956 resultados para Helminen, Klaus: Polisrätt
Resumo:
To investigate the mechanisms involved in automatic processing of facial expressions, we used the QUEST procedure to measure the display durations needed to make a gender decision on emotional faces portraying fearful, happy, or neutral facial expressions. In line with predictions of appraisal theories of emotion, our results showed greater processing priority of emotional stimuli regardless of their valence. Whereas all experimental conditions led to an averaged threshold of about 50 ms, fearful and happy facial expressions led to significantly less variability in the responses than neutral faces. Results suggest that attention may have been automatically drawn by the emotion portrayed by face targets, yielding more informative perceptions and less variable responses. The temporal resolution of the perceptual system (expressed by the thresholds) and the processing priority of the stimuli (expressed by the variability in the responses) may influence subjective and objective measures of awareness, respectively.
Resumo:
In this paper, we will address the endeavors of three disciplines, Psychology, Neuroscience, and Artificial Neural Network (ANN) modeling, in explaining how the mind perceives and attends information. More precisely, we will shed some light on the efforts to understand the allocation of attentional resources to the processing of emotional stimuli. This review aims at informing the three disciplines about converging points of their research and to provide a starting point for discussion.
Resumo:
We present the first climate prediction of the coming decade made with multiple models, initialized with prior observations. This prediction accrues from an international activity to exchange decadal predictions in near real-time, in order to assess differences and similarities, provide a consensus view to prevent over-confidence in forecasts from any single model, and establish current collective capability. We stress that the forecast is experimental, since the skill of the multi-model system is as yet unknown. Nevertheless, the forecast systems used here are based on models that have undergone rigorous evaluation and individually have been evaluated for forecast skill. Moreover, it is important to publish forecasts to enable open evaluation, and to provide a focus on climate change in the coming decade. Initialized forecasts of the year 2011 agree well with observations, with a pattern correlation of 0.62 compared to 0.31 for uninitialized projections. In particular, the forecast correctly predicted La Niña in the Pacific, and warm conditions in the north Atlantic and USA. A similar pattern is predicted for 2012 but with a weaker La Niña. Indices of Atlantic multi-decadal variability and Pacific decadal variability show no signal beyond climatology after 2015, while temperature in the Niño3 region is predicted to warm slightly by about 0.5 °C over the coming decade. However, uncertainties are large for individual years and initialization has little impact beyond the first 4 years in most regions. Relative to uninitialized forecasts, initialized forecasts are significantly warmer in the north Atlantic sub-polar gyre and cooler in the north Pacific throughout the decade. They are also significantly cooler in the global average and over most land and ocean regions out to several years ahead. However, in the absence of volcanic eruptions, global temperature is predicted to continue to rise, with each year from 2013 onwards having a 50 % chance of exceeding the current observed record. Verification of these forecasts will provide an important opportunity to test the performance of models and our understanding and knowledge of the drivers of climate change.
Resumo:
In biological mass spectrometry (MS), two ionization techniques are predominantly employed for the analysis of larger biomolecules, such as polypeptides. These are nano-electrospray ionization [1, 2] (nanoESI) and matrix-assisted laser desorption/ionization [3, 4] (MALDI). Both techniques are considered to be “soft”, allowing the desorption and ionization of intact molecular analyte species and thus their successful mass-spectrometric analysis. One of the main differences between these two ionization techniques lies in their ability to produce multiply charged ions. MALDI typically generates singly charged peptide ions whereas nanoESI easily provides multiply charged ions, even for peptides as low as 1000 Da in mass. The production of highly charged ions is desirable as this allows the use of mass analyzers, such as ion traps (including orbitraps) and hybrid quadrupole instruments, which typically offer only a limited m/z range (< 2000–4000). It also enables more informative fragmentation spectra using techniques such as collisioninduced dissociation (CID) and electron capture/transfer dissociation (ECD/ETD) in combination with tandem MS (MS/MS). [5, 6] Thus, there is a clear advantage of using ESI in research areas where peptide sequencing, or in general, the structural elucidation of biomolecules by MS/MS is required. Nonetheless, MALDI with its higher tolerance to contaminants and additives, ease-of-operation, potential for highspeed and automated sample preparation and analysis as well as its MS imaging capabilities makes it an ionization technique that can cover bioanalytical areas for which ESI is less suitable. [7, 8] If these strengths could be combined with the analytical power of multiply charged ions, new instrumental configurations and large-scale proteomic analyses based on MALDI MS(/MS) would become feasible.
Resumo:
BACKGROUND: Fibroblast growth factor 9 (FGF9) is secreted from bone marrow cells, which have been shown to improve systolic function after myocardial infarction (MI) in a clinical trial. FGF9 promotes cardiac vascularization during embryonic development but is only weakly expressed in the adult heart. METHODS AND RESULTS: We used a tetracycline-responsive binary transgene system based on the α-myosin heavy chain promoter to test whether conditional expression of FGF9 in the adult myocardium supports adaptation after MI. In sham-operated mice, transgenic FGF9 stimulated left ventricular hypertrophy with microvessel expansion and preserved systolic and diastolic function. After coronary artery ligation, transgenic FGF9 enhanced hypertrophy of the noninfarcted left ventricular myocardium with increased microvessel density, reduced interstitial fibrosis, attenuated fetal gene expression, and improved systolic function. Heart failure mortality after MI was markedly reduced by transgenic FGF9, whereas rupture rates were not affected. Adenoviral FGF9 gene transfer after MI similarly promoted left ventricular hypertrophy with improved systolic function and reduced heart failure mortality. Mechanistically, FGF9 stimulated proliferation and network formation of endothelial cells but induced no direct hypertrophic effects in neonatal or adult rat cardiomyocytes in vitro. FGF9-stimulated endothelial cell supernatants, however, induced cardiomyocyte hypertrophy via paracrine release of bone morphogenetic protein 6. In accord with this observation, expression of bone morphogenetic protein 6 and phosphorylation of its downstream targets SMAD1/5 were increased in the myocardium of FGF9 transgenic mice. CONCLUSIONS: Conditional expression of FGF9 promotes myocardial vascularization and hypertrophy with enhanced systolic function and reduced heart failure mortality after MI. These observations suggest a previously unrecognized therapeutic potential for FGF9 after MI.
Resumo:
The warm event which spread in the tropical Atlantic during Spring-Summer 1984 is assumed to be partially initiated by atmospheric disturbances, themselves related to the major 1982–1983 El-Niño which occurred 1 year earlier in the Pacific. This paper tests such an hypothesis. For that purpose, an atmospheric general circulation model (AGCM) is forced by different conditions of climatic and observed sea surface temperature and an Atlantic ocean general circulation model (OGCM) is subsequently forced by the outputs of the AGCM. It is firstly shown that both the AGCM and the OGCM correctly behave when globally observed SST are used: the strengthening of the trades over the tropical Atlantic during 1983 and their subsequent weakening at the beginning of 1984 are well captured by the AGCM, and so is the Spring 1984 deepening of the thermocline in the eastern equatorial Atlantic, simulated by the OGCM. As assumed, the SST anomalies located in the El-Niño Pacific area are partly responsible for wind signal anomaly in the tropical Atlantic. Though this remotely forced atmospheric signal has a small amplitude, it can generate, in the OGCM run, an anomalous sub-surface signal leading to a flattening of the thermocline in the equatorial Atlantic. This forced oceanic experiment cannot explain the amplitude and phase of the observed sub-surface oceanic anomaly: part of the Atlantic ocean response, due to local interaction between ocean and atmosphere, requires a coupled approach. Nevertheless this experiment showed that anomalous conditions in the Pacific during 82–83 created favorable conditions for anomaly development in the Atlantic.
Resumo:
The Asian winter monsoon (AWM) response to the global warming was investigated through a long-term integration of the transient greenhouse warming with the ECHAM4/OPYC3 CGCM. The physics of the response was studied through analyses of the impact of the global warming on the variations of the ocean and land contrast near the ground in the Asian and western Pacific region and the east Asian trough and jet stream in the middle and upper troposphere. Forcing of transient eddy activity on the zonal circulation over the Asian and western Pacific region was also analyzed. It is found that in the global warming scenario the winter northeasterlies along the Pacific coast of the Eurasian continent weaken systematically and significantly, and intensity of the AWM reduces evidently, but the AWM variances on the interannual and interdecadal scales are not affected much by the global warming. It is suggested that the global warming makes the climate over the most part of Asia to be milder with enhanced moisture in winter. In the global warming scenario the contrasts of the sea level pressure and the near-surface temperature between the Asian continent and the Pacific Ocean become significantly smaller, northward and eastward shifts and weakening of the east Asian trough and jet stream in the middle and upper troposphere are found. As a consequence, the cold air in the AWM originating from the east Asian trough and high latitudes is less powerful. In addition, feedback of the transient activity also makes a considerable contribution to the higher-latitude shift of the jet stream over the North Pacific in the global warming scenario.
Resumo:
In this article, we present FACSGen 2.0, new animation software for creating static and dynamic threedimensional facial expressions on the basis of the Facial Action Coding System (FACS). FACSGen permits total control over the action units (AUs), which can be animated at all levels of intensity and applied alone or in combination to an infinite number of faces. In two studies, we tested the validity of the software for the AU appearance defined in the FACS manual and the conveyed emotionality of FACSGen expressions. In Experiment 1, four FACS-certified coders evaluated the complete set of 35 single AUs and 54 AU combinations for AU presence or absence, appearance quality, intensity, and asymmetry. In Experiment 2, lay participants performed a recognition task on emotional expressions created with FACSGen software and rated the similarity of expressions displayed by human and FACSGen faces. Results showed good to excellent classification levels for all AUs by the four FACS coders, suggesting that the AUs are valid exemplars of FACS specifications. Lay participants’ recognition rates for nine emotions were high, and comparisons of human and FACSGen expressions were very similar. The findings demonstrate the effectiveness of the software in producing reliable and emotionally valid expressions, and suggest its application in numerous scientific areas, including perception, emotion, and clinical and euroscience research.
Resumo:
The variability of results from different automated methods of detection and tracking of extratropical cyclones is assessed in order to identify uncertainties related to the choice of method. Fifteen international teams applied their own algorithms to the same dataset—the period 1989–2009 of interim European Centre for Medium-Range Weather Forecasts (ECMWF) Re-Analysis (ERAInterim) data. This experiment is part of the community project Intercomparison of Mid Latitude Storm Diagnostics (IMILAST; see www.proclim.ch/imilast/index.html). The spread of results for cyclone frequency, intensity, life cycle, and track location is presented to illustrate the impact of using different methods. Globally, methods agree well for geographical distribution in large oceanic regions, interannual variability of cyclone numbers, geographical patterns of strong trends, and distribution shape for many life cycle characteristics. In contrast, the largest disparities exist for the total numbers of cyclones, the detection of weak cyclones, and distribution in some densely populated regions. Consistency between methods is better for strong cyclones than for shallow ones. Two case studies of relatively large, intense cyclones reveal that the identification of the most intense part of the life cycle of these events is robust between methods, but considerable differences exist during the development and the dissolution phases.
Resumo:
A novel version of the classical surface pressure tendency equation (PTE) is applied to ERA-Interim reanalysis data to quantitatively assess the contribution of diabatic processes to the deepening of extratropical cyclones relative to effects of temperature advection and vertical motions. The five cyclone cases selected, Lothar and Martin in December 1999, Kyrill in January 2007, Klaus in January 2009, and Xynthia in February 2010, all showed explosive deepening and brought considerable damage to parts of Europe. For Xynthia, Klaus and Lothar diabatic processes contribute more to the observed surface pressure fall than horizontal temperature advection during their respective explosive deepening phases, while Kyrill and Martin appear to be more baroclinically driven storms. The powerful new diagnostic tool presented here can easily be applied to large numbers of cyclones and will help to better understand the role of diabatic processes in future changes in extratropical storminess.
Resumo:
Despite decades of research, the roles of climate and humans in driving the dramatic extinctions of large-bodied mammals during the Late Quaternary period remain contentious. Here we use ancient DNA, species distribution models and the human fossil record to elucidate how climate and humans shaped the demographic history of woolly rhinoceros, woolly mammoth, wild horse, reindeer, bison and musk ox. We show that climate has been a major driver of population change over the past 50,000 years. However, each species responds differently to the effects of climatic shifts, habitat redistribution and human encroachment. Although climate change alone can explain the extinction of some species, such as Eurasian musk ox and woolly rhinoceros, a combination of climatic and anthropogenic effects appears to be responsible for the extinction of others, including Eurasian steppe bison and wild horse. We find no genetic signature or any distinctive range dynamics distinguishing extinct from surviving species, emphasizing the challenges associated with predicting future responses of extant mammals to climate and human-mediated habitat change.
Resumo:
What are the microfoundations of dynamic capabilities that sustain competitive advantage in a highly volatile environment, such as a transition economy? We explore the detailed nature of these dynamic capabilities along with their antecedents by tracing the sequence of their development based on a longitudinal case study of an organization subject to an external context of radical transition — the Russian oil company, Yukos. Our rich qualitative data indicate two distinct types of dynamic capabilities that are pivotal for organizational transformation. Adaptation dynamic capabilities relate to routines of resource exploitation and deployment, which are supported by acquisition, internalization and dissemination of extant knowledge, as well as resource reconfiguration, divestment and integration. Innovation dynamic capabilities relate to the creation of completely new capabilities via exploration and path-creation processes, which are supported by search, experimentation and risk taking, as well as project selection, funding and implementation. Second, we find that sequencing the two types of dynamic capabilities, helped the organization both to secure short-term competitive advantage, and to create the basis for long-term competitive advantage. These dynamic capability constructs advance theoretical understanding of what dynamic capabilities are, whilst their sequencing explains how firms create, leverage and enhance them over time.
Resumo:
Recent studies have indicated that research practices in psychology may be susceptible to factors that increase false-positive rates, raising concerns about the possible prevalence of false-positive findings. The present article discusses several practices that may run counter to the inflation of false-positive rates. Taking these practices into account would lead to a more balanced view on the false-positive issue. Specifically, we argue that an inflation of false-positive rates would diminish, sometimes to a substantial degree, when researchers (a) have explicit a priori theoretical hypotheses, (b) include multiple replication studies in a single paper, and (c) collect additional data based on observed results. We report findings from simulation studies and statistical evidence that support these arguments. Being aware of these preventive factors allows researchers not to overestimate the pervasiveness of false-positives in psychology and to gauge the susceptibility of a paper to possible false-positives in practical and fair ways.