40 resultados para Theweleit, Klaus
Resumo:
Starting from the classical Saltzman two-dimensional convection equations, we derive via a severe spectral truncation a minimal 10 ODE system which includes the thermal effect of viscous dissipation. Neglecting this process leads to a dynamical system which includes a decoupled generalized Lorenz system. The consideration of this process breaks an important symmetry and couples the dynamics of fast and slow variables, with the ensuing modifications to the structural properties of the attractor and of the spectral features. When the relevant nondimensional number (Eckert number Ec) is different from zero, an additional time scale of O(Ec−1) is introduced in the system, as shown with standard multiscale analysis and made clear by several numerical evidences. Moreover, the system is ergodic and hyperbolic, the slow variables feature long-term memory with 1/f3/2 power spectra, and the fast variables feature amplitude modulation. Increasing the strength of the thermal-viscous feedback has a stabilizing effect, as both the metric entropy and the Kaplan-Yorke attractor dimension decrease monotonically with Ec. The analyzed system features very rich dynamics: it overcomes some of the limitations of the Lorenz system and might have prototypical value in relevant processes in complex systems dynamics, such as the interaction between slow and fast variables, the presence of long-term memory, and the associated extreme value statistics. This analysis shows how neglecting the coupling of slow and fast variables only on the basis of scale analysis can be catastrophic. In fact, this leads to spurious invariances that affect essential dynamical properties (ergodicity, hyperbolicity) and that cause the model losing ability in describing intrinsically multiscale processes.
Resumo:
In this paper the authors exploit two equivalent formulations of the average rate of material entropy production in the climate system to propose an approximate splitting between contributions due to vertical and eminently horizontal processes. This approach is based only on 2D radiative fields at the surface and at the top of atmosphere. Using 2D fields at the top of atmosphere alone, lower bounds to the rate of material entropy production and to the intensity of the Lorenz energy cycle are derived. By introducing a measure of the efficiency of the planetary system with respect to horizontal thermodynamic processes, it is possible to gain insight into a previous intuition on the possibility of defining a baroclinic heat engine extracting work from the meridional heat flux. The approximate formula of the material entropy production is verified and used for studying the global thermodynamic properties of climate models (CMs) included in the Program for Climate Model Diagnosis and Intercomparison (PCMDI)/phase 3 of the Coupled Model Intercomparison Project (CMIP3) dataset in preindustrial climate conditions. It is found that about 90% of the material entropy production is due to vertical processes such as convection, whereas the large-scale meridional heat transport contributes to only about 10% of the total. This suggests that the traditional two-box models used for providing a minimal representation of entropy production in planetary systems are not appropriate, whereas a basic—but conceptually correct—description can be framed in terms of a four-box model. The total material entropy production is typically 55 mW m−2 K−1, with discrepancies on the order of 5%, and CMs’ baroclinic efficiencies are clustered around 0.055. The lower bounds on the intensity of the Lorenz energy cycle featured by CMs are found to be around 1.0–1.5 W m−2, which implies that the derived inequality is rather stringent. When looking at the variability and covariability of the considered thermodynamic quantities, the agreement among CMs is worse, suggesting that the description of feedbacks is more uncertain. The contributions to material entropy production from vertical and horizontal processes are positively correlated, so that no compensation mechanism seems in place. Quite consistently among CMs, the variability of the efficiency of the system is a better proxy for variability of the entropy production due to horizontal processes than that of the large-scale heat flux. The possibility of providing constraints on the 3D dynamics of the fluid envelope based only on 2D observations of radiative fluxes seems promising for the observational study of planets and for testing numerical models.
Resumo:
Starting point for these outputs is a large scale research project in collaboration with the Zurich University for the Arts and the Kunstmuseum Thun, looking at a redefinition of Social Sculpture (Joseph Beuys/ Bazon Brock, 1970) as a functional device re-deployed to expand the art discourse into a societal discourse. Although Beuys‘ version of a social sculpture involved notions of abstruse mysticism and reformulations of a national identity these were never-the less part of a social transformation that shifted and re-arranged power relations. Following Laclau and Mouffe in their contention that democray is a fundamentally antagonistic process and contesting Grant Kester’s understanding of a ethically based relational practice, this work is alignes itself with Hirschhorn’s claim to an aesthetic practice within communities, following the possibility to view a socially based practice from both ends of the ethics debate, whereby ethical aspects fuels the aethetic to “create situations that are beautiful because they are ethical and shocking because they are ethical, thus in turn aesthetic because they are ethical” (O’Donnell). This project sets out to engage in activities which interact with surrounding communities and evoce new imaginations of site, thereby understanding site as a catalysts for subjective emergences. Performance is tested as a site for social practice. Archival research into local audio/visual resources, such as the Swiss Radio Archive, the Swiss Military Film Archives and zoological film archives of the Basel Zoo, was instrumental to the navigation of this work, under theme of crisis, catastrophy, landscape, fallout, in order to create a visual language for an active performance site. Commissioned by the Kunstmuseum Thun in collaboration with the University for the Arts in Zurich as part of a year long exhibition programme, (other artists are Jeanne Van Heeswijk (NL) and San Keller (CH), ) this project brings together a series of different works in a new performace installation. The performance process includes a performance workshop with 30 school children from local Swiss schools and their teachers, which was conducted publicly in the museum spaces. It enabled the children to engage with an unexpected set of tribal and animalistic behaviours, looking at situations of flight and rescue, resulting in a large performance choreography orchestration without an apparent conductor, it includes a collaboration with renowned Swiss zoologist, Prof Klaus Zuberbühler(University of St Andrews) and the Colonal General Haldimann commander of the military base in Thun. The installation included 2 static video images, shot in an around spectacular local cave site (Beatus Caves) including 3 children. The project will culminate in an edited edition of the Oncurating Journal, (issue no, tbc, in 2012) including interviews and essays from project collaborators. (Army Commander General, Thun, Jörg Hess, performance script, Timothy Long, and others)
Resumo:
A nitric oxide synthase (NOS)-like activity has been demonstrated in human red blood cells (RBCs), but doubts about its functional significance, isoform identity and disease relevance remain. Using flow cytometry in combination with the NO-imaging probe DAF-FM we find that all blood cells form NO intracellularly, with a rank order of monocytes > neutrophils > lymphocytes > RBCs > platelets. The observation of a NO-related fluorescence within RBCs was unexpected given the abundance of the NO-scavenger oxyhemoglobin. Constitutive normoxic NO formation was abolished by NOS inhibition and intracellular NO scavenging, confirmed by laser-scanning microscopy and unequivocally validated by detection of the DAF-FM reaction product with NO using HPLC and LC-MS/MS. Employing immunoprecipitation, ESI-MS/MS-based peptide sequencing and enzymatic assay we further demonstrate that human RBCs contain an endothelial NOS (eNOS) that converts L-3H-Arginine to L-3H-Citrulline in a Ca2+/Calmodulin-dependent fashion. Moreover, in patients with coronary artery disease, red cell eNOS expression and activity are both lower than in age-matched healthy individuals and correlate with the degree of endothelial dysfunction. Thus, human RBCs constitutively produce NO under normoxic conditions via an active eNOS isoform the activity of which is compromised in patients with coronary artery disease.
Resumo:
To investigate the mechanisms involved in automatic processing of facial expressions, we used the QUEST procedure to measure the display durations needed to make a gender decision on emotional faces portraying fearful, happy, or neutral facial expressions. In line with predictions of appraisal theories of emotion, our results showed greater processing priority of emotional stimuli regardless of their valence. Whereas all experimental conditions led to an averaged threshold of about 50 ms, fearful and happy facial expressions led to significantly less variability in the responses than neutral faces. Results suggest that attention may have been automatically drawn by the emotion portrayed by face targets, yielding more informative perceptions and less variable responses. The temporal resolution of the perceptual system (expressed by the thresholds) and the processing priority of the stimuli (expressed by the variability in the responses) may influence subjective and objective measures of awareness, respectively.
Resumo:
In this paper, we will address the endeavors of three disciplines, Psychology, Neuroscience, and Artificial Neural Network (ANN) modeling, in explaining how the mind perceives and attends information. More precisely, we will shed some light on the efforts to understand the allocation of attentional resources to the processing of emotional stimuli. This review aims at informing the three disciplines about converging points of their research and to provide a starting point for discussion.
Resumo:
We present the first climate prediction of the coming decade made with multiple models, initialized with prior observations. This prediction accrues from an international activity to exchange decadal predictions in near real-time, in order to assess differences and similarities, provide a consensus view to prevent over-confidence in forecasts from any single model, and establish current collective capability. We stress that the forecast is experimental, since the skill of the multi-model system is as yet unknown. Nevertheless, the forecast systems used here are based on models that have undergone rigorous evaluation and individually have been evaluated for forecast skill. Moreover, it is important to publish forecasts to enable open evaluation, and to provide a focus on climate change in the coming decade. Initialized forecasts of the year 2011 agree well with observations, with a pattern correlation of 0.62 compared to 0.31 for uninitialized projections. In particular, the forecast correctly predicted La Niña in the Pacific, and warm conditions in the north Atlantic and USA. A similar pattern is predicted for 2012 but with a weaker La Niña. Indices of Atlantic multi-decadal variability and Pacific decadal variability show no signal beyond climatology after 2015, while temperature in the Niño3 region is predicted to warm slightly by about 0.5 °C over the coming decade. However, uncertainties are large for individual years and initialization has little impact beyond the first 4 years in most regions. Relative to uninitialized forecasts, initialized forecasts are significantly warmer in the north Atlantic sub-polar gyre and cooler in the north Pacific throughout the decade. They are also significantly cooler in the global average and over most land and ocean regions out to several years ahead. However, in the absence of volcanic eruptions, global temperature is predicted to continue to rise, with each year from 2013 onwards having a 50 % chance of exceeding the current observed record. Verification of these forecasts will provide an important opportunity to test the performance of models and our understanding and knowledge of the drivers of climate change.
Resumo:
In biological mass spectrometry (MS), two ionization techniques are predominantly employed for the analysis of larger biomolecules, such as polypeptides. These are nano-electrospray ionization [1, 2] (nanoESI) and matrix-assisted laser desorption/ionization [3, 4] (MALDI). Both techniques are considered to be “soft”, allowing the desorption and ionization of intact molecular analyte species and thus their successful mass-spectrometric analysis. One of the main differences between these two ionization techniques lies in their ability to produce multiply charged ions. MALDI typically generates singly charged peptide ions whereas nanoESI easily provides multiply charged ions, even for peptides as low as 1000 Da in mass. The production of highly charged ions is desirable as this allows the use of mass analyzers, such as ion traps (including orbitraps) and hybrid quadrupole instruments, which typically offer only a limited m/z range (< 2000–4000). It also enables more informative fragmentation spectra using techniques such as collisioninduced dissociation (CID) and electron capture/transfer dissociation (ECD/ETD) in combination with tandem MS (MS/MS). [5, 6] Thus, there is a clear advantage of using ESI in research areas where peptide sequencing, or in general, the structural elucidation of biomolecules by MS/MS is required. Nonetheless, MALDI with its higher tolerance to contaminants and additives, ease-of-operation, potential for highspeed and automated sample preparation and analysis as well as its MS imaging capabilities makes it an ionization technique that can cover bioanalytical areas for which ESI is less suitable. [7, 8] If these strengths could be combined with the analytical power of multiply charged ions, new instrumental configurations and large-scale proteomic analyses based on MALDI MS(/MS) would become feasible.
Resumo:
BACKGROUND: Fibroblast growth factor 9 (FGF9) is secreted from bone marrow cells, which have been shown to improve systolic function after myocardial infarction (MI) in a clinical trial. FGF9 promotes cardiac vascularization during embryonic development but is only weakly expressed in the adult heart. METHODS AND RESULTS: We used a tetracycline-responsive binary transgene system based on the α-myosin heavy chain promoter to test whether conditional expression of FGF9 in the adult myocardium supports adaptation after MI. In sham-operated mice, transgenic FGF9 stimulated left ventricular hypertrophy with microvessel expansion and preserved systolic and diastolic function. After coronary artery ligation, transgenic FGF9 enhanced hypertrophy of the noninfarcted left ventricular myocardium with increased microvessel density, reduced interstitial fibrosis, attenuated fetal gene expression, and improved systolic function. Heart failure mortality after MI was markedly reduced by transgenic FGF9, whereas rupture rates were not affected. Adenoviral FGF9 gene transfer after MI similarly promoted left ventricular hypertrophy with improved systolic function and reduced heart failure mortality. Mechanistically, FGF9 stimulated proliferation and network formation of endothelial cells but induced no direct hypertrophic effects in neonatal or adult rat cardiomyocytes in vitro. FGF9-stimulated endothelial cell supernatants, however, induced cardiomyocyte hypertrophy via paracrine release of bone morphogenetic protein 6. In accord with this observation, expression of bone morphogenetic protein 6 and phosphorylation of its downstream targets SMAD1/5 were increased in the myocardium of FGF9 transgenic mice. CONCLUSIONS: Conditional expression of FGF9 promotes myocardial vascularization and hypertrophy with enhanced systolic function and reduced heart failure mortality after MI. These observations suggest a previously unrecognized therapeutic potential for FGF9 after MI.
Resumo:
The warm event which spread in the tropical Atlantic during Spring-Summer 1984 is assumed to be partially initiated by atmospheric disturbances, themselves related to the major 1982–1983 El-Niño which occurred 1 year earlier in the Pacific. This paper tests such an hypothesis. For that purpose, an atmospheric general circulation model (AGCM) is forced by different conditions of climatic and observed sea surface temperature and an Atlantic ocean general circulation model (OGCM) is subsequently forced by the outputs of the AGCM. It is firstly shown that both the AGCM and the OGCM correctly behave when globally observed SST are used: the strengthening of the trades over the tropical Atlantic during 1983 and their subsequent weakening at the beginning of 1984 are well captured by the AGCM, and so is the Spring 1984 deepening of the thermocline in the eastern equatorial Atlantic, simulated by the OGCM. As assumed, the SST anomalies located in the El-Niño Pacific area are partly responsible for wind signal anomaly in the tropical Atlantic. Though this remotely forced atmospheric signal has a small amplitude, it can generate, in the OGCM run, an anomalous sub-surface signal leading to a flattening of the thermocline in the equatorial Atlantic. This forced oceanic experiment cannot explain the amplitude and phase of the observed sub-surface oceanic anomaly: part of the Atlantic ocean response, due to local interaction between ocean and atmosphere, requires a coupled approach. Nevertheless this experiment showed that anomalous conditions in the Pacific during 82–83 created favorable conditions for anomaly development in the Atlantic.
Resumo:
The Asian winter monsoon (AWM) response to the global warming was investigated through a long-term integration of the transient greenhouse warming with the ECHAM4/OPYC3 CGCM. The physics of the response was studied through analyses of the impact of the global warming on the variations of the ocean and land contrast near the ground in the Asian and western Pacific region and the east Asian trough and jet stream in the middle and upper troposphere. Forcing of transient eddy activity on the zonal circulation over the Asian and western Pacific region was also analyzed. It is found that in the global warming scenario the winter northeasterlies along the Pacific coast of the Eurasian continent weaken systematically and significantly, and intensity of the AWM reduces evidently, but the AWM variances on the interannual and interdecadal scales are not affected much by the global warming. It is suggested that the global warming makes the climate over the most part of Asia to be milder with enhanced moisture in winter. In the global warming scenario the contrasts of the sea level pressure and the near-surface temperature between the Asian continent and the Pacific Ocean become significantly smaller, northward and eastward shifts and weakening of the east Asian trough and jet stream in the middle and upper troposphere are found. As a consequence, the cold air in the AWM originating from the east Asian trough and high latitudes is less powerful. In addition, feedback of the transient activity also makes a considerable contribution to the higher-latitude shift of the jet stream over the North Pacific in the global warming scenario.
Resumo:
In this article, we present FACSGen 2.0, new animation software for creating static and dynamic threedimensional facial expressions on the basis of the Facial Action Coding System (FACS). FACSGen permits total control over the action units (AUs), which can be animated at all levels of intensity and applied alone or in combination to an infinite number of faces. In two studies, we tested the validity of the software for the AU appearance defined in the FACS manual and the conveyed emotionality of FACSGen expressions. In Experiment 1, four FACS-certified coders evaluated the complete set of 35 single AUs and 54 AU combinations for AU presence or absence, appearance quality, intensity, and asymmetry. In Experiment 2, lay participants performed a recognition task on emotional expressions created with FACSGen software and rated the similarity of expressions displayed by human and FACSGen faces. Results showed good to excellent classification levels for all AUs by the four FACS coders, suggesting that the AUs are valid exemplars of FACS specifications. Lay participants’ recognition rates for nine emotions were high, and comparisons of human and FACSGen expressions were very similar. The findings demonstrate the effectiveness of the software in producing reliable and emotionally valid expressions, and suggest its application in numerous scientific areas, including perception, emotion, and clinical and euroscience research.
Resumo:
The variability of results from different automated methods of detection and tracking of extratropical cyclones is assessed in order to identify uncertainties related to the choice of method. Fifteen international teams applied their own algorithms to the same dataset—the period 1989–2009 of interim European Centre for Medium-Range Weather Forecasts (ECMWF) Re-Analysis (ERAInterim) data. This experiment is part of the community project Intercomparison of Mid Latitude Storm Diagnostics (IMILAST; see www.proclim.ch/imilast/index.html). The spread of results for cyclone frequency, intensity, life cycle, and track location is presented to illustrate the impact of using different methods. Globally, methods agree well for geographical distribution in large oceanic regions, interannual variability of cyclone numbers, geographical patterns of strong trends, and distribution shape for many life cycle characteristics. In contrast, the largest disparities exist for the total numbers of cyclones, the detection of weak cyclones, and distribution in some densely populated regions. Consistency between methods is better for strong cyclones than for shallow ones. Two case studies of relatively large, intense cyclones reveal that the identification of the most intense part of the life cycle of these events is robust between methods, but considerable differences exist during the development and the dissolution phases.
Resumo:
A novel version of the classical surface pressure tendency equation (PTE) is applied to ERA-Interim reanalysis data to quantitatively assess the contribution of diabatic processes to the deepening of extratropical cyclones relative to effects of temperature advection and vertical motions. The five cyclone cases selected, Lothar and Martin in December 1999, Kyrill in January 2007, Klaus in January 2009, and Xynthia in February 2010, all showed explosive deepening and brought considerable damage to parts of Europe. For Xynthia, Klaus and Lothar diabatic processes contribute more to the observed surface pressure fall than horizontal temperature advection during their respective explosive deepening phases, while Kyrill and Martin appear to be more baroclinically driven storms. The powerful new diagnostic tool presented here can easily be applied to large numbers of cyclones and will help to better understand the role of diabatic processes in future changes in extratropical storminess.