765 resultados para Mark -- Criticism and interpretation
Resumo:
Decadal and longer timescale variability in the winter North Atlantic Oscillation (NAO) has considerable impact on regional climate, yet it remains unclear what fraction of this variability is potentially predictable. This study takes a new approach to this question by demonstrating clear physical differences between NAO variability on interannual-decadal (<30 year) and multidecadal (>30 year) timescales. It is shown that on the shorter timescale the NAO is dominated by variations in the latitude of the North Atlantic jet and storm track, whereas on the longer timescale it represents changes in their strength instead. NAO variability on the two timescales is associated with different dynamical behaviour in terms of eddy-mean flow interaction, Rossby wave breaking and blocking. The two timescales also exhibit different regional impacts on temperature and precipitation and different relationships to sea surface temperatures. These results are derived from linear regression analysis of the Twentieth Century and NCEP-NCAR reanalyses and of a high-resolution HiGEM General Circulation Model control simulation, with additional analysis of a long sea level pressure reconstruction. Evidence is presented for an influence of the ocean circulation on the longer timescale variability of the NAO, which is particularly clear in the model data. As well as providing new evidence of potential predictability, these findings are shown to have implications for the reconstruction and interpretation of long climate records.
Resumo:
Ice cores provide a robust reconstruction of past climate. However, development of timescales by annual-layer counting, essential to detailed climate reconstruction and interpretation, on ice cores collected at low-accumulation sites or in regions of compressed ice, is problematic due to closely spaced layers. Ice-core analysis by laser ablation–inductively coupled plasma–mass spectrometry (LA-ICP-MS) provides sub-millimeter-scale sampling resolution (on the order of 100μm in this study) and the low detection limits (ng L–1) necessary to measure the chemical constituents preserved in ice cores. We present a newly developed cryocell that can hold a 1m long section of ice core, and an alternative strategy for calibration. Using ice-core samples from central Greenland, we demonstrate the repeatability of multiple ablation passes, highlight the improved sampling resolution, verify the calibration technique and identify annual layers in the chemical profile in a deep section of an ice core where annual layers have not previously been identified using chemistry. In addition, using sections of cores from the Swiss/Italian Alps we illustrate the relationship between Ca, Na and Fe and particle concentration and conductivity, and validate the LA-ICP-MS Ca profile through a direct comparison with continuous flow analysis results.
Resumo:
The rapid further development of computed tomography (CT) and magnetic resonance imaging (MRI) induced the idea to use these techniques for postmortem documentation of forensic findings. Until now, only a few institutes of forensic medicine have acquired experience in postmortem cross-sectional imaging. Protocols, image interpretation and visualization have to be adapted to the postmortem conditions. Especially, postmortem alterations, such as putrefaction and livores, different temperature of the corpse and the loss of the circulation are a challenge for the imaging process and interpretation. Advantages of postmortem imaging are the higher exposure and resolution available in CT when there is no concern for biologic effects of ionizing radiation, and the lack of cardiac motion artifacts during scanning. CT and MRI may become useful tools for postmortem documentation in forensic medicine. In Bern, 80 human corpses underwent postmortem imaging by CT and MRI prior to traditional autopsy until the month of August 2003. Here, we describe the imaging appearance of postmortem alterations--internal livores, putrefaction, postmortem clotting--and distinguish them from the forensic findings of the heart, such as calcification, endocarditis, myocardial infarction, myocardial scarring, injury and other morphological alterations.
Resumo:
The ecosystem services concept (ES) is becoming a cornerstone of contemporary sustainability thought. Challenges with this concept and its applications are well documented, but have not yet been systematically assessed alongside strengths and external factors that influence uptake. Such an assessment could form the basis for improving ES thinking, further embedding it into environmental decisions and management. The Young Ecosystem Services Specialists (YESS) completed a Strengths–Weaknesses–Opportunities–Threats (SWOT) analysis of ES through YESS member surveys. Strengths include the approach being interdisciplinary, and a useful communication tool. Weaknesses include an incomplete scientific basis, frameworks being inconsistently applied, and accounting for nature's intrinsic value. Opportunities include alignment with existing policies and established methodologies, and increasing environmental awareness. Threats include resistance to change, and difficulty with interdisciplinary collaboration. Consideration of SWOT themes suggested five strategic areas for developing and implementing ES. The ES concept could improve decision-making related to natural resource use, and interpretation of the complexities of human-nature interactions. It is contradictory – valued as a simple means of communicating the importance of conservation, whilst also considered an oversimplification characterised by ambiguous language. Nonetheless, given sufficient funding and political will, the ES framework could facilitate interdisciplinary research, ensuring decision-making that supports sustainable development.
Resumo:
"Hole in the Head" is a play about a woman who wakes up. Maude wakes up in the first act, and in every subsequent scene she undergoes some form of physical or emotional awakening as characters walk in and out of her front door."Hole in the Head" is accompanied by an introduction that attempts to understand the interplay between creativity and academia through an analysis of theatre, feminist and queer theory, and science.
Resumo:
Alessandro Baricco is an Italian author, pianist, journalist and music critic, among a wide range of many other talents. His novels have won great critical acclaim in Italy and France and are popular around the world. While generally considered among the postmodern writers, some critics have accused him of being a forerunner in a 1990s movement dubbed letteratura giovanile, that is juvenile literature that is simplistic, targets a young audience and is created for the sole purpose of making money. This criticism is unwarranted. Baricco is a multitalented author who pays strict attention to the quality of his work and weaves plotlines replete with a diverse set of genres, literary devices and symbolism, often inspired by other great writers and thinkers. However, literary critics have yet to acknowledge one of Baricco's strongest and most important influences: Homer, the ancient Greek bard and author of the epic poems, the Iliad and the Odyssey. Taking Baricco's work in a Homeric context can aid in viewing it as valid and important work, worthy of scholarly discussion and interpretation, rather than, as some critics accuse, a one-dimensional story meant only for children. This paper will argue that Baricco's work is Homeric and, in fact, Baricco's implementation of many of Homer's devices, such as his understanding of his audience and use rhythmic language and stereotyped story patterns, has aided Baricco's great success and popularity.
Resumo:
The investigator conducted an action-oriented investigation of pregnancy and birth among the women of Mesa los Hornos, an urban squatter slum in Mexico City. Three aims guided the project: (1) To obtain information for improving prenatal and maternity service utilization; (2) To examine the utility of rapid ethnographic and epidemiologic assessment methodologies; (3) To cultivate community involvement in health development.^ Viewing service utilization as a culturally-bound decision, the study included a qualitative phase to explore women's cognition of pregnancy and birth, their perceived needs during pregnancy, and their criteria of service acceptability. A probability-based community survey delineated parameters of service utilization and pregnancy health events, and probed reasons for decisions to use medical services, lay midwives, or other sources of prenatal and labor and delivery assistance. Qualitative survey of service providers at relevant clinics, hospitals, and practices contributed information on service availability and access, and on coordination among private, social security, and public assistance health service sectors. The ethnographic approach to exploring the rationale for use or non-use of services provided a necessary complement to conventional barrier-based assessment, to inform planning of culturally appropriate interventions.^ Information collection and interpretation was conducted under the aegis of an advisory committee of community residents and service agency representatives; the residents' committee formulated recommendations for action based on findings, and forwarded the mandate to governmental social and urban development offices. Recommendations were designed to inform and develop community participation in health care decision-making.^ Rapid research methods are powerful tools for achieving community-based empowerment toward investigation and resolution of local health problems. But while ethnography works well in synergy with quantitative assessment approaches to strengthen the validity and richness of short-term field work, the author strongly urges caution in application of Rapid Ethnographic Assessments. An ethnographic sensibility is essential to the research enterprise for the development of an active and cooperative community base, the design and use of quantitative instruments, the appropriate use of qualitative techniques, and the interpretation of culturally-oriented information. However, prescribed and standardized Rapid Ethnographic Assessment techniques are counter-productive if used as research short-cuts before locale- and subject-specific cultural understanding is achieved. ^
Resumo:
Clinical text understanding (CTU) is of interest to health informatics because critical clinical information frequently represented as unconstrained text in electronic health records are extensively used by human experts to guide clinical practice, decision making, and to document delivery of care, but are largely unusable by information systems for queries and computations. Recent initiatives advocating for translational research call for generation of technologies that can integrate structured clinical data with unstructured data, provide a unified interface to all data, and contextualize clinical information for reuse in multidisciplinary and collaborative environment envisioned by CTSA program. This implies that technologies for the processing and interpretation of clinical text should be evaluated not only in terms of their validity and reliability in their intended environment, but also in light of their interoperability, and ability to support information integration and contextualization in a distributed and dynamic environment. This vision adds a new layer of information representation requirements that needs to be accounted for when conceptualizing implementation or acquisition of clinical text processing tools and technologies for multidisciplinary research. On the other hand, electronic health records frequently contain unconstrained clinical text with high variability in use of terms and documentation practices, and without commitmentto grammatical or syntactic structure of the language (e.g. Triage notes, physician and nurse notes, chief complaints, etc). This hinders performance of natural language processing technologies which typically rely heavily on the syntax of language and grammatical structure of the text. This document introduces our method to transform unconstrained clinical text found in electronic health information systems to a formal (computationally understandable) representation that is suitable for querying, integration, contextualization and reuse, and is resilient to the grammatical and syntactic irregularities of the clinical text. We present our design rationale, method, and results of evaluation in processing chief complaints and triage notes from 8 different emergency departments in Houston Texas. At the end, we will discuss significance of our contribution in enabling use of clinical text in a practical bio-surveillance setting.
Resumo:
Essential biological processes are governed by organized, dynamic interactions between multiple biomolecular systems. Complexes are thus formed to enable the biological function and get dissembled as the process is completed. Examples of such processes include the translation of the messenger RNA into protein by the ribosome, the folding of proteins by chaperonins or the entry of viruses in host cells. Understanding these fundamental processes by characterizing the molecular mechanisms that enable then, would allow the (better) design of therapies and drugs. Such molecular mechanisms may be revealed trough the structural elucidation of the biomolecular assemblies at the core of these processes. Various experimental techniques may be applied to investigate the molecular architecture of biomolecular assemblies. High-resolution techniques, such as X-ray crystallography, may solve the atomic structure of the system, but are typically constrained to biomolecules of reduced flexibility and dimensions. In particular, X-ray crystallography requires the sample to form a three dimensional (3D) crystal lattice which is technically di‑cult, if not impossible, to obtain, especially for large, dynamic systems. Often these techniques solve the structure of the different constituent components within the assembly, but encounter difficulties when investigating the entire system. On the other hand, imaging techniques, such as cryo-electron microscopy (cryo-EM), are able to depict large systems in near-native environment, without requiring the formation of crystals. The structures solved by cryo-EM cover a wide range of resolutions, from very low level of detail where only the overall shape of the system is visible, to high-resolution that approach, but not yet reach, atomic level of detail. In this dissertation, several modeling methods are introduced to either integrate cryo-EM datasets with structural data from X-ray crystallography, or to directly interpret the cryo-EM reconstruction. Such computational techniques were developed with the goal of creating an atomic model for the cryo-EM data. The low-resolution reconstructions lack the level of detail to permit a direct atomic interpretation, i.e. one cannot reliably locate the atoms or amino-acid residues within the structure obtained by cryo-EM. Thereby one needs to consider additional information, for example, structural data from other sources such as X-ray crystallography, in order to enable such a high-resolution interpretation. Modeling techniques are thus developed to integrate the structural data from the different biophysical sources, examples including the work described in the manuscript I and II of this dissertation. At intermediate and high-resolution, cryo-EM reconstructions depict consistent 3D folds such as tubular features which in general correspond to alpha-helices. Such features can be annotated and later on used to build the atomic model of the system, see manuscript III as alternative. Three manuscripts are presented as part of the PhD dissertation, each introducing a computational technique that facilitates the interpretation of cryo-EM reconstructions. The first manuscript is an application paper that describes a heuristics to generate the atomic model for the protein envelope of the Rift Valley fever virus. The second manuscript introduces the evolutionary tabu search strategies to enable the integration of multiple component atomic structures with the cryo-EM map of their assembly. Finally, the third manuscript develops further the latter technique and apply it to annotate consistent 3D patterns in intermediate-resolution cryo-EM reconstructions. The first manuscript, titled An assembly model for Rift Valley fever virus, was submitted for publication in the Journal of Molecular Biology. The cryo-EM structure of the Rift Valley fever virus was previously solved at 27Å-resolution by Dr. Freiberg and collaborators. Such reconstruction shows the overall shape of the virus envelope, yet the reduced level of detail prevents the direct atomic interpretation. High-resolution structures are not yet available for the entire virus nor for the two different component glycoproteins that form its envelope. However, homology models may be generated for these glycoproteins based on similar structures that are available at atomic resolutions. The manuscript presents the steps required to identify an atomic model of the entire virus envelope, based on the low-resolution cryo-EM map of the envelope and the homology models of the two glycoproteins. Starting with the results of the exhaustive search to place the two glycoproteins, the model is built iterative by running multiple multi-body refinements to hierarchically generate models for the different regions of the envelope. The generated atomic model is supported by prior knowledge regarding virus biology and contains valuable information about the molecular architecture of the system. It provides the basis for further investigations seeking to reveal different processes in which the virus is involved such as assembly or fusion. The second manuscript was recently published in the of Journal of Structural Biology (doi:10.1016/j.jsb.2009.12.028) under the title Evolutionary tabu search strategies for the simultaneous registration of multiple atomic structures in cryo-EM reconstructions. This manuscript introduces the evolutionary tabu search strategies applied to enable a multi-body registration. This technique is a hybrid approach that combines a genetic algorithm with a tabu search strategy to promote the proper exploration of the high-dimensional search space. Similar to the Rift Valley fever virus, it is common that the structure of a large multi-component assembly is available at low-resolution from cryo-EM, while high-resolution structures are solved for the different components but lack for the entire system. Evolutionary tabu search strategies enable the building of an atomic model for the entire system by considering simultaneously the different components. Such registration indirectly introduces spatial constrains as all components need to be placed within the assembly, enabling the proper docked in the low-resolution map of the entire assembly. Along with the method description, the manuscript covers the validation, presenting the benefit of the technique in both synthetic and experimental test cases. Such approach successfully docked multiple components up to resolutions of 40Å. The third manuscript is entitled Evolutionary Bidirectional Expansion for the Annotation of Alpha Helices in Electron Cryo-Microscopy Reconstructions and was submitted for publication in the Journal of Structural Biology. The modeling approach described in this manuscript applies the evolutionary tabu search strategies in combination with the bidirectional expansion to annotate secondary structure elements in intermediate resolution cryo-EM reconstructions. In particular, secondary structure elements such as alpha helices show consistent patterns in cryo-EM data, and are visible as rod-like patterns of high density. The evolutionary tabu search strategy is applied to identify the placement of the different alpha helices, while the bidirectional expansion characterizes their length and curvature. The manuscript presents the validation of the approach at resolutions ranging between 6 and 14Å, a level of detail where alpha helices are visible. Up to resolution of 12 Å, the method measures sensitivities between 70-100% as estimated in experimental test cases, i.e. 70-100% of the alpha-helices were correctly predicted in an automatic manner in the experimental data. The three manuscripts presented in this PhD dissertation cover different computation methods for the integration and interpretation of cryo-EM reconstructions. The methods were developed in the molecular modeling software Sculptor (http://sculptor.biomachina.org) and are available for the scientific community interested in the multi-resolution modeling of cryo-EM data. The work spans a wide range of resolution covering multi-body refinement and registration at low-resolution along with annotation of consistent patterns at high-resolution. Such methods are essential for the modeling of cryo-EM data, and may be applied in other fields where similar spatial problems are encountered, such as medical imaging.
Resumo:
High-frequency data collected continuously over a multiyear time frame are required for investigating the various agents that drive ecological and hydrodynamic processes in estuaries. Here, we present water quality and current in-situ observations from a fixed monitoring station operating from 2008 to 2014 in the lower Guadiana Estuary, southern Portugal (37°11.30' N, 7°24.67' W). The data were recorded by a multi-parametric probe providing hourly records (temperature, salinity, chlorophyll, dissolved oxygen, turbidity, and pH) at a water depth of ~1 m, and by a bottom-mounted acoustic Doppler current profiler measuring the pressure, near-bottom temperature, and flow velocity through the water column every 15 min. The time-series data, in particular the probe ones, present substantial gaps arising from equipment failure and maintenance, which are ineluctable with this type of observations in harsh environments. However, prolonged (months-long) periods of multi-parametric observations during contrasted external forcing conditions are available. The raw data are reported together with flags indicating the quality status of each record. River discharge data from two hydrographic stations located near the estuary head are also provided to support data analysis and interpretation.
Resumo:
A joint research expedition between the French IFREMER and the German MARUM was conducted in 2011 using the R/V 'Pourquoi pas?' to study gas hydrate distributions in a pockmark field (1141-1199 m below sea surface) at the continental margin of Nigeria. The seafloor drill rig MeBo of MARUM was used to recover sediments as deep as 56.74 m below seafloor. The presence of gas hydrates in specific core sections was deduced from temperature anomalies recorded during continuous records of infrared thermal scanning and anomalies in pore water chloride concentrations. In situ sediment temperature measurements showed elevated geothermal gradients of up to 258 °C/km in the center of the so-called pockmark A which is up to 4.6 times higher than that in the background sediment (72 °C/km). The gas hydrate distribution and thermal regime in the pockmark are largely controlled by the intensity, periodicity and direction of fluid flow. The joint interaction between fluid flow, gas hydrate formation and dissolution, and the thermal regime governs pockmark formation and evolution on the Nigerian continental margin.
Resumo:
Group IV nanostructures have attracted a great deal of attention because of their potential applications in optoelectronics and nanodevices. Raman spectroscopy has been extensively used to characterize nanostructures since it provides non destructive information about their size, by the adequate modeling of the phonon confinement effect. The Raman spectrum is also sensitive to other factors, as stress and temperature, which can mix with the size effects borrowing the interpretation of the Raman spectrum. We present herein an analysis of the Raman spectra obtained for Si and SiGe nanowires; the influence of the excitation conditions and the heat dissipation media are discussed in order to optimize the experimental conditions for reliable spectra acquisition and interpretation.
Resumo:
An increasing number of neuroimaging studies are concerned with the identification of interactions or statistical dependencies between brain areas. Dependencies between the activities of different brain regions can be quantified with functional connectivity measures such as the cross-correlation coefficient. An important factor limiting the accuracy of such measures is the amount of empirical data available. For event-related protocols, the amount of data also affects the temporal resolution of the analysis. We use analytical expressions to calculate the amount of empirical data needed to establish whether a certain level of dependency is significant when the time series are autocorrelated, as is the case for biological signals. These analytical results are then contrasted with estimates from simulations based on real data recorded with magnetoencephalography during a resting-state paradigm and during the presentation of visual stimuli. Results indicate that, for broadband signals, 50–100 s of data is required to detect a true underlying cross-correlations coefficient of 0.05. This corresponds to a resolution of a few hundred milliseconds for typical event-related recordings. The required time window increases for narrow band signals as frequency decreases. For instance, approximately 3 times as much data is necessary for signals in the alpha band. Important implications can be derived for the design and interpretation of experiments to characterize weak interactions, which are potentially important for brain processing.