129 resultados para detail
Resumo:
The HUPO Proteomics Standards Initiative has developed several standardized data formats to facilitate data sharing in mass spectrometry (MS)-based proteomics. These allow researchers to report their complete results in a unified way. However, at present, there is no format to describe the final qualitative and quantitative results for proteomics and metabolomics experiments in a simple tabular format. Many downstream analysis use cases are only concerned with the final results of an experiment and require an easily accessible format, compatible with tools such as Microsoft Excel or R. We developed the mzTab file format for MS-based proteomics and metabolomics results to meet this need. mzTab is intended as a lightweight supplement to the existing standard XML-based file formats (mzML, mzIdentML, mzQuantML), providing a comprehensive summary, similar in concept to the supplemental material of a scientific publication. mzTab files can contain protein, peptide, and small molecule identifications together with experimental metadata and basic quantitative information. The format is not intended to store the complete experimental evidence but provides mechanisms to report results at different levels of detail. These range from a simple summary of the final results to a representation of the results including the experimental design. This format is ideally suited to make MS-based proteomics and metabolomics results available to a wider biological community outside the field of MS. Several software tools for proteomics and metabolomics have already adapted the format as an output format. The comprehensive mzTab specification document and extensive additional documentation can be found online.
Resumo:
The functional architecture of the occipital cortex is being studied with increasing detail. Functional and structural MR based imaging are altering views about the organisation of the human visual system. Recent advances have ranged from comparative studies with non-human primates to predictive scanning. The latter multivariate technique describes with sub-voxel resolution patterns of activity that are characteristic of specific visual experiences. One can deduce what a subject experienced visually from the pattern of cortical activity recorded. The challenge for the future is to understand visual functions in terms of cerebral computations at a mesoscopic level of description and to relate this information to electrophysiology. The principal medical application of this new knowledge has focused to a large extent on plasticity and the capacity for functional reorganisation. Crossmodality visual-sensory interactions and cross-correlations between visual and other cerebral areas in the resting state are areas of considerable current interest. The lecture will review findings over the last two decades and reflect on possible roles for imaging studies in the future.
Resumo:
The effect of copper (Cu) filtration on image quality and dose in different digital X-ray systems was investigated. Two computed radiography systems and one digital radiography detector were used. Three different polymethylmethacrylate blocks simulated the pediatric body. The effect of Cu filters of 0.1, 0.2, and 0.3 mm thickness on the entrance surface dose (ESD) and the corresponding effective doses (EDs) were measured at tube voltages of 60, 66, and 73 kV. Image quality was evaluated in a contrast-detail phantom with an automated analyzer software. Cu filters of 0.1, 0.2, and 0.3 mm thickness decreased the ESD by 25-32%, 32-39%, and 40-44%, respectively, the ranges depending on the respective tube voltages. There was no consistent decline in image quality due to increasing Cu filtration. The estimated ED of anterior-posterior (AP) chest projections was reduced by up to 23%. No relevant reduction in the ED was noted in AP radiographs of the abdomen and pelvis or in posterior-anterior radiographs of the chest. Cu filtration reduces the ESD, but generally does not reduce the effective dose. Cu filters can help protect radiosensitive superficial organs, such as the mammary glands in AP chest projections.
Resumo:
Stratigraphic and petrographic analysis of the Cretaceous to Eocene Tibetan sedimentary succession has allowed us to reinterpret in detail the sequence of events which led to closure of Neotethys and continental collision in the NW Himalaya. During the Early Cretaceous, the Indian passive margin recorded basaltic magmatic activity. Albian volcanic arenites, probably related to a major extensional tectonic event, are unconformably overlain by an Upper Cretaceous to Paleocene carbonate sequence, with a major quartzarenite episode triggered by the global eustatic sea-level fall at the Cretaceous/Tertiary boundary. At the same time, Neotethyan oceanic crust was being subducted beneath Asia, as testified by calc-alkalic volcanism and forearc basin sedimentation in the Transhimalayan belt. Onset of collision and obduction of the Asian accretionary wedge onto the Indian continental rise was recorded by shoaling of the outer shelf at the Paleocene/Eocene boundary, related to flexural uplift of the passive margin. A few My later, foreland basin volcanic arenites derived from the uplifted Asian subduction complex onlapped onto the Indian continental terrace. All along the Himalaya, marine facies were rapidly replaced by continental redbeds in collisional basins on both sides of the ophiolitic suture. Next, foreland basin sedimentation was interrupted by fold-thrust deformation and final ophiolite emplacement. The observed sequence of events compares favourably with theoretical models of rifted margin to overthrust belt transition and shows that initial phases of continental collision and obduction were completed within 10 to 15 My, with formation of a proto-Himalayan chain by the end of the middle Eocene.
Resumo:
BACKGROUND AND PURPOSE: Risk factors for IS in young adults differ between genders and evolve with age, but data on the age- and gender-specific differences by stroke etiology are scare. These features were compared based on individual patient data from 15 European stroke centers. METHODS: Stroke etiology was reported in detail for 3331 patients aged 15-49 years with first-ever IS according to Trial of Org in Acute Stroke Treatment (TOAST) criteria: large-artery atherosclerosis (LAA), cardioembolism (CE), small-vessel occlusion (SVO), other determined etiology, or undetermined etiology. CE was categorized into low- and high-risk sources. Other determined group was divided into dissection and other non-dissection causes. Comparisons were done using logistic regression, adjusting for age, gender, and center heterogeneity. RESULTS: Etiology remained undetermined in 39.6%. Other determined etiology was found in 21.6%, CE in 17.3%, SVO in 12.2%, and LAA in 9.3%. Other determined etiology was more common in females and younger patients, with cervical artery dissection being the single most common etiology (12.8%). CE was more common in younger patients. Within CE, the most frequent high-risk sources were atrial fibrillation/flutter (15.1%) and cardiomyopathy (11.5%). LAA, high-risk sources of CE, and SVO were more common in males. LAA and SVO showed an increasing frequency with age. No significant etiologic distribution differences were found amongst southern, central, or northern Europe. CONCLUSIONS: The etiology of IS in young adults has clear gender-specific patterns that change with age. A notable portion of these patients remains without an evident stroke mechanism according to TOAST criteria.
Resumo:
Des progrès significatifs ont été réalisés dans le domaine de l'intégration quantitative des données géophysique et hydrologique l'échelle locale. Cependant, l'extension à de plus grandes échelles des approches correspondantes constitue encore un défi majeur. Il est néanmoins extrêmement important de relever ce défi pour développer des modèles fiables de flux des eaux souterraines et de transport de contaminant. Pour résoudre ce problème, j'ai développé une technique d'intégration des données hydrogéophysiques basée sur une procédure bayésienne de simulation séquentielle en deux étapes. Cette procédure vise des problèmes à plus grande échelle. L'objectif est de simuler la distribution d'un paramètre hydraulique cible à partir, d'une part, de mesures d'un paramètre géophysique pertinent qui couvrent l'espace de manière exhaustive, mais avec une faible résolution (spatiale) et, d'autre part, de mesures locales de très haute résolution des mêmes paramètres géophysique et hydraulique. Pour cela, mon algorithme lie dans un premier temps les données géophysiques de faible et de haute résolution à travers une procédure de réduction déchelle. Les données géophysiques régionales réduites sont ensuite reliées au champ du paramètre hydraulique à haute résolution. J'illustre d'abord l'application de cette nouvelle approche dintégration des données à une base de données synthétiques réaliste. Celle-ci est constituée de mesures de conductivité hydraulique et électrique de haute résolution réalisées dans les mêmes forages ainsi que destimations des conductivités électriques obtenues à partir de mesures de tomographic de résistivité électrique (ERT) sur l'ensemble de l'espace. Ces dernières mesures ont une faible résolution spatiale. La viabilité globale de cette méthode est testée en effectuant les simulations de flux et de transport au travers du modèle original du champ de conductivité hydraulique ainsi que du modèle simulé. Les simulations sont alors comparées. Les résultats obtenus indiquent que la procédure dintégration des données proposée permet d'obtenir des estimations de la conductivité en adéquation avec la structure à grande échelle ainsi que des predictions fiables des caractéristiques de transports sur des distances de moyenne à grande échelle. Les résultats correspondant au scénario de terrain indiquent que l'approche d'intégration des données nouvellement mise au point est capable d'appréhender correctement les hétérogénéitées à petite échelle aussi bien que les tendances à gande échelle du champ hydraulique prévalent. Les résultats montrent également une flexibilté remarquable et une robustesse de cette nouvelle approche dintégration des données. De ce fait, elle est susceptible d'être appliquée à un large éventail de données géophysiques et hydrologiques, à toutes les gammes déchelles. Dans la deuxième partie de ma thèse, j'évalue en détail la viabilité du réechantillonnage geostatique séquentiel comme mécanisme de proposition pour les méthodes Markov Chain Monte Carlo (MCMC) appliquées à des probmes inverses géophysiques et hydrologiques de grande dimension . L'objectif est de permettre une quantification plus précise et plus réaliste des incertitudes associées aux modèles obtenus. En considérant une série dexemples de tomographic radar puits à puits, j'étudie deux classes de stratégies de rééchantillonnage spatial en considérant leur habilité à générer efficacement et précisément des réalisations de la distribution postérieure bayésienne. Les résultats obtenus montrent que, malgré sa popularité, le réechantillonnage séquentiel est plutôt inefficace à générer des échantillons postérieurs indépendants pour des études de cas synthétiques réalistes, notamment pour le cas assez communs et importants où il existe de fortes corrélations spatiales entre le modèle et les paramètres. Pour résoudre ce problème, j'ai développé un nouvelle approche de perturbation basée sur une déformation progressive. Cette approche est flexible en ce qui concerne le nombre de paramètres du modèle et lintensité de la perturbation. Par rapport au rééchantillonage séquentiel, cette nouvelle approche s'avère être très efficace pour diminuer le nombre requis d'itérations pour générer des échantillons indépendants à partir de la distribution postérieure bayésienne. - Significant progress has been made with regard to the quantitative integration of geophysical and hydrological data at the local scale. However, extending corresponding approaches beyond the local scale still represents a major challenge, yet is critically important for the development of reliable groundwater flow and contaminant transport models. To address this issue, I have developed a hydrogeophysical data integration technique based on a two-step Bayesian sequential simulation procedure that is specifically targeted towards larger-scale problems. The objective is to simulate the distribution of a target hydraulic parameter based on spatially exhaustive, but poorly resolved, measurements of a pertinent geophysical parameter and locally highly resolved, but spatially sparse, measurements of the considered geophysical and hydraulic parameters. To this end, my algorithm links the low- and high-resolution geophysical data via a downscaling procedure before relating the downscaled regional-scale geophysical data to the high-resolution hydraulic parameter field. I first illustrate the application of this novel data integration approach to a realistic synthetic database consisting of collocated high-resolution borehole measurements of the hydraulic and electrical conductivities and spatially exhaustive, low-resolution electrical conductivity estimates obtained from electrical resistivity tomography (ERT). The overall viability of this method is tested and verified by performing and comparing flow and transport simulations through the original and simulated hydraulic conductivity fields. The corresponding results indicate that the proposed data integration procedure does indeed allow for obtaining faithful estimates of the larger-scale hydraulic conductivity structure and reliable predictions of the transport characteristics over medium- to regional-scale distances. The approach is then applied to a corresponding field scenario consisting of collocated high- resolution measurements of the electrical conductivity, as measured using a cone penetrometer testing (CPT) system, and the hydraulic conductivity, as estimated from electromagnetic flowmeter and slug test measurements, in combination with spatially exhaustive low-resolution electrical conductivity estimates obtained from surface-based electrical resistivity tomography (ERT). The corresponding results indicate that the newly developed data integration approach is indeed capable of adequately capturing both the small-scale heterogeneity as well as the larger-scale trend of the prevailing hydraulic conductivity field. The results also indicate that this novel data integration approach is remarkably flexible and robust and hence can be expected to be applicable to a wide range of geophysical and hydrological data at all scale ranges. In the second part of my thesis, I evaluate in detail the viability of sequential geostatistical resampling as a proposal mechanism for Markov Chain Monte Carlo (MCMC) methods applied to high-dimensional geophysical and hydrological inverse problems in order to allow for a more accurate and realistic quantification of the uncertainty associated with the thus inferred models. Focusing on a series of pertinent crosshole georadar tomographic examples, I investigated two classes of geostatistical resampling strategies with regard to their ability to efficiently and accurately generate independent realizations from the Bayesian posterior distribution. The corresponding results indicate that, despite its popularity, sequential resampling is rather inefficient at drawing independent posterior samples for realistic synthetic case studies, notably for the practically common and important scenario of pronounced spatial correlation between model parameters. To address this issue, I have developed a new gradual-deformation-based perturbation approach, which is flexible with regard to the number of model parameters as well as the perturbation strength. Compared to sequential resampling, this newly proposed approach was proven to be highly effective in decreasing the number of iterations required for drawing independent samples from the Bayesian posterior distribution.
Resumo:
Background and aim: H epatitis E v irus (HEV) infection has emerged as a c ause o f travel-related a nd autochthonous a cute hepatitis as well as chronic hepatitis in immunosuppressed patients. While t ravel-related cases a re c aused primarily b y infections w ith HEV of g enotype 1 ( HEV-1), autochthonous c ases a nd chronic cases a re d ue t o genotype 3 (HEV-3), which is s hared between humans and diverse animal species. The aim of this study was to establish HEV RNA detection assays f or q uantitative v iral load testing and genotyping. Methods: V iral RNA was p urified from plasma or s erum a nd converted to cDNA prior to (1) multiplex real-time PCR for HEV RNA quantification and (2) multiplex PCR coupled to DNA sequencing for HEV genotype determination. Real-time PCR was d esigned to match a ll known HEV genotypes available i n Genbank while PCR was designed using conserved primers flanking a variable region of the HEV RNA. Results: In a validation panel, the newly developed assays allowed for the reliable detection and genotyping of HEV-1 or HEV-3. Cases of t ravel-related and a utochthonous a cute h epatitis E a s well a s chronic hepatitis E i n immunosuppressed patients have b een identified using t hese a ssays a nd will be p resented in detail. Anti- HEV antibodies were n egative i n three well-characterized patients with chronic hepatitis E after organ transplantation. Conclusions: We developed and validated a quantitative HEV RNA detection assay that c an now be o ffered on a r outine basis (www.chuv.ch/imul/imu-collaborations-viral_hepatitis). Genotyping can also be offered on selected cases. HEV RNA detection is key in diagnosing chronic hepatitis E i n immunosuppressed patients with unexplained transaminase elevations, as serology can be negative in these patients.
Resumo:
The amino acid cysteine has long been known to be toxic at elevated levels for bacteria, fungi, and humans. However, mechanisms of cysteine tolerance in microbes remain largely obscure. Here we show that the human pathogenic yeast Candida albicans excretes sulfite when confronted with increasing cysteine concentrations. Mutant construction and phenotypic analysis revealed that sulfite formation from cysteine in C. albicans relies on cysteine dioxygenase Cdg1, an enzyme with similar functions in humans. Environmental cysteine induced not only the expression of the CDG1 gene in C. albicans, but also the expression of SSU1, encoding a putative sulfite efflux pump. Accordingly, the deletion of SSU1 resulted in enhanced sensitivity of the fungal cells to both cysteine and sulfite. To study the regulation of sulfite/cysteine tolerance in more detail, we screened a C. albicans library of transcription factor mutants in the presence of sulfite. This approach and subsequent independent mutant analysis identified the zinc cluster transcription factor Zcf2 to govern sulfite/cysteine tolerance, as well as cysteine-inducible SSU1 and CDG1 gene expression. cdg1Δ and ssu1Δ mutants displayed reduced hypha formation in the presence of cysteine, indicating a possible role of the newly proposed mechanisms of cysteine tolerance and sulfite secretion in the pathogenicity of C. albicans. Moreover, cdg1Δ mutants induced delayed mortality in a mouse model of disseminated infection. Since sulfite is toxic and a potent reducing agent, its production by C. albicans suggests diverse roles during host adaptation and pathogenicity.
Resumo:
To provide nursing practice with evidence, it is important to understand nursing phenomena in detail. Therefore, good descriptions including the identification of characteristics and attributes of nursing phenomena on various levels of abstraction, i. e., concepts, are needed. In this article the significance of concept development for nursing science will be demonstrated by drawing on the example of 'transitoriness'. The evolutionary concept analysis proposed by Rodgers (2000) is introduced in more detail. Drawing on transitoriness, the phenomenon is presented with the help of the evolutionary concept analysis by Rodgers (2000). The phenomenon's characteristics and attributes are identified, as well as potential areas of application. Moreover, areas are outlined, in which interventions for nursing practice can be developed, implemented and evaluated. Thus, nursing practice is updated to include new findings and innovation. Through concept analysis nursing phenomena can be described in more detail, enhanced or broadened for use in nursing practice. Such structured processes as concept analysis can be employed successfully for other nursing phenomena. Concept analyses can lead to the identification of tasks for the respective scientific discipline and professionals. Thus, concept analyses can lead to the concretisation of tasks in nursing.
Resumo:
The last ten years of research in the field of innate immunity have been incredibly fertile: the transmembrane Toll-like receptors (TLRs) were discovered as guardians protecting the host against microbial attacks and the emerging pathways characterized in detail. More recently, cytoplasmic sensors were identified, which are capable of detecting not only microbial, but also self molecules. Importantly, while such receptors trigger crucial host responses to microbial insult, over-activity of some of them has been linked to autoinflammatory disorders, hence demonstrating the importance of tightly regulating their actions over time and space. Here, we provide an overview of recent findings covering this area of innate and inflammatory responses that originate from the cytoplasm
Resumo:
ABSTRACT This dissertation investigates the, nature of space-time as described by the theory of general relativity. It mainly argues that space-time can be naturally interpreted as a physical structure in the precise sense of a network of concrete space-time relations among concrete space-time points that do not possess any intrinsic properties and any intrinsic identity. Such an interpretation is fundamentally based on two related key features of general relativity, namely substantive general covariance and background independence, where substantive general covariance is understood as a gauge-theoretic invariance under active diffeomorphisms and background independence is understood in the sense that the metric (or gravitational) field is dynamical and that, strictly speaking, it cannot be uniquely split into a purely gravitational part and a fixed purely inertial part or background. More broadly, a precise notion of (physical) structure is developed within the framework of a moderate version of structural realism understood as a metaphysical claim about what there is in the world. So, the developement of this moderate structural realism pursues two main aims. The first is purely metaphysical, the aim being to develop a coherent metaphysics of structures and of objects (particular attention is paid to the questions of identity and individuality of these latter within this structural realist framework). The second is to argue that moderate structural realism provides a convincing interpretation of the world as described by fundamental physics and in particular of space-time as described by general relativity. This structuralist interpretation of space-time is discussed within the traditional substantivalist-relationalist debate, which is best understood within the broader framework of the question about the relationship between space-time on the one hand and matter on the other. In particular, it is claimed that space-time structuralism does not constitute a 'tertium quid' in the traditional debate. Some new light on the question of the nature of space-time may be shed from the fundamental foundational issue of space-time singularities. Their possible 'non-local' (or global) feature is discussed in some detail and it is argued that a broad structuralist conception of space-time may provide a physically meaningful understanding of space-time singularities, which is not plagued by the conceptual difficulties of the usual atomsitic framework. Indeed, part of these difficulties may come from the standard differential geometric description of space-time, which encodes to some extent this atomistic framework; it raises the question of the importance of the mathematical formalism for the interpretation of space-time.
Resumo:
Profiling miRNA levels in cells with miRNA microarrays is becoming a widely used technique. Although normalization methods for mRNA gene expression arrays are well established, miRNA array normalization has so far not been investigated in detail. In this study we investigate the impact of normalization on data generated with the Agilent miRNA array platform. We have developed a method to select nonchanging miRNAs (invariants) and use them to compute linear regression normalization coefficients or variance stabilizing normalization (VSN) parameters. We compared the invariants normalization to normalization by scaling, quantile, and VSN with default parameters as well as to no normalization using samples with strong differential expression of miRNAs (heart-brain comparison) and samples where only a few miRNAs are affected (by p53 overexpression in squamous carcinoma cells versus control). All normalization methods performed better than no normalization. Normalization procedures based on the set of invariants and quantile were the most robust over all experimental conditions tested. Our method of invariant selection and normalization is not limited to Agilent miRNA arrays and can be applied to other data sets including those from one color miRNA microarray platforms, focused gene expression arrays, and gene expression analysis using quantitative PCR.
Resumo:
Since the arrival of several new antivirals and due to the growing molecular and clinical knowledge of hepatitis B virus (HBV) infection, therapy of hepatitis B has become complex. Clinical guidelines aim at streamlining medical attitudes: in this respect, the European Association for the Study of the Liver (EASL) recently issued clinical practice guidelines for the management of chronic hepatitis B. Guidelines made by international experts need however to be adapted to local health care systems. Here, we summarise the EASL guidelines with some minor modifications in order to be compatible with the particular Swiss situation, while discussing in more detail some aspects. Chronic hepatitis B is a complex disease with several phases where host and viral factors interact: the features of this continuous interplay need to be evaluated when choosing the most appropriate treatment. The EASL guidelines recommend, as first-line agents, using the most potent antivirals available with the optimal resistance profile, in order to abate HBV DNA as rapidly and as sustainably as possible. Once therapy has been started, the infection evolves and resistant viral strains may emerge. Rescue therapy needs to be started early with more potent agents lacking cross-resistance.
Resumo:
Plasmodium falciparum is the parasite responsible for the most acute form of malaria in humans. Recently, the serine repeat antigen (SERA) in P. falciparum has attracted attention as a potential vaccine and drug target, and it has been shown to be a member of a large gene family. To clarify the relationships among the numerous P. falciparum SERAs and to identify orthologs to SERA5 and SERA6 in Plasmodium species affecting rodents, gene trees were inferred from nucleotide and amino acid sequence data for 33 putative SERA homologs in seven different species. (A distance method for nucleotide sequences that is specifically designed to accommodate differing GC content yielded results that were largely compatible with the amino acid tree. Standard-distance and maximum-likelihood methods for nucleotide sequences, on the other hand, yielded gene trees that differed in important respects.) To infer the pattern of duplication, speciation, and gene loss events in the SERA gene family history, the resulting gene trees were then "reconciled" with two competing Plasmodium species tree topologies that have been identified by previous phylogenetic studies. Parsimony of reconciliation was used as a criterion for selecting a gene tree/species tree pair and provided (1) support for one of the two species trees and for the core topology of the amino acid-derived gene tree, (2) a basis for critiquing fine detail in a poorly resolved region of the gene tree, (3) a set of predicted "missing genes" in some species, (4) clarification of the relationship among the P. falciparum SERA, and (5) some information about SERA5 and SERA6 orthologs in the rodent malaria parasites. Parsimony of reconciliation and a second criterion--implied mutational pattern at two key active sites in the SERA proteins-were also seen to be useful supplements to standard "bootstrap" analysis for inferred topologies.