947 resultados para Data pre-processing
Resumo:
BACKGROUND: We reviewed the current evidence on the benefit and harm of pre-hospital tracheal intubation and mechanical ventilation after traumatic brain injury (TBI). METHODS: We conducted a systematic literature search up to December 2007 without language restriction to identify interventional and observational studies comparing pre-hospital intubation with other airway management (e.g. bag-valve-mask or oxygen administration) in patients with TBI. Information on study design, population, interventions, and outcomes was abstracted by two investigators and cross-checked by two others. Seventeen studies were included with data for 15,335 patients collected from 1985 to 2004. There were 12 retrospective analyses of trauma registries or hospital databases, three cohort studies, one case-control study, and one controlled trial. Using Brain Trauma Foundation classification of evidence, there were 14 class 3 studies, three class 2 studies, and no class 1 study. Six studies were of adults, five of children, and three of both; age groups were unclear in three studies. Maximum follow-up was up to 6 months or hospital discharge. RESULTS: In 13 studies, the unadjusted odds ratios (ORs) for an effect of pre-hospital intubation on in-hospital mortality ranged from 0.17 (favouring control interventions) to 2.43 (favouring pre-hospital intubation); adjusted ORs ranged from 0.24 to 1.42. Estimates for functional outcomes after TBI were equivocal. Three studies indicated higher risk of pneumonia associated with pre-hospital (when compared with in-hospital) intubation. CONCLUSIONS: Overall, the available evidence did not support any benefit from pre-hospital intubation and mechanical ventilation after TBI. Additional arguments need to be taken into account, including medical and procedural aspects.
Resumo:
Traffic safety engineers are among the early adopters of Bayesian statistical tools for analyzing crash data. As in many other areas of application, empirical Bayes methods were their first choice, perhaps because they represent an intuitively appealing, yet relatively easy to implement alternative to purely classical approaches. With the enormous progress in numerical methods made in recent years and with the availability of free, easy to use software that permits implementing a fully Bayesian approach, however, there is now ample justification to progress towards fully Bayesian analyses of crash data. The fully Bayesian approach, in particular as implemented via multi-level hierarchical models, has many advantages over the empirical Bayes approach. In a full Bayesian analysis, prior information and all available data are seamlessly integrated into posterior distributions on which practitioners can base their inferences. All uncertainties are thus accounted for in the analyses and there is no need to pre-process data to obtain Safety Performance Functions and other such prior estimates of the effect of covariates on the outcome of interest. In this slight, fully Bayesian methods may well be less costly to implement and may result in safety estimates with more realistic standard errors. In this manuscript, we present the full Bayesian approach to analyzing traffic safety data and focus on highlighting the differences between the empirical Bayes and the full Bayes approaches. We use an illustrative example to discuss a step-by-step Bayesian analysis of the data and to show some of the types of inferences that are possible within the full Bayesian framework.
Resumo:
Traffic safety engineers are among the early adopters of Bayesian statistical tools for analyzing crash data. As in many other areas of application, empirical Bayes methods were their first choice, perhaps because they represent an intuitively appealing, yet relatively easy to implement alternative to purely classical approaches. With the enormous progress in numerical methods made in recent years and with the availability of free, easy to use software that permits implementing a fully Bayesian approach, however, there is now ample justification to progress towards fully Bayesian analyses of crash data. The fully Bayesian approach, in particular as implemented via multi-level hierarchical models, has many advantages over the empirical Bayes approach. In a full Bayesian analysis, prior information and all available data are seamlessly integrated into posterior distributions on which practitioners can base their inferences. All uncertainties are thus accounted for in the analyses and there is no need to pre-process data to obtain Safety Performance Functions and other such prior estimates of the effect of covariates on the outcome of interest. In this light, fully Bayesian methods may well be less costly to implement and may result in safety estimates with more realistic standard errors. In this manuscript, we present the full Bayesian approach to analyzing traffic safety data and focus on highlighting the differences between the empirical Bayes and the full Bayes approaches. We use an illustrative example to discuss a step-by-step Bayesian analysis of the data and to show some of the types of inferences that are possible within the full Bayesian framework.
Resumo:
Abstract : The human body is composed of a huge number of cells acting together in a concerted manner. The current understanding is that proteins perform most of the necessary activities in keeping a cell alive. The DNA, on the other hand, stores the information on how to produce the different proteins in the genome. Regulating gene transcription is the first important step that can thus affect the life of a cell, modify its functions and its responses to the environment. Regulation is a complex operation that involves specialized proteins, the transcription factors. Transcription factors (TFs) can bind to DNA and activate the processes leading to the expression of genes into new proteins. Errors in this process may lead to diseases. In particular, some transcription factors have been associated with a lethal pathological state, commonly known as cancer, associated with uncontrolled cellular proliferation, invasiveness of healthy tissues and abnormal responses to stimuli. Understanding cancer-related regulatory programs is a difficult task, often involving several TFs interacting together and influencing each other's activity. This Thesis presents new computational methodologies to study gene regulation. In addition we present applications of our methods to the understanding of cancer-related regulatory programs. The understanding of transcriptional regulation is a major challenge. We address this difficult question combining computational approaches with large collections of heterogeneous experimental data. In detail, we design signal processing tools to recover transcription factors binding sites on the DNA from genome-wide surveys like chromatin immunoprecipitation assays on tiling arrays (ChIP-chip). We then use the localization about the binding of TFs to explain expression levels of regulated genes. In this way we identify a regulatory synergy between two TFs, the oncogene C-MYC and SP1. C-MYC and SP1 bind preferentially at promoters and when SP1 binds next to C-NIYC on the DNA, the nearby gene is strongly expressed. The association between the two TFs at promoters is reflected by the binding sites conservation across mammals, by the permissive underlying chromatin states 'it represents an important control mechanism involved in cellular proliferation, thereby involved in cancer. Secondly, we identify the characteristics of TF estrogen receptor alpha (hERa) target genes and we study the influence of hERa in regulating transcription. hERa, upon hormone estrogen signaling, binds to DNA to regulate transcription of its targets in concert with its co-factors. To overcome the scarce experimental data about the binding sites of other TFs that may interact with hERa, we conduct in silico analysis of the sequences underlying the ChIP sites using the collection of position weight matrices (PWMs) of hERa partners, TFs FOXA1 and SP1. We combine ChIP-chip and ChIP-paired-end-diTags (ChIP-pet) data about hERa binding on DNA with the sequence information to explain gene expression levels in a large collection of cancer tissue samples and also on studies about the response of cells to estrogen. We confirm that hERa binding sites are distributed anywhere on the genome. However, we distinguish between binding sites near promoters and binding sites along the transcripts. The first group shows weak binding of hERa and high occurrence of SP1 motifs, in particular near estrogen responsive genes. The second group shows strong binding of hERa and significant correlation between the number of binding sites along a gene and the strength of gene induction in presence of estrogen. Some binding sites of the second group also show presence of FOXA1, but the role of this TF still needs to be investigated. Different mechanisms have been proposed to explain hERa-mediated induction of gene expression. Our work supports the model of hERa activating gene expression from distal binding sites by interacting with promoter bound TFs, like SP1. hERa has been associated with survival rates of breast cancer patients, though explanatory models are still incomplete: this result is important to better understand how hERa can control gene expression. Thirdly, we address the difficult question of regulatory network inference. We tackle this problem analyzing time-series of biological measurements such as quantification of mRNA levels or protein concentrations. Our approach uses the well-established penalized linear regression models where we impose sparseness on the connectivity of the regulatory network. We extend this method enforcing the coherence of the regulatory dependencies: a TF must coherently behave as an activator, or a repressor on all its targets. This requirement is implemented as constraints on the signs of the regressed coefficients in the penalized linear regression model. Our approach is better at reconstructing meaningful biological networks than previous methods based on penalized regression. The method is tested on the DREAM2 challenge of reconstructing a five-genes/TFs regulatory network obtaining the best performance in the "undirected signed excitatory" category. Thus, these bioinformatics methods, which are reliable, interpretable and fast enough to cover large biological dataset, have enabled us to better understand gene regulation in humans.
Resumo:
A better integration of the information conveyed by traces within intelligence-led framework would allow forensic science to participate more intensively to security assessments through forensic intelligence (part I). In this view, the collection of data by examining crime scenes is an entire part of intelligence processes. This conception frames our proposal for a model that promotes to better use knowledge available in the organisation for driving and supporting crime scene examination. The suggested model also clarifies the uncomfortable situation of crime scene examiners who must simultaneously comply with justice needs and expectations, and serve organisations that are mostly driven by broader security objectives. It also opens new perspective for forensic science and crime scene investigation, by the proposal to follow other directions than the traditional path suggested by dominant movements in these fields.
Resumo:
This study aims to improve the accuracy and usability of Iowa Falling Weight Deflectometer (FWD) data by incorporating significant enhancements into the fully-automated software system for rapid processing of the FWD data. These enhancements include: (1) refined prediction of backcalculated pavement layer modulus through deflection basin matching/optimization, (2) temperature correction of backcalculated Hot-Mix Asphalt (HMA) layer modulus, (3) computation of 1993 AASHTO design guide related effective SN (SNeff) and effective k-value (keff ), (4) computation of Iowa DOT asphalt concrete (AC) overlay design related Structural Rating (SR) and kvalue (k), and (5) enhancement of user-friendliness of input and output from the software tool. A high-quality, easy-to-use backcalculation software package, referred to as, I-BACK: the Iowa Pavement Backcalculation Software, was developed to achieve the project goals and requirements. This report presents theoretical background behind the incorporated enhancements as well as guidance on the use of I-BACK developed in this study. The developed tool, I-BACK, provides more fine-tuned ANN pavement backcalculation results by implementation of deflection basin matching optimizer for conventional flexible, full-depth, rigid, and composite pavements. Implementation of this tool within Iowa DOT will facilitate accurate pavement structural evaluation and rehabilitation designs for pavement/asset management purposes. This research has also set the framework for the development of a simplified FWD deflection based HMA overlay design procedure which is one of the recommended areas for future research.
Resumo:
This study aims to improve the accuracy and usability of Iowa Falling Weight Deflectometer (FWD) data by incorporating significant enhancements into the fully-automated software system for rapid processing of the FWD data. These enhancements include: (1) refined prediction of backcalculated pavement layer modulus through deflection basin matching/optimization, (2) temperature correction of backcalculated Hot-Mix Asphalt (HMA) layer modulus, (3) computation of 1993 AASHTO design guide related effective SN (SNeff) and effective k-value (keff ), (4) computation of Iowa DOT asphalt concrete (AC) overlay design related Structural Rating (SR) and kvalue (k), and (5) enhancement of user-friendliness of input and output from the software tool. A high-quality, easy-to-use backcalculation software package, referred to as, I-BACK: the Iowa Pavement Backcalculation Software, was developed to achieve the project goals and requirements. This report presents theoretical background behind the incorporated enhancements as well as guidance on the use of I-BACK developed in this study. The developed tool, I-BACK, provides more fine-tuned ANN pavement backcalculation results by implementation of deflection basin matching optimizer for conventional flexible, full-depth, rigid, and composite pavements. Implementation of this tool within Iowa DOT will facilitate accurate pavement structural evaluation and rehabilitation designs for pavement/asset management purposes. This research has also set the framework for the development of a simplified FWD deflection based HMA overlay design procedure which is one of the recommended areas for future research.
Resumo:
Proneuropeptide Y (ProNPY) undergoes cleavage at a single dibasic site Lys38-Arg39 resulting in the formation of 1-39 amino acid NPY which is further processed successively by carboxypeptidase-like and peptidylglycine alpha-amidating monooxygenase enzymes. To investigate whether prohormone convertases are involved in ProNPY processing, a vaccinia virus derived expression system was used to coexpress recombinant ProNPY with each of the prohormone convertases PC1/3, PC2, furin, and PACE4 in Neuro2A and NIH 3T3 cell lines as regulated neuroendocrine and constitutive prototype cell lines, respectively. The analysis of processed products shows that only PC1/3 generates NPY in NIH 3T3 cells while both PC1/3 and PC2 are able to generate NPY in Neuro2A cells. The convertases furin and PACE4 are unable to process ProNPY in either cell line. Moreover, comparative in vitro cleavage of recombinant NPY precursor by the enzymes PC1/3, PC2 and furin shows that only PC1/3 and PC2 are involved in specific cleavage of the dibasic site. Kinetic studies demonstrate that PC1/3 cleaves ProNPY more efficiently than PC2. The main difference between the cleavage efficiency is observed in the Vmax values whereas no major difference is observed in Km values. In addition the cleavage by PC1/3 and PC2 of two peptides reproducing the dibasic cleavage site with different amino acid sequence lengths namely (20-49)-ProNPY and (28-43)-ProNPY was studied. These shortened ProNPY substrates, when recognized by the enzymes, are more efficiently cleaved than ProNPY itself. The shortest peptide is not cleaved by PC2 while it is by PC1/3. On the basis of these observations it is proposed, first, that the constitutive secreted NPY does not result from the cleavage carried out by ubiquitously expressed enzymes furin and PACE4; second, that PC1/3 and PC2 are not equipotent in the cleavage of ProNPY; and third, substrate peptide length might discriminate PC1/3 and PC2 processing activity.
Resumo:
New isotopic results on bulk carbonate and mollusc (gastropods and bivalves) samples from Lake Geneva (Switzerland), spanning the period from the Oldest Dryas to the present day, are compared with pre-existing stable isotope data. According to preliminary calibration of modern samples, Lake Geneva endogenic calcite precipitates at or near oxygen isotopic equilibrium with ambient water, confirming the potential of this large lake to record paleoenvironmental and paleoclimatic changes. The onset of endogenic calcite precipitation at the beginning of the Allerod biozone is clearly indicated by the oxygen isotopic signature of bulk carbonate. A large change in delta(13)C values occurs during the Preboreal. This carbon shift is likely to be due to a change in bioproductivity and/or to a `'catchment effect'', the contribution of biogenic CO2 from the catchment area to the dissolved inorganic carbon reservoir of the lake water becoming significant only during the Preboreal. Gastropods are confirmed as valuable for studies of changes in paleotemperature and in paleowater isotopic composition, despite the presence of a vital effect. Mineralogical evidence indicates an increased detrital influence upon sedimentation since the Subboreal time period. On the other hand, stable isotope measurements of Subatlantic carbonate sediments show values comparable to those of pure endogenic calcite and of gastropods (taking into account the vital effect). This apparent disagreement still remains difficult to explain.
Resumo:
The overall system is designed to permit automatic collection of delamination field data for bridge decks. In addition to measuring and recording the data in the field, the system provides for transferring the recorded data to a personal computer for processing and plotting. This permits rapid turnaround from data collection to a finished plot of the results in a fraction of the time previously required for manual analysis of the analog data captured on a strip chart recorder. In normal operation the Delamtect provides an analog voltage for each of two channels which is proportional to the extent of any delamination. These voltages are recorded on a strip chart for later visual analysis. An event marker voltage, produced by a momentary push button on the handle, is also provided by the Delamtect and recorded on a third channel of the analog recorder.
Resumo:
As a thorough aggregation of probability and graph theory, Bayesian networks currently enjoy widespread interest as a means for studying factors that affect the coherent evaluation of scientific evidence in forensic science. Paper I of this series of papers intends to contribute to the discussion of Bayesian networks as a framework that is helpful for both illustrating and implementing statistical procedures that are commonly employed for the study of uncertainties (e.g. the estimation of unknown quantities). While the respective statistical procedures are widely described in literature, the primary aim of this paper is to offer an essentially non-technical introduction on how interested readers may use these analytical approaches - with the help of Bayesian networks - for processing their own forensic science data. Attention is mainly drawn to the structure and underlying rationale of a series of basic and context-independent network fragments that users may incorporate as building blocs while constructing larger inference models. As an example of how this may be done, the proposed concepts will be used in a second paper (Part II) for specifying graphical probability networks whose purpose is to assist forensic scientists in the evaluation of scientific evidence encountered in the context of forensic document examination (i.e. results of the analysis of black toners present on printed or copied documents).
Resumo:
We investigated in conscious normotensive rats the effect of SKF64139 (2 mg i.v.), a potent phenylethanolamine N-methyltransferase (PNMT) inhibitor, on blood pressure responses to norepinephrine (40, 80, and 160 ng i.v.); methoxamine (2.5, 5 and 10 micrograms i.v.), a directly active sympathomimetic agent that is not taken up by adrenergic nerves; and tyramine (20, 40, and 80 micrograms i.v.), an indirectly acting sympathomimetic amine. The pressor effect of norepinephrine was not changed by 2 mg of SKF64139, while those of methoxamine and tyramine were significantly reduced. The dose-response curve to exogenous norepinephrine was also evaluated following blockade of norepinephrine uptake in the nerve endings using 0.25 mg desipramine i.v. This dose of desipramine had no effect on blood pressure increase induced by methoxamine. In rats pretreated with the neuronal uptake inhibitor desipramine in a dose that did not affect alpha-adrenoceptors, SKF64139 significantly decreased the pressor responses to norepinephrine. Increasing the dose of SKF64139 to 8 mg i.v. resulted in a significant fall in base-line blood pressure and in a blunted blood pressure response to norepinephrine. These data demonstrate that in vivo the PNMT inhibitor SKF64139 blocks alpha-adrenoceptors and inhibits neuronal uptake. The alpha-adrenoceptor blocking properties of SKF65139 are masked by simultaneous blockade of norepinephrine uptake when agonists with affinity for the uptake system are used. These findings need to be taken into account when interpreting cardiovascular effects of the PNMT inhibitor SKF64139.
Resumo:
OBJECTIVE: To determine if the results of resin-dentin microtensile bond strength (µTBS) is correlated with the outcome parameters of clinical studies on non-retentive Class V restorations. METHODS: Resin-dentin µTBS data were obtained from one test center; the in vitro tests were all performed by the same operator. The µTBS testing was performed 8h after bonding and after 6 months of storing the specimens in water. Pre-test failures (PTFs) of specimens were included in the analysis, attributing them a value of 1MPa. Prospective clinical studies on cervical restorations (Class V) with an observation period of at least 18 months were searched in the literature. The clinical outcome variables were retention loss, marginal discoloration and marginal integrity. Furthermore, an index was formulated to be better able to compare the laboratory and clinical results. Estimates of adhesive effects in a linear mixed model were used to summarize the clinical performance of each adhesive between 12 and 36 months. Spearman correlations between these clinical performances and the µTBS values were calculated subsequently. RESULTS: Thirty-six clinical studies with 15 adhesive/restorative systems for which µTBS data were also available were included in the statistical analysis. In general 3-step and 2-step etch-and-rinse systems showed higher bond strength values than the 2-step/3-step self-etching systems, which, however, produced higher values than the 1-step self-etching and the resin modified glass ionomer systems. Prolonged water storage of specimens resulted in a significant decrease of the mean bond strength values in 5 adhesive systems (Wilcoxon, p<0.05). There was a significant correlation between µTBS values both after 8h and 6 months of storage and marginal discoloration (r=0.54 and r=0.67, respectively). However, the same correlation was not found between µTBS values and the retention rate, clinical index or marginal integrity. SIGNIFICANCE: As µTBS data of adhesive systems, especially after water storage for 6 months, showed a good correlation with marginal discoloration in short-term clinical Class V restorations, longitudinal clinical trials should explore whether early marginal staining is predictive for future retention loss in non-carious cervical restorations.
Resumo:
Remote sensing image processing is nowadays a mature research area. The techniques developed in the field allow many real-life applications with great societal value. For instance, urban monitoring, fire detection or flood prediction can have a great impact on economical and environmental issues. To attain such objectives, the remote sensing community has turned into a multidisciplinary field of science that embraces physics, signal theory, computer science, electronics, and communications. From a machine learning and signal/image processing point of view, all the applications are tackled under specific formalisms, such as classification and clustering, regression and function approximation, image coding, restoration and enhancement, source unmixing, data fusion or feature selection and extraction. This paper serves as a survey of methods and applications, and reviews the last methodological advances in remote sensing image processing.
Resumo:
The mature TCR is composed of a clonotypic heterodimer (alpha beta or gamma delta) associated with the invariant CD3 components (gamma, delta, epsilon and zeta). There is now considerable evidence that more immature forms of the TCR-CD3 complex (consisting of either CD3 alone or CD3 associated with a heterodimer of TCR beta and pre-T alpha) can be expressed at the cell surface on early thymocytes. These pre-TCR complexes are believed to be necessary for the ordered progression of early T cell development. We have analyzed in detail the expression of both the pre-TCR and CD3 complex at various stages of adult thymus development. Our data indicate that all CD3 components are already expressed at the mRNA level by the earliest identifiable (CD4lo) thymic precursor. In contrast, genes encoding the pre-TCR complex (pre-T alpha and fully rearranged TCR beta) are first expressed at the CD44loCD25+CD4-CD8- stage. Detectable surface expression of both CD3 and TCR beta are delayed relative to expression of the corresponding genes, suggesting the existence of other (as yet unidentified) components of the pre-TCR complex.