25 resultados para Mathematical and statistical techniques
Resumo:
Rapid industrialization and urbanization in developing countries has led to an increase in air pollution, along a similar trajectory to that previously experienced by the developed nations. In China, particulate pollution is a serious environmental problem that is influencing air quality, regional and global climates, and human health. In response to the extremely severe and persistent haze pollution experienced by about 800 million people during the first quarter of 2013 (refs 4, 5), the Chinese State Council announced its aim to reduce concentrations of PM2.5 (particulate matter with an aerodynamic diameter less than 2.5micrometres) by up to 25 per cent relative to 2012 levels by 2017 (ref. 6). Such efforts however require elucidation of the factors governing the abundance and composition of PM2.5, which remain poorly constrained in China. Here we combine a comprehensive set of novel and state-of-the-art offline analytical approaches and statistical techniques to investigate the chemical nature and sources of particulate matter at urban locations in Beijing, Shanghai, Guangzhou and Xi'an during January 2013. We find that the severe haze pollution event was driven to a large extent by secondary aerosol formation, which contributed 30-77 per cent and 44-71 per cent (average for all four cities) of PM2.5 and of organic aerosol, respectively. On average, the contribution of secondary organic aerosol (SOA) and secondary inorganic aerosol (SIA) are found to be of similar importance (SOA/SIA ratios range from 0.6 to 1.4). Our results suggest that, in addition to mitigating primary particulate emissions, reducing the emissions of secondary aerosol precursors from, for example, fossil fuel combustion and biomass burning is likely to be important for controlling China's PM2.5 levels and for reducing the environmental, economic and health impacts resulting from particulate pollution.
Resumo:
Noninvasive blood flow measurements based on Doppler ultrasound studies are the main clinical tool for studying the cardiovascular status of fetuses at risk for circulatory compromise. Usually, qualitative analysis of peripheral arteries and in particular clinical situations such as severe growth restriction or volume overload also of venous vessels close to the heart or of flow patterns in the heart is being used to gauge the level of compensation in a fetus. However, quantitative assessment of the driving force of the fetal circulation, the cardiac output remains an elusive goal in fetal medicine. This article reviews the methods for direct and indirect assessment of cardiac function and explains new clinical applications. Part 1 of this review describes the concept of cardiac function and cardiac output and the techniques that have been used to quantify output. Part 2 summarizes the use of arterial and venous Doppler studies in the fetus and gives a detailed description of indirect measurements of cardiac function (like indices derived from the duration of segments of the cardiac cycle) with current examples of their application.
Resumo:
This article reports about the internet based, second multicenter study (MCS II) of the spine study group (AG WS) of the German trauma association (DGU). It represents a continuation of the first study conducted between the years 1994 and 1996 (MCS I). For the purpose of one common, centralised data capture methodology, a newly developed internet-based data collection system ( http://www.memdoc.org ) of the Institute for Evaluative Research in Orthopaedic Surgery of the University of Bern was used. The aim of this first publication on the MCS II was to describe in detail the new method of data collection and the structure of the developed data base system, via internet. The goal of the study was the assessment of the current state of treatment for fresh traumatic injuries of the thoracolumbar spine in the German speaking part of Europe. For that reason, we intended to collect large number of cases and representative, valid information about the radiographic, clinical and subjective treatment outcomes. Thanks to the new study design of MCS II, not only the common surgical treatment concepts, but also the new and constantly broadening spectrum of spine surgery, i.e. vertebro-/kyphoplasty, computer assisted surgery and navigation, minimal-invasive, and endoscopic techniques, documented and evaluated. We present a first statistical overview and preliminary analysis of 18 centers from Germany and Austria that participated in MCS II. A real time data capture at source was made possible by the constant availability of the data collection system via internet access. Following the principle of an application service provider, software, questionnaires and validation routines are located on a central server, which is accessed from the periphery (hospitals) by means of standard Internet browsers. By that, costly and time consuming software installation and maintenance of local data repositories are avoided and, more importantly, cumbersome migration of data into one integrated database becomes obsolete. Finally, this set-up also replaces traditional systems wherein paper questionnaires were mailed to the central study office and entered by hand whereby incomplete or incorrect forms always represent a resource consuming problem and source of error. With the new study concept and the expanded inclusion criteria of MCS II 1, 251 case histories with admission and surgical data were collected. This remarkable number of interventions documented during 24 months represents an increase of 183% compared to the previously conducted MCS I. The concept and technical feasibility of the MEMdoc data collection system was proven, as the participants of the MCS II succeeded in collecting data ever published on the largest series of patients with spinal injuries treated within a 2 year period.
Resumo:
AIM: The aim of the present review was to systematically assess the dental literature in terms of soft tissue grafting techniques. The focused question was: is one method superior over others for augmentation and stability of the augmented soft tissue in terms of increasing the width of keratinized tissue (part 1) and gain in soft tissue volume (part 2). METHODS: A Medline search was performed for human studies focusing on augmentation of keratinized tissue and/or soft tissue volume, and complemented by additional hand searching. Relevant studies were identified and statistical results were reported for meta-analyses including the test minus control weighted mean differences with 95% confidence intervals, the I-squared statistic for tests of heterogeneity, and the number of significant studies. RESULTS: Twenty-five (part 1) and three (part 2) studies met the inclusion criteria; 14 studies (part 1) were eligible for comparison using meta-analyses. An apically positioned flap/vestibuloplasty (APF/V) procedure resulted in a statistically significantly greater gain in keratinized tissue than untreated controls. APF/V plus autogenous tissue revealed statistically significantly more attached gingiva compared with untreated controls and a borderline statistical significance compared with APF/V plus allogenic tissue. Statistically significantly more shrinkage was observed for the APF/V plus allogenic graft compared with the APF/V plus autogenous tissue. Patient-centered outcomes did not reveal any of the treatment methods to be superior regarding postoperative complications. The three studies reporting on soft tissue volume augmentation could not be compared due to lack of homogeneity. The use of subepithelial connective tissue grafts (SCTGs) resulted in statistically significantly more soft tissue volume gain compared with free gingival grafts (FGGs). CONCLUSIONS: APF/V is a successful treatment concept to increase the width of keratinized tissue or attached gingiva around teeth. The addition of autogenous tissue statistically significantly increases the width of attached gingiva. For soft tissue volume augmentation, only limited data are available favoring SCTGs over FGG.
Resumo:
This paper presents a comparison of principal component (PC) regression and regularized expectation maximization (RegEM) to reconstruct European summer and winter surface air temperature over the past millennium. Reconstruction is performed within a surrogate climate using the National Center for Atmospheric Research (NCAR) Climate System Model (CSM) 1.4 and the climate model ECHO-G 4, assuming different white and red noise scenarios to define the distortion of pseudoproxy series. We show how sensitivity tests lead to valuable “a priori” information that provides a basis for improving real world proxy reconstructions. Our results emphasize the need to carefully test and evaluate reconstruction techniques with respect to the temporal resolution and the spatial scale they are applied to. Furthermore, we demonstrate that uncertainties inherent to the predictand and predictor data have to be more rigorously taken into account. The comparison of the two statistical techniques, in the specific experimental setting presented here, indicates that more skilful results are achieved with RegEM as low frequency variability is better preserved. We further detect seasonal differences in reconstruction skill for the continental scale, as e.g. the target temperature average is more adequately reconstructed for summer than for winter. For the specific predictor network given in this paper, both techniques underestimate the target temperature variations to an increasing extent as more noise is added to the signal, albeit RegEM less than with PC regression. We conclude that climate field reconstruction techniques can be improved and need to be further optimized in future applications.
Resumo:
In order to overcome the limitations of the linear-quadratic model and include synergistic effects of heat and radiation, a novel radiobiological model is proposed. The model is based on a chain of cell populations which are characterized by the number of radiation induced damages (hits). Cells can shift downward along the chain by collecting hits and upward by a repair process. The repair process is governed by a repair probability which depends upon state variables used for a simplistic description of the impact of heat and radiation upon repair proteins. Based on the parameters used, populations up to 4-5 hits are relevant for the calculation of the survival. The model describes intuitively the mathematical behaviour of apoptotic and nonapoptotic cell death. Linear-quadratic-linear behaviour of the logarithmic cell survival, fractionation, and (with one exception) the dose rate dependencies are described correctly. The model covers the time gap dependence of the synergistic cell killing due to combined application of heat and radiation, but further validation of the proposed approach based on experimental data is needed. However, the model offers a work bench for testing different biological concepts of damage induction, repair, and statistical approaches for calculating the variables of state.
Climate refugia: joint inference from fossil records, species distribution models and phylogeography
Resumo:
Climate refugia, locations where taxa survive periods of regionally adverse climate, are thought to be critical for maintaining biodiversity through the glacial–interglacial climate changes of the Quaternary. A critical research need is to better integrate and reconcile the three major lines of evidence used to infer the existence of past refugia – fossil records, species distribution models and phylogeographic surveys – in order to characterize the complex spatiotemporal trajectories of species and populations in and out of refugia. Here we review the complementary strengths, limitations and new advances for these three approaches. We provide case studies to illustrate their combined application, and point the way towards new opportunities for synthesizing these disparate lines of evidence. Case studies with European beech, Qinghai spruce and Douglas-fir illustrate how the combination of these three approaches successfully resolves complex species histories not attainable from any one approach. Promising new statistical techniques can capitalize on the strengths of each method and provide a robust quantitative reconstruction of species history. Studying past refugia can help identify contemporary refugia and clarify their conservation significance, in particular by elucidating the fine-scale processes and the particular geographic locations that buffer species against rapidly changing climate.
Resumo:
PURPOSE Hodgkin lymphoma (HL) is a highly curable disease. Reducing late complications and second malignancies has become increasingly important. Radiotherapy target paradigms are currently changing and radiotherapy techniques are evolving rapidly. DESIGN This overview reports to what extent target volume reduction in involved-node (IN) and advanced radiotherapy techniques, such as intensity-modulated radiotherapy (IMRT) and proton therapy-compared with involved-field (IF) and 3D radiotherapy (3D-RT)- can reduce high doses to organs at risk (OAR) and examines the issues that still remain open. RESULTS Although no comparison of all available techniques on identical patient datasets exists, clear patterns emerge. Advanced dose-calculation algorithms (e.g., convolution-superposition/Monte Carlo) should be used in mediastinal HL. INRT consistently reduces treated volumes when compared with IFRT with the exact amount depending on the INRT definition. The number of patients that might significantly benefit from highly conformal techniques such as IMRT over 3D-RT regarding high-dose exposure to organs at risk (OAR) is smaller with INRT. The impact of larger volumes treated with low doses in advanced techniques is unclear. The type of IMRT used (static/rotational) is of minor importance. All advanced photon techniques result in similar potential benefits and disadvantages, therefore only the degree-of-modulation should be chosen based on individual treatment goals. Treatment in deep inspiration breath hold is being evaluated. Protons theoretically provide both excellent high-dose conformality and reduced integral dose. CONCLUSION Further reduction of treated volumes most effectively reduces OAR dose, most likely without disadvantages if the excellent control rates achieved currently are maintained. For both IFRT and INRT, the benefits of advanced radiotherapy techniques depend on the individual patient/target geometry. Their use should therefore be decided case by case with comparative treatment planning.
Resumo:
This study compares gridded European seasonal series of surface air temperature (SAT) and precipitation (PRE) reconstructions with a regional climate simulation over the period 1500–1990. The area is analysed separately for nine subareas that represent the majority of the climate diversity in the European sector. In their spatial structure, an overall good agreement is found between the reconstructed and simulated climate features across Europe, supporting consistency in both products. Systematic biases between both data sets can be explained by a priori known deficiencies in the simulation. Simulations and reconstructions, however, largely differ in the temporal evolution of past climate for European subregions. In particular, the simulated anomalies during the Maunder and Dalton minima show stronger response to changes in the external forcings than recorded in the reconstructions. Although this disagreement is to some extent expected given the prominent role of internal variability in the evolution of regional temperature and precipitation, a certain degree of agreement is a priori expected in variables directly affected by external forcings. In this sense, the inability of the model to reproduce a warm period similar to that recorded for the winters during the first decades of the 18th century in the reconstructions is indicative of fundamental limitations in the simulation that preclude reproducing exceptionally anomalous conditions. Despite these limitations, the simulated climate is a physically consistent data set, which can be used as a benchmark to analyse the consistency and limitations of gridded reconstructions of different variables. A comparison of the leading modes of SAT and PRE variability indicates that reconstructions are too simplistic, especially for precipitation, which is associated with the linear statistical techniques used to generate the reconstructions. The analysis of the co-variability between sea level pressure (SLP) and SAT and PRE in the simulation yields a result which resembles the canonical co-variability recorded in the observations for the 20th century. However, the same analysis for reconstructions exhibits anomalously low correlations, which points towards a lack of dynamical consistency between independent reconstructions.
Resumo:
Monte Carlo integration is firmly established as the basis for most practical realistic image synthesis algorithms because of its flexibility and generality. However, the visual quality of rendered images often suffers from estimator variance, which appears as visually distracting noise. Adaptive sampling and reconstruction algorithms reduce variance by controlling the sampling density and aggregating samples in a reconstruction step, possibly over large image regions. In this paper we survey recent advances in this area. We distinguish between “a priori” methods that analyze the light transport equations and derive sampling rates and reconstruction filters from this analysis, and “a posteriori” methods that apply statistical techniques to sets of samples to drive the adaptive sampling and reconstruction process. They typically estimate the errors of several reconstruction filters, and select the best filter locally to minimize error. We discuss advantages and disadvantages of recent state-of-the-art techniques, and provide visual and quantitative comparisons. Some of these techniques are proving useful in real-world applications, and we aim to provide an overview for practitioners and researchers to assess these approaches. In addition, we discuss directions for potential further improvements.