903 resultados para Geo-statistical model


Relevância:

80.00% 80.00%

Publicador:

Resumo:

Study Objective: To estimate the concentration of natural killer (NK) cells in the peripheral blood in patients with and without endometriosis. Design: Case-control study (Canadian Task Force classification II-2). Setting: Tertiary referral hospital. Patients: One hundred fifty-five patients who had undergone videolaparoscopy were divided into 2 groups: those with endometriosis (n = 100) and those without endometriosis (n = 55). Interventions: The percentage of NK cells relative to peripheral lymphocytes was quantified at flow cytometry in 155 patients who had undergone laparoscopy. In addition to verifying the presence of endometriosis, stage of disease and the sites affected were also evaluated. Measurements and Main Results: The mean (SD) percentage of NK cells was higher (15.3% [9.8%]) in patients with endometriosis than in the group without the disease (10.6% [5.8%]) (p < .001). The percentage of NK cells was highest (19.8 [10.3%]) in patients with advanced stages of endometriosis and in those in whom the rectosigmoid colon was affected. In a statistical model of probability, the association of this marker (NK cells >= 11%) with the presence of symptoms such as pain and intestinal bleeding during menstruation and the absence of previous pregnancy yielded a 78% likelihood of the rectosigmoid colon being affected. Conclusion: Compared with patients without endometriosis, those with endometriosis demonstrate a higher concentration of peripheral NK cells. The percentage of NK cells is greater, primarily in patients with advanced stages of endometriosis involving the rectosigmoid colon. Therefore, it may serve as a diagnostic marker for this type of severe endometriosis, in particular if considered in conjunction with the symptoms. Journal of Minimally Invasive Gynecology (2012) 19, 317-324 (C) 2012 AAGL. All rights reserved.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Purpose: Few reports have evaluated cumulative survival rates of extraoral rehabilitation and peri-implant soft tissue reaction at long-term follow-up. The objective of this study was to evaluate implant and prosthesis survival rates and the soft tissue reactions around the extraoral implants used to support craniofacial prostheses. Materials and Methods: A retrospective study was performed of patients who received implants for craniofacial rehabilitation from 2003 to 2010. Two outcome variables were considered: implant and prosthetic success. The following predictor variables were recorded: gender, age, implant placement location, number and size of implants, irradiation status in the treated field, date of prosthesis delivery, soft tissue response, and date of last follow-up. A statistical model was used to estimate survival rates and associated confidence intervals. We randomly selected 1 implant per patient for analysis. Data were analyzed using the Kaplan-Meier method and log-rank test to compare survival curves. Results: A total of 150 titanium implants were placed in 56 patients. The 2-year overall implant survival rates were 94.1% for auricular implants, 90.9% for nasal implants, 100% for orbital implants, and 100% for complex midfacial implants (P = .585). The implant survival rates were 100% for implants placed in irradiated patients and 94.4% for those placed in nonirradiated patients (P = .324). The 2-year overall prosthesis survival rates were 100% for auricular implants, 90.0% for nasal implants, 92.3% for orbital implants, and 100% for complex midfacial implants (P = .363). The evaluation of the peri-implant soft tissue response showed that 15 patients (26.7%) had a grade 0 soft tissue reaction, 30 (53.5%) had grade 1, 6 (10.7%) had grade 2, and 5 (8.9%) had grade 3. Conclusions: From this study, it was concluded that craniofacial rehabilitation with extraoral implants is a safe, reliable, and predictable method to restore the patient's normal appearance. (C) 2012 American Association of Oral and Maxillofacial Surgeons J Oral Maxillofac Surg 70:1551-1557, 2012

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Mosquitoes are vectors of arboviruses that can cause encephalitis and hemorrhagic fevers in humans. Aedes serratus (Theobald), Aedes scapularis (Rondani) and Psorophora ferox (Von Humboldt) are potential vectors of arboviruses and are abundant in Vale do Ribeira, located in the Atlantic Forest in the southeast of the State of Sao Paulo, Brazil. The objective of this study was to predict the spatial distribution of these mosquitoes and estimate the risk of human exposure to mosquito bites. Results of the analyses show that humans are highly exposed to bites in the municipalities of Cananeia, Iguape and Ilha Comprida. In these localities the incidence of Rocio encephalitis was 2% in the 1970s. Furthermore, Ae. serratus, a recently implicated vector of yellow fever virus in the State of Rio Grande do Sul, should be a target for the entomological surveillance in the southeastern Atlantic Forest. Considering the continental dimensions of Brazil and the inherent difficulties in sampling its vast area, the habitat suitability method used in the study can be an important tool for predicting the distribution of vectors of pathogens.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Response surface methodology (RSM), based on a 2(2) full factorial design, evaluated the moisture effects in recovering xylose by diethyloxalate (DEO) hydrolysis. Experiments were carried out in laboratory reactors (10 mL glass ampoules) containing corn stover (0.5 g) properly ground. The ampoules were kept at 160 degrees C for 90 min.(-) Both DEO concentration and corn stover moisture content were statistically significant at 99% confidence level. The maximum xylose recovery by the response surface methodology was achieved employing both DEO concentration and corn stover moisture at near their highest levels area. We amplified this area by using an overlay plot as a graphical optimization using a response of xylose recovery more than 80%. The mathematical statistical model was validated by testing a specific condition in the satisfied overlay plot area. Experimentally, a maximum xylose recovery (81.2%) was achieved by using initial corn stover moisture of 60% and a DEO concentration of 4% w/w. The mathematical statistical model showed that xylose recovery increases during DEO corn stover acid hydrolysis as the corn stover moisture level increases. This observation could be important during the harvesting of corn before it is fully dried in the field. The corn stover moisture was an important variable to improve xylose recovery by DEO acid hydrolysis. (c) 2011 Elsevier Ltd. All rights reserved.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Es soll eine Dichtefunktion geschätzt werden unter der Modellannahme, dass diese in einer geeigneten Besovklasse liegt und kompakten Träger hat. Hierzu wird ein Waveletschätzer TW näher untersucht, der Thresholding-Methoden verwendet. Es wird die asymptotische Konvergenzgeschwindigkeit von TW für eine große Zahl von Beobachtungen angegeben und bewiesen. Schließlich werden in einem Überblick weitere Waveletschätzer diskutiert und mit TW verglichen. Es zeigt sich, dass TW in vielen Modellannahmen die optimale Konvergenzrate erreicht.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Small-scale dynamic stochastic general equilibrium have been treated as the benchmark of much of the monetary policy literature, given their ability to explain the impact of monetary policy on output, inflation and financial markets. One cause of the empirical failure of New Keynesian models is partially due to the Rational Expectations (RE) paradigm, which entails a tight structure on the dynamics of the system. Under this hypothesis, the agents are assumed to know the data genereting process. In this paper, we propose the econometric analysis of New Keynesian DSGE models under an alternative expectations generating paradigm, which can be regarded as an intermediate position between rational expectations and learning, nameley an adapted version of the "Quasi-Rational" Expectatations (QRE) hypothesis. Given the agents' statistical model, we build a pseudo-structural form from the baseline system of Euler equations, imposing that the length of the reduced form is the same as in the `best' statistical model.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Locally affine (polyaffine) image registration methods capture intersubject non-linear deformations with a low number of parameters, while providing an intuitive interpretation for clinicians. Considering the mandible bone, anatomical shape differences can be found at different scales, e.g. left or right side, teeth, etc. Classically, sequential coarse to fine registration are used to handle multiscale deformations, instead we propose a simultaneous optimization of all scales. To avoid local minima we incorporate a prior on the polyaffine transformations. This kind of groupwise registration approach is natural in a polyaffine context, if we assume one configuration of regions that describes an entire group of images, with varying transformations for each region. In this paper, we reformulate polyaffine deformations in a generative statistical model, which enables us to incorporate deformation statistics as a prior in a Bayesian setting. We find optimal transformations by optimizing the maximum a posteriori probability. We assume that the polyaffine transformations follow a normal distribution with mean and concentration matrix. Parameters of the prior are estimated from an initial coarse to fine registration. Knowing the region structure, we develop a blockwise pseudoinverse to obtain the concentration matrix. To our knowledge, we are the first to introduce simultaneous multiscale optimization through groupwise polyaffine registration. We show results on 42 mandible CT images.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

We derive a new class of iterative schemes for accelerating the convergence of the EM algorithm, by exploiting the connection between fixed point iterations and extrapolation methods. First, we present a general formulation of one-step iterative schemes, which are obtained by cycling with the extrapolation methods. We, then square the one-step schemes to obtain the new class of methods, which we call SQUAREM. Squaring a one-step iterative scheme is simply applying it twice within each cycle of the extrapolation method. Here we focus on the first order or rank-one extrapolation methods for two reasons, (1) simplicity, and (2) computational efficiency. In particular, we study two first order extrapolation methods, the reduced rank extrapolation (RRE1) and minimal polynomial extrapolation (MPE1). The convergence of the new schemes, both one-step and squared, is non-monotonic with respect to the residual norm. The first order one-step and SQUAREM schemes are linearly convergent, like the EM algorithm but they have a faster rate of convergence. We demonstrate, through five different examples, the effectiveness of the first order SQUAREM schemes, SqRRE1 and SqMPE1, in accelerating the EM algorithm. The SQUAREM schemes are also shown to be vastly superior to their one-step counterparts, RRE1 and MPE1, in terms of computational efficiency. The proposed extrapolation schemes can fail due to the numerical problems of stagnation and near breakdown. We have developed a new hybrid iterative scheme that combines the RRE1 and MPE1 schemes in such a manner that it overcomes both stagnation and near breakdown. The squared first order hybrid scheme, SqHyb1, emerges as the iterative scheme of choice based on our numerical experiments. It combines the fast convergence of the SqMPE1, while avoiding near breakdowns, with the stability of SqRRE1, while avoiding stagnations. The SQUAREM methods can be incorporated very easily into an existing EM algorithm. They only require the basic EM step for their implementation and do not require any other auxiliary quantities such as the complete data log likelihood, and its gradient or hessian. They are an attractive option in problems with a very large number of parameters, and in problems where the statistical model is complex, the EM algorithm is slow and each EM step is computationally demanding.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Automatic identification and extraction of bone contours from X-ray images is an essential first step task for further medical image analysis. In this paper we propose a 3D statistical model based framework for the proximal femur contour extraction from calibrated X-ray images. The automatic initialization is solved by an estimation of Bayesian network algorithm to fit a multiple component geometrical model to the X-ray data. The contour extraction is accomplished by a non-rigid 2D/3D registration between a 3D statistical model and the X-ray images, in which bone contours are extracted by a graphical model based Bayesian inference. Preliminary experiments on clinical data sets verified its validity

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The report explores the problem of detecting complex point target models in a MIMO radar system. A complex point target is a mathematical and statistical model for a radar target that is not resolved in space, but exhibits varying complex reflectivity across the different bistatic view angles. The complex reflectivity can be modeled as a complex stochastic process whose index set is the set of all the bistatic view angles, and the parameters of the stochastic process follow from an analysis of a target model comprising a number of ideal point scatterers randomly located within some radius of the targets center of mass. The proposed complex point targets may be applicable to statistical inference in multistatic or MIMO radar system. Six different target models are summarized here – three 2-dimensional (Gaussian, Uniform Square, and Uniform Circle) and three 3-dimensional (Gaussian, Uniform Cube, and Uniform Sphere). They are assumed to have different distributions on the location of the point scatterers within the target. We develop data models for the received signals from such targets in the MIMO radar system with distributed assets and partially correlated signals, and consider the resulting detection problem which reduces to the familiar Gauss-Gauss detection problem. We illustrate that the target parameter and transmit signal have an influence on the detector performance through target extent and the SNR respectively. A series of the receiver operator characteristic (ROC) curves are generated to notice the impact on the detector for varying SNR. Kullback–Leibler (KL) divergence is applied to obtain the approximate mean difference between density functions the scatterers assume inside the target models to show the change in the performance of the detector with target extent of the point scatterers.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Standard procedures for forecasting flood risk (Bulletin 17B) assume annual maximum flood (AMF) series are stationary, meaning the distribution of flood flows is not significantly affected by climatic trends/cycles, or anthropogenic activities within the watershed. Historical flood events are therefore considered representative of future flood occurrences, and the risk associated with a given flood magnitude is modeled as constant over time. However, in light of increasing evidence to the contrary, this assumption should be reconsidered, especially as the existence of nonstationarity in AMF series can have significant impacts on planning and management of water resources and relevant infrastructure. Research presented in this thesis quantifies the degree of nonstationarity evident in AMF series for unimpaired watersheds throughout the contiguous U.S., identifies meteorological, climatic, and anthropogenic causes of this nonstationarity, and proposes an extension of the Bulletin 17B methodology which yields forecasts of flood risk that reflect climatic influences on flood magnitude. To appropriately forecast flood risk, it is necessary to consider the driving causes of nonstationarity in AMF series. Herein, large-scale climate patterns—including El Niño-Southern Oscillation (ENSO), Pacific Decadal Oscillation (PDO), North Atlantic Oscillation (NAO), and Atlantic Multidecadal Oscillation (AMO)—are identified as influencing factors on flood magnitude at numerous stations across the U.S. Strong relationships between flood magnitude and associated precipitation series were also observed for the majority of sites analyzed in the Upper Midwest and Northeastern regions of the U.S. Although relationships between flood magnitude and associated temperature series are not apparent, results do indicate that temperature is highly correlated with the timing of flood peaks. Despite consideration of watersheds classified as unimpaired, analyses also suggest that identified change-points in AMF series are due to dam construction, and other types of regulation and diversion. Although not explored herein, trends in AMF series are also likely to be partially explained by changes in land use and land cover over time. Results obtained herein suggest that improved forecasts of flood risk may be obtained using a simple modification of the Bulletin 17B framework, wherein the mean and standard deviation of the log-transformed flows are modeled as functions of climate indices associated with oceanic-atmospheric patterns (e.g. AMO, ENSO, NAO, and PDO) with lead times between 3 and 9 months. Herein, one-year ahead forecasts of the mean and standard deviation, and subsequently flood risk, are obtained by applying site specific multivariate regression models, which reflect the phase and intensity of a given climate pattern, as well as possible impacts of coupling of the climate cycles. These forecasts of flood risk are compared with forecasts derived using the existing Bulletin 17B model; large differences in the one-year ahead forecasts are observed in some locations. The increased knowledge of the inherent structure of AMF series and an improved understanding of physical and/or climatic causes of nonstationarity gained from this research should serve as insight for the formulation of a physical-casual based statistical model, incorporating both climatic variations and human impacts, for flood risk over longer planning horizons (e.g., 10-, 50, 100-years) necessary for water resources design, planning, and management.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Demand for bio-fuels is expected to increase, due to rising prices of fossil fuels and concerns over greenhouse gas emissions and energy security. The overall cost of biomass energy generation is primarily related to biomass harvesting activity, transportation, and storage. With a commercial-scale cellulosic ethanol processing facility in Kinross Township of Chippewa County, Michigan about to be built, models including a simulation model and an optimization model have been developed to provide decision support for the facility. Both models track cost, emissions and energy consumption. While the optimization model provides guidance for a long-term strategic plan, the simulation model aims to present detailed output for specified operational scenarios over an annual period. Most importantly, the simulation model considers the uncertainty of spring break-up timing, i.e., seasonal road restrictions. Spring break-up timing is important because it will impact the feasibility of harvesting activity and the time duration of transportation restrictions, which significantly changes the availability of feedstock for the processing facility. This thesis focuses on the statistical model of spring break-up used in the simulation model. Spring break-up timing depends on various factors, including temperature, road conditions and soil type, as well as individual decision making processes at the county level. The spring break-up model, based on the historical spring break-up data from 27 counties over the period of 2002-2010, starts by specifying the probability distribution of a particular county’s spring break-up start day and end day, and then relates the spring break-up timing of the other counties in the harvesting zone to the first county. In order to estimate the dependence relationship between counties, regression analyses, including standard linear regression and reduced major axis regression, are conducted. Using realizations (scenarios) of spring break-up generated by the statistical spring breakup model, the simulation model is able to probabilistically evaluate different harvesting and transportation plans to help the bio-fuel facility select the most effective strategy. For early spring break-up, which usually indicates a longer than average break-up period, more log storage is required, total cost increases, and the probability of plant closure increases. The risk of plant closure may be partially offset through increased use of rail transportation, which is not subject to spring break-up restrictions. However, rail availability and rail yard storage may then become limiting factors in the supply chain. Rail use will impact total cost, energy consumption, system-wide CO2 emissions, and the reliability of providing feedstock to the bio-fuel processing facility.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Airway disease in childhood comprises a heterogeneous group of disorders. Attempts to distinguish different phenotypes have generally considered few disease dimensions. The present study examines phenotypes of childhood wheeze and chronic cough, by fitting a statistical model to data representing multiple disease dimensions. From a population-based, longitudinal cohort study of 1,650 preschool children, 319 with parent-reported wheeze or chronic cough were included. Phenotypes were identified by latent class analysis using data on symptoms, skin-prick tests, lung function and airway responsiveness from two preschool surveys. These phenotypes were then compared with respect to outcome at school age. The model distinguished three phenotypes of wheeze and two phenotypes of chronic cough. Subsequent wheeze, chronic cough and inhaler use at school age differed clearly between the five phenotypes. The wheeze phenotypes shared features with previously described entities and partly reconciled discrepancies between existing sets of phenotype labels. This novel, multidimensional approach has the potential to identify clinically relevant phenotypes, not only in paediatric disorders but also in adult obstructive airway diseases, where phenotype definition is an equally important issue.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The reasons for the development and collapse of Maya civilization remain controversial and historical events carved on stone monuments throughout this region provide a remarkable source of data about the rise and fall of these complex polities. Use of these records depends on correlating the Maya and European calendars so that they can be compared with climate and environmental datasets. Correlation constants can vary up to 1000 years and remain controversial. We report a series of high-resolution AMS C-14 dates on a wooden lintel collected from the Classic Period city of Tikal bearing Maya calendar dates. The radiocarbon dates were calibrated using a Bayesian statistical model and indicate that the dates were carved on the lintel between AD 658-696. This strongly supports the Goodman-Martinez-Thompson (GMT) correlation and the hypothesis that climate change played an important role in the development and demise of this complex civilization.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

OBJECTIVES: The aim of the study was to assess whether prospective follow-up data within the Swiss HIV Cohort Study can be used to predict patients who stop smoking; or among smokers who stop, those who start smoking again. METHODS: We built prediction models first using clinical reasoning ('clinical models') and then by selecting from numerous candidate predictors using advanced statistical methods ('statistical models'). Our clinical models were based on literature that suggests that motivation drives smoking cessation, while dependence drives relapse in those attempting to stop. Our statistical models were based on automatic variable selection using additive logistic regression with component-wise gradient boosting. RESULTS: Of 4833 smokers, 26% stopped smoking, at least temporarily; because among those who stopped, 48% started smoking again. The predictive performance of our clinical and statistical models was modest. A basic clinical model for cessation, with patients classified into three motivational groups, was nearly as discriminatory as a constrained statistical model with just the most important predictors (the ratio of nonsmoking visits to total visits, alcohol or drug dependence, psychiatric comorbidities, recent hospitalization and age). A basic clinical model for relapse, based on the maximum number of cigarettes per day prior to stopping, was not as discriminatory as a constrained statistical model with just the ratio of nonsmoking visits to total visits. CONCLUSIONS: Predicting smoking cessation and relapse is difficult, so that simple models are nearly as discriminatory as complex ones. Patients with a history of attempting to stop and those known to have stopped recently are the best candidates for an intervention.