937 resultados para Bose-Einstein condensation statistical model


Relevância:

100.00% 100.00%

Publicador:

Resumo:

We derive a new class of iterative schemes for accelerating the convergence of the EM algorithm, by exploiting the connection between fixed point iterations and extrapolation methods. First, we present a general formulation of one-step iterative schemes, which are obtained by cycling with the extrapolation methods. We, then square the one-step schemes to obtain the new class of methods, which we call SQUAREM. Squaring a one-step iterative scheme is simply applying it twice within each cycle of the extrapolation method. Here we focus on the first order or rank-one extrapolation methods for two reasons, (1) simplicity, and (2) computational efficiency. In particular, we study two first order extrapolation methods, the reduced rank extrapolation (RRE1) and minimal polynomial extrapolation (MPE1). The convergence of the new schemes, both one-step and squared, is non-monotonic with respect to the residual norm. The first order one-step and SQUAREM schemes are linearly convergent, like the EM algorithm but they have a faster rate of convergence. We demonstrate, through five different examples, the effectiveness of the first order SQUAREM schemes, SqRRE1 and SqMPE1, in accelerating the EM algorithm. The SQUAREM schemes are also shown to be vastly superior to their one-step counterparts, RRE1 and MPE1, in terms of computational efficiency. The proposed extrapolation schemes can fail due to the numerical problems of stagnation and near breakdown. We have developed a new hybrid iterative scheme that combines the RRE1 and MPE1 schemes in such a manner that it overcomes both stagnation and near breakdown. The squared first order hybrid scheme, SqHyb1, emerges as the iterative scheme of choice based on our numerical experiments. It combines the fast convergence of the SqMPE1, while avoiding near breakdowns, with the stability of SqRRE1, while avoiding stagnations. The SQUAREM methods can be incorporated very easily into an existing EM algorithm. They only require the basic EM step for their implementation and do not require any other auxiliary quantities such as the complete data log likelihood, and its gradient or hessian. They are an attractive option in problems with a very large number of parameters, and in problems where the statistical model is complex, the EM algorithm is slow and each EM step is computationally demanding.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Automatic identification and extraction of bone contours from X-ray images is an essential first step task for further medical image analysis. In this paper we propose a 3D statistical model based framework for the proximal femur contour extraction from calibrated X-ray images. The automatic initialization is solved by an estimation of Bayesian network algorithm to fit a multiple component geometrical model to the X-ray data. The contour extraction is accomplished by a non-rigid 2D/3D registration between a 3D statistical model and the X-ray images, in which bone contours are extracted by a graphical model based Bayesian inference. Preliminary experiments on clinical data sets verified its validity

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The report explores the problem of detecting complex point target models in a MIMO radar system. A complex point target is a mathematical and statistical model for a radar target that is not resolved in space, but exhibits varying complex reflectivity across the different bistatic view angles. The complex reflectivity can be modeled as a complex stochastic process whose index set is the set of all the bistatic view angles, and the parameters of the stochastic process follow from an analysis of a target model comprising a number of ideal point scatterers randomly located within some radius of the targets center of mass. The proposed complex point targets may be applicable to statistical inference in multistatic or MIMO radar system. Six different target models are summarized here – three 2-dimensional (Gaussian, Uniform Square, and Uniform Circle) and three 3-dimensional (Gaussian, Uniform Cube, and Uniform Sphere). They are assumed to have different distributions on the location of the point scatterers within the target. We develop data models for the received signals from such targets in the MIMO radar system with distributed assets and partially correlated signals, and consider the resulting detection problem which reduces to the familiar Gauss-Gauss detection problem. We illustrate that the target parameter and transmit signal have an influence on the detector performance through target extent and the SNR respectively. A series of the receiver operator characteristic (ROC) curves are generated to notice the impact on the detector for varying SNR. Kullback–Leibler (KL) divergence is applied to obtain the approximate mean difference between density functions the scatterers assume inside the target models to show the change in the performance of the detector with target extent of the point scatterers.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Standard procedures for forecasting flood risk (Bulletin 17B) assume annual maximum flood (AMF) series are stationary, meaning the distribution of flood flows is not significantly affected by climatic trends/cycles, or anthropogenic activities within the watershed. Historical flood events are therefore considered representative of future flood occurrences, and the risk associated with a given flood magnitude is modeled as constant over time. However, in light of increasing evidence to the contrary, this assumption should be reconsidered, especially as the existence of nonstationarity in AMF series can have significant impacts on planning and management of water resources and relevant infrastructure. Research presented in this thesis quantifies the degree of nonstationarity evident in AMF series for unimpaired watersheds throughout the contiguous U.S., identifies meteorological, climatic, and anthropogenic causes of this nonstationarity, and proposes an extension of the Bulletin 17B methodology which yields forecasts of flood risk that reflect climatic influences on flood magnitude. To appropriately forecast flood risk, it is necessary to consider the driving causes of nonstationarity in AMF series. Herein, large-scale climate patterns—including El Niño-Southern Oscillation (ENSO), Pacific Decadal Oscillation (PDO), North Atlantic Oscillation (NAO), and Atlantic Multidecadal Oscillation (AMO)—are identified as influencing factors on flood magnitude at numerous stations across the U.S. Strong relationships between flood magnitude and associated precipitation series were also observed for the majority of sites analyzed in the Upper Midwest and Northeastern regions of the U.S. Although relationships between flood magnitude and associated temperature series are not apparent, results do indicate that temperature is highly correlated with the timing of flood peaks. Despite consideration of watersheds classified as unimpaired, analyses also suggest that identified change-points in AMF series are due to dam construction, and other types of regulation and diversion. Although not explored herein, trends in AMF series are also likely to be partially explained by changes in land use and land cover over time. Results obtained herein suggest that improved forecasts of flood risk may be obtained using a simple modification of the Bulletin 17B framework, wherein the mean and standard deviation of the log-transformed flows are modeled as functions of climate indices associated with oceanic-atmospheric patterns (e.g. AMO, ENSO, NAO, and PDO) with lead times between 3 and 9 months. Herein, one-year ahead forecasts of the mean and standard deviation, and subsequently flood risk, are obtained by applying site specific multivariate regression models, which reflect the phase and intensity of a given climate pattern, as well as possible impacts of coupling of the climate cycles. These forecasts of flood risk are compared with forecasts derived using the existing Bulletin 17B model; large differences in the one-year ahead forecasts are observed in some locations. The increased knowledge of the inherent structure of AMF series and an improved understanding of physical and/or climatic causes of nonstationarity gained from this research should serve as insight for the formulation of a physical-casual based statistical model, incorporating both climatic variations and human impacts, for flood risk over longer planning horizons (e.g., 10-, 50, 100-years) necessary for water resources design, planning, and management.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Demand for bio-fuels is expected to increase, due to rising prices of fossil fuels and concerns over greenhouse gas emissions and energy security. The overall cost of biomass energy generation is primarily related to biomass harvesting activity, transportation, and storage. With a commercial-scale cellulosic ethanol processing facility in Kinross Township of Chippewa County, Michigan about to be built, models including a simulation model and an optimization model have been developed to provide decision support for the facility. Both models track cost, emissions and energy consumption. While the optimization model provides guidance for a long-term strategic plan, the simulation model aims to present detailed output for specified operational scenarios over an annual period. Most importantly, the simulation model considers the uncertainty of spring break-up timing, i.e., seasonal road restrictions. Spring break-up timing is important because it will impact the feasibility of harvesting activity and the time duration of transportation restrictions, which significantly changes the availability of feedstock for the processing facility. This thesis focuses on the statistical model of spring break-up used in the simulation model. Spring break-up timing depends on various factors, including temperature, road conditions and soil type, as well as individual decision making processes at the county level. The spring break-up model, based on the historical spring break-up data from 27 counties over the period of 2002-2010, starts by specifying the probability distribution of a particular county’s spring break-up start day and end day, and then relates the spring break-up timing of the other counties in the harvesting zone to the first county. In order to estimate the dependence relationship between counties, regression analyses, including standard linear regression and reduced major axis regression, are conducted. Using realizations (scenarios) of spring break-up generated by the statistical spring breakup model, the simulation model is able to probabilistically evaluate different harvesting and transportation plans to help the bio-fuel facility select the most effective strategy. For early spring break-up, which usually indicates a longer than average break-up period, more log storage is required, total cost increases, and the probability of plant closure increases. The risk of plant closure may be partially offset through increased use of rail transportation, which is not subject to spring break-up restrictions. However, rail availability and rail yard storage may then become limiting factors in the supply chain. Rail use will impact total cost, energy consumption, system-wide CO2 emissions, and the reliability of providing feedstock to the bio-fuel processing facility.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Airway disease in childhood comprises a heterogeneous group of disorders. Attempts to distinguish different phenotypes have generally considered few disease dimensions. The present study examines phenotypes of childhood wheeze and chronic cough, by fitting a statistical model to data representing multiple disease dimensions. From a population-based, longitudinal cohort study of 1,650 preschool children, 319 with parent-reported wheeze or chronic cough were included. Phenotypes were identified by latent class analysis using data on symptoms, skin-prick tests, lung function and airway responsiveness from two preschool surveys. These phenotypes were then compared with respect to outcome at school age. The model distinguished three phenotypes of wheeze and two phenotypes of chronic cough. Subsequent wheeze, chronic cough and inhaler use at school age differed clearly between the five phenotypes. The wheeze phenotypes shared features with previously described entities and partly reconciled discrepancies between existing sets of phenotype labels. This novel, multidimensional approach has the potential to identify clinically relevant phenotypes, not only in paediatric disorders but also in adult obstructive airway diseases, where phenotype definition is an equally important issue.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The reasons for the development and collapse of Maya civilization remain controversial and historical events carved on stone monuments throughout this region provide a remarkable source of data about the rise and fall of these complex polities. Use of these records depends on correlating the Maya and European calendars so that they can be compared with climate and environmental datasets. Correlation constants can vary up to 1000 years and remain controversial. We report a series of high-resolution AMS C-14 dates on a wooden lintel collected from the Classic Period city of Tikal bearing Maya calendar dates. The radiocarbon dates were calibrated using a Bayesian statistical model and indicate that the dates were carved on the lintel between AD 658-696. This strongly supports the Goodman-Martinez-Thompson (GMT) correlation and the hypothesis that climate change played an important role in the development and demise of this complex civilization.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

OBJECTIVES: The aim of the study was to assess whether prospective follow-up data within the Swiss HIV Cohort Study can be used to predict patients who stop smoking; or among smokers who stop, those who start smoking again. METHODS: We built prediction models first using clinical reasoning ('clinical models') and then by selecting from numerous candidate predictors using advanced statistical methods ('statistical models'). Our clinical models were based on literature that suggests that motivation drives smoking cessation, while dependence drives relapse in those attempting to stop. Our statistical models were based on automatic variable selection using additive logistic regression with component-wise gradient boosting. RESULTS: Of 4833 smokers, 26% stopped smoking, at least temporarily; because among those who stopped, 48% started smoking again. The predictive performance of our clinical and statistical models was modest. A basic clinical model for cessation, with patients classified into three motivational groups, was nearly as discriminatory as a constrained statistical model with just the most important predictors (the ratio of nonsmoking visits to total visits, alcohol or drug dependence, psychiatric comorbidities, recent hospitalization and age). A basic clinical model for relapse, based on the maximum number of cigarettes per day prior to stopping, was not as discriminatory as a constrained statistical model with just the ratio of nonsmoking visits to total visits. CONCLUSIONS: Predicting smoking cessation and relapse is difficult, so that simple models are nearly as discriminatory as complex ones. Patients with a history of attempting to stop and those known to have stopped recently are the best candidates for an intervention.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Fission fragment mass distributions were measured in heavy-ion induced fission of 238U. The mass distributions changed drastically with incident energy. The results are explained by a change of the ratio between fusion and quasifission with nuclear orientation. A calculation based on a fluctuation dissipation model reproduced the mass distributions and their incident energy dependence. Fusion probability was determined in the analysis. Evaporation residue cross sections were calculated with a statistical model for the reactions of 30Si+238U and 34S+238U using the obtained fusion probability in the entrance channel. The results agree with the measured cross sections of 263,264Sg and 267,268Hs, produced by 30Si+238U and 34S+238U, respectively. It is also suggested that sub-barrier energies can be used for heavy element synthesis.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We describe the recovery of three daily meteorological records for the southern Alps (Domodossola, Riva del Garda, and Rovereto), all starting in the second half of the nineteenth century. We use these new data, along with additional records, to study regional changes in the mean temperature and extreme indices of heat waves and cold spells frequency and duration over the period 1874–2015. The records are homogenized using subdaily cloud cover observations as a constraint for the statistical model, an approach that has never been applied before in the literature. A case study based on a record of parallel observations between a traditional meteorological window and a modern screen shows that the use of cloud cover can reduce the root-mean-square error of the homogenization by up to 30% in comparison to an unaided statistical correction. We find that mean temperature in the southern Alps has increased by 1.4°C per century over the analyzed period, with larger increases in daily minimum temperatures than maximum temperatures. The number of hot days in summer has more than tripled, and a similar increase is observed in duration of heat waves. Cold days in winter have dropped at a similar rate. These trends are mainly caused by climate change over the last few decades.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Predicting the federal funds rate and beating the federal funds futures market: mission impossible? Not so. We employ a Markov transition process and show that this model outperforms the federal funds futures market in predicting the target federal funds rate. Thus, by using purely historical data we are able to better explain future monetary policy than a forward looking measure like the federal funds futures rate. The fact that the federal funds futures market can be beaten by a statistical model, suggests that the federal funds futures market lacks eciency. The mar- ket allocates too much weight to current Federal Reserve communication and other real-time macro events, and allocates too little weight to past monetary policy behavior.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The objective of this dissertation was to design and implement strategies for assessment of exposures to organic chemicals used in the production of a styrene-butadiene polymer at the Texas Plastics Company (TPC). Linear statistical retrospective exposure models, univariate and multivariate, were developed based on the validation of historical industrial hygiene monitoring data collected by industrial hygienists at TPC, and additional current industrial hygiene monitoring data collected for the purposes of this study. The current monitoring data served several purposes. First, it provided information on current exposure data, in the form of unbiased estimates of mean exposure to organic chemicals for each job title included. Second, it provided information on homogeneity of exposure within each job title, through the use of a carefully designed sampling scheme which addressed variability of exposure both between and within job titles. Third, it permitted the investigation of how well current exposure data can serve as an evaluation tool for retrospective exposure estimation. Finally, this dissertation investigated the simultaneous evaluation of exposure to several chemicals, as well as the use of values below detection limits in a multivariate linear statistical model of exposures. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The purpose of this study was to analyze the implementation of national family planning policy in the United States, which was embedded in four separate statutes during the period of study, Fiscal Years 1976-81. The design of the study utilized a modification of the Sabatier and Mazmanian framework for policy analysis, which defined implementation as the carrying out of statutory policy. The study was divided into two phases. The first part of the study compared the implementation of family planning policy by each of the pertinent statutes. The second part of the study identified factors that were associated with implementation of federal family planning policy within the context of block grants.^ Implemention was measured here by federal dollars spent for family planning, adjusted for the size of the respective state target populations. Expenditure data were collected from the Alan Guttmacher Institute and from each of the federal agencies having administrative authority for the four pertinent statutes, respectively. Data from the former were used for most of the analysis because they were more complete and more reliable.^ The first phase of the study tested the hypothesis that the coherence of a statute is directly related to effective implementation. Equity in the distribution of funds to the states was used to operationalize effective implementation. To a large extent, the results of the analysis supported the hypothesis. In addition to their theoretical significance, these findings were also significant for policymakers insofar they demonstrated the effectiveness of categorical legislation in implementing desired health policy.^ Given the current and historically intermittent emphasis on more state and less federal decision-making in health and human serives, the second phase of the study focused on state level factors that were associated with expenditures of social service block grant funds for family planning. Using the Sabatier-Mazmanian implementation model as a framework, many factors were tested. Those factors showing the strongest conceptual and statistical relationship to the dependent variable were used to construct a statistical model. Using multivariable regression analysis, this model was applied cross-sectionally to each of the years of the study. The most striking finding here was that the dominant determinants of the state spending varied for each year of the study (Fiscal Years 1976-1981). The significance of these results was that they provided empirical support of current implementation theory, showing that the dominant determinants of implementation vary greatly over time. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The importance of race as a factor in mental health status has been a topic of controversy. This study reviews the history of research in this area and examines racial variances in the relationship between selected socio-demographic variables and general well-being. The study also examines the appropriateness of an additive versus an interactive statistical model for this investigation.^ The sample consists of 6,913 persons who completed the General Well-Being Schedules as administered in the detailed component of the first National Health and Nutrition Examination Survey (NHANES I) conducted by the National Center for Health Statistics between April, 1971 and October, 1975. The sampling design is a multistage, probability sample of clusters of persons in area based segments. Of the 6,913 persons, 873 are Black.^ Unlike other recent community based mental health studies, this study revealed significant differences between the general well-being of Blacks and Whites. Blacks continued to exhibit significantly lower levels of well-being even after adjustments were made for income, education, marital status, sex, age and place of residence. Statistical interaction was found between race and sex with Black females reporting lower levels of well-being than either Black or White males or their White female counterparts.^ The study includes a detailed review of the NHANES I sample design. It is shown that selected aspects of the design make it difficult to render appropriate national comparisons of Black-White differences. As a result conclusions pertaining to these differences based on NHANES I may be of questionable validity. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The efficacy of waste stabilization lagoons for the treatment of five priority pollutants and two widely used commercial compounds was evaluated in laboratory model ponds. Three ponds were designed to simulate a primary anaerobic lagoon, a secondary facultative lagoon, and a tertiary aerobic lagoon. Biodegradation, volatilization, and sorption losses were quantified for bis(2-chloroethyl) ether, benzene, toluene, naphthalene, phenanthrene, ethylene glycol, and ethylene glycol monoethyl ether. A statistical model using a log normal transformation indicated biodegradation of bis(2-chloroethyl) ether followed first-order kinetics. Additionally, multiple regression analysis indicated biochemical oxygen demand was the water quality variable most highly correlated with bis(2-chloroethyl) ether effluent concentration. ^