982 resultados para parametric model


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The alkali-aggregate reaction (AAR) is a chemical reaction that provokes a heterogeneous expansion of concrete and reduces important properties such as Young's modulus, leading to a reduction in the structure's useful life. In this study, a parametric model is employed to determine the spatial distribution of the concrete expansion, combining normalized factors that influence the reaction through an AAR expansion law. Optimization techniques were employed to adjust the numerical results and observations in a real structure. A three-dimensional version of the model has been implemented in a finite element commercial package (ANSYS(C)) and verified in the analysis of an accelerated mortar test. Comparisons were made between two AAR mathematical descriptions for the mechanical phenomenon, using the same methodology, and an expansion curve obtained from experiment. Some parametric studies are also presented. The numerical results compared very well with the experimental data validating the proposed method.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Survival times for the Acacia mangium plantation in the Segaliud Lokan Project, Sabah, East Malaysia were analysed based on 20 permanent sample plots (PSPs) established in 1988 as a spacing experiment. The PSPs were established following a complete randomized block design with five levels of spacing randomly assigned to units within four blocks at different sites. The survival times of trees in years are of interest. Since the inventories were only conducted annually, the actual survival time for each tree was not observed. Hence, the data set comprises censored survival times. Initial analysis of the survival of the Acacia mangium plantation suggested there is block by spacing interaction; a Weibull model gives a reasonable fit to the replicate survival times within each PSP; but a standard Weibull regression model is inappropriate because the shape parameter differs between PSPs. In this paper we investigate the form of the non-constant Weibull shape parameter. Parsimonious models for the Weibull survival times have been derived using maximum likelihood methods. The factor selection for the parameters is based on a backward elimination procedure. The models are compared using likelihood ratio statistics. The results suggest that both Weibull parameters depend on spacing and block.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Abstract We present a refined parametric model for forecasting electricity demand which performed particularly well in the recent Global Energy Forecasting Competition (GEFCom 2012). We begin by motivating and presenting a simple parametric model, treating the electricity demand as a function of the temperature and day of the data. We then set out a series of refinements of the model, explaining the rationale for each, and using the competition scores to demonstrate that each successive refinement step increases the accuracy of the model’s predictions. These refinements include combining models from multiple weather stations, removing outliers from the historical data, and special treatments of public holidays.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Electrospinning (ES) can readily produce polymer fibers with cross-sectional dimensions ranging from tens of nanometers to tens of microns. Qualitative estimates of surface area coverage are rather intuitive. However, quantitative analytical and numerical methods for predicting surface coverage during ES have not been covered in sufficient depth to be applied in the design of novel materials, surfaces, and devices from ES fibers. This article presents a modeling approach to ES surface coverage where an analytical model is derived for use in quantitative prediction of surface coverage of ES fibers. The analytical model is used to predict the diameter of circular deposition areas of constant field strength and constant electrostatic force. Experimental results of polyvinyl alcohol fibers are reported and compared to numerical models to supplement the analytical model derived. The analytical model provides scientists and engineers a method for estimating surface area coverage. Both applied voltage and capillary-to-collection-plate separation are treated as independent variables for the analysis. The electric field produced by the ES process was modeled using COMSOL Multiphysics software to determine a correlation between the applied field strength and the size of the deposition area of the ES fibers. MATLAB scripts were utilized to combine the numerical COMSOL results with derived analytical equations. Experimental results reinforce the parametric trends produced via modeling and lend credibility to the use of modeling techniques for the qualitative prediction of surface area coverage from ES. (Copyright: 2014 American Vacuum Society.)

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The purpose of this work is twofold: first, to develop a process to automatically create parametric models of the aorta that can adapt to any possible intraoperative deformation of the vessel. Second, it intends to provide the tools needed to perform this deformation in real time, by means of a non-rigid registration method. This dynamically deformable model will later be used in a VR-based surgery guidance system for aortic catheterism procedures, showing the vessel changes in real time.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

In this paper, the theory of hidden Markov models (HMM) isapplied to the problem of blind (without training sequences) channel estimationand data detection. Within a HMM framework, the Baum–Welch(BW) identification algorithm is frequently used to find out maximum-likelihood (ML) estimates of the corresponding model. However, such a procedureassumes the model (i.e., the channel response) to be static throughoutthe observation sequence. By means of introducing a parametric model fortime-varying channel responses, a version of the algorithm, which is moreappropriate for mobile channels [time-dependent Baum-Welch (TDBW)] isderived. Aiming to compare algorithm behavior, a set of computer simulationsfor a GSM scenario is provided. Results indicate that, in comparisonto other Baum–Welch (BW) versions of the algorithm, the TDBW approachattains a remarkable enhancement in performance. For that purpose, onlya moderate increase in computational complexity is needed.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

The problem of using information available from one variable X to make inferenceabout another Y is classical in many physical and social sciences. In statistics this isoften done via regression analysis where mean response is used to model the data. Onestipulates the model Y = µ(X) +ɛ. Here µ(X) is the mean response at the predictor variable value X = x, and ɛ = Y - µ(X) is the error. In classical regression analysis, both (X; Y ) are observable and one then proceeds to make inference about the mean response function µ(X). In practice there are numerous examples where X is not available, but a variable Z is observed which provides an estimate of X. As an example, consider the herbicidestudy of Rudemo, et al. [3] in which a nominal measured amount Z of herbicide was applied to a plant but the actual amount absorbed by the plant X is unobservable. As another example, from Wang [5], an epidemiologist studies the severity of a lung disease, Y , among the residents in a city in relation to the amount of certain air pollutants. The amount of the air pollutants Z can be measured at certain observation stations in the city, but the actual exposure of the residents to the pollutants, X, is unobservable and may vary randomly from the Z-values. In both cases X = Z+error: This is the so called Berkson measurement error model.In more classical measurement error model one observes an unbiased estimator W of X and stipulates the relation W = X + error: An example of this model occurs when assessing effect of nutrition X on a disease. Measuring nutrition intake precisely within 24 hours is almost impossible. There are many similar examples in agricultural or medical studies, see e.g., Carroll, Ruppert and Stefanski [1] and Fuller [2], , among others. In this talk we shall address the question of fitting a parametric model to the re-gression function µ(X) in the Berkson measurement error model: Y = µ(X) + ɛ; X = Z + η; where η and ɛ are random errors with E(ɛ) = 0, X and η are d-dimensional, and Z is the observable d-dimensional r.v.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

In this paper we introduce a parametric model for handling lifetime data where an early lifetime can be related to the infant-mortality failure or to the wear processes but we do not know which risk is responsible for the failure. The maximum likelihood approach and the sampling-based approach are used to get the inferences of interest. Some special cases of the proposed model are studied via Monte Carlo methods for size and power of hypothesis tests. To illustrate the proposed methodology, we introduce an example consisting of a real data set.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

70.00% 70.00%

Publicador:

Resumo:

In this article, for the first time, we propose the negative binomial-beta Weibull (BW) regression model for studying the recurrence of prostate cancer and to predict the cure fraction for patients with clinically localized prostate cancer treated by open radical prostatectomy. The cure model considers that a fraction of the survivors are cured of the disease. The survival function for the population of patients can be modeled by a cure parametric model using the BW distribution. We derive an explicit expansion for the moments of the recurrence time distribution for the uncured individuals. The proposed distribution can be used to model survival data when the hazard rate function is increasing, decreasing, unimodal and bathtub shaped. Another advantage is that the proposed model includes as special sub-models some of the well-known cure rate models discussed in the literature. We derive the appropriate matrices for assessing local influence on the parameter estimates under different perturbation schemes. We analyze a real data set for localized prostate cancer patients after open radical prostatectomy.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

BACKGROUND: Many HIV-infected patients on highly active antiretroviral therapy (HAART) experience metabolic complications including dyslipidaemia and insulin resistance, which may increase their coronary heart disease (CHD) risk. We developed a prognostic model for CHD tailored to the changes in risk factors observed in patients starting HAART. METHODS: Data from five cohort studies (British Regional Heart Study, Caerphilly and Speedwell Studies, Framingham Offspring Study, Whitehall II) on 13,100 men aged 40-70 and 114,443 years of follow up were used. CHD was defined as myocardial infarction or death from CHD. Model fit was assessed using the Akaike Information Criterion; generalizability across cohorts was examined using internal-external cross-validation. RESULTS: A parametric model based on the Gompertz distribution generalized best. Variables included in the model were systolic blood pressure, total cholesterol, high-density lipoprotein cholesterol, triglyceride, glucose, diabetes mellitus, body mass index and smoking status. Compared with patients not on HAART, the estimated CHD hazard ratio (HR) for patients on HAART was 1.46 (95% CI 1.15-1.86) for moderate and 2.48 (95% CI 1.76-3.51) for severe metabolic complications. CONCLUSIONS: The change in the risk of CHD in HIV-infected men starting HAART can be estimated based on typical changes in risk factors, assuming that HRs estimated using data from non-infected men are applicable to HIV-infected men. Based on this model the risk of CHD is likely to increase, but increases may often be modest, and could be offset by lifestyle changes.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Prevalent sampling is an efficient and focused approach to the study of the natural history of disease. Right-censored time-to-event data observed from prospective prevalent cohort studies are often subject to left-truncated sampling. Left-truncated samples are not randomly selected from the population of interest and have a selection bias. Extensive studies have focused on estimating the unbiased distribution given left-truncated samples. However, in many applications, the exact date of disease onset was not observed. For example, in an HIV infection study, the exact HIV infection time is not observable. However, it is known that the HIV infection date occurred between two observable dates. Meeting these challenges motivated our study. We propose parametric models to estimate the unbiased distribution of left-truncated, right-censored time-to-event data with uncertain onset times. We first consider data from a length-biased sampling, a specific case in left-truncated samplings. Then we extend the proposed method to general left-truncated sampling. With a parametric model, we construct the full likelihood, given a biased sample with unobservable onset of disease. The parameters are estimated through the maximization of the constructed likelihood by adjusting the selection bias and unobservable exact onset. Simulations are conducted to evaluate the finite sample performance of the proposed methods. We apply the proposed method to an HIV infection study, estimating the unbiased survival function and covariance coefficients. ^

Relevância:

70.00% 70.00%

Publicador:

Resumo:

A sample of 95 sib pairs affected with insulin-dependent diabetes and typed with their normal parents for 28 markers on chromosome 6 has been analyzed by several methods. When appropriate parameters are efficiently estimated, a parametric model is equivalent to the β model, which is superior to nonparametric alternatives both in single point tests (as found previously) and in multipoint tests. Theory is given for meta-analysis combined with allelic association, and problems that may be associated with errors of map location and/or marker typing are identified. Reducing by multipoint analysis the number of association tests in a dense map can give a 3-fold reduction in the critical lod, and therefore in the cost of positional cloning.