879 resultados para Predictive regression


Relevância:

20.00% 20.00%

Publicador:

Resumo:

A total of 152,145 weekly test-day milk yield records from 7317 first lactations of Holstein cows distributed in 93 herds in southeastern Brazil were analyzed. Test-day milk yields were classified into 44 weekly classes of DIM. The contemporary groups were defined as herd-year-week of test-day. The model included direct additive genetic, permanent environmental and residual effects as random and fixed effects of contemporary group and age of cow at calving as covariable, linear and quadratic effects. Mean trends were modeled by a cubic regression on orthogonal polynomials of DIM. Additive genetic and permanent environmental random effects were estimated by random regression on orthogonal Legendre polynomials. Residual variances were modeled using third to seventh-order variance functions or a step function with 1, 6,13,17 and 44 variance classes. Results from Akaike`s and Schwarz`s Bayesian information criterion suggested that a model considering a 7th-order Legendre polynomial for additive effect, a 12th-order polynomial for permanent environment effect and a step function with 6 classes for residual variances, fitted best. However, a parsimonious model, with a 6th-order Legendre polynomial for additive effects and a 7th-order polynomial for permanent environmental effects, yielded very similar genetic parameter estimates. (C) 2008 Elsevier B.V. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We introduce the log-beta Weibull regression model based on the beta Weibull distribution (Famoye et al., 2005; Lee et al., 2007). We derive expansions for the moment generating function which do not depend on complicated functions. The new regression model represents a parametric family of models that includes as sub-models several widely known regression models that can be applied to censored survival data. We employ a frequentist analysis, a jackknife estimator, and a parametric bootstrap for the parameters of the proposed model. We derive the appropriate matrices for assessing local influences on the parameter estimates under different perturbation schemes and present some ways to assess global influences. Further, for different parameter settings, sample sizes, and censoring percentages, several simulations are performed. In addition, the empirical distribution of some modified residuals are displayed and compared with the standard normal distribution. These studies suggest that the residual analysis usually performed in normal linear regression models can be extended to a modified deviance residual in the proposed regression model applied to censored data. We define martingale and deviance residuals to evaluate the model assumptions. The extended regression model is very useful for the analysis of real data and could give more realistic fits than other special regression models.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A cholesterol-rich nanoemulsion (LDE) that resembles LDL binds to the LDL receptors and after injection into the blood stream may concentrate in cells with LDL receptor overexpression, as occurs in neoplasias and other proliferative processes. Thus, LDE can be used as vehicle to target drugs against those cells. The current study was designed to verify in rabbits whether LDE concentrates in the lesioned rabbit artery and whether a paclitaxel derivative, paclitaxel oleate, associated to LDE could reduce the atherosclerotic lesions. Sixteen male New Zealand rabbits were fed a 1% cholesterol diet for 60 days. Starting from day 30 under cholesterol feeding, eight animals were treated with four weekly intravenous injections of LDE-paclitaxel (4 mg/kg) and eight with four weekly intravenous saline solution injections for additional 30 days. On day 60, the animals were sacrificed for analysis. The uptake of LDE labeled with [C-14]-cholesteryl oleate by the aortic arch of cholesterol-fed rabbits was twice as much that observed in animals fed only regular chow. LDE-paclitaxel reduced the lesion areas of cholesterol-fed animals by 60% and intima-media ratio fourfold and inhibited the macrophage migration and the smooth muscle cell proliferation and invasion of the intima. LDE-paclitaxel treatment had no toxicity. In conclusion, LDE-paclitaxel produced pronounced atherosclerosis regression without toxicity and has shown remarkable potential in cardiovascular therapeutics. (c) 2008 Published by Elsevier Ireland Ltd.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Data mining is the process to identify valid, implicit, previously unknown, potentially useful and understandable information from large databases. It is an important step in the process of knowledge discovery in databases, (Olaru & Wehenkel, 1999). In a data mining process, input data can be structured, seme-structured, or unstructured. Data can be in text, categorical or numerical values. One of the important characteristics of data mining is its ability to deal data with large volume, distributed, time variant, noisy, and high dimensionality. A large number of data mining algorithms have been developed for different applications. For example, association rules mining can be useful for market basket problems, clustering algorithms can be used to discover trends in unsupervised learning problems, classification algorithms can be applied in decision-making problems, and sequential and time series mining algorithms can be used in predicting events, fault detection, and other supervised learning problems (Vapnik, 1999). Classification is among the most important tasks in the data mining, particularly for data mining applications into engineering fields. Together with regression, classification is mainly for predictive modelling. So far, there have been a number of classification algorithms in practice. According to (Sebastiani, 2002), the main classification algorithms can be categorized as: decision tree and rule based approach such as C4.5 (Quinlan, 1996); probability methods such as Bayesian classifier (Lewis, 1998); on-line methods such as Winnow (Littlestone, 1988) and CVFDT (Hulten 2001), neural networks methods (Rumelhart, Hinton & Wiliams, 1986); example-based methods such as k-nearest neighbors (Duda & Hart, 1973), and SVM (Cortes & Vapnik, 1995). Other important techniques for classification tasks include Associative Classification (Liu et al, 1998) and Ensemble Classification (Tumer, 1996).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This study examined the relationship between isokinetic hip extensor/hip flexor strength, 1-RM squat strength, and sprint running performance for both a sprint-trained and non-sprint-trained group. Eleven male sprinters and 8 male controls volunteered for the study. On the same day subjects ran 20-m sprints from both a stationary start and with a 50-m acceleration distance, completed isokinetic hip extension/flexion exercises at 1.05, 4.74, and 8.42 rad.s(-1), and had their squat strength estimated. Stepwise multiple regression analysis showed that equations for predicting both 20-m maximum velocity nm time and 20-m acceleration time may be calculated with an error of less than 0.05 sec using only isokinetic and squat strength data. However, a single regression equation for predicting both 20-m acceleration and maximum velocity run times from isokinetic or squat tests was not found. The regression analysis indicated that hip flexor strength at all test velocities was a better predictor of sprint running performance than hip extensor strength.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A significant problem in the collection of responses to potentially sensitive questions, such as relating to illegal, immoral or embarrassing activities, is non-sampling error due to refusal to respond or false responses. Eichhorn & Hayre (1983) suggested the use of scrambled responses to reduce this form of bias. This paper considers a linear regression model in which the dependent variable is unobserved but for which the sum or product with a scrambling random variable of known distribution, is known. The performance of two likelihood-based estimators is investigated, namely of a Bayesian estimator achieved through a Markov chain Monte Carlo (MCMC) sampling scheme, and a classical maximum-likelihood estimator. These two estimators and an estimator suggested by Singh, Joarder & King (1996) are compared. Monte Carlo results show that the Bayesian estimator outperforms the classical estimators in almost all cases, and the relative performance of the Bayesian estimator improves as the responses become more scrambled.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Several long-term studies of breast cancer survival have shown continued excess mortality from breast cancer up to 20-40 years following treatment. The purpose of this report was to investigate temporal trends in long-term survival from breast cancer in all New South Wales (NSW) women. Breast cancer cases incident in 1972-1996 (54,228) were derived from the NSW Central Cancer Registry a population-based registry which began in 1972. All cases of breast cancer not known to be dead were matched against death records. The expected survival for NSW women was derived from published annual life tables. Relative survival analysis compared the survival of cancer cases with the age, sex and period matched mortality of the total population. Cases were considered alive at the end of 1996, except when known to be dead. Proportional hazards regression was employed to model survival on age, period and degree of spread at diagnosis. Survival at 5, 10, 15, 20 and 25 years of follow-up was 76 per cent, 65 per cent, 60 per cent, 57 per cent and 56 per cent. The annual hazard rate for excess mortality was 4.3 per cent in year 1, maximal at 6.5 per cent in year 3, declining to 4.7 per cent in year 5, 2.7 per cent in year 10, 1.4 per cent in year 15, 1.0 per cent for years 16-20, and 0.4 per cent for years 20-25 of follow-up. Relative survival was highest in 40-49 year-olds. Cases diagnosed most recently (1992-1996) had the highest survival, compared with cases diagnosed in previous periods. Five-year survival improved over time, especially from the late 1980s for women in the screening age group (50-69 years). Survival was highest for those with localised cancer at diagnosis: 88.4 per cent, 79.1 per cent, 74.6 per cent, 72.7 per cent and 72.8 per cent at 5, 10, 15, 20 and 25 years follow-up (excluding those aged greater than or equal to 70 years). There was no significant difference between the survival of the breast cancer cases and the general population at 20-25 years follow-up. Degree of spread was less predictive of survival 5-20 years after diagnosis, compared with 0-5 years after diagnosis, and was not significant at 20-25 years of follow-up. Relative survival from breast cancer in NSW women continues to decrease to 25 years after diagnosis, but there is little excess mortality after 15 years follow-up, especially for those with localised cancer at diagnosis, and the minimal excess mortality at 20-25 years of follow-up is not statistically significant. (C) 2002 Elsevier Science Ltd. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper is part of a large study to assess the adequacy of the use of multivariate statistical techniques in theses and dissertations of some higher education institutions in the area of marketing with theme of consumer behavior from 1997 to 2006. The regression and conjoint analysis are focused on in this paper, two techniques with great potential of use in marketing studies. The objective of this study was to analyze whether the employement of these techniques suits the needs of the research problem presented in as well as to evaluate the level of success in meeting their premisses. Overall, the results suggest the need for more involvement of researchers in the verification of all the theoretical precepts of application of the techniques classified in the category of investigation of dependence among variables.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

PURPOSE: Many guidelines advocate measurement of total or low density lipoprotein cholesterol (LDL), high density lipoprotein cholesterol (HDL), and triglycerides (TG) to determine treatment recommendations for preventing coronary heart disease (CHD) and cardiovascular disease (CVD). This analysis is a comparison of lipid variables as predictors of cardiovascular disease. METHODS: Hazard ratios for coronary and cardiovascular deaths by fourths of total cholesterol (TC), LDL, HDL, TG, non-HDL, TC/HDL, and TG/HDL values, and for a one standard deviation change in these variables, were derived in an individual participant data meta-analysis of 32 cohort studies conducted in the Asia-Pacific region. The predictive value of each lipid variable was assessed using the likelihood ratio statistic. RESULTS: Adjusting for confounders and regression dilution, each lipid variable had a positive (negative for HDL) log-linear association with fatal CHD and CVD. Individuals in the highest fourth of each lipid variable had approximately twice the risk of CHD compared with those with lowest levels. TG and HDL were each better predictors of CHD and CVD risk compared with TC alone, with test statistics similar to TC/HDL and TG/HDL ratios. Calculated LDL was a relatively poor predictor. CONCLUSIONS: While LDL reduction remains the main target of intervention for lipid-lowering, these data support the potential use of TG or lipid ratios for CHD risk prediction. (c) 2005 Elsevier Inc. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Objectives: To describe current practice for the discontinuation of continuous renal replacement therapy in a multinational setting and to identify variables associated with successful discontinuation. The approach to discontinue continuous renal replacement therapy may affect patient outcomes. However, there is lack of information on how and under what conditions continuous renal replacement therapy is discontinued. Design: Post hoc analysis of a prospective observational study. Setting. Fifty-four intensive care units in 23 countries. Patients: Five hundred twenty-nine patients (52.6%) who survived initial therapy among 1006 patients treated with continuous renal replacement therapy. Interventions: None. Measurements and Main Results., Three hundred thirteen patients were removed successfully from continuous renal replacement therapy and did not require any renal replacement therapy for at least 7 days and were classified as the ""success"" group and the rest (216 patients) were classified as the ""repeat-RRT"" (renal replacement therapy) group. Patients in the ""success"" group had lower hospital mortality (28.5% vs. 42.7%, p < .0001) compared with patients in the ""repeat-RRT"" group. They also had lower creatinine and urea concentrations and a higher urine output at the time of stopping continuous renal replacement therapy. Multivariate logistic regression analysis for successful discontinuation of continuous renal replacement therapy identified urine output (during the 24 hrs before stopping continuous renal replacement therapy: odds ratio, 1.078 per 100 mL/day increase) and creatinine (odds ratio, 0.996 per mu mol/L increase) as significant predictors of successful cessation. The area under the receiver operating characteristic curve to predict successful discontinuation of continuous renal replacement therapy was 0.808 for urine output and 0.635 for creatinine. The predictive ability of urine output was negatively affected by the use of diuretics (area under the receiver operating characteristic curve, 0.671 with diuretics and 0.845 without diuretics). Conclusions. We report on the current practice of discontinuing continuous renal replacement therapy in a multinational setting. Urine output at the time of initial cessation (if continuous renal replacement therapy was the most important predictor of successful discontinuation, especially if occurring without the administration of diuretics. (Crit Care Med 2009; 37:2576-2582)

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The Ordos Plateau in China is covered with up to 300,000 ha of peashrub (Caragana) which is the dominant natural vegetation and ideal for fodder production. To exploit peashrub fodder, it is crucially important to optimize the culture conditions, especially culture substrate to produce pectinase complex. In this study, a new prescription process was developed. The process, based on a uniform experimental design, first optimizes the solid substrate and second, after incubation, applies two different temperature treatments (30 degrees C for the first 30 h and 23 degrees C for the second 42 h) in the fermentation process. A multivariate regression analysis is applied to a number of independent variables (water, wheat bran, rice dextrose, ammonium sulfate, and Tween 80) to develop a predictive model of pectinase activity. A second-degree polynomial model is developed which accounts for an excellent proportion of the explained variation (R-2 = 97.7%). Using unconstrained mathematical programming, an optimized substrate prescription for pectinase production is subsequently developed. The mathematical analysis revealed that the optimal formula for pectinase production from Aspergillus niger by solid fermentation under the conditions of natural aeration, natural substrate pH (about 6.5), and environmental humidity of 60% is rice dextrose 8%, wheat bran 24%, ammonium sulfate ((NH4)(2)SO4) 6%, and water 61%. Tween 80 was found to have a negative effect on the production of pectinase in solid substrate. With this substrate prescription, pectinase produced by solid fermentation of A. niger reached 36.3IU/(gDM). Goats fed on the pectinase complex obtain an incremental increase of 0.47 kg day(-1) during the initial 25 days of feeding, which is a very promising new feeding prospect for the local peashrub. It is concluded that the new formula may be very useful for the sustainable development of and and semiarid pastures such as those of the Ordos Plateau. (c) 2005 Elsevier Inc. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Aortic valve calcium (AVC) can be quantified on the same computed tomographic scan as coronary artery calcium (CAC). Although CAC is an established predictor of cardiovascular events, limited evidence is available for an independent predictive value for AVC. We studied a cohort of 8,401 asymptomatic subjects (mean age 53 10 years, 69% men), who were free of known coronary heart disease and were undergoing electron beam computed tomography for assessment of subclinical atherosclerosis. The patients were followed for a median of 5 years (range 1 to 7) for the occurrence of mortality from any cause. Multivariate Cox regression models were developed to predict all-cause mortality according to the presence of AVC. A total of 517 patients (6%) had AVC on electron beam computed tomography. During follow-up, 124 patients died (1.5%), for an overall survival rate of 96.1% and 98.7% for those with and without AVC, respectively (hazard ratio 3.39, 95% confidence interval 2.09 to 5.49). After adjustment for age, gender, hypertension, dyslipidemia, diabetes mellitus, smoking, and a family history of premature coronary heart disease, AVC remained a significant predictor of mortality (hazard ratio 1.82, 95% confidence interval 1.11 to 2.98). Likelihood ratio chi-square statistics demonstrated that the addition of AVC contributed significantly to the prediction of mortality in a model adjusted for traditional risk factors (chi-square = 5.03, p = 0.03) as well as traditional risk factors plus the presence of CAC (chi-square = 3.58, p = 0.05). In conclusion, AVC was associated with increased all-cause mortality, independent of the traditional risk factors and the presence of CAC. (C) 2010 Published by Elsevier Inc. (Am J Cardiol 2010;106:1787-1791)

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We reviewed the data of 307 patients treated with autologous bone marrow transplantation with the aim to identify factors associated with poor hematopoietic stern cell (HSC) mobilization after administration of cyclophosphamide and granulocyte-colony stimulating factor. Success in mobilization was defined when >= 2.0 x 10(6) CD34+ cells/kg weight could be collected with <= 3 leukapheresis procedures. Success was observed in 260 patients (84.7%) and nonsuccess in 47 patients (15.3%). According to the stepwise regression model: diagnosis, chemotherapy load, treatment with mitoxantrone and platelet count before mobilization were found to be independent predictive factors for HSC mobilization. These results could help in the previous recognition of patients at risk for non response to mobilization and allow to plan an alternative protocol for this group of patients. (C) 2008 Elsevier Ltd. All rights reserved.