965 resultados para nonlinear regression analysis
Resumo:
Cover title.
Resumo:
Bibliographical footnotes.
Resumo:
Multiple regression analysis is a complex statistical method with many potential uses. It has also become one of the most abused of all statistical procedures since anyone with a data base and suitable software can carry it out. An investigator should always have a clear hypothesis in mind before carrying out such a procedure and knowledge of the limitations of each aspect of the analysis. In addition, multiple regression is probably best used in an exploratory context, identifying variables that might profitably be examined by more detailed studies. Where there are many variables potentially influencing Y, they are likely to be intercorrelated and to account for relatively small amounts of the variance. Any analysis in which R squared is less than 50% should be suspect as probably not indicating the presence of significant variables. A further problem relates to sample size. It is often stated that the number of subjects or patients must be at least 5-10 times the number of variables included in the study.5 This advice should be taken only as a rough guide but it does indicate that the variables included should be selected with great care as inclusion of an obviously unimportant variable may have a significant impact on the sample size required.
Resumo:
Objectives: To describe current practice for the discontinuation of continuous renal replacement therapy in a multinational setting and to identify variables associated with successful discontinuation. The approach to discontinue continuous renal replacement therapy may affect patient outcomes. However, there is lack of information on how and under what conditions continuous renal replacement therapy is discontinued. Design: Post hoc analysis of a prospective observational study. Setting. Fifty-four intensive care units in 23 countries. Patients: Five hundred twenty-nine patients (52.6%) who survived initial therapy among 1006 patients treated with continuous renal replacement therapy. Interventions: None. Measurements and Main Results., Three hundred thirteen patients were removed successfully from continuous renal replacement therapy and did not require any renal replacement therapy for at least 7 days and were classified as the ""success"" group and the rest (216 patients) were classified as the ""repeat-RRT"" (renal replacement therapy) group. Patients in the ""success"" group had lower hospital mortality (28.5% vs. 42.7%, p < .0001) compared with patients in the ""repeat-RRT"" group. They also had lower creatinine and urea concentrations and a higher urine output at the time of stopping continuous renal replacement therapy. Multivariate logistic regression analysis for successful discontinuation of continuous renal replacement therapy identified urine output (during the 24 hrs before stopping continuous renal replacement therapy: odds ratio, 1.078 per 100 mL/day increase) and creatinine (odds ratio, 0.996 per mu mol/L increase) as significant predictors of successful cessation. The area under the receiver operating characteristic curve to predict successful discontinuation of continuous renal replacement therapy was 0.808 for urine output and 0.635 for creatinine. The predictive ability of urine output was negatively affected by the use of diuretics (area under the receiver operating characteristic curve, 0.671 with diuretics and 0.845 without diuretics). Conclusions. We report on the current practice of discontinuing continuous renal replacement therapy in a multinational setting. Urine output at the time of initial cessation (if continuous renal replacement therapy was the most important predictor of successful discontinuation, especially if occurring without the administration of diuretics. (Crit Care Med 2009; 37:2576-2582)
Resumo:
Background Mucosal leishmaniasis is caused mainly by Leishmania braziliensis and it occurs months or years after cutaneous lesions. This progressive disease destroys cartilages and osseous structures from face, pharynx and larynx. Objective and methods The aim of this study was to analyse the significance of clinical and epidemiological findings, diagnosis and treatment with the outcome and recurrence of mucosal leishmaniasis through binary logistic regression model from 140 patients with mucosal leishmaniasis from a Brazilian centre. Results The median age of patients was 57.5 and systemic arterial hypertension was the most prevalent secondary disease found in patients with mucosal leishmaniasis (43%). Diabetes, chronic nephropathy and viral hepatitis, allergy and coagulopathy were found in less than 10% of patients. Human immunodeficiency virus (HIV) infection was found in 7 of 140 patients (5%). Rhinorrhea (47%) and epistaxis (75%) were the most common symptoms. N-methyl-glucamine showed a cure rate of 91% and recurrence of 22%. Pentamidine showed a similar rate of cure (91%) and recurrence (25%). Fifteen patients received itraconazole with a cure rate of 73% and recurrence of 18%. Amphotericin B was the drug used in 30 patients with 82% of response with a recurrence rate of 7%. The binary logistic regression analysis demonstrated that systemic arterial hypertension and HIV infection were associated with failure of the treatment (P < 0.05). Conclusion The current first-line mucosal leishmaniasis therapy shows an adequate cure but later recurrence. HIV infection and systemic arterial hypertension should be investigated before start the treatment of mucosal leishmaniasis. Conflicts of interest The authors are not part of any associations or commercial relationships that might represent conflicts of interest in the writing of this study (e.g. pharmaceutical stock ownership, consultancy, advisory board membership, relevant patents, or research funding).
Resumo:
Chemotherapy-induced oral mucositis is a frequent therapeutic challenge in cancer patients. The purpose of this retrospective study was to estimate the prevalence and risk factors of oral mucositis in 169 acute lymphoblastic leukaemia (ALL) patients treated according to different chemotherapeutic trials at the Darcy Vargas Children`s Hospital from 1994 to 2005. Demographic data, clinical history, chemotherapeutic treatment and patients` follow-up were recorded. The association of oral mucositis with age, gender, leucocyte counts at diagnosis and treatment was assessed by the chi-squared test and multivariate regression analysis. Seventy-seven ALL patients (46%) developed oral mucositis during the treatment. Patient age (P = 0.33), gender (P = 0.08) and leucocyte counts at diagnosis (P = 0.34) showed no correlation with the occurrence of oral mucositis. Multivariate regression analysis showed a significant risk for oral mucositis (P = 0.009) for ALL patients treated according to the ALL-BFM-95 protocol. These results strongly suggest the greater stomatotoxic effect of the ALL-BFM-95 trial when compared with Brazilian trials. We concluded that chemotherapy-induced oral mucositis should be systematically analysed prospectively in specialized centres for ALL treatment to establish the degree of toxicity of chemotherapeutic drugs and to improve the quality of life of patients based on more effective therapeutic and prophylactic approaches for prevention of its occurrence. Oral Diseases (2008) 14, 761-766
Resumo:
The use of a fitted parameter watershed model to address water quantity and quality management issues requires that it be calibrated under a wide range of hydrologic conditions. However, rarely does model calibration result in a unique parameter set. Parameter nonuniqueness can lead to predictive nonuniqueness. The extent of model predictive uncertainty should be investigated if management decisions are to be based on model projections. Using models built for four neighboring watersheds in the Neuse River Basin of North Carolina, the application of the automated parameter optimization software PEST in conjunction with the Hydrologic Simulation Program Fortran (HSPF) is demonstrated. Parameter nonuniqueness is illustrated, and a method is presented for calculating many different sets of parameters, all of which acceptably calibrate a watershed model. A regularization methodology is discussed in which models for similar watersheds can be calibrated simultaneously. Using this method, parameter differences between watershed models can be minimized while maintaining fit between model outputs and field observations. In recognition of the fact that parameter nonuniqueness and predictive uncertainty are inherent to the modeling process, PEST's nonlinear predictive analysis functionality is then used to explore the extent of model predictive uncertainty.
Resumo:
Beyond the classical statistical approaches (determination of basic statistics, regression analysis, ANOVA, etc.) a new set of applications of different statistical techniques has increasingly gained relevance in the analysis, processing and interpretation of data concerning the characteristics of forest soils. This is possible to be seen in some of the recent publications in the context of Multivariate Statistics. These new methods require additional care that is not always included or refered in some approaches. In the particular case of geostatistical data applications it is necessary, besides to geo-reference all the data acquisition, to collect the samples in regular grids and in sufficient quantity so that the variograms can reflect the spatial distribution of soil properties in a representative manner. In the case of the great majority of Multivariate Statistics techniques (Principal Component Analysis, Correspondence Analysis, Cluster Analysis, etc.) despite the fact they do not require in most cases the assumption of normal distribution, they however need a proper and rigorous strategy for its utilization. In this work, some reflections about these methodologies and, in particular, about the main constraints that often occur during the information collecting process and about the various linking possibilities of these different techniques will be presented. At the end, illustrations of some particular cases of the applications of these statistical methods will also be presented.
Resumo:
Dissertação apresentada na Faculdade de Ciências e Tecnologiea da Universidade Nova de Lisboa, para obtenção do Grau de Mestre em Engenharia Biomédica
Resumo:
Auditory event-related potentials (AERPs) are widely used in diverse fields of today’s neuroscience, concerning auditory processing, speech perception, language acquisition, neurodevelopment, attention and cognition in normal aging, gender, developmental, neurologic and psychiatric disorders. However, its transposition to clinical practice has remained minimal. Mainly due to scarce literature on normative data across age, wide spectrumof results, variety of auditory stimuli used and to different neuropsychological meanings of AERPs components between authors. One of the most prominent AERP components studied in last decades was N1, which reflects auditory detection and discrimination. Subsequently, N2 indicates attention allocation and phonological analysis. The simultaneous analysis of N1 and N2 elicited by feasible novelty experimental paradigms, such as auditory oddball, seems an objective method to assess central auditory processing. The aim of this systematic review was to bring forward normative values for auditory oddball N1 and N2 components across age. EBSCO, PubMed, Web of Knowledge and Google Scholarwere systematically searched for studies that elicited N1 and/or N2 by auditory oddball paradigm. A total of 2,764 papers were initially identified in the database, of which 19 resulted from hand search and additional references, between 1988 and 2013, last 25 years. A final total of 68 studiesmet the eligibility criteria with a total of 2,406 participants from control groups for N1 (age range 6.6–85 years; mean 34.42) and 1,507 for N2 (age range 9–85 years; mean 36.13). Polynomial regression analysis revealed thatN1latency decreases with aging at Fz and Cz,N1 amplitude at Cz decreases from childhood to adolescence and stabilizes after 30–40 years and at Fz the decrement finishes by 60 years and highly increases after this age. Regarding N2, latency did not covary with age but amplitude showed a significant decrement for both Cz and Fz. Results suggested reliable normative values for Cz and Fz electrode locations; however, changes in brain development and components topography over age should be considered in clinical practice.
Resumo:
Accurate size measurements are fundamental in characterizing the population structure and secondary production of a species. The purpose of this study was to determine the best morphometric parameter to estimate the size of individuals of Capitella capitata (Fabricius, 1780). The morphometric analysis was applied to individuals collected in the intertidal zones of two beaches on the northern coast of the state of São Paulo, Brazil: São Francisco and Araçá. The following measurements were taken: the width and length (height) of the 4th, 5th and 7th setigers, and the length of the thoracic region (first nine setigers). The area and volume of these setigers were calculated and a linear regression analysis was applied to the data. The data were log-transformed to fit the allometric equation y = ax b into a straight line (log y = log a + b * log x). The measurements which best correlated with the thoracic length in individuals from both beaches were the length of setiger 5 (r² = 0.722; p<0.05 in São Francisco and r² = 0.795; p<0.05 in Araçá) and the area of setiger 7 (r² = 0.705; p<0.05 in São Francisco and r² = 0.634; p<0.05 in Araçá). According to these analyses, the length of setiger 5 and/or the area of setiger 7 are the best parameters to evaluate the growth of individuals of C. capitata.
Resumo:
BACKGROUND: Recommended oral voriconazole (VRC) doses are lower than intravenous doses. Because plasma concentrations impact efficacy and safety of therapy, optimizing individual drug exposure may improve these outcomes. METHODS: A population pharmacokinetic analysis (NONMEM) was performed on 505 plasma concentration measurements involving 55 patients with invasive mycoses who received recommended VRC doses. RESULTS: A 1-compartment model with first-order absorption and elimination best fitted the data. VRC clearance was 5.2 L/h, the volume of distribution was 92 L, the absorption rate constant was 1.1 hour(-1), and oral bioavailability was 0.63. Severe cholestasis decreased VRC elimination by 52%. A large interpatient variability was observed on clearance (coefficient of variation [CV], 40%) and bioavailability (CV 84%), and an interoccasion variability was observed on bioavailability (CV, 93%). Lack of response to therapy occurred in 12 of 55 patients (22%), and grade 3 neurotoxicity occurred in 5 of 55 patients (9%). A logistic multivariate regression analysis revealed an independent association between VRC trough concentrations and probability of response or neurotoxicity by identifying a therapeutic range of 1.5 mg/L (>85% probability of response) to 4.5 mg/L (<15% probability of neurotoxicity). Population-based simulations with the recommended 200 mg oral or 300 mg intravenous twice-daily regimens predicted probabilities of 49% and 87%, respectively, for achievement of 1.5 mg/L and of 8% and 37%, respectively, for achievement of 4.5 mg/L. With 300-400 mg twice-daily oral doses and 200-300 mg twice-daily intravenous doses, the predicted probabilities of achieving the lower target concentration were 68%-78% for the oral regimen and 70%-87% for the intravenous regimen, and the predicted probabilities of achieving the upper target concentration were 19%-29% for the oral regimen and 18%-37% for the intravenous regimen. CONCLUSIONS: Higher oral than intravenous VRC doses, followed by individualized adjustments based on measured plasma concentrations, improve achievement of the therapeutic target that maximizes the probability of therapeutic response and minimizes the probability of neurotoxicity. These findings challenge dose recommendations for VRC.
Resumo:
This paper seeks to identify whether there is a representative empirical Okun’s Law coefficient (OLC) and to measure its size. We carry out a meta regression analysis on a sample of 269 estimates of the OLC to uncover reasons for differences in empirical results and to estimate the ‘true’ OLC. On statistical (and other) grounds, we find it appropriate to investigate two separate subsamples, using respectively (some measure of) unemployment or output as dependent variable. Our results can be summarized as follows. First, there is evidence of type II publication bias in both sub-samples, but a type I bias is present only among the papers using some measure of unemployment as the dependent variable. Second, after correction for publication bias, authentic and statistically significant OLC effects are present in both sub-samples. Third, bias-corrected estimated true OLCs are significantly lower (in absolute value) with models using some measure of unemployment as the dependent variable. Using a bivariate MRA approach, the estimated true effects are -0.25 for the unemployment sub-sample and -0.61 for the output-sub sample; with a multivariate MRA methodology, the estimated true effects are -0.40 and -1.02 for the unemployment and the output-sub samples respectively.
Resumo:
In a recent paper Bermúdez [2009] used bivariate Poisson regression models for ratemaking in car insurance, and included zero-inflated models to account for the excess of zeros and the overdispersion in the data set. In the present paper, we revisit this model in order to consider alternatives. We propose a 2-finite mixture of bivariate Poisson regression models to demonstrate that the overdispersion in the data requires more structure if it is to be taken into account, and that a simple zero-inflated bivariate Poisson model does not suffice. At the same time, we show that a finite mixture of bivariate Poisson regression models embraces zero-inflated bivariate Poisson regression models as a special case. Additionally, we describe a model in which the mixing proportions are dependent on covariates when modelling the way in which each individual belongs to a separate cluster. Finally, an EM algorithm is provided in order to ensure the models’ ease-of-fit. These models are applied to the same automobile insurance claims data set as used in Bermúdez [2009] and it is shown that the modelling of the data set can be improved considerably.
Resumo:
This article focuses on business risk management in the insurance industry. A methodology for estimating the profit loss caused by each customer in the portfolio due to policy cancellation is proposed. Using data from a European insurance company, customer behaviour over time is analyzed in order to estimate the probability of policy cancelation and the resulting potential profit loss due to cancellation. Customers may have up to two different lines of business contracts: motor insurance and other diverse insurance (such as, home contents, life or accident insurance). Implications for understanding customer cancellation behaviour as the core of business risk management are outlined.