979 resultados para Methods : Statistical
Resumo:
Small area health statistics has assumed increasing importance as the focus of population and public health moves to a more individualised approach of smaller area populations. Small populations and low event occurrence produce difficulties in interpretation and require appropriate statistical methods, including for age adjustment. There are also statistical questions related to multiple comparisons. Privacy and confidentiality issues include the possibility of revealing information on individuals or health care providers by fine cross-tabulations. Interpretation of small area population differences in health status requires consideration of migrant and Indigenous composition, socio-economic status and rural-urban geography before assessment of the effects of physical environmental exposure and services and interventions. Burden of disease studies produce a single measure for morbidity and mortality - disability adjusted life year (DALY) - which is the sum of the years of life lost (YLL) from premature mortality and the years lived with disability (YLD) for particular diseases (or all conditions). Calculation of YLD requires estimates of disease incidence (and complications) and duration, and weighting by severity. These procedures often mean problematic assumptions, as does future discounting and age weighting of both YLL and YLD. Evaluation of the Victorian small area population disease burden study presents important cross-disciplinary challenges as it relies heavily on synthetic approaches of demography and economics rather than on the empirical methods of epidemiology. Both empirical and synthetic methods are used to compute small area mortality and morbidity, disease burden, and then attribution to risk factors. Readers need to examine the methodology and assumptions carefully before accepting the results.
Resumo:
Sum: Plant biologists in fields of ecology, evolution, genetics and breeding frequently use multivariate methods. This paper illustrates Principal Component Analysis (PCA) and Gabriel's biplot as applied to microarray expression data from plant pathology experiments. Availability: An example program in the publicly distributed statistical language R is available from the web site (www.tpp.uq.edu.au) and by e-mail from the contact. Contact: scott.chapman@csiro.au.
Resumo:
Background: A variety of methods for prediction of peptide binding to major histocompatibility complex (MHC) have been proposed. These methods are based on binding motifs, binding matrices, hidden Markov models (HMM), or artificial neural networks (ANN). There has been little prior work on the comparative analysis of these methods. Materials and Methods: We performed a comparison of the performance of six methods applied to the prediction of two human MHC class I molecules, including binding matrices and motifs, ANNs, and HMMs. Results: The selection of the optimal prediction method depends on the amount of available data (the number of peptides of known binding affinity to the MHC molecule of interest), the biases in the data set and the intended purpose of the prediction (screening of a single protein versus mass screening). When little or no peptide data are available, binding motifs are the most useful alternative to random guessing or use of a complete overlapping set of peptides for selection of candidate binders. As the number of known peptide binders increases, binding matrices and HMM become more useful predictors. ANN and HMM are the predictive methods of choice for MHC alleles with more than 100 known binding peptides. Conclusion: The ability of bioinformatic methods to reliably predict MHC binding peptides, and thereby potential T-cell epitopes, has major implications for clinical immunology, particularly in the area of vaccine design.
Resumo:
Background: Controversy exists surrounding pharmacological therapy in acute variceal bleeding. Methods: To determine the efficacy and safety of terlipressin. Methods: Randomized trials were identified and duplicate, independent, review identified 20 randomized trials involving 1609 patients that compared terlipressin with placebo, balloon tamponade, endoscopic treatment, octreotide, somatostatin or vasopressin for treatment of acute oesophageal variceal haemorrhage. Results: Meta-analysis showed that compared to placebo, terlipressin reduced mortality (relative risk 0.66, 95% CI 0.49-0.88), failure of haemostasis (relative risk 0.63, 95% CI 0.45-0.89) and the number of emergency procedures per patient required for uncontrolled bleeding or rebleeding (relative risk 0.72, 95% CI 0.55-0.93). When used as an adjuvant to endoscopic sclerotherapy, terlipressin reduced failure of haemostasis (relative risk 0.75, 95% CI 0.58-0.96), and had an effect on reducing mortality that approached statistical significance (relative risk 0.74, 95% CI 0.53-1.04). No significant difference was demonstrated between terlipressin and endoscopic sclerotherapy, balloon tamponade, somatostatin or vasopressin. Haemostasis was achieved more frequently with octreotide compared to terlipressin (relative risk 1.62, 95% CI 1.05-2.50), but this result was based on unblinded studies. Adverse events were similar between terlipressin and the other comparison groups except for vasopressin, which caused more withdrawals due to adverse events. Conclusions: Terlipressin is a safe and effective treatment for acute oesophageal variceal bleeding, with or without adjuvant endoscopic sclerotherapy. Terlipressin appears to reduce mortality in acute oesophageal variceal bleeding compared to placebo, and is the only pharmacological agent shown to do so. Future studies will be required to detect potential mortality differences between terlipressin and other therapeutic approaches.
Resumo:
This paper is concerned with the use of scientific visualization methods for the analysis of feedforward neural networks (NNs). Inevitably, the kinds of data associated with the design and implementation of neural networks are of very high dimensionality, presenting a major challenge for visualization. A method is described using the well-known statistical technique of principal component analysis (PCA). This is found to be an effective and useful method of visualizing the learning trajectories of many learning algorithms such as back-propagation and can also be used to provide insight into the learning process and the nature of the error surface.
Resumo:
Estimating energy requirements is necessary in clinical practice when indirect calorimetry is impractical. This paper systematically reviews current methods for estimating energy requirements. Conclusions include: there is discrepancy between the characteristics of populations upon which predictive equations are based and current populations; tools are not well understood, and patient care can be compromised by inappropriate application of the tools. Data comparing tools and methods are presented and issues for practitioners are discussed. (C) 2003 International Life Sciences Institute.
Resumo:
Taking functional programming to its extremities in search of simplicity still requires integration with other development (e.g. formal) methods. Induction is the key to deriving and verifying functional programs, but can be simplified through packaging proofs with functions, particularly folds, on data (structures). Totally Functional Programming avoids the complexities of interpretation by directly representing data (structures) as platonic combinators - the functions characteristic to the data. The link between the two simplifications is that platonic combinators are a kind of partially-applied fold, which means that platonic combinators inherit fold-theoretic properties, but with some apparent simplifications due to the platonic combinator representation. However, despite observable behaviour within functional programming that suggests that TFP is widely-applicable, significant work remains before TFP as such could be widely adopted.
Resumo:
This paper is part of a large study to assess the adequacy of the use of multivariate statistical techniques in theses and dissertations of some higher education institutions in the area of marketing with theme of consumer behavior from 1997 to 2006. The regression and conjoint analysis are focused on in this paper, two techniques with great potential of use in marketing studies. The objective of this study was to analyze whether the employement of these techniques suits the needs of the research problem presented in as well as to evaluate the level of success in meeting their premisses. Overall, the results suggest the need for more involvement of researchers in the verification of all the theoretical precepts of application of the techniques classified in the category of investigation of dependence among variables.
Resumo:
This article deals with the efficiency of fractional integration parameter estimators. This study was based on Monte Carlo experiments involving simulated stochastic processes with integration orders in the range]-1,1[. The evaluated estimation methods were classified into two groups: heuristics and semiparametric/maximum likelihood (ML). The study revealed that the comparative efficiency of the estimators, measured by the lesser mean squared error, depends on the stationary/non-stationary and persistency/anti-persistency conditions of the series. The ML estimator was shown to be superior for stationary persistent processes; the wavelet spectrum-based estimators were better for non-stationary mean reversible and invertible anti-persistent processes; the weighted periodogram-based estimator was shown to be superior for non-invertible anti-persistent processes.
Resumo:
Discussion opposing the Theory of the Firm to the Theory of Stakeholders are contemporaneous and polemical. One focal point of such debates refers to which objective-function companies, should choose, whether that of the shareholders or that of the stakeholders, and whether it is possible to opt for both simultaneously. Several empirical studies. have attempted-to test a possible correlation between both functions, and there has not been any consensus-so far. The objective of the present research is to examine a gap in such discussions: is there (or not) a subordination of the stakeholders` objective-function to that of the shareholders? The research is empirical,and analytical and employs quantitative methods. Hypotheses were tested and data analyzed by using non-parametrical (chi-square test) and parametrical procedures (frequency. correlation `coefficient). Secondary data was collected from he Economitica database and from the Brazilian Institute of Social and-Economic Analyses (IBASE) website, relative to public companies that have published their Social Balance Statements following the IBASE model from 1999 to 2006, whose sample amounted to 65 companies; In order to assess the objective-function of shareholders a proxy was created based on the following three indices: ROE (return on equity), EnterpriseValue and Tobin`s Q. In order to assess the objective-function of stakeholders a proxy was created by employing the following IBASE social balance indices: internal ones (ISI), external ones (ISE), and environmental ones (IAM). The results have shown no evidence of subordination of stakeholders` objective-function to that of the shareholders in analyzed companies, negating initial expectations and calling for deeper investigation of results. Its main conclusion, which states that the attempted subordination does not take place, is limited to the sample herein investigated and calls for ongoing research aiming at improvements which may lead to sample enlargement and, as a consequence, may make feasible the application of other statistical techniques which may yield a more thorough, analysis of the studied phenomehon.
Resumo:
Objective: The Assessing Cost-Effectiveness - Mental Health (ACE-MH) study aims to assess from a health sector perspective, whether there are options for change that could improve the effectiveness and efficiency of Australia's current mental health services by directing available resources toward 'best practice' cost-effective services. Method: The use of standardized evaluation methods addresses the reservations expressed by many economists about the simplistic use of League Tables based on economic studies confounded by differences in methods, context and setting. The cost-effectiveness ratio for each intervention is calculated using economic and epidemiological data. This includes systematic reviews and randomised controlled trials for efficacy, the Australian Surveys of Mental Health and Wellbeing for current practice and a combination of trials and longitudinal studies for adherence. The cost-effectiveness ratios are presented as cost (A$) per disability-adjusted life year (DALY) saved with a 95% uncertainty interval based on Monte Carlo simulation modelling. An assessment of interventions on 'second filter' criteria ('equity', 'strength of evidence', 'feasibility' and 'acceptability to stakeholders') allows broader concepts of 'benefit' to be taken into account, as well as factors that might influence policy judgements in addition to cost-effectiveness ratios. Conclusions: The main limitation of the study is in the translation of the effect size from trials into a change in the DALY disability weight, which required the use of newly developed methods. While comparisons within disorders are valid, comparisons across disorders should be made with caution. A series of articles is planned to present the results.
Resumo:
in the Apis mellifera post-genomic era, RNAi protocols have been used in functional approaches. However, sample manipulation and invasive methods such as injection of double-stranded RNA (dsRNA) can compromise physiology and survival. To circumvent these problems, we developed a non-invasive method for honeybee gene knockdown, using a well-established vitellogenin RNAi system as a model. Second instar larvae received dsRNA for vitellogenin (dsVg-RNA) in their natural diet. For exogenous control, larvae received dsRNA for GFP (dsGFP-RNA). Untreated larvae formed another control group. Around 60% of the treated larvae naturally developed until adult emergence when 0.5 mu g of dsVg-RNA or dsGFP-RNA was offered while no larvae that received 3.0 mu g of dsRNA reached pupal stages. Diet dilution did not affect the removal rates. Viability depends not only on the delivered doses but also on the internal conditions of colonies. The weight of treated and untreated groups showed no statistical differences. This showed that RNAi ingestion did not elicit drastic collateral effects. Approximately 90% of vitellogenin transcripts from 7-day-old workers were silenced compared to controls. A large number of samples are handled in a relatively short time and smaller quantities of RNAi molecules are used compared to invasive methods. These advantages culminate in a versatile and a cost-effective approach. (c) 2008 Elsevier Ltd. All rights reserved.
Resumo:
This paper describes algorithms that can identify patterns of brain structure and function associated with Alzheimer's disease, schizophrenia, normal aging, and abnormal brain development based on imaging data collected in large human populations. Extraordinary information can be discovered with these techniques: dynamic brain maps reveal how the brain grows in childhood, how it changes in disease, and how it responds to medication. Genetic brain maps can reveal genetic influences on brain structure, shedding light on the nature-nurture debate, and the mechanisms underlying inherited neurobehavioral disorders. Recently, we created time-lapse movies of brain structure for a variety of diseases. These identify complex, shifting patterns of brain structural deficits, revealing where, and at what rate, the path of brain deterioration in illness deviates from normal. Statistical criteria can then identify situations in which these changes are abnormally accelerated, or when medication or other interventions slow them. In this paper, we focus on describing our approaches to map structural changes in the cortex. These methods have already been used to reveal the profile of brain anomalies in studies of dementia, epilepsy, depression, childhood and adult-onset schizophrenia, bipolar disorder, attention-deficit/ hyperactivity disorder, fetal alcohol syndrome, Tourette syndrome, Williams syndrome, and in methamphetamine abusers. Specifically, we describe an image analysis pipeline known as cortical pattern matching that helps compare and pool cortical data over time and across subjects. Statistics are then defined to identify brain structural differences between groups, including localized alterations in cortical thickness, gray matter density (GMD), and asymmetries in cortical organization. Subtle features, not seen in individual brain scans, often emerge when population-based brain data are averaged in this way. Illustrative examples are presented to show the profound effects of development and various diseases on the human cortex. Dynamically spreading waves of gray matter loss are tracked in dementia and schizophrenia, and these sequences are related to normally occurring changes in healthy subjects of various ages. (C) 2004 Published by Elsevier Inc.