55 resultados para Statisticians
Resumo:
Hypertension is a common, modifiable and heritable cardiovascular risk factor. Some rare monogenic forms of hypertension have been described, but the majority of patients suffer from "essential" hypertension, for whom the underlying pathophysiological mechanism is not clear. Essential hypertension is a complex trait, involving multiple genes and environmental factors. Recently, progress in the identification of common genetic variants associated with blood pressure and hypertension has been made thanks to large-scale international collaborative projects involving geneticists, epidemiologists, statisticians and clinicians. In this article, we review some basic genetic concepts and the main research methods used to study the genetics of hypertension, as well as selected recent findings in this field.
Resumo:
Continuing developments in science and technology mean that the amounts of information forensic scientists are able to provide for criminal investigations is ever increasing. The commensurate increase in complexity creates difficulties for scientists and lawyers with regard to evaluation and interpretation, notably with respect to issues of inference and decision. Probability theory, implemented through graphical methods, and specifically Bayesian networks, provides powerful methods to deal with this complexity. Extensions of these methods to elements of decision theory provide further support and assistance to the judicial system. Bayesian Networks for Probabilistic Inference and Decision Analysis in Forensic Science provides a unique and comprehensive introduction to the use of Bayesian decision networks for the evaluation and interpretation of scientific findings in forensic science, and for the support of decision-makers in their scientific and legal tasks. Includes self-contained introductions to probability and decision theory. Develops the characteristics of Bayesian networks, object-oriented Bayesian networks and their extension to decision models. Features implementation of the methodology with reference to commercial and academically available software. Presents standard networks and their extensions that can be easily implemented and that can assist in the reader's own analysis of real cases. Provides a technique for structuring problems and organizing data based on methods and principles of scientific reasoning. Contains a method for the construction of coherent and defensible arguments for the analysis and evaluation of scientific findings and for decisions based on them. Is written in a lucid style, suitable for forensic scientists and lawyers with minimal mathematical background. Includes a foreword by Ian Evett. The clear and accessible style of this second edition makes this book ideal for all forensic scientists, applied statisticians and graduate students wishing to evaluate forensic findings from the perspective of probability and decision analysis. It will also appeal to lawyers and other scientists and professionals interested in the evaluation and interpretation of forensic findings, including decision making based on scientific information.
Resumo:
Cancer omics data are exponentially created and associated with clinical variables, and important findings can be extracted based on bioinformatics approaches which can then be experimentally validated. Many of these findings are related to a specific class of non-coding RNA molecules called microRNAs (miRNAs) (post-transcriptional regulators of mRNA expression). The related research field is quite heterogeneous and bioinformaticians, clinicians, statisticians and biologists, as well as data miners and engineers collaborate to cure stored data and on new impulses coming from the output of the latest Next Generation Sequencing technologies. Here we review the main research findings on miRNA of the first 10 years in colon cancer research with an emphasis on possible uses in clinical practice. This review intends to provide a road map in the jungle of publications of miRNA in colorectal cancer, focusing on data availability and new ways to generate biologically relevant information out of these huge amounts of data.
Resumo:
The focus of my PhD research was the concept of modularity. In the last 15 years, modularity has become a classic term in different fields of biology. On the conceptual level, a module is a set of interacting elements that remain mostly independent from the elements outside of the module. I used modular analysis techniques to study gene expression evolution in vertebrates. In particular, I identified ``natural'' modules of gene expression in mouse and human, and I showed that expression of organ-specific and system-specific genes tends to be conserved between such distance vertebrates as mammals and fishes. Also with a modular approach, I studied patterns of developmental constraints on transcriptome evolution. I showed that none of the two commonly accepted models of the evolution of embryonic development (``evo-devo'') are exclusively valid. In particular, I found that the conservation of the sequences of regulatory regions is highest during mid-development of zebrafish, and thus it supports the ``hourglass model''. In contrast, events of gene duplication and new gene introduction are most rare in early development, which supports the ``early conservation model''. In addition to the biological insights on transcriptome evolution, I have also discussed in detail the advantages of modular approaches in large-scale data analysis. Moreover, I re-analyzed several studies (published in high-ranking journals), and showed that their conclusions do not hold out under a detailed analysis. This demonstrates that complex analysis of high-throughput data requires a co-operation between biologists, bioinformaticians, and statisticians.
Resumo:
Forensic scientists face increasingly complex inference problems for evaluating likelihood ratios (LRs) for an appropriate pair of propositions. Up to now, scientists and statisticians have derived LR formulae using an algebraic approach. However, this approach reaches its limits when addressing cases with an increasing number of variables and dependence relationships between these variables. In this study, we suggest using a graphical approach, based on the construction of Bayesian networks (BNs). We first construct a BN that captures the problem, and then deduce the expression for calculating the LR from this model to compare it with existing LR formulae. We illustrate this idea by applying it to the evaluation of an activity level LR in the context of the two-trace transfer problem. Our approach allows us to relax assumptions made in previous LR developments, produce a new LR formula for the two-trace transfer problem and generalize this scenario to n traces.
Resumo:
1. Biogeographical models of species' distributions are essential tools for assessing impacts of changing environmental conditions on natural communities and ecosystems. Practitioners need more reliable predictions to integrate into conservation planning (e.g. reserve design and management). 2. Most models still largely ignore or inappropriately take into account important features of species' distributions, such as spatial autocorrelation, dispersal and migration, biotic and environmental interactions. Whether distributions of natural communities or ecosystems are better modelled by assembling individual species' predictions in a bottom-up approach or modelled as collective entities is another important issue. An international workshop was organized to address these issues. 3. We discuss more specifically six issues in a methodological framework for generalized regression: (i) links with ecological theory; (ii) optimal use of existing data and artificially generated data; (iii) incorporating spatial context; (iv) integrating ecological and environmental interactions; (v) assessing prediction errors and uncertainties; and (vi) predicting distributions of communities or collective properties of biodiversity. 4. Synthesis and applications. Better predictions of the effects of impacts on biological communities and ecosystems can emerge only from more robust species' distribution models and better documentation of the uncertainty associated with these models. An improved understanding of causes of species' distributions, especially at their range limits, as well as of ecological assembly rules and ecosystem functioning, is necessary if further progress is to be made. A better collaborative effort between theoretical and functional ecologists, ecological modellers and statisticians is required to reach these goals.
Resumo:
This dissertation examines knowledge and industrial knowledge creation processes. It looks at the way knowledge is created in industrial processes based on data, which is transformed into information and finally into knowledge. In the context of this dissertation the main tool for industrial knowledge creation are different statistical methods. This dissertation strives to define industrial statistics. This is done using an expert opinion survey, which was sent to a number of industrial statisticians. The survey was conducted to create a definition for this field of applied statistics and to demonstrate the wide applicability of statistical methods to industrial problems. In this part of the dissertation, traditional methods of industrial statistics are introduced. As industrial statistics are the main tool for knowledge creation, the basics of statistical decision making and statistical modeling are also included. The widely known Data Information Knowledge Wisdom (DIKW) hierarchy serves as a theoretical background for this dissertation. The way that data is transformed into information, information into knowledge and knowledge finally into wisdom is used as a theoretical frame of reference. Some scholars have, however, criticized the DIKW model. Based on these different perceptions of the knowledge creation process, a new knowledge creation process, based on statistical methods is proposed. In the context of this dissertation, the data is a source of knowledge in industrial processes. Because of this, the mathematical categorization of data into continuous and discrete types is explained. Different methods for gathering data from processes are clarified as well. There are two methods for data gathering in this dissertation: survey methods and measurements. The enclosed publications provide an example of the wide applicability of statistical methods in industry. In these publications data is gathered using surveys and measurements. Enclosed publications have been chosen so that in each publication, different statistical methods are employed in analyzing of data. There are some similarities between the analysis methods used in the publications, but mainly different methods are used. Based on this dissertation the use of statistical methods for industrial knowledge creation is strongly recommended. With statistical methods it is possible to handle large datasets and different types of statistical analysis results can easily be transformed into knowledge.
Resumo:
La participación en carreras atléticas de calle ha aumentado; esto requiere detectar riesgos previos al esfuerzo físico. Objetivo. Identificar factores de riesgo del comportamiento y readiness de inscritos a una carrera. Método. Estudio transversal en aficionados de 18-64 años. Encuesta digital con módulos de IPAQ, PARQ+ y STEP. Muestreo aleatorio sistemático con n=510, para una inactividad física esperada de 35% (±5%). Se evaluó nivel de actividad física, consumo de alcohol (peligroso), de fruta, verdura, tabaco y sal, y readiness. Resultados. El cumplimiento de actividad física fue 97,4%; 2,4% consume nivel óptimo de fruta o verdura (diferencias por edad, sexo y estrato), 3,7% fuma y 44,1% consumo peligroso de alcohol. El 19,8% reportó PARQ+ positivo y 5,7% requiere supervisión. Hay diferencias por trabajo y estudio. Discusión. Los aficionados cumplen el nivel de actividad física; pero no de otros factores. Una estrategia de seguridad en el atletismo de calle es evaluar los factores de riesgo relacionados con el estilo de vida así como el readiness.
Resumo:
The aim of phase II single-arm clinical trials of a new drug is to determine whether it has sufficient promising activity to warrant its further development. For the last several years Bayesian statistical methods have been proposed and used. Bayesian approaches are ideal for earlier phase trials as they take into account information that accrues during a trial. Predictive probabilities are then updated and so become more accurate as the trial progresses. Suitable priors can act as pseudo samples, which make small sample clinical trials more informative. Thus patients have better chances to receive better treatments. The goal of this paper is to provide a tutorial for statisticians who use Bayesian methods for the first time or investigators who have some statistical background. In addition, real data from three clinical trials are presented as examples to illustrate how to conduct a Bayesian approach for phase II single-arm clinical trials with binary outcomes.
Resumo:
Concerns about potentially misleading reporting of pharmaceutical industry research have surfaced many times. The potential for duality (and thereby conflict) of interest is only too clear when you consider the sums of money required for the discovery, development and commercialization of new medicines. As the ability of major, mid-size and small pharmaceutical companies to innovate has waned, as evidenced by the seemingly relentless decline in the numbers of new medicines approved by Food and Drug Administration and European Medicines Agency year-on-year, not only has the cost per new approved medicine risen: so too has the public and media concern about the extent to which the pharmaceutical industry is open and honest about the efficacy, safety and quality of the drugs we manufacture and sell. In 2005 an Editorial in Journal of the American Medical Association made clear that, so great was their concern about misleading reporting of industry-sponsored studies, henceforth no article would be published that was not also guaranteed by independent statistical analysis. We examine the precursors to this Editorial, as well as its immediate and lasting effects for statisticians, for the manner in which statistical analysis is carried out, and for the industry more generally.
Resumo:
Background: Medication errors are common in primary care and are associated with considerable risk of patient harm. We tested whether a pharmacist-led, information technology-based intervention was more effective than simple feedback in reducing the number of patients at risk of measures related to hazardous prescribing and inadequate blood-test monitoring of medicines 6 months after the intervention. Methods: In this pragmatic, cluster randomised trial general practices in the UK were stratified by research site and list size, and randomly assigned by a web-based randomisation service in block sizes of two or four to one of two groups. The practices were allocated to either computer-generated simple feedback for at-risk patients (control) or a pharmacist-led information technology intervention (PINCER), composed of feedback, educational outreach, and dedicated support. The allocation was masked to general practices, patients, pharmacists, researchers, and statisticians. Primary outcomes were the proportions of patients at 6 months after the intervention who had had any of three clinically important errors: non-selective non-steroidal anti-inflammatory drugs (NSAIDs) prescribed to those with a history of peptic ulcer without co-prescription of a proton-pump inhibitor; β blockers prescribed to those with a history of asthma; long-term prescription of angiotensin converting enzyme (ACE) inhibitor or loop diuretics to those 75 years or older without assessment of urea and electrolytes in the preceding 15 months. The cost per error avoided was estimated by incremental cost-eff ectiveness analysis. This study is registered with Controlled-Trials.com, number ISRCTN21785299. Findings: 72 general practices with a combined list size of 480 942 patients were randomised. At 6 months’ follow-up, patients in the PINCER group were significantly less likely to have been prescribed a non-selective NSAID if they had a history of peptic ulcer without gastroprotection (OR 0∙58, 95% CI 0∙38–0∙89); a β blocker if they had asthma (0∙73, 0∙58–0∙91); or an ACE inhibitor or loop diuretic without appropriate monitoring (0∙51, 0∙34–0∙78). PINCER has a 95% probability of being cost eff ective if the decision-maker’s ceiling willingness to pay reaches £75 per error avoided at 6 months. Interpretation: The PINCER intervention is an effective method for reducing a range of medication errors in general practices with computerised clinical records. Funding: Patient Safety Research Portfolio, Department of Health, England.
Resumo:
The societal need for reliable climate predictions and a proper assessment of their uncertainties is pressing. Uncertainties arise not only from initial conditions and forcing scenarios, but also from model formulation. Here, we identify and document three broad classes of problems, each representing what we regard to be an outstanding challenge in the area of mathematics applied to the climate system. First, there is the problem of the development and evaluation of simple physically based models of the global climate. Second, there is the problem of the development and evaluation of the components of complex models such as general circulation models. Third, there is the problem of the development and evaluation of appropriate statistical frameworks. We discuss these problems in turn, emphasizing the recent progress made by the papers presented in this Theme Issue. Many pressing challenges in climate science require closer collaboration between climate scientists, mathematicians and statisticians. We hope the papers contained in this Theme Issue will act as inspiration for such collaborations and for setting future research directions.
Resumo:
Background Despite the promising benefits of adaptive designs (ADs), their routine use, especially in confirmatory trials, is lagging behind the prominence given to them in the statistical literature. Much of the previous research to understand barriers and potential facilitators to the use of ADs has been driven from a pharmaceutical drug development perspective, with little focus on trials in the public sector. In this paper, we explore key stakeholders’ experiences, perceptions and views on barriers and facilitators to the use of ADs in publicly funded confirmatory trials. Methods Semi-structured, in-depth interviews of key stakeholders in clinical trials research (CTU directors, funding board and panel members, statisticians, regulators, chief investigators, data monitoring committee members and health economists) were conducted through telephone or face-to-face sessions, predominantly in the UK. We purposively selected participants sequentially to optimise maximum variation in views and experiences. We employed the framework approach to analyse the qualitative data. Results We interviewed 27 participants. We found some of the perceived barriers to be: lack of knowledge and experience coupled with paucity of case studies, lack of applied training, degree of reluctance to use ADs, lack of bridge funding and time to support design work, lack of statistical expertise, some anxiety about the impact of early trial stopping on researchers’ employment contracts, lack of understanding of acceptable scope of ADs and when ADs are appropriate, and statistical and practical complexities. Reluctance to use ADs seemed to be influenced by: therapeutic area, unfamiliarity, concerns about their robustness in decision-making and acceptability of findings to change practice, perceived complexities and proposed type of AD, among others. Conclusions There are still considerable multifaceted, individual and organisational obstacles to be addressed to improve uptake, and successful implementation of ADs when appropriate. Nevertheless, inferred positive change in attitudes and receptiveness towards the appropriate use of ADs by public funders are supportive and are a stepping stone for the future utilisation of ADs by researchers.
Resumo:
Alterations in the neuropsychomotor development of children are not rare and can manifest themselves with varying intensity at different stages of their development. In this context, maternal risk factors may contribute to the appearance of these alterations. A number of studies have reported that neuropsychomotor development diagnosis is not an easy task, especially in the basic public health network. Diagnosis requires effective, low-cost, and easy - to-apply procedures. The Denver Developmental Screening Test, first published in 1967, is currently used in several countries. It has been revised and renamed as the Denver II Test and meets the aforementioned criteria. Accordingly, the aim of this study was to apply the Denver II Test in order to verify the prevalence of suspected neuropsychomotor development delay in children between the ages of 0 and 12 months and correlate it with the following maternal risk factors: family income, schooling, age at pregnancy, drug use during pregnancy, gestational age, gestational problems, type of delivery and the desire to have children. For data collection, performed during the first 6 months of 2004, a clinical assessment was made of 398 children selected by pediatricians and the nursing team of each public health unit. Later, the parents or guardians were asked to complete a structured questionnaire to determine possible risk indicators of neuropsychomotor development delay. Finally the Denver II Developmental Screening Test (DDST) was applied. The data were analyzed together, using Statistical Package for Social Science (SPSS) software, version 6.1. The confidence interval was set at 95%. The Denver II Test yielded normal and questionable results. This suggests compromised neuropsychomotor development in the children examined and deserves further investigation. The correlation of the results with preestablished maternal risk variables (family income, mother s schooling, age at pregnancy, drug use during the pregnancy and gestational age) was strongly significant. The other maternal risk variables (gestational problems, type of delivery and desire to have children) were not significant. Using an adjusted logistic regression model, we obtained the estimate of the greater likelihood of a child having suspected neuropsychomotor development delay: a mother with _75 4 years of schooling, chronological age less than 20 years and a drug user during pregnancy. This study produced two manuscripts, one published in Acta Cirúrgica Brasileira , in which an analysis was performed of children with suspected neuropsychomotor development delay in the city of Natal, Brazil. The other paper (to be published) analyzed the magnitude of the independent variable maternal schooling associated to neuropsychomotor development delay, every 3 months during the first twelve months of life of the children selected.. The results of the present study reinforce the multifactorial characteristic of development and the cumulative effect of maternal risk factors, and show the need for a regional policy that promotes low-cost programs for the community, involving children at risk of neuropsychomotor development delay. Moreover, they suggest the need for better qualified health professionals in terms of monitoring child development. This was an inter- and multidisciplinary study with the integrated participation of doctors, nurses, nursing assistants and professionals from other areas, such as statisticians and information technology professionals, who met all the requirements of the Postgraduate Program in Health Sciences of the Federal University of Rio Grande do Norte
Resumo:
It is noticeable that pressure, tension and overwork are frequent in health professionals routine. The work related to the ward area demands deep attention and surveillance. Because of that, it is essential to have a specific look at the humanization directed at health professionals, considering that taking care of other human beings is the essence of their job. This study has analyzed the psychic health levels, as well as the stress health professionals are submitted to, providing a debate about the humanization in 06 public hospitals (03 of them awarded by actions of humanization, and 03 not awarded) in Rio Grande do Norte state, Brazil. A study with 126 active health professionals (doctors, nurses, psychologists, nutritionists and social workers) in ward areas in their respective institutions was carried out. The thesis presented, with multi-disciplinary characteristic, counted on the support of statisticians (to calculate samples and data analysis), psychologists, social workers and administrators (linked to the human resources sector in each hospital). A cross-sectional study was performed, taking into consideration both quantitative and qualitative factors. The tools used for that were a semistructured questionnaire with socio-demographic characteristics, work and humanization; Lipp's Stress Symptoms Inventory for Adults (ISSL), and the Goldberg s General Health Questionnaire (QSG). The workers are predominantly women (84,9%), married (54,8%), between 46 and 55 years old (40,5%), working in the same institution for more than 20 years (22,2%), and between 16 and 20 years (20,6%), respectively. They work 40 hours a week (71,4%) and have multiple jobs (61,9%). Although most of these individuals global psychic health is in a good level, there are a significant number of people that is gradually getting worse concerning psychic stress (F1) showed by QSG (54,7%), and stress showed by ISSL(42,1%). Observing the categories, nurses (41,5%). Nutritionists (20,8%), doctors and social workers (18,9%), were among the most affected. About general health (F6), 63% of the awarded hospitals and 70% of the not awarded ones, presented good health levels (ranging from 5 to 50%). It was also noticed that, in the groups mentioned above, 25 and 20% respectively, were inserted in scores between 55 to 90%, what means that they are in worsening phase. The fact that the hospital is awarded or well recognized doesn t interfere in health professionals stress level and in their psychic health. Through what was heard from these individuals, it was possible to verify that they know little about humanization, once few of them identify or know that the service they offer is in an adoption process by Ministerial Policies. It was also detected the necessity of developing actions aimed at worker s health. Such results showed the importance of have more investments in programs that are directed to workers well-being, because they deal with other people s health and it is known that it is difficult for them to offer high-quality assistance if there are not suitable physical, psychological and material conditions to help them develop their jobs. As a warning, it is fair to say that investments in actions that provide humanized care to health professionals, mainly concerning preventive care for their health and life quality in their work