932 resultados para Good manufacturing practice
Resumo:
No Abstract
Resumo:
In a retrospective review, the telemedical management of 65 outpatients from a randomized controlled trial (RCT) of telemedicine for non-urgent referrals to a consultant neurologist was compared with the management of 76 patients seen face to face in the same trial, with that of 150 outpatients seen in the neurology clinics of district general hospitals and with that of 102 neurological outpatients seen by general physicians. Outcome measures were the numbers of investigations and of patient reviews. The telemedicine group did not differ significantly from the 150 patients seen face to face by neurologists in hospital clinics in terms of either the number of investigations or the number of reviews they received. Patients from the RCT seen face to face had significantly fewer investigations but a similar number of reviews to the other 150 patients seen face to face by neurologists (the disparity in the number of investigations may explain the negative result for telemedicine in that RCT). Patients with neurological symptoms assessed by general physicians had significantly more investigations and were reviewed significantly more often than all the other groups. Patients from the RCT seen by telemedicine were not managed significantly differently from those seen face to face by neurologists in hospital clinics but had significantly fewer investigations and follow-ups than those patients managed by general physicians. The results suggest that management of new neurological outpatients by neurologists using telemedicine is similar to that by neurologists using a face-to-face consultation, and is more efficient than management by general physicians.
Resumo:
Seasonal climate forecasting offers potential for improving management of crop production risks in the cropping systems of NE Australia. But how is this capability best connected to management practice? Over the past decade, we have pursued participative systems approaches involving simulation-aided discussion with advisers and decision-makers. This has led to the development of discussion support software as a key vehicle for facilitating infusion of forecasting capability into practice. In this paper, we set out the basis of our approach, its implementation and preliminary evaluation. We outline the development of the discussion support software Whopper Cropper, which was designed for, and in close consultation with, public and private advisers. Whopper Cropper consists of a database of simulation output and a graphical user interface to generate analyses of risks associated with crop management options. The charts produced provide conversation pieces for advisers to use with their farmer clients in relation to the significant decisions they face. An example application, detail of the software development process and an initial survey of user needs are presented. We suggest that discussion support software is about moving beyond traditional notions of supply-driven decision support systems. Discussion support software is largely demand-driven and can compliment participatory action research programs by providing cost-effective general delivery of simulation-aided discussions about relevant management actions. The critical role of farm management advisers and dialogue among key players is highlighted. We argue that the discussion support concept, as exemplified by the software tool Whopper Cropper and the group processes surrounding it, provides an effective means to infuse innovations, like seasonal climate forecasting, into farming practice. Crown Copyright (C) 2002 Published by Elsevier Science Ltd. All rights reserved.
Resumo:
The present paper reviews the findings of 30 years of verbal/manual dual task studies, the method most commonly used to assess lateralization of speech production in non-clinical samples. Meta-analysis of 64 results revealed that both the type of manual task used and the nature of practice that is given influence the size of the laterality effect. A meta-analysis of 36 results examining the effect size of sex differences in estimate,, of lateralization of speech production indicated that males appear to show, slightly larger laterality effects than females. (C) 2002 Elsevier Science Ltd. All rights reserved.
Resumo:
Background: Augmentation strategies in schizophrenia treatment remain an important issue because despite the introduction of several new antipsychotics, many patients remain treatment resistant. The aim of this study was to undertake a systematic review and meta-analysis of the safety and efficacy of one frequently used adjunctive compound: carbamazepine. Data sources and study selection: Randomized controlled trials comparing carbamazopine (as a sole or as an adjunctive compound) with placebo or no intervention in participants with schizophrenia or schizoaffective disorder were searched for by accessing 7 electronic databases, cross-referencing publications cited in pertinent studies, and contacting drug companies that manufacture carbamazepine. Method: The identified studies were independently inspected and their quality assessed by 2 reviewers, Because the study results were generally incompletely reported, original patient data were requested from the authors; data were received for 8 of the 10 randomized controlled trials included in the present analysis, allowing for a reanalysis of the primary data. Dichotomous variables were analyzed using the Mantel-Haenszel odds ratio and continuous data were analyzed using standardized mean differences, both specified with 95% confidence intervals. Results: Ten studies (total N = 283 subjects) were included. Carbamazepine was not effective in preventing relapse in the only randomized controlled trial that compared carbamazepine monotherapy with placebo. Carbamazepine tended to be less effective than perphenazine in the only trial comparing carbamazepine with an antipsychotic. Although there was a trend indicating a benefit from carbamazepine as an adjunct to antipsychotics, this trend did not reach statistical significance. Conclusion: At present, this augmentation strategy cannot be recommended for routine use. The most promising targets for future trials are patients with excitement, aggression, and schizoaffective disorder bipolar type.
Resumo:
The Load-Unload Response Ratio (LURR) method is an intermediate-term earthquake prediction approach that has shown considerable promise. It involves calculating the ratio of a specified energy release measure during loading and unloading where loading and unloading periods are determined from the earth tide induced perturbations in the Coulomb Failure Stress on optimally oriented faults. In the lead-up to large earthquakes, high LURR values are frequently observed a few months or years prior to the event. These signals may have a similar origin to the observed accelerating seismic moment release (AMR) prior to many large earthquakes or may be due to critical sensitivity of the crust when a large earthquake is imminent. As a first step towards studying the underlying physical mechanism for the LURR observations, numerical studies are conducted using the particle based lattice solid model (LSM) to determine whether LURR observations can be reproduced. The model is initialized as a heterogeneous 2-D block made up of random-sized particles bonded by elastic-brittle links. The system is subjected to uniaxial compression from rigid driving plates on the upper and lower edges of the model. Experiments are conducted using both strain and stress control to load the plates. A sinusoidal stress perturbation is added to the gradual compressional loading to simulate loading and unloading cycles and LURR is calculated. The results reproduce signals similar to those observed in earthquake prediction practice with a high LURR value followed by a sudden drop prior to macroscopic failure of the sample. The results suggest that LURR provides a good predictor for catastrophic failure in elastic-brittle systems and motivate further research to study the underlying physical mechanisms and statistical properties of high LURR values. The results provide encouragement for earthquake prediction research and the use of advanced simulation models to probe the physics of earthquakes.
Resumo:
Objectives: To study the influence of different diagnostic criteria on the prevalence of diabetes mellitus and characteristics of those diagnosed. Design and setting: Retrospective analysis of data from the general-practice-based Australian Diabetes Screening Study (January 1994 to June 1995). Participants: 5911 people with no previous diagnosis of diabetes, two or more symptoms or risk factors for diabetes, a random venous plasma glucose (PG) level > 5.5 mmol/L and a subsequent oral glucose tolerance test (OGTT) result. Main outcome measure: Prevalence of undiagnosed diabetes based on each of three sets of criteria: 1997 criteria of the American Diabetes Association (ADA), 1996 two-step screening strategy of the Australian Diabetes Society (ADS) (modified according to ADA recommendations about lowered diagnostic fasting PG level), and 1999 definition of the World Health Organization (WHO). Results: Prevalence estimates for undiagnosed diabetes using the American (ADA), Australian (ADS) and WHO criteria (95% CI) were 9.4% (8.7%-10.1%), 16.0% (15.3%-16.7%) and 18.1% (17.1%-19.1%), respectively. People diagnosed with diabetes by fasting PG level (common to all sets of criteria) were more likely to be male and younger than those diagnosed only by 2 h glucose challenge PG level (Australian and WHO criteria only). The Australian (ADS) stepwise screening strategy detected 88% of those who met the WHO criteria for diabetes, including about three-quarters of those with isolated post-challenge hyperglycaemia. Conclusion: The WHO criteria (which include an OGTT result) are preferable to the American (ADA) criteria (which rely totally on fasting PG level), as the latter underestimated the prevalence of undiagnosed diabetes by almost a half. The Australian (ADS) strategy identified most of those diagnosed with diabetes by WHO criteria.
Resumo:
This paper proposes a template for modelling complex datasets that integrates traditional statistical modelling approaches with more recent advances in statistics and modelling through an exploratory framework. Our approach builds on the well-known and long standing traditional idea of 'good practice in statistics' by establishing a comprehensive framework for modelling that focuses on exploration, prediction, interpretation and reliability assessment, a relatively new idea that allows individual assessment of predictions. The integrated framework we present comprises two stages. The first involves the use of exploratory methods to help visually understand the data and identify a parsimonious set of explanatory variables. The second encompasses a two step modelling process, where the use of non-parametric methods such as decision trees and generalized additive models are promoted to identify important variables and their modelling relationship with the response before a final predictive model is considered. We focus on fitting the predictive model using parametric, non-parametric and Bayesian approaches. This paper is motivated by a medical problem where interest focuses on developing a risk stratification system for morbidity of 1,710 cardiac patients given a suite of demographic, clinical and preoperative variables. Although the methods we use are applied specifically to this case study, these methods can be applied across any field, irrespective of the type of response.