915 resultados para New methodology
Resumo:
Venous thromboembolism (VTE) is common and has a high impact on morbidity, mortality, and costs of care. Although most of the patients with VTE are aged ≥65 years, there is little data about the medical outcomes in the elderly with VTE. The Swiss Cohort of Elderly Patients with VTE (SWITCO65+) is a prospective multicenter cohort study of in- and outpatients aged ≥65 years with acute VTE from all five Swiss university and four high-volume non-university hospitals. The goal is to examine which clinical and biological factors and processes of care drive short- and long-term medical outcomes, health-related quality of life, and medical resource utilization in elderly patients with acute VTE. The cohort also includes a large biobank with biological material from each participant. From September 2009 to March 2012, 1,863 elderly patients with VTE were screened and 1003 (53.8 %) were enrolled in the cohort. Overall, 51.7 % of patients were aged ≥75 years and 52.7 % were men. By October 16, 2012, after an average follow-up time of 512 days, 799 (79.7 %) patients were still actively participating. SWITCO65+ is a unique opportunity to study short- and long-term outcomes in elderly patients with VTE. The Steering Committee encourages national and international collaborative research projects related to SWITCO65+, including sharing anonymized data and biological samples.
Resumo:
Globalisation in coronary stent research calls for harmonization of clinical endpoint definitions and event adjudication. Little has been published about the various processes used for event adjudication or their impact on outcome reporting.
Resumo:
The AEGISS (Ascertainment and Enhancement of Gastrointestinal Infection Surveillance and Statistics) project aims to use spatio-temporal statistical methods to identify anomalies in the space-time distribution of non-specific, gastrointestinal infections in the UK, using the Southampton area in southern England as a test-case. In this paper, we use the AEGISS project to illustrate how spatio-temporal point process methodology can be used in the development of a rapid-response, spatial surveillance system. Current surveillance of gastroenteric disease in the UK relies on general practitioners reporting cases of suspected food-poisoning through a statutory notification scheme, voluntary laboratory reports of the isolation of gastrointestinal pathogens and standard reports of general outbreaks of infectious intestinal disease by public health and environmental health authorities. However, most statutory notifications are made only after a laboratory reports the isolation of a gastrointestinal pathogen. As a result, detection is delayed and the ability to react to an emerging outbreak is reduced. For more detailed discussion, see Diggle et al. (2003). A new and potentially valuable source of data on the incidence of non-specific gastro-enteric infections in the UK is NHS Direct, a 24-hour phone-in clinical advice service. NHS Direct data are less likely than reports by general practitioners to suffer from spatially and temporally localized inconsistencies in reporting rates. Also, reporting delays by patients are likely to be reduced, as no appointments are needed. Against this, NHS Direct data sacrifice specificity. Each call to NHS Direct is classified only according to the general pattern of reported symptoms (Cooper et al, 2003). The current paper focuses on the use of spatio-temporal statistical analysis for early detection of unexplained variation in the spatio-temporal incidence of non-specific gastroenteric symptoms, as reported to NHS Direct. Section 2 describes our statistical formulation of this problem, the nature of the available data and our approach to predictive inference. Section 3 describes the stochastic model. Section 4 gives the results of fitting the model to NHS Direct data. Section 5 shows how the model is used for spatio-temporal prediction. The paper concludes with a short discussion.
Resumo:
This article reports about the internet based, second multicenter study (MCS II) of the spine study group (AG WS) of the German trauma association (DGU). It represents a continuation of the first study conducted between the years 1994 and 1996 (MCS I). For the purpose of one common, centralised data capture methodology, a newly developed internet-based data collection system ( http://www.memdoc.org ) of the Institute for Evaluative Research in Orthopaedic Surgery of the University of Bern was used. The aim of this first publication on the MCS II was to describe in detail the new method of data collection and the structure of the developed data base system, via internet. The goal of the study was the assessment of the current state of treatment for fresh traumatic injuries of the thoracolumbar spine in the German speaking part of Europe. For that reason, we intended to collect large number of cases and representative, valid information about the radiographic, clinical and subjective treatment outcomes. Thanks to the new study design of MCS II, not only the common surgical treatment concepts, but also the new and constantly broadening spectrum of spine surgery, i.e. vertebro-/kyphoplasty, computer assisted surgery and navigation, minimal-invasive, and endoscopic techniques, documented and evaluated. We present a first statistical overview and preliminary analysis of 18 centers from Germany and Austria that participated in MCS II. A real time data capture at source was made possible by the constant availability of the data collection system via internet access. Following the principle of an application service provider, software, questionnaires and validation routines are located on a central server, which is accessed from the periphery (hospitals) by means of standard Internet browsers. By that, costly and time consuming software installation and maintenance of local data repositories are avoided and, more importantly, cumbersome migration of data into one integrated database becomes obsolete. Finally, this set-up also replaces traditional systems wherein paper questionnaires were mailed to the central study office and entered by hand whereby incomplete or incorrect forms always represent a resource consuming problem and source of error. With the new study concept and the expanded inclusion criteria of MCS II 1, 251 case histories with admission and surgical data were collected. This remarkable number of interventions documented during 24 months represents an increase of 183% compared to the previously conducted MCS I. The concept and technical feasibility of the MEMdoc data collection system was proven, as the participants of the MCS II succeeded in collecting data ever published on the largest series of patients with spinal injuries treated within a 2 year period.
Resumo:
OBJECTIVE: To assess the methodology of meta-analyses published in leading general and specialist medical journals over a 10-year period. STUDY DESIGN AND SETTING: Volumes 1993-2002 of four general medicine journals and four specialist journals were searched by hand for meta-analyses including at least five controlled trials. Characteristics were assessed using a standardized questionnaire. RESULTS: A total of 272 meta-analyses, which included a median of 11 trials (range 5-195), were assessed. Most (81%) were published in general medicine journals. The median (range) number of databases searched increased from 1 (1-9) in 1993/1994 to 3.5 (1-21) in 2001/2002, P<0.0001. The proportion of meta-analyses including searches by hand (10% in 1993/1994, 25% in 2001/2002, P=0.005), searches of the grey literature (29%, 51%, P=0.010 by chi-square test), and of trial registers (10%, 32%, P=0.025) also increased. Assessments of the quality of trials also became more common (45%, 70%, P=0.008), including whether allocation of patients to treatment groups had been concealed (24%, 60%, P=0.001). The methodological and reporting quality was consistently higher in general medicine compared to specialist journals. CONCLUSION: Many meta-analyses published in leading journals have important methodological limitations. The situation has improved in recent years but considerable room for further improvements remains.
Resumo:
Riparian zones are dynamic, transitional ecosystems between aquatic and terrestrial ecosystems with well defined vegetation and soil characteristics. Development of an all-encompassing definition for riparian ecotones, because of their high variability, is challenging. However, there are two primary factors that all riparian ecotones are dependent on: the watercourse and its associated floodplain. Previous approaches to riparian boundary delineation have utilized fixed width buffers, but this methodology has proven to be inadequate as it only takes the watercourse into consideration and ignores critical geomorphology, associated vegetation and soil characteristics. Our approach offers advantages over other previously used methods by utilizing: the geospatial modeling capabilities of ArcMap GIS; a better sampling technique along the water course that can distinguish the 50-year flood plain, which is the optimal hydrologic descriptor of riparian ecotones; the Soil Survey Database (SSURGO) and National Wetland Inventory (NWI) databases to distinguish contiguous areas beyond the 50-year plain; and land use/cover characteristics associated with the delineated riparian zones. The model utilizes spatial data readily available from Federal and State agencies and geospatial clearinghouses. An accuracy assessment was performed to assess the impact of varying the 50-year flood height, changing the DEM spatial resolution (1, 3, 5 and 10m), and positional inaccuracies with the National Hydrography Dataset (NHD) streams layer on the boundary placement of the delineated variable width riparian ecotones area. The result of this study is a robust and automated GIS based model attached to ESRI ArcMap software to delineate and classify variable-width riparian ecotones.
Resumo:
Monte Carlo simulation was used to evaluate properties of a simple Bayesian MCMC analysis of the random effects model for single group Cormack-Jolly-Seber capture-recapture data. The MCMC method is applied to the model via a logit link, so parameters p, S are on a logit scale, where logit(S) is assumed to have, and is generated from, a normal distribution with mean μ and variance σ2 . Marginal prior distributions on logit(p) and μ were independent normal with mean zero and standard deviation 1.75 for logit(p) and 100 for μ ; hence minimally informative. Marginal prior distribution on σ2 was placed on τ2=1/σ2 as a gamma distribution with α=β=0.001 . The study design has 432 points spread over 5 factors: occasions (t) , new releases per occasion (u), p, μ , and σ . At each design point 100 independent trials were completed (hence 43,200 trials in total), each with sample size n=10,000 from the parameter posterior distribution. At 128 of these design points comparisons are made to previously reported results from a method of moments procedure. We looked at properties of point and interval inference on μ , and σ based on the posterior mean, median, and mode and equal-tailed 95% credibility interval. Bayesian inference did very well for the parameter μ , but under the conditions used here, MCMC inference performance for σ was mixed: poor for sparse data (i.e., only 7 occasions) or σ=0 , but good when there were sufficient data and not small σ .
Resumo:
Desertification research conventionally focuses on the problem – that is, degradation – while neglecting the appraisal of successful conservation practices. Based on the premise that Sustainable Land Management (SLM) experiences are not sufficiently or comprehensively documented, evaluated, and shared, the World Overview of Conservation Approaches and Technologies (WOCAT) initiative (www.wocat.net), in collaboration with FAO’s Land Degradation Assessment in Drylands (LADA) project (www.fao.org/nr/lada/) and the EU’s DESIRE project (http://www.desire-project.eu/), has developed standardised tools and methods for compiling and evaluating the biophysical and socio-economic knowledge available about SLM. The tools allow SLM specialists to share their knowledge and assess the impact of SLM at the local, national, and global levels. As a whole, the WOCAT–LADA–DESIRE methodology comprises tools for documenting, self-evaluating, and assessing the impact of SLM practices, as well as for knowledge sharing and decision support in the field, at the planning level, and in scaling up identified good practices. SLM depends on flexibility and responsiveness to changing complex ecological and socioeconomic causes of land degradation. The WOCAT tools are designed to reflect and capture this capacity of SLM. In order to take account of new challenges and meet emerging needs of WOCAT users, the tools are constantly further developed and adapted. Recent enhancements include tools for improved data analysis (impact and cost/benefit), cross-scale mapping, climate change adaptation and disaster risk management, and easier reporting on SLM best practices to UNCCD and other national and international partners. Moreover, WOCAT has begun to give land users a voice by backing conventional documentation with video clips straight from the field. To promote the scaling up of SLM, WOCAT works with key institutions and partners at the local and national level, for example advisory services and implementation projects. Keywords: Sustainable Land Management (SLM), knowledge management, decision-making, WOCAT–LADA–DESIRE methodology.
Resumo:
The process of developing a successful stroke rehabilitation methodology requires four key components: a good understanding of the pathophysiological mechanisms underlying this brain disease, clear neuroscientific hypotheses to guide therapy, adequate clinical assessments of its efficacy on multiple timescales, and a systematic approach to the application of modern technologies to assist in the everyday work of therapists. Achieving this goal requires collaboration between neuroscientists, technologists and clinicians to develop well-founded systems and clinical protocols that are able to provide quantitatively validated improvements in patient rehabilitation outcomes. In this article we present three new applications of complementary technologies developed in an interdisciplinary matrix for acute-phase upper limb stroke rehabilitation – functional electrical stimulation, arm robot-assisted therapy and virtual reality-based cognitive therapy. We also outline the neuroscientific basis of our approach, present our detailed clinical assessment protocol and provide preliminary results from patient testing of each of the three systems showing their viability for patient use.