850 resultados para Prospective econometrics
Resumo:
Increasing amounts of clinical research data are collected by manual data entry into electronic source systems and directly from research subjects. For this manual entered source data, common methods of data cleaning such as post-entry identification and resolution of discrepancies and double data entry are not feasible. However data accuracy rates achieved without these mechanisms may be higher than desired for a particular research use. We evaluated a heuristic usability method for utility as a tool to independently and prospectively identify data collection form questions associated with data errors. The method evaluated had a promising sensitivity of 64% and a specificity of 67%. The method was used as described in the literature for usability with no further adaptations or specialization for predicting data errors. We conclude that usability evaluation methodology should be further investigated for use in data quality assurance.
Resumo:
QUESTIONS UNDER STUDY: After years of advocating ABC (Airway-Breathing-Circulation), current guidelines of cardiopulmonary resuscitation (CPR) recommend CAB (Circulation-Airway-Breathing). This trial compared ABC with CAB as initial approach to CPR from the arrival of rescuers until the completion of the first resuscitation cycle. METHODS: 108 teams, consisting of two physicians each, were randomized to receive a graphical display of either the ABC algorithm or the CAB algorithm. Subsequently teams had to treat a simulated cardiac arrest. Data analysis was performed using video recordings obtained during simulations. The primary endpoint was the time to completion of the first resuscitation cycle of 30 compressions and two ventilations. RESULTS: The time to execution of the first resuscitation measure was 32 ± 12 seconds in ABC teams and 25 ± 10 seconds in CAB teams (P = 0.002). 18/53 ABC teams (34%) and none of the 55 CAB teams (P = 0.006) applied more than the recommended two initial rescue breaths which caused a longer duration of the first cycle of 30 compressions and two ventilations in ABC teams (31 ± 13 vs.23 ± 6 sec; P = 0.001). Overall, the time to completion of the first resuscitation cycle was longer in ABC teams (63 ± 17 vs. 48 ± 10 sec; P <0.0001).CONCLUSIONS: This randomized controlled trial found CAB superior to ABC with an earlier start of CPR and a shorter time to completion of the first 30:2 resuscitation cycle. These findings endorse the change from ABC to CAB in international resuscitation guidelines.
Resumo:
Background: Total knee replacement is the gold standard treatment for patients suffering from advanced symptomatic knee osteoarthritis. The main goals of knee prosthetics are pain reduction and restoration of knee motion. The new prostheses on the market such as the bi-cruciate stabilized Journey knee implant, promise a reconstruction of total physiological function of the knee with physiological range of motion and therefore high patient satisfaction. Purpose: The aim of this study was to analyze the patient-based Knee Injury and Osteoarthritis Outcome Score (KOOS) outcome after total knee replacement with new physiological bi-cruciate stabilized Journey knee prosthesis. Study Design: Prospective, consecutive case-series. Patients: Ninety nine patients, who received bi-cruciate stabilized Journey total knee prosthesis between January 1st 2006 and May 31st 2012, were included in the study. A single surgeon operated all patients. There were 61.1% females and the overall average age was 68 years (range 41-83 years). Left knee was replaced in 55.6%. Methods: The patients filled in KOO’s questionnaire pre- and 1 year postoperative. Range of motion (ROM) was studied preoperatively and at 1-year follow-ups. The pre- and postoperative KOOS subscores and ROM were compared using the Wilcoxon signed rank test. Results: There are significant improvements of all KOOS subscores. Ninety percent of patients have reached the minimum clinically relevant 10 points in symptoms, 94.5% in pain, 94.5% in activities of daily living, 84.9% in sport and recreation, and 90% in knee related quality of life. Postoperative, the average passive ROM was 131° (range 110-145°) and the average active ROM 122° (range 105-135°). The highest correlation coefficients ROM and the KOOS were observed for the activity and pain subscores. Very low or no correlation was seen for the sport subscore. Conclusions: Bi-cruciate stabilized knee prosthetic offers a solid outcome 1 year postoperative based on the results measured with the KOOS evaluation questionnaire. The Patients showed a generalized improvement in all domains measured in the KOOS of minimally 35, and up to over 52 points, what can be described as statistically significant. Patients described the level of functionality close to double compared to the preoperative status.
Resumo:
A cohort of 418 United States Air Force (USAF) personnel from over 15 different bases deployed to Morocco in 1994. This was the first study of its kind and was designed with two primary goals: to determine if the USAF was medically prepared to deploy with its changing mission in the new world order, and to evaluate factors that might improve or degrade USAF medical readiness. The mean length of deployment was 21 days. The cohort was 95% male, 86% enlisted, 65% married, and 78% white.^ This study shows major deficiencies indicating the USAF medical readiness posture has not fully responded to meet its new mission requirements. Lack of required logistical items (e.g., mosquito nets, rainboots, DEET insecticide cream, etc.) revealed a low state of preparedness. The most notable deficiency was that 82.5% (95% CI = 78.4, 85.9) did not have permethrin pretreated mosquito nets and 81.0% (95% CI = 76.8, 84.6) lacked mosquito net poles. Additionally, 18% were deficient on vaccinations and 36% had not received a tuberculin skin test. Excluding injections, the overall compliance for preventive medicine requirements had a mean frequency of only 50.6% (95% CI = 45.36, 55.90).^ Several factors had a positive impact on compliance with logistical requirements. The most prominent was "receiving a medical intelligence briefing" from the USAF Public Health. After adjustment for mobility and age, individuals who underwent a briefing were 17.2 (95% CI = 4.37, 67.99) times more likely to have received an immunoglobulin shot and 4.2 (95% CI = 1.84, 9.45) times more likely to start their antimalarial prophylaxsis at the proper time. "Personnel on mobility" had the second strongest positive effect on medical readiness. When mobility and briefing were included in models, "personnel on mobility" were 2.6 (95% CI = 1.19, 5.53) times as likely to have DEET insecticide and 2.2 (95% CI = 1.16, 4.16) times as likely to have had a TB skin test.^ Five recommendations to improve the medical readiness of the USAF were outlined: upgrade base level logistical support, improve medical intelligence messages, include medical requirements on travel orders, place more personnel on mobility or only deploy personnel on mobility, and conduct research dedicated to capitalize on the powerful effect from predeployment briefings.^ Since this is the first study of its kind, more studies should be performed in different geographic theaters to assess medical readiness and establish acceptable compliance levels for the USAF. ^
Resumo:
BACKGROUND Although well-established for suspected lower limb deep venous thrombosis, an algorithm combining a clinical decision score, d-dimer testing, and ultrasonography has not been evaluated for suspected upper extremity deep venous thrombosis (UEDVT). OBJECTIVE To assess the safety and feasibility of a new diagnostic algorithm in patients with clinically suspected UEDVT. DESIGN Diagnostic management study. (ClinicalTrials.gov: NCT01324037) SETTING: 16 hospitals in Europe and the United States. PATIENTS 406 inpatients and outpatients with suspected UEDVT. MEASUREMENTS The algorithm consisted of the sequential application of a clinical decision score, d-dimer testing, and ultrasonography. Patients were first categorized as likely or unlikely to have UEDVT; in those with an unlikely score and normal d-dimer levels, UEDVT was excluded. All other patients had (repeated) compression ultrasonography. The primary outcome was the 3-month incidence of symptomatic UEDVT and pulmonary embolism in patients with a normal diagnostic work-up. RESULTS The algorithm was feasible and completed in 390 of the 406 patients (96%). In 87 patients (21%), an unlikely score combined with normal d-dimer levels excluded UEDVT. Superficial venous thrombosis and UEDVT were diagnosed in 54 (13%) and 103 (25%) patients, respectively. All 249 patients with a normal diagnostic work-up, including those with protocol violations (n = 16), were followed for 3 months. One patient developed UEDVT during follow-up, for an overall failure rate of 0.4% (95% CI, 0.0% to 2.2%). LIMITATIONS This study was not powered to show the safety of the substrategies. d-Dimer testing was done locally. CONCLUSION The combination of a clinical decision score, d-dimer testing, and ultrasonography can safely and effectively exclude UEDVT. If confirmed by other studies, this algorithm has potential as a standard approach to suspected UEDVT. PRIMARY FUNDING SOURCE None.
Resumo:
BACKGROUND Adherence to guidelines is associated with improved outcomes of patients with acute coronary syndrome (ACS). Clinical registries developed to assess quality of care at discharge often do not collect the reasons for non-prescription for proven efficacious preventive medication in Continental Europe. In a prospective cohort of patients hospitalized for an ACS, we aimed at measuring the rate of recommended treatment at discharge, using pre-specified quality indicators recommended in cardiologic guidelines and including systematic collection of reasons for non-prescription for preventive medications. METHODS In a prospective cohort with 1260 patients hospitalized for ACS, we measured the rate of recommended treatment at discharge in 4 academic centers in Switzerland. Performance measures for medication at discharge were pre-specified according to guidelines, systematically collected for all patients and included in a centralized database. RESULTS Six hundred and eighty eight patients(54.6%) were discharged with a main diagnosis of STEMI, 491(39%) of NSTEMI and 81(6.4%) of unstable angina. Mean age was 64 years and 21.3% were women. 94.6% were prescribed angiotensin converting enzyme inhibitors/angiotensin II receptor blockers at discharge when only considering raw prescription rates, but increased to 99.5% when including reasons non-prescription. For statins, rates increased from 98% to 98.6% when including reasons for non-prescription and for beta-blockers, from 82% to 93%. For aspirin, rates further increased from 99.4% to 100% and from to 99.8% to 100% for P2Y12 inhibitors. CONCLUSIONS We found a very high adherence to ACS guidelines for drug prescriptions at discharge when including reasons for non-prescription to drug therapy. For beta-blockers, prescription rates were suboptimal, even after taking into account reason for non-prescription. In an era of improving quality of care to achieve 100% prescription rates at discharge unless contra-indicated, pre-specification of reasons for non-prescription for cardiovascular preventive medication permits to identify remaining gaps in quality of care at discharge.
Resumo:
About 500,000 elderly people in Switzerland suffer a fall each year. Thus medical attention and help are essential for these people, who mostly live alone without a caregiver. Only 3% of people aged over 65 in Switzerland use an emergency system. Personal telehealth devices allow patients to receive enough information about the appropriate treatment, as well as followup with their doctors and reports of any emergency, in the absence of any caregiver. This increases their quality of life in a cost-effective fashion. "Limmex"-a new medical emergency watch-was launched in Switzerland in 2011 and has been a great commercial success. In this paper, we give a brief review of this watch technology, along with the results of a survey of 620 users conducted by the Department of Emergency Medicine in Bern.
Resumo:
BACKGROUND Prediction studies in subjects at Clinical High Risk (CHR) for psychosis are hampered by a high proportion of uncertain outcomes. We therefore investigated whether quantitative EEG (QEEG) parameters can contribute to an improved identification of CHR subjects with a later conversion to psychosis. METHODS This investigation was a project within the European Prediction of Psychosis Study (EPOS), a prospective multicenter, naturalistic field study with an 18-month follow-up period. QEEG spectral power and alpha peak frequencies (APF) were determined in 113 CHR subjects. The primary outcome measure was conversion to psychosis. RESULTS Cox regression yielded a model including frontal theta (HR=1.82; [95% CI 1.00-3.32]) and delta (HR=2.60; [95% CI 1.30-5.20]) power, and occipital-parietal APF (HR=.52; [95% CI .35-.80]) as predictors of conversion to psychosis. The resulting equation enabled the development of a prognostic index with three risk classes (hazard rate 0.057 to 0.81). CONCLUSIONS Power in theta and delta ranges and APF contribute to the short-term prediction of psychosis and enable a further stratification of risk in CHR samples. Combined with (other) clinical ratings, EEG parameters may therefore be a useful tool for individualized risk estimation and, consequently, targeted prevention.
Resumo:
BACKGROUND Advanced lower extremity peripheral artery disease (PAD), whether presenting as acute limb ischemia (ALI) or chronic critical limb ischemia (CLI), is associated with high rates of cardiovascular ischemic events, amputation, and death. Past research has focused on strategies of revascularization, but few data are available that prospectively evaluate the impact of key process of care factors (spanning pre-admission, acute hospitalization, and post-discharge) that might contribute to improving short and long-term health outcomes. METHODS/DESIGN The FRIENDS registry is designed to prospectively evaluate a range of patient and health system care delivery factors that might serve as future targets for efforts to improve limb and systemic outcomes for patients with ALI or CLI. This hypothesis-driven registry was designed to evaluate the contributions of: (i) pre-hospital limb ischemia symptom duration, (ii) use of leg revascularization strategies, and (iii) use of risk-reduction pharmacotherapies, as pre-specified factors that may affect amputation-free survival. Sequential patients would be included at an index "vascular specialist-defined" ALI or CLI episode, and patients excluded only for non-vascular etiologies of limb threat. Data including baseline demographics, functional status, co-morbidities, pre-hospital time segments, and use of medical therapies; hospital-based use of revascularization strategies, time segments, and pharmacotherapies; and rates of systemic ischemic events (e.g., myocardial infarction, stroke, hospitalization, and death) and limb ischemic events (e.g., hospitalization for revascularization or amputation) will be recorded during a minimum of one year follow-up. DISCUSSION The FRIENDS registry is designed to evaluate the potential impact of key factors that may contribute to adverse outcomes for patients with ALI or CLI. Definition of new "health system-based" therapeutic targets could then become the focus of future interventional clinical trials for individuals with advanced PAD.
Resumo:
Patient self-management (PSM) of oral anticoagulation is under discussion, because evidence from real-life settings is missing. Using data from a nationwide, prospective cohort study in Switzerland, we assessed overall long-term efficacy and safety of PSM and examined subgroups. Data of 1140 patients (5818.9 patient-years) were analysed and no patient were lost to follow-up. Median follow-up was 4.3 years (range 0.2-12.8 years). Median age at the time of training was 54.2 years (range 18.2-85.2) and 34.6% were women. All-cause mortality was 1.4 per 100 patient-years (95% CI 1.1-1.7) with a higher rate in patients with atrial fibrillation (2.5; 1.6-3.7; p<0.001), patients>50 years of age (2.0; 1.6-2.6; p<0.001), and men (1.6; 1.2-2.1; p = 0.036). The rate of thromboembolic events was 0.4 (0.2-0.6) and independent from indications, sex and age. Major bleeding were observed in 1.1 (0.9-1.5) per 100 patient-years. Efficacy was comparable to standard care and new oral anticoagulants in a network meta-analysis. PSM of properly trained patients is effective and safe in a long-term real-life setting and robust across clinical subgroups. Adoption in various clinical settings, including those with limited access to medical care or rural areas is warranted.
Resumo:
Background: Deep brain stimulation (DBS) is highly successful in treating Parkinson's disease (PD), dystonia, and essential tremor (ET). Until recently implantable neurostimulators were nonrechargeable, battery-driven devices, with a lifetime of about 3-5 years. This relatively short duration causes problems for patients (e.g. programming and device-use limitations, unpredictable expiration, surgeries to replace depleted batteries). Additionally, these batteries (relatively large with considerable weight) may cause discomfort. To overcome these issues, the first rechargeable DBS device was introduced: smaller, lighter and intended to function for 9 years. Methods: Of 35 patients implanted with the rechargeable device, 21 (including 8 PD, 10 dystonia, 2 ET) were followed before and 3 months after surgery and completed a systematic survey of satisfaction with the rechargeable device. Results: Overall patient satisfaction was high (83.3 ± 18.3). Dystonia patients tended to have lower satisfaction values for fit and comfort of the system than PD patients. Age was significantly negatively correlated with satisfaction regarding process of battery recharging. Conclusions: Dystonia patients (generally high-energy consumption, severe problems at the DBS device end-of-life) are good, reliable candidates for a rechargeable DBS system. In PD, younger patients, without signs of dementia and good technical understanding, might have highest benefit.
Resumo:
Performing a prospective memory task repeatedly changes the nature of the task from episodic to habitual. The goal of the present study was to investigate the neural basis of this transition. In two experiments, we contrasted event-related potentials (ERPs) evoked by correct responses to prospective memory targets in the first, more episodic part of the experiment with those of the second, more habitual part of the experiment. Specifically, we tested whether the early, middle, or late ERP-components, which are thought to reflect cue detection, retrieval of the intention, and post-retrieval processes, respectively, would be changed by routinely performing the prospective memory task. The results showed a differential ERP effect in the middle time window (450 - 650 ms post-stimulus). Source localization using low resolution brain electromagnetic tomography analysis (LORETA) suggests that the transition was accompanied by an increase of activation in the posterior parietal and occipital cortex. These findings indicate that habitual prospective memory involves retrieval processes guided more strongly by parietal brain structures. In brief, the study demonstrates that episodic and habitual prospective memory tasks recruit different brain areas.
Resumo:
Forgetting to carry out an intention as planned can have serious consequences in everyday life. People sometimes even forget intentions that they consider as very important. Here, we review the literature on the impact of importance on prospective memory performance. We highlight different methods used to manipulate the importance of a prospective memory task such as providing rewards, importance relative to other ongoing activities, absolute importance, and providing social motives. Moreover, we address the relationship between importance and other factors known to affect prospective memory and ongoing task performance such as type of prospective memory task (time-, event-, or activity-based), cognitive loads, and processing overlaps. Finally, we provide a connection to motivation, we summarize the effects of task importance and we identify important venues for future research.
Resumo:
Traumatic brain injuries (TBIs) occur frequently in childhood and entail broad cognitive deficits, particularly in the domain of executive functions (EF). Concerning mild TBI (mTBI), only little empirical evidence is available on acute and postacute performance in EF. Given that EF are linked to school adaptation and achievement, even subtle deficits in performance may affect children's academic careers. The present study assessed performance in the EF components of inhibition, working memory (WM), and switching in children after mTBI. Regarding both acute and postacute consequences, performance trajectories were measured in 13 patients aged between 5 and 10 years and 13 controls who were closely matched in terms of sex, age, and education. Performance in the EF components of inhibition, switching, and WM was assessed in a short-term longitudinal design at 2, 6, and 12 weeks after the mTBI. Results indicate subtle deficits after mTBI, which became apparent in the longitudinal trajectory in the EF components of switching and WM. Compared with controls, children who sustained mTBI displayed an inferior performance enhancement across testing sessions in the first 6 weeks after the injury in switching and WM, resulting in a delayed deficit in the EF component of WM 12 weeks after the injury. Results are interpreted as mTBI-related deficits that become evident in terms of an inability to profit from previous learning opportunities, a finding that is potentially important for children's mastery of their daily lives.