12 resultados para computerized electrocardiography
em Duke University
Resumo:
Children with sickle cell disease (SCD) have a high risk of neurocognitive impairment. No known research, however, has examined the impact of neurocognitive functioning on quality of life in this pediatric population. In addition, limited research has examined neurocognitive interventions for these children. In light of these gaps, two studies were undertaken to (a) examine the relationship between cognitive functioning and quality of life in a sample of children with SCD and (b) investigate the feasibility and preliminary efficacy of a computerized working memory training program in this population. Forty-five youth (ages 8-16) with SCD and a caregiver were recruited for the first study. Participants completed measures of cognitive ability, quality of life, and psychosocial functioning. Results indicated that cognitive ability significantly predicted child- and parent-reported quality of life among youth with SCD. In turn, a randomized-controlled trial of a computerized working memory program was undertaken. Eighteen youth with SCD and a caregiver enrolled in this study, and were randomized to a waitlist control or the working memory training condition. Data pertaining to cognitive functioning, psychosocial functioning, and disease characteristics were obtained from participants. The results of this study indicated a high degree of acceptance for this intervention but poor feasibility in practice. Factors related to feasibility were identified. Implications and future directions are discussed.
Resumo:
© 2013 American Psychological Association.This meta-analysis synthesizes research on the effectiveness of intelligent tutoring systems (ITS) for college students. Thirty-five reports were found containing 39 studies assessing the effectiveness of 22 types of ITS in higher education settings. Most frequently studied were AutoTutor, Assessment and Learning in Knowledge Spaces, eXtended Tutor-Expert System, and Web Interface for Statistics Education. Major findings include (a) Overall, ITS had a moderate positive effect on college students' academic learning (g = .32 to g = .37); (b) ITS were less effective than human tutoring, but they outperformed all other instruction methods and learning activities, including traditional classroom instruction, reading printed text or computerized materials, computer-assisted instruction, laboratory or homework assignments, and no-treatment control; (c) ITS's effectiveness did not significantly differ by different ITS, subject domain, or the manner or degree of their involvement in instruction and learning; and (d) effectiveness in earlier studies appeared to be significantly greater than that in more recent studies. In addition, there is some evidence suggesting the importance of teachers and pedagogy in ITS-assisted learning.
Resumo:
Simultaneous neural recordings taken from multiple areas of the rodent brain are garnering growing interest due to the insight they can provide about spatially distributed neural circuitry. The promise of such recordings has inspired great progress in methods for surgically implanting large numbers of metal electrodes into intact rodent brains. However, methods for localizing the precise location of these electrodes have remained severely lacking. Traditional histological techniques that require slicing and staining of physical brain tissue are cumbersome, and become increasingly impractical as the number of implanted electrodes increases. Here we solve these problems by describing a method that registers 3-D computerized tomography (CT) images of intact rat brains implanted with metal electrode bundles to a Magnetic Resonance Imaging Histology (MRH) Atlas. Our method allows accurate visualization of each electrode bundle's trajectory and location without removing the electrodes from the brain or surgically implanting external markers. In addition, unlike physical brain slices, once the 3D images of the electrode bundles and the MRH atlas are registered, it is possible to verify electrode placements from many angles by "re-slicing" the images along different planes of view. Further, our method can be fully automated and easily scaled to applications with large numbers of specimens. Our digital imaging approach to efficiently localizing metal electrodes offers a substantial addition to currently available methods, which, in turn, may help accelerate the rate at which insights are gleaned from rodent network neuroscience.
Resumo:
BACKGROUND: Automated reporting of estimated glomerular filtration rate (eGFR) is a recent advance in laboratory information technology (IT) that generates a measure of kidney function with chemistry laboratory results to aid early detection of chronic kidney disease (CKD). Because accurate diagnosis of CKD is critical to optimal medical decision-making, several clinical practice guidelines have recommended the use of automated eGFR reporting. Since its introduction, automated eGFR reporting has not been uniformly implemented by U. S. laboratories despite the growing prevalence of CKD. CKD is highly prevalent within the Veterans Health Administration (VHA), and implementation of automated eGFR reporting within this integrated healthcare system has the potential to improve care. In July 2004, the VHA adopted automated eGFR reporting through a system-wide mandate for software implementation by individual VHA laboratories. This study examines the timing of software implementation by individual VHA laboratories and factors associated with implementation. METHODS: We performed a retrospective observational study of laboratories in VHA facilities from July 2004 to September 2009. Using laboratory data, we identified the status of implementation of automated eGFR reporting for each facility and the time to actual implementation from the date the VHA adopted its policy for automated eGFR reporting. Using survey and administrative data, we assessed facility organizational characteristics associated with implementation of automated eGFR reporting via bivariate analyses. RESULTS: Of 104 VHA laboratories, 88% implemented automated eGFR reporting in existing laboratory IT systems by the end of the study period. Time to initial implementation ranged from 0.2 to 4.0 years with a median of 1.8 years. All VHA facilities with on-site dialysis units implemented the eGFR software (52%, p<0.001). Other organizational characteristics were not statistically significant. CONCLUSIONS: The VHA did not have uniform implementation of automated eGFR reporting across its facilities. Facility-level organizational characteristics were not associated with implementation, and this suggests that decisions for implementation of this software are not related to facility-level quality improvement measures. Additional studies on implementation of laboratory IT, such as automated eGFR reporting, could identify factors that are related to more timely implementation and lead to better healthcare delivery.
Resumo:
© 2015 Elsevier Inc. All rights reserved.Background 12-lead ECG is a critical component of initial evaluation of cardiac ischemia, but has traditionally been limited to large, dedicated equipment in medical care environments. Smartphones provide a potential alternative platform for the extension of ECG to new care settings and to improve timeliness of care. Objective To gain experience with smartphone electrocardiography prior to designing a larger multicenter study evaluating standard 12-lead ECG compared to smartphone ECG. Methods 6 patients for whom the hospital STEMI protocol was activated were evaluated with traditional 12-lead ECG followed immediately by a smartphone ECG using right (VnR) and left (VnL) limb leads for precordial grounding. The AliveCor™ Heart Monitor was utilized for this study. All tracings were taken prior to catheterization or immediately after revascularization while still in the catheterization laboratory. Results The smartphone ECG had excellent correlation with the gold standard 12-lead ECG in all patients. Four out of six tracings were judged to meet STEMI criteria on both modalities as determined by three experienced cardiologists, and in the remaining two, consensus indicated a non-STEMI ECG diagnosis. No significant difference was noted between VnR and VnL. Conclusions Smartphone based electrocardiography is a promising, developing technology intended to increase availability and speed of electrocardiographic evaluation. This study confirmed the potential of a smartphone ECG for evaluation of acute ischemia and the feasibility of studying this technology further to define the diagnostic accuracy, limitations and appropriate use of this new technology.
Resumo:
Effective dosages for enzyme replacement therapy (ERT) in Pompe disease are much higher than for other lysosomal storage disorders, which has been attributed to low cation-independent mannose-6-phosphate receptor (CI-MPR) in skeletal muscle. We have previously demonstrated the benefit of increased CI-MPR-mediated uptake of recombinant human acid-α-glucosidase during ERT in mice with Pompe disease following addition of albuterol therapy. Currently we have completed a pilot study of albuterol in patients with late-onset Pompe disease already on ERT for >2 yr, who were not improving further. The 6-min walk test (6MWT) distance increased in all 7 subjects at wk 6 (30±13 m; P=0.002), wk 12 (34±14 m; P=0.004), and wk 24 (42±37 m; P=0.02), in comparison with baseline. Grip strength was improved significantly for both hands at wk 12. Furthermore, individual subjects reported benefits; e.g., a female patient could stand up from sitting on the floor much more easily (time for supine to standing position decreased from 30 to 11 s), and a male patient could readily swing his legs out of his van seat (hip abduction increased from 1 to 2+ on manual muscle testing). Finally, analysis of the quadriceps biopsies suggested increased CI-MPR at wk 12 (P=0.08), compared with baseline. With the exception of 1 patient who succumbed to respiratory complications of Pompe disease in the first week, only mild adverse events have been reported, including tremor, transient difficulty falling asleep, and mild urinary retention (requiring early morning voiding). Therefore, this pilot study revealed initial safety and efficacy in an open label study of adjunctive albuterol therapy in patients with late-onset Pompe disease who had been stable on ERT with no improvements noted over the previous several years.
Resumo:
This research validates a computerized dietary selection task (Food-Linked Virtual Response or FLVR) for use in studies of food consumption. In two studies, FLVR task responses were compared with measures of health consciousness, mood, body mass index, personality, cognitive restraint toward food, and actual food selections from a buffet table. The FLVR task was associated with variables which typically predict healthy decision-making and was unrelated to mood or body mass index. Furthermore, the FLVR task predicted participants' unhealthy selections from the buffet, but not overall amount of food. The FLVR task is an inexpensive, valid, and easily administered option for assessing momentary dietary decisions.
Resumo:
BACKGROUND: Curcumin is a natural product that is often explored by patients with cancer. Weight loss due to fat and muscle depletion is a hallmark of pancreatic cancer and is associated with worse outcomes. Studies of curcumin's effects on muscularity show conflicting results in animal models. METHODS AND RESULTS: Retrospective matched 1:2 case-control study to evaluate the effects of curcumin on body composition (determined by computerized tomography) of 66 patients with advanced pancreatic cancer (22 treated,44 controls). Average age (SEM) was 63(1.8) years, 30/66(45%) women, median number of prior therapies was 2, median (IQR) time from advanced pancreatic cancer diagnosis to baseline image was 7(2-13.5) months (p>0.2, all variables). All patients lost weight (3.3% and 1.3%, treated vs. control, p=0.13). Treated patients lost more muscle (median [IQR] percent change -4.8[-9.1,-0.1] vs. -0.05%[-4.2, 2.6] in controls,p<0.001) and fat (median [IQR] percent change -6.8%[-15,-0.6] vs. -4.0%[-7.6, 1.3] in controls,p=0.04). Subcutaneous fat was more affected in the treated patients. Sarcopenic patients treated with curcumin(n=15) had survival of 169(115-223) days vs. 299(229-369) sarcopenic controls(p=0.024). No survival difference was found amongst non-sarcopenic patients. CONCLUSIONS: Patients with advanced pancreatic cancer treated with curcumin showed significantly greater loss of subcutaneous fat and muscle than matched untreated controls.
Resumo:
BACKGROUND: Guidance for appropriate utilisation of transthoracic echocardiograms (TTEs) can be incorporated into ordering prompts, potentially affecting the number of requests. METHODS: We incorporated data from the 2011 Appropriate Use Criteria for Echocardiography, the 2010 National Institute for Clinical Excellence Guideline on Chronic Heart Failure, and American College of Cardiology Choosing Wisely list on TTE use for dyspnoea, oedema and valvular disease into electronic ordering systems at Durham Veterans Affairs Medical Center. Our primary outcome was TTE orders per month. Secondary outcomes included rates of outpatient TTE ordering per 100 visits and frequency of brain natriuretic peptide (BNP) ordering prior to TTE. Outcomes were measured for 20 months before and 12 months after the intervention. RESULTS: The number of TTEs ordered did not decrease (338±32 TTEs/month prior vs 320±33 afterwards, p=0.12). Rates of outpatient TTE ordering decreased minimally post intervention (2.28 per 100 primary care/cardiology visits prior vs 1.99 afterwards, p<0.01). Effects on TTE ordering and ordering rate significantly interacted with time from intervention (p<0.02 for both), as the small initial effects waned after 6 months. The percentage of TTE orders with preceding BNP increased (36.5% prior vs 42.2% after for inpatients, p=0.01; 10.8% prior vs 14.5% after for outpatients, p<0.01). CONCLUSIONS: Ordering prompts for TTEs initially minimally reduced the number of TTEs ordered and increased BNP measurement at a single institution, but the effect on TTEs ordered was likely insignificant from a utilisation standpoint and decayed over time.
Resumo:
The problem of social diffusion has animated sociological thinking on topics ranging from the spread of an idea, an innovation or a disease, to the foundations of collective behavior and political polarization. While network diffusion has been a productive metaphor, the reality of diffusion processes is often muddier. Ideas and innovations diffuse differently from diseases, but, with a few exceptions, the diffusion of ideas and innovations has been modeled under the same assumptions as the diffusion of disease. In this dissertation, I develop two new diffusion models for "socially meaningful" contagions that address two of the most significant problems with current diffusion models: (1) that contagions can only spread along observed ties, and (2) that contagions do not change as they spread between people. I augment insights from these statistical and simulation models with an analysis of an empirical case of diffusion - the use of enterprise collaboration software in a large technology company. I focus the empirical study on when people abandon innovations, a crucial, and understudied aspect of the diffusion of innovations. Using timestamped posts, I analyze when people abandon software to a high degree of detail.
To address the first problem, I suggest a latent space diffusion model. Rather than treating ties as stable conduits for information, the latent space diffusion model treats ties as random draws from an underlying social space, and simulates diffusion over the social space. Theoretically, the social space model integrates both actor ties and attributes simultaneously in a single social plane, while incorporating schemas into diffusion processes gives an explicit form to the reciprocal influences that cognition and social environment have on each other. Practically, the latent space diffusion model produces statistically consistent diffusion estimates where using the network alone does not, and the diffusion with schemas model shows that introducing some cognitive processing into diffusion processes changes the rate and ultimate distribution of the spreading information. To address the second problem, I suggest a diffusion model with schemas. Rather than treating information as though it is spread without changes, the schema diffusion model allows people to modify information they receive to fit an underlying mental model of the information before they pass the information to others. Combining the latent space models with a schema notion for actors improves our models for social diffusion both theoretically and practically.
The empirical case study focuses on how the changing value of an innovation, introduced by the innovations' network externalities, influences when people abandon the innovation. In it, I find that people are least likely to abandon an innovation when other people in their neighborhood currently use the software as well. The effect is particularly pronounced for supervisors' current use and number of supervisory team members who currently use the software. This case study not only points to an important process in the diffusion of innovation, but also suggests a new approach -- computerized collaboration systems -- to collecting and analyzing data on organizational processes.
Resumo:
BACKGROUND: It is unclear whether diagnostic protocols based on cardiac markers to identify low-risk chest pain patients suitable for early release from the emergency department can be applied to patients older than 65 years or with traditional cardiac risk factors. METHODS AND RESULTS: In a single-center retrospective study of 231 consecutive patients with high-risk factor burden in which a first cardiac troponin (cTn) level was measured in the emergency department and a second cTn sample was drawn 4 to 14 hours later, we compared the performance of a modified 2-Hour Accelerated Diagnostic Protocol to Assess Patients with Chest Pain Using Contemporary Troponins as the Only Biomarker (ADAPT) rule to a new risk classification scheme that identifies patients as low risk if they have no known coronary artery disease, a nonischemic electrocardiogram, and 2 cTn levels below the assay's limit of detection. Demographic and outcome data were abstracted through chart review. The median age of our population was 64 years, and 75% had Thrombosis In Myocardial Infarction risk score ≥2. Using our risk classification rule, 53 (23%) patients were low risk with a negative predictive value for 30-day cardiac events of 98%. Applying a modified ADAPT rule to our cohort, 18 (8%) patients were identified as low risk with a negative predictive value of 100%. In a sensitivity analysis, the negative predictive value of our risk algorithm did not change when we relied only on undetectable baseline cTn and eliminated the second cTn assessment. CONCLUSIONS: If confirmed in prospective studies, this less-restrictive risk classification strategy could be used to safely identify chest pain patients with more traditional cardiac risk factors for early emergency department release.
Resumo:
Periods of drought and low streamflow can have profound impacts on both human and natural systems. People depend on a reliable source of water for numerous reasons including potable water supply and to produce economic value through agriculture or energy production. Aquatic ecosystems depend on water in addition to the economic benefits they provide to society through ecosystem services. Given that periods of low streamflow may become more extreme and frequent in the future, it is important to study the factors that control water availability during these times. In the absence of precipitation the slower hydrological response of groundwater systems will play an amplified role in water supply. Understanding the variability of the fraction of streamflow contribution from baseflow or groundwater during periods of drought provides insight into what future water availability may look like and how it can best be managed. The Mills River Basin in North Carolina is chosen as a case-study to test this understanding. First, obtaining a physically meaningful estimation of baseflow from USGS streamflow data via computerized hydrograph analysis techniques is carried out. Then applying a method of time series analysis including wavelet analysis can highlight signals of non-stationarity and evaluate the changes in variance required to better understand the natural variability of baseflow and low flows. In addition to natural variability, human influence must be taken into account in order to accurately assess how the combined system reacts to periods of low flow. Defining a combined demand that consists of both natural and human demand allows us to be more rigorous in assessing the level of sustainable use of a shared resource, in this case water. The analysis of baseflow variability can differ based on regional location and local hydrogeology, but it was found that baseflow varies from multiyear scales such as those associated with ENSO (3.5, 7 years) up to multi decadal time scales, but with most of the contributing variance coming from decadal or multiyear scales. It was also found that the behavior of baseflow and subsequently water availability depends a great deal on overall precipitation, the tracks of hurricanes or tropical storms and associated climate indices, as well as physiography and hydrogeology. Evaluating and utilizing the Duke Combined Hydrology Model (DCHM), reasonably accurate estimates of streamflow during periods of low flow were obtained in part due to the model’s ability to capture subsurface processes. Being able to accurately simulate streamflow levels and subsurface interactions during periods of drought can be very valuable to water suppliers, decision makers, and ultimately impact citizens. Knowledge of future droughts and periods of low flow in addition to tracking customer demand will allow for better management practices on the part of water suppliers such as knowing when they should withdraw more water during a surplus so that the level of stress on the system is minimized when there is not ample water supply.