11 resultados para Longitudinal field study

em DigitalCommons@The Texas Medical Center


Relevância:

90.00% 90.00%

Publicador:

Resumo:

Back ground and Purpose. There is a growing consensus among health care researchers that Quality of Life (QoL) is an important outcome and, within the field of family caregiving, cost effectiveness research is needed to determine which programs have the greatest benefit for family members. This study uses a multidimensional approach to measure the cost effectiveness of a multicomponent intervention designed to improve the quality of life of spousal caregivers of stroke survivors. Methods. The CAReS study (Committed to Assisting with Recovery after Stroke) was a 5-year prospective, longitudinal intervention study for 159 stroke survivors and their spousal caregivers upon discharge of the stroke survivor from inpatient rehabilitation to their home. CAReS cost data were analyzed to determine the incremental cost of the intervention per caregiver. The mean values of the quality-of-life predictor variables of the intervention group of caregivers were compared to the mean values of usual care groups found in the literature. Significant differences were then divided into the cost of the intervention per caregiver to calculate the incremental cost effectiveness ratio for each predictor variable. Results. The cost of the intervention per caregiver was approximately $2,500. Statistically significant differences were found between the mean scores for the Perceived Stress and Satisfaction with Life scales. Statistically significant differences were not found between the mean scores for the Self Reported Health Status, Mutuality, and Preparedness scales. Conclusions. This study provides a prototype cost effectiveness analysis on which researchers can build. Using a multidimensional approach to measure QoL, as used in this analysis, incorporates both the subjective and objective components of QoL. Some of the QoL predictor variable scores were significantly different between the intervention and comparison groups, indicating a significant impact of the intervention. The estimated cost of the impact was also examined. In future studies, a scale that takes into account both the dimensions and the weighting each person places on the dimensions of QoL should be used to provide a single QoL score per participant. With participant level cost and outcome data, uncertainty around each cost-effectiveness ratio can be calculated using the bias-corrected percentile bootstrapping method and plotted to calculate the cost-effectiveness acceptability curves.^

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Longitudinal principal components analyses on a combination of four subcutaneous skinfolds (biceps, triceps, subscapular and suprailiac) were performed using data from the London Longitudinal Growth Study. The main objectives were to discover at what age during growth sex differences in body fat distribution occur and to see if there is continuity in body fatness and body fat distribution from childhood into the adult status (18 years). The analyses were done for four age sectors (3mon-3yrs, 3yrs-8yrs, 8yrs-18yrs and 3yrs-18yrs). Longitudinal principal component one (LPC1) for each age interval in both sexes represents the population mean fat curve. Component two (LPC2) is a velocity of fatness component. Component three (LPC3) in the 3mon-3yrs age sector represents infant fat wave in both sexes. In the next two age sectors component three in males represents peaks and shifts in fat growth (change in velocity), while in females it represents body fat distribution. Component four (LPC4) in the same two age sectors is a reversal in the sexes of the patterns seen for component three, i.e., in males it is body fat distribution and in females velocity shifts. Components five and above represent more complicated patterns of change (multiple increases and decreases across the age interval). In both sexes there is strong tracking in fatness from middle childhood to adolescence. In males only there is also a low to moderate tracking of infant fat with middle to late childhood fat. These data are strongly supported in the literature. Several factors are known to predict adult fatness among the most important being previous levels of fatness (at earlier ages) and the age at rebound. In addition we found that the velocity of fat change in middle childhood was highly predictive of later fatness (r $\approx -$0.7), even more so than age at rebound (r $\approx -$0.5). In contrast to fatness (LPC1), body fat distribution (LPC3-LPC4) did not track well even though significant components of body fat distribution occur at each age. Tracking of body fat distribution was higher in females than males. Sex differences in body fat distribution are non existent. Some sex differences are evident with the peripheral-to-central ratios after age 14 years. ^

Relevância:

80.00% 80.00%

Publicador:

Resumo:

An understanding of interruptions in healthcare is important for the design, implementation, and evaluation of health information systems and for the management of clinical workflow and medical errors. The purpose of this study is to identify and classify the types of interruptions experienced by Emergency Department(ED) nurses working in a Level One Trauma Center. This was an observational field study of Registered Nurses (RNs) employed in a Level One Trauma Center using the shadowing method. Results of the study indicate that nurses were both recipients and initiators of interruptions. Telephones, pagers, and face-to-face conversations were the most common sources of interruptions. Unlike other industries, the healthcare community has not systematically studied interruptions in clinical settings to determine and weigh the necessity of the interruption against their sometimes negative results such as medical errors, decreased efficiency, and increased costs. Our study presented here is an initial step to understand the nature, causes, and effects of interruptions, thereby improving both the quality of healthcare and patient safety. We developed an ethnographic data collection technique and a data coding method for the capturing and analysis of interruptions. The interruption data we collected are systematic, comprehensive, and close to exhaustive. They confirmed the findings from earlier studies by other researchers that interruptions are frequent events in critical care and other healthcare settings. We are currently using these data to analyze the workflow dynamics of ED clinicians, to identify the bottlenecks of information flow, and to develop interventions to improve the efficiency of emergency care through the management of interruptions.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

An understanding of interruptions in healthcare is important for the design, implementation, and evaluation of health information systems and for the management of clinical workflow and medical errors. The purpose of this study is to identify and classify the types of interruptions experienced by ED nurses working in a Level One Trauma Center. This was an observational field study of Registered Nurses employed in a Level One Trauma Center using the shadowing method. Results of the study indicate that nurses were both recipients and initiators of interruptions. Telephone, pagers, and face-to-face conversations were the most common sources of interruptions. Unlike other industries, the outcomes caused by interruptions resulting in medical errors, decreased efficiency and increased cost have not been systematically studied in healthcare. Our study presented here is an initial step to understand the nature, causes, and effects of interruptions, and to develop interventions to manage interruptions to improve healthcare quality and patient safety. We developed an ethnographic data collection technique and a data coding method for the capturing and analysis of interruptions. The interruption data we collected are systematic, comprehensive, and close to exhaustive. They confirmed the findings from early studies by other researchers that interruptions are frequent events in critical care and other healthcare settings. We are currently using these data to analyze the workflow dynamics of ED clinicians, identify the bottlenecks of information flow, and develop interventions to improve the efficiency of emergency care through the management of interruptions.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Second-generation antipsychotics (SGAs) are increasingly prescribed to treat psychiatric symptoms in pediatric patients infected with HIV. We examined the relationship between prescribed SGAs and physical growth in a cohort of youth with perinatally acquired HIV-1 infection. Pediatric AIDS Clinical Trials Group (PACTG), Protocol 219C (P219C), a multicenter, longitudinal observational study of children and adolescents perinatally exposed to HIV, was conducted from September 2000 until May 2007. The analysis included P219C participants who were perinatally HIV-infected, 3-18 years old, prescribed first SGA for at least 1 month, and had available baseline data prior to starting first SGA. Each participant prescribed an SGA was matched (based on gender, age, Tanner stage, baseline body mass index [BMI] z score) with 1-3 controls without antipsychotic prescriptions. The main outcomes were short-term (approximately 6 months) and long-term (approximately 2 years) changes in BMI z scores from baseline. There were 236 participants in the short-term and 198 in the long-term analysis. In linear regression models, youth with SGA prescriptions had increased BMI z scores relative to youth without antipsychotic prescriptions, for all SGAs (short-term increase = 0.192, p = 0.003; long-term increase = 0.350, p < 0.001), and for risperidone alone (short-term = 0.239, p = 0.002; long-term = 0.360, p = 0.001). Participants receiving both protease inhibitors (PIs) and SGAs showed especially large increases. These findings suggest that growth should be carefully monitored in youth with perinatally acquired HIV who are prescribed SGAs. Future research should investigate the interaction between PIs and SGAs in children and adolescents with perinatally acquired HIV infection.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The purpose of this prospective observational field study was to present a model for measuring energy expenditure among nurses and to determine if there was a difference between the energy expenditure of nurses providing direct care to adult patients on general medical-surgical units in two major metropolitan hospitals and a recommended energy expenditure of 3.0 kcal/minute over 8 hours. One-third of the predicted cycle ergometer VO2max for the study population was used to calculate the recommended energy expenditure.^ Two methods were used to measure energy expenditure among participants during an 8 hour day shift. First, the Energy Expenditure Prediction Program (EEPP) developed by the University of Michigan Center for Ergonomics was used to calculate energy expenditure using activity recordings from observation (OEE; n = 39). The second method used ambulatory electrocardiography and the heart rate-oxygen consumption relationship (HREE; n = 20) to measure energy expenditure. It was concluded that energy expenditure among nurses can be estimated using the EEPP. Using classification systems from previous research, work load among the study population was categorized as "moderate" but was significantly less than (p = 0.021) 3.0 kcal/minute over 8 hours or 1/3 of the predicted VO2max.^ In addition, the relationships between OEE, body-part discomfort (BPCDS) and mental work load (MWI) were evaluated. The relationships between OEE/BPCDS and OEE/MWI were not significant (p = 0.062 and 0.091, respectively). Among the study population, body-part discomfort significantly increased for upper arms, mid-back, lower-back, legs and feet by mid-shift and by the end of the shift, the increase was also significant for neck and thighs.^ The study also provided documentation of a comprehensive list of nursing activities. Among the most important findings were the facts that the study population spent 23% of the workday in a bent posture, walked an average of 3.14 miles, and spent two-thirds of the shift doing activities other than direct patient care, such as paperwork and communicating with other departments. A discussion is provided regarding the ergonomic implications of these findings. ^

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Knee osteoarthritis (OA) is the most prevalent form of arthritis in the US, affecting approximately 37% of adults. Approximately 300,000 total knee arthroplasty (TKA) procedures take place in the United States each year. Total knee arthroplasty is an elective procedure available to patients as an irreversible treatment after failure of previous medical treatments. Some patients sacrifice quality of life and endure many years of pain before making the decision to undergo total knee replacement. In making their decision, it is therefore imperative for patients to understand the procedure, risks and surgical outcomes to create realistic expectations and increase outcome satisfaction. ^ From 2004-2007, 236 OA patients who underwent TKA participated in the PEAKS (Patient Expectations About Knee Surgery) study, an observational longitudinal cohort study, completed baseline and 6 month follow-up questionnaires after the surgery. We performed a secondary data analysis of the PEAKS study to: (1) determine the specific presurgical patient characteristics associated with patients’ presurgical expectations of time to functional recovery; and (2) determine the association between presurgical expectations of time to functional recovery and postsurgical patient capabilities (6 months after TKA). We utilized the WOMAC to measure knee pain and function, the SF-36 to measure health-related quality of life, and the DASS and MOS-SSS to measure psychosocial quality of life variables. Expectation and capability measures were generated from panel of experts. A list of 10 activities was used for this analysis to measure functional expectations and postoperative functional capabilities. ^ The final cohort consisted of 236 individuals, was predominately White with 154 women and 82 men. The mean age was 65 years. Patients were optimistic about their time to functional recovery. Expectation time of being able to perform the list activities per patient had a median of less than 3 months. Patients who expected to be able to perform the functional activities by 3 months had better knee function, less pain and better overall health-related quality of life. Despite expectation differences, all patients showed significant improvement 6 months after surgery. Participant expectation of time to functional recovery was not an independent predictor of capability to perform functional activities at 6 months. Better presurgical patient characteristics were, however, associated with a higher likelihood of being able to perform all activities at 6 months. ^ This study gave us initial insight on the relationship between presurgical patient characteristics and their expectations of functional recovery after total knee replacement. Future studies clarifying the relationship between patient presurgical characteristics and postsurgical functional capabilities are needed.^

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The Long Term Acute Care Hospitals (LTACH), which serve medically complex patients, have grown tremendously in recent years, by expanding the number of Medicare patient admissions and thus increasing Medicare expenditures (Stark 2004). In an attempt to mitigate the rapid growth of the LTACHs and reduce related Medicare expenditures, Congress enacted Section 114 of P.L. 110-173 (§114) of the Medicare, Medicaid and SCHIP Extension Act (MMSEA) in December 29, 2007 to regulate the LTCAHs industry. MMSEA increased the medical necessity reviews for Medicare admissions, imposed a moratorium on new LTCAHs, and allowed the Centers for Medicare and Medicaid Services (CMS) to recoup Medicare overpayments for unnecessary admissions. ^ This study examines whether MMSEA impacted LTACH admissions, operating margins and efficiency. These objectives were analyzed by comparing LTACH data for 2008 (post MMSEA) and data for 2006-2007 (pre-MMSEA). Secondary data were utilized from the American Hospital Association (AHA) database and the American Hospital Directory (AHD).^ This is a longitudinal retrospective study with a total sample of 55 LTACHs, selected from 396 LTACHs facilities that were fully operational during the study period of 2006-2008. The results of the research found no statistically significant change in total Medicare admissions; instead there was a small but not statistically significant reduction of 5% in Medicare admissions for 2008 in comparison to those for 2006. A statistically significant decrease in mean operating margins was confirmed between the years 2006 and 2008. The LTACHs' Technical Efficiency (TE), as computed by Data Envelopment Analysis (DEA), showed significant decrease in efficiency over the same period. Thirteen of the 55 LTACHs in the sample (24%) in 2006 were calculated as “efficient” utilizing the DEA analysis. This dropped to 13% (7/55) in 2008. Longitudinally, the decrease in efficiency using the DEA extension technique (Malmquist Index or MI) indicated a deterioration of 10% in efficiency over the same period. Interestingly, however, when the sample was stratified into high efficient versus low efficient subgroups (approximately 25% in each group), a comparison of the MIs suggested a significant improvement in Efficiency Change (EC) for the least efficient (MI 0.92022) and reduction in efficiency for the most efficient LTACHs (MI = 1.38761) over same period. While a reduction in efficiency for the most efficient is unexpected, it is not particularly surprising, since efficiency measure can vary over time. An improvement in efficiency, however, for the least efficient should be expected as those LTACHs begin to manage expenses (and controllable resources) more carefully to offset the payment/reimbursement pressures on their margins from MMSEA.^

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Ascertaining the family health history (FHH) may provide insight into genetic and environmental susceptibilities specific to a variety of chronic diseases, including type II diabetes mellitus. However, discussion of FHH during patient-provider encounters has been limited and uncharacterized. A longitudinal, observational study was conducted in order to compare the content of FHH topics in a convenience sample of 37 patients, 13 new and 24 established. Each patient had an average of three follow-up encounters involving 6 staff physicians at the Audie L. Murphy Memorial Veterans Hospital (VHA) in San Antonio, TX from 2003 to 2005. A total of 131 encounters were analyzed in this study. The average age of the selected population was 68 years and included 35 males and two females. Transcriptions of encounters were obtained, coded and analyzed, in NVIVO 8. Of the 131 total encounters transcribed among the 37 patients, only 24 encounters (18.3%) included discussion of FHH. Additionally, the relationship between FHH discussion and discussion of self-care management (SCM) topics were assessed. In this study, providers were more likely to initiate discussion on family health history among new patients in the first encounter (ORnew = 8.55, 95% CI: 1.49–52.90). The discussion of FHH occurred sporadically in established patients throughout the longitudinal study with no apparent pattern. Provider-initiated FHH discussion most frequently had satisfactory level(s) of discussion while patient-initiated FHH discussion most frequently had minimal level(s) of discussion. FHH discussion most oftentimes involved topics of cancer and cardiovascular disease among primary-degree familial relationships. Overall, family health histories are largely, an underutilized tool in personalized preventive care.^

Relevância:

80.00% 80.00%

Publicador:

Resumo:

To address concerns expressed about the possible effect of drilling mud discharges on shallow, low-energy estuarine ecosystems, a 12 month study was designed to detect alterations in water quality and sediment geochemistry. Each drilling mud used in the study and sediments from the study site were analyzed in the laboratory for chemical and physical characteristics. Potential water quality impacts were simulated by the EPA-COE elutriation test procedure. Mud toxicity was measured by acute and chronic bioassays with Mysidopsis bahia, Mercenaria mercenaria, and Nereis virens.^ For the field study, a relatively pristine, shallow (1.2 m) estuary (Christmas Bay, TX) without any drilling activity for the last 30 years was chosen for the study site. After a three month baseline study, three stations were selected. Station 1 was an external control. At each treatment station (2, 3), mesocosms were constructed to enclose a 3.5 m$\sp3$ water column. Each treatment station included an internal control site also. Each in situ mesocosm, except the controls, was successively dosed at a mesocosm-specific dose (1:100; 1:1,000; or 1:10,000 v/v) with 4 field collected drilling muds (spud, nondispersed, lightly-treated, and heavily-treated lignosulfonate) in sequential order over 1.5 months. Twenty-four hours after each dose, water exchange was allowed until the next treatment. Station 3 was destroyed by a winter storm. After the last treatment, the enclosures were removed and the remaining sites monitored for 6 months. One additional site was similarly dosed (1:100 v/v) with clean dredged sediment from Christmas Bay for comparison between dredged sediments and drilling muds.^ Results of the analysis of the water samples and field measurements showed that water quality was impacted during the discharges, primarily at the highest dose (1:100 v/v), but that elevated levels of C, Cr (T,F), Cr$\sp{+3}$ (T, F), N, Pb, and Zn returned to ambient levels before the end of the 24 hour exposure period or immediately after water exchange was allowed (Al, Ba(T), Chlorophyll ABC, SS, %T). Barium, from the barite, was used as a geochemical tracer in the sediments to confirm estimated doses by mass balance calculations. Barium reached a maximum of 166x background levels at the high dose mesocosm. Barium levels returned to ambient or only slightly elevated levels at the end of the 6 month monitoring period due to sediment deposition, resuspension, and bioturbation. QA/QC results using blind samples consisting of lab standards and spiked samples for both water and sediment matrices were within acceptable coefficients of variation.^ In order to avoid impacts on water quality and sediment geochemistry in a shallow estuarine ecosystem, this study concluded that a minimal dilution of 1:1,000 (v/v) would be required in addition to existing regulatory constraints. ^