214 resultados para consecutive
Resumo:
Background: Right-to-left shunting via a patent foramen ovale (PFO) has a recognized association with embolic events in younger patients. The use of agitated saline contrast imaging (ASCi) for detecting atrial shunting is well documented, however optimal technique is not well described. The purpose of this study is to assess the efficacy and safety of ASCi via TTE for assessment of right-to-left atrial communication in a large cohort of patients. Method: A retrospective review was undertaken of 1162 consecutive transthoracic (TTE) ASCi studies, of which 195 had also undergone clinically indicated transesophageal (TEE) echo. ASCi shunt results were compared with color flow imaging (CFI) and the role of provocative maneuvers (PM) assessed. Results: 403 TTE studies (35%) had paradoxical shunting seen during ASCi. Of these, 48% were positive with PM only. There was strong agreement between TTE ASCi and reported TEE findings (99% sensitivity, 85% specificity), with six false positive and two false negative results. In hindsight, the latter were likely due to suboptimal right atrial opacification, and the former due to transpulmonary shunting. TTE CFI was found to be insensitive (22%) for the detection of a PFO compared with TTE ASCi. Conclusions: TTE ASCi is minimally invasive and highly accurate for the detection of right-to-left atrial communication when PM are used. TTE CFI was found to be insensitive for PFO screening. It is recommended that TTE ASCi should be considered the initial diagnostic tool for the detection of PFO in clinical practice. A dedicated protocol should be followed to ensure adequate agitated saline contrast delivery and performance of provocative maneuvers.
Resumo:
The technique of femoral cement-in-cement revision is well established, but there are no previous series reporting its use on the acetabular side at the time of revision total hip arthroplasty. We describe the surgical technique and report the outcome of 60 consecutive cement-in-cement revisions of the acetabular component at a mean follow-up of 8.5 years (range 5-12 years). All had a radiologically and clinically well fixed acetabular cement mantle at the time of revision. 29 patients died. No case was lost to follow-up. The 2 most common indications for acetabular revision were recurrent dislocation (77%) and to compliment a femoral revision (20%). There were 2 cases of aseptic cup loosening (3.3%) requiring re-revision. No other hip was clinically or radiologically loose (96.7%) at latest follow-up. One case was re-revised for infection, 4 for recurrent dislocation and 1 for disarticulation of a constrained component. At 5 years, the Kaplan-Meier survival rate was 100% for aseptic loosening and 92.2% (95% CI; 84.8-99.6%) with revision for all causes as the endpoint. These results support the use of the cement-in-cement revision technique in appropriate cases on the acetabular side. Theoretical advantages include preservation of bone stock, reduced operating time, reduced risk of complications and durable fixation.
Resumo:
Objective. The aim of this paper is to report the clinical practice changes resulting from strategies to standardise diabetic foot clinical management in three diverse ambulatory service sites in Queensland, Australia. Methods. Multifaceted strategies were implemented in 2008, including: multidisciplinary teams, clinical pathways, clinical training, clinical indicators, and telehealth support. Prior to the intervention, none of the aforementioned strategies were used, except one site had a basic multidisciplinary team. A retrospective audit of consecutive patient records from July 2006 to June 2007 determined baseline clinical activity (n = 101).Aclinical pathway teleform was implemented as a clinical activity analyser in 2008 (n = 327) and followed up in 2009 (n = 406). Pre- and post-implementation data were analysed using Chi-square tests with a significance level set at P < 0.05. Results. There was an improvement in surveillance of the high risk population of 34% in 2008 and 19% in 2009, and treating according to risk of 15% in 2009 (P < 0.05). The documentation of all best-practice clinical activities performed improved 13–66% (P < 0.03). Conclusion. These findings support the use of multifaceted strategies to standardise practice and improve diabetic foot complications management in diverse ambulatory services.
Resumo:
Background Diabetic foot complications are recognised as the most common reason for diabetic related hospitalisation and lower extremity amputations. Multi-faceted strategies to reduce diabetic foot hospitalisation and amputation rates have been successful. However, most diabetic foot ulcers are managed in ambulatory settings where data availability is poor and studies limited. The project aimed to develop and evaluate strategies to improve the management of diabetic foot complications in three diverse ambulatory settings and measure the subsequent impact on ospitalisation and amputation. Methods Multifaceted strategies were implemented in 2008, including: multi-disciplinary teams, clinical pathways and training, clinical indicators, telehealth support and surveys. A retrospective audit of consecutive patient records from July 2006 – June 2007 determined baseline clinical indicators (n = 101). A clinical pathway teleform was implemented as a clinical record and clinical indicator analyser in all sites in 2008 (n = 327) and followed up in 2009 (n = 406). Results Prior to the intervention, clinical pathways were not used and multi-disciplinary teams were limited. There was an absolute improvement in treating according to risk of 15% in 2009 and surveillance of the high risk population of 34% and 19% in 2008 and 2009 respectively (p < 0.001). Improvements of 13 – 66% (p < 0.001) were recorded in 2008 for individual clinical activities to a performance > 92% in perfusion, ulcer depth, infection assessment and management, offloading and education. Hospitalisation impacts recorded reductions of up to 64% in amputation rates / 100,000 population (p < 0.001) and 24% average length of stay (p < 0.001) Conclusion These findings support the use of multi-faceted strategies in diverse ambulatory services to standardise practice, improve diabetic foot complications management and positively impact on hospitalisation outcomes. As of October 2010, these strategies had been rolled out to over 25 ambulatory sites, representing 66% of Queensland Health districts, managing 1,820 patients and 13,380 occasions of service, including 543 healed ulcer patients. It is expected that this number will rise dramatically as an incentive payment for the use of the teleform is expanded.
Resumo:
OBJECTIVES: To measure the thickness at which primary schoolchildren apply sunscreen on school day mornings and to compare it with the thickness (2.00 mg/cm(2)) at which sunscreen is tested during product development, as well as to investigate how application thickness was influenced by age of the child (school grades 1-7) and by dispenser type (500-mL pump, 125-mL squeeze bottle, or 50-mL roll-on). DESIGN: A crossover quasiexperimental study design comparing 3 sunscreen dispenser types. SETTING: Children aged 5 to 12 years from public primary schools (grades 1-7) in Queensland, Australia. PARTICIPANTS: Children (n=87) and their parents randomly recruited from the enrollment lists of 7 primary schools. Each child provided up to 3 observations (n=258). INTERVENTION: Children applied sunscreen during 3 consecutive school weeks (Monday through Friday) for the first application of the day using a different dispenser each week. MAIN OUTCOME MEASURE: Thickness of sunscreen application (in milligrams per square centimeter). The dispensers were weighed before and after use to calculate the weight of sunscreen applied. This was divided by the coverage area of application (in square centimeters), which was calculated by multiplying the children's body surface area by the percentage of the body covered with sunscreen. RESULTS: Children applied their sunscreen at a median thickness of 0.48 mg/cm(2). Children applied significantly more sunscreen when using the pump (0.75 mg/cm(2)) and the squeeze bottle (0.57 mg/cm(2)) compared with the roll-on (0.22 mg/cm(2)) (P<.001 for both). CONCLUSIONS: Regardless of age, primary schoolchildren apply sunscreen at substantially less than 1.00 mg/cm(2), similar to what has been observed among adults. Some sunscreen dispensers seem to facilitate thicker application than others.
Resumo:
Purpose. To evaluate the use of optical coherence tomography (OCT) to assess the effect of different soft contact lenses on corneoscleral morphology. Methods. Ten subjects had anterior segment OCT B-scans taken in the morning and again after six hours of soft contact lens wear. For each subject, three different contact lenses were used in the right eye on non-consecutive days, including a hydrogel sphere, a silicone hydrogel sphere and a silicone hydrogel toric. After image registration and layer segmentation, analyses were performed of the first hyper-reflective layer (HRL), the epithelial basement membrane (EBL) and the epithelial thickness (HRL to EBL). A root mean square difference (RMSD) of the layer profiles and the thickness change between the morning and afternoon measurements, was used to assess the effect of the contact lens on the corneoscleral morphology. Results. The soft contact lenses had a statistically significant effect on the morphology of the anterior segment layers (p <0.001). The average amounts of change for the three lenses (average RMSD values) for the corneal region were lower (3.93±1.95 µm for the HRL and 4.02±2.14 µm for the EBL) than those measured in the limbal/scleral region (11.24±6.21 µm for the HRL and 12.61±6.42 µm for the EBL). Similarly, averaged across the three lenses, the RMSD in epithelial thickness was lower in the cornea (2.84±0.84 µm) than the limbal/scleral (5.47±1.71 µm) region. Post-hoc analysis showed that ocular surface changes were significantly smaller with the silicone hydrogel sphere lens than both the silicone hydrogel toric (p<0.005) and hydrogel sphere (p<0.02) for the combined HRL and EBL data. Conclusions. In this preliminary study, we have shown that soft contact lenses can produce small but significant changes in the morphology of the limbal/scleral region and that OCT technology is useful in assessing these changes. The clinical significance of these changes is yet to be determined.
Resumo:
Recent research indicates that brief periods (60 minutes) of monocular defocus lead to small but significant changes in human axial length. However, the effects of longer periods of defocus on the axial length of human eyes are unknown. We examined the influence of a 12 hour period of monocular myopic defocus on the natural daily variations occurring in axial length and choroidal thickness of young adult emmetropes. A series of axial length and choroidal thickness measurements (collected at ~3 hourly intervals, with the first measurement at ~9 am and the final measurement at ~9 pm) were obtained for 13 emmetropic young adults over three consecutive days. The natural daily rhythms (Day 1, baseline day, no defocus), the daily rhythms with monocular myopic defocus (Day 2, defocus day, +1.50 DS spectacle lens over the right eye), and the recovery from any defocus induced changes (Day 3, recovery day, no defocus) were all examined. Significant variations over the course of the day were observed in both axial length and choroidal thickness on each of the three measurement days (p<0.0001). The magnitude and timing of the daily variations in axial length and choroidal thickness were significantly altered with the monocular myopic defocus on day 2 (p<0.0001). Following the introduction of monocular myopic defocus, the daily peak in axial length occurred approximately 6 hours later, and the peak in choroidal thickness approximately 8.5 hours earlier in the day compared to days 1 and 3 (with no defocus). The mean amplitude (peak to trough) of change in axial length (0.030 ± 0.012 on day 1, 0.020 ± 0.010 on day 2 and 0.033 ± 0.012 mm on day 3) and choroidal thickness (0.030 ± 0.007 on day 1, 0.022 ± 0.006 on day 2 and 0.027 ± 0.009 mm on day 3) were also significantly different between the three days (both p<0.05). The introduction of monocular myopic defocus disrupts the daily variations in axial length and choroidal thickness of human eyes (in terms of both amplitude and timing) that return to normal the following day after removal of the defocus.
Resumo:
Purpose. To compare radiological records of 90 consecutive patients who underwent cemented total hip arthroplasty (THA) with or without use of the Rim Cutter to prepare the acetabulum. Methods. The acetabulum of 45 patients was prepared using the Rim Cutter, whereas the device was not used in the other 45 patients. Postoperative radiographs were evaluated using a digital templating system to measure (1) the positions of the operated hips with respect to the normal, contralateral hips (the centre of rotation of the socket, the height of the centre of rotation from the teardrop, and lateralisation of the centre of rotation from the teardrop) and (2) the uniformity and width of the cement mantle in the 3 DeLee Charnley acetabular zones, and the number of radiolucencies in these zones. Results. The study group showed improved radiological parameters and were closer to the anatomic centre of rotation both vertically (1.5 vs. 3.7 mm, p<0.001) and horizontally (1.8 vs. 4.4 mm, p<0.001) and had consistently thicker and more uniform cement mantles (p<0.001). There were 2 radiolucent lines in the control group but none in the study group. Conclusion. The Rim Cutter resulted in more accurate placement of the centre of rotation of a cemented prosthetic socket, and produced a thicker, more congruent cement mantle with fewer radiolucent lines.
Resumo:
Critically ill patients receiving extracorporeal membrane oxygenation (ECMO) are often noted to have increased sedation requirements. However, data related to sedation in this complex group of patients is limited. The aim of our study was to characterise the sedation requirements in adult patients receiving ECMO for cardiorespiratory failure. A retrospective chart review was performed to collect sedation data for 30 consecutive patients who received venovenous or venoarterial ECMO between April 2009 and March 2011. To test for a difference in doses over time we used a regression model. The dose of midazolam received on ECMO support increased by an average of 18 mg per day (95% confidence interval 8, 29 mg, P=0.001), while the dose of morphine increased by 29 mg per day (95% confidence interval 4, 53 mg, P=0.021) The venovenous group received a daily midazolam dose that was 157 mg higher than the venoarterial group (95% confidence interval 53, 261 mg, P=0.005). We did not observe any significant increase in fentanyl doses over time (95% confidence interval 1269, 4337 µg, P=0.94). There is a significant increase in dose requirement for morphine and midazolam during ECMO. Patients on venovenous ECMO received higher sedative doses as compared to patients on venoarterial ECMO. Future research should focus on mechanisms behind these changes and also identify drugs that are most suitable for sedation during ECMO.
Resumo:
We examined the variation in association between high temperatures and elderly mortality (age ≥ 75 years) from year to year in 83 US cities between 1987 and 2000. We used a Poisson regression model and decomposed the mortality risk for high temperatures into: a “main effect” due to high temperatures using lagged non-linear function, and an “added effect” due to consecutive high temperature days. We pooled yearly effects across both regional and national levels. The high temperature effects (both main and added effects) on elderly mortality varied greatly from year to year. In every city there was at least one year where higher temperatures were associated with lower mortality. Years with relatively high heat-related mortality were often followed by years with relatively low mortality. These year to year changes have important consequences for heat-warning systems and for predictions of heat-related mortality due to climate change.
Resumo:
The purpose of this study was to describe patterns of medical and nursing practice in the care of patients dying of oncological and hematological malignancies in the acute care setting in Australia. A tool validated in a similar American study was used to study the medical records of 100 consecutive patients who died of oncological or hematological malignancies before August 1999 at The Canberra Hospital in the Australian Capital Territory. The three major indicators of patterns of end-of-life care were documentation of Do Not Resuscitate (DNR) orders, evidence that the patient was considered dying, and the presence of a palliative care intention. Findings were that 88 patients were documented DNR, 63 patients' records suggested that the patient was dying, and 74 patients had evidence of a palliative care plan. Forty-six patients were documented DNR 2 days or less prior to death and, of these, 12 were documented the day of death. Similar patterns emerged for days between considered dying and death, and between palliative care goals and death. Sixty patients had active treatment in progress at the time of death. The late implementation of end-of-life management plans and the lack of consistency within these plans suggested that patients were subjected to medical interventions and investigations up to the time of death. Implications for palliative care teams include the need to educate health care staff and to plan and implement policy regarding the management of dying patients in the acute care setting. Although the health care system in Australia has cultural differences when compared to the American context, this research suggests that the treatment imperative to prolong life is similar to that found in American-based studies.
Resumo:
Passive air samplers (PAS) consisting of polyurethane foam (PUF) disks were deployed at 6 outdoor air monitoring stations in different land use categories (commercial, industrial, residential and semi-rural) to assess the spatial distribution of polybrominated diphenyl ethers (PBDEs) in the Brisbane airshed. Air monitoring sites covered an area of 1143 km2 and PAS were allowed to accumulate PBDEs in the city's airshed over three consecutive seasons commencing in the winter of 2008. The average sum of five (∑5) PBDEs (BDEs 28, 47, 99, 100 and 209) levels were highest at the commercial and industrial sites (12.7 ± 5.2 ng PUF−1), which were relatively close to the city center and were a factor of 8 times higher than residential and semi-rural sites located in outer Brisbane. To estimate the magnitude of the urban ‘plume’ an empirical exponential decay model was used to fit PAS data vs. distance from the CBD, with the best correlation observed when the particulate bound BDE-209 was not included (∑5-209) (r2 = 0.99), rather than ∑5 (r2 = 0.84). At 95% confidence intervals the model predicts that regardless of site characterization, ∑5-209 concentrations in a PAS sample taken between 4–10 km from the city centre would be half that from a sample taken from the city centre and reach a baseline or plateau (0.6 to 1.3 ng PUF−1), approximately 30 km from the CBD. The observed exponential decay in ∑5-209 levels over distance corresponded with Brisbane's decreasing population density (persons/km2) from the city center. The residual error associated with the model increased significantly when including BDE-209 levels, primarily due to the highest level (11.4 ± 1.8 ng PUF−1) being consistently detected at the industrial site, indicating a potential primary source at this site. Active air samples collected alongside the PAS at the industrial air monitoring site (B) indicated BDE-209 dominated congener composition and was entirely associated with the particulate phase. This study demonstrates that PAS are effective tools for monitoring citywide regional differences however, interpretation of spatial trends for POPs which are predominantly associated with the particulate phase such as BDE-209, may be restricted to identifying ‘hotspots’ rather than broad spatial trends.
Resumo:
Background: Despite important implications for the budgets, statistical power and generalisability of research findings, detailed reports of recruitment and retention in randomised controlled trials (RCTs) are rare. The NOURISH RCT evaluated a community-based intervention for first-time mothers that promoted protective infant feeding practices as a primary prevention strategy for childhood obesity. The aim of this paper is to provide a detailed description and evaluation of the recruitment and retention strategies used. Methods: A two stage recruitment process designed to provide a consecutive sampling framework was used. First time mothers delivering healthy term infants were initially approached in postnatal wards of the major maternity services in two Australian cities for consent to later contact (Stage 1). When infants were about four months old mothers were re-contacted by mail for enrolment (Stage 2), baseline measurements (Time 1) and subsequent random allocation to the intervention or control condition. Outcomes were assessed at infant ages 14 months (Time 2) and 24 months (Time 3). Results: At Stage 1, 86% of eligible mothers were approached and of these women, 76% consented to later contact. At Stage 2, 3% had become ineligible and 76% could be recontacted. Of the latter, 44% consented to full enrolment and were allocated. This represented 21% of mothers screened as eligible at Stage 1. Retention at Time 3 was 78%. Mothers who did not consent or discontinued the study were younger and less likely to have a university education. Conclusions: The consent and retention rates of our sample of first time mothers are comparable with or better than other similar studies. The recruitment strategy used allowed for detailed information from non-consenters to be collected; thus selection bias could be estimated. Recommendations for future studies include being able to contact participants via mobile phone (particular text messaging), offering home visits to reduce participant burden and considering the use of financial incentives to support participant retention.
Resumo:
PURPOSE: To test the reliability of Timed Up and Go Tests (TUGTs) in cardiac rehabilitation (CR) and compare TUGTs to the 6-Minute Walk Test (6MWT) for outcome measurement. METHODS: Sixty-one of 154 consecutive community-based CR patients were prospectively recruited. Subjects undertook repeated TUGTs and 6MWTs at the start of CR (start-CR), postdischarge from CR (post-CR), and 6 months postdischarge from CR (6 months post-CR). The main outcome measurements were TUGT time (TUGTT) and 6MWT distance (6MWD). RESULTS: Mean (SD) TUGTT1 and TUGTT2 at the 3 assessments were 6.29 (1.30) and 5.94 (1.20); 5.81 (1.22) and 5.53 (1.09); and 5.39 (1.60) and 5.01 (1.28) seconds, respectively. A reduction in TUGTT occurred between each outcome point (P ≤ .002). Repeated TUGTTs were strongly correlated at each assessment, intraclass correlation (95% CI) = 0.85 (0.76–0.91), 0.84 (0.73–0.91), and 0.90 (0.83–0.94), despite a reduction between TUGTT1 and TUGTT2 of 5%, 5%, and 7%, respectively (P ≤ .006). Relative decreases in TUGTT1 (TUGTT2) occurred from start-CR to post-CR and from start-CR to 6 months post-CR of −7.5% (−6.9%) and −14.2% (−15.5%), respectively, while relative increases in 6MWD1 (6MWD2) occurred, 5.1% (7.2%) and 8.4% (10.2%), respectively (P < .001 in all cases). Pearson correlation coefficients for 6MWD1 to TUGTT1 and TUGTT2 across all times were −0.60 and −0.68 (P < .001) and the intraclass correlations (95% CI) for the speeds derived from averaged 6MWDs and TUGTTs were 0.65 (0.54, 0.73) (P < .001). CONCLUSIONS: Similar relative changes occurred for the TUGT and the 6MWT in CR. A significant correlation between the TUGTT and 6MWD was demonstrated, and we suggest that the TUGT may provide a related or a supplementary measurement of functional capacity in CR.
Resumo:
Background Overweight and obesity has become a serious public health problem in many parts of the world. Studies suggest that making small changes in daily activity levels such as “breaking-up” sedentary time (i.e., standing) may help mitigate the health risks of sedentary behavior. The aim of the present study was to examine time spent in standing (determined by count threshold), lying, and sitting postures (determined by inclinometer function) via the ActiGraph GT3X among sedentary adults with differing weight status based on body mass index (BMI) categories. Methods Participants included 22 sedentary adults (14 men, 8 women; mean age 26.5 ± 4.1 years). All subjects completed the self-report International Physical Activity Questionnaire to determine time spent sitting over the previous 7 days. Participants were included if they spent seven or more hours sitting per day. Postures were determined with the ActiGraph GT3X inclinometer function. Participants were instructed to wear the accelerometer for 7 consecutive days (24 h a day). BMI was categorized as: 18.5 to <25 kg/m2 as normal, 25 to <30 kg/m2 as overweight, and ≥30 kg/m2 as obese. Results Participants in the normal weight (n = 10) and overweight (n = 6) groups spent significantly more time standing (after adjustment for moderate-to-vigorous intensity physical activity and wear-time) (6.7 h and 7.3 h respectively) and less time sitting (7.1 h and 6.9 h respectively) than those in obese (n = 6) categories (5.5 h and 8.0 h respectively) after adjustment for wear-time (p < 0.001). There were no significant differences in standing and sitting time between normal weight and overweight groups (p = 0.051 and p = 0.670 respectively). Differences were not significant among groups for lying time (p = 0.55). Conclusion This study described postural allocations standing, lying, and sitting among normal weight, overweight, and obese sedentary adults. The results provide additional evidence for the use of increasing standing time in obesity prevention strategies.