985 resultados para Endotracheal-tube Length
Resumo:
This study aimed to investigate the influence of water loading upon intraocular pressure (IOP), ocular pulse amplitude (OPA) and axial length. Twenty one young adult subjects who were classified based on their spherical equivalent refraction as either myopes (n=11), or emmetropes (n=10) participated. Measures of IOP, OPA and ocular biometrics were collected before, and then 10, 15, 25 and 30 minutes following the ingestion of 1000 ml of water. Significant increases in both IOP and OPA were found to occur following water loading (p<0.0001), with peaks in both parameters occurring at 10 minutes after water loading (mean ± SEM increase of 2.24 ± 0.31 mmHg in IOP and 0.46 ± 0.06 mmHg in OPA). Axial length was found to reduce significantly following water loading (p=0.0005), with the largest reduction in axial length evident 10 minutes after water drinking (mean decrease 12 ± 3 µm). A significant time by refractive error group interaction (p=0.048) was found in axial length, indicative of a different pattern of change in eye length following water loading between the myopic and emmetropic populations. The largest difference in axial length change was evident at 10 minutes after water loading with a 17 ± 5 µm reduction in axial length evident in the myopes and only a 6 ± 2 µm reduction in the emmetropes. These findings illustrate significant changes in ocular parameters in young adult subjects following water loading.
Resumo:
This paper describes the behaviour of very high strength (VHS) circular steel tubes strengthened by carbon fibre reinforced polymer (CFRP) and subjected to axial tension. A series of tests were conducted with different bond lengths and number of layers. The distribution of strain through the thickness of CFRP layers and along CFRP bond length was studied. The strain was found to generally decrease along the CFRP bond length far from the joint. The strain through the thickness of the CFRP layers was also found to decrease from bottom to top layer. The effective bond length for high modulus CFRP was established. Finally empirical models were developed to estimate the maximum load for a given CFRP arrangement.
Resumo:
Objective.To estimate the excess length of stay in an intensive care unit (ICU) due to a central line–associated bloodstream infection (CLABSI), using a multistate model that accounts for the timing of infection. Design.A cohort of 3,560 patients followed up for 36,806 days in ICUs. Setting.Eleven ICUs in 3 Latin American countries: Argentina, Brazil, and Mexico. Patients.All patients admitted to the ICU during a defined time period with a central line in place for more than 24 hours. Results.The average excess length of stay due to a CLABSI increased in 10 of 11 ICUs and varied from −1.23 days to 4.69 days. A reduction in length of stay in Mexico was probably caused by an increased risk of death due to CLABSI, leading to shorter times to death. Adjusting for patient age and Average Severity of Illness Score tended to increase the estimated excess length of stays due to CLABSI. Conclusions.CLABSIs are associated with an excess length of ICU stay. The average excess length of stay varies between ICUs, most likely because of the case‐mix of admissions and differences in the ways that hospitals deal with infections.
Resumo:
Objective: Diarrhoea in the enterally tube fed (ETF) intensive care unit (ICU) patient is a multifactorial problem. Diarrhoeal aetiologies in this patient cohort remain debatable; however, the consequences of diarrhoea have been well established and include electrolyte imbalance, dehydration, bacterial translocation, peri anal wound contamination and sleep deprivation. This study examined the incidence of diarrhoea and explored factors contributing to the development of diarrhoea in the ETF, critically ill, adult patient. ---------- Method: After institutional ethical review and approval, a single centre medical chart audit was undertaken to examine the incidence of diarrhoea in ETF, critically ill patients. Retrospective, non-probability sequential sampling was used of all emergency admission adult ICU patients who met the inclusion/exclusion criteria. ---------- Results: Fifty patients were audited. Faecal frequency, consistency and quantity were considered important criteria in defining ETF diarrhoea. The incidence of diarrhoea was 78%. Total patient diarrhoea days (r = 0.422; p = 0.02) and total diarrhoea frequency (r = 0.313; p = 0.027) increased when the patient was ETF for longer periods of time. Increased severity of illness, peripheral oxygen saturation (Sp02), glucose control, albumin and white cell count were found to be statistically significant factors for the development of diarrhoea. ---------- Conclusion: Diarrhoea in ETF critically ill patients is multi-factorial. The early identification of diarrhoea risk factors and the development of a diarrhoea risk management algorithm is recommended.
Resumo:
Objective: The aim of this literature review is to identify the role of probiotics in the management of enteral tube feeding (ETF) diarrhoea in critically ill patients.---------- Background: Diarrhoea is a common gastrointestinal problem seen in ETF patients. The incidence of diarrhoea in tube fed patients varies from 2% to 68% across all patients. Despite extensive investigation, the pathogenesis surrounding ETF diarrhoea remains unclear. Evidence to support probiotics to manage ETF diarrhoea in critically ill patients remains sparse.---------- Method: Literature on ETF diarrhoea and probiotics in critically ill, adult patients was reviewed from 1980 to 2010. The Cochrane Library, Pubmed, Science Direct, Medline and the Cumulative Index of Nursing and Allied Health Literature (CINAHL) electronic databases were searched using specific inclusion/exclusion criteria. Key search terms used were: enteral nutrition, diarrhoea, critical illness, probiotics, probiotic species and randomised clinical control trial (RCT).---------- Results: Four RCT papers were identified with two reporting full studies, one reporting a pilot RCT and one conference abstract reporting an RCT pilot study. A trend towards a reduction in diarrhoea incidence was observed in the probiotic groups. However, mortality associated with probiotic use in some severely and critically ill patients must caution the clinician against its use.---------- Conclusion: Evidence to support probiotic use in the management of ETF diarrhoea in critically ill patients remains unclear. This paper argues that probiotics should not be administered to critically ill patients until further research has been conducted to examine the causal relationship between probiotics and mortality, irrespective of the patient's disease state or projected prophylactic benefit of probiotic administration.
Resumo:
The recognition that Web 2.0 applications and social media sites will strengthen and improve interaction between governments and citizens has resulted in a global push into new e-democracy or Government 2.0 spaces. These typically follow government-to-citizen (g2c) or citizen-to-citizen (c2c) models, but both these approaches are problematic: g2c is often concerned more with service delivery to citizens as clients, or exists to make a show of ‘listening to the public’ rather than to genuinely source citizen ideas for government policy, while c2c often takes place without direct government participation and therefore cannot ensure that the outcomes of citizen deliberations are accepted into the government policy-making process. Building on recent examples of Australian Government 2.0 initiatives, we suggest a new approach based on government support for citizen-to-citizen engagement, or g4c2c, as a workable compromise, and suggest that public service broadcasters should play a key role in facilitating this model of citizen engagement.
Resumo:
Purpose: To analyze the repeatability of measuring nerve fiber length (NFL) from images of the human corneal subbasal nerve plexus using semiautomated software. Methods: Images were captured from the corneas of 50 subjects with type 2 diabetes mellitus who showed varying severity of neuropathy, using the Heidelberg Retina Tomograph 3 with Rostock Corneal Module. Semiautomated nerve analysis software was independently used by two observers to determine NFL from images of the subbasal nerve plexus. This procedure was undertaken on two occasions, 3 days apart. Results: The intraclass correlation coefficient values were 0.95 (95% confidence intervals: 0.92–0.97) for individual subjects and 0.95 (95% confidence intervals: 0.74–1.00) for observer. Bland-Altman plots of the NFL values indicated a reduced spread of data with lower NFL values. The overall spread of data was less for (a) the observer who was more experienced at analyzing nerve fiber images and (b) the second measurement occasion. Conclusions: Semiautomated measurement of NFL in the subbasal nerve fiber layer is highly repeatable. Repeatability can be enhanced by using more experienced observers. It may be possible to markedly improve repeatability when measuring this anatomic structure using fully automated image analysis software.
Resumo:
Catheter associated urinary tract infections (CAUTI) are a worldwide problem that may lead to increased patient morbidity, cost and mortality.1e3 The literature is divided on whether there are real effects from CAUTI on length of stay or mortality. Platt4 found the costs and mortality risks to be largeyetGraves et al found the opposite.5 A reviewof the published estimates of the extra length of stay showed results between zero and 30 days.6 The differences in estimates may have been caused by the different epidemiological methods applied. Accurately estimating the effects of CAUTI is difficult because it is a time-dependent exposure. This means that standard statistical techniques, such asmatched case-control studies, tend to overestimate the increased hospital stay and mortality risk due to infection. The aim of the study was to estimate excess length of stay andmortality in an intensive care unit (ICU) due to a CAUTI, using a statistical model that accounts for the timing of infection. Data collected from ICU units in lower and middle income countries were used for this analysis.7,8 There has been little research for these settings, hence the need for this paper.
Resumo:
Background: Ambulance ramping within the Emergency Department (ED) is a common problem both internationally and in Australia. Previous research has focused on various issues associated with ambulance ramping such as access block, ED overcrowding and ambulance bypass. However, limited research has been conducted on ambulance ramping and its effects on patient outcomes. ----- ----- Methods: A case-control design was used to describe, compare and predict patient outcomes of 619 ramped (cases) vs. 1238 non-ramped (control) patients arriving to one ED via ambulance from 1 June 2007 to 31 August 2007. Cases and controls were matched (on a 1:2 basis) on age, gender and presenting problem. Outcome measures included ED length of stay and in-hospital mortality. ----- ----- Results: The median ramp time for all 1857 patients was 11 (IQR 6—21) min. Compared to nonramped patients, ramped patients had significantly longer wait time to be triaged (10 min vs. 4 min). Ramped patients also comprised significantly higher proportions of those access blocked (43% vs. 34%). No significant difference in the proportion of in-hospital deaths was identified (2%vs. 3%). Multivariate analysis revealed that the likelihood of having an ED length of stay greater than eight hours was 34% higher among patients who were ramped (OR 1.34, 95% CI 1.06—1.70, p = 0.014). In relation to in-hospital mortality age was the only significant independent predictor of mortality (p < 0.0001). ----- ----- Conclusion: Ambulance ramping is one factor that contributes to prolonged ED length of stay and adds additional strain on ED service provision. The potential for adverse patient outcomes that may occur as a result of ramping warrants close attention by health care service providers.
Resumo:
Despite many arguments to the contrary, the three-act story structure, as propounded and refined by Hollywood continues to dominate the blockbuster and independent film markets. Recent successes in post-modern cinema could indicate new directions and opportunities for low-budget national cinemas.
Resumo:
With the identification of common single locus point mutations as risk factors for thrombophilia, many DNA testing methodologies have been described for detecting these variations. Traditionally, functional or immunological testing methods have been used to investigate quantitative anticoagulant deficiencies. However, with the emergence of the genetic variations, factor V Leiden, prothrombin 20210 and, to a lesser extent, the methylene tetrahydrofolate reductase (MTHFR677) and factor V HR2 haplotype, traditional testing methodologies have proved to be less useful and instead DNA technology is more commonly employed in diagnostics. This review considers many of the DNA techniques that have proved to be useful in the detection of common genetic variants that predispose to thrombophilia. Techniques involving gel analysis are used to detect the presence or absence of restriction sites, electrophoretic mobility shifts, as in single strand conformation polymorphism or denaturing gradient gel electrophoresis, and product formation in allele-specific amplification. Such techniques may be sensitive, but are unwielding and often need to be validated objectively. In order to overcome some of the limitations of gel analysis, especially when dealing with larger sample numbers, many alternative detection formats, such as closed tube systems, microplates and microarrays (minisequencing, real-time polymerase chain reaction, and oligonucleotide ligation assays) have been developed. In addition, many of the emerging technologies take advantage of colourimetric or fluorescence detection (including energy transfer) that allows qualitative and quantitative interpretation of results. With the large variety of DNA technologies available, the choice of methodology will depend on several factors including cost and the need for speed, simplicity and robustness. © 2000 Lippincott Williams & Wilkins.
Resumo:
The compressed gas industry and government agencies worldwide utilize "adiabatic compression" testing for qualifying high-pressure valves, regulators, and other related flow control equipment for gaseous oxygen service. This test methodology is known by various terms including adiabatic compression testing, gaseous fluid impact testing, pneumatic impact testing, and BAM testing as the most common terms. The test methodology will be described in greater detail throughout this document but in summary it consists of pressurizing a test article (valve, regulator, etc.) with gaseous oxygen within 15 to 20 milliseconds (ms). Because the driven gas1 and the driving gas2 are rapidly compressed to the final test pressure at the inlet of the test article, they are rapidly heated by the sudden increase in pressure to sufficient temperatures (thermal energies) to sometimes result in ignition of the nonmetallic materials (seals and seats) used within the test article. In general, the more rapid the compression process the more "adiabatic" the pressure surge is presumed to be and the more like an isentropic process the pressure surge has been argued to simulate. Generally speaking, adiabatic compression is widely considered the most efficient ignition mechanism for directly kindling a nonmetallic material in gaseous oxygen and has been implicated in many fire investigations. Because of the ease of ignition of many nonmetallic materials by this heating mechanism, many industry standards prescribe this testing. However, the results between various laboratories conducting the testing have not always been consistent. Research into the test method indicated that the thermal profile achieved (i.e., temperature/time history of the gas) during adiabatic compression testing as required by the prevailing industry standards has not been fully modeled or empirically verified, although attempts have been made. This research evaluated the following questions: 1) Can the rapid compression process required by the industry standards be thermodynamically and fluid dynamically modeled so that predictions of the thermal profiles be made, 2) Can the thermal profiles produced by the rapid compression process be measured in order to validate the thermodynamic and fluid dynamic models; and, estimate the severity of the test, and, 3) Can controlling parameters be recommended so that new guidelines may be established for the industry standards to resolve inconsistencies between various test laboratories conducting tests according to the present standards?