921 resultados para hospitals
Resumo:
Background Medication safety is a pressing concern for residential aged care facilities (RACFs). Retrospective studies in RACF settings identify inadequate communication between RACFs, doctors, hospitals and community pharmacies as the major cause of medication errors. Existing literature offers limited insight about the gaps in the existing information exchange process that may lead to medication errors. The aim of this research was to explicate the cognitive distribution that underlies RACF medication ordering and delivery to identify gaps in medication-related information exchange which lead to medication errors in RACFs. Methods The study was undertaken in three RACFs in Sydney, Australia. Data were generated through ethnographic field work over a period of five months (May–September 2011). Triangulated analysis of data primarily focused on examining the transformation and exchange of information between different media across the process. Results The findings of this study highlight the extensive scope and intense nature of information exchange in RACF medication ordering and delivery. Rather than attributing error to individual care providers, the explication of distributed cognition processes enabled the identification of gaps in three information exchange dimensions which potentially contribute to the occurrence of medication errors namely: (1) design of medication charts which complicates order processing and record keeping (2) lack of coordination mechanisms between participants which results in misalignment of local practices (3) reliance on restricted communication bandwidth channels mainly telephone and fax which complicates the information processing requirements. The study demonstrates how the identification of these gaps enhances understanding of medication errors in RACFs. Conclusions Application of the theoretical lens of distributed cognition can assist in enhancing our understanding of medication errors in RACFs through identification of gaps in information exchange. Understanding the dynamics of the cognitive process can inform the design of interventions to manage errors and improve residents’ safety.
Resumo:
In this article, we study traffic flow in the presence of speed breaking structures. The speed breakers are typically used to reduce the local speed of vehicles near certain institutions such as schools and hospitals. Through a cellular automata model we study the impact of such structures on global traffic characteristics. The simulation results indicate that the presence of speed breakers could reduce the global flow under moderate global densities. However, under low and high global density traffic regime the presence of speed breakers does not have an impact on the global flow. Further the speed limit enforced by the speed breaker creates a phase distinction. For a given global density and slowdown probability, as the speed limit enforced by the speed breaker increases, the traffic moves from the reduced flow phase to maximum flow phase. This underlines the importance of proper design of these structures to avoid undesired flow restrictions.
Resumo:
Context: Tumor-induced osteomalacia (TIO) is a rarely diagnosed disorder presenting with bone pain, fractures, muscle weakness, and moderate-to-severe hypophosphatemia resulting from fibroblast growth factor 23-mediated renal phosphate wasting. Tumors secreting fibroblast growth factor 23 are often small and difficult to find with conventional imaging. Objective: We studied the utility of 68Ga-DOTA-octreotate (DOTATATE) somatostatin receptor positron emission tomography (PET)/computed tomography (CT) imaging in the diagnosis of TIO. Design and Setting: A multicenter case series was conducted at tertiary referral hospitals. Patients and Methods: Six patients with TIO diagnosed between 2003 and 2012 in Australia were referred for DOTATATE PET imaging. We reviewed the clinical history, biochemistry, imaging characteristics, histopathology, and clinical outcome of each patient. Results: Each case demonstrated delayed diagnosis despite severe symptoms. DOTATATE PET/CT imaging demonstrated high uptake and localized the tumor with confidence in each case. After surgical excision, there was resolution of clinical symptoms and serum phosphate, except in one patient who demonstrated residual disease on PET/CT. All tumors demonstrated high somatostatin receptor subtype 2 cell surface receptor expression using immunohistochemistry. Conclusions: In patients with TIO, DOTATATE PET/CT can successfully localize phosphaturic mesenchymal tumors and may be a practical first step in functional imaging for this disorder. Serum phosphate should be measured routinely in patients with unexplained muscle weakness, bone pain, or stress fractures to allow earlier diagnosis of TIO. - See more at: http://press.endocrine.org/doi/abs/10.1210/jc.2012-3642#sthash.eXD0CopL.dpuf
Resumo:
The first aim of the current study was to evaluate the survival of total hip arthroplasty (THA) in patients aged 55 years and older on a nation-wide level. The second aim was to evaluate, on a nation wide-basis, the geographical variation of the incidence of primary THA for primary OA and also to identify those variables that are possibly associated with this variation. The third aim was to evaluate the effects of hospital volume: on the length of stay, on the numbers of re-admissions and on the numbers of complications of THR on population-based level in Finland. The survival of implants was analysed based on data from the Finnish Arthroplasty Register. The incidence and hospital volume data were obtained from the Hospital Discharge Register. Cementless total hip replacements had a significantly reduced risk of revision for aseptic loosening compared with cemented hip replacements. When revision for any reason was the end point in the survival analyses, there were no significant differences found between the groups. Adjusted incidence ratios of THA varied from 1.9- to 3.0-fold during the study period. Neither the average income within a region nor the morbidity index was associated with the incidence of THA. For the four categories of volume of total hip replacements performed per hospital, the length of the surgical treatment period was shorter for the highest volume group than for the lowest volume group. The odds ratio for dislocations was significantly lower in the high volume group than in the low volume group. In patients who were 55 years of age or older, the survival of cementless total hip replacements was as good as that of the cemented replacements. However, multiple wear-related revisions of the cementless cups indicate that excessive polyethylene wear was a major clinical problem with modular cementless cups. The variation in the long-term rates of survival for different cemented stems was considerable. Cementless proximal porous-coated stems were found to be a good option for elderly patients. When hip surgery was performed on with a large repertoire, the indications to perform THAs due to primary OA were tight. Socio-economic status of the patient had no apparent effect on THA rate. Specialization of hip replacements in high volume hospitals should reduce costs by significantly shortening the length of stay, and may reduce the dislocation rate.
Resumo:
Background Many different guidelines recommend people with foot complications, or those at risk, should attend multiple health professionals for foot care each year. However, few studies have investigated the characteristics of those attending health professionals for foot care and if those characteristics match those requiring foot care as per guideline recommendations. The aim of this paper was to determine the associated characteristics of people who attended a health professional for foot care in the year prior to their hospitalisation. Methods Eligible participants were all adults admitted overnight, for any reason, into five diverse hospitals on one day; excluding maternity, mental health and cognitively impaired patients. Participants underwent a foot examination to clinically diagnose different foot complications; including wounds, infections, deformity, peripheral arterial disease and peripheral neuropathy. They were also surveyed on social determinant, medical history, self-care, foot complication history, and, past health professional attendance for foot care in the year prior to hospitalisation. Results Overall, 733 participants consented; mean(±SD) age 62(±19) years, 408 (55.8%) male, 172 (23.5%) diabetes. Two hundred and fifty-six (34.9% (95% CI) (31.6-38.4)) participants had attended a health professional for foot care; including attending podiatrists 180 (24.5%), GPs 93 (24.6%), and surgeons 36 (4.9%). In backwards stepwise multivariate analyses attending any health professional for foot care was independently associated (OR (95% CI)) with diabetes (3.0 (2.1-4.5)), arthritis (1.8 (1.3-2.6)), mobility impairment (2.0 (1.4-2.9)) and previous foot ulcer (5.4 (2.9-10.0)). Attending a podiatrist was independently associated with female gender (2.6 (1.7-3.9)), increasing years of age (1.06 (1.04-1.08), diabetes (5.0 (3.2-7.9)), arthritis (2.0 (1.3-3.0)), hypertension (1.7 (1.1-2.6) and previous foot ulcer (4.5 (2.4-8.1). While attending a GP was independently associated with having a foot ulcer (10.4 (5.6-19.2). Conclusions Promisingly these findings indicate that people with a diagnosis of diabetes and arthritis are more likely to attend health professionals for foot care. However, it also appears those with active foot complications, or significant risk factors, may not be more likely to receive the multi-disciplinary foot care recommended by guidelines. More concerted efforts are required to ensure all people with foot complications are receiving recommended foot care.
Resumo:
Background Foot complications have been found to be predictors of mobility impairment and falls in community dwelling elderly patients. However, fewer studies have investigated the link between foot complications and mobility impairment in hospital in patient populations. The aim of this paper was to investigate the associations between mobility impairment and various foot complications in general inpatient populations. Methods Eligible participants were all adults admitted overnight, for any reason, into five diverse hospitals on one day; excluding maternity, mental health and cognitively impaired patients. Participants underwent a foot examination to clinically diagnose different foot complications; including foot wounds, infections, deformity, peripheral arterial disease and peripheral neuropathy. They were also surveyed on social determinant, medical history, self-care, footwear, foot complication history risk factors, and, mobility impairment defined as requiring a mobility aid for mobilisation prior to hospitalisation. Results Overall, 733 participants consented; mean(±SD) age 62(±19) years, 408 (55.8%) male, 172 (23.5%) diabetes. Mobility impairment was present in 242 (33.2%) participants; diabetes populations reported more mobility impairment than non-diabetes populations (40.7% vs 30.9%, p < 0.05). In a backwards stepwise multivariate analysis, and controlling for other risk factors, those people with mobility impairment were independently associated with increasing years of age (OR = 1.04 (95% CI) (1.02-1.05)), male gender (OR = 1.7 (1.2-2.5)), being born in Australia (OR = 1.7 (1.1-2.8), vision impairment (2.0 (1.2-3.1)), peripheral neuropathy (OR = 3.1 (2.0-4.6) and foot deformity (OR = 2.0 (1.3-3.0). Conclusions These findings support the results of other large studies investigating community dwelling elderly patients that peripheral neuropathy and foot deformity are independently associated with mobility impairment and potentially falls. Furthermore the findings suggest routine clinical diagnosis of foot complications as defined by national diabetic foot guidelines were sufficient to determine these associated foot complication risk factors for mobility impairment. Further research is required to establish if these foot complication risk factors for mobility impairment are predictors of actual falls in the inpatient environment.
Resumo:
Background: The aging population is placing increasing demands on surgical services, simultaneously with a decreasing supply of professional labor and a worsening economic situation. Under growing financial constraints, successful operating room management will be one of the key issues in the struggle for technical efficiency. This study focused on several issues affecting operating room efficiency. Materials and methods: The current formal operating room management in Finland and the use of performance metrics and information systems used to support this management were explored using a postal survey. We also studied the feasibility of a wireless patient tracking system as a tool for managing the process. The reliability of the system as well as the accuracy and precision of its automatically recorded time stamps were analyzed. The benefits of a separate anesthesia induction room in a prospective setting were compared with the traditional way of working, where anesthesia is induced in the operating room. Using computer simulation, several models of parallel processing for the operating room were compared with the traditional model with respect to cost-efficiency. Moreover, international differences in operating room times for two common procedures, laparoscopic cholecystectomy and open lung lobectomy, were investigated. Results: The managerial structure of Finnish operating units was not clearly defined. Operating room management information systems were found to be out-of-date, offering little support to online evaluation of the care process. Only about half of the information systems provided information in real time. Operating room performance was most often measured by the number of procedures in a time unit, operating room utilization, and turnover time. The wireless patient tracking system was found to be feasible for hospital use. Automatic documentation of the system facilitated patient flow management by increasing process transparency via more available and accurate data, while lessening work for staff. Any parallel work flow model was more cost-efficient than the traditional way of performing anesthesia induction in the operating room. Mean operating times for two common procedures differed by 50% among eight hospitals in different countries. Conclusions: The structure of daily operative management of an operating room warrants redefinition. Performance measures as well as information systems require updating. Parallel work flows are more cost-efficient than the traditional induction-in-room model.
Resumo:
The occurrence and nature of civilian firearm- and explosion-injuries in Finland, and the nature of severe gunshot injuries of the extremities were described in seven original articles. The main data sources used were the National Hospital Discharge Register, the Cause-of-Death Register, and the Archive of Death Certificates at Statistics Finland. The present study was population based. Epidemiologic methods were used in six and clinical analyses in five papers. In these clinical studies, every original hospital record and death certificate was critically analyzed. The trend of hospitalized firearm injuries has slightly declined in Finland from the late 1980s to the early 2000s. The occurrence decreased from 5.1 per 100 000 person-years in 1990 to 2.6 in 2003. The decline was found in the unintentional firearm injuries. A high incidence of unintentional injuries by firearms was characteristic of the country, while violence and homicides by firearms represented a minor problem. The incidence of fatal non-suicidal firearm injuries has been stable, 1.8 cases per 100 000 person-years. Suicides using firearms were eight times more common during the period studied. This is contrary to corresponding reports from many other countries. However, the use of alcohol and illegal drugs or substances was detected in as many as one-third of the injuries studied. The median length of hospitalization was three days and it was significantly associated (p<0.001) with the type of injury. The mean length of hospital stay has decreased from the 1980s to the early 2000s. In this study, there was a special interest in gunshot injuries of the extremities. From a clinical point of view, the nature of severe extremital gunshot wounds, as well as the primary operative approach in their management, varied. The patients with severe injuries of this kind were managed at university and central hospital emergency departments, by general surgeons in smaller hospitals and by cardiothoracic or vascular surgeons in larger hospitals. Injuries were rarities and as such challenges for surgeons on call. Some noteworthy aspects of the management were noticed and these should be focused on in the future. On the other hand, the small population density and the relatively large geographic area of Finland do not favor high volume, centralized trauma management systems. However, experimental war surgery has been increasingly taught in the country from the 1990s, and excellent results could be expected during the present decade. Epidemiologically, explosion injuries can be considered a minor problem in Finland at present, but their significance should not be underestimated. Fatal explosion injuries showed up sporadically. An increase occurred from 2002 to 2004 for no obvius reason. However, in view of the historical facts, a possibility for another rare major explosion involving several people might become likely within the next decade. The national control system of firearms is mainly based on the new legislations from 1998 and 2002. However, as shown in this study, there is no reason to assume that the national hospitalization policies, or the political climate, or the legislation might have changed over the study period and influenced the declining development, at least not directly. Indeed, the reason for the decline to appear in the incidence of unintentional injuries only remains unclear. It may derive from many practical steps, e.g. locked firearm cases, or from the stability of the community itself. For effective reduction of firearm-related injuries, preventive measures, such as education and counseling, should be targeted at recreational firearm users. To sum up, this study showed that the often reported increasing trend in firearm as well as explosion-related injuries has not manifested in Finland. Consequently, it can be recognized that, overall, the Finnish legislation together with the various strategies have succeeded in preventing firearm- and explosion-related injuries in the country.
Resumo:
Background and aims. Since 1999, hospitals in the Finnish Hospital Infection Program (SIRO) have reported data on surgical site infections (SSI) following major hip and knee surgery. The purpose of this study was to obtain detailed information to support prevention efforts by analyzing SIRO data on SSIs, to evaluate possible factors affecting the surveillance results, and to assess the disease burden of postoperative prosthetic joint infections in Finland. Methods. Procedures under surveillance included total hip (THA) and total knee arthroplasties (TKA), and the open reduction and internal fixation (ORIF) of femur fractures. Hospitals prospectively collected data using common definitions and written protocol, and also performed postdischarge surveillance. In the validation study, a blinded retrospective chart review was performed and infection control nurses were interviewed. Patient charts of deep incisional and organ/space SSIs were reviewed, and data from three sources (SIRO, the Finnish Arthroplasty Register, and the Finnish Patient Insurance Centre) were linked for capture-recapture analyses. Results. During 1999-2002, the overall SSI rate was 3.3% after 11,812 orthopedic procedures (median length of stay, eight days). Of all SSIs, 56% were detected after discharge. The majority of deep incisional and organ/space SSIs (65/108, 60%) were detected on readmission. Positive and negative predictive values, sensitivity, and specificity for SIRO surveillance were 94% (95% CI, 89-99%), 99% (99-100%), 75% (56-93%), and 100% (97-100%), respectively. Of the 9,831 total joint replacements performed during 2001-2004, 7.2% (THA 5.2% and TKA 9.9%) of the implants were inserted in a simultaneous bilateral operation. Patients who underwent bilateral operations were younger, healthier, and more often males than those who underwent unilateral procedures. The rates of deep SSIs or mortality did not differ between bi- and uni-lateral THAs or TKAs. Four deep SSIs were reported following bilateral operations (antimicrobial prophylaxis administered 48-218 minutes before incision). In the three registers, altogether 129 prosthetic joint infections were identified after 13,482 THA and TKA during 1999-2004. After correction with the positive predictive value of SIRO (91%), a log-linear model provided an estimated overall prosthetic joint infection rate of 1.6% after THA and 1.3% after TKA. The sensitivity of the SIRO surveillance ranged from 36% to 57%. According to the estimation, nearly 200 prosthetic joint infections could occur in Finland each year (the average from 1999 to 2004) after THA and TKA. Conclusions. Postdischarge surveillance had a major impact on SSI rates after major hip and knee surgery. A minority of deep incisional and organ/space SSIs would be missed, however, if postdischarge surveillance by questionnaire was not performed. According to the validation study, most SSIs reported to SIRO were true infections. Some SSIs were missed, revealing some weakness in case finding. Variation in diagnostic practices may also affect SSI rates. No differences were found in deep SSI rates or mortality between bi- and unilateral THA and TKA. However, patient materials between these two groups differed. Bilateral operations require specific attention paid to their antimicrobial prophylaxis as well as to data management in the surveillance database. The true disease burden of prosthetic joint infections may be heavier than the rates from national nosocomial surveillance systems usually suggest.
Resumo:
The `VuoKKo` trial consisted of 236 women referred and randomised due to menorrhagia in the five university hospitals of Finland between November 1994 and November 1997. Of these women, 117 were randomised to hysterectomy and 119 to use levonorgestrel-releasing intrauterine system (LNG-IUS) to treat this complaint. Their follow-up visits took place six and twelve months after the treatment and five years after the randomisation. The first aim in the primary trial was quality-of-life and monetary aspects, and secondly in the present study to compare ovarian function, bone mineral density (BMD) and sexual functioning after these two treatment options. Ovarian function seemed to decrease after hysterectomy, demonstrated by increased hot flashes and serum follicle-stimulating hormone concentrations twelve months after the operation. Such an increase was not seen among LNG-IUS users. The pulsatility index of intraovarian arteries measured by two-dimensional ultrasound decreased in the hysterectomy group, but not in the LNG-IUS group. The decrease in serum inhibin B concentrations was similar in both groups, while ovarian artery circulation remained unchanged. BMD of the women measured by dual x-ray absorptiometry (DXA) at the lumbar spine and femoral neck at baseline and at five years after treatment showed BMD decrease at the lumbar spine among hysterectomised women, but not among LNG-IUS users. In both groups, BMD at the femoral neck had decreased. Differences between the groups were not, however, significant. Sexual functioning assessed by McCoy s sexual scale showed that sexual satisfaction as well as intercourse frequency had increased and sexual problems decreased among hysterectomised women six months after treatment. Among LNG-IUS users, sexual satisfaction and sexual problems remained unchanged. Although, the two groups did not differ in terms of sexual satisfaction or sexual problems at one-year and five-year follow-ups, LNG-IUS users were less satisfied with their partners than hysterectomised women.
Resumo:
Visual acuities at the time of referral and on the day before surgery were compared in 124 patients operated on for cataract in Vaasa Central Hospital, Finland. Preoperative visual acuity and the occurrence of ocular and general disease were compared in samples of consecutive cataract extractions performed in 1982, 1985, 1990, 1995 and 2000 in two hospitals in the Vaasa region in Finland. The repeatability and standard deviation of random measurement error in visual acuity and refractive error determination in a clinical environment in cataractous, pseudophakic and healthy eyes were estimated by re-examining visual acuity and refractive error of patients referred to cataract surgery or consultation by ophthalmic professionals. Altogether 99 eyes of 99 persons (41 cataractous, 36 pseudophakic and 22 healthy eyes) with a visual acuity range of Snellen 0.3 to 1.3 (0.52 to -0.11 logMAR) were examined. During an average waiting time of 13 months, visual acuity in the study eye decreased from 0.68 logMAR to 0.96 logMAR (from 0.2 to 0.1 in Snellen decimal values). The average decrease in vision was 0.27 logMAR per year. In the fastest quartile, visual acuity change per year was 0.75 logMAR, and in the second fastest 0.29 logMAR, the third and fourth quartiles were virtually unaffected. From 1982 to 2000, the incidence of cataract surgery increased from 1.0 to 7.2 operations per 1000 inhabitants per year in the Vaasa region. The average preoperative visual acuity in the operated eye increased by 0.85 logMAR (in decimal values from 0.03to 0.2) and in the better eye 0.27 logMAR (in decimal values from 0.23 to 0.43) over this period. The proportion of patients profoundly visually handicapped (VA in the better eye <0.1) before the operation fell from 15% to 4%, and that of patients less profoundly visually handicapped (VA in the better eye 0.1 to <0.3) from 47% to 15%. The repeatability visual acuity measurement estimated as a coefficient of repeatability for all 99 eyes was ±0.18 logMAR, and the standard deviation of measurement error was 0.06 logMAR. Eyes with the lowest visual acuity (0.3-0.45) had the largest variability, the coefficient of repeatability values being ±0.24 logMAR and eyes with a visual acuity of 0.7 or better had the smallest, ±0.12 logMAR. The repeatability of refractive error measurement was studied in the same patient material as the repeatability of visual acuity. Differences between measurements 1 and 2 were calculated as three-dimensional vector values and spherical equivalents and expressed by coefficients of repeatability. Coefficients of repeatability for all eyes for vertical, torsional and horisontal vectors were ±0.74D, ±0.34D and ±0.93D, respectively, and for spherical equivalent for all eyes ±0.74D. Eyes with lower visual acuity (0.3-0.45) had larger variability in vector and spherical equivalent values (±1.14), but the difference between visual acuity groups was not statistically significant. The difference in the mean defocus equivalent between measurements 1 and 2 was, however, significantly greater in the lower visual acuity group. If a change of ±0.5D (measured in defocus equivalents) is accepted as a basis for change of spectacles for eyes with good vision, the basis for eyes in the visual acuity range of 0.3 - 0.65 would be ±1D. Differences in repeated visual acuity measurements are partly explained by errors in refractive error measurements.
Resumo:
Every year, approximately 62 000 people with stroke and transient ischemic attack are treated in Canadian hospitals, and the evidence suggests one-third or more will experience vascular-cognitive impairment, and/or intractable fatigue, either alone or in combination. The 2015 update of the Canadian Stroke Best Practice Recommendations: Mood, Cognition and Fatigue Module guideline is a comprehensive summary of current evidence-based recommendations for clinicians in a range of settings, who provide care to patients following stroke. The three consequences of stroke that are the focus of the this guideline (poststroke depression, vascular cognitive impairment, and fatigue) have high incidence rates and significant impact on the lives of people who have had a stroke, impede recovery, and result in worse long-term outcomes. Significant practice variations and gaps in the research evidence have been reported for initial screening and in-depth assessment of stroke patients for these conditions. Also of concern, an increased number of family members and informal caregivers may also experience depressive symptoms in the poststroke recovery phase which further impact patient recovery. These factors emphasize the need for a system of care that ensures screening occurs as a standard and consistent component of clinical practice across settings as stroke patients transition from acute care to active rehabilitation and reintegration into their community. Additionally, building system capacity to ensure access to appropriate specialists for treatment and ongoing management of stroke survivors with these conditions is another great challenge.
Resumo:
Septic shock is a common killer in intensive care units (ICU). The most crucial issue concerning the outcome is the early and aggressive start of treatment aimed at normalization of hemodynamics and the early start of antibiotics during the very first hours. The optimal targets of hemodynamic treatment, or impact of hemodynamic treatment on survival after first resuscitation period are less known. The objective of this study was to evaluate different aspects of the hemodynamic pattern in septic shock with special attention to prediction of outcome. In particular components of early treatment and monitoring in the ICU were assessed. A total of 401 patients, 218 with septic shock and 192 with severe sepsis or septic shock were included in the study. The patients were treated in 24 Finnish ICUs during 1999-2005. 295 of the patients were included in the Finnish national epidemiologic Finnsepsis study. We found that the most important hemodynamic variables concerning the outcome were the mean arterial pressures (MAP) and lactate during the first six hours in ICU and the MAP and mixed venous oxygen saturation (SvO2) under 70% during first 48 hours. The MAP levels under 65 mmHg and SvO2 below 70% were the best predictive thresholds. Also the high central venous pressure (CVP) correlated to adverse outcome. We assessed the correlation and agreement of SvO2 and mean central venous oxygen saturation (ScvO2) in septic shock during first day in ICU. The mean SvO2 was below ScvO2 during early sepsis. Bias of difference was 4.2% (95% limits of agreement 8.1% to 16.5%) by Bland-Altman analysis. The difference between saturation values correlated significantly to cardiac index and oxygen delivery. Thus, the ScvO2 can not be used as a substitute of SvO2 in hemodynamic monitoring in ICU. Several biomarkers have been investigated for their ability to help in diagnosis or outcome prediction in sepsis. We assessed the predictive value of N-terminal pro brain natriuretic peptide (NT-proBNP) on mortality in severe sepsis or septic shock. The NT-proBNP levels were significantly higher in hospital nonsurvivors. The NT-proBNP 72 hrs after inclusion was independent predictor of hospital mortality. The acute cardiac load contributed to NTproBNP values at admission, but renal failure was the main confounding factor later. The accuracy of NT-proBNP, however, was not sufficient for clinical decision-making concerning the outcome prediction. The delays in start of treatment are associated to poorer prognosis in sepsis. We assessed how the early treatment guidelines were adopted, and what was the impact of early treatment on mortality in septic shock in Finland. We found that the early treatment was not optimal in Finnish hospitals and this reflected to mortality. A delayed initiation of antimicrobial agents was especially associated with unfavorable outcome.
Resumo:
The aim of the present thesis was to study the role of the epithelial sodium channel (ENaC) in clearance of fetal lung fluid in the newborn infant by measurement of airway epithelial expression of ENaC, of nasal transepithelial potential difference (N-PD), and of lung compliance (LC). In addition, the effect of postnatal dexamethasone on airway epithelial ENaC expression was measured in preterm infants with bronchopulmonary dysplasia (BPD). The patient population was formed of selected term newborn infants born in the Department of Obstetrics (Studies II-IV) and selected preterm newborn infants treated in the neonatal intensive care unit of the Hospital for Children and Adolescents (Studies I and IV) of the Helsinki University Central Hospital in Finland. A small population of preterm infants suffering from BPD was included in Study I. Studies I, III, and IV included airway epithelial measurement of ENaC and in Studies II and III, measurement of N-PD and LC. In Study I, ENaC expression analyses were performed in the Research Institute of the Hospital for Sick Children in Toronto, Ontario, Canada. In the following studies, analyses were performed in the Scientific Laboratory of the Hospital for Children and Adolescents. N-PD and LC measurements were performed at bedside in these hospitals. In term newborn infants, the percentage of amiloride-sensitive N-PD, a surrogate for ENaC activity, measured during the first 4 postnatal hours correlates positively with LC measured 1 to 2 days postnatally. Preterm infants with BPD had, after a therapeutic dose of dexamethasone, higher airway epithelial ENaC expression than before treatment. These patients were subsequently weaned from mechanical ventilation, probably as a result of the clearance of extra fluid from the alveolar spaces. In addition, we found that in preterm infants ENaC expression increases with gestational age (GA). In preterm infants, ENaC expression in the airway epithelium was lower than in term newborn infants. During the early postnatal period in those born both preterm and term airway epithelial βENaC expression decreased significantly. Term newborn infants delivered vaginally had a significantly smaller airway epithelial expression of αENaC after the first postnatal day than did those delivered by cesarean section. The functional studies showed no difference in N-PD between infants delivered vaginally and by cesarean section. We therefore conclude that the low airway epithelial expression of ENaC in the preterm infant and the correlation of N-PD with LC in the term infant indicate a role for ENaC in the pathogenesis of perinatal pulmonary adaptation and neonatal respiratory distress. Because dexamethasone raised ENaC expression in preterm infants with BPD, and infants were subsequently weaned from ventilator therapy, we suggest that studies on the treatment of respiratory distress in the preterm infant should include the induction of ENaC activity.
Resumo:
The spread of multidrug-resistant (MDR) bacteria has reached a threatening level. Extended-spectrum betalactamase- producing enterobacteriaceae (ESBLE) are now endemic in many hospitals worldwide as well as in the community, while resistance rates continue to rise steadily in Acinetobacter baumannii and Pseudomonas aeruginosa [1]. Even more alarming is the dissemination of carbapenemase-producing enterobacteriaceae (CPE), causing therapeutic and organizational problems in hospitals facing outbreaks or endemicity. This context could elicit serious concerns for the coming two decades; nevertheless, effective measures exist to stop the amplification of the problem and several axes of prevention remain to be fully exploited, leaving room for realistic hopes, at least for many parts of the world...