873 resultados para non-dialysis management
Resumo:
The aim of this master's thesis was to assess the ten- year trends and regional differences in management and outcome of acute myocardial infarction (AMI) within Switzerland. The thesis is composed of two articles. First, in the article "Trends in hospital management of acute myocardial infarction in Switzerland, 1998 to 2008" over 102,700 cases of AMI with corresponding management and revascularization procedures were assessed. The results showed a considerable increase in the numbers of hospital discharges for AMI, namely due to the increase of between- hospital transfers. Rates of intensive care unit admissions remained stable. All types of revascularization procedures showed an increase. In particular, overall stenting rates increased with drug-eluting stents partly replacing bare stents. Second, in the article "The region makes the difference: disparities in management of acute myocardial infarction within Switzerland" around 25,600 cases of AMI with corresponding management were assessed for the period of 2007-2008 and according to seven Swiss regions. As reported by our results, considerable regional differences in AMI management were stated within Switzerland. Although each region showed different trends regarding revascularization interventions, Leman and Ticino contrast significantly by presenting the minimum and maximum rates in almost all assessed parameters. As a consequence these two regions differ the most from the Swiss average. The impact of the changes in trends and the regional differences in AMI management on Swiss patient's outcome and economics remains to be assessed. Purpose: To assess ten-year trends in management and outcome of acute myocardial infarction (AMI) in Switzerland. Methods: Swiss hospital discharge database for the 1998 to 2008 period. AMI was defined as a primary discharge diagnosis code I21 according to the CIM-10 classification of the World Health Organization. Management and revascularization procedures were assessed. Results: Overall, 102,729 hospital discharges with a diagnosis of AMI were analyzed. The number of hospital discharges increased almost three-fold from 5530 in 1998 to 13,834 in 2008, namely due to a considerable increase in between-hospital transfers (1352 in 1998, 6494 in 2008). Relative to all hospital discharges, Intensive Care Unit admission rate was 38.0% in 1998 and remained stable (36.2%) in 2008 (p for trend=0.25). Percutaneous revascularization rates increased from 6.0% to 39.9% (p for trend<0.001). Non-drug-eluting stent use increased from 1.3% to 16.6% (p for trend<0.05). Drug eluting stents appeared in 2004 and increased to 23.5% of hospital discharges in 2008 (p for trend=0.07). Coronary artery bypass graft increased from 1.0% to 3.0% (p for trend<0.001). Circulatory assistance increased from 0.2% to 1.7% (p for trend<0.001). Thrombolysis showed no significant changes, from 0.5% to 1.9% (p for trend=0.64). Most of these trends were confirmed after multivariate adjustment. Conclusion: Between 1998 and 2008 the number of hospital discharges for AMI increased considerably in Switzerland, namely due to between-hospital transfers. Overall stenting rates increased, drug-eluting stents partly replacing bare stents. The impact of these changes on outcome and economics remains to be assessed.
Resumo:
BACKGROUND: Since 1981 Princess Margaret Hospital has used initial active surveillance (AS) with delayed treatment at relapse as the preferred management for all patients with clinical stage I nonseminomatous germ cell tumors (NSGCT). OBJECTIVE: Our aim was to report our overall AS experience and compare outcomes over different periods using this non-risk-adapted approach. DESIGN, SETTING, AND PARTICIPANTS: Three hundred and seventy-one patients with stage I NSGCT were managed by AS from 1981 to 2005. For analysis by time period, patients were divided into two cohorts by diagnosis date: initial cohort, 1981-1992 (n=157), and recent cohort, 1993-2005 (n=214). INTERVENTION: Patients were followed at regular intervals, and treatment was only given for relapse. MEASUREMENTS: Recurrence rates, time to relapse, risk factors for recurrence, disease-specific survival, and overall survival were determined. RESULTS AND LIMITATIONS: With a median follow-up of 6.3 yr, 104 patients (28%) relapsed: 53 of 157 (33.8%) in the initial group and 51 of 214 (23.8%) in the recent group. Median time to relapse was 7 mo. Lymphovascular invasion (p<0.0001) and pure embryonal carcinoma (p=0.02) were independent predictors of recurrence; 125 patients (33.7%) were designated as high risk based on the presence of one or both factors. In the initial cohort, 66 of 157 patients (42.0%) were high risk and 36 of 66 patients (54.5%) relapsed versus 17 of 91 low-risk patients (18.7%) (p<0.0001). In the recent cohort, 59 of 214 patients (27.6%) were high risk and 29 of 59 had a recurrence (49.2%) versus 22 of 155 low-risk patients (14.2%) (p<0.0001). Three patients (0.8%) died from testis cancer. The estimated 5-yr disease-specific survival was 99.3% in the initial group and 98.9% in the recent one. CONCLUSIONS: Non-risk-adapted surveillance is an effective, simple strategy for the management of all stage I NSGCT.
Resumo:
OBJECTIVE: HIV-1 post-exposure prophylaxis (PEP) is frequently prescribed after exposure to source persons with an undetermined HIV serostatus. To reduce unnecessary use of PEP, we implemented a policy including active contacting of source persons and the availability of free, anonymous HIV testing ('PEP policy'). METHODS: All consultations for potential non-occupational HIV exposures i.e. outside the medical environment) were prospectively recorded. The impact of the PEP policy on PEP prescription and costs was analysed and modelled. RESULTS: Among 146 putative exposures, 47 involved a source person already known to be HIV positive and 23 had no indication for PEP. The remaining 76 exposures involved a source person of unknown HIV serostatus. Of 33 (43.4%) exposures for which the source person could be contacted and tested, PEP was avoided in 24 (72.7%), initiated and discontinued in seven (21.2%), and prescribed and completed in two (6.1%). In contrast, of 43 (56.6%) exposures for which the source person could not be tested, PEP was prescribed in 35 (81.4%), P < 0.001. Upon modelling, the PEP policy allowed a 31% reduction of cost for management of exposures to source persons of unknown HIV serostatus. The policy was cost-saving for HIV prevalence of up to 70% in the source population. The availability of all the source persons for testing would have reduced cost by 64%. CONCLUSION: In the management of non-occupational HIV exposures, active contacting and free, anonymous testing of source persons proved feasible. This policy resulted in a decrease in prescription of PEP, proved to be cost-saving, and presumably helped to avoid unnecessary toxicity and psychological stress.
Resumo:
Early admission to hospital with minimum delay is a prerequisite for successful management of acute stroke. We sought to determine our local pre- and in-hospital factors influencing this delay. Time from onset of symptoms to admission (admission time) was prospectively documented during a 6-month period (December 2004 to May 2005) in patients consecutively admitted for an acute focal neurological deficit presented at arrival and of presumed vascular origin. Mode of transportation, patient's knowledge and correct recognition of stroke symptoms were assessed. Physicians contacted by the patients or their relatives were interviewed. The influence of referral patterns on in-hospital delays was further evaluated. Overall, 331 patients were included, 249 had an ischaemic and 37 a haemorrhagic stroke. Forty-five patients had a TIA with neurological symptoms subsiding within the first hours after admission. Median admission time was 3 hours 20 minutes. Transportation by ambulance significantly shortened admission delays in comparison with the patient's own means (HR 2.4, 95% CI 1.6-3.7). The only other factor associated with reduced delays was awareness of stroke (HR 1.9, 95% CI 1.3-2.9). Early in-hospital delays, specifically time to request CT-scan and time to call the neurologist, were shorter when the patient was referred by his family or to a lesser extent by an emergency physician than by the family physician (p < 0.04 and p < 0.01, respectively) and were shorter when he was transported by ambulance than by his own means (p < 0.01). Transportation by ambulance and referral by the patient or family significantly improved admission delays and early in-hospital management. Correct recognition of stroke symptoms further contributed to significant shortening of admission time. Educational programmes should take these findings into account.
Resumo:
BACKGROUND: Intrathecal analgesia and avoidance of perioperative fluid overload are key items within enhanced recovery pathways. Potential side effects include hypotension and renal dysfunction. STUDY DESIGN: From January 2010 until May 2010, all patients undergoing colorectal surgery within enhanced recovery pathways were included in this retrospective cohort study and were analyzed by intrathecal analgesia (IT) vs none (noIT). Primary outcomes measures were systolic and diastolic blood pressure, mean arterial pressure, and heart rate for 48 hours after surgery. Renal function was assessed by urine output and creatinine values. RESULTS: One hundred and sixty-three consecutive colorectal patients (127 IT and 36 noIT) were included in the analysis. Both patient groups showed low blood pressure values within the first 4 to 12 hours and a steady increase thereafter before return to baseline values after about 24 hours. Systolic and diastolic blood pressure and mean arterial pressure were significantly lower until 16 hours after surgery in patients having IT compared with the noIT group. Low urine output (<0.5 mL/kg/h) was reported in 11% vs 29% (IT vs noIT; p = 0.010) intraoperatively, 20% vs 11% (p = 0.387), 33% vs 22% (p = 0.304), and 31% vs 21% (p = 0.478) for postanesthesia care unit and postoperative days 1 and 2, respectively. Only 3 of 127 (2.4%) IT and 1 of 36 (2.8%) noIT patients had a transitory creatinine increase >50%; no patients required dialysis. CONCLUSIONS: Postoperative hypotension affects approximately 10% of patients within an enhanced recovery pathway and is slightly more pronounced in patients with IT. Hemodynamic depression persists for <20 hours after surgery; it has no measurable negative impact and therefore cannot justify detrimental postoperative fluid overload.
Resumo:
A nationwide survey was conducted in Switzerland to assess the quality level of osteoporosis management in patients aged 50 years or older presenting with a fragility fracture to the emergency ward of the participating hospitals. Eight centres recruited 4966 consecutive patients who presented with one or more fractures between 2004 and 2006. Of these, 3667 (2797 women, 73.8 years old and 870 men, 73.0 years old in average) were considered as having a fragility fracture and included in the survey. Included patients presented with a fracture of the upper limbs (30.7%), lower limbs (26.4%), axial skeleton (19.5%) or another localisation, including malleolar fractures (23.4%). Thirty-two percent reported one or more previous fractures during adulthood. Of the 2941 (80.2%) hospitalised women and men, only half returned home after discharge. During diagnostic workup, dual x-ray absorptiometry (DXA) measurement was performed in 31.4% of the patients only. Of those 46.0% had a T-score < or =-2.5 SD and 81.1% < or =-1.0 SD. Osteoporosis treatment rate increased from 26.3% before fracture to 46.9% after fracture in women and from 13.0% to 30.3% in men. However, only 24.0% of the women and 13.8% of the men were finally adequately treated with a bone active substance, generally an oral bisphosphonate, with or without calcium / vitamin D supplements. A positive history of previous fracture vs none increased the likelihood of getting treatment with a bone active substance (36.6 vs 17.9%, ? 18.7%, 95% CI 15.1 to 22.3, and 22.6 vs 9.9%, ? 12.7%, CI 7.3 to 18.5, in women and men, respectively). In Switzerland, osteoporosis remains underdiagnosed and undertreated in patients aged 50 years and older presenting with a fragility fracture.
Resumo:
Poor long-term adherence and persistence to drug therapy is universally recognized as one of the major clinical issues in the management of chronic diseases, and patients with renal diseases are also concerned by this important phenomenon. Chronic kidney disease (CKD) patients belong to the group of subjects with one of the highest burdens of daily pill intake with up to >20 pills per day depending on the severity of their disease. The purpose of the present review is to discuss the difficulties encountered by nephrologists in diagnosing and managing poor adherence and persistence in CKD patients including in patients receiving maintenance dialysis. Our review will also attempt to provide some clues and new perspectives on how drug adherence could actually be addressed and possibly improved. Working on drug adherence may look like a long and tedious path, but physicians and healthcare providers should always be aware that drug adherence is in general much lower than what they may think and that there are many ways to improve and support drug adherence and persistence so that renal patients obtain the full benefits of their treatments.
Resumo:
Isolated ventricular non-compaction (IVNC) is a rare, congenital, unclassified cardiomyopathy characterized by prominent trabecular meshwork and deep recesses. Major clinical manifestations of IVNC are heart failure, atrial and ventricular arrhythmias, and thrombo-embolic events. We describe a case of a 69-year-old woman in whom the diagnosis of IVNC was discovered late, whereas former echocardiographic examinations were considered normal. She was known for systolic left ventricular dysfunction for 3 years and then became symptomatic (NYHA III). In the past, she suffered from multiple episodes of deep vein thrombosis and pulmonary embolism. Electrocardiogram revealed a wide QRS complex, and transthoracic echocardiography showed typical apical thickening of the left and right ventricular myocardial wall with two distinct layers. The ratio of non-compacted to compacted myocardium was >2:1. Cardiac MRI confirmed the echocardiographic images. Cerebral MRI revealed multiple ischaemic sequellae. In view of the persistent refractory, heart failure in medical treatment of patients with classical criteria for cardiac re-synchronization therapy, as well as the ventricular arrhythmias, a biventricular automatic intracardiac defibrillator (biventricular ICD) was implanted. The 2-year follow-up period was characterized by improvement of NYHA functional class from III to I and increasing in left ventricular function. We hereby present a case of IVNC with favourable outcome after biventricular ICD implantation. Cardiac re-synchronization therapy could be considered in the management of this pathology.
Resumo:
Nitrous oxide (N2O) is the most important non-CO2 greenhouse gas and soil management systems should be evaluated for their N2O mitigation potential. This research evaluated a long-term (22 years) experiment testing the effect of soil management systems on N2O emissions in the postharvest period (autumn) from a subtropical Rhodic Hapludox at the research center FUNDACEP, in Cruz Alta, state of Rio Grande do Sul. Three treatments were evaluated, one under conventional tillage with soybean residues (CTsoybean) and two under no-tillage with soybean (NTsoybean) and maize residues (NTmaize). N2O emissions were measured eight times within 24 days (May 2007) using closed static chambers. Gas flows were obtained based on the relations between gas concentrations in the chamber at regular intervals (0, 15, 30, 45 min) analyzed by gas chromatography. After soybean harvest, accumulated N2O emissions in the period were approximately three times higher in the untilled soil (164 mg m-2 N) than under CT (51 mg m-2 N), with a short-lived N2O peak of 670 mg m-2 h-1 N. In contrast, soil N2O emissions in NT were lower after maize than after soybean, with a N2O peak of 127 g m-2 h-1 N. The multivariate analysis of N2O fluxes and soil variables, which were determined simultaneously with air sampling, demonstrated that the main driving variables of soil N2O emissions were soil microbial activity, temperature, water-filled pore space, and NO3- content. To replace soybean monoculture, crop rotation including maize must be considered as a strategy to decrease soil N2O emissions from NT soils in Southern Brazil in a Autumn.
Resumo:
Pasture productivity can drop due to soil compaction caused by animal trampling. Physical and mechanical alterations are therefore extremely important indicators for pasture management. The objective of this research was to: draw and evaluate the Mohr failure line of a Red Yellow Latossol under different pasture cycles and natural forest; calculate apparent cohesion; observe possible physical alterations in this soil; and propose a correction factor for stocking rates based on shear strength properties. This study was conducted between March/2006 and March/2007 on the Experimental Farm of Fundação de Ensino Superior de Passos, in Passos, state of Minas Gerais. The total study area covered 6 ha, of which 2 ha were irrigated pasture, 2 ha non-irrigated pasture and 2 ha natural forest. Brachiaria brizantha cv. MG-5 Vitória was used as forage plant. The pasture area was divided into paddocks. The Mohr failure line of samples of a Red Yellow Latossol under irrigated pasture equilibrated at a tension of water content of 6 kPa indicated higher shear strength than under non-irrigated pasture. The shear strength under irrigated pasture and natural forest was higher than under non-irrigated pasture. At a tension of water content of 33 kPa no difference was found in shear strength between management and use. Possible changes in soil structure were caused by apparent cohesion. The values of the correction factor were close to 1, which may indicate a possible soil compaction in prolonged periods of management.
Resumo:
Soils play a fundamental role in the production of human foods. The Oxisols in the state of Paraná are among the richest and most productive soils in Brazil, but degradation and low porosity are frequently documented, due to intensive farming involving various management strategies and the application of high-tech solutions. This study aims to investigate changes in the porosity of two Red Oxisols (Latossolos Vermelhos), denoted LVef (eutroferric) and LVdf (dystroferric) under conventional and no-tillage soil management, with a succession of annual crops of soybean, maize and wheat over a continuous period of more than 20 years. After describing the soil profiles under native forest, no-tillage management and conventional tillage using the crop profile method, deformed and non-deformed soil samples were collected from the volumes most compacted by human intervention and the physical, chemical and mineralogical properties analyzed. The various porosity classes (total pore volume, inter-aggregate porosity between channels and biological cavities) and intra-aggregate porosity (determined in 10 cm³ saturated clods subjected to a pressure of -10 kPa to obtain a pore volume with a radius (r eq), > 15 μm and < 15 μm). The results showed that the effects of no-tillage farming on porosity are more pronounced in both soil types. Porosity of the LVdf was higher than pf the LVef soil, whatever the management type. In the LVdf soil, only pores with a radius of > 15 μm were affected by farming whereas in the LVef soil, pores with a radius of < 15 μm were affected as well.
Resumo:
The increased availability of soil water is important for the management of non-irrigated orange orchards. The objective of this study was to evaluate the availability of soil water in a Haplorthox (Rhodic Ferralsol) under different tillage systems used for orchard plantation, mulch management and rootstocks in a "Pêra" orange orchard in northwest Paraná, Brazil. An experiment in a split-split-plot design was established in 2002, in an area cultivated with Brachiaria brizantha grass in which three tillage systems (no tillage, conventional tillage and strip-tillage) were used for orchard plantation. This grass was mowed twice a year between the rows, representing two mulch managements in the split plots (no mulching and mulching in the plant rows). The split-split-plots were represented by two rootstocks ("Rangpur" lime and "Cleopatra" mandarin). The soil water content in the plant rows was evaluated in the 0-20 cm layer in 2007 and at 0-20 and 20-40 cm in 2008-2009. The effect of soil tillage systems prior to implantation of orange orchards on soil water availability was less pronounced than mulching and the rootstocks. The soil water availability was lower when "Pêra" orange trees were grafted on "Cleopatra" mandarin than on "Rangpur" lime rootstocks. Mulching had a positive influence on soil water availability in the sandy surface layer (0-20 cm) and sandy clay loam subsurface (20-40 cm) of the soil in the spring. The production of B. brizantha between the rows and residue disposal in the plant rows as mulch increased water availability to the "Pêra" orange trees.
Resumo:
AIM: The study aims to evaluate the effects of assertive community treatment (ACT) on the mental health and overall functioning of adolescents suffering from severe psychiatric disorders and who refuse any traditional child psychiatric care. There are a few studies evaluating the effects of ACT on a population of adolescents with psychiatric disorders. This short report highlights the impact of an ACT programme tailored to the needs of these patients, not only as an alternative to hospitalization, but also as a new form of intervention for patients that are difficult to engage. METHODS: The effect of ACT on 35 adolescents using the Health of the Nation Outcome Scales for Children and Adolescents (HoNOSCA) as a measuring tool in pre- and post-intervention was evaluated. RESULTS: The results show that the intervention was associated with a significant improvement on the HoNOSCA overall score, with the following items showing significant amelioration: hyperactivity/focus problems, non-organic somatic symptoms, emotional symptoms, scholastic/language skills, peer relationships, family relationships and school attendance. CONCLUSION: ACT appears as a feasible intervention for hard-to-engage adolescents suffering from psychiatric disorders. The intervention seems to improve their mental health and functioning. This pilot study may serve as a basis to prepare a controlled study that will also take the costs of the intervention into account.
Resumo:
Introduction. The management of large burn victims has significantly improved in the last decades. Specifically autologous cultured keratinocytes (CEA) overcame the problem of limited donor sites in severely burned patients. Several studies testing CEA's in their burn centers give mixed results on the general outcomes of burn patients. Methods. A review of publications with a minimum of 15 patients per study using CEA for the management of severe burn injury from 1989 until 2011 were recruited by using an online database including Medline, Pub Med and the archives of the medical library of the CHUV in Lausanne. Results. 18 studies with a total of 977 patients were included into this review. Most of the studies did not specify if CEA's were grafted alone or in combination with split thickness skin grafts (STSG) although most of the patients seemed to have received both methodologies in reviewed studies. The mean TBSA per study ranged from 33% to 78% in patients that were grafted with CEA's. Here no common minimum TBSA making a patient eligible for CEA grafting could be found. The definition of the "take rate" is not standardized and varied largely from 26% to 73%. Mortality and hospitalization time could not be shown to correlate with CEA use in all of the studies. As late complications, some authors described the fragility of the CEA regenerated skin. Conclusion. Since the healing of large burn victims demands for a variety of different surgical and non-surgical treatment strategies and the final outcome mainly depends on the burned surface as well as the general health condition of the patient, no definitive conclusion could be drawn from the use of CEA's of reviewed studies. From our own experience, we know that selected patients significantly profit from CEA grafts although cost efficiency or the reduction of mortality cannot be demonstrated on this particular cases.
Resumo:
BACKGROUND: The treatment of status epilepticus (SE) is based on relatively little evidence although several guidelines have been published. A recent study reported a worse SE prognosis in a large urban setting as compared to a peripheral hospital, postulating better management in the latter. The aim of this study was to analyse SE episodes occurring in different settings and address possible explanatory variables regarding outcome, including treatment quality. METHODS: Over six months we prospectively recorded consecutive adults with SE (fit lasting five or more minutes) at the Centre Hospitalier Universitaire Vaudois (CHUV) and in six peripheral hospitals (PH) in the same region. Demographical, historical and clinical variables were collected, including SE severity estimation (STESS score) and adherence to Swiss SE treatment guidelines. Outcome at discharge was categorised as "good" (return to baseline), or "poor" (persistent neurological sequelae or death). RESULTS: Of 54 patients (CHUV: 36; PH 18), 33% had a poor outcome. Whilst age, SE severity, percentage of SE episodes lasting less than 30 minutes and total SE duration were similar, fewer patients had a good outcome at the CHUV (61% vs 83%; OR 3.57; 95% CI 0.8-22.1). Mortality was 14% at the CHUV and 5% at the PH. Most treatments were in agreement with national guidelines, although less often in PH (78% vs 97%, P = 0.04). CONCLUSION: Although not statistically significant, we observed a slightly worse SE prognosis in a large academic centre as compared to smaller hospitals. Since SE severity was similar in the two settings but adherence to national treatment guidelines was higher in the academic centre, further investigation on the prognostic role of SE treatment and outcome determinants is required.