939 resultados para Observational
Resumo:
Objective: To determine the factors associated with general practitioners' current practice location, with particular emphasis on rural location. Design: Observational, retrospective, case-control study using a self-administered questionnaire. Setting: Australian general practices in December 2000. Participants: 2414 Australian-trained rural and urban GPs. Main outcome measure: Current urban or rural practice location. Results: For Australia as a whole, rural GPs were more likely to be male (odds ratio [OR], 1.42; 95% CI, 1.17-1.73), Australian-born (OR, 1.95; 95% CI, 1.55-2.45), and to report attending a rural primary school for some (OR, 2.21; 95% CI, 1.69-2.89) or all (OR, 2.79; 95% CI, 1.94-4.00) of their primary schooling. Rural GPs' partners or spouses were also more likely to report some (OR, 2.75; 95% CI, 2.07-3.66) or all (OR, 2.86; 95% CI, 2.02-4.05) rural primary schooling. A rural background in both GP and partner produced the highest likelihood of rural practice (OR, 6.28; 95% CI, 4.26-9.25). For individual jurisdictions, a trend towards more rural GPs being men was only significant in Tasmania. In all jurisdictions except Tasmania and the Northern Territory, rural GPs were more likely to be Australian-born. Conclusions: GPs' and their partners' rural background (residence and primary and secondary schooling) influences choice of practice location, with partners' background appearing to exert more influence.
Resumo:
Objective: To determine the age-standardised prevalence of peripheral arterial disease (PAD) and associated risk factors, particularly smoking. Method: Design: Cross-sectional survey of a randomly selected population. Setting: Metropolitan area of Perth, Western Australia. Participants: Men aged between 65-83 years. Results: The adjusted response fraction was 77.2%. Of 4,470 men assessed, 744 were identified as having PAD by the Edinburgh Claudication Questionnaire and/or the ankle-brachial index of systolic blood pressure, yielding an age-standardised prevalence of PAD of 15.6% (95% confidence intervals (CI): 14.5%, 16.6%). The main risk factors identified in univariate analyses were increasing age, smoking current (OR=3.9, 95% CI 2.9-5.1) or former (OR=2.0, 95% CI 1.6-2.4), physical inactivity (OR=1.4, 95% CI 1.2-1.7), a history of angina (OR=2.2, 95% CI 1.8-2.7) and diabetes mellitus (OR=2.1, 95% CI 1.7-2.6). The multivariate analysis showed that the highest relative risk associated with PAD was current smoking of 25 or more cigarettes daily (OR=7.3, 95% CI 4.2-12.8). In this population, 32% of PAD was attributable to current smoking and a further 40% was attributable to past smoking by men who did not smoke currently. Conclusions: This large observational study shows that PAD is relatively common in older, urban Australian men. In contrast with its relationship to coronary disease and stroke, previous smoking appears to have a long legacy of increased risk of PAD. Implications: This research emphasises the importance of smoking as a preventable cause of PAD.
Resumo:
Objective-To evaluate the presence of a dominance rank in a group of cats and the relation between agonistic behavior and the use of resources, including environmental enrichment, in these cats. Design-Observational analytic study. Animals-27 neutered cats in a shelter in Sao Paulo, Brazil. Procedures-The cats were video recorded for 4 consecutive days to obtain baseline data. Subsequently, a puzzle feeder was added as an enrichment device every other day over 8 days, for a total of 4 days with enrichment. Cats were also video recorded on these days. All pretreatment and posttreatment agonistic behaviors and interactions with the puzzle feeder were recorded by reviewing the videotapes. Results-143 agonistic encounters were recorded, of which 44 were related to resources and 99 were not. There were insufficient agonistic interactions to determine a dominance rank. Presence or absence of the puzzle feeder did not affect the rate of aggression. There was no significant effect of weight, sex, or coat color on the rate of aggression, and aggressive behavior did not correlate with time spent with the puzzle feeder. Twenty-three of the 27 cats interacted with the puzzle feeder. Conclusions and Clinical Relevance-In a stable group of communally housed cats, environmental enrichment did not cause increased aggression as a result of competition for the source of enrichment. Because environmental enrichment increases the opportunity to perform exploratory behaviors, it may improve the welfare of groups of cats maintained long-term in shelters, sanctuaries, or multicat households. (J Am Vet Med Assoc 2011239:796-802)
Resumo:
Objective To determine the costs and benefits of interventions for maternal and newborn health to assess the appropriateness of current strategies and guide future plans to attain the millennium development goals. Design Cost effectiveness analysis. Setting Two regions classified by the World Health Organization according to their epidemiological grouping: Afr-E, those countries in sub-Saharan Africa with very high adult and high child mortality, and Sear-D, comprising countries in South East Asia with high adult and high child mortality. Data sources Effectiveness data from several sources, including trials, observational studies, and expert opinion. For resource inputs, quantifies came from WHO guidelines, literature, and expert opinion, and prices from the WHO choosing interventions that are cost effective database. Main outcome measures Cost per disability adjusted life year (DALY) averted in year 2000 international dollars. Results The most cost effective mix of interventions was similar in Afr-E and Sear-D. These were the community based newborn care package, followed by antenatal care (tetanus toxoid, screening for pre-eclampsia, screening and treatment of asymptomatic bacteriuria and syphilis); skilled attendance at birth, offering first level maternal and neonatal care around childbirth; and emergency obstetric and neonatal care around and after birth. Screening and treatment of maternal syphilis, community based management of neonatal pneumonia, and steroids given during the antenatal period were relatively less cost effective in Sear-D. Scaling up all of the included interventions to 95% coverage would halve neonatal and maternal deaths. Conclusion Preventive interventions at the community level for newborn babies and at the primary care level for mothers and newborn babies are extremely cost effective, but the millennium development goals for maternal and child health will not be achieved without universal access to clinical services as well.
Resumo:
Methods. Data from the Beginning and Ending Supportive Therapy for the Kidney (BEST Kidney) study, a prospective observational study from 54 ICUs in 23 countries of critically ill patients with severe AKI, were analysed. The RIFLE class was determined by using observed (o) pre-morbid and estimated (e) baseline SCr values. Agreement was evaluated by correlation coefficients and Bland-Altman plots. Sensitivity analysis by chronic kidney disease (CKD) status was performed. Results. Seventy-six percent of patients (n = 1327) had a pre-morbid baseline SCr, and 1314 had complete data for evaluation. Forty-six percent had CKD. The median (IQR) values were 97 mu mol/L (79-150) for oSCr and 88 mu mol/L (71-97) for eSCr. The oSCr and eSCr determined at ICU admission and at study enrolment showed only a modest correlation (r = 0.49, r = 0.39). At ICU admission and study enrolment, eSCr misclassified 18.8% and 11.7% of patients as having AKI compared with oSCr. Exclusion of CKD patients improved the correlation between oSCr and eSCr at ICU admission and study enrolment (r = 0.90, r = 0.84) resulting in 6.6% and 4.0% being misclassified, respectively. Conclusions. While limited, estimating baseline SCr by the MDRD equation when pre-morbid SCr is unavailable would appear to perform reasonably well for determining the RIFLE categories only if and when pre-morbid GFR was near normal. However, in patients with suspected CKD, the use of MDRD to estimate baseline SCr overestimates the incidence of AKI and should not likely be used. Improved methods to estimate baseline SCr are needed.
Resumo:
Purpose: The aim of this study is to evaluate the relationship between timing of renal replacement therapy (RRT) in severe acute kidney injury and clinical outcomes. Methods: This was a prospective multicenter observational study conducted at 54 intensive care units (ICUs) in 23 countries enrolling 1238 patients. Results: Timing of RRT was stratified into ""early"" and ""late"" by median urea and creatinine at the time RRT was started. Timing was also categorized temporally from ICU admission into early (<2 days), delayed (2-5 days), and late (>5 days). Renal replacement therapy timing by serum urea showed no significant difference in crude (63.4% for urea <= 24.2 mmol/L vs 61.4% for urea >24.2 mmol/L; odds ratio [OR], 0.92; 95% confidence interval [CI], 0.73-1.15; P = .48) or covariate-adjusted mortality (OR, 1.25; 95% CI, 0.91-1.70; P = .16). When stratified by creatinine, late RRT was associated with lower crude (53.4% for creatinine >309 mu mol/L vs 71.4% for creatinine <= 309 mu mol/L; OR, 0.46; 95% CI, 0.36-0.58; P < .0001) and covariate-adjusted mortality (OR, 0.51; 95% CI, 0.37-0.69; P < .001).However, for timing relative to ICU admission, late RRT was associated with greater crude (72.8% vs 62.3% vs 59%, P < .001) and covariate-adjusted mortality (OR, 1.95; 95% CI, 1.30-2.92; P = .001). Overall, late RRT was associated with a longer duration of RRT and stay in hospital and greater dialysis dependence. Conclusion: Timing of RRT, a potentially modifiable factor, might exert an important influence on patient survival. However, this largely depended on its definition. Late RRT (days from admission) was associated with a longer duration of RRT, longer hospital stay, and higher dialysis dependence. (C) 2009 Elsevier Inc. All rights reserved.
Resumo:
Background: Vascular calcification is common and constitutes a prognostic marker of mortality in the hemodialysis population. Derangements of mineral metabolism may influence its development. The aim of this study is to prospectively evaluate the association between bone remodeling disorders and progression of coronary artery calcification (CAC) in hemodialysis patients. Study Design: Cohort study nested within a randomized controlled trial. Setting & Participants: 64 stable hemodialysis patients. Predictor: Bone-related laboratory parameters and bone histomorphometric characteristics at baseline and after 1 year of follow-up. Outcomes: Progression of CAC assessed by means of coronary multislice tomography at baseline and after 1 year of follow-up. Baseline calcification score of 30 Agatston units or greater was defined as calcification. Change in calcification score of 15% or greater was defined as progression. Results: Of 64 patients, 26 (40%) had CAC at baseline and 38 (60%) did not. Participants without CAC at baseline were younger (P < 0.001), mainly men (P = 0.03) and nonwhite (P = 0.003), and had lower serum osteoprotegerin levels (P = 0.003) and higher trabecular bone volume (P = 0.001). Age (P 0.003; beta coefficient = 1.107; 95% confidence interval [Cl], 1.036 to 1.183) and trabecular bone volume (P = 0.006; beta coefficient = 0.828; 95% Cl, 0.723 to 0.948) were predictors for CAC development. Of 38 participants who had calcification at baseline, 26 (68%) had CAC progression in 1 year. Progressors had lower bone-specific alkaline phosphatase (P = 0.03) and deoxypyridinoline levels (P = 0.02) on follow-up, and low turnover was mainly diagnosed at the 12-month bone biopsy (P = 0.04). Low-turnover bone status at the 12-month bone biopsy was the only independent predictor for CAC progression (P = 0.04; beta coefficient = 4.5; 95% Cl, 1.04 to 19.39). According to bone histological examination, nonprogressors with initially high turnover (n = 5) subsequently had decreased bone formation rate (P = 0.03), and those initially with low turnover (n = 7) subsequently had increased bone formation rate (P = 0.003) and osteoid volume (P = 0.001). Limitations: Relatively small population, absence of patients with severe hyperparathyroidism, short observational period. Conclusions: Lower trabecular bone volume was associated with CAC development, whereas improvement in bone turnover was associated with lower CAC progression in patients with high- and low-turnover bone disorders. Because CAC is implicated in cardiovascular mortality, bone derangements may constitute a modifiable mortality risk factor in hemodialysis patients.
Resumo:
Objectives: Lung hyperinflation may be assessed by computed tomography (CT). As shown for patients with emphysema, however, CT image reconstruction affects quantification of hyperinflation. We studied the impact of reconstruction parameters on hyperinflation measurements in mechanically ventilated (MV) patients. Design: Observational analysis. Setting: A University hospital-affiliated research Unit. Patients: The patients were MV patients with injured (n = 5) or normal lungs (n = 6), and spontaneously breathing patients (n = 5). Interventions: None. Measurements and results: Eight image series involving 3, 5, 7, and 10 mm slices and standard and sharp filters were reconstructed from identical CT raw data. Hyperinflated (V-hyper), normally (V-normal), poorly (V-poor), and nonaerated (V-non) volumes were calculated by densitometry as percentage of total lung volume (V-total). V-hyper obtained with the sharp filter systematically exceeded that with the standard filter showing a median (interquartile range) increment of 138 (62-272) ml corresponding to approximately 4% of V-total. In contrast, sharp filtering minimally affected the other subvolumes (V-normal, V-poor, V-non, and V-total). Decreasing slice thickness also increased V-hyper significantly. When changing from 10 to 3 mm thickness, V-hyper increased by a median value of 107 (49-252) ml in parallel with a small and inconsistent increment in V-non of 12 (7-16) ml. Conclusions: Reconstruction parameters significantly affect quantitative CT assessment of V-hyper in MV patients. Our observations suggest that sharp filters are inappropriate for this purpose. Thin slices combined with standard filters and more appropriate thresholds (e.g., -950 HU in normal lungs) might improve the detection of V-hyper. Different studies on V-hyper can only be compared if identical reconstruction parameters were used.
Resumo:
Introduction: Quantitative computed tomography (qCT)-based assessment of total lung weight (M(lung)) has the potential to differentiate atelectasis from consolidation and could thus provide valuable information for managing trauma patients fulfilling commonly used criteria for acute lung injury (ALI). We hypothesized that qCT would identify atelectasis as a frequent mimic of early posttraumatic ALI. Methods: In this prospective observational study, M(lung) was calculated by qCT in 78 mechanically ventilated trauma patients fulfilling the ALI criteria at admission. A reference interval for M(lung) was derived from 74 trauma patients with morphologically and functionally normal lungs (reference). Results are given as medians with interquartile ranges. Results: The ratio of arterial partial pressure of oxygen to the fraction of inspired oxygen was 560 (506 to 616) mmHg in reference patients and 169 (95 to 240) mmHg in ALI patients. The median reference M(lung) value was 885 (771 to 973) g, and the reference interval for M(lung) was 584 to 1164 g, which matched that of previous reports. Despite the significantly greater median M(lung) value (1088 (862 to 1,342) g) in the ALI group, 46 (59%) ALI patients had M(lung) values within the reference interval and thus most likely had atelectasis. In only 17 patients (22%), Mlung was increased to the range previously reported for ALI patients and compatible with lung consolidation. Statistically significant differences between atelectasis and consolidation patients were found for age, Lung Injury Score, Glasgow Coma Scale score, total lung volume, mass of the nonaerated lung compartment, ventilator-free days and intensive care unit-free days. Conclusions: Atelectasis is a frequent cause of early posttraumatic lung dysfunction. Differentiation between atelectasis and consolidation from other causes of lung damage by using qCT may help to identify patients who could benefit from management strategies such as damage control surgery and lung-protective mechanical ventilation that focus on the prevention of pulmonary complications.
Resumo:
Background. Many resource-limited countries rely on clinical and immunological monitoring without routine virological monitoring for human immunodeficiency virus (HIV)-infected children receiving highly active antiretroviral therapy (HAART). We assessed whether HIV load had independent predictive value in the presence of immunological and clinical data for the occurrence of new World Health Organization (WHO) stage 3 or 4 events (hereafter, WHO events) among HIV-infected children receiving HAART in Latin America. Methods. The NISDI (Eunice Kennedy Shriver National Institute of Child Health and Human Development International Site Development Initiative) Pediatric Protocol is an observational cohort study designed to describe HIV-related outcomes among infected children. Eligibility criteria for this analysis included perinatal infection, age ! 15 years, and continuous HAART for >= 6 months. Cox proportional hazards modeling was used to assess time to new WHO events as a function of immunological status, viral load, hemoglobin level, and potential confounding variables; laboratory tests repeated during the study were treated as time-varying predictors. Results. The mean duration of follow-up was 2.5 years; new WHO events occurred in 92 (15.8%) of 584 children. In proportional hazards modeling, most recent viral load 15000 copies/mL was associated with a nearly doubled risk of developing a WHO event (adjusted hazard ratio, 1.81; 95% confidence interval, 1.05-3.11; P = 033), even after adjustment for immunological status defined on the basis of CD4 T lymphocyte value, hemoglobin level, age, and body mass index. Conclusions. Routine virological monitoring using the WHO virological failure threshold of 5000 copies/mL adds independent predictive value to immunological and clinical assessments for identification of children receiving HAART who are at risk for significant HIV-related illness. To provide optimal care, periodic virological monitoring should be considered for all settings that provide HAART to children.
Resumo:
In animal models, interstitial angiotensin II (ang II) and AT1 receptor (AT1R) are key mediators of renal inflammation and fibrosis in progressive chronic nephropathies. We hypothesized that these molecules were overexpressed in patients with progressive glomerulopathies. In this observational retrospective study, we described the expression of ang II and AT1R by immunohistochemistry in kidney biopsies of 7 patients with minimal change disease (MCD) and in 25 patients with progressive glomerulopathies (PGPs). Proteinuria, serum albumin, and serum creatinine were not statistically different between MCD and PGP patients. Total expression of ang II and AT1R was not statistically different between MCD (108.7 +/- 11.5 and 73.2 +/- 13.6 cells/mm(2), respectively) and PGN patients (100.7 +/- 9.0 and 157.7 +/- 13.8 cells/mm(2), respectively; p>0.05). Yet, interstitial expression of ang II and AT1R (91.6 +/- 16.0 and 45.6 +/- 5.4 cells/mm(2), respectively) was higher in patients with PGN than in those with MCD (22.0 +/- 4.1 and 17.9 +/- 2.9 cells/mm(2), respectively, p<0.05), as was the proportion of interstitial fibrosis (11.0 +/- 0.7% versus 6.1 +/- 1.2%, p<005). In patients with MCD, ang II and AT1R expressions predominate in the tubular compartment (52% and 36% of the positive cells, respectively). In those with PGP, the interstitial expression of ang II and AT1R predominates (58% and 45%, respectively). In conclusion, interstitial expression of ang II and AT1R is increased in patients with progressive glomerulopathies. The relationship of these results and interstitial fibrosis and disease progression in humans warrants further investigations.
Resumo:
Dyslipidemia is known to increase significantly the odds of major cardiovascular events in the general population. Its control becomes even more important in the type 2 diabetic (T2DM) population. Bariatric surgeries, especially gastric bypass, are effective in achieving long-term control of dyslipidemia in morbidly obese patients. The objective of the study was to evaluate the control of dyslipidemia in patients with T2DM and BMI below 30 that were submitted to the laparoscopic ileal interposition associated to sleeve gastrectomy. An observational transversal study was performed in a tertiary care hospital, between June 2005 and August 2007. Mean follow-up was 24.5 months (range 12-38). The procedure was performed in 72 patients: 51 were men and 21 were women. Mean age was 53.1 years (38-66). Mean BMI was 27 kg/m(2) (22.1-29.4). Mean duration of T2DM was 10.5 years (3-22). Mean HbA1c was 8.5%. Hypercholesterolemia was diagnosed in 68% of the patients and hypertriglyceridemia in 63.9%. Mean postoperative BMI was 21.2.kg/m(2) (17-26.7). Mean postoperative HbA1c was 6.1%, ranging 4.4% to 8.3%. Overall, 86.1% of the patients achieved an adequate glycemic control (HbA1c < 7) without anti-diabetic medication. HbA1c below 6 was achieved by 50%, 36.1% had HbA1c between 6 and 7, and 13.9% had HbA1c above 7. Total hypercholesterolemia was normalized in 91.8% and hypertriglyceridemia in 89.1% of patients. Low-density lipoprotein below 100 mg/dl was seen in 85.7%. The laparoscopic ileal interposition associated to sleeve gastrectomy was an effective operation for the regression of dyslipidemia and T2DM in a non-obese population.
Resumo:
Objective: to determine the relationship between age and in-hospital mortality of elderly patients, admitted to ICU, requiring and not requiring invasive ventilatory support. Design: prospective observational cohort study conducted over a period of 11 months. Setting: medical-surgical ICU at a Brazilian university hospital. Subjects: a total of 840 patients aged 55 years and older were admitted to ICU. Methods: in-hospital death rates for patients requiring and not requiring invasive ventilatory support were compared across three successive age intervals (55-64; 65-74 and 75 or more years), adjusting for severity of illness using the Acute Physiologic Score. Results: age was strongly correlated with mortality among the invasively ventilated subgroup of patients and the multivariate adjusted odds ratios increased progressively with every age increment (OR = 1.60, 95% CI = 1.01-2.54 for 65-74 years old and OR = 2.68, 95% CI = 1.58-4.56 for >= 75 years). For the patients not submitted to invasive ventilatory support, age was not independently associated with in-hospital mortality (OR = 2.28, 95% CI = 0.99-5.25 for 65-74 years old and OR = 1.95, 95% CI = 0.82-4.62 for >= 75 years old). Conclusions: the combination of age and invasive mechanical ventilation is strongly associated with in-hospital mortality. Age should not be considered as a factor related to in-hospital mortality of elderly patients not requiring invasive ventilatory support in ICU.
Resumo:
Background: Inhaled corticosteroids (ICSs) are recommended as the first line of treatment in children with moderate-to-severe asthma. Exhaled nitric oxide (ENO) has been proposed as a clinically useful marker of control that might help identify patients in whom ICS dose may be safely reduced. Objective: To evaluate the ability of ENO to predict future asthma exacerbations in children with moderate-to-severe asthma undergoing ICS tapering. Methods: This is an observational study with no control group. ENO was measured biweekly for 14 weeks in 32 children with moderate-to-severe asthma who were undergoing ICS tapering. Clinical evaluations and spirometry were performed concomitantly, and families kept daily diaries to record symptoms between visits. We used generalized estimating equations to model the In (odds) of an asthma exacerbation in the subsequent 2-week interval as a function of ENO level at the start of the interval while adjusting for age, sex, asthma severity, and current medication use. Results: We were able to successfully lower ICS doses in 10 (56%) of the 18 children with moderate asthma and in 3 (21%) of the 14 children with severe asthma. In 83 of the 187 follow-up clinical evaluations, children were determined to have had an exacerbation during the preceding 2 weeks. ENO levels, whether expressed as a continuous variable or dichotomized, were not associated with future risk for exacerbations in either unadjusted or adjusted models. Conclusion: ENO was not a useful clinical predictor of future asthma exacerbations for children with moderate-to-severe asthma undergoing ICS tapering. Ann Allergy Asthma Immunol. 2009; 103:206-211.
Resumo:
Background - The effect of prearrest left ventricular ejection fraction ( LVEF) on outcome after cardiac arrest is unknown. Methods and Results - During a 26-month period, Utstein-style data were prospectively collected on 800 consecutive inpatient adult index cardiac arrests in an observational, single-center study at a tertiary cardiac care hospital. Prearrest echocardiograms were performed on 613 patients ( 77%) at 11 +/- 14 days before the cardiac arrest. Outcomes among patients with normal or nearly normal prearrest LVEF ( >= 45%) were compared with those of patients with moderate or severe dysfunction ( LVEF < 45%) by chi(2) and logistic regression analyses. Survival to discharge was 19% in patients with normal or nearly normal LVEF compared with 8% in those with moderate or severe dysfunction ( adjusted odds ratio, 4.8; 95% confidence interval, 2.3 to 9.9; P < 0.001) but did not differ with regard to sustained return of spontaneous circulation ( 59% versus 56%; P = 0.468) or 24-hour survival ( 39% versus 36%; P = 0.550). Postarrest echocardiograms were performed on 84 patients within 72 hours after the index cardiac arrest; the LVEF decreased 25% in those with normal or nearly normal prearrest LVEF ( 60 +/- 9% to 45 +/- 14%; P < 0.001) and decreased 26% in those with moderate or severe dysfunction ( 31 +/- 7% to 23 +/- 6%, P < 0.001). For all patients, prearrest beta-blocker treatment was associated with higher survival to discharge ( 33% versus 8%; adjusted odds ratio, 3.9; 95% confidence interval, 1.8 to 8.2; P < 0.001). Conclusions - Moderate and severe prearrest left ventricular systolic dysfunction was associated with substantially lower rates of survival to hospital discharge compared with normal or nearly normal function.