713 resultados para all Adult: 19 years
Resumo:
Peritoneal dialysis (PD) should be considered a suitable method of renal replacement therapy in acute kidney injury (AKI) patients. This study is the largest cohort providing patient characteristics, clinical practice, patterns and their relationship to outcomes in a developing country. Its objective was to describe the main determinants of patient and technique survival, including trends over time of PD treatment in AKI patients. This was a Brazilian prospective cohort study in which all adult AKI patients on PD were studied from January/2004 to January/2014. For comparison purposes, patients were divided into 2 groups according to the year of treatment: 2004-2008 and 2009-2014. Patient survival and technique failure (TF) were analyzed using the competing risk model of Fine and Gray. A total of 301 patients were included, 51 were transferred to hemodialysis (16.9%) during the study period. The main cause of TF was mechanical complication (47%) followed by peritonitis (41.2%). There was change in TF during the study period: compared to 2004-2008, patients treated at 2009-2014 had relative risk (RR) reduction of 0.86 (95% CI 0.77-0.96) and three independent risk factors were identified: period of treatment at 2009 and 2014, sepsis and age>65 years. There were 180 deaths (59.8%) during the study. Death was the leading cause of dropout (77.9% of all cases) mainly by sepsis (58.3%), followed cardiovascular disease (36.1%). The overall patient survival was 41% at 30 days. Patient survival improved along study periods: compared to 2004-2008, patients treated at 2009-2014 had a RR reduction of 0.87 (95% CI 0.79-0.98). The independent risk factors for mortality were sepsis, age>70 years, ATN-ISS > 0.65 and positive fluid balance. As conclusion, we observed an improvement in patient survival and TF along the years even after correction for several confounders and using a competing risk approach.
Resumo:
Peripheral cement-ossifying fibroma is a relatively common gingival growth of a reactive rather than neoplastic nature, whose pathogenesis is uncertain. It predominantly affects adolescents and young adults, with peak prevalence between 10 and 19 years. We report here the clinical case of a 5-year-old girl with disease duration of 3 years, who was followed up for 4 years, showing a gingival health and normal radiopacity of bone.
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de NÃvel Superior (CAPES)
Resumo:
OBJETIVO: Determinar a taxa de sucesso da broncoscopia flexÃvel como primeira opção na remoção de corpos estranhos das vias aéreas em adultos. MÉTODOS: Estudo retrospectivo de todos os pacientes adultos (acima de 18 anos) com aspiração de corpo estranho submetidos a broncoscopia no Hospital das ClÃnicas, Faculdade de Medicina da Universidade de São Paulo, em São Paulo (SP). RESULTADOS: A amostra foi constituÃda por 40 pacientes adultos, com média de idade de 52 anos (variação: 18-88 anos). A mediana do tempo de permanência do corpo estranho na via aérea foi de 15 dias (variação: 12 h a 10 anos). Todos os pacientes foram submetidos primeiramente a broncoscopia flexÃvel diagnóstica. A retirada do corpo estranho por meio de broncoscopia flexÃvel foi bem-sucedida em 33 dos pacientes (82,5%). Em 1 paciente, um objeto metálico alojado na árvore brônquica distal requereu o uso de fluoroscopia. Seis pacientes (15%) foram submetidos a broncoscopia rÃgida devido a dispneia induzida por corpo estranho traqueal, em 2, e porque o corpo estranho era muito grande para as pinças flexÃveis, em 4. A broncoscopia falhou em apenas 1 paciente, que portanto necessitou de broncotomia. CONCLUSÕES: Embora a broncoscopia rÃgida seja considerada o padrão ouro na remoção de corpos estranhos na via aérea, nossa experiência mostrou que a broncoscopia flexÃvel pode ser utilizada segura e eficientemente no diagnóstico e tratamento de pacientes adultos estáveis.
Resumo:
Background: Cardiovascular disease is an important cause of death in patients on dialysis. Peripheral arterial disease (PAD) is a prognostic factor for cardiovascular disease. The ankle brachial index (ABI) is a noninvasive method used for the diagnosis of PAD. The difference between ABI pre- and post-dialysis had not yet been formally tested, and it was the main objective of this study. Methods:The ABI was assessed using an automated oscillometric device in incident patients on hemodialysis. All blood pressure readings were taken in triplicate pre- and post-dialysis in three consecutive dialysis sessions (times 1, 2, and 3). Results: One hundred and twenty-three patients (85 men) aged 53 +/- 19 years were enrolled. We found no difference in ABI pre- and post-dialysis on the right or left side, and there was no difference in times 1, 2, and 3. In patients with a history of PAD, the ABI pre- versus post-dialysis were of borderline significance on the right side (p = 0.088). Conclusion: ABI measured pre- and post-dialysis presented low variability. The ABI in patients with a history of PAD should be evaluated with caution. The applicability of the current method in predicting mortality among patients on hemodialysis therefore needs further investigation. Copyright (C) 2012 S. Karger AG, Basel
Resumo:
The aim of this study was to compare time-motion indicators during judo matches performed by athletes from different age groups. The following age groups were analysed: Pre-Juvenile (13-14 years, n=522), Juvenile (15-16 years, n 353); Junior (19 years, n = 349) and Senior (>20 years, n = 587). The time-motion indicators included: Total Combat Time, Standing Combat Time, Displacement Without Contact, Gripping Time, Groundwork Combat Time and Pause Time. Analysis of variance (ANOVA) one-way and the Tukey test, as well as the Kruskal-Wallis test and Mann-Whitney (for non-parametric data), were conducted, using P < 0.05 as significance level. The results showed that all analysed groups obtained a median of 7 (first quantile - 3, third quantile - 12) sequences of combat/pause cycles. In total time of combat, the result was: for Total Combat Time, Standing Combat Time and Gripping Time: Pre-Juvenile and Senior were significantly longer than Juvenile and Junior. Considering Displacement Without Contact, Junior was significantly longer than all other age groups. For Groundwork Combat Time, Senior was significantly longer than all other age groups and Pre-Juvenile was longer than Junior. These results can be used to improve the physiological performance in intermittent practices, as well as technicaltactical training during judo sessions.
Resumo:
Methane (CH4) emission from agricultural soils increases dramatically as a result of deleterious effect of soil disturbance and nitrogen fertilization on methanotrophic organisms; however, few studies have attempted to evaluate the potential of long-term conservation management systems to mitigate CH4 emissions in tropical and subtropical soils. This study aimed to evaluate the long-term effect (>19 years) of no-till grass- and legume-based cropping systems on annual soil CH4 fluxes in a formerly degraded Acrisol in Southern Brazil. Air sampling was carried out using static chambers and CH4 analysis by gas chromatography. Analysis of historical data set of the experiment evidenced a remarkable effect of high C- and N-input cropping systems on the improvement of biological, chemical, and physical characteristics of this no-tilled soil. Soil CH4 fluxes, which represent a net balance between consumption (-) and production (+) of CH4 in soil, varied from -40 +/- 2 to +62 +/- 78 mu g C m(-2) h(-1). Mean weighted contents of ammonium (NH4+-N) and dissolved organic carbon (DOC) in soil had a positive relationship with accumulated soil CH4 fluxes in the post-management period (r(2) = 0.95, p = 0.05), suggesting an additive effect of these nutrients in suppressing CH4 oxidation and stimulating methanogenesis, respectively, in legume-based cropping systems with high biomass input. Annual CH4 fluxes ranged from -50 +/- 610 to +994 +/- 105 g C ha(-1), which were inversely related to annual biomass-C input (r(2) = 0.99, p = 0.003), with the exception of the cropping system containing pigeon pea, a summer legume that had the highest biologically fixed N input (>300 kg ha(-1) yr(-1)). Our results evidenced a small effect of conservation management systems on decreasing CH4 emissions from soil, despite their significant effect restoring soil quality. We hypothesized that soil CH4 uptake strength has been off-set by an injurious effect of biologically fixed N in legume-based cropping systems on soil methanotrophic microbiota, and by the methanogenesis increase as a result of the O-2 depletion in niches of high biological activity in the surface layer of the no-tillage soil. (C) 2012 Elsevier B.V. All rights reserved.
Resumo:
BACKGROUND: Worldwide, diarrheal diseases rank second among conditions that afflict children. Despite the disease burden, there is limited consensus on how to define and measure pediatric acute diarrhea in trials. OBJECTIVES: In RCTs of children involving acute diarrhea as the primary outcome, we documented (1) how acute diarrhea and its resolution were defined, (2) all primary outcomes, (3) the psychometric properties of instruments used to measure acute diarrhea and (4) the methodologic quality of included trials, as reported. METHODS: We searched CENTRAL, Embase, Global Health, and Medline from inception to February 2009. English-language RCTs of children younger than 19 years that measured acute diarrhea as a primary outcome were chosen. RESULTS: We identified 138 RCTs reporting on 1 or more primary outcomes related to pediatric acute diarrhea/diseases. Included trials used 64 unique definitions of diarrhea, 69 unique definitions of diarrhea resolution, and 46 unique primary outcomes. The majority of included trials evaluated short-term clinical disease activity (incidence and duration of diarrhea), laboratory outcomes, or a composite of these end points. Thirty-two trials used instruments (eg, single and multidomain scoring systems) to support assessment of disease activity. Of these, 3 trials stated that their instrument was valid; however, none of the trials (or their citations) reported evidence of this validity. The overall methodologic quality of included trials was good. CONCLUSIONS: Even in what would be considered methodologically sound clinical trials, definitions of diarrhea, primary outcomes, and instruments employed in RCTs of pediatric acute diarrhea are heterogeneous, lack evidence of validity, and focus on indices that may not be important to participants.
Resumo:
OBJECTIVES: We aimed to (i) evaluate psychological distress in adolescent survivors of childhood cancer and compare them to siblings and a norm population; (ii) compare the severity of distress of distressed survivors and siblings with that of psychotherapy patients; and (iii) determine risk factors for psychological distress in survivors. METHODS: We sent a questionnaire to all childhood cancer survivors aged <16 years when diagnosed, who had survived ≥5 years and were aged 16-19 years at the time of study. Our control groups were same-aged siblings, a norm population, and psychotherapy patients. Psychological distress was measured with the Brief Symptom Inventory-18 (BSI-18) assessing somatization, depression, anxiety, and a global severity index (GSI). Participants with a T-score ≥57 were defined as distressed. We used logistic regression to determine risk factors. RESULTS: We evaluated the BSI-18 in 407 survivors and 102 siblings. Fifty-two survivors (13%) and 11 siblings (11%) had scores above the distress threshold (T ≥ 57). Distressed survivors scored significantly higher in somatization (p = 0.027) and GSI (p = 0.016) than distressed siblings, and also scored higher in somatization (p ≤ 0.001) and anxiety (p = 0.002) than psychotherapy patients. In the multivariable regression, psychological distress was associated with female sex, self-reported late effects, and low perceived parental support. CONCLUSIONS: The majority of survivors did not report psychological distress. However, the severity of distress of distressed survivors exceeded that of distressed siblings and psychotherapy patients. Systematic psychological follow-up can help to identify survivors at risk and support them during the challenging period of adolescence. Copyright © 2013 John Wiley & Sons, Ltd.
Resumo:
Prospective cohort studies have provided evidence on longer-term mortality risks of fine particulate matter (PM2.5), but due to their complexity and costs, only a few have been conducted. By linking monitoring data to the U.S. Medicare system by county of residence, we developed a retrospective cohort study, the Medicare Air Pollution Cohort Study (MCAPS), comprising over 20 million enrollees in the 250 largest counties during 2000-2002. We estimated log-linear regression models having as outcome the age-specific mortality rate for each county and as the main predictor, the average level for the study period 2000. Area-level covariates were used to adjust for socio-economic status and smoking. We reported results under several degrees of adjustment for spatial confounding and with stratification into by eastern, central and western counties. We estimated that a 10 µg/m3 increase in PM25 is associated with a 7.6% increase in mortality (95% CI: 4.4 to 10.8%). We found a stronger association in the eastern counties than nationally, with no evidence of an association in western counties. When adjusted for spatial confounding, the estimated log-relative risks drop by 50%. We demonstrated the feasibility of using Medicare data to establish cohorts for follow-up for effects of air pollution. Particulate matter (PM) air pollution is a global public health problem (1). In developing countries, levels of airborne particles still reach concentrations at which serious health consequences are well-documented; in developed countries, recent epidemiologic evidence shows continued adverse effects, even though particle levels have declined in the last two decades (2-6). Increased mortality associated with higher levels of PM air pollution has been of particular concern, giving an imperative for stronger protective regulations (7). Evidence on PM and health comes from studies of acute and chronic adverse effects (6). The London Fog of 1952 provides dramatic evidence of the unacceptable short-term risk of extremely high levels of PM air pollution (8-10); multi-site time-series studies of daily mortality show that far lower levels of particles are still associated with short-term risk (5)(11-13). Cohort studies provide complementary evidence on the longer-term risks of PM air pollution, indicating the extent to which exposure reduces life expectancy. The design of these studies involves follow-up of cohorts for mortality over periods of years to decades and an assessment of mortality risk in association with estimated long-term exposure to air pollution (2-4;14-17). Because of the complexity and costs of such studies, only a small number have been conducted. The most rigorously executed, including the Harvard Six Cities Study and the American Cancer Society’s (ACS) Cancer Prevention Study II, have provided generally consistent evidence for an association of long- term exposure to particulate matter air pollution with increased all-cause and cardio-respiratory mortality (2,4,14,15). Results from these studies have been used in risk assessments conducted for setting the U.S. National Ambient Air Quality Standard (NAAQS) for PM and for estimating the global burden of disease attributable to air pollution (18,19). Additional prospective cohort studies are necessary, however, to confirm associations between long-term exposure to PM and mortality, to broaden the populations studied, and to refine estimates by regions across which particle composition varies. Toward this end, we have used data from the U.S. Medicare system, which covers nearly all persons 65 years of age and older in the United States. We linked Medicare mortality data to (particulate matter less than 2.5 µm in aerodynamic diameter) air pollution monitoring data to create a new retrospective cohort study, the Medicare Air Pollution Cohort Study (MCAPS), consisting of 20 million persons from 250 counties and representing about 50% of the US population of elderly living in urban settings. In this paper, we report on the relationship between longer-term exposure to PM2.5 and mortality risk over the period 2000 to 2002 in the MCAPS.
Resumo:
AIMS: To determine whether the current practice of sweat testing in Swiss hospitals is consistent with the current international guidelines. METHODS: A questionnaire was mailed to all children's hospitals (n = 8), regional paediatric sections of general hospitals (n = 28), and all adult pulmonology centres (n = 8) in Switzerland which care for patients with cystic fibrosis (CF). The results were compared with published "guidelines 2000" of the American National Committee for Clinical Laboratory Standards (NCCLS) and the UK guidelines of 2003. RESULTS: The response rate was 89%. All 8 children's hospitals and 18 out of 23 answering paediatric sections performed sweat tests but none of the adult pulmonology centres. In total, 1560 sweat tests (range: 5-200 tests/centre/year, median 40) per year were done. 88% (23/26) were using Wescor systems, 73% (19/26) the Macroduct system for collecting sweat and 31% (8/26) the Nanoduct system. Sweat chloride was determined by only 62% (16/26) of all centres; of these, only 63% (10/16) indicated to use the recommended diagnostic chloride-CF-reference value of >60 mmol/l. Osmolality was measured in 35%, sodium in 42% and conductivity in 62% of the hospitals. Sweat was collected for maximal 30-120 (median 55) minutes; only three centres used the maximal 30 minutes sample time recommended by the international guidelines. CONCLUSIONS: Sweat testing practice in Swiss hospitals was inconsistent and seldom followed the current international guidelines for sweat collection, analyzing method and reference values. Only 62% were used the chloride concentration as a diagnostic reference, the only accepted diagnostic measurement by the NCCLS or UK guidelines.
Resumo:
INTRODUCTION: Primary ciliary dyskinesia (PCD) is a rare hereditary recessive disease with symptoms of recurrent pneumonia, chronic bronchitis, bronchiectasis, and chronic sinusitis. Chronic rhinitis is often the presenting symptom in newborns and infants. Approximately half of the patients show visceral mirror image arrangements (situs inversus). In this study, we aimed 1) to determine the number of paediatric PCD patients in Austria, 2) to show the diagnostic and therapeutic modalities used in the clinical centres and 3) to describe symptoms of children with PCD. PATIENTS, MATERIAL AND METHODS: For the first two aims, we analysed data from a questionnaire survey of the European Respiratory Society (ERS) task force on Primary Ciliary Dyskinesia in children. All paediatric respiratory units in Austria received a questionnaire. Symptoms of PCD patients from Vienna Children's University Hospital (aim 3) were extracted from case histories. RESULTS: In 13 Austrian clinics 48 patients with PCD (36 aged from 0-19 years) were identified. The prevalence of reported cases (aged 0-19 yrs) in Austria was 1:48000. Median age at diagnosis was 4.8 years (IQR 0.3-8.2), lower in children with situs inversus compared to those without (3.1 vs. 8.1 yrs, p = 0.067). In 2005-2006, the saccharine test was still the most commonly used screening test for PCD in Austria (45%). Confirmation of the diagnosis was usually by electron microscopy (73%). All clinics treated exacerbations immediately with antibiotics, 73% prescribed airway clearance therapy routinely to all patients. Other therapies and diagnostic tests were applied very inconsistently across Austrian hospitals. All PCD patients from Vienna (n = 13) had increased upper and lower respiratory secretions, most had recurring airway infections (n = 12), bronchiectasis (n = 7) and bronchitis (n = 7). CONCLUSION: Diagnosis and therapy of PCD in Austria are inhomogeneous. Prospective studies are needed to learn more about the course of the disease and to evaluate benefits and harms of different treatment strategies.
Resumo:
To study the time course of demineralization and fracture incidence after spinal cord injury (SCI), 100 paraplegic men with complete motor loss were investigated in a cross-sectional study 3 months to 30 years after their traumatic SCI. Fracture history was assessed and verified using patients' files and X-rays. BMD of the lumbar spine (LS), femoral neck (FN), distal forearm (ultradistal part = UDR, 1/3 distal part = 1/3R), distal tibial diaphysis (TDIA), and distal tibial epiphysis (TEPI) was measured using DXA. Stiffness of the calcaneus (QUI.CALC), speed of sound of the tibia (SOS.TIB), and amplitude-dependent SOS across the proximal phalanges (adSOS.PHAL) were measured using QUS. Z-Scores of BMD and quantitative ultrasound (QUS) were plotted against time-since-injury and compared among four groups of paraplegics stratified according to time-since-injury (<1 year, stratum I; 1-9 years, stratum II; 10-19 years, stratum III; 20-29 years, stratum IV). Biochemical markers of bone turnover (deoxypyridinoline/creatinine (D-pyr/Cr), osteocalcin, alkaline phosphatase) and the main parameters of calcium phosphate metabolism were measured. Fifteen out of 98 paraplegics had sustained a total of 39 fragility fractures within 1,010 years of observation. All recorded fractures were fractures of the lower limbs, mean time to first fracture being 8.9 +/- 1.4 years. Fracture incidence increased with time-after-SCI, from 1% in the first 12 months to 4.6%/year in paraplegics since >20 years ( p<.01). The overall fracture incidence was 2.2%/year. Compared with nonfractured paraplegics, those with a fracture history had been injured for a longer time ( p<.01). Furthermore, they had lower Z-scores at FN, TEPI, and TDIA ( p<.01 to <.0001), the largest difference being observed at TDIA, compared with the nonfractured. At the lower limbs, BMD decreased with time at all sites ( r=.49 to.78, all p<.0001). At FN and TEPI, bone loss followed a log curve which leveled off between 1 to 3 years after injury. In contrast, Z-scores of TDIA continuously decreased even beyond 10 years after injury. LS BMD Z-score increased with time-since-SCI ( p<.05). Similarly to DXA, QUS allowed differentiation of early and rapid trabecular bone loss (QUI.CALC) vs slow and continuous cortical bone loss (SOS.TIB). Biochemical markers reflected a disproportion between highly elevated bone resorption and almost normal bone formation early after injury. Turnover declined following a log curve with time-after-SCI, however, D-pyr/Cr remained elevated in 30% of paraplegics injured >10 years. In paraplegic men early (trabecular) and persistent (cortical) bone loss occurs at the lower limbs and leads to an increasing fracture incidence with time-after-SCI.
Resumo:
BACKGROUND A number of epidemiological studies indicate an inverse association between atopy and brain tumors in adults, particularly gliomas. We investigated the association between atopic disorders and intracranial brain tumors in children and adolescents, using international collaborative CEFALO data. PATIENTS AND METHODS CEFALO is a population-based case-control study conducted in Denmark, Norway, Sweden, and Switzerland, including all children and adolescents in the age range 7-19 years diagnosed with a primary brain tumor between 2004 and 2008. Two controls per case were randomly selected from population registers matched on age, sex, and geographic region. Information about atopic conditions and potential confounders was collected through personal interviews. RESULTS In total, 352 cases (83%) and 646 controls (71%) participated in the study. For all brain tumors combined, there was no association between ever having had an atopic disorder and brain tumor risk [odds ratio 1.03; 95% confidence interval (CI) 0.70-1.34]. The OR was 0.76 (95% CI 0.53-1.11) for a current atopic condition (in the year before diagnosis) and 1.22 (95% CI 0.86-1.74) for an atopic condition in the past. Similar results were observed for glioma. CONCLUSIONS There was no association between atopic conditions and risk of all brain tumors combined or of glioma in particular. Stratification on current or past atopic conditions suggested the possibility of reverse causality, but may also the result of random variation because of small numbers in subgroups. In addition, an ongoing tumor treatment may affect the manifestation of atopic conditions, which could possibly affect recall when reporting about a history of atopic diseases. Only a few studies on atopic conditions and pediatric brain tumors are currently available, and the evidence is conflicting.