43 resultados para Objective risk
Resumo:
Objective: The objective was to evaluate the cardiovascular profile of first-episode psychosis patients in Sao Paulo, Brazil, an issue that has not been sufficiently explored in low-/middle-income countries. Method: A cross-sectional study was performed 1 to 3 years after an initial, larger survey that assessed first-episode psychosis in sao Paulo. We evaluated cardiovascular risk factors and lifestyle habits using standard clinical examination and laboratory evaluation. Results: Of 151 contacted patients, 82 agreed to participate (mean age=35 years; 54% female). The following diagnoses were found: 20.7% were obese, 29.3% had hypertension, 39.0% had dyslipidemia, 19.5% had metabolic syndrome, and 1.2% had a >20% 10-year risk of coronary heart disease based on Framingham score. Also, 72% were sedentary, 25.6% were current smokers, and 7.3% reported a heavy alcohol intake. Conclusion: Compared to other samples, ours presented a distinct profile of higher rates of hypertension and diabetes (possibly due to dietary habits) and lower rates of smoking and alcohol intake (possibly due to higher dependence on social support). Indirect comparison vs. healthy, age-matched Brazilians revealed that our sample had higher frequencies of hypertension, diabetes and metabolic syndrome. Therefore, we confirmed a high cardiovascular risk in first-episode psychosis in Brazil. Transcultural studies are needed to investigate to which extent lifestyle contributes to such increased risk. (C) 2012 Elsevier Inc. All rights reserved.
Resumo:
The Simplified Acute Physiology Score II (SAPS II) and Logistic Organ Dysfunction System (LODS) are instruments used to classify Intensive Care Unit (ICU) inpatients according to the severity of their condition and risk of death, and evaluate the quality of nursing care. The objective of this study is to evaluate and compare the performance of SAPS II and LODS to predict the mortality of patients admitted to the ICU. The participants were 600 patients from four ICUs located in Sao Paulo, Brazil. Receiver Operator Characteristic (ROC) curves were used to compare the performance of the indexes. Results: The areas under the ROC curves of LODS (0.69) and SAPS II (0.71) indicated moderate discriminatory capacity to identify death or survival. No statistically significant differences were found between these areas (p=0.26). In conclusion, there was equivalence between SAPS II and LODS to estimate the risk of death of ICU patients.
Resumo:
The objective of this study was to estimate the prevalence of inadequate micronutrient intake and excess sodium intake among adults age 19 years and older in the city of Sao Paulo, Brazil. Twenty-four hour dietary recall and sociodemographic data were collected from each participant (n=1,663) in a cross-sectional study, Inquiry of Health of Sao Paulo, of a representative sample of the adult population of the city of Sao Paulo in 2003 (ISA-2003). The variability in intake was measured through two replications of the 24-hour recall in a subsample of this population in 2007 (ISA-2007). Usual intake was estimated by the PC-SIDE program (version 1.0, 2003, Department of Statistics, Iowa State University), which uses an approach developed by Iowa State University. The prevalence of nutrient inadequacy was calculated using the Estimated Average Requirement cut-point method for vitamins A and C, thiamin, riboflavin, niacin, copper, phosphorus, and selenium. For vitamin D, pantothenic acid, manganese, and sodium, the proportion of individuals with usual intake equal to or more than the Adequate Intake value was calculated. The percentage of individuals with intake equal to more than the Tolerable Upper Intake Level was calculated for sodium. The highest prevalence of inadequacy for males and females, respectively, occurred for vitamin A (67% and 58%), vitamin C (52% and 62%), thiamin (41% and 50%), and riboflavin (29% and 19%). The adjustment for the within-person variation presented lower prevalence of inadequacy due to removal of within-person variability. All adult residents of Sao Paulo had excess sodium intake, and the rates of nutrient inadequacy were high for certain key micronutrients. J Acad Nutr Diet. 2012;112:1614-1618.
Resumo:
Introduction: Bipolar disorder (BD) is a highly incapacitating disease typically associated with high rates of familial dysfunction. Despite recent literature suggesting that maternal care is an important environmental factor in the development of behavioral disorders, it is unclear how much maternal care is dysfunctional in BD subjects. Objective: The objective of this study was to characterize maternal care in DSM-IV/SCID diagnosed BD type I subjects compared to healthy controls with (PD) and without (NPD) other psychiatric diagnoses. Materials and methods: Thirty-four BD mothers and 106 controls underwent an interview about family planning and maternal care, obstetrical complications, and mother-child interactions. K-SADS-PL questions about violence exposure were used to ascertain domestic violence and physical/sexual abuse. Results: BD mothers were less likely to have stable unions (45.5%; p < 0.01) or to live with the biological father of their children (33.3%; p < 0.01), but had higher educational level and higher rates of social security use/retirement. They also had fewer children and used less contraceptive methods than controls. Children of BD women had higher rates of neonatal anoxia, and reported more physical abuse (16.1%; p = 0.02) than offspring of NPD mothers. Due to BD mothers' symptoms, 33.3% of offspring suffered physical and/or psychological abuse. Limitations: Post hoc analysis, and the use of questions as a surrogate of symptoms as opposed to validated instruments. Conclusion: This is one of few reports confirming that maternal care given by BD women is dysfunctional. BD psychopathology can lead to poor maternal care and both should be considered important environmental risk factors in BD, suggesting that BD psychoeducation should include maternal care orientation. (C) 2012 Elsevier B.V. All rights reserved.
Resumo:
The objective of this study was to identify, among motorcyclists involved in traffic incidents, the factors associated with risk of injuries. In 2004, in the city of Maringa-PR, it was determined that there were a total of 2,362 motorcyclists involved in traffic incidents, according to records from the local Military Police. Multivariate analysis was applied to identify the factors associated with the presence of injury. A significantly higher probability of injury was observed among motorcyclists involved in collisions (odds Ratio = 11.19) and falls (odds Ratio = 3.81); the estimated odds ratio for females was close to four, and those involved in incidents including up to two vehicles were 2.63 times more likely to have injuries. Women involved in motorcycle falls and collisions with up to two vehicles stood out as a high-risk group for injuries.
Resumo:
Introduction. Patients with terminal heart failure have increased more than the available organs leading to a high mortality rate on the waiting list. Use of Marginal and expanded criteria donors has increased due to the heart shortage. Objective. We analyzed all heart transplantations (HTx) in Sao Paulo state over 8 years for donor profile and recipient risk factors. Method. This multi-institutional review collected HTx data from all institutions in the state of Sao Paulo, Brazil. From 2002 to 2008 (6 years), only 512 (28.8%) of 1777 available heart donors were accepted for transplantation. All medical records were analyzed retrospectively; none of the used donors was excluded, even those considered to be nonstandard. Results. The hospital mortality rate was 27.9% (n = 143) and the average follow-up time was 29.4 +/- 28.4 months. The survival rate was 55.5% (n = 285) at 6 years after HTx. Univariate analysis showed the following factors to impact survival: age (P = .0004), arterial hypertension (P = .4620), norepinephrine (P = .0450), cardiac arrest (P = .8500), diabetes mellitus (P = .5120), infection (P = .1470), CKMB (creatine kinase MB) (P = .8694), creatinine (P = .7225), and Na+ (P = .3273). On multivariate analysis, only age showed significance; logistic regression showed a significant cut-off at 40 years: organs from donors older than 40 years showed a lower late survival rates (P = .0032). Conclusions. Donor age older than 40 years represents an important risk factor for survival after HTx. Neither donor gender nor norepinephrine use negatively affected early survival.
Resumo:
Objective: To determine the accuracy of the Timed Up and Go Test (TUGT) for screening the risk of falls among community-dwelling elderly individuals. Method: This is a prospective cohort study with a randomly by lots without reposition sample stratified by proportional partition in relation to gender involving 63 community-dwelling elderly individuals. Elderly individuals who reported having Parkinson's disease, a history of transitory ischemic attack, stroke and with a Mini Mental State Exam lower than the expected for the education level, were on a wheelchair and that reported a single fall in the previous six months were excluded. The TUGT, a mobility test, was the measure of interested and the occurrence of falls was the outcome. The performance of basic activities of daily living (ADL) and instrumental activities of daily living (IADL) was determined through the Older American Resources and Services, and the socio-demographic and clinical data were determined through the use of additional questionnaires. Receiver Operating Characteristic Curves were used to analyze the sensitivity and specificity of the TUGT. Results: Elderly individuals who fell had greater difficulties in ADL and IADL (p<0.01) and a slower performance on the TUGT (p=0.02). No differences were found in socio-demographic and clinical characteristics between fallers and non- fallers. Considering the different sensitivity and specificity, the best predictive value for discriminating elderly individuals who fell was 12.47 seconds [(RR= 3.2) 95% CI: 1.3- 7.7]. Conclusions: The TUGT proved to be an accurate measure for screening the risk of falls among elderly individuals. Although different from that reported in the international literature, the 12.47 second cutoff point seems to be a better predictive value for Brazilian elderly individuals.
Resumo:
The objective of this study was to determine the frequencies of autoantibodies to heterogeneous islet-cell cytoplasmic antigens (ICA), glutamic acid decarboxylase(65) (GAD(65)A), insulinoma-associated antigen-2 (IA-2A) and insulin (IAA)-and human leukocyte antigen (HLA) class II markers (HLA-DR and -DQ) in first degree relatives of heterogeneous Brazilian patients with type I diabetes(T1DM). A major focus of this study was to determine the influence of age, gender, proband characteristics and ancestry on the prevalence of autoantibodies and HLA-DR and -DQ alleles on disease progression and genetic predisposition to T1DM among the first-degree relatives. IAA, ICA, GAD(65)A, IA-2A and HLA- class II alleles were determined in 546 first-degree-relatives, 244 siblings, 55 offspring and 233 parents of 178 Brazilian patients with T1DM. Overall, 8.9% of the relatives were positive for one or more autoantibodies. IAA was the only antibody detected in parents. GAD(65) was the most prevalent antibody in offspring and siblings as compared to parents and it was the sole antibody detected in offspring. Five siblings were positive for the IA-2 antibody. A significant number (62.1%) of siblings had 1 or 2 high risk HLA haplotypes. During a 4-year follow-up study, 5 siblings (expressing HLA-DR3 or -DR4 alleles) and 1 offspring positive for GAD(65)A progressed to diabetes. The data indicated that the GAD(65) and IA-2 antibodies were the strongest predictors of T1DM in our study population. The high risk HLA haplotypes alone were not predictive of progression to overt diabetes.
Resumo:
Background: The biobehavioural pain reactivity and recovery of preterm infants in the neonatal period may reflect the capacity of the central nervous system to regulate neurobiological development. Objective: The aim of the present study was to analyse the influence of the neonatal clinical risk for illness severity on biobehavioural pain reactivity in preterm infants. Methods: Fifty-two preterm infants were allocated into two groups according to neonatal severity of illness, as measured by the Clinical Risk Index for Babies (CRIB). The low clinical risk (LCr) group included 30 neonates with CRIB scores <4, and the high clinical risk (HCr) group included 22 neonates with CRIB scores >= 4. Pain reactivity was assessed during a blood collection, which was divided into five phases (baseline, antisepsis, puncture, recovery-dressing and recovery-resting). Behavioral pain reactivity was measured using the scores, and magnitude of responses in Neonatal Facial Coding System (NFCS) and Sleep-Wake States Scale (SWS). The heart rate was continuously recorded. Results: The HCr demonstrated a higher magnitude of response on the SWS score from the baseline to the puncture phase than the LCr. Also, the HCr exhibited a higher mean heart rate and minimum heart rate than the LCr in the recovery-resting phase. In addition, the HCr exhibited a higher minimum heart rate from the baseline to the recovery-resting phase than the LCr. Conclusion: The infants exhibiting a high neonatal clinical risk showed high arousal during the puncture procedure and higher physiological reactivity in the recovery phase.
Resumo:
OBJECTIVE: To analyze the nutritional status of pediatric patients after orthotopic liver transplantation and the relationship with short-term clinical outcome. METHOD: Anthropometric evaluations of 60 children and adolescents after orthotopic liver transplantation, during the first 24 hours in a tertiary pediatric intensive care unit. Nutritional status was determined from the Z score for the following indices: weight/age, height/age or length/age, weight/height or weight/length, body mass index/age, arm circumference/age and triceps skinfold/age. The severity of liver disease was evaluated using one of the two models which was adequated to the patients' age: 1. Pediatric End-stage Liver Disease, 2. Model for End-Stage Liver Disease. RESULTS: We found 50.0% undernutrition by height/age; 27.3% by weight/age; 11.1% by weight/height or weight/length; 10.0% by body mass index/age; 61.6% by arm circumference/age and 51.0% by triceps skinfold/age. There was no correlation between nutritional status and Pediatric End-stage Liver Disease or mortality. We found a negative correlation between arm circumference/age and length of hospitalization. CONCLUSION: Children with chronic liver diseases experience a significant degree of undernutrition, which makes nutritional support an important aspect of therapy. Despite the difficulties in assessment, anthropometric evaluation of the upper limbs is useful to evaluate nutritional status of children before or after liver transplantation.
Resumo:
OBJECTIVE: Many changes in mucosal morphology are observed following ileal pouch construction, including colonic metaplasia and dysplasia. Additionally, one rare but potential complication is the development of adenocarcinoma of the reservoir. The aim of this study was to evaluate the most frequently observed histopathological changes in ileal pouches and to correlate these changes with potential risk factors for complications. METHODS: A total of 41 patients were enrolled in the study and divided into the following three groups: a non-pouchitis group (group 1) (n = 20; 8 males; mean age: 47.5 years) demonstrating optimal outcome; a pouchitis without antibiotics group (group 2) (n = 14; 4 males; mean age: 47 years), containing individuals with pouchitis who did not receive treatment with antibiotics; and a pouchitis plus antibiotics group (group 3) (n = 7; 3 males; mean age: 41 years), containing those patients with pouchitis who were administered antibiotics. Ileal pouch endoscopy was performed, and tissue biopsy samples were collected for histopathological analysis. RESULTS: Colonic metaplasia was found in 15 (36.6%) of the 41 patients evaluated; of these, five (25%) were from group 1, eight (57.1%) were from group 2, and two (28.6%) were from group 3. However, no correlation was established between the presence of metaplasia and pouchitis (p = 0.17). and no differences in mucosal atrophy or the degree of chronic or acute inflammation were observed between groups 1, 2, and 3 (p > 0.45). Moreover, no dysplasia or neoplastic changes were detected. However, the degree of mucosal atrophy correlated well with the time of postoperative follow-up (p = 0.05). CONCLUSIONS: The degree of mucosal atrophy, the presence of colonic metaplasia, and the degree of acute or chronic inflammation do not appear to constitute risk factors for the development of pouchitis. Moreover, we observed that longer postoperative follow-up times were associated with greater degrees of mucosal atrophy.
Resumo:
Objective: The objective of this study was to analyze the incidence of and risk factors for healthcare-associated infections (HAI) among hematopoietic stem cell transplantation (HSCT) patients, and the impact of such infections on mortality during hospitalization. Methods: We conducted a 9-year (2001-2009) retrospective cohort study including patients submitted to HSCT at a reference center in Sao Paulo, Brazil. The incidence of HAI was calculated using days of neutropenia as the denominator. Data were analyzed using EpiInfo 3.5.1. Results: Over the 9-year period there were 429 neutropenic HSCT patients, with a total of 6816 days of neutropenia. Bloodstream infections (BSI) were the most frequent infection, presenting in 80 (18.6%) patients, with an incidence of 11.7 per 1000 days of neutropenia. Most bacteremia was due to Gram-negative bacteria: 43 (53.8%) cases were caused by Gram-negative species, while 33 (41.2%) were caused by Gram-positive species, and four (5%) by fungal species. Independent risk factors associated with HAI were prolonged neutropenia (odds ratio (OR) 1.07, 95% confidence interval (CI) 1.04-1.10) and duration of fever (OR 1.20, 95% CI 1.12-1.30). Risk factors associated with death in multivariate analyses were age (OR 1.02, 95% CI 1.01-1.43), being submitted to an allogeneic transplant (OR 3.08, 95% CI 1.68-5.56), a microbiologically documented infection (OR 2.96, 95% CI 1.87-4.6), invasive aspergillosis disease (OR 2.21, 95% CI 1.1-4.3), and acute leukemias (OR 2.24, 95% CI 1.3-3.6). Conclusions: BSI was the most frequent HAI, and there was a predominance of Gram-negative microorganisms. Independent risk factors associated with HAI were duration of neutropenia and fever, and the risk factors for a poor outcome were older age, type of transplant (allogeneic), the presence of a microbiologically documented infection, invasive aspergillosis, and acute leukemia. Further prospective studies with larger numbers of patients may confirm the role of these risk factors for a poor clinical outcome and death in this transplant population. (C) 2012 Published by Elsevier Ltd on behalf of International Society for Infectious Diseases.
Resumo:
Objective To evaluate whether the presence of polycystic ovary syndrome (PCOS) alters multiple ultrasonographic and laboratory markers of metabolic and cardiovascular disease risk in obese women without any other health condition that could interfere with combined oral contraceptive (COC) eligibility criteria. Methods This was a case- control study evaluating 90 obese women ( body mass index ( BMI) = 30.0 kg/m2 and < 40 kg/m2) aged between 18 and 40 years without any other health condition that could interfere with COC eligibility criteria, of whom 45 had PCOS and 45 were age- matched controls. BMI, waist and hip circumference, arterial blood pressure, fasting insulin and glucose, quantitative insulin sensitivity check index ( QUICKI), highdensity lipoprotein cholesterol, low- density lipoprotein cholesterol, total cholesterol, triglycerides, testosterone, sex hormone- binding globulin, free androgen index ( FAI), carotid stiffness index, intima media thickness, flowmediated dilatation ( FMD) of the brachial artery and non- alcoholic fatty liver disease ( NAFLD) were assessed. Results In women with PCOS, we observed a higher frequency of NAFLD ( 73.3 vs. 46.7%, P < 0.01) and higher FAI ( 10.4 vs. 6.8%, P < 0.01). We also observed a trend towards increased insulin levels ( 10.06 +/- 6.66 vs. 7.45 +/- 5.88 mu IU/mL, P = 0.05), decreased QUICKI ( 0.36 +/- 0.06 vs. 0.39 +/- 0.07, P = 0.05) and decreased FMD ( 7.00 +/- 3.87 vs. 8.41 +/- 3.79%, P = 0.08). No other significant difference was observed. Conclusions NAFLD is frequent in obese women without any other health condition that could interfere with COC eligibility criteria, especially in those with PCOS. This should be considered when choosing the best contraceptive option. Copyright (C) 2012 ISUOG. Published by John Wiley & Sons, Ltd.
Resumo:
The objective of this study was to compare the perceptions of two families living in two different neighborhoods (rated according to risk levels) regarding social support. A questionnaire was designed to assess social support according to the following dimensions: instrumental, emotional, religious, and support from friends, neighbors and family. The sample was comprised as follows: considering the 114 families living in neighborhood 1, 52 families were interviewed; and among the 162 families living in neighborhood 2, 60 families were interviewed. No significant difference was found related to instrumental, religious and emotional support, including the support from relatives among the families from both neighborhoods. The results disagree with the reviewed literature, which indicated a strong association between social support and families living at socioeconomic risk. In conclusion, social support is important for families, regardless of their risk stratification.
Resumo:
Objective: To assess the risk factors for delayed diagnosis of uterine cervical lesions. Materials and Methods: This is a case-control study that recruited 178 women at 2 Brazilian hospitals. The cases (n = 74) were composed of women with a late diagnosis of a lesion in the uterine cervix (invasive carcinoma in any stage). The controls (n = 104) were composed of women with cervical lesions diagnosed early on (low-or high-grade intraepithelial lesions). The analysis was performed by means of logistic regression model using a hierarchical model. The socioeconomic and demographic variables were included at level I (distal). Level II (intermediate) included the personal and family antecedents and knowledge about the Papanicolaou test and human papillomavirus. Level III (proximal) encompassed the variables relating to individuals' care for their own health, gynecologic symptoms, and variables relating to access to the health care system. Results: The risk factors for late diagnosis of uterine cervical lesions were age older than 40 years (odds ratio [OR] = 10.4; 95% confidence interval [CI], 2.3-48.4), not knowing the difference between the Papanicolaou test and gynecological pelvic examinations (OR, = 2.5; 95% CI, 1.3-4.9), not thinking that the Papanicolaou test was important (odds ratio [OR], 4.2; 95% CI, 1.3-13.4), and abnormal vaginal bleeding (OR, 15.0; 95% CI, 6.5-35.0). Previous treatment for sexually transmissible disease was a protective factor (OR, 0.3; 95% CI, 0.1-0.8) for delayed diagnosis. Conclusions: Deficiencies in cervical cancer prevention programs in developing countries are not simply a matter of better provision and coverage of Papanicolaou tests. The misconception about the Papanicolaou test is a serious educational problem, as demonstrated by the present study.