869 resultados para Risk model


Relevância:

30.00% 30.00%

Publicador:

Resumo:

de Araujo CC, Silva JD, Samary CS, Guimaraes IH, Marques PS, Oliveira GP, do Carmo LGRR, Goldenberg RC, Bakker-Abreu I, Diaz BL, Rocha NN, Capelozzi VL, Pelosi P, Rocco PRM. Regular and moderate exercise before experimental sepsis reduces the risk of lung and distal organ injury. J Appl Physiol 112: 1206-1214, 2012. First published January 19, 2012; doi:10.1152/japplphysiol.01061.2011.-Physical activity modulates inflammation and immune response in both normal and pathologic conditions. We investigated whether regular and moderate exercise before the induction of experimental sepsis reduces the risk of lung and distal organ injury and survival. One hundred twenty-four BALB/c mice were randomly assigned to two groups: sedentary (S) and trained (T). Animals in T group ran on a motorized treadmill, at moderate intensity, 5% grade, 30 min/day, 3 times a week for 8 wk. Cardiac adaptation to exercise was evaluated using echocardiography. Systolic volume and left ventricular mass were increased in T compared with S group. Both T and S groups were further randomized either to sepsis induced by cecal ligation and puncture surgery (CLP) or sham operation (control). After 24 h, lung mechanics and histology, the degree of cell apoptosis in lung, heart, kidney, liver, and small intestine villi, and interleukin (IL)-6, KC (IL-8 murine functional homolog), IL-1 beta, IL-10, and number of cells in bronchoalveolar lavage (BALF) and peritoneal lavage (PLF) fluids as well as plasma were measured. In CLP, T compared with S groups showed: 1) improvement in survival; 2) reduced lung static elastance, alveolar collapse, collagen and elastic fiber content, number of neutrophils in BALF, PLF, and plasma, as well as lung and distal organ cell apoptosis; and 3) increased IL-10 in BALF and plasma, with reduced IL-6, KC, and IL-1 beta in PLF. In conclusion, regular and moderate exercise before the induction of sepsis reduced the risk of lung and distal organ damage, thus increasing survival.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Santos C.S.A.B., Piatti R.M., Azevedo S.S., Alves C.J., Higino S.S.S., Silva M.L.C.R., Brasil A.W.L. & Gennari S.M. 2012. Seroprevalence and risk factors associated with Chlamydophila abortus infection in dairy goats in the Northeast of Brazil. Pesquisa Veterinaria Brasileira 32(11):1082-1086. Unidade Academica de Medicina Veterinaria, Centro de Sa de e Tecnologia Rural, Universidade Federal de Campina Grande, Av. Universitaria s/n, Bairro Santa Cecilia, Patos, PB 58700-970, Brazil. E-mail: sergio.azevedo@pq.cnpq.br Few data are available on the prevalence and risk factors of Chlamydophila abortus infection in goats in Brazil. A cross-sectional study was carried out to determine the flock-level prevalence of C. abortus infection in goats from the semiarid region of the Paraiba State, Northeast region of Brazil, as well as to identify risk factors associated with the infection. Flocks were randomly selected and a pre-established number of female goats >= 12 mo old were sampled in each of these flocks. A total of 975 serum samples from 110 flocks were collected, and structured questionnaire focusing on risk factors for C. abortus infection was given to each farmer at the time of blood collection. For the serological diagnosis the complement fixation test (CFT) using C. abortus S26/3 strain as antigen was performed. The flock-level factors for C. abortus prevalence were tested using multivariate logistic regression model. Fifty-five flocks out of 110 presented at least one seropositive animal with an overall prevalence of 50.0% (95%; CI: 40.3%, 59.7%). Ninety-one out of 975 dairy goats examined were seropositive with titers >= 32, resulting in a frequency of 9.3%. Lend buck for breeding (odds ratio = 2.35; 95% CI: 1.04-5.33) and history of abortions (odds ratio = 3.06; 95% CI: 1.37-6.80) were associated with increased flock prevalence.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Congenital heart disease (CHD) occurs in similar to 1% of newborns. CHD arises from many distinct etiologies, ranging from genetic or genomic variation to exposure to teratogens, which elicit diverse cell and molecular responses during cardiac development. To systematically explore the relationships between CHD risk factors and responses, we compiled and integrated comprehensive datasets from studies of CHD in humans and model organisms. We examined two alternative models of potential functional relationships between genes in these datasets: direct convergence, in which CHD risk factors significantly and directly impact the same genes and molecules and functional convergence, in which risk factors significantly impact different molecules that participate in a discrete heart development network. We observed no evidence for direct convergence. In contrast, we show that CHD risk factors functionally converge in protein networks driving the development of specific anatomical structures (e.g., outflow tract, ventricular septum, and atrial septum) that are malformed by CHD. This integrative analysis of CHD risk factors and responses suggests a complex pattern of functional interactions between genomic variation and environmental exposures that modulate critical biological systems during heart development.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The aim of the present study is to contribute an ecologically relevant assessment of the ecotoxicological effects of pesticide applications in agricultural areas in the tropics, using an integrated approach with information gathered from soil and aquatic compartments. Carbofuran, an insecticide/nematicide used widely on sugarcane crops, was selected as a model substance. To evaluate the toxic effects of pesticide spraying for soil biota, as well as the potential indirect effects on aquatic biota resulting from surface runoff and/or leaching, field and laboratory (using a cost-effective simulator of pesticide applications) trials were performed. Standard ecotoxicological tests were performed with soil (Eisenia andrei, Folsomia candida, and Enchytraeus crypticus) and aquatic (Ceriodaphnia silvestrii) organisms, using serial dilutions of soil, eluate, leachate, and runoff samples. Among soil organisms, sensitivity was found to be E. crypticus < E. andrei < F. candida. Among the aqueous extracts, mortality of C. silvestrii was extreme in runoff samples, whereas eluates were by far the least toxic samples. A generally higher toxicity was found in the bioassays performed with samples from the field trial, indicating the need for improvements in the laboratory simulator. However, the tool developed proved to be valuable in evaluating the toxic effects of pesticide spraying in soils and the potential risks for aquatic compartments. Environ. Toxicol. Chem. 2012;31:437-445. (C) 2011 SETAC

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVE: To analyze the nutritional status of pediatric patients after orthotopic liver transplantation and the relationship with short-term clinical outcome. METHOD: Anthropometric evaluations of 60 children and adolescents after orthotopic liver transplantation, during the first 24 hours in a tertiary pediatric intensive care unit. Nutritional status was determined from the Z score for the following indices: weight/age, height/age or length/age, weight/height or weight/length, body mass index/age, arm circumference/age and triceps skinfold/age. The severity of liver disease was evaluated using one of the two models which was adequated to the patients' age: 1. Pediatric End-stage Liver Disease, 2. Model for End-Stage Liver Disease. RESULTS: We found 50.0% undernutrition by height/age; 27.3% by weight/age; 11.1% by weight/height or weight/length; 10.0% by body mass index/age; 61.6% by arm circumference/age and 51.0% by triceps skinfold/age. There was no correlation between nutritional status and Pediatric End-stage Liver Disease or mortality. We found a negative correlation between arm circumference/age and length of hospitalization. CONCLUSION: Children with chronic liver diseases experience a significant degree of undernutrition, which makes nutritional support an important aspect of therapy. Despite the difficulties in assessment, anthropometric evaluation of the upper limbs is useful to evaluate nutritional status of children before or after liver transplantation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper, the effects of uncertainty and expected costs of failure on optimum structural design are investigated, by comparing three distinct formulations of structural optimization problems. Deterministic Design Optimization (DDO) allows one the find the shape or configuration of a structure that is optimum in terms of mechanics, but the formulation grossly neglects parameter uncertainty and its effects on structural safety. Reliability-based Design Optimization (RBDO) has emerged as an alternative to properly model the safety-under-uncertainty part of the problem. With RBDO, one can ensure that a minimum (and measurable) level of safety is achieved by the optimum structure. However, results are dependent on the failure probabilities used as constraints in the analysis. Risk optimization (RO) increases the scope of the problem by addressing the compromising goals of economy and safety. This is accomplished by quantifying the monetary consequences of failure, as well as the costs associated with construction, operation and maintenance. RO yields the optimum topology and the optimum point of balance between economy and safety. Results are compared for some example problems. The broader RO solution is found first, and optimum results are used as constraints in DDO and RBDO. Results show that even when optimum safety coefficients are used as constraints in DDO, the formulation leads to configurations which respect these design constraints, reduce manufacturing costs but increase total expected costs (including expected costs of failure). When (optimum) system failure probability is used as a constraint in RBDO, this solution also reduces manufacturing costs but by increasing total expected costs. This happens when the costs associated with different failure modes are distinct. Hence, a general equivalence between the formulations cannot be established. Optimum structural design considering expected costs of failure cannot be controlled solely by safety factors nor by failure probability constraints, but will depend on actual structural configuration. (c) 2011 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We investigated the association between diet and head and neck cancer (HNC) risk using data from the International Head and Neck Cancer Epidemiology (INHANCE) consortium. The INHANCE pooled data included 22 case-control studies with 14,520 cases and 22,737 controls. Center-specific quartiles among the controls were used for food groups, and frequencies per week were used for single food items. A dietary pattern score combining high fruit and vegetable intake and low red meat intake was created. Odds ratios (OR) and 95% confidence intervals (CI) for the dietary items on the risk of HNC were estimated with a two-stage random-effects logistic regression model. An inverse association was observed for higher-frequency intake of fruit (4th vs. 1st quartile OR = 0.52, 95% CI = 0.43-0.62, p (trend) < 0.01) and vegetables (OR = 0.66, 95% CI = 0.49-0.90, p (trend) = 0.01). Intake of red meat (OR = 1.40, 95% CI = 1.13-1.74, p (trend) = 0.13) and processed meat (OR = 1.37, 95% CI = 1.14-1.65, p (trend) < 0.01) was positively associated with HNC risk. Higher dietary pattern scores, reflecting high fruit/vegetable and low red meat intake, were associated with reduced HNC risk (per score increment OR = 0.90, 95% CI = 0.84-0.97).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In order to assess the contribution of different parenteral routes as risk exposure to the hepatitis C virus (HCV), samples from nine surveys or cross-sectional studies conducted in two Brazilian inland regions were pooled, including a total of 3,910 subjects. Heterogeneity among the study results for different risk factors was tested and the results were shown to be homogeneous. Anti-HCV antibodies were observed in 241 individuals, of which 146 (3.7%, 95% CI?=?3.24.4) had HCV exposure confirmed by immunoblot analysis or PCR test. After adjustment for relevant variables, a correlation between confirmed HCV exposure and injection drug use, tattooing, and advance age was observed. In a second logistic model that included exposures not searched in all nine studies, a smaller sample was analyzed, revealing an independent HCV association with past history of surgery and males who have sex with other males, in addition to repeated injection drug use. Overall, these analyses corroborate the finding that injection drug use is the main risk factor for HCV exposure and spread, in addition to other parenteral routes. J. Med. Virol. 84:756762, 2012. (C) 2012 Wiley Periodicals, Inc.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose: Refractory frontal lobe epilepsy (FLE) remains one of the most challenging surgically remediable epilepsy syndromes. Nevertheless, definition of independent predictors and predictive models of postsurgical seizure outcome remains poorly explored in FLE. Methods: We retrospectively analyzed data from 70 consecutive patients with refractory FLE submitted to surgical treatment at our center from July 1994 to December 2006. Univariate results were submitted to logistic regression models and Cox proportional hazards regression to identify isolated risk factors for poor surgical results and to construct predictive models for surgical outcome in FLE. Results: From 70 patients submitted to surgery, 45 patients (64%) had favorable outcome and 37 (47%) became seizure free. Isolated risk factors for poor surgical outcome are expressed in hazard ratio (H.R.) and were time of epilepsy (H.R.=4.2; 95% C.I.=.1.5-11.7; p=0.006), ictal EEG recruiting rhythm (H.R. = 2.9; 95% C.I. = 1.1-7.7; p=0.033); normal MRI (H.R. = 4.8; 95% C.I. = 1.4-16.6; p = 0.012), and MRI with lesion involving eloquent cortex (H.R. = 3.8; 95% C.I. = 1.2-12.0; p = 0.021). Based on these variables and using a logistic regression model we constructed a model that correctly predicted long-term surgical outcome in up to 80% of patients. Conclusion: Among independent risk factors for postsurgical seizure outcome, epilepsy duration is a potentially modifiable factor that could impact surgical outcome in FLE. Early diagnosis, presence of an MRI lesion not involving eloquent cortex, and ictal EEG without recruited rhythm independently predicted favorable outcome in this series. (C) 2011 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

It was verified to what extent cognitive and affective/emotional variables could distinguish caregivers accused of committing physical abuse (G1) from those without physical abuse records (G2). The Child Abuse Potential Inventory (CAP), which is an instrument designed to assess psychological risk factors in caregivers, was used. A questionnaire on socio-demographic characterization and another on economic classification were also employed to equate the groups. G1 presented a greater potential risk than G2, higher levels of Distress, Rigidity, Problems with the Child and with Themselves, Problems with Others, and a lower level of Ego Strength. These variables contribute with the composition of physical abuse risk, since, in agreement with the Social Information Processing Model, they would be related to cognitive and affective basic processes which are veiled to the perceptions and evaluation/interpretations, associated to abusive parental behavior.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Abstract Background Patients under haemodialysis are considered at high risk to acquire hepatitis B virus (HBV) infection. Since few data are reported from Brazil, our aim was to assess the frequency and risk factors for HBV infection in haemodialysis patients from 22 Dialysis Centres from Santa Catarina State, south of Brazil. Methods This study includes 813 patients, 149 haemodialysis workers and 772 healthy controls matched by sex and age. Serum samples were assayed for HBV markers and viraemia was detected by nested PCR. HBV was genotyped by partial S gene sequencing. Univariate and multivariate statistical analyses with stepwise logistic regression analysis were carried out to analyse the relationship between HBV infection and the characteristics of patients and their Dialysis Units. Results Frequency of HBV infection was 10.0%, 2.7% and 2.7% among patients, haemodialysis workers and controls, respectively. Amidst patients, the most frequent HBV genotypes were A (30.6%), D (57.1%) and F (12.2%). Univariate analysis showed association between HBV infection and total time in haemodialysis, type of dialysis equipment, hygiene and sterilization of equipment, number of times reusing the dialysis lines and filters, number of patients per care-worker and current HCV infection. The logistic regression model showed that total time in haemodialysis, number of times of reusing the dialysis lines and filters, and number of patients per worker were significantly related to HBV infection. Conclusions Frequency of HBV infection among haemodialysis patients at Santa Catarina state is very high. The most frequent HBV genotypes were A, D and F. The risk for a patient to become HBV positive increase 1.47 times each month of haemodialysis; 1.96 times if the dialysis unit reuses the lines and filters ≥ 10 times compared with haemodialysis units which reuse < 10 times; 3.42 times if the number of patients per worker is more than five. Sequence similarity among the HBV S gene from isolates of different patients pointed out to nosocomial transmission.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Abstract Background Hepatitis C chronic liver disease is a major cause of liver transplant in developed countries. This article reports the first nationwide population-based survey conducted to estimate the seroprevalence of HCV antibodies and associated risk factors in the urban population of Brazil. Methods The cross sectional study was conducted in all Brazilian macro-regions from 2005 to 2009, as a stratified multistage cluster sample of 19,503 inhabitants aged between 10 and 69 years, representing individuals living in all 26 State capitals and the Federal District. Hepatitis C antibodies were detected by a third-generation enzyme immunoassay. Seropositive individuals were retested by Polymerase Chain Reaction and genotyped. Adjusted prevalence was estimated by macro-regions. Potential risk factors associated with HCV infection were assessed by calculating the crude and adjusted odds ratios, 95% confidence intervals (95% CI) and p values. Population attributable risk was estimated for multiple factors using a case–control approach. Results The overall weighted prevalence of hepatitis C antibodies was 1.38% (95% CI: 1.12%–1.64%). Prevalence of infection increased in older groups but was similar for both sexes. The multivariate model showed the following to be predictors of HCV infection: age, injected drug use (OR = 6.65), sniffed drug use (OR = 2.59), hospitalization (OR = 1.90), groups socially deprived by the lack of sewage disposal (OR = 2.53), and injection with glass syringe (OR = 1.52, with a borderline p value). The genotypes 1 (subtypes 1a, 1b), 2b and 3a were identified. The estimated population attributable risk for the ensemble of risk factors was 40%. Approximately 1.3 million individuals would be expected to be anti-HCV-positive in the country. Conclusions The large estimated absolute numbers of infected individuals reveals the burden of the disease in the near future, giving rise to costs for the health care system and society at large. The known risk factors explain less than 50% of the infected cases, limiting the prevention strategies. Our findings regarding risk behaviors associated with HCV infection showed that there is still room for improving strategies for reducing transmission among drug users and nosocomial infection, as well as a need for specific prevention and control strategies targeting individuals living in poverty.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background UCP2 (uncoupling protein 2) plays an important role in cardiovascular diseases and recent studies have suggested that the A55V polymorphism can cause UCP2 dysfunction. The main aim was to investigate the association of A55V polymorphism with cardiovascular events in a group of 611 patients enrolled in the Medical, Angioplasty or Surgery Study II (MASS II), a randomized trial comparing treatments for patients with coronary artery disease and preserved left ventricular function. Methods The participants of the MASS II were genotyped for the A55V polymorphism using allele-specific PCR assay. Survival curves were calculated with the Kaplan–Meier method and evaluated with the log-rank statistic. The relationship between baseline variables and the composite end-point of cardiac death, acute myocardial infarction (AMI), refractory angina requiring revascularization and cerebrovascular accident were assessed using a Cox proportional hazards survival model. Results There were no significant differences for baseline variables according genotypes. After 2 years of follow-up, dysglycemic patients harboring the VV genotype had higher occurrence of AMI (p=0.026), Death+AMI (p=0.033), new revascularization intervention (p=0.009) and combined events (p=0.037) as compared with patients carrying other genotypes. This association was not evident in normoglycemic patients. Conclusions These findings support the hypothesis that A55V polymorphism is associated with UCP2 functional alterations that increase the risk of cardiovascular events in patients with previous coronary artery disease and dysglycemia.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: Childhood obesity is a public health problem worldwide. Visceral obesity, particularly associated with cardio-metabolic risk, has been assessed by body mass index (BMI) and waist circumference, but both methods use sex-and age-specific percentile tables and are influenced by sexual maturity. Waist-to-height ratio (WHtR) is easier to obtain, does not involve tables and can be used to diagnose visceral obesity, even in normal-weight individuals. This study aims to compare the WHtR to the 2007 World Health Organization (WHO) reference for BMI in screening for the presence of cardio-metabolic and inflammatory risk factors in 6–10-year-old children. Methods: A cross-sectional study was undertaken with 175 subjects selected from the Reference Center for the Treatment of Children and Adolescents in Campos, Rio de Janeiro, Brazil. The subjects were classified according to the 2007 WHO standard as normal-weight (BMI z score > −1 and < 1) or overweight/obese (BMI z score ≥ 1). Systolic blood pressure (SBP), diastolic blood pressure (DBP), fasting glycemia, low-density lipoprotein (LDL), high-density lipoprotein (HDL), triglyceride (TG), Homeostatic Model Assessment – Insulin Resistance (HOMA-IR), leukocyte count and ultrasensitive C-reactive protein (CRP) were also analyzed. Results: There were significant correlations between WHtR and BMI z score (r = 0.88, p < 0.0001), SBP (r = 0.51, p < 0.0001), DBP (r = 0.49, p < 0.0001), LDL (r = 0.25, p < 0.0008, HDL (r = −0.28, p < 0.0002), TG (r = 0.26, p < 0.0006), HOMA-IR (r = 0.83, p < 0.0001) and CRP (r = 0.51, p < 0.0001). WHtR and BMI areas under the curve were similar for all the cardio-metabolic parameters. A WHtR cut-off value of > 0.47 was sensitive for screening insulin resistance and any one of the cardio-metabolic parameters. Conclusions: The WHtR was as sensitive as the 2007 WHO BMI in screening for metabolic risk factors in 6-10-year-old children. The public health message “keep your waist to less than half your height” can be effective in reducing cardio-metabolic risk because most of these risk factors are already present at a cut point of WHtR ≥ 0.5. However, as this is the first study to correlate the WHtR with inflammatory markers, we recommend further exploration of the use of WHtR in this age group and other population-based samples.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Abstract Background Allogeneic red blood cell (RBC) transfusion has been proposed as a negative indicator of quality in cardiac surgery. Hospital length of stay (LOS) may be a surrogate of poor outcome in transfused patients. Methods Data from 502 patients included in Transfusion Requirements After Cardiac Surgery (TRACS) study were analyzed to assess the relationship between RBC transfusion and hospital LOS in patients undergoing cardiac surgery and enrolled in the TRACS study. Results According to the status of RBC transfusion, patients were categorized into the following three groups: 1) 199 patients (40%) who did not receive RBC, 2) 241 patients (48%) who received 3 RBC units or fewer (low transfusion requirement group), and 3) 62 patients (12%) who received more than 3 RBC units (high transfusion requirement group). In a multivariable Cox proportional hazards model, the following factors were predictive of a prolonged hospital length of stay: age higher than 65 years, EuroSCORE, valvular surgery, combined procedure, LVEF lower than 40% and RBC transfusion of > 3 units. Conclusion RBC transfusion is an independent risk factor for increased LOS in patients undergoing cardiac surgery. This finding highlights the adequacy of a restrictive transfusion therapy in patients undergoing cardiac surgery. Trial registration Clinicaltrials.gov identifier: http://NCT01021631.