14 resultados para Statistical decision.
Resumo:
The human leukocyte antigen (HLA) DRB1*1501 has been consistently associated with multiple sclerosis (MS) in nearly all populations tested. This points to a specific antigen presentation as the pathogenic mechanism though this does not fully explain the disease association. The identification of expression quantitative trait loci (eQTL) for genes in the HLA locus poses the question of the role of gene expression in MS susceptibility. We analyzed the eQTLs in the HLA region with respect to MS-associated HLA-variants obtained from genome-wide association studies (GWAS). We found that the Tag of DRB1*1501, rs3135388 A allele, correlated with high expression of DRB1, DRB5 and DQB1 genes in a Caucasian population. In quantitative terms, the MS-risk AA genotype carriers of rs3135388 were associated with 15.7-, 5.2- and 8.3-fold higher expression of DQB1, DRB5 and DRB1, respectively, than the non-risk GG carriers. The haplotype analysis of expression-associated variants in a Spanish MS cohort revealed that high expression of DRB1 and DQB1 alone did not contribute to the disease. However, in Caucasian, Asian and African American populations, the DRB1*1501 allele was always highly expressed. In other immune related diseases such as type 1 diabetes, inflammatory bowel disease, ulcerative colitis, asthma and IgA deficiency, the best GWAS-associated HLA SNPs were also eQTLs for different HLA Class II genes. Our data suggest that the DR/DQ expression levels, together with specific structural properties of alleles, seem to be the causal effect in MS and in other immunopathologies rather than specific antigen presentation alone.
Resumo:
BACKGROUND AND OBJECTIVES Prevalence of hyponutrition in hospitalized patients is very high and it has been shown to be an important prognostic factor. Most of admitted patients depend on hospital food to cover their nutritional demands being important to assess the factors influencing their intake, which may be modified in order to improve it and prevent the consequences of inadequate feeding. In previous works, it has been shown that one of the worst scored characteristics of dishes was the temperature. The aim of this study was to assess the influence of temperature on patient's satisfaction and amount eaten depending on whether the food was served in isothermal trolleys keeping proper food temperature or not. MATERIAL AND METHODS We carried out satisfaction surveys to hospitalized patients having regular diets, served with or without isothermal trolleys. The following data were gathered: age, gender, weight, number of visits, mobility, autonomy, amount of orally taken medication, intake of out-of-hospital foods, qualification of food temperature, presentation and smokiness, amount of food eaten, and reasons for not eating all the content of the tray. RESULTS Of the 363 surveys, 134 (37.96%) were done to patients with isothermal trays and 229 (62.04%) to patients without them. Sixty percent of the patients referred having eaten less than the normal amount within the last week, the most frequent reason being decreased appetite. During lunch and dinner, 69.3% and 67.7%, respectively, ate half or less of the tray content, the main reasons being as follows: lack of appetite (42% at lunch time and 40% at dinner), do not like the food (24.3 and 26.2%) or taste (15.3 and 16.8%). Other less common reasons were the odor, the amount of food, having nausea or vomiting, fatigue, and lack of autonomy. There were no significant differences in the amount eaten by gender, weight, number of visits, amount of medication, and level of physical activity. The food temperature was classified as adequate by 62% of the patients, the presentation by 95%, and smokiness by 85%. When comparing the patients served with or without isothermal trays, there were no differences with regards to baseline characteristics analyzed that might have had an influence on amount eaten. Ninety percent of the patients with isothermal trolley rated the food temperature as good, as compared with 57.2% of the patients with conventional trolley, the difference being statistically significant (P = 0.000). Besides, there were differences in the amount of food eaten between patients with and without isothermal trolley, so that 41% and 27.7% ate all the tray content, respectively, difference being statistically significant (P = 0.007). There were no differences in smokiness or presentation rating. CONCLUSIONS Most of the patients (60%) had decreased appetite during hospital admission. The percentage of hospitalized patients rating the food temperature as being good is higher among patients served with isothermal trolleys. The amount of food eaten by the patients served with isothermal trolleys is significantly higher that in those without them.
Resumo:
Drug addiction is associated with impaired judgment in unstructured situations in which success depends on self-regulation of behavior according to internal goals (adaptive decision-making). However most executive measures are aimed at assessing decision-making in structured scenarios, in which success is determined by external criteria inherent to the situation (veridical decision-making). The aim of this study was to examine the performance of Substance Abusers (SA, n = 97) and Healthy Comparison participants (HC, n = 81) in two behavioral tasks that mimic the uncertainty inherent in real-life decision-making: the Cognitive Bias Task (CB) and the Iowa Gambling Task (IGT) (administered only to SA). A related goal was to study the interdependence between performances on both tasks. We conducted univariate analyses of variance (ANOVAs) to contrast the decision-making performance of both groups; and used correlation analyses to study the relationship between both tasks. SA showed a marked context-independent decision-making strategy on the CB's adaptive condition, but no differences were found on the veridical conditions in a subsample of SA (n = 34) and HC (n = 22). A high percentage of SA (75%) also showed impaired performance on the IGT. Both tasks were only correlated when no impaired participants were selected. Results indicate that SA show abnormal decision-making performance in unstructured situations, but not in veridical situations.
Resumo:
OBJECTIVE. The main goal of this paper is to obtain a classification model based on feed-forward multilayer perceptrons in order to improve postpartum depression prediction during the 32 weeks after childbirth with a high sensitivity and specificity and to develop a tool to be integrated in a decision support system for clinicians. MATERIALS AND METHODS. Multilayer perceptrons were trained on data from 1397 women who had just given birth, from seven Spanish general hospitals, including clinical, environmental and genetic variables. A prospective cohort study was made just after delivery, at 8 weeks and at 32 weeks after delivery. The models were evaluated with the geometric mean of accuracies using a hold-out strategy. RESULTS. Multilayer perceptrons showed good performance (high sensitivity and specificity) as predictive models for postpartum depression. CONCLUSIONS. The use of these models in a decision support system can be clinically evaluated in future work. The analysis of the models by pruning leads to a qualitative interpretation of the influence of each variable in the interest of clinical protocols.
Resumo:
This study evaluated the frequency of cognitive impairment in patients with Fibromyalgia syndrome (FMS) using the Mini Mental State Examination (MMSE).
METHODS
We analyzed baseline data from all 46 patients with FMS and 92 age- and sex-matched controls per diagnosis of neuropathic (NeP) or mixed pain (MP) selected from a larger prospective study.
RESULTS
FMS had a slight but statistically significant lower score in the adjusted MMSE score (26.9; 95% CI 26.7-27.1) than either NeP (27.3; 95% CI 27.2-27.4) or MP (27.3; 27.2-27.5). The percentage of patients with congnitive impairment (adjusted MMSE
Resumo:
BACKGROUND Only multifaceted hospital wide interventions have been successful in achieving sustained improvements in hand hygiene (HH) compliance. METHODOLOGY/PRINCIPAL FINDINGS Pre-post intervention study of HH performance at baseline (October 2007-December 2009) and during intervention, which included two phases. Phase 1 (2010) included multimodal WHO approach. Phase 2 (2011) added Continuous Quality Improvement (CQI) tools and was based on: a) Increase of alcohol hand rub (AHR) solution placement (from 0.57 dispensers/bed to 1.56); b) Increase in frequency of audits (three days every three weeks: "3/3 strategy"); c) Implementation of a standardized register form of HH corrective actions; d) Statistical Process Control (SPC) as time series analysis methodology through appropriate control charts. During the intervention period we performed 819 scheduled direct observation audits which provided data from 11,714 HH opportunities. The most remarkable findings were: a) significant improvements in HH compliance with respect to baseline (25% mean increase); b) sustained high level (82%) of HH compliance during intervention; c) significant increase in AHRs consumption over time; c) significant decrease in the rate of healthcare-acquired MRSA; d) small but significant improvements in HH compliance when comparing phase 2 to phase 1 [79.5% (95% CI: 78.2-80.7) vs 84.6% (95% CI:83.8-85.4), p<0.05]; e) successful use of control charts to identify significant negative and positive deviations (special causes) related to the HH compliance process over time ("positive": 90.1% as highest HH compliance coinciding with the "World hygiene day"; and "negative":73.7% as lowest HH compliance coinciding with a statutory lay-off proceeding). CONCLUSIONS/SIGNIFICANCE CQI tools may be a key addition to WHO strategy to maintain a good HH performance over time. In addition, SPC has shown to be a powerful methodology to detect special causes in HH performance (positive and negative) and to help establishing adequate feedback to healthcare workers.
Resumo:
BACKGROUND Waist circumference (WC) is a simple and reliable measure of fat distribution that may add to the prediction of type 2 diabetes (T2D), but previous studies have been too small to reliably quantify the relative and absolute risk of future diabetes by WC at different levels of body mass index (BMI). METHODS AND FINDINGS The prospective InterAct case-cohort study was conducted in 26 centres in eight European countries and consists of 12,403 incident T2D cases and a stratified subcohort of 16,154 individuals from a total cohort of 340,234 participants with 3.99 million person-years of follow-up. We used Prentice-weighted Cox regression and random effects meta-analysis methods to estimate hazard ratios for T2D. Kaplan-Meier estimates of the cumulative incidence of T2D were calculated. BMI and WC were each independently associated with T2D, with WC being a stronger risk factor in women than in men. Risk increased across groups defined by BMI and WC; compared to low normal weight individuals (BMI 18.5-22.4 kg/m(2)) with a low WC (<94/80 cm in men/women), the hazard ratio of T2D was 22.0 (95% confidence interval 14.3; 33.8) in men and 31.8 (25.2; 40.2) in women with grade 2 obesity (BMI≥35 kg/m(2)) and a high WC (>102/88 cm). Among the large group of overweight individuals, WC measurement was highly informative and facilitated the identification of a subgroup of overweight people with high WC whose 10-y T2D cumulative incidence (men, 70 per 1,000 person-years; women, 44 per 1,000 person-years) was comparable to that of the obese group (50-103 per 1,000 person-years in men and 28-74 per 1,000 person-years in women). CONCLUSIONS WC is independently and strongly associated with T2D, particularly in women, and should be more widely measured for risk stratification. If targeted measurement is necessary for reasons of resource scarcity, measuring WC in overweight individuals may be an effective strategy, since it identifies a high-risk subgroup of individuals who could benefit from individualised preventive action.
Resumo:
BACKGROUND Identifying individuals at high risk of excess weight gain may help targeting prevention efforts at those at risk of various metabolic diseases associated with weight gain. Our aim was to develop a risk score to identify these individuals and validate it in an external population. METHODS We used lifestyle and nutritional data from 53°758 individuals followed for a median of 5.4 years from six centers of the European Prospective Investigation into Cancer and Nutrition (EPIC) to develop a risk score to predict substantial weight gain (SWG) for the next 5 years (derivation sample). Assuming linear weight gain, SWG was defined as gaining ≥ 10% of baseline weight during follow-up. Proportional hazards models were used to identify significant predictors of SWG separately by EPIC center. Regression coefficients of predictors were pooled using random-effects meta-analysis. Pooled coefficients were used to assign weights to each predictor. The risk score was calculated as a linear combination of the predictors. External validity of the score was evaluated in nine other centers of the EPIC study (validation sample). RESULTS Our final model included age, sex, baseline weight, level of education, baseline smoking, sports activity, alcohol use, and intake of six food groups. The model's discriminatory ability measured by the area under a receiver operating characteristic curve was 0.64 (95% CI = 0.63-0.65) in the derivation sample and 0.57 (95% CI = 0.56-0.58) in the validation sample, with variation between centers. Positive and negative predictive values for the optimal cut-off value of ≥ 200 points were 9% and 96%, respectively. CONCLUSION The present risk score confidently excluded a large proportion of individuals from being at any appreciable risk to develop SWG within the next 5 years. Future studies, however, may attempt to further refine the positive prediction of the score.
Resumo:
BACKGROUND Functional brain images such as Single-Photon Emission Computed Tomography (SPECT) and Positron Emission Tomography (PET) have been widely used to guide the clinicians in the Alzheimer's Disease (AD) diagnosis. However, the subjectivity involved in their evaluation has favoured the development of Computer Aided Diagnosis (CAD) Systems. METHODS It is proposed a novel combination of feature extraction techniques to improve the diagnosis of AD. Firstly, Regions of Interest (ROIs) are selected by means of a t-test carried out on 3D Normalised Mean Square Error (NMSE) features restricted to be located within a predefined brain activation mask. In order to address the small sample-size problem, the dimension of the feature space was further reduced by: Large Margin Nearest Neighbours using a rectangular matrix (LMNN-RECT), Principal Component Analysis (PCA) or Partial Least Squares (PLS) (the two latter also analysed with a LMNN transformation). Regarding the classifiers, kernel Support Vector Machines (SVMs) and LMNN using Euclidean, Mahalanobis and Energy-based metrics were compared. RESULTS Several experiments were conducted in order to evaluate the proposed LMNN-based feature extraction algorithms and its benefits as: i) linear transformation of the PLS or PCA reduced data, ii) feature reduction technique, and iii) classifier (with Euclidean, Mahalanobis or Energy-based methodology). The system was evaluated by means of k-fold cross-validation yielding accuracy, sensitivity and specificity values of 92.78%, 91.07% and 95.12% (for SPECT) and 90.67%, 88% and 93.33% (for PET), respectively, when a NMSE-PLS-LMNN feature extraction method was used in combination with a SVM classifier, thus outperforming recently reported baseline methods. CONCLUSIONS All the proposed methods turned out to be a valid solution for the presented problem. One of the advances is the robustness of the LMNN algorithm that not only provides higher separation rate between the classes but it also makes (in combination with NMSE and PLS) this rate variation more stable. In addition, their generalization ability is another advance since several experiments were performed on two image modalities (SPECT and PET).
Resumo:
INTRODUCTION Selenium is an essential micronutrient for human health, being a cofactor for enzymes with antioxidant activity that protect the organism from oxidative damage. An inadequate intake of this mineral has been associated with the onset and progression of chronic diseases such as hypertension, diabetes, coronary diseases, asthma, and cancer. For this reason, knowledge of the plasma and erythrocyte selenium levels of a population makes a relevant contribution to assessment of its nutritional status. OBJECTIVE The objective of the present study was to determine the nutritional status of selenium and risk of selenium deficiency in a healthy adult population in Spain by examining food and nutrient intake and analyzing biochemical parameters related to selenium metabolism, including plasma and erythrocyte levels and selenium-dependent glutathione peroxidase (GPx) enzymatic activity. MATERIAL AND METHODS We studied 84 healthy adults (31 males and 53 females) from the province of Granada, determining their plasma and erythrocyte selenium concentrations and the association of these levels with the enzymatic activity of glutathione peroxidase (GPx) and with life style factors. We also gathered data on their food and nutrient intake and the results of biochemical analyses. Correlations were studied among all of these variables. RESULTS The mean plasma selenium concentration was 76.6 ± 17.3 μg/L (87.3 ± 17.4 μg/L in males, 67.3 ± 10.7 μg/L in females), whereas the mean erythrocyte selenium concentration was 104.6 μg/L (107.9 ± 26.1 μg/L in males and 101.7 ± 21.7 μg/L in females). The nutritional status of selenium was defined by the plasma concentration required to reach maximum GPx activity, establishing 90 μg/L as reference value. According to this criterion, 50% of the men and 53% of the women were selenium deficient. CONCLUSIONS Selenium is subjected to multiple regulation mechanisms. Erythrocyte selenium is a good marker of longer term selenium status, while plasma selenium appears to be a marker of short-term nutritional status. The present findings indicate a positive correlation between plasma selenium concentration and the practice of physical activity. Bioavailability studies are required to establish appropriate reference levels of this mineral for the Spanish population.
Resumo:
OBJETIVE: To report the data of the Home Parenteral Nutrition (HPN) registry of the NADYA-SENPE working group for the years 2011 and 2012. METHODOLOGY: We compiled the data from the on-line registry introduced by reviewers of NADYA group responsible for monitoring of NPD introduced by since January 1, 2011 to december 31, 2012. Included fields were: age, sex, diagnosis and reason for HPN, access path, complications, beginning and end dates, complementary oral or enteral nutrition, activity level, autonomy degree, product and fungible material supply, withdrawal reason and intestinal transplant indication. RESULTS: Year 2010: 184 patients from 29 hospitals , representing a rate of 3.98 patients/million inhabitants/ year 2011, with 186 episodes were recorded NPD . During 2012, 203 patients from 29 hospitals , representing a rate of 4.39 patients/million inhabitants/year 2012 , a total of 211 episodes were recorded NPD . CONCLUSIONS: We observe an increase in registered patients with respect to previous years.Neoplasia remains as the main pathology since 2003. Although NADYA is consolidated registry and has been indispensable source of information relevant to the understanding of the progress of Home Artificial Nutrition in our country, there is ample room for improvement. Especially that refers to the registration of pediatric patients and the registration of complications.
Resumo:
BACKGROUND Understanding of the genetic basis of type 2 diabetes (T2D) has progressed rapidly, but the interactions between common genetic variants and lifestyle risk factors have not been systematically investigated in studies with adequate statistical power. Therefore, we aimed to quantify the combined effects of genetic and lifestyle factors on risk of T2D in order to inform strategies for prevention. METHODS AND FINDINGS The InterAct study includes 12,403 incident T2D cases and a representative sub-cohort of 16,154 individuals from a cohort of 340,234 European participants with 3.99 million person-years of follow-up. We studied the combined effects of an additive genetic T2D risk score and modifiable and non-modifiable risk factors using Prentice-weighted Cox regression and random effects meta-analysis methods. The effect of the genetic score was significantly greater in younger individuals (p for interaction = 1.20×10-4). Relative genetic risk (per standard deviation [4.4 risk alleles]) was also larger in participants who were leaner, both in terms of body mass index (p for interaction = 1.50×10-3) and waist circumference (p for interaction = 7.49×10-9). Examination of absolute risks by strata showed the importance of obesity for T2D risk. The 10-y cumulative incidence of T2D rose from 0.25% to 0.89% across extreme quartiles of the genetic score in normal weight individuals, compared to 4.22% to 7.99% in obese individuals. We detected no significant interactions between the genetic score and sex, diabetes family history, physical activity, or dietary habits assessed by a Mediterranean diet score. CONCLUSIONS The relative effect of a T2D genetic risk score is greater in younger and leaner participants. However, this sub-group is at low absolute risk and would not be a logical target for preventive interventions. The high absolute risk associated with obesity at any level of genetic risk highlights the importance of universal rather than targeted approaches to lifestyle intervention.
Resumo:
INTRODUCTION Monotherapy against HIV has undoubted theoretical advantages and has good scientific fundaments. However, it is still controversial and here we will analyze the efficacy and safety of MT with darunavir with ritonavir (DRV/r) on patients who have received this treatment in our hospitals. MATERIALS AND METHODS Observational retrospective study that includes patients from 10 Andalusian hospitals that have received DRV/r in MT and that have been followed over a minimum of 12 months. We carried out a statistical descriptive analysis based on the profile of patients who had been prescribed MT and the efficacy and safety that were observed, paying special attention to treatment failure and virological evolution. RESULTS DRV/r was prescribed to 604 patients, of which 41.1% had a CD4 nadir <200/mmc. 33.1% had chronic hepatitis caused by HCV, had received an average of five lines of previous treatment and had a history of treatment failure to analogues in 33%, to non-analogues 22 and protease inhibitors (PI) in 19.5%. 76.6% proceeded from a previous treatment with PI. The simplification was the main criteria for the instauration of MT in the 81.5% and the adverse effects in the 18.5%. We managed to maintain MT in 84% of cases, with only 4.8% of virological failure (VF) with viral load (VL) >200 c/mL and 3.6% additional losses due to VF with VL between 50 and 200 copies/mL. Thirty three genotypes were performed after failure without findings of resistance mutations to DRV/r or other IPs. Only 23.7% of patients presented some blips during the period of exposition to MT. Eighty seven percent of all determinations of VL had <50 copies/mL, and only 4.99% had >200 copies/mL. Although up to 14.9% registered at some point an AE, only 2.6% abandoned MT because of AE and 1.2% because of voluntary decision. Although the average of total and LDL cholesterol increases 10 mg/dL after 2 years of follow-up, so did HDL cholesterol in 3mg/dL and the values of triglycerides (-14 mg/dL) and GPT (-6 UI/mL) decreased. The average count of CD4 lymphocytes increased from 642 to 714/mm(3) at 24 weeks. CONCLUSIONS In a very broad series of patients obtained from clinical practice, data from clinical trials was confirmed: MT with DRV as a de-escalation strategy is very safe, it's associated to a negligible rate of adverse effects and maintains a good suppression of HIV replication. VF (with >50 or >200 copies/mL) is always under 10% and in any case without consequences.
Resumo:
Clinical Decision Support Systems (CDSS) are software applications that support clinicians in making healthcare decisions providing relevant information for individual patients about their specific conditions. The lack of integration between CDSS and Electronic Health Record (EHR) has been identified as a significant barrier to CDSS development and adoption. Andalusia Healthcare Public System (AHPS) provides an interoperable health information infrastructure based on a Service Oriented Architecture (SOA) that eases CDSS implementation. This paper details the deployment of a CDSS jointly with the deployment of a Terminology Server (TS) within the AHPS infrastructure. It also explains a case study about the application of decision support to thromboembolism patients and its potential impact on improving patient safety. We will apply the inSPECt tool proposal to evaluate the appropriateness of alerts in this scenario.