971 resultados para CLINICAL-SAMPLES


Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVE Cognitive impairments are regarded as a core component of schizophrenia. However, the cognitive dimension of psychosis is hardly considered by ultra-high risk (UHR) criteria. Therefore, we studied whether the combination of symptomatic UHR criteria and the basic symptom criterion "cognitive disturbances" (COGDIS) is superior in predicting first-episode psychosis. METHOD In a naturalistic 48-month follow-up study, the conversion rate to first-episode psychosis was studied in 246 outpatients of an early detection of psychosis service (FETZ); thereby, the association between conversion, and the combined and singular use of UHR criteria and COGDIS was compared. RESULTS Patients that met UHR criteria and COGDIS (n=127) at baseline had a significantly higher risk of conversion (hr=0.66 at month 48) and a shorter time to conversion than patients that met only UHR criteria (n=37; hr=0.28) or only COGDIS (n=30; hr=0.23). Furthermore, the risk of conversion was higher for the combined criteria than for UHR criteria (n=164; hr=0.56 at month 48) and COGDIS (n=158; hr=0.56 at month 48) when considered irrespective of each other. CONCLUSIONS Our findings support the merits of considering both COGDIS and UHR criteria in the early detection of persons who are at high risk of developing a first psychotic episode within 48months. Applying both sets of criteria improves sensitivity and individual risk estimation, and may thereby support the development of stage-targeted interventions. Moreover, since the combined approach enables the identification of considerably more homogeneous at-risk samples, it should support both preventive and basic research.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND Thrombotic thrombocytopenic purpura (TTP) associated with severe, acquired ADAMTS13 deficiency is uncommonly reported in children. The incidence, demographic, and clinical features of these children, compared to adults, have not been described. PROCEDURES This study focused on children (<18 years old) and adults with TTP associated with severe, acquired ADAMTS13 deficiency, defined as activity <10%. The incidence rates for TTP in children and adults were calculated from patients enrolled in the Oklahoma TTP-HUS (Hemolytic-Uremic syndrome) Registry, 1996-2012. To describe demographic and clinical features, children with TTP were also identified from a systematic review of published reports and from samples sent to a reference laboratory for analysis of ADAMTS13. RESULTS The standardized annual incidence rate of TTP in children was 0.09 × 10(6) children per year, 3% of the incidence rate among adults (2.88 × 10(6) adults per year). Among the 79 children who were identified (one from the Oklahoma Registry, 55 from published reports, 23 from the reference laboratory), TTP appeared to be more common among females, similar to the relative increased frequency of women among adults with TTP, and more common in older children. Clinical data were available on 52 children; the frequency of severe renal failure, relapse, treatment with rituximab, and systemic lupus erythematosus in these children was similar to adults with TTP. CONCLUSIONS TTP associated with severe, acquired ADAMTS13 deficiency is uncommon in children. The demographic and clinical features of these children are similar to the features of adults with TTP.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND Prediction studies in subjects at Clinical High Risk (CHR) for psychosis are hampered by a high proportion of uncertain outcomes. We therefore investigated whether quantitative EEG (QEEG) parameters can contribute to an improved identification of CHR subjects with a later conversion to psychosis. METHODS This investigation was a project within the European Prediction of Psychosis Study (EPOS), a prospective multicenter, naturalistic field study with an 18-month follow-up period. QEEG spectral power and alpha peak frequencies (APF) were determined in 113 CHR subjects. The primary outcome measure was conversion to psychosis. RESULTS Cox regression yielded a model including frontal theta (HR=1.82; [95% CI 1.00-3.32]) and delta (HR=2.60; [95% CI 1.30-5.20]) power, and occipital-parietal APF (HR=.52; [95% CI .35-.80]) as predictors of conversion to psychosis. The resulting equation enabled the development of a prognostic index with three risk classes (hazard rate 0.057 to 0.81). CONCLUSIONS Power in theta and delta ranges and APF contribute to the short-term prediction of psychosis and enable a further stratification of risk in CHR samples. Combined with (other) clinical ratings, EEG parameters may therefore be a useful tool for individualized risk estimation and, consequently, targeted prevention.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Listeria (L.) monocytogenes causes orally acquired infections and is of major importance in ruminants. Little is known about L. monocytogenes transmission between farm environment and ruminants. In order to determine potential sources of infection, we investigated the distribution of L. monocytogenes genetic subtypes in a sheep farm during a listeriosis outbreak by applying four subtyping methods (MALDI-TOF-MS, MLST, MLVA and PFGE). L. monocytogenes was isolated from a lamb with septicemia and from the brainstem of three sheep with encephalitis. Samples from the farm environment were screened for the presence of L. monocytogenes during the listeriosis outbreak, four weeks and eight months after. L. monocytogenes was found only in soil and water tank swabs during the outbreak. Four weeks later, following thorough cleaning of the barn, as well as eight months later, L. monocytogenes was absent in environmental samples. All environmental and clinical L. monocytogenes isolates were found to be the same strain. Our results show that the outbreak involving two different clinical syndromes was caused by a single L. monocytogenes strain and that soil and water tanks were potential infection sources during this outbreak. However, silage cannot be completely ruled out as the bales fed prior to the outbreak were not available for analysis. Faeces samples were negative, suggesting that sheep did not act as amplification hosts contributing to environmental contamination. In conclusion, farm management appears to be a crucial factor for the limitation of a listeriosis outbreak.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background Tumor necrosis factor (TNF) inhibition is central to the therapy of inflammatory bowel diseases (IBD). However, loss of response (LOR) is frequent and additional tests to help decision making with costly anti-TNF Therapy are needed. Methods Consecutive IBD Patients receiving anti-TNF therapy (Infliximab (IFX) or Adalimumab (after IFX LOR) from Bern University Hospital were identified and followed prospectively. Patient whole blood was stimulated with a dose-titration of two triggers of TLR receptors human: TNF and LPS. Median fluorescence intensity of CD62L on the surface of granulocytes was quantified by surface staining with specific antibodies (CD33, CD62L) and flow cytometry and logistic curves to these data permits the calculation of EC50 or the half maximal effective concentration TNF concentration to induce shedding [1]. A shift in the concentration were CD62L shedding occurred was seen before and after the anti-TNF agent administraion which permits to predict the response to the drug. This predicted response was correlated to the clinical evolution of the patients in order to analyze the ability of this test to identify LOR to IFX. Results We collected prospective clinical data and blood samples, before and after anti-TNF agent administration, on 33 IBD patients, 25 Crohn's disease and 8 ulcerative colitis patients (45% females) between June 2012 and November 2013. The assay showed a functional blockade of IFX (PFR) for 22 patients (17 CD and 5 UC) whereas 11 (8 CD and 3 UC) had no functional response (NR) to IFX. Clinical characteristics (e.g. diagnosis, disease location, smoking status, BMI and number of infusions) were no significantly different between predicted PFR and NR. Among the 22 Patients with PRF, only 1 patient was a clinical non responder (LOR to IFX), based on clinical prospective evaluation by IBD gastroenterologists (PJ, AM), and among the 11 predicted NR, 3 had no clinical LOR. Sensitivity of this test was 95% and specificity 73% and AUC adjusted for age and gender was 0.81 (Figure 1). During follow up (median 10 mo, 3–15) 8 “hard” outcomes occured (3 medic. flares, 4 resections and 1 new fistula) 2 in the PFR and 6 in the NR group (25% vs. 75%; p < 0.01). Correlation with clinical response is presented in Figure 2. Figure 1. Figure 2. Correlation clinical response - log EC50 changes: 1 No, 2 partial, 3 complete clinical response. Conclusion CD62L (L-Selectin) shedding is the first validated test of functional blockade of TNF alpha in anti-TNF treated IBD patients and will be a useful tool to guide medical decision on the use of anti-TNF agents. Comparative studies with ATI and trough level of IFX are ongoing. 1. Nicola Patuto, Emma Slack, Frank Seibold and Andrew J. Macpherson, (2011), Quantitating Anti-TNF Functionality to Inform Dosing and Choice of Therapy, Gastroenterology, 140 (5, Suppl. I), S689.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Mycoplasma bovis causes mastitis in dairy cows and is associated with pneumonia and polyarthritis in cattle. The present investigation included a retrospective case–control study to identify potential herd-level risk factors for M. bovis associated disease, and a prospective cohort study to evaluate the course of clinical disease in M. bovis infected dairy cattle herds in Switzerland. Eighteen herds with confirmed M. bovis cases were visited twice within an average interval of 75 d. One control herd with no history of clinical mycoplasmosis, matched for herd size, was randomly selected within a 10 km range for each case herd. Animal health data, production data, information on milking and feeding-management, housing and presence of potential stress- factors were collected. Composite quarter milk samples were aseptically collected from all lactating cows and 5% of all animals within each herd were sampled by nasal swabs. Organ samples of culled diseased cows were collected when logistically possible. All samples were analyzed by real-time polymerase chain reaction (PCR). In case herds, incidence risk of pneumonia, arthritis and clinical mastitis prior to the first visit and incidence rates of clinical mastitis and clinical pneumonia between the two visits was estimated. Logistic regression was used to identify potential herd-level risk factors for M. bovis infection. In case herds, incidence risk of M. bovis mastitis prior to the first visit ranged from 2 to 15%, whereas 2 to 35% of the cows suffered from clinical pneumonia within the 12 months prior to the first herd visit. The incidence rates of mycoplasmal mastitis and clinical pneumonia between the two herd visits were low in case herds (0–0.1 per animal year at risk and 0.1-0.6 per animal year at risk, respectively). In the retrospective-case-control study high mean milk production, appropriate stimulation until milk-let-down, fore-stripping, animal movements (cattle shows and trade), presence of stress-factors, and use of a specific brand of milking equipment, were identified as potential herd-level risk factors. The prospective cohort study revealed a decreased incidence of clinical disease within three months and prolonged colonization of the nasal cavity by M. bovis in young stock.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This guidance paper from the European Psychiatric Association (EPA) aims to provide evidence-based recommendations on early intervention in clinical high risk (CHR) states of psychosis, assessed according to the EPA guidance on early detection. The recommendations were derived from a meta-analysis of current empirical evidence on the efficacy of psychological and pharmacological interventions in CHR samples. Eligible studies had to investigate conversion rate and/or functioning as a treatment outcome in CHR patients defined by the ultra-high risk and/or basic symptom criteria. Besides analyses on treatment effects on conversion rate and functional outcome, age and type of intervention were examined as potential moderators. Based on data from 15 studies (n = 1394), early intervention generally produced significantly reduced conversion rates at 6- to 48-month follow-up compared to control conditions. However, early intervention failed to achieve significantly greater functional provements because both early intervention and control conditions produced similar positive effects. With regard to the type of intervention, both psychological and pharmacological interventions produced significant effects on conversion rates, but not on functional outcome relative to the control conditions. Early intervention in youth samples was generally less effective than in predominantly adult samples. Seven evidence-based recommendations for early intervention in CHR samples could have been formulated, although more studies are needed to investigate the specificity of treatment effects and potential age effects in order to tailor interventions to the individual treatment needs and risk Status.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The aim of this guidance paper of the European Psychiatric Association is to provide evidence-based recommendations on the early detection of a clinical high risk (CHR) for psychosis in patients with mental problems. To this aim, we conducted a meta-analysis of studies reporting on conversion rates to psychosis in non-overlapping samples meeting any at least any one of the main CHR criteria: ultra-high risk (UHR) and/or basic symptoms criteria. Further, effects of potential moderators (different UHR criteria definitions, single UHR criteria and age) on conversion rates were examined. Conversion rates in the identified 42 samples with altogether more than 4000 CHR patients who had mainly been identified by UHR criteria and/or the basic symptom criterion ‘cognitive disturbances’ (COGDIS) showed considerable heterogeneity. While UHR criteria and COGDIS were related to similar conversion rates until 2-year follow-up, conversion rates of COGDIS were significantly higher thereafter. Differences in onset and frequency requirements of symptomatic UHR criteria or in their different consideration of functional decline, substance use and co-morbidity did not seem to impact on conversion rates. The ‘genetic risk and functional decline’ UHR criterion was rarely met and only showed an insignificant pooled sample effect. However, age significantly affected UHR conversion rates with lower rates in children and adolescents. Although more research into potential sources of heterogeneity in conversion rates is needed to facilitate improvement of CHR criteria, six evidence-based recommendations for an early detection of psychosis were developed as a basis for the EPA guidance on early intervention in CHR states.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND Prostate cancer (PCa) is a very heterogeneous disease with respect to clinical outcome. This study explored differential DNA methylation in a priori selected genes to diagnose PCa and predict clinical failure (CF) in high-risk patients. METHODS A quantitative multiplex, methylation-specific PCR assay was developed to assess promoter methylation of the APC, CCND2, GSTP1, PTGS2 and RARB genes in formalin-fixed, paraffin-embedded tissue samples from 42 patients with benign prostatic hyperplasia and radical prostatectomy specimens of patients with high-risk PCa, encompassing training and validation cohorts of 147 and 71 patients, respectively. Log-rank tests, univariate and multivariate Cox models were used to investigate the prognostic value of the DNA methylation. RESULTS Hypermethylation of APC, CCND2, GSTP1, PTGS2 and RARB was highly cancer-specific. However, only GSTP1 methylation was significantly associated with CF in both independent high-risk PCa cohorts. Importantly, trichotomization into low, moderate and high GSTP1 methylation level subgroups was highly predictive for CF. Patients with either a low or high GSTP1 methylation level, as compared to the moderate methylation groups, were at a higher risk for CF in both the training (Hazard ratio [HR], 3.65; 95% CI, 1.65 to 8.07) and validation sets (HR, 4.27; 95% CI, 1.03 to 17.72) as well as in the combined cohort (HR, 2.74; 95% CI, 1.42 to 5.27) in multivariate analysis. CONCLUSIONS Classification of primary high-risk tumors into three subtypes based on DNA methylation can be combined with clinico-pathological parameters for a more informative risk-stratification of these PCa patients.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: In clinical practise the high dose ACTH stimulation test (HDT) is frequently used in the assessment of adrenal insufficiency (AI). However, there is uncertainty regarding optimal time-points and number of blood samplings. The present study compared the utility of a single cortisol value taken either 30 or 60 minutes after ACTH stimulation with the traditional interpretation of the HDT. METHODS: Retrospective analysis of 73 HDT performed at a single tertiary endocrine centre. Serum cortisol was measured at baseline, 30 and 60 minutes after intravenous administration of 250 µg synthetic ACTH1-24. Adrenal insufficiency (AI) was defined as a stimulated cortisol level <550 nmol/l. RESULTS: There were twenty patients (27.4%) who showed an insufficient rise in serum cortisol using traditional HDT criteria and were diagnosed to suffer from AI. There were ten individuals who showed insufficient cortisol values after 30 minutes, rising to sufficient levels at 60 minutes. All patients revealing an insufficient cortisol response result after 60 minutes also had an insufficient result after 30 minutes. The cortisol value taken after 30 minutes did not add incremental diagnostic value in any of the cases under investigation compared with the 60 minutes' sample. CONCLUSIONS: Based on the findings of the present analysis the utility of a cortisol measurement 30 minutes after high dose ACTH injection was low and did not add incremental diagnostic value to a single measurement after 60 minutes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: The individual risk of developing psychosis after being tested for clinical high-risk (CHR) criteria (posttest risk of psychosis) depends on the underlying risk of the disease of the population from which the person is selected (pretest risk of psychosis), and thus on recruitment strategies. Yet, the impact of recruitment strategies on pretest risk of psychosis is unknown. Methods: Meta-analysis of the pretest risk of psychosis in help-seeking patients selected to undergo CHR assessment: total transitions to psychosis over the pool of patients assessed for potential risk and deemed at risk (CHR+) or not at risk (CHR−). Recruitment strategies (number of outreach activities per study, main target of outreach campaign, and proportion of self-referrals) were the moderators examined in meta-regressions. Results: 11 independent studies met the inclusion criteria, for a total of 2519 (CHR+: n = 1359; CHR−: n = 1160) help-seeking patients undergoing CHR assessment (mean follow-up: 38 months). The overall meta-analytical pretest risk for psychosis in help-seeking patients was 15%, with high heterogeneity (95% CI: 9%–24%, I 2 = 96, P < .001). Recruitment strategies were heterogeneous and opportunistic. Heterogeneity was largely explained by intensive (n = 11, β = −.166, Q = 9.441, P = .002) outreach campaigns primarily targeting the general public (n = 11, β = −1.15, Q = 21.35, P < .001) along with higher proportions of self-referrals (n = 10, β = −.029, Q = 4.262, P = .039), which diluted pretest risk for psychosis in patients undergoing CHR assessment. Conclusions: There is meta-analytical evidence for overall risk enrichment (pretest risk for psychosis at 38monhts = 15%) in help-seeking samples selected for CHR assessment as compared to the general population (pretest risk of psychosis at 38monhts=0.1%). Intensive outreach campaigns predominantly targeting the general population and a higher proportion of self-referrals diluted the pretest risk for psychosis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVE To analytically validate a gas concentration of chromatography-mass spectrometry (GC-MS) method for measurement of 6 amino acids in canine serum samples and to assess the stability of each amino acid after sample storage. SAMPLES Surplus serum from 80 canine samples submitted to the Gastrointestinal Laboratory at Texas A&M University and serum samples from 12 healthy dogs. PROCEDURES GC-MS was validated to determine precision, reproducibility, limit of detection, and percentage recovery of known added concentrations of 6 amino acids in surplus serum samples. Amino acid concentrations in serum samples from healthy dogs were measured before (baseline) and after storage in various conditions. RESULTS Intra- and interassay coefficients of variation (10 replicates involving 12 pooled serum samples) were 13.4% and 16.6% for glycine, 9.3% and 12.4% for glutamic acid, 5.1% and 6.3% for methionine, 14.0% and 15.1% for tryptophan, 6.2% and 11.0% for tyrosine, and 7.4% and 12.4% for lysine, respectively. Observed-to-expected concentration ratios in dilutional parallelism tests (6 replicates involving 6 pooled serum samples) were 79.5% to 111.5% for glycine, 80.9% to 123.0% for glutamic acid, 77.8% to 111.0% for methionine, 85.2% to 98.0% for tryptophan, 79.4% to 115.0% for tyrosine, and 79.4% to 110.0% for lysine. No amino acid concentration changed significantly from baseline after serum sample storage at -80°C for ≤ 7 days. CONCLUSIONS AND CLINICAL RELEVANCE GC-MS measurement of concentration of 6 amino acids in canine serum samples yielded precise, accurate, and reproducible results. Sample storage at -80°C for 1 week had no effect on GC-MS results.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND AND OBJECTIVES Allelic variants in UMOD, the gene coding for uromodulin, are associated with rare tubulointerstitial kidney disorders and risk of CKD and hypertension in the general population. The factors associated with uromodulin excretion in the normal population remain largely unknown, and were therefore explored in this study. DESIGN, SETTING, PARTICIPANTS, & MEASUREMENTS Urinary uromodulin excretion was measured using a validated ELISA in two population-based cohorts that included more than 6500 individuals. The Swiss Kidney Project on Genes in Hypertension study (SKIPOGH) included 817 adults (mean age±SD, 45±17 years) who underwent renal ultrasonography and performed a 24-hour urine collection. The Cohorte Lausannoise study included 5706 adults (mean age, 53±11 years) with fresh spot morning urine samples. We calculated eGFRs using the CKD-Epidemiology Collaboration formula and by 24-hour creatinine clearance. RESULTS In both studies, positive associations were found between uromodulin and urinary sodium, chloride, and potassium excretion and osmolality. In SKIPOGH, 24-hour uromodulin excretion (median, 41 [interquartile range, 29-57] mg/24 h) was positively associated with kidney length and volume and with creatinine excretion and urine volume. It was negatively associated with age and diabetes. Both spot uromodulin concentration and 24-hour uromodulin excretion were linearly and positively associated (multivariate analyses) with eGFR<90 ml/min per 1.73 m(2). CONCLUSION Age, creatinine excretion, diabetes, and urinary volume are independent clinical correlates of urinary uromodulin excretion. The associations of uromodulin excretion with markers of tubular functions and kidney dimensions suggest that it may reflect tubule activity in the general population.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Footrot is a widespread problem in Swiss sheep farming. The objectives of this study were to determine whether flocks which were clinically free from footrot carry virulent strains of Dichelobacter nodosus, and to describe the infection dynamics for flocks and individual sheep. To this purpose, a new PCR-diagnostic tool was used, which is able to distinguish benign from virulent D. nodosus. Nine farms were examined three times at intervals of 6 months. Cotton swabs were used to collect samples from the interdigital skin to analyze for the presence of virulent and benign strains of D. nodosus. Additionally, epidemiological data of the farms were collected with the aid of a standardized questionnaire. On four farms, benign strains were diagnosed at each visit; in one farm, benign strains were detected once only. Two flocks revealed sheep infected with virulent D. nodosus throughout the study but without clinical evidence of footrot. In two flocks, the virulent strains of D. nodosus were introduced into the flock during the study period. In one farm, clinical symptoms of virulent footrot were evident only two weeks after the positive finding by PCR. Only individual sheep with previously negative status, but none with previously benign status became infected with virulent strains during the study. The newly developed competitive RT PCR proved to be more sensitive than clinical diagnosis for detecting footrot infection in herds, as it unequivocally classified the four flocks as infected with virulent D. nodosus, even though they did not show clinical signs at the times of sampling. This early detection may be crucial to the success of any control program. Both new infections with virulent strains could be explained by contact with sheep from herds with virulent D. nodosus as evaluated from the questionnaires. These results show that the within-herd eradication of footrot becomes possible using the competitive PCR assay to specifically diagnose virulent D. nodosus.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

AIMS A non-invasive gene-expression profiling (GEP) test for rejection surveillance of heart transplant recipients originated in the USA. A European-based study, Cardiac Allograft Rejection Gene Expression Observational II Study (CARGO II), was conducted to further clinically validate the GEP test performance. METHODS AND RESULTS Blood samples for GEP testing (AlloMap(®), CareDx, Brisbane, CA, USA) were collected during post-transplant surveillance. The reference standard for rejection status was based on histopathology grading of tissue from endomyocardial biopsy. The area under the receiver operating characteristic curve (AUC-ROC), negative (NPVs), and positive predictive values (PPVs) for the GEP scores (range 0-39) were computed. Considering the GEP score of 34 as a cut-off (>6 months post-transplantation), 95.5% (381/399) of GEP tests were true negatives, 4.5% (18/399) were false negatives, 10.2% (6/59) were true positives, and 89.8% (53/59) were false positives. Based on 938 paired biopsies, the GEP test score AUC-ROC for distinguishing ≥3A rejection was 0.70 and 0.69 for ≥2-6 and >6 months post-transplantation, respectively. Depending on the chosen threshold score, the NPV and PPV range from 98.1 to 100% and 2.0 to 4.7%, respectively. CONCLUSION For ≥2-6 and >6 months post-transplantation, CARGO II GEP score performance (AUC-ROC = 0.70 and 0.69) is similar to the CARGO study results (AUC-ROC = 0.71 and 0.67). The low prevalence of ACR contributes to the high NPV and limited PPV of GEP testing. The choice of threshold score for practical use of GEP testing should consider overall clinical assessment of the patient's baseline risk for rejection.