235 resultados para AUTOINFLAMMATORY SYNDROMES
Resumo:
Fibromuscular dysplasia (FMD) is a rare, nonatherosclerotic arterial disease for which the molecular basis is unknown. We comprehensively studied 47 subjects with FMD, including physical examination, spine magnetic resonance imaging, bone densitometry, and brain magnetic resonance angiography. Inflammatory biomarkers in plasma and transforming growth factor β (TGF-β) cytokines in patient-derived dermal fibroblasts were measured by ELISA. Arterial pathology other than medial fibrodysplasia with multifocal stenosis included cerebral aneurysm, found in 12.8% of subjects. Extra-arterial pathology included low bone density (P<0.001); early onset degenerative spine disease (95.7%); increased incidence of Chiari I malformation (6.4%) and dural ectasia (42.6%); and physical examination findings of a mild connective tissue dysplasia (95.7%). Screening for mutations causing known genetically mediated arteriopathies was unrevealing. We found elevated plasma TGF-β1 (P=0.009), TGF-β2 (P=0.004) and additional inflammatory markers, and increased TGF-β1 (P=0.0009) and TGF-β2 (P=0.0001) secretion in dermal fibroblast cell lines from subjects with FMD compared to age- and gender-matched controls. Detailed phenotyping of patients with FMD allowed us to demonstrate that FMD is a systemic disease with alterations in common with the spectrum of genetic syndromes that involve altered TGF-β signaling and offers TGF-β as a marker of FMD.
Resumo:
The perioperative management of patients with mediastinal masses is a special clinical challenge in our field. Even though regional anaesthesia is normally the first choice, in some cases it is not feasible due to the method of operation. In these cases general anaesthesia is the second option but can lead to respiratory and haemodynamic decompensation due to tumor-associated compression syndrome (mediastinal mass syndrome). The appropriate treatment begins with the preoperative risk classification on the basis of clinical and radiological findings. In addition to anamnesis, chest radiograph, and CT, dynamical methods (e.g. pneumotachography and echocardiography) should be applied to verify possible intraoperative compression syndromes. The induction of general anaesthesia is to be realized in awake-fiberoptic intubation with introduction of the tube via nasal route while maintaining the spontaneous breathing of the patient. The anaesthesia continues with short effective agents applied inhalative or iv. If possible from the point of operation, agents of muscle relaxation are not to be applied. If the anaesthesia risk is classified as uncertain or unsafe, depending on the location of tumor compression (tracheobronchial tree, pulmonary artery, superior vena cava), alternative techniques of securing the respiratory tract (different tubes, rigid bronchoscope) and cardiopulmonary bypass with extracorporal oxygen supply are prepared. For patients with severe clinical symptoms and extensive mediastinal mass, the preoperative cannulation of femoral vessels is also recommended. In addition to fulfilling technical and personnel requirements, an interdisciplinary cooperation of participating fields is the most important prerequisite for the optimal treatment of patients.
Resumo:
NaV-b subunits associate with the NaV-a or pore-forming subunit of the voltage-dependent sodium channel and play critical roles in channel expression, voltage dependence of the channel gating, cell adhesion, signal transduction, and channel pharmacology. Five NaV-b subunits have been identified in humans, all of them implicated in many primary arrhythmia syndromes that cause sudden death or neurologic disorders, including long QT syndrome, Brugada syndrome, cardiac conduction disorders, idiopathic ventricular fibrillation, epilepsy, neurodegenerative diseases, and neuropsychiatric disorders.
Resumo:
Listeria (L.) monocytogenes causes orally acquired infections and is of major importance in ruminants. Little is known about L. monocytogenes transmission between farm environment and ruminants. In order to determine potential sources of infection, we investigated the distribution of L. monocytogenes genetic subtypes in a sheep farm during a listeriosis outbreak by applying four subtyping methods (MALDI-TOF-MS, MLST, MLVA and PFGE). L. monocytogenes was isolated from a lamb with septicemia and from the brainstem of three sheep with encephalitis. Samples from the farm environment were screened for the presence of L. monocytogenes during the listeriosis outbreak, four weeks and eight months after. L. monocytogenes was found only in soil and water tank swabs during the outbreak. Four weeks later, following thorough cleaning of the barn, as well as eight months later, L. monocytogenes was absent in environmental samples. All environmental and clinical L. monocytogenes isolates were found to be the same strain. Our results show that the outbreak involving two different clinical syndromes was caused by a single L. monocytogenes strain and that soil and water tanks were potential infection sources during this outbreak. However, silage cannot be completely ruled out as the bales fed prior to the outbreak were not available for analysis. Faeces samples were negative, suggesting that sheep did not act as amplification hosts contributing to environmental contamination. In conclusion, farm management appears to be a crucial factor for the limitation of a listeriosis outbreak.
Resumo:
BACKGROUND The CD4 cell count or percent (CD4%) at the start of combination antiretroviral therapy (cART) is an important prognostic factor in children starting therapy and an important indicator of program performance. We describe trends and determinants of CD4 measures at cART initiation in children from low-, middle-, and high-income countries. METHODS We included children aged <16 years from clinics participating in a collaborative study spanning sub-Saharan Africa, Asia, Latin America, and the United States. Missing CD4 values at cART start were estimated through multiple imputation. Severe immunodeficiency was defined according to World Health Organization criteria. Analyses used generalized additive mixed models adjusted for age, country, and calendar year. RESULTS A total of 34,706 children from 9 low-income, 6 lower middle-income, 4 upper middle-income countries, and 1 high-income country (United States) were included; 20,624 children (59%) had severe immunodeficiency. In low-income countries, the estimated prevalence of children starting cART with severe immunodeficiency declined from 76% in 2004 to 63% in 2010. Corresponding figures for lower middle-income countries were from 77% to 66% and for upper middle-income countries from 75% to 58%. In the United States, the percentage decreased from 42% to 19% during the period 1996 to 2006. In low- and middle-income countries, infants and children aged 12-15 years had the highest prevalence of severe immunodeficiency at cART initiation. CONCLUSIONS Despite progress in most low- and middle-income countries, many children continue to start cART with severe immunodeficiency. Early diagnosis and treatment of HIV-infected children to prevent morbidity and mortality associated with immunodeficiency must remain a global public health priority.
Resumo:
BACKGROUND Mutations in the SCN9A gene cause chronic pain and pain insensitivity syndromes. We aimed to study clinical, genetic, and electrophysiological features of paroxysmal extreme pain disorder (PEPD) caused by a novel SCN9A mutation. METHODS Description of a 4-generation family suffering from PEPD with clinical, genetic and electrophysiological studies including patch clamp experiments assessing response to drug and temperature. RESULTS The family was clinically comparable to those reported previously with the exception of a favorable effect of cold exposure and a lack of drug efficacy including with carbamazepine, a proposed treatment for PEPD. A novel p.L1612P mutation in the Nav1.7 voltage-gated sodium channel was found in the four affected family members tested. Electrophysiologically the mutation substantially depolarized the steady-state inactivation curve (V1/2 from -61.8 ± 4.5 mV to -30.9 ± 2.2 mV, n = 4 and 7, P < 0.001), significantly increased ramp current (from 1.8% to 3.4%, n = 10 and 12) and shortened recovery from inactivation (from 7.2 ± 5.6 ms to 2.2 ± 1.5 ms, n = 11 and 10). However, there was no persistent current. Cold exposure reduced peak current and prolonged recovery from inactivation in wild-type and mutated channels. Amitriptyline only slightly corrected the steady-state inactivation shift of the mutated channel, which is consistent with the lack of clinical benefit. CONCLUSIONS The novel p.L1612P Nav1.7 mutation expands the PEPD spectrum with a unique combination of clinical symptoms and electrophysiological properties. Symptoms are partially responsive to temperature but not to drug therapy. In vitro trials of sodium channel blockers or temperature dependence might help predict treatment efficacy in PEPD.
Resumo:
The field of animal syndromic surveillance (SyS) is growing, with many systems being developed worldwide. Now is an appropriate time to share ideas and lessons learned from early SyS design and implementation. Based on our practical experience in animal health SyS, with additions from the public health and animal health SyS literature, we put forward for discussion a 6-step approach to designing SyS systems for livestock and poultry. The first step is to formalise policy and surveillance goals which are considerate of stakeholder expectations and reflect priority issues (1). Next, it is important to find consensus on national priority diseases and identify current surveillance gaps. The geographic, demographic, and temporal coverage of the system must be carefully assessed (2). A minimum dataset for SyS that includes the essential data to achieve all surveillance objectives while minimizing the amount of data collected should be defined. One can then compile an inventory of the data sources available and evaluate each using the criteria developed (3). A list of syndromes should then be produced for all data sources. Cases can be classified into syndrome classes and the data can be converted into time series (4). Based on the characteristics of the syndrome-time series, the length of historic data available and the type of outbreaks the system must detect, different aberration detection algorithms can be tested (5). Finally, it is essential to develop a minimally acceptable response protocol for each statistical signal produced (6). Important outcomes of this pre-operational phase should be building of a national network of experts and collective action and evaluation plans. While some of the more applied steps (4 and 5) are currently receiving consideration, more emphasis should be put on earlier conceptual steps by decision makers and surveillance developers (1-3).
Resumo:
BACKGROUND The association between combination antiretroviral therapy (cART) and cancer risk, especially regimens containing protease inhibitors (PIs) or nonnucleoside reverse transcriptase inhibitors (NNRTIs), is unclear. METHODS Participants were followed from the latest of D:A:D study entry or January 1, 2004, until the earliest of a first cancer diagnosis, February 1, 2012, death, or 6 months after the last visit. Multivariable Poisson regression models assessed associations between cumulative (per year) use of either any cART or PI/NNRTI, and the incidence of any cancer, non-AIDS-defining cancers (NADC), AIDS-defining cancers (ADC), and the most frequently occurring ADC (Kaposi sarcoma, non-Hodgkin lymphoma) and NADC (lung, invasive anal, head/neck cancers, and Hodgkin lymphoma). RESULTS A total of 41,762 persons contributed 241,556 person-years (PY). A total of 1832 cancers were diagnosed [incidence rate: 0.76/100 PY (95% confidence interval: 0.72 to 0.79)], 718 ADC [0.30/100 PY (0.28-0.32)], and 1114 NADC [0.46/100 PY (0.43-0.49)]. Longer exposure to cART was associated with a lower ADC risk [adjusted rate ratio: 0.88/year (0.85-0.92)] but a higher NADC risk [1.02/year (1.00-1.03)]. Both PI and NNRTI use were associated with a lower ADC risk [PI: 0.96/year (0.92-1.00); NNRTI: 0.86/year (0.81-0.91)]. PI use was associated with a higher NADC risk [1.03/year (1.01-1.05)]. Although this was largely driven by an association with anal cancer [1.08/year (1.04-1.13)], the association remained after excluding anal cancers from the end point [1.02/year (1.01-1.04)]. No association was seen between NNRTI use and NADC [1.00/year (0.98-1.02)]. CONCLUSIONS Cumulative use of PIs may be associated with a higher risk of anal cancer and possibly other NADC. Further investigation of biological mechanisms is warranted.
Resumo:
SUMMARY There is interest in the potential of companion animal surveillance to provide data to improve pet health and to provide early warning of environmental hazards to people. We implemented a companion animal surveillance system in Calgary, Alberta and the surrounding communities. Informatics technologies automatically extracted electronic medical records from participating veterinary practices and identified cases of enteric syndrome in the warehoused records. The data were analysed using time-series analyses and a retrospective space-time permutation scan statistic. We identified a seasonal pattern of reports of occurrences of enteric syndromes in companion animals and four statistically significant clusters of enteric syndrome cases. The cases within each cluster were examined and information about the animals involved (species, age, sex), their vaccination history, possible exposure or risk behaviour history, information about disease severity, and the aetiological diagnosis was collected. We then assessed whether the cases within the cluster were unusual and if they represented an animal or public health threat. There was often insufficient information recorded in the medical record to characterize the clusters by aetiology or exposures. Space-time analysis of companion animal enteric syndrome cases found evidence of clustering. Collection of more epidemiologically relevant data would enhance the utility of practice-based companion animal surveillance.
Resumo:
Flavanoid-rich dark chocolate consumption benefits cardiovascular health, but underlying mechanisms are elusive. We investigated the acute effect of dark chocolate on the reactivity of prothrombotic measures to psychosocial stress. Healthy men aged 20-50 years (mean ± SD: 35.7 ± 8.8) were assigned to a single serving of either 50 g of flavonoid-rich dark chocolate (n=31) or 50 g of optically identical flavonoid-free placebo chocolate (n=34). Two hours after chocolate consumption, both groups underwent an acute standardised psychosocial stress task combining public speaking and mental arithmetic. We determined plasma levels of four stress-responsive prothrombotic measures (i. e., fibrinogen, clotting factor VIII activity, von Willebrand Factor antigen, fibrin D-dimer) prior to chocolate consumption, immediately before and after stress, and at 10 minutes and 20 minutes after stress cessation. We also measured the flavonoid epicatechin, and the catecholamines epinephrine and norepinephrine in plasma. The dark chocolate group showed a significantly attenuated stress reactivity of the hypercoagulability marker D-dimer (F=3.87, p=0.017) relative to the placebo chocolate group. Moreover, the blunted D-dimer stress reactivity related to higher plasma levels of the flavonoid epicatechin assessed before stress (F=3.32, p = 0.031) but not to stress-induced changes in catecholamines (p's=0.35). There were no significant group differences in the other coagulation measures (p's≥0.87). Adjustments for covariates did not alter these findings. In conclusion, our findings indicate that a single consumption of flavonoid-rich dark chocolate blunted the acute prothrombotic response to psychosocial stress, thereby perhaps mitigating the risk of acute coronary syndromes triggered by emotional stress.
Resumo:
BACKGROUND Patients with HIV exposed to the antiretroviral drug abacavir may have an increased risk of cardiovascular disease (CVD). There is concern that this association arises because of a channelling bias. Even if exposure is a risk, it is not clear how that risk changes as exposure cumulates. METHODS We assess the effect of exposure to abacavir on the risk of CVD events in the Swiss HIV Cohort Study. We use a new marginal structural Cox model to estimate the effect of abacavir as a flexible function of past exposures while accounting for risk factors that potentially lie on a causal pathway between exposure to abacavir and CVD. RESULTS 11,856 patients were followed for a median of 6.6 years; 365 patients had a CVD event (4.6 events per 1000 patient years). In a conventional Cox model, recent - but not cumulative - exposure to abacavir increased the risk of a CVD event. In the new marginal structural Cox model, continued exposure to abacavir during the past four years increased the risk of a CVD event (hazard ratio 2.06, 95% confidence interval 1.43-2.98). The estimated function for the effect of past exposures suggests that exposure during the past 6 to 36 months caused the greatest increase in risk. CONCLUSIONS Abacavir increases the risk of a CVD event: the effect of exposure is not immediate, rather the risk increases as exposure cumulates over the past few years. This gradual increase in risk is not consistent with a rapidly acting mechanism, such as acute inflammation.
Resumo:
Early detection of psychosis is an important topic in psychiatry. Yet, there is limited information on the prevalence and clinical significance of high-risk symptoms in children and adolescents as compared to adults. We examined ultra-high-risk (UHR) symptoms and criteria in a sample of individuals aged 8-40 years from the general population of Canton Bern, Switzerland, enrolled from June 2011 to May 2014. The current presence of attenuated psychotic symptoms (APS) and brief intermittent psychotic symptoms (BLIPS) and the fulfillment of onset/worsening and frequency requirements for these symptoms in UHR criteria were assessed using the Structured Interview for Psychosis Risk Syndromes. Additionally, perceptive and non-perceptive APS were differentiated. Psychosocial functioning and current non-psychotic DSM-IV axis I disorders were also surveyed. Well-trained psychologists performed assessments. Altogether, 9.9% of subjects reported APS and none BLIPS, and 1.3% met all the UHR requirements for APS. APS were related to more current axis I disorders and impaired psychosocial functioning, indicating some clinical significance. A strong age effect was detected around age 16: compared to older individuals, 8-15-year olds reported more perceptive APS, that is, unusual perceptual experiences and attenuated hallucinations. Perceptive APS were generally less related to functional impairment, regardless of age. Conversely, non-perceptive APS were related to low functioning, although this relationship was weaker in those below age 16. Future studies should address the differential effects of perceptive and non-perceptive APS, and their interaction with age, also in terms of conversion to psychosis.
Resumo:
Germline mutation testing in patients with colorectal cancer (CRC) is offered only to a subset of patients with a clinical presentation or tumor histology suggestive of familial CRC syndromes, probably underestimating familial CRC predisposition. The aim of our study was to determine whether unbiased screening of newly diagnosed CRC cases with next generation sequencing (NGS) increases the overall detection rate of germline mutations. We analyzed 152 consecutive CRC patients for germline mutations in 18 CRC-associated genes using NGS. All patients were also evaluated for Bethesda criteria and all tumors were investigated for microsatellite instability, immunohistochemistry for mismatch repair proteins and the BRAF*V600E somatic mutation. NGS based sequencing identified 27 variants in 9 genes in 23 out of 152 patients studied (18%). Three of them were already reported as pathogenic and 12 were class 3 germline variants with an uncertain prediction of pathogenicity. Only 1 of these patients fulfilled Bethesda criteria and had a microsatellite instable tumor and an MLH1 germline mutation. The others would have been missed with current approaches: 2 with a MSH6 premature termination mutation and 12 uncertain, potentially pathogenic class 3 variants in APC, MLH1, MSH2, MSH6, MSH3 and MLH3. The higher NGS mutation detection rate compared with current testing strategies based on clinicopathological criteria is probably due to the large genetic heterogeneity and overlapping clinical presentation of the various CRC syndromes. It can also identify apparently nonpenetrant germline mutations complicating the clinical management of the patients and their families.
Resumo:
White markings and spotting patterns in animal species are thought to be a result of the domestication process. They often serve for the identification of individuals but sometimes are accompanied by complex pathological syndromes. In the Swiss Franches-Montagnes horse population, white markings increased vastly in size and occurrence during the past 30 years, although the breeding goal demands a horse with as little depigmented areas as possible. In order to improve selection and avoid more excessive depigmentation on the population level, we estimated population parameters and breeding values for white head and anterior and posterior leg markings. Heritabilities and genetic correlations for the traits were high (h(2) > 0.5). A strong positive correlation was found between the chestnut allele at the melanocortin-1-receptor gene locus and the extent of white markings. Segregation analysis revealed that our data fit best to a model including a polygenic effect and a biallelic locus with a dominant-recessive mode of inheritance. The recessive allele was found to be the white trait-increasing allele. Multilocus linkage disequilibrium analysis allowed the mapping of the putative major locus to a chromosomal region on ECA3q harboring the KIT gene.