874 resultados para Animals as carriers of disease
Resumo:
The role of endothelial progenitor cells (EPCs) in peripheral artery disease (PAD) remains unclear. We hypothesized that EPC mobilization and function play a central role in the development of endothelial dysfunction and directly influence the degree of atherosclerotic burden in peripheral artery vessels. The number of circulating EPCs, defined as CD34(+)/KDR(+) cells, were assessed by flow cytometry in 91 subjects classified according to a predefined sample size of 31 non-diabetic PAD patients, 30 diabetic PAD patients, and 30 healthy volunteers. Both PAD groups had undergone endovascular treatment in the past. As a functional parameter, EPC colony-forming units were determined ex vivo. Apart from a broad laboratory analysis, a series of clinical measures using the ankle-brachial index (ABI), flow-mediated dilatation (FMD) and carotid intima-media thickness (cIMT) were investigated. A significant reduction of EPC counts and proliferation indices in both PAD groups compared to healthy subjects were observed. Low EPC number and pathological findings in the clinical assessment were strongly correlated to the group allocation. Multivariate statistical analysis revealed these findings to be independent predictors of disease appearance. Linear regression analysis showed the ABI to be a predictor of circulating EPC number (p=0.02). Moreover, the functionality of EPCs was correlated by linear regression (p=0.017) to cIMT. The influence of diabetes mellitus on EPCs in our study has to be considered marginal in already disease-affected patients. This study demonstrated that EPCs could predict the prevalence and severity of symptomatic PAD, with ABI as the determinant of the state of EPC populations in disease-affected groups.
Resumo:
BACKGROUND Diabetes mellitus and angiographic coronary artery disease complexity are intertwined and unfavorably affect prognosis after percutaneous coronary interventions, but their relative impact on long-term outcomes after percutaneous coronary intervention with drug-eluting stents remains controversial. This study determined drug-eluting stents outcomes in relation to diabetic status and coronary artery disease complexity as assessed by the Synergy Between PCI With Taxus and Cardiac Surgery (SYNTAX) score. METHODS AND RESULTS In a patient-level pooled analysis from 4 all-comers trials, 6081 patients were stratified according to diabetic status and according to the median SYNTAX score ≤11 or >11. The primary end point was major adverse cardiac events, a composite of cardiac death, myocardial infarction, and clinically indicated target lesion revascularization within 2 years. Diabetes mellitus was present in 1310 patients (22%), and new-generation drug-eluting stents were used in 4554 patients (75%). Major adverse cardiac events occurred in 173 diabetics (14.5%) and 436 nondiabetic patients (9.9%; P<0.001). In adjusted Cox regression analyses, SYNTAX score and diabetes mellitus were both associated with the primary end point (P<0.001 and P=0.028, respectively; P for interaction, 0.07). In multivariable analyses, diabetic versus nondiabetic patients had higher risks of major adverse cardiac events (hazard ratio, 1.25; 95% confidence interval, 1.03-1.53; P=0.026) and target lesion revascularization (hazard ratio, 1.54; 95% confidence interval, 1.18-2.01; P=0.002) but similar risks of cardiac death (hazard ratio, 1.41; 95% confidence interval, 0.96-2.07; P=0.08) and myocardial infarction (hazard ratio, 0.89; 95% confidence interval, 0.64-1.22; P=0.45), without significant interaction with SYNTAX score ≤11 or >11 for any of the end points. CONCLUSIONS In this population treated with predominantly new-generation drug-eluting stents, diabetic patients were at increased risk for repeat target-lesion revascularization consistently across the spectrum of disease complexity. The SYNTAX score was an independent predictor of 2-year outcomes but did not modify the respective effect of diabetes mellitus. CLINICAL TRIAL REGISTRATION URL: http://www.clinicaltrials.gov. Unique identifiers: NCT00297661, NCT00389220, NCT00617084, and NCT01443104.
Resumo:
BACKGROUND/AIMS The use of antihypertensive medicines has been shown to reduce proteinuria, morbidity, and mortality in patients with chronic kidney disease (CKD). A specific recommendation for a class of antihypertensive drugs is not available in this population, despite the pharmacodynamic differences. We have therefore analysed the association between antihypertensive medicines and survival of patients with chronic kidney disease. METHODS Out of 2687 consecutive patients undergoing kidney biopsy a cohort of 606 subjects with retrievable medical therapy was included into the analysis. Kidney function was assessed by glomerular filtration rate (GFR) estimation at the time point of kidney biopsy. Main outcome variable was death. RESULTS Overall 114 (18.7%) patients died. In univariate regression analysis the use of alpha-blockers and calcium channel antagonists, progression of disease, diabetes mellitus (DM) type 1 and 2, arterial hypertension, coronary heart disease, peripheral vascular disease, male sex and age were associated with mortality (all p<0.05). In a multivariate Cox regression model the use of calcium channel blockers (HR 1.89), age (HR 1.04), DM type 1 (HR 8.43) and DM type 2 (HR 2.17) and chronic obstructive pulmonary disease (HR 1.66) were associated with mortality (all p < 0.05). CONCLUSION The use of calcium channel blockers but not of other antihypertensive medicines is associated with mortality in primarily GN patients with CKD.
Resumo:
BACKGROUND AND AIMS Inflammatory bowel diseases (IBDs) may impair quality of life (QoL) in paediatric patients. We aimed to evaluate in a nationwide cohort whether patients experience QoL in a different way when compared with their parents. METHODS Sociodemographic and psychosocial characteristics were prospectively acquired from paediatric patients and their parents included in the Swiss IBD Cohort Study. Disease activity was evaluated by the Paediatric Crohn's Disease Activity Index (PCDAI) and the Paediatric Ulcerative Colitis Activity Index (PUCAI). We assessed QoL using the KIDSCREEN questionnaire. The QoL domains were analysed and compared between children and parents according to type of disease, parents' age, origin, education and marital status. RESULTS We included 110 children and parents (59 Crohn's disease [CD], 45 ulcerative colitis [UC], 6 IBD unclassified [IBDU]). There was no significant difference in QoL between CD and UC/IBDU, whether the disease was active or in remission. Parents perceived overall QoL, as well as 'mood', 'family' and 'friends' domains, lower than the children themselves, independently of their place of birth and education. However, better concordance was found on 'school performance' and 'physical activity' domains. Marital status and age of parents significantly influenced the evaluation of QoL. Mothers and fathers being married or cohabiting perceived significantly lower mood, family and friends domains than their children, whereas mothers living alone had a lower perception of the friends domain; fathers living alone had a lower perception of family and mood subscores. CONCLUSION Parents of Swiss paediatric IBD patients significantly underestimate overall QoL and domains of QoL of their children independently of origin and education.
Resumo:
Syndromic surveillance (SyS) systems currently exploit various sources of health-related data, most of which are collected for purposes other than surveillance (e.g. economic). Several European SyS systems use data collected during meat inspection for syndromic surveillance of animal health, as some diseases may be more easily detected post-mortem than at their point of origin or during the ante-mortem inspection upon arrival at the slaughterhouse. In this paper we use simulation to evaluate the performance of a quasi-Poisson regression (also known as an improved Farrington) algorithm for the detection of disease outbreaks during post-mortem inspection of slaughtered animals. When parameterizing the algorithm based on the retrospective analyses of 6 years of historic data, the probability of detection was satisfactory for large (range 83-445 cases) outbreaks but poor for small (range 20-177 cases) outbreaks. Varying the amount of historical data used to fit the algorithm can help increasing the probability of detection for small outbreaks. However, while the use of a 0·975 quantile generated a low false-positive rate, in most cases, more than 50% of outbreak cases had already occurred at the time of detection. High variance observed in the whole carcass condemnations time-series, and lack of flexibility in terms of the temporal distribution of simulated outbreaks resulting from low reporting frequency (monthly), constitute major challenges for early detection of outbreaks in the livestock population based on meat inspection data. Reporting frequency should be increased in the future to improve timeliness of the SyS system while increased sensitivity may be achieved by integrating meat inspection data into a multivariate system simultaneously evaluating multiple sources of data on livestock health.
Resumo:
Research has shown that disease-specific health related quality of life (HRQoL) instruments are more responsive than generic instruments to particular disease conditions. However, only a few studies have used disease-specific instruments to measure HRQoL in hemophilia. The goal of this project was to develop a disease-specific utility instrument that measures patient preferences for various hemophilia health states. The visual analog scale (VAS), a ranking method, and the standard gamble (SG), a choice-based method incorporating risk, were used to measure patient preferences. Study participants (n = 128) were recruited from the UT/Gulf States Hemophilia and Thrombophilia Center and stratified by age: 0–18 years and 19+. ^ Test retest reliability was demonstrated for both VAS and SG instruments: overall within-subject correlation coefficients were 0.91 and 0.79, respectively. Results showed statistically significant differences in responses between pediatric and adult participants when using the SG (p = .045). However, no significant differences were shown between these groups when using the VAS (p = .636). When responses to VAS and SG instruments were compared, statistically significant differences in both pediatric (p < .0001) and adult (p < .0001) groups were observed. Data from this study also demonstrated that persons with hemophilia with varying severity of disease, as well as those who were HIV infected, were able to evaluate a range of health states for hemophilia. This has important implications for the study of quality of life in hemophilia and the development of disease-specific HRQoL instruments. ^ The utility measures obtained from this study can be applied in economic evaluations that analyze the cost/utility of alternative hemophilia treatments. Results derived from the SG indicate that age can influence patients' preferences regarding their state of health. This may have implications for considering treatment options based on the mean age of the population under consideration. Although both instruments independently demonstrated reliability and validity, results indicate that the two measures may not be interchangeable. ^
Resumo:
Alzheimer's disease (AD) is associated with greater mortality and reduced survival among individuals with Alzheimer's disease as compared to those without dementia. It is uncertain how these survival estimates change when the clinical signs and/or symptoms of comorbid conditions are present in individuals' with Alzheimer's disease. Cardiovascular risk factors such as hypertension, hyperlipidemia, congestive heart failure, coronary artery disease, and diabetes mellitus are common conditions in the aged population. Independently, these factors influence mortality and may have an additive effect on reduced survival in an individual with concomitant Alzheimer's disease. The bulk of the evidence from previous research efforts suggests an association between vascular co-morbidities and Alzheimer's disease incidence, but their role in survival remains to be elucidated. The objective of this proposed study was to examine the effects of cardiovascular comorbidities on the survival experience of individuals with probable Alzheimer's disease in order to identify prognostic factors for life expectancy following onset of disease. This study utilized data from the Baylor College of Medicine Alzheimer's Disease Center (ADC) longitudinal study of Alzheimer's disease and other memory disorders. Individuals between the ages of 55-69, 70-79, and ≥80 had a median survival from date of onset of 9.2 years, 8.0 years, and 7.2 years, respectively (p<0.001) and 5.5 years, 4.3 years, and 3.4 years from diagnosis. Sex was the strongest predictor of death from onset of AD, with females having a 30 percent lower risk compared to males. These findings further support the notion that age (both from onset and from diagnosis) and sex are the strongest predictors of survival among those with AD. ^
Resumo:
Context. Despite the rapid growth of disease management programs, there are still questions about their efficacy and effectiveness for improving patient outcomes and their ability to reduce costs associated with chronic disease. ^ Objective. To determine the effectiveness of disease management programs on improving the results of HbA1c tests, lipid profiles and systolic blood pressure (SBP) readings among diabetics. These three quantitative measures are widely accepted methods of determining the quality of a patient's diabetes management and the potential for future complications. ^ Data Sources. MEDLINE and CINAHL were searched from 1950 to June 2008 using MeSH terms designed to capture all relevant studies. Scopus pearling and hand searching were also done. Only English language articles were selected. ^ Study Selection. Titles and abstracts for the 2347 articles were screened against predetermined inclusion and exclusion criteria, yielding 217 articles for full screening. After full article screening, 29 studies were selected for inclusion in the review. ^ Data Extraction. From the selected studies, data extraction included sample size, mean change over baseline, and standard deviation for each control and experimental arm. ^ Results. The pooled results show a mean HbA1c reduction of 0.64%, 95% CI (-0.83 to -0.44), mean SBP reduction of 7.39 mmHg (95% CI to -11.58 to -3.2), mean total cholesterol reduction of 5.74 mg/dL (95% CI, -10.01 to -1.43), and mean LDL cholesterol reduction of 3.74 mg/dL (95% CI, -8.34 to 0.87). Results for HbA1c, SBP and total cholesterol were statistically significant, while the results for LDL cholesterol were not. ^ Conclusions. The findings suggest that disease management programs utilizing five hallmarks of care can be effective at improving intermediate outcomes among diabetics. However, given the significant heterogeneity present, there may be fundamental differences with respect to study-specific interventions and populations that render them inappropriate for meta-analysis. ^
Resumo:
Health care workers have been known to carry into the workplace a variety of judgmental and negative attitudes towards their patients. In no other area of patient care has this issue been more pronounced as in the management of patients with AIDS. Health care workers have refused to treat or manage patients with AIDS and have often treated them more harshly than identically described leukemia patients. Some health care institutions have simply refused to admit patients with AIDS and even recent applicants to medical colleges and schools of nursing have indicated a preference for schools in areas with low prevalence of HIV disease. Since the attitudes of health care workers do have significant consequences on patient management, this study was carried out to determine the differences in clinical practice in Nigeria and the United States of America as it relates to knowledge of a patient's HIV status, determine HIV prevalence and culture in each of the study sites and how they impact on infection control practices, determine the relationship between infection control practices and fear of AIDS, and also determine the predictors of safe infection control practices in each of the study sites.^ The study utilized the 38-item fear of AIDS scale and the measure of infection control questionnaire for its data. Questionnaires were administered to health care workers at the university teaching hospital sites of Houston, Texas and Calabar in Nigeria. Data was analyzed using a chi-square test, and where appropriate, a student t-tests to establish the demographic variables for each country. Factor analysis was done using principal components analysis followed by varimax rotation to simple structure. The subscale scores for each study site were compared using t-tests (separate variance estimates) and utilizing Bonferroni adjustments for number of tests. Finally, correlations were carried out between infection control procedures and fear of AIDS in each study site using Pearson-product moment correlation coefficients.^ The study revealed that there were five dimensions of the fear of AIDS in health care workers, namely fear of loss of control, fear of sex, fear of HIV infection through blood and illness, fear of death and medical interventions and fear of contact with out-groups. Fear of loss of control was the primary area of concern in the Nigerian health care workers whereas fear of HIV infection through blood and illness was the most important area of AIDS related feats in United States health care workers. The study also revealed that infection control precautions and practices in Nigeria were based more on normative and social pressures whereas it was based on knowledge of disease transmission, supervision and employee discipline in the United States, and thus stresses the need for focused educational programs in health care settings that emphasize universal precautions at all times and that are sensitive to the cultural nuances of that particular environment. ^
Resumo:
Background: Little is known about the effects on patient adherence when the same study drug is administered in the same dose in two populations with two different diseases in two different clinical trials. The Minocycline in Rheumatoid Arthritis (MIRA) trial and the NIH Exploratory Trials in Parkinson's disease (NET-PD) Futility Study I provide a unique opportunity to do the above and to compare methods measuring adherence. This study may increase understanding of the influence of disease and adverse events on patient adherence and will provide insights to investigators selecting adherence assessment methods in clinical trials of minocycline and other drugs in future.^ Methods: Minocycline adherence by pill count and the effect of adverse events was compared in the MIRA and NET-PD FS1 trials using multivariable linear regression. Within the MIRA trial, agreement between assay and pill count was compared. The association of adverse events with assay adherence was examined using multivariable logistic regression.^ Results: Adherence derived from pill count in the MIRA and NET-PD FS1 trials did not differ significantly. Adverse events potentially related to minocycline did not appear useful to predict minocycline adherence. In the MIRA trial, adherence measured by pill count appears higher than adherence measured by assay. Agreement between pill count and assay was poor (kappa statistic = 0.25).^ Limitations: Trial and disease are completely confounded and hence the independent effect of disease on adherence to minocycline treatment cannot be studied.^ Conclusion: Simple pill count may be preferred over assay in the minocycline clinical trials to measure adherence. Assays may be less sensitive in a clinical setting where appointments are not scheduled in relation to medication administration time, given assays depend on many pharmacokinetic and instrument-related factors. However, pill count can be manipulated by the patient. Another study suggested that self-report method is more sensitive than pill count method in differentiating adherence from non-adherence. An effect of medication-related adverse events on adherence could not be detected.^
Resumo:
Few studies have been conducted on the epidemiology of enteric infectious diseases of public health importance in communities along the United States-Mexico border, and these studies typically focus on bacterial and viral diseases. The epidemiology of intestinal helminth infections along the border has not recently been explored, and there are no published reports for El Paso and Ciudad Juarez, both of which are high traffic urban areas along the Texas-Mexico border. The purpose of this research project was to conduct a cross-sectional epidemiologic survey for enteric helminths of medical importance along the Texas-Mexico border region of El Paso and Ciudad Juarez and to evaluate risk factors for exposure to these parasites. In addition, an emphasis was placed on the zoonotic tapeworm, Taenia solium. This tapeworm is especially important in this region because of the increasing incidence of neurocysticercosis, a severe disease spread by carriers of intestinal T. solium. Fecal samples were collected from individuals of all ages in a population-based cross-sectional household survey and evaluated for the presence of helminth parasites using fecal flotations. In addition, a Taenia coproantigen enzyme linked immunosorbent assay (ELISA) was performed on each stool sample to identify tapeworm carriers. A standardized questionnaire was administered to identify risk factors and routes of exposure for enteric helminth infections with additional questions to assess risk factors specific for taeniasis. The actual prevalence of taeniasis along the Texas-Mexico border was unknown, and this is the first population-based study performed in this region. Flotations were performed on 395 samples and four (1%) were positive for helminths including Ascaris, hookworms and Taenia species. Immunodiagnostic testing demonstrated a prevalence of 2.9% (11/378) for taeniasis. Based on the case definition, a 3% (12/395) prevalence of taeniasis was detected in this area. In addition, statistical analyses indicate that residents of El Paso are 8.5 times more likely to be a tapeworm carrier compared to residents of Juarez (PR=8.5, 95% CI=2.35, 30.81). This finding has important implications in terms of planning effective health education campaigns to decrease the prevalence of enteric helminths in populations along the Texas-Mexico border. ^
Resumo:
The effect of biodiversity on the ability of parasites to infect their host and cause disease (i.e. disease risk) is a major question in pathology, which is central to understand the emergence of infectious diseases, and to develop strategies for their management. Two hypotheses, which can be considered as extremes of a continuum, relate biodiversity to disease risk: One states that biodiversity is positively correlated with disease risk (Amplification Effect), and the second predicts a negative correlation between biodiversity and disease risk (Dilution Effect). Which of them applies better to different host-parasite systems is still a source of debate, due to limited experimental or empirical data. This is especially the case for viral diseases of plants. To address this subject, we have monitored for three years the prevalence of several viruses, and virus-associated symptoms, in populations of wild pepper (chiltepin) under different levels of human management. For each population, we also measured the habitat species diversity, host plant genetic diversity and host plant density. Results indicate that disease and infection risk increased with the level of human management, which was associated with decreased species diversity and host genetic diversity, and with increased host plant density. Importantly, species diversity of the habitat was the primary predictor of disease risk for wild chiltepin populations. This changed in managed populations where host genetic diversity was the primary predictor. Host density was generally a poorer predictor of disease and infection risk. These results support the dilution effect hypothesis, and underline the relevance of different ecological factors in determining disease/infection risk in host plant populations under different levels of anthropic influence. These results are relevant for managing plant diseases and for establishing conservation policies for endangered plant species.
Resumo:
The risk of disease associated with persistent virus infections such as HIV-I, hepatitis B and C, and human T-lymphotropic virus-I (HTLV-I) is strongly determined by the virus load. However, it is not known whether a persistent class I HLA-restricted antiviral cytotoxic T lymphocyte (CTL) response reduces viral load and is therefore beneficial or causes tissue damage and contributes to disease pathogenesis. HTLV-I-associated myelopathy (HAM/TSP) patients have a high virus load compared with asymptomatic HTLV-I carriers. We hypothesized that HLA alleles control HTLV-I provirus load and thus influence susceptibility to HAM/TSP. Here we show that, after infection with HTLV-I, the class I allele HLA-A*02 halves the odds of HAM/TSP (P < 0.0001), preventing 28% of potential cases of HAM/TSP. Furthermore, HLA-A*02+ healthy HTLV-I carriers have a proviral load one-third that (P = 0.014) of HLA-A*02− HTLV-I carriers. An association of HLA-DRB1*0101 with disease susceptibility also was identified, which doubled the odds of HAM/TSP in the absence of the protective effect of HLA-A*02. These data have implications for other persistent virus infections in which virus load is associated with prognosis and imply that an efficient antiviral CTL response can reduce virus load and so prevent disease in persistent virus infections.
Resumo:
Mutations in superoxide dismutase 1 (SOD1; EC 1.15.1.1) are responsible for a proportion of familial amyotrophic lateral sclerosis (ALS) through acquisition of an as-yet-unidentified toxic property or properties. Two proposed possibilities are that toxicity may arise from imperfectly folded mutant SOD1 catalyzing the nitration of tyrosines [Beckman, J. S., Carson, M., Smith, C. D. & Koppenol, W. H. (1993) Nature (London) 364, 584] through use of peroxynitrite or from peroxidation arising from elevated production of hydroxyl radicals through use of hydrogen peroxide as a substrate [Wiedau-Pazos, M., Goto, J. J., Rabizadeh, S., Gralla, E. D., Roe, J. A., Valentine, J. S. & Bredesen, D. E. (1996) Science 271, 515–518]. To test these possibilities, levels of nitrotyrosine and markers for hydroxyl radical formation were measured in two lines of transgenic mice that develop progressive motor neuron disease from expressing human familial ALS-linked SOD1 mutation G37R. Relative to normal mice or mice expressing high levels of wild-type human SOD1, 3-nitrotyrosine levels were elevated by 2- to 3-fold in spinal cords coincident with the earliest pathological abnormalities and remained elevated in spinal cord throughout progression of disease. However, no increases in protein-bound nitrotyrosine were found during any stage of SOD1-mutant-mediated disease in mice or at end stage of sporadic or SOD1-mediated familial human ALS. When salicylate trapping of hydroxyl radicals and measurement of levels of malondialdehyde were used, there was no evidence throughout disease progression in mice for enhanced production of hydroxyl radicals or lipid peroxidation, respectively. The presence of elevated nitrotyrosine levels beginning at the earliest stages of cellular pathology and continuing throughout progression of disease demonstrates that tyrosine nitration is one in vivo aberrant property of this ALS-linked SOD1 mutant.
Resumo:
We demonstrate that the receptor binding moiety of Escherichia coli heat-labile enterotoxin (EtxB) can completely prevent autoimmune disease in a murine model of arthritis. Injection of male DBA/1 mice at the base of the tail with type II collagen in the presence of complete Freund’s adjuvant normally leads to arthritis, as evidenced by inflammatory infiltration and swelling of the joints. A separate injection of EtxB at the same time as collagen challenge prevented leukocyte infiltration, synovial hyperplasia, and degeneration of the articular cartilage and reduced clinical symptoms of disease by 82%. The principle biological property of EtxB is its ability to bind to the ubiquitous cell surface receptor GM1 ganglioside, and to other galactose-containing glycolipids and galactoproteins. The importance of receptor interaction in mediating protection from arthritis was demonstrated by the failure of a non-receptor-binding mutant of EtxB to elicit any protective effect. Analysis of T cell responses to collagen, in cultures of draining lymph node cells, revealed that protection was associated with a marked increase in interleukin 4 production concomitant with a reduction in interferon γ levels. Furthermore, in protected mice there was a significant reduction in anti-collagen antibody levels as well as an increase in the IgG1/IgG2a ratio. These observations show that protection is associated with a shift in the Th1/Th2 balance as well as a general reduction in the extent of the anti-type II collagen immune response. This suggests that EtxB-receptor-mediated modulation of lymphocyte responses provides a means of preventing autoimmune disease.