943 resultados para Disease evaluation
Resumo:
The clinical presentation and neuroimaging findings of children with pseudotumoral hemicerebellitis (PTHC) and Lhermitte-Duclos disease (LDD) may be very similar. The differentiation between these entities, however, is important because their management and prognosis are different. We report on three children with PTHC. For all three children, in the acute situation, the differentiation between PTHC and LDD was challenging. A review of the literature shows that a detailed evaluation of conventional and neuroimaging data may help to differentiate between these two entities. A striated folial pattern, brainstem involvement, and prominent veins surrounding the thickened cerebellar foliae on susceptibility weighted imaging favor LDD, while post-contrast enhancement and an increased choline peak on (1)H-Magnetic resonance spectroscopy suggest PTHC.
Resumo:
OBJECTIVE To evaluate horses with atrial fibrillation for hypercoagulability; plasma D-dimer concentrations, as a marker of a procoagulant state; and a relationship between coagulation profile results and duration of atrial fibrillation or presence of structural heart disease. DESIGN Case-control study. ANIMALS Plasma samples from 42 horses (25 with atrial fibrillation and 17 without cardiovascular or systemic disease [control group]). PROCEDURES Results of hematologic tests (ie, plasma fibrinogen and D-dimer concentrations, prothrombin and activated partial thromboplastin times, and antithrombin activity) in horses were recorded to assess coagulation and fibrinolysis. Historical and clinical variables, as associated with a hypercoagulable state in other species, were also recorded. RESULTS Horses with atrial fibrillation and control horses lacked clinical signs of hypercoagulation or thromboembolism. Compared with control horses, horses with atrial fibrillation had significantly lower antithrombin activity. No significant differences in plasma fibrinogen and D-dimer concentrations and prothrombin and activated partial thromboplastin times existed between horse groups. In horses with atrial fibrillation versus control horses, a significantly larger proportion had an abnormal plasma D-dimer concentration (10/25 vs 2/17), test results indicative of subclinical activated coagulation (18/25 vs 6/17), or abnormal coagulation test results (25/121 vs 7/85), respectively. CONCLUSIONS AND CLINICAL RELEVANCE Horses with atrial fibrillation did not have clinical evidence of a hypercoagulable state, but a higher proportion of horses with atrial fibrillation, compared with control horses, did have subclinical activated coagulation on the basis of standard coagulation test results.
Resumo:
Sarcoptic mange occurs in free-ranging wild boar (Sus scrofa) but has been poorly described in this species. We evaluated the performance of a commercial indirect enzyme-linked immunosorbent assay (ELISA) for serodiagnosis of sarcoptic mange in domestic swine when applied to wild boar sera. We tested 96 sera from wild boar in populations without mange history ("truly noninfected") collected in Switzerland between December 2012 and February 2014, and 141 sera from free-ranging wild boar presenting mange-like lesions, including 50 live animals captured and sampled multiple times in France between May and August 2006 and three cases submitted to necropsy in Switzerland between April 2010 and February 2014. Mite infestation was confirmed by skin scraping in 20 of them ("truly infected"). We defined sensitivity of the test as the proportion of truly infected that were found ELISA-positive, and specificity as the proportion of truly noninfected that were found negative. Sensitivity and specificity were 75% and 80%, respectively. Success of antibody detection increased with the chronicity of lesions, and seroconversion was documented in 19 of 27 wild boar sampled multiple times that were initially negative or doubtful. In conclusion, the evaluated ELISA has been successfully applied to wild boar sera. It appears to be unreliable for early detection in individual animals but may represent a useful tool for population surveys.
Resumo:
BACKGROUND In parallel to the increase of wild boar abundance in the past decades, an increase of exposure to the Aujeszky's disease virus (ADV) has been reported in wild boar in several parts of Europe. Since high animal densities have been proposed to be one of the major factors influencing ADV seroprevalence in wild boar populations and wild boar abundance has increased in Switzerland, too, a re-evaluation of the ADV status was required in wild boar in Switzerland. We tested wild boar sera collected from 2008-2013 with a commercial ELISA for antibodies against ADV. To set our data in the European context, we reviewed scientific publications on ADV serosurveys in Europe for two time periods (1995-2007 and 2008-2014). RESULTS Seven out of 1,228 wild boar sera were positive for antibodies against ADV, resulting in an estimated seroprevalence of 0.57% (95% confidence interval CI: 0.32-0.96%). This is significantly lower than the prevalence of a previous survey in 2004-2005. The literature review revealed that high to very high ADV seroprevalences are reported from Mediterranean and Central-eastern countries. By contrast, an "island" of low to medium seroprevalences is observed in the centre of Europe with few isolated foci of high seroprevalences. We were unable to identify a general temporal trend of ADV seroprevalence at European scale. CONCLUSIONS The seroprevalence of ADV in wild boar in Switzerland belongs among the lowest documented in Europe. Considering the disparity of seroprevalences in wild boar in Europe, the fact that seroprevalences in Switzerland and other countries have decreased despite increasing wild boar densities and the knowledge that stress leads to the reactivation of latent ADV with subsequent excretion and transmission, we hypothesize that not only animal density but a range of factors leading to stress - such as management - might play a crucial role in the dynamics of ADV infections.
Resumo:
Syndromic surveillance (SyS) systems currently exploit various sources of health-related data, most of which are collected for purposes other than surveillance (e.g. economic). Several European SyS systems use data collected during meat inspection for syndromic surveillance of animal health, as some diseases may be more easily detected post-mortem than at their point of origin or during the ante-mortem inspection upon arrival at the slaughterhouse. In this paper we use simulation to evaluate the performance of a quasi-Poisson regression (also known as an improved Farrington) algorithm for the detection of disease outbreaks during post-mortem inspection of slaughtered animals. When parameterizing the algorithm based on the retrospective analyses of 6 years of historic data, the probability of detection was satisfactory for large (range 83-445 cases) outbreaks but poor for small (range 20-177 cases) outbreaks. Varying the amount of historical data used to fit the algorithm can help increasing the probability of detection for small outbreaks. However, while the use of a 0·975 quantile generated a low false-positive rate, in most cases, more than 50% of outbreak cases had already occurred at the time of detection. High variance observed in the whole carcass condemnations time-series, and lack of flexibility in terms of the temporal distribution of simulated outbreaks resulting from low reporting frequency (monthly), constitute major challenges for early detection of outbreaks in the livestock population based on meat inspection data. Reporting frequency should be increased in the future to improve timeliness of the SyS system while increased sensitivity may be achieved by integrating meat inspection data into a multivariate system simultaneously evaluating multiple sources of data on livestock health.
Resumo:
Mastitis-Metritis-Agalactia (MMA), also known as postpartum dysgalactia syndrome (PPDS) is the most important disease complex in sows after birth. The present study compared 30 MMA problem herds (over 12% of farrowing sows affected) with 30 control farms (less than 10% of farrowing sows affected) to identify risk factors and treatment incidence. Important risk factors identified were in gilts the integration into the herd after the first farrowing, in gestating sows firm fecal consistency as well as in lactating sows soiled troughs, a low flow rate (<2 liters per minute) in drinking nipples and a high prevalence of lameness. The treatment incidence was also significantly different between the two groups. The MMA prevalence could be reduced through optimization of husbandry, feeding and management, which could essentially diminish the use of antibiotics.
Resumo:
The basophil activation test (BAT) has become a pervasive test for allergic response through the development of flow cytometry, discovery of activation markers such as CD63 and unique markers identifying basophil granulocytes. Basophil activation test measures basophil response to allergen cross-linking IgE on between 150 and 2000 basophil granulocytes in <0.1 ml fresh blood. Dichotomous activation is assessed as the fraction of reacting basophils. In addition to clinical history, skin prick test, and specific IgE determination, BAT can be a part of the diagnostic evaluation of patients with food-, insect venom-, and drug allergy and chronic urticaria. It may be helpful in determining the clinically relevant allergen. Basophil sensitivity may be used to monitor patients on allergen immunotherapy, anti-IgE treatment or in the natural resolution of allergy. Basophil activation test may use fewer resources and be more reproducible than challenge testing. As it is less stressful for the patient and avoids severe allergic reactions, BAT ought to precede challenge testing. An important next step is to standardize BAT and make it available in diagnostic laboratories. The nature of basophil activation as an ex vivo challenge makes it a multifaceted and promising tool for the allergist. In this EAACI task force position paper, we provide an overview of the practical and technical details as well as the clinical utility of BAT in diagnosis and management of allergic diseases.
Resumo:
Neural tube defects (NTDs) are the most common severely disabling birth defects in the United States, with a frequency of approximately 1–2 of every 1,000 births. This text includes the identification and evaluation of candidate susceptibility genes that confer risk for the development of neural tube defects (NTDs). The project focused on isolated meningomyelocele, also termed spina bifida (SB). ^ Spina bifida is a complex disease with multifactorial inheritance, therefore the subject population (consisting of North American Caucasians and Hispanics of Mexicali-American descent) was composed of 459 simplex SB families who were tested for genetic associations utilizing the transmission disequilibrium test (TDT), a nonparametric linkage technique. Three categories of candidate genes were studied, including (1) human equivalents of genes determined in mouse models to cause NTDs, (2) HOX and PAX genes, and (3) the MTHFR gene involved in the metabolic pathway of folate. ^ The C677T variant of the 5,10-methylenetetrahydrofolate reductase (MTHFR) gene was the first mutation in this gene to be implicated as a risk factor for NTDs. Our evaluation of the MTHFR gene provides evidence that maternal C677T homozygosity is a risk factor for upper level spina bifida defects in Hispanics [OR = 2.3, P = 0.02]. This observed risk factor is of great importance due to the high prevalence of this homozygous genotype in the Hispanic population. Additionally, maternal C677T/A1298C compound heterozygosity is a risk factor for upper level spina bifida defects in non-Hispanic whites [OR = 3.6, P = 0.03]. ^ For TDT analysis, our total population of 1128 subjects were genotyped for 54 markers from within and/or flanking the 20 candidate genes/gene regions of interest. Significant TDT findings were obtained for 3 of the 54 analyzed markers: d20s101 flanking the PAX1 gene (P = 0.019), d1s228 within the PAX7 gene (P = 0.011), and d2s110 within the PAX8 gene (P = 0.013). These results were followed-up by testing the genes directly for mutations utilizing single-strand conformational analysis (SSCA) and direct sequencing. Multiple variations were detected in each of these PAX genes; however, these variations were not passed from parent to child in phase with the positively transmitted alleles. Therefore, these variations do not contribute to the susceptibility of spina bifida, but rather are previously unreported single nucleotide polymorphisms. ^
Resumo:
Objective. Although complete blood count (CBC) changes occur with the development of clinical sepsis in newborns, the CBC has not been reported to be a sensitive predictor of sepsis in asymptomatic full-term newborn infants, nor has it been reported to be related to risk factors for sepsis or clinical decisions. The objective of this study was to evaluate the relationship between the WBC/I:T (immature:total neutrophil) ratio and maternal group B streptococcal (GBS) risk factors (rupture of membranes ≥18 hours, maternal temperature ≥100.4°F, maternal age ≤20 years, previous infant with invasive GBS disease, maternal GBS bacteriuria, and black ethnicity); and to evaluate the relationship between the WBC/I:T ratios and providers' clinical decisions (observe versus repeat the CBC or complete sepsis evaluation) in the asymptomatic full-term newborn at risk for early-onset GBS sepsis. ^ Methods. Medical records of infants admitted to the well baby nursery at a tertiary care teaching hospital in Houston, TX between 1/1/99 and 12/31/00 whose gestational ages were ≥35 weeks; who had mothers with GBS positive or unknown culture status and inadequate intrapartum antibiotic prophylaxis; and who had screening CBCs performed in the first 30 hours of life because of GBS risk were reviewed (n = 412). Demographic information, maternal GBS risk factors, CBC results, clinical decisions, and rationales for clinical decisions were collected. ^ Results. With the exception of black ethnicity (p = .0000, odds ratio = 0.213), no statistically significant differences in risk factors between infants with normal and abnormal WBC counts or normal and abnormal I:T ratios were found. Infants with abnormal WBCs had a significantly higher likelihood of having a CBC repeated (p = 0.002 for WBC). Providers documented the CBC result in the rationale for clinical decisions in 62% of the cases. ^ Conclusion. The CBC results were not related to maternal risk factors for GBS except for ethnicity. Black infants had significantly lower WBC levels than infants of other ethnicities, although this difference was clinically insignificant. Infants with abnormal WBCs had a significantly higher likelihood of undergoing repeat CBCs but not sepsis evaluations. Provider rationale was difficult to evaluate due to insufficient documentation. The screening CBC result did not impact the clinicians' decisions to initiate sepsis evaluations in this population. ^
Resumo:
Dental caries is a common preventable childhood disease leading to severe physical, mental and economic repercussions for children and their families if left untreated. A needs assessment in Harris County reported that 45.9% of second graders had untreated dental caries. In order to address this growing problem, the School Sealant Program (SSP), a primary preventive initiative, was launched by the Houston Department of Health and Human Services (HDHHS) to provide oral health education, and underutilized dental preventive services to second grade children from participating Local School Districts (LSDs). ^ To determine the effectiveness and efficiency of the SSP, a program evaluation was conducted by the HDHHS between September 2007 and June 2008 for the Oral Health Education (OHE) component of the SSP. The objective of the evaluation was to assess short term changes in oral health knowledge of the participants and determine if these changes, if any, were due to the OHE sessions. An 8-item multiple choice pre/post test was developed for this purpose and administered to the participants before and immediately after the OHE sessions. ^ The present project analyzed pre and post test data of 1,088 second graders from 22 participating schools. Changes in overall and topic-specific knowledge of the program participants before and after the OHE sessions were analyzed using the Wilcoxon's signed rank test. ^ Results. The overall knowledge assessment showed a statistically significant (p <0.001) increase in the dental health knowledge of the participants after the oral health education sessions. Participants in the higher scoring category (7-8 correct responses) increased from 9.5% at baseline to 60.8% after the education sessions. Overall knowledge increased in all school regions with the highest knowledge gains seen in the Central and South regions. Males and females had similar knowledge gains. Significant knowledge differences were also found for each of the topic specific categories (functions of teeth, healthy diet, healthy habits, dental sealants; p<0.001) indicating an increase in topic specific knowledge of the participants post-health education sessions. ^ Conclusions. The OHE sessions were successful in increasing the short term oral health knowledge of the participants. ^
Resumo:
Cardiovascular disease has been the leading cause of death in the United States for over fifty years. While multiple risk factors for cardiovascular disease have been identified, hypertension is one of the most commonly recognized and treatable. Recent studies indicate that the prevalence of hypertension among children and adolescents is between 3-5%, much higher than originally estimated and likely rising due to the epidemic of obesity in the U.S. In 2004, the National High Blood Pressure Education Program Working Group on High Blood Pressure in Children and Adolescents published new guidelines for the diagnosis and treatment of hypertension in this population. Included in these recommendations was the creation of a new diagnosis, pre-hypertension, aimed at identifying children at-risk for hypertension to provide early lifestyle interventions in an effort to prevent its ultimate development. In order to determine the risk associated with pre-hypertension for the development of incident HTN, a secondary analysis of a repeated cross-sectional study measuring blood pressure in Houston area adolescents from 2000 to 2007 was performed. Of 1006 students participating in the blood pressure screening on more than one occasion not diagnosed with hypertension at initial encounter, eleven were later found to have hypertension providing an overall incident rate of 0.5% per year. Incidence rates were higher among overweight adolescents–1.9% per year [IRR 8.6 (1.97, 51.63)]; students “at-risk for hypertension” (pre-hypertensive or initial blood pressure in the hypertensive range but falling on subsequent measures)–1.4% per year [IRR 4.77 (1.21, 19.78)]; and those with blood pressure ≥90th percentile on three occasions–6.6% per year [IRR 21.87 (3.40, 112.40)]. Students with pre-hypertension as currently defined by the Task Force did have an increased rate of hypertension (1.1% per year) but it did not reach statistical significance [IRR 2.44 (0.42, 10.18)]. Further research is needed to determine the morbidity and mortality associated with pre-hypertension in this age group as well as the effectiveness of various interventions for preventing the development of hypertensive disease among these at-risk individuals. ^
Resumo:
The study of obesity and its causes has evolved into one of the most important public health issues in the United States (Office of Disease Prevention and Health Promotion, 2007). Obesity is linked to several chronic conditions, such as cardiovascular disease, diabetes and some cancers (National Center for Chronic Disease Prevention and Health Promotion, 2008b) and the public health concern resides in the present morbidity and mortality associated with obesity and related conditions (National Heart, Lung and Blood Institute, 1998). Furthermore, obesity and its related conditions present economic challenges to employers in terms of medical health care, sick leave, short-term disability and long-term disability benefits utilized by employees (Østbye, Dement, and Krause, 2007). Recently, articles covering intervention programs targeting obesity in the occupational setting have surfaced in the body of scientific literature. The increased interest in this area stems from the fact that employees in the United States spend more time in the work environment than many industrialized nations, including Japan and most of Western Europe (Organisation for Economic Co-operation and Development, 2006). Moreover, scientific literature supports the idea of investing in healthy human capital to promote productivity and output from employees (Berger, Howell, Nicholson, & Sharda, 2003). The time spent in the work environment, the business need for healthy employees, and the public health concern create an opportunity for planning, implementation and analysis of interventions for effectiveness. This paper aims to identify those intervention programs that focus on the occupational setting related to obesity, to analyze the overall effect of diet, physical fitness and behavioral change interventions targeting overweight and obesity in the occupational setting, and to evaluate the details and effectiveness of components, such as, intervention setting, target participant group, content, industry and length of follow up. Once strengths and weaknesses of the interventions are evaluated, ideas will be suggested for implementation in the future.^
Resumo:
Diabetes Mellitus is not a disease, but a group of diseases. Common to all types of diabetes is high levels of blood glucose produced from a variety of causes. In 2006, the American Diabetes Association ranked diabetes as the fifth leading cause of death in the United States. The complications and consequences are serious and include nephropathy, retinopathy, neuropathy, heart disease, amputations, pregnancy complications, sexual dysfunction, biochemical imbalances, susceptibility and sensitivity to many other diseases and in some cases death. ^ The serious nature of diabetes mellitus and its complications has compelled researchers to devise new strategies to reach population segments at high risk. Various avenues of outreach have been attempted. This pilot program is not unique in using a health museum as a point of outreach. However health museums have not been a major source of interventions, either. Little information was available regarding health museum visitor demographics, visitation patterns, companion status and museum trust levels prior to this pilot intervention. This visitor information will improve planning for further interventions and studies. ^ This thesis also examined prevalence data in a temporal context, the populations at risk for diabetes, the collecting agencies, and other relevant collected data. The prevalence of diabetes has been rapidly increasing. The increase is partially explained by refinement of the definition of diabetes as the etiology has become better understood. Increasing obesity and sedentary lifestyles have contributed to the increase, as well as the burdensome increase on minority populations. ^ Treatment options are complex and have had limited effectiveness. This would lead one to conclude that prevention and early diagnosis are preferable. However, the general public has insufficient awareness and education regarding diabetes symptoms and the serious risks and complications the disease can cause. Reaching high risk, high prevalence, populations is challenging for any intervention. During its “free family Thursdays” The Health Museum (Houston, Texas) has attracted a variety of ethnic patrons; similar to the Houston and Harris County demographics. This research project explored the effectiveness of a pilot diabetes educational intervention in a health museum setting where people chose to visit. ^
Resumo:
Coronary artery disease (CAD) is the most common cause of morbidity and mortality in the United States. While Coronary Angiography (CA) is the gold standard test to investigate coronary artery disease, Prospective gated-64 Slice Computed Tomography (Prosp-64CT) is a new non-invasive technology that uses the 64Slice computed tomography (64CT) with electrocardiographic gating to investigate coronary artery disease. The aim of the current study was to investigate the role of Body Mass Index (BMI) as a factor affecting occurrence of CA after a Prosp-64CT, as well as the quality of the Prosp-64CT. Demographic and clinical characteristics of the study population were described. A secondary analysis of data on patients who underwent a Prosp-64CT for evaluation of coronary artery disease was performed. Seventy seven patients who underwent Prosp-64CT for evaluation for coronary artery disease were included. Fifteen patients were excluded because they had missing data regarding BMI, quality of the Prosp-64CT or CA. Thus, a total of 62 patients were included in the final analysis. The mean age was 56.2 years. The mean BMI was 31.3 kg/m 2. Eight (13%) patients underwent a CA within one month of Prosp-64CT. Eight (13%) patients had a poor quality Prosp-64CT. There was significant association of higher BMI as a factor for occurrence of CA post Prosp-64CT (P<0.05). There was a trend, but no statistical significance was observed for the association of being obese and occurrence of CA (P=0.06). BMI, as well as obesity, were not found to be significantly associated with poor quality of Prosp-64CT (P=0.19 and P=0.76, respectively). In conclusion, BMI was significantly associated with occurrence of CA within one month of Prosp-64CT. Thus, in patients with a higher BMI, diagnostic investigation with both tests could be avoided; rather, only a CA could be performed. However, the relationship of BMI to quality of Prosp-64CT needs to be further investigated since the sample size of the current study was small.^
Resumo:
Background. About a third of the world’s population is infected with tuberculosis (TB) with sub-Saharan Africa being the worst hit. Uganda is ranked 16th among the countries with the biggest TB burden. The burden in children however has not been determined. The burden of TB has been worsened by the advent of HIV and TB is the leading cause of mortality in HIV infected individuals. Development of TB disease can be prevented if TB is diagnosed during its latent stage and treated with isoniazid. For over a century, latent TB infection (LTBI) was diagnosed using the Tuberculin Skin Test (TST). New interferon gamma release assays (IGRA) have been approved by FDA for the diagnosis of LTBI and adult studies have shown that IGRAs are superior to the TST but there have been few studies in children especially in areas of high TB and HIV endemicity. ^ Objective. The objective of this study was to examine whether the IGRAs had a role in LTBI diagnosis in HIV infected children in Uganda. ^ Methods. Three hundred and eighty one (381) children were recruited at the Baylor College of Medicine-Bristol Meyers Squibb Children’s Clinical Center of Excellence at Mulago Hospital, Kampala, Uganda between March and August 2010. All the children were subjected to a TST and T-SPOT ®.TB test which was the IGRA chosen for this study. Sputum examination and chest x-rays were also done to rule out active TB. ^ Results. There was no statistically significant difference between the tests. The agreement between the two assays was 95.9% and the kappa statistic was 0.7 (95% CI: 0.55–0.85, p-value<0.05) indicating a substantial or good agreement. The TST was associated with older age and higher weight for age z-scores but the T-SPOT®. TB was not. Both tests were associated with history of taking anti-retroviral therapy (ART). ^ Conclusion. Before promoting use of IGRAs in children living in HIV/TB endemic countries, more research needs to be done. ^