881 resultados para Disease severity
Resumo:
AIM To relate the mean percentage of bleeding on probing (BOP) to smoking status in patients enrolled in supportive periodontal therapy (SPT). MATERIALS AND METHODS Retrospective data on BOP from 8'741 SPT visits were related to smoking status among categories of both periodontal disease severity and progression (instability) in patients undergoing dental hygiene treatment at the Medi School of Dental Hygiene (MSDH), Bern, Switzerland 1985-2011. RESULTS A total of 445 patients were identified with 27.2% (n = 121) being smokers, 27.6% (n = 123) former smokers and 45.2% (n = 201) non-smokers. Mean BOP statistically significantly increased with disease severity (p = 0.0001) and periodontal instability (p = 0.0115) irrespective of the smoking status. Periodontally stable smokers (n = 30) categorized with advanced periodontal disease demonstrated a mean BOP of 16.2% compared to unstable smokers (n = 15) with a mean BOP of 22.4% (p = 0.0291). Assessments of BOP in relation to the percentage of sites with periodontal probing depths (PPD) ≥ 4 mm at patient-level yielded a statistically significantly decreased proportion of BOP in smokers compared to non-smokers and former smokers (p = 0.0137). CONCLUSIONS Irrespective of the smoking status, increased mean BOP in SPT patients relates to disease severity and periodontal instability while smokers demonstrate lower mean BOP concomitantly with an increased prevalence of residual PPDs.
Resumo:
Classical swine fever (CSF) causes major losses in pig farming, with various degrees of disease severity. Efficient live attenuated vaccines against classical swine fever virus (CSFV) are used routinely in endemic countries. However, despite intensive vaccination programs in these areas for more than 20 years, CSF has not been eradicated. Molecular epidemiology studies in these regions suggests that the virus circulating in the field has evolved under the positive selection pressure exerted by the immune response to the vaccine, leading to new attenuated viral variants. Recent work by our group demonstrated that a high proportion of persistently infected piglets can be generated by early postnatal infection with low and moderately virulent CSFV strains. Here, we studied the immune response to a hog cholera lapinised virus vaccine (HCLV), C-strain, in six-week-old persistently infected pigs following post-natal infection. CSFV-negative pigs were vaccinated as controls. The humoral and interferon gamma responses as well as the CSFV RNA loads were monitored for 21 days post-vaccination. No vaccine viral RNA was detected in the serum samples and tonsils from CSFV postnatally persistently infected pigs for 21 days post-vaccination. Furthermore, no E2-specific antibody response or neutralising antibody titres were shown in CSFV persistently infected vaccinated animals. Likewise, no of IFN-gamma producing cell response against CSFV or PHA was observed. To our knowledge, this is the first report demonstrating the absence of a response to vaccination in CSFV persistently infected pigs.
Resumo:
In the demanding environment of healthcare reform, reduction of unwanted physician practice variation is promoted, often through evidence-based guidelines. Guidelines represent innovations that direct change(s) in physician practice; however, compliance has been disappointing. Numerous studies have analyzed guideline development and dissemination, while few have evaluated the consequences of guideline adoption. The primary purpose of this study was to explore and analyze the relationship between physician adoption of the glycated hemoglobin test guideline for management of adult patients with diabetes, and the cost of medical care. The study also examined six personal and organizational characteristics of physicians and their association with innovativeness, or adoption of the guideline. ^ Cost was represented by approved charges from a managed care claims database. Total cost, and diabetes and related complications cost, first were compared for all patients of adopter physicians with those of non-adopter physicians. Then, data were analyzed controlling for disease severity based on insulin dependency, and for high cost cases. There was no statistically significant difference in any of eight cost categories analyzed. This study represented a twelve-month period, and did not reflect cost associated with future complications known to result from inadequate management of glycemia. Guideline compliance did not increase annual cost, which, combined with the future benefit of glycemic control, lends support to the cost effectiveness of the guideline in the long term. Physician adoption of the guideline was recommended to reduce the future personal and economic burden of this chronic disease. ^ Only half of physicians studied had adopted the glycated hemoglobin test guideline for at least 75% of their diabetic patients. No statistically significant relationship was found between any physician characteristic and guideline adoption. Instead, it was likely that the innovation-decision process and guideline dissemination methods were most influential. ^ A multidisciplinary, multi-faceted approach, including interventions for each stage of the innovation-decision process, was proposed to diffuse practice guidelines more effectively. Further, it was recommended that Organized Delivery Systems expand existing administrative databases to include clinical information, decision support systems, and reminder mechanisms, to promote and support physician compliance with this and other evidence-based guidelines. ^
Resumo:
Symptoms has been shown to predict quality of life, treatment course and survival in solid tumor patients. Currently, no instrument exists that measures both cancer-related symptoms and the neurologic symptoms that are unique to persons with primary brain tumors (PBT). The aim of this study was to develop and validate an instrument to measure symptoms in patients who have PBT. A conceptual analysis of symptoms and symptom theories led to defining the symptoms experience as the perception of the frequency, intensity, distress, and meaning that occurs as symptoms are produced, perceived, and expressed. The M.D. Anderson Symptom Inventory (MDASI) measures both symptoms and how they interfere with daily functioning in patients with cancer, which is similar to the situational meaning defined in the analysis. A list of symptoms pertinent to the PBT population was added to the core MDASI and reviewed by a group of experts for validity. As a result, 18 items were added to the core MDASI (the MDASI-BT) for the next phase of instrument development, establishing validity and reliability through a descriptive, cross-sectional approach with PBT patients. Data were collected with a patient completed demographic data sheet, an investigator completed clinician checklist, and the MDASI-BT. Analysis evaluated the reliability and validity of the MDASI-BT in PBT patients. Data were obtained from 201 patients. The number of items was reduced to 22 by evaluation of symptom severity as well as cluster analysis. Regression analysis showed more than half (56%) of the variability in symptom severity was explained by the brain tumor module items. Factor analysis confirmed that the 22 item MDASI-BT measured six underlying constructs: (a) affective; (b) cognitive; (c) focal neurologic deficits; (d) constitutional symptoms; (e) treatment-related symptoms; and (f) gastrointestinal symptoms. The MDASI-BT was sensitive to disease severity and if the patient was hospitalized. The MDASI-BT is the first instrument to measure symptoms in PBT patients that has demonstrated reliability and validity. It is the first step in a program of research to evaluate the occurrence of symptoms and plan and evaluate interventions for PBT patients. ^
Resumo:
Background. Racial disparities in healthcare span such areas as access, outcomes after procedures, and patient satisfaction. Previous work suggested that minorities experience less healthcare and worse survival rates. In adult orthotopic liver transplantation (OLT) mixed results have been reported, with some showing African-American recipients having poor survival compared to Caucasians, and others finding no such discrepancy. ^ Purpose. This study’s purpose was to analyze the most recent United Network for Organ Sharing (UNOS) data, both before and after the implementation of the Model for End-Stage Liver Disease (MELD)/Pediatric End-Stage Liver Disease (PELD) scoring system, to determine if minority racial groups still experience poor outcomes after OLT. ^ Methods. The UNOS dataset for 1992-2001 (Era I) and 2002-2007 (Era II) was used. Patient survival rates for each Era and for adult and pediatric recipients were analyzed with adjustment. A separate multivariate analysis was performed on African-American adult patients in Era II in order to identify unique predictors for poor patient survival. ^ Results. The overall study included 66,118 OLT recipients. The majority were Caucasian (78%), followed by Hispanics (13%) and African-Americans (9%). Hispanic and African-American adults were more likely to be female, have Hepatitis C, to be in the intensive care unit (ICU) or ventilated at time of OLT, to have a MELD score ≥23, to have a lower education level, and to have public insurance when compared to Caucasian adults (all p-values < 0.05). Hispanic and African-American pediatric recipients were more likely have public insurance and less likely to receive a living donor OLT than were Caucasian pediatric OLT recipients (p <0.05). There was no difference in the likelihood of having a PELD score ≥21 among racial groups (p >0.40). African-American adults in Era I and Era II had worse patient survival rates than both Caucasians and Hispanic (pair-wise p-values <0.05). This same disparity was seen for pediatric recipients in Era I, but not in Era II. Multivariate analysis of African-American recipients revealed no unique predictors of patient death. ^ Conclusions. African-American race is still a predictor of poor outcome after adult OLT, even after adjustment for multiple clinical, demographic, and liver disease severity variables. Although African-American and Hispanic subgroups share many characteristics previously thought to increase risk of post-OLT death, only African-American patients have poor survival rates when compared to Caucasians. ^
Resumo:
Objective. To evaluate the host risk factors associated with rifamycin-resistant Clostridium difficile (C. diff) infection in hospitalized patients compared to rifamycin-susceptible C.diff infection.^ Background. C. diff is the most common definable cause of nosocomial diarrhea affecting elderly hospitalized patients taking antibiotics for prolonged durations. The epidemiology of Clostridium difficile associated disease is now changing with the reports of a new hypervirulent strain causing hospital outbreaks. This new strain is associated with increased disease severity and mortality. The conventional therapy for C. diff includes metronidazole and vancomycin but high recurrence rates and treatment failures are now becoming a major concern. Rifamycin antibiotics are being developed as a new therapeutic option to treat C. diff infection after their efficacy was established in a few in vivo and in vitro studies. There are some recent studies that report an association between the hypervirulent strain and emerging rifamycin resistance. These findings assess the need for clinical studies to better understand the efficacy of rifamycin drugs against C. diff.^ Methods. This is a hospital-based, matched case-control study using de-identified data drawn from two prospective cohort studies involving C. diff patients at St Luke's Hospital. The C. diff isolates from these patients are screened for rifamycin resistance using agar dilution methods for minimum inhibitory concentrations (MIC) as part of Dr Zhi-Dong Jiang's study. Twenty-four rifamycin-rifamycin resistant C. diff cases were identified and matched with one rifamycin susceptible C. diff control on the basis of ± 10 years of age and hospitalization 30 days before or after the case. De-identified data for the 48 subjects was obtained from Dr Kevin Garey's clinical study at St Luke's Hospital enrolling C. diff patients. It was reviewed to gather information about host risk factors, outcome variables and relevant clinical characteristic.^ Results. Medical diagnosis at the time of admission (p = 0.0281) and history of chemotherapy (p = 0.022) were identified as a significant risk factor while hospital stay ranging from 1 week to 1 month and artificial feeding were identified as an important outcome variable (p = 0.072 and p = 0.081 respectively). Horn's Index assessing the severity of underlying illness and duration of antibiotics for cases and controls showed no significant difference.^ Conclusion. The study was a small project designed to identify host risk factors and understand the clinical implications of rifamycin-resistance. The study was underpowered and a larger sample size is needed to validate the results.^
Resumo:
The purpose of this study was to assess whether C. difficile infection (CDI) increases the risk of bacteremia or E. coli infection. The first specific aim of this study was to study the incidence of post C. difficile bacteremia in CDI patients stratified by disease severity vs. controls. The second specific aim was to study the incidence of post C. difficile E. coli infection from normally sterile sites stratified by disease severity vs. controls. This was a retrospective case case control study. The cases came from an ongoing prospective cohort study of CDI. Case group 1 were patients with mild to moderate CDI. Case group 2 were patients who had severe CDI. Controls were hospitalized patients given broad spectrum antibiotics that did not develop CDI. Controls were matched by age (±10 years) and duration of hospital visit (±1 week). 191 cases were selected from the cohort study and 191 controls were matched to the cases. Patients were followed up to 60 days after the initial diagnosis of CDI and assessed for bacteremia and E. coli infections. The Zar score was used to determine the severity of the CDI. Stata 11 was used to run all analyses. ^ The risk of non staphylococcal bacteremia after diagnosis of CDI was higher compared to controls (14% and 7% respectively, OR: 2.27; 95% CI:1.07-5.01, p=0.028). The risk of getting an E.coli infection was higher in cases than in controls (13% and 9% respectively although the results were not statistically significant (OR:1.4; 95% CI:0.38-5.59;p=0.32). Rates of non-staphylococcal bacteremia and E. coli infection did not differ cased on CDI severity. ^ This study showed that the risk of developing non-staphylococcus bacteremia was higher in patients with CDI compared to matched controls. The findings supported the hypothesis that CDI increases the risk of bacterial translocation specifically leading to the development of bacteremia.^
Resumo:
Fusarium proliferatum has been reported on garlic in the Northwest USA, Spain and Serbia, causing water-soaked tan-colored lesions on cloves. In this work, Fusarium proliferatum was isolated from 300 symptomatic garlic bulbs. Morphological identification of Fusarium was confirmed using species-specific PCR assays and EF-1α sequencing. Confirmation of pathogenicity was conducted with eighteen isolates. Six randomly selected F. proliferatum isolates from garlic were tested for specific pathogenicity and screened for fusaric acid production. Additionally, pathogenicity of each F. proliferatum isolate was tested on healthy seedlings of onion (Allium cepa), leek (A. porrum), scallions (A. fistulosum), chives (A. schoenoprasum) and garlic (A. sativum). A disease severity index (DSI) was calculated as the mean severity on three plants of each species with four test replicates. Symptoms on onion and garlic plants were observed three weeks after inoculation. All isolates tested produced symptoms on all varieties inoculated. Inoculation of F. proliferatum isolates from diseased garlic onto other Allium species provided new information on host range and pathogenicity. The results demonstrated differences in susceptibility with respect to host species and cultivar. The F. proliferatum isolates tested all produced fusaric acid (FA); correlations between FA production and isolate pathogenicity are discussed. Additionally, all isolates showed the presence of the FUM1 gene suggesting the ability of Spanish isolates to produce fumonisins.
Resumo:
El desarrollo de las técnicas de imágenes por resonancia magnética han permitido el estudio y cuantificación, in vivo, de los cambios que ocurren en la morfología cerebral ligados a procesos tales como el neurodesarrollo, el envejecimiento, el aprendizaje o la enfermedad. Un gran número de métodos de morfometría han sido desarrollados con el fin de extraer la información contenida en estas imágenes y traducirla en indicadores de forma o tamaño, tales como el volumen o el grosor cortical; marcadores que son posteriormente empleados para encontrar diferencias estadísticas entre poblaciones de sujetos o realizar correlaciones entre la morfología cerebral y, por ejemplo, la edad o la severidad de determinada enfermedad. A pesar de la amplia variedad de biomarcadores y metodologías de morfometría, muchos estudios sesgan sus hipótesis, y con ello los resultados experimentales, al empleo de un número reducido de biomarcadores o a al uso de una única metodología de procesamiento. Con el presente trabajo se pretende demostrar la importancia del empleo de diversos métodos de morfometría para lograr una mejor caracterización del proceso que se desea estudiar. En el mismo se emplea el análisis de forma para detectar diferencias, tanto globales como locales, en la morfología del tálamo entre pacientes adolescentes con episodios tempranos de psicosis y adolescentes sanos. Los resultados obtenidos demuestran que la diferencia de volumen talámico entre ambas poblaciones de sujetos, previamente descrita en la literatura, se debe a una reducción del volumen de la región anterior-mediodorsal y del núcleo pulvinar del tálamo de los pacientes respecto a los sujetos sanos. Además, se describe el desarrollo de un estudio longitudinal, en sujetos sanos, que emplea simultáneamente distintos biomarcadores para la caracterización y cuantificación de los cambios que ocurren en la morfología de la corteza cerebral durante la adolescencia. A través de este estudio se revela que el proceso de “alisado” que experimenta la corteza cerebral durante la adolescencia es consecuencia de una disminución de la profundidad, ligada a un incremento en el ancho, de los surcos corticales. Finalmente, esta metodología es aplicada, en un diseño transversal, para el estudio de las causas que provocan el decrecimiento tanto del grosor cortical como del índice de girificación en adolescentes con episodios tempranos de psicosis. ABSTRACT The ever evolving sophistication of magnetic resonance image techniques continue to provide new tools to characterize and quantify, in vivo, brain morphologic changes related to neurodevelopment, senescence, learning or disease. The majority of morphometric methods extract shape or size descriptors such as volume, surface area, and cortical thickness from the MRI image. These morphological measurements are commonly entered in statistical analytic approaches for testing between-group differences or for correlations between the morphological measurement and other variables such as age, sex, or disease severity. A wide variety of morphological biomarkers are reported in the literature. Despite this wide range of potentially useful biomarkers and available morphometric methods, the hypotheses and findings of the grand majority of morphological studies are biased because reports assess only one morphometric feature and usually use only one image processing method. Throughout this dissertation biomarkers and image processing strategies are combined to provide innovative and useful morphometric tools for examining brain changes during neurodevelopment. Specifically, a shape analysis technique allowing for a fine-grained assessment of regional thalamic volume in early-onset psychosis patients and healthy comparison subjects is implemented. Results show that disease-related reductions in global thalamic volume, as previously described by other authors, could be particularly driven by a deficit in the anterior-mediodorsal and pulvinar thalamic regions in patients relative to healthy subjects. Furthermore, in healthy adolescents different cortical features are extracted and combined and their interdependency is assessed over time. This study attempts to extend current knowledge of normal brain development, specifically the largely unexplored relationship between changes of distinct cortical morphological measurements during adolescence. This study demonstrates that cortical flattening, present during adolescence, is produced by a combination of age-related increase in sulcal width and decrease in sulcal depth. Finally, this methodology is applied to a cross-sectional study, investigating the mechanisms underlying the decrease in cortical thickness and gyrification observed in psychotic patients with a disease onset during adolescence.
Resumo:
Clinical findings suggest that inflammatory disease symptoms are aggravated by ongoing, repeated stress, but not by acute stress. We hypothesized that, compared with single acute stressors, chronic repeated stress may engage different physiological mechanisms that exert qualitatively different effects on the inflammatory response. Because inhibition of plasma extravasation, a critical component of the inflammatory response, has been associated with increased disease severity in experimental arthritis, we tested for a potential repeated stress-induced inhibition of plasma extravasation. Repeated, but not single, exposures to restraint stress produced a profound inhibition of bradykinin-induced synovial plasma extravasation in the rat. Experiments examining the mechanism of inhibition showed that the effect of repeated stress was blocked by adrenalectomy, but not by adrenal medullae denervation, suggesting that the adrenal cortex mediates this effect. Consistent with known effects of stress and with mediation by the adrenal cortex, restraint stress evoked repeated transient elevations of plasma corticosterone levels. This elevated corticosterone was necessary and sufficient to produce inhibition of plasma extravasation because the stress-induced inhibition was blocked by preventing corticosterone synthesis and, conversely, induction of repeated transient elevations in plasma corticosterone levels mimicked the effects of repeated stress. These data suggest that repetition of a mild stressor can induce changes in the physiological state of the animal that enable a previously innocuous stressor to inhibit the inflammatory response. These findings provide a potential explanation for the clinical association between repeated stress and aggravation of inflammatory disease symptoms and provide a model for study of the biological mechanisms underlying the stress-induced aggravation of chronic inflammatory diseases.
Resumo:
Type 1 fimbriae are adhesion organelles expressed by many Gram-negative bacteria. They facilitate adherence to mucosal surfaces and inflammatory cells in vitro, but their contribution to virulence has not been defined. This study presents evidence that type 1 fimbriae increase the virulence of Escherichia coli for the urinary tract by promoting bacterial persistence and enhancing the inflammatory response to infection. In a clinical study, we observed that disease severity was greater in children infected with E. coli O1:K1:H7 isolates expressing type 1 fimbriae than in those infected with type 1 negative isolates of the same serotype. The E. coli O1:K1:H7 isolates had the same electrophoretic type, were hemolysin-negative, expressed P fimbriae, and carried the fim DNA sequences. When tested in a mouse urinary tract infection model, the type 1-positive E. coli O1:K1:H7 isolates survived in higher numbers, and induced a greater neutrophil influx into the urine, than O1:K1:H7 type 1-negative isolates. To confirm a role of type 1 fimbriae, a fimH null mutant (CN1016) was constructed from an O1:K1:H7 type 1-positive parent. E. coli CN1016 had reduced survival and inflammatogenicity in the mouse urinary tract infection model. E. coli CN1016 reconstituted with type 1 fimbriae (E. coli CN1018) had restored virulence similar to that of the wild-type parent strain. These results show that type 1 fimbriae in the genetic background of a uropathogenic strain contribute to the pathogenesis of E. coli in the urinary tract.
Resumo:
The impact of transmission events from patients with shingles (zoster) on the epidemiology of varicella is examined before and after the introduction of mass immunization by using a stochastic mathematical model of transmission dynamics. Reactivation of the virus is shown to damp stochastic fluctuations and move the dynamics toward simple annual oscillations. The force of infection due to zoster cases is estimated by comparison of simulated and observed incidence time series. The presence of infectious zoster cases reduces the tendency for mass immunization to increase varicella incidence at older ages when disease severity is typically greater.
Resumo:
A redução da mortalidade é um objetivo fundamental das unidades de terapia intensiva pediátrica (UTIP). O estágio de gravidade da doença reflete a magnitude das comorbidades e distúrbios fisiológicos no momento da internação e pode ser avaliada pelos escores prognósticos de mortalidade. Os dois principais escores utilizados na UTIP são o Pediatric Risk of Mortality (PRISM) e o Pediatric Index of Mortality (PIM). O PRISM utiliza os piores valores de variáveis fisiológicas e laboratoriais nas primeiras 24 horas de internação enquanto o PIM2 utiliza dados da primeira hora de internação na UTIP e apenas uma gasometria arterial. Não há consenso na literatura, entre PRISM e PIM2, quanto à utilidade e padronização na admissão na terapia intensiva para as crianças e adolescentes, principalmente em uma UTI de nível de atendimento terciário. O objetivo do estudo foi estabelecer o escore de melhor performance na avaliação do prognóstico de mortalidade que seja facilmente aplicável na rotina da UTIP, para ser utilizado de forma padronizada e contínua. Foi realizado um estudo retrospectivo onde foram revisados os escores PRISM e PIM2 de 359 pacientes internados na unidade de terapia intensiva pediátrica do Instituto da Criança do Hospital das Clínicas da Faculdade de Medicina da USP, considerada uma unidade de atendimento de nível terciário. A mortalidade foi de 15%, o principal tipo de admissão foi clinico (78%) sendo a principal causa de internação a disfunção respiratória (37,3%). Os escores dos pacientes que foram a óbito mostraram-se maiores do que o dos sobreviventes. Para o PRISM foi 15 versus 7 (p = 0,0001) e para o PIM2, 11 versus 5 (p = 0,0002), respectivamente. Para a amostra geral, o Standardized Mortality Ratio (SMR) subestimou a mortalidade tanto para o PIM2 quanto para o PRISM [1,15 (0,84 - 1,46) e 1,67 (1,23 - 2,11), respectivamente]. O teste de Hosmer-Lemeshow mostrou calibração adequada para ambos os escores [x2 = 12,96 (p = 0,11) para o PRISM e x2 = 13,7 (p = 0,09) para o PIM2]. A discriminação, realizada por meio da área sob a curva ROC, foi mais adequada para o PRISM do que para o PIM2 [0,76 (IC 95% 0,69 - 0,83) e 0,65 (IC 95% 0,57 - 0,72), respectivamente, p= 0,002]. No presente estudo, a melhor sensibilidade e especificidade para o risco de óbito do PRISM foi um escore entre 13 e 14, mostrando que, com o avanço tecnológico, o paciente precisa ter um escore mais elevado, ou seja, maior gravidade clínica do que a população original, para um maior risco de mortalidade. Os escores de gravidade podem ter seus resultados modificados em consequência: do sistema de saúde (público ou privado), da infraestrutura da UTIP (número de leitos, recursos humanos, parque tecnológico) e indicação da internação. A escolha de um escore de gravidade depende das características individuais da UTIP, como o tempo de espera na emergência, presença de doença crônica complexa (por exemplo, pacientes oncológicos) e como é realizado o transporte para a UTIP. Idealmente, estudos multicêntricos têm maior significância estatística. No entanto, estudos com populações maiores e mais homogêneas, especialmente nos países em desenvolvimento, são difíceis de serem realizados
Resumo:
Leishmaniaparasites cause a broad range of disease, with cutaneous afflictions being, by far, the most prevalent. Variations in disease severity and symptomatic spectrum are mostly associated to parasite species. One risk factor for the severity and emergence of leishmaniasis is immunosuppression, usually arising by coinfection of the patient with human immunodeficiency virus (HIV). Interestingly, several species ofLeishmaniahave been shown to bear an endogenous cytoplasmic dsRNA virus (LRV) of theTotiviridaefamily, and recently we correlated the presence of LRV1 withinLeishmaniaparasites to an exacerbation murine leishmaniasis and with an elevated frequency of drug treatment failures in humans. This raises the possibility of further exacerbation of leishmaniasis in the presence of both viruses, and here we report a case of cutaneous leishmaniasis caused byLeishmania braziliensisbearing LRV1 with aggressive pathogenesis in an HIV patient. LRV1 was isolated and partially sequenced from skin and nasal lesions. Genetic identity of both sequences reinforced the assumption that nasal parasites originate from primary skin lesions. Surprisingly, combined antiretroviral therapy did not impact the devolution ofLeishmaniainfection. TheLeishmaniainfection was successfully treated through administration of liposomal amphotericin B.
Resumo:
Thesis (Master's)--University of Washington, 2016-06