70 resultados para structural health monitoring method
Resumo:
INTRODUCTION: A contribution to the regional epidemiological profile of the most common fungal agents in Public Health Services in Cuiabá, state of Mato Grosso, including university hospitals and polyclinics. METHODS: Clinical specimens (n = 1,496) from 1,078 patients were collected, submitted to direct mycological exam (potash or stick tape method) and cultured in specific mediums. Dermatophytic and non-dermatophytic agents were identified according to micromorphology (Ridell technique). RESULTS: The majority of the 1,496 specimens were skin (n = 985) and nail exams (n = 472). Of the 800 positive cultures, 246 (30.8%) corresponded to dermatophytes and 336 (42%) to yeasts of the genus Candida, 190 (23.7%) to other yeasts, 27 (3.4%) to non-dermatophytic filamentous fungi and one (0.1%) the agent of subcutaneous mycosis. Lesions considered primary occurred in greater numbers (59.5%) than recurrent lesions (37.4%), with a greater concentration of positivity occurring on the arms and legs. CONCLUSIONS: Comorbidities, allergies and diabetes mellitus were conditions associated with greater positivity in direct mycological exams and cultures. Positive culture was considered a definitive diagnosis of fungal infection and confirmed 47.8% of diagnostic hypotheses.
Resumo:
This study describes the development and application of a new PCR assay for the specific detection of pathogenic leptospires and its comparison with a previously reported PCR protocol. New primers were designed for PCR optimization and evaluation in artificially-infected paraffin-embedded tissues. PCR was then applied to post-mortem, paraffin-embedded samples, followed by amplicon sequencing. The PCR was more efficient than the reported protocol, allowing the amplification of expected DNA fragment from the artificially infected samples and from 44% of the post-mortem samples. The sequences of PCR amplicons from different patients showed >99% homology with pathogenic leptospires DNA sequences. The applicability of a highly sensitive and specific tool to screen histological specimens for the detection of pathogenic Leptospira spp. would facilitate a better assessment of the prevalence and epidemiology of leptospirosis, which constitutes a health problem in many countries.
Resumo:
INTRODUCTION: Chagas disease (ChD) is a chronic illness related to significant morbidity and mortality that can affect the quality of life (QoL) of infected patients. However, there are few studies regarding QoL in ChD. The objectives of this study are to construct a health-related QoL (HRQoL) profile of ChD patients and compare this with a non-ChD (NChD) group to identify factors associated with the worst HRQoL scores in ChD patients. METHODS: HRQoL was investigated in 125 patients with ChD and 21 NChD individuals using the Medical Outcomes Study 36-item Short-Form (SF-36) and the Minnesota Living with Heart Failure Questionnaire (MLWHFQ). Patients were submitted to a standard protocol that included clinical examination, ECG, Holter monitoring, Doppler echocardiogram and autonomic function tests. RESULTS: HRQoL scores were significantly worse among the ChD group compared to the NChD group in the SF-36 domains of physical functioning and role-emotional and in the MLWHFQ scale. For the ChD group, univariate analysis showed that HRQoL score quartiles were associated with level of education, sex, marital status, use of medication, functional classification and cardiovascular and gastrointestinal symptoms. In the multivariate analysis, female sex, fewer years of education, single status, worst functional classification, presence of cardiovascular and gastrointestinal symptoms, associated illnesses, Doppler echocardiographic abnormalities and ventricular arrhythmia detected during Holter monitoring were predictors of lower HRQoL scores. CONCLUSIONS: ChD patients showed worse HRQoL scores compared to NChD. For the ChD group, sociodemographic and clinical variables were associated with worst scores.
Resumo:
INTRODUCTION: The goal was to develop an in-house serological method with high specificity and sensitivity for diagnosis and monitoring of Chagas disease morbidity. METHODS: With this purpose, the reactivities of anti-T. cruzi IgG and subclasses were tested in successive serum dilutions of patients from Berilo municipality, Jequitinhonha Valley, Minas Gerais, Brazil. The performance of the in-house ELISA was also evaluated in samples from other relevant infectious diseases, including HIV, hepatitis C (HCV), syphilis (SYP), visceral leishmaniasis (VL), and American tegumentary leishmaniasis (ATL), and noninfected controls (NI). Further analysis was performed to evaluate the applicability of this in-house methodology for monitoring Chagas disease morbidity into three groups of patients: indeterminate (IND), cardiac (CARD), and digestive/mixed (DIG/Mix), based on their clinical status. RESULTS: The analysis of total IgG reactivity at serum dilution 1:40 was an excellent approach to Chagas disease diagnosis (100% sensitivity and specificity). The analysis of IgG subclasses showed cross-reactivity, mainly with NI, VL, and ATL, at all selected serum dilutions. Based on the data analysis, the IND group displayed higher IgG3 levels and the DIG/Mix group presented higher levels of total IgG as compared with the IND and CARD groups. CONCLUSIONS: These findings demonstrated that methodology presents promising applicability in the analysis of anti-T. cruzi IgG reactivity for the differential diagnosis and evaluation of Chagas disease morbidity.
Resumo:
INTRODUCTION: Tuberculosis (TB) control is linked to the availability of qualified methods for microbiological diagnostics; however, microscopy with limited sensitivity is the only method available in many locations. The objective of this study was to evaluate the introduction of culture, drug susceptibility testing (DST), and genotyping in the routine of a Municipal Program of Tuberculosis Control. METHODS: Direct microscopy of sputum and culture in Ogawa-Kudoh were performed on 1,636 samples from 787 patients. DST of positive cultures was performed by resazurin microtiter assay and genotyping by mycobacterial interspersed repetitive units-variable number tandem repeat. RESULTS: A total 91 patients with TB were identified. The culture increased case detection by 32% compared with the microscopy; acquired resistance was 3.3% and the genotyping showed high genetic diversity. CONCLUSIONS: Ogawa-Kudoh contributed significantly to the increase in case detection and is suitable for implementation in poor-resource locations. The acquired resistance rate was lower than that reported in a recent Brazilian survey. The high genetic diversity is possibly related to the high TB prevalence in the population, as well as to early detection and suitable treatment of patients. The interaction between research and health care is important for reorienting the practice, transferring technology, and improving TB control.
Resumo:
Introduction The aim of the present study was to assess the polymerase chain reaction (PCR) as a method for detecting Trypanosoma cruzi infection in triatomines that had been previously determined by microscopic examination in the State of Mato Grosso do Sul, Brazil. Methods In total, 515 specimens were collected. Material from the digestive tract of each triatomine was analyzed for the presence of T. cruzi by microscopic examination and PCR using the 121/122 primer set. Results Among the 515 specimens tested, 58 (11.3%) were positive by microscopy and 101 (19.61%) were positive by PCR and there was an association between the results of the techniques (χ2 = 53.354, p = 0.001). The main species of triatomine identified was T. sordida (95.5%) Conclusions The use of PCR in entomological surveillance may contribute to a better assessment of the occurrence of T. cruzi in triatomine populations.
Resumo:
Introduction In addition to the common alterations and diseases inherent in the aging process, elderly persons with a history of leprosy are particularly vulnerable to dependence because of disease-related impairments. Objective determine whether physical impairment from leprosy is associated with dependence among the elderly. Methods An analytical cross-sectional study of elderly individuals with a history of leprosy and no signs of cognitive impairment was conducted using a database from a former leprosy colony-hospital. The patients were evaluated for dependence in the basic activities of daily living (BADL) and instrumental activities of daily living (IADL), respectively) and subjected to standard leprosy physical disability grading. Subsequently, descriptive and univariate analyses were conducted, the latter using Pearson's chi-squared test. Results A total of 186 elderly persons were included in the study. Of these individuals, 53.8% were women, 49.5% were older than 75 years of age, 93% had four or less years of formal education, 24.2% lived in an institution for the long-term care of the elderly (ILTC), and 18.3% had lower limb amputations. Among those evaluated, 79.8% had visible physical impairments from leprosy (grade 2), 83.3% were independent in BADL, and 10.2% were independent in IADL. There was a higher impairment grade among those patients who were IADL dependent (p=0.038). Conclusion s: The leprosy physical impairment grade is associated with dependence for IADL, creating the need for greater social support and systematic monitoring by a multidisciplinary team. The results highlight the importance of early diagnosis and treatment of leprosy to prevent physical impairment and dependence in later years.
Resumo:
A prospective study was conducted to determine if standardized vancomycin doses could produce adequate serum concentrations in 25 term newborn infants with sepsis. Purpose: The therapeutic response of neonatal sepsis by Staphylococcus sp. treated with vancomycin was evaluated through serum concentrations of vancomycin, serum bactericidal titers (SBT), and minimum inhibitory concentration (MIC). METHOD: Vancomycin serum concentrations were determined by the fluorescence polarization immunoassay technique , SBT by the macro-broth dilution method, and MIC by diffusion test in agar . RESULTS: Thirteen newborn infants (59.1%) had adequate peak vancomycin serum concentrations (20--40 mg/mL) and one had peak concentration with potential ototoxicity risk (>40 µg/mL). Only 48% had adequate trough concentrations (5--10 mg/mL), and seven (28%) had a potential nephrotoxicity risk (>10 µg/mL). There was no significant agreement regarding normality for peak and trough vancomycin method (McNemar test : p = 0.7905). Peak serum vancomycin concentrations were compared with the clinical evaluation (good or bad clinical evolution) of the infants, with no significant difference found (U=51.5; p=0.1947). There was also no significant difference between the patients' trough concentrations and good or bad clinical evolution (U = 77.0; p=0.1710). All Staphylococcus isolates were sensitive to vancomycin according to the MIC. Half of the patients with adequate trough SBT (1/8), also had adequate trough vancomycin concentrations and satisfactory clinical evolution. CONCLUSIONS: Recommended vancomycin schedules for term newborn infants with neonatal sepsis should be based on the weight and postconceptual age only to start antimicrobial therapy. There is no ideal pattern of vancomycin dosing; vancomycin dosages must be individualized. SBT interpretation should be made in conjunction with the patient's clinical presentation and vancomycin serum concentrations. Those laboratory and clinical data favor elucidation of the probable cause of patient's bad evolution, which would facilitate drug adjustment and reduce the risk of toxicity or failing to achieve therapeutic doses.
Resumo:
Studies have shown that the age of 12 was determined as the age of global monitoring of caries for international comparisons and monitoring of disease trends. The aimed was to evaluate the prevalence of dental caries, fluorosis and periodontal condition and their relation with socioeconomic factors among schoolchildren aged twelve in the city of Manaus, AM. This study with a probabilistic sample of 661 children was conducted, 609 from public and 52 from private schools, in 2008. Dental caries, periodontal condition and dental fluorosis were evaluated. In order to obtain the socioeconomic classification of each child (high, upper middle, middle, lower middle, low and lower low socioeconomic classes), the guardians were given a questionnaire. The mean decayed teeth, missing teeth, and filled teeth (DMFT) found at age twelve was 1.89. It was observed that the presence of dental calculus was the most severe periodontal condition detected in 39.48%. In relation to dental fluorosis, there was a low prevalence in the children examined, i.e., the more pronounced lines of opacity only occasionally merge, forming small white areas. The study showed a significant association of 5% among social class with dental caries and periodontal condition. In schoolchildren of Manaus there are low mean of DMFT and fluorosis, but a high occurrence of gingival bleeding.
Resumo:
OBJECTIVE: Brazil is the country with the largest community of Japanese descendants in the world, from a migration movement that started in 1908. However, more recently (1988), a movement in the opposite direction began. Many of these descendants went to Japan for work purposes and suffered mental distress. Some of them sought treatment in Japan, while others returned to Brazil to seek treatment. The aim of the present study was to compare the sociodemographic profile and diagnoses of Japanese Brazilian psychiatric outpatients in Japan (remaining group) and in Brazil (returning group). METHOD: All consecutive Japanese Brazilian outpatients who received care from the psychiatric units in Japan and Brazil from April 1997 to April 2000 were compared. The diagnoses were based on ICD-10 and were made by psychiatrists. Sociodemographic data and diagnoses in Brazil and Japan were compared by means of the Chi-Squared Test. RESULTS: The individuals who returned to Brazil were mostly male and unmarried, had lived alone in Japan, had stayed there for short periods and were classified in the schizophrenia group. The individuals who remained in Japan were mostly female and married, were living with family or friends, had stayed there for long periods and were classified in the anxiety group. Logistic regression showed that the most significant factors associated with the returning group were that they had lived alone and stayed for short periods (OR = 0.93 and 40.21, respectively). CONCLUSION: We conclude that living with a family and having a network of friends is very important for mental health in the context evaluated.
Resumo:
OBJECTIVE: To evaluate the relationship between 24-hour ambulatory arterial blood pressure monitoring and the prognosis of patients with advanced congestive heart failure. METHODS: We studied 38 patients with NYHA functional class IV congestive heart failure, and analyzed left ventricular ejection fraction, diastolic diameter, and ambulatory blood pressure monitoring data. RESULTS: Twelve deaths occurred. Left ventricular ejection fraction (35.2±7.3%) and diastolic diameter (72.2±7.8mm) were not correlated with the survival. The mean 24-hour (SBP24), waking (SBPw), and sleeping (SBPs) systolic pressures of the living patients were higher than those of the deceased patients and were significant for predicting survival. Patients with mean SBP24, SBPv, and SBPs > or = 105mmHg had longer survival (p=0.002, p=0.01 and p=0.0007, respectively). Patients with diastolic blood pressure sleep decrements (dip) and patients with mean blood pressure dip <=6mmHg had longer survival (p=0.04 and p=0.01, respectively). In the multivariate analysis, SBPs was the only variable with an odds ratio of 7.61 (CI: 1.56; 3704) (p=0.01). Patients with mean SBP<105mmHg were 7.6 times more likely to die than those with SBP > or = 105 mmHg CONCLUSION: Ambulatory blood pressure monitoring appears to be a useful method for evaluating patients with congestive heart failure.
Resumo:
AbstractBackground:Heart surgery has developed with increasing patient complexity.Objective:To assess the use of resources and real costs stratified by risk factors of patients submitted to surgical cardiac procedures and to compare them with the values reimbursed by the Brazilian Unified Health System (SUS).Method:All cardiac surgery procedures performed between January and July 2013 in a tertiary referral center were analyzed. Demographic and clinical data allowed the calculation of the value reimbursed by the Brazilian SUS. Patients were stratified as low, intermediate and high-risk categories according to the EuroSCORE. Clinical outcomes, use of resources and costs (real costs versus SUS) were compared between established risk groups.Results:Postoperative mortality rates of low, intermediate and high-risk EuroSCORE risk strata showed a significant linear positive correlation (EuroSCORE: 3.8%, 10%, and 25%; p < 0.0001), as well as occurrence of any postoperative complication EuroSCORE: 13.7%, 20.7%, and 30.8%, respectively; p = 0.006). Accordingly, length-of-stay increased from 20.9 days to 24.8 and 29.2 days (p < 0.001). The real cost was parallel to increased resource use according to EuroSCORE risk strata (R$ 27.116,00 ± R$ 13.928,00 versus R$ 34.854,00 ± R$ 27.814,00 versus R$ 43.234,00 ± R$ 26.009,00, respectively; p < 0.001). SUS reimbursement also increased (R$ 14.306,00 ± R$ 4.571,00 versus R$ 16.217,00 ± R$ 7.298,00 versus R$ 19.548,00 ± R$935,00; p < 0.001). However, as the EuroSCORE increased, there was significant difference (p < 0.0001) between the real cost increasing slope and the SUS reimbursement elevation per EuroSCORE risk strata.Conclusion:Higher EuroSCORE was related to higher postoperative mortality, complications, length of stay, and costs. Although SUS reimbursement increased according to risk, it was not proportional to real costs.
Resumo:
Abstract Casual blood pressure measurements have been extensively questioned over the last five decades. A significant percentage of patients have different blood pressure readings when examined in the office or outside it. For this reason, a change in the paradigm of the best manner to assess blood pressure has been observed. The method that has been most widely used is the Ambulatory Blood Pressure Monitoring - ABPM. The method allows recording blood pressure measures in 24 hours and evaluating various parameters such as mean BP, pressure loads, areas under the curve, variations between daytime and nighttime, pulse pressure variability etc. Blood pressure measurements obtained by ABPM are better correlated, for example, with the risks of hypertension. The main indications for ABPM are: suspected white coat hypertension and masked hypertension, evaluation of the efficacy of the antihypertensive therapy in 24 hours, and evaluation of symptoms. There is increasing evidence that the use of ABPM has contributed to the assessment of blood pressure behaviors, establishment of diagnoses, prognosis and the efficacy of antihypertensive therapy. There is no doubt that the study of 24-hour blood pressure behavior and its variations by ABPM has brought more light and less darkness to the field, which justifies the title of this review.
Resumo:
The Kilombero Malaria Project (KMP) attemps to define opperationally useful indicators of levels of transmission and disease and health system relevant monitoring indicators to evaluate the impact of disease control at the community or health facility level. The KMP is longitudinal community based study (N = 1024) in rural Southern Tanzania, investigating risk factors for malarial morbidity and developing household based malaria control strategies. Biweekly morbidity and bimonthly serological, parasitological and drug consumption surveys are carried out in all study households. Mosquito densities are measured biweekly in 50 sentinel houses by timed light traps. Determinants of transmission and indicators of exposure were not strongly aggregated within households. Subjective morbidity (recalled fever), objective morbidity (elevated body temperature and high parasitaemia) and chloroquine consumption were strongly aggregated within a few households. Nested analysis of anti-NANP40 antibody suggest that only approximately 30% of the titer variance can explained by household clustering and that the largest proportion of antibody titer variability must be explained by non-measured behavioral determinants relating to an individual's level of exposure within a household. Indicators for evaluation and monitoring and outcome measures are described within the context of health service management to describe control measure output in terms of community effectiveness.
Resumo:
Fluorescence flow cytometry was employed to assess the potential of a vital dye, hydroethiedine, for use in the detection and monitoring of the viability of hemoparasites in infected erythrocytes, using Babesia bovis as a model parasite. The studies demonstrated that hydroethidine is taken up by B. bovis and metabolically converted to the DNA binding fluorochrone, ethidium. Following uptake of the dye, erythrocytes contamine viable parasites were readily distinguished and quantitated. Timed studies with the parasiticidal drug, Ganaseg, showed that it is possible to use the fluorochrome assay to monitor the effects of the drug on the rate of replication and viability of B. bovis in culture. The assay provides a rapid method for evaluation of the in vitro effect of drugs on hemoparasites and for analysis of the effect of various components of the immune response, such as lymphokines, monocyte products, antibodies, and effector cells (T, NK, LAK, ADCC) on the growth and viability of intraerythrocytic parasites.