965 resultados para 111506 Toxicology (incl.Clinical Toxicology)
Resumo:
Background: Malaria rapid diagnostic tests (RDTs) are appropriate for case management, but persistent antigenaemia is a concern for HRP2-detecting RDTs in endemic areas. It has been suggested that pan-pLDH test bands on combination RDTs could be used to distinguish persistent antigenaemia from active Plasmodium falciparum infection, however this assumes all active infections produce positive results on both bands of RDTs, an assertion that has not been demonstrated. Methods: In this study, data generated during the WHO-FIND product testing programme for malaria RDTs was reviewed to investigate the reactivity of individual test bands against P. falciparum in 18 combination RDTs. Each product was tested against multiple wild-type P. falciparum only samples. Antigen levels were measured by quantitative ELISA for HRP2, pLDH and aldolase. Results: When tested against P. falciparum samples at 200 parasites/μL, 92% of RDTs were positive; 57% of these on both the P. falciparum and pan bands, while 43% were positive on the P. falciparum band only. There was a relationship between antigen concentration and band positivity; ≥4 ng/mL of HRP2 produced positive results in more than 95% of P. falciparum bands, while ≥45 ng/mL of pLDH was required for at least 90% of pan bands to be positive. Conclusions: In active P. falciparum infections it is common for combination RDTs to return a positive HRP2 band combined with a negative pan-pLDH band, and when both bands are positive, often the pan band is faint. Thus active infections could be missed if the presence of a HRP2 band in the absence of a pan band is interpreted as being caused solely by persistent antigenaemia.
Resumo:
Background Historically, the paper hand-held record (PHR) has been used for sharing information between hospital clinicians, general practitioners and pregnant women in a maternity shared-care environment. Recently in alignment with a National e-health agenda, an electronic health record (EHR) was introduced at an Australian tertiary maternity service to replace the PHR for collection and transfer of data. The aim of this study was to examine and compare the completeness of clinical data collected in a PHR and an EHR. Methods We undertook a comparative cohort design study to determine differences in completeness between data collected from maternity records in two phases. Phase 1 data were collected from the PHR and Phase 2 data from the EHR. Records were compared for completeness of best practice variables collected The primary outcome was the presence of best practice variables and the secondary outcomes were the differences in individual variables between the records. Results Ninety-four percent of paper medical charts were available in Phase 1 and 100% of records from an obstetric database in Phase 2. No PHR or EHR had a complete dataset of best practice variables. The variables with significant improvement in completeness of data documented in the EHR, compared with the PHR, were urine culture, glucose tolerance test, nuchal screening, morphology scans, folic acid advice, tobacco smoking, illicit drug assessment and domestic violence assessment (p = 0.001). Additionally the documentation of immunisations (pertussis, hepatitis B, varicella, fluvax) were markedly improved in the EHR (p = 0.001). The variables of blood pressure, proteinuria, blood group, antibody, rubella and syphilis status, showed no significant differences in completeness of recording. Conclusion This is the first paper to report on the comparison of clinical data collected on a PHR and EHR in a maternity shared-care setting. The use of an EHR demonstrated significant improvements to the collection of best practice variables. Additionally, the data in an EHR were more available to relevant clinical staff with the appropriate log-in and more easily retrieved than from the PHR. This study contributes to an under-researched area of determining data quality collected in patient records.
Resumo:
GVHD remains the major complication of allo-HSCT. Murine models are the primary system used to understand GVHD, and to develop potential therapies. Several factors are critical for GVHD in these models; including histo- compatibility, conditioning regimen, and T-cell number. We serendipitously found that environmental factors such as the caging system and bedding also significantly impact the kinetics of GVHD in these models. This is important because such factors may influence the experimental conditions required to cause GVHD and how mice respond to various treatments. Consequently, this is likely to alter interpretation of results between research groups, and the perceived effectiveness of experimental therapies.
Resumo:
Purpose To investigate the frequency of convergence and accommodation anomalies in an optometric clinical setting in Mashhad, Iran, and to determine tests with highest accuracy in diagnosing these anomalies. Methods From 261 patients who came to the optometric clinics of Mashhad University of Medical Sciences during a month, 83 of them were included in the study based on the inclusion criteria. Near point of convergence (NPC), near and distance heterophoria, monocular and binocular accommodative facility (MAF and BAF, respectively), lag of accommodation, positive and negative fusional vergences (PFV and NFV, respectively), AC/A ratio, relative accommodation, and amplitude of accommodation (AA) were measured to diagnose the convergence and accommodation anomalies. The results were also compared between symptomatic and asymptomatic patients. The accuracy of these tests was explored using sensitivity (S), specificity (Sp), and positive and negative likelihood ratios (LR+, LR−). Results Mean age of the patients was 21.3 ± 3.5 years and 14.5% of them had specific binocular and accommodative symptoms. Convergence and accommodative anomalies were found in 19.3% of the patients; accommodative excess (4.8%) and convergence insufficiency (3.6%) were the most common accommodative and convergence disorders, respectively. Symptomatic patients showed lower values for BAF (p = .003), MAF (p = .001), as well as AA (p = .001) compared with asymptomatic patients. Moreover, BAF (S = 75%, Sp = 62%) and MAF (S = 62%, Sp = 89%) were the most accurate tests for detecting accommodative and convergence disorders in terms of both sensitivity and specificity. Conclusions Convergence and accommodative anomalies are the most common binocular disorders in optometric patients. Including tests of monocular and binocular accommodative facility in routine eye examinations as accurate tests to diagnose these anomalies requires further investigation.
Resumo:
PURPOSE - To present the results of same-day topography-guided photorefractive keratectomy (TG-PRK) and corneal collagen cross-linking (CXL) after intrastromal corneal ring (ISCR) implantation in patients with keratoconus. METHODS - Thirty-three patients (41 eyes) aged between 19 and 45 years were included in this prospective study. All patients underwent a femtosecond laser-enabled (Intralase FS; Abbott Medical Optics, Inc.) placement of intracorneal ring segments (Kerarings; Mediphacos, Brazil). Uncorrected distance visual acuity (UDVA), corrected distance visual acuity (CDVA), and keratometry readings remained stable for 6 months. Same-day PRK and CXL was subsequently performed in all patients. RESULTS - After 12 months of completion of the procedure, mean UDVA in log of minimal angle of resolution was significantly improved (0.74±0.54-0.10±0.16); CDVA did not improve significantly but 85% of eyes maintained or gained multiple lines of CDVA; mean refraction spherical equivalent improved (from -3.03±1.98 to -0.04±0.99 D), all keratometry readings were significantly reduced, from preoperative values, but coma did not vary significantly from preoperative values. Central corneal thickness and corneal thickness at the thinnest point were significantly (P<0.0001) reduced from 519.76±29.33 and 501.87±31.50 preoperatively to 464.71±36.79 and 436.55±47.42 postoperatively, respectively. Safety and efficacy indices were 0.97 and 0.88, respectively. From 6 months up until more than 1 year of follow-up, further significant improvement was observed only for UDVA (P<0.0001). CONCLUSIONS - Same-day combined TG-PRK and CXL after ISCR implantation is a safe and effective option for improving visual acuity and visual function, and it halts the progression of the keratoconus. The improvements recorded after 6 months of follow-up were maintained or improved upon 1 year after the procedure.
Resumo:
Large volumes of heterogeneous health data silos pose a big challenge when exploring for information to allow for evidence based decision making and ensuring quality outcomes. In this paper, we present a proof of concept for adopting data warehousing technology to aggregate and analyse disparate health data in order to understand the impact various lifestyle factors on obesity. We present a practical model for data warehousing with detailed explanation which can be adopted similarly for studying various other health issues.
Resumo:
Decision-making is such an integral aspect in health care routine that the ability to make the right decisions at crucial moments can lead to patient health improvements. Evidence-based practice, the paradigm used to make those informed decisions, relies on the use of current best evidence from systematic research such as randomized controlled trials. Limitations of the outcomes from randomized controlled trials (RCT), such as “quantity” and “quality” of evidence generated, has lowered healthcare professionals’ confidence in using EBP. An alternate paradigm of Practice-Based Evidence has evolved with the key being evidence drawn from practice settings. Through the use of health information technology, electronic health records (EHR) capture relevant clinical practice “evidence”. A data-driven approach is proposed to capitalize on the benefits of EHR. The issues of data privacy, security and integrity are diminished by an information accountability concept. Data warehouse architecture completes the data-driven approach by integrating health data from multi-source systems, unique within the healthcare environment.
Resumo:
Background The use of Electronic Medical Record (EMR) systems is increasing internationally, though developing countries, such as Saudi Arabia, have tended to lag behind in the adoption and implementation of EMR systems due to several barriers. The literature shows that the main barriers to EMR in Saudi Arabia are lack of knowledge or experience using EMR systems and staff resistance to using the implemented EMR system. Methods A quantitative methodology was used to examine health personnel knowledge and acceptance of and preference for EMR systems in seven Saudi public hospitals in Jeddah, Makkah and Taif cities. Results Both English literacy and education levels were significantly correlated with computer literacy and EMR literacy. Participants whose first language was not Arabic were more likely to prefer using an EMR system compared to those whose first language was Arabic. Conclusion This study suggests that as computer literacy levels increase, so too do staff preferences for using EMR systems. Thus, it would be beneficial for hospitals to assess English language proficiency and computer literacy levels of staff prior to implementing an EMR system. It is recommended that hospitals need to offer training and targeted educational programs to the potential users of the EMR system. This would help to increase English language proficiency and computer literacy levels of staff as well as staff acceptance of the system.
Resumo:
Recent increases in incidence of childhood cancers cannot be explained by genetic factors. Identifying the environmental risk factors that may explain increases in cancer incidence is an important step to reduce the overall burden of disease. The risk factors for which the most evidence exists include ionising radiation, ultraviolet radiation and chemicals such as benzene and pesticides, biological agents as well as parental smoking and parental substance use. Regarding the link between exposure to non-ionising radiation and development of cancer, the evidence was limited. Maternal vitamin supplementation may reduce the risk of cancer in offspring. Environmental exposures encountered during development and early childhood may be even more important contributors to the risk of cancer than exposures in adulthood and the early developmental period presents an important opportunity for cancer prevention.
Resumo:
The incidence of autism spectrum disorders, a heterogenous group of neurodevelopmental disorders is increasing. In response, there has been a concerted effort by researchers to identify environmental risk factors that explain the epidemiological changes seen with autism. Advanced parental age, maternal migrant status, maternal gestational stress, pregnancy and birth complications, maternal obesity and gestational diabetes, maternal vitamin D deficiency, use of antidepressants during gestation and exposure to organochlorine pesticides during pregnancy are all associated with an increased risk of autism. Folic acid use prior to pregnancy may reduce the risk of autism. Exposure to antenatal ultrasonography, maternal gestational cigarette and alcohol use do not appear to influence the risk of autism in offspring. There is little evidence that exposure to environmental toxins such as thimerosal, polybrominated diphenyl ethers and di-(2-ethylhexyl) phthalate in early childhood increases the risk of autism. Apart from birth complications, the current evidence suggests that the majority of environmental factors increasing the risk of autism occur in the antenatal period. Consistent with the rise in incidence in autism, some of these environmental factors are now more common in developed nations. Further research is required to determine how these environmental exposures translate to an increased risk of autism. Understanding how these exposures alter neurodevelopment in autistic children may inform both the aetiopathogenesis and the strategies for prevention of autism.
Resumo:
There is a large amount of research conducted each year examining every aspect of the mechanics of the human body and its interaction with medical devices and the environment; from the cellular level through to the whole body. While, as researchers, we obtain great pleasure from conducting studies and creating new knowledge we need to keep in mind that while this is a good thing it is even better if this new knowledge can lead to improvement in the quality of life for individuals suffering from biomechanical disorders. Such that while commercialisation is a good aim, not all research leads to marketable outcomes. However, it can lead to improvements in surgical techniques and clinical practice. It is important for us to identify and promote how the outcomes of research lead to improvements in quality of care, as this is perhaps the most important outcome for individual patients.
Resumo:
Background Assessing hand injury is of great interest given the level of involvement of the hand with the environment. Knowing different assessment systems and their limitations generates new perspectives. The integration of digital systems (accelerometry and electromyography) as a tool to supplement functional assessment allows the clinician to know more about the motor component and its relation to movement. Therefore, the purpose of this study was the kinematic and electromyography analysis during functional hand movements. Method Ten subjects carried out six functional movements (terminal pinch, termino-lateral pinch, tripod pinch, power grip, extension grip and ball grip). Muscle activity (hand and forearm) was measured in real time using electromyograms, acquired with the Mega ME 6000, whilst acceleration was measured using the AcceleGlove. Results Electrical activity and acceleration variables were recorded simultaneously during the carrying out of the functional movements. The acceleration outcome variables were the modular vectors of each finger of the hand and the palm. In the electromyography, the main variables were normalized by the mean and by the maximum muscle activity of the thenar region, hypothenar, first interosseous dorsal, wrist flexors, carpal flexors and wrist extensors. Conclusions Knowing muscle behavior allows the clinician to take a more direct approach in the treatment. Based on the results, the tripod grip shows greater kinetic activity and the middle finger is the most relevant in this regard. Ball grip involves most muscle activity, with the thenar region playing a fundamental role in hand activity. Clinical relevance Relating muscle activation, movements, individual load and displacement offers the possibility to proceed with rehabilitation by individual component.
Resumo:
Clinical Data Warehousing: A Business Analytic approach for managing health data