887 resultados para Medical laboratory technology
Resumo:
The electroencephalogram (EEG) is a medical technology that is used in the monitoring of the brain and in the diagnosis of many neurological illnesses. Although coarse in its precision, the EEG is a non-invasive tool that requires minimal set-up times, and is suitably unobtrusive and mobile to allow continuous monitoring of the patient, either in clinical or domestic environments. Consequently, the EEG is the current tool-of-choice with which to continuously monitor the brain where temporal resolution, ease-of- use and mobility are important. Traditionally, EEG data are examined by a trained clinician who identifies neurological events of interest. However, recent advances in signal processing and machine learning techniques have allowed the automated detection of neurological events for many medical applications. In doing so, the burden of work on the clinician has been significantly reduced, improving the response time to illness, and allowing the relevant medical treatment to be administered within minutes rather than hours. However, as typical EEG signals are of the order of microvolts (μV ), contamination by signals arising from sources other than the brain is frequent. These extra-cerebral sources, known as artefacts, can significantly distort the EEG signal, making its interpretation difficult, and can dramatically disimprove automatic neurological event detection classification performance. This thesis therefore, contributes to the further improvement of auto- mated neurological event detection systems, by identifying some of the major obstacles in deploying these EEG systems in ambulatory and clinical environments so that the EEG technologies can emerge from the laboratory towards real-world settings, where they can have a real-impact on the lives of patients. In this context, the thesis tackles three major problems in EEG monitoring, namely: (i) the problem of head-movement artefacts in ambulatory EEG, (ii) the high numbers of false detections in state-of-the-art, automated, epileptiform activity detection systems and (iii) false detections in state-of-the-art, automated neonatal seizure detection systems. To accomplish this, the thesis employs a wide range of statistical, signal processing and machine learning techniques drawn from mathematics, engineering and computer science. The first body of work outlined in this thesis proposes a system to automatically detect head-movement artefacts in ambulatory EEG and utilises supervised machine learning classifiers to do so. The resulting head-movement artefact detection system is the first of its kind and offers accurate detection of head-movement artefacts in ambulatory EEG. Subsequently, addtional physiological signals, in the form of gyroscopes, are used to detect head-movements and in doing so, bring additional information to the head- movement artefact detection task. A framework for combining EEG and gyroscope signals is then developed, offering improved head-movement arte- fact detection. The artefact detection methods developed for ambulatory EEG are subsequently adapted for use in an automated epileptiform activity detection system. Information from support vector machines classifiers used to detect epileptiform activity is fused with information from artefact-specific detection classifiers in order to significantly reduce the number of false detections in the epileptiform activity detection system. By this means, epileptiform activity detection which compares favourably with other state-of-the-art systems is achieved. Finally, the problem of false detections in automated neonatal seizure detection is approached in an alternative manner; blind source separation techniques, complimented with information from additional physiological signals are used to remove respiration artefact from the EEG. In utilising these methods, some encouraging advances have been made in detecting and removing respiration artefacts from the neonatal EEG, and in doing so, the performance of the underlying diagnostic technology is improved, bringing its deployment in the real-world, clinical domain one step closer.
Infant milk formula manufacture: process and compositional interactions in high dry matter wet-mixes
Resumo:
Infant milk formula (IMF) is fortified milk with composition based on the nutrient content in human mother's milk, 0 to 6 months postpartum. Extensive medical and clinical research has led to advances in the nutritional quality of infant formula; however, relatively few studies have focused on interactions between nutrients and the manufacturing process. The objective of this research was to investigate the impact of composition and processing parameters on physical behaviour of high dry matter (DM) IMF systems with a view to designing more sustainable manufacturing processes. The study showed that commercial IMF, with similar compositions, manufactured by different processes, had markedly different physical properties in dehydrated or reconstituted state. Commercial products made with hydrolysed protein were more heat stable compared to products made with intact protein, however, emulsion quality was compromised. Heat-induced denaturation of whey proteins resulted in increased viscosity of wet-mixes, an effect that was dependant on both whey concentration and interactions with lactose and caseins. Expanding on fundamental laboratory studies, a novel high velocity steam injection process was developed whereby high DM (60%) wet-mixes with lower denaturation/viscosity compared to conventional processes could be achieved; powders produced using this process were of similar quality to those manufactured conventionally. Hydrolysed proteins were also shown to be an effective way of reducing viscosity in heat-treated high DM wet-mixes. In particular, using a whey protein concentrate whereby β-Lactoglobulin was selectively hydrolysed, i.e., α-Lactalbumin remained intact, reduced viscosity of wet-mixes during processing while still providing good emulsification. The thesis provides new insights into interactions between nutrients and/or processing which influence physical stability of IMF both in concentrated liquid and powdered form. The outcomes of the work have applications in such areas as; increasing the DM content of spray drier feeds in order to save energy, and, controlling final powder quality.
Resumo:
BACKGROUND: Automated reporting of estimated glomerular filtration rate (eGFR) is a recent advance in laboratory information technology (IT) that generates a measure of kidney function with chemistry laboratory results to aid early detection of chronic kidney disease (CKD). Because accurate diagnosis of CKD is critical to optimal medical decision-making, several clinical practice guidelines have recommended the use of automated eGFR reporting. Since its introduction, automated eGFR reporting has not been uniformly implemented by U. S. laboratories despite the growing prevalence of CKD. CKD is highly prevalent within the Veterans Health Administration (VHA), and implementation of automated eGFR reporting within this integrated healthcare system has the potential to improve care. In July 2004, the VHA adopted automated eGFR reporting through a system-wide mandate for software implementation by individual VHA laboratories. This study examines the timing of software implementation by individual VHA laboratories and factors associated with implementation. METHODS: We performed a retrospective observational study of laboratories in VHA facilities from July 2004 to September 2009. Using laboratory data, we identified the status of implementation of automated eGFR reporting for each facility and the time to actual implementation from the date the VHA adopted its policy for automated eGFR reporting. Using survey and administrative data, we assessed facility organizational characteristics associated with implementation of automated eGFR reporting via bivariate analyses. RESULTS: Of 104 VHA laboratories, 88% implemented automated eGFR reporting in existing laboratory IT systems by the end of the study period. Time to initial implementation ranged from 0.2 to 4.0 years with a median of 1.8 years. All VHA facilities with on-site dialysis units implemented the eGFR software (52%, p<0.001). Other organizational characteristics were not statistically significant. CONCLUSIONS: The VHA did not have uniform implementation of automated eGFR reporting across its facilities. Facility-level organizational characteristics were not associated with implementation, and this suggests that decisions for implementation of this software are not related to facility-level quality improvement measures. Additional studies on implementation of laboratory IT, such as automated eGFR reporting, could identify factors that are related to more timely implementation and lead to better healthcare delivery.
Resumo:
© 2015 Elsevier Inc. All rights reserved.Background 12-lead ECG is a critical component of initial evaluation of cardiac ischemia, but has traditionally been limited to large, dedicated equipment in medical care environments. Smartphones provide a potential alternative platform for the extension of ECG to new care settings and to improve timeliness of care. Objective To gain experience with smartphone electrocardiography prior to designing a larger multicenter study evaluating standard 12-lead ECG compared to smartphone ECG. Methods 6 patients for whom the hospital STEMI protocol was activated were evaluated with traditional 12-lead ECG followed immediately by a smartphone ECG using right (VnR) and left (VnL) limb leads for precordial grounding. The AliveCor™ Heart Monitor was utilized for this study. All tracings were taken prior to catheterization or immediately after revascularization while still in the catheterization laboratory. Results The smartphone ECG had excellent correlation with the gold standard 12-lead ECG in all patients. Four out of six tracings were judged to meet STEMI criteria on both modalities as determined by three experienced cardiologists, and in the remaining two, consensus indicated a non-STEMI ECG diagnosis. No significant difference was noted between VnR and VnL. Conclusions Smartphone based electrocardiography is a promising, developing technology intended to increase availability and speed of electrocardiographic evaluation. This study confirmed the potential of a smartphone ECG for evaluation of acute ischemia and the feasibility of studying this technology further to define the diagnostic accuracy, limitations and appropriate use of this new technology.
Resumo:
The overall objective of this work is to develop a computational model of particle degradation during dilute-phasepneumatic conveying. A key feature of such a model is the prediction of particle breakage due to particle–wall collisions in pipeline bends. This paper presents a method for calculating particle impact degradation propensity under a range of particle velocities and particle sizes. It is based on interpolation on impact data obtained in a new laboratory-scale degradation tester. The method is tested and validated against experimental results for degradation at 90± impact angle of a full-size distribution sample of granulated sugar. In a subsequent work, the calculation of degradation propensity is coupled with a ow model of the solids and gas phases in the pipeline.
Resumo:
Ocean Virtual Laboratory is an ESA-funded project to prototype the concept of a single point of access for all satellite remote-sensing data with ancillary model output and in situ measurements for a given region. The idea is to provide easy access for the non-specialist to both data and state-of-the-art processing techniques and enable their easy analysis and display. The project, led by OceanDataLab, is being trialled in the region of the Agulhas Current, as it contains signals of strong contrast (due to very energetic upper ocean dynamics) and special SAR data acquisitions have been recorded there. The project also encourages the take up of Earth Observation data by developing training material to help those not in large scientific or governmental organizations make the best use of what data are available. The website for access is: http://ovl-project.oceandatalab.com/
Resumo:
Physiological studies on M. parvicella have been conducted to determine the rate of growth of this organism in pure culture. The organism displayed a doubling time of 128 days despite its profuse abundance in a local Wastewater Treatment Plant (WWTW). An extensive survey has been ongoing since February 2000 into the extent of M. parvicella in the WWTW. A suite of monoclonal and polyclonal antibodies has been developed to detect and quantify M. parvicella.
Resumo:
Huge magnetic fields are predicted1–4 to exist in the high-density region of plasmas produced during intense laser–matter interaction, near the criticaldensity surface where most laser absorption occurs, but until now these fields have never been measured. By using pulses focused to extreme intensities to investigate laser–plasma interactions5, we have been able to record the highest magnetic fields ever produced in a laboratory – over 340 megagauss – by polarimetry measurements of self-generated laser harmonics.
Resumo:
Experiments were carried out from June 2000 to April 2001 to compare survival of European lobster (Homarus gammarus) offspring (larvae and juveniles) from three brood sources, Kvitsøy Wild (KW), Kvitsøy Cultured (KC), and Rogaland Wild (RW), Norway. In the first set of experiments, newly hatched larvae (stage I) were raised in separate family tanks. All larvae groups survived to stage III/IV, although large variation in relative survival was observed among families within each of the three different female groups. Highest overall survival was observed for the RW group (12.8%), whereas no differences in overall survival were found between the KW (9.0%) and KC groups (9.6%). From stage III/IV, larvae from single family tank experiments were mixed in five “common garden” juvenile experiments. These lasted for 9 months, and the surviving juveniles were identified to family/female group using microsatellite DNA profiling. Significantly higher survival of the KW families (7.0%) was found compared with the KC (3.7%) and the RW families (3.2%), and differences in family ranking of relative survival values were evident between the KW and KC groups. The relative survival rate of the different groups was independent of female lobster size. An estimate based on only stage IV larvae reduced the difference in survival between the KW (11.4%) and KC (8.3%) group. The experiments provided evidence that cultured females (KC) are producing viable offspring with lower, but comparable survival to that of offspring from wild females (KW).
Resumo:
Bio art, understood as the convergence of the relations between art, biology and technology, constitutes a useful case study to discuss the meaning of interdisciplinarity in the artistic field. This paper explores different discourses around interdisciplinarity in order to challenge certain generic approaches for their ineffectiveness when assessing artistic practices. It is proposed that the analysis of interdisciplinarity must address the singular connections produced in the artistic practice itself, considering the impossibility of reducing the complexity of interdisciplinary dialogues into generic considerations. Taking bioart as a case study, different kinds of relationships between the artist and the lab are identified and analyzed, ranging from the use of the lab as a true atelier and as a resource for materials and techniques, to the rejection of the lab by proposing amateurism as an alternative. estrategias amateur, pasando por su utilización como fuente de técnicas y materiales.