601 resultados para Medical Monitoring
Resumo:
This thesis represents a major step forward in understanding the link between the development of combustion related faults in diesel engines and the generation of acoustic emissions. The findings presented throughout the thesis provide a foundation so that future diesel engine monitoring systems are able to more effectively detect and monitor developing faults. In undertaking this research knowledge concerning engine function and relevant failure mechanisms was combined with different modelling methods to generate a framework that was used to effectively identify fault related activity within acoustic emissions recorded from different engines.
Resumo:
We propose a computationally efficient image border pixel based watermark embedding scheme for medical images. We considered the border pixels of a medical image as RONI (region of non-interest), since those pixels have no or little interest to doctors and medical professionals irrespective of the image modalities. Although RONI is used for embedding, our proposed scheme still keeps distortion at a minimum level in the embedding region using the optimum number of least significant bit-planes for the border pixels. All these not only ensure that a watermarked image is safe for diagnosis, but also help minimize the legal and ethical concerns of altering all pixels of medical images in any manner (e.g, reversible or irreversible). The proposed scheme avoids the need for RONI segmentation, which incurs capacity and computational overheads. The performance of the proposed scheme has been compared with a relevant scheme in terms of embedding capacity, image perceptual quality (measured by SSIM and PSNR), and computational efficiency. Our experimental results show that the proposed scheme is computationally efficient, offers an image-content-independent embedding capacity, and maintains a good image quality
Resumo:
In this paper we introduce a novel design for a translational medical research ecosystem. Translational medical research is an emerging field of work, which aims to bridge the gap between basic medical science research and clinical research/patient care. We analyze the key challenges of digital ecosystems for translational research, based on real world scenarios posed by the Lab for Translational Research at the Harvard Medical School and the Genomics Research Centre of the Griffith University, and show how traditional IT approaches fail to fulfill these challenges. We then introduce our design for a translational research ecosystem. Several key contributions are made: A novel approach to managing ad-hoc research ecosystems is introduced; a new security approach for translational research is proposed which allows each participating site to retain control over its data and define its own policies to ensure legal and ethical compliance; and a design for a novel interactive access control framework which allows users to easily share data, while adhering to their organization's policies is presented.
Resumo:
Liuwei Dihuang Wan (LWD), a classic Chinese medicinal formulae, has been used to improve or restore declined functions related to aging and geriatric diseases, such as impaired mobility, vision, hearing, cognition and memory. It has attracted increasingly much attention as one of the most popular and valuable herbal medicines. However, the systematic analysis of the chemical constituents of LDW is difficult and thus has not been well established. In this paper, a rapid, sensitive and reliable ultra-performance liquid chromatography with electrospray ionization quadrupole time-of-flight high-definition mass spectrometry (UPLC-ESI-Q-TOF-MS) method with automated MetaboLynx analysis in positive and negative ion mode was established to characterize the chemical constituents of LDW. The analysis was performed on a Waters UPLCTM HSS T3 using a gradient elution system. MS/MS fragmentation behavior was proposed for aiding the structural identification of the components. Under the optimized conditions, a total of 50 peaks were tentatively characterized by comparing the retention time and MS data. It is concluded that a rapid and robust platform based on UPLC-ESI-Q-TOF-MS has been successfully developed for globally identifying multiple-constituents of traditional Chinese medicine prescriptions. This is the first report on systematic analysis of the chemical constituents of LDW. This article is protected by copyright. All rights reserved.
Resumo:
Balancing the competing interests of autonomy and protection of individuals is an escalating challenge confronting an ageing Australian society. Legal and medical professionals are increasingly being asked to determine whether individuals are legally competent/capable to make their own testamentary and substitute decision-making, that is financial and/or personal/health care, decisions. No consistent and transparent competency/capacity assessment paradigm currently exists in Australia. Consequently, assessments are currently being undertaken on an ad hoc basis which is concerning as Australia’s population ages and issues of competency/capacity increase. The absence of nationally accepted competency/capacity assessment guidelines and supporting principles results in legal and medical professionals involved with competency/capacity assessment implementing individual processes tailored to their own abilities. Legal and medical approaches differ both between and within the professions. The terminology used also varies. The legal practitioner is concerned with whether the individual has the legal ability to make the decision. A medical practitioner assesses fluctuations in physical and mental abilities. The problem is that the terms competency and capacity are used interchangeably resulting in confusion about what is actually being assessed. The terminological and methodological differences subsequently create miscommunication and misunderstanding between the professions. Consequently, it is not necessarily a simple solution for a legal professional to seek the opinion of a medical practitioner when assessing testamentary and/or substitute decision-making competency/capacity. This research investigates the effects of the current inadequate testamentary and substitute decision-making assessment paradigm and whether there is a more satisfactory approach. This exploration is undertaken within a framework of therapeutic jurisprudence which promotes principles fundamentally important in this context. Empirical research has been undertaken to first, explore the effects of the current process with practising legal and medical professionals; and second, to determine whether miscommunication and misunderstanding actually exist between the professions such that it gives rise to a tense relationship which is not conducive to satisfactory competency/capacity assessments. The necessity of reviewing the adequacy of the existing competency/capacity assessment methodology in the testamentary and substitute decision-making domain will be demonstrated and recommendations for the development of a suitable process made.
Resumo:
Robotic systems are increasingly being utilised as fundamental data-gathering tools by scientists, allowing new perspectives and a greater understanding of the planet and its environmental processes. Today's robots are already exploring our deep oceans, tracking harmful algal blooms and pollution spread, monitoring climate variables, and even studying remote volcanoes. This article collates and discusses the significant advancements and applications of marine, terrestrial, and airborne robotic systems developed for environmental monitoring during the last two decades. Emerging research trends for achieving large-scale environmental monitoring are also reviewed, including cooperative robotic teams, robot and wireless sensor network (WSN) interaction, adaptive sampling and model-aided path planning. These trends offer efficient and precise measurement of environmental processes at unprecedented scales that will push the frontiers of robotic and natural sciences.
Resumo:
The Australasian Nutrition Care Day Survey (ANCDS) reported two-in-five patients in Australian and New Zealand hospitals consume ≤50% of the offered food. The ANCDS found a significant association between poor food intake and increased in-hospital mortality after controlling for confounders (nutritional status, age, disease type and severity)1. Evidence for the effectiveness of medical nutrition therapy (MNT) in hospital patients eating poorly is lacking. An exploratory study was conducted in respiratory, neurology and orthopaedic wards of an Australian hospital. At baseline, 24-hour food intake (0%, 25%, 50%, 75%, 100% of offered meals) was evaluated for patients hospitalised for ≥2 days and not under dietetic review. Patients consuming ≤50% of offered meals due to nutrition-impact symptoms were referred to ward dietitians for MNT with food intake re-evaluated on day-7. 184 patients were observed over four weeks. Sixty-two patients (34%) consumed ≤50% of the offered meals. Simple interventions (feeding/menu assistance, diet texture modifications) improved intake to ≥75% in 30 patients who did not require further MNT. Of the 32 patients referred for MNT, baseline and day-7 data were available for 20 patients (68±17years, 65% females, BMI: 22±5kg/m2, median energy, protein intake: 2250kJ, 25g respectively). On day-7, 17 participants (85%) demonstrated significantly higher consumption (4300kJ, 53g; p<0.01). Three participants demonstrated no improvement due to ongoing nutrition-impact symptoms. “Percentage food intake” was a quick tool to identify patients in whom simple interventions could enhance intake. MNT was associated with improved dietary intake in hospital patients. Further research is needed to establish a causal relationship.
Resumo:
The Lake Wivenhoe Integrated Wireless Sensor Network is conceptually similar to traditional SCADA monitoring and control approaches. However, it is applied in an open system using wireless devices to monitor processes that affect water quality at both a high spatial and temporal frequency. This monitoring assists scientists to better understand drivers of key processes that influence water quality and provide the operators with an early warning system if below standard water enters the reservoir. Both of these aspects improve the safety and efficient delivery of drinking water to the end users.
Resumo:
Background and aims The Australasian Nutrition Care Day Survey (ANCDS) reported two-in-five patients consume ≤50% of the offered food in Australian and New Zealand hospitals. After controlling for confounders (nutritional status, age, disease type and severity), the ANCDS also established an independent association between poor food intake and increased in-hospital mortality. This study aimed to evaluate if medical nutrition therapy (MNT) could improve dietary intake in hospital patients eating poorly. Methods An exploratory pilot study was conducted in the respiratory, neurology and orthopaedic wards of an Australian hospital. At baseline, percentage food intake (0%, 25%, 50%, 75%, and 100%) was evaluated for each main meal and snack for a 24-hour period in patients hospitalised for ≥2 days and not under dietetic review. Patients consuming ≤50% of offered meals due to nutrition-impact symptoms were referred to ward dietitians for MNT. Food intake was re-evaluated on the seventh day following recruitment (post-MNT). Results 184 patients were observed over four weeks; 32 patients were referred for MNT. Although baseline and post-MNT data for 20 participants (68±17years, 65% females) indicated a significant increase in median energy and protein intake post-MNT (3600kJ/day, 40g/day) versus baseline (2250kJ/day, 25g/day) (p<0.05), the increased intake met only 50% of dietary requirements. Persistent nutrition impact symptoms affected intake. Conclusion In this pilot study whilst dietary intake improved, it remained inadequate to meet participants’ estimated requirements due to ongoing nutrition-impact symptoms. Appropriate medical management and early enteral feeding could be a possible solution for such patients.
Resumo:
Aim To develop clinical practice guidelines for nurse-administered procedural sedation and analgesia in the cardiac catheterisation laboratory. Background Numerous studies have reported that nurse-administered procedural sedation and analgesia is safe. However, the broad scope of existing guidelines for the administration and monitoring of patients who receive sedation during medical procedures without an anaesthetist presents means there is a lack of specific guidance regarding optimal nursing practices for the unique circumstances in which nurse-administered procedural sedation and analgesia is used in the cardiac catheterisation laboratory. Methods A sequential mixed methods design was utilised. Initial recommendations were produced from three studies conducted by the authors: an integrative review; a qualitative study; and a cross-sectional survey. The recommendations were revised in accordance with responses from a modified Delphi study. The first Delphi round was completed by nine senior cardiac catheterisation laboratory nurses. All but one of the draft recommendations met the pre-determined cut-off point for inclusion. There were a total of 59 responses to the second round. Consensus was reached on all recommendations. Implications for nursing The guidelines that were derived from the Delphi study offer twenty four recommendations within six domains of nursing practice: Pre-procedural assessment; Pre-procedural patient and family education; Pre-procedural patient comfort; Intra-procedural patient comfort; Intra-procedural patient assessment and monitoring; and Post-procedural patient assessment and monitoring. Conclusion These guidelines provide an important foundation towards the delivery of safe, consistent and evidence-based nursing care for the many patients who receive sedation in the cardiac catheterisation laboratory setting.
Resumo:
INTRODUCTION It is known that the vascular morphology and functionality are changed following closed soft tissue trauma (CSTT) [1], and bone fractures [2]. The disruption of blood vessels may lead to hypoxia and necrosis. Currently, most clinical methods for the diagnosis and monitoring of CSTT with or without bone fractures are primarily based on qualitative measures or practical experience, making the diagnosis subjective and inaccurate. There is evidence that CSTT and early vascular changes following the injury delay the soft tissue tissue and bone healing [3]. However, a precise qualitative and quantitative morphological assessment of vasculature changes after trauma is currently missing. In this research, we aim to establish a diagnostic framework to assess the 3D vascular morphological changes after standardized CSTT in a rat model qualitatively and quantitatively using contrast-enhanced micro-CT imaging. METHODS An impact device was used for the application of a controlled reproducible CSTT to the left thigh (Biceps Femoris) of anaesthetized male Wistar rats. After euthanizing the animals at 6 hours, 24 hours, 3 days, 7 days, or 14 days after trauma, CSTT was qualitatively evaluated by macroscopic visual observation of the skin and muscles. For visualization of the vasculature, the blood vessels of sacrificed rats were flushed with heparinised saline and then perfused with a radio-opaque contrast agent (Microfil, MV 122, Flowtech, USA) using an infusion pump. After allowing the contrast agent to polymerize overnight, both hind-limbs were dissected, and then the whole injured and contra-lateral control limbs were imaged using a micro-CT scanner (µCT 40, Scanco Medical, Switzerland) to evaluate the vascular morphological changes. Correlated biopsy samples were also taken from the CSTT region of both injured and control legs. The morphological parameters such as the vessel volume ratio (VV/TV), vessel diameter (V.D), spacing (V.Sp), number (V.N), connectivity (V.Conn) and the degree of anisotropy (DA) were then quantified by evaluating the scans of biopsy samples using the micro-CT imaging system. RESULTS AND DISCUSSION A qualitative evaluation of the CSTT has shown that the developed impact protocols were capable of producing a defined and reproducible injury within the region of interest (ROI), resulting in a large hematoma and moderate swelling in both lateral and medial sides of the injured legs. Also, the visualization of the vascular network using 3D images confirmed the ability to perfuse the large vessels and a majority of the microvasculature consistently (Figure 1). Quantification of the vascular morphology obtained from correlated biopsy samples has demonstrated that V.D and V.N and V.Sp were significantly higher in the injured legs 24 hours after impact in comparison with the control legs (p<0.05). The evaluation of the other time points is currently progressing. CONCLUSIONS The findings of this research will contribute to a better understanding of the changes to the vascular network architecture following traumatic injuries and during healing process. When interpreted in context of functional changes, such as tissue oxygenation, this will allow for objective diagnosis and monitoring of CSTT and serve as validation for future non-invasive clinical assessment modalities.
Resumo:
Voltammetric techniques have been introduced to monitor the formation of gold nanoparticles produced via the reaction of the amino acid glycyl-L-tyrosine with Au(III) (bromoaurate) in 0.05 M KOH conditions. The alkaline conditions facilitate amino acid binding to Au(III), inhibit the rate of reduction to Au(0), and provide an excellent supporting electrolyte for voltammetric studies. Data obtained revealed that a range of time-dependent gold solution species are involved in gold nanoparticle formation and that the order in which reagents are mixed is critical to the outcome. Concomitantly with voltammetric measurements, the properties of gold nanoparticles formed are probed by examination of electronic spectra in order to understand how the solution environment present during nanoparticle growth affects the final distribution of the nanoparticles. Images obtained by the ex situ transmission electron microscopy (TEM) technique enable the physical properties of the nanoparticles isolated in the solid state to be assessed. Use of this combination of in situ and ex situ techniques provides a versatile framework for elucidating the details of nanoparticle formation.
Resumo:
Scanning electrochemical microscopy (SECM), in the substrate generation–tip collection (SG-TC) mode, has been used to detect the cuprous ion intermediate formed during the course of electrodeposition of Cu metal from aqueous solution. Addition of chloride is confirmed to strongly stabilize the ion in aqueous solution and enhance the rate of Cu electrodeposition. This SECM method in the SG-TC mode offers an alternative to the rotating ring disk electrode (RRDE) technique for in situ studies on the effect of plating bath additives in metal electrodeposition. An attractive feature of the SECM relative to the RRDE method is that it allows qualitative aspects of the electrodeposition process to be studied in close proximity to the substrate in a simple and direct fashion using an inexpensive probe, and without the need for forced convection.
Resumo:
BACKGROUND: The prevalence of protein-energy malnutrition in older adults is reported to be as high as 60% and is associated with poor health outcomes. Inadequate feeding assistance and mealtime interruptions may contribute to malnutrition and poor nutritional intake during hospitalisation. Despite being widely implemented in practice in the United Kingdom and increasingly in Australia, there have been few studies examining the impact of strategies such as Protected Mealtimes and dedicated feeding assistant roles on nutritional outcomes of elderly inpatients. AIMS: The aim of this research was to implement and compare three system-level interventions designed to specifically address mealtime barriers and improve energy intakes of medical inpatients aged ≥65 years. This research also aimed to evaluate the sustainability of any changes to mealtime routines six months post-intervention and to gain an understanding of staff perceptions of the post-intervention mealtime experience. METHODS: Three mealtime assistance interventions were implemented in three medical wards at Royal Brisbane and Women's Hospital: AIN-only: Additional assistant-in-nursing (AIN) with dedicated nutrition role. PM-only: Multidisciplinary approach to meals, including Protected Mealtimes. PM+AIN: Combined intervention: AIN + multidisciplinary approach to meals. An action research approach was used to carefully design and implement the three interventions in partnership with ward staff and managers. Significant time was spent in consultation with staff throughout the implementation period to facilitate ownership of the interventions and increase likelihood of successful implementation. A pre-post design was used to compare the implementation and nutritional outcomes of each intervention to a pre-intervention group. Using the same wards, eligible participants (medical inpatients aged ≥65 years) were recruited to the preintervention group between November 2007 and March 2008 and to the intervention groups between January and June 2009. The primary nutritional outcome was daily energy and protein intake, which was determined by visually estimating plate waste at each meal and mid-meal on Day 4 of admission. Energy and protein intakes were compared between the pre and post intervention groups. Data were collected on a range of covariates (demographics, nutritional status and known risk factors for poor food intake), which allowed for multivariate analysis of the impact of the interventions on nutritional intake. The provision of mealtime assistance to participants and activities of ward staff (including mealtime interruptions) were observed in the pre-intervention and intervention groups, with staff observations repeated six months post-intervention. Focus groups were conducted with nursing and allied health staff in June 2009 to explore their attitudes and behaviours in response to the three mealtime interventions. These focus group discussions were analysed using thematic analysis. RESULTS: A total of 254 participants were recruited to the study (pre-intervention: n=115, AIN-only: n=58, PM-only: n=39, PM+AIN: n=42). Participants had a mean age of 80 years (SD 8), and 40% (n=101) were malnourished on hospital admission, 50% (n=108) had anorexia and 38% (n=97) required some assistance at mealtimes. Occasions of mealtime assistance significantly increased in all interventions (p<0.01). However, no change was seen in mealtime interruptions. No significant difference was seen in mean total energy and protein intake between the preintervention and intervention groups. However, when total kilojoule intake was compared with estimated requirements at the individual level, participants in the intervention groups were more likely to achieve adequate energy intake (OR=3.4, p=0.01), with no difference noted between interventions (p=0.29). Despite small improvements in nutritional adequacy, the majority of participants in the intervention groups (76%, n=103) had inadequate energy intakes to meet their estimated energy requirements. Patients with cognitive impairment or feeding dependency appeared to gain substantial benefit from mealtime assistance interventions. The increase in occasions of mealtime assistance by nursing staff during the intervention period was maintained six-months post-intervention. Staff focus groups highlighted the importance of clearly designating and defining mealtime responsibilities in order to provide adequate mealtime care. While the purpose of the dedicated feeding assistant was to increase levels of mealtime assistance, staff indicated that responsibility for mealtime duties may have merely shifted from nursing staff to the assistant. Implementing the multidisciplinary interventions empowered nursing staff to "protect" the mealtime from external interruptions, but further work is required to empower nurses to prioritise mealtime activities within their own work schedules. Staff reported an increase in the profile of nutritional care on all wards, with additional non-nutritional benefits noted including improved mobility and functional independence, and better identification of swallowing difficulties. IMPLICATIONS: The PhD research provides clinicians with practical strategies to immediately introduce change to deliver better mealtime care in the hospital setting, and, as such, has initiated local and state-wide roll-out of mealtime assistance programs. Improved nutritional intakes of elderly inpatients was observed; however given the modest effect size and reducing lengths of hospital stays, better nutritional outcomes may be achieved by targeting the hospital-to-home transition period. Findings from this study suggest that mealtime assistance interventions for elderly inpatients with cognitive impairment and/or functional dependency show promise.
Resumo:
Today, the majority of semiconductor fabrication plants (fabs) conduct equipment preventive maintenance based on statistically-derived time- or wafer-count-based intervals. While these practices have had relative success in managing equipment availability and product yield, the cost, both in time and materials, remains high. Condition-based maintenance has been successfully adopted in several industries, where costs associated with equipment downtime range from potential loss of life to unacceptable affects to companies’ bottom lines. In this paper, we present a method for the monitoring of complex systems in the presence of multiple operating regimes. In addition, the new representation of degradation processes will be used to define an optimization procedure that facilitates concurrent maintenance and operational decision-making in a manufacturing system. This decision-making procedure metaheuristically maximizes a customizable cost function that reflects the benefits of production uptime, and the losses incurred due to deficient quality and downtime. The new degradation monitoring method is illustrated through the monitoring of a deposition tool operating over a prolonged period of time in a major fab, while the operational decision-making is demonstrated using simulated operation of a generic cluster tool.