153 resultados para Medical Research Council (MRC)
em CORA - Cork Open Research Archive - University College Cork - Ireland
Resumo:
The overarching aim of this thesis was to develop an intervention to support patient-centred prescribing in the context of multimorbidity in primary care. Methods A range of research methods were used to address different components of the Medical Research Council, UK (MRC) guidance on the development and evaluation of complex interventions in health care. The existing evidence on GPs’ perceptions of the management of multimorbidity was systematically reviewed. In qualitative interviews, chart-stimulated recall was used to explore the challenges experienced by GPs when prescribing for multimorbid patients. In a cross-sectional study, the psychosocial issues that complicate the management of multimorbidity were examined. To develop the complex intervention, the Behaviour Change Wheel (BCW) was used to integrate behavioural theory with the findings of these three studies. A feasibility study of the new intervention was then conducted with GPs. Results The systematic review revealed four domains of clinical practice where GPs experienced difficulties in multimorbidity. The qualitative interview study showed that GPs responded to these difficulties by ‘satisficing’. In multimorbid patients perceived as stable, GPs preferred to ‘maintain the status quo’ rather than actively change medications. In the cross-sectional study, the significant association between multimorbidity and negative psychosocial factors was shown. These findings informed the development of the ‘Multimorbidity Collaborative Medication Review and Decision-making’ (MY COMRADE) intervention. The intervention involves peer support: two GPs review the medications prescribed to a complex multimorbid patient together. In the feasibility study, GPs reported that the intervention was appropriate for the context of general practice; was widely applicable to their patients with multimorbidity; and recommendations for optimising medications arose from all collaborative reviews. Conclusion Applying theory to empirical data has led to an intervention that is implementable in clinical practice, and has the potential to positively change GPs’ behaviour in the management of medications for patients with multimorbidity.
Resumo:
Background: Dietary behaviour interventions have the potential to reduce diet-related disease. Ample opportunity exists to implement these interventions in the workplace. The overall aim is to assess the effectiveness and cost-effectiveness of complex dietary interventions focused on environmental dietary modification alone or in combination with nutrition education in large manufacturing workplace settings. Methods/design: A clustered controlled trial involving four large multinational manufacturing workplaces in Cork will be conducted. The complex intervention design has been developed using the Medical Research Council's framework and the National Institute for Health and Clinical Excellence (NICE) guidelines and will be reported using the TREND statement for the transparent reporting of evaluations with non-randomized designs. It will draw on a soft paternalistic 'nudge' theoretical perspective. It will draw on a soft paternalistic "nudge" theoretical perspective. Nutrition education will include three elements: group presentations, individual nutrition consultations and detailed nutrition information. Environmental dietary modification will consist of five elements: (a) restriction of fat, saturated fat, sugar and salt, (b) increase in fibre, fruit and vegetables, (c) price discounts for whole fresh fruit, (d) strategic positioning of healthier alternatives and (e) portion size control. No intervention will be offered in workplace A (control). Workplace B will receive nutrition education. Workplace C will receive nutrition education and environmental dietary modification. Workplace D will receive environmental dietary modification alone. A total of 448 participants aged 18 to 64 years will be selected randomly. All permanent, full-time employees, purchasing at least one main meal in the workplace daily, will be eligible. Changes in dietary behaviours, nutrition knowledge, health status with measurements obtained at baseline and at intervals of 3 to 4 months, 7 to 9 months and 13 to 16 months will be recorded. A process evaluation and cost-effectiveness economic evaluation will be undertaken. Discussion: A 'Food Choice at Work' toolbox (concise teaching kit to replicate the intervention) will be developed to inform and guide future researchers, workplace stakeholders, policy makers and the food industry. Trial registration: Current Controlled Trials, ISRCTN35108237.
Resumo:
This thesis investigates the extent and range of the ocular vocabulary and themes employed by the playwright Thomas Middleton in context with early modern scientific, medical, and moral-philosophical writing on vision. More specifically, this thesis concerns Middleton’s revelation of the substance or essence of outward forms through mimesis. This paradoxical stance implies Middleton’s use of an illusory (theatrical) art form to explore hidden truths. This can be related to the early modern belief in the imagination (or fantasy) as chief mediator between the corporeal and spiritual worlds as well as to a reformed belief in the power of signs to indicate divine truth. This thesis identifies striking parallels between Middleton’s policy of social diagnosis and cure and an increased preoccupation with knowledge of interior man which culminates in Robert Burton’s Anatomy of Melancholy of 1621. All of these texts seek a cure for diseased internal sense faculties (such as fantasy and will) which cause the raging passions to destroy the individual. The purpose of this thesis is to demonstrate how Middleton takes a similar ‘mental-medicinal’ approach which investigates the idols created by the imagination before ‘purging’ the same and restoring order (Corneanu and Vermeir 184). The idea of infection incurred through the eyes which are fixed on vice (or error) has moral, religious, and political implications and discovery of corruption involves stripping away the illusions of false appearances to reveal the truth within whereby disease and disorder can be cured and restored. Finally, Middleton’s use of theatrical fantasy to detect the idols of the diseased imagination can be read as a Paracelsian, rather than Galenic, form of medicine whereby like is ‘joined with their like’ (Bostocke C7r) to restore health.
Resumo:
As a by-product of the ‘information revolution’ which is currently unfolding, lifetimes of man (and indeed computer) hours are being allocated for the automated and intelligent interpretation of data. This is particularly true in medical and clinical settings, where research into machine-assisted diagnosis of physiological conditions gains momentum daily. Of the conditions which have been addressed, however, automated classification of allergy has not been investigated, even though the numbers of allergic persons are rising, and undiagnosed allergies are most likely to elicit fatal consequences. On the basis of the observations of allergists who conduct oral food challenges (OFCs), activity-based analyses of allergy tests were performed. Algorithms were investigated and validated by a pilot study which verified that accelerometer-based inquiry of human movements is particularly well-suited for objective appraisal of activity. However, when these analyses were applied to OFCs, accelerometer-based investigations were found to provide very poor separation between allergic and non-allergic persons, and it was concluded that the avenues explored in this thesis are inadequate for the classification of allergy. Heart rate variability (HRV) analysis is known to provide very significant diagnostic information for many conditions. Owing to this, electrocardiograms (ECGs) were recorded during OFCs for the purpose of assessing the effect that allergy induces on HRV features. It was found that with appropriate analysis, excellent separation between allergic and nonallergic subjects can be obtained. These results were, however, obtained with manual QRS annotations, and these are not a viable methodology for real-time diagnostic applications. Even so, this was the first work which has categorically correlated changes in HRV features to the onset of allergic events, and manual annotations yield undeniable affirmation of this. Fostered by the successful results which were obtained with manual classifications, automatic QRS detection algorithms were investigated to facilitate the fully automated classification of allergy. The results which were obtained by this process are very promising. Most importantly, the work that is presented in this thesis did not obtain any false positive classifications. This is a most desirable result for OFC classification, as it allows complete confidence to be attributed to classifications of allergy. Furthermore, these results could be particularly advantageous in clinical settings, as machine-based classification can detect the onset of allergy which can allow for early termination of OFCs. Consequently, machine-based monitoring of OFCs has in this work been shown to possess the capacity to significantly and safely advance the current state of clinical art of allergy diagnosis
Resumo:
This PhD thesis investigates the application of hollow core photonic crystal fibre for use as an optical fibre nano litre liquid sensor. The use of hollow core photonic crystal fibre for optical fibre sensing is influenced by the vast wealth of knowledge, and years of research that has been conducted for optical waveguides. Hollow core photonic crystal fibres have the potential for use as a simple, rapid and continuous sensor for a wide range of applications. In this thesis, the velocity of a liquid flowing through the core of the fibre (driven by capillary forces) is used for the determination of the viscosity of a liquid. The structure of the hollow core photonic crystal fibre is harnessed to collect Raman scatter from the sample liquid. These two methods are integrated to investigate the range of applications the hollow core photonic crystal fibre can be utilised for as an optical liquid sensor. Understanding the guidance properties of hollow core photonic crystal fibre is forefront in dynamically monitoring the liquid filling. When liquid is inserted fully or selectively to the capillaries, the propagation properties change from photonic bandgap guidance when empty, to index guidance when the core only is filled and finally to a shifted photonic bandgap effect, when the capillaries are fully filled. The alterations to the guidance are exploited for all viscosity and Raman scattering measurements. The concept of the optical fibre viscosity sensor was tested for a wide range of samples, from aqueous solutions of propan-1-ol to solutions of mono-saccharides in phosphate buffer saline. The samples chosen to test the concept were selected after careful consideration of the importance of the liquid in medical and industrial applications. The Raman scattering of a wide range of biological important fluids, such as creatinine, glucose and lactate were investigated, some for the first time with hollow core photonic crystal fibre.
Resumo:
The analysis of energy detector systems is a well studied topic in the literature: numerous models have been derived describing the behaviour of single and multiple antenna architectures operating in a variety of radio environments. However, in many cases of interest, these models are not in a closed form and so their evaluation requires the use of numerical methods. In general, these are computationally expensive, which can cause difficulties in certain scenarios, such as in the optimisation of device parameters on low cost hardware. The problem becomes acute in situations where the signal to noise ratio is small and reliable detection is to be ensured or where the number of samples of the received signal is large. Furthermore, due to the analytic complexity of the models, further insight into the behaviour of various system parameters of interest is not readily apparent. In this thesis, an approximation based approach is taken towards the analysis of such systems. By focusing on the situations where exact analyses become complicated, and making a small number of astute simplifications to the underlying mathematical models, it is possible to derive novel, accurate and compact descriptions of system behaviour. Approximations are derived for the analysis of energy detectors with single and multiple antennae operating on additive white Gaussian noise (AWGN) and independent and identically distributed Rayleigh, Nakagami-m and Rice channels; in the multiple antenna case, approximations are derived for systems with maximal ratio combiner (MRC), equal gain combiner (EGC) and square law combiner (SLC) diversity. In each case, error bounds are derived describing the maximum error resulting from the use of the approximations. In addition, it is demonstrated that the derived approximations require fewer computations of simple functions than any of the exact models available in the literature. Consequently, the regions of applicability of the approximations directly complement the regions of applicability of the available exact models. Further novel approximations for other system parameters of interest, such as sample complexity, minimum detectable signal to noise ratio and diversity gain, are also derived. In the course of the analysis, a novel theorem describing the convergence of the chi square, noncentral chi square and gamma distributions towards the normal distribution is derived. The theorem describes a tight upper bound on the error resulting from the application of the central limit theorem to random variables of the aforementioned distributions and gives a much better description of the resulting error than existing Berry-Esseen type bounds. A second novel theorem, providing an upper bound on the maximum error resulting from the use of the central limit theorem to approximate the noncentral chi square distribution where the noncentrality parameter is a multiple of the number of degrees of freedom, is also derived.
Resumo:
One problem in most three-dimensional (3D) scalar data visualization techniques is that they often overlook to depict uncertainty that comes with the 3D scalar data and thus fail to faithfully present the 3D scalar data and have risks which may mislead users’ interpretations, conclusions or even decisions. Therefore this thesis focuses on the study of uncertainty visualization in 3D scalar data and we seek to create better uncertainty visualization techniques, as well as to find out the advantages/disadvantages of those state-of-the-art uncertainty visualization techniques. To do this, we address three specific hypotheses: (1) the proposed Texture uncertainty visualization technique enables users to better identify scalar/error data, and provides reduced visual overload and more appropriate brightness than four state-of-the-art uncertainty visualization techniques, as demonstrated using a perceptual effectiveness user study. (2) The proposed Linked Views and Interactive Specification (LVIS) uncertainty visualization technique enables users to better search max/min scalar and error data than four state-of-the-art uncertainty visualization techniques, as demonstrated using a perceptual effectiveness user study. (3) The proposed Probabilistic Query uncertainty visualization technique, in comparison to traditional Direct Volume Rendering (DVR) methods, enables radiologists/physicians to better identify possible alternative renderings relevant to a diagnosis and the classification probabilities associated to the materials appeared on these renderings; this leads to improved decision support for diagnosis, as demonstrated in the domain of medical imaging. For each hypothesis, we test it by following/implementing a unified framework that consists of three main steps: the first main step is uncertainty data modeling, which clearly defines and generates certainty types of uncertainty associated to given 3D scalar data. The second main step is uncertainty visualization, which transforms the 3D scalar data and their associated uncertainty generated from the first main step into two-dimensional (2D) images for insight, interpretation or communication. The third main step is evaluation, which transforms the 2D images generated from the second main step into quantitative scores according to specific user tasks, and statistically analyzes the scores. As a result, the quality of each uncertainty visualization technique is determined.
Resumo:
This thesis explores the use of electromagnetics for both steering and tracking of medical instruments in minimally invasive surgeries. The end application is virtual navigation of the lung for biopsy of early stage cancer nodules. Navigation to the peripheral regions of the lung is difficult due to physical dimensions of the bronchi and current methods have low successes rates for accurate diagnosis. Firstly, the potential use of DC magnetic fields for the actuation of catheter devices with permanently magnetised distal attachments is investigated. Catheter models formed from various materials and magnetic tip formations are used to examine the usefulness of relatively low power and compact electromagnets. The force and torque that can be exerted on a small permanent magnet is shown to be extremely limited. Hence, after this initial investigation we turn our attention to electromagnetic tracking, in the development of a novel, low-cost implementation of a GPS-like system for navigating within a patient. A planar magnetic transmitter, formed on a printed circuit board for a low-profile and low cost manufacture, is used to generate a low frequency magnetic field distribution which is detected by a small induction coil sensor. The field transmitter is controlled by a novel closed-loop system that ensures a highly stable magnetic field with reduced interference from one transmitter coil to another. Efficient demodulation schemes are presented which utilise synchronous detection of each magnetic field component experienced by the sensor. The overall tracking accuracy of the system is shown to be less than 2 mm with an orientation error less than 1°. A novel demodulation implementation using a unique undersampling approach allows the use of reduced sample rates to sample the signals of interest without loss of tracking accuracy. This is advantageous for embedded microcontroller implementations of EM tracking systems. The EM tracking system is demonstrated in the pre-clinical environment of a breathing lung phantom. The airways of the phantom are successfully navigated using the system in combination with a 3D computer model rendered from CT data. Registration is achieved using both a landmark rigid registration method and a hybrid fiducial-free approach. The design of a planar magnetic shield structure for blocking the effects of metallic distortion from below the transmitter is presented which successfully blocks the impact of large ferromagnetic objects such as operating tables. A variety of shielding material are analysed with MuMetal and ferrite both providing excellent shieling performance and an increased signal to noise ratio. Finally, the effect of conductive materials and human tissue on magnetic field measurements is presented. Error due to induced eddy currents and capacitive coupling is shown to severely affect EM tracking accuracy at higher frequencies.
Resumo:
Global biodiversity is eroding at an alarming rate, through a combination of anthropogenic disturbance and environmental change. Ecological communities are bewildering in their complexity. Experimental ecologists strive to understand the mechanisms that drive the stability and structure of these complex communities in a bid to inform nature conservation and management. Two fields of research have had high profile success at developing theories related to these stabilising structures and testing them through controlled experimentation. Biodiversity-ecosystem functioning (BEF) research has explored the likely consequences of biodiversity loss on the functioning of natural systems and the provision of important ecosystem services. Empirical tests of BEF theory often consist of simplified laboratory and field experiments, carried out on subsets of ecological communities. Such experiments often overlook key information relating to patterns of interactions, important relationships, and fundamental ecosystem properties. The study of multi-species predator-prey interactions has also contributed much to our understanding of how complex systems are structured, particularly through the importance of indirect effects and predator suppression of prey populations. A growing number of studies describe these complex interactions in detailed food webs, which encompass all the interactions in a community. This has led to recent calls for an integration of BEF research with the comprehensive study of food web properties and patterns, to help elucidate the mechanisms that allow complex communities to persist in nature. This thesis adopts such an approach, through experimentation at Lough Hyne marine reserve, in southwest Ireland. Complex communities were allowed to develop naturally in exclusion cages, with only the diversity of top trophic levels controlled. Species removals were carried out and the resulting changes to predator-prey interactions, ecosystem functioning, food web properties, and stability were studied in detail. The findings of these experiments contribute greatly to our understanding of the stability and structure of complex natural communities.
Resumo:
This research focuses on the design and implementation of a tool to speed-up the development and deployment of heterogeneous wireless sensor networks. The THAWS (Tyndall Heterogeneous Automated Wireless Sensors) tool can be used to quickly create and configure application-specific sensor networks. THAWS presents the user with a choice of options, in order to characterise the desired functionality of the network. With this information, THAWS generates the necessary code from pre-written templates and well-tested, optimized software modules. This is then automatically compiled to form binary files for each node in the network. Wireless programming of the network completes the task of targeting the wireless network towards a specific sensing application. THAWS is an adaptable tool that works with both homogeneous and heterogeneous networks built from wireless sensor nodes that have been developed in the Tyndall National Institute.
Resumo:
Two complementary wireless sensor nodes for building two-tiered heterogeneous networks are presented. A larger node with a 25 mm by 25 mm size acts as the backbone of the network, and can handle complex data processing. A smaller, cheaper node with a 10 mm by 10 mm size can perform simpler sensor-interfacing tasks. The 25mm node is based on previous work that has been done in the Tyndall National Institute that created a modular wireless sensor node. In this work, a new 25mm module is developed operating in the 433/868 MHz frequency bands, with a range of 3.8 km. The 10mm node is highly miniaturised, while retaining a high level of modularity. It has been designed to support very energy efficient operation for applications with low duty cycles, with a sleep current of 3.3 μA. Both nodes use commercially available components and have low manufacturing costs to allow the construction of large networks. In addition, interface boards for communicating with nodes have been developed for both the 25mm and 10mm nodes. These interface boards provide a USB connection, and support recharging of a Li-ion battery from the USB power supply. This paper discusses the design goals, the design methods, and the resulting implementation.
Resumo:
Accepted Version
Resumo:
This work considers the effect of hardware constraints that typically arise in practical power-aware wireless sensor network systems. A rigorous methodology is presented that quantifies the effect of output power limit and quantization constraints on bit error rate performance. The approach uses a novel, intuitively appealing means of addressing the output power constraint, wherein the attendant saturation block is mapped from the output of the plant to its input and compensation is then achieved using a robust anti-windup scheme. A priori levels of system performance are attained using a quantitative feedback theory approach on the initial, linear stage of the design paradigm. This hybrid design is assessed experimentally using a fully compliant 802.15.4 testbed where mobility is introduced through the use of autonomous robots. A benchmark comparison between the new approach and a number of existing strategies is also presented.
Resumo:
Lacticin 3147, enterocin AS-48, lacticin 481, variacin, and sakacin P are bacteriocins offering promising perspectives in terms of preservation and shelf-life extension of food products and should find commercial application in the near future. The studies detailing their characterization and bio-preservative applications are reviewed. Transcriptomic analyses showed a cell wall-targeted response of Lactococcus lactis IL1403 during the early stages of infection with the lytic bacteriophage c2, which is probably orchestrated by a number of membrane stress proteins and involves D-alanylation of membrane lipoteichoic acids, restoration of the physiological proton motive force disrupted following bacteriophage infection, and energy conservation. Sequencing of the eight plasmids of L. lactis subsp. cremoris DPC3758 from raw milk cheese revealed three anti-phage restriction/modification (R/M) systems, immunity/resistance to nisin, lacticin 481, cadmium and copper, and six conjugative/mobilization regions. A food-grade derivative strain with enhanced bacteriophage resistance was generated via stacking of R/M plasmids. Sequencing and functional analysis of the four plasmids of L. lactis subsp. lactis biovar. diacetylactis DPC3901 from raw milk cheese revealed genes novel to Lactococcus and typical of bacteria associated with plants, in addition to genes associated with plant-derived lactococcal strains. The functionality of a novel high-affinity regulated system for cobalt uptake was demonstrated. The bacteriophage resistant and bacteriocin-producing plasmid pMRC01 places a metabolic burden on lactococcal hosts resulting in lowered growth rates and increased cell permeability and autolysis. The magnitude of these effects is strain dependent but not related to bacteriocin production. Starters’ acidification capacity is not significantly affected. Transcriptomic analyses showed that pMRC01 abortive infection (Abi) system is probably subjected to a complex regulatory control by Rgg-like ORF51 and CopG-like ORF58 proteins. These regulators are suggested to modulate the activity of the putative Abi effectors ORF50 and ORF49 exhibiting topology and functional similarities to the Rex system aborting bacteriophage λ lytic growth.
Experimental quantification and modelling of attrition of infant formulae during pneumatic conveying
Resumo:
Infant formula is often produced as an agglomerated powder using a spray drying process. Pneumatic conveying is commonly used for transporting this product within a manufacturing plant. The transient mechanical loads imposed by this process cause some of the agglomerates to disintegrate, which has implications for key quality characteristics of the formula including bulk density and wettability. This thesis used both experimental and modelling approaches to investigate this breakage during conveying. One set of conveying trials had the objective of establishing relationships between the geometry and operating conditions of the conveying system and the resulting changes in bulk properties of the infant formula upon conveying. A modular stainless steel pneumatic conveying rig was constructed for these trials. The mode of conveying and air velocity had a statistically-significant effect on bulk density at a 95% level, while mode of conveying was the only factor which significantly influenced D[4,3] or wettability. A separate set of conveying experiments investigated the effect of infant formula composition, rather than the pneumatic conveying parameters, and also assessed the relationships between the mechanical responses of individual agglomerates of four infant formulae and their compositions. The bulk densities before conveying, and the forces and strains at failure of individual agglomerates, were related to the protein content. The force at failure and stiffness of individual agglomerates were strongly correlated, and generally increased with increasing protein to fat ratio while the strain at failure decreased. Two models of breakage were developed at different scales; the first was a detailed discrete element model of a single agglomerate. This was calibrated using a novel approach based on Taguchi methods which was shown to have considerable advantages over basic parameter studies which are widely used. The data obtained using this model compared well to experimental results for quasi-static uniaxial compression of individual agglomerates. The model also gave adequate results for dynamic loading simulations. A probabilistic model of pneumatic conveying was also developed; this was suitable for predicting breakage in large populations of agglomerates and was highly versatile: parts of the model could easily be substituted by the researcher according to their specific requirements.