5 resultados para electronic healthcare data

em Glasgow Theses Service


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Abstract and Summary of Thesis: Background: Individuals with Major Mental Illness (such as schizophrenia and bipolar disorder) experience increased rates of physical health comorbidity compared to the general population. They also experience inequalities in access to certain aspects of healthcare. This ultimately leads to premature mortality. Studies detailing patterns of physical health comorbidity are limited by their definitions of comorbidity, single disease approach to comorbidity and by the study of heterogeneous groups. To date the investigation of possible sources of healthcare inequalities experienced by individuals with Major Mental Illness (MMI) is relatively limited. Moreover studies detailing the extent of premature mortality experienced by individuals with MMI vary both in terms of the measure of premature mortality reported and age of the cohort investigated, limiting their generalisability to the wider population. Therefore local and national data can be used to describe patterns of physical health comorbidity, investigate possible reasons for health inequalities and describe mortality rates. These findings will extend existing work in this area. Aims and Objectives: To review the relevant literature regarding: patterns of physical health comorbidity, evidence for inequalities in physical healthcare and evidence for premature mortality for individuals with MMI. To examine the rates of physical health comorbidity in a large primary care database and to assess for evidence for inequalities in access to healthcare using both routine primary care prescribing data and incentivised national Quality and Outcome Framework (QOF) data. Finally to examine the rates of premature mortality in a local context with a particular focus on cause of death across the lifespan and effect of International Classification of Disease Version 10 (ICD 10) diagnosis and socioeconomic status on rates and cause of death. Methods: A narrative review of the literature surrounding patterns of physical health comorbidity, the evidence for inequalities in physical healthcare and premature mortality in MMI was undertaken. Rates of physical health comorbidity and multimorbidity in schizophrenia and bipolar disorder were examined using a large primary care dataset (Scottish Programme for Improving Clinical Effectiveness in Primary Care (SPICE)). Possible inequalities in access to healthcare were investigated by comparing patterns of prescribing in individuals with MMI and comorbid physical health conditions with prescribing rates in individuals with physical health conditions without MMI using SPICE data. Potential inequalities in access to health promotion advice (in the form of smoking cessation) and prescribing of Nicotine Replacement Therapy (NRT) were also investigated using SPICE data. Possible inequalities in access to incentivised primary healthcare were investigated using National Quality and Outcome Framework (QOF) data. Finally a pre-existing case register (Glasgow Psychosis Clinical Information System (PsyCIS)) was linked to Scottish Mortality data (available from the Scottish Government Website) to investigate rates and primary cause of death in individuals with MMI. Rate and primary cause of death were compared to the local population and impact of age, socioeconomic status and ICD 10 diagnosis (schizophrenia vs. bipolar disorder) were investigated. Results: Analysis of the SPICE data found that sixteen out of the thirty two common physical comorbidities assessed, occurred significantly more frequently in individuals with schizophrenia. In individuals with bipolar disorder fourteen occurred more frequently. The most prevalent chronic physical health conditions in individuals with schizophrenia and bipolar disorder were: viral hepatitis (Odds Ratios (OR) 3.99 95% Confidence Interval (CI) 2.82-5.64 and OR 5.90 95% CI 3.16-11.03 respectively), constipation (OR 3.24 95% CI 3.01-3.49 and OR 2.84 95% CI 2.47-3.26 respectively) and Parkinson’s disease (OR 3.07 95% CI 2.43-3.89 and OR 2.52 95% CI 1.60-3.97 respectively). Both groups had significantly increased rates of multimorbidity compared to controls: in the schizophrenia group OR for two comorbidities was 1.37 95% CI 1.29-1.45 and in the bipolar disorder group OR was 1.34 95% CI 1.20-1.49. In the studies investigating inequalities in access to healthcare there was evidence of: under-recording of cardiovascular-related conditions for example in individuals with schizophrenia: OR for Atrial Fibrillation (AF) was 0.62 95% CI 0.52 - 0.73, for hypertension 0.71 95% CI 0.67 - 0.76, for Coronary Heart Disease (CHD) 0.76 95% CI 0.69 - 0.83 and for peripheral vascular disease (PVD) 0.83 95% CI 0.72 - 0.97. Similarly in individuals with bipolar disorder OR for AF was 0.56 95% CI 0.41-0.78, for hypertension 0.69 95% CI 0.62 - 0.77 and for CHD 0.77 95% CI 0.66 - 0.91. There was also evidence of less intensive prescribing for individuals with schizophrenia and bipolar disorder who had comorbid hypertension and CHD compared to individuals with hypertension and CHD who did not have schizophrenia or bipolar disorder. Rate of prescribing of statins for individuals with schizophrenia and CHD occurred significantly less frequently than in individuals with CHD without MMI (OR 0.67 95% CI 0.56-0.80). Rates of prescribing of 2 or more anti-hypertensives were lower in individuals with CHD and schizophrenia and CHD and bipolar disorder compared to individuals with CHD without MMI (OR 0.66 95% CI 0.56-0.78 and OR 0.55 95% CI 0.46-0.67, respectively). Smoking was more common in individuals with MMI compared to individuals without MMI (OR 2.53 95% CI 2.44-2.63) and was particularly increased in men (OR 2.83 95% CI 2.68-2.98). Rates of ex-smoking and non-smoking were lower in individuals with MMI (OR 0.79 95% CI 0.75-0.83 and OR 0.50 95% CI 0.48-0.52 respectively). However recorded rates of smoking cessation advice in smokers with MMI were significantly lower than the recorded rates of smoking cessation advice in smokers with diabetes (88.7% vs. 98.0%, p<0.001), smokers with CHD (88.9% vs. 98.7%, p<0.001) and smokers with hypertension (88.3% vs. 98.5%, p<0.001) without MMI. The odds ratio of NRT prescription was also significantly lower in smokers with MMI without diabetes compared to smokers with diabetes without MMI (OR 0.75 95% CI 0.69-0.81). Similar findings were found for smokers with MMI without CHD compared to smokers with CHD without MMI (OR 0.34 95% CI 0.31-0.38) and smokers with MMI without hypertension compared to smokers with hypertension without MMI (OR 0.71 95% CI 0.66-0.76). At a national level, payment and population achievement rates for the recording of body mass index (BMI) in MMI was significantly lower than the payment and population achievement rates for BMI recording in diabetes throughout the whole of the UK combined: payment rate 92.7% (Inter Quartile Range (IQR) 89.3-95.8 vs. 95.5% IQR 93.3-97.2, p<0.001 and population achievement rate 84.0% IQR 76.3-90.0 vs. 92.5% IQR 89.7-94.9, p<0.001 and for each country individually: for example in Scotland payment rate was 94.0% IQR 91.4-97.2 vs. 96.3% IQR 94.3-97.8, p<0.001. Exception rate was significantly higher for the recording of BMI in MMI than the exception rate for BMI recording in diabetes for the UK combined: 7.4% IQR 3.3-15.9 vs. 2.3% IQR 0.9-4.7, p<0.001 and for each country individually. For example in Scotland exception rate in MMI was 11.8% IQR 5.4-19.3 compared to 3.5% IQR 1.9-6.1 in diabetes. Similar findings were found for Blood Pressure (BP) recording: across the whole of the UK payment and population achievement rates for BP recording in MMI were also significantly reduced compared to payment and population achievement rates for the recording of BP in chronic kidney disease (CKD): payment rate: 94.1% IQR 90.9-97.1 vs.97.8% IQR 96.3-98.9 and p<0.001 and population achievement rate 87.0% IQR 81.3-91.7 vs. 97.1% IQR 95.5-98.4, p<0.001. Exception rates again were significantly higher for the recording of BP in MMI compared to CKD (6.4% IQR 3.0-13.1 vs. 0.3% IQR 0.0-1.0, p<0.001). There was also evidence of differences in rates of recording of BMI and BP in MMI across the UK. BMI and BP recording in MMI were significantly lower in Scotland compared to England (BMI:-1.5% 99% CI -2.7 to -0.3%, p<0.001 and BP: -1.8% 99% CI -2.7 to -0.9%, p<0.001). While rates of BMI and BP recording in diabetes and CKD were similar in Scotland compared to England (BMI: -0.5 99% CI -1.0 to 0.05, p=0.004 and BP: 0.02 99% CI -0.2 to 0.3, p=0.797). Data from the PsyCIS cohort showed an increase in Standardised Mortality Ratios (SMR) across the lifespan for individuals with MMI compared to the local Glasgow and wider Scottish populations (Glasgow SMR 1.8 95% CI 1.6-2.0 and Scotland SMR 2.7 95% CI 2.4-3.1). Increasing socioeconomic deprivation was associated with an increased overall rate of death in MMI (350.3 deaths/10,000 population/5 years in the least deprived quintile compared to 794.6 deaths/10,000 population/5 years in the most deprived quintile). No significant difference in rate of death for individuals with schizophrenia compared with bipolar disorder was reported (6.3% vs. 4.9%, p=0.086), but primary cause of death varied: with higher rates of suicide in individuals with bipolar disorder (22.4% vs. 11.7%, p=0.04). Discussion: Local and national datasets can be used for epidemiological study to inform local practice and complement existing national and international studies. While the strengths of this thesis include the large data sets used and therefore their likely representativeness to the wider population, some limitations largely associated with using secondary data sources are acknowledged. While this thesis has confirmed evidence of increased physical health comorbidity and multimorbidity in individuals with MMI, it is likely that these findings represent a significant under reporting and likely under recognition of physical health comorbidity in this population. This is likely due to a combination of patient, health professional and healthcare system factors and requires further investigation. Moreover, evidence of inequality in access to healthcare in terms of: physical health promotion (namely smoking cessation advice), recording of physical health indices (BMI and BP), prescribing of medications for the treatment of physical illness and prescribing of NRT has been found at a national level. While significant premature mortality in individuals with MMI within a Scottish setting has been confirmed, more work is required to further detail and investigate the impact of socioeconomic deprivation on cause and rate of death in this population. It is clear that further education and training is required for all healthcare staff to improve the recognition, diagnosis and treatment of physical health problems in this population with the aim of addressing the significant premature mortality that is seen. Conclusions: Future work lies in the challenge of designing strategies to reduce health inequalities and narrow the gap in premature mortality reported in individuals with MMI. Models of care that allow a much more integrated approach to diagnosing, monitoring and treating both the physical and mental health of individuals with MMI, particularly in areas of social and economic deprivation may be helpful. Strategies to engage this “hard to reach” population also need to be developed. While greater integration of psychiatric services with primary care and with specialist medical services is clearly vital the evidence on how best to achieve this is limited. While the National Health Service (NHS) is currently undergoing major reform, attention needs to be paid to designing better ways to improve the current disconnect between primary and secondary care. This should then help to improve physical, psychological and social outcomes for individuals with MMI.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis reports on an investigation of the feasibility and usefulness of incorporating dynamic management facilities for managing sensed context data in a distributed contextaware mobile application. The investigation focuses on reducing the work required to integrate new sensed context streams in an existing context aware architecture. Current architectures require integration work for new streams and new contexts that are encountered. This means of operation is acceptable for current fixed architectures. However, as systems become more mobile the number of discoverable streams increases. Without the ability to discover and use these new streams the functionality of any given device will be limited to the streams that it knows how to decode. The integration of new streams requires that the sensed context data be understood by the current application. If the new source provides data of a type that an application currently requires then the new source should be connected to the application without any prior knowledge of the new source. If the type is similar and can be converted then this stream too should be appropriated by the application. Such applications are based on portable devices (phones, PDAs) for semi-autonomous services that use data from sensors connected to the devices, plus data exchanged with other such devices and remote servers. Such applications must handle input from a variety of sensors, refining the data locally and managing its communication from the device in volatile and unpredictable network conditions. The choice to focus on locally connected sensory input allows for the introduction of privacy and access controls. This local control can determine how the information is communicated to others. This investigation focuses on the evaluation of three approaches to sensor data management. The first system is characterised by its static management based on the pre-pended metadata. This was the reference system. Developed for a mobile system, the data was processed based on the attached metadata. The code that performed the processing was static. The second system was developed to move away from the static processing and introduce a greater freedom of handling for the data stream, this resulted in a heavy weight approach. The approach focused on pushing the processing of the data into a number of networked nodes rather than the monolithic design of the previous system. By creating a separate communication channel for the metadata it is possible to be more flexible with the amount and type of data transmitted. The final system pulled the benefits of the other systems together. By providing a small management class that would load a separate handler based on the incoming data, Dynamism was maximised whilst maintaining ease of code understanding. The three systems were then compared to highlight their ability to dynamically manage new sensed context. The evaluation took two approaches, the first is a quantitative analysis of the code to understand the complexity of the relative three systems. This was done by evaluating what changes to the system were involved for the new context. The second approach takes a qualitative view of the work required by the software engineer to reconfigure the systems to provide support for a new data stream. The evaluation highlights the various scenarios in which the three systems are most suited. There is always a trade-o↵ in the development of a system. The three approaches highlight this fact. The creation of a statically bound system can be quick to develop but may need to be completely re-written if the requirements move too far. Alternatively a highly dynamic system may be able to cope with new requirements but the developer time to create such a system may be greater than the creation of several simpler systems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Due to the growth of design size and complexity, design verification is an important aspect of the Logic Circuit development process. The purpose of verification is to validate that the design meets the system requirements and specification. This is done by either functional or formal verification. The most popular approach to functional verification is the use of simulation based techniques. Using models to replicate the behaviour of an actual system is called simulation. In this thesis, a software/data structure architecture without explicit locks is proposed to accelerate logic gate circuit simulation. We call thus system ZSIM. The ZSIM software architecture simulator targets low cost SIMD multi-core machines. Its performance is evaluated on the Intel Xeon Phi and 2 other machines (Intel Xeon and AMD Opteron). The aim of these experiments is to: • Verify that the data structure used allows SIMD acceleration, particularly on machines with gather instructions ( section 5.3.1). • Verify that, on sufficiently large circuits, substantial gains could be made from multicore parallelism ( section 5.3.2 ). • Show that a simulator using this approach out-performs an existing commercial simulator on a standard workstation ( section 5.3.3 ). • Show that the performance on a cheap Xeon Phi card is competitive with results reported elsewhere on much more expensive super-computers ( section 5.3.5 ). To evaluate the ZSIM, two types of test circuits were used: 1. Circuits from the IWLS benchmark suit [1] which allow direct comparison with other published studies of parallel simulators.2. Circuits generated by a parametrised circuit synthesizer. The synthesizer used an algorithm that has been shown to generate circuits that are statistically representative of real logic circuits. The synthesizer allowed testing of a range of very large circuits, larger than the ones for which it was possible to obtain open source files. The experimental results show that with SIMD acceleration and multicore, ZSIM gained a peak parallelisation factor of 300 on Intel Xeon Phi and 11 on Intel Xeon. With only SIMD enabled, ZSIM achieved a maximum parallelistion gain of 10 on Intel Xeon Phi and 4 on Intel Xeon. Furthermore, it was shown that this software architecture simulator running on a SIMD machine is much faster than, and can handle much bigger circuits than a widely used commercial simulator (Xilinx) running on a workstation. The performance achieved by ZSIM was also compared with similar pre-existing work on logic simulation targeting GPUs and supercomputers. It was shown that ZSIM simulator running on a Xeon Phi machine gives comparable simulation performance to the IBM Blue Gene supercomputer at very much lower cost. The experimental results have shown that the Xeon Phi is competitive with simulation on GPUs and allows the handling of much larger circuits than have been reported for GPU simulation. When targeting Xeon Phi architecture, the automatic cache management of the Xeon Phi, handles and manages the on-chip local store without any explicit mention of the local store being made in the architecture of the simulator itself. However, targeting GPUs, explicit cache management in program increases the complexity of the software architecture. Furthermore, one of the strongest points of the ZSIM simulator is its portability. Note that the same code was tested on both AMD and Xeon Phi machines. The same architecture that efficiently performs on Xeon Phi, was ported into a 64 core NUMA AMD Opteron. To conclude, the two main achievements are restated as following: The primary achievement of this work was proving that the ZSIM architecture was faster than previously published logic simulators on low cost platforms. The secondary achievement was the development of a synthetic testing suite that went beyond the scale range that was previously publicly available, based on prior work that showed the synthesis technique is valid.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Developments in theory and experiment have raised the prospect of an electronic technology based on the discrete nature of electron tunnelling through a potential barrier. This thesis deals with novel design and analysis tools developed to study such systems. Possible devices include those constructed from ultrasmall normal tunnelling junctions. These exhibit charging effects including the Coulomb blockade and correlated electron tunnelling. They allow transistor-like control of the transfer of single carriers, and present the prospect of digital systems operating at the information theoretic limit. As such, they are often referred to as single electronic devices. Single electronic devices exhibit self quantising logic and good structural tolerance. Their speed, immunity to thermal noise, and operating voltage all scale beneficially with junction capacitance. For ultrasmall junctions the possibility of room temperature operation at sub picosecond timescales seems feasible. However, they are sensitive to external charge; whether from trapping-detrapping events, externally gated potentials, or system cross-talk. Quantum effects such as charge macroscopic quantum tunnelling may degrade performance. Finally, any practical system will be complex and spatially extended (amplifying the above problems), and prone to fabrication imperfection. This summarises why new design and analysis tools are required. Simulation tools are developed, concentrating on the basic building blocks of single electronic systems; the tunnelling junction array and gated turnstile device. Three main points are considered: the best method of estimating capacitance values from physical system geometry; the mathematical model which should represent electron tunnelling based on this data; application of this model to the investigation of single electronic systems. (DXN004909)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis investigates how web search evaluation can be improved using historical interaction data. Modern search engines combine offline and online evaluation approaches in a sequence of steps that a tested change needs to pass through to be accepted as an improvement and subsequently deployed. We refer to such a sequence of steps as an evaluation pipeline. In this thesis, we consider the evaluation pipeline to contain three sequential steps: an offline evaluation step, an online evaluation scheduling step, and an online evaluation step. In this thesis we show that historical user interaction data can aid in improving the accuracy or efficiency of each of the steps of the web search evaluation pipeline. As a result of these improvements, the overall efficiency of the entire evaluation pipeline is increased. Firstly, we investigate how user interaction data can be used to build accurate offline evaluation methods for query auto-completion mechanisms. We propose a family of offline evaluation metrics for query auto-completion that represents the effort the user has to spend in order to submit their query. The parameters of our proposed metrics are trained against a set of user interactions recorded in the search engine’s query logs. From our experimental study, we observe that our proposed metrics are significantly more correlated with an online user satisfaction indicator than the metrics proposed in the existing literature. Hence, fewer changes will pass the offline evaluation step to be rejected after the online evaluation step. As a result, this would allow us to achieve a higher efficiency of the entire evaluation pipeline. Secondly, we state the problem of the optimised scheduling of online experiments. We tackle this problem by considering a greedy scheduler that prioritises the evaluation queue according to the predicted likelihood of success of a particular experiment. This predictor is trained on a set of online experiments, and uses a diverse set of features to represent an online experiment. Our study demonstrates that a higher number of successful experiments per unit of time can be achieved by deploying such a scheduler on the second step of the evaluation pipeline. Consequently, we argue that the efficiency of the evaluation pipeline can be increased. Next, to improve the efficiency of the online evaluation step, we propose the Generalised Team Draft interleaving framework. Generalised Team Draft considers both the interleaving policy (how often a particular combination of results is shown) and click scoring (how important each click is) as parameters in a data-driven optimisation of the interleaving sensitivity. Further, Generalised Team Draft is applicable beyond domains with a list-based representation of results, i.e. in domains with a grid-based representation, such as image search. Our study using datasets of interleaving experiments performed both in document and image search domains demonstrates that Generalised Team Draft achieves the highest sensitivity. A higher sensitivity indicates that the interleaving experiments can be deployed for a shorter period of time or use a smaller sample of users. Importantly, Generalised Team Draft optimises the interleaving parameters w.r.t. historical interaction data recorded in the interleaving experiments. Finally, we propose to apply the sequential testing methods to reduce the mean deployment time for the interleaving experiments. We adapt two sequential tests for the interleaving experimentation. We demonstrate that one can achieve a significant decrease in experiment duration by using such sequential testing methods. The highest efficiency is achieved by the sequential tests that adjust their stopping thresholds using historical interaction data recorded in diagnostic experiments. Our further experimental study demonstrates that cumulative gains in the online experimentation efficiency can be achieved by combining the interleaving sensitivity optimisation approaches, including Generalised Team Draft, and the sequential testing approaches. Overall, the central contributions of this thesis are the proposed approaches to improve the accuracy or efficiency of the steps of the evaluation pipeline: the offline evaluation frameworks for the query auto-completion, an approach for the optimised scheduling of online experiments, a general framework for the efficient online interleaving evaluation, and a sequential testing approach for the online search evaluation. The experiments in this thesis are based on massive real-life datasets obtained from Yandex, a leading commercial search engine. These experiments demonstrate the potential of the proposed approaches to improve the efficiency of the evaluation pipeline.