3 resultados para capacity management

em Glasgow Theses Service


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The anticipated growth of air traffic worldwide requires enhanced Air Traffic Management (ATM) technologies and procedures to increase the system capacity, efficiency, and resilience, while reducing environmental impact and maintaining operational safety. To deal with these challenges, new automation and information exchange capabilities are being developed through different modernisation initiatives toward a new global operational concept called Trajectory Based Operations (TBO), in which aircraft trajectory information becomes the cornerstone of advanced ATM applications. This transformation will lead to higher levels of system complexity requiring enhanced Decision Support Tools (DST) to aid humans in the decision making processes. These will rely on accurate predicted aircraft trajectories, provided by advanced Trajectory Predictors (TP). The trajectory prediction process is subject to stochastic effects that introduce uncertainty into the predictions. Regardless of the assumptions that define the aircraft motion model underpinning the TP, deviations between predicted and actual trajectories are unavoidable. This thesis proposes an innovative method to characterise the uncertainty associated with a trajectory prediction based on the mathematical theory of Polynomial Chaos Expansions (PCE). Assuming univariate PCEs of the trajectory prediction inputs, the method describes how to generate multivariate PCEs of the prediction outputs that quantify their associated uncertainty. Arbitrary PCE (aPCE) was chosen because it allows a higher degree of flexibility to model input uncertainty. The obtained polynomial description can be used in subsequent prediction sensitivity analyses thanks to the relationship between polynomial coefficients and Sobol indices. The Sobol indices enable ranking the input parameters according to their influence on trajectory prediction uncertainty. The applicability of the aPCE-based uncertainty quantification detailed herein is analysed through a study case. This study case represents a typical aircraft trajectory prediction problem in ATM, in which uncertain parameters regarding aircraft performance, aircraft intent description, weather forecast, and initial conditions are considered simultaneously. Numerical results are compared to those obtained from a Monte Carlo simulation, demonstrating the advantages of the proposed method. The thesis includes two examples of DSTs (Demand and Capacity Balancing tool, and Arrival Manager) to illustrate the potential benefits of exploiting the proposed uncertainty quantification method.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Maintaining accessibility to and understanding of digital information over time is a complex challenge that often requires contributions and interventions from a variety of individuals and organizations. The processes of preservation planning and evaluation are fundamentally implicit and share similar complexity. Both demand comprehensive knowledge and understanding of every aspect of to-be-preserved content and the contexts within which preservation is undertaken. Consequently, means are required for the identification, documentation and association of those properties of data, representation and management mechanisms that in combination lend value, facilitate interaction and influence the preservation process. These properties may be almost limitless in terms of diversity, but are integral to the establishment of classes of risk exposure, and the planning and deployment of appropriate preservation strategies. We explore several research objectives within the course of this thesis. Our main objective is the conception of an ontology for risk management of digital collections. Incorporated within this are our aims to survey the contexts within which preservation has been undertaken successfully, the development of an appropriate methodology for risk management, the evaluation of existing preservation evaluation approaches and metrics, the structuring of best practice knowledge and lastly the demonstration of a range of tools that utilise our findings. We describe a mixed methodology that uses interview and survey, extensive content analysis, practical case study and iterative software and ontology development. We build on a robust foundation, the development of the Digital Repository Audit Method Based on Risk Assessment. We summarise the extent of the challenge facing the digital preservation community (and by extension users and creators of digital materials from many disciplines and operational contexts) and present the case for a comprehensive and extensible knowledge base of best practice. These challenges are manifested in the scale of data growth, the increasing complexity and the increasing onus on communities with no formal training to offer assurances of data management and sustainability. These collectively imply a challenge that demands an intuitive and adaptable means of evaluating digital preservation efforts. The need for individuals and organisations to validate the legitimacy of their own efforts is particularly prioritised. We introduce our approach, based on risk management. Risk is an expression of the likelihood of a negative outcome, and an expression of the impact of such an occurrence. We describe how risk management may be considered synonymous with preservation activity, a persistent effort to negate the dangers posed to information availability, usability and sustainability. Risk can be characterised according to associated goals, activities, responsibilities and policies in terms of both their manifestation and mitigation. They have the capacity to be deconstructed into their atomic units and responsibility for their resolution delegated appropriately. We continue to describe how the manifestation of risks typically spans an entire organisational environment, and as the focus of our analysis risk safeguards against omissions that may occur when pursuing functional, departmental or role-based assessment. We discuss the importance of relating risk-factors, through the risks themselves or associated system elements. To do so will yield the preservation best-practice knowledge base that is conspicuously lacking within the international digital preservation community. We present as research outcomes an encapsulation of preservation practice (and explicitly defined best practice) as a series of case studies, in turn distilled into atomic, related information elements. We conduct our analyses in the formal evaluation of memory institutions in the UK, US and continental Europe. Furthermore we showcase a series of applications that use the fruits of this research as their intellectual foundation. Finally we document our results in a range of technical reports and conference and journal articles. We present evidence of preservation approaches and infrastructures from a series of case studies conducted in a range of international preservation environments. We then aggregate this into a linked data structure entitled PORRO, an ontology relating preservation repository, object and risk characteristics, intended to support preservation decision making and evaluation. The methodology leading to this ontology is outlined, and lessons are exposed by revisiting legacy studies and exposing the resource and associated applications to evaluation by the digital preservation community.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background and aims: Advances in modern medicine have led to improved outcomes after stroke, yet an increased treatment burden has been placed on patients. Treatment burden is the workload of health care for people with chronic illness and the impact that this has on functioning and well-being. Those with comorbidities are likely to be particularly burdened. Excessive treatment burden can negatively affect outcomes. Individuals are likely to differ in their ability to manage health problems and follow treatments, defined as patient capacity. The aim of this thesis was to explore the experience of treatment burden for people who have had a stroke and the factors that influence patient capacity. Methods: There were four phases of research. 1) A systematic review of the qualitative literature that explored the experience of treatment burden for those with stroke. Data were analysed using framework synthesis, underpinned by Normalisation Process Theory (NPT). 2) A cross-sectional study of 1,424,378 participants >18 years, demographically representative of the Scottish population. Binary logistic regression was used to analyse the relationship between stroke and the presence of comorbidities and prescribed medications. 3) Interviews with twenty-nine individuals with stroke, fifteen analysed by framework analysis underpinned by NPT and fourteen by thematic analysis. The experience of treatment burden was explored in depth along with factors that influence patient capacity. 4) Integration of findings in order to create a conceptual model of treatment burden and patient capacity in stroke. Results: Phase 1) A taxonomy of treatment burden in stroke was created. The following broad areas of treatment burden were identified: making sense of stroke management and planning care; interacting with others including health professionals, family and other stroke patients; enacting management strategies; and reflecting on management. Phase 2) 35,690 people (2.5%) had a diagnosis of stroke and of the 39 co-morbidities examined, 35 were significantly more common in those with stroke. The proportion of those with stroke that had >1 additional morbidities present (94.2%) was almost twice that of controls (48%) (odds ratio (OR) adjusted for age, gender and socioeconomic deprivation; 95% confidence interval: 5.18; 4.95-5.43) and 34.5% had 4-6 comorbidities compared to 7.2% of controls (8.59; 8.17-9.04). In the stroke group, 12.6% of people had a record of >11 repeat prescriptions compared to only 1.5% of the control group (OR adjusted for age, gender, deprivation and morbidity count: 15.84; 14.86-16.88). Phase 3) The taxonomy of treatment burden from Phase 1 was verified and expanded. Additionally, treatment burdens were identified as arising from either: the workload of healthcare; or the endurance of care deficiencies. A taxonomy of patient capacity was created. Six factors were identified that influence patient capacity: personal attributes and skills; physical and cognitive abilities; support network; financial status; life workload, and environment. A conceptual model of treatment burden was created. Healthcare workload and the presence of care deficiencies can influence and be influenced by patient capacity. The quality and configuration of health and social care services influences healthcare workload, care deficiencies and patient capacity. Conclusions: This thesis provides important insights into the considerable treatment burden experienced by people who have had a stroke and the factors that affect their capacity to manage health. Multimorbidity and polypharmacy are common in those with stroke and levels of these are high. Findings have important implications for the design of clinical guidelines and healthcare delivery, for example co-ordination of care should be improved, shared decision-making enhanced, and patients better supported following discharge from hospital.