919 resultados para Process control -- Statistical methods


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Identifying crash “hotspots”, “blackspots”, “sites with promise”, or “high risk” locations is standard practice in departments of transportation throughout the US. The literature is replete with the development and discussion of statistical methods for hotspot identification (HSID). Theoretical derivations and empirical studies have been used to weigh the benefits of various HSID methods; however, a small number of studies have used controlled experiments to systematically assess various methods. Using experimentally derived simulated data—which are argued to be superior to empirical data, three hot spot identification methods observed in practice are evaluated: simple ranking, confidence interval, and Empirical Bayes. Using simulated data, sites with promise are known a priori, in contrast to empirical data where high risk sites are not known for certain. To conduct the evaluation, properties of observed crash data are used to generate simulated crash frequency distributions at hypothetical sites. A variety of factors is manipulated to simulate a host of ‘real world’ conditions. Various levels of confidence are explored, and false positives (identifying a safe site as high risk) and false negatives (identifying a high risk site as safe) are compared across methods. Finally, the effects of crash history duration in the three HSID approaches are assessed. The results illustrate that the Empirical Bayes technique significantly outperforms ranking and confidence interval techniques (with certain caveats). As found by others, false positives and negatives are inversely related. Three years of crash history appears, in general, to provide an appropriate crash history duration.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Plant biosecurity requires statistical tools to interpret field surveillance data in order to manage pest incursions that threaten crop production and trade. Ultimately, management decisions need to be based on the probability that an area is infested or free of a pest. Current informal approaches to delimiting pest extent rely upon expert ecological interpretation of presence / absence data over space and time. Hierarchical Bayesian models provide a cohesive statistical framework that can formally integrate the available information on both pest ecology and data. The overarching method involves constructing an observation model for the surveillance data, conditional on the hidden extent of the pest and uncertain detection sensitivity. The extent of the pest is then modelled as a dynamic invasion process that includes uncertainty in ecological parameters. Modelling approaches to assimilate this information are explored through case studies on spiralling whitefly, Aleurodicus dispersus and red banded mango caterpillar, Deanolis sublimbalis. Markov chain Monte Carlo simulation is used to estimate the probable extent of pests, given the observation and process model conditioned by surveillance data. Statistical methods, based on time-to-event models, are developed to apply hierarchical Bayesian models to early detection programs and to demonstrate area freedom from pests. The value of early detection surveillance programs is demonstrated through an application to interpret surveillance data for exotic plant pests with uncertain spread rates. The model suggests that typical early detection programs provide a moderate reduction in the probability of an area being infested but a dramatic reduction in the expected area of incursions at a given time. Estimates of spiralling whitefly extent are examined at local, district and state-wide scales. The local model estimates the rate of natural spread and the influence of host architecture, host suitability and inspector efficiency. These parameter estimates can support the development of robust surveillance programs. Hierarchical Bayesian models for the human-mediated spread of spiralling whitefly are developed for the colonisation of discrete cells connected by a modified gravity model. By estimating dispersal parameters, the model can be used to predict the extent of the pest over time. An extended model predicts the climate restricted distribution of the pest in Queensland. These novel human-mediated movement models are well suited to demonstrating area freedom at coarse spatio-temporal scales. At finer scales, and in the presence of ecological complexity, exploratory models are developed to investigate the capacity for surveillance information to estimate the extent of red banded mango caterpillar. It is apparent that excessive uncertainty about observation and ecological parameters can impose limits on inference at the scales required for effective management of response programs. The thesis contributes novel statistical approaches to estimating the extent of pests and develops applications to assist decision-making across a range of plant biosecurity surveillance activities. Hierarchical Bayesian modelling is demonstrated as both a useful analytical tool for estimating pest extent and a natural investigative paradigm for developing and focussing biosecurity programs.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The elastic task model, a significant development in scheduling of real-time control tasks, provides a mechanism for flexible workload management in uncertain environments. It tells how to adjust the control periods to fulfill the workload constraints. However, it is not directly linked to the quality-of-control (QoC) management, the ultimate goal of a control system. As a result, it does not tell how to make the best use of the system resources to maximize the QoC improvement. To fill in this gap, a new feedback scheduling framework, which we refer to as QoC elastic scheduling, is developed in this paper for real-time process control systems. It addresses the QoC directly through embedding both the QoC management and workload adaptation into a constrained optimization problem. The resulting solution for period adjustment is in a closed-form expressed in QoC measurements, enabling closed-loop feedback of the QoC to the task scheduler. Whenever the QoC elastic scheduler is activated, it improves the QoC the most while still meeting the system constraints. Examples are given to demonstrate the effectiveness of the QoC elastic scheduling.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In 2008, a three-year pilot ‘pay for performance’ (P4P) program, known as ‘Clinical Practice Improvement Payment’ (CPIP) was introduced into Queensland Health (QHealth). QHealth is a large public health sector provider of acute, community, and public health services in Queensland, Australia. The organisation has recently embarked on a significant reform agenda including a review of existing funding arrangements (Duckett et al., 2008). Partly in response to this reform agenda, a casemix funding model has been implemented to reconnect health care funding with outcomes. CPIP was conceptualised as a performance-based scheme that rewarded quality with financial incentives. This is the first time such a scheme has been implemented into the public health sector in Australia with a focus on rewarding quality, and it is unique in that it has a large state-wide focus and includes 15 Districts. CPIP initially targeted five acute and community clinical areas including Mental Health, Discharge Medication, Emergency Department, Chronic Obstructive Pulmonary Disease, and Stroke. The CPIP scheme was designed around key concepts including the identification of clinical indicators that met the set criteria of: high disease burden, a well defined single diagnostic group or intervention, significant variations in clinical outcomes and/or practices, a good evidence, and clinician control and support (Ward, Daniels, Walker & Duckett, 2007). This evaluative research targeted Phase One of implementation of the CPIP scheme from January 2008 to March 2009. A formative evaluation utilising a mixed methodology and complementarity analysis was undertaken. The research involved three research questions and aimed to determine the knowledge, understanding, and attitudes of clinicians; identify improvements to the design, administration, and monitoring of CPIP; and determine the financial and economic costs of the scheme. Three key studies were undertaken to ascertain responses to the key research questions. Firstly, a survey of clinicians was undertaken to examine levels of knowledge and understanding and their attitudes to the scheme. Secondly, the study sought to apply Statistical Process Control (SPC) to the process indicators to assess if this enhanced the scheme and a third study examined a simple economic cost analysis. The CPIP Survey of clinicians elicited 192 clinician respondents. Over 70% of these respondents were supportive of the continuation of the CPIP scheme. This finding was also supported by the results of a quantitative altitude survey that identified positive attitudes in 6 of the 7 domains-including impact, awareness and understanding and clinical relevance, all being scored positive across the combined respondent group. SPC as a trending tool may play an important role in the early identification of indicator weakness for the CPIP scheme. This evaluative research study supports a previously identified need in the literature for a phased introduction of Pay for Performance (P4P) type programs. It further highlights the value of undertaking a formal risk assessment of clinician, management, and systemic levels of literacy and competency with measurement and monitoring of quality prior to a phased implementation. This phasing can then be guided by a P4P Design Variable Matrix which provides a selection of program design options such as indicator target and payment mechanisms. It became evident that a clear process is required to standardise how clinical indicators evolve over time and direct movement towards more rigorous ‘pay for performance’ targets and the development of an optimal funding model. Use of this matrix will enable the scheme to mature and build the literacy and competency of clinicians and the organisation as implementation progresses. Furthermore, the research identified that CPIP created a spotlight on clinical indicators and incentive payments of over five million from a potential ten million was secured across the five clinical areas in the first 15 months of the scheme. This indicates that quality was rewarded in the new QHealth funding model, and despite issues being identified with the payment mechanism, funding was distributed. The economic model used identified a relative low cost of reporting (under $8,000) as opposed to funds secured of over $300,000 for mental health as an example. Movement to a full cost effectiveness study of CPIP is supported. Overall the introduction of the CPIP scheme into QHealth has been a positive and effective strategy for engaging clinicians in quality and has been the catalyst for the identification and monitoring of valuable clinical process indicators. This research has highlighted that clinicians are supportive of the scheme in general; however, there are some significant risks that include the functioning of the CPIP payment mechanism. Given clinician support for the use of a pay–for-performance methodology in QHealth, the CPIP scheme has the potential to be a powerful addition to a multi-faceted suite of quality improvement initiatives within QHealth.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Genetic research of complex diseases is a challenging, but exciting, area of research. The early development of the research was limited, however, until the completion of the Human Genome and HapMap projects, along with the reduction in the cost of genotyping, which paves the way for understanding the genetic composition of complex diseases. In this thesis, we focus on the statistical methods for two aspects of genetic research: phenotype definition for diseases with complex etiology and methods for identifying potentially associated Single Nucleotide Polymorphisms (SNPs) and SNP-SNP interactions. With regard to phenotype definition for diseases with complex etiology, we firstly investigated the effects of different statistical phenotyping approaches on the subsequent analysis. In light of the findings, and the difficulties in validating the estimated phenotype, we proposed two different methods for reconciling phenotypes of different models using Bayesian model averaging as a coherent mechanism for accounting for model uncertainty. In the second part of the thesis, the focus is turned to the methods for identifying associated SNPs and SNP interactions. We review the use of Bayesian logistic regression with variable selection for SNP identification and extended the model for detecting the interaction effects for population based case-control studies. In this part of study, we also develop a machine learning algorithm to cope with the large scale data analysis, namely modified Logic Regression with Genetic Program (MLR-GEP), which is then compared with the Bayesian model, Random Forests and other variants of logic regression.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Despite promising benefits and advantages, there are reports of failures and low realisation of benefits in Enterprise System (ES) initiatives. Among the research on the factors that influence ES success, there is a dearth of studies on the knowledge implications of multiple end-user groups using the same ES application. An ES facilitates the work of several user groups, ranging from strategic management, management, to operational staff, all using the same system for multiple objectives. Given the fundamental characteristics of ES – integration of modules, business process views, and aspects of information transparency – it is necessary that all frequent end-users share a reasonable amount of common knowledge and integrate their knowledge to yield new knowledge. Recent literature on ES implementation highlights the importance of Knowledge Integration (KI) for implementation success. Unfortunately, the importance of KI is often overlooked and little about the role of KI in ES success is known. Many organisations do not achieve the potential benefits from their ES investment because they do not consider the need or their ability to integrate their employees’ knowledge. This study is designed to improve our understanding of the influence of KI among ES end-users on operational ES success. The three objectives of the study are: (I) to identify and validate the antecedents of KI effectiveness, (II) to investigate the impact of KI effectiveness on the goodness of individuals’ ES-knowledge base, and (III) to examine the impact of the goodness of individuals’ ES-knowledge base on the operational ES success. For this purpose, we employ the KI factors identified by Grant (1996) and an IS-impact measurement model from the work of Gable et al. (2008) to examine ES success. The study derives its findings from data gathered from six Malaysian companies in order to obtain the three-fold goal of this thesis as outlined above. The relationships between the antecedents of KI effectiveness and its consequences are tested using 188 responses to a survey representing the views of management and operational employment cohorts. Using statistical methods, we confirm three antecedents of KI effectiveness and the consequences of the antecedents on ES success are validated. The findings demonstrate a statistically positive impact of KI effectiveness of ES success, with KI effectiveness contributing to almost one-third of ES success. This research makes a number of contributions to the understanding of the influence of KI on ES success. First, based on the empirical work using a complete nomological net model, the role of KI effectiveness on ES success is evidenced. Second, the model provides a theoretical lens for a more comprehensive understanding of the impact of KI on the level of ES success. Third, restructuring the dimensions of the knowledge-based theory to fit the context of ES extends its applicability and generalisability to contemporary Information Systems. Fourth, the study develops and validates measures for the antecedents of KI effectiveness. Fifth, the study demonstrates the statistically significant positive influence of the goodness of KI on ES success. From a practical viewpoint, this study emphasises the importance of KI effectiveness as a direct antecedent of ES success. Practical lessons can be drawn from the work done in this study to empirically identify the critical factors among the antecedents of KI effectiveness that should be given attention.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Though the value of a process-centred view for the understanding and (re-)design of corporations has been widely accepted, our understanding of the research process in Information Systems (IS) remains superficial. A process-centred view on IS research considers the conduct of a research project as a sequence of activities involving resources, data and research artifacts. As such, it helps to reflect on more effective ways to conduct IS research, to consolidate and compare diverse practices and to complement the focus on research methodologies with research project practices. This paper takes a first step towards the discipline of ‘Research Process Management’ by exploring the features of research processes and by presenting a preliminary approach for research process design that can facilitate modelling IS research. The case study method and the design science research method are used as examples to demonstrate the potential of such reference research process models.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background The four principles of Beauchamp and Childress - autonomy, non-maleficence, beneficence and justice - have been extremely influential in the field of medical ethics, and are fundamental for understanding the current approach to ethical assessment in health care. This study tests whether these principles can be quantitatively measured on an individual level, and then subsequently if they are used in the decision making process when individuals are faced with ethical dilemmas. Methods The Analytic Hierarchy Process was used as a tool for the measurement of the principles. Four scenarios, which involved conflicts between the medical ethical principles, were presented to participants and they made judgments about the ethicality of the action in the scenario, and their intentions to act in the same manner if they were in the situation. Results Individual preferences for these medical ethical principles can be measured using the Analytic Hierarchy Process. This technique provides a useful tool in which to highlight individual medical ethical values. On average individuals have a significant preference for non-maleficence over the other principles, however, and perhaps counter-intuitively, this preference does not seem to relate to applied ethical judgements in specific ethical dilemmas. Conclusions People state they value these medical ethical principles but they do not actually seem to use them directly in the decision making process. The reasons for this are explained through the lack of a behavioural model to account for the relevant situational factors not captured by the principles. The limitations of the principles in predicting ethical decision making are discussed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A significant number of patients diagnosed with primary brain tumours report unmet information needs. Using concept mapping methodology, this study aimed to identify strategies for improving information provision, and to describe factors that health professionals understood to influence their provision of information to patients with brain tumours and their families. Concept mapping is a mixed methods approach that uses statistical methods to represent participants’ perceived relationships between elements as conceptual maps. These maps, and results of associated data collection and analyses, are used to extract concepts involved in information provision to these patients. Thirty health professionals working across a range of neuro-oncology roles and settings participated in the concept mapping process. Participants rated a care coordinator as the most important strategy for improving brain tumour care, with psychological support as a whole rated as the most important element of care. Five major themes were identified as facilitating information provision: health professionals’ communication skills, style and attitudes; patients’ needs and preferences; perceptions of patients’ need for protection and initiative; rapport and continuity between patients and health professionals; and the nature of the health care system. Overall, health professionals conceptualised information provision as ‘individualised’, dependent on these interconnected personal and environmental factors.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A model has been developed to track the flow of cane constituents through the milling process. While previous models have tracked the flow of fibre, brix and water through the process, this model tracks the soluble and insoluble solid cane components using modelling theory and experiment data, assisting in further understanding the flow of constituents into mixed juice and final bagasse. The work provided an opportunity to understand the factors which affect the distribution of the cane constituents in juice and bagasse. Application of the model should lead to improvements in the overall performance of the milling train.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background On-site wastewater treatment system (OWTS) siting, design and management has traditionally been based on site specific conditions with little regard to the surrounding environment or the cumulative effect of other systems in the environment. The general approach has been to apply the same framework of standards and regulations to all sites equally, regardless of the sensitivity, or lack thereof, to the receiving environment. Consequently, this has led to the continuing poor performance and failure of on-site systems, resulting in environmental and public health consequences. As a result, there is increasing realisation that more scientifically robust evaluations in regard to site assessment and the underlying ground conditions are needed. Risk-based approaches to on-site system siting, design and management are considered the most appropriate means of improvement to the current standards and codes for on-site wastewater treatment systems. The Project Research in relation to this project was undertaken within the Gold Coast City Council region, the major focus being the semi-urban, rural residential and hinterland areas of the city that are not serviced by centralised treatment systems. The Gold Coast has over 15,000 on-site systems in use, with approximately 66% being common septic tank-subsurface dispersal systems. A recent study evaluating the performance of these systems within the Gold Coast area showed approximately 90% were not meeting the specified guidelines for effluent treatment and dispersal. The main focus of this research was to incorporate strong scientific knowledge into an integrated risk assessment process to allow suitable management practices to be set in place to mitigate the inherent risks. To achieve this, research was undertaken focusing on three main aspects involved with the performance and management of OWTS. Firstly, an investigation into the suitability of soil for providing appropriate effluent renovation was conducted. This involved detailed soil investigations, laboratory analysis and the use of multivariate statistical methods for analysing soil information. The outcomes of these investigations were developed into a framework for assessing soil suitability for effluent renovation. This formed the basis for the assessment of OWTS siting and design risks employed in the developed risk framework. Secondly, an assessment of the environmental and public health risks was performed specifically related the release of contaminants from OWTS. This involved detailed groundwater and surface water sampling and analysis to assess the current and potential risks of contamination throughout the Gold Coast region. Additionally, the assessment of public health risk incorporated the use of bacterial source tracking methods to identify the different sources of fecal contamination within monitored regions. Antibiotic resistance pattern analysis was utilised to determine the extent of human faecal contamination, with the outcomes utilised for providing a more indicative public health assessment. Finally, the outcomes of both the soil suitability assessment and ground and surface water monitoring was utilised for the development of the integrated risk framework. The research outcomes achieved through this project enabled the primary research aims and objects to be accomplished. This in turn would enable Gold Coast City Council to provide more appropriate assessment and management guidelines based on robust scientific knowledge which will ultimately ensure that the potential environmental and public health impacts resulting from on-site wastewater treatment is minimised. As part of the implementation of suitable management strategies, a critical point monitoring program (CPM) was formulated. This entailed the identification of the key critical parameters that contribute to the characterised risks at monitored locations within the study area. The CPM will allow more direct procedures to be implemented, targeting the specific hazards at sensitive areas throughout Gold Coast region.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Currently, 1.3 billion tonnes of food is lost annually due to lack of proper processing and preservation method. Drying is one of the easiest and oldest methods of food processing which can contribute to reduce that huge losses, combat hunger and promote food security. Drying increase shelf life, reduce weight and volume of food thus minimize packing, storage, and transportation cost and enable storage of food under ambient environment. However, drying is a complex process which involves combination of heat and mass transfer and physical property change and shrinkage of the food material. Modelling of this process is essential to optimize the drying kinetics and improve energy efficiency of the process. Since material properties varies with moisture content, the models should not consider constant materials properties, constant diffusion .The objective of this paper is to develop a multiphysics based mathematical model to simulate coupled heat and mass transfer during convective drying of fruit considering variable material properties. This model can be used predict the temperature and moisture distribution inside the food during drying. Effect of different drying air temperature and drying air velocity on drying kinetics has been demonstrated. The governing equations of heat and mass transfer were solved with Comsol Multiphysics 4.3.