75 resultados para Process control -- Data processing


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Analytical expressions are derived for the mean and variance, of estimates of the bispectrum of a real-time series assuming a cosinusoidal model. The effects of spectral leakage, inherent in discrete Fourier transform operation when the modes present in the signal have a nonintegral number of wavelengths in the record, are included in the analysis. A single phase-coupled triad of modes can cause the bispectrum to have a nonzero mean value over the entire region of computation owing to leakage. The variance of bispectral estimates in the presence of leakage has contributions from individual modes and from triads of phase-coupled modes. Time-domain windowing reduces the leakage. The theoretical expressions for the mean and variance of bispectral estimates are derived in terms of a function dependent on an arbitrary symmetric time-domain window applied to the record. the number of data, and the statistics of the phase coupling among triads of modes. The theoretical results are verified by numerical simulations for simple test cases and applied to laboratory data to examine phase coupling in a hypothesis testing framework

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper investigates relationship between traffic conditions and the crash occurrence likelihood (COL) using the I-880 data. To remedy the data limitations and the methodological shortcomings suffered by previous studies, a multiresolution data processing method is proposed and implemented, upon which binary logistic models were developed. The major findings of this paper are: 1) traffic conditions have significant impacts on COL at the study site; Specifically, COL in a congested (transitioning) traffic flow is about 6 (1.6) times of that in a free flow condition; 2)Speed variance alone is not sufficient to capture traffic dynamics’ impact on COL; a traffic chaos indicator that integrates speed, speed variance, and flow is proposed and shows a promising performance; 3) Models based on aggregated data shall be interpreted with caution. Generally, conclusions obtained from such models shall not be generalized to individual vehicles (drivers) without further evidences using high-resolution data and it is dubious to either claim or disclaim speed kills based on aggregated data.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A model has been developed to track the flow of cane constituents through the milling process. While previous models have tracked the flow of fibre, brix and water through the process, this model tracks the soluble and insoluble solid cane components using modelling theory and experiment data, assisting in further understanding the flow of constituents into mixed juice and final bagasse. The work provided an opportunity to understand the factors which affect the distribution of the cane constituents in juice and bagasse. Application of the model should lead to improvements in the overall performance of the milling train.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Quality oriented management systems and methods have become the dominant business and governance paradigm. From this perspective, satisfying customers’ expectations by supplying reliable, good quality products and services is the key factor for an organization and even government. During recent decades, Statistical Quality Control (SQC) methods have been developed as the technical core of quality management and continuous improvement philosophy and now are being applied widely to improve the quality of products and services in industrial and business sectors. Recently SQC tools, in particular quality control charts, have been used in healthcare surveillance. In some cases, these tools have been modified and developed to better suit the health sector characteristics and needs. It seems that some of the work in the healthcare area has evolved independently of the development of industrial statistical process control methods. Therefore analysing and comparing paradigms and the characteristics of quality control charts and techniques across the different sectors presents some opportunities for transferring knowledge and future development in each sectors. Meanwhile considering capabilities of Bayesian approach particularly Bayesian hierarchical models and computational techniques in which all uncertainty are expressed as a structure of probability, facilitates decision making and cost-effectiveness analyses. Therefore, this research investigates the use of quality improvement cycle in a health vii setting using clinical data from a hospital. The need of clinical data for monitoring purposes is investigated in two aspects. A framework and appropriate tools from the industrial context are proposed and applied to evaluate and improve data quality in available datasets and data flow; then a data capturing algorithm using Bayesian decision making methods is developed to determine economical sample size for statistical analyses within the quality improvement cycle. Following ensuring clinical data quality, some characteristics of control charts in the health context including the necessity of monitoring attribute data and correlated quality characteristics are considered. To this end, multivariate control charts from an industrial context are adapted to monitor radiation delivered to patients undergoing diagnostic coronary angiogram and various risk-adjusted control charts are constructed and investigated in monitoring binary outcomes of clinical interventions as well as postintervention survival time. Meanwhile, adoption of a Bayesian approach is proposed as a new framework in estimation of change point following control chart’s signal. This estimate aims to facilitate root causes efforts in quality improvement cycle since it cuts the search for the potential causes of detected changes to a tighter time-frame prior to the signal. This approach enables us to obtain highly informative estimates for change point parameters since probability distribution based results are obtained. Using Bayesian hierarchical models and Markov chain Monte Carlo computational methods, Bayesian estimators of the time and the magnitude of various change scenarios including step change, linear trend and multiple change in a Poisson process are developed and investigated. The benefits of change point investigation is revisited and promoted in monitoring hospital outcomes where the developed Bayesian estimator reports the true time of the shifts, compared to priori known causes, detected by control charts in monitoring rate of excess usage of blood products and major adverse events during and after cardiac surgery in a local hospital. The development of the Bayesian change point estimators are then followed in a healthcare surveillances for processes in which pre-intervention characteristics of patients are viii affecting the outcomes. In this setting, at first, the Bayesian estimator is extended to capture the patient mix, covariates, through risk models underlying risk-adjusted control charts. Variations of the estimator are developed to estimate the true time of step changes and linear trends in odds ratio of intensive care unit outcomes in a local hospital. Secondly, the Bayesian estimator is extended to identify the time of a shift in mean survival time after a clinical intervention which is being monitored by riskadjusted survival time control charts. In this context, the survival time after a clinical intervention is also affected by patient mix and the survival function is constructed using survival prediction model. The simulation study undertaken in each research component and obtained results highly recommend the developed Bayesian estimators as a strong alternative in change point estimation within quality improvement cycle in healthcare surveillances as well as industrial and business contexts. The superiority of the proposed Bayesian framework and estimators are enhanced when probability quantification, flexibility and generalizability of the developed model are also considered. The empirical results and simulations indicate that the Bayesian estimators are a strong alternative in change point estimation within quality improvement cycle in healthcare surveillances. The superiority of the proposed Bayesian framework and estimators are enhanced when probability quantification, flexibility and generalizability of the developed model are also considered. The advantages of the Bayesian approach seen in general context of quality control may also be extended in the industrial and business domains where quality monitoring was initially developed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Citizen Science projects are initiatives in which members of the general public participate in scientific research projects and perform or manage research-related tasks such as data collection and/or data annotation. Citizen Science is technologically possible and scientifically significant. However, as the gathered information is from the crowd, the data quality is always hard to manage. There are many ways to manage data quality, and reputation management is one of the common approaches. In recent year, many research teams have deployed many audio or image sensors in natural environment in order to monitor the status of animals or plants. The collected data will be analysed by ecologists. However, as the amount of collected data is exceedingly huge and the number of ecologists is very limited, it is impossible for scientists to manually analyse all these data. The functions of existing automated tools to process the data are still very limited and the results are still not very accurate. Therefore, researchers have turned to recruiting general citizens who are interested in helping scientific research to do the pre-processing tasks such as species tagging. Although research teams can save time and money by recruiting general citizens to volunteer their time and skills to help data analysis, the reliability of contributed data varies a lot. Therefore, this research aims to investigate techniques to enhance the reliability of data contributed by general citizens in scientific research projects especially for acoustic sensing projects. In particular, we aim to investigate how to use reputation management to enhance data reliability. Reputation systems have been used to solve the uncertainty and improve data quality in many marketing and E-Commerce domains. The commercial organizations which have chosen to embrace the reputation management and implement the technology have gained many benefits. Data quality issues are significant to the domain of Citizen Science due to the quantity and diversity of people and devices involved. However, research on reputation management in this area is relatively new. We therefore start our investigation by examining existing reputation systems in different domains. Then we design novel reputation management approaches for Citizen Science projects to categorise participants and data. We have investigated some critical elements which may influence data reliability in Citizen Science projects. These elements include personal information such as location and education and performance information such as the ability to recognise certain bird calls. The designed reputation framework is evaluated by a series of experiments involving many participants for collecting and interpreting data, in particular, environmental acoustic data. Our research in exploring the advantages of reputation management in Citizen Science (or crowdsourcing in general) will help increase awareness among organizations that are unacquainted with its potential benefits.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Urban renewal is a significant issue in developed urban areas, with a particular problem for urban planners being redevelopment of land to meet demand whilst ensuring compatibility with existing land use. This paper presents a geographic information systems (GIS)-based decision support tool (called LUDS) to quantitatively assess land-use suitability for site redevelopment in urban renewal areas. This consists of a model for the suitability analysis and an affiliated land-information database for residential, commercial, industrial, G/I/C (government/institution/community) and open space land uses. Development has occurred with support from interviews with industry experts, focus group meetings and an experimental trial, combined with several advanced techniques and tools, including GIS data processing and spatial analysis, multi-criterion analysis, as well as the AHP method for constructing the model and database. As demonstrated in the trial, LUDS assists planners in making land-use decisions and supports the planning process in assessing urban land-use suitability for site redevelopment. Moreover, it facilitates public consultation (participatory planning) by providing stakeholders with an explicit understanding of planners' views.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Control Theory has provided a useful theoretical foundation for Information Systems development outsourcing (ISD-outsourcing) to examine the co-ordination between the client and the vendor. Recent research identified two control mechanisms: structural (structure of the control mode) and process (the process through which the control mode is enacted). Yet, the Control Theory research to-date does not describe the ways in which the two control mechanisms can be combined to ensure project success. Grounded in case study data of eight ISD-outsourcing projects, we derive three ‘control configurations’; i) aligned, ii) negotiated, and 3) self-managed, which describe the combinative patterns of structural and process control mechanisms within and across control modes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Service processes such as financial advice, booking a business trip or conducting a consulting project have emerged as units of analysis of high interest for the business process and service management communities in practice and academia. While the transactional nature of production processes is relatively well understood and deployed, the less predictable and highly interactive nature of service processes still lacks in many areas appropriate methodological grounding. This paper proposes a framework of a process laboratory as a new IT artefact in order to facilitate the holistic analysis and simulation of such service processes. Using financial services as an example, it will be shown how such a process laboratory can be used to reduce the complexity of service process analysis and facilitate operational service process control.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

FOR SUGAR factories with cogeneration plants major changes to the process stations have been undertaken to reduce the consumption of exhaust steam from the turbines and maximise the generated power. In many cases the process steam consumption has been reduced from greater than 52% on cane to ~40% on cane. The main changes have been to install additional evaporation area at the front of the set, operate the pan stages on vapour from No 1 or No 2 effects and undertake juice heating using vapour bleed from evaporators as far down the set as the penultimate stage. Operationally, one of the main challenges has been to develop a control system for the evaporators that addresses the objectives of juice processing rate (throughput) and steam economy, while producing syrup consistently at the required brix and providing an adequate and consistent vapour pressure for the pan stage operations. The cyclic demand for vapour by batch pans causes process disturbances through the evaporator set and these must be regulated in an effective manner to satisfy the above list of objectives for the evaporator station. The impact of the cyclic pan stage vapour demand has been modelled to define the impact on juice rate, steam economy, syrup brix and head space pressures in the evaporators. Experiences with the control schemes used at Pioneer and Rocky Point Mills are discussed. For each factory the paper provides information on (a) the control system used, the philosophy behind the control system and experiences in reaching the current system for control (b) the performance of the control system to handle the disturbances imposed by the pan stage and operate within other constraints of the factory (c) deficiencies in the current system and plans for further improvements. Other processing changes to boost the performance of the evaporators are also discussed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This article presents mathematical models to simulate coupled heat and mass transfer during convective drying of food materials using three different effective diffusivities: shrinkage dependent, temperature dependent and average of those two. Engineering simulation software COMSOL Multiphysics was utilized to simulate the model in 2D and 3D. The simulation results were compared with experimental data. It is found that the temperature dependent effective diffusivity model predicts the moisture content more accurately at the initial stage of the drying, whereas, the shrinkage dependent effective diffusivity model is better for the final stage of the drying. The model with shrinkage dependent effective diffusivity shows evaporative cooling phenomena at the initial stage of drying. This phenomenon was investigated and explained. Three dimensional temperature and moisture profiles show that even when the surface is dry, inside of the sample may still contain large amount of moisture. Therefore, drying process should be carefully dealt with otherwise microbial spoilage may start from the centre of the ‘dried’ food. A parametric investigation has been conducted after the validation of the model.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Injection velocity has been recognized as a key variable in thermoplastic injection molding. Its closed-loop control is, however, difficult due to the complexity of the process dynamic characteristics. The basic requirements of the control system include tracking of a pre-determined injection velocity curve defined in a profile, load rejection and robustness. It is difficult for a conventional control scheme to meet all these requirements. Injection velocity dynamics are first analyzed in this paper. Then a novel double-controller scheme is adopted for the injection velocity control. This scheme allows an independent design of set-point tracking and load rejection and has good system robustness. The implementation of the double-controller scheme for injection velocity control is discussed. Special techniques such as profile transformation and shifting are also introduced to improve the velocity responses. The proposed velocity control has been experimentally demonstrated to be effective for a wide range of processing conditions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

As a result of the more distributed nature of organisations and the inherently increasing complexity of their business processes, a significant effort is required for the specification and verification of those processes. The composition of the activities into a business process that accomplishes a specific organisational goal has primarily been a manual task. Automated planning is a branch of artificial intelligence (AI) in which activities are selected and organised by anticipating their expected outcomes with the aim of achieving some goal. As such, automated planning would seem to be a natural fit to the BPM domain to automate the specification of control flow. A number of attempts have been made to apply automated planning to the business process and service composition domain in different stages of the BPM lifecycle. However, a unified adoption of these techniques throughout the BPM lifecycle is missing. As such, we propose a new intention-centric BPM paradigm, which aims on minimising the specification effort by exploiting automated planning techniques to achieve a pre-stated goal. This paper provides a vision on the future possibilities of enhancing BPM using automated planning. A research agenda is presented, which provides an overview of the opportunities and challenges for the exploitation of automated planning in BPM.