75 resultados para Process control -- Data processing


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Purpose Accelerometers are recognized as a valid and objective tool to assess free-living physical activity. Despite the widespread use of accelerometers, there is no standardized way to process and summarize data from them, which limits our ability to compare results across studies. This paper a) reviews decision rules researchers have used in the past, b) compares the impact of using different decision rules on a common data set, and c) identifies issues to consider for accelerometer data reduction. Methods The methods sections of studies published in 2003 and 2004 were reviewed to determine what decision rules previous researchers have used to identify wearing period, minimal wear requirement for a valid day, spurious data, number of days used to calculate the outcome variables, and extract bouts of moderate to vigorous physical activity (MVPA). For this study, four data reduction algorithms that employ different decision rules were used to analyze the same data set. Results The review showed that among studies that reported their decision rules, much variability was observed. Overall, the analyses suggested that using different algorithms impacted several important outcome variables. The most stringent algorithm yielded significantly lower wearing time, the lowest activity counts per minute and counts per day, and fewer minutes of MVPA per day. An exploratory sensitivity analysis revealed that the most stringent inclusion criterion had an impact on sample size and wearing time, which in turn affected many outcome variables. Conclusions These findings suggest that the decision rules employed to process accelerometer data have a significant impact on important outcome variables. Until guidelines are developed, it will remain difficult to compare findings across studies

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis was a step forward in extracting valuable features from human's movement behaviour in terms of space utilisation based on Media-Access-Control data. This research offered a low-cost and less computational complexity approach compared to existing human's movement tracking methods. This research was successfully applied in QUT's Gardens Point campus and can be scaled to bigger environments and societies. Extractable information from human's movement by this approach can add a significant value to studying human's movement behaviour, enhancing future urban and interior design, improving crowd safety and evacuation plans.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

MOST PAN stages in Australian factories use only five or six batch pans for the high grade massecuite production and operate these in a fairly rigid repeating production schedule. It is common that some of the pans are of large dropping capacity e.g. 150 to 240 t. Because of the relatively small number and large sizes of the pans, steam consumption varies widely through the schedule, often by ±30% about the mean value. Large fluctuations in steam consumption have implications for the steam generation/condensate management of the factory and the evaporators when bleed vapour is used. One of the objectives of a project to develop a supervisory control system for a pan stage is to (a) reduce the average steam consumption and (b) reduce the variation in the steam consumption. The operation of each of the high grade pans within the schedule at Macknade Mill was analysed to determine the idle (or buffer) time, time allocations for essential but unproductive operations (e.g. pan turn round, charging, slow ramping up of steam rates on pan start etc.), and productive time i.e. the time during boil-on of liquor and molasses feed. Empirical models were developed for each high grade pan on the stage to define the interdependence of the production rate and the evaporation rate for the different phases of each pan’s cycle. The data were analysed in a spreadsheet model to try to reduce and smooth the total steam consumption. This paper reports on the methodology developed in the model and the results of the investigations for the pan stage at Macknade Mill. It was found that the operation of the schedule severely restricted the ability to reduce the average steam consumption and smooth the steam flows. While longer cycle times provide increased flexibility the steam consumption profile was changed only slightly. The ability to cut massecuite on the run among pans, or the use of a high grade seed vessel, would assist in reducing the average steam consumption and the magnitude of the variations in steam flow.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper describes a safety data recording and analysis system that has been developed to capture safety occurrences including precursors using high-definition forward-facing video from train cabs and data from other train-borne systems. The paper describes the data processing model and how events detected through data analysis are related to an underlying socio-technical model of accident causation. The integrated approach to safety data recording and analysis insures systemic factors that condition, influence or potentially contribute to an occurrence are captured both for safety occurrences and precursor events, providing a rich tapestry of antecedent causal factors that can significantly improve learning around accident causation. This can ultimately provide benefit to railways through the development of targeted and more effective countermeasures, better risk models and more effective use and prioritization of safety funds. Level crossing occurrences are a key focus in this paper with data analysis scenarios describing causal factors around near-miss occurrences. The paper concludes with a discussion on how the system can also be applied to other types of railway safety occurrences.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Variations that exist in the treatment of patients (with similar symptoms) across different hospitals do substantially impact the quality and costs of healthcare. Consequently, it is important to understand the similarities and differences between the practices across different hospitals. This paper presents a case study on the application of process mining techniques to measure and quantify the differences in the treatment of patients presenting with chest pain symptoms across four South Australian hospitals. Our case study focuses on cross-organisational benchmarking of processes and their performance. Techniques such as clustering, process discovery, performance analysis, and scientific workflows were applied to facilitate such comparative analyses. Lessons learned in overcoming unique challenges in cross-organisational process mining, such as ensuring population comparability, data granularity comparability, and experimental repeatability are also presented.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This is the third TAProViz workshop being run at BPM. The intention this year is to consolidate on the results of the previous successful workshops by further developing this important topic, identifying the key research topics of interest to the BPM visualization community. We note this year the continuing interest in the visualisation of process mining data and resultant process models. More info at: http://wst.univie.ac.at/topics/taproviz14/

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The ambiguity acceptance test is an important quality control procedure in high precision GNSS data processing. Although the ambiguity acceptance test methods have been extensively investigated, its threshold determine method is still not well understood. Currently, the threshold is determined with the empirical approach or the fixed failure rate (FF-) approach. The empirical approach is simple but lacking in theoretical basis, while the FF-approach is theoretical rigorous but computationally demanding. Hence, the key of the threshold determination problem is how to efficiently determine the threshold in a reasonable way. In this study, a new threshold determination method named threshold function method is proposed to reduce the complexity of the FF-approach. The threshold function method simplifies the FF-approach by a modeling procedure and an approximation procedure. The modeling procedure uses a rational function model to describe the relationship between the FF-difference test threshold and the integer least-squares (ILS) success rate. The approximation procedure replaces the ILS success rate with the easy-to-calculate integer bootstrapping (IB) success rate. Corresponding modeling error and approximation error are analysed with simulation data to avoid nuisance biases and unrealistic stochastic model impact. The results indicate the proposed method can greatly simplify the FF-approach without introducing significant modeling error. The threshold function method makes the fixed failure rate threshold determination method feasible for real-time applications.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This study investigates the effects of trait anxiety on self-reported driving behaviours through its negative impacts on Central Executive functions. Following a self-report study that found trait anxiety to be significantly related to driving behaviours, the present study extended the predictions of Eysenck and Calvo’s Attentional Control Theory, proposing that anxiety affects driving behaviours, in particular driving lapses, through its impact across the Central Executive. Seventy-five Australian drivers participated in the study, completing the Parametric Go/No-Go and n-back tasks, as well as the State-Trait Anxiety Inventory and the Driving Behaviour Questionnaire. While both trait anxiety and processing efficiency of the Central Executive was found to significantly predict driving lapses, trait anxiety remained a strong predictor of driving lapses after processing efficiency was controlled for. It is concluded that while processing efficiency of the central Executive is a key determinant of driving lapses, another Central Executive function that is closer to the driving lapses in the trait anxiety – driving lapses relationship may be needed. Suggestions regarding how to improve future trait anxiety – driving behaviours research are discussed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper we illustrate a set of features of the Apromore process model repository for analyzing business process variants. Two types of analysis are provided: one is static and based on differences on the process control flow, the other is dynamic and based on differences in the process behavior between the variants. These features combine techniques for the management of large process model collections with those for mining process knowledge from process execution logs. The tool demonstration will be useful for researchers and practitioners working on large process model collections and process execution logs, and specifically for those with an interest in understanding, managing and consolidating business process variants both within and across organizational boundaries.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Flos Chrysanthemum is a generic name for a particular group of edible plants, which also have medicinal properties. There are, in fact, twenty to thirty different cultivars, which are commonly used in beverages and for medicinal purposes. In this work, four Flos Chrysanthemum cultivars, Hangju, Taiju, Gongju, and Boju, were collected and chromatographic fingerprints were used to distinguish and assess these cultivars for quality control purposes. Chromatography fingerprints contain chemical information but also often have baseline drifts and peak shifts, which complicate data processing, and adaptive iteratively reweighted, penalized least squares, and correlation optimized warping were applied to correct the fingerprint peaks. The adjusted data were submitted to unsupervised and supervised pattern recognition methods. Principal component analysis was used to qualitatively differentiate the Flos Chrysanthemum cultivars. Partial least squares, continuum power regression, and K-nearest neighbors were used to predict the unknown samples. Finally, the elliptic joint confidence region method was used to evaluate the prediction ability of these models. The partial least squares and continuum power regression methods were shown to best represent the experimental results.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Increasingly larger scale applications are generating an unprecedented amount of data. However, the increasing gap between computation and I/O capacity on High End Computing machines makes a severe bottleneck for data analysis. Instead of moving data from its source to the output storage, in-situ analytics processes output data while simulations are running. However, in-situ data analysis incurs much more computing resource contentions with simulations. Such contentions severely damage the performance of simulation on HPE. Since different data processing strategies have different impact on performance and cost, there is a consequent need for flexibility in the location of data analytics. In this paper, we explore and analyze several potential data-analytics placement strategies along the I/O path. To find out the best strategy to reduce data movement in given situation, we propose a flexible data analytics (FlexAnalytics) framework in this paper. Based on this framework, a FlexAnalytics prototype system is developed for analytics placement. FlexAnalytics system enhances the scalability and flexibility of current I/O stack on HEC platforms and is useful for data pre-processing, runtime data analysis and visualization, as well as for large-scale data transfer. Two use cases – scientific data compression and remote visualization – have been applied in the study to verify the performance of FlexAnalytics. Experimental results demonstrate that FlexAnalytics framework increases data transition bandwidth and improves the application end-to-end transfer performance.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A vessel stabilizer control system includes a sensor fault detection means which senses the availability of sensing signals from a gyrostabilizer precession motion sensor and a vessel roll motion sensor. The control system controls the action of a gyro-actuator which is mechanically coupled to a gyrostabilizer. The benefit of employing fault sensing of the sensors providing the process control variables is that the sensed number of available process control variables (or sensors) can be used to activate a tiered system of control modes. Each tiered control mode is designed to utilize the available process control variables to ensure safe and effective operation of the gyrostabilizer that is tolerant of sensor faults and loss of power supply. A control mode selector is provided for selecting the appropriate control mode based on the number of available process control variables.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Being able to accurately predict the risk of falling is crucial in patients with Parkinson’s dis- ease (PD). This is due to the unfavorable effect of falls, which can lower the quality of life as well as directly impact on survival. Three methods considered for predicting falls are decision trees (DT), Bayesian networks (BN), and support vector machines (SVM). Data on a 1-year prospective study conducted at IHBI, Australia, for 51 people with PD are used. Data processing are conducted using rpart and e1071 packages in R for DT and SVM, con- secutively; and Bayes Server 5.5 for the BN. The results show that BN and SVM produce consistently higher accuracy over the 12 months evaluation time points (average sensitivity and specificity > 92%) than DT (average sensitivity 88%, average specificity 72%). DT is prone to imbalanced data so needs to adjust for the misclassification cost. However, DT provides a straightforward, interpretable result and thus is appealing for helping to identify important items related to falls and to generate fallers’ profiles.