467 resultados para Prediction techniques
Resumo:
In Australia, railway systems play a vital role in transporting the sugarcane crop from farms to mills. The sugarcane transport system is very complex and uses daily schedules, consisting of a set of locomotives runs, to satisfy the requirements of the mill and harvesters. The total cost of sugarcane transport operations is very high; over 35% of the total cost of sugarcane production in Australia is incurred in cane transport. Efficient schedules for sugarcane transport can reduce the cost and limit the negative effects that this system can have on the raw sugar production system. There are several benefits to formulating the train scheduling problem as a blocking parallel-machine job shop scheduling (BPMJSS) problem, namely to prevent two trains passing in one section at the same time; to keep the train activities (operations) in sequence during each run (trip) by applying precedence constraints; to pass the trains on one section in the correct order (priorities of passing trains) by applying disjunctive constraints; and, to ease passing trains by solving rail conflicts by applying blocking constraints and Parallel Machine Scheduling. Therefore, the sugarcane rail operations are formulated as BPMJSS problem. A mixed integer programming and constraint programming approaches are used to describe the BPMJSS problem. The model is solved by the integration of constraint programming, mixed integer programming and search techniques. The optimality performance is tested by Optimization Programming Language (OPL) and CPLEX software on small and large size instances based on specific criteria. A real life problem is used to verify and validate the approach. Constructive heuristics and new metaheuristics including simulated annealing and tabu search are proposed to solve this complex and NP-hard scheduling problem and produce a more efficient scheduling system. Innovative hybrid and hyper metaheuristic techniques are developed and coded using C# language to improve the solutions quality and CPU time. Hybrid techniques depend on integrating heuristic and metaheuristic techniques consecutively, while hyper techniques are the complete integration between different metaheuristic techniques, heuristic techniques, or both.
Resumo:
Quality oriented management systems and methods have become the dominant business and governance paradigm. From this perspective, satisfying customers’ expectations by supplying reliable, good quality products and services is the key factor for an organization and even government. During recent decades, Statistical Quality Control (SQC) methods have been developed as the technical core of quality management and continuous improvement philosophy and now are being applied widely to improve the quality of products and services in industrial and business sectors. Recently SQC tools, in particular quality control charts, have been used in healthcare surveillance. In some cases, these tools have been modified and developed to better suit the health sector characteristics and needs. It seems that some of the work in the healthcare area has evolved independently of the development of industrial statistical process control methods. Therefore analysing and comparing paradigms and the characteristics of quality control charts and techniques across the different sectors presents some opportunities for transferring knowledge and future development in each sectors. Meanwhile considering capabilities of Bayesian approach particularly Bayesian hierarchical models and computational techniques in which all uncertainty are expressed as a structure of probability, facilitates decision making and cost-effectiveness analyses. Therefore, this research investigates the use of quality improvement cycle in a health vii setting using clinical data from a hospital. The need of clinical data for monitoring purposes is investigated in two aspects. A framework and appropriate tools from the industrial context are proposed and applied to evaluate and improve data quality in available datasets and data flow; then a data capturing algorithm using Bayesian decision making methods is developed to determine economical sample size for statistical analyses within the quality improvement cycle. Following ensuring clinical data quality, some characteristics of control charts in the health context including the necessity of monitoring attribute data and correlated quality characteristics are considered. To this end, multivariate control charts from an industrial context are adapted to monitor radiation delivered to patients undergoing diagnostic coronary angiogram and various risk-adjusted control charts are constructed and investigated in monitoring binary outcomes of clinical interventions as well as postintervention survival time. Meanwhile, adoption of a Bayesian approach is proposed as a new framework in estimation of change point following control chart’s signal. This estimate aims to facilitate root causes efforts in quality improvement cycle since it cuts the search for the potential causes of detected changes to a tighter time-frame prior to the signal. This approach enables us to obtain highly informative estimates for change point parameters since probability distribution based results are obtained. Using Bayesian hierarchical models and Markov chain Monte Carlo computational methods, Bayesian estimators of the time and the magnitude of various change scenarios including step change, linear trend and multiple change in a Poisson process are developed and investigated. The benefits of change point investigation is revisited and promoted in monitoring hospital outcomes where the developed Bayesian estimator reports the true time of the shifts, compared to priori known causes, detected by control charts in monitoring rate of excess usage of blood products and major adverse events during and after cardiac surgery in a local hospital. The development of the Bayesian change point estimators are then followed in a healthcare surveillances for processes in which pre-intervention characteristics of patients are viii affecting the outcomes. In this setting, at first, the Bayesian estimator is extended to capture the patient mix, covariates, through risk models underlying risk-adjusted control charts. Variations of the estimator are developed to estimate the true time of step changes and linear trends in odds ratio of intensive care unit outcomes in a local hospital. Secondly, the Bayesian estimator is extended to identify the time of a shift in mean survival time after a clinical intervention which is being monitored by riskadjusted survival time control charts. In this context, the survival time after a clinical intervention is also affected by patient mix and the survival function is constructed using survival prediction model. The simulation study undertaken in each research component and obtained results highly recommend the developed Bayesian estimators as a strong alternative in change point estimation within quality improvement cycle in healthcare surveillances as well as industrial and business contexts. The superiority of the proposed Bayesian framework and estimators are enhanced when probability quantification, flexibility and generalizability of the developed model are also considered. The empirical results and simulations indicate that the Bayesian estimators are a strong alternative in change point estimation within quality improvement cycle in healthcare surveillances. The superiority of the proposed Bayesian framework and estimators are enhanced when probability quantification, flexibility and generalizability of the developed model are also considered. The advantages of the Bayesian approach seen in general context of quality control may also be extended in the industrial and business domains where quality monitoring was initially developed.
Resumo:
This paper reports on the development and implementation of a self-report risk assessment tool that was developed in an attempt to increase the efficacy of crash prediction within Australian fleet settings. This study forms a part of a broader program of research into work related road safety and identification of driving risk. The first phase of the study involved a series of focus groups being conducted with 217 professional drivers which revealed that the following factors were proposed to influence driving performance: Fatigue, Knowledge of risk, Mood, Impatience and frustration, Speed limits, Experience, Other road users, Passengers, Health, and Culture. The second phase of the study involved piloting the newly developed 38 item Driving Risk Assessment Scale - Work Version (DRAS-WV) with 546 professional drivers. Factor analytic techniques identified a 9 factor solution that was comprised of speeding, aggression, time pressure, distraction, casualness, awareness, maintenance, fatigue and minor damage. Speeding and aggressive driving manoeuvres were identified to be the most frequent aberrant driving behaviours engaged in by the sample. However, a series of logistic regression analyses undertaken to determine the DRAS-WV scale’s ability to predict self-reported crashes revealed limited predictive efficacy e.g., 10% of crashes. This paper outlines proposed reasons for this limited predictive ability of the DRAS-WV as well as provides suggestions regarding the future of research that aims to develop methods to identify “at risk” drivers.
Resumo:
The recent expansion of prediction markets provides a great opportunity to test the market efficiency hypothesis and the calibration of trader judgements. Using a large database of observed prices, this article studies the calibration of prediction markets prices on sporting events using both nonparametric and parametric methods. While only minor bias can be observed during most of the lifetime of the contracts, the calibration of prices deteriorates very significantly in the last moments of the contracts’ lives. Traders tend to overestimate the probability of the losing team to reverse the situation in the last minutes of the game.
Resumo:
The overall aim of this project was to contribute to existing knowledge regarding methods for measuring characteristics of airborne nanoparticles and controlling occupational exposure to airborne nanoparticles, and to gather data on nanoparticle emission and transport in various workplaces. The scope of this study involved investigating the characteristics and behaviour of particles arising from the operation of six nanotechnology processes, subdivided into nine processes for measurement purposes. It did not include the toxicological evaluation of the aerosol and therefore, no direct conclusion was made regarding the health effects of exposure to these particles. Our research included real-time measurement of sub, and supermicrometre particle number and mass concentration, count median diameter, and alveolar deposited surface area using condensation particle counters, an optical particle counter, DustTrak photometer, scanning mobility particle sizer, and nanoparticle surface area monitor, respectively. Off-line particle analysis included scanning and transmission electron microscopy, energy-dispersive x-ray spectrometry, and thermal optical analysis of elemental carbon. Sources of fibrous and non-fibrous particles were included.
Resumo:
The performance of techniques for evaluating multivariate volatility forecasts are not yet as well understood as their univariate counterparts. This paper aims to evaluate the efficacy of a range of traditional statistical-based methods for multivariate forecast evaluation together with methods based on underlying considerations of economic theory. It is found that a statistical-based method based on likelihood theory and an economic loss function based on portfolio variance are the most effective means of identifying optimal forecasts of conditional covariance matrices.
Resumo:
The application of nanotechnology products has increased significantly in recent years. With their broad range of applications, including electronics, food and agriculture, power and energy, scientific instruments, clothing, cosmetics, buildings, biomedical and health, etc (Catanzariti, 2008), nanomaterials are an indispensible part of human life.
Resumo:
Complex flow datasets are often difficult to represent in detail using traditional vector visualisation techniques such as arrow plots and streamlines. This is particularly true when the flow regime changes in time. Texture-based techniques, which are based on the advection of dense textures, are novel techniques for visualising such flows (i.e., complex dynamics and time-dependent). In this paper, we review two popular texture-based techniques and their application to flow datasets sourced from real research projects. The texture-based techniques investigated were Line Integral Convolution (LIC), and Image-Based Flow Visualisation (IBFV). We evaluated these techniques and in this paper report on their visualisation effectiveness (when compared with traditional techniques), their ease of implementation, and their computational overhead.
Resumo:
Prostate cancer (CaP) is the second leading cause of cancer-related deaths in North American males and the most common newly diagnosed cancer in men world wide. Biomarkers are widely used for both early detection and prognostic tests for cancer. The current, commonly used biomarker for CaP is serum prostate specific antigen (PSA). However, the specificity of this biomarker is low as its serum level is not only increased in CaP but also in various other diseases, with age and even body mass index. Human body fluids provide an excellent resource for the discovery of biomarkers, with the advantage over tissue/biopsy samples of their ease of access, due to the less invasive nature of collection. However, their analysis presents challenges in terms of variability and validation. Blood and urine are two human body fluids commonly used for CaP research, but their proteomic analyses are limited both by the large dynamic range of protein abundance making detection of low abundance proteins difficult and in the case of urine, by the high salt concentration. To overcome these challenges, different techniques for removal of high abundance proteins and enrichment of low abundance proteins are used. Their applications and limitations are discussed in this review. A number of innovative proteomic techniques have improved detection of biomarkers. They include two dimensional differential gel electrophoresis (2D-DIGE), quantitative mass spectrometry (MS) and functional proteomic studies, i.e., investigating the association of post translational modifications (PTMs) such as phosphorylation, glycosylation and protein degradation. The recent development of quantitative MS techniques such as stable isotope labeling with amino acids in cell culture (SILAC), isobaric tags for relative and absolute quantitation (iTRAQ) and multiple reaction monitoring (MRM) have allowed proteomic researchers to quantitatively compare data from different samples. 2D-DIGE has greatly improved the statistical power of classical 2D gel analysis by introducing an internal control. This chapter aims to review novel CaP biomarkers as well as to discuss current trends in biomarker research from two angles: the source of biomarkers (particularly human body fluids such as blood and urine), and emerging proteomic approaches for biomarker research.
Resumo:
Detailed representations of complex flow datasets are often difficult to generate using traditional vector visualisation techniques such as arrow plots and streamlines. This is particularly true when the flow regime changes in time. Texture-based techniques, which are based on the advection of dense textures, are novel techniques for visualising such flows. We review two popular texture based techniques and their application to flow datasets sourced from active research projects. The techniques investigated were Line integral convolution (LIC) [1], and Image based flow visualisation (IBFV) [18]. We evaluated these and report on their effectiveness from a visualisation perspective. We also report on their ease of implementation and computational overheads.
Resumo:
Objectives: To compare measures of fat-free mass (FFM) by three different bioelectrical impedance analysis (BIA) devices and to assess the agreement between three different equations validated in older adult and/or overweight populations. Design: Cross-sectional study. Setting: Orthopaedics ward of Brisbane public hospital, Australia. Participants: Twenty-two overweight, older Australians (72 yr ± 6.4, BMI 34 kg/m2 ± 5.5) with knee osteoarthritis. Measurements: Body composition was measured using three BIA devices: Tanita 300-GS (foot-to-foot), Impedimed DF50 (hand-to-foot) and Impedimed SFB7 (bioelectrical impedance spectroscopy (BIS)). Three equations for predicting FFM were selected based on their ability to be applied to an older adult and/ or overweight population. Impedance values were extracted from the hand-to-foot BIA device and included in the equations to estimate FFM. Results: The mean FFM measured by BIS (57.6 kg ± 9.1) differed significantly from those measured by foot-to-foot (54.6 kg ± 8.7) and hand-to-foot BIA (53.2 kg ± 10.5) (P < 0.001). The mean ± SD FFM predicted by three equations using raw data from hand-to-foot BIA were 54.7 kg ± 8.9, 54.7 kg ± 7.9 and 52.9 kg ± 11.05 respectively. These results did not differ from the FFM predicted by the hand-to-foot device (F = 2.66, P = 0.118). Conclusions: Our results suggest that foot-to-foot and hand-to-foot BIA may be used interchangeably in overweight older adults at the group level but due to the large limits of agreement may lead to unacceptable error in individuals. There was no difference between the three prediction equations however these results should be confirmed within a larger sample and against a reference standard.
Resumo:
The mining environment presents a challenging prospect for stereo vision. Our objective is to produce a stereo vision sensor suited to close-range scenes consisting mostly of rocks. This sensor should produce a dense depth map within real-time constraints. Speed and robustness are of foremost importance for this application. This paper compares a number of stereo matching algorithms in terms of robustness and suitability to fast implementation. These include traditional area-based algorithms, and algorithms based on non-parametric transforms, notably the rank and census transforms. Our experimental results show that the rank and census transforms are robust with respect to radiometric distortion and introduce less computational complexity than conventional area-based matching techniques.
Resumo:
Purpose. To compare radiological records of 90 consecutive patients who underwent cemented total hip arthroplasty (THA) with or without use of the Rim Cutter to prepare the acetabulum. Methods. The acetabulum of 45 patients was prepared using the Rim Cutter, whereas the device was not used in the other 45 patients. Postoperative radiographs were evaluated using a digital templating system to measure (1) the positions of the operated hips with respect to the normal, contralateral hips (the centre of rotation of the socket, the height of the centre of rotation from the teardrop, and lateralisation of the centre of rotation from the teardrop) and (2) the uniformity and width of the cement mantle in the 3 DeLee Charnley acetabular zones, and the number of radiolucencies in these zones. Results. The study group showed improved radiological parameters and were closer to the anatomic centre of rotation both vertically (1.5 vs. 3.7 mm, p<0.001) and horizontally (1.8 vs. 4.4 mm, p<0.001) and had consistently thicker and more uniform cement mantles (p<0.001). There were 2 radiolucent lines in the control group but none in the study group. Conclusion. The Rim Cutter resulted in more accurate placement of the centre of rotation of a cemented prosthetic socket, and produced a thicker, more congruent cement mantle with fewer radiolucent lines.