15 resultados para Data handling

em Deakin Research Online - Australia


Relevância:

70.00% 70.00%

Publicador:

Resumo:

he workshop will firstly provide an overview of the problems associated with missing data within the context of clinical trials and how to minimise these. Missing data will be explored by modeling the impact on a number of datasets. This approach will be invaluable in highlighting how alternative methods for controlling for missing data impact differentially on the interpretation of study findings. Popular strategies involve options based on an assessment of the percentage of missing data. More innovative approaches to the management of missing data (e.g. based upon reliability analyses) will be explored and evaluated and the role of the most popular methods of data management explored in several study designs beyond those of the classic randomised controlled trial. Participants will have the opportunity to appraise and debate existing methods of missing data handling.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

TLAStat+ is a powerful but easy to use statistics software package.It is an add-in for statistics to be used in conjunction with Microsoft Excel. It provides extra calculation, graph, and help features for statistical data analysis. TLAStat+ is designed for teaching introductory-intermediate courses in statistics. It runs on PCs in various versions of Excel and has powerful data handling capabilities.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Offshore wind turbine requires more systematized operation and maintenance strategies to ensure systems are harmless, profitable and cost-effective. Condition monitoring and fault diagnostic systems ominously plays an important role in offshore wind turbine in order to cut down maintenance and operational costs. Condition monitoring techniques which describing complex faults and failure mode types and their generated traceable signs to provide cost-effective condition monitoring and predictive maintenance and their diagnostic schemes. Continuously monitor the condition of critical parts are the most efficient way to improve reliability of wind turbine. Implementation of Condition Based Maintenance (CBM) strategy provides right time maintenance decisions and Predictive Health Monitoring (PHM) data to overcome breakdown and machine downtime. Fault detection and CBM implementation is challenging for off shore wind farm due to the complexity of remote sensing, components health and predictive assessment, data collection, data analysis, data handling, state recognition, and advisory decision. The rapid expansion of wind farms, advanced technological development and harsh installation sites needs a successful CM approach. This paper aims to review brief status of recent development of CM techniques and focusing with major faults takes place in gear box and bearing, rotor and blade, pitch, yaw and tower system and generator and control system.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Airport baggage handling systems are a critical infrastructure component within major airports, and essential to ensure smooth luggage transfer while preventing dangerous material being loaded onto aircraft. This paper proposes a standard set of measures to assess the expected performance of a baggage handling system through discrete event simulation. These evaluation methods also have application in the study of general network systems. Results from the application of these methods reveal operational characteristics of the studied BHS, in terms of metrics such as peak throughput, in-system time and system recovery time.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Data mining refers to extracting or "mining" knowledge from large amounts of data. It is an increasingly popular field that uses statistical, visualization, machine learning, and other data manipulation and knowledge extraction techniques aimed at gaining an insight into the relationships and patterns hidden in the data. Availability of digital data within picture archiving and communication systems raises a possibility of health care and research enhancement associated with manipulation, processing and handling of data by computers.That is the basis for computer-assisted radiology development. Further development of computer-assisted radiology is associated with the use of new intelligent capabilities such as multimedia support and data mining in order to discover the relevant knowledge for diagnosis. It is very useful if results of data mining can be communicated to humans in an understandable way. In this paper, we present our work on data mining in medical image archiving systems. We investigate the use of a very efficient data mining technique, a decision tree, in order to learn the knowledge for computer-assisted image analysis. We apply our method to the classification of x-ray images for lung cancer diagnosis. The proposed technique is based on an inductive decision tree learning algorithm that has low complexity with high transparency and accuracy. The results show that the proposed algorithm is robust, accurate, fast, and it produces a comprehensible structure, summarizing the knowledge it induces.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Wireless sensor networks (WSNs) are proposed as powerful means for fine grained monitoring in different classes of applications at very low cost and for extended periods of time. Among various solutions, supporting WSNs with intelligent mobile platforms for handling the data management, proved its benefits towards extending the network lifetime and enhancing its performance. The mobility model applied highly affects the data latency in the network as well as the sensors’ energy consumption levels. Intelligent-based models taking into consideration the network runtime conditions are adopted to overcome such problems. In this chapter, existing proposals that use intelligent mobility for managing the data in WSNs are surveyed. Different classifications are presented through the chapter to give a complete view on the solutions lying in this domain. Furthermore, these models are compared considering various metrics and design goals.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this study, we develop some deterministic metamodels to quickly and precisely predict the future of a technically complex system. The underlying system is essentially a stochastic, discrete event simulation model of a big baggage handling system. The highly detailed simulation model of this is used for conducting some experiments and logging data which are then used for training artificial neural network metamodels. Demonstrated results show that the developed metamodels are well able to predict different performance measures related to the travel time of bags within this system. In contrast to the simulation models which are computationally expensive and expertise extensive to be developed, run, and maintained, the artificial neural network metamodels could serve as real time decision aiding tools which are considerably fast, precise, simple to use, and reliable.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents a triple-random ensemble learning method for handling multi-label classification problems. The proposed method integrates and develops the concepts of random subspace, bagging and random k-label sets ensemble learning methods to form an approach to classify multi-label data. It applies the random subspace method to feature space, label space as well as instance space. The devised subsets selection procedure is executed iteratively. Each multi-label classifier is trained using the randomly selected subsets. At the end of the iteration, optimal parameters are selected and the ensemble MLC classifiers are constructed. The proposed method is implemented and its performance compared against that of popular multi-label classification methods. The experimental results reveal that the proposed method outperforms the examined counterparts in most occasions when tested on six small to larger multi-label datasets from different domains. This demonstrates that the developed method possesses general applicability for various multi-label classification problems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Stressors of various kinds constantly affect fish both in the wild and in culture, examples being acute water temperature and quality changes, predation, handling, and confinement. Known physiological responses of fish to stress such as increases in plasma cortisol and glucose levels, are considered to be adaptive, allowing the animal to cope in the short term. Prolonged exposure to stressors however, has the potential to affect growth, immune function, and survival. Nonetheless, little is known about the mechanisms underlying the long-term stress response. We have investigated the metabolic response of juvenile Atlantic salmon (Salmo salar) to long-term handling stress by analyzing fish plasma via 1H nuclear magnetic resonance spectroscopy and ultra high performance liquid chromatography–mass spectrometry (UPLC–MS), and comparing results with controls. Analysis of NMR data indicated a difference in the metabolic profiles of control and stressed fish after 1 week of stress with a maximum difference observed after 2 weeks. These differences were associated with stress-induced increases in phosphatidyl choline, lactate, carbohydrates, alanine, valine and trimethylamine-N-oxide, and decreases in low density lipoprotein, very low density lipoprotein, and lipid. UPLC-MS data showed differences at week 2, associated with another set of compounds, tentatively identified on the basis of their mass/charge. Overall the results provided a multi-faceted view of the response of fish to long-term handling stress, indicating that the metabolic disparity between the control and stress groups increased to week 2, but declined by weeks 3 and 4, and revealed several new molecular indicators of long-term stress.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Coal handling is a complex process involving different correlated and highly dependent operations such as selecting appropriate product types, planning stockpiles, scheduling stacking and reclaiming activities and managing train loads. Planning these operations manually is time consuming and can result in non-optimized schedules as future impact of decisions may not be appropriately considered. This paper addresses the operational scheduling of the continuous coal handling problem with multiple conflicting objectives. As the problem is NP-hard in nature, an effective heuristic is presented for planning stockpiles and scheduling resources to minimize delays in production and the coal age in the stockyard. A model of stockyard operations within a coal mine is described and the problem is formulated as a Bi- Objective Optimization Problem (BOOP). The algorithm efficacy is demonstrated on different real-life data scenarios. Computational results show that the solution algorithm is effective and the coal throughput is substantially impacted by the conflicting objectives. Together, the model and the proposed heuristic, can act as a decision support system for the stockyard planner to explore the effects of alternative decisions, such as balancing age and volume of stockpiles, and minimizing conflicts due to stacker and reclaimer movements.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Stochastic search techniques such as evolutionary algorithms (EA) are known to be better explorer of search space as compared to conventional techniques including deterministic methods. However, in the era of big data like most other search methods and learning algorithms, suitability of evolutionary algorithms is naturally questioned. Big data pose new computational challenges including very high dimensionality and sparseness of data. Evolutionary algorithms' superior exploration skills should make them promising candidates for handling optimization problems involving big data. High dimensional problems introduce added complexity to the search space. However, EAs need to be enhanced to ensure that majority of the potential winner solutions gets the chance to survive and mature. In this paper we present an evolutionary algorithm with enhanced ability to deal with the problems of high dimensionality and sparseness of data. In addition to an informed exploration of the solution space, this technique balances exploration and exploitation using a hierarchical multi-population approach. The proposed model uses informed genetic operators to introduce diversity by expanding the scope of search process at the expense of redundant less promising members of the population. Next phase of the algorithm attempts to deal with the problem of high dimensionality by ensuring broader and more exhaustive search and preventing premature death of potential solutions. To achieve this, in addition to the above exploration controlling mechanism, a multi-tier hierarchical architecture is employed, where, in separate layers, the less fit isolated individuals evolve in dynamic sub-populations that coexist alongside the original or main population. Evaluation of the proposed technique on well known benchmark problems ascertains its superior performance. The algorithm has also been successfully applied to a real world problem of financial portfolio management. Although the proposed method cannot be considered big data-ready, it is certainly a move in the right direction.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The focus of this paper is on handling non-monotone information in the modelling process of a single-input target monotone system. On one hand, the monotonicity property is a piece of useful prior (or additional) information which can be exploited for modelling of a monotone target system. On the other hand, it is difficult to model a monotone system if the available information is not monotonically-ordered. In this paper, an interval-based method for analysing non-monotonically ordered information is proposed. The applicability of the proposed method to handling a non-monotone function, a non-monotone data set, and an incomplete and/or non-monotone fuzzy rule base is presented. The upper and lower bounds of the interval are firstly defined. The region governed by the interval is explained as a coverage measure. The coverage size represents uncertainty pertaining to the available information. The proposed approach constitutes a new method to transform non-monotonic information to interval-valued monotone system. The proposed interval-based method to handle an incomplete and/or non-monotone fuzzy rule base constitutes a new fuzzy reasoning approach.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper, a Radial Basis Function Network (RBFN) trained with the Dynamic Decay Adjustment (DDA) algorithm (i.e., RBFNDDA) is deployed as an incremental learning model for tackling transfer learning problems. An online learning strategy is exploited to allow the RBFNDDA model to transfer knowledge from one domain and applied to classification tasks in a different yet related domain. An experimental study is carried out to evaluate the effectiveness of the online RBFNDDA model using a benchmark data set obtained from a public domain. The results are analyzed and compared with those from other methods. The outcomes positively reveal the potentials of the online RBFNDDA model in handling transfer learning tasks. © 2014 The authors and IOS Press. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Musculoskeletal injuries are reported as burdening the military. An identified risk factor for injury is carrying heavy loads; however, soldiers are also required to wear their load as body armour. To investigate the effects of body armour on trunk and hip kinematics during military-specific manual handling tasks, 16 males completed 3 tasks while wearing each of 4 body armour conditions plus a control. Three-dimensional motion analysis captured and quantified all kinematic data. Average trunk flexion for the weightiest armour type was higher compared with control during the carry component of the ammunition box lift (p < 0.001) and sandbag lift tasks (p < 0.001). Trunk rotation ROM was lower for all armour types compared with control during the ammunition box place component (p < 0.001). The altered kinematics with body armour occurred independent of armour design. In order to optimise armour design, manufacturers need to work with end-users to explore how armour configurations interact with range of personal and situational factors in operationally relevant environments. Practitioner Summary: Musculoskeletal injuries are reported as burdening the military and may relate to body armour wear. Body armour increased trunk flexion and reduced trunk rotation during military-specific lifting and carrying tasks. The altered kinematics may contribute to injury risk, but more research is required.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Whilst atom probe tomography (APT) is a powerful technique with the capacity to gather information containing hundreds of millions of atoms from a single specimen, the ability to effectively use this information creates significant challenges. The main technological bottleneck lies in handling the extremely large amounts of data on spatial-chemical correlations, as well as developing new quantitative computational foundations for image reconstruction that target critical and transformative problems in materials science. The power to explore materials at the atomic scale with the extraordinary level of sensitivity of detection offered by atom probe tomography has not been not fully harnessed due to the challenges of dealing with missing, sparse and often noisy data. Hence there is a profound need to couple the analytical tools to deal with the data challenges with the experimental issues associated with this instrument. In this paper we provide a summary of some key issues associated with the challenges, and solutions to extract or "mine" fundamental materials science information from that data.