227 resultados para Milling machines
Resumo:
This paper discusses the main milling train management tasks necessary for maintaining good extraction performance through a season. The main activities discussed are making week by week decisions about shredder and mill setting adjustments, and selecting preseason mill settings. To maintain satisfactory milling train extraction performance, the main factors affecting extraction should be examined: cane preparation with pol in open cells or shredder torque, delivery nip compaction through the load or torque controller outputs such as roll lift, feed chute flap position or pressure feeder to mill speed ratio, and added water rate. To select mill settings for the coming season, delivery nip compaction and feed chute exit compaction can be inferred from the previous seasons.
Resumo:
This paper describes recent updates to a milling train extraction model used to assess and predict the performance of a milling train. An extension was made to the milling unit model for the bagasse mills to replace the imbibition coefficient with crushing factor and mixing efficiency. New empirical relationships for reabsorption factor, imbibition coefficient, crushing factor, mixing efficiency and purity ratio were developed. The new empirical relationships were tested against factory measurements and previous model predictions. The updated model has been implemented in the SysCAD process modelling software. New additions to the model implementation include: a shredder model to assess or predict cane preparation, mill and shredder drives for power consumption and an updated imbibition control system to add allow water to be added to intermediate mills.
Resumo:
The increase in data center dependent services has made energy optimization of data centers one of the most exigent challenges in today's Information Age. The necessity of green and energy-efficient measures is very high for reducing carbon footprint and exorbitant energy costs. However, inefficient application management of data centers results in high energy consumption and low resource utilization efficiency. Unfortunately, in most cases, deploying an energy-efficient application management solution inevitably degrades the resource utilization efficiency of the data centers. To address this problem, a Penalty-based Genetic Algorithm (GA) is presented in this paper to solve a defined profile-based application assignment problem whilst maintaining a trade-off between the power consumption performance and resource utilization performance. Case studies show that the penalty-based GA is highly scalable and provides 16% to 32% better solutions than a greedy algorithm.
Resumo:
Although live VM migration has been intensively studied, the problem of live migration of multiple interdependent VMs has hardly been investigated. The most important problem in the live migration of multiple interdependent VMs is how to schedule VM migrations as the schedule will directly affect the total migration time and the total downtime of those VMs. Aiming at minimizing both the total migration time and the total downtime simultaneously, this paper presents a Strength Pareto Evolutionary Algorithm 2 (SPEA2) for the multi-VM migration scheduling problem. The SPEA2 has been evaluated by experiments, and the experimental results show that the SPEA2 can generate a set of VM migration schedules with a shorter total migration time and a shorter total downtime than an existing genetic algorithm, namely Random Key Genetic Algorithm (RKGA). This paper also studies the scalability of the SPEA2.
Resumo:
This paper reviews the innovations that have been introduced in the milling train at Rocky Point mill since 2001 and provides some operational, performance and maintenance comparisons of the technologies in use. The decision to install BHEM mills in the #2 and #3 mill positions to complement the six-roll mills in the #1 and #4 mill positions has proven a good one. Satisfactory performance is being obtained by these mills while maintenance costs are significantly less. Very good #1 mill extraction and final bagasse moisture content are being achieved. The innovation of using Hägglunds hydraulic drives at higher speed…
Resumo:
Being able to accurately predict the risk of falling is crucial in patients with Parkinson’s dis- ease (PD). This is due to the unfavorable effect of falls, which can lower the quality of life as well as directly impact on survival. Three methods considered for predicting falls are decision trees (DT), Bayesian networks (BN), and support vector machines (SVM). Data on a 1-year prospective study conducted at IHBI, Australia, for 51 people with PD are used. Data processing are conducted using rpart and e1071 packages in R for DT and SVM, con- secutively; and Bayes Server 5.5 for the BN. The results show that BN and SVM produce consistently higher accuracy over the 12 months evaluation time points (average sensitivity and specificity > 92%) than DT (average sensitivity 88%, average specificity 72%). DT is prone to imbalanced data so needs to adjust for the misclassification cost. However, DT provides a straightforward, interpretable result and thus is appealing for helping to identify important items related to falls and to generate fallers’ profiles.
Resumo:
In the field of workplace air quality, measuring and analyzing the size distribution of airborne particles to identify their sources and apportion their contribution has become widely accepted, however, the driving factors that influence this parameter, particularly for nanoparticles (< 100 nm), have not been thoroughly determined. Identification of driving factors, and in turn, general trends in size distribution of emitted particles would facilitate the prediction of nanoparticles’ emission behavior and significantly contribute to their exposure assessment. In this study, a comprehensive analysis of the particle number size distribution data, with a particular focus on the ultrafine size range of synthetic clay particles emitted from a jet milling machine was conducted using the multi-lognormal fitting method. The results showed relatively high contribution of nanoparticles to the emissions in many of the tested cases, and also, that both surface treatment and feed rate of the machine are significant factors influencing the size distribution of the emitted particles of this size. In particular, applying surface treatments and increasing the machine feed rate have the similar effect of reducing the size of the particles, however, no general trend was found in variations of size distribution across different surface treatments and feed rates. The findings of our study demonstrate that for this process and other activities, where no general trend is found in the size distribution of the emitted airborne particles due to dissimilar effects of the driving factors, each case must be treated separately in terms of workplace exposure assessment and regulations.
Resumo:
Current IEEE 802.11 wireless networks are vulnerable to session hijacking attacks as the existing standards fail to address the lack of authentication of management frames and network card addresses, and rely on loosely coupled state machines. Even the new WLAN security standard - IEEE 802.11i does not address these issues. In our previous work, we proposed two new techniques for improving detection of session hijacking attacks that are passive, computationally inexpensive, reliable, and have minimal impact on network performance. These techniques utilise unspoofable characteristics from the MAC protocol and the physical layer to enhance confidence in the intrusion detection process. This paper extends our earlier work and explores usability, robustness and accuracy of these intrusion detection techniques by applying them to eight distinct test scenarios. A correlation engine has also been introduced to maintain the false positives and false negatives at a manageable level. We also explore the process of selecting optimum thresholds for both detection techniques. For the purposes of our experiments, Snort-Wireless open source wireless intrusion detection system was extended to implement these new techniques and the correlation engine. Absence of any false negatives and low number of false positives in all eight test scenarios successfully demonstrated the effectiveness of the correlation engine and the accuracy of the detection techniques.
Resumo:
John Frazer's architectural work is inspired by living and generative processes. Both evolutionary and revolutionary, it explores informatin ecologies and the dynamics of the spaces between objects. Fuelled by an interest in the cybernetic work of Gordon Pask and Norbert Wiener, and the possibilities of the computer and the "new science" it has facilitated, Frazer and his team of collaborators have conducted a series of experiments that utilize genetic algorithms, cellular automata, emergent behaviour, complexity and feedback loops to create a truly dynamic architecture. Frazer studied at the Architectural Association (AA) in London from 1963 to 1969, and later became unit master of Diploma Unit 11 there. He was subsequently Director of Computer-Aided Design at the University of Ulter - a post he held while writing An Evolutionary Architecture in 1995 - and a lecturer at the University of Cambridge. In 1983 he co-founded Autographics Software Ltd, which pioneered microprocessor graphics. Frazer was awarded a person chair at the University of Ulster in 1984. In Frazer's hands, architecture becomes machine-readable, formally open-ended and responsive. His work as computer consultant to Cedric Price's Generator Project of 1976 (see P84)led to the development of a series of tools and processes; these have resulted in projects such as the Calbuild Kit (1985) and the Universal Constructor (1990). These subsequent computer-orientated architectural machines are makers of architectural form beyond the full control of the architect-programmer. Frazer makes much reference to the multi-celled relationships found in nature, and their ongoing morphosis in response to continually changing contextual criteria. He defines the elements that describe his evolutionary architectural model thus: "A genetic code script, rules for the development of the code, mapping of the code to a virtual model, the nature of the environment for the development of the model and, most importantly, the criteria for selection. In setting out these parameters for designing evolutionary architectures, Frazer goes beyond the usual notions of architectural beauty and aesthetics. Nevertheless his work is not without an aesthetic: some pieces are a frenzy of mad wire, while others have a modularity that is reminiscent of biological form. Algorithms form the basis of Frazer's designs. These algorithms determine a variety of formal results dependent on the nature of the information they are given. His work, therefore, is always dynamic, always evolving and always different. Designing with algorithms is also critical to other architects featured in this book, such as Marcos Novak (see p150). Frazer has made an unparalleled contribution to defining architectural possibilities for the twenty-first century, and remains an inspiration to architects seeking to create responsive environments. Architects were initially slow to pick up on the opportunities that the computer provides. These opportunities are both representational and spatial: computers can help architects draw buildings and, more importantly, they can help architects create varied spaces, both virtual and actual. Frazer's work was groundbreaking in this respect, and well before its time.
Resumo:
This paper describes the approach taken to the XML Mining track at INEX 2008 by a group at the Queensland University of Technology. We introduce the K-tree clustering algorithm in an Information Retrieval context by adapting it for document clustering. Many large scale problems exist in document clustering. K-tree scales well with large inputs due to its low complexity. It offers promising results both in terms of efficiency and quality. Document classification was completed using Support Vector Machines.
Resumo:
This paper proposes a new prognosis model based on the technique for health state estimation of machines for accurate assessment of the remnant life. For the evaluation of health stages of machines, the Support Vector Machine (SVM) classifier was employed to obtain the probability of each health state. Two case studies involving bearing failures were used to validate the proposed model. Simulated bearing failure data and experimental data from an accelerated bearing test rig were used to train and test the model. The result obtained is very encouraging and shows that the proposed prognostic model produces promising results and has the potential to be used as an estimation tool for machine remnant life prediction.
Resumo:
Changing informational constraints of practice, such as when using ball projection machines, has been shown to significantly affect movement coordination of skilled cricketers. To date, there has been no similar research on movement responses of developing batters, an important issue since ball projection machines are used heavily in cricket development programmes. Timing and coordination of young cricketers (n = 12, age = 15.6 ± 0.7 years) were analyzed during the forward defensive and forward drive strokes when facing a bowling machine and bowler (both with a delivery velocity of 28.14 ± 0.56 m s−1). Significant group performance differences were observed between the practice task constraints, with earlier initiation of the backswing, front foot movement, downswing, and front foot placement when facing the bowler compared to the bowling machine. Peak height of the backswing was higher when facing the bowler, along with a significantly larger step length. Altering the informational constraints of practice caused major changes to the information–movement couplings of developing cricketers. Data from this study were interpreted to emanate from differences in available specifying variables under the distinct practice task constraints. Considered with previous findings, results confirmed the need to ensure representative batting task constraints in practice, cautioning against an over-reliance on ball projection machines in cricket development programmes.
Resumo:
In condition-based maintenance (CBM), effective diagnostics and prognostics are essential tools for maintenance engineers to identify imminent fault and to predict the remaining useful life before the components finally fail. This enables remedial actions to be taken in advance and reschedules production if necessary. This paper presents a technique for accurate assessment of the remnant life of machines based on historical failure knowledge embedded in the closed loop diagnostic and prognostic system. The technique uses the Support Vector Machine (SVM) classifier for both fault diagnosis and evaluation of health stages of machine degradation. To validate the feasibility of the proposed model, the five different level data of typical four faults from High Pressure Liquefied Natural Gas (HP-LNG) pumps were used for multi-class fault diagnosis. In addition, two sets of impeller-rub data were analysed and employed to predict the remnant life of pump based on estimation of health state. The results obtained were very encouraging and showed that the proposed prognosis system has the potential to be used as an estimation tool for machine remnant life prediction in real life industrial applications.
Resumo:
With service interaction modelling, it is customary to distinguish between two types of models: choreographies and orchestrations. A choreography describes interactions within a collection of services from a global perspective, where no service plays a privileged role. Instead, services interact in a peer-to-peer manner. In contrast, an orchestration describes the interactions between one particular service, the orchestrator, and a number of partner services. The main proposition of this work is an approach to bridge these two modelling viewpoints by synthesising orchestrators from choreographies. To start with, choreographies are defined using a simple behaviour description language based on communicating finite state machines. From such a model, orchestrators are initially synthesised in the form of state machines. It turns out that state machines are not suitable for orchestration modelling, because orchestrators generally need to engage in concurrent interactions. To address this issue, a technique is proposed to transform state machines into process models in the Business Process Modelling Notation (BPMN). Orchestrations represented in BPMN can then be augmented with additional business logic to achieve value-adding mediation. In addition, techniques exist for refining BPMN models into executable process definitions. The transformation from state machines to BPMN relies on Petri nets as an intermediary representation and leverages techniques from theory of regions to identify concurrency in the initial Petri net. Once concurrency has been identified, the resulting Petri net is transformed into a BPMN model. The original contributions of this work are: an algorithm to synthesise orchestrators from choreographies and a rules-based transformation from Petri nets into BPMN.