156 resultados para handling
Resumo:
Design for Manufacturing (DFM) is a highly integral methodology in product development, starting from the concept development phase, with the aim of improving manufacturing productivity. It is used to reduce manufacturing costs in complex production environments, while maintaining product quality. While Design for Assembly (DFA) is focusing on elimination or combination of parts with other components, which in most cases relates to performing a function and manufacture operation in a simpler way, DFM is following a more holistic approach. Common consideration for DFM are standard components, manufacturing tool inventory and capability, materials compatibility with production process, part handling, logistics, tool wear and process optimization, quality control complexity or Poka-Yoke design. During DFM, the considerable background work required for the conceptual phase is compensated for by a shortening of later development phases. Current DFM projects normally apply an iterative step-by-step approach and eventually transfer to the developer team. The study is introducing a new, knowledge based approach to DFM, eliminating steps of DFM, and showing implications on the work process. Furthermore, a concurrent engineering process via transparent interface between the manufacturing engineering and product development systems is brought forward.
Resumo:
Retrofit projects are different from newly-built projects in many respects. A retrofit project involves an existing building, which imposes constraints on the owners, designers, operators and constructors throughout the project process. Retrofit projects are risky, complex, less predictable and difficult to be well planned, which need greater coordination. For office building retrofit project, further restrictions will apply as these buildings often locate in CBD areas and most have to remain operational during the progression of project work. Issues such as site space, material storage and handling, noise and dust, need to be considered and well addressed. In this context, waste management is even more challenging with small spaces for waste handling, uncertainties in waste control, and impact of waste management activities on project delivery and building occupants. Current literatures on waste management in office building retrofit projects focus on increasing waste recovery rate based on project planning, monitoring and stakeholders’ collaboration. However, previous research has not produced knowledge of understanding the particular retrofit processes and their impact on waste generation and management. This paper discusses the interim results of a continuing research on new strategies for waste management in office building retrofit projects. Firstly based on the literature review, it summarizes the unique characteristics of office building retrofit projects and their influence on waste management. An assumption on waste management strategies is formed. Semi-structured interviews were conducted towards industry practitioners and findings are then presented in the paper. The assumption of the research was validated in the interviews from the opinions and experiences of the respondents. Finally the research develops a process model for waste management in office building retrofit projects. It introduces two different waste management strategies. For the dismantling phase, waste is generated fast along with the work progress, so integrated planning for project delivery and waste generation is needed in order to organize prompt handling and treatment. For the fit-out phase, the work is similar as new construction. Factors which are particularly linked to generating waste on site need to be controlled and monitored. Continuing research in this space will help improve the practice of waste management in office building retrofit projects. The new strategies will help promote the practicality of project waste planning and management and stakeholders’ capability of coordinating waste management and project delivery.
Resumo:
Workplace mobbing is a particularly serious phenomenon that is extremely costly to organizations and the health of those targeted. This article reports on a study of self-identified targets of mobbing. Findings support a five-stage process of mobbing, which commences with unresolved conflict and leads ultimately to expulsion from the organization. Participants report a number of experiences, such as lengthy investigations and escalation of conflict, that result in an increasingly unbalanced sense of power away from the individual and towards the organization. Revealed is a mismatch between the expected organizational justice processes and support and the actual experience. Recommendations for approaching this problem are discussed.
Resumo:
Little past empirical support has been found for the efficacy of motorcycle rider training as a road safety countermeasure. However, it has been argued that rider training should focus more particularly on the psychosocial factors that influence risk taking behaviour in addition to the traditional practice of developing vehicle-handling skills. This paper examines how rider training to reduce risk taking could be guided by appropriate theories. Two fundamental perspectives are examined: firstly training can be considered in terms of behaviour change, and secondly in terms of adult learning. Whilst behaviour change theories assume some pre-existing level of dysfunctional behaviour, an adult learning perspective does not necessarily carry this assumption. This distinction in perspectives conceptually aligns with the notions of intervention and prevention (respectively), with possible implications for specific target groups for pre-licence and post-licence training. The application of the Theory of Reasoned Action (Ajzen & Fishbein, 1975, 1980) and Transformative Learning Theory (Mezirow, 1997) to a pre-licence rider training program in Queensland, Australia is discussed.
Resumo:
Poisson distribution has often been used for count like accident data. Negative Binomial (NB) distribution has been adopted in the count data to take care of the over-dispersion problem. However, Poisson and NB distributions are incapable of taking into account some unobserved heterogeneities due to spatial and temporal effects of accident data. To overcome this problem, Random Effect models have been developed. Again another challenge with existing traffic accident prediction models is the distribution of excess zero accident observations in some accident data. Although Zero-Inflated Poisson (ZIP) model is capable of handling the dual-state system in accident data with excess zero observations, it does not accommodate the within-location correlation and between-location correlation heterogeneities which are the basic motivations for the need of the Random Effect models. This paper proposes an effective way of fitting ZIP model with location specific random effects and for model calibration and assessment the Bayesian analysis is recommended.
Resumo:
The use of containers have greatly reduced handling operations at ports and at all other transfer points, thus increasing the efficiency and speed of transportation. This was done in an attempt to cut down the cost of maritime transport, mainly by reducing cargo handling and costs, and ships' time in port by speeding up handling operations. This paper discusses the major factors influencing the transfer efficiency of seaport container terminals. A network model is designed to analyse container progress in the system and applied to a seaport container terminal. The model presented here can be seen as a decision support system in the context of investment appraisal of multimodal container terminals. (C) 2000 Elsevier Science Ltd.
Resumo:
Optimising the container transfer schedule at the multimodal terminals is known to be NP-hard, which implies that the best solution becomes computationally infeasible as problem sizes increase. Genetic Algorithm (GA) techniques are used to reduce container handling/transfer times and ships' time at the port by speeding up handling operations. The GA is chosen due to the relatively good results that have been reported even with the simplest GA implementations to obtain near-optimal solutions in reasonable time. Also discussed, is the application of the model to assess the consequences of increased scheduled throughput time as well as different strategies such as the alternative plant layouts, storage policies and number of yard machines. A real data set used for the solution and subsequent sensitivity analysis is applied to the alternative plant layouts, storage policies and number of yard machines.
Resumo:
Facial expression is an important channel of human social communication. Facial expression recognition (FER) aims to perceive and understand emotional states of humans based on information in the face. Building robust and high performance FER systems that can work in real-world video is still a challenging task, due to the various unpredictable facial variations and complicated exterior environmental conditions, as well as the difficulty of choosing a suitable type of feature descriptor for extracting discriminative facial information. Facial variations caused by factors such as pose, age, gender, race and occlusion, can exert profound influence on the robustness, while a suitable feature descriptor largely determines the performance. Most present attention on FER has been paid to addressing variations in pose and illumination. No approach has been reported on handling face localization errors and relatively few on overcoming facial occlusions, although the significant impact of these two variations on the performance has been proved and highlighted in many previous studies. Many texture and geometric features have been previously proposed for FER. However, few comparison studies have been conducted to explore the performance differences between different features and examine the performance improvement arisen from fusion of texture and geometry, especially on data with spontaneous emotions. The majority of existing approaches are evaluated on databases with posed or induced facial expressions collected in laboratory environments, whereas little attention has been paid on recognizing naturalistic facial expressions on real-world data. This thesis investigates techniques for building robust and high performance FER systems based on a number of established feature sets. It comprises of contributions towards three main objectives: (1) Robustness to face localization errors and facial occlusions. An approach is proposed to handle face localization errors and facial occlusions using Gabor based templates. Template extraction algorithms are designed to collect a pool of local template features and template matching is then performed to covert these templates into distances, which are robust to localization errors and occlusions. (2) Improvement of performance through feature comparison, selection and fusion. A comparative framework is presented to compare the performance between different features and different feature selection algorithms, and examine the performance improvement arising from fusion of texture and geometry. The framework is evaluated for both discrete and dimensional expression recognition on spontaneous data. (3) Evaluation of performance in the context of real-world applications. A system is selected and applied into discriminating posed versus spontaneous expressions and recognizing naturalistic facial expressions. A database is collected from real-world recordings and is used to explore feature differences between standard database images and real-world images, as well as between real-world images and real-world video frames. The performance evaluations are based on the JAFFE, CK, Feedtum, NVIE, Semaine and self-collected QUT databases. The results demonstrate high robustness of the proposed approach to the simulated localization errors and occlusions. Texture and geometry have different contributions to the performance of discrete and dimensional expression recognition, as well as posed versus spontaneous emotion discrimination. These investigations provide useful insights into enhancing robustness and achieving high performance of FER systems, and putting them into real-world applications.
Resumo:
Thermal-infrared imagery is relatively robust to many of the failure conditions of visual and laser-based SLAM systems, such as fog, dust and smoke. The ability to use thermal-infrared video for localization is therefore highly appealing for many applications. However, operating in thermal-infrared is beyond the capacity of existing SLAM implementations. This paper presents the first known monocular SLAM system designed and tested for hand-held use in the thermal-infrared modality. The implementation includes a flexible feature detection layer able to achieve robust feature tracking in high-noise, low-texture thermal images. A novel approach for structure initialization is also presented. The system is robust to irregular motion and capable of handling the unique mechanical shutter interruptions common to thermal-infrared cameras. The evaluation demonstrates promising performance of the algorithm in several environments.
Resumo:
Health complaint statistics are important for identifying problems and bringing about improvements to health care provided by health service providers and to the wider health care system. This paper overviews complaints handling by the eight Australian state and territory health complaint entities, based on an analysis of data from their annual reports. The analysis shows considerable variation between jurisdictions in the ways complaint data are defined, collected and recorded. Complaints from the public are an important accountability mechanism and open a window on service quality. The lack of a national approach leads to fragmentation of complaint data and a lost opportunity to use national data to assist policy development and identify the main areas causing consumers to complain. We need a national approach to complaints data collection in order to better respond to patients’ concerns.
Resumo:
A paradigm shift is taking place in orthopaedic and reconstructive surgery. This transition from using medical devices and tissue grafts towards the utilization of a tissue engineering approach combines biodegradable scaffolds with cells and/or biological molecules in order to repair and/or regenerate tissues. One of the potential benefits offered by solid freeform fabrication (SFF) technologies is the ability to create such biodegradable scaffolds with highly reproducible architecture and compositional variation across the entire scaffold due to their tightly controlled computer-driven fabrication. Many of these biologically activated materials can induce bone formation at ectopic and orthotopic sites, but they have not yet gained widespread use due to several continuing limitations, including poor mechanical properties, difficulties in intraoperative handling, lack of porosity suitable for cellular and vascular infiltration, and suboptimal degradation characteristics. In this chapter, we define scaffold properties and attempt to provide some broad criteria and constraints for scaffold design and fabrication in combination with growth factors for bone engineering applications. Lastly, we comment on the current and future developments in the field, such as the functionalization of novel composite scaffolds with combinations of growth factors designed to promote cell attachment, cell survival, vascular ingrowth, and osteoinduction.
Resumo:
THE Mackay Renewable Biocommodities Pilot Plant is a pilot scale facility owned and operated by QUT for research and demonstration of the conversion of lignocellulosic biomass such as sugarcane bagasse into biofuels. The pilot plant accommodates unique state-of-the-art equipment to process a wide range of feedstocks and is strategically located on the site of the Mackay Sugar Ltd Racecourse Mill. Major facilities include a biomass handling system, pre-treatment reactor, saccharification reactor, fermentors, distillation column and bioseparations equipment. This paper provides an update on the design, construction, commissioning and start-up of the facility. In addition, the paper provides results from preliminary facility trials on the pre-treatment of sugarcane bagasse for cellulosic ethanol production.
Resumo:
The R statistical environment and language has demonstrated particular strengths for interactive development of statistical algorithms, as well as data modelling and visualisation. Its current implementation has an interpreter at its core which may result in a performance penalty in comparison to directly executing user algorithms in the native machine code of the host CPU. In contrast, the C++ language has no built-in visualisation capabilities, handling of linear algebra or even basic statistical algorithms; however, user programs are converted to high-performance machine code, ahead of execution. A new method avoids possible speed penalties in R by using the Rcpp extension package in conjunction with the Armadillo C++ matrix library. In addition to the inherent performance advantages of compiled code, Armadillo provides an easy-to-use template-based meta-programming framework, allowing the automatic pooling of several linear algebra operations into one, which in turn can lead to further speedups. With the aid of Rcpp and Armadillo, conversion of linear algebra centered algorithms from R to C++ becomes straightforward. The algorithms retains the overall structure as well as readability, all while maintaining a bidirectional link with the host R environment. Empirical timing comparisons of R and C++ implementations of a Kalman filtering algorithm indicate a speedup of several orders of magnitude.
Resumo:
Twenty first century learners operate in organic, immersive environments. A pedagogy of student-centred learning is not a recipe for rooms. A contemporary learning environment is like a landscape that grows, morphs, and responds to the pressures of the context and micro-culture. There is no single adaptable solution, nor a suite of off-the-shelf answers; propositions must be customisable and infinitely variable. They must be indeterminate and changeable; based on the creation of learning places, not restrictive or constraining spaces. A sustainable solution will be un-fixed, responsive to the life cycle of the components and materials, able to be manipulated by the users; it will create and construct its own history. Learning occurs as formal education with situational knowledge structures, but also as informal learning, active learning, blended learning social learning, incidental learning, and unintended learning. These are not spatial concepts but socio-cultural patterns of discovery. Individual learning requirements must run free and need to be accommodated as the learner sees fit. The spatial solution must accommodate and enable a full array of learning situations. It is a system not an object. Three major components: 1. The determinate landscape: in-situ concrete 'plate' that is permanent. It predates the other components of the system and remains as a remnant/imprint/fossil after the other components of the system have been relocated. It is a functional learning landscape in its own right; enabling a variety of experiences and activities. 2. The indeterminate landscape: a kit of pre-fabricated 2-D panels assembled in a unique manner at each site to suit the client and context. Manufactured to the principles of design-for-disassembly. A symbiotic barnacle like system that attaches itself to the existing infrastructure through the determinate landscape which acts as a fast growth rhizome. A carapace of protective panels, infinitely variable to create enclosed, semi-enclosed, and open learning places. 3. The stations: pre-fabricated packages of highly-serviced space connected through the determinate landscape. Four main types of stations; wet-room learning centres, dry-room learning centres, ablutions, and low-impact building services. Entirely customised at the factory and delivered to site. The stations can be retro-fitted to suit a new context during relocation. Principles of design for disassembly: material principles • use recycled and recyclable materials • minimise the number of types of materials • no toxic materials • use lightweight materials • avoid secondary finishes • provide identification of material types component principles • minimise/standardise the number of types of components • use mechanical not chemical connections • design for use of common tools and equipment • provide easy access to all components • make component size to suite means of handling • provide built in means of handling • design to realistic tolerances • use a minimum number of connectors and a minimum number of types system principles • design for durability and repeated use • use prefabrication and mass production • provide spare components on site • sustain all assembly and material information
Resumo:
Securing IT infrastructures of our modern lives is a challenging task because of their increasing complexity, scale and agile nature. Monolithic approaches such as using stand-alone firewalls and IDS devices for protecting the perimeter cannot cope with complex malwares and multistep attacks. Collaborative security emerges as a promising approach. But, research results in collaborative security are not mature, yet, and they require continuous evaluation and testing. In this work, we present CIDE, a Collaborative Intrusion Detection Extension for the network security simulation platform ( NeSSi 2 ). Built-in functionalities include dynamic group formation based on node preferences, group-internal communication, group management and an approach for handling the infection process for malware-based attacks. The CIDE simulation environment provides functionalities for easy implementation of collaborating nodes in large-scale setups. We evaluate the group communication mechanism on the one hand and provide a case study and evaluate our collaborative security evaluation platform in a signature exchange scenario on the other.