239 resultados para metrics
Resumo:
Increasing global competitiveness worldwide has forced manufacturing organizations to produce high-quality products more quickly and at a competitive cost which demand of continuous improvements techniques. In this paper, we propose a fuzzy based performance evaluation method for lean supply chain. To understand the overall performance of cost competitive supply chain, we investigate the alignment of market strategy and position of the supply chain. Competitive strategies can be achieved by using a different weight calculation for different supply chain situations. By identifying optimal performance metrics and applying performance evaluation methods, managers can predict the overall supply chain performance under lean strategy.
Resumo:
Airports represent the epitome of complex systems with multiple stakeholders, multiple jurisdictions and complex interactions between many actors. The large number of existing models that capture different aspects of the airport are a testament to this. However, these existing models do not consider in a systematic sense modelling requirements nor how stakeholders such as airport operators or airlines would make use of these models. This can detrimentally impact on the verification and validation of models and makes the development of extensible and reusable modelling tools difficult. This paper develops from the Concept of Operations (CONOPS) framework a methodology to help structure the review and development of modelling capabilities and usage scenarios. The method is applied to the review of existing airport terminal passenger models. It is found that existing models can be broadly categorised according to four usage scenarios: capacity planning, operational planning and design, security policy and planning, and airport performance review. The models, the performance metrics that they evaluate and their usage scenarios are discussed. It is found that capacity and operational planning models predominantly focus on performance metrics such as waiting time, service time and congestion whereas performance review models attempt to link those to passenger satisfaction outcomes. Security policy models on the other hand focus on probabilistic risk assessment. However, there is an emerging focus on the need to be able to capture trade-offs between multiple criteria such as security and processing time. Based on the CONOPS framework and literature findings, guidance is provided for the development of future airport terminal models.
Resumo:
All processes are modeled, all process metrics defined, all process support systems are set up; yet still, processes are not running smoothly and departmental silos are more present than ever. Both practitioners and academics tell the same story. A successful business process management (BPM) implementation goes beyond using the right methods and putting the right systems in place. In fact, an important success factor for BPM is establishing the right organizational culture, that is, a culture that supports the achievement and maintenance of efficient and effective business processes.
Resumo:
The importance of actively managing and analyzing business processes is acknowledged more than ever in organizations nowadays. Business processes form an essential part of an organization and their ap-plication areas are manifold. Most organizations keep records of various activities that have been carried out for auditing purposes, but they are rarely used for analysis purposes. This paper describes the design and implementation of a process analysis tool that replays, analyzes and visualizes a variety of performance metrics using a process definition and its execution logs. Performing performance analysis on existing and planned process models offers a great way for organizations to detect bottlenecks within their processes and allow them to make more effective process improvement decisions. Our technique is applied to processes modeled in the YAWL language. Execution logs of process instances are compared against the corresponding YAWL process model and replayed in a robust manner, taking into account any noise in the logs. Finally, performance characteristics, obtained from replaying the log in the model, are projected onto the model.
Resumo:
IEEE 802.11p is the new standard for inter-vehicular communications (IVC) using the 5.9 GHz frequency band; it is planned to be widely deployed to enable cooperative systems. 802.11p uses and performance have been studied theoretically and in simulations over the past years. Unfortunately, many of these results have not been confirmed by on-tracks experimentation. In this paper, we describe field trials of 802.11p technology with our test vehicles. Metrics such as maximum range, latency and frame loss are examined.
Resumo:
Modern toxicology investigates a wide array of both old and new health hazards. Priority setting is needed to select agents for research from the plethora of exposure circumstances. The changing societies and a growing fraction of the aged have to be taken into consideration. A precise exposure assessment is of importance for risk estimation and regulation. Toxicology contributes to the exploration of pathomechanisms to specify the exposure metrics for risk estimation. Combined effects of co-existing agents are not yet sufficiently understood. Animal experiments allow a separate administration of agents which can not be disentangled by epidemiological means, but their value is limited for low exposure levels in many of today’s settings. As an experimental science, toxicology has to keep pace with the rapidly growing knowledge about the language of the genome and the changing paradigms in cancer development. During the pioneer era of assembling a working draft of the human genome, toxicogenomics has been developed. Gene and pathway complexity have to be considered when investigating gene–environment interactions. For a best conduct of studies, modern toxicology needs a close liaison with many other disciplines like epidemiology and bioinformatics.
Resumo:
The research reported in this paper introduces a knowledge-based urban development assessment framework, which is constructed in order to evaluate and assist in the (re)formulation of local and regional policy frameworks and applications necessary in knowledge city transformations. The paper also reports the findings of an application of this framework in a comparative study of Boston, Vancouver, Melbourne and Manchester. The paper with its assessment framework: demonstrates an innovative way of examining the knowledge-based development capacity of cities by scrutinising their economic, socio-cultural, enviro-urban and institutional development mechanisms and capabilities; presents some of the generic indicators used to evaluate knowledge-based development performance of cities; reveals how a city can benchmark its development level against that of other cities, and; provides insights for achieving a more sustainable and knowledge-based development.
Resumo:
Traffic safety studies demand more than what current micro-simulation models can provide as they presume that all drivers exhibit safe behaviors. All the microscopic traffic simulation models include a car following model. This paper highlights the limitations of the Gipps car following model ability to emulate driver behavior for safety study purposes. A safety adapted car following model based on the Gipps car following model is proposed to simulate unsafe vehicle movements, with safety indicators below critical thresholds. The modifications are based on the observations of driver behavior in real data and also psychophysical notions. NGSIM vehicle trajectory data is used to evaluate the new model and short following headways and Time To Collision are employed to assess critical safety events within traffic flow. Risky events are extracted from available NGSIM data to evaluate the modified model against them. The results from simulation tests illustrate that the proposed model can predict the safety metrics better than the generic Gipps model. The outcome of this paper can potentially facilitate assessing and predicting traffic safety using microscopic simulation.
Resumo:
The Australian e-Health Research Centre and Queensland University of Technology recently participated in the TREC 2011 Medical Records Track. This paper reports on our methods, results and experience using a concept-based information retrieval approach. Our concept-based approach is intended to overcome specific challenges we identify in searching medical records. Queries and documents are transformed from their term-based originals into medical concepts as de ned by the SNOMED-CT ontology. Results show our concept-based approach performed above the median in all three performance metrics: bref (+12%), R-prec (+18%) and Prec@10 (+6%).
Resumo:
The concept of Six Sigma was initiated in the 1980s by Motorola. Since then it has been implemented in several manufacturing and service organizations. Till now Six Sigma implementation is mostly limited to healthcare and financial services in private sector. Its implementation is now gradually picking up in services such as call center, education, construction and related engineering etc. in private as well as public sector. Through a literature review, a questionnaire survey, and multiple case study approach the paper develops a conceptual framework to facilitate widening the scope of Six Sigma implementation in service organizations. Using grounded theory methodology, this study develops theory for Six Sigma implementation in service organizations. The study involves a questionnaire survey and case studies to understand and build a conceptual framework. The survey was conducted in service organizations in Singapore and exploratory in nature. The case studies involved three service organizations which implemented Six Sigma. The objective is to explore and understand the issues highlighted by the survey and the literature. The findings confirm the inclusion of critical success factors, critical-to-quality characteristics, and set of tools and techniques as observed from the literature. In case of key performance indicator, there are different interpretations about it in literature and also by industry practitioners. Some literature explain key performance indicator as performance metrics whereas some feel it as key process input or output variables, which is similar to interpretations by practitioners of Six Sigma. The response of not relevant and unknown to us as reasons for not implementing Six Sigma shows the need for understanding specific requirements of service organizations. Though much theoretical description is available about Six Sigma, but there has been limited rigorous academic research on it. This gap is far more pronounced about Six Sigma implementation in service organizations, where the theory is not mature enough. Identifying this need, the study contributes by going through theory building exercise and developing a conceptual framework to understand the issues involving its implementation in service organizations.
Resumo:
Traffic safety studies mandate more than what existing micro-simulation models can offer as they postulate that every driver exhibits a safe behaviour. All the microscopic traffic simulation models are consisting of a car-following model and the Gazis–Herman–Rothery (GHR) car-following model is a widely used model. This paper highlights the limitations of the GHR car-following model capability to model longitudinal driving behaviour for safety study purposes. This study reviews and compares different version of the GHR model. To empower the GHR model on precise metrics reproduction a new set of car-following model parameters is offered to simulate unsafe vehicle conflicts. NGSIM vehicle trajectory data is used to evaluate the new model and short following headways and Time to Collision are employed to assess critical safety events within traffic flow. Risky events are extracted from available NGSIM data to evaluate the modified model against the generic versions of the GHR model. The results from simulation tests illustrate that the proposed model does predict the safety metrics better than the generic GHR model. Additionally it can potentially facilitate assessing and predicting traffic facilities’ safety using microscopic simulation. The new model can predict Near-miss rear-end crashes.
Resumo:
High Dynamic Range (HDR) imaging was used to collect luminance information at workstations in 2 open-plan office buildings in Queensland, Australia: one lit by skylights, vertical windows and electric light, and another by skylights and electric light. This paper compares illuminance and luminance data collected in these offices with occupant feedback to evaluate these open-plan environments based on available and emerging metrics for visual comfort and glare. This study highlights issues of daylighting quality and measurement specific to open plan spaces. The results demonstrate that overhead glare is a serious threat to user acceptance of skylights, and that electric and daylight integration and controls have a major impact on the perception of daylighting quality. With regards to measurement of visual comfort it was found that the Daylight Glare Probability (DGP) gave poor agreement with occupant reports of discomfort glare in open-plan spaces with skylights, and the CIE Glare Index (CGI) gave the best agreement. Horizontal and vertical illuminances gave no indication of visual comfort in these spaces.
Resumo:
This work was motivated by the limited knowledge on personal exposure to ultrafine (UF) particles, especially for children (Mejía et al. 2011). Most research efforts in the past have investigated particle mass concentration and only a limited number of studies have been conducted to quantify other particle metrics, such as particle number, in the classrooms and school microenvironment in general (Diapouli et al. 2008; Guo et al. 2008; Weichenthal et al. 2008; Mullen et al. 2011).