225 resultados para BENCHMARK
Resumo:
In this paper, three metaheuristics are proposed for solving a class of job shop, open shop, and mixed shop scheduling problems. We evaluate the performance of the proposed algorithms by means of a set of Lawrence’s benchmark instances for the job shop problem, a set of randomly generated instances for the open shop problem, and a combined job shop and open shop test data for the mixed shop problem. The computational results show that the proposed algorithms perform extremely well on all these three types of shop scheduling problems. The results also reveal that the mixed shop problem is relatively easier to solve than the job shop problem due to the fact that the scheduling procedure becomes more flexible by the inclusion of more open shop jobs in the mixed shop.
Resumo:
Deterministic transit capacity analysis applies to planning, design and operational management of urban transit systems. The Transit Capacity and Quality of Service Manual (1) and Vuchic (2, 3) enable transit performance to be quantified and assessed using transit capacity and productive capacity. This paper further defines important productive performance measures of an individual transit service and transit line. Transit work (p-km) captures the transit task performed over distance. Passenger transmission (p-km/h) captures the passenger task delivered by service at speed. Transit productiveness (p-km/h) captures transit work performed over time. These measures are useful to operators in understanding their services’ or systems’ capabilities and passenger quality of service. This paper accounts for variability in utilized demand by passengers along a line and high passenger load conditions where passenger pass-up delay occurs. A hypothetical case study of an individual bus service’s operation demonstrates the usefulness of passenger transmission in comparing existing and growth scenarios. A hypothetical case study of a bus line’s operation during a peak hour window demonstrates the theory’s usefulness in examining the contribution of individual services to line productive performance. Scenarios may be assessed using this theory to benchmark or compare lines and segments, conditions, or consider improvements.
Resumo:
The research field was curatorship of the Machinima genre - a film-making practice that uses real time 3D computer graphics engines to create cinematic productions. The context was the presentation of gallery non-specific work for large-scale exhibition, as an investigation in thinking beyond traditional strategies of white cube. Strongly influenced by the Christiane Paul (Ed) seminal text, 'New Media in the White Cube and Beyond, Curatorial Models for Digital Art', the context was the repositioning of a genre traditionally focussed on delivery through small-screen, indoor, personal spaces, to large exhibition hall spaces. Beyond the core questions of collecting, documenting, expanding and rethinking the place of Machinima within the history of contemporary digital arts, the curatorial premise asked how to best invert the relationship between context of media production within the gaming domain, using novel presentational strategies that might best promote the 'take-home' impulse. The exhibition was used not as the ultimate destination for work but rather as a place to experience, sort and choose from a high volume of possible works for subsequent investigation by audiences within their own game-ready, domestic environments. In pursuit of this core aim, the exhibition intentionally promoted 'sensory overload'. The exhibition also included a gaming lab experience where audiences could begin to learn the DIY concepts of the medium, and be stimulated to revisit, consider and re-make their own relationship to this genre. The research was predominantly practice-led and collaborative (in close concert with the Machinima community), and ethnographic in that it sought to work with, understand and promote the medium in a contemporary art context. This benchmark exhibition, building on the 15-year history of the medium, was warmly received by the global Machinima community as evidenced by the significant debate, feedback and general interest recorded. The exhibition has recently begun an ongoing Australian touring schedule. To date, the exhibition has received critical attention nationally and internationally in Das Superpaper, the Courier Mail, Machinimart, 4ZZZ-FM, the Sydney Morning Herald, Games and Business, Australian Gamer, Kotaku Australia, and the Age.
Resumo:
In this study, we provide an insight into how private equity players choose their targets and the bid arrangements they prefer. We test our expectations of the unique features of private equity targets using a sample of 23 listed private equity target firms during 2001–2007. We find, relative to a benchmark sample of 81 corporate targets matched by year and industry, the private equity target firms to be larger, more profitable, use their assets more efficiently, more highly levered and have greater cash flow. Multivariate testing indicates that private equity targets have relatively greater financial slack, greater financial stability, greater free cash flow and lower measurable growth prospects. All conclusions are found to be robust to a control sample of 502 takeover bids during 2001–2007.
Resumo:
This note examines the productive efficiency of 62 starting guards during the 2011/12 National Basketball Association (NBA) season. This period coincides with the phenomenal and largely unanticipated performance of New York Knicks’ starting point guard Jeremy Lin and the attendant public and media hype known as Linsanity. We employ a data envelopment analysis (DEA) approach that includes allowance for an undesirable output, here turnovers per game, with the desirable outputs of points, rebounds, assists, steals, and blocks per game and an input of minutes per game. The results indicate that depending upon the specification, between 29 and 42 percent of NBA guards are fully efficient, including Jeremy Lin, with a mean inefficiency of 3.7 and 19.2 percent. However, while Jeremy Lin is technically efficient, he seldom serves as a benchmark for inefficient players, at least when compared with established players such as Chris Paul and Dwayne Wade. This suggests the uniqueness of Jeremy Lin’s productive solution and may explain why his unique style of play, encompassing individual brilliance, unselfish play, and team leadership, is of such broad public appeal.
Resumo:
Enterprise architecture management (EAM) has become an intensively discussed approach to manage enterprise transformations. While many organizations employ EAM, a notable insecurity about the value of EAM remains. In this paper, we propose a model to measure the realization of benefits from EAM. We identify EAM success factors and EAM benefits through a comprehensive literature review and eleven explorative expert interviews. Based on our findings, we integrate the EAM success factors and benefits with the established DeLone & McLean IS success model resulting in a model that explains the realization of EAM benefits. This model aids organizations as a benchmark and framework for identifying and assessing the setup of their EAM initiatives and whether and how EAM benefits are materialized. We see our model also as a first step to gain insights in and start a discussion on the theory of EAM benefit realization.
Resumo:
The deal value of private equity merger and takeover activity has achieved unprecedented growth in the last couple of years, in Australia and globally. Private equity deals are not a new feature of the market; however, such deals have been subject to increased academic, professional and policy interest. This study examines the particular features of 15 major deals involving listed company "targets" and provides evidence – based on a comparison with a benchmark sample – to demonstrate the role that private equity plays in the market for corporate control. The objective of this study was to assess the friendliness of private equity bids. Based on the indicia compiled, lower bid premiums, the presence of break fees and the intention to retain senior management are compellingly different for private equity bids than for the comparative sample of bids. Using these several characteristics of "friendliness", the authors show that private equity deals are generally friendly in nature, consistent with industry rhetoric, but perhaps inconsistent with the popular belief that private equity bidders are the "barbarians at the gate".
Resumo:
It is nearly 10 years since the introduction of s 299(1)(f) Corporations Act , which requires the disclosure of information regarding a company's environmental performance within its annual report. This provision has generated considerable debate in the years since its introduction, fundamentally between proponents of either a voluntary or mandatory environmental reporting framework. This study examines the adequacy of the current regulatory framework. The environmental reporting practices of 24 listed companies in the resources industries are assessed relative to a standard set by the Global Reporting Initiative (GRI) Sustainability Reporting Guidelines. These Guidelines are argued to represent "international best practice" in environmental reporting and a "scorecard" approach is used to score the quality of disclosure according to this voluntary benchmark. Larger companies in the sample tend to report environmental information over and above the level required by legislation. Some, but not all companies present a stand-alone environmental/sustainability report. However, smaller companies provide minimal information in compliance with s 299(1)(f) . The findings indicate that "international best practice" environmental reporting is unlikely to be achieved by Australian companies under the current regulatory framework. In the current regulatory environment that scrutinises s 299(1)(f) , this article provides some preliminary evidence of the quality of disclosures generated in the Australian market.
Resumo:
In this paper we use a sequence-based visual localization algorithm to reveal surprising answers to the question, how much visual information is actually needed to conduct effective navigation? The algorithm actively searches for the best local image matches within a sliding window of short route segments or 'sub-routes', and matches sub-routes by searching for coherent sequences of local image matches. In contract to many existing techniques, the technique requires no pre-training or camera parameter calibration. We compare the algorithm's performance to the state-of-the-art FAB-MAP 2.0 algorithm on a 70 km benchmark dataset. Performance matches or exceeds the state of the art feature-based localization technique using images as small as 4 pixels, fields of view reduced by a factor of 250, and pixel bit depths reduced to 2 bits. We present further results demonstrating the system localizing in an office environment with near 100% precision using two 7 bit Lego light sensors, as well as using 16 and 32 pixel images from a motorbike race and a mountain rally car stage. By demonstrating how little image information is required to achieve localization along a route, we hope to stimulate future 'low fidelity' approaches to visual navigation that complement probabilistic feature-based techniques.
Resumo:
Recognizing the impact of reconfiguration on the QoS of running systems is especially necessary for choosing an appropriate approach to dealing with dynamic evolution of mission-critical or non-stop business systems. The rationale is that the impaired QoS caused by inappropriate use of dynamic approaches is unacceptable for such running systems. To predict in advance the impact, the challenge is two-fold. First, a unified benchmark is necessary to expose QoS problems of existing dynamic approaches. Second, an abstract representation is necessary to provide a basis for modeling and comparing the QoS of existing and new dynamic reconfiguration approaches. Our previous work [8] has successfully evaluated the QoS assurance capabilities of existing dynamic approaches and provided guidance of appropriate use of particular approaches. This paper reinvestigates our evaluations, extending them into concurrent and parallel environments by abstracting hardware and software conditions to design an evaluation context. We report the new evaluation results and conclude with updated impact analysis and guidance.
Resumo:
Data quality has become a major concern for organisations. The rapid growth in the size and technology of a databases and data warehouses has brought significant advantages in accessing, storing, and retrieving information. At the same time, great challenges arise with rapid data throughput and heterogeneous accesses in terms of maintaining high data quality. Yet, despite the importance of data quality, literature has usually condensed data quality into detecting and correcting poor data such as outliers, incomplete or inaccurate values. As a result, organisations are unable to efficiently and effectively assess data quality. Having an accurate and proper data quality assessment method will enable users to benchmark their systems and monitor their improvement. This paper introduces a granules mining for measuring the random degree of error data which will enable decision makers to conduct accurate quality assessment and allocate the most severe data, thereby providing an accurate estimation of human and financial resources for conducting quality improvement tasks.
Resumo:
The research reported in this paper introduces a knowledge-based urban development assessment framework, which is constructed in order to evaluate and assist in the (re)formulation of local and regional policy frameworks and applications necessary in knowledge city transformations. The paper also reports the findings of an application of this framework in a comparative study of Boston, Vancouver, Melbourne and Manchester. The paper with its assessment framework: demonstrates an innovative way of examining the knowledge-based development capacity of cities by scrutinising their economic, socio-cultural, enviro-urban and institutional development mechanisms and capabilities; presents some of the generic indicators used to evaluate knowledge-based development performance of cities; reveals how a city can benchmark its development level against that of other cities, and; provides insights for achieving a more sustainable and knowledge-based development.
Resumo:
Current complication rates for adolescent scoliosis surgery necessitate the development of better surgical planning tools to improve outcomes. Here we present our approach to developing finite element models of the thoracolumbar spine for deformity surgery simulation, with patient-specific model anatomy based on low-dose pre-operative computed tomography scans. In a first step towards defining patient-specific tissue properties, an initial 'benchmark' set of properties were used to simulate a clinically performed pre-operative spinal flexibility assessment, the fulcrum bending radiograph. Clinical data for ten patients were compared with the simulated results for this assessment and in cases where these data differed by more than 10%, soft tissue properties for the costo-vertebral joint (CVJt) were altered to achieve better agreement. Results from these analyses showed that changing the CVJt stiffness resulted in acceptable agreement between clinical and simulated flexibility in two of the six cases. In light of these results and those of our previous studies in this area, it is suggested that spinal flexibility in the fulcrum bending test is not governed by any single soft tissue structure acting in isolation. More detailed biomechanical characterisation of the fulcrum bending test is required to provide better data for determination of patient-specific soft tissue properties.
Resumo:
The discovery of protein variation is an important strategy in disease diagnosis within the biological sciences. The current benchmark for elucidating information from multiple biological variables is the so called “omics” disciplines of the biological sciences. Such variability is uncovered by implementation of multivariable data mining techniques which come under two primary categories, machine learning strategies and statistical based approaches. Typically proteomic studies can produce hundreds or thousands of variables, p, per observation, n, depending on the analytical platform or method employed to generate the data. Many classification methods are limited by an n≪p constraint, and as such, require pre-treatment to reduce the dimensionality prior to classification. Recently machine learning techniques have gained popularity in the field for their ability to successfully classify unknown samples. One limitation of such methods is the lack of a functional model allowing meaningful interpretation of results in terms of the features used for classification. This is a problem that might be solved using a statistical model-based approach where not only is the importance of the individual protein explicit, they are combined into a readily interpretable classification rule without relying on a black box approach. Here we incorporate statistical dimension reduction techniques Partial Least Squares (PLS) and Principal Components Analysis (PCA) followed by both statistical and machine learning classification methods, and compared them to a popular machine learning technique, Support Vector Machines (SVM). Both PLS and SVM demonstrate strong utility for proteomic classification problems.
Resumo:
This CDROM includes PDFs of presentations on the following topics: "TXDOT Revenue and Expenditure Trends;" "Examine Highway Fund Diversions, & Benchmark Texas Vehicle Registration Fees;" "Evaluation of the JACK Model;" "Future highway construction cost trends;" "Fuel Efficiency Trends and Revenue Impact"