898 resultados para sisäinen benchmarking


Relevância:

10.00% 10.00%

Publicador:

Resumo:

There is growing popularity in the use of composite indices and rankings for cross-organizational benchmarking. However, little attention has been paid to alternative methods and procedures for the computation of these indices and how the use of such methods may impact the resulting indices and rankings. This dissertation developed an approach for assessing composite indices and rankings based on the integration of a number of methods for aggregation, data transformation and attribute weighting involved in their computation. The integrated model developed is based on the simulation of composite indices using methods and procedures proposed in the area of multi-criteria decision making (MCDM) and knowledge discovery in databases (KDD). The approach developed in this dissertation was automated through an IT artifact that was designed, developed and evaluated based on the framework and guidelines of the design science paradigm of information systems research. This artifact dynamically generates multiple versions of indices and rankings by considering different methodological scenarios according to user specified parameters. The computerized implementation was done in Visual Basic for Excel 2007. Using different performance measures, the artifact produces a number of excel outputs for the comparison and assessment of the indices and rankings. In order to evaluate the efficacy of the artifact and its underlying approach, a full empirical analysis was conducted using the World Bank's Doing Business database for the year 2010, which includes ten sub-indices (each corresponding to different areas of the business environment and regulation) for 183 countries. The output results, which were obtained using 115 methodological scenarios for the assessment of this index and its ten sub-indices, indicated that the variability of the component indicators considered in each case influenced the sensitivity of the rankings to the methodological choices. Overall, the results of our multi-method assessment were consistent with the World Bank rankings except in cases where the indices involved cost indicators measured in per capita income which yielded more sensitive results. Low income level countries exhibited more sensitivity in their rankings and less agreement between the benchmark rankings and our multi-method based rankings than higher income country groups.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper assesses the status of pre-disaster risk management in the case of Turkey. By focusing on the period following the catastrophic August 17, 1999 earthquake, the study benefits from USAID’s Disaster Risk Management Benchmarking Tool (DRMBT). In line with the benchmarking tool, the paper covers key developments in the four components of pre-disaster risk management, namely: risk identification, risk mitigation, risk transfer and disaster preparedness. In the end, it will present three major conclusions: (i) Although post-1999 Turkey has made some important progress in the pre-disaster phase of DRM, particularly with the enactment of obligatory earthquake insurance and tightened standards for building construction, the country is far away from substantial levels of success in DRM. (ii) In recent years, local governments have had been given more authority in the realm of DRM, however, Turkey’s approach to DRM is still predominantly centralized at the expense of successful DRM practices at the local level. (iii) While the devastating 1999 earthquake has resulted in advances in the pre-disaster components of DRM; progress has been mostly in the realm of earthquakes. Turkey’s other major disasters (landslides, floods, wild fires i.e.) also require similar attention by local and central authorities.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this doctoral thesis analyzed the discursive representations of the bandit Lampião, the Lantern and his bandits gang in news mossoroenses newspapers published in the twenties of the last century (1927), when the gang invasion of the city of Mossoro in the state of Rio Grande do Norte, on June 13 of that year. To this end, we take as basis the theoretical assumptions of linguistics Textual, especially the narrower context of what is known today as Textual Analysis of the Discourses (ADT), theoretical and descriptive approach to linguistic studies of the text proposed by the French linguist Jean-Michel Adam. In this approach, we are interested in, specifically, the semantic level of the text, highlighting the notion of discursive representation, studied based on benchmarking operations, predication, modification, spatial location and temporal connection and analogy (ADAM, 2011; CASTILHO, 2010; KOCH, 2002, 2006; MARCUSCHI, 1998, 2008; NEVES, 2007; RODRIGUES, PASSEGGI & SILVA NETO, 2010). The corpus of this research consists of three reports in the twenties of the last century in newspapers The Mossoroense, Correio do Povo and the Northeast, and reconstituted through the collection held in the Municipal Museum Lauro Scotland files, Memorial Resistance Mossoro, both located in Natal, and in the news collection of Lampião newspapers in Natal, north of Rio Grande Raimundo Nonato historian. The discursive representations are built from the use of semantic analysis operations. Lampião to, the following representations are built: bandit, head of bandits, briber, defeated, Captain and Lord. To the outlaws of Lampião bunch of the following discursive representations were built: group, gang, gangsters, mates, bloodthirsty pack, brigands, bandits, criminals, burglar horde, and wild beasts. These representations reveal mainly the views of the newspapers of that time, which represented mainly the interests of traders, politicians, the government itself and generally Mossoró population.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The spread of wireless networks and growing proliferation of mobile devices require the development of mobility control mechanisms to support the different demands of traffic in different network conditions. A major obstacle to developing this kind of technology is the complexity involved in handling all the information about the large number of Moving Objects (MO), as well as the entire signaling overhead required to manage these procedures in the network. Despite several initiatives have been proposed by the scientific community to address this issue they have not proved to be effective since they depend on the particular request of the MO that is responsible for triggering the mobility process. Moreover, they are often only guided by wireless medium statistics, such as Received Signal Strength Indicator (RSSI) of the candidate Point of Attachment (PoA). Thus, this work seeks to develop, evaluate and validate a sophisticated communication infrastructure for Wireless Networking for Moving Objects (WiNeMO) systems by making use of the flexibility provided by the Software-Defined Networking (SDN) paradigm, where network functions are easily and efficiently deployed by integrating OpenFlow and IEEE 802.21 standards. For purposes of benchmarking, the analysis was conducted in the control and data planes aspects, which demonstrate that the proposal significantly outperforms typical IPbased SDN and QoS-enabled capabilities, by allowing the network to handle the multimedia traffic with optimal Quality of Service (QoS) transport and acceptable Quality of Experience (QoE) over time.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This research paper focuses on the self-declared initiatives of the four largest chocolate companies to tackle social problems within the context of establishing a sustainable supply chain. After the literature review of sustainability, supply chain management, and cocoa farming, this paper gives an assessment of the extant practices of the chocolatiers and makes a comparative analysis based on Corporate Social Responsibility (CSR) and Sustainability Reports. This paper uses a case study approach based on secondary-data. A roadmap and benchmarking of social sustainability initiatives were conducted for the supply chain management activities of the world's four largest chocolatiers. This paper analyses the extant sustainability practices of the chocolatiers and offers a model framework for comparison of the measures taken. This paper is based on self-declared secondary data. There is a chance that some practices were not documented by the case companies; or that companies claim what they don't actually do. This paper provides a framework for agricultural businesses to compare their sustainability efforts and improve the performance of their supply chains. Originality and value of this research reside in terms of both literature and methodology. The framework for analysing the social sustainability aspects of agricultural supply chains is original and gives an up-to-date view of sustainability practices. The use of secondary data to compare self-declared initiatives is also a novel approach to business sustainability research.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This thesis introduces two related lines of study on classification of hyperspectral images with nonlinear methods. First, it describes a quantitative and systematic evaluation, by the author, of each major component in a pipeline for classifying hyperspectral images (HSI) developed earlier in a joint collaboration [23]. The pipeline, with novel use of nonlinear classification methods, has reached beyond the state of the art in classification accuracy on commonly used benchmarking HSI data [6], [13]. More importantly, it provides a clutter map, with respect to a predetermined set of classes, toward the real application situations where the image pixels not necessarily fall into a predetermined set of classes to be identified, detected or classified with.

The particular components evaluated are a) band selection with band-wise entropy spread, b) feature transformation with spatial filters and spectral expansion with derivatives c) graph spectral transformation via locally linear embedding for dimension reduction, and d) statistical ensemble for clutter detection. The quantitative evaluation of the pipeline verifies that these components are indispensable to high-accuracy classification.

Secondly, the work extends the HSI classification pipeline with a single HSI data cube to multiple HSI data cubes. Each cube, with feature variation, is to be classified of multiple classes. The main challenge is deriving the cube-wise classification from pixel-wise classification. The thesis presents the initial attempt to circumvent it, and discuss the potential for further improvement.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The dissertation consists of three chapters related to the low-price guarantee marketing strategy and energy efficiency analysis. The low-price guarantee is a marketing strategy in which firms promise to charge consumers the lowest price among their competitors. Chapter 1 addresses the research question "Does a Low-Price Guarantee Induce Lower Prices'' by looking into the retail gasoline industry in Quebec where there was a major branded firm which started a low-price guarantee back in 1996. Chapter 2 does a consumer welfare analysis of low-price guarantees to drive police indications and offers a new explanation of the firms' incentives to adopt a low-price guarantee. Chapter 3 develops the energy performance indicators (EPIs) to measure energy efficiency of the manufacturing plants in pulp, paper and paperboard industry.

Chapter 1 revisits the traditional view that a low-price guarantee results in higher prices by facilitating collusion. Using accurate market definitions and station-level data from the retail gasoline industry in Quebec, I conducted a descriptive analysis based on stations and price zones to compare the price and sales movement before and after the guarantee was adopted. I find that, contrary to the traditional view, the stores that offered the guarantee significantly decreased their prices and increased their sales. I also build a difference-in-difference model to quantify the decrease in posted price of the stores that offered the guarantee to be 0.7 cents per liter. While this change is significant, I do not find the response in comeptitors' prices to be significant. The sales of the stores that offered the guarantee increased significantly while the competitors' sales decreased significantly. However, the significance vanishes if I use the station clustered standard errors. Comparing my observations and the predictions of different theories of modeling low-price guarantees, I conclude the empirical evidence here supports that the low-price guarantee is a simple commitment device and induces lower prices.

Chapter 2 conducts a consumer welfare analysis of low-price guarantees to address the antitrust concerns and potential regulations from the government; explains the firms' potential incentives to adopt a low-price guarantee. Using station-level data from the retail gasoline industry in Quebec, I estimated consumers' demand of gasoline by a structural model with spatial competition incorporating the low-price guarantee as a commitment device, which allows firms to pre-commit to charge the lowest price among their competitors. The counterfactual analysis under the Bertrand competition setting shows that the stores that offered the guarantee attracted a lot more consumers and decreased their posted price by 0.6 cents per liter. Although the matching stores suffered a decrease in profits from gasoline sales, they are incentivized to adopt the low-price guarantee to attract more consumers to visit the store likely increasing profits at attached convenience stores. Firms have strong incentives to adopt a low-price guarantee on the product that their consumers are most price-sensitive about, while earning a profit from the products that are not covered in the guarantee. I estimate that consumers earn about 0.3% more surplus when the low-price guarantee is in place, which suggests that the authorities should not be concerned and regulate low-price guarantees. In Appendix B, I also propose an empirical model to look into how low-price guarantees would change consumer search behavior and whether consumer search plays an important role in estimating consumer surplus accurately.

Chapter 3, joint with Gale Boyd, describes work with the pulp, paper, and paperboard (PP&PB) industry to provide a plant-level indicator of energy efficiency for facilities that produce various types of paper products in the United States. Organizations that implement strategic energy management programs undertake a set of activities that, if carried out properly, have the potential to deliver sustained energy savings. Energy performance benchmarking is a key activity of strategic energy management and one way to enable companies to set energy efficiency targets for manufacturing facilities. The opportunity to assess plant energy performance through a comparison with similar plants in its industry is a highly desirable and strategic method of benchmarking for industrial energy managers. However, access to energy performance data for conducting industry benchmarking is usually unavailable to most industrial energy managers. The U.S. Environmental Protection Agency (EPA), through its ENERGY STAR program, seeks to overcome this barrier through the development of manufacturing sector-based plant energy performance indicators (EPIs) that encourage U.S. industries to use energy more efficiently. In the development of the energy performance indicator tools, consideration is given to the role that performance-based indicators play in motivating change; the steps necessary for indicator development, from interacting with an industry in securing adequate data for the indicator; and actual application and use of an indicator when complete. How indicators are employed in EPA’s efforts to encourage industries to voluntarily improve their use of energy is discussed as well. The chapter describes the data and statistical methods used to construct the EPI for plants within selected segments of the pulp, paper, and paperboard industry: specifically pulp mills and integrated paper & paperboard mills. The individual equations are presented, as are the instructions for using those equations as implemented in an associated Microsoft Excel-based spreadsheet tool.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Prior research has established that idiosyncratic volatility of the securities prices exhibits a positive trend. This trend and other factors have made the merits of investment diversification and portfolio construction more compelling. A new optimization technique, a greedy algorithm, is proposed to optimize the weights of assets in a portfolio. The main benefits of using this algorithm are to: a) increase the efficiency of the portfolio optimization process, b) implement large-scale optimizations, and c) improve the resulting optimal weights. In addition, the technique utilizes a novel approach in the construction of a time-varying covariance matrix. This involves the application of a modified integrated dynamic conditional correlation GARCH (IDCC - GARCH) model to account for the dynamics of the conditional covariance matrices that are employed. The stochastic aspects of the expected return of the securities are integrated into the technique through Monte Carlo simulations. Instead of representing the expected returns as deterministic values, they are assigned simulated values based on their historical measures. The time-series of the securities are fitted into a probability distribution that matches the time-series characteristics using the Anderson-Darling goodness-of-fit criterion. Simulated and actual data sets are used to further generalize the results. Employing the S&P500 securities as the base, 2000 simulated data sets are created using Monte Carlo simulation. In addition, the Russell 1000 securities are used to generate 50 sample data sets. The results indicate an increase in risk-return performance. Choosing the Value-at-Risk (VaR) as the criterion and the Crystal Ball portfolio optimizer, a commercial product currently available on the market, as the comparison for benchmarking, the new greedy technique clearly outperforms others using a sample of the S&P500 and the Russell 1000 securities. The resulting improvements in performance are consistent among five securities selection methods (maximum, minimum, random, absolute minimum, and absolute maximum) and three covariance structures (unconditional, orthogonal GARCH, and integrated dynamic conditional GARCH).

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper formulates a linear kernel support vector machine (SVM) as a regularized least-squares (RLS) problem. By defining a set of indicator variables of the errors, the solution to the RLS problem is represented as an equation that relates the error vector to the indicator variables. Through partitioning the training set, the SVM weights and bias are expressed analytically using the support vectors. It is also shown how this approach naturally extends to Sums with nonlinear kernels whilst avoiding the need to make use of Lagrange multipliers and duality theory. A fast iterative solution algorithm based on Cholesky decomposition with permutation of the support vectors is suggested as a solution method. The properties of our SVM formulation are analyzed and compared with standard SVMs using a simple example that can be illustrated graphically. The correctness and behavior of our solution (merely derived in the primal context of RLS) is demonstrated using a set of public benchmarking problems for both linear and nonlinear SVMs.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Resource Selection (or Query Routing) is an important step in P2P IR. Though analogous to document retrieval in the sense of choosing a relevant subset of resources, resource selection methods have evolved independently from those for document retrieval. Among the reasons for such divergence is that document retrieval targets scenarios where underlying resources are semantically homogeneous, whereas peers would manage diverse content. We observe that semantic heterogeneity is mitigated in the clustered 2-tier P2P IR architecture resource selection layer by way of usage of clustering, and posit that this necessitates a re-look at the applicability of document retrieval methods for resource selection within such a framework. This paper empirically benchmarks document retrieval models against the state-of-the-art resource selection models for the problem of resource selection in the clustered P2P IR architecture, using classical IR evaluation metrics. Our benchmarking study illustrates that document retrieval models significantly outperform other methods for the task of resource selection in the clustered P2P IR architecture. This indicates that clustered P2P IR framework can exploit advancements in document retrieval methods to deliver corresponding improvements in resource selection, indicating potential convergence of these fields for the clustered P2P IR architecture.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Introduction
Evaluating quality of palliative day services is essential for assessing care across diverse settings, and for monitoring quality improvement approaches.

Aim
To develop a set of quality indicators for assessment of all aspects (structure, process and outcome) of care in palliative day services.

Methods
Using a modified version of the RAND/UCLA appropriateness method (Fitch et al., 2001), a multidisciplinary panel of 16 experts independently completed a survey rating the appropriateness of 182 potential quality indicators previously identified during a systematic evidence review. Panel members then attended a one day, face-to-face meeting where indicators were discussed and subsequently re-rated. Panel members were also asked to rate the feasibility and necessity of measuring each indicator.

Results
71 indicators classified as inappropriate during the survey were removed based on median appropriateness ratings and level of agreement. Following the panel discussions, a further 60 were removed based on appropriateness and feasibility ratings, level of agreement and assessment of necessity. Themes identified during the panel discussion and findings of the evidence review were used to translate the remaining 51 indicators into a final set of 27.

Conclusion
The final indicator set included information on rationale and supporting evidence, methods of assessment, risk adjustment, and recommended performance levels. Further implementation work will test the suitability of this ‘toolkit’ for measurement and benchmarking. The final indicator set provides the basis for standardised assessment of quality across services, including care delivered in community and primary care settings.

Reference

• Fitch K, Bernstein SJ, Aguilar MD, et al. The RAND/UCLA Appropriateness Method User’s Manual. Santa Monica, CA: RAND Corporation; 2001. http://www.rand.org/pubs/monograph_reports/MR1269

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This is the final Report to the Iowa DOT Offices of Construction and the Highway Division for the calendar year 1999 research project entitled - Continuation of Benchmarking Project: Phase IV. This project continues efforts started in 1995 with the development of a performance measurement system. The performance measurements were used to identify areas that required improvement and process improvement teams (PITs) were launched to make recommendations for improvement. This report provides a brief historical background, documents Benchmark Steering Team Activities, describes measurement activities including the employee survey and collection of non-survey data. Then a retrospective of past PIT activities is given, which sets the stage for the substantial increase in PIT activity that occurred during the winter of 1998/9. Finally, the report closes with suggestions for future directions in Benchmarking Activity.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In the past decades, social-ecological systems (SESs) worldwide have undergone dramatic transformations with often detrimental consequences for livelihoods. Although resilience thinking offers promising conceptual frameworks to understand SES transformations, empirical resilience assessments of real-world SESs are still rare because SES complexity requires integrating knowledge, theories, and approaches from different disciplines. Taking up this challenge, we empirically assess the resilience of a South African pastoral SES to drought using various methods from natural and social sciences. In the ecological subsystem, we analyze rangelands’ ability to buffer drought effects on forage provision, using soil and vegetation indicators. In the social subsystem, we assess households’ and communities’ capacities to mitigate drought effects, applying agronomic and institutional indicators and benchmarking against practices and institutions in traditional pastoral SESs. Our results indicate that a decoupling of livelihoods from livestock-generated income was initiated by government interventions in the 1930s. In the post-apartheid phase, minimum-input strategies of herd management were adopted, leading to a recovery of rangeland vegetation due to unintentionally reduced stocking densities. Because current livelihood security is mainly based on external monetary resources (pensions, child grants, and disability grants), household resilience to drought is higher than in historical phases. Our study is one of the first to use a truly multidisciplinary resilience assessment. Conflicting results from partial assessments underline that measuring narrow indicator sets may impede a deeper understanding of SES transformations. The results also imply that the resilience of contemporary, open SESs cannot be explained by an inward-looking approach because essential connections and drivers at other scales have become relevant in the globalized world. Our study thus has helped to identify pitfalls in empirical resilience assessment and to improve the conceptualization of SES dynamics.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Metagenomic studies use high-throughput sequence data to investigate microbial communities in situ. However, considerable challenges remain in the analysis of these data, particularly with regard to speed and reliable analysis of microbial species as opposed to higher level taxa such as phyla. We here present Genometa, a computationally undemanding graphical user interface program that enables identification of bacterial species and gene content from datasets generated by inexpensive high-throughput short read sequencing technologies. Our approach was first verified on two simulated metagenomic short read datasets, detecting 100% and 94% of the bacterial species included with few false positives or false negatives. Subsequent comparative benchmarking analysis against three popular metagenomic algorithms on an Illumina human gut dataset revealed Genometa to attribute the most reads to bacteria at species level (i.e. including all strains of that species) and demonstrate similar or better accuracy than the other programs. Lastly, speed was demonstrated to be many times that of BLAST due to the use of modern short read aligners. Our method is highly accurate if bacteria in the sample are represented by genomes in the reference sequence but cannot find species absent from the reference. This method is one of the most user-friendly and resource efficient approaches and is thus feasible for rapidly analysing millions of short reads on a personal computer.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Trotz Wachstumsmarkt konkurrieren Hochschulen um leistungsstarke Studienanfänger. Ziel dieser Arbeit war eine Untersuchung der Wettbewerbsfähigkeit des Bachelorstudiengangs Betriebswirtschaftslehre der Hochschule Hannover, Fakultät IV – Abteilung Betriebswirtschaft. Zu diesem Zweck wurde eine vergleichende Analyse von 23 Studiengängen des Fachs Betriebswirtschaftslehre oder Wirtschaftswissenschaft in Niedersachsen und angrenzenden Bundesländern anhand der Kriterien Wahlmöglichkeiten, Methodenausbildung, Ausbildung in Schlüsselkompetenzen und Internationalisierung durchgeführt. In dieser Arbeit werden die Ergebnisse der Untersuchung und Schlussfolgerungen für den Studiengang Betriebswirtschaftslehre der Hochschule Hannover berichtet.