799 resultados para Utility-based performance measures


Relevância:

100.00% 100.00%

Publicador:

Resumo:

This document describes the analytical methods used to quantify core organic chemicals in tissue and sediment collected as part of NOAA’s National Status and Trends Program (NS&T) for the years 2000-2006. Organic contaminat analytical methods used during the early years of the program are described in NOAA Technical Memoranda NOS ORCA 71 and 130 (Lauenstein and Cantillo, 1993; Lauenstein and Cantillo, 1998) for the years 1984-1992 and 1993-1996, respectively. These reports are available from our website (http://www.ccma.nos.gov) The methods detailed in this document were utilized by the Mussel Watch Project and Bioeffects Project, which are both part of the NS&T program. The Mussel Watch Project has been monitoring contaminants in bivalves and sediments since 1986 and is the longest active national contaminant monitoring program operating in U.S. costal waters. Approximately 280 Mussel Watch sites are sampled on a biennial and decadal timescale for bivalve tissue and sediment respectively. Similarly, the Bioeffects Assessment Project began in 1986 to characterize estuaries and near coastal environs. Using the sediment quality triad approach that measures; (1) levels of contaminants in sediments, (2) incidence and severity of toxicity, and (3) benthic macrofaunal conmmunities, the Bioeffects Project describes the spatial extent of sediment toxicity. Contaminant assessment is a core function of both projects. These methods, while discussed here in the context of sediment and bivalve tissue, were also used with other matricies including: fish fillet, fish liver, nepheloid layer, and suspended particulate matter. The methods described herein are for the core organic contaminants monitored in the NS&T Program and include polycyclic aromatic hydrocarbons (PAHs), polychlorinated biphenyls (PCBs), butyltins, and organochlorines that have been analyzed consistently over the past 15-20 years. Organic contaminants such as dioxins, perfluoro compounds and polybrominated biphenyl ethers (PBDEs) were analyzed periodically in special studies of the NS&T Program and will be described in another document. All of the analytical techniques described in this document were used by B&B Laboratories, Inc, an affiliate of TDI-Brook International, Inc. in College Station, Texas under contract to NOAA. The NS&T Program uses a performance-based system approach to obtain the best possible data quality and comparability, and requires laboratories to demonstrate precision, accuracy, and sensitivity to ensure results-based performance goals and measures. (PDF contains 75 pages)

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper investigates a method of automatic pronunciation scoring for use in computer-assisted language learning (CALL) systems. The method utilizes a likelihood-based `Goodness of Pronunciation' (GOP) measure which is extended to include individual thresholds for each phone based on both averaged native confidence scores and on rejection statistics provided by human judges. Further improvements are obtained by incorporating models of the subject's native language and by augmenting the recognition networks to include expected pronunciation errors. The various GOP measures are assessed using a specially recorded database of non-native speakers which has been annotated to mark phone-level pronunciation errors. Since pronunciation assessment is highly subjective, a set of four performance measures has been designed, each of them measuring different aspects of how well computer-derived phone-level scores agree with human scores. These performance measures are used to cross-validate the reference annotations and to assess the basic GOP algorithm and its refinements. The experimental results suggest that a likelihood-based pronunciation scoring metric can achieve usable performance, especially after applying the various enhancements.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper presents a preliminary study which describes and evaluates a multi-objective (MO) version of a recently created single objective (SO) optimization algorithm called the "Alliance Algorithm" (AA). The algorithm is based on the metaphorical idea that several tribes, with certain skills and resource needs, try to conquer an environment for their survival and to ally together to improve the likelihood of conquest. The AA has given promising results in several fields to which has been applied, thus the development of a MO variant (MOAA) is a natural extension. Here the MOAA's performance is compared with two well-known MO algorithms: NSGA-II and SPEA-2. The performance measures chosen for this study are the convergence and diversity metrics. The benchmark functions chosen for the comparison are from the ZDT and OKA families and the main classical MO problems. The results show that the three algorithms have similar overall performance. Thus, it is not possible to identify a best algorithm for all the problems; the three algorithms show a certain complementarity because they offer superior performance for different classes of problems. © 2012 IEEE.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Current evidence increasingly suggests that very short, supra-maximal bouts of exercise can have significant health and performance benefits. The majority of research conducted in the area however, uses laboratory-based protocols, which can lack ecological validity. The purpose of this study was to examine the effects of a high intensity sprint-training programme on hockey related performance measures. 14 semi-professional hockey players completed either a 4-week high intensity training (HIT) intervention, consisting of a total of six sessions HIT, which progressively increased in volume (n=7), or followed their normal training programme (Con; n=7). Straight-line sprint speed with and without a hockey stick and ball, and slalom sprint speed, with and without a hockey stick and ball were used as performance indicators. Maximal sprint speed over 22.9m was also assessed. Upon completion of the four-week intervention, straight-line sprint speed improved significantly in the HIT group (~3%), with no change in performance for the Con group. Slalom sprint speed, both with and without a hockey ball was not significantly different following the training programme in either group. Maximal sprint speed improved significantly (12.1%) in the HIT group, but there was no significant performance change in the Con group. The findings of this study indicate that a short period of HIT can significantly improve hockey related performance measures, and could be beneficial to athletes and coaches in field settings.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Recent measurements of local-area and wide-area traffic have shown that network traffic exhibits variability at a wide range of scales self-similarity. In this paper, we examine a mechanism that gives rise to self-similar network traffic and present some of its performance implications. The mechanism we study is the transfer of files or messages whose size is drawn from a heavy-tailed distribution. We examine its effects through detailed transport-level simulations of multiple TCP streams in an internetwork. First, we show that in a "realistic" client/server network environment i.e., one with bounded resources and coupling among traffic sources competing for resources the degree to which file sizes are heavy-tailed can directly determine the degree of traffic self-similarity at the link level. We show that this causal relationship is not significantly affected by changes in network resources (bottleneck bandwidth and buffer capacity), network topology, the influence of cross-traffic, or the distribution of interarrival times. Second, we show that properties of the transport layer play an important role in preserving and modulating this relationship. In particular, the reliable transmission and flow control mechanisms of TCP (Reno, Tahoe, or Vegas) serve to maintain the long-range dependency structure induced by heavy-tailed file size distributions. In contrast, if a non-flow-controlled and unreliable (UDP-based) transport protocol is used, the resulting traffic shows little self-similar characteristics: although still bursty at short time scales, it has little long-range dependence. If flow-controlled, unreliable transport is employed, the degree of traffic self-similarity is positively correlated with the degree of throttling at the source. Third, in exploring the relationship between file sizes, transport protocols, and self-similarity, we are also able to show some of the performance implications of self-similarity. We present data on the relationship between traffic self-similarity and network performance as captured by performance measures including packet loss rate, retransmission rate, and queueing delay. Increased self-similarity, as expected, results in degradation of performance. Queueing delay, in particular, exhibits a drastic increase with increasing self-similarity. Throughput-related measures such as packet loss and retransmission rate, however, increase only gradually with increasing traffic self-similarity as long as reliable, flow-controlled transport protocol is used.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We discuss the design principles of TCP within the context of heterogeneous wired/wireless networks and mobile networking. We identify three shortcomings in TCP's behavior: (i) the protocol's error detection mechanism, which does not distinguish different types of errors and thus does not suffice for heterogeneous wired/wireless environments, (ii) the error recovery, which is not responsive to the distinctive characteristics of wireless networks such as transient or burst errors due to handoffs and fading channels, and (iii) the protocol strategy, which does not control the tradeoff between performance measures such as goodput and energy consumption, and often entails a wasteful effort of retransmission and energy expenditure. We discuss a solution-framework based on selected research proposals and the associated evaluation criteria for the suggested modifications. We highlight an important angle that did not attract the required attention so far: the need for new performance metrics, appropriate for evaluating the impact of protocol strategies on battery-powered devices.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper, we discuss the problem of maintenance of a CBR system for retrieval of rotationally symmetric shapes. The special feature of this system is that similarity is derived primarily from graph matching algorithms. The special problem of such a system is that it does not operate on search indices that may be derived from single cases and then used for visualisation and principle component analyses. Rather, the system is built on a similarity metric defined directly over pairs of cases. The problems of efficiency, consistency, redundancy, completeness and correctness are discussed for such a system. Performance measures for the CBR system are given, and the results for trials of the system are presented. The competence of the current case-base is discussed, with reference to a representation of cases as points in an n-dimensional feature space, and a Gramian visualisation. A refinement of the case base is performed as a result of the competence analysis and the performance of the case-base before and after refinement is compared.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Software metrics are the key tool in software quality management. In this paper, we propose to use support vector machines for regression applied to software metrics to predict software quality. In experiments we compare this method with other regression techniques such as Multivariate Linear Regression, Conjunctive Rule and Locally Weighted Regression. Results on benchmark dataset MIS, using mean absolute error, and correlation coefficient as regression performance measures, indicate that support vector machines regression is a promising technique for software quality prediction. In addition, our investigation of PCA based metrics extraction shows that using the first few Principal Components (PC) we can still get relatively good performance.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Traditionally, when designing a ship the driving issues are seen to be powering, stability, strength and seakeeping. Issues related to ship operations and evolutions are investigated later in the design process, within the constraint of a fixed layout. This can result in operational inefficiencies and limitations, excessive crew numbers and potentially hazardous situations. This paper summarises work by University College London and the University of Greenwich prior to the completion of a three year EPSRC funded research project to integrate the simulation of personnel movement into early stage ship design. This integration is intended to facilitate the assessment of onboard operations while the design is still highly amenable to change. The project brings together the University of Greenwich developed maritimeEXODUS personnel movement simulation software and the SURFCON implementation of the Design Building Block approach to early stage ship design, which originated with the UCL Ship Design Research team and has been implemented within the PARAMARINE ship design system produced by Graphics Research Corporation. Central to the success of this project is the definition of a suitable series of Performance Measures (PM) which can be used to assess the human performance of the design in different operational scenarios. The paper outlines the progress made on deriving the PM from human dynamics criteria measured in simulations and their incorporation into a Human Performance Metric (HPM) for analysis. It describes the production of a series of SURFCON ship designs, based on the Royal Navy’s Type 22 Batch 3 frigate, and their analysis using the PARAMARINE and maritimeEXODUS software. Conclusions on the work to date and for the remainder of the project are presented addressing the integration of personnel movement simulation into the preliminary ship design process.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Purpose The aim of this paper is to explore the issues involved in developing and applying performance management approaches within a large UK public sector department using a multiple stakeholder perspective and an accompanying theoretical framework. Design/methodology/approach An initial short questionnaire was used to determine perceptions about the implementation and effectiveness of the new performance management system across the organisation. In total, 700 questionnaires were distributed. Running concurrently with an ethnographic approach, and informed by the questionnaire responses, was a series of semi-structured interviews and focus groups. Findings Staff at all levels had an understanding of the new system and perceived it as being beneficial. However, there were concerns that the approach was not continuously managed throughout the year and was in danger of becoming an annual event, rather than an ongoing process. Furthermore, the change process seemed to have advanced without corresponding changes to appraisal and reward and recognition systems. Thus, the business objectives were not aligned with motivating factors within the organisation. Research limitations/implications Additional research to test the validity and usefulness of the theoretical model, as discussed in this paper, would be beneficial. Practical implications The strategic integration of the stakeholder performance measures and scorecards was found to be essential to producing an overall stakeholder-driven strategy within the case study organisation. Originality/value This paper discusses in detail the approach adopted and the progress made by one large UK public sector organisation, as it attempts to develop better relationships with all of its stakeholders and hence improve its performance. This paper provides a concerted attempt to link theory with practice.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Logistic regression and Gaussian mixture model (GMM) classifiers have been trained to estimate the probability of acute myocardial infarction (AMI) in patients based upon the concentrations of a panel of cardiac markers. The panel consists of two new markers, fatty acid binding protein (FABP) and glycogen phosphorylase BB (GPBB), in addition to the traditional cardiac troponin I (cTnI), creatine kinase MB (CKMB) and myoglobin. The effect of using principal component analysis (PCA) and Fisher discriminant analysis (FDA) to preprocess the marker concentrations was also investigated. The need for classifiers to give an accurate estimate of the probability of AMI is argued and three categories of performance measure are described, namely discriminatory ability, sharpness, and reliability. Numerical performance measures for each category are given and applied. The optimum classifier, based solely upon the samples take on admission, was the logistic regression classifier using FDA preprocessing. This gave an accuracy of 0.85 (95% confidence interval: 0.78-0.91) and a normalised Brier score of 0.89. When samples at both admission and a further time, 1-6 h later, were included, the performance increased significantly, showing that logistic regression classifiers can indeed use the information from the five cardiac markers to accurately and reliably estimate the probability AMI. © Springer-Verlag London Limited 2008.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Purpose – The purpose of this paper is to summarize the accumulated body of knowledge on the performance of new product projects and provide directions for further research. Design/methodology/approach – Using a refined classification of antecedents of new product project performance the research results are meta-analyzed in the literature in order to identify the strength and stability of predictor-performance relationships. Findings – The results reveal that 22 variables have a significant relationship with new product project performance, of which only 12 variables have a sizable relationship. In order of importance these factors are the degree of organizational interaction, R&D and marketing interface, general product development proficiency, product advantage, financial/business analysis, technical proficiency, management skill, marketing proficiency, market orientation, technology synergy, project manager competency and launch activities. Of the 34 variables 16 predictors show potential for moderator effects. Research limitations/implications – The validity of the results is constrained by publication bias and heterogeneity of performance measures, and directions for the presentation of data in future empirical publications are provided. Practical implications – This study helps new product project managers in understanding and managing the performance of new product development projects. Originality/value – This paper provides unique insights into the importance of predictors of new product performance at the project level. Furthermore, it identifies which predictor-performance relations are contingent on other factors.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The increasing importance placed upon regional development and the knowledge-based economy as economic growth stimuli has led to a changing role for Universities and their interaction with the business community through (though not limited to) the transfer of technology from academia to industry. With the emergence of Local Enterprise Partnerships (LEPs) replacing the Regional Development Agencies (RDAs), there is a need for policy and practice going forward to be clearly informed by a critique of TTO (Technology Transfer Office)–RDA stakeholder relationship in a lessons learned approach so that LEPs can benefit from a faster learning curve. Thus, the aim of this paper is to examine the stakeholder relationship between three regional universities in the context of its TTO and the RDA with a view to determining lessons learned for the emerging LEP approach. Although the issues raised are contextual, the abstracted stakeholder conceptualisation of the TTO–RDA relationship should enable wider generalisation of the issues raised beyond the UK. Stakeholder theory relationship and stage development models are used to guide a repeat interview study of the TTO and RDA stakeholder groupings. The findings, interpreted using combined category and stage based stakeholder models, show how the longitudinal development of the TTO–RDA stakeholder relationship for each case has progressed through different stakeholder pathways, and stages where specific targeting of funding was dependant on the stakeholder stage. Greater targeted policy and funding, based on the stakeholder relationship approach, led to the development of joint mechanisms and a closer alignment of performance measures between the TTO and the RDA. However, over-reliance on the unitary nature of the TTO–RDA relationship may lead to a lack of cultivation and dependency for funding from other stakeholders.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This study undertakes a modeling based performance assessment of all Irish credit unions between 2002 and 2010, a particularly turbulent period in their history. The analysis explicitly addresses the current challenges faced by credit unions in that the modeling approach used rewards credit unions for reducing undesirable outputs (impaired loans and investments) as well as for increasing desirable outputs (loans, earning assets and members’ funds) and decreasing inputs (labour expenditure, capital expenditure and fund expenses). The main findings are: credit unions are subject to increasing returns to scale; technical regression occurred in the years after 2007; there is significant scope for an improvement in efficiency through expansion of desirable outputs and contraction of undesirable outputs and inputs; and that larger credit unions, that are better capitalised and pay a higher dividend to members are more efficient than their smaller, less capitalised, and lower dividend paying counterparts.