964 resultados para Performance metrics


Relevância:

70.00% 70.00%

Publicador:

Resumo:

Political drivers such as the Kyoto protocol, the EU Energy Performance of Buildings Directive and the Energy end use and Services Directive have been implemented in response to an identified need for a reduction in human related CO2 emissions. Buildings account for a significant portion of global CO2 emissions, approximately 25-30%, and it is widely acknowledged by industry and research organisations that they operate inefficiently. In parallel, unsatisfactory indoor environmental conditions have proven to negatively impact occupant productivity. Legislative drivers and client education are seen as the key motivating factors for an improvement in the holistic environmental and energy performance of a building. A symbiotic relationship exists between building indoor environmental conditions and building energy consumption. However traditional Building Management Systems and Energy Management Systems treat these separately. Conventional performance analysis compares building energy consumption with a previously recorded value or with the consumption of a similar building and does not recognise the fact that all buildings are unique. Therefore what is required is a new framework which incorporates performance comparison against a theoretical building specific ideal benchmark. Traditionally Energy Managers, who work at the operational level of organisations with respect to building performance, do not have access to ideal performance benchmark information and as a result cannot optimally operate buildings. This thesis systematically defines Holistic Environmental and Energy Management and specifies the Scenario Modelling Technique which in turn uses an ideal performance benchmark. The holistic technique uses quantified expressions of building performance and by doing so enables the profiled Energy Manager to visualise his actions and the downstream consequences of his actions in the context of overall building operation. The Ideal Building Framework facilitates the use of this technique by acting as a Building Life Cycle (BLC) data repository through which ideal building performance benchmarks are systematically structured and stored in parallel with actual performance data. The Ideal Building Framework utilises transformed data in the form of the Ideal Set of Performance Objectives and Metrics which are capable of defining the performance of any building at any stage of the BLC. It is proposed that the union of Scenario Models for an individual building would result in a building specific Combination of Performance Metrics which would in turn be stored in the BLC data repository. The Ideal Data Set underpins the Ideal Set of Performance Objectives and Metrics and is the set of measurements required to monitor the performance of the Ideal Building. A Model View describes the unique building specific data relevant to a particular project stakeholder. The energy management data and information exchange requirements that underlie a Model View implementation are detailed and incorporate traditional and proposed energy management. This thesis also specifies the Model View Methodology which complements the Ideal Building Framework. The developed Model View and Rule Set methodology process utilises stakeholder specific rule sets to define stakeholder pertinent environmental and energy performance data. This generic process further enables each stakeholder to define the resolution of data desired. For example, basic, intermediate or detailed. The Model View methodology is applicable for all project stakeholders, each requiring its own customised rule set. Two rule sets are defined in detail, the Energy Manager rule set and the LEED Accreditor rule set. This particular measurement generation process accompanied by defined View would filter and expedite data access for all stakeholders involved in building performance. Information presentation is critical for effective use of the data provided by the Ideal Building Framework and the Energy Management View definition. The specifications for a customised Information Delivery Tool account for the established profile of Energy Managers and best practice user interface design. Components of the developed tool could also be used by Facility Managers working at the tactical and strategic levels of organisations. Informed decision making is made possible through specified decision assistance processes which incorporate the Scenario Modelling and Benchmarking techniques, the Ideal Building Framework, the Energy Manager Model View, the Information Delivery Tool and the established profile of Energy Managers. The Model View and Rule Set Methodology is effectively demonstrated on an appropriate mixed use existing ‘green’ building, the Environmental Research Institute at University College Cork, using the Energy Management and LEED rule sets. Informed Decision Making is also demonstrated using a prototype scenario for the demonstration building.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Data registration refers to a series of techniques for matching or bringing similar objects or datasets together into alignment. These techniques enjoy widespread use in a diverse variety of applications, such as video coding, tracking, object and face detection and recognition, surveillance and satellite imaging, medical image analysis and structure from motion. Registration methods are as numerous as their manifold uses, from pixel level and block or feature based methods to Fourier domain methods.

This book is focused on providing algorithms and image and video techniques for registration and quality performance metrics. The authors provide various assessment metrics for measuring registration quality alongside analyses of registration techniques, introducing and explaining both familiar and state-of-the-art registration methodologies used in a variety of targeted applications.

Key features:
- Provides a state-of-the-art review of image and video registration techniques, allowing readers to develop an understanding of how well the techniques perform by using specific quality assessment criteria
- Addresses a range of applications from familiar image and video processing domains to satellite and medical imaging among others, enabling readers to discover novel methodologies with utility in their own research
- Discusses quality evaluation metrics for each application domain with an interdisciplinary approach from different research perspectives

Relevância:

70.00% 70.00%

Publicador:

Resumo:

We present a rigorous methodology and new metrics for fair comparison of server and microserver platforms. Deploying our methodology and metrics, we compare a microserver with ARM cores against two servers with ×86 cores running the same real-time financial analytics workload. We define workload-specific but platform-independent performance metrics for platform comparison, targeting both datacenter operators and end users. Our methodology establishes that a server based on the Xeon Phi co-processor delivers the highest performance and energy efficiency. However, by scaling out energy-efficient microservers, we achieve competitive or better energy efficiency than a power-equivalent server with two Sandy Bridge sockets, despite the microserver's slower cores. Using a new iso-QoS metric, we find that the ARM microserver scales enough to meet market throughput demand, that is, a 100% QoS in terms of timely option pricing, with as little as 55% of the energy consumed by the Sandy Bridge server.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

In the last decade, mobile wireless communications have witnessed an explosive growth in the user’s penetration rate and their widespread deployment around the globe. In particular, a research topic of particular relevance in telecommunications nowadays is related to the design and implementation of mobile communication systems of 4th generation (4G). 4G networks will be characterized by the support of multiple radio access technologies in a core network fully compliant with the Internet Protocol (all IP paradigms). Such networks will sustain the stringent quality of service (QoS) requirements and the expected high data rates from the type of multimedia applications (i.e. YouTube and Skype) to be available in the near future. Therefore, 4G wireless communications system will be of paramount importance on the development of the information society in the near future. As 4G wireless services will continue to increase, this will put more and more pressure on the spectrum availability. There is a worldwide recognition that methods of spectrum managements have reached their limit and are no longer optimal, therefore new paradigms must be sought. Studies show that most of the assigned spectrum is under-utilized, thus the problem in most cases is inefficient spectrum management rather spectrum shortage. There are currently trends towards a more liberalized approach of spectrum management, which are tightly linked to what is commonly termed as Cognitive Radio (CR). Furthermore, conventional deployment of 4G wireless systems (one BS in cell and mobile deploy around it) are known to have problems in providing fairness (users closer to the BS are more benefited relatively to the cell edge users) and in covering some zones affected by shadowing, therefore the use of relays has been proposed as a solution. To evaluate and analyse the performances of 4G wireless systems software tools are normally used. Software tools have become more and more mature in recent years and their need to provide a high level evaluation of proposed algorithms and protocols is now more important. The system level simulation (SLS) tools provide a fundamental and flexible way to test all the envisioned algorithms and protocols under realistic conditions, without the need to deal with the problems of live networks or reduced scope prototypes. Furthermore, the tools allow network designers a rapid collection of a wide range of performance metrics that are useful for the analysis and optimization of different algorithms. This dissertation proposes the design and implementation of conventional system level simulator (SLS), which afterwards enhances for the 4G wireless technologies namely cognitive Radios (IEEE802.22) and Relays (IEEE802.16j). SLS is then used for the analysis of proposed algorithms and protocols.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Data registration refers to a series of techniques for matching or bringing similar objects or datasets together into alignment. These techniques enjoy widespread use in a diverse variety of applications, such as video coding, tracking, object and face detection and recognition, surveillance and satellite imaging, medical image analysis and structure from motion. Registration methods are as numerous as their manifold uses, from pixel level and block or feature based methods to Fourier domain methods. This book is focused on providing algorithms and image and video techniques for registration and quality performance metrics. The authors provide various assessment metrics for measuring registration quality alongside analyses of registration techniques, introducing and explaining both familiar and state–of–the–art registration methodologies used in a variety of targeted applications.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Space weather effects on technological systems originate with energy carried from the Sun to the terrestrial environment by the solar wind. In this study, we present results of modeling of solar corona-heliosphere processes to predict solar wind conditions at the L1 Lagrangian point upstream of Earth. In particular we calculate performance metrics for (1) empirical, (2) hybrid empirical/physics-based, and (3) full physics-based coupled corona-heliosphere models over an 8-year period (1995–2002). L1 measurements of the radial solar wind speed are the primary basis for validation of the coronal and heliosphere models studied, though other solar wind parameters are also considered. The models are from the Center for Integrated Space-Weather Modeling (CISM) which has developed a coupled model of the whole Sun-to-Earth system, from the solar photosphere to the terrestrial thermosphere. Simple point-by-point analysis techniques, such as mean-square-error and correlation coefficients, indicate that the empirical coronal-heliosphere model currently gives the best forecast of solar wind speed at 1 AU. A more detailed analysis shows that errors in the physics-based models are predominately the result of small timing offsets to solar wind structures and that the large-scale features of the solar wind are actually well modeled. We suggest that additional “tuning” of the coupling between the coronal and heliosphere models could lead to a significant improvement of their accuracy. Furthermore, we note that the physics-based models accurately capture dynamic effects at solar wind stream interaction regions, such as magnetic field compression, flow deflection, and density buildup, which the empirical scheme cannot.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

The aim of this paper is to investigate the relationship-based factors that affect performance of general building projects in China. Thirteen performance metrics that may be used to measure the success level of construction projects are defined and categorized into four groups namely cost, schedule, quality and relationship performance. Fourteen risks inherent in relationships and 16 tools expected to facilitate relationship building that may affect project success are identified. Data of different projects were collected in China via a self-administered postal survey. Multiple linear regression models are developed to help explain the variance in different performance metrics. It has been found that ten risks and nine tools have either positive or negative influence on project performance to some different extents and in different project development process stages. Detailed explanations are made, especially to those variables bearing unexpected signs. It is recommended that firms in the Chinese construction industry manage the relationship-based factors that are significant in the MLR models so as to achieve project success.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

The aim of this paper is to investigate the relationship-based factors that affect performance of general building projects in China. Eight performance metrics that may be used to measure the success level of construction projects are defined and categorized into two groups namely 'hard' and 'soft' performance. Eight indicators of risks inherent in relationships and seven indicators of tools expected to facilitate relationship building that may affect project success are identified. Data of different projects were collected in China via a self-administered postal survey. By using structural equation modelling techniques, a structural model is developed to help explain the relationship among different variables. It has been found that relational risk has negative influence on project performance. It is recommended that firms in the Chinese construction industry manage the relationship-based factors that are significant in the model so as to achieve project success.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Image fusion process merges two images into a single more informative image. Objective image fusion per- formance metrics rely primarily on measuring the amount of information transferred from each source image into the fused image. Objective image fusion metrics have evolved from image processing dissimilarity metrics. Additionally, researchers have developed many additions to image dissimilarity metrics in order to better value the local fusion worthy features in source images. This paper studies the evolution of objective image fusion performance metrics and their subjective and objective validation. It describes how a fusion performance metric evolves starting with image dissimilarity metrics, its realization into image fusion contexts, its localized weighting factors and the validation process.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Phyllosoma of the tropical spiny rock lobster, Panulirus ornatus, possess a rudimentary digestive system with a limited capacity to digest large protein molecules. As such, to foster the successful aquaculture of this species, research into dietary requirements should place a focus on feed ingredients aligned with digestive capacity. Thus, the aim of the present study was to assess the effects of two protein pre-digestion treatments: acid denaturation and enzyme hydrolysis, on a regular fishmeal ingredient in a novel formulated diet for early-mid stage P. ornatus phyllosoma (Stages III-VIII). Three iso-nitrogenous, iso-lipidic and iso-energetic diets were formulated with 100% of protein originating from intact fishmeal (IFM), acid-denatured fishmeal (DFM) or enzyme hydrolysed fishmeal (HFM) and fed to early-mid stage phyllosoma for a period of 35-days. Growth performance metrics were all significantly higher in phyllosoma receiving the HFM treatment compared to the DFM and IFM treatments. Phyllosoma fed the HFM diet also had the most advanced development stages, with a significantly greater proportion of individuals reaching Stage VII (2). No significant differences were detectable in either the protein-bound or FAA composition of phyllosoma across all treatments, suggesting that the superior growth performance of the HFM fed phyllosoma was the result of an increased abundance of intermediate, shorter chain dietary peptides. The present study suggests that enzyme hydrolysed fishmeal is a superior protein ingredient for artificial diets and most closely resembles the requisite dietary protein format for P. ornatus phyllosoma.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Strategic control is defined as the use of qualitative and quantitative tools for the evaluation of strategic organizational performance. Most research in strategic planning has focused on strategy formulation and implementation, but little work has been done on strategic performance evaluation particularly in the area of cancer research. The objective of this study was to identify strategic control approaches and financial performance metrics used by major cancer centers in the country as an initial step in expanding the theory and practice behind strategic organizational performance. Focusing on hospitals which share similar mandate and resource constraints was expected to improve measurement precision. The results indicate that most cancer centers use a wide selection of evaluation tools, but sophisticated analytical approaches were less common. In addition, there was evidence that high-performing centers tend to invest a larger degree of resources in the area of strategic performance analysis than centers showing lower financial results. The conclusions point to the need for incorporating higher degree of analytical power in order to improve the tracking of strategic performance. This study is one of the first to concentrate in the area of strategic control.^

Relevância:

70.00% 70.00%

Publicador:

Resumo:

CHARACTERIZATION OF THE COUNT RATE PERFORMANCE AND EVALUATION OF THE EFFECTS OF HIGH COUNT RATES ON MODERN GAMMA CAMERAS Michael Stephen Silosky, B.S. Supervisory Professor: S. Cheenu Kappadath, Ph.D. Evaluation of count rate performance (CRP) is an integral component of gamma camera quality assurance and measurement of system dead time (τ) is important for quantitative SPECT. The CRP of three modern gamma cameras was characterized using established methods (Decay and Dual Source) under a variety of experimental conditions. For the Decay method, input count rate was plotted against observed count rate and fit to the paralyzable detector model (PDM) to estimate τ (Rates method). A novel expression for observed counts as a function of measurement time interval was derived and the observed counts were fit to this expression to estimate τ (Counts method). Correlation and Bland-Altman analysis were performed to assess agreement in estimates of τ between methods. The dependencies of τ on energy window definition and incident energy spectrum were characterized. The Dual Source method was also used to estimate τ and its agreement with the Decay method under identical conditions and the effects of total activity and the ratio of source activities were investigated. Additionally, the effects of count rate on several performance metrics were evaluated. The CRP curves for each system agreed with the PDM at low count rates but deviated substantially at high count rates. Estimates of τ for the paralyzable portion of the CRP curves using the Rates and Counts methods were highly correlated (r=0.999) but with a small (~6%) difference. No significant difference was observed between the highly correlated estimates of τ using the Decay or Dual Source methods under identical experimental conditions (r=0.996). Estimates of τ increased as a power-law function with decreasing ratio of counts in the photopeak to the total counts and linearly with decreasing spectral effective energy. Dual Source method estimates of τ varied as a quadratic with the ratio of the single source to combined source activities and linearly with total activity used across a large range. Image uniformity, spatial resolution, and energy resolution degraded linearly with count rate and image distorting effects were observed. Guidelines for CRP testing and a possible method for the correction of count rate losses for clinical images have been proposed.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

In this paper, a computer-based tool is developed to analyze student performance along a given curriculum. The proposed software makes use of historical data to compute passing/failing probabilities and simulates future student academic performance based on stochastic programming methods (MonteCarlo) according to the specific university regulations. This allows to compute the academic performance rates for the specific subjects of the curriculum for each semester, as well as the overall rates (the set of subjects in the semester), which are the efficiency rate and the success rate. Additionally, we compute the rates for the Bachelors degree, which are the graduation rate measured as the percentage of students who finish as scheduled or taking an extra year and the efficiency rate (measured as the percentage of credits of the curriculum with respect to the credits really taken). In Spain, these metrics have been defined by the National Quality Evaluation and Accreditation Agency (ANECA). Moreover, the sensitivity of the performance metrics to some of the parameters of the simulator is analyzed using statistical tools (Design of Experiments). The simulator has been adapted to the curriculum characteristics of the Bachelor in Engineering Technologies at the Technical University of Madrid(UPM).

Relevância:

70.00% 70.00%

Publicador:

Resumo:

The amplification of demand variation up a supply chain widely termed ‘the Bullwhip Effect’ is disruptive, costly and something that supply chain management generally seeks to minimise. Originally attributed to poor system design; deficiencies in policies, organisation structure and delays in material and information flow all lead to sub-optimal reorder point calculation. It has since been attributed to exogenous random factors such as: uncertainties in demand, supply and distribution lead time but these causes are not exclusive as academic and operational studies since have shown that orders and/or inventories can exhibit significant variability even if customer demand and lead time are deterministic. This increase in the range of possible causes of dynamic behaviour indicates that our understanding of the phenomenon is far from complete. One possible, yet previously unexplored, factor that may influence dynamic behaviour in supply chains is the application and operation of supply chain performance measures. Organisations monitoring and responding to their adopted key performance metrics will make operational changes and this action may influence the level of dynamics within the supply chain, possibly degrading the performance of the very system they were intended to measure. In order to explore this a plausible abstraction of the operational responses to the Supply Chain Council’s SCOR® (Supply Chain Operations Reference) model was incorporated into a classic Beer Game distribution representation, using the dynamic discrete event simulation software Simul8. During the simulation the five SCOR Supply Chain Performance Attributes: Reliability, Responsiveness, Flexibility, Cost and Utilisation were continuously monitored and compared to established targets. Operational adjustments to the; reorder point, transportation modes and production capacity (where appropriate) for three independent supply chain roles were made and the degree of dynamic behaviour in the Supply Chain measured, using the ratio of the standard deviation of upstream demand relative to the standard deviation of the downstream demand. Factors employed to build the detailed model include: variable retail demand, order transmission, transportation delays, production delays, capacity constraints demand multipliers and demand averaging periods. Five dimensions of supply chain performance were monitored independently in three autonomous supply chain roles and operational settings adjusted accordingly. Uniqueness of this research stems from the application of the five SCOR performance attributes with modelled operational responses in a dynamic discrete event simulation model. This project makes its primary contribution to knowledge by measuring the impact, on supply chain dynamics, of applying a representative performance measurement system.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

The extent to which animal migrations shape parasite transmission networks is critically dependent on a migrant's ability to tolerate infection and migrate successfully. Yet, sub-lethal effects of parasites can be intensified through periods of increased physiological stress. Long-distance migrants may, therefore, be especially susceptible to negative effects of parasitic infection. Although a handful of studies have investigated the short-term, transmission-relevant behaviors of wild birds infected with low-pathogenic avian influenza viruses (LPAIV), the ecological consequences of LPAIV for the hosts themselves remain largely unknown. Here, we assessed the potential effects of naturally-acquired LPAIV infections in Bewick's swans, a long-distance migratory species that experiences relatively low incidence of LPAIV infection during early winter. We monitored both foraging and movement behavior in the winter of infection, as well as subsequent breeding behavior and inter-annual resighting probability over 3 years. Incorporating data on infection history we hypothesized that any effects would be most apparent in naïve individuals experiencing their first LPAIV infection. Indeed, significant effects of infection were only seen in birds that were infected but lacked antibodies indicative of prior infection. Swans that were infected but had survived a previous infection were indistinguishable from uninfected birds in each of the ecological performance metrics. Despite showing reduced foraging rates, individuals in the naïve-infected category had similar accumulated body stores to re-infected and uninfected individuals prior to departure on spring migration, possibly as a result of having higher scaled mass at the time of infection. And yet individuals in the naïve-infected category were unlikely to be resighted 1 year after infection, with 6 out of 7 individuals that never resighted again compared to 20 out of 63 uninfected individuals and 5 out of 12 individuals in the re-infected category. Collectively, our findings indicate that acute and superficially harmless infection with LPAIV may have indirect effects on individual performance and recruitment in migratory Bewick's swans. Our results also highlight the potential for infection history to play an important role in shaping ecological constraints throughout the annual cycle.