820 resultados para Performance Based Assessment
Resumo:
Discovering the function of an unknown protein, particularly one with neither structural nor functional correlates, is a daunting task. Interaction analyses determine binding partners, whereas DNA transfection, either transient or stable, leads to intracellular expression, though not necessarily at physiologically relevant levels. In theory, direct intracellular protein delivery (protein transduction) provides a conceptually simpler alternative, but in practice the approach is problematic. Domains such as HIV TAT protein are valuable, but their effectiveness is protein specific. Similarly, the delivery of intact proteins via endocytic pathways (e.g. using liposomes) is problematic for functional analysis because of the potential for protein degradation in the endosomes/lysosomes. Consequently, recent reports that microspheres can deliver bio-cargoes into cells via a non-endocytic, energy-independent pathway offer an exciting and promising alternative for in vitro delivery of functional protein. In order for such promise to be fully exploited, microspheres are required that (i) are stably linked to proteins, (ii) can deliver those proteins with good efficiency, (iii) release functional protein once inside the cells, and (iv) permit concomitant tracking. Herein, we report the application of microspheres to successfully address all of these criteria simultaneously, for the first time. After cellular uptake, protein release was autocatalyzed by the reducing cytoplasmic environment. Outside of cells, the covalent microsphere-protein linkage was stable for ≥90 h at 37°C. Using conservative methods of estimation, 74.3% ± 5.6% of cells were shown to take up these microspheres after 24 h of incubation, with the whole process of delivery and intracellular protein release occurring within 36 h. Intended for in vitro functional protein research, this approach will enable study of the consequences of protein delivery at physiologically relevant levels, without recourse to nucleic acids, and offers a useful alternative to commercial protein transfection reagents such as Chariot™. We also provide clear immunostaining evidence to resolve residual controversy surrounding FACS-based assessment of microsphere uptake. © 2014 by The American Society for Biochemistry and Molecular Biology Inc.
Resumo:
This theoretical study shows the technical feasibility of self-powered geothermal desalination of groundwater sources at <100 °C. A general method and framework are developed and then applied to specific case studies. First, the analysis considers an ideal limit to performance based on exergy analysis using generalised idealised assumptions. This thermodynamic limit applies to any type of process technology. Then, the analysis focuses specifically on the Organic Rankine Cycle (ORC) driving Reverse Osmosis (RO), as these are among the most mature and efficient applicable technologies. Important dimensionless parameters are calculated for the ideal case of the self-powered arrangement and semi-ideal case where only essential losses dependent on the RO system configuration are considered. These parameters are used to compare the performance of desalination systems using ORC-RO under ideal, semi-ideal and real assumptions for four case studies relating to geothermal sources located in India, Saudi Arabia, Tunisia and Turkey. The overall system recovery ratio (the key performance measure for the self-powered process) depends strongly on the geothermal source temperature. It can be as high as 91.5% for a hot spring emerging at 96 °C with a salinity of 1830 mg/kg.
Resumo:
Agency costs are said to arise as a result of the separation of ownership from control inherent in the corporate form of ownership. One such agency problem concerns the potential variance between the time horizons of principal shareholders and agent managers. Agency theory suggests that these costs can be alleviated or controlled through performance-based Chief Executive Officer (CEO) contracting. However, components of a CEO's compensation contract can exacerbate or mitigate agency-related problems (Antle and Smith, 1985). According to the horizon hypothesis, a self-serving CEO reduces discretionary research and development (R&D) expenditures to increase earnings and earnings-based bonus compensation. Agency theorists contend that a CEO's market-based compensation component can mitigate horizon problems. This study seeks to determine whether there is a relationship between CEO earnings- and market-based compensation components and R&D expenditures in the largest United States industrial firms from 1987 to 1993.^ Consistent with the horizon hypothesis, results provide evidence of a negative and statistically significant relationship between CEO cash compensation (i.e., salary and bonus) and the firm's R&D expenditures. Consistent with the expectations of agency theory, results provide evidence of a positive and statistically significant relationship between market-based CEO compensation and R&D.^ Further results of this study provide evidence of a positive and statistically significant relationship between CEO tenure and the firm's R&D expenditures. Although there is a negative relationship between CEO age and the firm's R&D, it was not statistically significant at the 0.5 level. ^
Resumo:
The purpose of this study was to document and critically analyze the lived experience of selected nursing staff developers in the process of moving toward a new model for hospital nursing education. Eleven respondents were drawn from a nation-wide population of about two hundred individuals involved in nursing staff development. These subjects were responsible for the implementation of the Performance Based Development System (PBDS) in their institutions.^ A purposive, criterion-based sampling technique was used with respondents being selected according to size of hospital, primary responsibility for orchestration of the change, influence over budgetary factors and managerial responsibility for PBDS. Data were gathered by the researcher through both in-person and telephone interviews. A semi-structured interview guide, designed by the researcher was used, and respondents were encouraged to amplify on their recollections as desired. Audiotapes were transcribed and resulting computer files were analyzed using the program "Martin". Answers to interview questions were compiled and reported across cases. The data was then reviewed a second time and interpreted for emerging themes and patterns.^ Two types of verification were used in the study. Internal verification was done through interview transcript review and feedback by respondents. External verification was done through review and feedback on data analysis by readers who were experienced in management of staff development departments.^ All respondents were female, so Gilligan's concept of the "ethic of care" was examined as a decision making strategy. Three levels of caring which influenced decision making were found. They were caring: (a) for the organization, (b) for the employee, and (c) for the patient. The four existentials of the lived experience, relationality, corporeality, temporality and spatiality were also examined to reveal the everydayness of making change. ^
Resumo:
This study investigated the factors considered by forensic examiners when evaluating sexually violent predators (SVP) for civil commitment under Florida's “Jimmy Ryce Act.” The project was funded by a pre-doctoral research grant awarded by the Association for the Treatment of Sexual Abusers (ATSA). ^ This study proposed two specific research questions. First, what is the direct relationship between actuarial risk assessment scores and recommendations for sex offender civil commitment? Second, which other variables are likely to influence SVP commitment decisions, and to what degree? The purpose of the study was to determine if risk assessment practices are evidence-based, and whether offenders selected for commitment meet statutory criteria. ^ The purposive sample of 450 SVPs was drawn from the population of sex offenders evaluated for civil commitment in Florida between July 1, 2000 and June 30, 2001. Data were extracted from SVP evaluations provided by the Florida Department of Children and Families. Using multivariate logistic regression, this correlational research design examined the relationship between the dependent variable, commitment decision, and several sets of independent variables. The independent variables were derived from a review of the literature, and were grouped conceptually according to their degree of correlation with sex offense recidivism. Independent variables included diagnoses, actuarial risk assessment scores, empirically validated static and dynamic risk factors, consensus based risk factors, evaluator characteristics, and demographics. This study investigated the degree to which the identified variables predicted civil commitment decisions. ^ Logistic regression results revealed that the statistically significant predictors of recommendations for sex offender civil commitment were actuarial risk assessment scores, diagnoses of Pedophilia and Paraphilia NOS, psychopathy, younger age of victim, and non-minority race. Discriminant function analysis confirmed that these variables correctly predicted commitment decisions in 90% of cases. ^ It appears that civil commitment evaluators in Florida used empirically-based assessment procedures, and did not make decisions that were heavily influenced by extraneous factors. SVPs recommended for commitment consistently met the criteria set forth by the U.S. Supreme Court in Hendricks v. Kansas (1997): they suffered from a mental abnormality predisposing them to sexual violence, and risk assessment determined that they were likely to reoffend. ^
Resumo:
The most important factor that affects the decision making process in finance is the risk which is usually measured by variance (total risk) or systematic risk (beta). Since investors’ sentiment (whether she is an optimist or pessimist) plays a very important role in the choice of beta measure, any decision made for the same asset within the same time horizon will be different for different individuals. In other words, there will neither be homogeneity of beliefs nor the rational expectation prevalent in the market due to behavioral traits. This dissertation consists of three essays. In the first essay, “ Investor Sentiment and Intrinsic Stock Prices”, a new technical trading strategy was developed using a firm specific individual sentiment measure. This behavioral based trading strategy forecasts a range within which a stock price moves in a particular period and can be used for stock trading. Results indicate that sample firms trade within a range and give signals as to when to buy or sell. In the second essay, “Managerial Sentiment and the Value of the Firm”, examined the effect of managerial sentiment on the project selection process using net present value criterion and also effect of managerial sentiment on the value of firm. Final analysis reported that high sentiment and low sentiment managers obtain different values for the same firm before and after the acceptance of a project. Changes in the cost of capital, weighted cost of average capital were found due to managerial sentiment. In the last essay, “Investor Sentiment and Optimal Portfolio Selection”, analyzed how the investor sentiment affects the nature and composition of the optimal portfolio as well as the portfolio performance. Results suggested that the choice of the investor sentiment completely changes the portfolio composition, i.e., the high sentiment investor will have a completely different choice of assets in the portfolio in comparison with the low sentiment investor. The results indicated the practical application of behavioral model based technical indicator for stock trading. Additional insights developed include the valuation of firms with a behavioral component and the importance of distinguishing portfolio performance based on sentiment factors.
Resumo:
The most important factor that affects the decision making process in finance is the risk which is usually measured by variance (total risk) or systematic risk (beta). Since investors' sentiment (whether she is an optimist or pessimist) plays a very important role in the choice of beta measure, any decision made for the same asset within the same time horizon will be different for different individuals. In other words, there will neither be homogeneity of beliefs nor the rational expectation prevalent in the market due to behavioral traits. This dissertation consists of three essays. In the first essay, Investor Sentiment and Intrinsic Stock Prices, a new technical trading strategy is developed using a firm specific individual sentiment measure. This behavioral based trading strategy forecasts a range within which a stock price moves in a particular period and can be used for stock trading. Results show that sample firms trade within a range and show signals as to when to buy or sell. The second essay, Managerial Sentiment and the Value of the Firm, examines the effect of managerial sentiment on the project selection process using net present value criterion and also effect of managerial sentiment on the value of firm. Findings show that high sentiment and low sentiment managers obtain different values for the same firm before and after the acceptance of a project. The last essay, Investor Sentiment and Optimal Portfolio Selection, analyzes how the investor sentiment affects the nature and composition of the optimal portfolio as well as the performance measures. Results suggest that the choice of the investor sentiment completely changes the portfolio composition, i.e., the high sentiment investor will have a completely different choice of assets in the portfolio in comparison with the low sentiment investor. The results indicate the practical application of behavioral model based technical indicators for stock trading. Additional insights developed include the valuation of firms with a behavioral component and the importance of distinguishing portfolio performance based on sentiment factors.
Resumo:
The main objective is to exhibit how usage data from new media can be used to assess areas where students need more help in creating their ETDs. After attending this session, attendees will be able to use usage data from new media, in conjunction with traditional assessment data, to identify strengths and weaknesses in ETD training and resources. The burgeoning ETD program at Florida International University (FIU) has provided many opportunities to experiment with assessment strategies and new media. The usage statistics from YouTube and the ETD LibGuide revealed areas of strength and weakness in the training resources and the overall ETD training initiative. With the ability to assess these materials, they have been updated to better meet student needs. In addition to these assessment tools, there are opportunities to connect these statistics with data from a common error checklist, student feedback from ETD workshops, and final ETD submission surveys to create a full-fledged outcome based assessment program for the ETD initiative.
Resumo:
The Highway Safety Manual (HSM) estimates roadway safety performance based on predictive models that were calibrated using national data. Calibration factors are then used to adjust these predictive models to local conditions for local applications. The HSM recommends that local calibration factors be estimated using 30 to 50 randomly selected sites that experienced at least a total of 100 crashes per year. It also recommends that the factors be updated every two to three years, preferably on an annual basis. However, these recommendations are primarily based on expert opinions rather than data-driven research findings. Furthermore, most agencies do not have data for many of the input variables recommended in the HSM. This dissertation is aimed at determining the best way to meet three major data needs affecting the estimation of calibration factors: (1) the required minimum sample sizes for different roadway facilities, (2) the required frequency for calibration factor updates, and (3) the influential variables affecting calibration factors. In this dissertation, statewide segment and intersection data were first collected for most of the HSM recommended calibration variables using a Google Maps application. In addition, eight years (2005-2012) of traffic and crash data were retrieved from existing databases from the Florida Department of Transportation. With these data, the effect of sample size criterion on calibration factor estimates was first studied using a sensitivity analysis. The results showed that the minimum sample sizes not only vary across different roadway facilities, but they are also significantly higher than those recommended in the HSM. In addition, results from paired sample t-tests showed that calibration factors in Florida need to be updated annually. To identify influential variables affecting the calibration factors for roadway segments, the variables were prioritized by combining the results from three different methods: negative binomial regression, random forests, and boosted regression trees. Only a few variables were found to explain most of the variation in the crash data. Traffic volume was consistently found to be the most influential. In addition, roadside object density, major and minor commercial driveway densities, and minor residential driveway density were also identified as influential variables.
Resumo:
The thermodynamic performance of a refrigeration system can be improved by reducing the compression work by a particular technique for a specific heat removal rate. This study examines the effect of small concentrations of Al2O3 (50 nm) nanoparticles dispersion in the mineral oil based lubricant on the: viscosity, thermal conductivity, and lubrication characteristics as well as the overall performance (based on the Second Law of Thermodynamics) of the refrigerating system using R134a or R600a as refrigerants. The study looked at the influences of variables: i) refrigerant charge (100, 110, 120 and 130 g), ii) rotational speed of the condenser blower (800 and 1100 RPM) and iii) nanoparticle concentration (0.1 and 0.5 g/l) on the system performance based on the Taguchi method in a matrix of L8 trials with the criterion "small irreversibility is better”. They were carried pulldown and cycling tests according to NBR 12866 and NBR 12869, respectively, to evaluate the operational parameters: on-time ratio, cycles per hour, suction and discharge pressures, oil sump temperature, evaporation and condensation temperatures, energy consumption at the set-point, total energy consumption and compressor power. In order to evaluate the nanolubricant characteristics, accelerated tests were performed in a HFRR bench. In each 60 minutes test with nanolubricants at a certain concentration (0, 0.1 and 0.5 g/l), with three replications, the sphere (diameter 6.00 ± 0.05 mm, Ra 0.05 ± 0.005 um, AISI 52100 steel, E = 210 GPa, HRC 62 ± 4) sliding on a flat plate (cast iron FC200, Ra <0.5 ± 0.005 um) in a reciprocating motion with amplitude of 1 mm, frequency 20 Hz and a normal load of 1,96 N. The friction coefficient signals were recorded by sensors coupled to the HFRR system. There was a trend commented bit in the literature: a nanolubricant viscosity reduction at the low nanoparticles concentrations. It was found the dominant trend in the literature: increased thermal conductivity with increasing nanoparticles mass fraction in the base fluid. Another fact observed is the significant thermal conductivity growth of nanolubricant with increasing temperature. The condenser fan rotational speed is the most influential parameter (46.192%) in the refrigerator performance, followed by R600a charge (38.606%). The Al2O3 nanoparticles concentration in the lubricant plays a minor influence on system performance, with 12.44%. The results of power consumption indicates that the nanoparticles addition in the lubricant (0.1 g/L), together with R600a, the refrigerator consumption is reduced of 22% with respect to R134a and POE lubricant. Only the Al2O3 nanoparticles addition in the lubricant results in a consumption reduction of about 5%.
Resumo:
The thermodynamic performance of a refrigeration system can be improved by reducing the compression work by a particular technique for a specific heat removal rate. This study examines the effect of small concentrations of Al2O3 (50 nm) nanoparticles dispersion in the mineral oil based lubricant on the: viscosity, thermal conductivity, and lubrication characteristics as well as the overall performance (based on the Second Law of Thermodynamics) of the refrigerating system using R134a or R600a as refrigerants. The study looked at the influences of variables: i) refrigerant charge (100, 110, 120 and 130 g), ii) rotational speed of the condenser blower (800 and 1100 RPM) and iii) nanoparticle concentration (0.1 and 0.5 g/l) on the system performance based on the Taguchi method in a matrix of L8 trials with the criterion "small irreversibility is better”. They were carried pulldown and cycling tests according to NBR 12866 and NBR 12869, respectively, to evaluate the operational parameters: on-time ratio, cycles per hour, suction and discharge pressures, oil sump temperature, evaporation and condensation temperatures, energy consumption at the set-point, total energy consumption and compressor power. In order to evaluate the nanolubricant characteristics, accelerated tests were performed in a HFRR bench. In each 60 minutes test with nanolubricants at a certain concentration (0, 0.1 and 0.5 g/l), with three replications, the sphere (diameter 6.00 ± 0.05 mm, Ra 0.05 ± 0.005 um, AISI 52100 steel, E = 210 GPa, HRC 62 ± 4) sliding on a flat plate (cast iron FC200, Ra <0.5 ± 0.005 um) in a reciprocating motion with amplitude of 1 mm, frequency 20 Hz and a normal load of 1,96 N. The friction coefficient signals were recorded by sensors coupled to the HFRR system. There was a trend commented bit in the literature: a nanolubricant viscosity reduction at the low nanoparticles concentrations. It was found the dominant trend in the literature: increased thermal conductivity with increasing nanoparticles mass fraction in the base fluid. Another fact observed is the significant thermal conductivity growth of nanolubricant with increasing temperature. The condenser fan rotational speed is the most influential parameter (46.192%) in the refrigerator performance, followed by R600a charge (38.606%). The Al2O3 nanoparticles concentration in the lubricant plays a minor influence on system performance, with 12.44%. The results of power consumption indicates that the nanoparticles addition in the lubricant (0.1 g/L), together with R600a, the refrigerator consumption is reduced of 22% with respect to R134a and POE lubricant. Only the Al2O3 nanoparticles addition in the lubricant results in a consumption reduction of about 5%.
Resumo:
Il presente elaborato di tesi tratta la valutazione di differenti sistemi di controventatura, sia dal punto di vista di risposta ad un evento sismico che in termini di perdite economiche legate al danneggiamento delle varie componenti. Tra di esse è presentata anche una nuova tipologia strutturale, ideata per ridurre il comportamento “soft-story” e “weak-story”, tipico delle strutture controventate convenzionali. In questo caso, è integrata alla struttura una trave reticolare metallica, che funge da supporto verticale ed è progettata per rimanere in campo elastico. Tale sostegno garantisce una distribuzione più uniforme degli sforzi lungo l’intera altezza della struttura, anziché concentrarli in un unico piano. La ricerca tratta lo studio della fattibilità economica di questa nuova tecnologia, rispetto alle precedenti soluzioni di controventatura adottate, confrontando le perdite economiche delle diverse soluzioni, applicate ad un unico prototipo di edificio collocato a Berkeley, CA. L’analisi sismica tiene in considerazione di tre diversi livelli di intensità, riferiti a un periodo di ritorno di 50 anni, corrispondente alla vita dell’edificio: questi sono caratterizzati dalla probabilità di ricorrenza, rispettivamente del 2%, 10% e 50% ogni 50 anni. L’ambito di ricerca presentato è estremamente innovativo e di primario interesse per lo sviluppo di uno studio sulla resilienza, che può essere adattato anche in un modello di urbanizzazione futura.
Comparação de produtos de colagem de origem animal, vegetal e extratos de leveduras em vinhos tintos
Resumo:
No mercado têm surgido alternativas aos produtos de colagem de origem animal, nomeadamente as proteínas de origem vegetal e extratos de levedura. O objectivo deste trabalho foi a comparação de um elevado número de produtos de colagem de origem animal (gelatinas) e produtos alternativos de origem vegetal e levuriano. Inicialmente os produtos de colagem foram aplicados ao vinho em diferentes doses. Com base na avaliação sensorial, foram selecionados os produtos e as doses que revelaram melhor desempenho. Os produtos de colagem selecionados foram aplicados ao vinho e avaliado o seu efeito nas características sensoriais e físico-químicas. O parâmetro que apresentou diferenças mais significativas foi a turbidez. A análise estatística apenas revelou diferenças significativas para os atributos qualidade do sabor e nota final; apesar de não evidenciar diferenças significativas, as variações observadas para o atributo extração/secura foram muito importantes do ponto de vista prático. As gelatinas que mostraram melhor desempenho foram as POA 10 e POA 12 e as proteínas vegetais POV 6 (proteína de ervilha) e POV 10 (proteína de batata). Este estudo permitiu a recolha de informação para ser utilizada na seleção do produto de colagem mais apropriado para o perfil de vinho da adega e a alternativa mais apropriada às gelatinas.
Resumo:
In this dissertation, I explore the impact of several public policies on civic participation. Using a unique combination of school administrative and public–use voter files and methods for causal inference, I evaluate the impact of three new, as of yet unexplored, policies: one informational, one institutional, and one skill–based. Chapter 2 examines the causal effect of No Child Left Behind’s performance-based accountability school failure signals on turnout in school board elections and on individuals’ use of exit. I find that failure signals mobilize citizens both at the ballot box and by encouraging them to vote with their feet. However, these increases in voice and exit come primarily from citizens who already active—thus exacerbating inequalities in both forms of participation. Chapter 3 examines the causal effect of preregistration—an electoral reform that allows young citizens to enroll in the electoral system before turning 18, while also providing them with various in-school supports. Using data from the Current Population Survey and Florida Voter Files and multiple methods for causal inference, I (with my coauthor listed below) show that preregistration mobilizes and does so for a diverse set of citizens. Finally, Chapter 4 examines the impact of psychosocial or so called non-cognitive skills on voter turnout. Using information from the Fast Track intervention, I show that early– childhood investments in psychosocial skills have large, long-run spillovers on civic participation. These gains are widely distributed, being especially large for those least likely to participate. These chapters provide clear insights that reach across disciplinary boundaries and speak to current policy debates. In placing specific attention not only on whether these programs mobilize, but also on who they mobilize, I provide scholars and practitioners with new ways of thinking about how to address stubbornly low and unequal rates of citizen engagement.
Resumo:
The dissertation consists of three chapters related to the low-price guarantee marketing strategy and energy efficiency analysis. The low-price guarantee is a marketing strategy in which firms promise to charge consumers the lowest price among their competitors. Chapter 1 addresses the research question "Does a Low-Price Guarantee Induce Lower Prices'' by looking into the retail gasoline industry in Quebec where there was a major branded firm which started a low-price guarantee back in 1996. Chapter 2 does a consumer welfare analysis of low-price guarantees to drive police indications and offers a new explanation of the firms' incentives to adopt a low-price guarantee. Chapter 3 develops the energy performance indicators (EPIs) to measure energy efficiency of the manufacturing plants in pulp, paper and paperboard industry.
Chapter 1 revisits the traditional view that a low-price guarantee results in higher prices by facilitating collusion. Using accurate market definitions and station-level data from the retail gasoline industry in Quebec, I conducted a descriptive analysis based on stations and price zones to compare the price and sales movement before and after the guarantee was adopted. I find that, contrary to the traditional view, the stores that offered the guarantee significantly decreased their prices and increased their sales. I also build a difference-in-difference model to quantify the decrease in posted price of the stores that offered the guarantee to be 0.7 cents per liter. While this change is significant, I do not find the response in comeptitors' prices to be significant. The sales of the stores that offered the guarantee increased significantly while the competitors' sales decreased significantly. However, the significance vanishes if I use the station clustered standard errors. Comparing my observations and the predictions of different theories of modeling low-price guarantees, I conclude the empirical evidence here supports that the low-price guarantee is a simple commitment device and induces lower prices.
Chapter 2 conducts a consumer welfare analysis of low-price guarantees to address the antitrust concerns and potential regulations from the government; explains the firms' potential incentives to adopt a low-price guarantee. Using station-level data from the retail gasoline industry in Quebec, I estimated consumers' demand of gasoline by a structural model with spatial competition incorporating the low-price guarantee as a commitment device, which allows firms to pre-commit to charge the lowest price among their competitors. The counterfactual analysis under the Bertrand competition setting shows that the stores that offered the guarantee attracted a lot more consumers and decreased their posted price by 0.6 cents per liter. Although the matching stores suffered a decrease in profits from gasoline sales, they are incentivized to adopt the low-price guarantee to attract more consumers to visit the store likely increasing profits at attached convenience stores. Firms have strong incentives to adopt a low-price guarantee on the product that their consumers are most price-sensitive about, while earning a profit from the products that are not covered in the guarantee. I estimate that consumers earn about 0.3% more surplus when the low-price guarantee is in place, which suggests that the authorities should not be concerned and regulate low-price guarantees. In Appendix B, I also propose an empirical model to look into how low-price guarantees would change consumer search behavior and whether consumer search plays an important role in estimating consumer surplus accurately.
Chapter 3, joint with Gale Boyd, describes work with the pulp, paper, and paperboard (PP&PB) industry to provide a plant-level indicator of energy efficiency for facilities that produce various types of paper products in the United States. Organizations that implement strategic energy management programs undertake a set of activities that, if carried out properly, have the potential to deliver sustained energy savings. Energy performance benchmarking is a key activity of strategic energy management and one way to enable companies to set energy efficiency targets for manufacturing facilities. The opportunity to assess plant energy performance through a comparison with similar plants in its industry is a highly desirable and strategic method of benchmarking for industrial energy managers. However, access to energy performance data for conducting industry benchmarking is usually unavailable to most industrial energy managers. The U.S. Environmental Protection Agency (EPA), through its ENERGY STAR program, seeks to overcome this barrier through the development of manufacturing sector-based plant energy performance indicators (EPIs) that encourage U.S. industries to use energy more efficiently. In the development of the energy performance indicator tools, consideration is given to the role that performance-based indicators play in motivating change; the steps necessary for indicator development, from interacting with an industry in securing adequate data for the indicator; and actual application and use of an indicator when complete. How indicators are employed in EPA’s efforts to encourage industries to voluntarily improve their use of energy is discussed as well. The chapter describes the data and statistical methods used to construct the EPI for plants within selected segments of the pulp, paper, and paperboard industry: specifically pulp mills and integrated paper & paperboard mills. The individual equations are presented, as are the instructions for using those equations as implemented in an associated Microsoft Excel-based spreadsheet tool.