841 resultados para Firm Performance Measures
Resumo:
Most research on single machine scheduling has assumedthe linearity of job holding costs, which is arguablynot appropriate in some applications. This motivates ourstudy of a model for scheduling $n$ classes of stochasticjobs on a single machine, with the objective of minimizingthe total expected holding cost (discounted or undiscounted). We allow general holding cost rates that are separable,nondecreasing and convex on the number of jobs in eachclass. We formulate the problem as a linear program overa certain greedoid polytope, and establish that it issolved optimally by a dynamic (priority) index rule,whichextends the classical Smith's rule (1956) for the linearcase. Unlike Smith's indices, defined for each class, ournew indices are defined for each extended class, consistingof a class and a number of jobs in that class, and yieldan optimal dynamic index rule: work at each time on a jobwhose current extended class has larger index. We furthershow that the indices possess a decomposition property,as they are computed separately for each class, andinterpret them in economic terms as marginal expected cost rate reductions per unit of expected processing time.We establish the results by deploying a methodology recentlyintroduced by us [J. Niño-Mora (1999). "Restless bandits,partial conservation laws, and indexability. "Forthcomingin Advances in Applied Probability Vol. 33 No. 1, 2001],based on the satisfaction by performance measures of partialconservation laws (PCL) (which extend the generalizedconservation laws of Bertsimas and Niño-Mora (1996)):PCL provide a polyhedral framework for establishing theoptimality of index policies with special structure inscheduling problems under admissible objectives, which weapply to the model of concern.
Resumo:
We present a polyhedral framework for establishing general structural properties on optimal solutions of stochastic scheduling problems, where multiple job classes vie for service resources: the existence of an optimal priority policy in a given family, characterized by a greedoid(whose feasible class subsets may receive higher priority), where optimal priorities are determined by class-ranking indices, under restricted linear performance objectives (partial indexability). This framework extends that of Bertsimas and Niño-Mora (1996), which explained the optimality of priority-index policies under all linear objectives (general indexability). We show that, if performance measures satisfy partial conservation laws (with respect to the greedoid), which extend previous generalized conservation laws, then theproblem admits a strong LP relaxation over a so-called extended greedoid polytope, which has strong structural and algorithmic properties. We present an adaptive-greedy algorithm (which extends Klimov's) taking as input the linear objective coefficients, which (1) determines whether the optimal LP solution is achievable by a policy in the given family; and (2) if so, computes a set of class-ranking indices that characterize optimal priority policies in the family. In the special case of project scheduling, we show that, under additional conditions, the optimal indices can be computed separately for each project (index decomposition). We further apply the framework to the important restless bandit model (two-action Markov decision chains), obtaining new index policies, that extend Whittle's (1988), and simple sufficient conditions for their validity. These results highlight the power of polyhedral methods (the so-called achievable region approach) in dynamic and stochastic optimization.
Resumo:
We show that if performance measures in a stochastic scheduling problem satisfy a set of so-called partial conservation laws (PCL), which extend previously studied generalized conservation laws (GCL), then the problem is solved optimally by a priority-index policy for an appropriate range of linear performance objectives, where the optimal indices are computed by a one-pass adaptive-greedy algorithm, based on Klimov's. We further apply this framework to investigate the indexability property of restless bandits introduced by Whittle, obtaining the following results: (1) we identify a class of restless bandits (PCL-indexable) which are indexable; membership in this class is tested through a single run of the adaptive-greedy algorithm, which also computes the Whittle indices when the test is positive; this provides a tractable sufficient condition for indexability; (2) we further indentify the class of GCL-indexable bandits, which includes classical bandits, having the property that they are indexable under any linear reward objective. The analysis is based on the so-called achievable region method, as the results follow fromnew linear programming formulations for the problems investigated.
Resumo:
We propose a model in which economic relations and institutions in advancedand less developed economies differ as these societies have access to different amounts of information. This lack of information makes it hard to give the right incentives to managers and entrepreneurs. We argue that differences in the amount of information arise because of the differences in the scale of activities in rich and poor economies; namely, there is too little repetition of similar activities in pooreconomies, thus insufficient information to set the appropriate standards for firm performance. Our model predicts a number of institutional and structural transformations as the economy accumulates capital and information.
Resumo:
Highway agencies spend millions of dollars to ensure safe and efficient winter travel. However, the effectiveness of winter weather maintenance practices on safety and mobility are somewhat difficult to quantify. Phase I of this project investigated opportunities for improving traffic safety on state-maintained roads in Iowa during winter weather conditions. The primary objective was to develop several preliminary means for the Iowa Department of Transportation (DOT) to identify locations of possible interest systematically with respect to winter weather-related safety performance based on crash history. Specifically, metrics were developed to assist in identifying possible habitual, winter weather-related crash sites on state-maintained rural highways in Iowa. In addition, the current state of practice, for both domestic and international highway agency practices, regarding integration of traffic safety- and mobility-related data in winter maintenance activities and performance measures were investigated. This investigation also included previous research efforts. Finally, a preliminary work plan, focusing on systematic use of safety-related data in support of winter maintenance activities and site evaluation, was prepared.
Resumo:
Executive Summary The unifying theme of this thesis is the pursuit of a satisfactory ways to quantify the riskureward trade-off in financial economics. First in the context of a general asset pricing model, then across models and finally across country borders. The guiding principle in that pursuit was to seek innovative solutions by combining ideas from different fields in economics and broad scientific research. For example, in the first part of this thesis we sought a fruitful application of strong existence results in utility theory to topics in asset pricing. In the second part we implement an idea from the field of fuzzy set theory to the optimal portfolio selection problem, while the third part of this thesis is to the best of our knowledge, the first empirical application of some general results in asset pricing in incomplete markets to the important topic of measurement of financial integration. While the first two parts of this thesis effectively combine well-known ways to quantify the risk-reward trade-offs the third one can be viewed as an empirical verification of the usefulness of the so-called "good deal bounds" theory in designing risk-sensitive pricing bounds. Chapter 1 develops a discrete-time asset pricing model, based on a novel ordinally equivalent representation of recursive utility. To the best of our knowledge, we are the first to use a member of a novel class of recursive utility generators to construct a representative agent model to address some long-lasting issues in asset pricing. Applying strong representation results allows us to show that the model features countercyclical risk premia, for both consumption and financial risk, together with low and procyclical risk free rate. As the recursive utility used nests as a special case the well-known time-state separable utility, all results nest the corresponding ones from the standard model and thus shed light on its well-known shortcomings. The empirical investigation to support these theoretical results, however, showed that as long as one resorts to econometric methods based on approximating conditional moments with unconditional ones, it is not possible to distinguish the model we propose from the standard one. Chapter 2 is a join work with Sergei Sontchik. There we provide theoretical and empirical motivation for aggregation of performance measures. The main idea is that as it makes sense to apply several performance measures ex-post, it also makes sense to base optimal portfolio selection on ex-ante maximization of as many possible performance measures as desired. We thus offer a concrete algorithm for optimal portfolio selection via ex-ante optimization over different horizons of several risk-return trade-offs simultaneously. An empirical application of that algorithm, using seven popular performance measures, suggests that realized returns feature better distributional characteristics relative to those of realized returns from portfolio strategies optimal with respect to single performance measures. When comparing the distributions of realized returns we used two partial risk-reward orderings first and second order stochastic dominance. We first used the Kolmogorov Smirnov test to determine if the two distributions are indeed different, which combined with a visual inspection allowed us to demonstrate that the way we propose to aggregate performance measures leads to portfolio realized returns that first order stochastically dominate the ones that result from optimization only with respect to, for example, Treynor ratio and Jensen's alpha. We checked for second order stochastic dominance via point wise comparison of the so-called absolute Lorenz curve, or the sequence of expected shortfalls for a range of quantiles. As soon as the plot of the absolute Lorenz curve for the aggregated performance measures was above the one corresponding to each individual measure, we were tempted to conclude that the algorithm we propose leads to portfolio returns distribution that second order stochastically dominates virtually all performance measures considered. Chapter 3 proposes a measure of financial integration, based on recent advances in asset pricing in incomplete markets. Given a base market (a set of traded assets) and an index of another market, we propose to measure financial integration through time by the size of the spread between the pricing bounds of the market index, relative to the base market. The bigger the spread around country index A, viewed from market B, the less integrated markets A and B are. We investigate the presence of structural breaks in the size of the spread for EMU member country indices before and after the introduction of the Euro. We find evidence that both the level and the volatility of our financial integration measure increased after the introduction of the Euro. That counterintuitive result suggests the presence of an inherent weakness in the attempt to measure financial integration independently of economic fundamentals. Nevertheless, the results about the bounds on the risk free rate appear plausible from the view point of existing economic theory about the impact of integration on interest rates.
Resumo:
The choice to adopt risk-sensitive measurement approaches for operational risks: the case of Advanced Measurement Approach under Basel II New Capital Accord This paper investigates the choice of the operational risk approach under Basel II requirements and whether the adoption of advanced risk measurement approaches allows banks to save capital. Among the three possible approaches for operational risk measurement, the Advanced Measurement Approach (AMA) is the most sophisticated and requires the use of historical loss data, the application of statistical tools, and the engagement of a highly qualified staff. Our results provide evidence that the adoption of AMA is contingent on the availability of bank resources and prior experience in risk-sensitive operational risk measurement practices. Moreover, banks that choose AMA exhibit low requirements for capital and, as a result might gain a competitive advantage compared to banks that opt for less sophisticated approaches. - Internal Risk Controls and their Impact on Bank Solvency Recent cases in financial sector showed the importance of risk management controls on risk taking and firm performance. Despite advances in the design and implementation of risk management mechanisms, there is little research on their impact on behavior and performance of firms. Based on data from a sample of 88 banks covering the period between 2004 and 2010, we provide evidence that internal risk controls impact the solvency of banks. In addition, our results show that the level of internal risk controls leads to a higher degree of solvency in banks with a major shareholder in contrast to widely-held banks. However, the relationship between internal risk controls and bank solvency is negatively affected by BHC growth strategies and external restrictions on bank activities, while the higher regulatory requirements for bank capital moderates positively this relationship. - The Impact of the Sophistication of Risk Measurement Approaches under Basel II on Bank Holding Companies Value Previous research showed the importance of external regulation on banks' behavior. Some inefficient standards may accentuate risk-taking in banks and provoke a financial crisis. Despite the growing literature on the potential effects of Basel II rules, there is little empirical research on the efficiency of risk-sensitive capital measurement approaches and their impact on bank profitability and market valuation. Based on data from a sample of 66 banks covering the period between 2008 and 2010, we provide evidence that prudential ratios computed under Basel II standards predict the value of banks. However, this relation is contingent on the degree of sophistication of risk measurement approaches that banks apply. Capital ratios are effective in predicting bank market valuation when banks adopt the advanced approaches to compute the value of their risk-weighted assets.
Resumo:
Like many businesses and government agencies, the Iowa Department of Corrections has been measuring our results for some years now. Certain performance measures are collected and reported to the Governor as part of the Director’s Flexible Performance Agreement used to evaluate the DOC Director. Updates of these measures are forwarded to DOC staff on a quarterly basis. In addition, the Iowa Department of Management requires each state agency to report on certain performance measures as part of Iowa’s effort to ensure accountability in state government. These measures and their progress are posted to www.ResultsIowa.org
Resumo:
Improving safety at nighttime work zones is important because of the extra visibility concerns. The deployment of sequential lights is an innovative method for improving driver recognition of lane closures and work zone tapers. Sequential lights are wireless warning lights that flash in a sequence to clearly delineate the taper at work zones. The effectiveness of sequential lights was investigated using controlled field studies. Traffic parameters were collected at the same field site with and without the deployment of sequential lights. Three surrogate performance measures were used to determine the impact of sequential lights on safety. These measures were the speeds of approaching vehicles, the number of late taper merges and the locations where vehicles merged into open lane from the closed lane. In addition, an economic analysis was conducted to monetize the benefits and costs of deploying sequential lights at nighttime work zones. The results of this study indicates that sequential warning lights had a net positive effect in reducing the speeds of approaching vehicles, enhancing driver compliance, and preventing passenger cars, trucks and vehicles at rural work zones from late taper merges. Statistically significant decreases of 2.21 mph mean speed and 1 mph 85% speed resulted with sequential lights. The shift in the cumulative speed distributions to the left (i.e. speed decrease) was also found to be statistically significant using the Mann-Whitney and Kolmogorov-Smirnov tests. But a statistically significant increase of 0.91 mph in the speed standard deviation also resulted with sequential lights. With sequential lights, the percentage of vehicles that merged earlier increased from 53.49% to 65.36%. A benefit-cost ratio of around 5 or 10 resulted from this analysis of Missouri nighttime work zones and historical crash data. The two different benefitcost ratios reflect two different ways of computing labor costs.
Resumo:
The objective of this study is to systematically evaluate the Iowa Department of Transportation’s (DOT’s) existing Pavement Management Information System (PMIS) with respect to the input information required for Mechanistic-Empirical Pavement Design Guide (MEPDG) rehabilitation analysis and design. To accomplish this objective, all of available PMIS data for interstate and primary roads in Iowa were retrieved from the Iowa DOT PMIS. The retrieved data were evaluated with respect to the input requirements and outputs for the latest version of the MEPDG software (version 1.0). The input parameters that are required for MEPDG HMA rehabilitation design, but currently unavailable in the Iowa DOT PMIS were identified. The differences in the specific measurement metrics used and their units for some of the pavement performance measures between the Iowa DOT PMIS and MEPDG were identified and discussed. Based on the results of this study, it is recommended that the Iowa DOT PMIS should be updated, if possible, to include the identified parameters that are currently unavailable, but are required for MEPDG rehabilitation design. Similarly, the measurement units of distress survey results in the Iowa DOT PMIS should be revised to correspond to those of MEPDG performance predictions. *******************Large File**************************
Resumo:
In the last decade, Intelligent Transportation Systems (ITS) have increasingly been deployed in work zones by state departments of transportation. Also known as smart work zone systems they improve traffic operations and safety by providing real-time information to travelers, monitoring traffic conditions, and managing incidents. Although there have been numerous ITS deployments in work zones, a framework for evaluating the effectiveness of these deployments does not exist. To justify the continued development and implementation of smart work zone systems, this study developed a framework to determine ITS effectiveness for specific work zone projects. The framework recommends using one or more of five performance measures: diversion rate, delay time, queue length, crash frequency, and speed. The monetary benefits and costs of ITS deployment in a work zone can then be computed using the performance measure values. Such ITS computations include additional considerations that are typically not present in standard benefit-cost computations. The proposed framework will allow for consistency in performance measures across different ITS studies thus allowing for comparisons across studies or for meta analysis. In addition, guidance on the circumstances under which ITS deployment is recommended for a work zone is provided. The framework was illustrated using two case studies: one urban work zone on I-70 and one rural work zone on I-44, in Missouri. The goals of the two ITS deployments were different – the I-70 ITS deployment was targeted at improving mobility whereas the I-44 deployment was targeted at improving safety. For the I-70 site, only permanent ITS equipment that was already in place was used for the project and no temporary ITS equipment was deployed. The permanent DMS equipment serves multiple purposes, and it is arguable whether that cost should be attributed to the work zone project. The data collection effort for the I-70 site was very significant as portable surveillance captured the actual diversion flows to alternative routes. The benefit-cost ratio for the I-70 site was 2.1 to 1 if adjusted equipment costs were included and 6.9 to 1 without equipment costs. The safety-focused I-44 ITS deployment had an estimated benefit-cost ratio of 3.2 to 1.
Resumo:
This report describes the development of performance measures for the Iowa DOT Construction Offices. The offices are responsible for administering all transportation construction projects for the Iowa DOT. In conjunction with a steering team composed of representatives of the Construction Offices, the research team developed a list of eight key processes and a set of measures for each. Two kinds of data were gathered: baseline data and benchmark data. Baseline data is used to characterize current performance. Benchmark data is gathered to find organizations that have excellent performance records for one or more key functions. This report discusses the methodology used and the results obtained. The data obtained represents the first set of data points. Subsequent years will establish trends for each of the measures, showing improvement or lack of it.
Resumo:
This report describes the continuation of the development of performance measures for the Iowa Department of Transportation (DOT) Offices of Construction. Those offices are responsible for administering transportation construction projects for the Iowa DOT. Researchers worked closely with the Benchmark Steering Team which was formed during Phase I of this project and is composed of representatives of the Offices of Construction. The research team conducted a second survey of Offices of Construction personnel, interviewed numerous members of the Offices and continued to work to improve the eight key processes identified during Phase I of this research. The eight key processes include Inspection of Work, Resolution of Technical Issues, Documentation of Work Progress and Pay Quantities, Employee Training and Development, Continuous Feedback for Improved Contract Documents, Provide Safe Traffic Control, External/Public Communication, and Providing Pre-Letting Information. Three to four measurements were specified for each key process. Many of these measurements required opinion surveys of employees, contractors, and others. During Phase II, researchers concentrated on conducting surveys, interviewing respondents to improve future surveys, and facilitating Benchmark Steering Team monthly meetings. Much effort was placed on using the information collected during the first year's research to improve the effectiveness and efficiency of the Offices of Construction. The results from Process Improvement Teams that studied Traffic Control and Resolution of Technical Issues were used to improve operations.
Resumo:
Dans cette thèse, nous étudions les aspects comportementaux d'agents qui interagissent dans des systèmes de files d'attente à l'aide de modèles de simulation et de méthodologies expérimentales. Chaque période les clients doivent choisir un prestataire de servivce. L'objectif est d'analyser l'impact des décisions des clients et des prestataires sur la formation des files d'attente. Dans un premier cas nous considérons des clients ayant un certain degré d'aversion au risque. Sur la base de leur perception de l'attente moyenne et de la variabilité de cette attente, ils forment une estimation de la limite supérieure de l'attente chez chacun des prestataires. Chaque période, ils choisissent le prestataire pour lequel cette estimation est la plus basse. Nos résultats indiquent qu'il n'y a pas de relation monotone entre le degré d'aversion au risque et la performance globale. En effet, une population de clients ayant un degré d'aversion au risque intermédiaire encoure généralement une attente moyenne plus élevée qu'une population d'agents indifférents au risque ou très averses au risque. Ensuite, nous incorporons les décisions des prestataires en leur permettant d'ajuster leur capacité de service sur la base de leur perception de la fréquence moyenne d'arrivées. Les résultats montrent que le comportement des clients et les décisions des prestataires présentent une forte "dépendance au sentier". En outre, nous montrons que les décisions des prestataires font converger l'attente moyenne pondérée vers l'attente de référence du marché. Finalement, une expérience de laboratoire dans laquelle des sujets jouent le rôle de prestataire de service nous a permis de conclure que les délais d'installation et de démantèlement de capacité affectent de manière significative la performance et les décisions des sujets. En particulier, les décisions du prestataire, sont influencées par ses commandes en carnet, sa capacité de service actuellement disponible et les décisions d'ajustement de capacité qu'il a prises, mais pas encore implémentées. - Queuing is a fact of life that we witness daily. We all have had the experience of waiting in line for some reason and we also know that it is an annoying situation. As the adage says "time is money"; this is perhaps the best way of stating what queuing problems mean for customers. Human beings are not very tolerant, but they are even less so when having to wait in line for service. Banks, roads, post offices and restaurants are just some examples where people must wait for service. Studies of queuing phenomena have typically addressed the optimisation of performance measures (e.g. average waiting time, queue length and server utilisation rates) and the analysis of equilibrium solutions. The individual behaviour of the agents involved in queueing systems and their decision making process have received little attention. Although this work has been useful to improve the efficiency of many queueing systems, or to design new processes in social and physical systems, it has only provided us with a limited ability to explain the behaviour observed in many real queues. In this dissertation we differ from this traditional research by analysing how the agents involved in the system make decisions instead of focusing on optimising performance measures or analysing an equilibrium solution. This dissertation builds on and extends the framework proposed by van Ackere and Larsen (2004) and van Ackere et al. (2010). We focus on studying behavioural aspects in queueing systems and incorporate this still underdeveloped framework into the operations management field. In the first chapter of this thesis we provide a general introduction to the area, as well as an overview of the results. In Chapters 2 and 3, we use Cellular Automata (CA) to model service systems where captive interacting customers must decide each period which facility to join for service. They base this decision on their expectations of sojourn times. Each period, customers use new information (their most recent experience and that of their best performing neighbour) to form expectations of sojourn time at the different facilities. Customers update their expectations using an adaptive expectations process to combine their memory and their new information. We label "conservative" those customers who give more weight to their memory than to the xiv Summary new information. In contrast, when they give more weight to new information, we call them "reactive". In Chapter 2, we consider customers with different degree of risk-aversion who take into account uncertainty. They choose which facility to join based on an estimated upper-bound of the sojourn time which they compute using their perceptions of the average sojourn time and the level of uncertainty. We assume the same exogenous service capacity for all facilities, which remains constant throughout. We first analyse the collective behaviour generated by the customers' decisions. We show that the system achieves low weighted average sojourn times when the collective behaviour results in neighbourhoods of customers loyal to a facility and the customers are approximately equally split among all facilities. The lowest weighted average sojourn time is achieved when exactly the same number of customers patronises each facility, implying that they do not wish to switch facility. In this case, the system has achieved the Nash equilibrium. We show that there is a non-monotonic relationship between the degree of risk-aversion and system performance. Customers with an intermediate degree of riskaversion typically achieve higher sojourn times; in particular they rarely achieve the Nash equilibrium. Risk-neutral customers have the highest probability of achieving the Nash Equilibrium. Chapter 3 considers a service system similar to the previous one but with risk-neutral customers, and relaxes the assumption of exogenous service rates. In this sense, we model a queueing system with endogenous service rates by enabling managers to adjust the service capacity of the facilities. We assume that managers do so based on their perceptions of the arrival rates and use the same principle of adaptive expectations to model these perceptions. We consider service systems in which the managers' decisions take time to be implemented. Managers are characterised by a profile which is determined by the speed at which they update their perceptions, the speed at which they take decisions, and how coherent they are when accounting for their previous decisions still to be implemented when taking their next decision. We find that the managers' decisions exhibit a strong path-dependence: owing to the initial conditions of the model, the facilities of managers with identical profiles can evolve completely differently. In some cases the system becomes "locked-in" into a monopoly or duopoly situation. The competition between managers causes the weighted average sojourn time of the system to converge to the exogenous benchmark value which they use to estimate their desired capacity. Concerning the managers' profile, we found that the more conservative Summary xv a manager is regarding new information, the larger the market share his facility achieves. Additionally, the faster he takes decisions, the higher the probability that he achieves a monopoly position. In Chapter 4 we consider a one-server queueing system with non-captive customers. We carry out an experiment aimed at analysing the way human subjects, taking on the role of the manager, take decisions in a laboratory regarding the capacity of a service facility. We adapt the model proposed by van Ackere et al (2010). This model relaxes the assumption of a captive market and allows current customers to decide whether or not to use the facility. Additionally the facility also has potential customers who currently do not patronise it, but might consider doing so in the future. We identify three groups of subjects whose decisions cause similar behavioural patterns. These groups are labelled: gradual investors, lumpy investors, and random investor. Using an autocorrelation analysis of the subjects' decisions, we illustrate that these decisions are positively correlated to the decisions taken one period early. Subsequently we formulate a heuristic to model the decision rule considered by subjects in the laboratory. We found that this decision rule fits very well for those subjects who gradually adjust capacity, but it does not capture the behaviour of the subjects of the other two groups. In Chapter 5 we summarise the results and provide suggestions for further work. Our main contribution is the use of simulation and experimental methodologies to explain the collective behaviour generated by customers' and managers' decisions in queueing systems as well as the analysis of the individual behaviour of these agents. In this way, we differ from the typical literature related to queueing systems which focuses on optimising performance measures and the analysis of equilibrium solutions. Our work can be seen as a first step towards understanding the interaction between customer behaviour and the capacity adjustment process in queueing systems. This framework is still in its early stages and accordingly there is a large potential for further work that spans several research topics. Interesting extensions to this work include incorporating other characteristics of queueing systems which affect the customers' experience (e.g. balking, reneging and jockeying); providing customers and managers with additional information to take their decisions (e.g. service price, quality, customers' profile); analysing different decision rules and studying other characteristics which determine the profile of customers and managers.
Resumo:
Effective winter maintenance makes use of freezing-point-depressant chemicals (also known as ice-control products) to prevent the formation of the bond between snow and ice and the highway pavement. In performing such winter maintenance, the selection of appropriate ice-control products for the bond prevention task involves consideration of a number of factors, as indicated in Nixon and Williams (2001). The factors are in essence performance measurements of the ice-control products, and as such can be easily incorporated into a specification document to allow for selection of the best ice-control products for a given agency to use in its winter maintenance activities. Once performance measures for de-icing or anti-icing chemicals have been specified, this allows the creation of a quality control program for the acceptance of those chemicals. This study presents a series of performance measurement tests for ice-control products, and discusses the role that they can play in such a quality control program. Some tests are simple and rapid enough that they can be performed on every load of icecontrol products received, while for others, a sampling technique must be used. An appropriate sampling technique is presented. Further, each test is categorized as to whether it should be applied to every load of ice-control products or on a sampling basis. The study includes a detailed literature review that considers the performance of ice-control products in three areas: temperature related performance, product consistency, and negative side effects. The negative side effects are further broken down into three areas, namely operational side effects (such as chemical slipperiness), environmental side effects, and infrastructural side effects (such as corrosion of vehicles and damage to concrete). The review indicated that in the area of side effects the field performance of ice-control products is currently so difficult to model in the laboratory that no particular specification tests can be recommended at this time. A study of the impact of ice-control products on concrete was performed by Professor Wang of Iowa State University as a sub-contract to this study, and has been presented to the Iowa Highway Research Board prior to this report.