919 resultados para Adjustment cost models


Relevância:

80.00% 80.00%

Publicador:

Resumo:

Exascale systems of the future are predicted to have mean time between failures (MTBF) of less than one hour. Malleable applications, where the number of processors on which the applications execute can be changed during executions, can make use of their malleability to better tolerate high failure rates. We present AdFT, an adaptive fault tolerance framework for long running malleable applications to maximize application performance in the presence of failures. AdFT framework includes cost models for evaluating the benefits of various fault tolerance actions including checkpointing, live-migration and rescheduling, and runtime decisions for dynamically selecting the fault tolerance actions at different points of application execution to maximize performance. Simulations with real and synthetic failure traces show that our approach outperforms existing fault tolerance mechanisms for malleable applications yielding up to 23% improvement in application performance, and is effective even for petascale systems and beyond.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Professionals who are responsible for coastal environmental and natural resource planning and management have a need to become conversant with new concepts designed to provide quantitative measures of the environmental benefits of natural resources. These amenities range from beaches to wetlands to clean water and other assets that normally are not bought and sold in everyday markets. At all levels of government — from federal agencies to townships and counties — decisionmakers are being asked to account for the costs and benefits of proposed actions. To non-specialists, the tools of professional economists are often poorly understood and sometimes inappropriate for the problem at hand. This handbook is intended to bridge this gap. The most widely used organizing tool for dealing with natural and environmental resource choices is benefit-cost analysis — it offers a convenient way to carefully identify and array, quantitatively if possible, the major costs, benefits, and consequences of a proposed policy or regulation. The major strength of benefit-cost analysis is not necessarily the predicted outcome, which depends upon assumptions and techniques, but the process itself, which forces an approach to decision-making that is based largely on rigorous and quantitative reasoning. However, a major shortfall of benefit-cost analysis has been the difficulty of quantifying both benefits and costs of actions that impact environmental assets not normally, nor even regularly, bought and sold in markets. Failure to account for these assets, to omit them from the benefit-cost equation, could seriously bias decisionmaking, often to the detriment of the environment. Economists and other social scientists have put a great deal of effort into addressing this shortcoming by developing techniques to quantify these non-market benefits. The major focus of this handbook is on introducing and illustrating concepts of environmental valuation, among them Travel Cost models and Contingent Valuation. These concepts, combined with advances in natural sciences that allow us to better understand how changes in the natural environment influence human behavior, aim to address some of the more serious shortcomings in the application of economic analysis to natural resource and environmental management and policy analysis. Because the handbook is intended for non-economists, it addresses basic concepts of economic value such as willingness-to-pay and other tools often used in decision making such as costeffectiveness analysis, economic impact analysis, and sustainable development. A number of regionally oriented case studies are included to illustrate the practical application of these concepts and techniques.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Professionals who are responsible for coastal environmental and natural resource planning and management have a need to become conversant with new concepts designed to provide quantitative measures of the environmental benefits of natural resources. These amenities range from beaches to wetlands to clean water and other assets that normally are not bought and sold in everyday markets. At all levels of government — from federal agencies to townships and counties — decisionmakers are being asked to account for the costs and benefits of proposed actions. To non-specialists, the tools of professional economists are often poorly understood and sometimes inappropriate for the problem at hand. This handbook is intended to bridge this gap. The most widely used organizing tool for dealing with natural and environmental resource choices is benefit-cost analysis — it offers a convenient way to carefully identify and array, quantitatively if possible, the major costs, benefits, and consequences of a proposed policy or regulation. The major strength of benefit-cost analysis is not necessarily the predicted outcome, which depends upon assumptions and techniques, but the process itself, which forces an approach to decision-making that is based largely on rigorous and quantitative reasoning. However, a major shortfall of benefit-cost analysis has been the difficulty of quantifying both benefits and costs of actions that impact environmental assets not normally, nor even regularly, bought and sold in markets. Failure to account for these assets, to omit them from the benefit-cost equation, could seriously bias decisionmaking, often to the detriment of the environment. Economists and other social scientists have put a great deal of effort into addressing this shortcoming by developing techniques to quantify these non-market benefits. The major focus of this handbook is on introducing and illustrating concepts of environmental valuation, among them Travel Cost models and Contingent Valuation. These concepts, combined with advances in natural sciences that allow us to better understand how changes in the natural environment influence human behavior, aim to address some of the more serious shortcomings in the application of economic analysis to natural resource and environmental management and policy analysis. Because the handbook is intended for non-economists, it addresses basic concepts of economic value such as willingness-to-pay and other tools often used in decision making such as costeffectiveness analysis, economic impact analysis, and sustainable development. A number of regionally oriented case studies are included to illustrate the practical application of these concepts and techniques.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Increasingly infrastructure providers are supplying the cloud marketplace with storage and on-demand compute resources to host cloud applications. From an application user's point of view, it is desirable to identify the most appropriate set of available resources on which to execute an application. Resource choice can be complex and may involve comparing available hardware specifications, operating systems, value-added services, such as network configuration or data replication, and operating costs, such as hosting cost and data throughput. Providers' cost models often change and new commodity cost models, such as spot pricing, have been introduced to offer significant savings. In this paper, a software abstraction layer is used to discover infrastructure resources for a particular application, across multiple providers, by using a two-phase constraints-based approach. In the first phase, a set of possible infrastructure resources are identified for a given application. In the second phase, a heuristic is used to select the most appropriate resources from the initial set. For some applications a cost-based heuristic is most appropriate; for others a performance-based heuristic may be used. A financial services application and a high performance computing application are used to illustrate the execution of the proposed resource discovery mechanism. The experimental result shows the proposed model could dynamically select an appropriate set of resouces that match the application's requirements.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

We describe an approach aimed at addressing the issue of joint exploitation of control (stream) and data parallelism in a skeleton based parallel programming environment, based on annotations and refactoring. Annotations drive efficient implementation of a parallel computation. Refactoring is used to transform the associated skeleton tree into a more efficient, functionally equivalent skeleton tree. In most cases, cost models are used to drive the refactoring process. We show how sample use case applications/kernels may be optimized and discuss preliminary experiments with FastFlow assessing the theoretical results. © 2013 Springer-Verlag.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

En synthèse d’images, reproduire les effets complexes de la lumière sur des matériaux transluminescents, tels que la cire, le marbre ou la peau, contribue grandement au réalisme d’une image. Malheureusement, ce réalisme supplémentaire est couteux en temps de calcul. Les modèles basés sur la théorie de la diffusion visent à réduire ce coût en simulant le comportement physique du transport de la lumière sous surfacique tout en imposant des contraintes de variation sur la lumière incidente et sortante. Une composante importante de ces modèles est leur application à évaluer hiérarchiquement l’intégrale numérique de l’illumination sur la surface d’un objet. Cette thèse révise en premier lieu la littérature actuelle sur la simulation réaliste de la transluminescence, avant d’investiguer plus en profondeur leur application et les extensions des modèles de diffusion en synthèse d’images. Ainsi, nous proposons et évaluons une nouvelle technique d’intégration numérique hiérarchique utilisant une nouvelle analyse fréquentielle de la lumière sortante et incidente pour adapter efficacement le taux d’échantillonnage pendant l’intégration. Nous appliquons cette théorie à plusieurs modèles qui correspondent à l’état de l’art en diffusion, octroyant une amélioration possible à leur efficacité et précision.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Identification of cost-determinant variables and evaluation of their degree of influence play an essential role in building reliable cost models and enhance the competitive edge of quantity surveyors as well as contracting organisations. Sixty-seven variables affecting pre-tender construction cost estimates are identified through literature and interviews. These factors are grouped into six categories and a comparison analysis of their impact is conducted. Priority ranking of cost-influencing factors is carried out using a questionnaire survey commissioned amongst quantity surveyors based in the UK. Findings of this survey indicate that there is a strong agreement between quantity surveyors in ranking cost-influencing factors of construction projects. Comparisons between the outcomes of this research and other related studies are presented.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Constructing a building is a long process which can take several years. Most building services products are installed while a building is constructed, but they are not operated until the building is commissioned. The warranty term for the building service systems may cover the time starting from their installation to the end of the warranty period. Prior to the commissioning of the building, the building services systems are protected by warranty although they are not operated. The bum in time for such systems is important when warranty costs is analyzed. In this paper, warranty cost models for products with burn in periods are presented. Two burn in policies are developed to optimize the total mean warranty cost. A special case on the relationship between the failure rates of the product at the dormant state and at the I operating state is presented.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

As unidades de beneficiamento de macadâmia adotam silos secadores distintos, para cada etapa de secagem, a fim de garantir a manutenção da qualidade do produto pela redução da umidade a níveis desejáveis. Diante da necessidade de quantificar a resistência apresentada pelas nozes, submetidas a diferentes fluxos de ar durante a secagem, bem como avaliar a possibilidade de utilização de modelos empíricos, que estimem o gradiente de pressão a partir da vazão de ar, conduziram-se vários testes em laboratório para obtenção de dados experimentais e ajuste de modelos. Frutos de macadâmia (M. integrifolia), com umidade de 0,11 b.s., após limpeza e classificação, foram colocados no interior de um protótipo constituído por uma coluna de chapa galvanizada (com tomadas para medição da pressão estática), plenum e ventilador, sendo submetidos a diferentes fluxos de ar. Os testes consistiram de três medidas por profundidade, para cada um dos três lotes de nozes, perfazendo um total de nove medidas de pressão estática por profundidade na coluna. Os resultados obtidos permitiram concluir que os fluxos de ar testados apresentaram efeito significativo sobre a queda de pressão estática na coluna de macadâmia, a qual aumentou linearmente com a profundidade. Os dados experimentais ajustaram-se muito bem aos modelos de Shedd e Hunter, sugerindo sua boa aplicabilidade para a macadâmia.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Includes bibliography

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Due to highly erodible volcanic soils and a harsh climate, livestock grazing in Iceland has led to serious soil erosion on about 40% of the country's surface. Over the last 100 years, various revegetation and restoration measures were taken on large areas distributed all over Iceland in an attempt to counteract this problem. The present research aimed to develop models for estimating percent vegetation cover (VC) and aboveground biomass (AGB) based on satellite data, as this would make it possible to assess and monitor the effectiveness of restoration measures over large areas at a fairly low cost. Models were developed based on 203 vegetation cover samples and 114 aboveground biomass samples distributed over five SPOT satellite datasets. All satellite datasets were atmospherically corrected, and digital numbers were converted into ground reflectance. Then a selection of vegetation indices (VIs) was calculated, followed by simple and multiple linear regression analysis of the relations between the field data and the calculated VIs. Best results were achieved using multiple linear regression models for both %VC and AGB. The model calibration and validation results showed that R2 and RMSE values for most VIs do not vary very much. For percent VC, R2 values range between 0.789 and 0.822, leading to RMSEs ranging between 15.89% and 16.72%. For AGB, R2 values for low-biomass areas (AGB < 800 g/m2) range between 0.607 and 0.650, leading to RMSEs ranging between 126.08 g/m2 and 136.38 g/m2. The AGB model developed for all areas, including those with high biomass coverage (AGB > 800 g/m2), achieved R2 values between 0.487 and 0.510, resulting in RMSEs ranging from 234 g/m2 to 259.20 g/m2. The models predicting percent VC generally overestimate observed low percent VC and slightly underestimate observed high percent VC. The estimation models for AGB behave in a similar way, but over- and underestimation are much more pronounced. These results show that it is possible to estimate percent VC with high accuracy based on various VIs derived from SPOT satellite data. AGB of restoration areas with low-biomass values of up to 800 g/m2 can likewise be estimated with high accuracy based on various VIs derived from SPOT satellite data, whereas in the case of high biomass coverage, estimation accuracy decreases with increasing biomass values. Accordingly, percent VC can be estimated with high accuracy anywhere in Iceland, whereas AGB is much more difficult to estimate, particularly for areas with high-AGB variability.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Characteristics of Medicare-certified home health agencies in Texas and the contributions of selected agency characteristics on home health care costs were examined. Cost models were developed and estimated for both nursing and total visit costs using multiple regression procedures. The models included home health agency size, profit status, control, hospital-based affiliation, contract-cost ratio, service provision, competition, urban-rural input-price differences, and selected measures of patient case-mix. The study population comprised 314 home health agencies in Texas that had been certified at least one year on July, 1, 1986. Data for the analysis were obtained from Medicare Cost Reports for fiscal year ending between July 1, 1985 to June 30, 1986.^ Home health agency size, as measured by the logs of nursing and total visits, has a statistically significant negative linear relationship with nursing visit and total visit costs. Nursing and total visit costs decrease at a declining rate as size increases. The size-cost relationship is not altered when controlling for any other agency characteristic. The number of visits per patient per year, a measure of patient case-mix, is also negatively related to costs, suggesting that costs decline with care of chronic patients. Hospital-based affiliation and urban location are positively associated with costs. Together, the four characteristics explain 19 percent of the variance in nursing visit costs and 24 percent of the variance in total visit costs.^ Profit status and control, although correlated with other agency characteristics, exhibit no observable effect on costs. Although no relationship was found between costs and competition, contract cost ratio, or the provision on non-reimburseable services, no conclusions can be made due to problems with measurement of these variables. ^

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Irregular computations pose sorne of the most interesting and challenging problems in automatic parallelization. Irregularity appears in certain kinds of numerical problems and is pervasive in symbolic applications. Such computations often use dynamic data structures, which make heavy use of pointers. This complicates all the steps of a parallelizing compiler, from independence detection to task partitioning and placement. Starting in the mid 80s there has been significant progress in the development of parallelizing compilers for logic pro­gramming (and more recently, constraint programming) resulting in quite capable paralle­lizers. The typical applications of these paradigms frequently involve irregular computations, and make heavy use of dynamic data structures with pointers, since logical variables represent in practice a well-behaved form of pointers. This arguably makes the techniques used in these compilers potentially interesting. In this paper, we introduce in a tutoríal way, sorne of the problems faced by parallelizing compilers for logic and constraint programs and provide pointers to sorne of the significant progress made in the area. In particular, this work has resulted in a series of achievements in the areas of inter-procedural pointer aliasing analysis for independence detection, cost models and cost analysis, cactus-stack memory management, techniques for managing speculative and irregular computations through task granularity control and dynamic task allocation such as work-stealing schedulers), etc.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

En el proceso general de la sociedad por la mejora continua hacia la Calidad, el sector de la construcción, y más específicamente la actividad de los Arquitectos con la redacción de los proyectos, no pueden, ni deben, quedar al margen. La presente investigación apunta un procedimiento para el control técnico de los proyectos y demuestra la eficacia y rentabilidad de éste o cualquier otro método de control, avanzando una aproximación a los modelos de costes de calidad de los estudios de arquitectura. El método de trabajo que se ha previsto para el desarrollo de la tesis cuenta con una base principal consistente en definir un procedimiento general de revisión de los proyectos, tipificando los principales errores (sistema de puntos de inspección), analizando las causas que los generan, su repercusión en el plazo, durabilidad y satisfacción del cliente, así como en definir un método de cuantificación que nos aproxime a la "importancia" (económica) que tienen o inducen los errores e indefiniciones detectadas. Para demostrar la validez de la hipótesis inicial sobre la rentabilidad de aplicar un sistema de control técnico del proyecto, se ha aplicado una parte del procedimiento general particularizado para la evaluación sistemática de los problemas, indefiniciones y fallos detectados, al que llamamos de forma simplificada Método Partícula Éste se aplica sobre una muestra de proyectos que se revisan y que, para que los resultados del análisis sean representativos, se seleccionaron de forma aleatoria, respondiendo topológicamente en sus variables definitorias a la población que se pretende estudiar: la totalidad de proyectos de ejecución de viviendas producidos en el ámbito territorial de Madrid (Comunidad) en el plazo comprendido entre los años 1990 y 1995. Pero además esta representatividad está condicionada a la mayor o menor exactitud de la valoración que se haga de los sobrecostos que puedan generar los errores e indefiniciones de los proyectos. Se hace pues imprescindible el tratar de objetivar al máximo los criterios de valoración de los mismos. Con los datos generados en la revisión de cada proyecto se analizan la totalidad de resultados de la muestra objeto de estudio, sacando conclusiones sobre frecuencia e importancia de los errores, incidencia de las variables que influyen, y posibles combinaciones de variables que incrementan el riesgo. Extrapolando el análisis al método general y a la población objeto de estudio, se extraen conclusiones significativas respecto a la eficacia del control técnico de proyectos, así como de las formas de optimizar las inversiones realizadas en el mismo, depurando y optimizando selectivamente el método general definido. Con el análisis de los modelos de coste de calidad se puede constatar cómo las inversiones en desarrollar sistemas de aseguramiento de la calidad de los proyectos, o, de forma más modesta, controlando la calidad técnica de los mismos, los estudios de arquitectura, amén del mejor servicio ofrecido a los clientes, y lo que ésto supone de permanencia en el mercado, mejoran significativamente su competitividad y su margen de beneficio, demostrando que son muy rentables tanto para los propios arquitectos, como para la sociedad en general. ABSTRACT The construction sector as a whole, and especifically architects as project drafters, must fully participate in the general process of society for continuous improvement towards quality. Our research outlines a procedure for the technical control of projects and shows the efficacy and cost-effectiveness of this or any other control method, putting fonvard an approach to quality cost-models in Architecture offices. Our procedure consists mainly in defining a general method of revising projects, typifying main errors (Points of inspection system), analysing their causes, their repercussion in clients' durability and satisfaction. It wHI also define a quantitative method to assess tfie economic importance of detected errors and indefinitions. To prove our initial hypothesis on the cost-effectiveness of applying a system of tecfinical control to projects we have applied part of the general procedure, adjusting it to the systematic evaluation of problems, indefinitions and errors we have found. This will be simply called Particular Method. We will use it on a sample of projects which were randomly selected, for the sake of representativeness, and which, in their defining variables, match the population under study topologically: every housing project in Madrid (Región) between 1.990 and 1.995. Furthermore, this representativeness is related to the accuracy of the assessment of the additional costs due to errors or lack of definition in the projects. We must therefore try to be precise in the evaluation of their costs. With data obtained from the revision of each project, we analyze every result from the sample under study, drawing conclusions about the frequency and importance of each error, their causing variables, and the combination of variables which are risk-increasing. By extrapolating this analysis to the General Method and the population under study, we draw significant conclusions on the effectiveness of the technical control of projects, as well as of the ways to optimise the investment in it, improving and optimising the General Method in a selective way. Analyzing quality cost-models, we can show how the investment in developing systems that will guarantee quality projects, or, more modestly, controlling the technical quality of them, Architecture firms will not only serve their clients best, with its impact on the firm's durability, but also improve their competitivity and their profit margin proving that they are profitable both for architects themselves and for the general public.