918 resultados para Benefit Cost Analysis


Relevância:

90.00% 90.00%

Publicador:

Resumo:

Automatic cost analysis of programs has been traditionally studied in terms of a number of concrete, predefined resources such as execution steps, time, or memory. However, the increasing relevance of analysis applications such as static debugging and/or certification of user-level properties (including for mobile code) makes it interesting to develop analyses for resource notions that are actually applicationdependent. This may include, for example, bytes sent or received by an application, number of files left open, number of SMSs sent or received, number of accesses to a database, money spent, energy consumption, etc. We present a fully automated analysis for inferring upper bounds on the usage that a Java bytecode program makes of a set of application programmer-definable resources. In our context, a resource is defined by programmer-provided annotations which state the basic consumption that certain program elements make of that resource. From these definitions our analysis derives functions which return an upper bound on the usage that the whole program (and individual blocks) make of that resource for any given set of input data sizes. The analysis proposed is independent of the particular resource. We also present some experimental results from a prototype implementation of the approach covering an ample set of interesting resources.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

El Análisis de Consumo de Recursos o Análisis de Coste trata de aproximar el coste de ejecutar un programa como una función dependiente de sus datos de entrada. A pesar de que existen trabajos previos a esta tesis doctoral que desarrollan potentes marcos para el análisis de coste de programas orientados a objetos, algunos aspectos avanzados, como la eficiencia, la precisión y la fiabilidad de los resultados, todavía deben ser estudiados en profundidad. Esta tesis aborda estos aspectos desde cuatro perspectivas diferentes: (1) Las estructuras de datos compartidas en la memoria del programa son una pesadilla para el análisis estático de programas. Trabajos recientes proponen una serie de condiciones de localidad para poder mantener de forma consistente información sobre los atributos de los objetos almacenados en memoria compartida, reemplazando éstos por variables locales no almacenadas en la memoria compartida. En esta tesis presentamos dos extensiones a estos trabajos: la primera es considerar, no sólo los accesos a los atributos, sino también los accesos a los elementos almacenados en arrays; la segunda se centra en los casos en los que las condiciones de localidad no se cumplen de forma incondicional, para lo cual, proponemos una técnica para encontrar las precondiciones necesarias para garantizar la consistencia de la información acerca de los datos almacenados en memoria. (2) El objetivo del análisis incremental es, dado un programa, los resultados de su análisis y una serie de cambios sobre el programa, obtener los nuevos resultados del análisis de la forma más eficiente posible, evitando reanalizar aquellos fragmentos de código que no se hayan visto afectados por los cambios. Los analizadores actuales todavía leen y analizan el programa completo de forma no incremental. Esta tesis presenta un análisis de coste incremental, que, dado un cambio en el programa, reconstruye la información sobre el coste del programa de todos los métodos afectados por el cambio de forma incremental. Para esto, proponemos (i) un algoritmo multi-dominio y de punto fijo que puede ser utilizado en todos los análisis globales necesarios para inferir el coste, y (ii) una novedosa forma de almacenar las expresiones de coste que nos permite reconstruir de forma incremental únicamente las funciones de coste de aquellos componentes afectados por el cambio. (3) Las garantías de coste obtenidas de forma automática por herramientas de análisis estático no son consideradas totalmente fiables salvo que la implementación de la herramienta o los resultados obtenidos sean verificados formalmente. Llevar a cabo el análisis de estas herramientas es una tarea titánica, ya que se trata de herramientas de gran tamaño y complejidad. En esta tesis nos centramos en el desarrollo de un marco formal para la verificación de las garantías de coste obtenidas por los analizadores en lugar de analizar las herramientas. Hemos implementado esta idea mediante la herramienta COSTA, un analizador de coste para programas Java y KeY, una herramienta de verificación de programas Java. De esta forma, COSTA genera las garantías de coste, mientras que KeY prueba la validez formal de los resultados obtenidos, generando de esta forma garantías de coste verificadas. (4) Hoy en día la concurrencia y los programas distribuidos son clave en el desarrollo de software. Los objetos concurrentes son un modelo de concurrencia asentado para el desarrollo de sistemas concurrentes. En este modelo, los objetos son las unidades de concurrencia y se comunican entre ellos mediante llamadas asíncronas a sus métodos. La distribución de las tareas sugiere que el análisis de coste debe inferir el coste de los diferentes componentes distribuidos por separado. En esta tesis proponemos un análisis de coste sensible a objetos que, utilizando los resultados obtenidos mediante un análisis de apunta-a, mantiene el coste de los diferentes componentes de forma independiente. Abstract Resource Analysis (a.k.a. Cost Analysis) tries to approximate the cost of executing programs as functions on their input data sizes and without actually having to execute the programs. While a powerful resource analysis framework on object-oriented programs existed before this thesis, advanced aspects to improve the efficiency, the accuracy and the reliability of the results of the analysis still need to be further investigated. This thesis tackles this need from the following four different perspectives. (1) Shared mutable data structures are the bane of formal reasoning and static analysis. Analyses which keep track of heap-allocated data are referred to as heap-sensitive. Recent work proposes locality conditions for soundly tracking field accesses by means of ghost non-heap allocated variables. In this thesis we present two extensions to this approach: the first extension is to consider arrays accesses (in addition to object fields), while the second extension focuses on handling cases for which the locality conditions cannot be proven unconditionally by finding aliasing preconditions under which tracking such heap locations is feasible. (2) The aim of incremental analysis is, given a program, its analysis results and a series of changes to the program, to obtain the new analysis results as efficiently as possible and, ideally, without having to (re-)analyze fragments of code that are not affected by the changes. During software development, programs are permanently modified but most analyzers still read and analyze the entire program at once in a non-incremental way. This thesis presents an incremental resource usage analysis which, after a change in the program is made, is able to reconstruct the upper-bounds of all affected methods in an incremental way. To this purpose, we propose (i) a multi-domain incremental fixed-point algorithm which can be used by all global analyses required to infer the cost, and (ii) a novel form of cost summaries that allows us to incrementally reconstruct only those components of cost functions affected by the change. (3) Resource guarantees that are automatically inferred by static analysis tools are generally not considered completely trustworthy, unless the tool implementation or the results are formally verified. Performing full-blown verification of such tools is a daunting task, since they are large and complex. In this thesis we focus on the development of a formal framework for the verification of the resource guarantees obtained by the analyzers, instead of verifying the tools. We have implemented this idea using COSTA, a state-of-the-art cost analyzer for Java programs and KeY, a state-of-the-art verification tool for Java source code. COSTA is able to derive upper-bounds of Java programs while KeY proves the validity of these bounds and provides a certificate. The main contribution of our work is to show that the proposed tools cooperation can be used for automatically producing verified resource guarantees. (4) Distribution and concurrency are today mainstream. Concurrent objects form a well established model for distributed concurrent systems. In this model, objects are the concurrency units that communicate via asynchronous method calls. Distribution suggests that analysis must infer the cost of the diverse distributed components separately. In this thesis we propose a novel object-sensitive cost analysis which, by using the results gathered by a points-to analysis, can keep the cost of the diverse distributed components separate.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This article presents an alternative approach to the decision-making process in transport strategy design. The study explores the possibility of integrating forecasting, assessment and optimization procedures in support of a decision-making process designed to reach the best achievable scenario through mobility policies. Long-term evaluation, as required by a dynamic system such as a city, is provided by a strategic Land-Use and Transport Interaction (LUTI) model. The social welfare achieved by implementing mobility LUTI model policies is measured through a cost-benefit analysis and maximized through an optimization process throughout the evaluation period. The method is tested by optimizing a pricing policy scheme in Madrid on a cordon toll in a context requiring system efficiency, social equity and environmental quality. The optimized scheme yields an appreciable increase in social surplus through a relatively low rate compared to other similar pricing toll schemes. The results highlight the different considerations regarding mobility impacts on the case study area, as well as the major contributors to social welfare surplus. This leads the authors to reconsider the cost-analysis approach, as defined in the study, as the best option for formulating sustainability measures.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

La computación basada en servicios (Service-Oriented Computing, SOC) se estableció como un paradigma ampliamente aceptado para el desarollo de sistemas de software flexibles, distribuidos y adaptables, donde las composiciones de los servicios realizan las tareas más complejas o de nivel más alto, frecuentemente tareas inter-organizativas usando los servicios atómicos u otras composiciones de servicios. En tales sistemas, las propriedades de la calidad de servicio (Quality of Service, QoS), como la rapídez de procesamiento, coste, disponibilidad o seguridad, son críticas para la usabilidad de los servicios o sus composiciones en cualquier aplicación concreta. El análisis de estas propriedades se puede realizarse de una forma más precisa y rica en información si se utilizan las técnicas de análisis de programas, como el análisis de complejidad o de compartición de datos, que son capables de analizar simultáneamente tanto las estructuras de control como las de datos, dependencias y operaciones en una composición. El análisis de coste computacional para la composicion de servicios puede ayudar a una monitorización predictiva así como a una adaptación proactiva a través de una inferencia automática de coste computacional, usando los limites altos y bajos como funciones del valor o del tamaño de los mensajes de entrada. Tales funciones de coste se pueden usar para adaptación en la forma de selección de los candidatos entre los servicios que minimizan el coste total de la composición, basado en los datos reales que se pasan al servicio. Las funciones de coste también pueden ser combinadas con los parámetros extraídos empíricamente desde la infraestructura, para producir las funciones de los límites de QoS sobre los datos de entrada, cuales se pueden usar para previsar, en el momento de invocación, las violaciones de los compromisos al nivel de servicios (Service Level Agreements, SLA) potenciales or inminentes. En las composiciones críticas, una previsión continua de QoS bastante eficaz y precisa se puede basar en el modelado con restricciones de QoS desde la estructura de la composition, datos empiricos en tiempo de ejecución y (cuando estén disponibles) los resultados del análisis de complejidad. Este enfoque se puede aplicar a las orquestaciones de servicios con un control centralizado del flujo, así como a las coreografías con participantes multiples, siguiendo unas interacciones complejas que modifican su estado. El análisis del compartición de datos puede servir de apoyo para acciones de adaptación, como la paralelización, fragmentación y selección de los componentes, las cuales son basadas en dependencias funcionales y en el contenido de información en los mensajes, datos internos y las actividades de la composición, cuando se usan construcciones de control complejas, como bucles, bifurcaciones y flujos anidados. Tanto las dependencias funcionales como el contenido de información (descrito a través de algunos atributos definidos por el usuario) se pueden expresar usando una representación basada en la lógica de primer orden (claúsulas de Horn), y los resultados del análisis se pueden interpretar como modelos conceptuales basados en retículos. ABSTRACT Service-Oriented Computing (SOC) is a widely accepted paradigm for development of flexible, distributed and adaptable software systems, in which service compositions perform more complex, higher-level, often cross-organizational tasks using atomic services or other service compositions. In such systems, Quality of Service (QoS) properties, such as the performance, cost, availability or security, are critical for the usability of services and their compositions in concrete applications. Analysis of these properties can become more precise and richer in information, if it employs program analysis techniques, such as the complexity and sharing analyses, which are able to simultaneously take into account both the control and the data structures, dependencies, and operations in a composition. Computation cost analysis for service composition can support predictive monitoring and proactive adaptation by automatically inferring computation cost using the upper and lower bound functions of value or size of input messages. These cost functions can be used for adaptation by selecting service candidates that minimize total cost of the composition, based on the actual data that is passed to them. The cost functions can also be combined with the empirically collected infrastructural parameters to produce QoS bounds functions of input data that can be used to predict potential or imminent Service Level Agreement (SLA) violations at the moment of invocation. In mission-critical applications, an effective and accurate continuous QoS prediction, based on continuations, can be achieved by constraint modeling of composition QoS based on its structure, known data at runtime, and (when available) the results of complexity analysis. This approach can be applied to service orchestrations with centralized flow control, and choreographies with multiple participants with complex stateful interactions. Sharing analysis can support adaptation actions, such as parallelization, fragmentation, and component selection, which are based on functional dependencies and information content of the composition messages, internal data, and activities, in presence of complex control constructs, such as loops, branches, and sub-workflows. Both the functional dependencies and the information content (described using user-defined attributes) can be expressed using a first-order logic (Horn clause) representation, and the analysis results can be interpreted as a lattice-based conceptual models.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This article presents an alternative approach to the decision-making process in transport strategy design. The study explores the possibility of integrating forecasting, assessment and optimization procedures in support of a decision-making process designed to reach the best achievable scenario through mobility policies. Long-term evaluation, as required by a dynamic system such as a city, is provided by a strategic Land-Use and Transport Interaction (LUTI) model. The social welfare achieved by implementing mobility LUTI model policies is measured through a cost-benefit analysis and maximized through an optimization process throughout the evaluation period. The method is tested by optimizing a pricing policy scheme in Madrid on a cordon toll in a context requiring system efficiency, social equity and environmental quality. The optimized scheme yields an appreciable increase in social surplus through a relatively low rate compared to other similar pricing toll schemes. The results highlight the different considerations regarding mobility impacts on the case study area, as well as the major contributors to social welfare surplus. This leads the authors to reconsider the cost-analysis approach, as defined in the study, as the best option for formulating sustainability measures.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Description based on: 1982; title from cover.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This paper addresses the theme of retrotif applied to buildings that belong to the modernist architectural production of historical interest located in the urban area of Natal . The overall objective is to identify and harmonize procedures for retrofit and architectural heritage preservation using elements of constructive analysis of expression and Benefit Cost Ratio ( BCR ) parameters established by the National Electric Energy Agency - ANEEL . The hypothesis put forward is that by stimulating the projetual year analysis , retrofit interventions , it is possible to obtain better results with projects RCB addressing the issues of preservation of architectural heritage . For both flow analysis of process solutions and proposals for action of elements and systems that seek to improve the energy performance of the building , restoring or preserving the architectural elements were developed . The proposed interventions undergo performance through computer simulations of systems such as DesignBuilder, Solar and Sun Tool. The energy results were converted to the analysis of RCB parameter and compared to the constructive expression of the project because the prenatal and intervention. From the results , a plot was constructed which results in a comparison between the RCB and the constructive expression of the simulated interventions

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Algae biodiesel is a promising but expensive alternative fuel to petro-diesel. To overcome cost barriers, detailed cost analyses are needed. A decade-old cost analysis by the U.S. National Renewable Energy Laboratory indicated that the costs of algae biodiesel were in the range of $0.53–0.85/L (2012 USD values). However, the cost of land and transesterification were just roughly estimated. In this study, an updated comprehensive techno-economic analysis was conducted with optimized processes and improved cost estimations. Latest process improvement, quotes from vendors, government databases, and other relevant data sources were used to calculate the updated algal biodiesel costs, and the final costs of biodiesel are in the range of $0.42–0.97/L. Additional improvements on cost-effective biodiesel production around the globe to cultivate algae was also recommended. Overall, the calculated costs seem promising, suggesting that a single step biodiesel production process is close to commercial reality.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This paper presents a methodology to explore the impact on poverty of the public spending on education. The methodology consists of two approaches: Benefit Incidence Analysis (BIA) and behavioral approach. BIA considers the cost and use of the educational service, and the distribution of the benefits among groups of income. Regarding the behavioral approach, we use a Probit model of schooling attendance, in order to determinethe influence of public spending on the probability for thepoor to attend the school. As a complement, a measurement of targeting errors in the allocation of public spending is included in the methodology.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Recent developments in automation, robotics and artificial intelligence have given a push to a wider usage of these technologies in recent years, and nowadays, driverless transport systems are already state-of-the-art on certain legs of transportation. This has given a push for the maritime industry to join the advancement. The case organisation, AAWA initiative, is a joint industry-academia research consortium with the objective of developing readiness for the first commercial autonomous solutions, exploiting state-of-the-art autonomous and remote technology. The initiative develops both autonomous and remote operation technology for navigation, machinery, and all on-board operating systems. The aim of this study is to develop a model with which to estimate and forecast the operational costs, and thus enable comparisons between manned and autonomous cargo vessels. The building process of the model is also described and discussed. Furthermore, the model’s aim is to track and identify the critical success factors of the chosen ship design, and to enable monitoring and tracking of the incurred operational costs as the life cycle of the vessel progresses. The study adopts the constructive research approach, as the aim is to develop a construct to meet the needs of a case organisation. Data has been collected through discussions and meeting with consortium members and researchers, as well as through written and internal communications material. The model itself is built using activity-based life cycle costing, which enables both realistic cost estimation and forecasting, as well as the identification of critical success factors due to the process-orientation adopted from activity-based costing and the statistical nature of Monte Carlo simulation techniques. As the model was able to meet the multiple aims set for it, and the case organisation was satisfied with it, it could be argued that activity-based life cycle costing is the method with which to conduct cost estimation and forecasting in the case of autonomous cargo vessels. The model was able to perform the cost analysis and forecasting, as well as to trace the critical success factors. Later on, it also enabled, albeit hypothetically, monitoring and tracking of the incurred costs. By collecting costs this way, it was argued that the activity-based LCC model is able facilitate learning from and continuous improvement of the autonomous vessel. As with the building process of the model, an individual approach was chosen, while still using the implementation and model building steps presented in existing literature. This was due to two factors: the nature of the model and – perhaps even more importantly – the nature of the case organisation. Furthermore, the loosely organised network structure means that knowing the case organisation and its aims is of great importance when conducting a constructive research.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This work reports an economic evaluation of the dried banana production from an agroindustry located in Guaraqueçaba - PR State, Brazil. The conventional and the organic banana processings were evaluated by comparing the economic viability pointers. The dried organic banana is exported to the Europe and the dried conventional banana is commercialized in the region of Curitiba - PR. Both processings presented positive economic viability, however the dried organic banana presented better indices (TIR 94%, VPL R$ 486,009.39 and benefit cost relation of 2.11) than the conventional dried banana (TIR 14%, VPL R$ 34,668.00 and benefit cost relation of 1.17). The dried organic banana presented a cost of production of R$ 3.64, being 50.1% relative to the expense with insumos and 27% with labour. The dried conventional banana presented a cost of R$ 3.21, being 45.3% for insumos and 31.2% for labour.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The study was done to evaluate the cost-effectiveness of a national rotavirus vaccination programme in Brazilian children from the healthcare system perspective. A hypothetical annual birth-cohort was followed for a five-year period. Published and national administrative data were incorporated into a model to quantify the consequences of vaccination versus no vaccination. Main outcome measures included the reduction in disease burden, lives saved, and disability-adjusted life-years (DALYs) averted. A rotavirus vaccination programme in Brazil would prevent an estimated 1,804 deaths associated with gastroenteritis due to rotavirus, 91,127 hospitalizations, and 550,198 outpatient visits. Vaccination is likely to reduce 76% of the overall healthcare burden of rotavirus-associated gastroenteritis in Brazil. At a vaccine price of US$ 7-8 per dose, the cost-effectiveness ratio would be US$ 643 per DALY averted. Rotavirus vaccination can reduce the burden of gastroenteritis due to rotavirus at a reasonable cost-effectiveness ratio.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Nitrogen is the nutrient that is most absorbed by the corn crop, with the most complex management, and has the highest share on the cost of corn production. The objective of this work was to evaluate the economic viability of different rates and split-applications of nitrogen fertilization, as such as urea, in the corn crop in a eutrophic Red Latosol (Oxisol). The study was carried out in the Experimental Station of the Regional Pole of the Sao Paulo Northwest Agribusiness Development (APTA), in Votuporanga, State of Sao Paulo, Brazil. The experimental design was randomized complete blocks with nine treatments and four replications, consisting of five N rates: 0, 55, 95, 135 and 175 kg ha(-1), 15 kg ha-l applied in the seeding and the remainder in top dressing: 40 and 80 kg ha(-1) N at forty days after seeding (DAS), or 1/2 + 1/2 at 20 and 40 DAS; 120 kg ha-1 N split in 1/2 + 1/2 or 1/3 + 1/3 + 1/3 at 20, 40 or 60 DAS; 160 kg ha(-1) N split in 1/4 + 3/8 + 3/8 or 114 + 1/4 + 1/4 + 1/4 at 20, 40, 60 and 80 DAS. The application of 135 kg ha-l of N split in three times provided the best benefit/cost ratio. The non-application of N provided the lowest economic return, proving to be unviable.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

We conducted a randomised, controlled field trial during 1998/1999 to evaluate the hypothesis that improved piglet management would improve the reproductive performance of smallholder sows. Simple changes were introduced into the treatment herds including the construction of a heated piglet-separation pen, vitamin injections, creep feeding and early weaning. The control herds were unchanged. Data were collected from all sows in each enrolled herd over two farrowings. We enrolled 176 sows, including 170 (96 treatment and 74 control) sows that remained throughout the study period. Significant differences in the reproductive performance of treatment and control sows were recorded for interfarrowing interval (median 176 versus 220 days), average number liveborn over 2 litters (11 versus 12), and average preweaning mortality over 2 litters (0 versus 37%). Based on a discount rate of 17%, the benefit-cost ratio of the treatment was 11.1 and 12.1 over 3 and 5 years, respectively. (C) 2001 Elsevier Science B.V. All rights reserved.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Objective: First, to assess the clinical effectiveness of hylan G-F 20 in an appropriate care treatment regimen (as defined by the American College of Rheumatology (ACR) 1995 guidelines) as measured by validated disease-specific outcomes and health-related quality of life endpoints for patients with osteoarthritis (OA) of the knee. Second, to utilize the measures of effectiveness and costs in an economic evaluation (see accompanying manuscript). Design: A total of 255 patients with OA of the knee were enrolled by rheumatologists or orthopedic surgeons into a prospective, randomized, open-label, 1-year, multi-centred trial, conducted in Canada. Patients were randomized to 'Appropriate care with hylan G-F 20' (AC+H) or 'Appropriate care without hylan G-F 20' (AC). Data were collected at clinic visits (baseline, 12 months) and by telephone (1, 2, 4, 6, 8, 10, and 12 months). Results: The AC+H group was superior to the AC group for all primary (% reduction in mean Western Ontario and McMaster Universities Osteoarthritis Index (WOMAC) pain scale: 38% vs 13%, P=0.0001) and secondary effectiveness outcome measures. These differences were all statistically significant and exceeded the 20% difference between groups seta priori by the investigators as the minimum clinically important difference. Health-related quality of life improvements in the AC+H group were statistically superior for the WOMAC pain, stiffness and physical function (all P