919 resultados para Total Cost Management


Relevância:

80.00% 80.00%

Publicador:

Resumo:

BACKGROUND: Trauma care is expensive. However, reliable data on the exact lifelong costs incurred by a major trauma patient are lacking. Discussion usually focuses on direct medical costs--underestimating consequential costs resulting from absence from work and permanent disability. METHODS: Direct medical costs and consequential costs of 63 major trauma survivors (ISS >13) at a Swiss trauma center from 1995 to 1996 were assessed 5 years posttrauma. The following cost evaluation methods were used: correction cost method (direct cost of restoring an original state), human capital method (indirect cost of lost productivity), contingent valuation method (human cost as the lost quality of life), and macroeconomic estimates. RESULTS: Mean ISS (Injury Severity Score) was 26.8 +/- 9.5 (mean +/- SD). In all, 22 patients (35%) were disabled, causing discounted average lifelong total costs of USD 1,293,800, compared with 41 patients (65%) who recovered without any disabilities with incurred costs of USD 147,200 (average of both groups USD 547,800). Two thirds of these costs were attributable to a loss of production whereas only one third was a result of the cost of correction. Primary hospital treatment (USD 27,800 +/- 37,800) was only a minor fraction of the total cost--less than the estimated cost of police and the judiciary. Loss of quality of life led to considerable intangible human costs similar to real costs. CONCLUSIONS: Trauma costs are commonly underestimated. Direct medical costs make up only a small part of the total costs. Consequential costs, such as lost productivity, are well in excess of the usual medical costs. Mere cost averages give a false estimate of the costs incurred by patients with/without disabilities.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

BACKGROUND AND OBJECTIVE: Most economic evaluations of chlamydia screening do not include costs incurred by patients. The objective of this study was to estimate both the health service and private costs of patients who participated in proactive chlamydia screening, using mailed home-collected specimens as part of the Chlamydia Screening Studies project. METHODS: Data were collected on the administrative costs of the screening study, laboratory time and motion studies and patient-cost questionnaire surveys were conducted. The cost for each screening invitation and for each accepted offer was estimated. One-way sensitivity analysis was conducted to explore the effects of variations in patient costs and the number of patients accepting the screening offer. RESULTS: The time and costs of processing urine specimens and vulvo-vaginal swabs from women using two nucleic acid amplification tests were similar. The total cost per screening invitation was 20.37 pounds (95% CI 18.94 pounds to 24.83). This included the National Health Service cost per individual screening invitation 13.55 pounds (95% CI 13.15 pounds to 14.33) and average patient costs of 6.82 pounds (95% CI 5.48 pounds to 10.22). Administrative costs accounted for 50% of the overall cost. CONCLUSIONS: The cost of proactive chlamydia screening is comparable to those of opportunistic screening. Results from this study, which is the first to collect private patient costs associated with a chlamydia screening programme, could be used to inform future policy recommendations and provide unique primary cost data for economic evaluations.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Demand for bio-fuels is expected to increase, due to rising prices of fossil fuels and concerns over greenhouse gas emissions and energy security. The overall cost of biomass energy generation is primarily related to biomass harvesting activity, transportation, and storage. With a commercial-scale cellulosic ethanol processing facility in Kinross Township of Chippewa County, Michigan about to be built, models including a simulation model and an optimization model have been developed to provide decision support for the facility. Both models track cost, emissions and energy consumption. While the optimization model provides guidance for a long-term strategic plan, the simulation model aims to present detailed output for specified operational scenarios over an annual period. Most importantly, the simulation model considers the uncertainty of spring break-up timing, i.e., seasonal road restrictions. Spring break-up timing is important because it will impact the feasibility of harvesting activity and the time duration of transportation restrictions, which significantly changes the availability of feedstock for the processing facility. This thesis focuses on the statistical model of spring break-up used in the simulation model. Spring break-up timing depends on various factors, including temperature, road conditions and soil type, as well as individual decision making processes at the county level. The spring break-up model, based on the historical spring break-up data from 27 counties over the period of 2002-2010, starts by specifying the probability distribution of a particular county’s spring break-up start day and end day, and then relates the spring break-up timing of the other counties in the harvesting zone to the first county. In order to estimate the dependence relationship between counties, regression analyses, including standard linear regression and reduced major axis regression, are conducted. Using realizations (scenarios) of spring break-up generated by the statistical spring breakup model, the simulation model is able to probabilistically evaluate different harvesting and transportation plans to help the bio-fuel facility select the most effective strategy. For early spring break-up, which usually indicates a longer than average break-up period, more log storage is required, total cost increases, and the probability of plant closure increases. The risk of plant closure may be partially offset through increased use of rail transportation, which is not subject to spring break-up restrictions. However, rail availability and rail yard storage may then become limiting factors in the supply chain. Rail use will impact total cost, energy consumption, system-wide CO2 emissions, and the reliability of providing feedstock to the bio-fuel processing facility.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A significant cost for foundations is the design and installation of piles when they are required due to poor ground conditions. Not only is it important that piles be designed properly, but also that the installation equipment and total cost be evaluated. To assist in the evaluation of piles a number of methods have been developed. In this research three of these methods were investigated, which were developed by the Federal Highway Administration, the US Corps of Engineers and the American Petroleum Institute (API). The results from these methods were entered into the program GRLWEAPTM to assess the pile drivability and to provide a standard base for comparing the three methods. An additional element of this research was to develop EXCEL spreadsheets to implement these three methods. Currently the Army Corps and API methods do not have publicly available software and must be performed manually, which requires that data is taken off of figures and tables, which can introduce error in the prediction of pile capacities. Following development of the EXCEL spreadsheet, they were validated with both manual calculations and existing data sets to ensure that the data output is correct. To evaluate the three pile capacity methods data was utilized from four project sites from North America. The data included site geotechnical data along with field determined pile capacities. In order to achieve a standard comparison of the data, the pile capacities and geotechnical data from the three methods were entered into GRLWEAPTM. The sites consisted of both cohesive and cohesionless soils; where one site was primarily cohesive, one was primarily cohesionless, and the other two consisted of inter-bedded cohesive and cohesionless soils. Based on this limited set of data the results indicated that the US Corps of Engineers method more closely compared with the field test data, followed by the API method to a lesser degree. The DRIVEN program compared favorably in cohesive soils, but over predicted in cohesionless material.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Umformwerkzeuge sind eine neue und bislang nicht erforschte Anwendung generativ gefertigter Werkzeuge. Der Vortrag präsentiert ein Fallbeispiel, bei dem ein typisches Schmiedeteil mit recht komplexer Geometrie erfolgreich unter Verwendung eines generativ gefertigten Schmiedegesenks hergestellt werden konnte. Die Marktanforderungen zur frühestmöglichen Verfügbarkeit echter Schmiedeteile werden dargestellt. Die gesamte Prozesskette von der 3D-CAD-Werkzeugkonstruktion über die Schmiedeprozesssimulation, das Laserstrahlschmelzen der Gesenkeinsätze und die Gesenkmontage bis hin zu den eigentlichen Schmiedeversuchen unter produktionsähnlichen Bedingungen wird dargestellt und mit konventioneller Schmiedegesenkkonstruktion und ‑fertigung verglichen. Die Vorteile und Besonderheiten der generativen Prozesskette werden herausgestellt. Die gefertigten Schmiedeteile werden hinsichtlich Formfüllung, Maßhaltigkeit und Gefüge mit konventionell geschmiedeten Teilen verglichen. Die Lieferzeit der generativ gefertigten Schmiedegesenke wird der von konventionell hergestellten gegenübergestellt, ebenso die Kosten, um die Vorteile des Einsatzes generativer Fertigung herauszustellen. Es werden Randbedingungen beschrieben, unter denen die generative Fertigung von Schmiedegesenken technisch und wirtschaftlich sinnvoll ist.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The heifer development project was a five year project conducted on the site of the former Jackson County Farm north of Andrew, Iowa, for four years and on an area producer’s farm for the fifth year. Heifers arrived around December 1 each year and the average number of heifers each year was 43 with a low of 37 and high of 47. After a 30+ day warm-up period the heifers were put on a 112-day test from early January to late April. They were fed a shelled corn and legume-grass hay ration consisting of between 13% and 14% crude protein and a range of .44 to .58 megacal/pound of NEg over the five years. During the 112-day test heifers gained 1.86, 1.78, 1.5, 1.63 and 2.2 pounds per day, respectively, for years 1992 through 1996. The actual average breeding weight was less than the target weight in three years by 5, 12 and 22 pounds and exceeded the target weight in two year by 17 and 28 pounds. Estrus synchronization used a combination of MGA feeding and Lutalyse injection. Heifers were heat detected and bred 12 hours later for a three-day period. On the fourth day, all heifers not bred were mass inseminated. Heifers then ran with the cleanup bull for 58 days. The average synchronization response rate during the project was 79%. The overall pregnancy rates based on September pregnancy averaged 92%. The five year average total cost per head for heifer development was $286.18 or about $.85 per day. Feed and pasture costs averaged 61% of the total costs.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The heifer development project took place the past four years on the site of the former Jackson County Farm north of Andrew, Iowa. Heifers arrived around December 1 with 38 heifers delivered for 1992, 44 for 1993, 46 for 1994, and 47 for 1995. After a 30+ day warm-up period, the heifers were put on a 112-day test from early January to late April. They were fed a shelled corn and legume-grass hay ration consisting of between 13% and 14% crude protein and .48, .58, .44, and .54 megacal/pound of NEg respectively for the years 1992 - 1995. During the 112-day test heifers gained 1.86, 1.78, 1.5, and 1.63 pounds per day respectively for years 1992 through 1995. The 1995 heifers averaged 853 pounds at breeding (22 pounds under target weight). This compares with previous years in which the breeding weight was less than target weight in two years by 5 and 12 pounds and exceeded the target weight in one year by 17 pounds. Estrus synchronization used a combination of MGA feeding and Lutalyse injection. Heifers were heatdetected and bred 12 hours later for a three-day period. On the fourth day, all heifers not bred were mass inseminated. Heifers then ran with the cleanup bull for 58 days. The synchronization response rate in 1995 was 83%, which compares with the three year previous average of 77%. The overall pregnancy rates based on September pregnancy exams were 94.6% in 1992, 93% in 1993, 91% in 1994, and 91.5% in 1995. Development costs for the 326 days in 1995 totaled $269.14 per heifer. This compares with the average of $286. 92 for the three previous years. The four-year average total cost per head for heifer development was $282.48, or about $.84 per day. Feed and pasture costs represented 58% of the total costs, or $.49 per day.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

OBJECTIVE To determine the prevalence of methicillin-resistant Staphylococcus aureus (MRSA) nasal colonization in hemodialysis patients and to analyze the cost-effectiveness of our screening approach compared with an alternative strategy. DESIGN Screening study and cost-effectiveness analysis. METHODS Analysis of twice-yearly MRSA prevalence studies conducted in the hemodialysis unit of a 950-bed tertiary care hospital from January 1, 2004, through December 31, 2013. For this purpose, nasal swab samples were cultured on MRSA screening agar (mannitol-oxacillin biplate). RESULTS There were 20 mass screenings during the 10-year study period. We identified 415 patients participating in at least 1 screening, with an average of 4.5 screenings per patient. Of 415 screened patients, 15 (3.6%) were found to be MRSA carriers. The first mass screening in 2004 yielded the highest percentage of MRSA (6/101 [6%]). Only 7 subsequent screenings revealed new MRSA carriers, whereas 4 screenings confirmed previously known carriers, and 8 remained negative. None of the carriers developed MRSA bacteremia during the study period. The total cost of our screening approach, that is, screening and isolation costs, was US $93,930. The total cost of an alternative strategy (ie, no mass screening administered) would be equivalent to costs of isolation of index cases and contact tracing was estimated to be US $5,382 (difference, US $88,548). CONCLUSIONS In an area of low MRSA endemicity (<5%), regular nasal screenings of a high-risk population yielded a low rate of MRSA carriers. Twice-yearly MRSA screening of dialysis patients is unlikely to be cost-effective if MRSA prevalence is low. Infect. Control Hosp. Epidemiol. 2015;00(0):1-4.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Objectives. This dissertation focuses on estimating the cost of providing a minimum package of prevention of mother-to-child HIV transmission (PMTCT) in Vietnam from a societal perspective and discussing the issues of scaling-up the minimum package nationwide. ^ Methods. Through collection of cost-related data of PMTCT services at 22 PMTCT sites in 5 provinces (Hanoi, Quang Ninh, Thai Nguyen, Hochiminh City, and An Giang) in Vietnam, the research investigates the item cost of each service in minimum PMTCT packages and the actual cost per PMTCT site at different organizational levels including central, provincial, and district. Next, the actual cost per site at each organizational level is standardized by adjusting for HIV prevalence rate to arrive at standardized costs per site. This study then uses the standardized costs per site to project, by different scenarios, the total cost to scale-up the PMTCT program in Vietnam. ^ Results. The cost for HIV tests, infant formula, and salary of health workers are consistently found to be the biggest expenditures in the PMTCT minimum package program across all organizational levels. Annual cost for drugs for prophylaxis treatment, operating and capital, and training costs are not substantial (less than 5% of total costs at all levels). The actual annual estimated cost for a PMTCT site at the central level is nearly VND 1.9 billion or US$ 107,650 (exchange rate US$ 1 = VND 17,500) while the annual cost for a provincial site is VND 375 million or US$ 21,400. The annual cost for a district site is VND 139 million (∼US$ 8,000). ^ The estimated total annual cost to roll out the PMTCT minimum package to the 5 studied provinces is approximately US$ 1.1 million. If the PMTCT program is to be scaled-up to 14 provinces until 2008 and up to 40 provinces through the end of 2010 as planned by the Ministry of Health, it would cost the health system an approximate annual amount of US$ 2.1 million and US$ 5.04 million, respectively. The annual cost for scaling-up the PMTCT minimum package nationwide is around US$ 7.6 million. Meanwhile, the total annual cost to implement PMTCT minimum packages to achieve PMTCT national targets in 2010 (providing counseling service to 90% of all pregnant women; 60% of them will receive HIV tests and 100% of HIV (+) mother and their newborn will receive prophylaxis treatment) would be US$ 6.1 million. ^ Recommendations. This study recommends: (1) the Ministry of Health of Vietnam should adjust its short-term national targets to a more feasible and achievable level given the current level of available resources; (2) a detailed budget for scaling-up the PMTCT program should be developed together with the national PMTCT action plan; (3) the PMTCT scaling-up plan developed by the Ministry of Health should focus on coverage of high prevalence population and quality of services provided rather than number of physical provinces reached; (4) exclusive breastfeeding strategy should be promoted as part of the PMTCT program; and (5) for a smooth and effective rolling out of PMTCT services nationwide, development of a national training plan and execution of this plan must precede any other initiations of the PMTCT scaling-up plan. ^

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A sample of 157 AIDS patients 17 years of age or over were followed for six months from the date of hospital discharge to derive average total cost of medical care, utilization and satisfaction with care. Those referred for home care follow-up after discharge from the hospital were compared with those who did not receive home care.^ The average total cost of medical care for all patients was $34,984. Home care patient costs averaged \$29,614 while patients with no home care averaged $37,091. Private hospital patients had average costs of \$50,650 compared with $25,494 for public hospital patients. Hospital days for the six months period averaged 23.9 per patient for the no home care group and 18.5 days for home care group. Patient satisfaction with care was higher in the home care group than no home care group, with a mean score of 68.2 compared with 61.1.^ Other health services information indicated that 98% of the private hospital patients had insurance while only 2% of public hospital patients had coverage. The time between the initial date of diagnosis with AIDS and admission to the study was longer for private hospital patients, survival time over the study period was shorter, and the number of hospitalizations prior to entering the study was higher for private hospital patients. These results suggest that patients treated in the private hospital were sicker than public hospital patients, which may explain their higher average total cost. Statistical analyses showed that cost and utilization have no significant relationship with home care or no home care when controlling for indicators of the severity of illness and treatment in public or private hospital.^ In future studies, selecting a matched group of patients from the same hospital and following them for nine months to one year would be helpful in making a more realistic comparison of the cost effectiveness of home care. ^

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Los problemas de programación de tareas son muy importantes en el mundo actual. Se puede decir que se presentan en todos los fundamentos de la industria moderna, de ahí la importancia de que estos sean óptimos, de forma que se puedan ahorrar recursos que estén asociados al problema. La programación adecuada de trabajos en procesos de manufactura, constituye un importante problema que se plantea dentro de la producción en muchas empresas. El orden en que estos son procesados, no resulta indiferente, sino que determinará algún parámetro de interés, cuyos valores convendrá optimizar en la medida de lo posible. Así podrá verse afectado el coste total de ejecución de los trabajos, el tiempo necesario para concluirlos o el stock de productos en curso que será generado. Esto conduce de forma directa al problema de determinar cuál será el orden más adecuado para llevar a cabo los trabajos con vista a optimizar algunos de los anteriores parámetros u otros similares. Debido a las limitaciones de las técnicas de optimización convencionales, en la presente tesis se presenta una metaheurística basada en un Algoritmo Genético Simple (Simple Genetic Algorithm, SGA), para resolver problemas de programación de tipo flujo general (Job Shop Scheduling, JSS) y flujo regular (Flow Shop Scheduling, FSS), que están presentes en un taller con tecnología de mecanizado con el objetivo de optimizar varias medidas de desempeño en un plan de trabajo. La aportación principal de esta tesis, es un modelo matemático para medir el consumo de energía, como criterio para la optimización, de las máquinas que intervienen en la ejecución de un plan de trabajo. Se propone además, un método para mejorar el rendimiento en la búsqueda de las soluciones encontradas, por parte del Algoritmo Genético Simple, basado en el aprovechamiento del tiempo ocioso. ABSTRACT The scheduling problems are very important in today's world. It can be said to be present in all the basics of modern industry, hence the importance that these are optimal, so that they can save resources that are associated with the problem. The appropriate programming jobs in manufacturing processes is an important problem that arises in production in many companies. The order in which they are processed, it is immaterial, but shall determine a parameter of interest, whose values agree optimize the possible. This may be affected the total cost of execution of work, the time needed to complete them or the stock of work in progress that will be generated. This leads directly to the problem of determining what the most appropriate order to carry out the work in order to maximize some of the above parameters or other similar. Due to the limitations of conventional optimization techniques, in this work present a metaheuristic based on a Simple Genetic Algorithm (Simple Genetic Algorithm, SGA) to solve programming problems overall flow rate (Job Shop Scheduling, JSS) and regular flow (Flow Shop Scheduling, FSS), which are present in a workshop with machining technology in order to optimize various performance measures in a plan. The main contribution of this thesis is a mathematical model to measure the energy consumption as a criterion for the optimization of the machines involved in the implementation of a work plan. It also proposes a method to improve performance in finding the solutions, by the simple genetic algorithm, based on the use of idle time.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Los servicios portuarios son responsables de la mayor parte del coste que se produce en el paso de la mercancía por un puerto, particularmente el servicio de manipulación de mercancías. La prestación de éstos de forma fiable y eficiente resulta clave en un sector en el que existe una gran opacidad. Con el estudio realizado se dota a la Administración responsable, la Autoridad Portuaria, de una herramienta que le ayude a objetivar la toma de decisiones tanto a la hora de otorgar las preceptivas licencias como durante el periodo de prestación del servicio. Además se proponen una serie de medidas cuya aplicación mejoraría las condiciones de prestación del servicio así como una reducción de costes al paso de la mercancía por el puerto. Port services are responsible for most of the cost for the passing of goods through the port, especially the cargo handling service. Reliability and efficiency in the provision of them are key in a sector where there is a high opacity. With this study, we provide the responsible Administration, the Port Authority, with a tool that will help to objectify the decision making process, both at the time of granting the required licenses and during the period of the service provision. Also, proposes a number of measures whose implementation would improve the conditions of the service delivery and will reduce the total cost of transporting goods through the port.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A reliability analysis method is proposed that starts with the identification of all variables involved. These are divided in three groups: (a) variables fixed by codes, as loads and strength project values, and their corresponding partial safety coefficients, (b) geometric variables defining the dimension of the main elements involved, (c) the cost variables, including the possible damages caused by failure, (d) the random variables as loads, strength, etc., and (e)the variables defining the statistical model, as the family of distribution and its corresponding parameters. Once the variables are known, the II-theorem is used to obtain a minimum equivalent set of non-dimensional variables, which is used to define the limit states. This allows a reduction in the number of variables involved and a better understanding of their coupling effects. Two minimum cost criteria are used for selecting the project dimensions. One is based on a bounded-probability of failure, and the other on a total cost, including the damages of the possible failure. Finally, the method is illustrated by means of an application.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

La computación basada en servicios (Service-Oriented Computing, SOC) se estableció como un paradigma ampliamente aceptado para el desarollo de sistemas de software flexibles, distribuidos y adaptables, donde las composiciones de los servicios realizan las tareas más complejas o de nivel más alto, frecuentemente tareas inter-organizativas usando los servicios atómicos u otras composiciones de servicios. En tales sistemas, las propriedades de la calidad de servicio (Quality of Service, QoS), como la rapídez de procesamiento, coste, disponibilidad o seguridad, son críticas para la usabilidad de los servicios o sus composiciones en cualquier aplicación concreta. El análisis de estas propriedades se puede realizarse de una forma más precisa y rica en información si se utilizan las técnicas de análisis de programas, como el análisis de complejidad o de compartición de datos, que son capables de analizar simultáneamente tanto las estructuras de control como las de datos, dependencias y operaciones en una composición. El análisis de coste computacional para la composicion de servicios puede ayudar a una monitorización predictiva así como a una adaptación proactiva a través de una inferencia automática de coste computacional, usando los limites altos y bajos como funciones del valor o del tamaño de los mensajes de entrada. Tales funciones de coste se pueden usar para adaptación en la forma de selección de los candidatos entre los servicios que minimizan el coste total de la composición, basado en los datos reales que se pasan al servicio. Las funciones de coste también pueden ser combinadas con los parámetros extraídos empíricamente desde la infraestructura, para producir las funciones de los límites de QoS sobre los datos de entrada, cuales se pueden usar para previsar, en el momento de invocación, las violaciones de los compromisos al nivel de servicios (Service Level Agreements, SLA) potenciales or inminentes. En las composiciones críticas, una previsión continua de QoS bastante eficaz y precisa se puede basar en el modelado con restricciones de QoS desde la estructura de la composition, datos empiricos en tiempo de ejecución y (cuando estén disponibles) los resultados del análisis de complejidad. Este enfoque se puede aplicar a las orquestaciones de servicios con un control centralizado del flujo, así como a las coreografías con participantes multiples, siguiendo unas interacciones complejas que modifican su estado. El análisis del compartición de datos puede servir de apoyo para acciones de adaptación, como la paralelización, fragmentación y selección de los componentes, las cuales son basadas en dependencias funcionales y en el contenido de información en los mensajes, datos internos y las actividades de la composición, cuando se usan construcciones de control complejas, como bucles, bifurcaciones y flujos anidados. Tanto las dependencias funcionales como el contenido de información (descrito a través de algunos atributos definidos por el usuario) se pueden expresar usando una representación basada en la lógica de primer orden (claúsulas de Horn), y los resultados del análisis se pueden interpretar como modelos conceptuales basados en retículos. ABSTRACT Service-Oriented Computing (SOC) is a widely accepted paradigm for development of flexible, distributed and adaptable software systems, in which service compositions perform more complex, higher-level, often cross-organizational tasks using atomic services or other service compositions. In such systems, Quality of Service (QoS) properties, such as the performance, cost, availability or security, are critical for the usability of services and their compositions in concrete applications. Analysis of these properties can become more precise and richer in information, if it employs program analysis techniques, such as the complexity and sharing analyses, which are able to simultaneously take into account both the control and the data structures, dependencies, and operations in a composition. Computation cost analysis for service composition can support predictive monitoring and proactive adaptation by automatically inferring computation cost using the upper and lower bound functions of value or size of input messages. These cost functions can be used for adaptation by selecting service candidates that minimize total cost of the composition, based on the actual data that is passed to them. The cost functions can also be combined with the empirically collected infrastructural parameters to produce QoS bounds functions of input data that can be used to predict potential or imminent Service Level Agreement (SLA) violations at the moment of invocation. In mission-critical applications, an effective and accurate continuous QoS prediction, based on continuations, can be achieved by constraint modeling of composition QoS based on its structure, known data at runtime, and (when available) the results of complexity analysis. This approach can be applied to service orchestrations with centralized flow control, and choreographies with multiple participants with complex stateful interactions. Sharing analysis can support adaptation actions, such as parallelization, fragmentation, and component selection, which are based on functional dependencies and information content of the composition messages, internal data, and activities, in presence of complex control constructs, such as loops, branches, and sub-workflows. Both the functional dependencies and the information content (described using user-defined attributes) can be expressed using a first-order logic (Horn clause) representation, and the analysis results can be interpreted as a lattice-based conceptual models.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

There is an increasing awareness among all kinds of organisations (in business,government and civil society) about the benefits of jointly working with stakeholders to satisfy both their goals and the social demands placed upon them. This is particularly the case within corporate social responsibility (CSR) frameworks. In this regard, multi-criteria tools for decision-making like the analytic hierarchy process (AHP) described in the paper can be useful for the building relationships with stakeholders. Since these tools can reveal decision-maker’s preferences, the integration of opinions from various stakeholders in the decision-making process may result in better and more innovative solutions with significant shared value. This paper is based on ongoing research to assess the feasibility of an AHP-based model to support CSR decisions in large infrastructure projects carried out by Red Electrica de España, the sole transmission agent and operator of the Spanishelectricity system.