767 resultados para cost utility analysis
Resumo:
A cikkben a kooperatív játékelmélet fogalmait alkalmazzuk egy ellátási lánc esetében. Az ostorcsapás-hatás elemeit egy beszállító-termelő ellátási láncban ragadjuk meg egy Arrow-Karlin típusú modellben lineáris készletezési és konvex termelési költség mellett. Feltételezzük, hogy mindkét vállalat minimalizálja a fontosabb költségeit. Két működési rendszert hasonlítunk össze: egy hierarchikus döntéshozatali rendszert, amikor először a termelő, majd a beszállító optimalizálja helyzetét, majd egy centralizált (kooperatív) modellt, amikor a vállalatok az együttes költségüket minimalizálják. A kérdés úgy merül fel, hogy a csökkentett ostorcsapás-hatás esetén hogyan osszák meg a részvevők ebben a transzferálható hasznosságú kooperatív játékban. = In this paper we apply cooperative game theory concepts to analyze supply chains. The bullwhip effect in a two-stage supply chain (supplier-manufacturer) in the framework of the Arrow-Karlin model with linear-convex cost functions is considered. It is assumed that both firms minimize their relevant costs, and two cases are examined: the supplier and the manufacturer minimize their relevant costs in a decentralized and in a centralized (cooperative) way. The question of how to share the savings of the decreased bullwhip effect in the centralized (cooperative) model is answered by transferable utility cooperative game theory tools.
Resumo:
There are many factors which can assist in controlling the cost of labor in the food service industry. The author discusses a number of these, including scheduling, establishing production standards, forecasting workloads, analyzing employee turnover, combating absenteeism, and controlling overtime.
Resumo:
Construction projects are complex endeavors that require the involvement of different professional disciplines in order to meet various project objectives that are often conflicting. The level of complexity and the multi-objective nature of construction projects lend themselves to collaborative design and construction such as integrated project delivery (IPD), in which relevant disciplines work together during project conception, design and construction. Traditionally, the main objectives of construction projects have been to build in the least amount of time with the lowest cost possible, thus the inherent and well-established relationship between cost and time has been the focus of many studies. The importance of being able to effectively model relationships among multiple objectives in building construction has been emphasized in a wide range of research. In general, the trade-off relationship between time and cost is well understood and there is ample research on the subject. However, despite sustainable building designs, relationships between time and environmental impact, as well as cost and environmental impact, have not been fully investigated. The objectives of this research were mainly to analyze and identify relationships of time, cost, and environmental impact, in terms of CO2 emissions, at different levels of a building: material level, component level, and building level, at the pre-use phase, including manufacturing and construction, and the relationships of life cycle cost and life cycle CO2 emissions at the usage phase. Additionally, this research aimed to develop a robust simulation-based multi-objective decision-support tool, called SimulEICon, which took construction data uncertainty into account, and was capable of incorporating life cycle assessment information to the decision-making process. The findings of this research supported the trade-off relationship between time and cost at different building levels. Moreover, the time and CO2 emissions relationship presented trade-off behavior at the pre-use phase. The results of the relationship between cost and CO2 emissions were interestingly proportional at the pre-use phase. The same pattern continually presented after the construction to the usage phase. Understanding the relationships between those objectives is a key in successfully planning and designing environmentally sustainable construction projects.
Resumo:
ACKNOWLEDGMENT We are thankful to RTE for financial support of this project.
Resumo:
AIMS: Diagnosis of soft tissue sarcomas can be difficult. It can be aided by detection of specific genetic aberrations in many cases. This study assessed the utility of a molecular genetics/cytogenetics service as part of the routine diagnostic service at the Royal Marsden Hospital. METHODS: A retrospective audit was performed over a 15-month period to evaluate the diagnostic usefulness for soft tissue sarcomas with translocations of fluorescence in situ hybridisation (FISH) and reverse-transcriptase PCR (RT-PCR) in paraffin-embedded (PE) material. Results were compared with histology, and evaluated. RESULTS: Molecular investigations were performed on PE material in 158 samples (total 194 RT-PCR and 174 FISH tests), of which 85 were referral cases. Synovial sarcoma, Ewing sarcoma and low-grade fibromyxoid sarcoma were the most commonly tested tumours. Myxoid liposarcoma showed the best histological and molecular concordance, and alveolar rhabdomyosarcoma showed the best agreement between methods. FISH had a higher sensitivity for detecting tumours (73%, compared with 59% for RT-PCR) with a better success rate than RT-PCR, although the latter was specific in identifying the partner gene for each fusion. In particular, referral blocks in which methods of tissue fixation and processing were not certain resulted in higher RT-PCR failure rates. CONCLUSIONS: FISH and RT-PCR on PE tissue are practical and effective ancillary tools in the diagnosis of soft tissue sarcomas. They are useful in confirming doubtful histological diagnoses and excluding malignant diagnoses. PCR is less sensitive than FISH, and the use of both techniques is optimal for maximising the detection rate of translocation-positive sarcomas.
Resumo:
The effectiveness of the Incredible Years Basic parent programme (IYBP) in reducing child conduct problems and improving parent competencies and mental health was examined in a 12-month follow-up. Pre- to post-intervention service use and related costs were also analysed. A total of 103 families and their children (aged 32–88 months), who previously participated in a randomised controlled trial of the IYBP, took part in a 12-month follow-up assessment. Child and parent behaviour and well-being were measured using psychometric and observational measures. An intention-to-treat analysis was carried out using a one-way repeated measures ANOVA. Pairwise comparisons were subsequently conducted to determine whether treatment outcomes were sustained 1 year post-baseline assessment. Results indicate that post-intervention improvements in child conduct problems, parenting behaviour and parental mental health were maintained. Service use and associated costs continued to decline. The results indicate that parent-focused interventions, implemented in the early years, can result in improvements in child and parent behaviour and well-being 12 months later. A reduced reliance on formal services is also indicated.
Resumo:
Teachers frequently struggle to cope with conduct problems in the classroom. The aim of this study was to assess the effectiveness of the Incredible Years Teacher Classroom Management Training Programme for improving teacher competencies and child adjustment. The study involved a group randomised controlled trial which included 22 teachers and 217 children (102 boys and 115 girls). The average age of children included in the study was 5.3 years (standard deviation = 0.89). Teachers were randomly allocated to an intervention group (n = 11 teachers; 110 children) or a waiting-list control group (n = 11; 107 children). The sample also included 63 ‘high-risk’ children (33 intervention; 30 control), who scored above the cut-off (>12) on the Strengths and Difficulties Questionnaire for abnormal socioemotional and behavioural difficulties. Teacher and child behaviours were assessed at baseline and 6 months later using psychometric and observational measures. Programme delivery costs were also analysed. Results showed positive changes in teachers’ self-reported use of positive classroom management strategies (effect size = 0.56), as well as negative classroom management strategies (effect size = −0.43). Teacher reports also highlight improvements in the classroom behaviour of the high-risk group of children, while the estimated cost of delivering the Incredible Years Teacher Classroom Management Training Programme was modest. However, analyses of teacher and child observations were largely non-significant. A need for further research exploring the effectiveness and cost-effectiveness of the Incredible Years Teacher Classroom Management Training Programme is indicated.
Resumo:
Empirical validity of the claim that overhead costs are driven not by production volume but by transactions resulting from production complexity is examined using data from 32 manufacturing plants from the electronics, machinery, and automobile components industries. Transactions are measured using number of engineering change orders, number of purchasing and production planning personnel, shop- floor area per part, and number of quality control and improvement personnel. Results indicate a strong positive relation between manufacturing overhead costs and both manufacturing transactions and production volume. Most of the variation in overhead costs, however, is explained by measures of manufacturing transactions, not volume.
Resumo:
Background The HCL-32 is a widely-used screening questionnaire for hypomania. We aimed to use a Rasch analysis approach to (i) evaluate the measurement properties, principally unidimensionality, of the HCL-32, and (ii) generate a score table to allow researchers to convert raw HCL-32 scores into an interval-level measurement which will be more appropriate for statistical analyses. Methods Subjects were part of the Bipolar Disorder Research Network (BDRN) study with DSM-IV bipolar disorder (n=389). Multidimensionality was assessed using the Rasch fit statistics and principle components analysis of the residuals (PCA). Item invariance (differential item functioning, DIF) was tested for gender, bipolar diagnosis and current mental state. Item estimates and reliabilities were calculated. Results Three items (29, 30, 32) had unacceptable fit to the Rasch unidimensional model. Item 14 displayed significant DIF for gender and items 8 and 17 for current mental state. Item estimates confirmed that not all items measure hypomania equally. Limitations This sample was recruited as part of a large ongoing genetic epidemiology study of bipolar disorder and may not be fully representative of the broader clinical population of individuals with bipolar disorder. Conclusion The HCL-32 is unidimensional in practice, but measurements may be further strengthened by the removal of four items. Re-scored linear measurements may be more appropriate for clinical research.
Resumo:
At the national level, with a fixed amount of resources available for public investment in the restoration of biodiversity, it is difficult to prioritize alternative restoration projects. One way to do this is to assess the level of ecosystem services delivered by these projects and to compare them with their costs. The challenge is to derive a common unit of measurement for ecosystem services in order to compare projects which are carried out in different institutional contexts having different goals (application of environmental laws, management of natural reserves, etc.). This paper assesses the use of habitat equivalency analysis (HEA) as a tool to evaluate ecosystem services provided by restoration projects developed in different institutional contexts. This tool was initially developed to quantify the level of ecosystem services required to compensate for non-market impacts coming from accidental pollution in the US. In this paper, HEA is used to assess the cost effectiveness of several restoration projects in relation to different environmental policies, using case studies based in France. Four case studies were used: the creation of a market for wetlands, public acceptance of a port development project, the rehabilitation of marshes to mitigate nitrate loading to the sea, and the restoration of streams in a protected area. Our main conclusion is that HEA can provide a simple tool to clarify the objectives of restoration projects, to compare the cost and effectiveness of these projects, and to carry out trade-offs, without requiring significant amounts of human or technical resources.
Resumo:
Investors value the special attributes of monetary assets (e.g., exchangeability, liquidity, and safety) and pay a premium for holding them in the form of a lower return rate -- The user cost of holding monetary assets can be measured approximately by the difference between the returns on illiquid risky assets and those of safer liquid assets -- A more appropriate measure should adjust this difference by the differential risk of the assets in question -- We investigate the impact that time non-separable preferences has on the estimation of the risk-adjusted user cost of money -- Using U.K. data from 1965Q1 to 2011Q1, we estimate a habit-based asset pricing model with money in the utility function and find that the risk adjustment for risky monetary assets is negligible -- Thus, researchers can dispense with risk adjusting the user cost of money in constructing monetary aggregate indexes
Resumo:
We explore the recently developed snapshot-based dynamic mode decomposition (DMD) technique, a matrix-free Arnoldi type method, to predict 3D linear global flow instabilities. We apply the DMD technique to flows confined in an L-shaped cavity and compare the resulting modes to their counterparts issued from classic, matrix forming, linear instability analysis (i.e. BiGlobal approach) and direct numerical simulations. Results show that the DMD technique, which uses snapshots generated by a 3D non-linear incompressible discontinuous Galerkin Navier?Stokes solver, provides very similar results to classical linear instability analysis techniques. In addition, we compare DMD results issued from non-linear and linearised Navier?Stokes solvers, showing that linearisation is not necessary (i.e. base flow not required) to obtain linear modes, as long as the analysis is restricted to the exponential growth regime, that is, flow regime governed by the linearised Navier?Stokes equations, and showing the potential of this type of analysis based on snapshots to general purpose CFD codes, without need of modifications. Finally, this work shows that the DMD technique can provide three-dimensional direct and adjoint modes through snapshots provided by the linearised and adjoint linearised Navier?Stokes equations advanced in time. Subsequently, these modes are used to provide structural sensitivity maps and sensitivity to base flow modification information for 3D flows and complex geometries, at an affordable computational cost. The information provided by the sensitivity study is used to modify the L-shaped geometry and control the most unstable 3D mode.
Resumo:
This thesis attempts to find the least-cost strategy to reduce CO2 emission by replacing coal by other energy sources for electricity generation in the context of the proposed EPA’s regulation on CO2 emissions from existing coal-fired power plants. An ARIMA model is built to forecast coal consumption for electricity generation and its CO2 emissions in Michigan from 2016 to 2020. CO2 emission reduction costs are calculated under three emission reduction scenarios- reduction to 17%, 30% and 50% below the 2005 emission level. The impacts of Production Tax Credit (PTC) and the intermittency of renewable energy are also discussed. The results indicate that in most cases natural gas will be the best alternative to coal for electricity generation to realize CO2 reduction goals; if the PTC for wind power will continue after 2015, a natural gas and wind combination approach could be the best strategy based on the least-cost criterion.
Resumo:
We present the first detailed application of Meadows’s cost-based modelling framework to the analysis of JFK, an Internet key agreement protocol. The analysis identifies two denial of service attacks against the protocol that are possible when an attacker is willing to reveal the source IP address. The first attack was identified through direct application of a cost-based modelling framework, while the second was only identified after considering coordinated attackers. Finally, we demonstrate how the inclusion of client puzzles in the protocol can improve denial of service resistance against both identified attacks.