877 resultados para decision analysis


Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper introduces a mixture model based on the beta distribution, without preestablishedmeans and variances, to analyze a large set of Beauty-Contest data obtainedfrom diverse groups of experiments (Bosch-Domenech et al. 2002). This model gives a bettert of the experimental data, and more precision to the hypothesis that a large proportionof individuals follow a common pattern of reasoning, described as iterated best reply (degenerate),than mixture models based on the normal distribution. The analysis shows thatthe means of the distributions across the groups of experiments are pretty stable, while theproportions of choices at dierent levels of reasoning vary across groups.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Standard practice of wave-height hazard analysis often pays little attention to the uncertainty of assessed return periods and occurrence probabilities. This fact favors the opinion that, when large events happen, the hazard assessment should change accordingly. However, uncertainty of the hazard estimates is normally able to hide the effect of those large events. This is illustrated using data from the Mediterranean coast of Spain, where the last years have been extremely disastrous. Thus, it is possible to compare the hazard assessment based on data previous to those years with the analysis including them. With our approach, no significant change is detected when the statistical uncertainty is taken into account. The hazard analysis is carried out with a standard model. Time-occurrence of events is assumed Poisson distributed. The wave-height of each event is modelled as a random variable which upper tail follows a Generalized Pareto Distribution (GPD). Moreover, wave-heights are assumed independent from event to event and also independent of their occurrence in time. A threshold for excesses is assessed empirically. The other three parameters (Poisson rate, shape and scale parameters of GPD) are jointly estimated using Bayes' theorem. Prior distribution accounts for physical features of ocean waves in the Mediterranean sea and experience with these phenomena. Posterior distribution of the parameters allows to obtain posterior distributions of other derived parameters like occurrence probabilities and return periods. Predictives are also available. Computations are carried out using the program BGPE v2.0

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: Guidelines surrounding maternal contact with the stillborn infant have been contradictory over the past thirty years. Most studies have reported that seeing and holding the stillborn baby is associated with fewer anxiety and depressive symptoms among mothers of stillborn babies than not doing so. In contrast, others studies suggest that contact with the stillborn infant can lead to poorer maternal mental health outcomes. There is a lack of research focusing on the maternal experience of this contact. The present study aimed to investigate how mothers describe their experience of spending time with their stillborn baby and how they felt retrospectively about the decision they made to see and hold their baby or not. METHOD: In depth interviews were conducted with twenty-one mothers three months after stillbirth. All mothers had decided to see and the majority to hold their baby. Qualitative analysis of the interview data was performed using Interpretive Phenomenological Analysis. RESULTS: Six superordinate themes were identified: Characteristics of Contact, Physicality; Emotional Experience; Surreal Experience; Finality; and Decision. Having contact with their stillborn infant provided mothers with time to process what had happened, to build memories, and to 'say goodbye', often sharing the experience with partners and other family members. The majority of mothers felt satisfied with their decision to spend time with their stillborn baby. Several mothers talked about their fear of seeing a damaged or dead body. Some mothers experienced strong disbelief and dissociation during the contact. CONCLUSIONS: Results indicate that preparation before contact with the baby, professional support during the contact, and professional follow-up are crucial in order to prevent the development of maternal mental health problems. Fears of seeing a damaged or dead body should be sensitively explored and ways of coping discussed. Even in cases where mothers experienced intense distress during the contact with their stillborn baby, they still described that having had this contact was important and that they had taken the right decision. This indicates a need for giving parents an informed choice by engaging in discussions about the possible benefits and risks of seeing their stillborn baby.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The paper deals with the development and application of the methodology for automatic mapping of pollution/contamination data. General Regression Neural Network (GRNN) is considered in detail and is proposed as an efficient tool to solve this problem. The automatic tuning of isotropic and an anisotropic GRNN model using cross-validation procedure is presented. Results are compared with k-nearest-neighbours interpolation algorithm using independent validation data set. Quality of mapping is controlled by the analysis of raw data and the residuals using variography. Maps of probabilities of exceeding a given decision level and ?thick? isoline visualization of the uncertainties are presented as examples of decision-oriented mapping. Real case study is based on mapping of radioactively contaminated territories.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Decisions taken in modern organizations are often multi-dimensional, involving multiple decision makers and several criteria measured on different scales. Multiple Criteria Decision Making (MCDM) methods are designed to analyze and to give recommendations in this kind of situations. Among the numerous MCDM methods, two large families of methods are the multi-attribute utility theory based methods and the outranking methods. Traditionally both method families require exact values for technical parameters and criteria measurements, as well as for preferences expressed as weights. Often it is hard, if not impossible, to obtain exact values. Stochastic Multicriteria Acceptability Analysis (SMAA) is a family of methods designed to help in this type of situations where exact values are not available. Different variants of SMAA allow handling all types of MCDM problems. They support defining the model through uncertain, imprecise, or completely missing values. The methods are based on simulation that is applied to obtain descriptive indices characterizing the problem. In this thesis we present new advances in the SMAA methodology. We present and analyze algorithms for the SMAA-2 method and its extension to handle ordinal preferences. We then present an application of SMAA-2 to an area where MCDM models have not been applied before: planning elevator groups for high-rise buildings. Following this, we introduce two new methods to the family: SMAA-TRI that extends ELECTRE TRI for sorting problems with uncertain parameter values, and SMAA-III that extends ELECTRE III in a similar way. An efficient software implementing these two methods has been developed in conjunction with this work, and is briefly presented in this thesis. The thesis is closed with a comprehensive survey of SMAA methodology including a definition of a unified framework.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Nowadays the used fuel variety in power boilers is widening and new boiler constructions and running models have to be developed. This research and development is done in small pilot plants where more faster analyse about the boiler mass and heat balance is needed to be able to find and do the right decisions already during the test run. The barrier on determining boiler balance during test runs is the long process of chemical analyses of collected input and outputmatter samples. The present work is concentrating on finding a way to determinethe boiler balance without chemical analyses and optimise the test rig to get the best possible accuracy for heat and mass balance of the boiler. The purpose of this work was to create an automatic boiler balance calculation method for 4 MW CFB/BFB pilot boiler of Kvaerner Pulping Oy located in Messukylä in Tampere. The calculation was created in the data management computer of pilot plants automation system. The calculation is made in Microsoft Excel environment, which gives a good base and functions for handling large databases and calculations without any delicate programming. The automation system in pilot plant was reconstructed und updated by Metso Automation Oy during year 2001 and the new system MetsoDNA has good data management properties, which is necessary for big calculations as boiler balance calculation. Two possible methods for calculating boiler balance during test run were found. Either the fuel flow is determined, which is usedto calculate the boiler's mass balance, or the unburned carbon loss is estimated and the mass balance of the boiler is calculated on the basis of boiler's heat balance. Both of the methods have their own weaknesses, so they were constructed parallel in the calculation and the decision of the used method was left to user. User also needs to define the used fuels and some solid mass flowsthat aren't measured automatically by the automation system. With sensitivity analysis was found that the most essential values for accurate boiler balance determination are flue gas oxygen content, the boiler's measured heat output and lower heating value of the fuel. The theoretical part of this work concentrates in the error management of these measurements and analyses and on measurement accuracy and boiler balance calculation in theory. The empirical part of this work concentrates on the creation of the balance calculation for the boiler in issue and on describing the work environment.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background Analysing the observed differences for incidence or mortality of a particular disease between two different situations (such as time points, geographical areas, gender or other social characteristics) can be useful both for scientific or administrative purposes. From an epidemiological and public health point of view, it is of great interest to assess the effect of demographic factors in these observed differences in order to elucidate the effect of the risk of developing a disease or dying from it. The method proposed by Bashir and Estève, which splits the observed variation into three components: risk, population structure and population size is a common choice at practice. Results A web-based application, called RiskDiff has been implemented (available at http://rht.iconcologia.net/riskdiff.htm webcite), to perform this kind of statistical analyses, providing text and graphical summaries. Code from the implemented functions in R is also provided. An application to cancer mortality data from Catalonia is used for illustration. Conclusions Combining epidemiological with demographical factors is crucial for analysing incidence or mortality from a disease, especially if the population pyramids show substantial differences. The tool implemented may serve to promote and divulgate the use of this method to give advice for epidemiologic interpretation and decision making in public health.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

On selvää, että tänä päivänä maailmankaupan painopiste on hiljalleen siirtymässä Aasiaan ja varsinkin Kiina on ollut huomion keskipisteessä. Erityisesti valmistavien yritysten perspektiivistä muutos on ollut merkittävä ja tämä tosiasia kasvattaa yrityksissä paineita luoda kustannustehokkaita toimitusketjuratkaisuja,joiden vasteaika on mahdollisimman lyhyt. Samaan aikaan kun tarkastellaan kuljetusvirtoja, huomattaan että maanosien välillä on suuri epätasapaino. Tämä on enimmäkseen seurausta suurten globaalisti toimivien yritysten toimitusketjustrategioista. Useimmat näistä toimijoista optimoivat verkostonsa turvautumalla 'paikalliseen hankintaan', jotta he voisivat paremmin hallita toimitusketjujaan ja saada näitä reagointiherkimmiksi. Valmistusyksiköillä onkin monesti Euroopassa pakko käyttää kalliita raaka-aineita ja puolivalmisteita. Kriittisiksi tekijöiksi osoittautuvat kuljetus- ja varastointikustannukset sekä näiden seurauksena hukka-aika, joka aiheutuu viivästyksistä. Voidakseen saavuttaa optimiratkaisun, on tehtävä päätös miten tuotteet varastoidaan: keskitetysti tai hajautetusti ja integroida tämä valinta sopivien kuljetusmuotojen kanssa. Aasiasta Pohjois-Eurooppaan on halpaa käyttää merikuljetusta, mutta operaatio kestää hyvin pitkään - joissain tapauksessa jopa kahdeksan viikkoa. Toisaalta lentokuljetus on sekä kallis että rajoittaa siirrettävien tuotteiden eräkokoa.On olemassa kolmaskin vaihtoehto, josta voisi olla ratkaisuksi: rautatiekuljetus on halvempi kuin lentokuljetus ja vasteajat ovat lyhyemmät kuin merikuljetuksissa. Tässä tutkimuksessa tilannetta selvitetään kyselyllä, joka suunnattiin Suomessa ja Ruotsissa toimiville yrityksille. Tuloksien perusteella teemme johtopäätökset siitä, mitkä kuljetusmuotojen markkinaosuudet tulevat olemaan tulevaisuudessa sekä luomme kuvan kuljetusvirroista Euroopan, Venäjän, Etelä-Korea, Intian, Kiinan ja Japanin välillä. Samalla on tarkoitus ennakoida sitä, miten tarkastelun kohteena olevat yritykset aikovat kehittää kuljetuksiaan ja varastointiaan tulevien vuosien aikana. Tulosten perusteella näyttää siltä, että seuraavan viiden vuoden kuluessa kuljetuskustannukset eivät merkittävissä määrin tule muuttuman ja meri- sekä kumipyöräkuljetukset pysyvät suosituimpina vaihtoehtoina.Kuitenkin lentokuljetusten osuus laskee hiukan, kun taas rautatiekuljetusten painotus kasvaa. Tulokset paljastavat, että Kiinassa ja Venäjällä kuljetettava konttimäärä kasvaa; Intiassa tulos on saman suuntainen, joskaan ei niin voimakas. Analyysimme mukaan kuljetusvirtoihin liittyvä epätasapaino säilyy Venäjän kuljetusten suhteen: yritykset jatkavat tulevaisuudessakin vientiperusteista strategiaansa. Varastoinnin puolella tunnistamme pienemmän muutoksen, jonka mukaan pienikokoisten varastojen määrät todennäköisesti vähenevät tulevaisuudessa ja kiinnostus isoja varastoja kohtaan lisääntyy. Tässä kohtaa on mainittava, että suomalaisilla yrityksillä on enemmän varastoja Keski- ja Itä-Euroopassa verrattuna ruotsalaisiin toimijoihin, jotka keskittyvät selkeämmin Länsi-Euroopan maihin. Varastoja yrityksillä on molemmissa tapaukissa paljolti kotimaassaan. Valitessaan varastojensa sijoituskohteita yritykset painottavat seuraavia kriteereitä: alhaiset jakelukustannukset, kokoamispaikan/valmistustehtaan läheisyys, saapuvan logistiikan integroitavuus ja saatavilla olevat logistiikkapalvelut. Tutkimuksemme lopussa päädymme siihen, että varastojen sijoituspaikat eivät muutu satamien rakenteen ja liikenneyhteyksien takia kovinkaan nopeasti.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Life cycle analysis (LCA) is a comprehensive method for assessing the environmental impact of a product or an activity over its entire life cycle. The purpose of conducting LCA studies varies from one application to another. Different applications use LCA for different purposes. In general, the main aim of using LCA is to reduce the environmental impact of products through guiding the decision making process towards more sustainable solutions. The most critical phase in an LCA study is the Life Cycle Impact Assessment (LCIA) where the life cycle inventory (LCI) results of the considered substances related to the study of a certain system are transformed into understandable impact categories that represent the impact on the environment. In this research work, a general structure clarifying the steps that shall be followed ir order to conduct an LCA study effectively is presented. These steps are based on the ISO 14040 standard framework. In addition, a survey is done on the most widely used LCIA methodologies. Recommendations about possible developments and suggetions for further research work regarding the use of LCA and LCIA methodologies are discussed as well.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Due to the intense international competition, demanding, and sophisticated customers, and diverse transforming technological change, organizations need to renew their products and services by allocating resources on research and development (R&D). Managing R&D is complex, but vital for many organizations to survive in the dynamic, turbulent environment. Thus, the increased interest among decision-makers towards finding the right performance measures for R&D is understandable. The measures or evaluation methods of R&D performance can be utilized for multiple purposes; for strategic control, for justifying the existence of R&D, for providing information and improving activities, as well as for the purposes of motivating and benchmarking. The earlier research in the field of R&D performance analysis has generally focused on either the activities and considerable factors and dimensions - e.g. strategic perspectives, purposes of measurement, levels of analysis, types of R&D or phases of R&D process - prior to the selection of R&Dperformance measures, or on proposed principles or actual implementation of theselection or design processes of R&D performance measures or measurement systems. This study aims at integrating the consideration of essential factors anddimensions of R&D performance analysis to developed selection processes of R&D measures, which have been applied in real-world organizations. The earlier models for corporate performance measurement that can be found in the literature, are to some extent adaptable also to the development of measurement systemsand selecting the measures in R&D activities. However, it is necessary to emphasize the special aspects related to the measurement of R&D performance in a way that make the development of new approaches for especially R&D performance measure selection necessary: First, the special characteristics of R&D - such as the long time lag between the inputs and outcomes, as well as the overall complexity and difficult coordination of activities - influence the R&D performance analysis problems, such as the need for more systematic, objective, balanced and multi-dimensional approaches for R&D measure selection, as well as the incompatibility of R&D measurement systems to other corporate measurement systems and vice versa. Secondly, the above-mentioned characteristics and challenges bring forth the significance of the influencing factors and dimensions that need to be recognized in order to derive the selection criteria for measures and choose the right R&D metrics, which is the most crucial step in the measurement system development process. The main purpose of this study is to support the management and control of the research and development activities of organizations by increasing the understanding of R&D performance analysis, clarifying the main factors related to the selection of R&D measures and by providing novel types of approaches and methods for systematizing the whole strategy- and business-based selection and development process of R&D indicators.The final aim of the research is to support the management in their decision making of R&D with suitable, systematically chosen measures or evaluation methods of R&D performance. Thus, the emphasis in most sub-areas of the present research has been on the promotion of the selection and development process of R&D indicators with the help of the different tools and decision support systems, i.e. the research has normative features through providing guidelines by novel types of approaches. The gathering of data and conducting case studies in metal and electronic industry companies, in the information and communications technology (ICT) sector, and in non-profit organizations helped us to formulate a comprehensive picture of the main challenges of R&D performance analysis in different organizations, which is essential, as recognition of the most importantproblem areas is a very crucial element in the constructive research approach utilized in this study. Multiple practical benefits regarding the defined problemareas could be found in the various constructed approaches presented in this dissertation: 1) the selection of R&D measures became more systematic when compared to the empirical analysis, as it was common that there were no systematic approaches utilized in the studied organizations earlier; 2) the evaluation methods or measures of R&D chosen with the help of the developed approaches can be more directly utilized in the decision-making, because of the thorough consideration of the purpose of measurement, as well as other dimensions of measurement; 3) more balance to the set of R&D measures was desired and gained throughthe holistic approaches to the selection processes; and 4) more objectivity wasgained through organizing the selection processes, as the earlier systems were considered subjective in many organizations. Scientifically, this dissertation aims to make a contribution to the present body of knowledge of R&D performance analysis by facilitating dealing with the versatility and challenges of R&D performance analysis, as well as the factors and dimensions influencing the selection of R&D performance measures, and by integrating these aspects to the developed novel types of approaches, methods and tools in the selection processes of R&D measures, applied in real-world organizations. In the whole research, facilitation of dealing with the versatility and challenges in R&D performance analysis, as well as the factors and dimensions influencing the R&D performance measure selection are strongly integrated with the constructed approaches. Thus, the research meets the above-mentioned purposes and objectives of the dissertation from the scientific as well as from the practical point of view.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: Pneumonia is the biggest cause of deaths in young children in developing countries, but early diagnosis and intervention can effectively reduce mortality. We aimed to assess the diagnostic value of clinical signs and symptoms to identify radiological pneumonia in children younger than 5 years and to review the accuracy of WHO criteria for diagnosis of clinical pneumonia. METHODS: We searched Medline (PubMed), Embase (Ovid), the Cochrane Database of Systematic Reviews, and reference lists of relevant studies, without date restrictions, to identify articles assessing clinical predictors of radiological pneumonia in children. Selection was based on: design (diagnostic accuracy studies), target disease (pneumonia), participants (children aged <5 years), setting (ambulatory or hospital care), index test (clinical features), and reference standard (chest radiography). Quality assessment was based on the 2011 Quality Assessment of Diagnostic Accuracy Studies (QUADAS-2) criteria. For each index test, we calculated sensitivity and specificity and, when the tests were assessed in four or more studies, calculated pooled estimates with use of bivariate model and hierarchical summary receiver operation characteristics plots for meta-analysis. FINDINGS: We included 18 articles in our analysis. WHO-approved signs age-related fast breathing (six studies; pooled sensitivity 0·62, 95% CI 0·26-0·89; specificity 0·59, 0·29-0·84) and lower chest wall indrawing (four studies; 0·48, 0·16-0·82; 0·72, 0·47-0·89) showed poor diagnostic performance in the meta-analysis. Features with the highest pooled positive likelihood ratios were respiratory rate higher than 50 breaths per min (1·90, 1·45-2·48), grunting (1·78, 1·10-2·88), chest indrawing (1·76, 0·86-3·58), and nasal flaring (1·75, 1·20-2·56). Features with the lowest pooled negative likelihood ratio were cough (0·30, 0·09-0·96), history of fever (0·53, 0·41-0·69), and respiratory rate higher than 40 breaths per min (0·43, 0·23-0·83). INTERPRETATION: Not one clinical feature was sufficient to diagnose pneumonia definitively. Combination of clinical features in a decision tree might improve diagnostic performance, but the addition of new point-of-care tests for diagnosis of bacterial pneumonia would help to attain an acceptable level of accuracy. FUNDING: Swiss National Science Foundation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Occupational exposure modeling is widely used in the context of the E.U. regulation on the registration, evaluation, authorization, and restriction of chemicals (REACH). First tier tools, such as European Centre for Ecotoxicology and TOxicology of Chemicals (ECETOC) targeted risk assessment (TRA) or Stoffenmanager, are used to screen a wide range of substances. Those of concern are investigated further using second tier tools, e.g., Advanced REACH Tool (ART). Local sensitivity analysis (SA) methods are used here to determine dominant factors for three models commonly used within the REACH framework: ECETOC TRA v3, Stoffenmanager 4.5, and ART 1.5. Based on the results of the SA, the robustness of the models is assessed. For ECETOC, the process category (PROC) is the most important factor. A failure to identify the correct PROC has severe consequences for the exposure estimate. Stoffenmanager is the most balanced model and decision making uncertainties in one modifying factor are less severe in Stoffenmanager. ART requires a careful evaluation of the decisions in the source compartment since it constitutes ∼75% of the total exposure range, which corresponds to an exposure estimate of 20-22 orders of magnitude. Our results indicate that there is a trade off between accuracy and precision of the models. Previous studies suggested that ART may lead to more accurate results in well-documented exposure situations. However, the choice of the adequate model should ultimately be determined by the quality of the available exposure data: if the practitioner is uncertain concerning two or more decisions in the entry parameters, Stoffenmanager may be more robust than ART.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The evaluation of investments in advanced technology is one of the most important decision making tasks. The importance is even more pronounced considering the huge budget concerning the strategic, economic and analytic justification in order to shorten design and development time. Choosing the most appropriate technology requires an accurate and reliable system that can lead the decision makers to obtain such a complicated task. Currently, several Information and Communication Technologies (ICTs) manufacturers that design global products are seeking local firms to act as their sales and services representatives (called distributors) to the end user. At the same time, the end user or customer is also searching for the best possible deal for their investment in ICT's projects. Therefore, the objective of this research is to present a holistic decision support system to assist the decision maker in Small and Medium Enterprises (SMEs) - working either as individual decision makers or in a group - in the evaluation of the investment to become an ICT's distributor or an ICT's end user. The model is composed of the Delphi/MAH (Maximising Agreement Heuristic) Analysis, a well-known quantitative method in Group Support System (GSS), which is applied to gather the average ranking data from amongst Decision Makers (DMs). After that the Analytic Network Process (ANP) analysis is brought in to analyse holistically: it performs quantitative and qualitative analysis simultaneously. The illustrative data are obtained from industrial entrepreneurs by using the Group Support System (GSS) laboratory facilities at Lappeenranta University of Technology, Finland and in Thailand. The result of the research, which is currently implemented in Thailand, can provide benefits to the industry in the evaluation of becoming an ICT's distributor or an ICT's end user, particularly in the assessment of the Enterprise Resource Planning (ERP) programme. After the model is put to test with an in-depth collaboration with industrial entrepreneurs in Finland and Thailand, the sensitivity analysis is also performed to validate the robustness of the model. The contribution of this research is in developing a new approach and the Delphi/MAH software to obtain an analysis of the value of becoming an ERP distributor or end user that is flexible and applicable to entrepreneurs, who are looking for the most appropriate investment to become an ERP distributor or end user. The main advantage of this research over others is that the model can deliver the value of becoming an ERP distributor or end user in a single number which makes it easier for DMs to choose the most appropriate ERP vendor. The associated advantage is that the model can include qualitative data as well as quantitative data, as the results from using quantitative data alone can be misleading and inadequate. There is a need to utilise quantitative and qualitative analysis together, as can be seen from the case studies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

ABSTRACT The traditional method of net present value (NPV) to analyze the economic profitability of an investment (based on a deterministic approach) does not adequately represent the implicit risk associated with different but correlated input variables. Using a stochastic simulation approach for evaluating the profitability of blueberry (Vaccinium corymbosum L.) production in Chile, the objective of this study is to illustrate the complexity of including risk in economic feasibility analysis when the project is subject to several but correlated risks. The results of the simulation analysis suggest that the non-inclusion of the intratemporal correlation between input variables underestimate the risk associated with investment decisions. The methodological contribution of this study illustrates the complexity of the interrelationships between uncertain variables and their impact on the convenience of carrying out this type of business in Chile. The steps for the analysis of economic viability were: First, adjusted probability distributions for stochastic input variables (SIV) were simulated and validated. Second, the random values of SIV were used to calculate random values of variables such as production, revenues, costs, depreciation, taxes and net cash flows. Third, the complete stochastic model was simulated with 10,000 iterations using random values for SIV. This result gave information to estimate the probability distributions of the stochastic output variables (SOV) such as the net present value, internal rate of return, value at risk, average cost of production, contribution margin and return on capital. Fourth, the complete stochastic model simulation results were used to analyze alternative scenarios and provide the results to decision makers in the form of probabilities, probability distributions, and for the SOV probabilistic forecasts. The main conclusion shown that this project is a profitable alternative investment in fruit trees in Chile.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The GH-2000 and GH-2004 projects have developed a method for detecting GH misuse based on measuring insulin-like growth factor-I (IGF-I) and the amino-terminal pro-peptide of type III collagen (P-III-NP). The objectives were to analyze more samples from elite athletes to improve the reliability of the decision limit estimates, to evaluate whether the existing decision limits needed revision, and to validate further non-radioisotopic assays for these markers. The study included 998 male and 931 female elite athletes. Blood samples were collected according to World Anti-Doping Agency (WADA) guidelines at various sporting events including the 2011 International Association of Athletics Federations (IAAF) World Athletics Championships in Daegu, South Korea. IGF-I was measured by the Immunotech A15729 IGF-I IRMA, the Immunodiagnostic Systems iSYS IGF-I assay and a recently developed mass spectrometry (LC-MS/MS) method. P-III-NP was measured by the Cisbio RIA-gnost P-III-P, Orion UniQ? PIIINP RIA and Siemens ADVIA Centaur P-III-NP assays. The GH-2000 score decision limits were developed using existing statistical techniques. Decision limits were determined using a specificity of 99.99% and an allowance for uncertainty because of the finite sample size. The revised Immunotech IGF-I - Orion P-III-NP assay combination decision limit did not change significantly following the addition of the new samples. The new decision limits are applied to currently available non-radioisotopic assays to measure IGF-I and P-III-NP in elite athletes, which should allow wider flexibility to implement the GH-2000 marker test for GH misuse while providing some resilience against manufacturer withdrawal or change of assays. Copyright © 2015 John Wiley & Sons, Ltd.