998 resultados para uncertainty factor
Resumo:
Tämä diplomityö tehtiin Neste Oil Oyj:n Kehittäminen ja Laboratoriot yksikön HSE-palveluille. Työn tavoitteena oli arvioida Neste Oilin ympäristövaikutusten velvoitetarkkailujen mittaustulosten epävarmuutta. Tarkastelu koski ilmanlaadun SO2-, NO2-, TRS- sekä O3-mittauksia, ympäristömelumittauksia sekä pohjavesinäytteenottoa. Ympäristönsuojelulaki (86/2000) velvoittaa tuotantolaitoksia selvittämään toimintansa ympäristövaikutukset. Myös esimerkiksi akkreditoitaessa menetelmiä mittausepävarmuus on tunnettava. On arvioitu, että tulevaisuudessa direktiivit tulevat tiukentamaan päästöraja-arvoja ja mittausepävarmuuden käsite tulee käyttöön kaikilla ympäristösektoreilla.Tässä työssä ilmanlaadun mittausepävarmuus arvioitiin vertaamalla Neste Oilin mittaustuloksia Ilmatieteenlaitoksen vertailumittausten ja kalibrointien tuloksiin. Ympäristömelun mittausepävarmuus arvioitiin Ympäristöministeriön ympäristömelunmittausohjeen (1/1995)mukai-sesti. Pohjavesinäytteenoton mittausepävarmuus arvioitiin laskemalla haitta-aineiden ajallisen vaihtelun, näytteenottomenetelmien, näytteiden kuljetuksenja säilytyksen aiheuttaman kontaminaation sekä analyysivaiheen epävarmuustekijöiden yhdistetty mittausepävarmuus. Tarkastelussa todettiin, että ilmanlaadunmittaustulokset eivät poikenneet merkittävästi vertai-lumittausten ja kalibrointien tuloksista. Menetelmien laajennetuksi mittausepävarmuudeksi saatiin 6-8 %. Ympäristömelun mittausepävarmuus vastasi ympäristömelunmittausohjeessa esitettyjä arvoja ja vaihtelivat 2-10 dB:n välillä, riippuen mittausetäisyydestä ja mittauskertojen lukumäärästä. Pohjavesinäytteenoton mittausepävarmuudelle ei ole asetettu laatutavoitteita. Tässä tarkastelussa pohjavesinäytteenoton mittausepävarmuudeksi saatiin 33 %.
Resumo:
Un facteur d’incertitude de 10 est utilisé par défaut lors de l’élaboration des valeurs toxicologiques de référence en santé environnementale, afin de tenir compte de la variabilité interindividuelle dans la population. La composante toxicocinétique de cette variabilité correspond à racine de 10, soit 3,16. Sa validité a auparavant été étudiée sur la base de données pharmaceutiques colligées auprès de diverses populations (adultes, enfants, aînés). Ainsi, il est possible de comparer la valeur de 3,16 au Facteur d’ajustement pour la cinétique humaine (FACH), qui constitue le rapport entre un centile élevé (ex. : 95e) de la distribution de la dose interne dans des sous-groupes présumés sensibles et sa médiane chez l’adulte, ou encore à l’intérieur d’une population générale. Toutefois, les données expérimentales humaines sur les polluants environnementaux sont rares. De plus, ces substances ont généralement des propriétés sensiblement différentes de celles des médicaments. Il est donc difficile de valider, pour les polluants, les estimations faites à partir des données sur les médicaments. Pour résoudre ce problème, la modélisation toxicocinétique à base physiologique (TCBP) a été utilisée pour simuler la variabilité interindividuelle des doses internes lors de l’exposition aux polluants. Cependant, les études réalisées à ce jour n’ont que peu permis d’évaluer l’impact des conditions d’exposition (c.-à-d. voie, durée, intensité), des propriétés physico/biochimiques des polluants, et des caractéristiques de la population exposée sur la valeur du FACH et donc la validité de la valeur par défaut de 3,16. Les travaux de la présente thèse visent à combler ces lacunes. À l’aide de simulations de Monte-Carlo, un modèle TCBP a d’abord été utilisé pour simuler la variabilité interindividuelle des doses internes (c.-à-d. chez les adultes, ainés, enfants, femmes enceintes) de contaminants de l’eau lors d’une exposition par voie orale, respiratoire, ou cutanée. Dans un deuxième temps, un tel modèle a été utilisé pour simuler cette variabilité lors de l’inhalation de contaminants à intensité et durée variables. Ensuite, un algorithme toxicocinétique à l’équilibre probabiliste a été utilisé pour estimer la variabilité interindividuelle des doses internes lors d’expositions chroniques à des contaminants hypothétiques aux propriétés physico/biochimiques variables. Ainsi, les propriétés de volatilité, de fraction métabolisée, de voie métabolique empruntée ainsi que de biodisponibilité orale ont fait l’objet d’analyses spécifiques. Finalement, l’impact du référent considéré et des caractéristiques démographiques sur la valeur du FACH lors de l’inhalation chronique a été évalué, en ayant recours également à un algorithme toxicocinétique à l’équilibre. Les distributions de doses internes générées dans les divers scénarios élaborés ont permis de calculer dans chaque cas le FACH selon l’approche décrite plus haut. Cette étude a mis en lumière les divers déterminants de la sensibilité toxicocinétique selon le sous-groupe et la mesure de dose interne considérée. Elle a permis de caractériser les déterminants du FACH et donc les cas où ce dernier dépasse la valeur par défaut de 3,16 (jusqu’à 28,3), observés presqu’uniquement chez les nouveau-nés et en fonction de la substance mère. Cette thèse contribue à améliorer les connaissances dans le domaine de l’analyse du risque toxicologique en caractérisant le FACH selon diverses considérations.
Resumo:
Cuando las empresas evalúan los resultados de su planeación estratégica y de mercadeo se enfrentan en numerosas ocasiones al escepticismo sobre cómo dicha planeación favoreció o afectó la percepción hacia la empresa y sus productos. Este proyecto propone por medio del uso de una herramienta de simulación computacional reducir el factor de incertidumbre de LG Electronics en el valor percibido de marca por la población de Bogotá D.C. en cada una de sus líneas de producto; el grado de inversión en mercadeo, publicidad, distribución y servicio. Para ello los consumidores son modelados como agentes inteligentes con poder de recomendación, quienes se basan principalmente en la experiencia generada por el producto y en el grado de influencia de las estrategias de mercadeo que afectan su decisión de compra, de preferencia y de permanencia. Adicionalmente se mide la retribución en utilidades y en recordación de marca de las inversiones en mercadeo que la compañía realiza.
Resumo:
Al hacer la evaluación de los resultados asociados con procesos de planeación estratégica y mercadeo en las empresas, la dirección enfrenta un cierto nivel de incertidumbre al no saber si estos planes afectaron positiva o negativamente la posición de la empresa en su entorno. El presente trabajo hace uso de una herramienta de simulación basada en agentes inteligentes para reducir el mencionado factor de incertidumbre, en este caso, para la empresa Corgranos S.A. Se espera modelar el comportamiento de los grupos poblacionales directa e indirectamente involucrados con la empresa para así afinar los esfuerzos que se efectúan sobre cada uno de ellos, siendo la diferencia entre las suposiciones iniciales y los resultados de la simulación el verdadero aporte del trabajo.
Resumo:
In situ precipitation measurements can extremely differ in space and time. Taking into account the limited spatial–temporal representativity and the uncertainty of a single station is important for validating mesoscale numerical model results as well as for interpreting remote sensing data. In situ precipitation data from a high resolution network in North-Eastern Germany are analysed to determine their temporal and spatial representativity. For the dry year 2003 precipitation amounts were available with 10 min resolution from 14 rain gauges distributed in an area of 25 km 25 km around the Meteorological Observatory Lindenberg (Richard-Aßmann Observatory). Our analysis reveals that short-term (up to 6 h) precipitation events dominate (94% of all events) and that the distribution is skewed with a high frequency of very low precipitation amounts. Long-lasting precipitation events are rare (6% of all precipitation events), but account for nearly 50% of the annual precipitation. The spatial representativity of a single-site measurement increases slightly for longer measurement intervals and the variability decreases. Hourly precipitation amounts are representative for an area of 11 km 11 km. Daily precipitation amounts appear to be reliable with an uncertainty factor of 3.3 for an area of 25 km 25 km, and weekly and monthly precipitation amounts have uncertainties of a factor of 2 and 1.4 when compared to 25 km 25 km mean values.
Resumo:
Epidemiological studies report confidence or uncertainty intervals around their estimates. Estimates of the burden of diseases and risk factors are subject to a broader range of uncertainty because of the combination of multiple data sources and value choices. Sensitivity analysis can be used to examine the effects of social values that have been incorporated into the design of the disability–adjusted life year (DALY). Age weight, where a year of healthy life lived at one age is valued differently from at another age, is the most controversial value built into the DALY. The discount rate, which addresses the difference in value of current versus future health benefits, also has been criticized. The distribution of the global disease burden and rankings of various conditions are largely insensitive to alternate assumptions about the discount rate and age weighting. The major effects of discounting and age weighting are to enhance the importance of neuropsychiatric conditions and sexually transmitted infections. The Global Burden of Disease study also has been criticized for estimating mortality and disease burden for regions using incomplete and uncertain data. Including uncertain results, with uncertainty quantified to the extent possible, is preferable, however, to leaving blank cells in tables intended to provide policy makers with an overall assessment of burden of disease. No estimate is generally interpreted as no problem. Greater investment in getting the descriptive epidemiology of diseases and injuries correct in poor countries will do vastly more to reduce uncertainty in disease burden assessments than a philosophical debate about the appropriateness of social value
Resumo:
A szerző a 2008-ban kezdődött gazdasági világválság hatását vizsgálja az egy részvényre jutó nyereség előrejelzésének hibájára. Számos publikáció bizonyította, hogy az elemzők a tényértékeknél szisztematikusan kedvezőbb tervértéket adnak meg az egy részvényre jutó előrejelzéseikben. Más vizsgálatok azt igazolták, hogy az egy részvényre jutó előrejelzési hiba bizonytalan környezetben növekszik, míg arra is számos bizonyítékot lehet találni, hogy a negatív hírek hatását az elemzők alulsúlyozzák. A gazdasági világválság miatt az elemzőknek számtalan negatív hírt kellett figyelembe venniük az előrejelzések készítésekor, továbbá a válság az egész gazdaságban jelentősen növelte a bizonytalanságot. A szerző azt vizsgálja, hogy miként hatott a gazdasági világválság az egy részvényre jutó nyereség- előrejelzés hibájára, megkülönböztetve azt az időszakot, amíg a válság negatív hír volt, attól, amikor már hatásaként jelentősen megnőtt a bizonytalanság. _____ The author investigated the impact of the financial crisis that started in 2008 on the forecasting error for earnings per share. There is plentiful evidence from the 1980s that analysts give systematically more favourable values in their earnings per share (EPS) forecasts than reality, i.e. they are generally optimistic. Other investigations have supported the idea that the EPS forecasting error is greater under uncertain environmental circumstances, while other researchers prove that the analysts under-react to the negative information in their forecasts. The financial crisis brought a myriad of negative information for analysts to consider in such forecasts, while also increasing the level of uncertainty for the entire economy. The article investigates the impact of the financial crisis on the EPS forecasting error, distinguishing the period when the crisis gave merely negative information, from the one when its effect of uncertainty was significantly increased over the entire economy.
Resumo:
In this study, the concentration probability distributions of 82 pharmaceutical compounds detected in the effluents of 179 European wastewater treatment plants were computed and inserted into a multimedia fate model. The comparative ecotoxicological impact of the direct emission of these compounds from wastewater treatment plants on freshwater ecosystems, based on a potentially affected fraction (PAF) of species approach, was assessed to rank compounds based on priority. As many pharmaceuticals are acids or bases, the multimedia fate model accounts for regressions to estimate pH-dependent fate parameters. An uncertainty analysis was performed by means of Monte Carlo analysis, which included the uncertainty of fate and ecotoxicity model input variables, as well as the spatial variability of landscape characteristics on the European continental scale. Several pharmaceutical compounds were identified as being of greatest concern, including 7 analgesics/anti-inflammatories, 3 β-blockers, 3 psychiatric drugs, and 1 each of 6 other therapeutic classes. The fate and impact modelling relied extensively on estimated data, given that most of these compounds have little or no experimental fate or ecotoxicity data available, as well as a limited reported occurrence in effluents. The contribution of estimated model input variables to the variance of freshwater ecotoxicity impact, as well as the lack of experimental abiotic degradation data for most compounds, helped in establishing priorities for further testing. Generally, the effluent concentration and the ecotoxicity effect factor were the model input variables with the most significant effect on the uncertainty of output results.
Resumo:
This paper extends the Nelson-Siegel linear factor model by developing a flexible macro-finance framework for modeling and forecasting the term structure of US interest rates. Our approach is robust to parameter uncertainty and structural change, as we consider instabilities in parameters and volatilities, and our model averaging method allows for investors' model uncertainty over time. Our time-varying parameter Nelson-Siegel Dynamic Model Averaging (NS-DMA) predicts yields better than standard benchmarks and successfully captures plausible time-varying term premia in real time. The proposed model has significant in-sample and out-of-sample predictability for excess bond returns, and the predictability is of economic value.
Resumo:
Background: Alcohol is a major risk factor for burden of disease and injuries globally. This paper presents a systematic method to compute the 95% confidence intervals of alcohol-attributable fractions (AAFs) with exposure and risk relations stemming from different sources.Methods: The computation was based on previous work done on modelling drinking prevalence using the gamma distribution and the inherent properties of this distribution. The Monte Carlo approach was applied to derive the variance for each AAF by generating random sets of all the parameters. A large number of random samples were thus created for each AAF to estimate variances. The derivation of the distributions of the different parameters is presented as well as sensitivity analyses which give an estimation of the number of samples required to determine the variance with predetermined precision, and to determine which parameter had the most impact on the variance of the AAFs.Results: The analysis of the five Asian regions showed that 150 000 samples gave a sufficiently accurate estimation of the 95% confidence intervals for each disease. The relative risk functions accounted for most of the variance in the majority of cases.Conclusions: Within reasonable computation time, the method yielded very accurate values for variances of AAFs.
Resumo:
The GH-2000 and GH-2004 projects have developed a method for detecting GH misuse based on measuring insulin-like growth factor-I (IGF-I) and the amino-terminal pro-peptide of type III collagen (P-III-NP). The objectives were to analyze more samples from elite athletes to improve the reliability of the decision limit estimates, to evaluate whether the existing decision limits needed revision, and to validate further non-radioisotopic assays for these markers. The study included 998 male and 931 female elite athletes. Blood samples were collected according to World Anti-Doping Agency (WADA) guidelines at various sporting events including the 2011 International Association of Athletics Federations (IAAF) World Athletics Championships in Daegu, South Korea. IGF-I was measured by the Immunotech A15729 IGF-I IRMA, the Immunodiagnostic Systems iSYS IGF-I assay and a recently developed mass spectrometry (LC-MS/MS) method. P-III-NP was measured by the Cisbio RIA-gnost P-III-P, Orion UniQ? PIIINP RIA and Siemens ADVIA Centaur P-III-NP assays. The GH-2000 score decision limits were developed using existing statistical techniques. Decision limits were determined using a specificity of 99.99% and an allowance for uncertainty because of the finite sample size. The revised Immunotech IGF-I - Orion P-III-NP assay combination decision limit did not change significantly following the addition of the new samples. The new decision limits are applied to currently available non-radioisotopic assays to measure IGF-I and P-III-NP in elite athletes, which should allow wider flexibility to implement the GH-2000 marker test for GH misuse while providing some resilience against manufacturer withdrawal or change of assays. Copyright © 2015 John Wiley & Sons, Ltd.
Resumo:
The use of the Bayes factor (BF) or likelihood ratio as a metric to assess the probative value of forensic traces is largely supported by operational standards and recommendations in different forensic disciplines. However, the progress towards more widespread consensus about foundational principles is still fragile as it raises new problems about which views differ. It is not uncommon e.g. to encounter scientists who feel the need to compute the probability distribution of a given expression of evidential value (i.e. a BF), or to place intervals or significance probabilities on such a quantity. The article here presents arguments to show that such views involve a misconception of principles and abuse of language. The conclusion of the discussion is that, in a given case at hand, forensic scientists ought to offer to a court of justice a given single value for the BF, rather than an expression based on a distribution over a range of values.
Resumo:
Traditionally, in the cigarettes industry, the determination of ammonium ion in the mainstream smoke is performed by ion chromatography. This work studies this determination and compares the results of this technique with the use of external and internal standard calibration. A reference cigarette sample presented measurement uncertainty of 2.0 μg/cigarette and 1.5 μg/cigarette, with external and internal standard, respectively. It is observed that the greatest source of uncertainty is the bias correction factor and that it is even more significant when using external standard, confirming thus the importance of internal standardization for this correction.
Resumo:
Resumen tomado de la publicaci??n
Resumo:
Resumen tomado de la publicación