913 resultados para risk-based modeling


Relevância:

100.00% 100.00%

Publicador:

Resumo:

It is well known that there is a dynamic relationship between cerebral blood flow (CBF) and cerebral blood volume (CBV). With increasing applications of functional MRI, where the blood oxygen-level-dependent signals are recorded, the understanding and accurate modeling of the hemodynamic relationship between CBF and CBV becomes increasingly important. This study presents an empirical and data-based modeling framework for model identification from CBF and CBV experimental data. It is shown that the relationship between the changes in CBF and CBV can be described using a parsimonious autoregressive with exogenous input model structure. It is observed that neither the ordinary least-squares (LS) method nor the classical total least-squares (TLS) method can produce accurate estimates from the original noisy CBF and CBV data. A regularized total least-squares (RTLS) method is thus introduced and extended to solve such an error-in-the-variables problem. Quantitative results show that the RTLS method works very well on the noisy CBF and CBV data. Finally, a combination of RTLS with a filtering method can lead to a parsimonious but very effective model that can characterize the relationship between the changes in CBF and CBV.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Geoengineering by stratospheric aerosol injection has been proposed as a policy response to warming from human emissions of greenhouse gases, but it may produce unequal regional impacts. We present a simple, intuitive risk-based framework for classifying these impacts according to whether geoengineering increases or decreases the risk of substantial climate change, with further classification by the level of existing risk from climate change from increasing carbon dioxide concentrations. This framework is applied to two climate model simulations of geoengineering counterbalancing the surface warming produced by a quadrupling of carbon dioxide concentrations, with one using a layer of sulphate aerosol in the lower stratosphere, and the other a reduction in total solar irradiance. The solar dimming model simulation shows less regional inequality of impacts compared with the aerosol geoengineering simulation. In the solar dimming simulation, 10% of the Earth’s surface area, containing 10% of its population and 11% of its gross domestic product, experiences greater risk of substantial precipitation changes under geoengineering than under enhanced carbon dioxide concentrations. In the aerosol geoengineering simulation the increased risk of substantial precipitation change is experienced by 42% of Earth’s surface area, containing 36% of its population and 60% of its gross domestic product.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We report on the assembly of tumor necrosis factor receptor 1 (TNF-R1) prior to ligand activation and its ligand-induced reorganization at the cell membrane. We apply single-molecule localization microscopy to obtain quantitative information on receptor cluster sizes and copy numbers. Our data suggest a dimeric pre-assembly of TNF-R1, as well as receptor reorganization toward higher oligomeric states with stable populations comprising three to six TNF-R1. Our experimental results directly serve as input parameters for computational modeling of the ligand-receptor interaction. Simulations corroborate the experimental finding of higher-order oligomeric states. This work is a first demonstration how quantitative, super-resolution and advanced microscopy can be used for systems biology approaches at the single-molecule and single-cell level.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Probabilistic hydro-meteorological forecasts have over the last decades been used more frequently to communicate forecastuncertainty. This uncertainty is twofold, as it constitutes both an added value and a challenge for the forecaster and the user of the forecasts. Many authors have demonstrated the added (economic) value of probabilistic over deterministic forecasts across the water sector (e.g. flood protection, hydroelectric power management and navigation). However, the richness of the information is also a source of challenges for operational uses, due partially to the difficulty to transform the probability of occurrence of an event into a binary decision. This paper presents the results of a risk-based decision-making game on the topic of flood protection mitigation, called “How much are you prepared to pay for a forecast?”. The game was played at several workshops in 2015, which were attended by operational forecasters and academics working in the field of hydrometeorology. The aim of this game was to better understand the role of probabilistic forecasts in decision-making processes and their perceived value by decision-makers. Based on the participants’ willingness-to-pay for a forecast, the results of the game show that the value (or the usefulness) of a forecast depends on several factors, including the way users perceive the quality of their forecasts and link it to the perception of their own performances as decision-makers.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A intenção deste trabalho é explorar dinâmicas de competição por meio de “simulação baseada em agentes”. Apoiando-se em um crescente número de estudos no campo da estratégia e teoria das organizações que utilizam métodos de simulação, desenvolveu-se um modelo computacional para simular situações de competição entre empresas e observar a eficiência relativa dos métodos de busca de melhoria de desempenho teorizados. O estudo também explora possíveis explicações para a persistência de desempenho superior ou inferior das empresas, associados às condições de vantagem ou desvantagem competitiva

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Simulation of large and complex systems, such as computing grids, is a difficult task. Current simulators, despite providing accurate results, are significantly hard to use. They usually demand a strong knowledge of programming, what is not a standard pattern in today's users of grids and high performance computing. The need for computer expertise prevents these users from simulating how the environment will respond to their applications, what may imply in large loss of efficiency, wasting precious computational resources. In this paper we introduce iSPD, iconic Simulator of Parallel and Distributed Systems, which is a simulator where grid models are produced through an iconic interface. We describe the simulator and its intermediate model languages. Results presented here provide an insight in its easy-of-use and accuracy.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper, the effects of uncertainty and expected costs of failure on optimum structural design are investigated, by comparing three distinct formulations of structural optimization problems. Deterministic Design Optimization (DDO) allows one the find the shape or configuration of a structure that is optimum in terms of mechanics, but the formulation grossly neglects parameter uncertainty and its effects on structural safety. Reliability-based Design Optimization (RBDO) has emerged as an alternative to properly model the safety-under-uncertainty part of the problem. With RBDO, one can ensure that a minimum (and measurable) level of safety is achieved by the optimum structure. However, results are dependent on the failure probabilities used as constraints in the analysis. Risk optimization (RO) increases the scope of the problem by addressing the compromising goals of economy and safety. This is accomplished by quantifying the monetary consequences of failure, as well as the costs associated with construction, operation and maintenance. RO yields the optimum topology and the optimum point of balance between economy and safety. Results are compared for some example problems. The broader RO solution is found first, and optimum results are used as constraints in DDO and RBDO. Results show that even when optimum safety coefficients are used as constraints in DDO, the formulation leads to configurations which respect these design constraints, reduce manufacturing costs but increase total expected costs (including expected costs of failure). When (optimum) system failure probability is used as a constraint in RBDO, this solution also reduces manufacturing costs but by increasing total expected costs. This happens when the costs associated with different failure modes are distinct. Hence, a general equivalence between the formulations cannot be established. Optimum structural design considering expected costs of failure cannot be controlled solely by safety factors nor by failure probability constraints, but will depend on actual structural configuration. (c) 2011 Elsevier Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

La stima della frequenza di accadimento di eventi incidentali di tipo random da linee e apparecchiature è in generale effettuata sulla base delle informazioni presenti in banche dati specializzate. I dati presenti in tali banche contengono informazioni relative ad eventi incidentali avvenuti in varie tipologie di installazioni, che spaziano dagli impianti chimici a quelli petrolchimici. Alcune di queste banche dati risultano anche piuttosto datate, poiché fanno riferimento ad incidenti verificatisi ormai molto addietro negli anni. Ne segue che i valori relativi alle frequenze di perdita forniti dalle banche dati risultano molto conservativi. Per ovviare a tale limite e tenere in conto il progresso tecnico, la linea guida API Recommended Pratice 581, pubblicata nel 2000 e successivamente aggiornata nel 2008, ha introdotto un criterio per determinare frequenze di perdita specializzate alla realtà propria impiantistica, mediante l’ausilio di coefficienti correttivi che considerano il meccanismo di guasto del componente, il sistema di gestione della sicurezza e l’efficacia dell’attività ispettiva. Il presente lavoro di tesi ha lo scopo di mettere in evidenza l’evoluzione dell’approccio di valutazione delle frequenze di perdita da tubazione. Esso è articolato come descritto nel seguito. Il Capitolo 1 ha carattere introduttivo. Nel Capitolo 2 è affrontato lo studio delle frequenze di perdita reperibili nelle banche dati generaliste. Nel Capitolo 3 sono illustrati due approcci, uno qualitativo ed uno quantitativo, che permettono di determinare le linee che presentano priorità più alta per essere sottoposte all’attività ispettiva. Il Capitolo 4 è dedicato alla descrizione della guida API Recomended Practice 581. L’applicazione ad un caso di studio dei criteri di selezione delle linee illustrati nel Capitolo 3 e la definizione delle caratteristiche dell’attività ispettiva secondo la linea guida API Recomended Practice 581 sono illustrati nel Capitolo 5. Infine nel Capitolo 6 sono rese esplicite le considerazioni conclusive dello studio effettuato.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Traditionally, the routine artificial digestion test is applied to assess the presence of Trichinella larvae in pigs. However, this diagnostic method has a low sensitivity compared to serological tests. The results from artificial digestion tests in Switzerland were evaluated over a time period of 15 years to determine by when freedom from infection based on these data could be confirmed. Freedom was defined as a 95% probability that the prevalence of infection was below 0.0001%. Freedom was demonstrated after 12 years at the latest. A new risk-based surveillance approach was then developed based on serology. Risk-based surveillance was also assessed over 15 years, starting in 2010. It was shown that by using this design, the sample size could be reduced by at least a factor of 4 when compared with the traditional testing regimen, without lowering the level of confidence in the Trichinella-free status of the pig population.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In all European Union countries, chemical residues are required to be routinely monitored in meat. Good farming and veterinary practice can prevent the contamination of meat with pharmaceutical substances, resulting in a low detection of drug residues through random sampling. An alternative approach is to target-monitor farms suspected of treating their animals with antimicrobials. The objective of this project was to assess, using a stochastic model, the efficiency of these two sampling strategies. The model integrated data on Swiss livestock as well as expert opinion and results from studies conducted in Switzerland. Risk-based sampling showed an increase in detection efficiency of up to 100% depending on the prevalence of contaminated herds. Sensitivity analysis of this model showed the importance of the accuracy of prior assumptions for conducting risk-based sampling. The resources gained by changing from random to risk-based sampling should be transferred to improving the quality of prior information.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Swiss aquaculture farms were assessed according to their risk of acquiring or spreading viral haemorrhagic septicaemia (VHS) and infectious haematopoietic necrosis (IHN). Risk factors for the introduction and spread of VHS and IHN were defined and assessed using published data and expert opinions. Among the 357 aquaculture farms identified in Switzerland, 49.3% were categorised as high risk, 49.0% as medium risk and 1.7% as low risk. According to the new Directive 2006/88/EC for aquaculture of the European Union, the frequency of farm inspections must be derived from their risk levels. A sensitivity analysis showed that water supply and fish movements were highly influential on the output of the risk assessment regarding the introduction of VHS and IHN. Fish movements were also highly influential on the risk assessment output regarding the spread of these diseases.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND: This study focused on the descriptive analysis of cattle movements and farm-level parameters derived from cattle movements, which are considered to be generically suitable for risk-based surveillance systems in Switzerland for diseases where animal movements constitute an important risk pathway. METHODS: A framework was developed to select farms for surveillance based on a risk score summarizing 5 parameters. The proposed framework was validated using data from the bovine viral diarrhoea (BVD) surveillance programme in 2013. RESULTS: A cumulative score was calculated per farm, including the following parameters; the maximum monthly ingoing contact chain (in 2012), the average number of animals per incoming movement, use of mixed alpine pastures and the number of weeks in 2012 a farm had movements registered. The final score for the farm depended on the distribution of the parameters. Different cut offs; 50, 90, 95 and 99%, were explored. The final scores ranged between 0 and 5. Validation of the scores against results from the BVD surveillance programme 2013 gave promising results for setting the cut off for each of the five selected farm level criteria at the 50th percentile. Restricting testing to farms with a score ≥ 2 would have resulted in the same number of detected BVD positive farms as testing all farms, i.e., the outcome of the 2013 surveillance programme could have been reached with a smaller survey. CONCLUSIONS: The seasonality and time dependency of the activity of single farms in the networks requires a careful assessment of the actual time period included to determine farm level criteria. However, selecting farms in the sample for risk-based surveillance can be optimized with the proposed scoring system. The system was validated using data from the BVD eradication program. The proposed method is a promising framework for the selection of farms according to the risk of infection based on animal movements.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Even the best school health education programs will be unsuccessful if they are not disseminated effectively in a manner that encourages classroom adoption and implementation. This study involved two components: (1) the development of a videotape intervention to be used in the dissemination phase of a 4-year, NCI-funded diffusion study and (2) the evaluation of that videotape intervention strategy in comparison with a print (information transfer) strategy. Conceptualization has been guided by Social Learning Theory, Diffusion Theory, and communication theory. Additionally, the PRECEDE Framework has been used. Seventh and 8th grade classroom teachers from Spring Branch Independent School District in west Houston participated in the evaluation of the videotape and print interventions using a 57-item preadoption survey instrument developed by the UT Center for Health Promotion Research and Development. Two-way ANOVA was used to study individual score differences for five outcome variables: Total Scale Score (comprised of 57 predisposing, enabling, and reinforcing items), Adoption Characteristics Subscale, Attitude Toward Innovation Subscale, Receptivity Toward Innovation, and Reinforcement Subscale. The aim of the study is to compare the effect upon score differences of video and print interventions alone and in combination. Seventy-three 7th and 8th grade classroom teachers completed the study providing baseline and post-intervention measures on factors related to the adoption and implementation of tobacco-use prevention programs. Two-way ANOVA, in relation to the study questions, found significant scoring differences for those exposed to the videotape intervention alone for both the Attitude Toward Innovation Subscale and the Receptivity to Adopt Subscale. No significant results were found to suggest that print alone influences favorable scoring differences between baseline and post-intervention testing. One interaction effect was found suggesting video and print combined are more effective for influencing favorable scoring differences for the Reinforcement for the Adoption Subscale.^ This research is unique in that it represents a newly emerging field in health promotion communications research with implications for Social Learning Theory, Diffusion Theory, and communication science that are applicable to the development of improved school health interventions. ^