913 resultados para risk-based modeling
Resumo:
The water stored in and flowing through the subsurface is fundamental for sustaining human activities and needs, feeding water and its constituents to surface water bodies and supporting the functioning of their ecosystems. Quantifying the changes that affect the subsurface water is crucial for our understanding of its dynamics and changes driven by climate change and other changes in the landscape, such as in land-use and water-use. It is inherently difficult to directly measure soil moisture and groundwater levels over large spatial scales and long times. Models are therefore needed to capture the soil moisture and groundwater level dynamics over such large spatiotemporal scales. This thesis develops a modeling framework that allows for long-term catchment-scale screening of soil moisture and groundwater level changes. The novelty in this development resides in an explicit link drawn between catchment-scale hydroclimatic and soil hydraulics conditions, using observed runoff data as an approximation of soil water flux and accounting for the effects of snow storage-melting dynamics on that flux. Both past and future relative changes can be assessed by use of this modeling framework, with future change projections based on common climate model outputs. By direct model-observation comparison, the thesis shows that the developed modeling framework can reproduce the temporal variability of large-scale changes in soil water storage, as obtained from the GRACE satellite product, for most of 25 large study catchments around the world. Also compared with locally measured soil water content and groundwater level in 10 U.S. catchments, the modeling approach can reasonably well reproduce relative seasonal fluctuations around long-term average values. The developed modeling framework is further used to project soil moisture changes due to expected future climate change for 81 catchments around the world. The future soil moisture changes depend on the considered radiative forcing scenario (RCP) but are overall large for the occurrence frequency of dry and wet events and the inter-annual variability of seasonal soil moisture. These changes tend to be higher for the dry events and the dry season, respectively, than for the corresponding wet quantities, indicating increased drought risk for some parts of the world.
Resumo:
[EN]All the relevant risk factors contributing to breast cancer etiology are not fully known. Exposure to organochlorine pesticides has been linked to an increased incidence of the disease, although not all data have been consistent. Most published studies evaluated the exposure to organochlorines individually, ignoring the potential effects exerted by the mixtures of chemicals.
Resumo:
Despite its huge potential in risk analysis, the Dempster–Shafer Theory of Evidence (DST) has not received enough attention in construction management. This paper presents a DST-based approach for structuring personal experience and professional judgment when assessing construction project risk. DST was innovatively used to tackle the problem of lacking sufficient information through enabling analysts to provide incomplete assessments. Risk cost is used as a common scale for measuring risk impact on the various project objectives, and the Evidential Reasoning algorithm is suggested as a novel alternative for aggregating individual assessments. A spreadsheet-based decision support system (DSS) was devised to facilitate the proposed approach. Four case studies were conducted to examine the approach's viability. Senior managers in four British construction companies tried the DSS and gave very promising feedback. The paper concludes that the proposed methodology may contribute to bridging the gap between theory and practice of construction risk assessment.
Resumo:
Structural Health Monitoring (SHM) is an emerging area of research associated to improvement of maintainability and the safety of aerospace, civil and mechanical infrastructures by means of monitoring and damage detection. Guided wave structural testing method is an approach for health monitoring of plate-like structures using smart material piezoelectric transducers. Among many kinds of transducers, the ones that have beam steering feature can perform more accurate surface interrogation. A frequency steerable acoustic transducer (FSATs) is capable of beam steering by varying the input frequency and consequently can detect and localize damage in structures. Guided wave inspection is typically performed through phased arrays which feature a large number of piezoelectric transducers, complexity and limitations. To overcome the weight penalty, the complex circuity and maintenance concern associated with wiring a large number of transducers, new FSATs are proposed that present inherent directional capabilities when generating and sensing elastic waves. The first generation of Spiral FSAT has two main limitations. First, waves are excited or sensed in one direction and in the opposite one (180 ̊ ambiguity) and second, just a relatively rude approximation of the desired directivity has been attained. Second generation of Spiral FSAT is proposed to overcome the first generation limitations. The importance of simulation tools becomes higher when a new idea is proposed and starts to be developed. The shaped transducer concept, especially the second generation of spiral FSAT is a novel idea in guided waves based of Structural Health Monitoring systems, hence finding a simulation tool is a necessity to develop various design aspects of this innovative transducer. In this work, the numerical simulation of the 1st and 2nd generations of Spiral FSAT has been conducted to prove the directional capability of excited guided waves through a plate-like structure.
Resumo:
Thesis (Master's)--University of Washington, 2016-08
Resumo:
Thesis (Ph.D.)--University of Washington, 2016-08
Resumo:
Emerging infectious diseases are a growing concern in wildlife conservation. Documenting outbreak patterns and determining the ecological drivers of transmission risk are fundamental to predicting disease spread and assessing potential impacts on population viability. However, evaluating disease in wildlife populations requires expansive surveillance networks that often do not exist in remote and developing areas. Here, we describe the results of a community-based research initiative conducted in collaboration with indigenous harvesters, the Inuit, in response to a new series of Avian Cholera outbreaks affecting Common Eiders (Somateria mollissima) and other comingling species in the Canadian Arctic. Avian Cholera is a virulent disease of birds caused by the bacterium Pasteurella multocida. Common Eiders are a valuable subsistence resource for Inuit, who hunt the birds for meat and visit breeding colonies during the summer to collect eggs and feather down for use in clothing and blankets. We compiled the observations of harvesters about the growing epidemic and with their assistance undertook field investigation of 131 colonies distributed over >1200 km of coastline in the affected region. Thirteen locations were identified where Avian Cholera outbreaks have occurred since 2004. Mortality rates ranged from 1% to 43% of the local breeding population at these locations. Using a species-habitat model (Maxent), we determined that the distribution of outbreak events has not been random within the study area and that colony size, vegetation cover, and a measure of host crowding in shared wetlands were significantly correlated to outbreak risk. In addition, outbreak locations have been spatially structured with respect to hypothesized introduction foci and clustered along the migration corridor linking Arctic breeding areas with wintering areas in Atlantic Canada. At present, Avian Cholera remains a localized threat to Common Eider populations in the Arctic; however expanded, community-based surveillance will be required to track disease spread.
Resumo:
Abstract : Images acquired from unmanned aerial vehicles (UAVs) can provide data with unprecedented spatial and temporal resolution for three-dimensional (3D) modeling. Solutions developed for this purpose are mainly operating based on photogrammetry concepts, namely UAV-Photogrammetry Systems (UAV-PS). Such systems are used in applications where both geospatial and visual information of the environment is required. These applications include, but are not limited to, natural resource management such as precision agriculture, military and police-related services such as traffic-law enforcement, precision engineering such as infrastructure inspection, and health services such as epidemic emergency management. UAV-photogrammetry systems can be differentiated based on their spatial characteristics in terms of accuracy and resolution. That is some applications, such as precision engineering, require high-resolution and high-accuracy information of the environment (e.g. 3D modeling with less than one centimeter accuracy and resolution). In other applications, lower levels of accuracy might be sufficient, (e.g. wildlife management needing few decimeters of resolution). However, even in those applications, the specific characteristics of UAV-PSs should be well considered in the steps of both system development and application in order to yield satisfying results. In this regard, this thesis presents a comprehensive review of the applications of unmanned aerial imagery, where the objective was to determine the challenges that remote-sensing applications of UAV systems currently face. This review also allowed recognizing the specific characteristics and requirements of UAV-PSs, which are mostly ignored or not thoroughly assessed in recent studies. Accordingly, the focus of the first part of this thesis is on exploring the methodological and experimental aspects of implementing a UAV-PS. The developed system was extensively evaluated for precise modeling of an open-pit gravel mine and performing volumetric-change measurements. This application was selected for two main reasons. Firstly, this case study provided a challenging environment for 3D modeling, in terms of scale changes, terrain relief variations as well as structure and texture diversities. Secondly, open-pit-mine monitoring demands high levels of accuracy, which justifies our efforts to improve the developed UAV-PS to its maximum capacities. The hardware of the system consisted of an electric-powered helicopter, a high-resolution digital camera, and an inertial navigation system. The software of the system included the in-house programs specifically designed for camera calibration, platform calibration, system integration, onboard data acquisition, flight planning and ground control point (GCP) detection. The detailed features of the system are discussed in the thesis, and solutions are proposed in order to enhance the system and its photogrammetric outputs. The accuracy of the results was evaluated under various mapping conditions, including direct georeferencing and indirect georeferencing with different numbers, distributions and types of ground control points. Additionally, the effects of imaging configuration and network stability on modeling accuracy were assessed. The second part of this thesis concentrates on improving the techniques of sparse and dense reconstruction. The proposed solutions are alternatives to traditional aerial photogrammetry techniques, properly adapted to specific characteristics of unmanned, low-altitude imagery. Firstly, a method was developed for robust sparse matching and epipolar-geometry estimation. The main achievement of this method was its capacity to handle a very high percentage of outliers (errors among corresponding points) with remarkable computational efficiency (compared to the state-of-the-art techniques). Secondly, a block bundle adjustment (BBA) strategy was proposed based on the integration of intrinsic camera calibration parameters as pseudo-observations to Gauss-Helmert model. The principal advantage of this strategy was controlling the adverse effect of unstable imaging networks and noisy image observations on the accuracy of self-calibration. The sparse implementation of this strategy was also performed, which allowed its application to data sets containing a lot of tie points. Finally, the concepts of intrinsic curves were revisited for dense stereo matching. The proposed technique could achieve a high level of accuracy and efficiency by searching only through a small fraction of the whole disparity search space as well as internally handling occlusions and matching ambiguities. These photogrammetric solutions were extensively tested using synthetic data, close-range images and the images acquired from the gravel-pit mine. Achieving absolute 3D mapping accuracy of 11±7 mm illustrated the success of this system for high-precision modeling of the environment.
Resumo:
This dissertation contains four essays that all share a common purpose: developing new methodologies to exploit the potential of high-frequency data for the measurement, modeling and forecasting of financial assets volatility and correlations. The first two chapters provide useful tools for univariate applications while the last two chapters develop multivariate methodologies. In chapter 1, we introduce a new class of univariate volatility models named FloGARCH models. FloGARCH models provide a parsimonious joint model for low frequency returns and realized measures, and are sufficiently flexible to capture long memory as well as asymmetries related to leverage effects. We analyze the performances of the models in a realistic numerical study and on the basis of a data set composed of 65 equities. Using more than 10 years of high-frequency transactions, we document significant statistical gains related to the FloGARCH models in terms of in-sample fit, out-of-sample fit and forecasting accuracy compared to classical and Realized GARCH models. In chapter 2, using 12 years of high-frequency transactions for 55 U.S. stocks, we argue that combining low-frequency exogenous economic indicators with high-frequency financial data improves the ability of conditionally heteroskedastic models to forecast the volatility of returns, their full multi-step ahead conditional distribution and the multi-period Value-at-Risk. Using a refined version of the Realized LGARCH model allowing for time-varying intercept and implemented with realized kernels, we document that nominal corporate profits and term spreads have strong long-run predictive ability and generate accurate risk measures forecasts over long-horizon. The results are based on several loss functions and tests, including the Model Confidence Set. Chapter 3 is a joint work with David Veredas. We study the class of disentangled realized estimators for the integrated covariance matrix of Brownian semimartingales with finite activity jumps. These estimators separate correlations and volatilities. We analyze different combinations of quantile- and median-based realized volatilities, and four estimators of realized correlations with three synchronization schemes. Their finite sample properties are studied under four data generating processes, in presence, or not, of microstructure noise, and under synchronous and asynchronous trading. The main finding is that the pre-averaged version of disentangled estimators based on Gaussian ranks (for the correlations) and median deviations (for the volatilities) provide a precise, computationally efficient, and easy alternative to measure integrated covariances on the basis of noisy and asynchronous prices. Along these lines, a minimum variance portfolio application shows the superiority of this disentangled realized estimator in terms of numerous performance metrics. Chapter 4 is co-authored with Niels S. Hansen, Asger Lunde and Kasper V. Olesen, all affiliated with CREATES at Aarhus University. We propose to use the Realized Beta GARCH model to exploit the potential of high-frequency data in commodity markets. The model produces high quality forecasts of pairwise correlations between commodities which can be used to construct a composite covariance matrix. We evaluate the quality of this matrix in a portfolio context and compare it to models used in the industry. We demonstrate significant economic gains in a realistic setting including short selling constraints and transaction costs.
Resumo:
Authentication plays an important role in how we interact with computers, mobile devices, the web, etc. The idea of authentication is to uniquely identify a user before granting access to system privileges. For example, in recent years more corporate information and applications have been accessible via the Internet and Intranet. Many employees are working from remote locations and need access to secure corporate files. During this time, it is possible for malicious or unauthorized users to gain access to the system. For this reason, it is logical to have some mechanism in place to detect whether the logged-in user is the same user in control of the user's session. Therefore, highly secure authentication methods must be used. We posit that each of us is unique in our use of computer systems. It is this uniqueness that is leveraged to "continuously authenticate users" while they use web software. To monitor user behavior, n-gram models are used to capture user interactions with web-based software. This statistical language model essentially captures sequences and sub-sequences of user actions, their orderings, and temporal relationships that make them unique by providing a model of how each user typically behaves. Users are then continuously monitored during software operations. Large deviations from "normal behavior" can possibly indicate malicious or unintended behavior. This approach is implemented in a system called Intruder Detector (ID) that models user actions as embodied in web logs generated in response to a user's actions. User identification through web logs is cost-effective and non-intrusive. We perform experiments on a large fielded system with web logs of approximately 4000 users. For these experiments, we use two classification techniques; binary and multi-class classification. We evaluate model-specific differences of user behavior based on coarse-grain (i.e., role) and fine-grain (i.e., individual) analysis. A specific set of metrics are used to provide valuable insight into how each model performs. Intruder Detector achieves accurate results when identifying legitimate users and user types. This tool is also able to detect outliers in role-based user behavior with optimal performance. In addition to web applications, this continuous monitoring technique can be used with other user-based systems such as mobile devices and the analysis of network traffic.
Resumo:
Current practices in agricultural management involve the application of rules and techniques to ensure high quality and environmentally friendly production. Based on their experience, agricultural technicians and farmers make critical decisions affecting crop growth while considering several interwoven agricultural, technological, environmental, legal and economic factors. In this context, decision support systems and the knowledge models that support them, enable the incorporation of valuable experience into software systems providing support to agricultural technicians to make rapid and effective decisions for efficient crop growth. Pest control is an important issue in agricultural management due to crop yield reductions caused by pests and it involves expert knowledge. This paper presents a formalisation of the pest control problem and the workflow followed by agricultural technicians and farmers in integrated pest management, the crop production strategy that combines different practices for growing healthy crops whilst minimising pesticide use. A generic decision schema for estimating infestation risk of a given pest on a given crop is defined and it acts as a metamodel for the maintenance and extension of the knowledge embedded in a pest management decision support system which is also presented. This software tool has been implemented by integrating a rule-based tool into web-based architecture. Evaluation from validity and usability perspectives concluded that both agricultural technicians and farmers considered it a useful tool in pest control, particularly for training new technicians and inexperienced farmers.
Resumo:
Historically, the health risk of mycotoxins had been evaluated on the basis of single-chemical and single-exposure pathway scenarios. However, the co-contamination of foodstuffs with these compounds is being reported at an increasing rate and a multiple-exposure scenario for humans and vulnerable population groups as children is urgently needed. Cereals are among the first solid foods eaten by child and thus constitute an important food group of their diet. Few data are available relatively to early stages child´s exposure to mycotoxins through consumption of cereal-based foods. The present study aims to perform the cumulative risk assessment of mycotoxins present in a set of cereal-based foods including breakfast cereals (BC), processed cereal-based foods (PCBF) and biscuits (BT), consumed by children (1 to 3 years old, n=75) from Lisbon region, Portugal. Children food consumption and occurrence of 12 mycotoxins (aflatoxins, ochratoxin A, fumonisins and trichothecenes) in cereal-based foods were combined to estimate the mycotoxin daily intake, using deterministic and probabilistic approaches. Different strategies were used to treat the left censored data. For aflatoxins, as carcinogenic compounds, the margin of exposure (MoE) was calculated as a ratio of BMDL (benchmark dose lower confidence limit) and aflatoxin daily exposure. For the remaining mycotoxins, the output of exposure was compared to the dose reference values (TDI) in order to calculate the hazard quotients (HQ, ratio between exposure and a reference dose). The concentration addition (CA) concept was used for the cumulative risk assessment of multiple mycotoxins. The combined margin of exposure (MoET) and the hazard index (HI) were calculated for aflatoxins and the remaining mycotoxins, respectively. Main results revealed a significant health concern related to aflatoxins and especially aflatoxin M1 exposure according to the MoET and MoE values (below 10000), respectively. HQ and HI values for the remaining mycotoxins were below 1, revealing a low concern from a public health point of view. These are the first results on cumulative risk assessment of multiple mycotoxins present in cereal-based foods consumed by children. Considering the present results, more research studies are needed to provide the governmental regulatory bodies with data to develop an approach that contemplate the human exposure and, particularly, children, to multiple mycotoxins in food. The last issue is particularly important considering the potential synergistic effects that could occur between mycotoxins and its potential impact on human and, mainly, children health.
Resumo:
People, animals and the environment can be exposed to multiple chemicals at once from a variety of sources, but current risk assessment is usually carried out based on one chemical substance at a time. In human health risk assessment, ingestion of food is considered a major route of exposure to many contaminants, namely mycotoxins, a wide group of fungal secondary metabolites that are known to potentially cause toxicity and carcinogenic outcomes. Mycotoxins are commonly found in a variety of foods including those intended for consumption by infants and young children and have been found in processed cereal-based foods available in the Portuguese market. The use of mathematical models, including probabilistic approaches using Monte Carlo simulations, constitutes a prominent issue in human health risk assessment in general and in mycotoxins exposure assessment in particular. The present study aims to characterize, for the first time, the risk associated with the exposure of Portuguese children to single and multiple mycotoxins present in processed cereal-based foods (CBF). Portuguese children (0-3 years old) food consumption data (n=103) were collected using a 3 days food diary. Contamination data concerned the quantification of 12 mycotoxins (aflatoxins, ochratoxin A, fumonisins and trichothecenes) were evaluated in 20 CBF samples marketed in 2014 and 2015 in Lisbon; samples were analyzed by HPLC-FLD, LC-MS/MS and GC-MS. Daily exposure of children to mycotoxins was performed using deterministic and probabilistic approaches. Different strategies were used to treat the left censored data. For aflatoxins, as carcinogenic compounds, the margin of exposure (MoE) was calculated as a ratio of BMDL (benchmark dose lower confidence limit) to the aflatoxin exposure. The magnitude of the MoE gives an indication of the risk level. For the remaining mycotoxins, the output of exposure was compared to the dose reference values (TDI) in order to calculate the hazard quotients (ratio between exposure and a reference dose, HQ). For the cumulative risk assessment of multiple mycotoxins, the concentration addition (CA) concept was used. The combined margin of exposure (MoET) and the hazard index (HI) were calculated for aflatoxins and the remaining mycotoxins, respectively. 71% of CBF analyzed samples were contaminated with mycotoxins (with values below the legal limits) and approximately 56% of the studied children consumed CBF at least once in these 3 days. Preliminary results showed that children exposure to single mycotoxins present in CBF were below the TDI. Aflatoxins MoE and MoET revealed a reduced potential risk by exposure through consumption of CBF (with values around 10000 or more). HQ and HI values for the remaining mycotoxins were below 1. Children are a particularly vulnerable population group to food contaminants and the present results point out an urgent need to establish legal limits and control strategies regarding the presence of multiple mycotoxins in children foods in order to protect their health. The development of packaging materials with antifungal properties is a possible solution to control the growth of moulds and consequently to reduce mycotoxin production, contributing to guarantee the quality and safety of foods intended for children consumption.
Resumo:
Water regimes in the Brazilian Cerrados are sensitive to climatological disturbances and human intervention. The risk that critical water-table levels are exceeded over long periods of time can be estimated by applying stochastic methods in modeling the dynamic relationship between water levels and driving forces such as precipitation and evapotranspiration. In this study, a transfer function-noise model, the so called PIRFICT-model, is applied to estimate the dynamic relationship between water-table depth and precipitation surplus/deficit in a watershed with a groundwater monitoring scheme in the Brazilian Cerrados. Critical limits were defined for a period in the Cerrados agricultural calendar, the end of the rainy season, when extremely shallow levels (< 0.5-m depth) can pose a risk to plant health and machinery before harvesting. By simulating time-series models, the risk of exceeding critical thresholds during a continuous period of time (e.g. 10 days) is described by probability levels. These simulated probabilities were interpolated spatially using universal kriging, incorporating information related to the drainage basin from a digital elevation model. The resulting map reduced model uncertainty. Three areas were defined as presenting potential risk at the end of the rainy season. These areas deserve attention with respect to water-management and land-use planning.
Resumo:
"Growing Up Happily in the Family" is a program to prevent child maltreatment targeted at parents of children aged 0-5 years old in at-risk psychosocial contexts. The program is delivered via either a group-based or a home-visit format. The objective of this study was to evaluate the impact of various implementation components in the home and group versions on changes in parental attitudes about child development and education. At-risk and non at-risk parents participated in the group-based (196 participants in 26 groups) and home-visit (95 participants) versions of the program delivered through local social services. We analyzed program adherence, adaptations, participant responsiveness, quality of delivery, and implementation barriers as predictors of changes in parental attitudes. The results showed that greater program adherence, better quality of delivery and participant responsiveness, and positive climate predicted changes in parental attitudes in both formats. Therefore, it is important to take into account the quality of the implementation process when testing the effectiveness of early group-based and home-visit interventions in at-risk families.