944 resultados para Hazard-Based Models


Relevância:

90.00% 90.00%

Publicador:

Resumo:

Proper management of supply chains is fundamental in the overall system performance of forestbased activities. Usually, efficient management techniques rely on a decision support software, which needs to be able to generate fast and effective outputs from the set of possibilities. In order to do this, it is necessary to provide accurate models representative of the dynamic interactions of systems. Due to forest-based supply chains’ nature, event-based models are more suited to describe their behaviours. This work proposes the modelling and simulation of a forestbased supply chain, in particular the biomass supply chain, through the SimPy framework. This Python based tool allows the modelling of discrete-event systems using operations such as events, processes and resources. The developed model was used to access the impact of changes in the daily working plan in three situations. First, as a control case, the deterministic behaviour was simulated. As a second approach, a machine delay was introduced and its implications in the plan accomplishment were analysed. Finally, to better address real operating conditions, stochastic behaviours of processing and driving times were simulated. The obtained results validate the SimPy simulation environment as a framework for modelling supply chains in general and for the biomass problem in particular.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Community-based models for injury prevention have become an accepted part of the overall injury control strategy. This systematic review of the scientific literature examines the evidence for their effectiveness in reducing pedestrian injury in children 0-14 years of age. A comprehensive search of the literature was performed using the following study selection criteria: community-based intervention study; target population was children under 14 years; outcome measure is either pedestrian injury rates or observed child pedestrian or vehicle driver behaviour; and use of a community control or an historical control in the study design. Quality assessment and data abstraction was guided by a standardized procedure and performed independently by two authors. Data synthesis was in tabular and text form with meta-analysis not being possible due to the discrepancy in methods and measures between the studies.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

We present unified, systematic derivations of schemes in the two known measurement-based models of quantum computation. The first model (introduced by Raussendorf and Briegel, [Phys. Rev. Lett. 86, 5188 (2001)]) uses a fixed entangled state, adaptive measurements on single qubits, and feedforward of the measurement results. The second model (proposed by Nielsen, [Phys. Lett. A 308, 96 (2003)] and further simplified by Leung, [Int. J. Quant. Inf. 2, 33 (2004)]) uses adaptive two-qubit measurements that can be applied to arbitrary pairs of qubits, and feedforward of the measurement results. The underlying principle of our derivations is a variant of teleportation introduced by Zhou, Leung, and Chuang, [Phys. Rev. A 62, 052316 (2000)]. Our derivations unify these two measurement-based models of quantum computation and provide significantly simpler schemes.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Study Objective: Community-based models for injury prevention have become an accepted part of the overall injury control strategy. This systematic review of the scientific literature examines the evidence for their effectiveness in reducing injury due to inadequate car seat restraint use in children 0-16 years of age. Methods: A comprehensive search of the literature was performed using the following study selection criteria: community-based intervention study: target population was children aged 0-16 years of age; outcome measure was either injury rates due to motor vehicle crashes or observed changes in child restraint use; and use of community control or historical control in the study design. Quality assessment and data abstraction was guided by a standardized procedure and performed independently by two authors. Data synthesis was in tabular and text form with meta-analysis not being possible due to the discrepancy in methods and measures between the studies. Results: This review found eight studies, that met all the inclusion criteria. In the studies that measured injury outcomes, significant reductions in risk of motor vehicle occupant injury (33-55%) were reported in the study communities. For those studies reporting observed car seat restraint use the community-based programs were successful in increasing toddler restraint use in 1-5 year aged children by up to 11%; child booster seat use in 4-8 year aged children by up to 13%; rear restraint use in children aged 0-15 years by 8%; a 50% increase in restraint use in pre-school aged children in a high-risk community; and a 44% increase in children aged 5-11 years. Conclusion: While this review highlights that there is some evidence to support the effectiveness of community-based programs to promote car restraint use and/or motor vehicle occupant injury, limitations in the evaluation methodologies of the studies requires the results to be interpreted with caution. There is clearly a need for further high quality program evaluation research to develop an evidence base. (C) 2004 Elsevier Ltd. All rights reserved.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

An appreciation of the physical mechanisms which cause observed seismicity complexity is fundamental to the understanding of the temporal behaviour of faults and single slip events. Numerical simulation of fault slip can provide insights into fault processes by allowing exploration of parameter spaces which influence microscopic and macroscopic physics of processes which may lead towards an answer to those questions. Particle-based models such as the Lattice Solid Model have been used previously for the simulation of stick-slip dynamics of faults, although mainly in two dimensions. Recent increases in the power of computers and the ability to use the power of parallel computer systems have made it possible to extend particle-based fault simulations to three dimensions. In this paper a particle-based numerical model of a rough planar fault embedded between two elastic blocks in three dimensions is presented. A very simple friction law without any rate dependency and no spatial heterogeneity in the intrinsic coefficient of friction is used in the model. To simulate earthquake dynamics the model is sheared in a direction parallel to the fault plane with a constant velocity at the driving edges. Spontaneous slip occurs on the fault when the shear stress is large enough to overcome the frictional forces on the fault. Slip events with a wide range of event sizes are observed. Investigation of the temporal evolution and spatial distribution of slip during each event shows a high degree of variability between the events. In some of the larger events highly complex slip patterns are observed.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This paper compares the UK/US exchange rate forecasting performance of linear and nonlinear models based on monetary fundamentals, to a random walk (RW) model. Structural breaks are identified and taken into account. The exchange rate forecasting framework is also used for assessing the relative merits of the official Simple Sum and the weighted Divisia measures of money. Overall, there are four main findings. First, the majority of the models with fundamentals are able to beat the RW model in forecasting the UK/US exchange rate. Second, the most accurate forecasts of the UK/US exchange rate are obtained with a nonlinear model. Third, taking into account structural breaks reveals that the Divisia aggregate performs better than its Simple Sum counterpart. Finally, Divisia-based models provide more accurate forecasts than Simple Sum-based models provided they are constructed within a nonlinear framework.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

We study the comparative importance of thermal to nonthermal fluctuations for membrane-based models in the linear regime. Our results, both in 1+1 and 2+1 dimensions, suggest that nonthermal fluctuations dominate thermal ones only when the relaxation time τ is large. For moderate to small values of τ, the dynamics is defined by a competition between these two forces. The results are expected to act as a quantitative benchmark for biological modeling in systems involving cytoskeletal and other nonthermal fluctuations. © 2011 American Physical Society.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Analysis of risk measures associated with price series data movements and its predictions are of strategic importance in the financial markets as well as to policy makers in particular for short- and longterm planning for setting up economic growth targets. For example, oilprice risk-management focuses primarily on when and how an organization can best prevent the costly exposure to price risk. Value-at-Risk (VaR) is the commonly practised instrument to measure risk and is evaluated by analysing the negative/positive tail of the probability distributions of the returns (profit or loss). In modelling applications, least-squares estimation (LSE)-based linear regression models are often employed for modeling and analyzing correlated data. These linear models are optimal and perform relatively well under conditions such as errors following normal or approximately normal distributions, being free of large size outliers and satisfying the Gauss-Markov assumptions. However, often in practical situations, the LSE-based linear regression models fail to provide optimal results, for instance, in non-Gaussian situations especially when the errors follow distributions with fat tails and error terms possess a finite variance. This is the situation in case of risk analysis which involves analyzing tail distributions. Thus, applications of the LSE-based regression models may be questioned for appropriateness and may have limited applicability. We have carried out the risk analysis of Iranian crude oil price data based on the Lp-norm regression models and have noted that the LSE-based models do not always perform the best. We discuss results from the L1, L2 and L∞-norm based linear regression models. ACM Computing Classification System (1998): B.1.2, F.1.3, F.2.3, G.3, J.2.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In recent years, there has been an increasing interest in learning a distributed representation of word sense. Traditional context clustering based models usually require careful tuning of model parameters, and typically perform worse on infrequent word senses. This paper presents a novel approach which addresses these limitations by first initializing the word sense embeddings through learning sentence-level embeddings from WordNet glosses using a convolutional neural networks. The initialized word sense embeddings are used by a context clustering based model to generate the distributed representations of word senses. Our learned representations outperform the publicly available embeddings on half of the metrics in the word similarity task, 6 out of 13 sub tasks in the analogical reasoning task, and gives the best overall accuracy in the word sense effect classification task, which shows the effectiveness of our proposed distributed distribution learning model.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

There has been an increasing interest in the use of agent-based simulation and some discussion of the relative merits of this approach as compared to discrete-event simulation. There are differing views on whether an agent-based simulation offers capabilities that discrete-event cannot provide or whether all agent-based applications can at least in theory be undertaken using a discrete-event approach. This paper presents a simple agent-based NetLogo model and corresponding discrete-event versions implemented in the widely used ARENA software. The two versions of the discrete-event model presented use a traditional process flow approach normally adopted in discrete-event simulation software and also an agent-based approach to the model build. In addition a real-time spatial visual display facility is provided using a spreadsheet platform controlled by VBA code embedded within the ARENA model. Initial findings from this investigation are that discrete-event simulation can indeed be used to implement agent-based models and with suitable integration elements such as VBA provide the spatial displays associated with agent-based software.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The spatial and temporal distribution of modern diatom assemblages in surface sediments, on the most dominant macrophytes, and in the water column at 96 locations in Florida Bay, Biscayne Bay and adjacent regions were examined in order to develop paleoenvironmental prediction models for this region. Analyses of these distributions revealed distinct temporal and spatial differences in assemblages among the locations. The differences among diatom assemblages living on subaquatic vegetation and sediments, and in the water column were significant. Because concentrations of salts, total phosphorus (WTP), total nitrogen (WTN) and total organic carbon (WTOC) are partly controlled by water management in this region, diatom-based models were produced to assess these variables. Discriminant function analyses showed that diatoms can also be successfully used to reconstruct changes in the abundance of diatom assemblages typical for different habitats and life habits. ^ To interpret paleoenvironmental changes, changes in salinity, WTN, WTP and WTOC were inferred from diatoms preserved in sediment cores collected along environmental gradients in Florida Bay (4 cores) and from nearshore and offshore locations in Biscayne Bay (3 cores). The reconstructions showed that water quality conditions in these estuaries have been fluctuating for thousands of years due to natural processes and sea-level changes, but almost synchronized shifts in diatom assemblages occurred in the mid-1960’s at all coring locations (except Ninemile Bank and Bob Allen Bank in Florida Bay). These alterations correspond to the major construction of numerous water management structures on the mainland. Additionally, all the coring sites (except Card Sound Bank, Biscayne Bay and Trout Cove, Florida Bay) showed decreasing salinity and fluctuations in nutrient levels in the last two decades that correspond to increased rainfall in the 1990’s and increased freshwater discharge to the bays, a result of increased freshwater deliveries to the Everglades by South Florida Water Management District in the 1980’s and 1990’s. Reconstructions of the abundance of diatom assemblages typical for different habitats and life habits revealed multiple sources of diatoms to the coring locations and that epiphytic assemblages in both bays increased in abundance since the early 1990’s. ^

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This dissertation aimed to improve travel time estimation for the purpose of transportation planning by developing a travel time estimation method that incorporates the effects of signal timing plans, which were difficult to consider in planning models. For this purpose, an analytical model has been developed. The model parameters were calibrated based on data from CORSIM microscopic simulation, with signal timing plans optimized using the TRANSYT-7F software. Independent variables in the model are link length, free-flow speed, and traffic volumes from the competing turning movements. The developed model has three advantages compared to traditional link-based or node-based models. First, the model considers the influence of signal timing plans for a variety of traffic volume combinations without requiring signal timing information as input. Second, the model describes the non-uniform spatial distribution of delay along a link, this being able to estimate the impacts of queues at different upstream locations of an intersection and attribute delays to a subject link and upstream link. Third, the model shows promise of improving the accuracy of travel time prediction. The mean absolute percentage error (MAPE) of the model is 13% for a set of field data from Minnesota Department of Transportation (MDOT); this is close to the MAPE of uniform delay in the HCM 2000 method (11%). The HCM is the industrial accepted analytical model in the existing literature, but it requires signal timing information as input for calculating delays. The developed model also outperforms the HCM 2000 method for a set of Miami-Dade County data that represent congested traffic conditions, with a MAPE of 29%, compared to 31% of the HCM 2000 method. The advantages of the proposed model make it feasible for application to a large network without the burden of signal timing input, while improving the accuracy of travel time estimation. An assignment model with the developed travel time estimation method has been implemented in a South Florida planning model, which improved assignment results.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Human use of the oceans is increasingly in conflict with conservation of endangered species. Methods for managing the spatial and temporal placement of industries such as military, fishing, transportation and offshore energy, have historically been post hoc; i.e. the time and place of human activity is often already determined before assessment of environmental impacts. In this dissertation, I build robust species distribution models in two case study areas, US Atlantic (Best et al. 2012) and British Columbia (Best et al. 2015), predicting presence and abundance respectively, from scientific surveys. These models are then applied to novel decision frameworks for preemptively suggesting optimal placement of human activities in space and time to minimize ecological impacts: siting for offshore wind energy development, and routing ships to minimize risk of striking whales. Both decision frameworks relate the tradeoff between conservation risk and industry profit with synchronized variable and map views as online spatial decision support systems.

For siting offshore wind energy development (OWED) in the U.S. Atlantic (chapter 4), bird density maps are combined across species with weights of OWED sensitivity to collision and displacement and 10 km2 sites are compared against OWED profitability based on average annual wind speed at 90m hub heights and distance to transmission grid. A spatial decision support system enables toggling between the map and tradeoff plot views by site. A selected site can be inspected for sensitivity to a cetaceans throughout the year, so as to capture months of the year which minimize episodic impacts of pre-operational activities such as seismic airgun surveying and pile driving.

Routing ships to avoid whale strikes (chapter 5) can be similarly viewed as a tradeoff, but is a different problem spatially. A cumulative cost surface is generated from density surface maps and conservation status of cetaceans, before applying as a resistance surface to calculate least-cost routes between start and end locations, i.e. ports and entrance locations to study areas. Varying a multiplier to the cost surface enables calculation of multiple routes with different costs to conservation of cetaceans versus cost to transportation industry, measured as distance. Similar to the siting chapter, a spatial decisions support system enables toggling between the map and tradeoff plot view of proposed routes. The user can also input arbitrary start and end locations to calculate the tradeoff on the fly.

Essential to the input of these decision frameworks are distributions of the species. The two preceding chapters comprise species distribution models from two case study areas, U.S. Atlantic (chapter 2) and British Columbia (chapter 3), predicting presence and density, respectively. Although density is preferred to estimate potential biological removal, per Marine Mammal Protection Act requirements in the U.S., all the necessary parameters, especially distance and angle of observation, are less readily available across publicly mined datasets.

In the case of predicting cetacean presence in the U.S. Atlantic (chapter 2), I extracted datasets from the online OBIS-SEAMAP geo-database, and integrated scientific surveys conducted by ship (n=36) and aircraft (n=16), weighting a Generalized Additive Model by minutes surveyed within space-time grid cells to harmonize effort between the two survey platforms. For each of 16 cetacean species guilds, I predicted the probability of occurrence from static environmental variables (water depth, distance to shore, distance to continental shelf break) and time-varying conditions (monthly sea-surface temperature). To generate maps of presence vs. absence, Receiver Operator Characteristic (ROC) curves were used to define the optimal threshold that minimizes false positive and false negative error rates. I integrated model outputs, including tables (species in guilds, input surveys) and plots (fit of environmental variables, ROC curve), into an online spatial decision support system, allowing for easy navigation of models by taxon, region, season, and data provider.

For predicting cetacean density within the inner waters of British Columbia (chapter 3), I calculated density from systematic, line-transect marine mammal surveys over multiple years and seasons (summer 2004, 2005, 2008, and spring/autumn 2007) conducted by Raincoast Conservation Foundation. Abundance estimates were calculated using two different methods: Conventional Distance Sampling (CDS) and Density Surface Modelling (DSM). CDS generates a single density estimate for each stratum, whereas DSM explicitly models spatial variation and offers potential for greater precision by incorporating environmental predictors. Although DSM yields a more relevant product for the purposes of marine spatial planning, CDS has proven to be useful in cases where there are fewer observations available for seasonal and inter-annual comparison, particularly for the scarcely observed elephant seal. Abundance estimates are provided on a stratum-specific basis. Steller sea lions and harbour seals are further differentiated by ‘hauled out’ and ‘in water’. This analysis updates previous estimates (Williams & Thomas 2007) by including additional years of effort, providing greater spatial precision with the DSM method over CDS, novel reporting for spring and autumn seasons (rather than summer alone), and providing new abundance estimates for Steller sea lion and northern elephant seal. In addition to providing a baseline of marine mammal abundance and distribution, against which future changes can be compared, this information offers the opportunity to assess the risks posed to marine mammals by existing and emerging threats, such as fisheries bycatch, ship strikes, and increased oil spill and ocean noise issues associated with increases of container ship and oil tanker traffic in British Columbia’s continental shelf waters.

Starting with marine animal observations at specific coordinates and times, I combine these data with environmental data, often satellite derived, to produce seascape predictions generalizable in space and time. These habitat-based models enable prediction of encounter rates and, in the case of density surface models, abundance that can then be applied to management scenarios. Specific human activities, OWED and shipping, are then compared within a tradeoff decision support framework, enabling interchangeable map and tradeoff plot views. These products make complex processes transparent for gaming conservation, industry and stakeholders towards optimal marine spatial management, fundamental to the tenets of marine spatial planning, ecosystem-based management and dynamic ocean management.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Thesis (Master's)--University of Washington, 2016-06

Relevância:

90.00% 90.00%

Publicador:

Resumo:

By proposing a numerical based method on PCA-ANFIS(Adaptive Neuro-Fuzzy Inference System), this paper is focusing on solving the problem of uncertain cycle of water injection in the oilfield. As the dimension of original data is reduced by PCA, ANFIS can be applied for training and testing the new data proposed by this paper. The correctness of PCA-ANFIS models are verified by the injection statistics data collected from 116 wells inside an oilfield, the average absolute error of testing is 1.80 months. With comparison by non-PCA based models which average error is 4.33 months largely ahead of PCA-ANFIS based models, it shows that the testing accuracy has been greatly enhanced by our approach. With the conclusion of the above testing, the PCA-ANFIS method is robust in predicting the effectiveness cycle of water injection which helps oilfield developers to design the water injection scheme.