931 resultados para Urban Simulation Model


Relevância:

80.00% 80.00%

Publicador:

Resumo:

In 2002, 2003 and 2004, we took macoinvertebrate samples on a total of 36 occasions at the Badacsony bay of Lake Balaton. Our sampling site was characterised by areas of open water (in 2003 and 2004 full of reed-grass) as well as by areas covered by common reed (Phragmites australis) and narrowleaf cattail (Typha angustifolia). Samples were taken both from water body and benthic ooze by use of a stiff hand net. We have gained our data from processing 208 individual samples. We took samples frequently from early spring until late autumn for a deeper understanding of the processes of seasonal dynamics. The main seasonal patterns and temporal changes of diversity were described. We constructed a weather-dependent simulation model of the processes of seasonal dynamics in the interest of a possible further utilization of our data in climate change research. We described the total number of individuals, biovolume and diversity of all macroinvertebrate species with a single index and used the temporal trends of this index for simulation modelling. Our discrete deterministic model includes only the impact of temperature, other interactions might only appear concealed. Running the model for different climate change scenarios it became possible to estimate conditions for the 2070-2100 period. The results, however, should be treated very prudently not only because our model is very simple but also because the scenarios are the results of different models.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In the years 2004 and 2005 we collected samples of phytoplankton, zooplankton and macroinvertebrates in an artificial small pond in Budapest. We set up a simulation model predicting the abundance of the cyclopoids, Eudiaptomus zachariasi and Ischnura pumilio by considering only temperature as it affects the abundance of population of the previous day. Phytoplankton abundance was simulated by considering not only temperature, but the abundance of the three mentioned groups. This discrete-deterministic model could generate similar patterns like the observed one and testing it on historical data was successful. However, because the model was overpredicting the abundances of Ischnura pumilio and Cyclopoida at the end of the year, these results were not considered. Running the model with the data series of climate change scenarios, we had an opportunity to predict the individual numbers for the period around 2050. If the model is run with the data series of the two scenarios UKHI and UKLO, which predict drastic global warming, then we can observe a decrease in abundance and shift in the date of the maximum abundance occurring (excluding Ischnura pumilio, where the maximum abundance increases and it occurs later), whereas under unchanged climatic conditions (BASE scenario) the change in abundance is negligible. According to the scenarios GFDL 2535, GFDL 5564 and UKTR, a transition could be noticed.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Climate change is one of the most crucial ecological problems of our age with great influence. Seasonal dynamics of aquatic communities are — among others — regulated by the climate, especially by temperature. In this case study we attempted the simulation modelling of the seasonal dynamics of a copepod species, Cyclops vicinus, which ranks among the zooplankton community, based on a quantitative database containing ten years of data from the Danube’s Göd area. We set up a simulation model predicting the abundance of Cyclops vicinus by considering only temperature as it affects the abundance of population. The model was adapted to eight years of daily temperature data observed between 1981 and 1994 and was tested successfully with the additional data of two further years. The model was run with the data series of climate change scenarios specified for the period around 2070- 2100. On the other hand we looked for the geographically analogous areas with the Göd region which are mostly similar to the future climate of the Göd area. By means of the above-mentioned points we can get a view how the climate of the region will change by the end of the 21st century, and the way the seasonal dynamics of a chosen planktonic crustacean species may follow this change. According to our results the area of Göd will be similar to the northern region of Greece. The maximum abundance of the examined species occurs a month to one and a half months earlier, moreover larger variances are expected between years in respect of the abundance. The deviations are expected in the direction of smaller or significantly larger abundance not observed earlier.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In the years 2004 and 2005, we collected samples of phytoplankton, zooplankton, and macroinvertebrates in an artificial small pond in Budapest (Hungary). We set up a simulation model predicting the abundances of the cyclopoids, Eudiaptomus zachariasi, and Ischnura pumilio by considering only temperature and the abundance of population of the previous day. Phytoplankton abundance was simulated by considering not only temperature but the abundances of the three mentioned groups. When we ran the model with the data series of internationally accepted climate change scenarios, the different outcomes were discussed. Comparative assessment of the alternative climate change scenarios was also carried out with statistical methods.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Traffic incidents are non-recurring events that can cause a temporary reduction in roadway capacity. They have been recognized as a major contributor to traffic congestion on our nation’s highway systems. To alleviate their impacts on capacity, automatic incident detection (AID) has been applied as an incident management strategy to reduce the total incident duration. AID relies on an algorithm to identify the occurrence of incidents by analyzing real-time traffic data collected from surveillance detectors. Significant research has been performed to develop AID algorithms for incident detection on freeways; however, similar research on major arterial streets remains largely at the initial stage of development and testing. This dissertation research aims to identify design strategies for the deployment of an Artificial Neural Network (ANN) based AID algorithm for major arterial streets. A section of the US-1 corridor in Miami-Dade County, Florida was coded in the CORSIM microscopic simulation model to generate data for both model calibration and validation. To better capture the relationship between the traffic data and the corresponding incident status, Discrete Wavelet Transform (DWT) and data normalization were applied to the simulated data. Multiple ANN models were then developed for different detector configurations, historical data usage, and the selection of traffic flow parameters. To assess the performance of different design alternatives, the model outputs were compared based on both detection rate (DR) and false alarm rate (FAR). The results show that the best models were able to achieve a high DR of between 90% and 95%, a mean time to detect (MTTD) of 55-85 seconds, and a FAR below 4%. The results also show that a detector configuration including only the mid-block and upstream detectors performs almost as well as one that also includes a downstream detector. In addition, DWT was found to be able to improve model performance, and the use of historical data from previous time cycles improved the detection rate. Speed was found to have the most significant impact on the detection rate, while volume was found to contribute the least. The results from this research provide useful insights on the design of AID for arterial street applications.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Traffic incidents are a major source of traffic congestion on freeways. Freeway traffic diversion using pre-planned alternate routes has been used as a strategy to reduce traffic delays due to major traffic incidents. However, it is not always beneficial to divert traffic when an incident occurs. Route diversion may adversely impact traffic on the alternate routes and may not result in an overall benefit. This dissertation research attempts to apply Artificial Neural Network (ANN) and Support Vector Regression (SVR) techniques to predict the percent of delay reduction from route diversion to help determine whether traffic should be diverted under given conditions. The DYNASMART-P mesoscopic traffic simulation model was applied to generate simulated data that were used to develop the ANN and SVR models. A sample network that comes with the DYNASMART-P package was used as the base simulation network. A combination of different levels of incident duration, capacity lost, percent of drivers diverted, VMS (variable message sign) messaging duration, and network congestion was simulated to represent different incident scenarios. The resulting percent of delay reduction, average speed, and queue length from each scenario were extracted from the simulation output. The ANN and SVR models were then calibrated for percent of delay reduction as a function of all of the simulated input and output variables. The results show that both the calibrated ANN and SVR models, when applied to the same location used to generate the calibration data, were able to predict delay reduction with a relatively high accuracy in terms of mean square error (MSE) and regression correlation. It was also found that the performance of the ANN model was superior to that of the SVR model. Likewise, when the models were applied to a new location, only the ANN model could produce comparatively good delay reduction predictions under high network congestion level.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Freeway systems are becoming more congested each day. One contribution to freeway traffic congestion comprises platoons of on-ramp traffic merging into freeway mainlines. As a relatively low-cost countermeasure to the problem, ramp meters are being deployed in both directions of an 11-mile section of I-95 in Miami-Dade County, Florida. The local Fuzzy Logic (FL) ramp metering algorithm implemented in Seattle, Washington, has been selected for deployment. The FL ramp metering algorithm is powered by the Fuzzy Logic Controller (FLC). The FLC depends on a series of parameters that can significantly alter the behavior of the controller, thus affecting the performance of ramp meters. However, the most suitable values for these parameters are often difficult to determine, as they vary with current traffic conditions. Thus, for optimum performance, the parameter values must be fine-tuned. This research presents a new method of fine tuning the FLC parameters using Particle Swarm Optimization (PSO). PSO attempts to optimize several important parameters of the FLC. The objective function of the optimization model incorporates the METANET macroscopic traffic flow model to minimize delay time, subject to the constraints of reasonable ranges of ramp metering rates and FLC parameters. To further improve the performance, a short-term traffic forecasting module using a discrete Kalman filter was incorporated to predict the downstream freeway mainline occupancy. This helps to detect the presence of downstream bottlenecks. The CORSIM microscopic simulation model was selected as the platform to evaluate the performance of the proposed PSO tuning strategy. The ramp-metering algorithm incorporating the tuning strategy was implemented using CORSIM's run-time extension (RTE) and was tested on the aforementioned I-95 corridor. The performance of the FLC with PSO tuning was compared with the performance of the existing FLC without PSO tuning. The results show that the FLC with PSO tuning outperforms the existing FL metering, fixed-time metering, and existing conditions without metering in terms of total travel time savings, average speed, and system-wide throughput.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This dissertation develops a process improvement method for service operations based on the Theory of Constraints (TOC), a management philosophy that has been shown to be effective in manufacturing for decreasing WIP and improving throughput. While TOC has enjoyed much attention and success in the manufacturing arena, its application to services in general has been limited. The contribution to industry and knowledge is a method for improving global performance measures based on TOC principles. The method proposed in this dissertation will be tested using discrete event simulation based on the scenario of the service factory of airline turnaround operations. To evaluate the method, a simulation model of aircraft turn operations of a U.S. based carrier was made and validated using actual data from airline operations. The model was then adjusted to reflect an application of the Theory of Constraints for determining how to deploy the scarce resource of ramp workers. The results indicate that, given slight modifications to TOC terminology and the development of a method for constraint identification, the Theory of Constraints can be applied with success to services. Bottlenecks in services must be defined as those processes for which the process rates and amount of work remaining are such that completing the process will not be possible without an increase in the process rate. The bottleneck ratio is used to determine to what degree a process is a constraint. Simulation results also suggest that redefining performance measures to reflect a global business perspective of reducing costs related to specific flights versus the operational local optimum approach of turning all aircraft quickly results in significant savings to the company. Savings to the annual operating costs of the airline were simulated to equal 30% of possible current expenses for misconnecting passengers with a modest increase in utilization of the workers through a more efficient heuristic of deploying them to the highest priority tasks. This dissertation contributes to the literature on service operations by describing a dynamic, adaptive dispatch approach to manage service factory operations similar to airline turnaround operations using the management philosophy of the Theory of Constraints.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The lack of analytical models that can accurately describe large-scale networked systems makes empirical experimentation indispensable for understanding complex behaviors. Research on network testbeds for testing network protocols and distributed services, including physical, emulated, and federated testbeds, has made steady progress. Although the success of these testbeds is undeniable, they fail to provide: 1) scalability, for handling large-scale networks with hundreds or thousands of hosts and routers organized in different scenarios, 2) flexibility, for testing new protocols or applications in diverse settings, and 3) inter-operability, for combining simulated and real network entities in experiments. This dissertation tackles these issues in three different dimensions. First, we present SVEET, a system that enables inter-operability between real and simulated hosts. In order to increase the scalability of networks under study, SVEET enables time-dilated synchronization between real hosts and the discrete-event simulator. Realistic TCP congestion control algorithms are implemented in the simulator to allow seamless interactions between real and simulated hosts. SVEET is validated via extensive experiments and its capabilities are assessed through case studies involving real applications. Second, we present PrimoGENI, a system that allows a distributed discrete-event simulator, running in real-time, to interact with real network entities in a federated environment. PrimoGENI greatly enhances the flexibility of network experiments, through which a great variety of network conditions can be reproduced to examine what-if questions. Furthermore, PrimoGENI performs resource management functions, on behalf of the user, for instantiating network experiments on shared infrastructures. Finally, to further increase the scalability of network testbeds to handle large-scale high-capacity networks, we present a novel symbiotic simulation approach. We present SymbioSim, a testbed for large-scale network experimentation where a high-performance simulation system closely cooperates with an emulation system in a mutually beneficial way. On the one hand, the simulation system benefits from incorporating the traffic metadata from real applications in the emulation system to reproduce the realistic traffic conditions. On the other hand, the emulation system benefits from receiving the continuous updates from the simulation system to calibrate the traffic between real applications. Specific techniques that support the symbiotic approach include: 1) a model downscaling scheme that can significantly reduce the complexity of the large-scale simulation model, resulting in an efficient emulation system for modulating the high-capacity network traffic between real applications; 2) a queuing network model for the downscaled emulation system to accurately represent the network effects of the simulated traffic; and 3) techniques for reducing the synchronization overhead between the simulation and emulation systems.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Financial innovations have emerged globally to close the gap between the rising global demand for infrastructures and the availability of financing sources offered by traditional financing mechanisms such as fuel taxation, tax-exempt bonds, and federal and state funds. The key to sustainable innovative financing mechanisms is effective policymaking. This paper discusses the theoretical framework of a research study whose objective is to structurally and systemically assess financial innovations in global infrastructures. The research aims to create analysis frameworks, taxonomies and constructs, and simulation models pertaining to the dynamics of the innovation process to be used in policy analysis. Structural assessment of innovative financing focuses on the typologies and loci of innovations and evaluates the performance of different types of innovative financing mechanisms. Systemic analysis of innovative financing explores the determinants of the innovation process using the System of Innovation approach. The final deliverables of the research include propositions pertaining to the constituents of System of Innovation for infrastructure finance which include the players, institutions, activities, and networks. These static constructs are used to develop a hybrid Agent-Based/System Dynamics simulation model to derive propositions regarding the emergent dynamics of the system. The initial outcomes of the research study are presented in this paper and include: (a) an archetype for mapping innovative financing mechanisms, (b) a System of Systems-based analysis framework to identify the dimensions of Systems of Innovation analyses, and (c) initial observations regarding the players, institutions, activities, and networks of the System of Innovation in the context of the U.S. transportation infrastructure financing.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Traffic incidents are non-recurring events that can cause a temporary reduction in roadway capacity. They have been recognized as a major contributor to traffic congestion on our national highway systems. To alleviate their impacts on capacity, automatic incident detection (AID) has been applied as an incident management strategy to reduce the total incident duration. AID relies on an algorithm to identify the occurrence of incidents by analyzing real-time traffic data collected from surveillance detectors. Significant research has been performed to develop AID algorithms for incident detection on freeways; however, similar research on major arterial streets remains largely at the initial stage of development and testing. This dissertation research aims to identify design strategies for the deployment of an Artificial Neural Network (ANN) based AID algorithm for major arterial streets. A section of the US-1 corridor in Miami-Dade County, Florida was coded in the CORSIM microscopic simulation model to generate data for both model calibration and validation. To better capture the relationship between the traffic data and the corresponding incident status, Discrete Wavelet Transform (DWT) and data normalization were applied to the simulated data. Multiple ANN models were then developed for different detector configurations, historical data usage, and the selection of traffic flow parameters. To assess the performance of different design alternatives, the model outputs were compared based on both detection rate (DR) and false alarm rate (FAR). The results show that the best models were able to achieve a high DR of between 90% and 95%, a mean time to detect (MTTD) of 55-85 seconds, and a FAR below 4%. The results also show that a detector configuration including only the mid-block and upstream detectors performs almost as well as one that also includes a downstream detector. In addition, DWT was found to be able to improve model performance, and the use of historical data from previous time cycles improved the detection rate. Speed was found to have the most significant impact on the detection rate, while volume was found to contribute the least. The results from this research provide useful insights on the design of AID for arterial street applications.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Traffic incidents are a major source of traffic congestion on freeways. Freeway traffic diversion using pre-planned alternate routes has been used as a strategy to reduce traffic delays due to major traffic incidents. However, it is not always beneficial to divert traffic when an incident occurs. Route diversion may adversely impact traffic on the alternate routes and may not result in an overall benefit. This dissertation research attempts to apply Artificial Neural Network (ANN) and Support Vector Regression (SVR) techniques to predict the percent of delay reduction from route diversion to help determine whether traffic should be diverted under given conditions. The DYNASMART-P mesoscopic traffic simulation model was applied to generate simulated data that were used to develop the ANN and SVR models. A sample network that comes with the DYNASMART-P package was used as the base simulation network. A combination of different levels of incident duration, capacity lost, percent of drivers diverted, VMS (variable message sign) messaging duration, and network congestion was simulated to represent different incident scenarios. The resulting percent of delay reduction, average speed, and queue length from each scenario were extracted from the simulation output. The ANN and SVR models were then calibrated for percent of delay reduction as a function of all of the simulated input and output variables. The results show that both the calibrated ANN and SVR models, when applied to the same location used to generate the calibration data, were able to predict delay reduction with a relatively high accuracy in terms of mean square error (MSE) and regression correlation. It was also found that the performance of the ANN model was superior to that of the SVR model. Likewise, when the models were applied to a new location, only the ANN model could produce comparatively good delay reduction predictions under high network congestion level.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The distribution and mobilization of fluid in a porous medium depend on the capillary, gravity, and viscous forces. In oil field, the processes of enhanced oil recovery involve change and importance of these forces to increase the oil recovery factor. In the case of gas assisted gravity drainage (GAGD) process is important to understand the physical mechanisms to mobilize oil through the interaction of these forces. For this reason, several authors have developed physical models in laboratory and core floods of GAGD to study the performance of these forces through dimensionless groups. These models showed conclusive results. However, numerical simulation models have not been used for this type of study. Therefore, the objective of this work is to study the performance of capillary, viscous and gravity forces on GAGD process and its influence on the oil recovery factor through a 2D numerical simulation model. To analyze the interplay of these forces, dimensionless groups reported in the literature have been used such as Capillary Number (Nc), Bond number (Nb) and Gravity Number (Ng). This was done to determine the effectiveness of each force related to the other one. A comparison of the results obtained from the numerical simulation was also carried out with the results reported in the literature. The results showed that before breakthrough time, the lower is the injection flow rate, oil recovery is increased by capillary force, and after breakthrough time, the higher is the injection flow rate, oil recovery is increased by gravity force. A good relationship was found between the results obtained in this research with those published in the literature. The simulation results indicated that before the gas breakthrough, higher oil recoveries were obtained at lower Nc and Nb and, after the gas breakthrough, higher oil recoveries were obtained at lower Ng. The numerical models are consistent with the reported results in the literature

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In the last 16 years emerged in Brazil a segment of independent producers with focus on onshore basins and shallow waters. Among the challenges of these companies is the development of fields with projects with a low net present value (NPV). The objective of this work was to study the technical-economical best option to develop an oil field in the Brazilian Northeast using reservoir simulation. Real geology, reservoir and production data was used to build the geological and simulation model. Due to not having PVT analysis, distillation method test data known as the true boiling points (TBP) were used to create a fluids model generating the PVT data. After execution of the history match, four development scenarios were simulated: the extrapolation of production without new investments, the conversion of a producing well for immiscible gas injection, the drilling of a vertical well and the drilling of a horizontal well. As a result, from the financial point of view, the gas injection is the alternative with lower added value, but it may be viable if there are environmental or regulatory restrictions to flaring or venting the produced gas into the atmosphere from this field or neighboring accumulations. The recovery factor achieved with the drilling of vertical and horizontal wells is similar, but the horizontal well is a project of production acceleration; therefore, the present incremental cumulative production with a minimum rate of company's attractiveness is higher. Depending on the crude oil Brent price and the drilling cost, this option can be technically and financially viable.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

L’épaule est l’articulation la plus mobile et la plus instable du corps humain dû à la faible quantité de contraintes osseuses et au rôle des tissus mous qui lui confèrent au moins une dizaine de degrés de liberté. La mobilité de l’épaule est un facteur de performance dans plusieurs sports. Mais son instabilité engendre des troubles musculo-squelettiques, dont les déchirures de la coiffe des rotateurs sont fréquentes et les plus handicapantes. L’évaluation de l’amplitude articulaire est un indice commun de la fonction de l’épaule, toutefois elle est souvent limitée à quelques mesures planaires pour lesquelles les degrés de liberté varient indépendamment les uns des autres. Ces valeurs utilisées dans les modèles de simulation musculo-squelettiques peuvent amener à des solutions non physiologiques. L’objectif de cette thèse était de développer des outils pour la caractérisation de la mobilité articulaire tri-dimensionnelle de l’épaule, en passant par i) fournir une méthode et son approche expérimentale pour évaluer l’amplitude articulaire tridimensionnelle de l’épaule incluant des interactions entre les degrés de liberté ; ii) proposer une représentation permettant d’interpréter les données tri-dimensionnelles obtenues; iii) présenter des amplitudes articulaires normalisées, iv) implémenter une amplitude articulaire tridimensionnelle au sein d’un modèle de simulation numérique afin de générer des mouvements sportifs optimaux plus réalistes; v) prédire des amplitudes articulaires sécuritaires et vi) des exercices de rééducation sécuritaires pour des patients ayant subi une réparation de la coiffe des rotateurs. i) Seize sujets ont été réalisé séries de mouvements d’amplitudes maximales actifs avec des combinaisons entre les différents degrés de liberté de l’épaule. Un système d’analyse du mouvement couplé à un modèle cinématique du membre supérieur a été utilisé pour estimer les cinématiques articulaires tridimensionnelles. ii) L’ensemble des orientations définies par une séquence de trois angles a été inclus dans un polyèdre non convexe représentant l’espace de mobilité articulaire prenant en compte les interactions entre les degrés de liberté. La combinaison des séries d’élévation et de rotation est recommandée pour évaluer l’amplitude articulaire complète de l’épaule. iii) Un espace de mobilité normalisé a également été défini en englobant les positions atteintes par au moins 50% des sujets et de volume moyen. iv) Cet espace moyen, définissant la mobilité physiologiques, a été utilisé au sein d’un modèle de simulation cinématique utilisé pour optimiser la technique d’un élément acrobatique de lâcher de barres réalisée par des gymnastes. Avec l’utilisation régulière de limites articulaires planaires pour contraindre la mobilité de l’épaule, seulement 17% des solutions optimales sont physiologiques. En plus, d’assurer le réalisme des solutions, notre contrainte articulaire tridimensionnelle n’a pas affecté le coût de calculs de l’optimisation. v) et vi) Les seize participants ont également réalisé des séries d’amplitudes articulaires passives et des exercices de rééducation passifs. La contrainte dans l’ensemble des muscles de la coiffe des rotateurs au cours de ces mouvements a été estimée à l’aide d’un modèle musculo-squelettique reproduisant différents types et tailles de déchirures. Des seuils de contrainte sécuritaires ont été utilisés pour distinguer les amplitudes de mouvements risquées ou non pour l’intégrité de la réparation chirurgicale. Une taille de déchirure plus grande ainsi que les déchirures affectant plusieurs muscles ont contribué à réduire l’espace de mobilité articulaire sécuritaire. Principalement les élévations gléno-humérales inférieures à 38° et supérieures à 65°, ou réalisées avec le bras maintenu en rotation interne engendrent des contraintes excessives pour la plupart des types et des tailles de blessure lors de mouvements d’abduction, de scaption ou de flexion. Cette thèse a développé une représentation innovante de la mobilité de l’épaule, qui tient compte des interactions entre les degrés de liberté. Grâce à cette représentation, l’évaluation clinique pourra être plus exhaustive et donc élargir les possibilités de diagnostiquer les troubles de l’épaule. La simulation de mouvement peut maintenant être plus réaliste. Finalement, nous avons montré l’importance de personnaliser la rééducation des patients en termes d’amplitude articulaire, puisque des exercices passifs de rééducation précoces peuvent contribuer à une re-déchirure à cause d’une contrainte trop importante qu’ils imposent aux tendons.