936 resultados para Microscopic simulation models


Relevância:

90.00% 90.00%

Publicador:

Resumo:

Variable Speed Limit (VSL) strategies identify and disseminate dynamic speed limits that are determined to be appropriate based on prevailing traffic conditions, road surface conditions, and weather conditions. This dissertation develops and evaluates a shockwave-based VSL system that uses a heuristic switching logic-based controller with specified thresholds of prevailing traffic flow conditions. The system aims to improve operations and mobility at critical bottlenecks. Before traffic breakdown occurrence, the proposed VSL’s goal is to prevent or postpone breakdown by decreasing the inflow and achieving uniform distribution in speed and flow. After breakdown occurrence, the VSL system aims to dampen traffic congestion by reducing the inflow traffic to the congested area and increasing the bottleneck capacity by deactivating the VSL at the head of the congested area. The shockwave-based VSL system pushes the VSL location upstream as the congested area propagates upstream. In addition to testing the system using infrastructure detector-based data, this dissertation investigates the use of Connected Vehicle trajectory data as input to the shockwave-based VSL system performance. Since the field Connected Vehicle data are not available, as part of this research, Vehicle-to-Infrastructure communication is modeled in the microscopic simulation to obtain individual vehicle trajectories. In this system, wavelet transform is used to analyze aggregated individual vehicles’ speed data to determine the locations of congestion. The currently recommended calibration procedures of simulation models are generally based on the capacity, volume and system-performance values and do not specifically examine traffic breakdown characteristics. However, since the proposed VSL strategies are countermeasures to the impacts of breakdown conditions, considering breakdown characteristics in the calibration procedure is important to have a reliable assessment. Several enhancements were proposed in this study to account for the breakdown characteristics at bottleneck locations in the calibration process. In this dissertation, performance of shockwave-based VSL is compared to VSL systems with different fixed VSL message sign locations utilizing the calibrated microscopic model. The results show that shockwave-based VSL outperforms fixed-location VSL systems, and it can considerably decrease the maximum back of queue and duration of breakdown while increasing the average speed during breakdown.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Traffic incidents are non-recurring events that can cause a temporary reduction in roadway capacity. They have been recognized as a major contributor to traffic congestion on our national highway systems. To alleviate their impacts on capacity, automatic incident detection (AID) has been applied as an incident management strategy to reduce the total incident duration. AID relies on an algorithm to identify the occurrence of incidents by analyzing real-time traffic data collected from surveillance detectors. Significant research has been performed to develop AID algorithms for incident detection on freeways; however, similar research on major arterial streets remains largely at the initial stage of development and testing. This dissertation research aims to identify design strategies for the deployment of an Artificial Neural Network (ANN) based AID algorithm for major arterial streets. A section of the US-1 corridor in Miami-Dade County, Florida was coded in the CORSIM microscopic simulation model to generate data for both model calibration and validation. To better capture the relationship between the traffic data and the corresponding incident status, Discrete Wavelet Transform (DWT) and data normalization were applied to the simulated data. Multiple ANN models were then developed for different detector configurations, historical data usage, and the selection of traffic flow parameters. To assess the performance of different design alternatives, the model outputs were compared based on both detection rate (DR) and false alarm rate (FAR). The results show that the best models were able to achieve a high DR of between 90% and 95%, a mean time to detect (MTTD) of 55-85 seconds, and a FAR below 4%. The results also show that a detector configuration including only the mid-block and upstream detectors performs almost as well as one that also includes a downstream detector. In addition, DWT was found to be able to improve model performance, and the use of historical data from previous time cycles improved the detection rate. Speed was found to have the most significant impact on the detection rate, while volume was found to contribute the least. The results from this research provide useful insights on the design of AID for arterial street applications.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The problem of social diffusion has animated sociological thinking on topics ranging from the spread of an idea, an innovation or a disease, to the foundations of collective behavior and political polarization. While network diffusion has been a productive metaphor, the reality of diffusion processes is often muddier. Ideas and innovations diffuse differently from diseases, but, with a few exceptions, the diffusion of ideas and innovations has been modeled under the same assumptions as the diffusion of disease. In this dissertation, I develop two new diffusion models for "socially meaningful" contagions that address two of the most significant problems with current diffusion models: (1) that contagions can only spread along observed ties, and (2) that contagions do not change as they spread between people. I augment insights from these statistical and simulation models with an analysis of an empirical case of diffusion - the use of enterprise collaboration software in a large technology company. I focus the empirical study on when people abandon innovations, a crucial, and understudied aspect of the diffusion of innovations. Using timestamped posts, I analyze when people abandon software to a high degree of detail.

To address the first problem, I suggest a latent space diffusion model. Rather than treating ties as stable conduits for information, the latent space diffusion model treats ties as random draws from an underlying social space, and simulates diffusion over the social space. Theoretically, the social space model integrates both actor ties and attributes simultaneously in a single social plane, while incorporating schemas into diffusion processes gives an explicit form to the reciprocal influences that cognition and social environment have on each other. Practically, the latent space diffusion model produces statistically consistent diffusion estimates where using the network alone does not, and the diffusion with schemas model shows that introducing some cognitive processing into diffusion processes changes the rate and ultimate distribution of the spreading information. To address the second problem, I suggest a diffusion model with schemas. Rather than treating information as though it is spread without changes, the schema diffusion model allows people to modify information they receive to fit an underlying mental model of the information before they pass the information to others. Combining the latent space models with a schema notion for actors improves our models for social diffusion both theoretically and practically.

The empirical case study focuses on how the changing value of an innovation, introduced by the innovations' network externalities, influences when people abandon the innovation. In it, I find that people are least likely to abandon an innovation when other people in their neighborhood currently use the software as well. The effect is particularly pronounced for supervisors' current use and number of supervisory team members who currently use the software. This case study not only points to an important process in the diffusion of innovation, but also suggests a new approach -- computerized collaboration systems -- to collecting and analyzing data on organizational processes.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Intelligent agents offer a new and exciting way of understanding the world of work. Agent-Based Simulation (ABS), one way of using intelligent agents, carries great potential for progressing our understanding of management practices and how they link to retail performance. We have developed simulation models based on research by a multi-disciplinary team of economists, work psychologists and computer scientists. We will discuss our experiences of implementing these concepts working with a well-known retail department store. There is no doubt that management practices are linked to the performance of an organisation (Reynolds et al., 2005; Wall & Wood, 2005). Best practices have been developed, but when it comes down to the actual application of these guidelines considerable ambiguity remains regarding their effectiveness within particular contexts (Siebers et al., forthcoming a). Most Operational Research (OR) methods can only be used as analysis tools once management practices have been implemented. Often they are not very useful for giving answers to speculative ‘what-if’ questions, particularly when one is interested in the development of the system over time rather than just the state of the system at a certain point in time. Simulation can be used to analyse the operation of dynamic and stochastic systems. ABS is particularly useful when complex interactions between system entities exist, such as autonomous decision making or negotiation. In an ABS model the researcher explicitly describes the decision process of simulated actors at the micro level. Structures emerge at the macro level as a result of the actions of the agents and their interactions with other agents and the environment. We will show how ABS experiments can deal with testing and optimising management practices such as training, empowerment or teamwork. Hence, questions such as “will staff setting their own break times improve performance?” can be investigated.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In this paper, we investigate output accuracy for a Discrete Event Simulation (DES) model and Agent Based Simulation (ABS) model. The purpose of this investigation is to find out which of these simulation techniques is the best one for modelling human reactive behaviour in the retail sector. In order to study the output accuracy in both models, we have carried out a validation experiment in which we compared the results from our simulation models to the performance of a real system. Our experiment was carried out using a large UK department store as a case study. We had to determine an efficient implementation of management policy in the store’s fitting room using DES and ABS. Overall, we have found that both simulation models were a good representation of the real system when modelling human reactive behaviour.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This research investigated the simulation model behaviour of a traditional and combined discrete event as well as agent based simulation models when modelling human reactive and proactive behaviour in human centric complex systems. A departmental store was chosen as human centric complex case study where the operation system of a fitting room in WomensWear department was investigated. We have looked at ways to determine the efficiency of new management policies for the fitting room operation through simulating the reactive and proactive behaviour of staff towards customers. Once development of the simulation models and their verification had been done, we carried out a validation experiment in the form of a sensitivity analysis. Subsequently, we executed a statistical analysis where the mixed reactive and proactive behaviour experimental results were compared with some reactive experimental results from previously published works. Generally, this case study discovered that simple proactive individual behaviour could be modelled in both simulation models. In addition, we found the traditional discrete event model performed similar in the simulation model output compared to the combined discrete event and agent based simulation when modelling similar human behaviour.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In recent years, maize has become one of the main alternative crops for the autumn winter growing season in the central-western and southeastern regions of Brazil. However, water deficits, sub-optimal temperatures and low solar radiation levels are common problems that are experienced during this growing season by local farmers. One methodology to assess the impact of variable weather conditions on crop production is the use of crop simulation models. The goal of this study was to evaluate the effect of climate variability on maize yield for a subtropical region of Brazil. Specific objectives for this study were (1) to analyse the effect of El Nino Southern Oscillation (ENSO) on precipitation and air temperature for four locations in the state of Sao Paulo and (2) to analyse the impact of ENSO on maize grown off-season for the same four locations using a crop simulation model. For each site, historical weather data were categorised as belonging to one of three phases of ENSO: El Nino (warm sea surface temperature anomalies in the Pacific), La Nina (cool sea surface temperature anomalies) or neutral, based on an index derived from observed sea surface temperature anomalies. During El Nino, there is a tendency for an increase in the rainfall amount during May for the four selected locations, and also during April, mainly in three of the locations, resulting in an increase in simulated maize yield planted between February 15 and March 15. In general, there was a decrease in the simulated yield for maize grown off-season during neutral years. This study showed how a crop model can be used to assess the impact of climate variability on the yield of maize grown off-season in a subtropical region of Brazil. The outcomes of this study can be very useful for both policy makers and local farmers for agricultural planning and decision making. Copyright (C) 2009 Royal Meteorological Society

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Warm-season grasses are economically important for cattle production in tropical regions, and tools to aid in management and research of these forages would be highly beneficial. Crop simulation models synthesize numerous physiological processes and are important research tools for evaluating production of warm-season grasses. This research was conducted to adapt the perennial CROPGRO Forage model to simulate growth of the tropical species palisadegrass [Brachiaria brizantha (A. Rich.) Stapf. cv. Xaraes] and to describe model adaptation for this species. In order to develop the CROPGRO parameters for this species, we began with values and relationships reported in the literature. Some parameters and relationships were calibrated by comparison with observed growth, development, dry matter accumulation and partitioning during a 2-year experiment with Xaraes palisadegrass in Piracicaba, SP, Brazil. Starting with parameters for the bahiagrass (Paspalum notatum Flugge) perennial forage model, dormancy effects had to be minimized, and partitioning to storage tissue/root decreased, and partitioning to leaf and stem increased to provide for more leaf and stem growth and less root. Parameters affecting specific leaf area (SLA) and senescence of plant tissues were improved. After these changes were made to the model, biomass accumulation was better simulated, mean predicted herbage yield per cycle was 3573 kg ha(-1), with a RMSE of 538 kg DM ha(-1) (D-Stat = 0.838, simulated/observed ratio = 1.028). The results of the adaptation suggest that the CROPGRO model is an efficient tool to integrate physiological aspects of palisadegrass and can be used to simulate growth. (C) 2010 Elsevier B.V. All rights reserved.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The ability to predict leaf area and leaf area index is crucial in crop simulation models that predict crop growth and yield. Previous studies have shown existing methods of predicting leaf area to be inadequate when applied to a broad range of cultivars with different numbers of leaves. The objectives of the study were to (i) develop generalised methods of modelling individual and total plant leaf area, and leaf senescence, that do not require constants that are specific to environments and/or genotypes, (ii) re-examine the base, optimum, and maximum temperatures for calculation of thermal time for leaf senescence, and (iii) assess the method of calculation of individual leaf area from leaf length and leaf width in experimental work. Five cultivars of maize differing widely in maturity and adaptation were planted in October 1994 in south-eastern Queensland, and grown under non-limiting conditions of water and plant nutrient supplies. Additional data for maize plants with low total leaf number (12-17) grown at Katumani Research Centre, Kenya, were included to extend the range in the total leaf number per plant. The equation for the modified (slightly skewed) bell curve could be generalised for modelling individual leaf area, as all coefficients in it were related to total leaf number. Use of coefficients for individual genotypes can be avoided, and individual and total plant leaf area can be calculated from total leaf number. A single, logistic equation, relying on maximum plant leaf area and thermal time from emergence, was developed to predict leaf senescence. The base, optimum, and maximum temperatures for calculation of thermal time for leaf senescence were 8, 34, and 40 degrees C, and apply for the whole crop-cycle when used in modelling of leaf senescence. Thus, the modelling of leaf production and senescence is simplified, improved, and generalised. Consequently, the modelling of leaf area index (LAI) and variables that rely on LAI will be improved. For experimental purposes, we found that the calculation of leaf area from leaf length and leaf width remains appropriate, though the relationship differed slightly from previously published equations.

Relevância:

80.00% 80.00%

Publicador:

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Hemichordates were traditionally allied to the chordates, but recent molecular analyses have suggested that hemichordates are a sister group to the echinoderms, a relationship that has important consequences for the interpretation of the evolution of deuterostome body plans. However, the molecular phylogenetic analyses to date have not provided robust support for the hemichordate + echinoderm clade. We use a maximum likelihood framework, including the parametric bootstrap, to reanalyze DNA data from complete mitochondrial genomes and nuclear 18S rRNA. This approach provides the first statistically significant support for the hemichordate + echinoderm clade from molecular data. This grouping implies that the ancestral deuterostome had features that included an adult with a pharynx and a dorsal nerve cord and an indirectly developing dipleurula-like larva.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Urbanization and the ability to manage for a sustainable future present numerous challenges for geographers and planners in metropolitan regions. Remotely sensed data are inherently suited to provide information on urban land cover characteristics, and their change over time, at various spatial and temporal scales. Data models for establishing the range of urban land cover types and their biophysical composition (vegetation, soil, and impervious surfaces) are integrated to provide a hierarchical approach to classifying land cover within urban environments. These data also provide an essential component for current simulation models of urban growth patterns, as both calibration and validation data. The first stages of the approach have been applied to examine urban growth between 1988 and 1995 for a rapidly developing area in southeast Queensland, Australia. Landsat Thematic Mapper image data provided accurate (83% adjusted overall accuracy) classification of broad land cover types and their change over time. The combination of commonly available remotely sensed data, image processing methods, and emerging urban growth models highlights an important application for current and next generation moderate spatial resolution image data in studies of urban environments.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Using peanuts as an example, a generic methodology is presented to forward-estimate regional crop production and associated climatic risks based on phases of the Southern Oscillation Index (SOI). Yield fluctuations caused by a highly variable rainfall environment are of concern to peanut processing and marketing bodies. The industry could profitably use forecasts of likely production to adjust their operations strategically. Significant, physically based lag-relationships exist between an index of ocean/atmosphere El Nino/Southern Oscillation phenomenon and future rainfall in Australia and elsewhere. Combining knowledge of SOI phases in November and December with output from a dynamic simulation model allows the derivation of yield probability distributions based on historic rainfall data. This information is available shortly after planting a crop and at least 3-5 months prior to harvest. The study shows that in years when the November-December SOI phase is positive there is an 80% chance of exceeding average district yields. Conversely, in years when the November-December SOI phase is either negative or rapidly falling there is only a 5% chance of exceeding average district yields, but a 95% chance of below average yields. This information allows the industry to adjust strategically for the expected volume of production. The study shows that simulation models can enhance SOI signals contained in rainfall distributions by discriminating between useful and damaging rainfall events. The methodology can be applied to other industries and regions.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Crop modelling has evolved over the last 30 or so years in concert with advances in crop physiology, crop ecology and computing technology. Having reached a respectable degree of acceptance, it is appropriate to review briefly the course of developments in crop modelling and to project what might be major contributions of crop modelling in the future. Two major opportunities are envisioned for increased modelling activity in the future. One opportunity is in a continuing central, heuristic role to support scientific investigation, to facilitate decision making by crop managers, and to aid in education. Heuristic activities will also extend to the broader system-level issues of environmental and ecological aspects of crop production. The second opportunity is projected as a prime contributor in understanding and advancing the genetic regulation of plant performance and plant improvement. Physiological dissection and modelling of traits provides an avenue by which crop modelling could contribute to enhancing integration of molecular genetic technologies in crop improvement. Crown Copyright (C) 2002 Published by Elsevier Science B.V. All rights reserved.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Anew thermodynamic approach has been developed in this paper to analyze adsorption in slitlike pores. The equilibrium is described by two thermodynamic conditions: the Helmholtz free energy must be minimal, and the grand potential functional at that minimum must be negative. This approach has led to local isotherms that describe adsorption in the form of a single layer or two layers near the pore walls. In narrow pores local isotherms have one step that could be either very sharp but continuous or discontinuous benchlike for a definite range of pore width. The latter reflects a so-called 0 --> 1 monolayer transition. In relatively wide pores, local isotherms have two steps, of which the first step corresponds to the appearance of two layers near the pore walls, while the second step corresponds to the filling of the space between these layers. All features of local isotherms are in agreement with the results obtained from the density functional theory and Monte Carlo simulations. The approach is used for determining pore size distributions of carbon materials. We illustrate this with the benzene adsorption data on activated carbon at 20, 50, and 80 degreesC, argon adsorption on activated carbon Norit ROX at 87.3 K, and nitrogen adsorption on activated carbon Norit R1 at 77.3 K.