8 resultados para Traffic Flow Modeling.
em DRUM (Digital Repository at the University of Maryland)
Resumo:
Traffic demand increases are pushing aging ground transportation infrastructures to their theoretical capacity. The result of this demand is traffic bottlenecks that are a major cause of delay on urban freeways. In addition, the queues associated with those bottlenecks increase the probability of a crash while adversely affecting environmental measures such as emissions and fuel consumption. With limited resources available for network expansion, traffic professionals have developed active traffic management systems (ATMS) in an attempt to mitigate the negative consequences of traffic bottlenecks. Among these ATMS strategies, variable speed limits (VSL) and ramp metering (RM) have been gaining international interests for their potential to improve safety, mobility, and environmental measures at freeway bottlenecks. Though previous studies have shown the tremendous potential of variable speed limit (VSL) and VSL paired with ramp metering (VSLRM) control, little guidance has been developed to assist decision makers in the planning phase of a congestion mitigation project that is considering VSL or VSLRM control. To address this need, this study has developed a comprehensive decision/deployment support tool for the application of VSL and VSLRM control in recurrently congested environments. The decision tool will assist practitioners in deciding the most appropriate control strategy at a candidate site, which candidate sites have the most potential to benefit from the suggested control strategy, and how to most effectively design the field deployment of the suggested control strategy at each implementation site. To do so, the tool is comprised of three key modules, (1) Decision Module, (2) Benefits Module, and (3) Deployment Guidelines Module. Each module uses commonly known traffic flow and geometric parameters as inputs to statistical models and empirically based procedures to provide guidance on the application of VSL and VSLRM at each candidate site. These models and procedures were developed from the outputs of simulated experiments, calibrated with field data. To demonstrate the application of the tool, a list of real-world candidate sites were selected from the Maryland State Highway Administration Mobility Report. Here, field data from each candidate site was input into the tool to illustrate the step-by-step process required for efficient planning of VSL or VSLRM control. The output of the tool includes the suggested control system at each site, a ranking of the sites based on the expected benefit-to-cost ratio, and guidelines on how to deploy the VSL signs, ramp meters, and detectors at the deployment site(s). This research has the potential to assist traffic engineers in the planning of VSL and VSLRM control, thus enhancing the procedure for allocating limited resources for mobility and safety improvements on highways plagued by recurrent congestion.
Resumo:
Authentication plays an important role in how we interact with computers, mobile devices, the web, etc. The idea of authentication is to uniquely identify a user before granting access to system privileges. For example, in recent years more corporate information and applications have been accessible via the Internet and Intranet. Many employees are working from remote locations and need access to secure corporate files. During this time, it is possible for malicious or unauthorized users to gain access to the system. For this reason, it is logical to have some mechanism in place to detect whether the logged-in user is the same user in control of the user's session. Therefore, highly secure authentication methods must be used. We posit that each of us is unique in our use of computer systems. It is this uniqueness that is leveraged to "continuously authenticate users" while they use web software. To monitor user behavior, n-gram models are used to capture user interactions with web-based software. This statistical language model essentially captures sequences and sub-sequences of user actions, their orderings, and temporal relationships that make them unique by providing a model of how each user typically behaves. Users are then continuously monitored during software operations. Large deviations from "normal behavior" can possibly indicate malicious or unintended behavior. This approach is implemented in a system called Intruder Detector (ID) that models user actions as embodied in web logs generated in response to a user's actions. User identification through web logs is cost-effective and non-intrusive. We perform experiments on a large fielded system with web logs of approximately 4000 users. For these experiments, we use two classification techniques; binary and multi-class classification. We evaluate model-specific differences of user behavior based on coarse-grain (i.e., role) and fine-grain (i.e., individual) analysis. A specific set of metrics are used to provide valuable insight into how each model performs. Intruder Detector achieves accurate results when identifying legitimate users and user types. This tool is also able to detect outliers in role-based user behavior with optimal performance. In addition to web applications, this continuous monitoring technique can be used with other user-based systems such as mobile devices and the analysis of network traffic.
Resumo:
Two-phase flow heat exchangers have been shown to have very high efficiencies, but the lack of a dependable model and data precludes them from use in many cases. Herein a new method for the measurement of local convective heat transfer coefficients from the outside of a heat transferring wall has been developed, which results in accurate local measurements of heat flux during two-phase flow. This novel technique uses a chevron-pattern corrugated plate heat exchanger consisting of a specially machined Calcium Fluoride plate and the refrigerant HFE7100, with heat flux values up to 1 W cm-2 and flow rates up to 300 kg m-2s-1. As Calcium Fluoride is largely transparent to infra-red radiation, the measurement of the surface temperature of PHE that is in direct contact with the liquid is accomplished through use of a mid-range (3.0-5.1 µm) infra-red camera. The objective of this study is to develop, validate, and use a unique infrared thermometry method to quantify the heat transfer characteristics of flow boiling within different Plate Heat Exchanger geometries. This new method allows high spatial and temporal resolution measurements. Furthermore quasi-local pressure measurements enable us to characterize the performance of each geometry. Validation of this technique will be demonstrated by comparison to accepted single and two-phase data. The results can be used to come up with new heat transfer correlations and optimization tools for heat exchanger designers. The scientific contribution of this thesis is, to give PHE developers further tools to allow them to identify the heat transfer and pressure drop performance of any corrugated plate pattern directly without the need to account for typical error sources due to inlet and outlet distribution systems. Furthermore, the designers will now gain information on the local heat transfer distribution within one plate heat exchanger cell which will help to choose the correct corrugation geometry for a given task.
Experimental Modeling of Twin-Screw Extrusion Processes to Predict Properties of Extruded Composites
Resumo:
Twin-screw extrusion is used to compound fillers into a polymer matrix in order to improve the properties of the final product. The resultant properties of the composite are determined by the operating conditions used during extrusion processing. Changes in the operating conditions affect the physics of the melt flow, inducing unique composite properties. In the following work, the Residence Stress Distribution methodology has been applied to model both the stress behavior and the property response of a twin-screw compounding process as a function of the operating conditions. The compounding of a pigment into a polymer melt has been investigated to determine the effect of stress on the degree of mixing, which will affect the properties of the composite. In addition, the pharmaceutical properties resulting from the compounding of an active pharmaceutical ingredient are modeled as a function of the operating conditions, indicating the physical behavior inducing the property responses.
Resumo:
Fatigue damage in the connections of single mast arm signal support structures is one of the primary safety concerns because collapse could result from fatigue induced cracking. This type of cantilever signal support structures typically has very light damping and excessively large wind-induced vibration have been observed. Major changes related to fatigue design were made in the 2001 AASHTO LRFD Specification for Structural Supports for Highway Signs, Luminaries, and Traffic Signals and supplemental damping devices have been shown to be promising in reducing the vibration response and thus fatigue load demand on mast arm signal support structures. The primary objective of this study is to investigate the effectiveness and optimal use of one type of damping devices termed tuned mass damper (TMD) in vibration response mitigation. Three prototype single mast arm signal support structures with 50-ft, 60-ft, and 70-ft respectively are selected for this numerical simulation study. In order to validate the finite element models for subsequent simulation study, analytical modeling of static deflection response of mast arm of the signal support structures was performed and found to be close to the numerical simulation results from beam element based finite element model. A 3-DOF dynamic model was then built using analytically derived stiffness matrix for modal analysis and time history analysis. The free vibration response and forced (harmonic) vibration response of the mast arm structures from the finite element model are observed to be in good agreement with the finite element analysis results. Furthermore, experimental test result from recent free vibration test of a full-scale 50-ft mast arm specimen in the lab is used to verify the prototype structure’s fundamental frequency and viscous damping ratio. After validating the finite element models, a series of parametric study were conducted to examine the trend and determine optimal use of tuned mass damper on the prototype single mast arm signal support structures by varying the following parameters: mass, frequency, viscous damping ratio, and location of TMD. The numerical simulation study results reveal that two parameters that influence most the vibration mitigation effectiveness of TMD on the single mast arm signal pole structures are the TMD frequency and its viscous damping ratio.
Resumo:
A primary goal of this dissertation is to understand the links between mathematical models that describe crystal surfaces at three fundamental length scales: The scale of individual atoms, the scale of collections of atoms forming crystal defects, and macroscopic scale. Characterizing connections between different classes of models is a critical task for gaining insight into the physics they describe, a long-standing objective in applied analysis, and also highly relevant in engineering applications. The key concept I use in each problem addressed in this thesis is coarse graining, which is a strategy for connecting fine representations or models with coarser representations. Often this idea is invoked to reduce a large discrete system to an appropriate continuum description, e.g. individual particles are represented by a continuous density. While there is no general theory of coarse graining, one closely related mathematical approach is asymptotic analysis, i.e. the description of limiting behavior as some parameter becomes very large or very small. In the case of crystalline solids, it is natural to consider cases where the number of particles is large or where the lattice spacing is small. Limits such as these often make explicit the nature of links between models capturing different scales, and, once established, provide a means of improving our understanding, or the models themselves. Finding appropriate variables whose limits illustrate the important connections between models is no easy task, however. This is one area where computer simulation is extremely helpful, as it allows us to see the results of complex dynamics and gather clues regarding the roles of different physical quantities. On the other hand, connections between models enable the development of novel multiscale computational schemes, so understanding can assist computation and vice versa. Some of these ideas are demonstrated in this thesis. The important outcomes of this thesis include: (1) a systematic derivation of the step-flow model of Burton, Cabrera, and Frank, with corrections, from an atomistic solid-on-solid-type models in 1+1 dimensions; (2) the inclusion of an atomistically motivated transport mechanism in an island dynamics model allowing for a more detailed account of mound evolution; and (3) the development of a hybrid discrete-continuum scheme for simulating the relaxation of a faceted crystal mound. Central to all of these modeling and simulation efforts is the presence of steps composed of individual layers of atoms on vicinal crystal surfaces. Consequently, a recurring theme in this research is the observation that mesoscale defects play a crucial role in crystal morphological evolution.
Resumo:
The service of a critical infrastructure, such as a municipal wastewater treatment plant (MWWTP), is taken for granted until a flood or another low frequency, high consequence crisis brings its fragility to attention. The unique aspects of the MWWTP call for a method to quantify the flood stage-duration-frequency relationship. By developing a bivariate joint distribution model of flood stage and duration, this study adds a second dimension, time, into flood risk studies. A new parameter, inter-event time, is developed to further illustrate the effect of event separation on the frequency assessment. The method is tested on riverine, estuary and tidal sites in the Mid-Atlantic region. Equipment damage functions are characterized by linear and step damage models. The Expected Annual Damage (EAD) of the underground equipment is further estimated by the parametric joint distribution model, which is a function of both flood stage and duration, demonstrating the application of the bivariate model in risk assessment. Flood likelihood may alter due to climate change. A sensitivity analysis method is developed to assess future flood risk by estimating flood frequency under conditions of higher sea level and stream flow response to increased precipitation intensity. Scenarios based on steady and unsteady flow analysis are generated for current climate, future climate within this century, and future climate beyond this century, consistent with the WWTP planning horizons. The spatial extent of flood risk is visualized by inundation mapping and GIS-Assisted Risk Register (GARR). This research will help the stakeholders of the critical infrastructure be aware of the flood risk, vulnerability, and the inherent uncertainty.
Resumo:
Transportation system resilience has been the subject of several recent studies. To assess the resilience of a transportation network, however, it is essential to model its interactions with and reliance on other lifelines. In this work, a bi-level, mixed-integer, stochastic program is presented for quantifying the resilience of a coupled traffic-power network under a host of potential natural or anthropogenic hazard-impact scenarios. A two-layer network representation is employed that includes details of both systems. Interdependencies between the urban traffic and electric power distribution systems are captured through linking variables and logical constraints. The modeling approach was applied on a case study developed on a portion of the signalized traffic-power distribution system in southern Minneapolis. The results of the case study show the importance of explicitly considering interdependencies between critical infrastructures in transportation resilience estimation. The results also provide insights on lifeline performance from an alternative power perspective.