490 resultados para traffic modeling
Resumo:
Temporary Traffic Control Plans (TCP’s), which provide construction phasing to maintain traffic during construction operations, are integral component of highway construction project design. Using the initial design, designers develop estimated quantities for the required TCP devices that become the basis for bids submitted by highway contractors. However, actual as-built quantities are often significantly different from the engineer’s original estimate. The total cost of TCP phasing on highway construction projects amounts to 6–10% of the total construction cost. Variations between engineer estimated quantities and final quantities contribute to reduced cost control, increased chances of cost related litigations, and bid rankings and selection. Statistical analyses of over 2000 highway construction projects were performed to determine the sources of variation, which later were used as the basis of development for an automated-hybrid prediction model that uses multiple regressions and heuristic rules to provide accurate TCP quantities and costs. The predictive accuracy of the model developed was demonstrated through several case studies.
Resumo:
Loop detectors are the oldest and widely used traffic data source. On urban arterials, they are mainly installed for signal control. Recently state of the art Bluetooth MAC Scanners (BMS) has significantly captured the interest of stakeholders for exploiting it for area wide traffic monitoring. Loop detectors provide flow- a fundamental traffic parameter; whereas BMS provides individual vehicle travel time between BMS stations. Hence, these two data sources complement each other, and if integrated should increase the accuracy and reliability of the traffic state estimation. This paper proposed a model that integrates loops and BMS data for seamless travel time and density estimation for urban signalised network. The proposed model is validated using both real and simulated data and the results indicate that the accuracy of the proposed model is over 90%.
Resumo:
Due to rapidly diminishing international supplies of fossil fuels, such as petroleum and diesel, the cost of fuel is constantly increasing, leading to higher costs of living, as a result of the significant reliance of many industries on motor vehicles. Many technologies have been developed to replace part or all of a fossil fuel with bio-fuels. One of the dual fuel technologies is fumigation of ethanol in diesel engines, which injects ethanol into the intake air stream of the engine. The advantage of this is that it avoids any costly modification of the engine high pressure diesel injection system, while reducing the volume of diesel required and potentially increasing the power output and efficiency. This paper investigates the performance of a diesel engine, converted to implement ethanol fumigation. The project will use both existing experimental data, along with generating computer modeled results using the program AVL Boost. The data from both experiments and the numerical simulation indicate desirable results for the peak pressure and the indicated mean effective pressure (IMEP). Increase in ethanol substitution resulted in elevated combustion pressure and an increase in the IMEP, while the variation of ethanol injection location resulted in negligible change. These increases in cylinder pressure led to a higher work output and total efficiency in the engine as the ethanol substitution was increased. In comparing the numerical and experimental results, the simulation showed a slight elevation, due to the inaccuracies in the heat release models. Future work is required to improve the combustion model and investigate the effect of the variation of the location of ethanol injection.
Resumo:
Bandwidths and offsets are important components in vehicle traffic control strategies. This article proposes new methods for quantifying and selecting them. Bandwidth is the amount of green time available for vehicles to travel through adjacent intersections without the requirement to stop at the second traffic light. The offset is the difference between the starting-time of ``green'' periods at two adjacent intersections, along a given route. The core ideas in this article were developed during the 2013 Maths and Industry Study Group in Brisbane, Australia. Analytical expressions for computing bandwidth, as a function of offset, are developed. An optimisation model, for selecting offsets across an arterial, is proposed. Arterial roads were focussed upon, as bandwidth and offset have a greater impact on these types of road as opposed to a full traffic network. A generic optimisation-simulation approach is also proposed to refine an initial starting solution, according to a specified metric. A metric that reflects the number of stops, and the distance between stops, is proposed to explicitly reduce the dissatisfaction of road users, and to implicitly reduce fuel consumption and emissions. Conceptually the optimisation-simulation approach is superior as it handles real-life complexities and is a global optimisation approach. The models and equations in this article can be used in road planning and traffic control.
Resumo:
Perflurooctanoic acid (PFOA) and perfluorooctane sulfonic acid (PFOS) have been used for a variety of applications including fluoropolymer processing, fire-fighting foams and surface treatments since the 1950s. Both PFOS and PFOA are polyfluoroalkyl chemicals (PFCs), man-made compounds that are persistent in the environment and humans; some PFCs have shown adverse effects in laboratory animals. Here we describe the application of a simple one compartment pharmacokinetic model to estimate total intakes of PFOA and PFOS for the general population of urban areas on the east coast of Australia. Key parameters for this model include the elimination rate constants and the volume of distribution within the body. A volume of distribution was calibrated for PFOA to a value of 170ml/kgbw using data from two communities in the United States where the residents' serum concentrations could be assumed to result primarily from a known and characterized source, drinking water contaminated with PFOA by a single fluoropolymer manufacturing facility. For PFOS, a value of 230ml/kgbw was used, based on adjustment of the PFOA value. Applying measured Australian serum data to the model gave mean+/-standard deviation intake estimates of PFOA of 1.6+/-0.3ng/kgbw/day for males and females >12years of age combined based on samples collected in 2002-2003 and 1.3+/-0.2ng/kg bw/day based on samples collected in 2006-2007. Mean intakes of PFOS were 2.7+/-0.5ng/kgbw/day for males and females >12years of age combined based on samples collected in 2002-2003, and 2.4+/-0.5ng/kgbw/day for the 2006-2007 samples. ANOVA analysis was run for PFOA intake and demonstrated significant differences by age group (p=0.03), sex (p=0.001) and date of collection (p<0.001). Estimated intake rates were highest in those aged >60years, higher in males compared to females, and higher in 2002-2003 compared to 2006-2007. The same results were seen for PFOS intake with significant differences by age group (p<0.001), sex (p=0.001) and date of collection (p=0.016).
Resumo:
Several significant studies have been made in recent decades toward understanding road traffic noise and its effects on residential balconies. These previous studies have used a variety of techniques such as theoretical models, scale models and measurements on real balconies. The studies have considered either road traffic noise levels within the balcony space or inside an adjacent habitable room or both. Previous theoretical models have used, for example, simplified specular reflection calculations, boundary element methods (BEM), adaptations of CoRTN or the use of Sabine Theory. This paper presents an alternative theoretical model to predict the effects of road traffic noise spatially within the balcony space. The model includes a specular reflection component by calculating up to 10 orders of source images. To account for diffusion effects, a two compartment radiosity component is utilised. The first radiosity compartment is the urban street, represented as a street with building facades on either side. The second radiosity compartment is the balcony space. The model is designed to calculate the predicted road traffic noise levels within the balcony space and is capable of establishing the effect of changing street and balcony geometries. Screening attenuation algorithms are included to determine the effects of solid balcony parapets and balcony ceiling shields.
Resumo:
This paper presents two novel nonlinear models of u-shaped anti-roll tanks for ships, and their linearizations. In addition, a third simplified nonlinear model is presented. The models are derived using Lagrangian mechanics. This formulation not only simplifies the modeling process, but also allows one to obtain models that satisfy energy-related physical properties. The proposed nonlinear models and their linearizations are validated using model-scale experimental data. Unlike other models in the literature, the nonlinear models in this paper are valid for large roll amplitudes. Even at moderate roll angles, the nonlinear models have three orders of magnitude lower mean square error relative to experimental data than the linear models.
Resumo:
Process models are usually depicted as directed graphs, with nodes representing activities and directed edges control flow. While structured processes with pre-defined control flow have been studied in detail, flexible processes including ad-hoc activities need further investigation. This paper presents flexible process graph, a novel approach to model processes in the context of dynamic environment and adaptive process participants’ behavior. The approach allows defining execution constraints, which are more restrictive than traditional ad-hoc processes and less restrictive than traditional control flow, thereby balancing structured control flow with unstructured ad-hoc activities. Flexible process graph focuses on what can be done to perform a process. Process participants’ routing decisions are based on the current process state. As a formal grounding, the approach uses hypergraphs, where each edge can associate any number of nodes. Hypergraphs are used to define execution semantics of processes formally. We provide a process scenario to motivate and illustrate the approach.
Resumo:
Lean construction and building information modeling (BIM) are quite different initiatives, but both are having profound impacts on the construction industry. A rigorous analysis of the myriad specific interactions between them indicates that a synergy exists which, if properly understood in theoretical terms, can be exploited to improve construction processes beyond the degree to which it might be improved by application of either of these paradigms independently. Using a matrix that juxtaposes BIM functionalities with prescriptive lean construction principles, 56 interactions have been identified, all but four of which represent constructive interaction. Although evidence for the majority of these has been found, the matrix is not considered complete but rather a framework for research to explore the degree of validity of the interactions. Construction executives, managers, designers, and developers of information technology systems for construction can also benefit from the framework as an aid to recognizing the potential synergies when planning their lean and BIM adoption strategies.
Resumo:
Unstable density-driven flow can lead to enhanced solute transport in groundwater. Only recently has the complex fingering pattern associated with free convection been documented in field settings. Electrical resistivity (ER) tomography has been used to capture a snapshot of convective instabilities at a single point in time, but a thorough transient analysis is still lacking in the literature. We present the results of a 2 year experimental study at a shallow aquifer in the United Arab Emirates that was designed to specifically explore the transient nature of free convection. ER tomography data documented the presence of convective fingers following a significant rainfall event. We demonstrate that the complex fingering pattern had completely disappeared a year after the rainfall event. The observation is supported by an analysis of the aquifer halite budget and hydrodynamic modeling of the transient character of the fingering instabilities. Modeling results show that the transient dynamics of the gravitational instabilities (their initial development, infiltration into the underlying lower-density groundwater, and subsequent decay) are in agreement with the timing observed in the time-lapse ER measurements. All experimental observations and modeling results are consistent with the hypothesis that a dense brine that infiltrated into the aquifer from a surficial source was the cause of free convection at this site, and that the finite nature of the dense brine source and dispersive mixing led to the decay of instabilities with time. This study highlights the importance of the transience of free convection phenomena and suggests that these processes are more rapid than was previously understood.
Resumo:
Designing systems for multiple stakeholders requires frequent collaboration with multiple stakeholders from the start. In many cases at least some stakeholders lack a professional habit of formal modeling. We report observations from two case studies of stakeholder-involvement in early design where non-formal techniques supported strong collaboration resulting in deep understanding of requirements and of the feasibility of solutions.
Resumo:
Motivation ?Task analysis for designing modern collaborative work needs a more fine grained approach. Especially in a complex task domain, like collaborative scientific authoring, when there is a single overall goal that can only be accomplished only by collaboration between multiple roles, each requiring its own expertise. We analyzed and re-considered roles, activities, and objects for design for complex collaboration contexts. Our main focus is on a generic approach to design for multiple roles and subtasks in a domain with a shared overall goal, which requires a detailed approach. Collaborative authoring is our current example. This research is incremental: an existing task analysis approach (GTA) is reconsidered by applying it to a case of complex collaboration. Our analysis shows that designing for collaboration indeed requires a refined approach to task modeling: GTA, in future, will need to consider tasks at the lowest level that can be delegated or mandates. These tasks need to be analyzed and redesigned in more in detail, along with the relevant task object.
Resumo:
Process choreographies describe interactions between different business partners and the dependencies between these interactions. While different proposals were made for capturing choreographies at an implementation level, it remains unclear how choreographies should be described on a conceptual level.While the Business Process Modeling Notation (BPMN) is already in use for describing choreographies in terms of interconnected interface behavior models, this paper will introduce interaction modeling using BPMN. Such interaction models do not suffer from incompatibility issues and are better suited for human modelers. BPMN extensions are proposed and a mapping from interaction models to interface behavior models is presented.
Resumo:
This paper presents a novel framework to further advance the recent trend of using query decomposition and high-order term relationships in query language modeling, which takes into account terms implicitly associated with different subsets of query terms. Existing approaches, most remarkably the language model based on the Information Flow method are however unable to capture multiple levels of associations and also suffer from a high computational overhead. In this paper, we propose to compute association rules from pseudo feedback documents that are segmented into variable length chunks via multiple sliding windows of different sizes. Extensive experiments have been conducted on various TREC collections and our approach significantly outperforms a baseline Query Likelihood language model, the Relevance Model and the Information Flow model.
Resumo:
Accurate prediction of incident duration is not only important information of Traffic Incident Management System, but also an ffective input for travel time prediction. In this paper, the hazard based prediction odels are developed for both incident clearance time and arrival time. The data are obtained from the Queensland Department of Transport and Main Roads’ STREAMS Incident Management System (SIMS) for one year ending in November 2010. The best fitting distributions are drawn for both clearance and arrival time for 3 types of incident: crash, stationary vehicle, and hazard. The results show that Gamma, Log-logistic, and Weibull are the best fit for crash, stationary vehicle, and hazard incident, respectively. The obvious impact factors are given for crash clearance time and arrival time. The quantitative influences for crash and hazard incident are presented for both clearance and arrival. The model accuracy is analyzed at the end.