899 resultados para CONICAL INTERSECTIONS


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Predicting safety on roadways is standard practice for road safety professionals and has a corresponding extensive literature. The majority of safety prediction models are estimated using roadway segment and intersection (microscale) data, while more recently efforts have been undertaken to predict safety at the planning level (macroscale). Safety prediction models typically include roadway, operations, and exposure variables—factors known to affect safety in fundamental ways. Environmental variables, in particular variables attempting to capture the effect of rain on road safety, are difficult to obtain and have rarely been considered. In the few cases weather variables have been included, historical averages rather than actual weather conditions during which crashes are observed have been used. Without the inclusion of weather related variables researchers have had difficulty explaining regional differences in the safety performance of various entities (e.g. intersections, road segments, highways, etc.) As part of the NCHRP 8-44 research effort, researchers developed PLANSAFE, or planning level safety prediction models. These models make use of socio-economic, demographic, and roadway variables for predicting planning level safety. Accounting for regional differences - similar to the experience for microscale safety models - has been problematic during the development of planning level safety prediction models. More specifically, without weather related variables there is an insufficient set of variables for explaining safety differences across regions and states. Furthermore, omitted variable bias resulting from excluding these important variables may adversely impact the coefficients of included variables, thus contributing to difficulty in model interpretation and accuracy. This paper summarizes the results of an effort to include weather related variables, particularly various measures of rainfall, into accident frequency prediction and the prediction of the frequency of fatal and/or injury degree of severity crash models. The purpose of the study was to determine whether these variables do in fact improve overall goodness of fit of the models, whether these variables may explain some or all of observed regional differences, and identifying the estimated effects of rainfall on safety. The models are based on Traffic Analysis Zone level datasets from Michigan, and Pima and Maricopa Counties in Arizona. Numerous rain-related variables were found to be statistically significant, selected rain related variables improved the overall goodness of fit, and inclusion of these variables reduced the portion of the model explained by the constant in the base models without weather variables. Rain tends to diminish safety, as expected, in fairly complex ways, depending on rain frequency and intensity.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

There has been considerable research conducted over the last 20 years focused on predicting motor vehicle crashes on transportation facilities. The range of statistical models commonly applied includes binomial, Poisson, Poisson-gamma (or negative binomial), zero-inflated Poisson and negative binomial models (ZIP and ZINB), and multinomial probability models. Given the range of possible modeling approaches and the host of assumptions with each modeling approach, making an intelligent choice for modeling motor vehicle crash data is difficult. There is little discussion in the literature comparing different statistical modeling approaches, identifying which statistical models are most appropriate for modeling crash data, and providing a strong justification from basic crash principles. In the recent literature, it has been suggested that the motor vehicle crash process can successfully be modeled by assuming a dual-state data-generating process, which implies that entities (e.g., intersections, road segments, pedestrian crossings, etc.) exist in one of two states—perfectly safe and unsafe. As a result, the ZIP and ZINB are two models that have been applied to account for the preponderance of “excess” zeros frequently observed in crash count data. The objective of this study is to provide defensible guidance on how to appropriate model crash data. We first examine the motor vehicle crash process using theoretical principles and a basic understanding of the crash process. It is shown that the fundamental crash process follows a Bernoulli trial with unequal probability of independent events, also known as Poisson trials. We examine the evolution of statistical models as they apply to the motor vehicle crash process, and indicate how well they statistically approximate the crash process. We also present the theory behind dual-state process count models, and note why they have become popular for modeling crash data. A simulation experiment is then conducted to demonstrate how crash data give rise to “excess” zeros frequently observed in crash data. It is shown that the Poisson and other mixed probabilistic structures are approximations assumed for modeling the motor vehicle crash process. Furthermore, it is demonstrated that under certain (fairly common) circumstances excess zeros are observed—and that these circumstances arise from low exposure and/or inappropriate selection of time/space scales and not an underlying dual state process. In conclusion, carefully selecting the time/space scales for analysis, including an improved set of explanatory variables and/or unobserved heterogeneity effects in count regression models, or applying small-area statistical methods (observations with low exposure) represent the most defensible modeling approaches for datasets with a preponderance of zeros

Relevância:

10.00% 10.00%

Publicador:

Resumo:

At least two important transportation planning activities rely on planning-level crash prediction models. One is motivated by the Transportation Equity Act for the 21st Century, which requires departments of transportation and metropolitan planning organizations to consider safety explicitly in the transportation planning process. The second could arise from a need for state agencies to establish incentive programs to reduce injuries and save lives. Both applications require a forecast of safety for a future period. Planning-level crash prediction models for the Tucson, Arizona, metropolitan region are presented to demonstrate the feasibility of such models. Data were separated into fatal, injury, and property-damage crashes. To accommodate overdispersion in the data, negative binomial regression models were applied. To accommodate the simultaneity of fatality and injury crash outcomes, simultaneous estimation of the models was conducted. All models produce crash forecasts at the traffic analysis zone level. Statistically significant (p-values < 0.05) and theoretically meaningful variables for the fatal crash model included population density, persons 17 years old or younger as a percentage of the total population, and intersection density. Significant variables for the injury and property-damage crash models were population density, number of employees, intersections density, percentage of miles of principal arterial, percentage of miles of minor arterials, and percentage of miles of urban collectors. Among several conclusions it is suggested that planning-level safety models are feasible and may play a role in future planning activities. However, caution must be exercised with such models.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In recent years the development and use of crash prediction models for roadway safety analyses have received substantial attention. These models, also known as safety performance functions (SPFs), relate the expected crash frequency of roadway elements (intersections, road segments, on-ramps) to traffic volumes and other geometric and operational characteristics. A commonly practiced approach for applying intersection SPFs is to assume that crash types occur in fixed proportions (e.g., rear-end crashes make up 20% of crashes, angle crashes 35%, and so forth) and then apply these fixed proportions to crash totals to estimate crash frequencies by type. As demonstrated in this paper, such a practice makes questionable assumptions and results in considerable error in estimating crash proportions. Through the use of rudimentary SPFs based solely on the annual average daily traffic (AADT) of major and minor roads, the homogeneity-in-proportions assumption is shown not to hold across AADT, because crash proportions vary as a function of both major and minor road AADT. For example, with minor road AADT of 400 vehicles per day, the proportion of intersecting-direction crashes decreases from about 50% with 2,000 major road AADT to about 15% with 82,000 AADT. Same-direction crashes increase from about 15% to 55% for the same comparison. The homogeneity-in-proportions assumption should be abandoned, and crash type models should be used to predict crash frequency by crash type. SPFs that use additional geometric variables would only exacerbate the problem quantified here. Comparison of models for different crash types using additional geometric variables remains the subject of future research.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper addresses the problem of constructing consolidated business process models out of collections of process models that share common fragments. The paper considers the construction of unions of multiple models (called merged models) as well as intersections (called digests). Merged models are intended for analysts who wish to create a model that subsumes a collection of process models - typically representing variants of the same underlying process - with the aim of replacing the variants with the merged model. Digests, on the other hand, are intended for analysts who wish to identify the most recurring fragments across a collection of process models, so that they can focus their efforts on optimizing these fragments. The paper presents an algorithm for computing merged models and an algorithm for extracting digests from a merged model. The merging and digest extraction algorithms have been implemented and tested against collections of process models taken from multiple application domains. The tests show that the merging algorithm produces compact models and scales up to process models containing hundreds of nodes. Furthermore, a case study conducted in a large insurance company has demonstrated the usefulness of the merging and digest extraction operators in a practical setting.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Epilogue for the edited book "Nexus: New Intersections in Internet Research"

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Skid resistance is a condition parameter characterising the contribution that a road makes to the friction between a road surface and a vehicle tyre. Studies of traffic crash histories around the world have consistently found that a disproportionate number of crashes occur where the road surface has a low level of surface friction and/or surface texture, particularly when the road surface is wet. Various research results have been published over many years and have tried to quantify the influence of skid resistance on accident occurrence and to characterise a correlation between skid resistance and accident frequency. Most of the research studies used simple statistical correlation methods in analysing skid resistance and crash data.----- ------ Preliminary findings of a systematic and extensive literature search conclude that there is rarely a single causation factor in a crash. Findings from research projects do affirm various levels of correlation between skid resistance and accident occurrence. Studies indicate that the level of skid resistance at critical places such as intersections, curves, roundabouts, ramps and approaches to pedestrian crossings needs to be well maintained.----- ----- Management of risk is an integral aspect of the Queensland Department of Main Roads (QDMR) strategy for managing its infrastructure assets. The risk-based approach has been used in many areas of infrastructure engineering. However, very limited information is reported on using risk-based approach to mitigate crash rates related to road surface. Low skid resistance and surface texture may increase the risk of traffic crashes.----- ----- The objectives of this paper are to explore current issues of skid resistance in relation to crashes, to provide a framework of probability-based approach to be adopted by QDMR in assessing the relationship between crash accidents and pavement properties, and to explain why the probability-based approach is a suitable tool for QDMR in order to reduce accident rates due to skid resistance.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Freeways are divided roadways designed to facilitate the uninterrupted movement of motor vehicles. However, many freeways now experience demand flows in excess of capacity, leading to recurrent congestion. The Highway Capacity Manual (TRB, 1994) uses empirical macroscopic relationships between speed, flow and density to quantify freeway operations and performance. Capacity may be predicted as the maximum uncongested flow achievable. Although they are effective tools for design and analysis, macroscopic models lack an understanding of the nature of processes taking place in the system. Szwed and Smith (1972, 1974) and Makigami and Matsuo (1990) have shown that microscopic modelling is also applicable to freeway operations. Such models facilitate an understanding of the processes whilst providing for the assessment of performance, through measures of capacity and delay. However, these models are limited to only a few circumstances. The aim of this study was to produce more comprehensive and practical microscopic models. These models were required to accurately portray the mechanisms of freeway operations at the specific locations under consideration. The models needed to be able to be calibrated using data acquired at these locations. The output of the models needed to be able to be validated with data acquired at these sites. Therefore, the outputs should be truly descriptive of the performance of the facility. A theoretical basis needed to underlie the form of these models, rather than empiricism, which is the case for the macroscopic models currently used. And the models needed to be adaptable to variable operating conditions, so that they may be applied, where possible, to other similar systems and facilities. It was not possible to produce a stand-alone model which is applicable to all facilities and locations, in this single study, however the scene has been set for the application of the models to a much broader range of operating conditions. Opportunities for further development of the models were identified, and procedures provided for the calibration and validation of the models to a wide range of conditions. The models developed, do however, have limitations in their applicability. Only uncongested operations were studied and represented. Driver behaviour in Brisbane was applied to the models. Different mechanisms are likely in other locations due to variability in road rules and driving cultures. Not all manoeuvres evident were modelled. Some unusual manoeuvres were considered unwarranted to model. However the models developed contain the principal processes of freeway operations, merging and lane changing. Gap acceptance theory was applied to these critical operations to assess freeway performance. Gap acceptance theory was found to be applicable to merging, however the major stream, the kerb lane traffic, exercises only a limited priority over the minor stream, the on-ramp traffic. Theory was established to account for this activity. Kerb lane drivers were also found to change to the median lane where possible, to assist coincident mergers. The net limited priority model accounts for this by predicting a reduced major stream flow rate, which excludes lane changers. Cowan's M3 model as calibrated for both streams. On-ramp and total upstream flow are required as input. Relationships between proportion of headways greater than 1 s and flow differed for on-ramps where traffic leaves signalised intersections and unsignalised intersections. Constant departure onramp metering was also modelled. Minimum follow-on times of 1 to 1.2 s were calibrated. Critical gaps were shown to lie between the minimum follow-on time, and the sum of the minimum follow-on time and the 1 s minimum headway. Limited priority capacity and other boundary relationships were established by Troutbeck (1995). The minimum average minor stream delay and corresponding proportion of drivers delayed were quantified theoretically in this study. A simulation model was constructed to predict intermediate minor and major stream delays across all minor and major stream flows. Pseudo-empirical relationships were established to predict average delays. Major stream average delays are limited to 0.5 s, insignificant compared with minor stream delay, which reach infinity at capacity. Minor stream delays were shown to be less when unsignalised intersections are located upstream of on-ramps than signalised intersections, and less still when ramp metering is installed. Smaller delays correspond to improved merge area performance. A more tangible performance measure, the distribution of distances required to merge, was established by including design speeds. This distribution can be measured to validate the model. Merging probabilities can be predicted for given taper lengths, a most useful performance measure. This model was also shown to be applicable to lane changing. Tolerable limits to merging probabilities require calibration. From these, practical capacities can be estimated. Further calibration is required of traffic inputs, critical gap and minimum follow-on time, for both merging and lane changing. A general relationship to predict proportion of drivers delayed requires development. These models can then be used to complement existing macroscopic models to assess performance, and provide further insight into the nature of operations.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Road safety is a major concern worldwide. Road safety will improve as road conditions and their effects on crashes are continually investigated. This paper proposes to use the capability of data mining to include the greater set of road variables for all available crashes with skid resistance values across the Queensland state main road network in order to understand the relationships among crash, traffic and road variables. This paper presents a data mining based methodology for the road asset management data to find out the various road properties that contribute unduly to crashes. The models demonstrate high levels of accuracy in predicting crashes in roads when various road properties are included. This paper presents the findings of these models to show the relationships among skid resistance, crashes, crash characteristics and other road characteristics such as seal type, seal age, road type, texture depth, lane count, pavement width, rutting, speed limit, traffic rates intersections, traffic signage and road design and so on.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper explores how young children are constructed in educational policy for citizenship in Australia, investigating tensions between early childhood educational discourses that construct young children as active citizens and the broader discourses of citizenship in Australian educational policy. There is a widespread discourse within early childhood education that regards young children as citizens and democratic participants in their own lives. This view is a reflection of the oft cited Article 12 in the UN Convention on the Rights of the Child (UNCRC 1989). However, educational policy and curriculum for citizenship in Australia, by and large, adheres to age and stage understandings of children, implicitly deeming young children unable to conceptualise abstract ideas of what it means to ‘be a good citizen’. This paper is located in the borders and intersections between discourses of early childhood education, young children as active participants in their own lives and what it means to be an active citizen in Australia. We are concerned with the interweaving of these ideas and how they are played out in educational policy making. This is an important perspective to take for governing and policy making are exercises in harnessing existing ideas and discourses, thereby rendering strategies and tactics for managing populations thinkable and sayable (Rose 1999). The ‘views from the margins’ (Burman 2008, p. 7) can provide alternative perspectives on policymaking, illuminating discursive tactics and strategies.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The use of stable isotope ratios δ18O and δ2H are well established in assessment of groundwater systems and their hydrology. The conventional approach is based on x/y plots and relation to various MWL’s, and plots of either ratio against parameters such as Clor EC. An extension of interpretation is the use of 2D maps and contour plots, and 2D hydrogeological vertical sections. An enhancement of presentation and interpretation is the production of “isoscapes”, usually as 2.5D surface projections. We have applied groundwater isotopic data to a 3D visualisation, using the alluvial aquifer system of the Lockyer Valley. The 3D framework is produced in GVS (Groundwater Visualisation System). This format enables enhanced presentation by displaying the spatial relationships and allowing interpolation between “data points” i.e. borehole screened zones where groundwater enters. The relative variations in the δ18O and δ2H values are similar in these ambient temperature systems. However, δ2H better reflects hydrological processes, whereas δ18O also reflects aquifer/groundwater exchange reactions. The 3D model has the advantage that it displays borehole relations to spatial features, enabling isotopic ratios and their values to be associated with, for example, bedrock groundwater mixing, interaction between aquifers, relation to stream recharge, and to near-surface and return irrigation water evaporation. Some specific features are also shown, such as zones of leakage of deeper groundwater (in this case with a GAB signature). Variations in source of recharging water at a catchment scale can be displayed. Interpolation between bores is not always possible depending on numbers and spacing, and by elongate configuration of the alluvium. In these cases, the visualisation uses discs around the screens that can be manually expanded to test extent or intersections. Separate displays are used for each of δ18O and δ2H and colour coding for isotope values.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Over less than a decade, we have witnessed a seismic shift in the way knowledge is produced and exchanged. This is opening up new opportunities for civic and community engagement, entrepreneurial behaviour, sustainability initiatives and creative practices. It also has the potential to create fresh challenges in areas of privacy, cyber-security and misuse of data and personal information. The field of urban informatics focuses on the use and impacts of digital media technology in urban environments. Urban informatics is a dynamic and cross-disciplinary area of inquiry that encapsulates social media, ubiquitous computing, mobile applications and location-based services. Its insights suggest the emergence of a new economic force with the potential for driving innovation, wealth and prosperity through technological advances, digital media and online networks that affect patterns of both social and economic development. Urban informatics explores the intersections between people, place and technology, and their implications for creativity, innovation and engagement. This paper examines how the key learnings from this field can be used to position creative and cultural institutions such as galleries, libraries, archives and museums (GLAM) to take advantage of the opportunities presented by these changing social and technological developments. This paper introduces the underlying principles, concepts and research areas of urban informatics, against the backdrop of modern knowledge economies. Both theoretical ideas and empirical examples are covered in this paper. The first part discusses three challenges: a. People, and the challenge of creativity: The paper explores the opportunities and challenges of urban informatics that can lead to the design and development of new tools, methods and applications fostering participation, the democratisation of knowledge, and new creative practices. b. Technology, and the challenge of innovation: The paper examines how urban informatics can be applied to support user-led innovation with a view to promoting entrepreneurial ideas and creative industries. c. Place, and the challenge of engagement: The paper discusses the potential to establish place-based applications of urban informatics, using the example of library spaces designed to deliver community and civic engagement strategies. The discussion of these challenges is illustrated by a review of projects as examples drawn from diverse fields such as urban computing, locative media, community activism, and sustainability initiatives. The second part of the paper introduces an empirically grounded case study that responds to these three challenges: The Edge, the Queensland Government’s Digital Culture Centre which is an initiative of the State Library of Queensland to explore the nexus of technology and culture in an urban environment. The paper not only explores the new role of libraries in the knowledge economy, but also how the application of urban informatics in prototype engagement spaces such as The Edge can provide transferable insights that can inform the design and development of responsive and inclusive new library spaces elsewhere. To set the scene and background, the paper begins by drawing the bigger picture and outlining some key characteristics of the knowledge economy and the role that the creative and cultural industries play in it, grasping new opportunities that can contribute to the prosperity of Australia.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The aim of this study is to assess the potential use of Bluetooth data for traffic monitoring of arterial road networks. Bluetooth data provides the direct measurement of travel time between pairs of scanners, and intensive research has been reported on this topic. Bluetooth data includes “Duration” data, which represents the time spent by Bluetooth devices to pass through the detection range of Bluetooth scanners. If the scanners are located at signalised intersections, this Duration can be related to intersection performance, and hence represents valuable information for traffic monitoring. However the use of Duration has been ignored in previous analyses. In this study, the Duration data as well as travel time data is analysed to capture the traffic condition of a main arterial route in Brisbane. The data consists of one week of Bluetooth data provided by Brisbane City Council. As well, micro simulation analysis is conducted to further investigate the properties of Duration. The results reveal characteristics of Duration, and address future research needs to utilise this valuable data source.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The city centre represents a complex environment for cycling with large volumes of pedestrians and motorised vehicles and frequent signalised intersections. Much of the previous literature has focused on cyclist-motor vehicle interactions because of the safety implications for cyclists, but there is increasing concern from pedestrians about the threats they perceive from cyclists. In the absence of objective data, this has the potential to lead to restrictions on cyclist access and behaviour. This presentation reports the development of a method to study the extent of cycling in the city centre and the frequency and nature of interactions between cyclists and pedestrians. Queensland is one of the few Australian jurisdictions that permits adults to cycle on the footpath and this was also of interest. 1992 cyclists were observed at six locations in the Brisbane city centre, during 7-9am, 9-11am, 2-4pm and 4-6pm on four weekdays in October 2010. The majority (85.5%) of cyclists were male, and 21.8% rode on the footpath. Females were more likely to travel on the footpath than males. One or more pedestrians were within 1m for 18.1% of observed cyclists, and one or more pedestrians were within 5m for 39.1% of observed cyclists. There were few conflicts, defined as an occasion where if no one took evasive action a collision would occur, between cyclists and pedestrians or vehicles (1.1% and 0.6% respectively) but they were more common for adolescents and riders not wearing (or not fastening) helmets.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This edited book brings together empirical studies of young people in paid employment from a variety of disciplinary perspectives and in different national settings. In the context of increasing youth labour market participation rates and debates about the value of early employment, it draws on multi-level analyses to reflect the complexity of the field. Each of the three sections of the book explores a key aspect of young people's employment: their experience of work, intersections between work and education, and the impact of other actors and institutions. The book contributes to broadening and strengthening knowledge about the opportunities and constraints that young people face during their formative experiences in the labour market.