989 resultados para Traffic volume.
Resumo:
Urban traffic and climate change are two phenomena that have the potential to degrade urban water quality by influencing the build-up and wash-off of pollutants, respectively. However, limited knowledge has made it difficult to establish any link between pollutant buildup and wash-off under such dynamic conditions. In order to safeguard urban water quality, adaptive water quality mitigation measures are required. In this research, pollutant build-up and wash-off have been investigated from a dynamic point of view which incorporated the impacts of changed urban traffic as well as changes in the rainfall characteristics induced by climate change. The study has developed a dynamic object classification system and thereby, conceptualised the study of pollutant build-up and wash-off under future changes in urban traffic and rainfall characteristics. This study has also characterised the buildup and wash-off processes of traffic generated heavy metals, volatile, semi-volatile and non-volatile hydrocarbons under dynamic conditions which enables the development of adaptive mitigation measures for water quality. Additionally, predictive frameworks for the build-up and wash-off of some pollutants have also been developed.
Resumo:
Evaluating the safety of different traffic facilities is a complex and crucial task. Microscopic simulation models have been widely used for traffic management but have been largely neglected in traffic safety studies. Micro simulation to study safety is more ethical and accessible than the traditional safety studies, which only assess historical crash data. However, current microscopic models are unable to mimic unsafe driver behavior, as they are based on presumptions of safe driver behavior. This highlights the need for a critical examination of the current microscopic models to determine which components and parameters have an effect on safety indicator reproduction. The question then arises whether these safety indicators are valid indicators of traffic safety. The safety indicators were therefore selected and tested for straight motorway segments in Brisbane, Australia. This test examined the capability of a micro-simulation model and presents a better understanding of micro-simulation models and how such models, in particular car following models can be enriched to present more accurate safety indicators.
Resumo:
This paper outlines a study to determine the correlation between the LA10(18hour) and other road traffic noise indicators. It is based on a database comprising of 404 measurement locations including 947 individual days of valid noise measurements across numerous circumstances taken between November 2001 and November 2007. This paper firstly discusses the need and constraints on the indicators and their nature of matching a suitable indicator to the various road traffic noise dynamical characteristics. The paper then presents a statistical analysis of the road traffic noise monitoring data, correlating various indicators with the LA10(18hour) statistical indicator and provides a comprehensive table of linear correlations. There is an extended analysis on relationships across the night time period. The paper concludes with a discussion on the findings.
Raising awareness of traffic pollution: the potential benefits and problems of using a warning smell
Resumo:
Exposure to traffic pollution is increasing worldwide as people move to cities, and as more vehicles join the roads, creating longer journeys and more traffic jams. Most traffic pollutants are odourless and invisible, which hides exposure from the public. If traffic pollution had a distinctive smell it would enable people to avoid exposure, and increase the political will for difficult policy changes. A smell may also instigate longer-term changes, such as switching to active transport for school pick-ups. A smell could be added using a fuel additive or a temporary device attached to vehicle exhausts.
Resumo:
Over the last decade, researchers and legislators have struggled to get an accurate picture of the scale and nature of the problem of human trafficking. In the absence of reliable data, some anti-prostitution activists have asserted that a causal relationship exists between legalised prostitution and human trafficking. They claim that systems of legalised or decriminalised prostitution lead to increases in trafficking into the sex industry. This paper critically analyses attempts to substantiate this claim during the development of anti-trafficking policy in Australia and the United States. These attempts are explored within the context of persistent challenges in measuring the scale and nature of human trafficking. The efforts of abolitionist campaigners to use statistical evidence and logical argumentation are analysed, with a specific focus on the characterisation of demand for sexual services and systems of legalised prostitution as ‘pull’ factors fuelling an increase in sex trafficking. The extent to which policymakers sought to introduce evidence-based policy is also explored.
Resumo:
A road bridge containing disused flatbed rail wagons as the primary deck superstructure was performance tested in a low volume, high axle load traffic road in Queensland, Australia; some key results are presented in this paper. A fully laden truck of total weight 28.88 % of the serviceability design load prescribed in the Australian bridge code was used; its wheel positions were accurately captured using a high speed camera and synchronised with the real‐time deflections and strains measured at the critical members of the flat rail wagons. The strains remained well below the yield and narrated the existence of composite action between the reinforced concrete slab pavement and the wagon deck. A three dimensional grillage model was developed and calibrated using the test data, which established the structural adequacy of the rail wagons and the positive contribution of the reinforced concrete slab pavement to resist high axle traffic loads on a single lane bridge in the low volume roads network.
Resumo:
OBJECTIVE: Childhood-onset type 1 diabetes is associated with neurocognitive deficits, but there is limited evidence to date regarding associated neuroanatomical brain changes and their relationship to illness variables such as age at disease onset. This report examines age-related changes in volume and T2 relaxation time (a fundamental parameter of magnetic resonance imaging that reflects tissue health) across the whole brain. RESEARCH DESIGN AND METHODS: Type 1 diabetes, N = 79 (mean age 20.32 ± 4.24 years), and healthy control participants, N = 50 (mean age 20.53 ± 3.60 years). There were no substantial group differences on socioeconomic status, sex ratio, or intelligence quotient. RESULTS: Regression analyses revealed a negative correlation between age and brain changes, with decreasing gray matter volume and T2 relaxation time with age in multiple brain regions in the type 1 diabetes group. In comparison, the age-related decline in the control group was small. Examination of the interaction of group and age confirmed a group difference (type 1 diabetes vs. control) in the relationship between age and brain volume/T2 relaxation time. CONCLUSIONS: We demonstrated an interaction between age and group in predicting brain volumes and T2 relaxation time such that there was a decline in these outcomes in type 1 diabetic participants that was much less evident in control subjects. Findings suggest the neurodevelopmental pathways of youth with type 1 diabetes have diverged from those of their healthy peers by late adolescence and early adulthood but the explanation for this phenomenon remains to be clarified.
Resumo:
The greatly increased risk of being killed or injured in a car crash for the young novice driver has been recognised in the road safety and injury prevention literature for decades. Risky driving behaviour has consistently been found to contribute to traffic crashes. Researchers have devised a number of instruments to measure this risky driving behaviour. One tool developed specifically to measure the risky behaviour of young novice drivers is the Behaviour of Young Novice Drivers Scale (BYNDS) (Scott-Parker et al., 2010). The BYNDS consists of 44 items comprising five subscales for transient violations, fixed violations, misjudgement, risky driving exposure, and driving in response to their mood. The factor structure of the BYNDS has not been examined since its development in a matched sample of 476 novice drivers aged 17-25 years. Method: The current research attempted to refine the BYNDS and explore its relationship with the self-reported crash and offence involvement and driving intentions of 390 drivers aged 17-25 years (M = 18.23, SD = 1.58) in Queensland, Australia, during their first six months of independent driving with a Provisional (intermediate) driver’s licence. A confirmatory factor analysis was undertaken examining the fit of the originally proposed BYNDS measurement model. Results: The model was not a good fit to the data. A number of iterations removed items with low factor loadings, resulting in a 36-item revised BYNDS which was a good fit to the data. The revised BYNDS was highly internally consistent. Crashes were associated with fixed violations, risky driving exposure, and misjudgement; offences were moderately associated with risky driving exposure and transient violations; and road-rule compliance intentions were highly associated with transient violations. Conclusions: Applications of the BYNDS in other young novice driver populations will further explore the factor structure of both the original and revised BYNDS. The relationships between BYNDS subscales and self-reported risky behaviour and attitudes can also inform countermeasure development, such as targeting young novice driver non-compliance through enforcement and education initiatives.
Resumo:
The impact of weather on traffic and its behavior is not well studied in literature primarily due to lack of integrated traffic and weather data. Weather can significant effect the traffic and traffic management measures developed for fine weather might not be optimal for adverse weather. Simulation is an efficient tool for analyzing traffic management measures even before their actual implementation. Therefore, in order to develop and test traffic management measures for adverse weather condition we need to first analyze the effect of weather on fundamental traffic parameters and thereafter, calibrate the simulation model parameters in order to simulate the traffic under adverse weather conditions. In this paper we first, analyses the impact of weather on motorway traffic flow and drivers’ behaviour with traffic data from Swiss motorways and weather data from MeteoSuisse. Thereafter, we develop methodology to calibrate a microscopic simulation model with the aim to utilize the simulation model for simulating traffic under adverse weather conditions. Here, study is performed using AIMSUN, a microscopic traffic simulator.
Resumo:
Road traffic noise affects the quality of life in the areas adjoining the road. The effect of traffic noise on people is wide ranging and may include sleep disturbance and negative impact on work efficiency. To address the problem of traffic noise, it is necessary to estimate the noise level. For this, a number of noise estimation models have been developed which can estimate noise at the receptor points, based on simple configuration of buildings. However, for a real world situation we have multiple buildings forming built-up area. In such a situation, it is almost impossible to consider multiple diffractions and reflections in sound propagation from the source to the receptor point. An engineering solution to such a real world problem is needed to estimate noise levels in built-up area.
Resumo:
Traffic generated semi and non volatile organic compounds (SVOCs and NVOCs) pose a serious threat to human and ecosystem health when washed off into receiving water bodies by stormwater. Climate change influenced rainfall characteristics makes the estimation of these pollutants in stormwater quite complex. The research study discussed in the paper developed a prediction framework for such pollutants under the dynamic influence of climate change on rainfall characteristics. It was established through principal component analysis (PCA) that the intensity and durations of low to moderate rain events induced by climate change mainly affect the wash-off of SVOCs and NVOCs from urban roads. The study outcomes were able to overcome the limitations of stringent laboratory preparation of calibration matrices by extracting uncorrelated underlying factors in the data matrices through systematic application of PCA and factor analysis (FA). Based on the initial findings from PCA and FA, the framework incorporated orthogonal rotatable central composite experimental design to set up calibration matrices and partial least square regression to identify significant variables in predicting the target SVOCs and NVOCs in four particulate fractions ranging from >300-1 μm and one dissolved fraction of <1 μm. For the particulate fractions range >300-1 μm, similar distributions of predicted and observed concentrations of the target compounds from minimum to 75th percentile were achieved. The inter-event coefficient of variations for particulate fractions of >300-1 μm were 5% to 25%. The limited solubility of the target compounds in stormwater restricted the predictive capacity of the proposed method for the dissolved fraction of <1 μm.
Resumo:
Traffic related emissions have been recognised as one of the main sources of air pollutants. In the research study discussed in this paper, variability of atmospheric total suspended particulate matter (TSP), polycyclic aromatic hydrocarbons (PAH) and heavy metal (HM) concentrations with traffic and land use characteristics during weekdays and weekends were investigated. Data required for the study were collected from a range of sampling sites to ensure a wide mix of traffic and land use characteristics. The analysis undertaken confirmed that zinc has the highest concentration in the atmospheric phase during weekends as well as weekdays. Although the use of leaded gasoline was discontinued a decade ago, lead was the second most commonly detected heavy metal. This is attributed to the association of previously generated lead with roadside soil and re-suspension to the atmosphere. Soil related particles are the primary source of TSP and manganese to the atmosphere. The analysis further revealed that traffic sources are dominant in gas phase PAHs compared to the other sources during weekdays. Land use related sources become important contributors to atmospheric PAHs during weekends when traffic sources are at their minimal levels.
Resumo:
Volume measurements are useful in many branches of science and medicine. They are usually accomplished by acquiring a sequence of cross sectional images through the object using an appropriate scanning modality, for example x-ray computed tomography (CT), magnetic resonance (MR) or ultrasound (US). In the cases of CT and MR, a dividing cubes algorithm can be used to describe the surface as a triangle mesh. However, such algorithms are not suitable for US data, especially when the image sequence is multiplanar (as it usually is). This problem may be overcome by manually tracing regions of interest (ROIs) on the registered multiplanar images and connecting the points into a triangular mesh. In this paper we describe and evaluate a new discreet form of Gauss’ theorem which enables the calculation of the volume of any enclosed surface described by a triangular mesh. The volume is calculated by summing the vector product of the centroid, area and normal of each surface triangle. The algorithm was tested on computer-generated objects, US-scanned balloons, livers and kidneys and CT-scanned clay rocks. The results, expressed as the mean percentage difference ± one standard deviation were 1.2 ± 2.3, 5.5 ± 4.7, 3.0 ± 3.2 and −1.2 ± 3.2% for balloons, livers, kidneys and rocks respectively. The results compare favourably with other volume estimation methods such as planimetry and tetrahedral decomposition.
Resumo:
A system is described for calculating volume from a sequence of multiplanar 2D ultrasound images. Ultrasound images are captured using a video digitising card (Hauppauge Win/TV card) installed in a personal computer, and regions of interest transformed into 3D space using position and orientation data obtained from an electromagnetic device (Polbemus, Fastrak). The accuracy of the system was assessed by scanning 10 water filled balloons (13-141 ml), 10 kidneys (147 200 ml) and 16 fetal livers (8 37 ml) in water using an Acuson 128XP/10 (5 MHz curvilinear probe). Volume was calculated using the ellipsoid, planimetry, tetrahedral and ray tracing methods and compared with the actual volume measured by weighing (balloons) and water displacement (kidneys and livers). The mean percentage error for the ray tracing method was 0.9 ± 2.4%, 2.7 ± 2.3%, 6.6 ± 5.4% for balloons, kidneys and livers, respectively. So far the system has been used clinically to scan fetal livers and lungs, neonate brain ventricles and adult prostate glands.
Resumo:
A new system is described for estimating volume from a series of multiplanar 2D ultrasound images. Ultrasound images are captured using a personal computer video digitizing card and an electromagnetic localization system is used to record the pose of the ultrasound images. The accuracy of the system was assessed by scanning four groups of ten cadaveric kidneys on four different ultrasound machines. Scan image planes were oriented either radially, in parallel or slanted at 30 C to the vertical. The cross-sectional images of the kidneys were traced using a mouse and the outline points transformed to 3D space using the Fastrak position and orientation data. Points on adjacent region of interest outlines were connected to form a triangle mesh and the volume of the kidneys estimated using the ellipsoid, planimetry, tetrahedral and ray tracing methods. There was little difference between the results for the different scan techniques or volume estimation algorithms, although, perhaps as expected, the ellipsoid results were the least precise. For radial scanning and ray tracing, the mean and standard deviation of the percentage errors for the four different machines were as follows: Hitachi EUB-240, −3.0 ± 2.7%; Tosbee RM3, −0.1 ± 2.3%; Hitachi EUB-415, 0.2 ± 2.3%; Acuson, 2.7 ± 2.3%.