927 resultados para National Research Council Canada
Resumo:
Statisticians along with other scientists have made significant computational advances that enable the estimation of formerly complex statistical models. The Bayesian inference framework combined with Markov chain Monte Carlo estimation methods such as the Gibbs sampler enable the estimation of discrete choice models such as the multinomial logit (MNL) model. MNL models are frequently applied in transportation research to model choice outcomes such as mode, destination, or route choices or to model categorical outcomes such as crash outcomes. Recent developments allow for the modification of the potentially limiting assumptions of MNL such as the independence from irrelevant alternatives (IIA) property. However, relatively little transportation-related research has focused on Bayesian MNL models, the tractability of which is of great value to researchers and practitioners alike. This paper addresses MNL model specification issues in the Bayesian framework, such as the value of including prior information on parameters, allowing for nonlinear covariate effects, and extensions to random parameter models, so changing the usual limiting IIA assumption. This paper also provides an example that demonstrates, using route-choice data, the considerable potential of the Bayesian MNL approach with many transportation applications. This paper then concludes with a discussion of the pros and cons of this Bayesian approach and identifies when its application is worthwhile
Resumo:
Graduated licensing schemes have been found to reduce the crash risk of young novice drivers, but there is less evidence of their success with novice motorcycle riders. This study examined the riding experience of a sample of Australian learner-riders to establish the extent and variety of their riding practice during the learner stage. Riders completed an anonymous questionnaire at a compulsory rider-training course for the licensing test. The majority of participants were male (81%) with an average age of 33 years. They worked full time (81%), held an unrestricted driver's license (81%), and owned the motorcycle that they rode (79%). These riders had held their learner's license for an average of 6 months. On average, they rode 6.4 h/week. By the time they attempted the rider-licensing test, they had ridden a total of 101 h. Their total hours of on-road practice were comparable to those of learner-drivers at the same stage of licensing, but they had less experience in adverse or challenging road conditions. A substantial proportion had little or no experience of riding in the rain (57%), at night (36%), in heavy traffic (22%), on winding rural roads (52%), or on high-speed roads (51%). These findings highlight the differences in the learning processes between unsupervised novice motorcycle riders and supervised novice drivers. Further research is necessary to clarify whether specifying the conditions under which riders should practice during the graduated licensing process would likely reduce or increase their crash risk.
Resumo:
Characteristics of the road infrastructure affect both the popularity of bicycling and its safety, but comparisons of the safety performance of infrastructure may be confounded by differences in the profiles of cyclists who use them. Data from a survey of 2,532 adult bicycle riders in Queensland, Australia, demonstrated that many riders rode reluctantly in particular locations and that preference for riding location was influenced by degree of experience and riding purpose. Most riders rode most often and furthest per week on urban roads, but approximately one-third of all riders (and more new riders) rode there reluctantly. Almost two-thirds of riders rode on bicycle paths, most by choice, not reluctantly. New riders rode proportionally more on bicycle paths, but continuing riders rode further in absolute terms. Utilitarian riders were more likely to ride on bicycle paths than social and fitness riders and almost all of this riding was by choice. Fitness riders were more reluctant in their use of bicycle paths, but still most of their use was by choice. One-third of the respondents reported riding on the sidewalk (legal in Queensland), with approximately two-thirds doing so reluctantly. The frequency and distance ridden on the sidewalk was less than for urban roads and bicycle paths. Sidewalks and bicycle paths were important facilities for both inexperienced and experienced riders and for utilitarian riding, especially when urban roads were considered a poor choice for cycling.
Resumo:
Achieving sustainability is one of the major goals of many urban transportation systems. Over the years, many innovative policies have been attempted to achieve an efficient, safe, and sustainable transport system. Those policies often require smart technologies to assist implementation process and enhance effectiveness. This paper discusses how sustainability can be promoted by embedding smart technologies in a modern transportation system. In particular, this paper studies the transport system of Singapore to address how this system is addressing sustainability through the use of smart technologies. Various technological initiatives in managing traffic flow, monitoring and enforcement, sharing real-time information, and managing revenues are discussed in light of their potentiality in addressing sustainability issues. The Singapore experience provides a useful reference for the cities intending to develop and promote a sustainable transport system.
Resumo:
Deterministic transit capacity analysis applies to planning, design and operational management of urban transit systems. The Transit Capacity and Quality of Service Manual (1) and Vuchic (2, 3) enable transit performance to be quantified and assessed using transit capacity and productive capacity. This paper further defines important productive performance measures of an individual transit service and transit line. Transit work (p-km) captures the transit task performed over distance. Passenger transmission (p-km/h) captures the passenger task delivered by service at speed. Transit productiveness (p-km/h) captures transit work performed over time. These measures are useful to operators in understanding their services’ or systems’ capabilities and passenger quality of service. This paper accounts for variability in utilized demand by passengers along a line and high passenger load conditions where passenger pass-up delay occurs. A hypothetical case study of an individual bus service’s operation demonstrates the usefulness of passenger transmission in comparing existing and growth scenarios. A hypothetical case study of a bus line’s operation during a peak hour window demonstrates the theory’s usefulness in examining the contribution of individual services to line productive performance. Scenarios may be assessed using this theory to benchmark or compare lines and segments, conditions, or consider improvements.
Importance of a resilient air services network to Australian remote, rural, and regional communities
Resumo:
Rural, regional, and remote settlements in Australia require resilient infrastructure to remain sustainable in a context characterized by frequent large-scale natural disasters, long distances between urban centers, and the pressures of economic change. A critical aspect of this infrastructure is the air services network, a system of airports, aircraft operators, and related industries that enables the high-speed movement of people, goods, and services to remote locations. A process of deregulation during the 1970s and 1980s resulted in many of these airports passing into local government and private ownership, and the rationalization of the industry saw the closure of a number of airlines and airports. This paper examines the impacts of deregulation on the resilience of air services and the contribution that they make to regional and rural communities. In particular, the robustness, redundancy, resourcefulness, and rapidity of the system are examined. The conclusion is that while the air services network has remained resilient in a situation of considerable change, the pressures of commercialization and the tendency to manage aspects of the system in isolation have contributed to a potential decrease in overall resilience.
Resumo:
For the evaluation, design, and planning of traffic facilities and measures, traffic simulation packages are the de facto tools for consultants, policy makers, and researchers. However, the available commercial simulation packages do not always offer the desired work flow and flexibility for academic research. In many cases, researchers resort to designing and building their own dedicated models, without an intrinsic incentive (or the practical means) to make the results available in the public domain. To make matters worse, a substantial part of these efforts pertains to rebuilding basic functionality and, in many respects, reinventing the wheel. This problem not only affects the research community but adversely affects the entire traffic simulation community and frustrates the development of traffic simulation in general. For this problem to be addressed, this paper describes an open source approach, OpenTraffic, which is being developed as a collaborative effort between the Queensland University of Technology, Australia; the National Institute of Informatics, Tokyo; and the Technical University of Delft, the Netherlands. The OpenTraffic simulation framework enables academies from geographic areas and disciplines within the traffic domain to work together and contribute to a specific topic of interest, ranging from travel choice behavior to car following, and from response to intelligent transportation systems to activity planning. The modular approach enables users of the software to focus on their area of interest, whereas other functional modules can be regarded as black boxes. Specific attention is paid to a standardization of data inputs and outputs for traffic simulations. Such standardization will allow the sharing of data with many existing commercial simulation packages.
Resumo:
Long traffic queues on off-ramps significantly compromise the safety and throughput of motorways. Obtaining accurate queue information is crucial for countermeasure strategies. However, it is challenging to estimate traffic queues with locally installed inductive loop detectors. This paper deals with the problem of queue estimation with the interpretation of queuing dynamics and the corresponding time-occupancy distribution over motorway off-ramps. A novel algorithm for real-time queue estimation with two detectors is presented and discussed. Results derived from microscopic traffic simulation validated the effectiveness of the algorithm and revealed some of its useful features: (a) long and intermediate traffic queues could be accurately measured, (b) relatively simple detector input (i.e., time occupancy) was required, and (c) the estimation philosophy was independent with signal timing changes and provided the potential to cooperate with advanced strategies for signal control. Some issues concerning field implementation are also discussed.
Traffic queue estimation for metered motorway on-ramps through use of loop detector time occupancies
Resumo:
The primary objective of this study is to develop a robust queue estimation algorithm for motorway on-ramps. Real-time queue information is a vital input for dynamic queue management on metered on-ramps. Accurate and reliable queue information enables the management of on-ramp queue in an adaptive manner to the actual traffic queue size and thus minimises the adverse impacts of queue flush while increasing the benefit of ramp metering. The proposed algorithm is developed based on the Kalman filter framework. The fundamental conservation model is used to estimate the system state (queue size) with the flow-in and flow-out measurements. This projection results are updated with the measurement equation using the time occupancies from mid-link and link-entrance loop detectors. This study also proposes a novel single point correction method. This method resets the estimated system state to eliminate the counting errors that accumulate over time. In the performance evaluation, the proposed algorithm demonstrated accurate and reliable performances and consistently outperformed the benchmarked Single Occupancy Kalman filter (SOKF) method. The improvements over SOKF are 62% and 63% in average in terms of the estimation accuracy (MAE) and reliability (RMSE), respectively. The benefit of the innovative concepts of the algorithm is well justified by the improved estimation performance in congested ramp traffic conditions where long queues may significantly compromise the benchmark algorithm’s performance.
Resumo:
This report is one of a series of products resulting from a National Health and Medical Research Council (NHMRC) Urgent Research Grant – Pandemic Influenza [No 409973]. The research targeted two key aspects of planning and preparedness for a human influenza pandemic, namely:
Resumo:
In 2009, the National Research Council of the National Academies released a report on A New Biology for the 21st Century. The council preferred the term ‘New Biology’ to capture the convergence and integration of the various disciplines of biology. The National Research Council stressed: ‘The essence of the New Biology, as defined by the committee, is integration—re-integration of the many sub-disciplines of biology, and the integration into biology of physicists, chemists, computer scientists, engineers, and mathematicians to create a research community with the capacity to tackle a broad range of scientific and societal problems.’ They define the ‘New Biology’ as ‘integrating life science research with physical science, engineering, computational science, and mathematics’. The National Research Council reflected: 'Biology is at a point of inflection. Years of research have generated detailed information about the components of the complex systems that characterize life––genes, cells, organisms, ecosystems––and this knowledge has begun to fuse into greater understanding of how all those components work together as systems. Powerful tools are allowing biologists to probe complex systems in ever greater detail, from molecular events in individual cells to global biogeochemical cycles. Integration within biology and increasingly fruitful collaboration with physical, earth, and computational scientists, mathematicians, and engineers are making it possible to predict and control the activities of biological systems in ever greater detail.' The National Research Council contended that the New Biology could address a number of pressing challenges. First, it stressed that the New Biology could ‘generate food plants to adapt and grow sustainably in changing environments’. Second, the New Biology could ‘understand and sustain ecosystem function and biodiversity in the face of rapid change’. Third, the New Biology could ‘expand sustainable alternatives to fossil fuels’. Moreover, it was hoped that the New Biology could lead to a better understanding of individual health: ‘The New Biology can accelerate fundamental understanding of the systems that underlie health and the development of the tools and technologies that will in turn lead to more efficient approaches to developing therapeutics and enabling individualized, predictive medicine.’ Biological research has certainly been changing direction in response to changing societal problems. Over the last decade, increasing awareness of the impacts of climate change and dwindling supplies of fossil fuels can be seen to have generated investment in fields such as biofuels, climate-ready crops and storage of agricultural genetic resources. In considering biotechnology’s role in the twenty-first century, biological future-predictor Carlson’s firm Biodesic states: ‘The problems the world faces today – ecosystem responses to global warming, geriatric care in the developed world or infectious diseases in the developing world, the efficient production of more goods using less energy and fewer raw materials – all depend on understanding and then applying biology as a technology.’ This collection considers the roles of intellectual property law in regulating emerging technologies in the biological sciences. Stephen Hilgartner comments that patent law plays a significant part in social negotiations about the shape of emerging technological systems or artefacts: 'Emerging technology – especially in such hotbeds of change as the life sciences, information technology, biomedicine, and nanotechnology – became a site of contention where competing groups pursued incompatible normative visions. Indeed, as people recognized that questions about the shape of technological systems were nothing less than questions about the future shape of societies, science and technology achieved central significance in contemporary democracies. In this context, states face ongoing difficulties trying to mediate these tensions and establish mechanisms for addressing problems of representation and participation in the sociopolitical process that shapes emerging technology.' The introduction to the collection will provide a thumbnail, comparative overview of recent developments in intellectual property and biotechnology – as a foundation to the collection. Section I of this introduction considers recent developments in United States patent law, policy and practice with respect to biotechnology – in particular, highlighting the Myriad Genetics dispute and the decision of the Supreme Court of the United States in Bilski v. Kappos. Section II considers the cross-currents in Canadian jurisprudence in intellectual property and biotechnology. Section III surveys developments in the European Union – and the interpretation of the European Biotechnology Directive. Section IV focuses upon Australia and New Zealand, and considers the policy responses to the controversy of Genetic Technologies Limited’s patents in respect of non-coding DNA and genomic mapping. Section V outlines the parts of the collection and the contents of the chapters.
Resumo:
Drivers behave in different ways, and these different behaviors are a cause of traffic disturbances. A key objective for simulation tools is to correctly reproduce this variability, in particular for car-following models. From data collection to the sampling of realistic behaviors, a chain of key issues must be addressed. This paper discusses data filtering, robustness of calibration, correlation between parameters, and sampling techniques of acceleration-time continuous car-following models. The robustness of calibration is systematically investigated with an objective function that allows confidence regions around the minimum to be obtained. Then, the correlation between sets of calibrated parameters and the validity of the joint distributions sampling techniques are discussed. This paper confirms the need for adapted calibration and sampling techniques to obtain realistic sets of car-following parameters, which can be used later for simulation purposes.
Resumo:
Studies on the swelling behaviour of mixtures of bentonite clay and nonswelling coarser fractions of different sizes and shapes reveal that observed swelling occurs only after the voids of the nonswelling particles are filled up with swollen clay particles. The magnitude of the swell within the voids, called intervoid swelling is large when the size and percentage of the nonswelling coarser fraction is large. The observable swell, after intervoid swelling, is called primary swelling and follows a rectangular hyperbolic relationship with time. The total swell per gram of the clay decreases with an increase in the size of the nonswelling fraction and with a decrease in the percentage of swelling clay. Time-swell relationships show that swelling continues to occur for a long time after the primary swelling, and this is called secondary swelling.