514 resultados para uses
Resumo:
Freeways are divided roadways designed to facilitate the uninterrupted movement of motor vehicles. However, many freeways now experience demand flows in excess of capacity, leading to recurrent congestion. The Highway Capacity Manual (TRB, 1994) uses empirical macroscopic relationships between speed, flow and density to quantify freeway operations and performance. Capacity may be predicted as the maximum uncongested flow achievable. Although they are effective tools for design and analysis, macroscopic models lack an understanding of the nature of processes taking place in the system. Szwed and Smith (1972, 1974) and Makigami and Matsuo (1990) have shown that microscopic modelling is also applicable to freeway operations. Such models facilitate an understanding of the processes whilst providing for the assessment of performance, through measures of capacity and delay. However, these models are limited to only a few circumstances. The aim of this study was to produce more comprehensive and practical microscopic models. These models were required to accurately portray the mechanisms of freeway operations at the specific locations under consideration. The models needed to be able to be calibrated using data acquired at these locations. The output of the models needed to be able to be validated with data acquired at these sites. Therefore, the outputs should be truly descriptive of the performance of the facility. A theoretical basis needed to underlie the form of these models, rather than empiricism, which is the case for the macroscopic models currently used. And the models needed to be adaptable to variable operating conditions, so that they may be applied, where possible, to other similar systems and facilities. It was not possible to produce a stand-alone model which is applicable to all facilities and locations, in this single study, however the scene has been set for the application of the models to a much broader range of operating conditions. Opportunities for further development of the models were identified, and procedures provided for the calibration and validation of the models to a wide range of conditions. The models developed, do however, have limitations in their applicability. Only uncongested operations were studied and represented. Driver behaviour in Brisbane was applied to the models. Different mechanisms are likely in other locations due to variability in road rules and driving cultures. Not all manoeuvres evident were modelled. Some unusual manoeuvres were considered unwarranted to model. However the models developed contain the principal processes of freeway operations, merging and lane changing. Gap acceptance theory was applied to these critical operations to assess freeway performance. Gap acceptance theory was found to be applicable to merging, however the major stream, the kerb lane traffic, exercises only a limited priority over the minor stream, the on-ramp traffic. Theory was established to account for this activity. Kerb lane drivers were also found to change to the median lane where possible, to assist coincident mergers. The net limited priority model accounts for this by predicting a reduced major stream flow rate, which excludes lane changers. Cowan's M3 model as calibrated for both streams. On-ramp and total upstream flow are required as input. Relationships between proportion of headways greater than 1 s and flow differed for on-ramps where traffic leaves signalised intersections and unsignalised intersections. Constant departure onramp metering was also modelled. Minimum follow-on times of 1 to 1.2 s were calibrated. Critical gaps were shown to lie between the minimum follow-on time, and the sum of the minimum follow-on time and the 1 s minimum headway. Limited priority capacity and other boundary relationships were established by Troutbeck (1995). The minimum average minor stream delay and corresponding proportion of drivers delayed were quantified theoretically in this study. A simulation model was constructed to predict intermediate minor and major stream delays across all minor and major stream flows. Pseudo-empirical relationships were established to predict average delays. Major stream average delays are limited to 0.5 s, insignificant compared with minor stream delay, which reach infinity at capacity. Minor stream delays were shown to be less when unsignalised intersections are located upstream of on-ramps than signalised intersections, and less still when ramp metering is installed. Smaller delays correspond to improved merge area performance. A more tangible performance measure, the distribution of distances required to merge, was established by including design speeds. This distribution can be measured to validate the model. Merging probabilities can be predicted for given taper lengths, a most useful performance measure. This model was also shown to be applicable to lane changing. Tolerable limits to merging probabilities require calibration. From these, practical capacities can be estimated. Further calibration is required of traffic inputs, critical gap and minimum follow-on time, for both merging and lane changing. A general relationship to predict proportion of drivers delayed requires development. These models can then be used to complement existing macroscopic models to assess performance, and provide further insight into the nature of operations.
Resumo:
The ultimate goal of an authorisation system is to allocate each user the level of access they need to complete their job - no more and no less. This proves to be challenging in an organisational setting because on one hand employees need enough access to perform their tasks, while on the other hand more access will bring about an increasing risk of misuse - either intentionally, where an employee uses the access for personal benefit, or unintentionally through carelessness, losing the information or being socially engineered to give access to an adversary. With the goal of developing a more dynamic authorisation model, we have adopted a game theoretic framework to reason about the factors that may affect users likelihood to misuse a permission at the time of an access decision. Game theory provides a useful but previously ignored perspective in authorisation theory: the notion of the user as a self-interested player who selects among a range of possible actions depending on their pay-offs.
Resumo:
Recommender systems are one of the recent inventions to deal with ever growing information overload. Collaborative filtering seems to be the most popular technique in recommender systems. With sufficient background information of item ratings, its performance is promising enough. But research shows that it performs very poor in a cold start situation where previous rating data is sparse. As an alternative, trust can be used for neighbor formation to generate automated recommendation. User assigned explicit trust rating such as how much they trust each other is used for this purpose. However, reliable explicit trust data is not always available. In this paper we propose a new method of developing trust networks based on users interest similarity in the absence of explicit trust data. To identify the interest similarity, we have used users personalized tagging information. This trust network can be used to find the neighbors to make automated recommendations. Our experiment result shows that the proposed trust based method outperforms the traditional collaborative filtering approach which uses users rating data. Its performance improves even further when we utilize trust propagation techniques to broaden the range of neighborhood.
Resumo:
Vehicular traffic in urban areas may adversely affect urban water quality through the build-up of traffic generated semi and non volatile organic compounds (SVOCs and NVOCs) on road surfaces. The characterisation of the build-up processes is the key to developing mitigation measures for the removal of such pollutants from urban stormwater. An in-depth analysis of the build-up of SVOCs and NVOCs was undertaken in the Gold Coast region in Australia. Principal Component Analysis (PCA) and Multicriteria Decision tools such as PROMETHEE and GAIA were employed to understand the SVOC and NVOC build-up under combined traffic scenarios of low, moderate, and high traffic in different land uses. It was found that congestion in the commercial areas and use of lubricants and motor oils in the industrial areas were the main sources of SVOCs and NVOCs on urban roads, respectively. The contribution from residential areas to the build-up of such pollutants was hardly noticeable. It was also revealed through this investigation that the target SVOCs and NVOCs were mainly attached to particulate fractions of 75 to 300 m whilst the redistribution of coarse fractions due to vehicle activity mainly occurred in the >300 m size range. Lastly, under combined traffic scenario, moderate traffic with average daily traffic ranging from 2300 to 5900 and average congestion of 0.47 was found to dominate SVOC and NVOC build-up on roads.
Resumo:
The effects of rapid development have increased pressures on these places exacerbated by the competition between two key industry sectors, commercial base and tourism development. This, in supplement with urbanisation and industrialisation, has posted a high demand for the uses of these spaces. The political scenario and lack of adaptation on ecological principles and public participations in its design approach have sparked stiff environmental, historical and cultural constraint towards its landscape character as well as the ecological system. Therefore, a holistic approach towards improving the landscape design process is extremely necessary to protect human well being, cultural, environmental and historical values of these places. Limited research also has been carried out to overcome this situation. This further has created an urgent need to explore better ways to improve the landscape design process of Malaysian heritage urban river corridor developments that encompass the needs and aspirations of the Malaysian multi-ethnic society without making any drastic changes to the landscape character of the rivers. This paper presents a methodology to develop an advanced Landscape Character Assessment (aLCA) framework for evaluating the landscape character of the places, derived from the perception of two keys yet oppositional stakeholders: urban design team and special interest public. The triangulation of subjectivist paradigm methodologies: the psychophysical approach; the psychological approach; and, the phenomenological approach will be employed. The outcome will be used to improve the present landscape design process for future development of these places. Unless a range of perspectives can be brought to bear on enhancing the form and function of their future development and management, urban river corridors in the Malaysian context will continue to decline.
Resumo:
The Texas Transportation Commission (the Commission) is responsible for planning and making policies for the location, construction, and maintenance of a comprehensive system of highways and public roads in Texas. In order for the Commission to carry out its legislative mandate, the Texas Constitution requires that most revenue generated by motor vehicle registration fees and motor fuel taxes be used for constructing and maintaining public roadways and other designated purposes. The Texas Department of Transportation (TxDOT) assists the Commission in executing state transportation policy. It is the responsibility of the legislature to appropriate money for TxDOTs operation and maintenance expenses. All money authorized to be appropriated for TxDOTs operations must come from the State Highway Fund (also known as Fund 6, Fund 006, or Fund 0006). The Commission can then use the balance in the fund to fulfill its responsibilities. However, the value of the revenue received in Fund 6 is not keeping pace with growing demand for transportation infrastructure in Texas. Additionally, diversion of revenue to nontransportation uses now exceeds $600 million per year. As shown in Figure 1.1, revenues and expenditures of the State Highway Fund per vehicle mile traveled (VMT) in Texas have remained almost flat since 1993. In the meantime, construction cost inflation has gone up more than 100%, effectively halving the value of expenditure.
Resumo:
Linking real-time schedulability directly to the Quality of Control (QoC), the ultimate goal of a control system, a hierarchical feedback QoC management framework with the Fixed Priority (FP) and the Earliest-Deadline-First (EDF) policies as plug-ins is proposed in this paper for real-time control systems with multiple control tasks. It uses a task decomposition model for continuous QoC evaluation even in overload conditions, and then employs heuristic rules to adjust the period of each of the control tasks for QoC improvement. If the total requested workload exceeds the desired value, global adaptation of control periods is triggered for workload maintenance. A sufficient stability condition is derived for a class of control systems with delay and period switching of the heuristic rules. Examples are given to demonstrate the proposed approach.
Resumo:
This paper in the journalism education field reports on the construction of a new subject as part of a postgraduate coursework degree. The subject, or unit1 will offer both Journalism students and other students an introductory experience of creating media, using common new media tools, with exercises that will model the learning of communication principles through practice. It has been named Fundamental Media Skills for the Workplace. The conceptualisation and teaching of it will be characteristic of the Journalism academic discipline that uses the inside perspectiveunderstanding mass media by observing from within. Proposers for the unit within the Journalism discipline have sought to extend the common teaching approach, based on training to produce start-ready recruits for media jobs, backed by a study of contexts, e.g. journalistic ethics, or media audiences. In this proposal, students would then examine the process to elicit additional knowledge about their learning. The paper draws on literature of journalism and its pedagogy, and on communication generally. It also documents a community of practice exercise conducted among practitioners as teachers for the subject, developing exercises and models of media work. A preliminary conclusion from that exercise is that it has taken a step towards enhancing skills-based learning for media work, as a portal to more generalised knowledge.
Resumo:
This article applies social network analysis techniques to a case study of police corruption in order to produce findings which will assist in corruption prevention and investigation. Police corruption is commonly studied but rarely are sophisticated tools of analyse engaged to add rigour to the field of study. This article analyses the First Joke a systemic and long lasting corruption network in the Queensland Police Force, a state police agency in Australia. It uses the data obtained from a commission of inquiry which exposed the network and develops hypotheses as to the nature of the networks structure based on existing literature into dark networks and criminal networks. These hypotheses are tested by entering the data into UCINET and analysing the outcomes through social network analysis measures of average path distance, centrality and density. The conclusions reached show that the network has characteristics not predicted by the literature.
Resumo:
"How do you film a punch?" This question can be posed by actors, make-up artists, directors and cameramen. Though they can all ask the same question, they are not all seeking the same answer. Within a given domain, based on the roles they play, agents of the domain have different perspectives and they want the answers to their question from their perspective. In this example, an actor wants to know how to act when filming a scene involving a punch. A make-up artist is interested in how to do the make-up of the actor to show bruises that may result from the punch. Likewise, a director wants to know how to direct such a scene and a cameraman is seeking guidance on how best to film such a scene. This role-based difference in perspective is the underpinning of the Loculus framework for information management for the Motion Picture Industry. The Loculus framework exploits the perspective of agent for information extraction and classification within a given domain. The framework uses the positioning of the agents role within the domain ontology and its relatedness to other concepts in the ontology to determine the perspective of the agent. Domain ontology had to be developed for the motion picture industry as the domain lacked one. A rule-based relatedness score was developed to calculate the relative relatedness of concepts with the ontology, which were then used in the Loculus system for information exploitation and classification. The evaluation undertaken to date have yielded promising results and have indicated that exploiting perspective can lead to novel methods of information extraction and classifications.
Resumo:
This paper presents a novel topology for the generation of high voltage pulses that uses both slow and fast solid-state power switches. This topology includes diode-capacitor units in parallel with commutation circuits connected to a positive buck-boost converter. This enables the generation of a range of high output voltages with a given number of capacitors. The advantages of this topology are the use of slow switches and a reduced number of diodes in comparison with conventional Marx generator. Simulations performed for single and repetitive pulse generation and experimental tests of a prototype hardware verify the proposed topology.
Resumo:
This paper describes algorithms that can musically augment the realtime performance of electronic dance music by generating new musical material by morphing. Note sequence morphing involves the algorithmic generation of music that smoothly transitions between two existing musical segments. The potential of musical morphing in electronic dance music is outlined and previous research is summarised; including discussions of relevant music theoretic and algorithmic concepts. An outline and explanation is provided of a novel Markov morphing process that uses similarity measures to construct transition matrices. The paper reports on a focus-concert study used to evaluate this morphing algorithm and to compare its output with performances from a professional DJ. Discussions of this trial include reflections on some of the aesthetic characteristics of note sequence morphing. The research suggests that the proposed morphing technique could be effectively used in some electronic dance music contexts.
Resumo:
The concept of constructability uses integration art of individual functions through a valuable and timely construction inputs into planning and design development stages. It results in significant savings in cost and time needed to finalize infrastructure projects. However, available constructability principles, developed by CII Australia (1993), do not cover Operation and Maintenance (O&M) phases of projects, whilst major cost and time in multifaceted infrastructure projects are spent in post-occupancy stages. This paper discusses the need to extend the constructability concept by examining current O&M issues in the provision of multifaceted building projects. It highlights available O&M problems and shortcomings of building projects, as well as their causes and reasons in different categories. This initial categorization is an efficient start point for testing probable present O&M issues in various cases of complex infrastructure building projects. This preliminary categorization serve as a benchmark to develop an extended constructability model that considers the whole project life cycle phases rather than a specific phase. It anticipates that the development of an extended constructability model can reduce significant number of reworks, mistakes, extra costs and time wasted during delivery stages of multifaceted building projects.
Resumo:
This paper describes the characterisation for airborne uses of the public mobile data communication systems known broadly as 3G. The motivation for this study was to explore how this mature public communication systems could be used for aviation purposes. An experimental system was fitted to a light aircraft to record communication latency, line speed, RF level, packet loss and cell tower identifier. Communications was established using internet protocols and connection was made to a local server. The aircraft was flown in both remote and populous areas at altitudes up to 8500ft in a region located in South East Queensland, Australia. Results show that the average airborne RF levels are better than those on the ground by 21% and in the order of -77 dbm. Latencies were in the order of 500 ms (1/2 the latency of Iridium), an average download speed of 0.48 Mb/s, average uplink speed of 0.85 Mb/s, a packet of information loss of 6.5%. The maximum communication range was also observed to be 70km from a single cell station. The paper also describes possible limitations and utility of using such a communications architecture for both manned and unmanned aircraft systems.
Resumo:
This article in the journalism education field reports on the construction of a new subject as part of a postgraduate coursework degree. The subject, or unit1 will offer both Journalism students and other students an introductory experience of creating media, using common new media tools, with exercises that will model the learning of communication principles through practice. It has been named Fundamental Media Skills for the Workplace. The conceptualisation and teaching of it will be characteristic of the Journalism academic discipline that uses the inside perspectiveunderstanding mass media by observing from within. Proposers for the unit within the Journalism discipline have sought to extend the common teaching approach, based on training to produce start-ready recruits for media jobs, backed by a study of contexts, e.g. journalistic ethics, or media audiences. In this proposal, students would then examine the process to elicit additional knowledge about their learning. The article draws on literature of journalism and its pedagogy, and on communication generally. It also documents a community of practice exercise conducted among practitioners as teachers for the subject, developing exercises and models of media work. A preliminary conclusion from that exercise is that it has taken a step towards enhancing skills-based learning for media work.