167 resultados para Corner Operators
Resumo:
Multi-disciplinary approaches to complex problems are becoming more common – they enable criteria manifested in distinct (and potentially conflicting) domains to be jointly balanced and satisfied. In this paper we present airport terminals as a case study which requires multi-disciplinary knowledge in order to balance conflicting security, economic and passenger-driven needs and correspondingly enhance the design, management and operation of airport terminals. The need for a truly multi-disciplinary scientific approach which integrates information, process, people, technology and space domains is highlighted through a brief discussion of two challenges currently faced by airport operators. The paper outlines the approach taken by this project, detailing the aims and objectives of each of seven diverse research programs.
Resumo:
Porphyrins are one of Nature’s essential building blocks that play an important role in several biological systems including oxygen transport, photosynthesis, and enzymes. Their capacity to absorb visible light, facilitate oxidation and reduction, and act as energy- and electron-transfer agents, in particular when several are held closely together, is of interest to chemists who seek to mimic Nature and to make and use these compounds in order to synthesise novel advanced materials. During this project 26 new 5,10-diarylsubstituted porphyrin monomers, 10 dimers, and 1 tetramer were synthesised. The spectroscopic and structural properties of these compounds were investigated using 1D/2D 1H NMR, UV/visible, ATR-IR and Raman spectroscopy, mass spectrometry, X-ray crystallography, electrochemistry and gel permeation chromatography. Nitration, amination, bromination and alkynylation of only one as well as both of the meso positions of the porphyrin monomers have resulted in the expansion of the synthetic possibilities for the 5,10-diarylsubstituted porphyrins. The development of these new porphyrin monomers has led to the successful synthesis of new azo- and butadiyne-linked dimers. The functionalisation of these compounds was investigated, in particular nitration, amination, and bromination. The synthesised dimers containing the azo bridge have absorption spectra that show a large split in the Soret bands and intense Q-bands that have been significantly redshifted. The butadiyne dimers also have intense, red-shifted Q-bands but smaller Soret band splittings. Crystal structures of two new azoporphyrins have been acquired and compared to the azoporphyrin previously synthesised from 5,10,15- triarylsubstituted porphyrin monomers. A completely new cyclic porphyrin oligomer (CPO) was synthesised comprising four porphyrin monomers linked by azo and butadiyne bridges. This is the first cyclic tetramer that has both the azo and butadiyne linking groups. The absorption spectrum of the tetramer exhibits a large Soret split making it more similar to the azo- dimers than the butadiyne-linked dimers. The spectroscopic characteristics of the synthesised tetramer have been compared to the characteristics of other cyclic porphyrin tetramers. The collected data indicate that the new synthesised cyclic tetramer has a more efficient ð-overlap and a better ground state electronic communication between the porphyrin rings.
Resumo:
If one clear argument emerged from my doctoral thesis in political science, it is that there is no agreement as to what democracy is. There are over 40 different varieties of democracy ranging from those in the mainstream with subtle or minute differences to those playing by themselves in the corner. And many of these various types of democracy are very well argued, empirically supported, and highly relevant to certain polities. The irony is that the thing which all of these democratic varieties or the ‘basic democracy’ that all other forms of democracy stem from, is elusive. There is no international agreement in the literature or in political practice as to what ‘basic democracy’ is and that is problematic as many of us use the word ‘democracy’ every day and it is a concept of tremendous importance internationally. I am still uncertain as to why this problem has not been resolved before by far greater minds than my own, and it may have something to do with the recent growth in democratic theory this past decade and the innovative areas of thought my thesis required, but I think I’ve got the answer. By listing each type of democracy and filling the column next to this list with the literature associated with these various styles of democracy, I amassed a large and comprehensive body of textual data. My research intended to find out what these various styles of democracy had in common and to create a taxonomy (like the ‘tree of life’ in biology) of democracy to attempt at showing how various styles of democracy have ‘evolved’ over the past 5000 years.ii I then ran a word frequency analysis program or a piece of software that counts the 100 most commonly used words in the texts. This is where my logic came in as I had to make sense of these words. How did they answer what the most fundamental commonalities are between 40 different styles of democracy? I used a grounded theory analysis which required that I argue my way through these words to form a ‘theory’ or plausible explanation as to why these particular words and not others are the important ones for answering the question. It came down to the argument that all 40 styles of democracy analysed have the following in common 1) A concept of a citizenry. 2) A concept of sovereignty. 3) A concept of equality. 4) A concept of law. 5) A concept of communication. 6) And a concept of selecting officials. Thus, democracy is a defined citizenry with its own concept of sovereignty which it exercises through the institutions which support the citizenry’s understandings of equality, law, communication, and the selection of officials. Once any of these 6 concepts are defined in a particular way it creates a style of democracy. From this, we can also see that there can be more than one style of democracy active in a particular government as a citizenry is composed of many different aggregates with their own understandings of the six concepts.
Resumo:
Reliability analysis has several important engineering applications. Designers and operators of equipment are often interested in the probability of the equipment operating successfully to a given age - this probability is known as the equipment's reliability at that age. Reliability information is also important to those charged with maintaining an item of equipment, as it enables them to model and evaluate alternative maintenance policies for the equipment. In each case, information on failures and survivals of a typical sample of items is used to estimate the required probabilities as a function of the item's age, this process being one of many applications of the statistical techniques known as distribution fitting. In most engineering applications, the estimation procedure must deal with samples containing survivors (suspensions or censorings); this thesis focuses on several graphical estimation methods that are widely used for analysing such samples. Although these methods have been current for many years, they share a common shortcoming: none of them is continuously sensitive to changes in the ages of the suspensions, and we show that the resulting reliability estimates are therefore more pessimistic than necessary. We use a simple example to show that the existing graphical methods take no account of any service recorded by suspensions beyond their respective previous failures, and that this behaviour is inconsistent with one's intuitive expectations. In the course of this thesis, we demonstrate that the existing methods are only justified under restricted conditions. We present several improved methods and demonstrate that each of them overcomes the problem described above, while reducing to one of the existing methods where this is justified. Each of the improved methods thus provides a realistic set of reliability estimates for general (unrestricted) censored samples. Several related variations on these improved methods are also presented and justified. - i
Resumo:
The Inflatable Rescue Boat (IRB) is arguably the most effective rescue tool used by the Australian surf lifesavers. The exceptional features of high mobility and rapid response have enabled it to become an icon on Australia's popular beaches. However, the IRB's extensive use within an environment that is as rugged as it is spectacular, has led it to become a danger to those who risk their lives to save others. Epidemiological research revealed lower limb injuries to be predominant, particularly the right leg. The common types of injuries were fractures and dislocations, as well as muscle or ligament strains and tears. The concern expressed by Surf Life Saving Queensland (SLSQ) and Surf Life Saving Australia (SLSA) led to a biomechanical investigation into this unique and relatively unresearched field. The aim of the research was to identify the causes of injury and propose processes that may reduce the instances and severity of injury to surf lifesavers during IRB operation. Following a review of related research, a design analysis of the craft was undertaken as an introduction to the craft, its design and uses. The mechanical characteristics of the vessel were then evaluated and the accelerations applied to the crew in the IRB were established through field tests. The data were then combined and modelled in the 3-D mathematical modelling and simulation package, MADYMO. A tool was created to compare various scenarios of boat design and methods of operation to determine possible mechanisms to reduce injuries. The results of this study showed that under simulated wave loading the boats flex around a pivot point determined by the position of the hinge in the floorboard. It was also found that the accelerations experienced by the crew exhibited similar characteristics to road vehicle accidents. Staged simulations indicated the attributes of an optimum foam in terms of thickness and density. Likewise, modelling of the boat and crew produced simulations that predicted realistic crew response to tested variables. Unfortunately, the observed lack of adherence to the SLSA footstrap Standard has impeded successful epidemiological and modelling outcomes. If uniformity of boat setup can be assured then epidemiological studies will be able to highlight the influence of implementing changes to the boat design. In conclusion, the research provided a tool to successfully link the epidemiology and injury diagnosis to the mechanical engineering design through the use of biomechanics. This was a novel application of the mathematical modelling software MADYMO. Other craft can also be investigated in this manner to provide solutions to the problem identified and therefore reduce risk of injury for the operators.
Resumo:
Hydrocarbon spills on roads are a major safety concern for the driving public and can have severe cost impacts both on pavement maintenance and to the economy through disruption to services. The time taken to clean-up spills and re-open roads in a safe driving condition is an issue of increasing concern given traffic levels on major urban arterials. Thus, the primary aim of the research was to develop a sorbent material that facilitates rapid clean-up of road spills. The methodology involved extensive research into a range of materials (organic, inorganic and synthetic sorbents), comprehensive testing in the laboratory, scale-up and field, and product design (i.e. concept to prototype). The study also applied chemometrics to provide consistent, comparative methods of sorbent evaluation and performance. In addition, sorbent materials at every stage were compared against a commercial benchmark. For the first time, the impact of diesel on asphalt pavement has been quantified and assessed in a systematic way. Contrary to conventional thinking and anecdotal observations, the study determined that the action of diesel on asphalt was quite rapid (i.e. hours rather than weeks or months). This significant finding demonstrates the need to minimise the impact of hydrocarbon spills and the potential application of the sorbent option. To better understand the adsorption phenomenon, surface characterisation techniques were applied to selected sorbent materials (i.e. sand, organo-clay and cotton fibre). Brunauer Emmett Teller (BET) and thermal analysis indicated that the main adsorption mechanism for the sorbents occurred on the external surface of the material in the diffusion region (sand and organo-clay) and/or capillaries (cotton fibre). Using environmental scanning electron microscopy (ESEM), it was observed that adsorption by the interfibre capillaries contributed to the high uptake of hydrocarbons by the cotton fibre. Understanding the adsorption mechanism for these sorbents provided some guidance and scientific basis for the selection of materials. The study determined that non-woven cotton mats were ideal sorbent materials for clean-up of hydrocarbon spills. The prototype sorbent was found to perform significantly better than the commercial benchmark, displaying the following key properties: • superior hydrocarbon pick-up from the road pavement; • high hydrocarbon retention capacity under an applied load; • adequate field skid resistance post treatment; • functional and easy to use in the field (e.g. routine handling, transportation, application and recovery); • relatively inexpensive to produce due to the use of raw cotton fibre and simple production process; • environmentally friendly (e.g. renewable materials, non-toxic to environment and operators, and biodegradable); and • rapid response time (e.g. two minutes total clean-up time compared with thirty minutes for reference sorbents). The major outcomes of the research project include: a) development of a specifically designed sorbent material suitable for cleaning up hydrocarbon spills on roads; b) submission of patent application (serial number AU2005905850) for the prototype product; and c) preparation of Commercialisation Strategy to advance the sorbent product to the next phase (i.e. R&D to product commercialisation).
Resumo:
Camera calibration information is required in order for multiple camera networks to deliver more than the sum of many single camera systems. Methods exist for manually calibrating cameras with high accuracy. Manually calibrating networks with many cameras is, however, time consuming, expensive and impractical for networks that undergo frequent change. For this reason, automatic calibration techniques have been vigorously researched in recent years. Fully automatic calibration methods depend on the ability to automatically find point correspondences between overlapping views. In typical camera networks, cameras are placed far apart to maximise coverage. This is referred to as a wide base-line scenario. Finding sufficient correspondences for camera calibration in wide base-line scenarios presents a significant challenge. This thesis focuses on developing more effective and efficient techniques for finding correspondences in uncalibrated, wide baseline, multiple-camera scenarios. The project consists of two major areas of work. The first is the development of more effective and efficient view covariant local feature extractors. The second area involves finding methods to extract scene information using the information contained in a limited set of matched affine features. Several novel affine adaptation techniques for salient features have been developed. A method is presented for efficiently computing the discrete scale space primal sketch of local image features. A scale selection method was implemented that makes use of the primal sketch. The primal sketch-based scale selection method has several advantages over the existing methods. It allows greater freedom in how the scale space is sampled, enables more accurate scale selection, is more effective at combining different functions for spatial position and scale selection, and leads to greater computational efficiency. Existing affine adaptation methods make use of the second moment matrix to estimate the local affine shape of local image features. In this thesis, it is shown that the Hessian matrix can be used in a similar way to estimate local feature shape. The Hessian matrix is effective for estimating the shape of blob-like structures, but is less effective for corner structures. It is simpler to compute than the second moment matrix, leading to a significant reduction in computational cost. A wide baseline dense correspondence extraction system, called WiDense, is presented in this thesis. It allows the extraction of large numbers of additional accurate correspondences, given only a few initial putative correspondences. It consists of the following algorithms: An affine region alignment algorithm that ensures accurate alignment between matched features; A method for extracting more matches in the vicinity of a matched pair of affine features, using the alignment information contained in the match; An algorithm for extracting large numbers of highly accurate point correspondences from an aligned pair of feature regions. Experiments show that the correspondences generated by the WiDense system improves the success rate of computing the epipolar geometry of very widely separated views. This new method is successful in many cases where the features produced by the best wide baseline matching algorithms are insufficient for computing the scene geometry.
Resumo:
Following the success of Coalbed Natural Gas (CBNG) operations in the United States, companies in Australia and New Zealand have been actively exploring and developing this technology for the last two decades. In particular, the Bowen and Surat basins in Queensland, Australia, have undergone extensive CBNG development. Unfortunately, awareness of potential environmental problems associated with CBNG abstraction has not been widespread and legislation has at times struggled to keep up with rapid development. In Australia, the combined CBNG resource for both the Bowen and Surat basins has been estimated at approximately 10,500 PJ with gas content as high as 10 m3/tonne of coal. There are no official estimates for the magnitude of the CBNG resource in New Zealand but initial estimates suggest this could be up to 1,300 PJ with gas content ranging from 1 to 5 m3/tonne of coal. In Queensland, depressurization of the Walloon Coal Measures to recover CBNG has the potential to induce drawdown in adjacent deep aquifer systems through intraformational groundwater flow. In addition, CBNG operators have been disposing their co-produced water by using large unlined ponds, which is not the best practice for managing co-produced water. CBNG waters in Queensland have the typical geochemical signature associated with CBNG waters (Van Voast, 2003) and thus have the potential to impair soils and plant growth where land disposal is considered. Water quality from exploration wells in New Zealand exhibit the same characteristics although full scale production has not yet begun. In general, the environmental impacts that could arise from CBNG water extraction depend on the aquifer system, the quantity and quality of produced water, and on the method of treatment and disposal being used. Understanding these impacts is necessary to adequately manage CBNG waters so that environmental effects are minimized; if properly managed, CBNG waters can be used for beneficial applications and can become a valuable resource to stakeholders.
Resumo:
Open access reforms to railway regulations allow multiple train operators to provide rail services on a common infrastructure. As railway operations are now independently managed by different stakeholders, conflicts in operations may arise, and there have been attempts to derive an effective access charge regime so that these conflicts may be resolved. One approach is by direct negotiation between the infrastructure manager and the train service providers. Despite the substantial literature on the topic, few consider the benefits of employing computer simulation as an evaluation tool for railway operational activities such as access pricing. This article proposes a multi-agent system (MAS) framework for the railway open market and demonstrates its feasibility by modelling the negotiation between an infrastructure provider and a train service operator. Empirical results show that the model is capable of resolving operational conflicts according to market demand.
Resumo:
The concept of moving block signallings (MBS) has been adopted in a few mass transit railway systems. When a dense queue of trains begins to move from a complete stop, the trains can re-start in very close succession under MBS. The feeding substations nearby are likely to be overloaded and the service will inevitably be disturbed unless substations of higher power rating are used. By introducing starting time delays among the trains or limiting the trains’ acceleration rate to a certain extent, the peak energy demand can be contained. However, delay is introduced and quality of service is degraded. An expert system approach is presented to provide a supervisory tool for the operators. As the knowledge base is vital for the quality of decisions to be made, the study focuses on its formulation with a balance between delay and peak power demand.
Resumo:
This paper addresses the problem of constructing consolidated business process models out of collections of process models that share common fragments. The paper considers the construction of unions of multiple models (called merged models) as well as intersections (called digests). Merged models are intended for analysts who wish to create a model that subsumes a collection of process models - typically representing variants of the same underlying process - with the aim of replacing the variants with the merged model. Digests, on the other hand, are intended for analysts who wish to identify the most recurring fragments across a collection of process models, so that they can focus their efforts on optimizing these fragments. The paper presents an algorithm for computing merged models and an algorithm for extracting digests from a merged model. The merging and digest extraction algorithms have been implemented and tested against collections of process models taken from multiple application domains. The tests show that the merging algorithm produces compact models and scales up to process models containing hundreds of nodes. Furthermore, a case study conducted in a large insurance company has demonstrated the usefulness of the merging and digest extraction operators in a practical setting.
Resumo:
Condition monitoring on rails and train wheels is vitally important to the railway asset management and the rail-wheel interactions provide the crucial information of the health state of both rails and wheels. Continuous and remote monitoring is always a preference for operators. With a new generation of strain sensing devices in Fibre Bragg Grating (FBG) sensors, this study explores the possibility of continuous monitoring of the health state of the rails; and investigates the required signal processing techniques and their limitations.
Resumo:
The track allocation problem (TAP) at a multi-track, multi-platform mainline railway station is defined by the station track layout and service timetable, which implies combinations of spatial and temporal conflicts. Feasible solutions are available from either traditional planning or advanced intelligent searching methods and their evaluations with respect to operational requirements are essential for the operators. To facilitate thorough analysis, a timed Coloured Petri Nets (CPN) model is presented here to encapsulate the inter-relationships of the spatial and temporal constraints in the TAP.