841 resultados para Trajectory analysis
em Queensland University of Technology - ePrints Archive
Resumo:
An experimental set-up was used to visually observe the characteristics of bubbles as they moved up a column holding xanthan gum crystal suspensions. The bubble rise characteristics in xanthan gum solutions with crystal suspension are presented in this paper. The suspensions were made by using different concentrations of xanthan gum solutions with 0.23 mm mean diameter polystyrene crystal particles. The influence of the dimensionless quantities; namely the Reynolds number, Re, the Weber number, We, and the drag co-efficient, cd, are identified for the determination of the bubble rise velocity. The effect of these dimensionless groups together with the Eötvös number, Eo, the Froude number, Fr, and the bubble deformation parameter, D, on the bubble rise velocity and bubble trajectory are analysed. The experimental results show that the average bubble velocity increases with the increase in bubble volume for xanthan gum crystal suspensions. At high We, Eo and Re, bubbles are spherical-capped and their velocities are found to be very high. At low We and Eo, the surface tension force is significant compared to the inertia force. The viscous forces were shown to have no substantial effect on the bubble rise velocity for 45 < Re < 299. The results show that the drag co-efficient decreases with the increase in bubble velocity and Re. The trajectory analysis showed that small bubbles followed a zigzag motion while larger bubbles followed a spiral motion. The smaller bubbles experienced less horizontal motion in crystal suspended xanthan gum solutions while larger bubbles exhibited a greater degree of spiral motion than those seen in the previous studies on the bubble rise in xanthan gum solutions without crystal.
Resumo:
The characteristics of dust particles deposited during the 2009 dust storm in the Gold Coast and Brisbane regions of Australia are discussed in this paper. The study outcomes provide important knowledge in relation to the potential impacts of dust storm related pollution on ecosystem health in the context that the frequency of dust storms is predicted to increase due to anthropogenic desert surface modifications and climate change impacts. The investigated dust storm contributed a large fraction of fine particles to the environment with an increased amount of total suspended solids, compared to dry deposition under ambient conditions. Although the dust storm passed over forested areas, the organic carbon content in the dust was relatively low. The primary metals present in the dust storm deposition were aluminium, iron and manganese, which are common soil minerals in Australia. The dust storm deposition did not contain significant loads of nickel, cadmium, copper and lead, which are commonly present in the urban environment. Furthermore, the comparison between the ambient and dust storm chromium and zinc loads suggested that these metals were contributed to the dust storm by local anthropogenic sources. The potential ecosystem health impacts of the 2009 dust storm include, increased fine solids deposition on ground surfaces resulting in an enhanced capacity to adsorb toxic pollutants as well as increased aluminium, iron and manganese loads. In contrast, the ecosystem health impacts related to organic carbon and other metals from dust storm atmospheric deposition are not considered to be significant.
Resumo:
Designing trajectories for a submerged rigid body motivates this paper. Two approaches are addressed: the time optimal approach and the motion planning ap- proach using concatenation of kinematic motions. We focus on the structure of singular extremals and their relation to the existence of rank-one kinematic reduc- tions; thereby linking the optimization problem to the inherent geometric frame- work. Using these kinematic reductions, we provide a solution to the motion plan- ning problem in the under-actuated scenario, or equivalently, in the case of actuator failures. We finish the paper comparing a time optimal trajectory to one formed by concatenation of pure motions.
Resumo:
The Macroscopic Fundamental Diagram (MFD) relates space-mean density and flow. Since the MFD represents the area-wide network traffic performance, studies on perimeter control strategies and network-wide traffic state estimation utilising the MFD concept have been reported. Most previous works have utilised data from fixed sensors, such as inductive loops, to estimate the MFD, which can cause biased estimation in urban networks due to queue spillovers at intersections. To overcome the limitation, recent literature reports the use of trajectory data obtained from probe vehicles. However, these studies have been conducted using simulated datasets; limited works have discussed the limitations of real datasets and their impact on the variable estimation. This study compares two methods for estimating traffic state variables of signalised arterial sections: a method based on cumulative vehicle counts (CUPRITE), and one based on vehicles’ trajectory from taxi Global Positioning System (GPS) log. The comparisons reveal some characteristics of taxi trajectory data available in Brisbane, Australia. The current trajectory data have limitations in quantity (i.e., the penetration rate), due to which the traffic state variables tend to be underestimated. Nevertheless, the trajectory-based method successfully captures the features of traffic states, which suggests that the trajectories from taxis can be a good estimator for the network-wide traffic states.
Resumo:
Since the formal recognition of practice-led research in the 1990s, many higher research degree candidates in art, design and media have submitted creative works along with an accompanying written document or ‘exegesis’ for examination. Various models for the exegesis have been proposed in university guidelines and academic texts during the past decade, and students and supervisors have experimented with its contents and structure. With a substantial number of exegeses submitted and archived, it has now become possible to move beyond proposition to empirical analysis. In this article we present the findings of a content analysis of a large, local sample of submitted exegeses. We identify the emergence of a persistent pattern in the types of content included as well as overall structure. Besides an introduction and conclusion, this pattern includes three main parts, which can be summarized as situating concepts (conceptual definitions and theories); precedents of practice (traditions and exemplars in the field); and researcher’s creative practice (the creative process, the artifacts produced and their value as research). We argue that this model combines earlier approaches to the exegesis, which oscillated between academic objectivity, by providing a contextual framework for the practice, and personal reflexivity, by providing commentary on the creative practice. But this model is more than simply a hybrid: it provides a dual orientation, which allows the researcher to both situate their creative practice within a trajectory of research and do justice to its personally invested poetics. By performing the important function of connecting the practice and creative work to a wider emergent field, the model helps to support claims for a research contribution to the field. We call it a connective model of exegesis.
Resumo:
The closure of large institutions for people with intellectual disability and the subsequent shift to community living has been a feature of social policies in most western democracies for more than two decades. While the move from congregated settings to homes in the community has been heralded as a positive and desirable strategy, deinstitutionalisation has continued to be a controversial policy and practice. This research critically analyses the implementation of a deinstitutionalisation policy called Institutional Reform in the state of Queensland from May 1994 until it was dismantled under a new government in the middle of 1996. A trajectory study of the policy from early conceptualisation through its development, implementation and final extinction was undertaken. Several methods were utilised in the research including the textual analyis of policy documents, discussion papers and newspaper articles, interviews with stakeholders and participant observation. The research draws on theories of discourse and focuses on how discourses of disability shape policy and practice. The thesis outlines a number of implications for policy implementation more generally as well as for disability services. In particular, the theoretical framework builds on Fulcher's (1989) disabling discourses - medical, charity, lay and rights - and identifies two additional discourses of economics and inclusion. The thesis argues that competing disability discourses operated in powerful ways to shape the implementation of the policy and illustrates how older discourses based on fear and prejudice were promoted to positions of dominance and power.
Resumo:
This paper discusses the effects of thyristor controlled series compensator (TCSC), a series FACTS controller, on the transient stability of a power system. Trajectory sensitivity analysis (TSA) has been used to measure the transient stability condition of the system. The TCSC is modeled by a variable capacitor, the value of which changes with the firing angle. It is shown that TSA can be used in the design of the controller. The optimal locations of the TCSC-controller for different fault conditions can also be identified with the help of TSA. The paper depicts the advantage of the use of TCSC with a suitable controller over fixed capacitor operation.
Resumo:
In this paper we analyze the equations of motion of a submerged rigid body. Our motivation is based on recent developments done in trajectory design for this problem. Our goal is to relate some properties of singular extremals to the existence of decoupling vector fields. The ideas displayed in this paper can be viewed as a starting point to a geometric formulation of the trajectory design problem for mechanical systems with potential and external forces.
Resumo:
In this paper we identify the origins of stop-and-go (or slow-and-go) driving and measure microscopic features of their propagations by analyzing vehicle trajectories via Wavelet Transform. Based on 53 oscillation cases analyzed, we find that oscillations can be originated by either lane-changing maneuvers (LCMs) or car-following behavior (CF). LCMs were predominantly responsible for oscillation formations in the absence of considerable horizontal or vertical curves, whereas oscillations formed spontaneously near roadside work on an uphill segment. Regardless of the trigger, the features of oscillation propagations were similar in terms of propagation speed, oscillation duration, and amplitude. All observed cases initially exhibited a precursor phase, in which slow-and-go motions were localized. Some of them eventually transitioned into a well developed phase, in which oscillations propagated upstream in queue. LCMs were primarily responsible for the transition, although some transitions occurred without LCMs. Our findings also suggest that an oscillation has a regressive effect on car following behavior: a deceleration wave of an oscillation affects a timid driver (with larger response time and minimum spacing) to become less timid and an aggressive driver less aggressive, although this change may be short-lived. An extended framework of Newell’s CF is able to describe the regressive effects with two additional parameters with reasonable accuracy, as verified using vehicle trajectory data.
Resumo:
Traffic oscillations are typical features of congested traffic flow that are characterized by recurring decelerations followed by accelerations. However, people have limited knowledge on this complex topic. In this research, 1) the impact of traffic oscillations on freeway crash occurrences has been measured using the matched case-control design. The results consistently reveal that oscillations have a more significant impact on freeway safety than the average traffic states. 2) Wavelet Transform has been adopted to locate oscillations' origins and measure their characteristics along their propagation paths using vehicle trajectory data. 3) Lane changing maneuver's impact on the immediate follower is measured and modeled. The knowledge and the new models generated from this study could provide better understanding on fundamentals of congested traffic; enable improvements to existing traffic control strategies and freeway crash countermeasures; and instigate people to develop new operational strategies with the objective of reducing the negative effects of oscillatory driving.
Resumo:
Complex networks have been studied extensively due to their relevance to many real-world systems such as the world-wide web, the internet, biological and social systems. During the past two decades, studies of such networks in different fields have produced many significant results concerning their structures, topological properties, and dynamics. Three well-known properties of complex networks are scale-free degree distribution, small-world effect and self-similarity. The search for additional meaningful properties and the relationships among these properties is an active area of current research. This thesis investigates a newer aspect of complex networks, namely their multifractality, which is an extension of the concept of selfsimilarity. The first part of the thesis aims to confirm that the study of properties of complex networks can be expanded to a wider field including more complex weighted networks. Those real networks that have been shown to possess the self-similarity property in the existing literature are all unweighted networks. We use the proteinprotein interaction (PPI) networks as a key example to show that their weighted networks inherit the self-similarity from the original unweighted networks. Firstly, we confirm that the random sequential box-covering algorithm is an effective tool to compute the fractal dimension of complex networks. This is demonstrated on the Homo sapiens and E. coli PPI networks as well as their skeletons. Our results verify that the fractal dimension of the skeleton is smaller than that of the original network due to the shortest distance between nodes is larger in the skeleton, hence for a fixed box-size more boxes will be needed to cover the skeleton. Then we adopt the iterative scoring method to generate weighted PPI networks of five species, namely Homo sapiens, E. coli, yeast, C. elegans and Arabidopsis Thaliana. By using the random sequential box-covering algorithm, we calculate the fractal dimensions for both the original unweighted PPI networks and the generated weighted networks. The results show that self-similarity is still present in generated weighted PPI networks. This implication will be useful for our treatment of the networks in the third part of the thesis. The second part of the thesis aims to explore the multifractal behavior of different complex networks. Fractals such as the Cantor set, the Koch curve and the Sierspinski gasket are homogeneous since these fractals consist of a geometrical figure which repeats on an ever-reduced scale. Fractal analysis is a useful method for their study. However, real-world fractals are not homogeneous; there is rarely an identical motif repeated on all scales. Their singularity may vary on different subsets; implying that these objects are multifractal. Multifractal analysis is a useful way to systematically characterize the spatial heterogeneity of both theoretical and experimental fractal patterns. However, the tools for multifractal analysis of objects in Euclidean space are not suitable for complex networks. In this thesis, we propose a new box covering algorithm for multifractal analysis of complex networks. This algorithm is demonstrated in the computation of the generalized fractal dimensions of some theoretical networks, namely scale-free networks, small-world networks, random networks, and a kind of real networks, namely PPI networks of different species. Our main finding is the existence of multifractality in scale-free networks and PPI networks, while the multifractal behaviour is not confirmed for small-world networks and random networks. As another application, we generate gene interactions networks for patients and healthy people using the correlation coefficients between microarrays of different genes. Our results confirm the existence of multifractality in gene interactions networks. This multifractal analysis then provides a potentially useful tool for gene clustering and identification. The third part of the thesis aims to investigate the topological properties of networks constructed from time series. Characterizing complicated dynamics from time series is a fundamental problem of continuing interest in a wide variety of fields. Recent works indicate that complex network theory can be a powerful tool to analyse time series. Many existing methods for transforming time series into complex networks share a common feature: they define the connectivity of a complex network by the mutual proximity of different parts (e.g., individual states, state vectors, or cycles) of a single trajectory. In this thesis, we propose a new method to construct networks of time series: we define nodes by vectors of a certain length in the time series, and weight of edges between any two nodes by the Euclidean distance between the corresponding two vectors. We apply this method to build networks for fractional Brownian motions, whose long-range dependence is characterised by their Hurst exponent. We verify the validity of this method by showing that time series with stronger correlation, hence larger Hurst exponent, tend to have smaller fractal dimension, hence smoother sample paths. We then construct networks via the technique of horizontal visibility graph (HVG), which has been widely used recently. We confirm a known linear relationship between the Hurst exponent of fractional Brownian motion and the fractal dimension of the corresponding HVG network. In the first application, we apply our newly developed box-covering algorithm to calculate the generalized fractal dimensions of the HVG networks of fractional Brownian motions as well as those for binomial cascades and five bacterial genomes. The results confirm the monoscaling of fractional Brownian motion and the multifractality of the rest. As an additional application, we discuss the resilience of networks constructed from time series via two different approaches: visibility graph and horizontal visibility graph. Our finding is that the degree distribution of VG networks of fractional Brownian motions is scale-free (i.e., having a power law) meaning that one needs to destroy a large percentage of nodes before the network collapses into isolated parts; while for HVG networks of fractional Brownian motions, the degree distribution has exponential tails, implying that HVG networks would not survive the same kind of attack.
Resumo:
This paper outlines a feasible scheme to extract deck trend when a rotary-wing unmanned aerial vehicle (RUAV)approaches an oscillating deck. An extended Kalman filter (EKF) is de- veloped to fuse measurements from multiple sensors for effective estimation of the unknown deck heave motion. Also, a recursive Prony Analysis (PA) procedure is proposed to implement online curve-fitting of the estimated heave mo- tion. The proposed PA constructs an appropriate model with parameters identified using the forgetting factor recursive least square (FFRLS)method. The deck trend is then extracted by separating dominant modes. Performance of the proposed procedure is evaluated using real ship motion data, and simulation results justify the suitability of the proposed method into safe landing of RUAVs operating in a maritime environment.
Resumo:
This paper will compare and evaluate the effectiveness of commercial media lobbying and advocacy against public service media in two countries, the United Kingdom and Australia. The paper will focus empirically on the commercial media coverage of public service media issues in these countries (relating to the BBC and ABC respectively) over the period since the election of the Conservative-led Coalition in Britain in June 2010, and the election of the Gillard government in Australia in August 2010. Reference will be made to preceding periods as relevant to an understanding of the current environment. In both countries the main commercial media rival to public service media is News Corp and its associated organisations – News Ltd and Sky News in Australia, and News International and BSkyB in the UK. The paper will examine with analysis of print and online news and commentary content how News Corp outlets have reported and commented on the activities and plans of public service media as the latter have developed and extended their presence on digital TV and online platforms. It will also consider the responses of the ABC and BBC to these interventions. It will consider, thirdly, the responses of Australian and British governments to these debates, and the policy outcomes. This section of the paper will seek to evaluate the trajectory of the policy-public-private dynamic in recent years, and to draw conclusions as to the future direction of policy. Particular attention will be devoted to recent key moments in this unfolding dialogue. In Britain, debates around the efforts of News Corp to take over 100% of BSkyB, both before and after the breaking of the phone-hacking scandal in July 2011; in Australia, the debate around the National Broadband Network and the competitive tender process for ABC World, that country’s public service transnational broadcaster; and other key moments where rivalry between News Corp companies and public service media became mainstream news stories provoking wider public debate. The paper will conclude with recommendations as to how public service media organisations might engage constructively with commercial organisations in the future, including News Corp, and taking into account emerging technological and financial challenges to traditional rationales for public service provision.
Resumo:
In this study we develop a theorization of an Internet dating site as a cultural artifact. The site, Gaydar, is targeted at gay men. We argue that contemporary received representations of their sexuality figure heavily in the site’s focus by providing a cultural logic for the apparent ad hoc development trajectories of its varied commercial and non-‐commercial services. More specifically, we suggest that the growing sets of services related to the website are heavily enmeshed within current social practices and meanings. These practices and meanings are, in turn, shaped by the interactions and preferences of a variety of diverse groups involved in what is routinely seen within the mainstream literature as a singularly specific sexuality and cultural project. Thus, we attend to two areas – the influence of the various social engagements associated with Gaydar together with the further extension of its trajectory ‘beyond the web’. Through the case of Gaydar, we contribute a study that recognizes the need for attention to sexuality in information systems research and one which illustrates sexuality as a pivotal aspect of culture. We also draw from anthropology to theorize ICTs as cultural artifacts and provide insights into the contemporary phenomena of ICT enabled social networking.