221 resultados para trajectory


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Cognitive load theory was used to generate a series of three experiments to investigate the effects of various worked example formats on learning orthographic projection. Experiments 1 and 2 investigated the benefits of presenting problems, conventional worked examples incorporating the final 2-D and 3-D representations only, and modified worked examples with several intermediate stages of rotation between the 2-D and 3-D representations. Modified worked examples proved superior to conventional worked examples without intermediate stages while conventional worked examples were, in turn, superior to problems. Experiment 3 investigated the consequences of varying the number and location of intermediate stages in the rotation trajectory and found three stages to be superior to one. A single intermediate stage was superior when nearer the 2-D than the 3-D end of the trajectory. It was concluded that (a) orthographic projection is learned best using worked examples with several intermediate stages and that (b) a linear relation between angle of rotation and problem difficulty did not hold for orthographic projection material. Cognitive load theory could be used to suggest the ideal location of the intermediate stages.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper we identify the origins of stop-and-go (or slow-and-go) driving and measure microscopic features of their propagations by analyzing vehicle trajectories via Wavelet Transform. Based on 53 oscillation cases analyzed, we find that oscillations can be originated by either lane-changing maneuvers (LCMs) or car-following behavior (CF). LCMs were predominantly responsible for oscillation formations in the absence of considerable horizontal or vertical curves, whereas oscillations formed spontaneously near roadside work on an uphill segment. Regardless of the trigger, the features of oscillation propagations were similar in terms of propagation speed, oscillation duration, and amplitude. All observed cases initially exhibited a precursor phase, in which slow-and-go motions were localized. Some of them eventually transitioned into a well developed phase, in which oscillations propagated upstream in queue. LCMs were primarily responsible for the transition, although some transitions occurred without LCMs. Our findings also suggest that an oscillation has a regressive effect on car following behavior: a deceleration wave of an oscillation affects a timid driver (with larger response time and minimum spacing) to become less timid and an aggressive driver less aggressive, although this change may be short-lived. An extended framework of Newell’s CF is able to describe the regressive effects with two additional parameters with reasonable accuracy, as verified using vehicle trajectory data.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Traffic oscillations are typical features of congested traffic flow that are characterized by recurring decelerations followed by accelerations. However, people have limited knowledge on this complex topic. In this research, 1) the impact of traffic oscillations on freeway crash occurrences has been measured using the matched case-control design. The results consistently reveal that oscillations have a more significant impact on freeway safety than the average traffic states. 2) Wavelet Transform has been adopted to locate oscillations' origins and measure their characteristics along their propagation paths using vehicle trajectory data. 3) Lane changing maneuver's impact on the immediate follower is measured and modeled. The knowledge and the new models generated from this study could provide better understanding on fundamentals of congested traffic; enable improvements to existing traffic control strategies and freeway crash countermeasures; and instigate people to develop new operational strategies with the objective of reducing the negative effects of oscillatory driving.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper considers the implications of journalism research being located within the Field of Research associated with the creative arts and writing in the recent Excellence in Research for Australia (ERA) evaluations. While noting that this classification does capture a significant trajectory in Australian journalism research, it also points to some anomalous implications of understanding journalism as an arts discipline, given its historical co-location in universities with communications disciplines, and the mutually reinforcing relationships between the two fields.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The Future of Financial Regulation is an edited collection of papers presented at a major conference at the University of Glasgow in Spring 2009. It draws together a variety of different perspectives on the international financial crisis which began in August 2007 and later turned into a more widespread economic crisis following the collapse of Lehman Brothers in the Autumn of 2008. Spring 2009 was in many respects the nadir since valuations in financial markets had reached their low point and crisis management rather than regulatory reform was the main focus of attention. The conference and book were deliberately framed as an attempt to re-focus attention from the former to the latter. The first part of the book focuses on the context of the crisis, discussing the general characteristics of financial crises and the specific influences that were at work during this time. The second part focuses more specifically on regulatory techniques and practices implicated in the crisis, noting in particular an over-reliance on the capacity of regulators and financial institutions to manage risk and on the capacity of markets to self-correct. The third part focuses on the role of governance and ethics in the crisis and in particular the need for a common ethical framework to underpin governance practices and to provide greater clarity in the design of accountability mechanisms. The final part focuses on the trajectory of regulatory reform, noting the considerable potential for change as a result of the role of the state in the rescue and recuperation of the financial system and stressing the need for fundamental re-appraisal of business and regulatory models. This informative book will be of interest to financial regulators and theorists, commercial and financial law practitioners, and academics involved in the law and economics of regulation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

While using unmanned systems in combat is not new, what will be new in the foreseeable future is how such systems are used and integrated in the civilian space. The potential use of Unmanned Aerial Vehicles in civil and commercial applications is becoming a fact, and is receiving considerable attention by industry and the research community. The majority of Unmanned Aerial Vehicles performing civilian tasks are restricted to flying only in segregated space, and not within the National Airspace. The areas that UAVs are restricted to flying in are typically not above populated areas, which in turn are the areas most useful for civilian applications. The reasoning behind the current restrictions is mainly due to the fact that current UAV technologies are not able to demonstrate an Equivalent Level of Safety to manned aircraft, particularly in the case of an engine failure which would require an emergency or forced landing. This chapter will preset and guide the reader through a number of developments that would facilitate the integration of UAVs into the National Airspace. Algorithms for UAV Sense-and-Avoid and Force Landings are recognized as two major enabling technologies that will allow the integration of UAVs in the civilian airspace. The following sections will describe some of the techniques that are currently being tested at the Australian Research Centre for Aerospace Automation (ARCAA), which places emphasis on the detection of candidate landing sites using computer vision, the planning of the descent path trajectory for the UAV, and the decision making process behind the selection of the final landing site.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In a similar fashion to many western countries, the political context of Japan has been transformed since the 1975 UN World Conference on Women, which eventually led to the establishment of the Basic Law for a Gender-equal Society in Japan in 1999. The Basic Law sets out a series of general guidelines across every field of society, including education. This trajectory policy research study targets gender issues in Japanese higher education and follows the development of the Basic Law and, in particular, how it has been interpreted by bureaucrats and implemented within the field of higher education. This feminist policy research study examines Japanese power relationships within the field of gender and identifies gender discourses embedded within Japanese gender equity policy documents. The study documents the experiences of, and strategies used by, Japanese feminists in relation to gender equity policies in education. Drawing on critical feminist theory and feminist critical discourse theory, the study explores the relationship between gender discourses and social practices and analyses how unequal gender relations have been sustained through the implementation of Japanese gender equity policy. Feminist critical policy analysis and feminist critical discourse analysis have been used to examine data collected through interviews with key players, including policy makers and policy administrators from the national government and higher education institutions offering teacher education courses. The study also scrutinises the minutes of government meetings, and other relevant policy documents. The study highlights the struggles between policy makers in the government and bureaucracy, and feminist educators working for change. Following an anti-feminist backlash, feminist discourses in the original policy documents were weakened or marginalised in revisions, ultimately weakening the impact of the Basic Law in the higher education institutions. The following four key findings are presented within the research: 1) tracking of the original feminist teachers’ movement that existed just prior to the development of the Basic Law in 1999; 2) the formation of the Basic Law, and how the policy resulted in a weakening of the main tenets of women’s policy from a feminist perspective; 3) the problematic manner in which the Basic Law was interpreted at the bureaucratic level; and 4) the limited impact of the Basic Law on higher education and the strategies and struggles of feminist scholars in reaction to this law.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Language is a unique aspect of human communication because it can be used to discuss itself in its own terms. For this reason, human societies potentially have superior capacities of co-ordination, reflexive self-correction, and innovation than other animal, physical or cybernetic systems. However, this analysis also reveals that language is interconnected with the economically and technologically mediated social sphere and hence is vulnerable to abstraction, objectification, reification, and therefore ideology – all of which are antithetical to its reflexive function, whilst paradoxically being a fundamental part of it. In particular, in capitalism, language is increasingly commodified within the social domains created and affected by ubiquitous communication technologies. The advent of the so-called ‘knowledge economy’ implicates exchangeable forms of thought (language) as the fundamental commodities of this emerging system. The historical point at which a ‘knowledge economy’ emerges, then, is the critical point at which thought itself becomes a commodified ‘thing’, and language becomes its “objective” means of exchange. However, the processes by which such commodification and objectification occurs obscures the unique social relations within which these language commodities are produced. The latest economic phase of capitalism – the knowledge economy – and the obfuscating trajectory which accompanies it, we argue, is destroying the reflexive capacity of language particularly through the process of commodification. This can be seen in that the language practices that have emerged in conjunction with digital technologies are increasingly non-reflexive and therefore less capable of self-critical, conscious change.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper examines some of the implications for China of the creative industries agenda as drawn by some recent commentators. The creative industries have been seen by many commentators as essential if China is to move from an imitative low-value economy to an innovative high value one. Some suggest that this trajectory is impossible without a full transition to liberal capitalism and democracy - not just removing censorship but instituting 'enlightenment values'. Others suggest that the development of the creative industries themselves will promote social and political change. The paper suggests that the creative industries takes certain elements of a prior cultural industries concept and links it to a new kind of economic development agenda. Though this agenda presents problems for the Chinese government it does not in itself imply the kind of radical democratic political change with which these commentators associate it. In the form in which the creative industries are presented – as part of an informational economy rather than as a cultural politics – it can be accommodated by a Chinese regime doing ‘business as usual’.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A forced landing is an unscheduled event in flight requiring an emergency landing, and is most commonly attributed to engine failure, failure of avionics or adverse weather. Since the ability to conduct a successful forced landing is the primary indicator for safety in the aviation industry, automating this capability for unmanned aerial vehicles (UAVs) will help facilitate their integration into, and subsequent routine operations over civilian airspace. Currently, there is no commercial system available to perform this task; however, a team at the Australian Research Centre for Aerospace Automation (ARCAA) is working towards developing such an automated forced landing system. This system, codenamed Flight Guardian, will operate onboard the aircraft and use machine vision for site identification, artificial intelligence for data assessment and evaluation, and path planning, guidance and control techniques to actualize the landing. This thesis focuses on research specific to the third category, and presents the design, testing and evaluation of a Trajectory Generation and Guidance System (TGGS) that navigates the aircraft to land at a chosen site, following an engine failure. Firstly, two algorithms are developed that adapts manned aircraft forced landing techniques to suit the UAV planning problem. Algorithm 1 allows the UAV to select a route (from a library) based on a fixed glide range and the ambient wind conditions, while Algorithm 2 uses a series of adjustable waypoints to cater for changing winds. A comparison of both algorithms in over 200 simulated forced landings found that using Algorithm 2, twice as many landings were within the designated area, with an average lateral miss distance of 200 m at the aimpoint. These results present a baseline for further refinements to the planning algorithms. A significant contribution is seen in the design of the 3-D Dubins Curves planning algorithm, which extends the elementary concepts underlying 2-D Dubins paths to account for powerless flight in three dimensions. This has also resulted in the development of new methods in testing for path traversability, in losing excess altitude, and in the actual path formation to ensure aircraft stability. Simulations using this algorithm have demonstrated lateral and vertical miss distances of under 20 m at the approach point, in wind speeds of up to 9 m/s. This is greater than a tenfold improvement on Algorithm 2 and emulates the performance of manned, powered aircraft. The lateral guidance algorithm originally developed by Park, Deyst, and How (2007) is enhanced to include wind information in the guidance logic. A simple assumption is also made that reduces the complexity of the algorithm in following a circular path, yet without sacrificing performance. Finally, a specific method of supplying the correct turning direction is also used. Simulations have shown that this new algorithm, named the Enhanced Nonlinear Guidance (ENG) algorithm, performs much better in changing winds, with cross-track errors at the approach point within 2 m, compared to over 10 m using Park's algorithm. A fourth contribution is made in designing the Flight Path Following Guidance (FPFG) algorithm, which uses path angle calculations and the MacCready theory to determine the optimal speed to fly in winds. This algorithm also uses proportional integral- derivative (PID) gain schedules to finely tune the tracking accuracies, and has demonstrated in simulation vertical miss distances of under 2 m in changing winds. A fifth contribution is made in designing the Modified Proportional Navigation (MPN) algorithm, which uses principles from proportional navigation and the ENG algorithm, as well as methods specifically its own, to calculate the required pitch to fly. This algorithm is robust to wind changes, and is easily adaptable to any aircraft type. Tracking accuracies obtained with this algorithm are also comparable to those obtained using the FPFG algorithm. For all three preceding guidance algorithms, a novel method utilising the geometric and time relationship between aircraft and path is also employed to ensure that the aircraft is still able to track the desired path to completion in strong winds, while remaining stabilised. Finally, a derived contribution is made in modifying the 3-D Dubins Curves algorithm to suit helicopter flight dynamics. This modification allows a helicopter to autonomously track both stationary and moving targets in flight, and is highly advantageous for applications such as traffic surveillance, police pursuit, security or payload delivery. Each of these achievements serves to enhance the on-board autonomy and safety of a UAV, which in turn will help facilitate the integration of UAVs into civilian airspace for a wider appreciation of the good that they can provide. The automated UAV forced landing planning and guidance strategies presented in this thesis will allow the progression of this technology from the design and developmental stages, through to a prototype system that can demonstrate its effectiveness to the UAV research and operations community.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Car Following models have a critical role in all microscopic traffic simulation models. Current microscopic simulation models are unable to mimic the unsafe behaviour of drivers as most are based on presumptions about the safe behaviour of drivers. Gipps model is a widely used car following model embedded in different micro-simulation models. This paper examines the Gipps car following model to investigate ways of improving the model for safety studies application. The paper puts forward some suggestions to modify the Gipps model to improve its capabilities to simulate unsafe vehicle movements (vehicles with safety indicators below critical thresholds). The result of the paper is one step forward to facilitate assessing and predicting safety at motorways using microscopic simulation. NGSIM as a rich source of vehicle trajectory data for a motorway is used to extract its relatively risky events. Short following headways and Time To Collision are used to assess critical safety event within traffic flow. The result shows that the modified proposed car following to a certain extent predicts the unsafe trajectories with smaller error values than the generic Gipps model.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Stochastic models for competing clonotypes of T cells by multivariate, continuous-time, discrete state, Markov processes have been proposed in the literature by Stirk, Molina-París and van den Berg (2008). A stochastic modelling framework is important because of rare events associated with small populations of some critical cell types. Usually, computational methods for these problems employ a trajectory-based approach, based on Monte Carlo simulation. This is partly because the complementary, probability density function (PDF) approaches can be expensive but here we describe some efficient PDF approaches by directly solving the governing equations, known as the Master Equation. These computations are made very efficient through an approximation of the state space by the Finite State Projection and through the use of Krylov subspace methods when evolving the matrix exponential. These computational methods allow us to explore the evolution of the PDFs associated with these stochastic models, and bimodal distributions arise in some parameter regimes. Time-dependent propensities naturally arise in immunological processes due to, for example, age-dependent effects. Incorporating time-dependent propensities into the framework of the Master Equation significantly complicates the corresponding computational methods but here we describe an efficient approach via Magnus formulas. Although this contribution focuses on the example of competing clonotypes, the general principles are relevant to multivariate Markov processes and provide fundamental techniques for computational immunology.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

PKU is a genetically inherited inborn error of metabolism caused by a deficiency of the enzyme phenylalanine hydroxylase. The failure of this enzyme causes incomplete metabolism of protein ingested in the diet, specifically the conversion of one amino acid, phenylalanine, to tyrosine, which is a precursor to the neurotransmitter dopamine. Rising levels of phenylalanine is toxic to the developing brain, disrupting the formation of white matter tracts. The impact of tyrosine deficiency is not as well understood, but is hypothesized to lead to a low dopamine environment for the developing brain. Detection in the newborn period and continuous treatment (a low protein phe-restricted diet supplemented with phenylalanine-free protein formulas) has resulted in children with early and continuously treated PKU now developing normal I.Q. However, deficits in executive function (EF) are common, leading to a rate of Attention Deficit Hyperactivity Disorder (ADHD) up to five times the norm. EF worsens with exposure to higher phenylalanine levels, however recent research has demonstrated that a high phenylalanine to tyrosine ratio (phenylalanine:tyrosine ratio), which is hypothesised to lead to poorer dopamine function, has a more negative impact on EF than phenylalanine levels alone. Research and treatment of PKU is currently phenylalanine-focused, with little investigation of the impact of tyrosine on neuropsychological development. There is no current consensus as to the veracity of tyrosine monitoring or treatment in this population. Further, the research agenda in this population has demonstrated a primary focus on EF impairment alone, even though there may be additional neuropsychological skills compromised (e.g., mood, visuospatial deficits). The aim of this PhD research was to identify residual neuropsychological deficits in a cohort of children with early and continuously treated phenylketonuria, at two time points in development (early childhood and early adolescence), separated by eight years. In addition, this research sought to determine which biochemical markers were associated with neuropsychological impairments. A clinical practice survey was also undertaken to ascertain the current level of monitoring/treatment of tyrosine in this population. Thirteen children with early and continuously treated PKU were tested at mean age 5.9 years and again at mean age 13.95 years on several neuropsychological measures. Four children with hyperphenylalaninemia (a milder version of PKU) were also tested at both time points and provide a comparison group in analyses. Associations between neuropsychological function and biochemical markers were analysed. A between groups analysis in adolescence was also conducted (children with PKU compared to their siblings) on parent report measures of EF and mood. Minor EF impairments were evident in the PKU group by age 6 years and these persisted into adolescence. Life-long exposure to high phenylalanine:tyrosine ratio and/or low tyrosine independent of phenylalanine were significantly associated with EF impairments at both time points. Over half the children with PKU showed severe impairment on a visuospatial task, and this was associated only with concurrent levels of tyrosine in adolescence. Children with PKU also showed a statistically significant decline in a language comprehension task from 6 years to adolescence (going from normal to subnormal), this deficit was associated with lifetime levels of phenylalanine. In comparison, the four children with hyperphenylalaninemia demonstrated normal function at both time points, across all measures. No statistically significant differences were detected between children with PKU and their siblings on the parent report of EF and mood. However, depressive symptoms were significantly correlated with: EF; long term high phe:tyr exposure; and low tyrosine levels independent of phenylalanine. The practice survey of metabolic clinics from 12 countries indicated a high level of variability in terms of monitoring/treatment of tyrosine in this population. Whilst over 80% of clinics surveyed routinely monitored tyrosine levels in their child patients, 25% reported treatment strategies to increase tyrosine (and thereby lower the phenylalanine:tyrosine ratio) under a variety of patient presentation conditions. Overall, these studies have shown that EF impairments associated with PKU provide support for the dopamine-deficiency model. A language comprehension task showed a different trajectory, serving a timely reminder that non-EF functions also remain vulnerable in this population; and that normal function in childhood does not guarantee normal function by adolescence. Mood impairments were associated with EF impairments as well as long term measures of phenylalanine:tyrosine and/or tyrosine. The implications of this research for enhanced clinical guidelines are discussed given varied current practice.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The availability and use of online counseling approaches has increased rapidly over the last decade. While research has suggested a range of potential affordances and limitations of online counseling modalities, very few studies have offered detailed examinations of how counselors and clients manage asynchronous email counseling exchanges. In this paper we examine email exchanges involving clients and counselors through Kids Helpline, a national Australian counseling service that offers free online, email and telephone counseling for young people up to the age of 25. We employ tools from the traditions of ethnomethodology and conversation analysis to analyze the ways in which counselors from Kids Helpline request that their clients call them, and hence change the modality of their counseling relationship, from email to telephone counseling. This paper shows the counselors’ three multi-layered approaches in these emails as they negotiate the potentially delicate task of requesting and persuading a client to change the trajectory of their counseling relationship from text to talk without placing that relationship in jeopardy.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper presents a comprehensive study to find the most efficient bitrate requirement to deliver mobile video that optimizes bandwidth, while at the same time maintains good user viewing experience. In the study, forty participants were asked to choose the lowest quality video that would still provide for a comfortable and long-term viewing experience, knowing that higher video quality is more expensive and bandwidth intensive. This paper proposes the lowest pleasing bitrates and corresponding encoding parameters for five different content types: cartoon, movie, music, news and sports. It also explores how the lowest pleasing quality is influenced by content type, image resolution, bitrate, and user gender, prior viewing experience, and preference. In addition, it analyzes the trajectory of users’ progression while selecting the lowest pleasing quality. The findings reveal that the lowest bitrate requirement for a pleasing viewing experience is much higher than that of the lowest acceptable quality. Users’ criteria for the lowest pleasing video quality are related to the video’s content features, as well as its usage purpose and the user’s personal preferences. These findings can provide video providers guidance on what quality they should offer to please mobile users.