936 resultados para reliable narrator
Resumo:
Adaptation of novels and other source texts into theatre has proven to be a recurring and popular form of writing through the ages. This study argues that as the theoretical discourse has moved on from outmoded notions of fidelity to original sources, the practice of adaptation is a method of re-invigorating theatre forms and inventing new ones. This practice-led research employed a tripartite methodology comprised of the writing of two play adaptations, participation by the author/researcher in their productions, and exegetical components focused on the development and deployment of analytical tools. These tools were derived from theoretical literature and a creative practice based on acquired professional artistry "learnt by doing" over a longstanding professional career as actor, director and writer. A suite of analytical tools was developed through the three phases of the first project, the adaptation of Nick Earls’ novel Perfect Skin. The tools draw on Cardwell’s "comparative analysis", which encompasses close consideration of generic context, authorial context and medium-specific context; and on Stam’s "mechanics of narrative": order, duration, frequency, the narrator and point of view. A third analytical lens was developed from an awareness of the significance of the commissioning brief and ethical considerations and obligations to the source text and its author and audience. The tripartite methodology provided an adaptation template that was applied to the writing and production of the second play Red Cap, which used factual and anecdotal sources. The second play’s exegesis (Chapter 10) analyses the effectiveness of the suite of analytical tools and the reception of the production in order to conclude the study with a workable model for use in the practice of adapting existing texts, both factual and fictional, for the theatre.
Resumo:
This poster summarises the current findings from STRC’s Integrated Traveller Information research domain that aims for accurate and reliable travel time prediction, and optimisation of multimodal trips. Following are the three selected discussions: a) Fundamental understanding on the use of Bluetooth MAC Scanner (BMS) for travel time estimation b) Integration of multi-sources (Loops and Bluetooth) for travel time and density estimation c) Architecture for online and predictive multimodal trip planner
Resumo:
Global Navigation Satellite Systems (GNSS)-based observation systems can provide high precision positioning and navigation solutions in real time, in the order of subcentimetre if we make use of carrier phase measurements in the differential mode and deal with all the bias and noise terms well. However, these carrier phase measurements are ambiguous due to unknown, integer numbers of cycles. One key challenge in the differential carrier phase mode is to fix the integer ambiguities correctly. On the other hand, in the safety of life or liability-critical applications, such as for vehicle safety positioning and aviation, not only is high accuracy required, but also the reliability requirement is important. This PhD research studies to achieve high reliability for ambiguity resolution (AR) in a multi-GNSS environment. GNSS ambiguity estimation and validation problems are the focus of the research effort. Particularly, we study the case of multiple constellations that include initial to full operations of foreseeable Galileo, GLONASS and Compass and QZSS navigation systems from next few years to the end of the decade. Since real observation data is only available from GPS and GLONASS systems, the simulation method named Virtual Galileo Constellation (VGC) is applied to generate observational data from another constellation in the data analysis. In addition, both full ambiguity resolution (FAR) and partial ambiguity resolution (PAR) algorithms are used in processing single and dual constellation data. Firstly, a brief overview of related work on AR methods and reliability theory is given. Next, a modified inverse integer Cholesky decorrelation method and its performance on AR are presented. Subsequently, a new measure of decorrelation performance called orthogonality defect is introduced and compared with other measures. Furthermore, a new AR scheme considering the ambiguity validation requirement in the control of the search space size is proposed to improve the search efficiency. With respect to the reliability of AR, we also discuss the computation of the ambiguity success rate (ASR) and confirm that the success rate computed with the integer bootstrapping method is quite a sharp approximation to the actual integer least-squares (ILS) method success rate. The advantages of multi-GNSS constellations are examined in terms of the PAR technique involving the predefined ASR. Finally, a novel satellite selection algorithm for reliable ambiguity resolution called SARA is developed. In summary, the study demonstrats that when the ASR is close to one, the reliability of AR can be guaranteed and the ambiguity validation is effective. The work then focuses on new strategies to improve the ASR, including a partial ambiguity resolution procedure with a predefined success rate and a novel satellite selection strategy with a high success rate. The proposed strategies bring significant benefits of multi-GNSS signals to real-time high precision and high reliability positioning services.
Resumo:
This practice-led research examines the generative function of loss in fiction that explores themes of grief and longing. This research considers how loss may be understood as a structuring mechanism through which characters evaluate time, resolve loss and affect future change. The creative work is a work of literary fiction titled A Distance Too Far Away. Aubrey, the story’s protagonist, is a woman in her twenties living in Brisbane in the early 1980s, carving out an independent life for herself away from her family. Through a flashback narrative sequence, told from the perspective of the twelve year narrator, Aubrey retraces a significant point of rupture in her life following a series of family tragedies. A Distance Too Far Away explores the tension between belonging and freedom, and considers how the past provides a malleable space for illuminating desire in order to traverse the gap between the world as it is and the world as we want it to be. The exegetical component of this research considers an alternative critical frame for interpreting the work of American author Anne Tyler, a writer who has had a significant influence on my own practice. Frequently criticised for creating sentimental and inert characters, many critics observe that nothing happens in Tyler’s circular plots. This research challenges these assertions, and through a contextual analysis of Tyler’s Ladder of Years (1995) investigates how Tyler engages with memory and nostalgia in order to move across time and resolve loss.
Resumo:
Pressure feeder chutes are pieces of equipment used in sugar cane crushing to increase the amount of cane that can be put through a mill. The continuous pressure feeder was developed with the objective to provide a constant feed of bagasse under pressure to the mouth of the crushing mills. The pressure feeder chute is used in a sugarcane milling unit to transfer bagasse from one set of crushing rolls to a second set of crushing rolls. There have been many pressure feeder chute failures in the past. The pressure feeder chute is quite vulnerable and if the bagasse throughput is blocked at the mill rollers, the pressure build-up in the chute can be enormous, which can ultimately result in failure. The result is substantial damage to the rollers, mill and chute construction, and downtimes of up to 48 hours can be experienced. Part of the problem is that the bagasse behaviour in the pressure feeder chute is not understood well. If the pressure feeder chute behaviour was understood, then the chute geometry design could be modified in order to minimise risk of failure. There are possible avenues for changing pressure feeder chute design and operations with a view to producing more reliable pressure feeder chutes in the future. There have been previous attempts to conduct experimental work to determine the causes of pressure feeder chute failures. There are certain guidelines available, however pressure feeder chute failures continue. Pressure feeder chute behaviour still remains poorly understood. This thesis contains the work carried out between April 14th 2009 and October 10th 2012 that focuses on the design of an experimental apparatus to measure forces and visually observe bagasse behaviour in an attempt to understand bagasse behaviour in pressure feeder chutes and minimise the risk of failure.
Resumo:
This paper outlines an innovative and feasible flight control scheme for a rotary-wing unmanned aerial system (RUAS) with guaranteed safety and reliable flight quality in a gusty environment. The proposed control methodology aims to increase gust-attenuation capability of a RUAS to ensure improved flight performance when strong gusts occur. Based on the design of an effective estimator, an altitude controller is firstly constructed to synchronously compensate for fluctuations of the main rotor thrust which might lead to crashes in a gusty environment. Afterwards, a nonlinear state feedback controller is proposed to stabilize horizontal positions of the RUAS with gust-attenuation property. Performance of the proposed control framework is evaluated using parameters of a Vario XLC helicopter and high-fidelity simulations show that the proposed controllers can effectively reduce side-effect of gusts and demonstrate performance improvement when compared with the proportional-integral-derivative (PID) controllers.
Resumo:
Density functional theory (DFT) is a powerful approach to electronic structure calculations in extended systems, but suffers currently from inadequate incorporation of long-range dispersion, or Van der Waals (VdW) interactions. VdW-corrected DFT is tested for interactions involving molecular hydrogen, graphite, single-walled carbon nanotubes (SWCNTs), and SWCNT bundles. The energy correction, based on an empirical London dispersion term with a damping function at short range, allows a reasonable physisorption energy and equilibrium distance to be obtained for H2 on a model graphite surface. The VdW-corrected DFT calculation for an (8, 8) nanotube bundle reproduces accurately the experimental lattice constant. For H2 inside or outside an (8, 8) SWCNT, we find the binding energies are respectively higher and lower than that on a graphite surface, correctly predicting the well known curvature effect. We conclude that the VdW correction is a very effective method for implementing DFT calculations, allowing a reliable description of both short-range chemical bonding and long-range dispersive interactions. The method will find powerful applications in areas of SWCNT research where empirical potential functions either have not been developed, or do not capture the necessary range of both dispersion and bonding interactions.
Resumo:
The Oceania region is an area particularly prone to natural disasters such as cyclones, tsunamis, floods, droughts, earthquakes and volcanic eruptions. Many of the nations in the region are Small Island Developing States (SIDS), yet even within wealthy states such as Australia and New Zealand there are groups which are vulnerable to disaster. Vulnerability to natural disaster can be understood in human rights terms, as natural disasters threaten the enjoyment of a number of rights which are guaranteed under international law, including rights to health, housing, food, water and even the right to life itself. The impacts of climate change threaten to exacerbate these vulnerabilities, yet, despite the foreseeability of further natural disasters as a result of climate change, there currently exists no comprehensive international framework for disaster response offering practical and/or legally reliable mechanisms to assist at‐risk states and communities. This paper sets out to explore the human rights issues presented by natural disasters and examine the extent to which these issues can be addressed by disaster response frameworks at the international, regional and national levels.
Resumo:
As highlighted by previous work in Normal Accident Theory1 and High Reliability Organisations, 2 the ability of a system to be flexible is of critical importance to its capability to prepare for, respond to, and recover from disturbance and disasters. This paper proposes that the research into ‘edge organisations’3 and ‘agility’4 is a potential means to operationalise components that embed high reliable traits in the management and oversight of critical infrastructure systems. Much prior work has focused on these concepts in a military frame whereas the study reported on here examines the application of these concepts to aviation infrastructure, specifically, a commercial international airport. As a commercial entity functions in a distinct manner from a military organisation this study aims to better understand the complementary and contradictory components of the application of agility work to a commercial context. Findings highlight the challenges of making commercial operators of infrastructure systems agile as well as embedding traits of High Reliability in such complex infrastructure settings.
Resumo:
STUDY DESIGN: Reliability and case-control injury study. OBJECTIVES: 1) To determine if a novel device, designed to measure eccentric knee flexors strength via the Nordic hamstring exercise (NHE), displays acceptable test-retest reliability; 2) to determine normative values for eccentric knee flexors strength derived from the device in individuals without a history of hamstring strain injury (HSI) and; 3) to determine if the device could detect weakness in elite athletes with a previous history of unilateral HSI. BACKGROUND: HSIs and reinjuries are the most common cause of lost playing time in a number of sports. Eccentric knee flexors weakness is a major modifiable risk factor for future HSIs, however there is a lack of easily accessible equipment to assess this strength quality. METHODS: Thirty recreationally active males without a history of HSI completed NHEs on the device on 2 separate occasions. Intraclass correlation coefficients (ICCs), typical error (TE), typical error as a co-efficient of variation (%TE), and minimum detectable change at a 95% confidence interval (MDC95) were calculated. Normative strength data were determined using the most reliable measurement. An additional 20 elite athletes with a unilateral history of HSI within the previous 12 months performed NHEs on the device to determine if residual eccentric muscle weakness existed in the previously injured limb. RESULTS: The device displayed high to moderate reliability (ICC = 0.83 to 0.90; TE = 21.7 N to 27.5 N; %TE = 5.8 to 8.5; MDC95 = 76.2 to 60.1 N). Mean±SD normative eccentric flexors strength, based on the uninjured group, was 344.7 ± 61.1 N for the left and 361.2 ± 65.1 N for the right side. The previously injured limbs were 15% weaker than the contralateral uninjured limbs (mean difference = 50.3 N; 95% CI = 25.7 to 74.9N; P < .01), 15% weaker than the normative left limb data (mean difference = 50.0 N; 95% CI = 1.4 to 98.5 N; P = .04) and 18% weaker than the normative right limb data (mean difference = 66.5 N; 95% CI = 18.0 to 115.1 N; P < .01). CONCLUSIONS: The experimental device offers a reliable method to determine eccentric knee flexors strength and strength asymmetry and revealed residual weakness in previously injured elite athletes.
Resumo:
Cell trajectory data is often reported in the experimental cell biology literature to distinguish between different types of cell migration. Unfortunately, there is no accepted protocol for designing or interpreting such experiments and this makes it difficult to quantitatively compare different published data sets and to understand how changes in experimental design influence our ability to interpret different experiments. Here, we use an individual based mathematical model to simulate the key features of a cell trajectory experiment. This shows that our ability to correctly interpret trajectory data is extremely sensitive to the geometry and timing of the experiment, the degree of motility bias and the number of experimental replicates. We show that cell trajectory experiments produce data that is most reliable when the experiment is performed in a quasi 1D geometry with a large number of identically{prepared experiments conducted over a relatively short time interval rather than few trajectories recorded over particularly long time intervals.
Resumo:
The dc capacitors voltage unbalancing is the main technical drawback of a diode-clamped multilevel inverter (DCMLI), with more than three levels. A voltage-balancing circuit based on buck–boost chopper connected to the dc link of DCMLI is a reliable and robust solution to this problem. This study presents four different schemes for controlling the chopper circuit to achieve the capacitor voltages equalisation. These can be broadly categorised as single-pulse, multi-pulse and hysteresis band current control schemes. The single-pulse scheme does not involve faster switching actions but need the chopper devices to be rated for higher current. The chopper devices current rating can be kept limited by using the multi-pulse scheme but it involves faster switching actions and slower response. The hysteresis band current control scheme offers faster dynamics, lower current rating of the chopper devices and can nullify the initial voltage imbalance as well. However, it involves much faster switching actions which may not be feasible for some of its applications. Therefore depending on the system requirements and ratings, one of these schemes may be used. The performance and validity of the proposed schemes are confirmed through both simulation and experimental investigations on a prototype five-level diode-clamped inverter.
Resumo:
Mentors play a key role in developing preservice teachers for their chosen careers and providing feedback appears as a significant relational interaction between the mentor and mentee that assists in guiding the mentee’s practices. Yet, what are mentors’ perspectives on providing feedback to their mentees? In this case study, eight mentors viewed a professional video recorded science lesson facilitated by a final-year preservice teacher during practicum for the purposes of providing oral feedback in a simulated mentor-mentee discussion. Findings showed that mentors’ feedback was variable in both their positive feedback and constructive criticisms and, in one case, the feedback was contrasting in nature. Implications are discussed, including preservice teachers receiving feedback from more than one mentor and universities researching the design of valid and reliable tools to guide mentors’ oral feedback.
Resumo:
During the last several decades, the quality of natural resources and their services have been exposed to significant degradation from increased urban populations combined with the sprawl of settlements, development of transportation networks and industrial activities (Dorsey, 2003; Pauleit et al., 2005). As a result of this environmental degradation, a sustainable framework for urban development is required to provide the resilience of natural resources and ecosystems. Sustainable urban development refers to the management of cities with adequate infrastructure to support the needs of its population for the present and future generations as well as maintain the sustainability of its ecosystems (UNEP/IETC, 2002; Yigitcanlar, 2010). One of the important strategic approaches for planning sustainable cities is „ecological planning‟. Ecological planning is a multi-dimensional concept that aims to preserve biodiversity richness and ecosystem productivity through the sustainable management of natural resources (Barnes et al., 2005). As stated by Baldwin (1985, p.4), ecological planning is the initiation and operation of activities to direct and control the acquisition, transformation, disruption and disposal of resources in a manner capable of sustaining human activities with a minimum disruption of ecosystem processes. Therefore, ecological planning is a powerful method for creating sustainable urban ecosystems. In order to explore the city as an ecosystem and investigate the interaction between the urban ecosystem and human activities, a holistic urban ecosystem sustainability assessment approach is required. Urban ecosystem sustainability assessment serves as a tool that helps policy and decision-makers in improving their actions towards sustainable urban development. There are several methods used in urban ecosystem sustainability assessment among which sustainability indicators and composite indices are the most commonly used tools for assessing the progress towards sustainable land use and urban management. Currently, a variety of composite indices are available to measure the sustainability at the local, national and international levels. However, the main conclusion drawn from the literature review is that they are too broad to be applied to assess local and micro level sustainability and no benchmark value for most of the indicators exists due to limited data availability and non-comparable data across countries. Mayer (2008, p. 280) advocates that by stating "as different as the indices may seem, many of them incorporate the same underlying data because of the small number of available sustainability datasets". Mori and Christodoulou (2011) also argue that this relative evaluation and comparison brings along biased assessments, as data only exists for some entities, which also means excluding many nations from evaluation and comparison. Thus, there is a need for developing an accurate and comprehensive micro-level urban ecosystem sustainability assessment method. In order to develop such a model, it is practical to adopt an approach that uses a method to utilise indicators for collecting data, designate certain threshold values or ranges, perform a comparative sustainability assessment via indices at the micro-level, and aggregate these assessment findings to the local level. Hereby, through this approach and model, it is possible to produce sufficient and reliable data to enable comparison at the local level, and provide useful results to inform the local planning, conservation and development decision-making process to secure sustainable ecosystems and urban futures. To advance research in this area, this study investigated the environmental impacts of an existing urban context by using a composite index with an aim to identify the interaction between urban ecosystems and human activities in the context of environmental sustainability. In this respect, this study developed a new comprehensive urban ecosystem sustainability assessment tool entitled the „Micro-level Urban-ecosystem Sustainability IndeX‟ (MUSIX). The MUSIX model is an indicator-based indexing model that investigates the factors affecting urban sustainability in a local context. The model outputs provide local and micro-level sustainability reporting guidance to help policy-making concerning environmental issues. A multi-method research approach, which is based on both quantitative analysis and qualitative analysis, was employed in the construction of the MUSIX model. First, a qualitative research was conducted through an interpretive and critical literature review in developing a theoretical framework and indicator selection. Afterwards, a quantitative research was conducted through statistical and spatial analyses in data collection, processing and model application. The MUSIX model was tested in four pilot study sites selected from the Gold Coast City, Queensland, Australia. The model results detected the sustainability performance of current urban settings referring to six main issues of urban development: (1) hydrology, (2) ecology, (3) pollution, (4) location, (5) design, and; (6) efficiency. For each category, a set of core indicators was assigned which are intended to: (1) benchmark the current situation, strengths and weaknesses, (2) evaluate the efficiency of implemented plans, and; (3) measure the progress towards sustainable development. While the indicator set of the model provided specific information about the environmental impacts in the area at the parcel scale, the composite index score provided general information about the sustainability of the area at the neighbourhood scale. Finally, in light of the model findings, integrated ecological planning strategies were developed to guide the preparation and assessment of development and local area plans in conjunction with the Gold Coast Planning Scheme, which establishes regulatory provisions to achieve ecological sustainability through the formulation of place codes, development codes, constraint codes and other assessment criteria that provide guidance for best practice development solutions. These relevant strategies can be summarised as follows: • Establishing hydrological conservation through sustainable stormwater management in order to preserve the Earth’s water cycle and aquatic ecosystems; • Providing ecological conservation through sustainable ecosystem management in order to protect biological diversity and maintain the integrity of natural ecosystems; • Improving environmental quality through developing pollution prevention regulations and policies in order to promote high quality water resources, clean air and enhanced ecosystem health; • Creating sustainable mobility and accessibility through designing better local services and walkable neighbourhoods in order to promote safe environments and healthy communities; • Sustainable design of urban environment through climate responsive design in order to increase the efficient use of solar energy to provide thermal comfort, and; • Use of renewable resources through creating efficient communities in order to provide long-term management of natural resources for the sustainability of future generations.
Resumo:
In a classification problem typically we face two challenging issues, the diverse characteristic of negative documents and sometimes a lot of negative documents that are closed to positive documents. Therefore, it is hard for a single classifier to clearly classify incoming documents into classes. This paper proposes a novel gradual problem solving to create a two-stage classifier. The first stage identifies reliable negatives (negative documents with weak positive characteristics). It concentrates on minimizing the number of false negative documents (recall-oriented). We use Rocchio, an existing recall based classifier, for this stage. The second stage is a precision-oriented “fine tuning”, concentrates on minimizing the number of false positive documents by applying pattern (a statistical phrase) mining techniques. In this stage a pattern-based scoring is followed by threshold setting (thresholding). Experiment shows that our statistical phrase based two-stage classifier is promising.