374 resultados para train delays
Resumo:
Background: SEQ Catchments Ltd and QUT are collaborating on groundwater investigations in the SE Qld region, which utilise community engagement and 3D Visualisation methodologies. The projects, which have been funded by the Australian Government’s NHT and Caring for our Country programmes, were initiated from local community concerns regarding groundwater sustainability and quality in areas where little was previously known. ----- Objectives: Engage local and regional stakeholders to tap all available sources of information;•Establish on-going (2 years +) community-based groundwater / surface water monitoring programmes;•Develop 3D Visualisation from all available data; and•Involve, train and inform the local community for improved on-ground land and water use management. ----- Results and findings: Respectful community engagement yielded information, access to numerous monitoring sites and education opportunities at low cost, which would otherwise be unavailable. A Framework for Community-Based Groundwater Monitoring has been documented (Todd, 2008).A 3D visualisation models have been developed for basaltic settings, which relate surface features familiar to the local community with the interpreted sub-surface hydrogeology. Groundwater surface movements have been animated and compared to local rainfall using the time-series monitoring data.An important 3D visualisation feature of particular interest to the community was the interaction between groundwater and surface water. This factor was crucial in raising awareness of potential impacts of land and water use on groundwater and surface water resources.
Resumo:
Studies have examined the associations between cancers and circulating 25-hydroxyvitamin D [25(OH)D], but little is known about the impact of different laboratory practices on 25(OH)D concentrations. We examined the potential impact of delayed blood centrifuging, choice of collection tube, and type of assay on 25(OH)D concentrations. Blood samples from 20 healthy volunteers underwent alternative laboratory procedures: four centrifuging times (2, 24, 72, and 96 h after blood draw); three types of collection tubes (red top serum tube, two different plasma anticoagulant tubes containing heparin or EDTA); and two types of assays (DiaSorin radioimmunoassay [RIA] and chemiluminescence immunoassay [CLIA/LIAISON®]). Log-transformed 25(OH)D concentrations were analyzed using the generalized estimating equations (GEE) linear regression models. We found no difference in 25(OH)D concentrations by centrifuging times or type of assay. There was some indication of a difference in 25(OH)D concentrations by tube type in CLIA/LIAISON®-assayed samples, with concentrations in heparinized plasma (geometric mean, 16.1 ng ml−1) higher than those in serum (geometric mean, 15.3 ng ml−1) (p = 0.01), but the difference was significant only after substantial centrifuging delays (96 h). Our study suggests no necessity for requiring immediate processing of blood samples after collection or for the choice of a tube type or assay.
Resumo:
This study is the first to investigate the effect of prolonged reading on reading performance and visual functions in students with low vision. The study focuses on one of the most common modes of achieving adequate magnification for reading by students with low vision, their close reading distance (proximal or relative distance magnification). Close reading distances impose high demands on near visual functions, such as accommodation and convergence. Previous research on accommodation in children with low vision shows that their accommodative responses are reduced compared to normal vision. In addition, there is an increased lag of accommodation for higher stimulus levels as may occur at close reading distance. Reduced accommodative responses in low vision and higher lag of accommodation at close reading distances together could impact on reading performance of students with low vision especially during prolonged reading tasks. The presence of convergence anomalies could further affect reading performance. Therefore, the aims of the present study were 1) To investigate the effect of prolonged reading on reading performance in students with low vision 2) To investigate the effect of prolonged reading on visual functions in students with low vision. This study was conducted as cross-sectional research on 42 students with low vision and a comparison group of 20 students with normal vision, aged 7 to 20 years. The students with low vision had vision impairments arising from a range of causes and represented a typical group of students with low vision, with no significant developmental delays, attending school in Brisbane, Australia. All participants underwent a battery of clinical tests before and after a prolonged reading task. An initial reading-specific history and pre-task measurements that included Bailey-Lovie distance and near visual acuities, Pelli-Robson contrast sensitivity, ocular deviations, sensory fusion, ocular motility, near point of accommodation (pull-away method), accuracy of accommodation (Monocular Estimation Method (MEM)) retinoscopy and Near Point of Convergence (NPC) (push-up method) were recorded for all participants. Reading performance measures were Maximum Oral Reading Rates (MORR), Near Text Visual Acuity (NTVA) and acuity reserves using Bailey-Lovie text charts. Symptoms of visual fatigue were assessed using the Convergence Insufficiency Symptom Survey (CISS) for all participants. Pre-task measurements of reading performance and accuracy of accommodation and NPC were compared with post-task measurements, to test for any effects of prolonged reading. The prolonged reading task involved reading a storybook silently for at least 30 minutes. The task was controlled for print size, contrast, difficulty level and content of the reading material. Silent Reading Rate (SRR) was recorded every 2 minutes during prolonged reading. Symptom scores and visual fatigue scores were also obtained for all participants. A visual fatigue analogue scale (VAS) was used to assess visual fatigue during the task, once at the beginning, once at the middle and once at the end of the task. In addition to the subjective assessments of visual fatigue, tonic accommodation was monitored using a photorefractor (PlusoptiX CR03™) every 6 minutes during the task, as an objective assessment of visual fatigue. Reading measures were done at the habitual reading distance of students with low vision and at 25 cms for students with normal vision. The initial history showed that the students with low vision read for significantly shorter periods at home compared to the students with normal vision. The working distances of participants with low vision ranged from 3-25 cms and half of them were not using any optical devices for magnification. Nearly half of the participants with low vision were able to resolve 8-point print (1M) at 25 cms. Half of the participants in the low vision group had ocular deviations and suppression at near. Reading rates were significantly reduced in students with low vision compared to those of students with normal vision. In addition, there were a significantly larger number of participants in the low vision group who could not sustain the 30-minute task compared to the normal vision group. However, there were no significant changes in reading rates during or following prolonged reading in either the low vision or normal vision groups. Individual changes in reading rates were independent of their baseline reading rates, indicating that the changes in reading rates during prolonged reading cannot be predicted from a typical clinical assessment of reading using brief reading tasks. Contrary to previous reports the silent reading rates of the students with low vision were significantly lower than their oral reading rates, although oral and silent reading was assessed using different methods. Although the visual acuity, contrast sensitivity, near point of convergence and accuracy of accommodation were significantly poorer for the low vision group compared to those of the normal vision group, there were no significant changes in any of these visual functions following prolonged reading in either group. Interestingly, a few students with low vision (n =10) were found to be reading at a distance closer than their near point of accommodation. This suggests a decreased sensitivity to blur. Further evaluation revealed that the equivalent intrinsic refractive errors (an estimate of the spherical dioptirc defocus which would be expected to yield a patient’s visual acuity in normal subjects) were significantly larger for the low vision group compared to those of the normal vision group. As expected, accommodative responses were significantly reduced for the low vision group compared to the expected norms, which is consistent with their close reading distances, reduced visual acuity and contrast sensitivity. For those in the low vision group who had an accommodative error exceeding their equivalent intrinsic refractive errors, a significant decrease in MORR was found following prolonged reading. The silent reading rates however were not significantly affected by accommodative errors in the present study. Suppression also had a significant impact on the changes in reading rates during prolonged reading. The participants who did not have suppression at near showed significant decreases in silent reading rates during and following prolonged reading. This impact of binocular vision at near on prolonged reading was possibly due to the high demands on convergence. The significant predictors of MORR in the low vision group were age, NTVA, reading interest and reading comprehension, accounting for 61.7% of the variances in MORR. SRR was not significantly influenced by any factors, except for the duration of the reading task sustained; participants with higher reading rates were able to sustain a longer reading duration. In students with normal vision, age was the only predictor of MORR. Participants with low vision also reported significantly greater visual fatigue compared to the normal vision group. Measures of tonic accommodation however were little influenced by visual fatigue in the present study. Visual fatigue analogue scores were found to be significantly associated with reading rates in students with low vision and normal vision. However, the patterns of association between visual fatigue and reading rates were different for SRR and MORR. The participants with low vision with higher symptom scores had lower SRRs and participants with higher visual fatigue had lower MORRs. As hypothesized, visual functions such as accuracy of accommodation and convergence did have an impact on prolonged reading in students with low vision, for students whose accommodative errors were greater than their equivalent intrinsic refractive errors, and for those who did not suppress one eye. Those students with low vision who have accommodative errors higher than their equivalent intrinsic refractive errors might significantly benefit from reading glasses. Similarly, considering prisms or occlusion for those without suppression might reduce the convergence demands in these students while using their close reading distances. The impact of these prescriptions on reading rates, reading interest and visual fatigue is an area of promising future research. Most importantly, it is evident from the present study that a combination of factors such as accommodative errors, near point of convergence and suppression should be considered when prescribing reading devices for students with low vision. Considering these factors would also assist rehabilitation specialists in identifying those students who are likely to experience difficulty in prolonged reading, which is otherwise not reflected during typical clinical reading assessments.
Resumo:
Automatic recognition of people is an active field of research with important forensic and security applications. In these applications, it is not always possible for the subject to be in close proximity to the system. Voice represents a human behavioural trait which can be used to recognise people in such situations. Automatic Speaker Verification (ASV) is the process of verifying a persons identity through the analysis of their speech and enables recognition of a subject at a distance over a telephone channel { wired or wireless. A significant amount of research has focussed on the application of Gaussian mixture model (GMM) techniques to speaker verification systems providing state-of-the-art performance. GMM's are a type of generative classifier trained to model the probability distribution of the features used to represent a speaker. Recently introduced to the field of ASV research is the support vector machine (SVM). An SVM is a discriminative classifier requiring examples from both positive and negative classes to train a speaker model. The SVM is based on margin maximisation whereby a hyperplane attempts to separate classes in a high dimensional space. SVMs applied to the task of speaker verification have shown high potential, particularly when used to complement current GMM-based techniques in hybrid systems. This work aims to improve the performance of ASV systems using novel and innovative SVM-based techniques. Research was divided into three main themes: session variability compensation for SVMs; unsupervised model adaptation; and impostor dataset selection. The first theme investigated the differences between the GMM and SVM domains for the modelling of session variability | an aspect crucial for robust speaker verification. Techniques developed to improve the robustness of GMMbased classification were shown to bring about similar benefits to discriminative SVM classification through their integration in the hybrid GMM mean supervector SVM classifier. Further, the domains for the modelling of session variation were contrasted to find a number of common factors, however, the SVM-domain consistently provided marginally better session variation compensation. Minimal complementary information was found between the techniques due to the similarities in how they achieved their objectives. The second theme saw the proposal of a novel model for the purpose of session variation compensation in ASV systems. Continuous progressive model adaptation attempts to improve speaker models by retraining them after exploiting all encountered test utterances during normal use of the system. The introduction of the weight-based factor analysis model provided significant performance improvements of over 60% in an unsupervised scenario. SVM-based classification was then integrated into the progressive system providing further benefits in performance over the GMM counterpart. Analysis demonstrated that SVMs also hold several beneficial characteristics to the task of unsupervised model adaptation prompting further research in the area. In pursuing the final theme, an innovative background dataset selection technique was developed. This technique selects the most appropriate subset of examples from a large and diverse set of candidate impostor observations for use as the SVM background by exploiting the SVM training process. This selection was performed on a per-observation basis so as to overcome the shortcoming of the traditional heuristic-based approach to dataset selection. Results demonstrate the approach to provide performance improvements over both the use of the complete candidate dataset and the best heuristically-selected dataset whilst being only a fraction of the size. The refined dataset was also shown to generalise well to unseen corpora and be highly applicable to the selection of impostor cohorts required in alternate techniques for speaker verification.
Resumo:
Principal Topic: Project structures are often created by entrepreneurs and large corporate organizations to develop new products. Since new product development projects (NPDP) are more often situated within a larger organization, intrapreneurship or corporate entrepreneurship plays an important role in bringing these projects to fruition. Since NPDP often involves the development of a new product using immature technology, we describe development of an immature technology. The Joint Strike Fighter (JSF) F-35 aircraft is being developed by the U.S. Department of Defense and eight allied nations. In 2001 Lockheed Martin won a $19 billion contract to develop an affordable, stealthy and supersonic all-weather strike fighter designed to replace a wide range of aging fighter aircraft. In this research we define a complex project as one that demonstrates a number of sources of uncertainty to a degree, or level of severity, that makes it extremely difficult to predict project outcomes, to control or manage project (Remington & Zolin, Forthcoming). Project complexity has been conceptualized by Remington and Pollock (2007) in terms of four major sources of complexity; temporal, directional, structural and technological complexity (See Figure 1). Temporal complexity exists when projects experience significant environmental change outside the direct influence or control of the project. The Global Economic Crisis of 2008 - 2009 is a good example of the type of environmental change that can make a project complex as, for example in the JSF project, where project managers attempt to respond to changes in interest rates, international currency exchange rates and commodity prices etc. Directional complexity exists in a project where stakeholders' goals are unclear or undefined, where progress is hindered by unknown political agendas, or where stakeholders disagree or misunderstand project goals. In the JSF project all the services and all non countries have to agree to the specifications of the three variants of the aircraft; Conventional Take Off and Landing (CTOL), Short Take Off/Vertical Landing (STOVL) and the Carrier Variant (CV). Because the Navy requires a plane that can take off and land on an aircraft carrier, that required a special variant of the aircraft design, adding complexity to the project. Technical complexity occurs in a project using technology that is immature or where design characteristics are unknown or untried. Developing a plane that can take off on a very short runway and land vertically created may highly interdependent technological challenges to correctly locate, direct and balance the lift fans, modulate the airflow and provide equivalent amount of thrust from the downward vectored rear exhaust to lift the aircraft and at the same time control engine temperatures. These technological challenges make costing and scheduling equally challenging. Structural complexity in a project comes from the sheer numbers of elements such as the number of people, teams or organizations involved, ambiguity regarding the elements, and the massive degree of interconnectedness between them. While Lockheed Martin is the prime contractor, they are assisted in major aspects of the JSF development by Northrop Grumman, BAE Systems, Pratt & Whitney and GE/Rolls-Royce Fighter Engineer Team and innumerable subcontractors. In addition to identifying opportunities to achieve project goals, complex projects also need to identify and exploit opportunities to increase agility in response to changing stakeholder demands or to reduce project risks. Complexity Leadership Theory contends that in complex environments adaptive and enabling leadership are needed (Uhl-Bien, Marion and McKelvey, 2007). Adaptive leadership facilitates creativity, learning and adaptability, while enabling leadership handles the conflicts that inevitably arise between adaptive leadership and traditional administrative leadership (Uhl-Bien and Marion, 2007). Hence, adaptive leadership involves the recognition and opportunities to adapt, while and enabling leadership involves the exploitation of these opportunities. Our research questions revolve around the type or source of complexity and its relationship to opportunity recognition and exploitation. For example, is it only external environmental complexity that creates the need for the entrepreneurial behaviours, such as opportunity recognition and opportunity exploitation? Do the internal dimensions of project complexity, such as technological and structural complexity, also create the need for opportunity recognition and opportunity exploitation? The Kropp, Zolin and Lindsay model (2009) describes a relationship between entrepreneurial orientation (EO), opportunity recognition (OR), and opportunity exploitation (OX) in complex projects, with environmental and organizational contextual variables as moderators. We extend their model by defining the affects of external complexity and internal complexity on OR and OX. ---------- Methodology/Key Propositions: When the environment complex EO is more likely to result in OR because project members will be actively looking for solutions to problems created by environmental change. But in projects that are technologically or structurally complex project leaders and members may try to make the minimum changes possible to reduce the risk of creating new problems due to delays or schedule changes. In projects with environmental or technological complexity project leaders who encourage the innovativeness dimension of EO will increase OR in complex projects. But projects with technical or structural complexity innovativeness will not necessarily result in the recognition and exploitation of opportunities due to the over-riding importance of maintaining stability in the highly intricate and interconnected project structure. We propose that in projects with environmental complexity creating the need for change and innovation project leaders, who are willing to accept and manage risk, are more likely to identify opportunities to increase project effectiveness and efficiency. In contrast in projects with internal complexity a much higher willingness to accept risk will be necessary to trigger opportunity recognition. In structurally complex projects we predict it will be less likely to find a relationship between risk taking and OP. When the environment is complex, and a project has autonomy, they will be motivated to execute opportunities to improve the project's performance. In contrast, when the project has high internal complexity, they will be more cautious in execution. When a project experiences high competitive aggressiveness and their environment is complex, project leaders will be motivated to execute opportunities to improve the project's performance. In contrast, when the project has high internal complexity, they will be more cautious in execution. This paper reports the first stage of a three year study into the behaviours of managers, leaders and team members of complex projects. We conduct a qualitative study involving a Group Discussion with experienced project leaders. The objective is to determine how leaders of large and potentially complex projects perceive that external and internal complexity will influence the affects of EO on OR. ---------- Results and Implications: These results will help identify and distinguish the impact of external and internal complexity on entrepreneurial behaviours in NPDP. Project managers will be better able to quickly decide how and when to respond to changes in the environment and internal project events.
Resumo:
Natural disasters and deliberate, willful damage to telecommunication infrastructure can result in a loss of critical voice and data services. This loss of service hinders the ability for efficient emergency response and can cause delays leading to loss of life. Current mobile devices are generally tied to one network operator. When a disaster is of significant impact, that network operator cannot be relied upon to provide service and coverage levels that would normally exist. While some operators have agreements with other operators to share resources (such as network roaming) these agreements are contractual in nature and cannot be activated quickly in an emergency. This paper introduces Fourth Generation (4G) wireless networks. 4G networks are highly mobile and heterogeneous, which makes 4G networks highly resilient in times of disaster.
Resumo:
This paper describes an autonomous docking system and web interface that allows long-term unaided use of a sophisticated robot by untrained web users. These systems have been applied to the biologically inspired RatSLAM system as a foundation for testing both its long-term stability and its practicality. While docking and web interface systems already exist, this system allows for a significantly larger margin of error in docking accuracy due to the mechanical design, thereby increasing robustness against navigational errors. Also a standard vision sensor is used for both long-range and short-range docking, compared to the many systems that require both omni-directional cameras and high resolution Laser range finders for navigation. The web interface has been designed to accommodate the significant delays experienced on the Internet, and to facilitate the non- Cartesian operation of the RatSLAM system.
Resumo:
A research project was conducted at Queensland University of Technology on the relationship between the forces at the wheel-rail interface in track and the rate of degradation of track. Data for the study was obtained from an instrumented vehicle which ran repeatedly over a section of Queensland Rail's track in Central Queensland over a 6-month period. The wheel-rail forces had to be correlated with the elements of roughness in the test track profile, which were measured with a variety of equipment. At low frequencies, there was strong correlation between forces and profile, as expected, but diminishing correlation as frequencies increased.
Resumo:
Transnational mergers are mergers involving firms operating in more than one jurisdiction, or which occur in one jurisdiction but have an impact on competition in another. Being of this nature, they have the potential to raise competition law concerns in more than one jurisdiction. When they do, the transaction costs of the merger to the firms involved, and the competition law authorities, are likely to increase significantly and, even where the merger is allowed to proceed, delays are likely to occur in reaping the benefits of the merger. Ultimately, these costs are borne by consumers. This thesis will identify the nature and source of regulatory costs associated with transnational merger review and identify and evaluate possible mechanisms by which these costs might be reduced. It will conclude that there is no single panacea for transnational merger regulation, but that a multi-faceted approach, including the adoption of common filing forms, agreement on filing and review deadlines and continuing efforts toward increasing international cooperation in merger enforcement, is needed to reduce regulatory costs and more successfully improve the welfare outcomes to which merger regulation is directed.
Resumo:
Current train of thought in appetite research is favouring an interest in non-homeostatic or hedonic (reward) mechanisms in relation to overconsumption and energy balance. This tendency is supported by advances in neurobiology that precede the emergence of a new conceptual approach to reward where affect and motivation (liking and wanting) can be seen as the major force in guiding human eating behaviour. In this review, current progress in applying processes of liking and wanting to the study of human appetite are examined by discussing the following issues: How can these concepts be operationalised for use in human research to reflect the neural mechanisms by which they may be influenced? Do liking and wanting operate independently to produce functionally significant changes in behaviour? Can liking and wanting be truly experimentally separated or will an expression of one inevitably contain elements of the other? The review contains a re-examination of selected human appetite research before exploring more recent methodological approaches to the study of liking and wanting in appetite control. In addition, some theoretical developments are described in four diverse models that may enhance current understanding of the role of these processes in guiding ingestive behaviour. Finally, the implications of a dual process modulation of food reward for weight gain and obesity are discussed. The review concludes that processes of liking and wanting are likely to have independent roles in characterising susceptibility to weight gain. Further research into the dissociation of liking and wanting through implicit and explicit levels of processing would help to disclose the relative importance of these components of reward for appetite control and weight regulation.
Resumo:
We present a novel, simple and effective approach for tele-operation of aerial robotic vehicles with haptic feedback. Such feedback provides the remote pilot with an intuitive feel of the robot’s state and perceived local environment that will ensure simple and safe operation in cluttered 3D environments common in inspection and surveillance tasks. Our approach is based on energetic considerations and uses the concepts of network theory and port-Hamiltonian systems. We provide a general framework for addressing problems such as mapping the limited stroke of a ‘master’ joystick to the infinite stroke of a ‘slave’ vehicle, while preserving passivity of the closed-loop system in the face of potential time delays in communications links and limited sensor data
Resumo:
As part of an ongoing research on the development of a longer life insulated rail joint (IRJ), this paper reports a field experiment and a simplified 2D numerical modelling for the purpose of investigating the behaviour of rail web in the vicinity of endpost in an insulated rail joint (IRJ) due to wheel passages. A simplified 2D plane stress finite element model is used to simulate the wheel-rail rolling contact impact at IRJ. This model is validated using data from a strain gauged IRJ that was installed in a heavy haul network; data in terms of the vertical and shear strains at specific positions of the IRJ during train passing were captured and compared with the results of the FE model. The comparison indicates a satisfactory agreement between the FE model and the field testing. Furthermore, it demonstrates that the experimental and numerical analyses reported in this paper provide a valuable datum for developing further insight into the behaviour of IRJ under wheel impacts.
Resumo:
There has been a worldwide trend to increase axle loads and train speeds. This means that railway track degradation will be accelerated, and track maintenance costs will be increased significantly. There is a need to investigate the consequences of increasing traffic load. The aim of the research is to develop a model for the analysis of physical degradation of railway tracks in response to changes in traffic parameters, especially increased axle loads and train speeds. This research has developed an integrated track degradation model (ITDM) by integrating several models into a comprehensive framework. Mechanistic relationships for track degradation hav~ ?een used wherever possible in each of the models contained in ITDM. This overcc:mes the deficiency of the traditional statistical track models which rely heavily on historical degradation data, which is generally not available in many railway systems. In addition statistical models lack the flexibility of incorporating future changes in traffic patterns or maintenance practices. The research starts with reviewing railway track related studies both in Australia and overseas to develop a comprehensive understanding of track performance under various traffic conditions. Existing railway related models are then examined for their suitability for track degradation analysis for Australian situations. The ITDM model is subsequently developed by modifying suitable existing models, and developing new models where necessary. The ITDM model contains four interrelated submodels for rails, sleepers, ballast and subgrade, and track modulus. The rail submodel is for rail wear analysis and is developed from a theoretical concept. The sleeper submodel is for timber sleepers damage prediction. The submodel is developed by modifying and extending an existing model developed elsewhere. The submodel has also incorporated an analysis for the likelihood of concrete sleeper cracking. The ballast and subgrade submodel is evolved from a concept developed in the USA. Substantial modifications and improvements have been made. The track modulus submodel is developed from a conceptual method. Corrections for more global track conditions have been made. The integration of these submodels into one comprehensive package has enabled the interaction between individual track components to be taken into account. This is done by calculating wheel load distribution with time and updating track conditions periodically in the process of track degradation simulation. A Windows-based computer program ~ssociated with ITDM has also been developed. The program enables the user to carry out analysis of degradation of individual track components and to investigate the inter relationships between these track components and their deterioration. The successful implementation of this research has provided essential information for prediction of increased maintenance as a consequence of railway trackdegradation. The model, having been presented at various conferences and seminars, has attracted wide interest. It is anticipated that the model will be put into practical use among Australian railways, enabling track maintenance planning to be optimized and potentially saving Australian railway systems millions of dollars in operating costs.
Resumo:
Throughout the twentieth century increased interest in the training of actors resulted in the emergence of a plethora of acting theories and innovative theatrical movements in Europe, the UK and the USA. The individuals or groups involved with the formulation of these theories and movements developed specific terminologies, or languages of acting, in an attempt to clearly articulate the nature and the practice of acting according to their particular pedagogy or theatrical aesthetic. Now at the dawning of the twenty-first century, Australia boasts quite a number of schools and university courses professing to train actors. This research aims to discover the language used in actor training on the east coast of Australia today. Using interviews with staff of the National Institute of Dramatic Art, the Victorian College of the Arts, and the Queensland University of Technology as the primary source of data, a constructivist grounded theory has emerged to assess the influence of last century‟s theatrical theorists and practitioners on Australian training and to ascertain the possibility of a distinctly Australian language of acting.