943 resultados para framework-intensive applications
Resumo:
In this paper I explore the Indigenous Australian women's performance classroom (hereafter ANTH2120) as a dialectic and discursive space where the location of possibility is opened for female Indigenous performers to enter into a dialogue from and between both non-Indigenous and Indigenous voices. The work of Bakhtin on dialogue serves as a useful standpoint for understanding the multiple speaking positions and texts in the ANTH2120 context. Bakhtin emphasizes performance, history, actuality and the openness of dialogue to provide an important framework for analysing multiple speaking positions and ways of making meaning through dialogue between shifting and differing subjectivities. I begin by briefly critiquing Bakhtin's "dialogic imagination" and consider the application and usefulness of concepts such as dialogism, heteroglossia and the utterance to understanding the ANTH2120 classroom as a polyphonic and discursive space. I then turn to an analysis of dialogue in the ANTH2120 classroom and primarily situate my gaze on an examination of the interactions that took place between the voices of myself as family/teacher/student and senior Yanyuwa women from the r e m o t e N o r t h e r n T e r r i t o r y A b o r i g i n a l c o m m u n i t y o f B o r r o l o o l a as family/performers/teachers. The 2000 and 2001 Yanyuwa women's performance workshops will be used as examples of the way power is constantly shifting in this dialogue to allow particular voices to speak with authority, and for others to remain silent as roles and relationships between myself and the Yanyuwa women change. Conclusions will be drawn regarding how my subject positions and white race privilege affect who speaks, who listens and on whose terms, and further, the efficacy of this pedagogical platform for opening up the location of possibility for Indigenous Australian women to play a powerful part in the construction of knowledges about women's performance traditions.
Resumo:
The isotope composition of Ph is difficult to determine accurately due to the lack of a stable normalisation ratio. Double and triple-spike addition techniques provide one solution and presently yield the most accurate measurements. A number of recent studies have claimed that improved accuracy and precision could also be achieved by multi-collector ICP-MS (MC-ICP-MS) Pb-isotope analysis using the addition of Tl of known isotope composition to Pb samples. In this paper, we verify whether the known isotope composition of Tl can be used for correction of mass discrimination of Pb with an extensive dataset for the NIST standard SRM 981, comparison of MC-ICP-MS with TIMS data, and comparison with three isochrons from different geological environments. When all our NIST SRM 981 data are normalised with one constant Tl-205/Tl-203 of 2.38869, the following averages and reproducibilities were obtained: Pb-207/Pb-206=0.91461+/-18; Pb-208/Ph-206 = 2.1674+/-7; and (PbPh)-Pb-206-Ph-204 = 16.941+/-6. These two sigma standard deviations of the mean correspond to 149, 330, and 374 ppm, respectively. Accuracies relative to triple-spike values are 149, 157, and 52 ppm, respectively, and thus well within uncertainties. The largest component of the uncertainties stems from the Ph data alone and is not caused by differential mass discrimination behaviour of Ph and Tl. In routine operation, variation of sample introduction memory and production of isobaric molecular interferences in the spectrometer's collision cell currently appear to be the ultimate limitation to better reproducibility. Comparative study of five different datasets from actual samples (bullets, international rock standards, carbonates, metamorphic minerals, and sulphide minerals) demonstrates that in most cases geological scatter of the sample exceeds the achieved analytical reproducibility. We observe good agreement between TIMS and MC-ICP-MS data for international rock standards but find that such comparison does not constitute the ultimate. test for the validity of the MC-ICP-MS technique. Two attempted isochrons resulted in geological scatter (in one case small) in excess of analytical reproducibility. However, in one case (leached Great Dyke sulphides) we obtained a true isochron (MSWD = 0.63) age of 2578.3 +/- 0.9 Ma, which is identical to and more precise than a recently published U-Pb zircon age (2579 3 Ma) for a Great Dyke websterite [Earth Planet. Sci. Lett. 180 (2000) 1-12]. Reproducibility of this age by means of an isochron we regard as a robust test of accuracy over a wide dynamic range. We show that reliable and accurate Pb-isotope data can be obtained by careful operation of second-generation MC-ICP magnetic sector mass spectrometers. (C) 2002 Elsevier Science B.V. All rights reserved.
Resumo:
Motion study is an engineering technology that analyzes human body motions. During the past decade (1990-1999) a series of studies investigated the role of motion study in developmental disabilities. This article reviews the literature on the applications of motion study in the field. A historical and conceptual review of motion study leading to the current status of studies is presented followed by a review of the research literature. Two main eras of research focus were identified. The first era (1990-1995) of studies established the superior effectiveness and efficiency of tasks designed with motion study or motion study-related principles over traditional site-based task designs. The second era (1995-1999) of studies examined the interaction between motion study-based task designs and other variables such as choice, preference, and functionally equivalent and competing task designs and communicative alternatives. Our review found that applying motion study principles as an antecedent guide and practice to eliminating or reducing ineffective motions and simplifying effective motions resulted in positive task outcomes with most of the participants.
Resumo:
The fate of N-15-nitrogen-enriched formulated feed fed to shrimp was traced through the food web in shallow, outdoor tank systems (1000 1) stocked with shrimp. Triplicate tanks containing shrimp water with and without sediment were used to identify the role of the natural biota in the water column and sediment in processing dietary nitrogen (N). A preliminary experiment demonstrated that N-15-nitrogen-enriched feed products could be detected in the food web. Based on this, a 15-day experiment was conducted. The ammonium (NH4+) pool in the water column became rapidly enriched (within one day) with N-15-nitrogen after shrimp were fed N-15-enriched feed. By day 15, 6% of the added N-15-nitrogen was in this fraction in the 'sediment' tanks compared with 0.4% in the 'no sediment' tanks. The particulate fraction in the water column, principally autotrophic nanoflagellates, accounted for 4-5% of the N-15-nitrogen fed to shrimp after one day. This increased to 16% in the 'no sediment' treatment, and decreased to 2% in the 'sediment' treatment by day 15. It appears that dietary N was more accessible to the phytoplankton community in the absence of sediment. The difference is possibly because a proportion of the dietary N was buried in the sediment in the 'sediment' treatment, making it unavailable to the phytoplankton. Alternatively, the dietary N was retained in the NH4+ pool in the water column since phytoplankton growth, and hence, N utilization was lower in the 'sediment' treatment. The lower growth of phytoplankton in the 'sediment' treatment appeared to be related to higher turbidity, and hence, lower light availability for growth. The percentage N-15-nitrogen detected in the sediment was only 6% despite the high capacity for sedimentation of the large biomass of plankton detritus and shrimp waste. This suggests rapid remineralization of organic waste by the microbial community in the sediment resulting in diffusion of inorganic N sources into the water column. It is likely that most of the dietary N will ultimately be removed from the tank system by water discharges. Our study showed that N-15-nitrogen derived from aquaculture feed can be processed by the microbial community in outdoor aquaculture systems and provides a method for determining the effect of dietary N on ecosystems. However, a significant amount of the dietary N was not retained by the natural biota and is likely to be present in the soluble organic fraction. (C) 2002 Elsevier Science B.V. All rights reserved.
Resumo:
Evaluation of the performance of the APACHE III (Acute Physiology and Chronic Health Evaluation) ICU (intensive care unit) and hospital mortality models at the Princess Alexandra Hospital, Brisbane is reported. Prospective collection of demographic, diagnostic, physiological, laboratory, admission and discharge data of 5681 consecutive eligible admissions (1 January 1995 to 1 January 2000) was conducted at the Princess Alexandra Hospital, a metropolitan Australian tertiary referral medical/surgical adult ICU. ROC (receiver operating characteristic) curve areas for the APACHE III ICU mortality and hospital mortality models demonstrated excellent discrimination. Observed ICU mortality (9.1%) was significantly overestimated by the APACHE III model adjusted for hospital characteristics (10.1%), but did not significantly differ from the prediction of the generic APACHE III model (8.6%). In contrast, observed hospital mortality (14.8%) agreed well with the prediction of the APACHE III model adjusted for hospital characteristics (14.6%), but was significantly underestimated by the unadjusted APACHE III model (13.2%). Calibration curves and goodness-of-fit analysis using Hosmer-Lemeshow statistics, demonstrated that calibration was good with the unadjusted APACHE III ICU mortality model, and the APACHE III hospital mortality model adjusted for hospital characteristics. Post hoc analysis revealed a declining annual SMR (standardized mortality rate) during the study period. This trend was present in each of the non-surgical, emergency and elective surgical diagnostic groups, and the change was temporally related to increased specialist staffing levels. This study demonstrates that the APACHE III model performs well on independent assessment in an Australian hospital. Changes observed in annual SMR using such a validated model support an hypothesis of improved survival outcomes 1995-1999.
Resumo:
The Agricultural Production Systems Simulator (APSIM) is a modular modelling framework that has been developed by the Agricultural Production Systems Research Unit in Australia. APSIM was developed to simulate biophysical process in farming systems, in particular where there is interest in the economic and ecological outcomes of management practice in the face of climatic risk. The paper outlines APSIM's structure and provides details of the concepts behind the different plant, soil and management modules. These modules include a diverse range of crops, pastures and trees, soil processes including water balance, N and P transformations, soil pH, erosion and a full range of management controls. Reports of APSIM testing in a diverse range of systems and environments are summarised. An example of model performance in a long-term cropping systems trial is provided. APSIM has been used in a broad range of applications, including support for on-farm decision making, farming systems design for production or resource management objectives, assessment of the value of seasonal climate forecasting, analysis of supply chain issues in agribusiness activities, development of waste management guidelines, risk assessment for government policy making and as a guide to research and education activity. An extensive citation list for these model testing and application studies is provided. Crown Copyright (C) 2002 Published by Elsevier Science B.V. All rights reserved.
Resumo:
A range of lasers. is now available for use in dentistry. This paper summarizes key current and emerging applications, for lasers in clinical practice. A major diagnostic application of low power lasers is the detection of caries, using fluorescence elicited from hydroxyapatite or from bacterial by-products. Laser fluorescence is an effective method for detecting and quantifying incipient occlusal and cervical,carious lesions, and with further refinement could be used in the, same manner for proximal lesions. Photoactivated dye techniques have been developed which use low power lasers to elicit a photochemical reaction, Photoactivated dye techniques' can be used to disinfect root canals, periodontal pockets, cavity preparations and sites of peri-implantitis. Using similar principles, more powerful lasers tan be used for photodynamic therapy in the treatment of malignancies of the oral mucosa. Laser-driven photochemical reactions can also be used for tooth whitening. In combination with fluoride, laser irradiation can improve the resistance of tooth structure to demineralization, and this application is of particular benefit for susceptible sites in high caries risk patients. Laser technology for caries' removal, cavity preparation and soft tissue surgery is at a high state of refinement, having had several decades of development up to the present time. Used in conjunction with or as a replacement for traditional methods, it is expected that specific laser technologies will become an essential component of contemporary dental practice over the next decade.
Resumo:
This paper proposes a template for modelling complex datasets that integrates traditional statistical modelling approaches with more recent advances in statistics and modelling through an exploratory framework. Our approach builds on the well-known and long standing traditional idea of 'good practice in statistics' by establishing a comprehensive framework for modelling that focuses on exploration, prediction, interpretation and reliability assessment, a relatively new idea that allows individual assessment of predictions. The integrated framework we present comprises two stages. The first involves the use of exploratory methods to help visually understand the data and identify a parsimonious set of explanatory variables. The second encompasses a two step modelling process, where the use of non-parametric methods such as decision trees and generalized additive models are promoted to identify important variables and their modelling relationship with the response before a final predictive model is considered. We focus on fitting the predictive model using parametric, non-parametric and Bayesian approaches. This paper is motivated by a medical problem where interest focuses on developing a risk stratification system for morbidity of 1,710 cardiac patients given a suite of demographic, clinical and preoperative variables. Although the methods we use are applied specifically to this case study, these methods can be applied across any field, irrespective of the type of response.
Resumo:
In spite of their wide application in comminution circuits, hydrocyclones have at least one significant disadvantage in that their operation inherently tends to return the fine denser liberated minerals to the grinding mill. This results in unnecessary overgrinding which adds to the milling cost and can adversely affect the efficiency of downstream processes. In an attempt to solve this problem, a three-product cyclone has been developed at the Julius Kruttschnitt Mineral Research Centre (JKMRC) to generate a second overflow in which the fine dense liberated minerals can be selectively concentrated for further treatment. In this paper, the design and operation of the three-product cyclone are described. The influence of the length of the second vortex finder on the performance of a 150-mm unit treating a mixture of magnetite and silica is investigated. Conventional cyclone tests were also conducted under similar conditions. Using the operational performance data of the three-product and conventional cyclones, it is shown that by optimising the length of the second vortex finder, the amount of fine dense mineral particles that reports to the three-product cyclone underflow can be reduced. In addition, the three-product cyclone can be used to generate middlings stream that may be more suitable for flash flotation than the conventional cyclone underflow, or alternatively, could be classified with a microscreen to separate the valuables from the gangue. At the same time, a fines stream having similar properties to those of the conventional overflow can be obtained. Hence, if the middlings stream was used as feed for flash flotation or microscreening, the fines stream could be used in lieu of the conventional overflow without compromising the feed requirements for the conventional flotation circuit. Some of the other potential applications of the new cyclone are described. (C) 2003 Elsevier Science B.V. All rights reserved.
Resumo:
One of the most important advantages of database systems is that the underlying mathematics is rich enough to specify very complex operations with a small number of statements in the database language. This research covers an aspect of biological informatics that is the marriage of information technology and biology, involving the study of real-world phenomena using virtual plants derived from L-systems simulation. L-systems were introduced by Aristid Lindenmayer as a mathematical model of multicellular organisms. Not much consideration has been given to the problem of persistent storage for these simulations. Current procedures for querying data generated by L-systems for scientific experiments, simulations and measurements are also inadequate. To address these problems the research in this paper presents a generic process for data-modeling tools (L-DBM) between L-systems and database systems. This paper shows how L-system productions can be generically and automatically represented in database schemas and how a database can be populated from the L-system strings. This paper further describes the idea of pre-computing recursive structures in the data into derived attributes using compiler generation. A method to allow a correspondence between biologists' terms and compiler-generated terms in a biologist computing environment is supplied. Once the L-DBM gets any specific L-systems productions and its declarations, it can generate the specific schema for both simple correspondence terminology and also complex recursive structure data attributes and relationships.