288 resultados para Statistical tools
Resumo:
Statistical methodology was applied to a survey of time-course incidence of four viruses (alfalfa mosaic virus, clover yellow vein virus, subterranean clover mottle virus and subterranean clover red leaf virus) in improved pastures in southern regions of Australia. -from Authors
Resumo:
In this age of rapidly evolving technology, teachers are encouraged to adopt ICTs by government, syllabus, school management, and parents. Indeed, it is an expectation that teachers will incorporate technologies into their classroom teaching practices to enhance the learning experiences and outcomes of their students. In particular, regarding the science classroom, a subject that traditionally incorporates hands-on experiments and practicals, the integration of modern technologies should be a major feature. Although myriad studies report on technologies that enhance students’ learning outcomes in science, there is a dearth of literature on how teachers go about selecting technologies for use in the science classroom. Teachers can feel ill prepared to assess the range of available choices and might feel pressured and somewhat overwhelmed by the avalanche of new developments thrust before them in marketing literature and teaching journals. The consequences of making bad decisions are costly in terms of money, time and teacher confidence. Additionally, no research to date has identified what technologies science teachers use on a regular basis, and whether some purchased technologies have proven to be too problematic, preventing their sustained use and possible wider adoption. The primary aim of this study was to provide research-based guidance to teachers to aid their decision-making in choosing technologies for the science classroom. The study unfolded in several phases. The first phase of the project involved survey and interview data from teachers in relation to the technologies they currently use in their science classrooms and the frequency of their use. These data were coded and analysed using Grounded Theory of Corbin and Strauss, and resulted in the development of a PETTaL model that captured the salient factors of the data. This model incorporated usability theory from the Human Computer Interaction literature, and education theory and models such as Mishra and Koehler’s (2006) TPACK model, where the grounded data indicated these issues. The PETTaL model identifies Power (school management, syllabus etc.), Environment (classroom / learning setting), Teacher (personal characteristics, experience, epistemology), Technology (usability, versatility etc.,) and Learners (academic ability, diversity, behaviour etc.,) as fields that can impact the use of technology in science classrooms. The PETTaL model was used to create a Predictive Evaluation Tool (PET): a tool designed to assist teachers in choosing technologies, particularly for science teaching and learning. The evolution of the PET was cyclical (employing agile development methodology), involving repeated testing with in-service and pre-service teachers at each iteration, and incorporating their comments i ii in subsequent versions. Once no new suggestions were forthcoming, the PET was tested with eight in-service teachers, and the results showed that the PET outcomes obtained by (experienced) teachers concurred with their instinctive evaluations. They felt the PET would be a valuable tool when considering new technology, and it would be particularly useful as a means of communicating perceived value between colleagues and between budget holders and requestors during the acquisition process. It is hoped that the PET could make the tacit knowledge acquired by experienced teachers about technology use in classrooms explicit to novice teachers. Additionally, the PET could be used as a research tool to discover a teachers’ professional development needs. Therefore, the outcomes of this study can aid a teacher in the process of selecting educationally productive and sustainable new technology for their science classrooms. This study has produced an instrument for assisting teachers in the decision-making process associated with the use of new technologies for the science classroom. The instrument is generic in that it can be applied to all subject areas. Further, this study has produced a powerful model that extends the TPACK model, which is currently extensively employed to assess teachers’ use of technology in the classroom. The PETTaL model grounded in data from this study, responds to the calls in the literature for TPACK’s further development. As a theoretical model, PETTaL has the potential to serve as a framework for the development of a teacher’s reflective practice (either self evaluation or critical evaluation of observed teaching practices). Additionally, PETTaL has the potential for aiding the formulation of a teacher’s personal professional development plan. It will be the basis for further studies in this field.
Resumo:
Background: Appropriate disposition of emergency department (ED) patients with chest pain is dependent on clinical evaluation of risk. A number of chest pain risk stratification tools have been proposed. The aim of this study was to compare the predictive performance for major adverse cardiac events (MACE) using risk assessment tools from the National Heart Foundation of Australia (HFA), the Goldman risk score and the Thrombolysis in Myocardial Infarction risk score (TIMI RS). Methods: This prospective observational study evaluated ED patients aged ≥30 years with non-traumatic chest pain for which no definitive non-ischemic cause was found. Data collected included demographic and clinical information, investigation findings and occurrence of MACE by 30 days. The outcome of interest was the comparative predictive performance of the risk tools for MACE at 30 days, as analyzed by receiver operator curves (ROC). Results: Two hundred eighty-one patients were studied; the rate of MACE was 14.1%. Area under the curve (AUC) of the HFA, TIMI RS and Goldman tools for the endpoint of MACE was 0.54, 0.71 and 0.67, respectively, with the difference between the tools in predictive ability for MACE being highly significant [chi2 (3) = 67.21, N = 276, p < 0.0001]. Conclusion: The TIMI RS and Goldman tools performed better than the HFA in this undifferentiated ED chest pain population, but selection of cutoffs balancing sensitivity and specificity was problematic. There is an urgent need for validated risk stratification tools specific for the ED chest pain population.
Resumo:
The Department of Culture and the Arts undertook the first mapping of Perth’s creative industries in 2007 in partnership with the City of Perth and the Departments of Industry and Resources and the Premier and Cabinet. The 2013 Creative Industries Statistical Analysis for Western Australia report has updated the mapping with the 2011 Census employment data to provide invaluable information for the State’s creative industries, their peak associations and potential investors. The report maps sector employment numbers and growth between the 2006 and 2011 Census in the areas of music, visual and performing arts, film, TV and radio, advertising and marketing, software and digital content, publishing, and architecture and design, which includes designer fashion.
Resumo:
This chapter was developed as part of the ‘People, communities and economies of the Lake Eyre Basin’ project. It has been written for communities, government agencies and interface organisations involved in natural resource management (NRM) in the Lake Eyre Basin (LEB). Its purpose is to identify the key factors for successful community engagement processes relevant to the LEB and present tools and principles for successful engagement processes. The term ‘interface organisation’ is used here to refer to the diverse range of local and regional organisations (such as Catchment Committees or NRM Regional Bodies) that serve as linkages, or translators, between local communities and broader Australian and State Governments. The importance of fostering and harnessing effective processes of community engagement has been identified as crucial to building a prosperous future for rural and remote regions in Australia. The chapter presents an overview of the literature on successful community engagement processes for NRM, as well as an overview of the current NRM arrangements in the LEB. The main part of the chapter presents findings of the series of interviews conducted with the government liaison officers representing both state and federal organisations who are responsible for coordinating and facilitating regional NRM in the LEB, and with the members of communities of the LEB.
Resumo:
Electricity network investment and asset management require accurate estimation of future demand in energy consumption within specified service areas. For this purpose, simple models are typically developed to predict future trends in electricity consumption using various methods and assumptions. This paper presents a statistical model to predict electricity consumption in the residential sector at the Census Collection District (CCD) level over the state of New South Wales, Australia, based on spatial building and household characteristics. Residential household demographic and building data from the Australian Bureau of Statistics (ABS) and actual electricity consumption data from electricity companies are merged for 74 % of the 12,000 CCDs in the state. Eighty percent of the merged dataset is randomly set aside to establish the model using regression analysis, and the remaining 20 % is used to independently test the accuracy of model prediction against actual consumption. In 90 % of the cases, the predicted consumption is shown to be within 5 kWh per dwelling per day from actual values, with an overall state accuracy of -1.15 %. Given a future scenario with a shift in climate zone and a growth in population, the model is used to identify the geographical or service areas that are most likely to have increased electricity consumption. Such geographical representation can be of great benefit when assessing alternatives to the centralised generation of energy; having such a model gives a quantifiable method to selecting the 'most' appropriate system when a review or upgrade of the network infrastructure is required.
Resumo:
The safe working lifetime of a structure in a corrosive or other harsh environment is frequently not limited by the material itself but rather by the integrity of the coating material. Advanced surface coatings are usually crosslinked organic polymers such as epoxies and polyurethanes which must not shrink, crack or degrade when exposed to environmental extremes. While standard test methods for environmental durability of coatings have been devised, the tests are structured more towards determining the end of life rather than in anticipation of degradation. We have been developing prognostic tools to anticipate coating failure by using a fundamental understanding of their degradation behaviour which, depending on the polymer structure, is mediated through hydrolytic or oxidation processes. Fourier transform infrared spectroscopy (FTIR) is a widely-used laboratory technique for the analysis of polymer degradation and with the development of portable FTIR spectrometers, new opportunities have arisen to measure polymer degradation non-destructively in the field. For IR reflectance sampling, both diffuse (scattered) and specular (direct) reflections can occur. The complexity in these spectra has provided interesting opportunities to study surface chemical and physical changes during paint curing, service abrasion and weathering, but has often required the use of advanced statistical analysis methods such as chemometrics to discern these changes. Results from our studies using this and related techniques and the technical challenges that have arisen will be presented.
Resumo:
Many emerging economies are dangling the patent system to stimulate bio-technological innovations with the ultimate premise that these will improve their economic and social growth. The patent system mandates full disclosure of the patented invention in exchange of a temporary exclusive patent right. Recently, however, patent offices have fallen short of complying with such a mandate, especially for genetic inventions. Most patent offices provide only static information about disclosed patent sequences and even some do not keep track of the sequence listing data in their own database. The successful partnership of QUT Library and Cambia exemplifies advocacy in Open Access, Open Innovation and User Participation. The library extends its services to various departments within the university, builds and encourages research networks to complement skills needed to make a contribution in the real world.
Resumo:
Sugar cane is a major source of food and fuel worldwide. Biotechnology has the potential to improve economically-important traits in sugar cane as well as diversify sugar cane beyond traditional applications such as sucrose production. High levels of transgene expression are key to the success of improving crops through biotechnology. Here we describe new molecular tools that both expand and improve gene expression capabilities in sugar cane. We have identified promoters that can be used to drive high levels of gene expression in the leaf and stem of transgenic sugar cane. One of these promoters, derived from the Cestrum yellow leaf curling virus, drives levels of constitutive transgene expression that are significantly higher than those achieved by the historical benchmark maize polyubiquitin-1 (Zm-Ubi1) promoter. A second promoter, the maize phosphonenolpyruvate carboxylate promoter, was found to be a strong, leaf-preferred promoter that enables levels of expression comparable to Zm-Ubi1 in this organ. Transgene expression was increased approximately 50-fold by gene modification, which included optimising the codon usage of the coding sequence to better suit sugar cane. We also describe a novel dual transcriptional enhancer that increased gene expression from different promoters, boosting expression from Zm-Ubi1 over eightfold. These molecular tools will be extremely valuable for the improvement of sugar cane through biotechnology.
Resumo:
For clinical use, in electrocardiogram (ECG) signal analysis it is important to detect not only the centre of the P wave, the QRS complex and the T wave, but also the time intervals, such as the ST segment. Much research focused entirely on qrs complex detection, via methods such as wavelet transforms, spline fitting and neural networks. However, drawbacks include the false classification of a severe noise spike as a QRS complex, possibly requiring manual editing, or the omission of information contained in other regions of the ECG signal. While some attempts were made to develop algorithms to detect additional signal characteristics, such as P and T waves, the reported success rates are subject to change from person-to-person and beat-to-beat. To address this variability we propose the use of Markov-chain Monte Carlo statistical modelling to extract the key features of an ECG signal and we report on a feasibility study to investigate the utility of the approach. The modelling approach is examined with reference to a realistic computer generated ECG signal, where details such as wave morphology and noise levels are variable.
Resumo:
This chapter addresses data modelling as a means of promoting statistical literacy in the early grades. Consideration is first given to the importance of increasing young children’s exposure to statistical reasoning experiences and how data modelling can be a rich means of doing so. Selected components of data modelling are then reviewed, followed by a report on some findings from the third-year of a three-year longitudinal study across grades one through three.
Resumo:
At NDSS 2012, Yan et al. analyzed the security of several challenge-response type user authentication protocols against passive observers, and proposed a generic counting based statistical attack to recover the secret of some counting based protocols given a number of observed authentication sessions. Roughly speaking, the attack is based on the fact that secret (pass) objects appear in challenges with a different probability from non-secret (decoy) objects when the responses are taken into account. Although they mentioned that a protocol susceptible to this attack should minimize this difference, they did not give details as to how this can be achieved barring a few suggestions. In this paper, we attempt to fill this gap by generalizing the attack with a much more comprehensive theoretical analysis. Our treatment is more quantitative which enables us to describe a method to theoretically estimate a lower bound on the number of sessions a protocol can be safely used against the attack. Our results include 1) two proposed fixes to make counting protocols practically safe against the attack at the cost of usability, 2) the observation that the attack can be used on non-counting based protocols too as long as challenge generation is contrived, 3) and two main design principles for user authentication protocols which can be considered as extensions of the principles from Yan et al. This detailed theoretical treatment can be used as a guideline during the design of counting based protocols to determine their susceptibility to this attack. The Foxtail protocol, one of the protocols analyzed by Yan et al., is used as a representative to illustrate our theoretical and experimental results.
Resumo:
Management of the industrial nations' hazardous waste is a current and exponentially increasing, global threatening situation. Improved environmental information must be obtained and managed concerning the current status, temporal dynamics and potential future status of these critical sites. To test the application of spatial environmental techniques to the problem of hazardous waste sites, as Superfund (CERCLA) test site was chosen in an industrial/urban valley experiencing severe TCE, PCE, and CTC ground water contamination. A paradigm is presented for investigating spatial/environmental tools available for the mapping, monitoring and modelling of the environment and its toxic contaminated plumes. This model incorporates a range of technical issues concerning the collection of data as augmented by remotely sensed tools, the format and storage of data utilizing geographic information systems, and the analysis and modelling of environment through the use of advance GIS analysis algorithms and geophysic models of hydrologic transport including statistical surface generation. This spatial based approach is evaluated against the current government/industry standards of operations. Advantages and lessons learned of the spatial approach are discussed.
Resumo:
This study set out to investigate the kinds of learning difficulties encountered by the Malaysian students and how they actually coped with online learning. The modified Online Learning Environment Survey (OLES) instrument was used to collect data from the sample of 40 Malaysian students at a university in Brisbane, Australia. A controlled group of 35 Australian students was also included for comparison purposes. Contrary to assumptions from previous researches, the findings revealed that there were only a few differences between the international Asian and Australian students with regards to their perceptions of online learning. Recommendations based on the findings of this research study were applicable for Australian universities which have Asian international students enrolled to study online.
Resumo:
Providing culturally appropriate health communication tools at a community level, whilst meeting funding objectives set by Government led initiatives, can be challenging. Literature states that a translational research framework fostering community communication can encourage the development of appropriate communication tools to facilitate transfer of health information between community and researchers. Reflections from initial Need for Feed cooking and nutrition education program trials in remote Indigenous communities across Cape York indicated program resources were neither meeting community nor researchers needs. In response, a translational research framework was modelled with collaborative partnerships formed between researchers and community with the aim of modifying current resources. Local working groups were established to facilitate communication and guide continual remodelling and retrial of resources for successive programs. Feedback from working groups indicated community members wanted resources with more pictures and less words. Partnership with Chronic Disease Resources Online (CDRO) led to the development of pictorial resources including 3 evaluation tools, 27 recipe sets and 10 education support materials. Between June to December 2012 resources were trialled across 4 Cape York communities with 69 school aged children and 4 community elders. Qualitative data has indicated high satisfaction with modified pictorial resources, proving pictorial resources to be an effective and culturally appropriate method to both communicate health messages to community and facilitate flow of evaluation data to researchers. A translational research framework fostering communication between community and researchers can potentially enhance the quality of health communication tools.