971 resultados para statistical science
Resumo:
The critical problem of student disengagement and underachievement in the middle years of schooling (Years 4 . 9) has focussed attention on the quality of educational programs in schools, in Australia and elsewhere. The loss of enthusiasm for science in the middle years is particularly problematic given the growing demand for science professionals. Reshaping middle years programs has included an emphasis on integrating Information and Communication Technologies (ICTs) and improving assessment practices to engage students in higher cognitive processes and enhance academic rigour. Understanding the nature of academic rigour and how to embed it in students. science assessment tasks that incorporate the use of ICTs could enable teachers to optimise the quality of the learning environment. However, academic rigour is not clearly described or defined in the literature and there is little empirical evidence upon which researchers and teachers could draw to enhance understandings. This study used a collective case study design to explore teachers' understandings of academic rigour within science assessment tasks. The research design is based on a conceptual framework that is underpinned by socio-cultural theory. Three methods were used to collect data from six middle years teachers and their students. These methods were a survey, focus group discussion with teachers and a group of students and individual semi-structured interviews with teachers. Findings of the case study revealed six criteria of academic rigour, namely, higher order thinking, alignment, building on prior knowledge, scaffolding, knowledge construction and creativity. Results showed that the middle years teachers held rich understandings of academic rigour that led to effective utilisation of ICTs in science assessment tasks. Findings also indicated that teachers could further enhance their understandings of academic rigour in some aspects of each of the criteria. In particular, this study found that academic rigour could have been further optimised by: promoting more thoughtful discourse and interaction to foster higher order thinking; increasing alignment between curriculum, pedagogy, and assessment, and students. prior knowledge; placing greater emphasis on identifying, activating and building on prior knowledge; better differentiating the level of scaffolding provided and applying it more judiciously; fostering creativity throughout tasks; enhancing teachers‟ content knowledge and pedagogical content knowledge, and providing more in-depth coverage of fewer topics to support knowledge construction. Key contributions of this study are a definition and a model which clarify the nature of academic rigour.
Resumo:
A graduate destination survey can provide a snap shot in time of a graduate’s career progression and outcome. This paper will present the results of a Queensland University of Technology study exploring the employment outcomes of students who had completed a library and information science course from the Faculty of Information Technology between 2000 and 2008. Seventy-four graduates completed an online questionnaire administered in July 2009. The study found that 90% of the graduates surveyed were working and living in Queensland, with over three quarters living and working in Brisbane. Nearly 70% were working full-time, while only 1.4% indicating that they were unemployed and looking for work. Over 80% of the graduates identified themselves as working in “librarianship”. This study is the first step in understanding the progression and destination of QUT’s library and information science graduates. It is recommended that this survey becomes an ongoing initiative so that the results can be analysed and compared over time.
Resumo:
This paper presents the results from a study of information behaviors in the context of people's everyday lives undertaken in order to develop an integrated model of information behavior (IB). 34 participants from across 6 countries maintained a daily information journal or diary – mainly through a secure web log – for two weeks, to an aggregate of 468 participant days over five months. The text-rich diary data was analyzed using a multi-method qualitative-quantitative analysis in the following order: Grounded Theory analysis with manual coding, automated concept analysis using thesaurus-based visualization, and finally a statistical analysis of the coding data. The findings indicate that people engage in several information behaviors simultaneously throughout their everyday lives (including home and work life) and that sense-making is entangled in all aspects of them. Participants engaged in many of the information behaviors in a parallel, distributed, and concurrent fashion: many information behaviors for one information problem, one information behavior across many information problems, and many information behaviors concurrently across many information problems. Findings indicate also that information avoidance – both active and passive avoidance – is a common phenomenon and that information organizing behaviors or the lack thereof caused the most problems for participants. An integrated model of information behaviors is presented based on the findings.
Resumo:
Today’s evolving networks are experiencing a large number of different attacks ranging from system break-ins, infection from automatic attack tools such as worms, viruses, trojan horses and denial of service (DoS). One important aspect of such attacks is that they are often indiscriminate and target Internet addresses without regard to whether they are bona fide allocated or not. Due to the absence of any advertised host services the traffic observed on unused IP addresses is by definition unsolicited and likely to be either opportunistic or malicious. The analysis of large repositories of such traffic can be used to extract useful information about both ongoing and new attack patterns and unearth unusual attack behaviors. However, such an analysis is difficult due to the size and nature of the collected traffic on unused address spaces. In this dissertation, we present a network traffic analysis technique which uses traffic collected from unused address spaces and relies on the statistical properties of the collected traffic, in order to accurately and quickly detect new and ongoing network anomalies. Detection of network anomalies is based on the concept that an anomalous activity usually transforms the network parameters in such a way that their statistical properties no longer remain constant, resulting in abrupt changes. In this dissertation, we use sequential analysis techniques to identify changes in the behavior of network traffic targeting unused address spaces to unveil both ongoing and new attack patterns. Specifically, we have developed a dynamic sliding window based non-parametric cumulative sum change detection techniques for identification of changes in network traffic. Furthermore we have introduced dynamic thresholds to detect changes in network traffic behavior and also detect when a particular change has ended. Experimental results are presented that demonstrate the operational effectiveness and efficiency of the proposed approach, using both synthetically generated datasets and real network traces collected from a dedicated block of unused IP addresses.
Resumo:
Standardization is critical to scientists and regulators to ensure the quality and interoperability of research processes, as well as the safety and efficacy of the attendant research products. This is perhaps most evident in the case of “omics science,” which is enabled by a host of diverse high-throughput technologies such as genomics, proteomics, and metabolomics. But standards are of interest to (and shaped by) others far beyond the immediate realm of individual scientists, laboratories, scientific consortia, or governments that develop, apply, and regulate them. Indeed, scientific standards have consequences for the social, ethical, and legal environment in which innovative technologies are regulated, and thereby command the attention of policy makers and citizens. This article argues that standardization of omics science is both technical and social. A critical synthesis of the social science literature indicates that: (1) standardization requires a degree of flexibility to be practical at the level of scientific practice in disparate sites; (2) the manner in which standards are created, and by whom, will impact their perceived legitimacy and therefore their potential to be used; and (3) the process of standardization itself is important to establishing the legitimacy of an area of scientific research.
Resumo:
We have developed a new experimental method for interrogating statistical theories of music perception by implementing these theories as generative music algorithms. We call this method Generation in Context. This method differs from most experimental techniques in music perception in that it incorporates aesthetic judgments. Generation In Context is designed to measure percepts for which the musical context is suspected to play an important role. In particular the method is suitable for the study of perceptual parameters which are temporally dynamic. We outline a use of this approach to investigate David Temperley’s (2007) probabilistic melody model, and provide some provisional insights as to what is revealed about the model. We suggest that Temperley’s model could be improved by dynamically modulating the probability distributions according to the changing musical context.
Resumo:
In this thesis, the relationship between air pollution and human health has been investigated utilising Geographic Information System (GIS) as an analysis tool. The research focused on how vehicular air pollution affects human health. The main objective of this study was to analyse the spatial variability of pollutants, taking Brisbane City in Australia as a case study, by the identification of the areas of high concentration of air pollutants and their relationship with the numbers of death caused by air pollutants. A correlation test was performed to establish the relationship between air pollution, number of deaths from respiratory disease, and total distance travelled by road vehicles in Brisbane. GIS was utilized to investigate the spatial distribution of the air pollutants. The main finding of this research is the comparison between spatial and non-spatial analysis approaches, which indicated that correlation analysis and simple buffer analysis of GIS using the average levels of air pollutants from a single monitoring station or by group of few monitoring stations is a relatively simple method for assessing the health effects of air pollution. There was a significant positive correlation between variable under consideration, and the research shows a decreasing trend of concentration of nitrogen dioxide at the Eagle Farm and Springwood sites and an increasing trend at CBD site. Statistical analysis shows that there exists a positive relationship between the level of emission and number of deaths, though the impact is not uniform as certain sections of the population are more vulnerable to exposure. Further statistical tests found that the elderly people of over 75 years age and children between 0-15 years of age are the more vulnerable people exposed to air pollution. A non-spatial approach alone may be insufficient for an appropriate evaluation of the impact of air pollutant variables and their inter-relationships. It is important to evaluate the spatial features of air pollutants before modeling the air pollution-health relationships.
Resumo:
Longitudinal data, where data are repeatedly observed or measured on a temporal basis of time or age provides the foundation of the analysis of processes which evolve over time, and these can be referred to as growth or trajectory models. One of the traditional ways of looking at growth models is to employ either linear or polynomial functional forms to model trajectory shape, and account for variation around an overall mean trend with the inclusion of random eects or individual variation on the functional shape parameters. The identification of distinct subgroups or sub-classes (latent classes) within these trajectory models which are not based on some pre-existing individual classification provides an important methodology with substantive implications. The identification of subgroups or classes has a wide application in the medical arena where responder/non-responder identification based on distinctly diering trajectories delivers further information for clinical processes. This thesis develops Bayesian statistical models and techniques for the identification of subgroups in the analysis of longitudinal data where the number of time intervals is limited. These models are then applied to a single case study which investigates the neuropsychological cognition for early stage breast cancer patients undergoing adjuvant chemotherapy treatment from the Cognition in Breast Cancer Study undertaken by the Wesley Research Institute of Brisbane, Queensland. Alternative formulations to the linear or polynomial approach are taken which use piecewise linear models with a single turning point, change-point or knot at a known time point and latent basis models for the non-linear trajectories found for the verbal memory domain of cognitive function before and after chemotherapy treatment. Hierarchical Bayesian random eects models are used as a starting point for the latent class modelling process and are extended with the incorporation of covariates in the trajectory profiles and as predictors of class membership. The Bayesian latent basis models enable the degree of recovery post-chemotherapy to be estimated for short and long-term followup occasions, and the distinct class trajectories assist in the identification of breast cancer patients who maybe at risk of long-term verbal memory impairment.
Resumo:
There is no denying that the information technology revolution of the late twentieth century has arrived. Whilst not equitably accessible for many, others hold high expectations for the contributions online activity will make to student learning outcomes. Concurrently, and not necessarily consequentially, the number of science and technology secondary school and university graduates throughout the world has declined substantially, as has their motivation and engagement with school science (OECD, 2006). The aim of this research paper is to explore one aspect of online activity, that of forum-based netspeak (Crystal, 2006), in relation to the possibilities and challenges it provides for forms of scientific learning. This paper reports findings from a study investigating student initiated netspeak in a science inspired multiliteracies (New London Group, 2000) project in one middle primary (aged 7-10 years) multi-age Australian classroom. Drawing on the theoretical description of the Five phases of enquiry proposed by Bybee (1997), an analytic framework is proffered that allows identification of student engagement, exploration, explanation, elaboration and evaluation of scientific enquiry. The findings provide insight into online forums for advancing learning in and motivation for science in the middle primary years.