418 resultados para explicit läsundervisning
Resumo:
Social media analytics is a rapidly developing field of research at present: new, powerful ‘big data’ research methods draw on the Application Programming Interfaces (APIs) of social media platforms. Twitter has proven to be a particularly productive space for such methods development, initially due to the explicit support and encouragement of Twitter, Inc. However, because of the growing commercialisation of Twitter data, and the increasing API restrictions imposed by Twitter, Inc., researchers are now facing a considerably less welcoming environment, and are forced to find additional funding for paid data access, or to bend or break the rules of the Twitter API. This article considers the increasingly precarious nature of ‘big data’ Twitter research, and flags the potential consequences of this shift for academic scholarship.
Resumo:
This article develops a method for analysis of growth data with multiple recaptures when the initial ages for all individuals are unknown. The existing approaches either impute the initial ages or model them as random effects. Assumptions about the initial age are not verifiable because all the initial ages are unknown. We present an alternative approach that treats all the lengths including the length at first capture as correlated repeated measures for each individual. Optimal estimating equations are developed using the generalized estimating equations approach that only requires the first two moment assumptions. Explicit expressions for estimation of both mean growth parameters and variance components are given to minimize the computational complexity. Simulation studies indicate that the proposed method works well. Two real data sets are analyzed for illustration, one from whelks (Dicathais aegaota) and the other from southern rock lobster (Jasus edwardsii) in South Australia.
Resumo:
Due to the increasing speed of landscape changes and the massive development of computer technologies, the methods of representing heritage landscapes using digital tools have become a worldwide concern in conservation research. The aim of this paper is to demonstrate how an ‘interpretative model’ can be used for contextual design of heritage landscape information systems. This approach is explored through building a geographic information system database for St Helena Island national park in Moreton Bay, South East Queensland, Australia. Stakeholders' interpretations of this landscape were collected through interviews, and then used as a framework for designing the database. The designed database is a digital inventory providing contextual descriptions of the historic infrastructure remnants on St Helena Island. It also reveals the priorities of different sites in terms of historic research, landscape restoration, and tourism development. Additionally, this database produces thematic maps of the intangible heritage values, which could be used for landscape interpretation. This approach is different from the existing methods because building a heritage information system is deemed as an interpretative activity, rather than a value-free replication of the physical environment. This approach also shows how a cultural landscape methodology can be used to create a flexible information system for heritage conservation. The conclusion is that an ‘interpretative model’ of database design facilitates a more explicit focus on information support, and is a potentially effective approach to user-centred design of geographic information systems.
Resumo:
English is currently ascendant as the language of globalisation, evident in its mediation of interactions and transactions worldwide. For many international students, completion of a degree in English means significant credentialing and increased job prospects. Australian universities are the third largest English-speaking destination for overseas students behind the United States and the United Kingdom. International students comprise one-fifth of the total Australian university population, with 80% coming from Asian countries (ABS, 2010). In this competitive higher education market, English has been identified as a valued ‘good’. Indeed, universities have been critiqued for relentlessly reproducing the “hegemony and homogeneity of English” (Marginson, 2006, p. 37) in order to sustain their advantage in the education market. For international students, English is the gatekeeper to enrolment, the medium of instruction and the mediator of academic success. For these reasons, English is not benign, yet it remains largely taken-for-granted in the mainstream university context. This paper problematises the naturalness of English and reports on a study of an Australian Master of Education course in which English was a focus. The study investigated representations of English as they were articulated across a chain of texts including the university strategic plan, course assessment criteria, student assignments, lecturer feedback, and interviews. Critical Discourse Analysis (CDA) and Foucault’s work on discourse enabled understandings of how a particular English is formed through an apparatus of specifications, exclusionary thresholds, strategies for maintenance (and disruption), and privileged concepts and speaking positions. The findings indicate that English has hegemonic status within the Australian university, with material consequences for students whose proficiency falls outside the thresholds of accepted English practice. Central to the constitution of what counts as English is the relationship of equivalence between standard written English and successful academic writing. International students’ representations of English indicate a discourse that impacts on identities and practices and preoccupies them considerably as they negotiate language and task demands. For the lecturer, there is strategic manoeuvring within the institutional regulative regime to support students’ English language needs using adapted assessment practices, explicit teaching of academic genres and scaffolded classroom interaction. The paper concludes with the implications for university teaching and learning.
Resumo:
This study examined an aspect of adolescent writing development, specifically whether teaching secondary school students to use strategies to enhance succinctness in their essays changed the grammatical sophistication of their sentences. A quasi-experimental intervention was used to compare changes in syntactic complexity and lexical density between one-draft and polished essays. No link was demonstrated between the intervention and the changes. A thematic analysis of teacher interviews explored links between changes to student texts and teaching approaches. The study has implications for making syntactic complexity an explicit goal of student drafting.
Resumo:
In this paper we consider the third-moment structure of a class of time series models. It is often argued that the marginal distribution of financial time series such as returns is skewed. Therefore it is of importance to know what properties a model should possess if it is to accommodate unconditional skewness. We consider modeling the unconditional mean and variance using models that respond nonlinearly or asymmetrically to shocks. We investigate the implications of these models on the third-moment structure of the marginal distribution as well as conditions under which the unconditional distribution exhibits skewness and nonzero third-order autocovariance structure. In this respect, an asymmetric or nonlinear specification of the conditional mean is found to be of greater importance than the properties of the conditional variance. Several examples are discussed and, whenever possible, explicit analytical expressions provided for all third-order moments and cross-moments. Finally, we introduce a new tool, the shock impact curve, for investigating the impact of shocks on the conditional mean squared error of return series.
Resumo:
The effectiveness of any trapping system is highly dependent on the ability to accurately identify the specimens collected. For many fruit fly species, accurate identification (= diagnostics) using morphological or molecular techniques is relatively straightforward and poses few technical challenges. However, nearly all genera of pest tephritids also contain groups of species where single, stand-alone tools are not sufficient for accurate identification: such groups include the Bactrocera dorsalis complex, the Anastrepha fraterculus complex and the Ceratitis FAR complex. Misidentification of high-impact species from such groups can have dramatic consequences and negate the benefits of an otherwise effective trapping program. To help prevent such problems, this chapter defines what is meant by a species complex and describes in detail how the correct identification of species within a complex requires the use of an integrative taxonomic approach. Integrative taxonomy uses multiple, independent lines of evidence to delimit species boundaries, and the underpinnings of this approach from both the theoretical speciation literature and the systematics/taxonomy literature are described. The strength of the integrative approach lies in the explicit testing of hypotheses and the use of multiple, independent species delimitation tools. A case is made for a core set of species delimitation tools (pre- and post-zygotic compatibility tests, multi-locus phylogenetic analysis, chemoecological studies, and morphometric and geometric morphometric analyses) to be adopted as standards by tephritologists aiming to resolve economically important species complexes. In discussing the integrative approach, emphasis is placed on the subtle but important differences between integrative and iterative taxonomy. The chapter finishes with a case study that illustrates how iterative taxonomy applied to the B. dorsalis species complex led to incorrect taxonomic conclusions, which has had major implications for quarantine, trade, and horticultural pest management. In contrast, an integrative approach to the problem has resolved species limits in this taxonomically difficult group, meaning that robust diagnostics are now available.
Resumo:
The growth of APIs and Web services on the Internet, especially through larger enterprise systems increasingly being leveraged for Cloud and software-as-a-service opportuni- ties, poses challenges to improving the efficiency of integration with these services. Interfaces of enterprise systems are typically larger, more complex and overloaded, with single operation having multiple data entities and parameter sets, supporting varying requests, and reflecting versioning across different system releases, compared to fine-grained operations of contemporary interfaces. We propose a technique to support the refactoring of service interfaces by deriving business entities and their relationships. In this paper, we focus on the behavioural aspects of service interfaces, aiming to discover the sequential dependencies of operations (otherwise known as protocol extraction) based on the entities and relationships derived. Specifically, we propose heuristics according to these relationships, and in turn, deriving permissible orders in which operations are invoked. As a result of this, service operations can be refactored on business entity CRUD lines, with explicit behavioural protocols as part of an interface definition. This supports flexible service discovery, composition and integration. A prototypical implementation and analysis of existing Web services, including those of commercial logistic systems (Fedex), are used to validate the algorithms proposed through the paper.
Resumo:
The quality of species distribution models (SDMs) relies to a large degree on the quality of the input data, from bioclimatic indices to environmental and habitat descriptors (Austin, 2002). Recent reviews of SDM techniques, have sought to optimize predictive performance e.g. Elith et al., 2006. In general SDMs employ one of three approaches to variable selection. The simplest approach relies on the expert to select the variables, as in environmental niche models Nix, 1986 or a generalized linear model without variable selection (Miller and Franklin, 2002). A second approach explicitly incorporates variable selection into model fitting, which allows examination of particular combinations of variables. Examples include generalized linear or additive models with variable selection (Hastie et al. 2002); or classification trees with complexity or model based pruning (Breiman et al., 1984, Zeileis, 2008). A third approach uses model averaging, to summarize the overall contribution of a variable, without considering particular combinations. Examples include neural networks, boosted or bagged regression trees and Maximum Entropy as compared in Elith et al. 2006. Typically, users of SDMs will either consider a small number of variable sets, via the first approach, or else supply all of the candidate variables (often numbering more than a hundred) to the second or third approaches. Bayesian SDMs exist, with several methods for eliciting and encoding priors on model parameters (see review in Low Choy et al. 2010). However few methods have been published for informative variable selection; one example is Bayesian trees (O’Leary 2008). Here we report an elicitation protocol that helps makes explicit a priori expert judgements on the quality of candidate variables. This protocol can be flexibly applied to any of the three approaches to variable selection, described above, Bayesian or otherwise. We demonstrate how this information can be obtained then used to guide variable selection in classical or machine learning SDMs, or to define priors within Bayesian SDMs.
Resumo:
Crime analysts have traditionally received little guidance from academic researchers in key tasks in the analysis process, specifically the testing of multiple hypotheses and evaluating evidence in a scientific fashion. This article attempts to fill this gap by outlining a method (the Analysis of Competing Hypotheses) of systematically analysing multiple explanations for crime problems. The method is systematic, avoids many cognitive errors common in analysis, and is explicit. It is argued that the implementation of this approach makes analytic products audit-able, the reasoning underpinning them transparent, and provides intelligence managers a rational professional development tool for individual analysts.
Resumo:
Older populations are more likely to have multiple co-morbid diseases that require multiple treatments, which make them a large consumer of medications. As a person grows older, their ability to tolerate medications becomes less due to age-related changes in pharmacokinetics and pharmacodynamics often heading along a path that leads to frailty. Frail older persons often have multiple co-morbidities with signs of impairment in activities of daily living. Prescribing drugs for these vulnerable individuals is difficult and is a potentially unsafe activity. Inappropriate prescribing in older population can be detected using explicit (criterion-based) or implicit (judgment-based) criteria. Unfortunately, most current therapeutic guidelines are applicable only to healthy older adults and cannot be generalized to frail patients. These discrepancies should be addressed either by developing new criteria or by refining the existing tools for frail older people. The first and foremost step is to identify the frail patient in clinical practice by applying clinically validated tools. Once the frail patient has been identified, there is a need for specific measures or criteria to assess appropriateness of therapy that consider such factors as quality of life, functional status and remaining life expectancy and thus modified goals of care.
Resumo:
Out-of-plane behaviour of mortared and mortarless masonry walls with various forms of reinforcement, including unreinforced masonry as a base case is examined using a layered shell element based explicit finite element modelling method. Wall systems containing internal reinforcement, external surface reinforcement and intermittently laced reinforced concrete members and unreinforced masonry panels are considered. Masonry is modelled as a layer with macroscopic orthotropic properties; external reinforcing render, grout and reinforcing bars are modelled as distinct layers of the shell element. Predictions from the layered shell model have been validated using several out-of-plane experimental datasets reported in the literature. The model is used to examine the effectiveness of two retrofitting schemes for an unreinforced masonry wall.
Resumo:
With a strong emphasis on the interconnection between theory and practice, and how past/present intersections inform the future, these thirty-one articles by artist/scholars and artist/teachers profile current dance research from thirteen countries. The papers coalesce around five sub-themes: o approaches to choreography and performance o shifting cultural dance identities o contemporary research perspectives o changing dance pedagogies o site & environmental dance In exploring the cognitive and the sensory, the rational and the instinctive, the explicit and the implicit, these writings create and celebrate common and differentiated dance understandings — at times directly and provocatively, and at times liminally and poetically.
Resumo:
The Road Safety Remuneration Act 2012 (Cth) (the Act) explicitly enables the Road Safety Remuneration Tribunal to make orders that can impose binding requirements on all the participants in the road transport supply chain, including consignors and consignees at the apex the chain, for the pay and safety of both employee and independent contractor drivers. The tribunal is also specifically empowered to make enforceable orders to reduce or remove remuneration related incentives and pressures that contribute to unsafe work practices in the road transport industry. Recently the tribunal handed down its first order. The article considers whether, and the degree to which, the tribunal has been willing to exercise its explicit power to impose enforceable obligations on consignors and consignees — such as large supermarket chains — at the apex of road transport supply chains. It examines the substance and extent of the obligations imposed by the tribunal, including whether the tribunal has exercised the full range of powers vested in it by the Act. We contend that the tribunal’s first order primarily imposes obligations on direct work providers and drivers without making large, powerful consignors and consignees substantively responsible for driver pay and safety. We argue that the tribunal’s first order could have more comprehensively fulfilled the objectives of the Act by more directly addressing the root causes of low pay and poor safety in the road transport industry.
Resumo:
Information sharing in distance collaboration: A software engineering perspective, QueenslandFactors in software engineering workgroups such as geographical dispersion and background discipline can be conceptually characterized as "distances", and they are obstructive to team collaboration and information sharing. This thesis focuses on information sharing across multidimensional distances and develops an information sharing distance model, with six core dimensions: geography, time zone, organization, multi-discipline, heterogeneous roles, and varying project tenure. The research suggests that the effectiveness of workgroups may be improved through mindful conducts of information sharing, especially proactive consideration of, and explicit adjustment for, the distances of the recipient when sharing information.