393 resultados para IT intention to learn


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Predictors of people’s intention to register with a body bequest program for donating their deceased body to medical science and research were examined using standard theory of planned behavior (TPB) predictors (attitude, subjective norm, perceived behavioral control) and adding moral norm, altruism, and knowledge. Australian students (N = 221) at a university with a recently established body bequest program completed measures of the TPB’s underlying beliefs (behavioral, normative, and control beliefs) and standard and extended TPB predictors, with a sub-sample reporting their registration-related behavior 2 months later. The standard TPB accounted for 43.6%, and the extended predictors an additional 15.1% of variance in intention. The significant predictors were attitude, subjective norm, and moral norm, partially supporting an extended TPB in understanding people’s body donation intentions. Further, important underlying beliefs can inform strategies to target prospective donors.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Humans and microbes have developed a symbiotic relationship over time, and alterations in this symbiotic relationship have been linked to several immune mediated diseases such as inflammatory bowel disease, type 1 diabetes and spondyloarthropathies. Improvements in sequencing technologies, coupled with a renaissance in 16S rRNA gene based community profiling, have enabled the characterization of microbiomes throughout the body including the gut. Improved characterization and understanding of the human gut microbiome means the gut flora is progressively being explored as a target for novel therapies including probiotics and faecal microbiota transplants. These innovative therapies are increasingly used for patients with debilitating conditions where conventional treatments have failed. This review discusses the current understanding of the interplay between host genetics and the gut microbiome in the pathogenesis of spondyloarthropathies, and how this may relate to potential therapies for these conditions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Prior to embarking on further study into the subject of relevance it is essential to consider why the concept of relevance has remained inconclusive, despite extensive research and its centrality to the discipline of information science. The approach taken in this paper is to reconstruct the science of information retrieval from first principles including the problem statement, role, scope and objective. This framework for document selection is put forward as a straw man for comparison with the historical relevance models. The paper examines five influential relevance models over the past 50 years. Each is examined with respect to its treatment of relevance and compared with the first principles model to identify contributions and deficiencies. The major conclusion drawn is that relevance is a significantly overloaded concept which is both confusing and detrimental to the science.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Australian government has recently pledged a reduction in GHGs emissions of 26–28% below the 2005 level by 2030. How big is the challenge for the country to achieve this target in terms of its present emissions profile, recent historical trends, and the contributions to those trends from key proximate factors contributing to emissions? In this paper, we attempt a quantitative judgement of the challenge by using decomposition analysis. Based on the analysis it appears the announced target will be quite challenging to achieve if the average annual mitigating effects from economic restructuring, energy efficiency improvements and movement towards less emissions-intensive energy sources in evidence over 2002–2013 continued through to 2030; however, if the contribution from these mitigating sources in evidence over 2006–2013 can be sustained, achievement of the target will be much less challenging. The challenge for government then will be to provide a policy framework to ensure the more pronounced beneficial impacts of the mitigating factors evidenced during 2006–2013 can be maintained over the years to 2030.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A quantitative, quasi-experimental study of the effectiveness of computer-based scientific visualizations for concept learning on the part of Year 11 physics students (n=80) was conducted in six Queensland high school classrooms. Students’ gender and academic ability were also considered as factors in relation to the effectiveness of teaching with visualizations. Learning with visualizations was found to be equally effective as learning without them for all students, with no statistically significant difference in outcomes being observed for the group as a whole or on the academic ability dimension. Male students were found to learn significantly better with visualizations than without, while no such effect was observed for female students. This may give rise to some concern for the equity issues raised by introducing visualizations. Given that other research shows that students enjoy learning with visualizations and that their engagement with learning is enhanced, the finding that the learning outcomes are the same as for teaching without visualizations supports teachers’ use of visualizations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In his 1987 book, The Media Lab: Inventing the Future at MIT, Stewart Brand provides an insight into the visions of the future of the media in the 1970s and 1980s. 1 He notes that Nicolas Negroponte made a compelling case for the foundation of a media laboratory at MIT with diagrams detailing the convergence of three sectors of the media—the broadcast and motion picture industry; the print and publishing industry; and the computer industry. Stewart Brand commented: ‘If Negroponte was right and communications technologies really are converging, you would look for signs that technological homogenisation was dissolving old boundaries out of existence, and you would expect an explosion of new media where those boundaries used to be’. Two decades later, technology developers, media analysts and lawyers have become excited about the latest phase of media convergence. In 2006, the faddish Time Magazine heralded the arrival of various Web 2.0 social networking services: You can learn more about how Americans live just by looking at the backgrounds of YouTube videos—those rumpled bedrooms and toy‐strewn basement rec rooms—than you could from 1,000 hours of network television. And we didn’t just watch, we also worked. Like crazy. We made Facebook profiles and Second Life avatars and reviewed books at Amazon and recorded podcasts. We blogged about our candidates losing and wrote songs about getting dumped. We camcordered bombing runs and built open‐source software. America loves its solitary geniuses—its Einsteins, its Edisons, its Jobses—but those lonely dreamers may have to learn to play with others. Car companies are running open design contests. Reuters is carrying blog postings alongside its regular news feed. Microsoft is working overtime to fend off user‐created Linux. We’re looking at an explosion of productivity and innovation, and it’s just getting started, as millions of minds that would otherwise have drowned in obscurity get backhauled into the global intellectual economy. The magazine announced that Time’s Person of the Year was ‘You’, the everyman and everywoman consumer ‘for seizing the reins of the global media, for founding and framing the new digital democracy, for working for nothing and beating the pros at their own game’. This review essay considers three recent books, which have explored the legal dimensions of new media. In contrast to the unbridled exuberance of Time Magazine, this series of legal works displays an anxious trepidation about the legal ramifications associated with the rise of social networking services. In his tour de force, The Future of Reputation: Gossip, Rumor, and Privacy on the Internet, Daniel Solove considers the implications of social networking services, such as Facebook and YouTube, for the legal protection of reputation under privacy law and defamation law. Andrew Kenyon’s edited collection, TV Futures: Digital Television Policy in Australia, explores the intersection between media law and copyright law in the regulation of digital television and Internet videos. In The Future of the Internet and How to Stop It, Jonathan Zittrain explores the impact of ‘generative’ technologies and ‘tethered applications’—considering everything from the Apple Mac and the iPhone to the One Laptop per Child programme.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Typically adolescents' friends are considered a risk factor for adolescent engagement in risk-taking. This study took a more novel approach, by examining adolescent friendship as a protective factor. In particular it investigated friends' potential to intervene to reduce risk-taking. 540 adolescents (mean age 13.47 years) were asked about their intention to intervene to reduce friends' alcohol, drug and alcohol-related harms and about psychosocial factors potentially associated with intervening. More than half indicated that they would intervene in friends' alcohol, drug use, alcohol-related harms and interpersonal violence. Intervening was associated with being female, having friends engage in overall less risk-taking and having greater school connectedness. The findings provide an important understanding of increasing adolescent protective behavior as a potential strategy to reduce alcohol and drug related harms.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Over the last two decades, the notion of teacher leadership has emerged as a key concept in both the teaching and leadership literature. While researchers have not reached consensus regarding a definition, there has been some agreement that teacher leadership can operate at both a formal and informal level in schools and that it includes leadership of an instructional, organisational and professional development nature (York-Barr & Duke, 2004). Teacher leadership is a construct that tends not to be applied to pre-service teachers as interns, but is more often connected with the professional role of mentors who collaborate with them as they make the transition to being a beginning teacher. We argue that teacher leadership should be recognised as a professional and career goal during this formative learning phase and that interns should be expected to overtly demonstrate signs, albeit early ones, of leadership in instruction and other professional areas of development. The aim of this paper is to explore the extent to which teacher education interns at one university in Queensland reported on activities that may be deemed to be ‘teacher leadership.’ The research approach used in this study was an examination of 145 reflective reports written in 2008 by final Bachelor of Education (primary) pre-service teachers. These reports recorded the pre-service teachers’ perceptions of their professional learning with a school-based mentor in response to four outcomes of internship that were scaffolded by their mentor or initiated by them. These outcomes formed the bases of our research questions into the professional learning of the interns and included, ‘increased knowledge and capacity to teach within the total world of work as a teacher;’ ‘to work autonomously and interdependently’; to make ‘growth in critical reflectivity’, and the ‘ability to initiate professional development with the mentoring process’. Using the approaches of the constant comparative method of Strauss and Corbin (1998) key categories of experiences emerged. These categories were then identified as belonging to main meta-category labelled as ‘teacher leadership.’ Our research findings revealed that five dimensions of teacher leadership – effective practice in schools; school curriculum work; professional development of colleagues; parent and community involvement; and contributions to the profession – were evident in the written reports by interns. Not surprisingly, the mentor/intern relationship was the main vehicle for enabling the intern to learn about teaching and leadership. The paper concludes with some key implications for developers of preservice education programmes regarding the need for teacher leadership to be part of the discourse of these programmes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This study examined whether supervision characteristics impacted on mental health practice and morale, and developed a new Supervision Attitude Scale (SAS). Telephone surveys were conducted with a representative sample of 272 staff from public mental health services across Queensland. Although supervision was widely received and positively rated, it had low average intensity, and assessment and training of skills was rarely incorporated. Perceived impact on practice was associated with acquisition of skills and positive attitudes to supervisors, but extent of supervision was related to impact only if it was from within the profession. Intention to resign was unrelated to extent of supervision, but was associated with positive attitudes to supervisors, accessibility, high impact, and empathy or praise in supervision sessions. The SAS had high internal consistency, and its intercorrelations were consistent with it being a measure of relationship positivity. The study supported the role of supervision in retention and in improving practice. It also highlighted supervision characteristics that might be targeted in training, and provided preliminary data on a new measure.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Context The School of Information Technology at QUT has recently undertaken a major restructuring of their Bachelor of Information Technology (BIT) course. Some of the aims of this restructuring include a reduction in first year attrition and to provide an attractive degree course that meets both student and industry expectations. Emphasis has been placed on the first semester in the context of retaining students by introducing a set of four units that complement one another and provide introductory material on technology, programming and related skills, and generic skills that will aid the students throughout their undergraduate course and in their careers. This discussion relates to one of these four fist semester units, namely Building IT Systems. The aim of this unit is to create small Information Technology (IT) systems that use programming or scripting, databases as either standalone applications or web applications. In the prior history of teaching introductory computer programming at QUT, programming has been taught as a stand alone subject and integration of computer applications with other systems such as databases and networks was not undertaken until students had been given a thorough grounding in those topics as well. Feedback has indicated that students do not believe that working with a database requires programming skills. In fact, the teaching of the building blocks of computer applications have been compartmentalized and taught in isolation from each other. The teaching of introductory computer programming has been an industry requirement of IT degree courses as many jobs require at least some knowledge of the topic. Yet, computer programming is not a skill that all students have equal capabilities of learning (Bruce et al., 2004) and this is clearly shown by the volume of publications dedicated to this topic in the literature over a broad period of time (Eckerdal & Berglund, 2005; Mayer, 1981; Winslow, 1996). The teaching of this introductory material has been done pretty much the same way over the past thirty years. During this period of time that introductory computer programming courses have been taught at QUT, a number of different programming languages and programming paradigms have been used and different approaches to teaching and learning have been attempted in an effort to find the golden thread that would allow students to learn this complex topic. Unfortunately, computer programming is not a skill that can be learnt in one semester. Some basics can be learnt but it can take many years to master (Norvig, 2001). Faculty data typically has shown a bimodal distribution of results for students undertaking introductory programming courses with a high proportion of students receiving a high mark and a high proportion of students receiving a low or failing mark. This indicates that there are students who understand and excel with the introductory material while there is another group who struggle to understand the concepts and practices required to be able to translate a specification or problem statement into a computer program that achieves what is being requested. The consequence of a large group of students failing the introductory programming course has been a high level of attrition amongst first year students. This attrition level does not provide good continuity in student numbers in later years of the degree program and the current approach is not seen as sustainable.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper focuses on issues of access to productive literacy learning as part of socially just schooling for recently arrived refugee youth within Australia. It argues that a sole reliance on traditional ESL pedagogy is failing this vulnerable group of students, who differ significantly from past refugees who have settled in Australia. Many have been ‘placeless’ for some time, are likely to have received at best an interrupted education before arriving in Australia, and may have experienced signification trauma (Christie & Sidhu, 2006; Cottone, 2004; Miller, Mitchell, & Brown, 2005). Australian Government policy has resulted in spacialized settlement, leaving particular schools dealing with a large influx of refugee students who may be attending school for the first time (Centre for Multicultural Youth Issues, 2004; Sidhu & Christie, 2002). While this has implications generally, it has particular consequences for secondary school students attempting to learn English literacy in short periods of time, without basic foundations in either English or print-based literacy in any first language (Centre for Multicultural Youth Issues, 2006). Many of these students leave schools without the most basic early literacy practices, having endured several years of pedagogy pitched well beyond their needs. This paper suggests that schools must take up three key roles: to educate, to provide a site for the development of civic responsibility, and to act as a site for welfare with responsibility. As a system, our department needs to work out what can we do for 17-18 year olds that are coming into our school system in year 10 without more than 1-2 years of education. I don't think there is a policy about what to do. – (T2-ESL teacher)

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The present paper focuses on some interesting classes of process-control games, where winning essentially means successfully controlling the process. A master for one of these games is an agent who plays a winning strategy. In this paper we investigate situations in which even a complete model (given by a program) of a particular game does not provide enough information to synthesize—even incrementally—a winning strategy. However, if in addition to getting a program, a machine may also watch masters play winning strategies, then the machine is able to incrementally learn a winning strategy for the given game. Studied are successful learning from arbitrary masters and from pedagogically useful selected masters. It is shown that selected masters are strictly more helpful for learning than are arbitrary masters. Both for learning from arbitrary masters and for learning from selected masters, though, there are cases where one can learn programs for winning strategies from masters but not if one is required to learn a program for the master's strategy itself. Both for learning from arbitrary masters and for learning from selected masters, one can learn strictly more by watching m+1 masters than one can learn by watching only m. Last, a simulation result is presented where the presence of a selected master reduces the complexity from infinitely many semantic mind changes to finitely many syntactic ones.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

My research investigates why nouns are learned disproportionately more frequently than other kinds of words during early language acquisition (Gentner, 1982; Gleitman, et al., 2004). This question must be considered in the context of cognitive development in general. Infants have two major streams of environmental information to make meaningful: perceptual and linguistic. Perceptual information flows in from the senses and is processed into symbolic representations by the primitive language of thought (Fodor, 1975). These symbolic representations are then linked to linguistic input to enable language comprehension and ultimately production. Yet, how exactly does perceptual information become conceptualized? Although this question is difficult, there has been progress. One way that children might have an easier job is if they have structures that simplify the data. Thus, if particular sorts of perceptual information could be separated from the mass of input, then it would be easier for children to refer to those specific things when learning words (Spelke, 1990; Pylyshyn, 2003). It would be easier still, if linguistic input was segmented in predictable ways (Gentner, 1982; Gleitman, et al., 2004) Unfortunately the frequency of patterns in lexical or grammatical input cannot explain the cross-cultural and cross-linguistic tendency to favor nouns over verbs and predicates. There are three examples of this failure: 1) a wide variety of nouns are uttered less frequently than a smaller number of verbs and yet are learnt far more easily (Gentner, 1982); 2) word order and morphological transparency offer no insight when you contrast the sentence structures and word inflections of different languages (Slobin, 1973) and 3) particular language teaching behaviors (e.g. pointing at objects and repeating names for them) have little impact on children's tendency to prefer concrete nouns in their first fifty words (Newport, et al., 1977). Although the linguistic solution appears problematic, there has been increasing evidence that the early visual system does indeed segment perceptual information in specific ways before the conscious mind begins to intervene (Pylyshyn, 2003). I argue that nouns are easier to learn because their referents directly connect with innate features of the perceptual faculty. This hypothesis stems from work done on visual indexes by Zenon Pylyshyn (2001, 2003). Pylyshyn argues that the early visual system (the architecture of the "vision module") segments perceptual data into pre-conceptual proto-objects called FINSTs. FINSTs typically correspond to physical things such as Spelke objects (Spelke, 1990). Hence, before conceptualization, visual objects are picked out by the perceptual system demonstratively, like a finger pointing indicating ‘this’ or ‘that’. I suggest that this primitive system of demonstration elaborates on Gareth Evan's (1982) theory of nonconceptual content. Nouns are learnt first because their referents attract demonstrative visual indexes. This theory also explains why infants less often name stationary objects such as plate or table, but do name things that attract the focal attention of the early visual system, i.e., small objects that move, such as ‘dog’ or ‘ball’. This view leaves open the question how blind children learn words for visible objects and why children learn category nouns (e.g. 'dog'), rather than proper nouns (e.g. 'Fido') or higher taxonomic distinctions (e.g. 'animal').

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A significant proportion of the cost of software development is due to software testing and maintenance. This is in part the result of the inevitable imperfections due to human error, lack of quality during the design and coding of software, and the increasing need to reduce faults to improve customer satisfaction in a competitive marketplace. Given the cost and importance of removing errors improvements in fault detection and removal can be of significant benefit. The earlier in the development process faults can be found, the less it costs to correct them and the less likely other faults are to develop. This research aims to make the testing process more efficient and effective by identifying those software modules most likely to contain faults, allowing testing efforts to be carefully targeted. This is done with the use of machine learning algorithms which use examples of fault prone and not fault prone modules to develop predictive models of quality. In order to learn the numerical mapping between module and classification, a module is represented in terms of software metrics. A difficulty in this sort of problem is sourcing software engineering data of adequate quality. In this work, data is obtained from two sources, the NASA Metrics Data Program, and the open source Eclipse project. Feature selection before learning is applied, and in this area a number of different feature selection methods are applied to find which work best. Two machine learning algorithms are applied to the data - Naive Bayes and the Support Vector Machine - and predictive results are compared to those of previous efforts and found to be superior on selected data sets and comparable on others. In addition, a new classification method is proposed, Rank Sum, in which a ranking abstraction is laid over bin densities for each class, and a classification is determined based on the sum of ranks over features. A novel extension of this method is also described based on an observed polarising of points by class when rank sum is applied to training data to convert it into 2D rank sum space. SVM is applied to this transformed data to produce models the parameters of which can be set according to trade-off curves to obtain a particular performance trade-off.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The emergence of Twenty20 cricket at the elite level has been marketed on the excitement of the big hitter, where it seems that winning is a result of the muscular batter hitting boundaries at will. This version of the game has captured the imagination of many young players who all want to score runs with “big hits”. However, in junior cricket, boundary hitting is often more difficult due to size limitations of children and games played on outfields where the ball does not travel quickly. As a result, winning is often achieved via a less spectacular route – by scoring more singles than your opponents. However, most standard coaching texts only describe how to play boundary scoring shots (e.g. the drives, pulls, cuts and sweeps) and defensive shots to protect the wicket. Learning to bat appears to have been reduced to extremes of force production, i.e. maximal force production to hit boundaries or minimal force production to stop the ball from hitting the wicket. Initially, this is not a problem because the typical innings of a young player (<12 years) would be based on the concept of “block” or “bash” – they “block” the good balls and “bash” the short balls. This approach works because there are many opportunities to hit boundaries off the numerous inaccurate deliveries of novice bowlers. Most runs are scored behind the wicket by using the pace of the bowler’s delivery to re-direct the ball, because the intrinsic dynamics (i.e. lack of strength) of most children means that they can only create sufficient power by playing shots where the whole body can contribute to force production. This method works well until the novice player comes up against more accurate bowling when they find they have no way of scoring runs. Once batters begin to face “good” bowlers, batters have to learn to score runs via singles. In cricket coaching manuals (e.g. ECB, n.d), running between the wickets is treated as a separate task to batting, and the “basics” of running, such as how to “back- up”, carry the bat, calling and turning and sliding the bat into the crease are “drilled” into players. This task decomposition strategy focussing on techniques is a common approach to skill acquisition in many highly traditional sports, typified in cricket by activities where players hit balls off tees and receive “throw-downs” from coaches. However, the relative usefulness of these approaches in the acquisition of sporting skills is increasingly being questioned (Pinder, Renshaw & Davids, 2009). We will discuss why this is the case in the next section.