962 resultados para Eggemoggin Reach
Resumo:
While externally moderated standards-based assessment has been practised in Queensland senior schooling for more than three decades, there has been no such practice in the middle years. With the introduction of standards at state and national levels in these years, teacher judgement as developed in moderation practices is now vital. This paper argues, that in this context of assessment reform, standards intended to inform teacher judgement and to build assessment capacity are necessary but not sufficient for maintaining teacher and public confidence in schooling. Teacher judgement is intrinsic to moderation, and to professional practice, and can no longer remain private. Moderation too is intrinsic to efforts by the profession to realise judgements that are defensible, dependable and open to scrutiny. Moderation can no longer be considered an optional extra and requires system-level support especially if, as intended, the standards are linked to system-wide efforts to improve student learning. In presenting this argument we draw on an Australian Research Council funded study with key industry partners (the Queensland Studies Authority and the National Council for Curriculum and Assessment of the Republic of Ireland). The data analysed included teacher interview data and additional teacher talk during moderation sessions. These were undertaken during the initial phase of policy development. The analysis identified those issues that emerge in moderation meetings that are designed to reach consistent, reliable judgements. Of interest are the different ways in which teachers talked through and interacted with one another to reach agreement about the quality of student work in the application of standards. There is evidence of differences in the way that teachers made compensations and trade-offs in their award of grades, dependent on the subject domain in which they teach. This article concludes with some empirically derived insights into moderation practices as policy and social events.
Resumo:
As organizations reach higher levels of Business Process Management maturity, they tend to accumulate large collections of process models. These repositories may contain thousands of activities and be managed by different stakeholders with varying skills and responsibilities. However, while being of great value, these repositories induce high management costs. Thus, it becomes essential to keep track of the various model versions as they may mutually overlap, supersede one another and evolve over time. We propose an innovative versioning model and associated storage structure, specifically designed to maximize sharing across process model versions, and to automatically handle change propagation. The focal point of this technique is to version single process model fragments, rather than entire process models. Indeed empirical evidence shows that real-life process model repositories have numerous duplicate fragments. Experiments on two industrial datasets confirm the usefulness of our technique.
Resumo:
It is widely contended that we live in a „world risk society‟, where risk plays a central and ubiquitous role in contemporary social life. A seminal contributor to this view is Ulrich Beck, who claims that our world is governed by dangers that cannot be calculated or insured against. For Beck, risk is an inherently unrestrained phenomenon, emerging from a core and pouring out from and under national borders, unaffected by state power. Beck‟s focus on risk's ubiquity and uncontrollability at an infra-global level means that there is a necessary evenness to the expanse of risk: a "universalization of hazards‟, which possess an inbuilt tendency towards globalisation. While sociological scholarship has examined the reach and impact of globalisation processes on the role and power of states, Beck‟s argument that economic risk is without territory and resistant to domestic policy has come under less appraisal. This is contestable: what are often described as global economic processes, on closer inspection, reveal degrees of territorial embeddedness. This not only suggests that "global‟ flows could sometimes be more appropriately explained as international, regional or even local processes, formed from and responsive to state strategies – but also demonstrates what can be missed if we overinflate the global. This paper briefly introduces two key principles of Beck's theory of risk society and positions them within a review of literature debating the novelty and degree of global economic integration and its impact on states pursuing domestic economic policies. In doing so, this paper highlights the value for future research to engage with questions such as "is economic risk really without territory‟ and "does risk produce convergence‟, not so much as a means of reducing Beck's thesis to a purely empirical analysis, but rather to avoid limiting our scope in understanding the complex relationship between risk and state.
Resumo:
Sports sponsorship increasingly provides organisations with the opportunity to reach their target audiences in a manner that facilitates engagement and encourages relationship development. This paper provides an Australian perspective of the value of sports sponsorship using a case study of WOW Sight and Sound’s long-term sponsorship of the Brisbane Broncos rugby league team. The case study investigates WOW’s marketing objectives which centre generating brand awareness using sponsorship with the Brisbane Broncos as an integrated marketing communications tool. WOW believes that the integration of its sponsorship of the Broncos with the team’s total marketing plan is integral to its success. This integration requires the facilitation of two-way communications between WOW, its advertising agency, the Brisbane Broncos and customers to ensure that all parties’ needs are met.
Resumo:
The study of the creative industries is not much more than a decade old. What makes it fascinating is that it is dealing with a rapidly evolving process, where a good deal of Schumpeterian ‘creative destruction’ – of old industries, business models, and some familiar cultural and creative pursuits – can already be observed. What happens next – and who will be the winner – is hard to predict. Furthermore, the creative industries encompass both large-scale ‘industry’ (media, publishing, digital applications) and individual creative talent; both economic and cultural values, and both global reach and local context. Thus, the challenge is to integrate ‘top-down’ policy and planning with ‘bottom-up’ experimentation and innovation. There is always the promise that this new creative ecology will provide some novel answers to problems of wealth-creation for emergent economies, new solutions to problems of intellectual emancipation for individuals, and sustainable development for that most intense incubator of creative ideas, the city.
Resumo:
The following report considers a number of key challenges the Australian Federal Government faces in designing the regulatory framework and the reach of its planned mandatory internet filter. Previous reports on the mandatory filtering scheme have concentrated on the filtering technologies, their efficacy, their cost and their likely impact on the broadband environment. This report focuses on the scope and the nature of content that is likely to be caught by the proposed filter and on identifying associated public policy implications.
Resumo:
This paper explores empirically effects of Effectuation on nascent firms’ performance. Three potential outcomes for nascent firms using different levels of effectuation and causation are investigated. Innovation, a measure of venture sophistication was introduced as a moderator. We examine a longitudinal random sample of 625 nascent firms collected over two years in Australia and provide support for our hypotheses. Results show that in situation of high uncertainty, nascent firms using effectuation are more likely to reach operational stage than their counterpart using causation.
Resumo:
Background: Falls are a major health and injury problem for people with Parkinson disease (PD). Despite the severe consequences of falls, a major unresolved issue is the identification of factors that predict the risk of falls in individual patients with PD. The primary aim of this study was to prospectively determine an optimal combination of functional and disease-specific tests to predict falls in individuals with PD. ----- ----- Methods: A total of 101 people with early-stage PD undertook a battery of neurologic and functional tests in their optimally medicated state. The tests included Tinetti, Berg, Timed Up and Go, Functional Reach, and the Physiological Profile Assessment of Falls Risk; the latter assessment includes physiologic tests of visual function, proprioception, strength, cutaneous sensitivity, reaction time, and postural sway. Falls were recorded prospectively over 6 months. ----- ----- Results: Forty-eight percent of participants reported a fall and 24% more than 1 fall. In the multivariate model, a combination of the Unified Parkinson's Disease Rating Scale (UPDRS) total score, total freezing of gait score, occurrence of symptomatic postural orthostasis, Tinetti total score, and extent of postural sway in the anterior-posterior direction produced the best sensitivity (78%) and specificity (84%) for predicting falls. From the UPDRS items, only the rapid alternating task category was an independent predictor of falls. Reduced peripheral sensation and knee extension strength in fallers contributed to increased postural instability. ----- ----- Conclusions: Falls are a significant problem in optimally medicated early-stage PD. A combination of both disease-specific and balance- and mobility-related measures can accurately predict falls in individuals with PD.
Resumo:
Internationally the railway industry is facing a severe shortage of engineers with high level, relevant, profession and technical knowledge and abilities, in particular amongst engineers involved in the design, construction and maintenance of railway infrastructure. A unique graduate level program has been created to meet that global need via a fully online, distance education format. The development and operation of this Master of Engineering degree is proposed as a model of the process needed for the industry-relevance, flexible delivery, international networking, and professional development required for a successful graduate engineering program in the 21st century. In particular, the paper demonstrates how a mix of new and more familiar technologies are utilised through a variety of tasks to overcome the huge distances and multiple time zones that separate the participants across a growing number of countries, successfully achieving close and sustained interaction amongst the participants and railway experts.
Resumo:
Communications media have been central to globalizing processes in modern societies. As technological forms, communication media have long extended the transmission of messages across space in ways that challenge the socio-cultural dimensions of the nation-state and national cultures, and the global communications infrastructure that has developed rapidly since the 1980s has further promoted global information flows and cross-border commercial activity. As institutional and organisational forms through which information and content is produced and distributed, media corporations have been at the forefront of international expansion of their market reach and the development of new sites of production and distribution, and media industries are highly dynamic on a global scale. Finally, as cultural forms, or providers of the informational and symbolic content that is received and used by consumers/audiences/users, global media constitute a core means through which people make sense of events in distant places, and the information and images that they carry are central to the existence of common systems of meaning and understanding across nations, regions and cultures.
Resumo:
Many fashion businesses in New Zealand have followed a global trend towards inexpensive off shore manufacturing. The transfer of the production of garments to overseas workers has had consequences for the wellbeing of local businesses, fashion designers and garment makers. The gradual decline of fashion manufacturing also appears to have resulted in a local fashion scene where many garments look the same in style, colour, fabric, cut and fit. The excitement of the past, where the majority of fashion designers established their own individuality through the cut and shape of the garments that they produced, may have been inadvertently lost in an effort to take advantage of cost savings achieved through mass production and manufacturing methods which are now largely unavailable in New Zealand. Consequently, a sustainable local fashion and manufacturing industry, with design integrity, seems further out of reach. This paper is focussed upon the thesis that the design and manufacture of a fashion garment, bearing in mind certain economic and practical restrictions at its inception, can contribute to a more sustainable fashion manufacturing industry in New Zealand.
Resumo:
As organizations reach to higher levels of business process management maturity, they often find themselves maintaining repositories of hundreds or even thousands of process models, representing valuable knowledge about their operations. Over time, process model repositories tend to accumulate duplicate fragments (also called clones) as new process models are created or extended by copying and merging fragments from other models. This calls for methods to detect clones in process models, so that these clones can be refactored as separate subprocesses in order to improve maintainability. This paper presents an indexing structure to support the fast detection of clones in large process model repositories. The proposed index is based on a novel combination of a method for process model decomposition (specifically the Refined Process Structure Tree), with established graph canonization and string matching techniques. Experiments show that the algorithm scales to repositories with hundreds of models. The experimental results also show that a significant number of non-trivial clones can be found in process model repositories taken from industrial practice.
Resumo:
The Blair Witch Project was a low budget movie made by student filmmakers that become an international box office hit in 1999. Blair Witch was a landmark in movie marketing and distribution because it was the first time that any movie had successfully leveraged the Internet as a marketing platform to reach a wide audience. The marketing team employed a range of innovative strategies and tactics to stimulate audience demand. This case study describes and analyses the success of the marketing launch of The Blair Witch Project.It also provides an Instructors booklet comprising seven questions and answers related to the marketing success of the movie.
Resumo:
Issue addressed: Measures of 'social identity' and 'psychological sense of community' were included within a broader formative research inquiry to gain insight into the identity characteristics and level of connectedness among older recreational road travellers (commonly known as Grey Nomads). The research sought to gain insights on how best to reach or speak to this growing driver cohort. ----- ----- Method: Participants included 631 older recreational road travellers ranging in age from 50 years to over 80 years. Data were obtained through three scales which were incorporated into a larger formative research survey; an identity hierarchy, the Three Factor Model of Social Identity and the Sense of Community Index. ----- ----- Results: Older recreational road travellers see themselves principally as couples, with social group identity being secondary. Although many identified to some degree with the Grey Nomad identity, when asked to self categorise as either members of the Broad Network of Recreational Vehicle Travellers or as Grey Nomads, the majority categorised themselves as the former. Those identifying as Grey Nomads, however, reported significantly higher levels of 'social identification' and 'sense of community'. ----- ----- Conclusion: The Grey Nomad identity may not be the best identity at which to target road safety messages for this cohort. Targeting travelling 'couples' may be more efficacious. Using the 'Grey Nomad' identity is likely to reap at least some success, however, given that many identified to some degree with this group identity. Those identifying as Grey Nomads may be more open to community participation or behaviour change given their significantly higher levels of 'social identity' and 'sense of community'.
Resumo:
Freeways are divided roadways designed to facilitate the uninterrupted movement of motor vehicles. However, many freeways now experience demand flows in excess of capacity, leading to recurrent congestion. The Highway Capacity Manual (TRB, 1994) uses empirical macroscopic relationships between speed, flow and density to quantify freeway operations and performance. Capacity may be predicted as the maximum uncongested flow achievable. Although they are effective tools for design and analysis, macroscopic models lack an understanding of the nature of processes taking place in the system. Szwed and Smith (1972, 1974) and Makigami and Matsuo (1990) have shown that microscopic modelling is also applicable to freeway operations. Such models facilitate an understanding of the processes whilst providing for the assessment of performance, through measures of capacity and delay. However, these models are limited to only a few circumstances. The aim of this study was to produce more comprehensive and practical microscopic models. These models were required to accurately portray the mechanisms of freeway operations at the specific locations under consideration. The models needed to be able to be calibrated using data acquired at these locations. The output of the models needed to be able to be validated with data acquired at these sites. Therefore, the outputs should be truly descriptive of the performance of the facility. A theoretical basis needed to underlie the form of these models, rather than empiricism, which is the case for the macroscopic models currently used. And the models needed to be adaptable to variable operating conditions, so that they may be applied, where possible, to other similar systems and facilities. It was not possible to produce a stand-alone model which is applicable to all facilities and locations, in this single study, however the scene has been set for the application of the models to a much broader range of operating conditions. Opportunities for further development of the models were identified, and procedures provided for the calibration and validation of the models to a wide range of conditions. The models developed, do however, have limitations in their applicability. Only uncongested operations were studied and represented. Driver behaviour in Brisbane was applied to the models. Different mechanisms are likely in other locations due to variability in road rules and driving cultures. Not all manoeuvres evident were modelled. Some unusual manoeuvres were considered unwarranted to model. However the models developed contain the principal processes of freeway operations, merging and lane changing. Gap acceptance theory was applied to these critical operations to assess freeway performance. Gap acceptance theory was found to be applicable to merging, however the major stream, the kerb lane traffic, exercises only a limited priority over the minor stream, the on-ramp traffic. Theory was established to account for this activity. Kerb lane drivers were also found to change to the median lane where possible, to assist coincident mergers. The net limited priority model accounts for this by predicting a reduced major stream flow rate, which excludes lane changers. Cowan's M3 model as calibrated for both streams. On-ramp and total upstream flow are required as input. Relationships between proportion of headways greater than 1 s and flow differed for on-ramps where traffic leaves signalised intersections and unsignalised intersections. Constant departure onramp metering was also modelled. Minimum follow-on times of 1 to 1.2 s were calibrated. Critical gaps were shown to lie between the minimum follow-on time, and the sum of the minimum follow-on time and the 1 s minimum headway. Limited priority capacity and other boundary relationships were established by Troutbeck (1995). The minimum average minor stream delay and corresponding proportion of drivers delayed were quantified theoretically in this study. A simulation model was constructed to predict intermediate minor and major stream delays across all minor and major stream flows. Pseudo-empirical relationships were established to predict average delays. Major stream average delays are limited to 0.5 s, insignificant compared with minor stream delay, which reach infinity at capacity. Minor stream delays were shown to be less when unsignalised intersections are located upstream of on-ramps than signalised intersections, and less still when ramp metering is installed. Smaller delays correspond to improved merge area performance. A more tangible performance measure, the distribution of distances required to merge, was established by including design speeds. This distribution can be measured to validate the model. Merging probabilities can be predicted for given taper lengths, a most useful performance measure. This model was also shown to be applicable to lane changing. Tolerable limits to merging probabilities require calibration. From these, practical capacities can be estimated. Further calibration is required of traffic inputs, critical gap and minimum follow-on time, for both merging and lane changing. A general relationship to predict proportion of drivers delayed requires development. These models can then be used to complement existing macroscopic models to assess performance, and provide further insight into the nature of operations.