975 resultados para More, Thomas, Sir, Saint, 1478-1535
Promoting a more positive traffic safety culture in Australia : lessons learnt and future directions
Resumo:
Adopting a traffic safety culture approach, this paper identifies and discusses the ongoing challenge of promoting the road safety message in Australia. It is widely acknowledged that mass media and public education initiatives have played a critical role in the significant positive changes witnessed in community attitudes to road safety in the last three to four decades. It could be argued that mass media and education have had a direct influence on behaviours and attitudes, as well as an indirect influence through signposting and awareness raising functions in conjunction with enforcement. Great achievements have been made in reducing fatalities on Australia’s roads; a concept which is well understood among the international road safety fraternity. How well these achievements are appreciated by the general Australian community however, is not clear. This paper explores the lessons that can be learnt from successes in attitudinal and behaviour change in regard to seatbelt use and drink driving in Australia. It also identifies and discusses key challenges associated with achieving further positive changes in community attitudes and behaviours, particularly in relation to behaviours that may not be perceived by the community as dangerous, such as speeding and mobile phone use while driving. Potential strategies for future mass media and public education campaigns to target these challenges are suggested, including ways of harnessing the power of contemporary traffic law enforcement techniques, such as point-to-point speed enforcement and in-vehicle technologies, to help spread the road safety message.
Resumo:
In contemporary game development circles the ‘game making jam’ has become an important rite of passage and baptism event, an exploration space and a central indie lifestyle affirmation and community event. Game jams have recently become a focus for design researchers interested in the creative process. In this paper we tell the story of an established local game jam and our various documentation and data collection methods. We present the beginnings of the current project, which seeks to map the creative teams and their process in the space of the challenge, and which aims to enable participants to be more than the objects of the data collection. A perceived issue is that typical documentation approaches are ‘about’ the event as opposed to ‘made by’ the participants and are thus both at odds with the spirit of the jam as a phenomenon and do not really access the rich playful potential of participant experience. In the data collection and visualisation projects described here, we focus on using collected data to re-include the participants in telling stories about their experiences of the event as a place-based experience. Our goal is to find a means to encourage production of ‘anecdata’ - data based on individual story telling that is subjective, malleable, and resists collection via formal mechanisms - and to enable mimesis, or active narrating, on the part of the participants. We present a concept design for data as game based on the logic of early medieval maps and we reflect on how we could enable participation in the data collection itself.
Resumo:
The 48 hour game making challenge has been running since 2007. In recent years, we have not only been running a 'game jam' for the local community but we have also been exploring the way in which the event itself and the place of the event has the potential to create its own stories. Game jams are the creative festivals of the game development community and a game jam is very much an event or performance; its stories are those of subjective experience. Participants return year after year and recount personal stories from previous challenges; arrival in the 48hr location typically inspires instances of individual memory and narration more in keeping with those of a music festival or an oft frequented holiday destination. Since its inception, the 48hr has been heavily documented, from the photo-blogging of our first jam and the twitter streams of more recent events to more formal interviews and documentaries (see Anderson, 2012). We have even had our own moments of Gonzo journalism with an on-site press room one year and an ‘embedded’ journalist another year (Keogh, 2011). In the last two years of the 48hr we have started to explore ways and means to collect more abstract data during the event, that is, empirical data about movement and activity. The intent behind this form of data collection was to explore graphic and computer generated visualisations of the event, not for the purpose of formal analysis but in the service of further story telling. [exerpt from truna aka j.turner, Thomas & Owen, 2013) See: truna aka j.turner, Thomas & Owen (2013) Living the indie life: mapping creative teams in a 48 hour game jam and playing with data, Proceedings of the 9th Australasian Conference on Interactive Entertainment, IE'2013, September 30 - October 01 2013, Melbourne, VIC, Australia
Resumo:
GO423 was initiated in 2012 as part of a community effort to ensure the vitality of the Queensland Games Sector. In common with other industrialised nations, the game industry in Australia is a reasonably significant contributor to Gross National Product (GNP). Games are played in 92% of Australian homes and the average adult player has been playing them for at least twelve years with 26% playing for more than thirty years (Brand, 2011). Like the games and interactive entertainment industries in other countries, the Australian industry has its roots in the small team model of the 1980s. So, for example, Beam Software, which was established in Melbourne in 1980, was started by two people and Krome Studios was started in 1999 by three. Both these companies grew to employing over 100 people in their heydays (considered large by Antipodean standards), not by producing their own intellectual property (IP) but by content generation for off shore parent companies. Thus our bigger companies grew on a model of service provision and tended not to generate their own IP (Darchen, 2012). There are some no-table exceptions where IP has originated locally and been ac-quired by international companies but in the case of some of the works of which we are most proud, the Australian company took on the role of “Night Elf” – a convenience due to affordances of the time zone which allowed our companies to work while the parent companies slept in a different time zone. In the post GFC climate, the strong Australian dollar and the vulnerability of such service provision means that job security is virtually non-existent with employees invariably being on short-term contracts. These issues are exacerbated by the decline of middle-ground games (those which fall between the triple-A titles and the smaller games often produced for a casual audience). The response to this state of affairs has been the change in the Australian games industry to new recognition of its identity as a wider cultural sector and the rise (or return) of an increasing number of small independent game development companies. ’In-dies’ consist of small teams, often making games for mobile and casual platforms, that depend on producing at least one if not two games a year and who often explore more radical definitions of games as designed cultural objects. The need for innovation and creativity in the Australian context is seen as a vital aspect of the current changing scene where we see the emphasis on the large studio production model give way to an emerging cultural sector model where small independent teams are engaged in shorter design and production schedules driven by digital distribution. In terms of Quality of Life (QoL) this new digital distribution brings with it the danger of 'digital isolation' - a studio can work from home and deliver from home. Community events thus become increasingly important. The GO423 Symposium is a response to these perceived needs and the event is based on the understanding that our new small creative teams depend on the local community of practice in no small way. GO423 thus offers local industry participants the opportunity to talk to each other about their work, to talk to potential new members about their work and to show off their work in a small intimate situation, encouraging both feedback and support.
Resumo:
With unpredictable workloads and a need for a multitude of specialized skills, many main contractors rely heavily on subcontracting to reduce their risks (Bresnen et al., 1985; Beardsworth et al., 1988). This is especially the case In Hong Kong, where the average direct labour content accounts for only around 1% of the total contract sum (Lai, 1987). Extensive usage of subcontracting is also reported in many other countries, including the UK (Gray and Flanagan, 1989) and Japan (Bennett et al., 1987). In addition, and depending upon the scale and complexity of works, it is not uncommon for subcontractors to further sublet their works to lower tier(s) subcontractors. Richter and Mitchell (1982) argued that main contractors can obtain a higher profit margin by reducing their performance costs by subcontracting work to those who have the necessary resources to perform the work more efficiently and economically. Subcontracting is also used strategically to allow firms to employ a minimum work force under fluctuating demand (Usdiken and Sözen, 1985). Through subcontracting, the risks of main contractors are also reduced, as errors in estimating or additional costs caused by delays or extra labour requirements can be absorbed by the subcontractors involved (Woon and Ofori, 2000). Despite these benefits, the quality of work can suffer when incapable or inexperienced subcontractors are employed. Additional problems also exist in the form of bid shopping, unclear accountability, and high fragmentation (Palaneeswaran et al., 2002). A recent CIB TG 23 International Conference, October 2003, Hong Kong report produced by the Hong Kong Construction Industry Review Committee (CIRC) points to development of a framework to help distinguish between capable and incapable subcontractors (Tang, 2001). This paper describes research aims at identifying and prioritising criteria for use in such a framework.
Resumo:
Digital Human Models (DHM) have been used for over 25 years. They have evolved from simple drawing templates, which are nowadays still used in architecture, to complex and Computer Aided Engineering (CAE) integrated design and analysis tools for various ergonomic tasks. DHM are most frequently used for applications in product design and production planning, with many successful implementations documented. DHM from other domains, as for example computer user interfaces, artificial intelligence, training and education, or the entertainment industry show that there is also an ongoing development towards a comprehensive understanding and holistic modeling of human behavior. While the development of DHM for the game sector has seen significant progress in recent years, advances of DHM in the area of ergonomics have been comparatively modest. As a consequence, we need to question if current DHM systems are fit for the design of future mobile work systems. So far it appears that DHM in Ergonomics are rather limited to some traditional applications. According to Dul et al. (2012), future characteristics of Human Factors and Ergonomics (HFE) can be assigned to six main trends: (1) global change of work systems, (2) cultural diversity, (3) ageing, (4) information and communication technology (ICT), (5) enhanced competiveness and the need for innovation, and; (6) sustainability and corporate social responsibility. Based on a literature review, we systematically investigate the capabilities of current ergonomic DHM systems versus the ‘Future of Ergonomics’ requirements. It is found that DHMs already provide broad functionality in support of trends (1) and (2), and more limited options in regards to trend (3). Today’s DHM provide access to a broad range of national and international databases for correct differentiation and characterization of anthropometry for global populations. Some DHM explicitly address social and cultural modeling of groups of people. In comparison, the trends of growing importance of ICT (4), the need for innovation (5) and sustainability (6) are addressed primarily from a hardware-oriented and engineering perspective and not reflected in DHM. This reflects a persistent separation between hardware design (engineering) and software design (information technology) in the view of DHM – a disconnection which needs to be urgently overcome in the era of software defined user interfaces and mobile devices. The design of a mobile ICT-device is discussed to exemplify the need for a comprehensive future DHM solution. Designing such mobile devices requires an approach that includes organizational aspects as well as technical and cognitive ergonomics. Multiple interrelationships between the different aspects result in a challenging setting for future DHM. In conclusion, the ‘Future of Ergonomics’ pose particular challenges for DHM in regards to the design of mobile work systems, and moreover mobile information access.
Resumo:
The purpose of this study is to determine visual performance in water, including the influence of pupil size. The water en-vironment was simulated by placing a goggle filled with saline in front of eyes, with apertures placed at the front of the goggle. Correction factors were determined for the different magnification under this condition in order to to estimate vision in water. Experiments were conducted on letter visual acuity (7 participants), grating resolution (8 participants), and grating contrast sensitivity (1 participant). For letter acuity, mean loss in vision in water, compared to corrected vision in air, varied between 1.1 log minutes of arc resolution (logMAR) for a 1mm aperture to 2.2 logMAR for a 7mm aperture. The vision in minutes of arc was described well by a linear relationship with pupil size. For grating acuity, mean loss varied between 1.1 logMAR for a 2mm aperture to 1.2 logMAR for a 6mm aperture. Contrast sensitivity for a 2mm aperture dete-riorated as spatial frequency increased, with 2 log unit loss by 3 cycles/degree. Superimposed on this deterioration were depressions (notches) in sensitivity, with the first three notches occurring at 0.45, 0.8 and 1.3 cycles/degree with esti-mates for water of 0.39, 0.70 and 1.13 cycles/degree. In conclusion, vision in water is poor. It becomes worse as pupil size increases, but the effects are much more marked for letter targets than for grating targets.
Resumo:
Determining what consequences are likely to serve as effective punishment for any given behaviour is a complex task. This chapter focuses specifically on illegal road user behaviours and the mechanisms used to punish and deter them. Traffic law enforcement has traditionally used the threat and/or receipt of legal sanctions and penalties to deter illegal and risky behaviours. This process represents the use of positive punishment, one of the key behaviour modification mechanisms. Behaviour modification principles describe four types of reinforcers: positive and negative punishments and positive and negative reinforcements. The terms ‘positive’ and ‘negative’ are not used in an evaluative sense here. Rather, they represent the presence (positive) or absence (negative) of stimuli to promote behaviour change. Punishments aim to inhibit behaviour and reinforcements aim to encourage it. This chapter describes a variety of punishments and reinforcements that have been and could be used to modify illegal road user behaviours. In doing so, it draws on several theoretical perspectives that have defined behavioural reinforcement and punishment in different ways. Historically, the main theoretical approach used to deter risky road use has been classical deterrence theory which has focussed on the perceived certainty, severity and swiftness of penalties. Stafford and Warr (1993) extended the traditional deterrence principles to include the positive reinforcement concept of punishment avoidance. Evidence of the association between punishment avoidance experiences and behaviour has been established for a number of risky road user behaviours including drink driving, unlicensed driving, and speeding. We chose a novel way of assessing punishment avoidance by specifying two sub-constructs (detection evasion and punishment evasion). Another theorist, Akers, described the idea of competing reinforcers, termed differential reinforcement, within social learning theory (1977). Differential reinforcement describes a balance of reinforcements and punishments as influential on behaviour. This chapter describes comprehensive way of conceptualising a broad range of reinforcement and punishment concepts, consistent with Akers’ differential reinforcement concept, within a behaviour modification framework that incorporates deterrence principles. The efficacy of three theoretical perspectives to explain self-reported speeding among a sample of 833 Australian car drivers was examined. Results demonstrated that a broad range of variables predicted speeding including personal experiences of evading detection and punishment for speeding, intrinsic sensations, practical benefits expected from speeding, and an absence of punishing effects from being caught. Not surprisingly, being younger was also significantly related to more frequent speeding, although in a regression analysis, gender did not retain a significant influence once all punishment and reinforcement variables were entered. The implications for speed management, as well as road user behaviour modification more generally, are discussed in light of these findings. Overall, the findings reported in this chapter suggest that a more comprehensive approach is required to manage the behaviour of road users which does not rely solely on traditional legal penalties and sanctions.
Resumo:
Growing up, my family worshipped at the altar of unionism. My parents embraced ‘working class’ as an active social position not as a step on the aspirational treadmill. In those days and in the areas where I lived, it was nothing special. It was a given that everyone was in a union and voted Labor, manning factories and building sites and marching or striking when the need arose...
Resumo:
The aims of this phase I study were to establish the maximum tolerated dose, safety profile and activity of liposomal daunorubicin, DaunoXome (NeXstar Pharmaceuticals), in the treatment of metastatic breast cancer. DaunoXome was administered intravenously over 2 h in 21 day cycles and doses were increased from 80 to 100, 120 and 150 mg m 2. Sixteen patients were enrolled. A total of 70 cycles of DaunoXome were administered. The maximum tolerated dose was 120 mg m 2, the dose-limiting toxicity being prolonged grade 4 neutropenia or neutropenic pyrexia necessitating dose reductions at 120 and 150 mg m 2. Asymptomatic cardiotoxicity was observed in three patients: grade 1 in one treated with a cumulative dose of 800 mg m 2 and grade 2 in two, one who received a cumulative dose of 960 mg m 2 and the other a cumulative dose of 600 mg m 2 with a previous neoadjuvant doxorubicin chemotherapy of 300 mg m 2. Tumour response was evaluable in 15 patients, of whom two had objective responses, six had stable disease and seven had progressive disease. In conclusion, DaunoXome is associated with mild, manageable toxicities and has anti-tumour activity in metastatic breast cancer. The findings support further phase II evaluation of DaunoXome alone and in combination with other standard non-anthracycline cytotoxic or novel targeted agents. Although the dose-limiting toxicity for DaunoXome was febrile neutropenia at 120 mg m 2, we would recommend this dose for further evaluation, as the febrile neutropenia occurred after four or more cycles in three of the four episodes seen, was short lived and uncomplicated. © 2002 Cancer Research UK.
Resumo:
OBJECTIVE: We present and analyze long-term outcomes following multimodal therapy for esophageal cancer, in particular the relative impact of histomorphologic tumor regression and nodal status. PATIENTS AND METHODS: A total of 243 patients [(adenocarcinoma (n = 170) and squamous cell carcinoma (n = 73)] treated with neoadjuvant chemoradiotherapy in the period 1990 to 2004 were followed prospectively with a median follow-up of 60 months. Pathologic stage and tumor regression grade (TRG) were documented, the site of first failure was recorded, and Kaplan-Meier survival curves were plotted. RESULTS: Thirty patients (12%) did not undergo surgery due to disease progression or deteriorated performance status. Forty-one patients (19%) had a complete pathologic response (pCR), and there were 31(15%) stage I, 69 (32%) stage II, and 72 (34%) stage III cases. The overall median survival was 18 months, and the 5-year survival was 27%. The 5-year survival of patients achieving a pCR was 50% compared with 37% in non-pCR patients who were node-negative (P = 0.86). Histomorphologic tumor regression was not associated with pre-CRT cTN stage but was significantly (P < 0.05) associated with ypN stage. By multivariate analysis, ypN status (P = 0.002) was more predictive of overall survival than TRG (P = 0.06) or ypT stage (P = 0.39). CONCLUSION: Achieving a node-negative status is the major determinant of outcome following neoadjuvant chemoradiotherapy. Histomorphologic tumor regression is less predictive of outcome than pathologic nodal status (ypN), and the need to include a primary site regression score in a new staging classification is unclear. © 2007 Lippincott Williams & Wilkins, Inc.
Resumo:
Background: Charcot Neuro-Arthropathy (CN) is one of the more devastating complications of diabetes. To the best of the authors' knowledge, it appears that no clinical tools based on a systematic review of existing literature have been developed to manage acute CN. Thus, the aim of this paper was to systematically review existing literature and develop an evidence-based clinical pathway for the assessment, diagnosis and management of acute CN in patients with diabetes. Methods: Electronic databases (Medline, PubMed, CINAHL, Embase and Cochrane Library), reference lists, and relevant key websites were systematically searched for literature discussing the assessment, diagnosis and/or management of acute CN published between 2002-2012. At least two independent investigators then quality rated and graded the evidence of each included paper. Consistent recommendations emanating from the included papers were then fashioned in a clinical pathway. Results: The systematic search identified 267 manuscripts, of which 117 (44%) met the inclusion criteria for this study. Most manuscripts discussing the assessment, diagnosis and/or management of acute CN constituted level IV (case series) or EO (expert opinion) evidence. The included literature was used to develop an evidence-based clinical pathway for the assessment, investigations, diagnosis and management of acute CN. Conclusions: This research has assisted in developing a comprehensive, evidence-based clinical pathway to promote consistent and optimal practice in the assessment, diagnosis and management of acute CN. The pathway aims to support health professionals in making early diagnosis and providing appropriate immediate management of acute CN, ultimately reducing its associated complications such as amputations and hospitalisations.
Resumo:
"White Australia has always had a view on what makes a 'real' Aboriginal person. Andrew Bolt is the merely the latest in a long line of commentators who have put forward their views about 'black' and 'white' Aboriginals. Spread across a continent after 200 years of colonisation, Aboriginal people are diverse in a way that is at odds with media stereotypes of 'traditional' Aboriginal people living in troubled remote communities. At a crucial time for recognition and reconciliation, does 'white' or 'black' matter? Who speaks for Aboriginal people and defines who they are?"--Festivals, Talks & Ideas website
Resumo:
Television’s 50th anniversary marks half a century of extraordinary technological development. This begs the question: is the best we can expect for the next 50 years just Higher Definition pictures of the same old crap?
Resumo:
We provide a taxonomic redescription of the Fawn Antechinus, Antechinus bellus (Thomas). A. bellus is the only member of its genus to occur in Australia’s Northern Territory, where it can be found in savannah woodlands of the Top End. It is perhaps the most distinctive antechinus, and clearly distinguishable from the other 10 extant species of antechinus found in Australia: externally, A. bellus has pale body fur, white feet and large ears; A. bellus skulls have large auditory bullae and narrow interorbital width, while broadening abruptly at the molar row; mitochondrial and nuclear genes clearly dis-tinguish A. bellus from all congeners, phylogenetically positioning the Fawn Antechinus as sister to Queensland’s A. leo Van Dyck, 1980, with which it shares a curled supratragus of the external ear and a similar tropical latitudinal range.