328 resultados para Investor attention
Resumo:
The nature of the relationship that is negotiated, developed and maintained between a clinical supervisor and supervisee is central to effectively engage in clinical work, to promote professional and personal development, and to ensure consistent ethical practice. In this chapter attention is given to the challenges, importance and benefits of the supervisory relationship. The ability to form and sustain relationships in supervision and in clinical practice is more crucial than specific knowledge and therapeutic skills (Dye, 2004). Attention to parallel process, the working alliance, multiple roles, expectations and acculturative issues are addressed. This is an introduction to some of the most salient issues concerning the supervisory relationship and is a review of concepts and processes discussed in greater depth throughout this textbook. The reader is encouraged to utilise the references and suggested readings to deepen their understanding of the supervisory relationship.
Resumo:
Norman K. Denzin (1989) claims that the central assumption of the biographical method—that a life can be captured and represented in a text—is open to question. This paper explores Denzin’s statement by documenting the role of creative writers in re-presenting oral histories in two case studies from Queensland, Australia. The first, The Queensland Business Leaders Hall of Fame, was a commercial research project commissioned by the State Library of Queensland (SLQ) in 2009, and involved semi-formal qualitative interviews and digital stories. The second is an on-going practice-led PhD project, The Artful Life: Oral History and Fiction, which investigates the fictionalisation of oral histories. Both projects enter into a dialogue around the re-presentation of oral and life histories, with attention given to the critical scholarship and creative practice in the process. Creative writers represent a life having particular preoccupations with techniques that more closely align with fiction than non-fiction (Hirsch and Dixon 2008). In this context, oral history resources are viewed not so much as repositories of historical facts, but as ambiguous and fluid narrative sources. The comparison of the two case studies also demonstrates that the aims of a particular project dictate the nature of the re-presentation, revealing that writing about another’s life is a complex act of artful ‘shaping’. Alistair Thomson (2007) notes the growing interdisciplinary nature of oral history scholarship since the 1980s; oral histories are used increasingly in art-based contexts to produce diverse cultural artefacts, such as digital stories and works of fiction, which are very different from traditional histories. What are the methodological implications of such projects? This paper will draw on self-reflexive practice to explore this question.
Resumo:
My research investigates why nouns are learned disproportionately more frequently than other kinds of words during early language acquisition (Gentner, 1982; Gleitman, et al., 2004). This question must be considered in the context of cognitive development in general. Infants have two major streams of environmental information to make meaningful: perceptual and linguistic. Perceptual information flows in from the senses and is processed into symbolic representations by the primitive language of thought (Fodor, 1975). These symbolic representations are then linked to linguistic input to enable language comprehension and ultimately production. Yet, how exactly does perceptual information become conceptualized? Although this question is difficult, there has been progress. One way that children might have an easier job is if they have structures that simplify the data. Thus, if particular sorts of perceptual information could be separated from the mass of input, then it would be easier for children to refer to those specific things when learning words (Spelke, 1990; Pylyshyn, 2003). It would be easier still, if linguistic input was segmented in predictable ways (Gentner, 1982; Gleitman, et al., 2004) Unfortunately the frequency of patterns in lexical or grammatical input cannot explain the cross-cultural and cross-linguistic tendency to favor nouns over verbs and predicates. There are three examples of this failure: 1) a wide variety of nouns are uttered less frequently than a smaller number of verbs and yet are learnt far more easily (Gentner, 1982); 2) word order and morphological transparency offer no insight when you contrast the sentence structures and word inflections of different languages (Slobin, 1973) and 3) particular language teaching behaviors (e.g. pointing at objects and repeating names for them) have little impact on children's tendency to prefer concrete nouns in their first fifty words (Newport, et al., 1977). Although the linguistic solution appears problematic, there has been increasing evidence that the early visual system does indeed segment perceptual information in specific ways before the conscious mind begins to intervene (Pylyshyn, 2003). I argue that nouns are easier to learn because their referents directly connect with innate features of the perceptual faculty. This hypothesis stems from work done on visual indexes by Zenon Pylyshyn (2001, 2003). Pylyshyn argues that the early visual system (the architecture of the "vision module") segments perceptual data into pre-conceptual proto-objects called FINSTs. FINSTs typically correspond to physical things such as Spelke objects (Spelke, 1990). Hence, before conceptualization, visual objects are picked out by the perceptual system demonstratively, like a finger pointing indicating ‘this’ or ‘that’. I suggest that this primitive system of demonstration elaborates on Gareth Evan's (1982) theory of nonconceptual content. Nouns are learnt first because their referents attract demonstrative visual indexes. This theory also explains why infants less often name stationary objects such as plate or table, but do name things that attract the focal attention of the early visual system, i.e., small objects that move, such as ‘dog’ or ‘ball’. This view leaves open the question how blind children learn words for visible objects and why children learn category nouns (e.g. 'dog'), rather than proper nouns (e.g. 'Fido') or higher taxonomic distinctions (e.g. 'animal').
Resumo:
In recent years the development and use of crash prediction models for roadway safety analyses have received substantial attention. These models, also known as safety performance functions (SPFs), relate the expected crash frequency of roadway elements (intersections, road segments, on-ramps) to traffic volumes and other geometric and operational characteristics. A commonly practiced approach for applying intersection SPFs is to assume that crash types occur in fixed proportions (e.g., rear-end crashes make up 20% of crashes, angle crashes 35%, and so forth) and then apply these fixed proportions to crash totals to estimate crash frequencies by type. As demonstrated in this paper, such a practice makes questionable assumptions and results in considerable error in estimating crash proportions. Through the use of rudimentary SPFs based solely on the annual average daily traffic (AADT) of major and minor roads, the homogeneity-in-proportions assumption is shown not to hold across AADT, because crash proportions vary as a function of both major and minor road AADT. For example, with minor road AADT of 400 vehicles per day, the proportion of intersecting-direction crashes decreases from about 50% with 2,000 major road AADT to about 15% with 82,000 AADT. Same-direction crashes increase from about 15% to 55% for the same comparison. The homogeneity-in-proportions assumption should be abandoned, and crash type models should be used to predict crash frequency by crash type. SPFs that use additional geometric variables would only exacerbate the problem quantified here. Comparison of models for different crash types using additional geometric variables remains the subject of future research.
Resumo:
The costs of work-related crashes In Australia and overseas, fleet safety or work-related road safety is an issue gaining increased attention from researchers, organisations, road safety practitioners and the general community. This attention is primarily in response to the substantial physical, emotional and economic costs associated with work-related road crashes. The increased risk factors and subsequent costs of work-related driving are also now well documented in the literature. For example, it is noteworthy that research has demonstrated that work-related drivers on average report a higher level of crash involvement compared to personal car drivers (Downs et al., 1999; Kweon and Kockelman, 2003) and in particular within Australia, road crashes are the most common form of work-related fatalities (Haworth et al., 2000).
Resumo:
The concept of asset management is not a new but an evolving idea that has been attracting attention of many organisations operating and/or owning some kind of infrastructure assets. The term asset management have been used widely with fundamental differences in interpretation and usage. Regardless of the context of the usage of the term, asset management implies the process of optimising return by scrutinising performance and making key strategic decisions throughout all phases of an assets lifecycle (Sarfi and Tao, 2004). Hence, asset management is a philosophy and discipline through which organisations are enabled to more effectively deploy their resources to provide higher levels of customer service and reliability while balancing financial objectives. In Australia, asset management made its way into the public works in 1993 when the Australian Accounting Standard Board issued the Australian Accounting Standard 27 – AAS27. Standard AAS27 required government agencies to capitalise and depreciate assets rather than expense them against earnings. This development has indirectly forced organisations managing infrastructure assets to consider the useful life and cost effectiveness of asset investments. The Australian State Treasuries and the Australian National Audit Office was the first organisation to formalise the concepts and principles of asset management in Australia in which they defined asset management as “ a systematic, structured process covering the whole life of an asset”(Australian National Audit Office, 1996). This initiative led other Government bodies and industry sectors to develop, refine and apply the concept of asset management in the management of their respective infrastructure assets. Hence, it can be argued that the concept of asset management has emerged as a separate and recognised field of management during the late 1990s. In comparison to other disciplines such as construction, facilities, maintenance, project management, economics, finance, to name a few, asset management is a relatively new discipline and is clearly a contemporary topic. The primary contributors to the literature in asset management are largely government organisations and industry practitioners. These contributions take the form of guidelines and reports on the best practice of asset management. More recently, some of these best practices have been made to become a standard such as the PAS 55 (IAM, 2004, IAM, 2008b) in UK. As such, current literature in this field tends to lack well-grounded theories. To-date, while receiving relatively more interest and attention from empirical researchers, the advancement of this field, particularly in terms of the volume of academic and theoretical development is at best moderate. A plausible reason for the lack of advancement is that many researchers and practitioners are still unaware of, or unimpressed by, the contribution that asset management can make to the performance of infrastructure asset. This paper seeks to explore the practices of organisations that manage infrastructure assets to develop a framework of strategic infrastructure asset management processes. It will begin by examining the development of asset management. This is followed by the discussion on the method to be adopted for this paper. Next, is the discussion of the result form case studies. It first describes the goals of infrastructure asset management and how they can support the broader business goals. Following this, a set of core processes that can support the achievement of business goals are provided. These core processes are synthesised based on the practices of asset managers in the case study organisations.
Resumo:
Little is known about cancer survivors’ experiences with and preferences for exercise programmes offered during rehabilitation (immediately after cancer treatment). This study documented colorectal cancer survivors’ experiences in an exercise rehabilitation programme and their preferences for programme content and delivery. At the completion of 12-weeks of supervised exercise, 10 participants took part in one-on-one semi-structured interviews. Data from these interviews were coded, and themes were identified using qualitative software. Key findings were that most participants experienced improvements in treatment symptoms, including reduced fatigue and increased energy and confidence to do activities of daily living. They also reported that interactions with the exercise trainer and a flexible programme delivery were important aspects of the intervention. Most participants reported that they preferred having a choice of exercise, starting to exercise within a month after completing treatment, having supervision and maintaining a one-on-one format. Frustrations included scheduling conflicts and a lack of a transition out of the programme. The findings indicate that colorectal cancers experience benefits from exercise offered immediately after treatment and prefer individual attention from exercise staff. They further indicate directions for the implementation of future exercise programmes with this population.
Resumo:
This paper explores the idea of virtual participation through the historical example of the republic of letters in early modern Europe (circa 1500-1800). By reflecting on the construction of virtuality in a historical context, and more specifically in a pre-digital environment, this paper calls attention to accusations of technological determinism in ongoing research concerning the affordances of the Internet and related media of communication. It argues that ‘the virtual’ is not synonymous with ‘the digital’ and suggests that, in order to articulate what is novel about modern technologies, we must first understand the social interactions underpinning the relationships which are facilitated through those technologies. By analysing the construction of virtuality in a pre-digital environment, this paper thus offers a baseline from which scholars might consider what is different about the modes of interaction and communication being engaged in via modern media.
Resumo:
Thomas Young (1773-1829) carried out major pioneering work in many different subjects. In 1800 he gave the Bakerian Lecture of the Royal Society on the topic of the “mechanism of the eye”: this was published in the following year (Young, 1801). Young used his own design of optometer to measure refraction and accommodation, and discovered his own astigmatism. He considered the different possible origins of accommodation and confirmed that it was due to change in shape of the lens rather than to change in shape of the cornea or an increase in axial length. However, the paper also dealt with many other aspects of visual and ophthalmic optics, such as biometric parameters, peripheral refraction, longitudinal chromatic aberration, depth-of-focus and instrument myopia. These aspects of the paper have previously received little attention. We now give detailed consideration to these and other less-familiar features of Young’s work and conclude that his studies remain relevant to many of the topics which currently engage visual scientists.
Resumo:
Since its debut in 2001 Wikipedia has attracted the attention of many researchers in different fields. In recent years researchers in the area of ontology learning have realised the huge potential of Wikipedia as a source of semi-structured knowledge and several systems have used it as their main source of knowledge. However, the techniques used to extract semantic information vary greatly, as do the resulting ontologies. This paper introduces a framework to compare ontology learning systems that use Wikipedia as their main source of knowledge. Six prominent systems are compared and contrasted using the framework.
Resumo:
Background: Sun exposure is the main source of vitamin D. Increasing scientific and media attention to the potential health benefits of sun exposure may lead to changes in sun exposure behaviors. Methods: To provide data that might help frame public health messages, we conducted an online survey among office workers in Brisbane, Australia, to determine knowledge and attitudes about vitamin D and associations of these with sun protection practices. Of the 4,709 people invited to participate, 2,867 (61%) completed the questionnaire. This analysis included 1,971 (69%) participants who indicated that they had heard about vitamin D. Results: Lack of knowledge about vitamin D was apparent. Eighteen percent of people were unaware of the bone benefits of vitamin D but 40% listed currently unconfirmed benefits. Over half of the participants indicated that more than 10 minutes in the sun was needed to attain enough vitamin D in summer, and 28% indicated more than 20 minutes in winter. This was significantly associated with increased time outdoors and decreased sunscreen use. People believing sun protection might cause vitamin D deficiency (11%) were less likely to be frequent sunscreen users (summer odds ratio, 0.63; 95% confidence interval, 0.52-0.75). Conclusions: Our findings suggest that there is some confusion about sun exposure and vitamin D, and that this may result in reduced sun-protective behavior. Impact: More information is needed about vitamin D production in the skin. In the interim, education campaigns need to specifically address the vitamin D issue to ensure that skin cancer incidence does not increase.
Resumo:
Over the years, public health in relation to Australian Aboriginal people has involved many individuals and groups including health professionals, governments, politicians, special interest groups and corporate organisations. Since colonisation commenced until the1980s, public health relating to Aboriginal and Torres Strait Islander people was not necessarily in the best interests of Aboriginal and Torres Strait Islander people, but rather in the interests of the non-Aboriginal population. The attention that was paid focussed more generally around the subject of reproduction and issues of prostitution, exploitation, abuse and venereal diseases (Kidd, 1997). Since the late 1980s there has been a shift in the broader public health agenda (see Baum, 1998) along with public health in relation to Aboriginal and Torres Strait Islander people (NHMRC, 2003). This has been coupled with increasing calls to develop appropriate tertiary curriculum and to educate, train, and employ more Aboriginal and Torres Strait Islander and non-Aboriginal people in public health (Anderson et al., 2004; Genat, 2007; PHERP, 2008a, 2008b). Aboriginal and Torres Strait Islander people have been engaged in public health in ways in which they are in a position to influence the public health agenda (Anderson 2004; 2008; Anderson et al., 2004; NATSIHC, 2003). There have been numerous projects, programs and strategies that have sought to develop the Aboriginal and Torres Strait Islander Public Health workforce (AHMAC, 2002; Oldenburg et al., 2005; SCATSIH, 2002). In recent times the Aboriginal community controlled health sector has joined forces with other peak bodies and governments to find solutions and strategies to improve the health outcomes of Aboriginal and Torres Strait Islander peoples (NACCHO & Oxfam, 2007). This case study chapter will not address these broader activities. Instead it will explore the activities and roles of staff within the Public Health and Research Unit (PHRU) at the Victorian Aboriginal Community Controlled Health Organisation (VACCHO). It will focus on their experiences with education institutions, their work in public health and their thoughts on gaps and where improvements can be made in public health, research and education. What will be demonstrated is the diversity of education qualifications and experience. What will also be reflected is how people work within public health on a daily basis to enact change for equity in health and contribute to the improvement of future health outcomes of the Victorian Aboriginal community.
Resumo:
With the advances in computer hardware and software development techniques in the past 25 years, digital computer simulation of train movement and traction systems has been widely adopted as a standard computer-aided engineering tool [1] during the design and development stages of existing and new railway systems. Simulators of different approaches and scales are used extensively to investigate various kinds of system studies. Simulation is now proven to be the cheapest means to carry out performance predication and system behaviour characterisation. When computers were first used to study railway systems, they were mainly employed to perform repetitive but time-consuming computational tasks, such as matrix manipulations for power network solution and exhaustive searches for optimal braking trajectories. With only simple high-level programming languages available at the time, full advantage of the computing hardware could not be taken. Hence, structured simulations of the whole railway system were not very common. Most applications focused on isolated parts of the railway system. It is more appropriate to regard those applications as primarily mechanised calculations rather than simulations. However, a railway system consists of a number of subsystems, such as train movement, power supply and traction drives, which inevitably contains many complexities and diversities. These subsystems interact frequently with each other while the trains are moving; and they have their special features in different railway systems. To further complicate the simulation requirements, constraints like track geometry, speed restrictions and friction have to be considered, not to mention possible non-linearities and uncertainties in the system. In order to provide a comprehensive and accurate account of system behaviour through simulation, a large amount of data has to be organised systematically to ensure easy access and efficient representation; the interactions and relationships among the subsystems should be defined explicitly. These requirements call for sophisticated and effective simulation models for each component of the system. The software development techniques available nowadays allow the evolution of such simulation models. Not only can the applicability of the simulators be largely enhanced by advanced software design, maintainability and modularity for easy understanding and further development, and portability for various hardware platforms are also encouraged. The objective of this paper is to review the development of a number of approaches to simulation models. Attention is, in particular, given to models for train movement, power supply systems and traction drives. These models have been successfully used to enable various ‘what-if’ issues to be resolved effectively in a wide range of applications, such as speed profiles, energy consumption, run times etc.
Resumo:
This overview focuses on the application of chemometrics techniques for the investigation of soils contaminated by polycyclic aromatic hydrocarbons (PAHs) and metals because these two important and very diverse groups of pollutants are ubiquitous in soils. The salient features of various studies carried out in the micro- and recreational environments of humans, are highlighted in the context of the various multivariate statistical techniques available across discipline boundaries that have been effectively used in soil studies. Particular attention is paid to techniques employed in the geosciences that may be effectively utilized for environmental soil studies; classical multivariate approaches that may be used in isolation or as complementary methods to these are also discussed. Chemometrics techniques widely applied in atmospheric studies for identifying sources of pollutants or for determining the importance of contaminant source contributions to a particular site, have seen little use in soil studies, but may be effectively employed in such investigations. Suitable programs are also available for suggesting mitigating measures in cases of soil contamination, and these are also considered. Specific techniques reviewed include pattern recognition techniques such as Principal Components Analysis (PCA), Fuzzy Clustering (FC) and Cluster Analysis (CA); geostatistical tools include variograms, Geographical Information Systems (GIS), contour mapping and kriging; source identification and contribution estimation methods reviewed include Positive Matrix Factorisation (PMF), and Principal Component Analysis on Absolute Principal Component Scores (PCA/APCS). Mitigating measures to limit or eliminate pollutant sources may be suggested through the use of ranking analysis and multi criteria decision making methods (MCDM). These methods are mainly represented in this review by studies employing the Preference Ranking Organisation Method for Enrichment Evaluation (PROMETHEE) and its associated graphic output, Geometrical Analysis for Interactive Aid (GAIA).
Resumo:
The launch of the Apple iPad in January 2010 was one of the most anticipated and publicised launched of a new technological device in recent history. Positioning itself as between a smart phone and a PC, but with the attributes of both, Apple have sought to develop a new market niche with the iPad for tablet PC devices, and early signs are that market expectations are being met.. The iPad’s launch was potentially fortuitous for the newspaper industry worldwide, as it offered the potential to address its two recurring problems: the slow but inexorable decline of print media circulation, and the inability to satisfactorily monetise online readerships. As a result, the Apple iPad has benefited from an enormous amount of free publicity in newspapers, as they develop their own applications (apps) for the device. This paper reports on findings from work undertaken through Smart Services CRC into potential take-up and likely uses of the iPad, and their implications for the news media industry. It reports on focus group analysis undertaken in the mid-2010 using “customer job mapping” methodologies, that draw attention to current gaps in user behaviour in terms of available devices, in order to anticipate possibilities beyond the current “three screens” of PC, mobile phone and television.