857 resultados para definition of PE
Resumo:
The status of entertainment as both a dimension of human culture, and a booming global industry is increasing. Given more recent consumer-centric definitions of entertainment, the entertainment consumer has grown in prominence and is now coming under closer scrutiny. However viewing entertainment consumers as always behaving in a similar fashion towards entertainment as to other products may be selling them short. For a start, entertainment consumers can exhibit a strong loyalty towards their favourite entertainment products that is the envy of the marketing world. Academic researchers and marketers who are keen to investigate entertainment consumers would benefit from a theoretical base from which to commence. This essay therefore, takes a consumer-oriented focus in defining entertainment and conceptualises a model of entertainment consumption. In approaching the study of entertainment one axiomatic question remains: how should we define it? Richard Dyer notes that, considering that the category of entertainment can include – by its own definition in the song ‘That’s entertainment!’ – everything from Hamlet and Oedipus Rex to ‘the clown with his pants falling down’ and ‘the lights on the lady in tights’, it doesn’t make much sense to try to define entertainment as being marked by particular textual features (as is done, for example, by Avrich, 2002). Dyer’s position is rather that ‘entertainment is not so much a category of things as an attitude towards things’ (Dyer, 1973: 9). He traces the modern conception of entertainment back to the writings of Molière. This writer defended the purpose of his plays against attacks from the church that they were not sufficiently edifying by insisting that, as entertainments he had no interest in edifying audiences – his ‘real purpose …was to provide people pleasure – and the definition of that was to be decided by “the people”’(Dyer, 1973: 9). In my own discipline of Marketing this approach has been embraced – Kaser and Oelkers, for example, define entertainment as ‘whatever people are willing to spend their money and spare time viewing’ (2008, 18). That is the approach taken in this paper, where I see entertainment as ‘consumer-driven culture’ (McKee and Collis, 2009) – a definition that is closely aligned with the marketing concept. Within a marketing framework I explore what the consumption of entertainment can tell us about the relationships between consumers and culture more generally. For entertainment offers an intriguing case study, and is often consumed in ways that challenge many of our assumptions about marketing and consumer behaviour.
Resumo:
In Legal Services Commissioner and Wright [2010] QSC 168 and Amos v Ian K Fry & Company, the Supreme Court of Queensland considered the scope of some of the provisions of the Legal Profession Act 2007 (Qld), including the definition of “third party payer” in s 301 of the Act.
Resumo:
This literature review was developed as background for the formulation of an Australian Psychological Society position on the mental health and wellbeing of refugees resettling in Australia. The major aim is to provide a broad overview of the concerns related to refugee mental health and wellbeing within the Australian context. To begin, a brief overview of the definition of a refugee and the scope of refugee movement is provided. Next, the review examines the pre-displacement, post-displacement, systemic and socio-political factors that influence the process of adaptation in refugee resettlement. It then reviews documented approaches to psychological assessment and therapeutic interventions with refugees; and finally it summarises suggestions for assessment and intervention in these practice contexts.
Resumo:
Existing distinctions among macro and micro approaches have been jeopardising the advances of Information Systems (IS) research. Both approaches have been criticized for explaining one level while neglecting the other; thereby, the current situation necessitates the application of multilevel research for revealing the deficiencies. Instead of studying single level (macro or micro), multilevel research entails more than one level of conceptualization and analysis, simultaneously. As the notion of multilevel is borrowed from reference disciplines, there tends to be confusions and inconsistencies within the IS discipline, which hinders the adoption of multilevel research. This paper speaks for the potential value of multilevel research, by investigating the current application status of multilevel research within the IS domain. A content analysis of multilevel research articles from major IS conferences and journals is presented. Analysis results suggest that IS scholars have applied multilevel research to produce high quality work ranging from a variety of topics. However, researchers have not yet been consistently defining “multilevel”, leading to idiosyncratic meanings of multilevel research, most often, in authors’ own interpretations. We argue that a rigorous definition of “multilevel research” needs to be explicated for consistencies in research community.
Resumo:
An environmentally sustainable and thus green business process is one that delivers organizational value whilst also exerting a minimal impact on the natural environment. Recent works from the field of Information Systems (IS) have argued that information systems can contribute to the design and implementation of sustainable business processes. While prior research has investigated how information systems can be used in order to support sustainable business practices, there is still a void as to the actual changes that business processes have to undergo in order to become environmentally sustainable, and the specific role that information systems play in enabling this change. In this paper, we provide a conceptualization of environmentally sustainable business processes, and discuss the role of functional affordances of information systems in enabling both incremental and radical changes in order to make processes environmentally sustainable. Our conceptualization is based on (a) a fundamental definition of the concept of environmental sustainability, grounded in two basic components:the environmental source and sink functions of any project or activity, and (b) the concept of functional affordances, which describe the potential uses originating in the material properties of information systems in relation to their use context. In order to illustrate the application of our framework and provide a first evaluation, we analyse two examples from prior research where information systems impacted on the sustainability of business processes.
Resumo:
There exists an important tradition of content analyses of aggression in sexually explicit material. The majority of these analyses use a definition of aggression that excludes consent. This article identifies three problems with this approach. First, it does not distinguish between aggression and some positive acts. Second, it excludes a key element of healthy sexuality. Third, it can lead to heteronormative definitions of healthy sexuality. It would be better to use a definition of aggression such as Baron and Richardson's (1994) in our content analyses, that includes a consideration of consent. A number of difficulties have been identified with attending to consent but this article offers solutions to each of these.
Resumo:
Using Monte Carlo simulation for radiotherapy dose calculation can provide more accurate results when compared to the analytical methods usually found in modern treatment planning systems, especially in regions with a high degree of inhomogeneity. These more accurate results acquired using Monte Carlo simulation however, often require orders of magnitude more calculation time so as to attain high precision, thereby reducing its utility within the clinical environment. This work aims to improve the utility of Monte Carlo simulation within the clinical environment by developing techniques which enable faster Monte Carlo simulation of radiotherapy geometries. This is achieved principally through the use new high performance computing environments and simpler alternative, yet equivalent representations of complex geometries. Firstly the use of cloud computing technology and it application to radiotherapy dose calculation is demonstrated. As with other super-computer like environments, the time to complete a simulation decreases as 1=n with increasing n cloud based computers performing the calculation in parallel. Unlike traditional super computer infrastructure however, there is no initial outlay of cost, only modest ongoing usage fees; the simulations described in the following are performed using this cloud computing technology. The definition of geometry within the chosen Monte Carlo simulation environment - Geometry & Tracking 4 (GEANT4) in this case - is also addressed in this work. At the simulation implementation level, a new computer aided design interface is presented for use with GEANT4 enabling direct coupling between manufactured parts and their equivalent in the simulation environment, which is of particular importance when defining linear accelerator treatment head geometry. Further, a new technique for navigating tessellated or meshed geometries is described, allowing for up to 3 orders of magnitude performance improvement with the use of tetrahedral meshes in place of complex triangular surface meshes. The technique has application in the definition of both mechanical parts in a geometry as well as patient geometry. Static patient CT datasets like those found in typical radiotherapy treatment plans are often very large and present a significant performance penalty on a Monte Carlo simulation. By extracting the regions of interest in a radiotherapy treatment plan, and representing them in a mesh based form similar to those used in computer aided design, the above mentioned optimisation techniques can be used so as to reduce the time required to navigation the patient geometry in the simulation environment. Results presented in this work show that these equivalent yet much simplified patient geometry representations enable significant performance improvements over simulations that consider raw CT datasets alone. Furthermore, this mesh based representation allows for direct manipulation of the geometry enabling motion augmentation for time dependant dose calculation for example. Finally, an experimental dosimetry technique is described which allows the validation of time dependant Monte Carlo simulation, like the ones made possible by the afore mentioned patient geometry definition. A bespoke organic plastic scintillator dose rate meter is embedded in a gel dosimeter thereby enabling simultaneous 3D dose distribution and dose rate measurement. This work demonstrates the effectiveness of applying alternative and equivalent geometry definitions to complex geometries for the purposes of Monte Carlo simulation performance improvement. Additionally, these alternative geometry definitions allow for manipulations to be performed on otherwise static and rigid geometry.
Resumo:
Context: Various epidemiological studies have estimated that up to 70% of runners sustain an overuse running injury each year. Although few overuse running injuries have an established cause, more than 80% of running-related injuries occur at or below the knee, which suggests that some common mechanisms may be at work. The question then becomes, are there common mechanisms related to overuse running injuries? Evidence Acquisition: Research studies were identified via the following electronic databases: MEDLINE, EMBASE PsycInfo, and CINAHL (1980–July 2008). Inclusion was based on evaluation of risk factors for overuse running injuries. Results: A majority of the risk factors that have been researched over the past few years can be generally categorized into 2 groups: atypical foot pronation mechanics and inadequate hip muscle stabilization. Conclusion: Based on the review of literature, there is no definitive link between atypical foot mechanics and running injury mechanisms. The lack of normative data and a definition of typical foot structure has hampered progress. In contrast, a large and growing body of literature suggests that weakness of hip-stabilizing muscles leads to atypical lower extremity mechanics and increased forces within the lower extremity while running.
Resumo:
Quantitative imaging methods to analyze cell migration assays are not standardized. Here we present a suite of two–dimensional barrier assays describing the collective spreading of an initially–confined population of 3T3 fibroblast cells. To quantify the motility rate we apply two different automatic image detection methods to locate the position of the leading edge of the spreading population after 24, 48 and 72 hours. These results are compared with a manual edge detection method where we systematically vary the detection threshold. Our results indicate that the observed spreading rates are very sensitive to the choice of image analysis tools and we show that a standard measure of cell migration can vary by as much as 25% for the same experimental images depending on the details of the image analysis tools. Our results imply that it is very difficult, if not impossible, to meaningfully compare previously published measures of cell migration since previous results have been obtained using different image analysis techniques and the details of these techniques are not always reported. Using a mathematical model, we provide a physical interpretation of our edge detection results. The physical interpretation is important since edge detection algorithms alone do not specify any physical measure, or physical definition, of the leading edge of the spreading population. Our modeling indicates that variations in the image threshold parameter correspond to a consistent variation in the local cell density. This means that varying the threshold parameter is equivalent to varying the location of the leading edge in the range of approximately 1–5% of the maximum cell density.
Resumo:
Introduction: The accurate identification of tissue electron densities is of great importance for Monte Carlo (MC) dose calculations. When converting patient CT data into a voxelised format suitable for MC simulations, however, it is common to simplify the assignment of electron densities so that the complex tissues existing in the human body are categorized into a few basic types. This study examines the effects that the assignment of tissue types and the calculation of densities can have on the results of MC simulations, for the particular case of a Siemen’s Sensation 4 CT scanner located in a radiotherapy centre where QA measurements are routinely made using 11 tissue types (plus air). Methods: DOSXYZnrc phantoms are generated from CT data, using the CTCREATE user code, with the relationship between Hounsfield units (HU) and density determined via linear interpolation between a series of specified points on the ‘CT-density ramp’ (see Figure 1(a)). Tissue types are assigned according to HU ranges. Each voxel in the DOSXYZnrc phantom therefore has an electron density (electrons/cm3) defined by the product of the mass density (from the HU conversion) and the intrinsic electron density (electrons /gram) (from the material assignment), in that voxel. In this study, we consider the problems of density conversion and material identification separately: the CT-density ramp is simplified by decreasing the number of points which define it from 12 down to 8, 3 and 2; and the material-type-assignment is varied by defining the materials which comprise our test phantom (a Supertech head) as two tissues and bone, two plastics and bone, water only and (as an extreme case) lead only. The effect of these parameters on radiological thickness maps derived from simulated portal images is investigated. Results & Discussion: Increasing the degree of simplification of the CT-density ramp results in an increasing effect on the resulting radiological thickness calculated for the Supertech head phantom. For instance, defining the CT-density ramp using 8 points, instead of 12, results in a maximum radiological thickness change of 0.2 cm, whereas defining the CT-density ramp using only 2 points results in a maximum radiological thickness change of 11.2 cm. Changing the definition of the materials comprising the phantom between water and plastic and tissue results in millimetre-scale changes to the resulting radiological thickness. When the entire phantom is defined as lead, this alteration changes the calculated radiological thickness by a maximum of 9.7 cm. Evidently, the simplification of the CT-density ramp has a greater effect on the resulting radiological thickness map than does the alteration of the assignment of tissue types. Conclusions: It is possible to alter the definitions of the tissue types comprising the phantom (or patient) without substantially altering the results of simulated portal images. However, these images are very sensitive to the accurate identification of the HU-density relationship. When converting data from a patient’s CT into a MC simulation phantom, therefore, all possible care should be taken to accurately reproduce the conversion between HU and mass density, for the specific CT scanner used. Acknowledgements: This work is funded by the NHMRC, through a project grant, and supported by the Queensland University of Technology (QUT) and the Royal Brisbane and Women's Hospital (RBWH), Brisbane, Australia. The authors are grateful to the staff of the RBWH, especially Darren Cassidy, for assistance in obtaining the phantom CT data used in this study. The authors also wish to thank Cathy Hargrave, of QUT, for assistance in formatting the CT data, using the Pinnacle TPS. Computational resources and services used in this work were provided by the HPC and Research Support Group, QUT, Brisbane, Australia.
Resumo:
Purpose – The purpose of this paper is to investigate information communications technologies (ICT)-mediated inclusion and exclusion in terms of sexuality through a study of a commercial social networking web site for gay men. Design/methodology/approach – The paper uses an approach based on technological inscription and the commodification of difference to study Gaydar, a commercial social networking site. Findings – Through the activities, events and interactions offered by Gaydar, the study identifies a series of contrasting identity constructions and market segmentations that are constructed through the cyclic commodification of difference. These are fuelled by a particular series of meanings attached to gay male sexualities which serve to keep gay men positioned as a niche market. Research limitations/implications – The research centres on the study of one, albeit widely used, web site with a very specific set of purposes. The study offers a model for future research on sexuality and ICTs. Originality/value – This study places sexuality centre stage in an ICT-mediated environment and provides insights into the contemporary phenomenon of social networking. As a sexualised object, Gaydar presents a semiosis of politicised messages that question heteronormativity while simultaneously contributing to the definition of an increasingly globalised, commercialised and monolithic form of gay male sexuality defined against ICT
Resumo:
Football, or soccer as it is more commonly referred to in Australia and the US, is arguably the world’s most popular sport. It generates a proportionate volume of related writing. Within this landscape, works of novel-length fiction are seemingly rare. This paper establishes and maps a substantial body of football fiction works, explores elements and qualities exhibited individually and collectively. In bringing together current, limited surveys of the field, it presents the first rigorous definition of football fiction and captures the first historiography of the corpus. Drawing on distant reading methods developed in conjunction with closer textual analyses, the historiography and subsequent taxonomy represent the first articulation of relationships across the body of work, identify growth areas and establish a number of movements and trends. In advancing the understanding of football fiction as a collective body, the paper lays foundations for further research and consideration of the works in generic terms.
Resumo:
Finite Element modelling of bone fracture fixation systems allows computational investigation of the deformation response of the bone to load. Once validated, these models can be easily adapted to explore changes in design or configuration of a fixator. The deformation of the tissue within the fracture gap determines its healing and is often summarised as the stiffness of the construct. FE models capable of reproducing this behaviour would provide valuable insight into the healing potential of different fixation systems. Current model validation techniques lack depth in 6D load and deformation measurements. Other aspects of the FE model creation such as the definition of interfaces between components have also not been explored. This project investigated the mechanical testing and FE modelling of a bone– plate construct for the determination of stiffness. In depth 6D measurement and analysis of the generated forces, moments and movements showed large out of plane behaviours which had not previously been characterised. Stiffness calculated from the interfragmentary movement was found to be an unsuitable summary parameter as the error propagation is too large. Current FE modelling techniques were applied in compression and torsion mimicking the experimental setup. Compressive stiffness was well replicated, though torsional stiffness was not. The out of plane behaviours prevalent in the experimental work were not replicated in the model. The interfaces between the components were investigated experimentally and through modification to the FE model. Incorporation of the interface modelling techniques into the full construct models had no effect in compression but did act to reduce torsional stiffness bringing it closer to that of the experiment. The interface definitions had no effect on out of plane behaviours, which were still not replicated. Neither current nor novel FE modelling techniques were able to replicate the out of plane behaviours evident in the experimental work. New techniques for modelling loads and boundary conditions need to be developed to mimic the effects of the entire experimental system.
Resumo:
In recent years, with the development of techniques in modern molecular biology, it has become possible to study the genetic basis of carcinogenesis down to the level of DNA sequence. Major advances have been made in our understanding of the genes involved in cell cycle control and descriptions of mutations in those genes. These developments have led to the definition of the role of specific oncogenes and tumour suppressor genes in several cancers, including, for example, colon cancers and some forms of breast cancer. Work reported from our laboratory has led to the identification of a number of candidate genes involved in the development of non-melanotic skin cancers. In this chapter, we attempt to further explain the observed (phenomic) alterations in metabolic pathways associated with oxygen consumption with the changes at the genetic level.
Resumo:
A Neutral cluster and Air Ion Spectrometer (NAIS) was used to monitor the concentration of airborne ions on 258 full days between Nov 2011 and Dec 2012 in Brisbane, Australia. The air was sampled from outside a window on the sixth floor of a building close to the city centre, approximately 100 m away from a busy freeway. The NAIS detects all ions and charged particles smaller than 42 nm. It was operated in a 4 min measurement cycle, with ion data recorded at 10 s intervals over 2 min during each cycle. The data were analysed to derive the diurnal variation of small, large and total ion concentrations in the environment. We adapt the definition of Horrak et al (2000) and classify small ions as molecular clusters smaller than 1.6 nm and large ions as charged particles larger than this size...