467 resultados para Impulsive sensation-seeking
Resumo:
Objective: The Brief Michigan Alcoholism Screening Test (bMAST) is a 10-item test derived from the 25-item Michigan Alcoholism Screening Test (MAST). It is widely used in the assessment of alcohol dependence. In the absence of previous validation studies, the principal aim of this study was to assess the validity and reliability of the bMAST as a measure of the severity of problem drinking. Method: There were 6,594 patients (4,854 men, 1,740 women) who had been referred for alcohol-use disorders to a hospital alcohol and drug service who voluntarily participated in this study. Results: An exploratory factor analysis defined a two-factor solution, consisting of Perception of Current Drinking and Drinking Consequences factors. Structural equation modeling confirmed that the fit of a nine-item, two-factor model was superior to the original one-factor model. Concurrent validity was assessed through simultaneous administration of the Alcohol Use Disorders Identification Test (AUDIT) and associations with alcohol consumption and clinically assessed features of alcohol dependence. The two-factor bMAST model showed moderate correlations with the AUDIT. The two-factor bMAST and AUDIT were similarly associated with quantity of alcohol consumption and clinically assessed dependence severity features. No differences were observed between the existing weighted scoring system and the proposed simple scoring system. Conclusions: In this study, both the existing bMAST total score and the two-factor model identified were as effective as the AUDIT in assessing problem drinking severity. There are additional advantages of employing the two-factor bMAST in the assessment and treatment planning of patients seeking treatment for alcohol-use disorders. (J. Stud. Alcohol Drugs 68: 771-779,2007)
Resumo:
This paper takes Kent and Taylor’s (2002) call to develop a dialogic theory of public relations and suggests that a necessary first step is the modelling of the process of dialogic communication in public relations. In order to achieve this, extant literature from a range of fields is reviewed, seeking to develop a definition of dialogic communication that is meaningful to the practice of contemporary public relations. A simple transmission model of communication is used as a starting point. This is synthesised with concepts relating specifically to dialogue, taken here in its broadest sense rather than defined as any one particular outcome. The definition that emerges from this review leads to the conclusion that dialogic communication in public relations involves the interaction of three roles – those of sender, receiver, and responder. These three roles are shown to be adopted at different times by both participants involved in dialogic communication. It is further suggested that variations occur in how these roles are conducted: the sender and receiver roles can be approached in a passive or an active way, while the responder role can be classified as being either resistant or responsive to the information received in dialogic communication. The final modelling of the definition derived provides a framework which can be tested in the field to determine whether variations in the conduct of the roles in dialogic communication actually exist, and if so, whether they can be linked to the different types of outcome from dialogic communication identified previously in the literature.
Resumo:
Aims: To develop clinical protocols for acquiring PET images, performing CT-PET registration and tumour volume definition based on the PET image data, for radiotherapy for lung cancer patients and then to test these protocols with respect to levels of accuracy and reproducibility. Method: A phantom-based quality assurance study of the processes associated with using registered CT and PET scans for tumour volume definition was conducted to: (1) investigate image acquisition and manipulation techniques for registering and contouring CT and PET images in a radiotherapy treatment planning system, and (2) determine technology-based errors in the registration and contouring processes. The outcomes of the phantom image based quality assurance study were used to determine clinical protocols. Protocols were developed for (1) acquiring patient PET image data for incorporation into the 3DCRT process, particularly for ensuring that the patient is positioned in their treatment position; (2) CT-PET image registration techniques and (3) GTV definition using the PET image data. The developed clinical protocols were tested using retrospective clinical trials to assess levels of inter-user variability which may be attributed to the use of these protocols. A Siemens Somatom Open Sensation 20 slice CT scanner and a Philips Allegro stand-alone PET scanner were used to acquire the images for this research. The Philips Pinnacle3 treatment planning system was used to perform the image registration and contouring of the CT and PET images. Results: Both the attenuation-corrected and transmission images obtained from standard whole-body PET staging clinical scanning protocols were acquired and imported into the treatment planning system for the phantom-based quality assurance study. Protocols for manipulating the PET images in the treatment planning system, particularly for quantifying uptake in volumes of interest and window levels for accurate geometric visualisation were determined. The automatic registration algorithms were found to have sub-voxel levels of accuracy, with transmission scan-based CT-PET registration more accurate than emission scan-based registration of the phantom images. Respiration induced image artifacts were not found to influence registration accuracy while inadequate pre-registration over-lap of the CT and PET images was found to result in large registration errors. A threshold value based on a percentage of the maximum uptake within a volume of interest was found to accurately contour the different features of the phantom despite the lower spatial resolution of the PET images. Appropriate selection of the threshold value is dependant on target-to-background ratios and the presence of respiratory motion. The results from the phantom-based study were used to design, implement and test clinical CT-PET fusion protocols. The patient PET image acquisition protocols enabled patients to be successfully identified and positioned in their radiotherapy treatment position during the acquisition of their whole-body PET staging scan. While automatic registration techniques were found to reduce inter-user variation compared to manual techniques, there was no significant difference in the registration outcomes for transmission or emission scan-based registration of the patient images, using the protocol. Tumour volumes contoured on registered patient CT-PET images using the tested threshold values and viewing windows determined from the phantom study, demonstrated less inter-user variation for the primary tumour volume contours than those contoured using only the patient’s planning CT scans. Conclusions: The developed clinical protocols allow a patient’s whole-body PET staging scan to be incorporated, manipulated and quantified in the treatment planning process to improve the accuracy of gross tumour volume localisation in 3D conformal radiotherapy for lung cancer. Image registration protocols which factor in potential software-based errors combined with adequate user training are recommended to increase the accuracy and reproducibility of registration outcomes. A semi-automated adaptive threshold contouring technique incorporating a PET windowing protocol, accurately defines the geometric edge of a tumour volume using PET image data from a stand alone PET scanner, including 4D target volumes.
Resumo:
Hollywood has dominated the global film business since the First World War. Economic formulas used by governments to assess levels of industry dominance typically measure market share to establish the degree of industry concentration. The business literature reveals that a marketing orientation strongly correlates with superior market performance and that market leaders that possess a set of six superior marketing capabilities are able to continually outperform rival firms. This paper argues that the historical evidence shows that the Hollywood Majors have consistently outperformed rival firms and rival film industries in each of those six marketing capabilities and that unless rivals develop a similarly integrated and cohesive strategic marketing management approach to the movie business and match the Major studios’ superior capabilities, then Hollywood’s dominance will continue. This paper also proposes that in cyberspace, whilst the Internet does provide a channel that democratises film distribution, the flat landscape of the world wide web means that in order to stand out from the clutter of millions of cyber-voices seeking attention, independent film companies need to possess superior strategic marketing management capabilities and develop effective e-marketing strategies to find a niche, attract a loyal online audience and prosper. However, mirroring a recent CIA report forecasting a multi-polar world economy, this paper also argues that potentially serious longer-term rivals are emerging and will increasingly take a larger slice of an expanding global box office as India, China and other major developing economies and their respective cultural channels grow and achieve economic parity with or surpass the advanced western economies. Thus, in terms of global market share over time, Hollywood’s slice of the pie will comparatively diminish in an emerging multi-polar movie business.
Resumo:
The performance of an adaptive filter may be studied through the behaviour of the optimal and adaptive coefficients in a given environment. This thesis investigates the performance of finite impulse response adaptive lattice filters for two classes of input signals: (a) frequency modulated signals with polynomial phases of order p in complex Gaussian white noise (as nonstationary signals), and (b) the impulsive autoregressive processes with alpha-stable distributions (as non-Gaussian signals). Initially, an overview is given for linear prediction and adaptive filtering. The convergence and tracking properties of the stochastic gradient algorithms are discussed for stationary and nonstationary input signals. It is explained that the stochastic gradient lattice algorithm has many advantages over the least-mean square algorithm. Some of these advantages are having a modular structure, easy-guaranteed stability, less sensitivity to the eigenvalue spread of the input autocorrelation matrix, and easy quantization of filter coefficients (normally called reflection coefficients). We then characterize the performance of the stochastic gradient lattice algorithm for the frequency modulated signals through the optimal and adaptive lattice reflection coefficients. This is a difficult task due to the nonlinear dependence of the adaptive reflection coefficients on the preceding stages and the input signal. To ease the derivations, we assume that reflection coefficients of each stage are independent of the inputs to that stage. Then the optimal lattice filter is derived for the frequency modulated signals. This is performed by computing the optimal values of residual errors, reflection coefficients, and recovery errors. Next, we show the tracking behaviour of adaptive reflection coefficients for frequency modulated signals. This is carried out by computing the tracking model of these coefficients for the stochastic gradient lattice algorithm in average. The second-order convergence of the adaptive coefficients is investigated by modeling the theoretical asymptotic variance of the gradient noise at each stage. The accuracy of the analytical results is verified by computer simulations. Using the previous analytical results, we show a new property, the polynomial order reducing property of adaptive lattice filters. This property may be used to reduce the order of the polynomial phase of input frequency modulated signals. Considering two examples, we show how this property may be used in processing frequency modulated signals. In the first example, a detection procedure in carried out on a frequency modulated signal with a second-order polynomial phase in complex Gaussian white noise. We showed that using this technique a better probability of detection is obtained for the reduced-order phase signals compared to that of the traditional energy detector. Also, it is empirically shown that the distribution of the gradient noise in the first adaptive reflection coefficients approximates the Gaussian law. In the second example, the instantaneous frequency of the same observed signal is estimated. We show that by using this technique a lower mean square error is achieved for the estimated frequencies at high signal-to-noise ratios in comparison to that of the adaptive line enhancer. The performance of adaptive lattice filters is then investigated for the second type of input signals, i.e., impulsive autoregressive processes with alpha-stable distributions . The concept of alpha-stable distributions is first introduced. We discuss that the stochastic gradient algorithm which performs desirable results for finite variance input signals (like frequency modulated signals in noise) does not perform a fast convergence for infinite variance stable processes (due to using the minimum mean-square error criterion). To deal with such problems, the concept of minimum dispersion criterion, fractional lower order moments, and recently-developed algorithms for stable processes are introduced. We then study the possibility of using the lattice structure for impulsive stable processes. Accordingly, two new algorithms including the least-mean P-norm lattice algorithm and its normalized version are proposed for lattice filters based on the fractional lower order moments. Simulation results show that using the proposed algorithms, faster convergence speeds are achieved for parameters estimation of autoregressive stable processes with low to moderate degrees of impulsiveness in comparison to many other algorithms. Also, we discuss the effect of impulsiveness of stable processes on generating some misalignment between the estimated parameters and the true values. Due to the infinite variance of stable processes, the performance of the proposed algorithms is only investigated using extensive computer simulations.
Resumo:
This thesis addresses the contemporary issue of the control, restoration and potential for reuse of State Government-owned heritage properties with commercial potential. It attempts to reconcile the sometimes competing interests of the range of stakeholders in such properties, particularly those seeking to maximise economic performance and return on one hand and community expectations for heritage preservation and exhibition on the other. The matters are approached principally from the Government's position as asset owner/manager. It includes research into a number of key elements - including statutory, physical and economic parameters and an analysis of the legitimate requirements of all stakeholders. The thesis also recognises the need for innovation in approach and for the careful structuring and pre-planning of proposals on a project-by-project basis. On the matter of innovation, four case studies are included in the thesis to exhibit some approaches and techniques that have already been employed in addressing these issues. From this research base, a series of deductions at both a macro and micro level are established and a model for a rational decision-making process for dealing with such projects is developed as a major outcome of the work. Finally, the general model is applied to a specific project, the currently unused Port Office heritage site in the Brisbane Central Business District.
Resumo:
Australia is a land without haunted castles or subterranean corridors, without ancient graveyards or decaying monasteries, a land whose climate is rarely gloomy. Yet, the literary landscape is splattered with shades of the Gothic genre. This Gothic heritage is especially evident within elements of nineteenth century Australian sensation fiction. Australian crime fiction in the twentieth century, in keeping with this lineage, repeatedly employs elements of the Gothic, adapting and appropriating these conventions for literary effect. I believe that a ‘mélange’ of historical Gothic crime traditions could produce an exciting new mode of Gothic crime writing in the Australian context. As such, I have written a contemporary literary experiment in a Gothic crime ‘hybrid’ style: this novella forms my creative practice. The accompanying exegesis is a critical study of a selection of Australian literary works that exhibit the characteristics of both Gothic and crime genres. Through an analysis of these creative works, this study argues that the interlacing of Gothic traditions with crime writing conventions has been a noteworthy practice in Australian fiction during both the nineteenth and twentieth centuries and these literary tropes are interwoven in the writing of ‘The Candidate’, a Gothic crime novella.
Resumo:
Monotony has been identified as a contributing factor to road crashes. Drivers’ ability to react to unpredictable events deteriorates when exposed to highly predictable and uneventful driving tasks, such as driving on Australian rural roads, many of which are monotonous by nature. Highway design in particular attempts to reduce the driver’s task to a merely lane-keeping one. Such a task provides little stimulation and is monotonous, thus affecting the driver’s attention which is no longer directed towards the road. Inattention contributes to crashes, especially for professional drivers. Monotony has been studied mainly from the endogenous perspective (for instance through sleep deprivation) without taking into account the influence of the task itself (repetitiveness) or the surrounding environment. The aim and novelty of this thesis is to develop a methodology (mathematical framework) able to predict driver lapses of vigilance under monotonous environments in real time, using endogenous and exogenous data collected from the driver, the vehicle and the environment. Existing approaches have tended to neglect the specificity of task monotony, leaving the question of the existence of a “monotonous state” unanswered. Furthermore the issue of detecting vigilance decrement before it occurs (predictions) has not been investigated in the literature, let alone in real time. A multidisciplinary approach is necessary to explain how vigilance evolves in monotonous conditions. Such an approach needs to draw on psychology, physiology, road safety, computer science and mathematics. The systemic approach proposed in this study is unique with its predictive dimension and allows us to define, in real time, the impacts of monotony on the driver’s ability to drive. Such methodology is based on mathematical models integrating data available in vehicles to the vigilance state of the driver during a monotonous driving task in various environments. The model integrates different data measuring driver’s endogenous and exogenous factors (related to the driver, the vehicle and the surrounding environment). Electroencephalography (EEG) is used to measure driver vigilance since it has been shown to be the most reliable and real time methodology to assess vigilance level. There are a variety of mathematical models suitable to provide a framework for predictions however, to find the most accurate model, a collection of mathematical models were trained in this thesis and the most reliable was found. The methodology developed in this research is first applied to a theoretically sound measure of sustained attention called Sustained Attention Response to Task (SART) as adapted by Michael (2010), Michael and Meuter (2006, 2007). This experiment induced impairments due to monotony during a vigilance task. Analyses performed in this thesis confirm and extend findings from Michael (2010) that monotony leads to an important vigilance impairment independent of fatigue. This thesis is also the first to show that monotony changes the dynamics of vigilance evolution and tends to create a “monotonous state” characterised by reduced vigilance. Personality traits such as being a low sensation seeker can mitigate this vigilance decrement. It is also evident that lapses in vigilance can be predicted accurately with Bayesian modelling and Neural Networks. This framework was then applied to the driving task by designing a simulated monotonous driving task. The design of such task requires multidisciplinary knowledge and involved psychologist Rebecca Michael. Monotony was varied through both the road design and the road environment variables. This experiment demonstrated that road monotony can lead to driving impairment. Particularly monotonous road scenery was shown to have the most impact compared to monotonous road design. Next, this study identified a variety of surrogate measures that are correlated with vigilance levels obtained from the EEG. Such vigilance states can be predicted with these surrogate measures. This means that vigilance decrement can be detected in a car without the use of an EEG device. Amongst the different mathematical models tested in this thesis, only Neural Networks predicted the vigilance levels accurately. The results of both these experiments provide valuable information about the methodology to predict vigilance decrement. Such an issue is quite complex and requires modelling that can adapt to highly inter-individual differences. Only Neural Networks proved accurate in both studies, suggesting that these models are the most likely to be accurate when used on real roads or for further research on vigilance modelling. This research provides a better understanding of the driving task under monotonous conditions. Results demonstrate that mathematical modelling can be used to determine the driver’s vigilance state when driving using surrogate measures identified during this study. This research has opened up avenues for future research and could result in the development of an in-vehicle device predicting driver vigilance decrement. Such a device could contribute to a reduction in crashes and therefore improve road safety.
Resumo:
Value Management (VM) has been proven to provide a structured framework, together with supporting tools and techniques that facilitate effective decision-making in many types of projects, thus achieving ‘best value’ for clients. It is identified at International level as a natural career progression for the construction service provider and as an opportunity in developing leading-edge skills. The services offered by contractors and consultants in the construction sector have been expanding. In an increasingly competitive and global marketplace, firms are seeking ways to differentiate their services to ever more knowledgeable and demanding clients. The traditional demarcations have given way, and the old definition of what contractors, designers, engineers and quantity surveyors can, and cannot do in terms of their market offering has changed. Project management, design and cost and safety consultancy services, are being delivered by a diverse range of suppliers. Value management services have been developing in various sectors in industry; from manufacturing to the military and now construction. Given the growing evidence that VM has been successful in delivering value-for-money to the client, VM would appear to be gaining some momentum as an essential management tool in the Malaysian construction sector. The recently issued VM Circular 3/2009 by the Economic Planning Unit Malaysia (EPU) possibly marks a new beginning in public sector client acceptance on the strength of VM in construction. This paper therefore attempts to study the prospects of marketing the benefits of VM by construction service providers, and how it may provide an edge in an increasingly competitive Malaysian construction industry.
Resumo:
Flinders University and Queensland University of Technology, biofuels research interests cover a broad range of activities. Both institutions are seeking to overcome the twin evils of "peak oil" (Hubbert 1949 & 1956) and "global warming" (IPPC 2007, Stern 2006, Alison 2010), through development of Generation 1, 2 and 3 (Gen-1, 2 & 3) biofuels (Clarke 2008, Clarke 2010). This includes development of parallel Chemical Biorefinery, value-added, co-product chemical technologies, which can underpin the commercial viability of the biofuel industry. Whilst there is a focused effort to develop Gen-2 & 3 biofuels, thus avoiding the socially unacceptable use of food based Gen-1 biofuels, it must also be recognized that as yet, no country in the world has produced sustainable Gen-2 & 3 biofuel on a commercial basis. For example, in 2008 the United States used 38 billion litres (3.5% of total fuel use) of Gen-1 biofuel; in 2009/2010 this will be 47.5 billion litres (4.5% of fuel use) and in 2018 this has been estimated to rise to 96 billion litres (9% of total US fuel use). Brazil in 2008 produced 24.5 billion litres of ethanol, representing 37.3% of the world’s ethanol use for fuel and Europe, in 2008, produced 11.7 billion litres of biofuel (primarily as biodiesel). Compare this to Australia’s miserly biofuel production in 2008/2009 of 180 million litres of ethanol and 75 million litres of biodiesel, which is 0.4% of our fuel consumption! (Clarke, Graiver and Habibie 2010) To assist in the development of better biofuels technologies in the Asian developing regions the Australian Government recently awarded the Materials & BioEnergy Group from Flinders University, in partnership with the Queensland University of Technology, an Australian Leadership Award (ALA) Biofuel Fellowship program to train scientists from Indonesia and India about all facets of advanced biofuel technology.
Resumo:
There is an increasing global reliance on the Internet for retrieving information on health, illness, and recovery (Sillence et al, 2007; Laurent et al, 2009; Adams, 2010). People suffering from a vast array of illnesses, conditions, and complaints, as well as healthy travelers seeking advice about safe practices abroad, and teens seeking information about safe sexual practices are all now more likely to go to the internet for information than they are to rely solely on a general practitioner or physician (Santor et al, 2007; Moreno et al, 2009; Bartlett et al, 2010). Women in particular seek advice and support online for a number of health-related concerns regarding issues such as puberty, conception, pregnancy, postnatal depression, mothering, breast-cancer recovery, and ageing healthily (van Zutphen, 2008; Raymond et al, 2005). In keeping with this increasing socio-technological trend, the Women’s Health Unit at the Queensland University of Technology (Q.U.T), Brisbane, Australia, introduced the research, design, and development of online information resources for issues affecting the health of Australian women as an assessment item for students in the undergraduate Public Health curriculum. Students were required to research a particular health issue affecting Australian women, including pregnancy, pregnancy terminations, postnatal depression, returning to the work force after having a baby, breast cancer recovery, chronic disease prevention, health and safety for sex-workers, and ageing healthily. Students were required to design and develop websites that supported people living with these conditions, or who were in these situations. The websites were designed for communicating effectively with both women seeking information about their health, and their health practitioners. The pedagogical challenge inherent in this exercise was twofold: firstly, to encourage students to develop the skills to design and maintain software for online health forums; and secondly, to challenge public health students to go beyond generating ‘awareness’ and imparting health information to developing a nuanced understanding of the worlds and perspectives of their audiences, who require supportive networks and options that resonate with their restrictions, capabilities, and dispositions. This latter challenge spanned the realms of research, communication, and aesthetic design. This paper firstly, discusses an increasing reliance on the Internet by women seeking health-related information and the potential health risks and benefits of this trend. Secondly, it applies a post-structural analysis of the de-centred and mobile female self, as online social ‘spaces’ and networks supersede geographical ‘places’ and hierarchies, with implications for democracy, equality, power, and ultimately women’s health. Thirdly, it depicts the processes (learning reflections) and products (developed websites) created within this Women’s Health Unit by the students. Finally, we review this development in the undergraduate curriculum in terms of the importance of providing students with skills in research, communication, and technology in order to share and implement improved health care and social marketing for women as both recipients and providers of health care in the Internet Age.
Resumo:
This paper investigates the Cooroy Mill community precinct (Sunshine Coast, Queensland), as a case study, seeking to understand the way local dynamics interplay and work with the community strengths to build a governance model of best fit. As we move to an age of ubiquitous computing and creative economies, the definition of public place and its governance take on new dimensions, which – while often utilizing models of the past – will need to acknowledge and change to the direction of the future. This paper considers a newly developed community precinct that has been built on three key principles: to foster creative expression with new media, to establish a knowledge economy in a regional area, and to subscribe to principles of community engagement. The study involved qualitative interviews with key stakeholders and a review of common practice models of governance along a spectrum from community control to state control. The paper concludes with a call for governance structures that are locally situated and tailored, inclusive, engaging, dynamic and flexible in order to build community capacity, encourage creativity, and build knowledge economies within emerging digital media cityscapes.