760 resultados para Primacy Thesis


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Aux confluences historiques et conceptuelles de la modernité, de la technologie, et de l’« humain », les textes de notre corpus négocient et interrogent de façon critique les possibilités matérielles et symboliques de la prothèse, ses aspects phénoménologiques et spéculatifs : du côté subjectiviste et conceptualiste avec une philosophie de la conscience, avec Merleau-Ponty ; et de l’autre avec les épistémologues du corps et historiens de la connaissance Canguilhem et Foucault. Le trope prometteur de la prothèse impacte sur les formations discursives et non-discursives concernant la reconstruction des corps, là où la technologie devient le corrélat de l’identité. La technologie s’humanise au contact de l’homme, et, en révélant une hybridité supérieure, elle phagocyte l’humain du même coup. Ce travail de sociologie des sciences (Latour, 1989), ou encore d’anthropologie des sciences (Hakken, 2001) ou d’anthropologie bioculturelle (Andrieu, 1993; Andrieu, 2006; Andrieu, 2007a) se propose en tant qu’exemple de la contribution potentielle que l’anthropologie biologique et culturelle peut rendre à la médecine reconstructrice et que la médecine reconstructrice peut rendre à la plastique de l’homme ; l’anthropologie biologique nous concerne dans la transformation biologique du corps humain, par l’outil de la technologie, tant dans son histoire de la reconstruction mécanique et plastique, que dans son projet d’augmentation bionique. Nous établirons une continuité archéologique, d’une terminologie foucaldienne, entre les deux pratiques. Nous questionnons les postulats au sujet des relations nature/culture, biologie/contexte social, et nous présentons une approche définitionnelle de la technologie, pierre angulaire de notre travail théorique. Le trope de la technologie, en tant qu’outil adaptatif de la culture au service de la nature, opère un glissement sémantique en se plaçant au service d’une biologie à améliorer. Une des clés de notre recherche sur l’augmentation des fonctions et de l’esthétique du corps humain réside dans la redéfinition même de ces relations ; et dans l’impact de l’interpénétration entre réalité et imaginaire dans la construction de l’objet scientifique, dans la transformation du corps humain. Afin de cerner les enjeux du discours au sujet de l’« autoévolution » des corps, les théories évolutionnistes sont abordées, bien que ne représentant pas notre spécialité. Dans le cadre de l’autoévolution, et de l’augmentation bionique de l’homme, la somation culturelle du corps s’exerce par l’usage des biotechnologies, en rupture épistémologique de la pensée darwinienne, bien que l’acte d’hybridation évolutionnaire soit toujours inscrit dans un dessein de maximisation bionique/génétique du corps humain. Nous explorons les courants de la pensée cybernétique dans leurs actions de transformation biologique du corps humain, de la performativité des mutilations. Ainsi technologie et techniques apparaissent-elles indissociables de la science, et de son constructionnisme social.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Understanding the complexities that are involved in the genetics of multifactorial diseases is still a monumental task. In addition to environmental factors that can influence the risk of disease, there is also a number of other complicating factors. Genetic variants associated with age of disease onset may be different from those variants associated with overall risk of disease, and variants may be located in positions that are not consistent with the traditional protein coding genetic paradigm. Latent Variable Models are well suited for the analysis of genetic data. A latent variable is one that we do not directly observe, but which is believed to exist or is included for computational or analytic convenience in a model. This thesis presents a mixture of methodological developments utilising latent variables, and results from case studies in genetic epidemiology and comparative genomics. Epidemiological studies have identified a number of environmental risk factors for appendicitis, but the disease aetiology of this oft thought useless vestige remains largely a mystery. The effects of smoking on other gastrointestinal disorders are well documented, and in light of this, the thesis investigates the association between smoking and appendicitis through the use of latent variables. By utilising data from a large Australian twin study questionnaire as both cohort and case-control, evidence is found for the association between tobacco smoking and appendicitis. Twin and family studies have also found evidence for the role of heredity in the risk of appendicitis. Results from previous studies are extended here to estimate the heritability of age-at-onset and account for the eect of smoking. This thesis presents a novel approach for performing a genome-wide variance components linkage analysis on transformed residuals from a Cox regression. This method finds evidence for a dierent subset of genes responsible for variation in age at onset than those associated with overall risk of appendicitis. Motivated by increasing evidence of functional activity in regions of the genome once thought of as evolutionary graveyards, this thesis develops a generalisation to the Bayesian multiple changepoint model on aligned DNA sequences for more than two species. This sensitive technique is applied to evaluating the distributions of evolutionary rates, with the finding that they are much more complex than previously apparent. We show strong evidence for at least 9 well-resolved evolutionary rate classes in an alignment of four Drosophila species and at least 7 classes in an alignment of four mammals, including human. A pattern of enrichment and depletion of genic regions in the profiled segments suggests they are functionally significant, and most likely consist of various functional classes. Furthermore, a method of incorporating alignment characteristics representative of function such as GC content and type of mutation into the segmentation model is developed within this thesis. Evidence of fine-structured segmental variation is presented.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The literature identifies several models that describe inter-phase mass transfer, key to the emission process. While the emission process is complex and these models may be more or less successful at predicting mass transfer rates, they identify three key variables for a system involving a liquid and an air phase in contact with it: • A concentration (or partial pressure) gradient driving force; • The fluid dynamic characteristics within the liquid and air phases, and • The chemical properties of the individual components within the system. In three applied research projects conducted prior to this study, samples collected with two well-known sampling devices resulted in very different odour emission rates. It was not possible to adequately explain the differences observed. It appeared likely, however, that the sample collection device might have artefact effects on the emission of odorants, i.e. the sampling device appeared to have altered the mass transfer process. This raised the obvious question: Where two different emission rates are reported for a single source (differing only in the selection of sampling device), and a credible explanation for the difference in emission rate cannot be provided, which emission rate is correct? This research project aimed to identify the factors that determine odour emission rates, the impact that the characteristics of a sampling device may exert on the key mass transfer variables, and ultimately, the impact of the sampling device on the emission rate itself. To meet these objectives, a series of targeted reviews, and laboratory and field investigations, were conducted. Two widely-used, representative devices were chosen to investigate the influence of various parameters on the emission process. These investigations provided insight into the odour emission process generally, and the influence of the sampling device specifically.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Ecological problems are typically multi faceted and need to be addressed from a scientific and a management perspective. There is a wealth of modelling and simulation software available, each designed to address a particular aspect of the issue of concern. Choosing the appropriate tool, making sense of the disparate outputs, and taking decisions when little or no empirical data is available, are everyday challenges facing the ecologist and environmental manager. Bayesian Networks provide a statistical modelling framework that enables analysis and integration of information in its own right as well as integration of a variety of models addressing different aspects of a common overall problem. There has been increased interest in the use of BNs to model environmental systems and issues of concern. However, the development of more sophisticated BNs, utilising dynamic and object oriented (OO) features, is still at the frontier of ecological research. Such features are particularly appealing in an ecological context, since the underlying facts are often spatial and temporal in nature. This thesis focuses on an integrated BN approach which facilitates OO modelling. Our research devises a new heuristic method, the Iterative Bayesian Network Development Cycle (IBNDC), for the development of BN models within a multi-field and multi-expert context. Expert elicitation is a popular method used to quantify BNs when data is sparse, but expert knowledge is abundant. The resulting BNs need to be substantiated and validated taking this uncertainty into account. Our research demonstrates the application of the IBNDC approach to support these aspects of BN modelling. The complex nature of environmental issues makes them ideal case studies for the proposed integrated approach to modelling. Moreover, they lend themselves to a series of integrated sub-networks describing different scientific components, combining scientific and management perspectives, or pooling similar contributions developed in different locations by different research groups. In southern Africa the two largest free-ranging cheetah (Acinonyx jubatus) populations are in Namibia and Botswana, where the majority of cheetahs are located outside protected areas. Consequently, cheetah conservation in these two countries is focussed primarily on the free-ranging populations as well as the mitigation of conflict between humans and cheetahs. In contrast, in neighbouring South Africa, the majority of cheetahs are found in fenced reserves. Nonetheless, conflict between humans and cheetahs remains an issue here. Conservation effort in South Africa is also focussed on managing the geographically isolated cheetah populations as one large meta-population. Relocation is one option among a suite of tools used to resolve human-cheetah conflict in southern Africa. Successfully relocating captured problem cheetahs, and maintaining a viable free-ranging cheetah population, are two environmental issues in cheetah conservation forming the first case study in this thesis. The second case study involves the initiation of blooms of Lyngbya majuscula, a blue-green algae, in Deception Bay, Australia. L. majuscula is a toxic algal bloom which has severe health, ecological and economic impacts on the community located in the vicinity of this algal bloom. Deception Bay is an important tourist destination with its proximity to Brisbane, Australia’s third largest city. Lyngbya is one of several algae considered to be a Harmful Algal Bloom (HAB). This group of algae includes other widespread blooms such as red tides. The occurrence of Lyngbya blooms is not a local phenomenon, but blooms of this toxic weed occur in coastal waters worldwide. With the increase in frequency and extent of these HAB blooms, it is important to gain a better understanding of the underlying factors contributing to the initiation and sustenance of these blooms. This knowledge will contribute to better management practices and the identification of those management actions which could prevent or diminish the severity of these blooms.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This thesis critically analyses sperm donation practices from a child-centred perspective. It examines the effects, both personal and social, of disrupting the unity of biological and social relatedness in families affected by donor conception. It examines how disruption is facilitated by a process of mediation which is detailed using a model provided by Sunderland (2002). This model identifies mediating movements - alienation, translation, re-contextualisation and absorption - which help to explain the powerful and dominating material, and social and political processes which occur in biotechnology, or in reproductive technology in this case. The understanding of such movements and mediation of meanings is inspired by the complementary work of Silverstone (1999) and Sunderland. This model allows for a more critical appreciation of the movement of meaning from previously inalienable aspects of life to alienable products through biotechnology (Sunderland, 2002). Once this mediation in donor conception is subjected to critical examination here, it is then approached from different angles of investigation. The thesis posits that two conflicting notions of the self are being applied to fertility-frustrated adults and the offspring of reproductive interventions. Adults using reproductive interventions receive support to maximise their genetic continuity, but in so doing they create and dismiss the corresponding genetic discontinuity produced for the offspring. The offspring’s kinship and identity are then framed through an experimental postmodernist notion, presenting them as social rather than innate constructs. The adults using the reproductive intervention, on the other hand, have their identity and kinship continuity framed and supported as normative, innate, and based on genetic connection. This use of shifting frameworks is presented as unjust and harmful, creating double standards and a corrosion of kinship values, connection and intelligibility between generations; indeed, it is put forward as adult-centric. The analysis of other forms of human kinship dislocation provided by this thesis explores an under-utilised resource which is used to counter the commonly held opinion that any disruption of social and genetic relatedness for donor offspring is insignificant. The experiences of adoption and the stolen generations are used to inform understanding of the personal and social effects of such kinship disruption and potential reunion for donor offspring. These examples, along with laws governing international human rights, further strengthen the appeal here for normative principles and protections based on collective knowledge and standards to be applied to children of reproductive technology. The thesis presents the argument that the framing and regulation of reproductive technology is excessively influenced by industry providers and users. The interests of these parties collide with and corrode any accurate assessments and protections afforded to the children of reproductive technology. The thesis seeks to counter such encroachments and concludes by presenting these protections, frameworks, and human experiences as resources which can help to address the problems created for the offspring of such reproductive interventions, thereby illustrating why these reproductive interventions should be discontinued.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

An Approach with Vertical Guidance (APV) is an instrument approach procedure which provides horizontal and vertical guidance to a pilot on approach to landing in reduced visibility conditions. APV approaches can greatly reduce the safety risk to general aviation by improving the pilot’s situational awareness. In particular the incidence of Controlled Flight Into Terrain (CFIT) which has occurred in a number of fatal air crashes in general aviation over the past decade in Australia, can be reduced. APV approaches can also improve general aviation operations. If implemented at Australian airports, APV approach procedures are expected to bring a cost saving of millions of dollars to the economy due to fewer missed approaches, diversions and an increased safety benefit. The provision of accurate horizontal and vertical guidance is achievable using the Global Positioning System (GPS). Because aviation is a safety of life application, an aviation-certified GPS receiver must have integrity monitoring or augmentation to ensure that its navigation solution can be trusted. However, the difficulty with the current GPS satellite constellation alone meeting APV integrity requirements, the susceptibility of GPS to jamming or interference and the potential shortcomings of proposed augmentation solutions for Australia such as the Ground-based Regional Augmentation System (GRAS) justifies the investigation of Aircraft Based Augmentation Systems (ABAS) as an alternative integrity solution for general aviation. ABAS augments GPS with other sensors at the aircraft to help it meet the integrity requirements. Typical ABAS designs assume high quality inertial sensors to provide an accurate reference trajectory for Kalman filters. Unfortunately high-quality inertial sensors are too expensive for general aviation. In contrast to these approaches the purpose of this research is to investigate fusing GPS with lower-cost Micro-Electro-Mechanical System (MEMS) Inertial Measurement Units (IMU) and a mathematical model of aircraft dynamics, referred to as an Aircraft Dynamic Model (ADM) in this thesis. Using a model of aircraft dynamics in navigation systems has been studied before in the available literature and shown to be useful particularly for aiding inertial coasting or attitude determination. In contrast to these applications, this thesis investigates its use in ABAS. This thesis presents an ABAS architecture concept which makes use of a MEMS IMU and ADM, named the General Aviation GPS Integrity System (GAGIS) for convenience. GAGIS includes a GPS, MEMS IMU, ADM, a bank of Extended Kalman Filters (EKF) and uses the Normalized Solution Separation (NSS) method for fault detection. The GPS, IMU and ADM information is fused together in a tightly-coupled configuration, with frequent GPS updates applied to correct the IMU and ADM. The use of both IMU and ADM allows for a number of different possible configurations. Three are investigated in this thesis; a GPS-IMU EKF, a GPS-ADM EKF and a GPS-IMU-ADM EKF. The integrity monitoring performance of the GPS-IMU EKF, GPS-ADM EKF and GPS-IMU-ADM EKF architectures are compared against each other and against a stand-alone GPS architecture in a series of computer simulation tests of an APV approach. Typical GPS, IMU, ADM and environmental errors are simulated. The simulation results show the GPS integrity monitoring performance achievable by augmenting GPS with an ADM and low-cost IMU for a general aviation aircraft on an APV approach. A contribution to research is made in determining whether a low-cost IMU or ADM can provide improved integrity monitoring performance over stand-alone GPS. It is found that a reduction of approximately 50% in protection levels is possible using the GPS-IMU EKF or GPS-ADM EKF as well as faster detection of a slowly growing ramp fault on a GPS pseudorange measurement. A second contribution is made in determining how augmenting GPS with an ADM compares to using a low-cost IMU. By comparing the results for the GPS-ADM EKF against the GPS-IMU EKF it is found that protection levels for the GPS-ADM EKF were only approximately 2% higher. This indicates that the GPS-ADM EKF may potentially replace the GPS-IMU EKF for integrity monitoring should the IMU ever fail. In this way the ADM may contribute to the navigation system robustness and redundancy. To investigate this further, a third contribution is made in determining whether or not the ADM can function as an IMU replacement to improve navigation system redundancy by investigating the case of three IMU accelerometers failing. It is found that the failed IMU measurements may be supplemented by the ADM and adequate integrity monitoring performance achieved. Besides treating the IMU and ADM separately as in the GPS-IMU EKF and GPS-ADM EKF, a fourth contribution is made in investigating the possibility of fusing the IMU and ADM information together to achieve greater performance than either alone. This is investigated using the GPS-IMU-ADM EKF. It is found that the GPS-IMU-ADM EKF can achieve protection levels approximately 3% lower in the horizontal and 6% lower in the vertical than a GPS-IMU EKF. However this small improvement may not justify the complexity of fusing the IMU with an ADM in practical systems. Affordable ABAS in general aviation may enhance existing GPS-only fault detection solutions or help overcome any outages in augmentation systems such as the Ground-based Regional Augmentation System (GRAS). Countries such as Australia which currently do not have an augmentation solution for general aviation could especially benefit from the economic savings and safety benefits of satellite navigation-based APV approaches.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Road and highway infrastructure provides the backbone for a nation’s economic growth. The versatile dispersion of population in Australia and its resource boom, coupled with improved living standards and growing societal expectations, calls for continuing development and improvement of road infrastructure under the current local, state and federal governments’ policies and strategic plans. As road infrastructure projects involve huge resources and mechanisms, achieving sustainability not only on economic scales but also through environmental and social responsibility becomes a crucial issue. While sustainability is a logical link to infrastructure development, literature study and consultation with the industry found that there is a lack of common understanding on what constitutes sustainability in the infrastructure context. Its priorities are often interpreted differently among multiple stakeholders. For road infrastructure projects which typically span over long periods of time, achieving tangible sustainability outcomes during the lifecycle of development remains a formidable task. Sustainable development initiatives often remain ideological as in macro-level policies and broad-based concepts. There were little elaboration and exemplar cases on how these policies and concepts can be translated into practical decision-making during project implementation. In contrast, there seemed to be over commitment on research and development of sustainability assessment methods and tools. Between the two positions, there is a perception-reality gap and mismatch, specifically on how to enhance sustainability deliverables during infrastructure project delivery. Review on past research in this industry sector also found that little has been done to promote sustainable road infrastructure development; this has wide and varied potential impacts. This research identified the common perceptions and expectations by different stakeholders towards achieving sustainability in road and highway infrastructure projects. Face to face interviews on selected representatives of these stakeholders were carried out in order to select and categorize, confirm and prioritize a list of sustainability performance targets identified through literature and past research. A Delphi study was conducted with the assistance of a panel of senior industry professionals and academic experts, which further considered the interrelationship and influence of the sustainability indicators, and identified critical sustainability indicators under ten critical sustainability criteria (e.g. Environmental, Health & Safety, Resource Utilization & Management, Social & Cultural, Economic, Public Governance & Community Engagement, Relations Management, Engineering, Institutional and Project Management). This presented critical sustainability issues that needed to be addressed at the project level. Accordingly, exemplar highway development projects were used as case studies to elicit solutions for the critical issues. Through the identification and integration of different perceptions and priority needs of the stakeholders, as well as key sustainability indicators and solutions for critical issues, a set of decision-making guidelines was developed to promote and drive consistent sustainability deliverables in road infrastructure projects.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Automatic recognition of people is an active field of research with important forensic and security applications. In these applications, it is not always possible for the subject to be in close proximity to the system. Voice represents a human behavioural trait which can be used to recognise people in such situations. Automatic Speaker Verification (ASV) is the process of verifying a persons identity through the analysis of their speech and enables recognition of a subject at a distance over a telephone channel { wired or wireless. A significant amount of research has focussed on the application of Gaussian mixture model (GMM) techniques to speaker verification systems providing state-of-the-art performance. GMM's are a type of generative classifier trained to model the probability distribution of the features used to represent a speaker. Recently introduced to the field of ASV research is the support vector machine (SVM). An SVM is a discriminative classifier requiring examples from both positive and negative classes to train a speaker model. The SVM is based on margin maximisation whereby a hyperplane attempts to separate classes in a high dimensional space. SVMs applied to the task of speaker verification have shown high potential, particularly when used to complement current GMM-based techniques in hybrid systems. This work aims to improve the performance of ASV systems using novel and innovative SVM-based techniques. Research was divided into three main themes: session variability compensation for SVMs; unsupervised model adaptation; and impostor dataset selection. The first theme investigated the differences between the GMM and SVM domains for the modelling of session variability | an aspect crucial for robust speaker verification. Techniques developed to improve the robustness of GMMbased classification were shown to bring about similar benefits to discriminative SVM classification through their integration in the hybrid GMM mean supervector SVM classifier. Further, the domains for the modelling of session variation were contrasted to find a number of common factors, however, the SVM-domain consistently provided marginally better session variation compensation. Minimal complementary information was found between the techniques due to the similarities in how they achieved their objectives. The second theme saw the proposal of a novel model for the purpose of session variation compensation in ASV systems. Continuous progressive model adaptation attempts to improve speaker models by retraining them after exploiting all encountered test utterances during normal use of the system. The introduction of the weight-based factor analysis model provided significant performance improvements of over 60% in an unsupervised scenario. SVM-based classification was then integrated into the progressive system providing further benefits in performance over the GMM counterpart. Analysis demonstrated that SVMs also hold several beneficial characteristics to the task of unsupervised model adaptation prompting further research in the area. In pursuing the final theme, an innovative background dataset selection technique was developed. This technique selects the most appropriate subset of examples from a large and diverse set of candidate impostor observations for use as the SVM background by exploiting the SVM training process. This selection was performed on a per-observation basis so as to overcome the shortcoming of the traditional heuristic-based approach to dataset selection. Results demonstrate the approach to provide performance improvements over both the use of the complete candidate dataset and the best heuristically-selected dataset whilst being only a fraction of the size. The refined dataset was also shown to generalise well to unseen corpora and be highly applicable to the selection of impostor cohorts required in alternate techniques for speaker verification.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The homeless have been subject to considerable scrutiny, historically and within current social, political and public discourse. The aetiology of homelessness has been the focus of a large body of economic, sociological, historical and political investigation. Importantly, efforts to conceptualise, explain and measure, the phenomenon of homelessness and homeless people has occurred largely within the context of defining “the problem of the homeless” and the generation of solutions to the ‘problem’. There has been little consideration of how and why homelessness has come to be seen, or understood, as a problem, or how this can change across time and/or place. This alternative stream of research has focused on tracing and analysing the relationship between how people experiencing homeless have become a matter of government concern and the manner in which homelessness itself has been problematised. With this in mind this study has analysed the discourses - political, social and economic rationalities and knowledges - which have provided the conditions of possibility for the identification of the homeless and homelessness as a problem needing to be governed and the means for translating these discourses into the applied domain. The aim of this thesis has been to contribute to current knowledge by developing a genealogy of the conditions and rationalities that have underpinned the problematisation of homelessness and the homeless. The outcome of this analysis has been to open up the opportunity to consider alternative governmental possibilities arising from the exposure of the way in which contemporary problematisation and responses have been influenced by the past. An understanding of this process creates an ability to appreciate the intended and unintended consequences for the future direction of public policy and contemporary research.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Ultraviolet radiation (UV) is the carcinogen that causes the most common malignancy in humans – skin cancer. However, moderate UV exposure is essential for producing vitaminDin our skin. VitaminDincreases the absorption of calcium from the diet, and adequate calcium is necessary for the building and maintenance of bones. Thus, low levels of vitamin D can cause osteomalacia and rickets and contribute to osteoporosis. Emerging evidence also suggests vitamin D may protect against falls, internal cancers, psychiatric conditions, autoimmune diseases and cardiovascular diseases. Since the dominant source of vitamin D is sunlight exposure, there is a need to understand what is a “balanced” level of sun exposure to maintain an adequate level of vitamin D but minimise the risks of eye damage, skin damage and skin cancer resulting from excessive UV exposure. There are many steps in the pathway from incoming solar UV to the eventual vitamin D status of humans (measured as 25-hydroxyvitamin D in the blood), and our knowledge about many of these steps is currently incomplete. This project begins by investigating the levels of UV available for synthesising vitamin D, and how these levels vary across seasons, latitudes and times of the day. The thesis then covers experiments conducted with an in vitro model, which was developed to study several aspects of vitamin D synthesis. Results from the model suggest the relationship between UV dose and vitamin D is not linear. This is an important input into public health messages regarding ‘safe’ UV exposure: larger doses of UV, beyond a certain limit, may not continue to produce vitamin D; however, they will increase the risk of skin cancers and eye damage. The model also showed that, when given identical doses of UV, the amount of vitamin D produced was impacted by temperature. In humans, a temperature-dependent reaction must occur in the top layers of human skin, prior to vitamin D entering the bloodstream. The hypothesis will be raised that cooler temperatures (occurring in winter and at high latitudes) may reduce vitamin D production in humans. Finally, the model has also been used to study the wavelengths of UV thought to be responsible for producing vitamin D. It appears that vitamin D production is limited to a small range of UV wavelengths, which may be narrower than previously thought. Together, these results suggest that further research is needed into the ability of humans to synthesise vitamin D from sunlight. In particular, more information is needed about the dose-response relationship in humans and to investigate the proposed impact of temperature. Having an accurate action spectrum will also be essential for measuring the available levels of vitamin D-effective UV. As this research continues, it will contribute to the scientific evidence-base needed for devising a public health message that will balance the risks of excessive UV exposure with maintaining adequate vitamin D.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Frontline employee behaviours are recognised as vital for achieving a competitive advantage for service organisations. The services marketing literature has comprehensively examined ways to improve frontline employee behaviours in service delivery and recovery. However, limited attention has been paid to frontline employee behaviours that favour customers in ways that go against organisational norms or rules. This study examines these behaviours by introducing a behavioural concept of Customer-Oriented Deviance (COD). COD is defined as, “frontline employees exhibiting extra-role behaviours that they perceive to defy existing expectations or prescribed rules of higher authority through service adaptation, communication and use of resources to benefit customers during interpersonal service encounters.” This thesis develops a COD measure and examines the key determinants of these behaviours from a frontline employee perspective. Existing research on similar behaviours that has originated in the positive deviance and pro-social behaviour domains has limitations and is considered inadequate to examine COD in the services context. The absence of a well-developed body of knowledge on non-conforming service behaviours has implications for both theory and practice. The provision of ‘special favours’ increases customer satisfaction but the over-servicing of customers is also counterproductive for the service delivery and costly for the organisation. Despite these implications of non-conforming service behaviours, there is little understanding about the nature of these behaviours and its key drivers. This research builds on inadequacies in prior research on positive deviance, pro-social and pro-customer literature to develop the theoretical foundation of COD. The concept of positive deviance which has predominantly been used to study organisational behaviours is applied within a services marketing setting. Further, it addresses previous limitations in pro-social and pro-customer behavioural literature that has examined limited forms of behaviours with no clear understanding on the nature of these behaviours. Building upon these literature streams, this research adopts a holistic approach towards the conceptualisation of COD. It addresses previous shortcomings in the literature by providing a well bounded definition, developing a psychometrically sound measure of COD and a conceptually well-founded model of COD. The concept of COD was examined across three separate studies and based on the theoretical foundations of role theory and social identity theory. Study 1 was exploratory and based on in-depth interviews using the Critical Incident Technique (CIT). The aim of Study 1 was to understand the nature of COD and qualitatively identify its key drivers. Thematic analysis was conducted to analyse the data and the two potential dimensions of COD behaviours of Deviant Service Adaptation (DSA) and Deviant Service Communication (DSC) were revealed in the analysis. In addition, themes representing the potential influences of COD were broadly classified as individual factors, situational factors, and organisational factors. Study 2 was a scale development procedure that involved the generation and purification of items for the measure based on two student samples working in customer service roles (Pilot sample, N=278; Initial validation sample, N=231). The results for the reliability and Exploratory Factor Analyses (EFA) on the pilot sample suggested the scale had poor psychometric properties. As a result, major revisions were made in terms of item wordings and new items were developed based on the literature to reflect a new dimension, Deviant Use of Resources (DUR). The revised items were tested on the initial validation sample with the EFA analysis suggesting a four-factor structure of COD. The aim of Study 3 was to further purify the COD measure and test for nomological validity based on its theoretical relationships with key antecedents and similar constructs (key correlates). The theoretical model of COD consisting of nine hypotheses was tested on a retail and hospitality sample of frontline employees (Retail N=311; Hospitality N=305) of a market research panel using an online survey. The data was analysed using Structural Equation Modelling (SEM). The results provided support for a re-specified second-order three-factor model of COD which consists of 11 items. Overall, the COD measure was found to be reliable and valid, demonstrating convergent validity, discriminant validity and marginal partial invariance for the factor loadings. The results showed support for nomological validity, although the antecedents had differing impact on COD across samples. Specifically, empathy and perspective-taking, role conflict, and job autonomy significantly influenced COD in the retail sample, whereas empathy and perspective-taking, risk-taking propensity and role conflict were significant predictors in the hospitality sample. In addition, customer orientation-selling orientation, the altruistic dimension of organisational citizenship behaviours, workplace deviance, and social desirability responding were found to correlate with COD. This research makes several contributions to theory. First, the findings of this thesis extend the literature on positive deviance, pro-social and pro-customer behaviours. Second, the research provides an empirically tested model which describes the antecedents of COD. Third, this research contributes by providing a reliable and valid measure of COD. Finally, the research investigates the differential effects of the key antecedents in different service sectors on COD. The research findings also contribute to services marketing practice. Based on the research findings, service practitioners can better understand the phenomenon of COD and utilise the measurement tool to calibrate COD levels within their organisations. Knowledge on the key determinants of COD will help improve recruitment and training programs and drive internal initiatives within the firm.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

There is a need in industry for a commodity polyethylene film with controllable degradation properties that will degrade in an environmentally neutral way, for applications such as shopping bags and packaging film. Additives such as starch have been shown to accelerate the degradation of plastic films, however control of degradation is required so that the film will retain its mechanical properties during storage and use, and then degrade when no longer required. By the addition of a photocatalyst it is hoped that polymer film will breakdown with exposure to sunlight. Furthermore, it is desired that the polymer film will degrade in the dark, after a short initial exposure to sunlight. Research has been undertaken into the photo- and thermo-oxidative degradation processes of 25 ìm thick LLDPE (linear low density polyethylene) film containing titania from different manufacturers. Films were aged in a suntest or in an oven at 50 °C, and the oxidation product formation was followed using IR spectroscopy. Degussa P25, Kronos 1002, and various organic-modified and doped titanias of the types Satchleben Hombitan and Hunstsman Tioxide incorporated into LLDPE films were assessed for photoactivity. Degussa P25 was found to be the most photoactive with UVA and UVC exposure. Surface modification of titania was found to reduce photoactivity. Crystal phase is thought to be among the most important factors when assessing the photoactivity of titania as a photocatalyst for degradation. Pre-irradiation with UVA or UVC for 24 hours of the film containing 3% Degussa P25 titania prior to aging in an oven resulted in embrittlement in ca. 200 days. The multivariate data analysis technique PCA (principal component analysis) was used as an exploratory tool to investigate the IR spectral data. Oxidation products formed in similar relative concentrations across all samples, confirming that titania was catalysing the oxidation of the LLDPE film without changing the oxidation pathway. PCA was also employed to compare rates of degradation in different films. PCA enabled the discovery of water vapour trapped inside cavities formed by oxidation by titania particles. Imaging ATR/FTIR spectroscopy with high lateral resolution was used in a novel experiment to examine the heterogeneous nature of oxidation of a model polymer compound caused by the presence of titania particles. A model polymer containing Degussa P25 titania was solvent cast onto the internal reflection element of the imaging ATR/FTIR and the oxidation under UVC was examined over time. Sensitisation of 5 ìm domains by titania resulted in areas of relatively high oxidation product concentration. The suitability of transmission IR with a synchrotron light source to the study of polymer film oxidation was assessed as the Australian Synchrotron in Melbourne, Australia. Challenges such as interference fringes and poor signal-to-noise ratio need to be addressed before this can become a routine technique.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In the quest for shorter time-to-market, higher quality and reduced cost, model-driven software development has emerged as a promising approach to software engineering. The central idea is to promote models to first-class citizens in the development process. Starting from a set of very abstract models in the early stage of the development, they are refined into more concrete models and finally, as a last step, into code. As early phases of development focus on different concepts compared to later stages, various modelling languages are employed to most accurately capture the concepts and relations under discussion. In light of this refinement process, translating between modelling languages becomes a time-consuming and error-prone necessity. This is remedied by model transformations providing support for reusing and automating recurring translation efforts. These transformations typically can only be used to translate a source model into a target model, but not vice versa. This poses a problem if the target model is subject to change. In this case the models get out of sync and therefore do not constitute a coherent description of the software system anymore, leading to erroneous results in later stages. This is a serious threat to the promised benefits of quality, cost-saving, and time-to-market. Therefore, providing a means to restore synchronisation after changes to models is crucial if the model-driven vision is to be realised. This process of reflecting changes made to a target model back to the source model is commonly known as Round-Trip Engineering (RTE). While there are a number of approaches to this problem, they impose restrictions on the nature of the model transformation. Typically, in order for a transformation to be reversed, for every change to the target model there must be exactly one change to the source model. While this makes synchronisation relatively “easy”, it is ill-suited for many practically relevant transformations as they do not have this one-to-one character. To overcome these issues and to provide a more general approach to RTE, this thesis puts forward an approach in two stages. First, a formal understanding of model synchronisation on the basis of non-injective transformations (where a number of different source models can correspond to the same target model) is established. Second, detailed techniques are devised that allow the implementation of this understanding of synchronisation. A formal underpinning for these techniques is drawn from abductive logic reasoning, which allows the inference of explanations from an observation in the context of a background theory. As non-injective transformations are the subject of this research, there might be a number of changes to the source model that all equally reflect a certain target model change. To help guide the procedure in finding “good” source changes, model metrics and heuristics are investigated. Combining abductive reasoning with best-first search and a “suitable” heuristic enables efficient computation of a number of “good” source changes. With this procedure Round-Trip Engineering of non-injective transformations can be supported.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The call to innovate is ubiquitous across the Australian educational policy context. The claims of innovative practices and environments that occur frequently in university mission statements, strategic plans and marketing literature suggest that this exhortation to innovate appears to have been taken up enthusiastically by the university sector. Throughout the history of universities, a range of reported deficiencies of higher education have worked to produce a notion of crisis. At present, it would seem that innovation is positioned as the solution to the notion of crisis. This thesis is an inquiry into how the insistence on innovation works to both enable and constrain teaching and learning practices in Australian universities. Alongside the interplay between innovation and crisis is the link between resistance and innovation, a link which remains largely unproblematized in the scholarly literature. This thesis works to locate and unsettle understandings of a relationship between innovation and Australian higher education. The aim of this inquiry is to generate new understandings of what counts as innovation within this context and how innovation is enacted. The thesis draws on a number of postmodernist theorists, whose works have informed firstly the research method, and then the analysis and findings. Firstly, there is an assumption that power is capillary and works through discourse to enact power relations which shape certain truths (Foucault, 1990). Secondly, this research scrutinised language practices which frame the capacity for individuals to act, alongside the language practices which encourage an individual to adopt certain attitudes and actions as one’s own (Foucault, 1988). Thirdly, innovation talk is read in this thesis as an example of needs talk, that is, as a medium through which what is considered domestic, political or economic is made and contested (Fraser, 1989). Fourthly, relationships between and within discourses were identified and analysed beyond cause and effect descriptions, and more productively considered to be in a constant state of becoming (Deleuze, 1987). Finally, the use of ironic research methods assisted in producing alternate configurations of innovation talk which are useful and new (Rorty, 1989). The theoretical assumptions which underpin this thesis inform a document analysis methodology, used to examine how certain texts work to shape the ways in which innovation is constructed. The data consisted of three Federal higher education funding policies selected on the rationale that these documents, as opposed to state or locally based policy and legislation, represent the only shared policy context for all Australian universities. The analysis first provided a modernist reading of the three documents, and this was followed by postmodernist readings of these same policy documents. The modernist reading worked to locate and describe the current truths about innovation. The historical context in which the policy was produced as well as the textual features of the document itself were important to this reading. In the first modernist reading, the binaries involved in producing proper and improper notions of innovation were described and analysed. In the process of the modernist analysis and the subsequent location of binary organisation, a number of conceptual collisions were identified, and these sites of struggle were revisited, through the application of a postmodernist reading. By applying the theories of Rorty (1989) and Fraser (1989) it became possible to not treat these sites as contradictory and requiring resolution, but rather as spaces in which binary tensions are necessary and productive. This postmodernist reading constructed new spaces for refusing and resisting dominant discourses of innovation which value only certain kinds of teaching and learning practices. By exploring a number of ironic language practices found within the policies, this thesis proposes an alternative way of thinking about what counts as innovation and how it happens. The new readings of innovation made possible through the work of this thesis were in response to a suite of enduring, inter-related questions – what counts as innovation?, who or what supports innovation?, how does innovation occur?, and who are the innovators?. The truths presented in response to these questions were treated as the language practices which constitute a dominant discourse of innovation talk. The collisions that occur within these truths were the contested sites which were of most interest for the analysis. The thesis concludes by presenting a theoretical blueprint which works to shift the boundaries of what counts as innovation and how it happens in a manner which is productive, inclusive and powerful. This blueprint forms the foundation upon which a number of recommendations are made for both my own professional practice and broader contexts. In keeping with the conceptual tone of this study, these recommendations are a suite of new questions which focus attention on the boundaries of innovation talk as an attempt to re-configure what is valued about teaching and learning at university.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Accessibility to housing for low to moderate income groups in Australia has been experiencing a severe decline since 2001. On the supply side, the public sector has been reducing its commitment to the direct provision of public housing. Despite high demand for affordable housing, there has been limited supply generated by non-government housing providers. One possible solution to promote an increase in affordable housing supply, like other infrastructure, is through the development of multi-stakeholder partnerships and private financing. This research aims to identify current issues underlying decision-making criteria for building multi-stakeholder partnerships to deliver affordable housing projects. It also investigates strategies for minimising risk and ensuring the financial outcomes of these partnership arrangements. A mix of qualitative in-depth interviews and quantitative surveys has been used as the main method to explore stakeholder experiences regarding their involvement in partnership arrangements in the affordable housing sector in Queensland. Two sets of interviews were conducted following an exploratory pilot study: one set in 2003-2004 and the other in 2007-2008. There were nineteen respondents representing government, private and not-for-profit organisations in the first stage interviews and surveys. The second stage interviews were focussed on twenty-two housing providers in South East Queensland. Initial analyses have been conducted using thematic and statistical analyses. This study extends the use of existing decision making tools and combines the use of a Soft System Framework to analyse the ideal state questionnaires using qualitative thematic analysis. Soft System Methodology (SSM) has been used to analyse this unstructured complex problem by using systematic thinking to develop a conceptual model and carrying it to the real world situations to solve the problem. This research found that the diversity of stakeholder capability and their level of risk acceptance will allow partnerships to develop the best synergies and a degree of collaboration which achieves the required financial return within acceptable risk parameters. However, some of the negativity attached to future commitment to such partnerships has been found to be the anticipation of a worse outcome than that expected from independent action. Many interviewees agree that housing providers' fear of financial risk and community rejection has been central to dampening their enthusiasm for entering such investment projects. The creation of a mixed-use development structure will mitigate both risk and return as the commercial income will subsidise the affordable housing development and will normalise concentration of marginalised low-income people who live in a prime location with an award winning design. In addition, tenant support schemes and rent-to-buy incentive programs will encourage them to secure their tenancies and significantly reduce the risk of rent arrears and property damage. There is also a breakthrough investment vehicle offered by the social developer which sells the non-physical but financial product to individual and institutional investors to mitigate further financial risk. Finally, this study recommends modification of the current value-for-money framework in favour of broader partnership arrangements which are more closely aligned with risk minimisation strategies.