972 resultados para Mawsoniidae - Thesis
Resumo:
This article contests Sean McMeekin’s claims concerning Russian culpability for the First World War. McMeekin maintains that Ottoman rearmament, particularly the purchase of several battleships released onto the global arms market by South American states, threatened to create a situation where the Russian Black Sea Fleet would be outclassed by its Ottoman opposite number. Rather than waiting for this to happen, the tsarist regime chose to go to war. Yet, contrary to McMeekin’s claims, the Ottoman naval expansion never assumed threatening dimensions because the Porte was unable to purchase battleships from Chile or Argentina. As a result, it provided no incentive for Russia to go to war in 1914.
Resumo:
Abstract: In this essay, I discuss how I turned my masters thesis into three peer reviewed publications and the lessons I learned about academic writing and publication in the process.
Resumo:
Schools in Queensland, Australia, are undergoing inclusive education reform, following the report of the Ministerial Taskforce on Inclusive Education (Students with Disabilities) in 2004. The State government’s responses to the taskforce report emphasise a commitment to social justice and equity so that all students can be included in ways that enable them to achieve their potential. Teacher aides are employed in schools as ancillary staff to support students with disabilities and learning difficulties. Their support roles in schools are emerging within an educational context in which assumptions about disability, difference and inclusion of students with disabilities and learning difficulties are changing. It is important to acknowledge teacher aides as support practitioners, and to understand their roles in relation to the inclusion of students with disabilities and learning difficulties as inclusive education reform continues. This study used a phenomenological approach to explore the lived experiences of teacher aides as they supported students with disabilities and learning difficulties in primary schools. Four key insights into the support roles of teacher aides in primary schools in Brisbane, Queensland emerged from the study: 1) teacher aides develop empathetic relationships with students that contribute significantly to the students’ sense of belonging within school communities; 2) lack of clear definition of roles and responsibilities for teacher aides has detrimental effects on inclusion of students; 3) collaborative planning and implementation of classroom learning and socialisation programs enhances inclusion; and 4) teacher aides learn about supporting students while on-the-job, and in consultation and collaboration with other members of the students’ support networks.
Resumo:
This work is concerned with the genetic basis of normal human pigmentation variation. Specifically, the role of polymorphisms within the solute carrier family 45 member 2 (SLC45A2 or membrane associated transporter protein; MATP) gene were investigated with respect to variation in hair, skin and eye colour ― both between and within populations. SLC45A2 is an important regulator of melanin production and mutations in the gene underly the most recently identified form of oculocutaneous albinism. There is evidence to suggest that non-synonymous polymorphisms in SLC45A2 are associated with normal pigmentation variation between populations. Therefore, the underlying hypothesis of this thesis is that polymorphisms in SLC45A2 will alter the function or regulation of the protein, thereby altering the important role it plays in melanogenesis and providing a mechanism for normal pigmentation variation. In order to investigate the role that SLC45A2 polymorphisms play in human pigmentation variation, a DNA database was established which collected pigmentation phenotypic information and blood samples of more than 700 individuals. This database was used as the foundation for two association studies outlined in this thesis, the first of which involved genotyping two previously-described non-synonymous polymorphisms, p.Glu272Lys and p.Phe374Leu, in four different population groups. For both polymorphisms, allele frequencies were significantly different between population groups and the 272Lys and 374Leu alleles were strongly associated with black hair, brown eyes and olive skin colour in Caucasians. This was the first report to show that SLC45A2 polymorphisms were associated with normal human intra-population pigmentation variation. The second association study involved genotyping several SLC45A2 promoter polymorphisms to determine if they also played a role in pigmentation variation. Firstly, the transcription start site (TSS), and hence putative proximal promoter region, was identified using 5' RNA ligase mediated rapid amplification of cDNA ends (RLM-RACE). Two alternate TSSs were identified and the putative promoter region was screened for novel polymorphisms using denaturing high performance liquid chromatography (dHPLC). A novel duplication (c.–1176_–1174dupAAT) was identified along with other previously described single nucleotide polymorphisms (c.–1721C>G and c.–1169G>A). Strong linkage disequilibrium ensured that all three polymorphisms were associated with skin colour such that the –1721G, +dup and –1169A alleles were associated with olive skin in Caucasians. No linkage disequilibrium was observed between the promoter and coding region polymorphisms, suggesting independent effects. The association analyses were complemented with functional data, showing that the –1721G, +dup and –1169A alleles significantly decreased SLC45A2 transcriptional activity. Based on in silico bioinformatic analysis that showed these alleles remove a microphthalmia-associated transcription factor (MITF) binding site, and that MITF is a known regulator of SLC45A2 (Baxter and Pavan, 2002; Du and Fisher, 2002), it was postulated that SLC45A2 promoter polymorphisms could contribute to the regulation of pigmentation by altering MITF binding affinity. Further characterisation of the SLC45A2 promoter was carried out using luciferase reporter assays to determine the transcriptional activity of different regions of the promoter. Five constructs were designed of increasing length and their promoter activity evaluated. Constitutive promoter activity was observed within the first ~200 bp and promoter activity increased as the construct size increased. The functional impact of the –1721G, +dup and –1169A alleles, which removed a MITF consensus binding site, were assessed using electrophoretic mobility shift assays (EMSA) and expression analysis of genotyped melanoblast and melanocyte cell lines. EMSA results confirmed that the promoter polymorphisms affected DNA-protein binding. Interestingly, however, the protein/s involved were not MITF, or at least MITF was not the protein directly binding to the DNA. In an effort to more thoroughly characterise the functional consequences of SLC45A2 promoter polymorphisms, the mRNA expression levels of SLC45A2 and MITF were determined in melanocyte/melanoblast cell lines. Based on SLC45A2’s role in processing and trafficking TYRP1 from the trans-Golgi network to stage 2 melanosmes, the mRNA expression of TYRP1 was also investigated. Expression results suggested a coordinated expression of pigmentation genes. This thesis has substantially contributed to the field of pigmentation by showing that SLC45A2 polymorphisms not only show allele frequency differences between population groups, but also contribute to normal pigmentation variation within a Caucasian population. In addition, promoter polymorphisms have been shown to have functional consequences for SLC45A2 transcription and the expression of other pigmentation genes. Combined, the data presented in this work supports the notion that SLC45A2 is an important contributor to normal pigmentation variation and should be the target of further research to elucidate its role in determining pigmentation phenotypes. Understanding SLC45A2’s function may lead to the development of therapeutic interventions for oculocutaneous albinism and other disorders of pigmentation. It may also help in our understanding of skin cancer susceptibility and evolutionary adaptation to different UV environments, and contribute to the forensic application of pigmentation phenotype prediction.
Resumo:
Banana bunchy top is regarded as the most important viral disease of banana, causing significant yield losses worldwide. The disease is caused by Banana bunchy top virus (BBTV), which is a circular ssDNA virus belonging to the genus Babuvirus in the family Nanoviridae. There are currently few effective control strategies for this and other ssDNA viruses. “In Plant Activation” (InPAct) is a novel technology being developed at QUT for ssDNA virus-activated suicide gene expression. The technology exploits the rolling circle replication mechanism of ssDNA viruses and is based on a unique “split” gene design such that suicide gene expression is only activated in the presence of the viral Rep. This PhD project aimed to develop a BBTV-based InPAct system as a suicide gene strategy to control BBTV. The BBTV-based InPAct vector design requires a BBTV intergenic region (IR) to be embedded within an intron in the gene expression cassette. To ensure that the BBTV IR would not interfere with intron splicing, a TEST vector was initially generated that contained the entire BBTV IR embedded within an intron in a β-glucuronidase (GUS) expression vector. Transient GUS assays in banana embryogenic cell suspensions indicated that cryptic intron splice sites were present within the IR. Transcript analysis revealed two cryptic intron splice sites in the Domain III sequence of the CR-M within the IR. Removal of the CR-M from the TEST vector resulted in an enhancement of GUS expression suggesting that the cryptic intron splice sites had been removed. An InPAct GUS vector was subsequently generated that contained the modified BBTV IR, with the CR-M (minus Domain III) repositioned within the InPAct cassette. Using transient histochemical and fluorometric GUS assays in banana embryogenic cells, the InPAct GUS vector was shown to be activated in the presence of the BBTV Rep. However, the presence of both BBTV Rep and Clink was shown to have a deleterious effect on GUS expression suggesting that these proteins were cytotoxic at the levels expressed. Analysis of replication of the InPAct vectors by Southern hybridisation revealed low levels of InPAct cassette-based episomal DNA released from the vector through the nicking/ligation activity of BBTV Rep. However, Rep-mediated episomal replicons, indicative of rolling circle replication of the released circularised cassettes, were not observed. The inability of the InPAct cassette to be replicated was further investigated. To examine whether the absence of Domain III of the CR-M was responsible, a suite of modified BBTV-based InPAct GUS vectors was constructed that contained the CR-M with the inclusion of Domain III, the CR-M with the inclusion of Domain III and additional upstream IR sequence, or no CR-M. Analysis of replication by Southern hybridisation revealed that neither the presence of Domain III, nor the entire CR-M, had an effect on replication levels. Since the InPAct cassette was significantly larger than the native BBTV genomic components (approximately 1 kb), the effect of InPAct cassette size on replication was also investigated. A suite of size variant BBTV-based vectors was constructed that increased the size of a replication competent cassette to 1.1 kbp through to 2.1 kbp.. Analysis of replication by Southern hybridisation revealed that an increase in vector size above approximately 1.5 - 1.7 kbp resulted in a decrease in replication. Following the demonstration of Rep-mediated release, circularisation and expression from the InPAct GUS vector, an InPAct vector was generated in which the uidA reporter gene was replaced with the ribonuclease-encoding suicide gene, barnase. Initially, a TEST vector was generated to assess the cytotoxicity of Barnase on banana cells. Although transient assays revealed a Barnase-induced cytotoxic effect in banana cells, the expression levels were sub-optimal. An InPAct BARNASE vector was generated and tested for BBTV Rep-activated Barnase expression using transient assays in banana embryogenic cells. High levels of background expression from the InPAct BARNASE vector made it difficult to accurately assess Rep-activated Barnase expression. Analysis of replication by Southern hybridisation revealed low levels of InPAct cassette-based episomal DNA released from the vector but no Rep-mediated episomal replicons indicative of rolling circle replication of the released circularised cassettes were again observed. Despite the inability of the InPAct vectors to replicate to enable high level gene expression, the InPAct BARNASE vector was assessed in planta for BBTV Rep-mediated activation of Barnase expression. Eleven lines of transgenic InPAct BARNASE banana plants were generated by Agrobacterium-mediated transformation and were challenged with viruliferous Pentalonia nigronervosa. At least one clonal plant in each line developed bunchy top symptoms and infection was confirmed by PCR. No localised lesions were observed on any plants, nor was there any localised GUS expression in the one InPAct GUS line challenged with viruliferous aphids. The results presented in this thesis are the first study towards the development of a BBTV-based InPAct system as a Rep-activatable suicide gene expression system to control BBTV. Although further optimisation of the vectors is necessary, the preliminary results suggest that this approach has the potential to be an effective control strategy for BBTV. The use of iterons within the InPAct vectors that are recognised by Reps from different ssDNA plant viruses may provide a broad-spectrum resistance strategy against multiple ssDNA plant viruses. Further, this technology holds great promise as a platform technology for the molecular farming of high-value proteins in vitro or in vivo through expression of the ssDNA virus Rep protein.
Resumo:
Views on the nature and relevance of science education have changed significantly over recent decades. This has serious implications for the way in which science is taught in secondary schools, particularly with respect to teaching emerging topics such as biotechnology, which have a socio-scientific dimension and also require novel laboratory skills. It is apparent in current literature that there is a lack of adequate teacher professional development opportunities in biotechnology education and that a significant need exists for researchers to develop a carefully crafted and well supported professional development design which will positively impact on the way in which teachers engage with contemporary science. This study used a retrospective case study methodology to document the recent evolution of modern biotechnology education as part of the changing nature of science education; examine the adoption and implementation processes for biotechnology education by three secondary schools; and to propose an evidence based biotechnology professional development model for science educators. Data were gathered from documents, one-on-one interviews and focus group discussions. Analysis of these data has led to the proposal of a biotechnology professional development model which considers all of the key components of science professional development that are outlined in the literature, as well as the additional components which were articulated by the educators studied. This research is timely and pertinent to the needs of contemporary science education because of its recognition of the need for a professional development model in biotechnology education that recognizes and addresses the content knowledge, practical skills, pedagogical knowledge and curriculum management components.
Resumo:
Abstract - Mobile devices in the near future will need to collaborate to fulfill their function. Collaboration will be done by communication. We use a real world example of robotic soccer to come up with the necessary structures required for robotic communication. A review of related work is done and it is found no examples come close to providing a RANET. The robotic ad hoc network (RANET) we suggest uses existing structures pulled from the areas of wireless networks, peer to peer and software life-cycle management. Gaps are found in the existing structures so we describe how to extend some structures to satisfy the design. The RANET design supports robot cooperation by exchanging messages, discovering needed skills that other robots on the network may possess and the transfer of these skills. The network is built on top of a Bluetooth wireless network and uses JXTA to communicate and transfer skills. OSGi bundles form the skills that can be transferred. To test the nal design a reference implementation is done. Deficiencies in some third party software is found, specifically JXTA and JamVM and GNU Classpath. Lastly we look at how to fix the deciencies by porting the JXTA C implementation to the target robotic platform and potentially eliminating the TCP/IP layer, using UDP instead of TCP or using an adaptive TCP/IP stack. We also propose a future areas of investigation; how to seed the configuration for the Personal area network (PAN) Bluetooth protocol extension so a Bluetooth TCP/IP link is more quickly formed and using the STP to allow multi-hop messaging and transfer of skills.
Resumo:
Ordinary desktop computers continue to obtain ever more resources – in-creased processing power, memory, network speed and bandwidth – yet these resources spend much of their time underutilised. Cycle stealing frameworks harness these resources so they can be used for high-performance computing. Traditionally cycle stealing systems have used client-server based architectures which place significant limits on their ability to scale and the range of applica-tions they can support. By applying a fully decentralised network model to cycle stealing the limits of centralised models can be overcome. Using decentralised networks in this manner presents some difficulties which have not been encountered in their previous uses. Generally decentralised ap-plications do not require any significant fault tolerance guarantees. High-performance computing on the other hand requires very stringent guarantees to ensure correct results are obtained. Unfortunately mechanisms developed for traditional high-performance computing cannot be simply translated because of their reliance on a reliable storage mechanism. In the highly dynamic world of P2P computing this reliable storage is not available. As part of this research a fault tolerance system has been created which provides considerable reliability without the need for a persistent storage. As well as increased scalability, fully decentralised networks offer the ability for volunteers to communicate directly. This ability provides the possibility of supporting applications whose tasks require direct, message passing style communication. Previous cycle stealing systems have only supported embarrassingly parallel applications and applications with limited forms of communication so a new programming model has been developed which can support this style of communication within a cycle stealing context. In this thesis I present a fully decentralised cycle stealing framework. The framework addresses the problems of providing a reliable fault tolerance sys-tem and supporting direct communication between parallel tasks. The thesis includes a programming model for developing cycle stealing applications with direct inter-process communication and methods for optimising object locality on decentralised networks.
Resumo:
Shared Services (SS) involves the convergence and streamlining of an organisation’s functions to ensure timely service delivery as effectively and efficiently as possible. As a management structure designed to promote value generation, cost savings and improved service delivery by leveraging on economies of scale, the idea of SS is driven by cost reduction and improvements in quality levels of service and efficiency. Current conventional wisdom is that the potential for SS is increasing due to the increasing costs of changing systems and business requirements for organisations and in implementing and running information systems. In addition, due to commoditisation of large information systems such as enterprise systems, many common, supporting functions across organisations are becoming more similar than not, leading to an increasing overlap in processes and fuelling the notion that it is possible for organisations to derive benefits from collaborating and sharing their common services through an inter-organisational shared services (IOSS) arrangement. While there is some research on traditional SS, very little research has been done on IOSS. In particular, it is unclear what are the potential drivers and inhibitors of IOSS. As the concepts of IOSS and SS are closely related to that of Outsourcing, and their distinction is sometimes blurred, this research has the first objective of seeking a clear conceptual understanding of the differences between SS and Outsourcing (in motivators, arrangements, benefits, disadvantages, etc) and based on this conceptual understanding, the second objective of this research is to develop a decision model (Shared Services Potential model) which would aid organisations in deciding which arrangement would be more appropriate for them to adopt in pursuit of process improvements for their operations. As the context of the study is on universities in higher education sharing administrative services common to or across them and with the assumption that such services were homogenous in nature, this thesis also reports on a case study. The case study involved face to face interviews from representatives of an Australian university to explore the potential for IOSS. Our key findings suggest that it is possible for universities to share services common across them as most of them were currently using the same systems although independently.
Resumo:
This study investigates variation in IT professionals' experience of ethics with a view to enhancing their formation and support. This is explored through an examination of the experience of IT, IT professional ethics and IT professional ethics education. The study's principal contribution is the empirical study and description of IT professionals' experience of ethics. The empirical phase is preceded by a review of conceptions of IT and followed by an application of the findings to IT education. The study's empirical findings are based on 30 semi-structured interviews with IT professionals who represent a wide demographic, experience and IT sub-discipline range. Their experience of ethics is depicted as five citizenships: Citizenship of my world, Citizenship of the corporate world, Citizenship of a shared world, Citizenship of the client's world and Citizenship of the wider world. These signify an expanding awareness, which progressively accords rights to others and defines responsibility in terms of others. The empirical findings inform a Model of Ethical IT. This maps an IT professional space increasingly oriented towards others. Such a model provides a conceptual tool, available to prompt discussion and reflection, and which may be employed in pursuing formation aimed at experiential change. Its usefulness for the education of IT professionals with respect to ethics is explored. The research approach employed in this study is phenomenography. This method seeks to elicit and represent variation of experience. It understands experience as a relationship between a subject (IT professionals) and an object (ethics), and describes this relationship in terms of its foci and boundaries. The study's findings culminate in three observations, that change is indicated in the formation and support of IT professionals in: 1. IT professionals' experience of their discipline, moving towards a focus on information users; 2. IT professionals' experience of professional ethics, moving towards the adoption of other-centred attitudes; and 3. IT professionals' experience of professional development, moving towards an emphasis on a change in lived experience. Based on these results, employers, educators and professional bodies may want to evaluate how they approach professional formation and support, if they aim to promote a comprehensive awareness of ethics in IT professionals.
Resumo:
Search engines have forever changed the way people access and discover knowledge, allowing information about almost any subject to be quickly and easily retrieved within seconds. As increasingly more material becomes available electronically the influence of search engines on our lives will continue to grow. This presents the problem of how to find what information is contained in each search engine, what bias a search engine may have, and how to select the best search engine for a particular information need. This research introduces a new method, search engine content analysis, in order to solve the above problem. Search engine content analysis is a new development of traditional information retrieval field called collection selection, which deals with general information repositories. Current research in collection selection relies on full access to the collection or estimations of the size of the collections. Also collection descriptions are often represented as term occurrence statistics. An automatic ontology learning method is developed for the search engine content analysis, which trains an ontology with world knowledge of hundreds of different subjects in a multilevel taxonomy. This ontology is then mined to find important classification rules, and these rules are used to perform an extensive analysis of the content of the largest general purpose Internet search engines in use today. Instead of representing collections as a set of terms, which commonly occurs in collection selection, they are represented as a set of subjects, leading to a more robust representation of information and a decrease of synonymy. The ontology based method was compared with ReDDE (Relevant Document Distribution Estimation method for resource selection) using the standard R-value metric, with encouraging results. ReDDE is the current state of the art collection selection method which relies on collection size estimation. The method was also used to analyse the content of the most popular search engines in use today, including Google and Yahoo. In addition several specialist search engines such as Pubmed and the U.S. Department of Agriculture were analysed. In conclusion, this research shows that the ontology based method mitigates the need for collection size estimation.
Resumo:
Denial-of-service attacks (DoS) and distributed denial-of-service attacks (DDoS) attempt to temporarily disrupt users or computer resources to cause service un- availability to legitimate users in the internetworking system. The most common type of DoS attack occurs when adversaries °ood a large amount of bogus data to interfere or disrupt the service on the server. The attack can be either a single-source attack, which originates at only one host, or a multi-source attack, in which multiple hosts coordinate to °ood a large number of packets to the server. Cryptographic mechanisms in authentication schemes are an example ap- proach to help the server to validate malicious tra±c. Since authentication in key establishment protocols requires the veri¯er to spend some resources before successfully detecting the bogus messages, adversaries might be able to exploit this °aw to mount an attack to overwhelm the server resources. The attacker is able to perform this kind of attack because many key establishment protocols incorporate strong authentication at the beginning phase before they can iden- tify the attacks. This is an example of DoS threats in most key establishment protocols because they have been implemented to support con¯dentiality and data integrity, but do not carefully consider other security objectives, such as availability. The main objective of this research is to design denial-of-service resistant mechanisms in key establishment protocols. In particular, we focus on the design of cryptographic protocols related to key establishment protocols that implement client puzzles to protect the server against resource exhaustion attacks. Another objective is to extend formal analysis techniques to include DoS- resistance. Basically, the formal analysis approach is used not only to analyse and verify the security of a cryptographic scheme carefully but also to help in the design stage of new protocols with a high level of security guarantee. In this research, we focus on an analysis technique of Meadows' cost-based framework, and we implement DoS-resistant model using Coloured Petri Nets. Meadows' cost-based framework is directly proposed to assess denial-of-service vulnerabil- ities in the cryptographic protocols using mathematical proof, while Coloured Petri Nets is used to model and verify the communication protocols using inter- active simulations. In addition, Coloured Petri Nets are able to help the protocol designer to clarify and reduce some inconsistency of the protocol speci¯cation. Therefore, the second objective of this research is to explore vulnerabilities in existing DoS-resistant protocols, as well as extend a formal analysis approach to our new framework for improving DoS-resistance and evaluating the performance of the new proposed mechanism. In summary, the speci¯c outcomes of this research include following results; 1. A taxonomy of denial-of-service resistant strategies and techniques used in key establishment protocols; 2. A critical analysis of existing DoS-resistant key exchange and key estab- lishment protocols; 3. An implementation of Meadows's cost-based framework using Coloured Petri Nets for modelling and evaluating DoS-resistant protocols; and 4. A development of new e±cient and practical DoS-resistant mechanisms to improve the resistance to denial-of-service attacks in key establishment protocols.
Resumo:
For robots to operate in human environments they must be able to make their own maps because it is unrealistic to expect a user to enter a map into the robot’s memory; existing floorplans are often incorrect; and human environments tend to change. Traditionally robots have used sonar, infra-red or laser range finders to perform the mapping task. Digital cameras have become very cheap in recent years and they have opened up new possibilities as a sensor for robot perception. Any robot that must interact with humans can reasonably be expected to have a camera for tasks such as face recognition, so it makes sense to also use the camera for navigation. Cameras have advantages over other sensors such as colour information (not available with any other sensor), better immunity to noise (compared to sonar), and not being restricted to operating in a plane (like laser range finders). However, there are disadvantages too, with the principal one being the effect of perspective. This research investigated ways to use a single colour camera as a range sensor to guide an autonomous robot and allow it to build a map of its environment, a process referred to as Simultaneous Localization and Mapping (SLAM). An experimental system was built using a robot controlled via a wireless network connection. Using the on-board camera as the only sensor, the robot successfully explored and mapped indoor office environments. The quality of the resulting maps is comparable to those that have been reported in the literature for sonar or infra-red sensors. Although the maps are not as accurate as ones created with a laser range finder, the solution using a camera is significantly cheaper and is more appropriate for toys and early domestic robots.
Resumo:
The measurement of submicrometre (< 1.0 m) and ultrafine particles (diameter < 0.1 m) number concentration have attracted attention since the last decade because the potential health impacts associated with exposure to these particles can be more significant than those due to exposure to larger particles. At present, ultrafine particles are not regularly monitored and they are yet to be incorporated into air quality monitoring programs. As a result, very few studies have analysed their long-term and spatial variations in ultrafine particle concentration, and none have been in Australia. To address this gap in scientific knowledge, the aim of this research was to investigate the long-term trends and seasonal variations in particle number concentrations in Brisbane, Australia. Data collected over a five-year period were analysed using weighted regression models. Monthly mean concentrations in the morning (6:00-10:00) and the afternoon (16:00-19:00) were plotted against time in months, using the monthly variance as the weights. During the five-year period, submicrometre and ultrafine particle concentrations increased in the morning by 105.7% and 81.5% respectively whereas in the afternoon there was no significant trend. The morning concentrations were associated with fresh traffic emissions and the afternoon concentrations with the background. The statistical tests applied to the seasonal models, on the other hand, indicated that there was no seasonal component. The spatial variation in size distribution in a large urban area was investigated using particle number size distribution data collected at nine different locations during different campaigns. The size distributions were represented by the modal structures and cumulative size distributions. Particle number peaked at around 30 nm, except at an isolated site dominated by diesel trucks, where the particle number peaked at around 60 nm. It was found that ultrafine particles contributed to 82%-90% of the total particle number. At the sites dominated by petrol vehicles, nanoparticles (< 50 nm) contributed 60%-70% of the total particle number, and at the site dominated by diesel trucks they contributed 50%. Although the sampling campaigns took place during different seasons and were of varying duration these variations did not have an effect on the particle size distributions. The results suggested that the distributions were rather affected by differences in traffic composition and distance to the road. To investigate the occurrence of nucleation events, that is, secondary particle formation from gaseous precursors, particle size distribution data collected over a 13 month period during 5 different campaigns were analysed. The study area was a complex urban environment influenced by anthropogenic and natural sources. The study introduced a new application of time series differencing for the identification of nucleation events. To evaluate the conditions favourable to nucleation, the meteorological conditions and gaseous concentrations prior to and during nucleation events were recorded. Gaseous concentrations did not exhibit a clear pattern of change in concentration. It was also found that nucleation was associated with sea breeze and long-range transport. The implications of this finding are that whilst vehicles are the most important source of ultrafine particles, sea breeze and aged gaseous emissions play a more important role in secondary particle formation in the study area.
Resumo:
This program of research examines the experience of chronic pain in a community sample. While, it is clear that like patient samples, chronic pain in non-patient samples is also associated with psychological distress and physical disability, the experience of pain across the total spectrum of pain conditions (including acute and episodic pain conditions) and during the early course of chronic pain is less clear. Information about these aspects of the pain experience is important because effective early intervention for chronic pain relies on identification of people who are likely to progress to chronicity post-injury. A conceptual model of the transition from acute to chronic pain was proposed by Gatchel (1991a). In brief, Gatchel’s model describes three stages that individuals who have a serious pain experience move through, each with worsening psychological dysfunction and physical disability. The aims of this program of research were to describe the experience of pain in a community sample in order to obtain pain-specific data on the problem of pain in Queensland, and to explore the usefulness of Gatchel’s Model in a non-clinical sample. Additionally, five risk factors and six protective factors were proposed as possible extensions to Gatchel’s Model. To address these aims, a prospective longitudinal mixed-method research design was used. Quantitative data was collected in Phase 1 via a comprehensive postal questionnaire. Phase 2 consisted of a follow-up questionnaire 3 months post-baseline. Phase 3 consisted of semi-structured interviews with a subset of the original sample 12 months post follow-up, which used qualitative data to provide a further in-depth examination of the experience and process of chronic pain from respondents’ point of view. The results indicate chronic pain is associated with high levels of anxiety and depressive symptoms. However, the levels of disability reported by this Queensland sample were generally lower than those reported by clinical samples and consistent with disability data reported in a New South Wales population-based study. With regard to the second aim of this program of research, while some elements of the pain experience of this sample were consistent with that described by Gatchel’s Model, overall the model was not a good fit with the experience of this non-clinical sample. The findings indicate that passive coping strategies (minimising activity), catastrophising, self efficacy, optimism, social support, active strategies (use of distraction) and the belief that emotions affect pain may be important to consider in understanding the processes that underlie the transition to and continuation of chronic pain.