948 resultados para Methodological Development
Resumo:
Comprehensive two-dimensional gas chromatography (GC×GC) offers enhanced separation efficiency, reliability in qualitative and quantitative analysis, capability to detect low quantities, and information on the whole sample and its components. These features are essential in the analysis of complex samples, in which the number of compounds may be large or the analytes of interest are present at trace level. This study involved the development of instrumentation, data analysis programs and methodologies for GC×GC and their application in studies on qualitative and quantitative aspects of GC×GC analysis. Environmental samples were used as model samples. Instrumental development comprised the construction of three versions of a semi-rotating cryogenic modulator in which modulation was based on two-step cryogenic trapping with continuously flowing carbon dioxide as coolant. Two-step trapping was achieved by rotating the nozzle spraying the carbon dioxide with a motor. The fastest rotation and highest modulation frequency were achieved with a permanent magnetic motor, and modulation was most accurate when the motor was controlled with a microcontroller containing a quartz crystal. Heated wire resistors were unnecessary for the desorption step when liquid carbon dioxide was used as coolant. With use of the modulators developed in this study, the narrowest peaks were 75 ms at base. Three data analysis programs were developed allowing basic, comparison and identification operations. Basic operations enabled the visualisation of two-dimensional plots and the determination of retention times, peak heights and volumes. The overlaying feature in the comparison program allowed easy comparison of 2D plots. An automated identification procedure based on mass spectra and retention parameters allowed the qualitative analysis of data obtained by GC×GC and time-of-flight mass spectrometry. In the methodological development, sample preparation (extraction and clean-up) and GC×GC methods were developed for the analysis of atmospheric aerosol and sediment samples. Dynamic sonication assisted extraction was well suited for atmospheric aerosols collected on a filter. A clean-up procedure utilising normal phase liquid chromatography with ultra violet detection worked well in the removal of aliphatic hydrocarbons from a sediment extract. GC×GC with flame ionisation detection or quadrupole mass spectrometry provided good reliability in the qualitative analysis of target analytes. However, GC×GC with time-of-flight mass spectrometry was needed in the analysis of unknowns. The automated identification procedure that was developed was efficient in the analysis of large data files, but manual search and analyst knowledge are invaluable as well. Quantitative analysis was examined in terms of calibration procedures and the effect of matrix compounds on GC×GC separation. In addition to calibration in GC×GC with summed peak areas or peak volumes, simplified area calibration based on normal GC signal can be used to quantify compounds in samples analysed by GC×GC so long as certain qualitative and quantitative prerequisites are met. In a study of the effect of matrix compounds on GC×GC separation, it was shown that quality of the separation of PAHs is not significantly disturbed by the amount of matrix and quantitativeness suffers only slightly in the presence of matrix and when the amount of target compounds is low. The benefits of GC×GC in the analysis of complex samples easily overcome some minor drawbacks of the technique. The developed instrumentation and methodologies performed well for environmental samples, but they could also be applied for other complex samples.
Resumo:
This Working Paper reports the background to the first stage of the ongoing research project, The Quest for Well-being in Growth Industries: A Collaborative Study in Finland and Scotland, conducted under the auspices of the Academy of Finland research programme The Future of Work and Well-being (2008-2011). This collaborative project provides national and transnational data, analysis and outputs. The study is being conducted in the Department of Management and Organisation, Hanken School of Economics, Finland, in collaboration with Glasgow Caledonian University, University of East London, Heriot-Watt University and Reading University, UK. The project examines policies and practices towards the enhancement of work-related well-being in growth industries, and contradictory pressures and tensions posed in this situation. The overall aim is to evaluate the development, implementation and use of work-related well-being policies in four selected growth industries. These sectors – electronics, care, finance and accounting, and tourism – have been selected on the basis of European Union and national forecasts, and demographic and socio-economic trends in employment. In this working paper we outline the background to the research study, the initial research plan, and how the survey of employers has been constructed. The working paper concludes with a brief discussion of general ongoing research issues arising in the project.
Resumo:
The overlapping sound pressure waves that enter our brain via the ears and auditory nerves must be organized into a coherent percept. Modelling the regularities of the auditory environment and detecting unexpected changes in these regularities, even in the absence of attention, is a necessary prerequisite for orientating towards significant information as well as speech perception and communication, for instance. The processing of auditory information, in particular the detection of changes in the regularities of the auditory input, gives rise to neural activity in the brain that is seen as a mismatch negativity (MMN) response of the event-related potential (ERP) recorded by electroencephalography (EEG). --- As the recording of MMN requires neither a subject s behavioural response nor attention towards the sounds, it can be done even with subjects with problems in communicating or difficulties in performing a discrimination task, for example, from aphasic and comatose patients, newborns, and even fetuses. Thus with MMN one can follow the evolution of central auditory processing from the very early, often critical stages of development, and also in subjects who cannot be examined with the more traditional behavioural measures of auditory discrimination. Indeed, recent studies show that central auditory processing, as indicated by MMN, is affected in different clinical populations, such as schizophrenics, as well as during normal aging and abnormal childhood development. Moreover, the processing of auditory information can be selectively impaired for certain auditory attributes (e.g., sound duration, frequency) and can also depend on the context of the sound changes (e.g., speech or non-speech). Although its advantages over behavioral measures are undeniable, a major obstacle to the larger-scale routine use of the MMN method, especially in clinical settings, is the relatively long duration of its measurement. Typically, approximately 15 minutes of recording time is needed for measuring the MMN for a single auditory attribute. Recording a complete central auditory processing profile consisting of several auditory attributes would thus require from one hour to several hours. In this research, I have contributed to the development of new fast multi-attribute MMN recording paradigms in which several types and magnitudes of sound changes are presented in both speech and non-speech contexts in order to obtain a comprehensive profile of auditory sensory memory and discrimination accuracy in a short measurement time (altogether approximately 15 min for 5 auditory attributes). The speed of the paradigms makes them highly attractive for clinical research, their reliability brings fidelity to longitudinal studies, and the language context is especially suitable for studies on language impairments such as dyslexia and aphasia. In addition I have presented an even more ecological paradigm, and more importantly, an interesting result in view of the theory of MMN where the MMN responses are recorded entirely without a repetitive standard tone. All in all, these paradigms contribute to the development of the theory of auditory perception, and increase the feasibility of MMN recordings in both basic and clinical research. Moreover, they have already proven useful in studying for instance dyslexia, Asperger syndrome and schizophrenia.
Resumo:
The paper describes the development and application of a multiple linear regression model to identify how the key elements of waste and recycling infrastructure, namely container capacity and frequency of collection affect the yield from municipal kerbside recycling programmes. The overall aim of the research was to gain an understanding of the factors affecting the yield from municipal kerbside recycling programmes in Scotland. The study isolates the principal kerbside collection service offered by 32 councils across Scotland, eliminating those recycling programmes associated with flatted properties or multi occupancies. The results of a regression analysis model has identified three principal factors which explain 80% of the variability in the average yield of the principal dry recyclate services: weekly residual waste capacity, number of materials collected and the weekly recycling capacity. The use of the model has been evaluated and recommendations made on ongoing methodological development and the use of the results in informing the design of kerbside recycling programmes. The authors hope that the research can provide insights for the ongoing development of methods to optimise the design and operation of kerbside recycling programmes.
Resumo:
The paper highlights the methodological development of identifying and characterizing rice (Oryza sativa L.) ecosystems and the varietal deployment process through participatory approaches. Farmers have intricate knowledge of their rice ecosystems. Evidence from Begnas (mid-hill) and Kachorwa (plain) sites in Nepal suggests that farmers distinguish ecosystems for rice primarily on the basis of moisture and fertility of soils. Farmers also differentiate the number, relative size and specific characteristics of each ecosystem within a given geographic area. They allocate individual varieties to each ecosystem, based on the principle of ‘best fit’ between ecosystem characteristics and varietal traits, indicating that competition between varieties mainly occurs within the ecosystems. Land use and ecosystems determine rice genetic diversity, with marginal land having fewer options for varieties than more productive areas. Modern varieties are mostly confined to productive land, whereas landraces are adapted to marginal ecosystems. Researchers need to understand the ecosystems and varietal distribution within ecosystems better in order to plan and execute programmes on agrobiodiversity conservation on-farm, diversity deployment, repatriation of landraces and monitoring varietal diversity. Simple and practical ways to elicit information on rice ecosystems and associated varieties through farmers’ group discussion at village level are suggested.
Resumo:
Scott A. Shane is the 2009 winner of the Global Award for Entrepreneurship Research. In this article we discuss and analyze Shane’s most important contributions to the field of entrepreneurship. His contribution is extraordinarily broad in scope, which makes it difficult to pinpoint one or a few specifics that we associate with Shane’s scholarship. Instead, they can be summarized in the following three points. First, he has influenced what we view as central aspects of entrepreneurship. Shane has been a leading figure in redirecting the focus on entrepreneurship research itself. Second, he has influenced how we view entrepreneurship. Shane’s research is arguably theory driven and it applies and develops theoretical lenses that greatly improve our understanding of entrepreneurship. Third, he has contributed to how we conduct entrepreneurship research. Shane has been a forerunner in examining relevant units of analysis that are difficult to sample; research designs and databases specifically designed for studying entrepreneurial processes; and sophisticated analytical methods. This has contributed to advancing the methodological rigor of the field. Summing them up, the contributions are very impressive indeed.
Resumo:
We present a novel approach for developing summary statistics for use in approximate Bayesian computation (ABC) algorithms by using indirect inference. ABC methods are useful for posterior inference in the presence of an intractable likelihood function. In the indirect inference approach to ABC the parameters of an auxiliary model fitted to the data become the summary statistics. Although applicable to any ABC technique, we embed this approach within a sequential Monte Carlo algorithm that is completely adaptive and requires very little tuning. This methodological development was motivated by an application involving data on macroparasite population evolution modelled by a trivariate stochastic process for which there is no tractable likelihood function. The auxiliary model here is based on a beta–binomial distribution. The main objective of the analysis is to determine which parameters of the stochastic model are estimable from the observed data on mature parasite worms.
Resumo:
Health Informatics is an intersection of information technology, several disciplines of medicine and health care. It sits at the common frontiers of health care services including patient centric, processes driven and procedural centric care. From the information technology perspective it can be viewed as computer application in medical and/or health processes for delivering better health care solutions. In spite of the exaggerated hype, this field is having a major impact in health care solutions, in particular health care deliveries, decision making, medical devices and allied health care industries. It also affords enormous research opportunities for new methodological development. Despite the obvious connections between Medical Informatics, Nursing Informatics and Health Informatics, most of the methodologies and approaches used in Health Informatics have so far originated from health system management, care aspects and medical diagnostic. This paper explores reasoning for domain knowledge analysis that would establish Health Informatics as a domain and recognised as an intellectual discipline in its own right.
Resumo:
We present a novel approach for developing summary statistics for use in approximate Bayesian computation (ABC) algorithms using indirect infer- ence. We embed this approach within a sequential Monte Carlo algorithm that is completely adaptive. This methodological development was motivated by an application involving data on macroparasite population evolution modelled with a trivariate Markov process. The main objective of the analysis is to compare inferences on the Markov process when considering two di®erent indirect mod- els. The two indirect models are based on a Beta-Binomial model and a three component mixture of Binomials, with the former providing a better ¯t to the observed data.
Resumo:
This article outlines the research approach used in the international 1000 Voices Project. The 1000 Voices project is an interdisciplinary research and public awareness project that uses a customised online multimodal storytelling platform to explore the lives of people with disability internationally. Through the project, researchers and partners have encouraged diverse participants to select the modes of storytelling (e.g. images, text, videos and combinations thereof) that suit them best and to self-define what both ‘disability’ and ‘life story’ mean to them. The online reflective component of the approach encourages participants to organically and reflectively develop story events and revisions over time in ways that suit them and their emerging lives. This article provides a detailed summary of the project's theoretical and methodological development alongside suggestions for future development in social work and qualitative research.
Resumo:
While a number of scholars have explored the special exigencies of local as opposed to metropolitan journalism, rarely have studies examined such differences in relation to journalism culture as constituted by journalists’ professional views. To address the gap in our knowledge, this study reports results from a representative survey of local and metropolitan newspaper journalists in Australia. Findings suggest that territorial context accounts for some significant differences in journalists’ demographics, as well as their role perceptions. In line with past research, local newspaper journalists exhibit much stronger support for the community forum and advocacy role. At the same time, and contrary to expectations, there is very little difference in their support of the watchdog role compared with metropolitan journalists. By combining questions about journalistic ideals and enactment in their work, and finding differences in the two, this study also has important implications for the methodological development of survey studies.
Resumo:
Data assimilation provides an initial atmospheric state, called the analysis, for Numerical Weather Prediction (NWP). This analysis consists of pressure, temperature, wind, and humidity on a three-dimensional NWP model grid. Data assimilation blends meteorological observations with the NWP model in a statistically optimal way. The objective of this thesis is to describe methodological development carried out in order to allow data assimilation of ground-based measurements of the Global Positioning System (GPS) into the High Resolution Limited Area Model (HIRLAM) NWP system. Geodetic processing produces observations of tropospheric delay. These observations can be processed either for vertical columns at each GPS receiver station, or for the individual propagation paths of the microwave signals. These alternative processing methods result in Zenith Total Delay (ZTD) and Slant Delay (SD) observations, respectively. ZTD and SD observations are of use in the analysis of atmospheric humidity. A method is introduced for estimation of the horizontal error covariance of ZTD observations. The method makes use of observation minus model background (OmB) sequences of ZTD and conventional observations. It is demonstrated that the ZTD observation error covariance is relatively large in station separations shorter than 200 km, but non-zero covariances also appear at considerably larger station separations. The relatively low density of radiosonde observing stations limits the ability of the proposed estimation method to resolve the shortest length-scales of error covariance. SD observations are shown to contain a statistically significant signal on the asymmetry of the atmospheric humidity field. However, the asymmetric component of SD is found to be nearly always smaller than the standard deviation of the SD observation error. SD observation modelling is described in detail, and other issues relating to SD data assimilation are also discussed. These include the determination of error statistics, the tuning of observation quality control and allowing the taking into account of local observation error correlation. The experiments made show that the data assimilation system is able to retrieve the asymmetric information content of hypothetical SD observations at a single receiver station. Moreover, the impact of real SD observations on humidity analysis is comparable to that of other observing systems.
Resumo:
Herbivorous insects, their host plants and natural enemies form the largest and most species-rich communities on earth. But what forces structure such communities? Do they represent random collections of species, or are they assembled by given rules? To address these questions, food webs offer excellent tools. As a result of their versatile information content, such webs have become the focus of intensive research over the last few decades. In this thesis, I study herbivore-parasitoid food webs from a new perspective: I construct multiple, quantitative food webs in a spatially explicit setting, at two different scales. Focusing on food webs consisting of specialist herbivores and their natural enemies on the pedunculate oak, Quercus robur, I examine consistency in food web structure across space and time, and how landscape context affects this structure. As an important methodological development, I use DNA barcoding to resolve potential cryptic species in the food webs, and to examine their effect on food web structure. I find that DNA barcoding changes our perception of species identity for as many as a third of the individuals, by reducing misidentifications and by resolving several cryptic species. In terms of the variation detected in food web structure, I find surprising consistency in both space and time. From a spatial perspective, landscape context leaves no detectable imprint on food web structure, while species richness declines significantly with decreasing connectivity. From a temporal perspective, food web structure remains predictable from year to year, despite considerable species turnover in local communities. The rate of such turnover varies between guilds and species within guilds. The factors best explaining these observations are abundant and common species, which have a quantitatively dominant imprint on overall structure, and suffer the lowest turnover. By contrast, rare species with little impact on food web structure exhibit the highest turnover rates. These patterns reveal important limitations of modern metrics of quantitative food web structure. While they accurately describe the overall topology of the web and its most significant interactions, they are disproportionately affected by species with given traits, and insensitive to the specific identity of species. As rare species have been shown to be important for food web stability, metrics depicting quantitative food web structure should then not be used as the sole descriptors of communities in a changing world. To detect and resolve the versatile imprint of global environmental change, one should rather use these metrics as one tool among several.
Resumo:
Improvements in methods for the detection and enumeration of microbes in water, particularly the application of techniques of molecular biology, have highlighted shortcomings in the ”standard methods” for assessing water quality. Higher expectations from the consumer and increased publicity associated with pollution incidents can lead to an uncoupling of the cycle which links methodological development with standard-setting and legislation. The new methodology has also highlighted problems within the water cycle, related to the introduction, growth and metabolism of microbes. A greater understanding of the true diversity of the microbial community and the ability to transmit genetic information within aquatic systems ensures that the subject of this symposium and volume provides an ideal forum to discuss the problems encountered by both researcher and practitioner.