890 resultados para Nelson and Siegel model
Resumo:
This thesis will examine the interaction between the user and the digital archive. The aim of the study is to support an in-depth examination of the interaction process, with a view to making recommendations and tools, for system designers and archival professionals, to promote digital archive domain development. Following a comprehensive literature review process, an urgent requirement for models was identified. The Model of Contextual Interaction presented in this thesis, aims to provide a conceptual model through which the interaction process, between the user and the digital archive, can be examined. Using the five-phased research development framework, the study will present a structured account of its methods, using a multi-method methodology to ensuring robust data collection and analysis. The findings of the study are presented across the Model of Contextual Interaction, and provide a basis on which recommendations and tools for system designers have been made. The thesis concludes with a summary of key findings, and a reflective account of how the findings and the Model of Contextual Interaction have impacted digital provision within the archive domain and how the model could be applied to other domains.
Resumo:
This thesis investigates the application of plasmonic gold nanostructures for mercury detection. Various gold and silver single nanostructures and gold nanostructure assemblies were characterised in detail by correlated single nanostructure spectroscopy and electron microscopy. Several routes for mercury detection were explored: plasmon resonance energy transfer (PRET) upon Hg2+ binding to immobilised gold nanoparticle-organic ligand hybrid structures and amalgamation of single immobilised gold nanorods upon chemical and upon electrochemical reduction of Hg2+ ions. The amalgamation approach showed large potential with extraordinary shifts of the nanorods’ scattering spectra upon exposure to reduced mercury; a result of compositional and morphological change induced in the nanorod by amalgamation with mercury. A shift of 5 nm could be recorded for a concentration as low 10 nM Hg2+. Through detailed time-dependent experiments insights into the amalgamation mechanism were gained and a model comprising 5 steps was developed. Finally, spectroelectrochemistry proved to be an excellent way to study in real time in-situ the amalgamation of mercury with gold nanorods paving the way for future work in this field.
Resumo:
In this paper, we examine exchange rates in Vietnam’s transitional economy. Evidence of long-run equilibrium are established in most cases through a single co-integrating vector among endogenous variables that determine the real exchange rates. This supports relative PPP in which ECT of the system can be combined linearly into a stationary process, reducing deviation from PPP in the long run. Restricted coefficient vectors ß’ = (1, 1, -1) for real exchange rates of currencies in question are not rejected. This empirics of relative PPP adds to found evidences by many researchers, including Flre et al. (1999), Lee (1999), Johnson (1990), Culver and Papell (1999), Cuddington and Liang (2001). Instead of testing for different time series on a common base currency, we use different base currencies (USD, GBP, JPY and EUR). By doing so we want to know the whether theory may posit significant differences against one currency? We have found consensus, given inevitable technical differences, even with smallerdata sample for EUR. Speeds of convergence to PPP and adjustment are faster compared to results from other researches for developed economies, using both observed and bootstrapped HL measures. Perhaps, a better explanation is the adjustment from hyperinflation period, after which the theory indicates that adjusting process actually accelerates. We observe that deviation appears to have been large in early stages of the reform, mostly overvaluation. Over time, its correction took place leading significant deviations to gradually disappear.
Resumo:
In the presence of a chemical potential, the physics of level crossings leads to singularities at zero temperature, even when the spatial volume is finite. These singularities are smoothed out at a finite temperature but leave behind nontrivial finite size effects which must be understood in order to extract thermodynamic quantities using Monte Carlo methods, particularly close to critical points. We illustrate some of these issues using the classical nonlinear O(2) sigma model with a coupling β and chemical potential μ on a 2+1-dimensional Euclidean lattice. In the conventional formulation this model suffers from a sign problem at nonzero chemical potential and hence cannot be studied with the Wolff cluster algorithm. However, when formulated in terms of the worldline of particles, the sign problem is absent, and the model can be studied efficiently with the "worm algorithm." Using this method we study the finite size effects that arise due to the chemical potential and develop an effective quantum mechanical approach to capture the effects. As a side result we obtain energy levels of up to four particles as a function of the box size and uncover a part of the phase diagram in the (β,μ) plane. © 2010 The American Physical Society.
Resumo:
The growth and proliferation of invasive bacteria in engineered systems is an ongoing problem. While there are a variety of physical and chemical processes to remove and inactivate bacterial pathogens, there are many situations in which these tools are no longer effective or appropriate for the treatment of a microbial target. For example, certain strains of bacteria are becoming resistant to commonly used disinfectants, such as chlorine and UV. Additionally, the overuse of antibiotics has contributed to the spread of antibiotic resistance, and there is concern that wastewater treatment processes are contributing to the spread of antibiotic resistant bacteria.
Due to the continually evolving nature of bacteria, it is difficult to develop methods for universal bacterial control in a wide range of engineered systems, as many of our treatment processes are static in nature. Still, invasive bacteria are present in many natural and engineered systems, where the application of broad acting disinfectants is impractical, because their use may inhibit the original desired bioprocesses. Therefore, to better control the growth of treatment resistant bacteria and to address limitations with the current disinfection processes, novel tools that are both specific and adaptable need to be developed and characterized.
In this dissertation, two possible biological disinfection processes were investigated for use in controlling invasive bacteria in engineered systems. First, antisense gene silencing, which is the specific use of oligonucleotides to silence gene expression, was investigated. This work was followed by the investigation of bacteriophages (phages), which are viruses that are specific to bacteria, in engineered systems.
For the antisense gene silencing work, a computational approach was used to quantify the number of off-targets and to determine the effects of off-targets in prokaryotic organisms. For the organisms of
Regarding the work with phages, the disinfection rates of bacteria in the presence of phages was determined. The disinfection rates of
In addition to determining disinfection rates, the long-term bacterial growth inhibition potential was determined for a variety of phages with both Gram-negative and Gram-positive bacteria. It was determined, that on average, phages can be used to inhibit bacterial growth for up to 24 h, and that this effect was concentration dependent for various phages at specific time points. Additionally, it was found that a phage cocktail was no more effective at inhibiting bacterial growth over the long-term than the best performing phage in isolation.
Finally, for an industrial application, the use of phages to inhibit invasive
In conclusion, this dissertation improved the current methods for designing antisense gene silencing targets for prokaryotic organisms, and characterized phages from an engineering perspective. First, the current design strategy for antisense targets in prokaryotic organisms was improved through the development of an algorithm that minimized the number of off-targets. For the phage work, a framework was developed to predict the disinfection rates in terms of the initial phage and bacterial concentrations. In addition, the long-term bacterial growth inhibition potential of multiple phages was determined for several bacteria. In regard to the phage application, phages were shown to protect both final product yields and yeast concentrations during fermentation. Taken together, this work suggests that the rational design of phage treatment is possible and further work is needed to expand on this foundation.
Resumo:
Cells are fundamental units of life, but little is known about evolution of cell states. Induced pluripotent stem cells (iPSCs) are once differentiated cells that have been re-programmed to an embryonic stem cell-like state, providing a powerful platform for biology and medicine. However, they have been limited to a few mammalian species. Here we found that a set of four mammalian transcription factor genes used to generate iPSCs in mouse and humans can induce a partially reprogrammed pluripotent stem cell (PRPSCs) state in vertebrate and invertebrate model organisms, in mammals, birds, fish, and fly, which span 550 million years from a common ancestor. These findings are one of the first to show cross-lineage stem cell-like induction, and to generate pluripotent-like cells for several of these species with in vivo chimeras. We suggest that the stem-cell state may be highly conserved across a wide phylogenetic range. DOI:http://dx.doi.org/10.7554/eLife.00036.001.
Resumo:
Physarum polycephalum is a well-studied microbial eukaryote with unique experimental attributes relative to other experimental model organisms. It has a sophisticated life cycle with several distinct stages including amoebal, flagellated, and plasmodial cells. It is unusual in switching between open and closed mitosis according to specific life-cycle stages. Here we present the analysis of the genome of this enigmatic and important model organism and compare it with closely related species. The genome is littered with simple and complex repeats and the coding regions are frequently interrupted by introns with a mean size of 100 bases. Complemented with extensive transcriptome data, we define approximately 31,000 gene loci, providing unexpected insights into early eukaryote evolution. We describe extensive use of histidine kinase-based two-component systems and tyrosine kinase signaling, the presence of bacterial and plant type photoreceptors (phytochromes, cryptochrome, and phototropin) and of plant-type pentatricopeptide repeat proteins, as well as metabolic pathways, and a cell cycle control system typically found in more complex eukaryotes. Our analysis characterizes P. polycephalum as a prototypical eukaryote with features attributed to the last common ancestor of Amorphea, that is, the Amoebozoa and Opisthokonts. Specifically, the presence of tyrosine kinases in Acanthamoeba and Physarum as representatives of two distantly related subdivisions of Amoebozoa argues against the later emergence of tyrosine kinase signaling in the opisthokont lineage and also against the acquisition by horizontal gene transfer.
Resumo:
In this paper, the buildingEXODUS (V1.1) evacuation model is described and discussed and attempts at qualitative and quantitative model validation are presented. The data set used for the validation is the Tsukuba pavilion evacuation data. This data set is of particular interest as the evacuation was influenced by external conditions, namely inclement weather. As part of the validation exercise, the sensitivity of the buildingEXODUS predictions to a range of variables and conditions is examined, including; exit flow capacity, occupant response times and the impact of external conditions on the developing evacuation. The buildingEXODUS evacuation model was found to be able to produce good qualitative and quantitative agreement with the experimental data.
Resumo:
In this paper, the buildingEXODUS (V1.1) evacuation model is described and discussed and attempts at qualitative and quantitative model validation are presented. The data sets used for validation are the Stapelfeldt and Milburn House evacuation data. As part of the validation exercise, the sensitivity of the buildingEXODUS predictions to a range of variables is examined, including: occupant drive, occupant location, exit flow capacity, exit size, occupant response times and geometry definition. An important consideration that has been highlighted by this work is that any validation exercise must be scrutinised to identify both the results generated and the considerations and assumptions on which they are based. During the course of the validation exercise, both data sets were found to be less than ideal for the purpose of validating complex evacuation models. However, the buildingEXODUS evacuation model was found to be able to produce reasonable qualitative and quantitative agreement with the experimental data.
Resumo:
There is concern in the Cross-Channel region of Nord-Pas-de-Calais (France) and Kent (Great Britain), regarding the extent of atmospheric pollution detected in the area from emitted gaseous (VOC, NOx, S02)and particulate substances. In particular, the air quality of the Cross-Channel or "Trans-Manche" region is highly affected by the heavily industrial area of Dunkerque, in addition to transportation sources linked to cross-channel traffic in Kent and Calais, posing threats to the environment and human health. In the framework of the cross-border EU Interreg IIIA activity, the joint Anglo-French project, ATTMA, has been commissioned to study Aerosol Transport in the Trans-Manche Atmosphere. Using ground monitoring data from UK and French networks and with the assistance of satellite images the project aims to determine dispersion patterns. and identify sources responsible for the pollutants. The findings of this study will increase awareness and have a bearing on future air quality policy in the region. Public interest is evident by the presence of local authorities on both sides of the English Channel as collaborators. The research is based on pollution transport simulations using (a) Lagrangian Particle Dispersion (LPD) models, (b) an Eulerian Receptor Based model. This paper is concerned with part (a), the LPD Models. Lagrangian Particle Dispersion (LPD) models are often used to numerically simulate the dispersion of a passive tracer in the planetary boundary layer by calculating the Lagrangian trajectories of thousands of notional particles. In this contribution, the project investigated the use of two widely used particle dispersion models: the Hybrid Single Particle Lagrangian Integrated Trajectory (HYSPLIT) model and the model FLEXPART. In both models forward tracking and inverse (or·. receptor-based) modes are possible. Certain distinct pollution episodes have been selected from the monitor database EXPER/PF and from UK monitoring stations, and their likely trajectory predicted using prevailing weather data. Global meteorological datasets were downloaded from the ECMWF MARS archive. Part of the difficulty in identifying pollution sources arises from the fact that much of the pollution outside the monitoring area. For example heightened particulate concentrations are to originate from sand storms in the Sahara, or volcanic activity in Iceland or the Caribbean work identifies such long range influences. The output of the simulations shows that there are notable differences between the formulations of and Hysplit, although both models used the same meteorological data and source input, suggesting that the identification of the primary emissions during air pollution episodes may be rather uncertain.
Resumo:
In this paper, the buildingEXODUS evacuation model is described and discussed and attempts at qualitative and quantitative model validation are presented. The data sets used for validation are the Stapelfeldt and Milburn House evacuation data. As part of the validation exercise, the sensitivity of the building-EXODUS predictions to a range of variables is examined, including occupant drive, occupant location, exit flow capacity, exit size, occupant response times and geometry definition. An important consideration that has been highlighted by this work is that any validation exercise must be scrutinised to identify both the results generated and the considerations and assumptions on which they are based. During the course of the validation exercise, both data sets were found to be less than ideal for the purpose of validating complex evacuation. However, the buildingEXODUS evacuation model was found to be able to produce reasonable qualitative and quantitative agreement with the experimental data.
Resumo:
We present an extensive dataset of dimethylsulphide (DMS, n = 651) and dimethylsulphoniopropionate (DMSP, n = 590) from the Atlantic Meridional Transect programme. These data are used to derive representative depth profiles that illustrate observed natural variations and can be used for DMS and DMSP model-validation in oligotrophic waters. To further understand our dataset, we interpret the data with a wide range of accompanying parameters that characterise the prevailing biogeochemical conditions and phytoplankton community physiology, activity, taxonomic composition, and capacity to cope with light stress. No correlations were observed with typical biomarker pigments for DMSP-producing species. However, strong correlations were found between DMSP and primary production by cells >2 µm in diameter, and between DMSP and some photo-protective pigments. These parameters are measures of mixed phytoplankton communities, so we infer that such associations are likely to be stronger in DMSP-producing organisms. Further work is warranted to develop links between community parameters, DMS and DMSP at the global scale.
Resumo:
In this paper NOx emissions modelling for real-time operation and control of a 200 MWe coal-fired power generation plant is studied. Three model types are compared. For the first model the fundamentals governing the NOx formation mechanisms and a system identification technique are used to develop a grey-box model. Then a linear AutoRegressive model with eXogenous inputs (ARX) model and a non-linear ARX model (NARX) are built. Operation plant data is used for modelling and validation. Model cross-validation tests show that the developed grey-box model is able to consistently produce better overall long-term prediction performance than the other two models.
Resumo:
The incorporation of one-dimensional simulation codes within engine modelling applications has proved to be a useful tool in evaluating unsteady gas flow through elements in the exhaust system. This paper reports on an experimental and theoretical investigation into the behaviour of unsteady gas flow through catalyst substrate elements. A one-dimensional (1-D) catalyst model has been incorporated into a 1-D simulation code to predict this behaviour.
Experimental data was acquired using a ‘single pulse’ test rig. Substrate samples were tested under ambient conditions in order to investigate a range of regimes experienced by the catalyst during operation. This allowed reflection and transmission characteristics to be quantified in relation to both geometric and physical properties of substrate elements. Correlation between measured and predicted results is demonstrably good and the model provides an effective analysis tool for evaluating unsteady gas flow through different catalytic converter designs.
Resumo:
Estimating a time interval and temporally coordinating movements in space are fundamental skills, but the relationships between these different forms of timing, and the neural processes that they incur, are not well understood. While different theories have been proposed to account for time perception, time estimation, and the temporal patterns of coordination, there are no general mechanisms which unify these various timing skills. This study considers whether a model of perceptuo-motor timing, the tau(GUIDE), can also describe how certain judgements of elapsed time are made. To evaluate this, an equation for determining interval estimates was derived from the tau(GUIDE) model and tested in a task where participants had to throw a ball and estimate when it would hit the floor. The results showed that in accordance with the model, very accurate judgements could be made without vision (mean timing error -19.24 msec), and the model was a good predictor of skilled participants' estimate timing. It was concluded that since the tau(GUIDE) principle provides temporal information in a generic form, it could be a unitary process that links different forms of timing.