963 resultados para framework structure
Resumo:
The comprehensive study on the coupling of magnetism, electrical polarization and the crystalline lattice with the off-stoichiometric effects in self-doped multiferroic hexagonal h-LuMnxO3±δ (0.92≤x≤1.12) ceramic oxides was carried out for the PhD work. There is a complex coupling of the three ferroic degrees. The cancelation of the magnetic moments of ions in the antiferromagnetic order, electric polarization with specific vortex/antivortex topology and lattice properties have pushed researchers to find out ways to disclose the underlying physics and chemistry of magneto-electric and magneto-elastic couplings of h-RMnO3 multiferroic materials. In this research work, self-doping of Lu-sites or Mn-sites of h-LuMnxO3±δ ceramics prepared via solid state route was done to pave a way for deeper understanding of the antiferromagnetic transition, the weak ferromagnetism often reported in the same crystalline lattices and the ferroelectric properties coupled to the imposed lattice changes. Accordingly to the aim of the PhD thesis, the objectives set for the sintering study in the first chapter on experimental results were two. First, study of sintering off-stoichiometric samples within conditions reported in the bibliography and also extracted from the phase diagrams of the LuMnxO3±δ, with a multiple firings ending with a last high temperature step at 1300ºC for 24 hours. Second, explore longer annealing times of up to 240 hours at the fixed temperature of 1300 ºC in a search for improving the properties of the solid solution under study. All series of LuMnxO3±δ ceramics for each annealing time were characterized to tentatively build a framework enabling comparison of measured properties with results of others available in literature. XRD and Rietveld refinement of data give the evolution the lattice parameters as a function to x. Shrinkage of the lattice parameters with increasing x values was observed, the stability limit of the solid solution being determined by analysis of lattice parameters. The evolution of grain size and presence of secondary phases have been investigated by means of TEM, SEM, EDS and EBSD techniques. The dependencies of grain growth and regression of secondary phases on composition x and time were further characterized. Magnetic susceptibility of samples and magnetic irreversibility were extensively examined in the present work. The dependency of magnetic susceptibility, Neel ordering transition and important magnetic parameters are determined and compared to observation in other multiferroics in the following chapter of the thesis. As a tool of high sensitivity to detect minor traces of the secondary phase hausmannite, magnetic measurements are suggested for cross-checking of phase diagrams. Difficulty of previous studies on interpreting the magnetic anomaly below 43 K in h-RMnO3 oxides was discussed and assigned to the Mn3O4 phase, with supported of the electron microscopy. Magneto-electric coupling where AFM ordering is coupled to dielectric polarization is investigated as a function of x and of sintering condition via frequency and temperature dependent complex dielectric constant measurements in the final chapter of the thesis. Within the limits of solid solubility, the crystalline lattice of off-stoichiometric ceramics was shown to preserve the magneto-electric coupling at TN. It represents the first research work on magneto-electric coupling modified by vacancy doping to author’s knowledge. Studied lattices would reveal distortions at the atomic scale imposed by local changes of x dependent on sintering conditions which were widely inspected by using TEM/STEM methods, complemented with EDS and EELS spectroscopy all together to provide comprehensive information on cross coupling of distortions, inhomogeneity and electronic structure assembled and discussed in a specific chapter. Internal interfaces inside crystalline grains were examined. Qualitative explanations of the measured magnetic and ferroelectric properties were established in relation to observed nanoscale features of h-LuMnxO3±δ ceramics. Ferroelectric domains and topological defects are displayed both in TEM and AFM/PFM images, the later technique being used to look at size, distribution and switching of ferroelectric domains influenced by vacancy doping at the micron scale bridging to complementary TEM studies on the atomic structure of ferroelectric domains. In support to experimental study, DFT simulations using Wien2K code have been carried out in order to interpret the results of EELS spectra of O K-edge and to obtain information on the cation hybridization to oxygen ions. The L3,2 edges of Mn is used to access the oxidation state of the Mn ions inside crystalline grains. In addition, rehybridization driven ferroelectricity is also evaluated by comparing the partial density of states of the orbitals of all ions of the samples, also the polarization was calculated and correlated to the off-stoichiometric effect.
Resumo:
This PhD thesis contains three main chapters on macro finance, with a focus on the term structure of interest rates and the applications of state-of-the-art Bayesian econometrics. Except for Chapter 1 and Chapter 5, which set out the general introduction and conclusion, each of the chapters can be considered as a standalone piece of work. In Chapter 2, we model and predict the term structure of US interest rates in a data rich environment. We allow the model dimension and parameters to change over time, accounting for model uncertainty and sudden structural changes. The proposed timevarying parameter Nelson-Siegel Dynamic Model Averaging (DMA) predicts yields better than standard benchmarks. DMA performs better since it incorporates more macro-finance information during recessions. The proposed method allows us to estimate plausible realtime term premia, whose countercyclicality weakened during the financial crisis. Chapter 3 investigates global term structure dynamics using a Bayesian hierarchical factor model augmented with macroeconomic fundamentals. More than half of the variation in the bond yields of seven advanced economies is due to global co-movement. Our results suggest that global inflation is the most important factor among global macro fundamentals. Non-fundamental factors are essential in driving global co-movements, and are closely related to sentiment and economic uncertainty. Lastly, we analyze asymmetric spillovers in global bond markets connected to diverging monetary policies. Chapter 4 proposes a no-arbitrage framework of term structure modeling with learning and model uncertainty. The representative agent considers parameter instability, as well as the uncertainty in learning speed and model restrictions. The empirical evidence shows that apart from observational variance, parameter instability is the dominant source of predictive variance when compared with uncertainty in learning speed or model restrictions. When accounting for ambiguity aversion, the out-of-sample predictability of excess returns implied by the learning model can be translated into significant and consistent economic gains over the Expectations Hypothesis benchmark.
Resumo:
The purpose of this paper is twofold. Firstly it presents a preliminary and ethnomethodologically-informed analysis of the way in which the growing structure of a particular program's code was ongoingly derived from its earliest stages. This was motivated by an interest in how the detailed structure of completed program `emerged from nothing' as a product of the concrete practices of the programmer within the framework afforded by the language. The analysis is broken down into three sections that discuss: the beginnings of the program's structure; the incremental development of structure; and finally the code productions that constitute the structure and the importance of the programmer's stock of knowledge. The discussion attempts to understand and describe the emerging structure of code rather than focus on generating `requirements' for supporting the production of that structure. Due to time and space constraints, however, only a relatively cursory examination of these features was possible. Secondly the paper presents some thoughts on the difficulties associated with the analytic---in particular ethnographic---study of code, drawing on general problems as well as issues arising from the difficulties and failings encountered as part of the analysis presented in the first section.
Resumo:
This PhD thesis contains three main chapters on macro finance, with a focus on the term structure of interest rates and the applications of state-of-the-art Bayesian econometrics. Except for Chapter 1 and Chapter 5, which set out the general introduction and conclusion, each of the chapters can be considered as a standalone piece of work. In Chapter 2, we model and predict the term structure of US interest rates in a data rich environment. We allow the model dimension and parameters to change over time, accounting for model uncertainty and sudden structural changes. The proposed time-varying parameter Nelson-Siegel Dynamic Model Averaging (DMA) predicts yields better than standard benchmarks. DMA performs better since it incorporates more macro-finance information during recessions. The proposed method allows us to estimate plausible real-time term premia, whose countercyclicality weakened during the financial crisis. Chapter 3 investigates global term structure dynamics using a Bayesian hierarchical factor model augmented with macroeconomic fundamentals. More than half of the variation in the bond yields of seven advanced economies is due to global co-movement. Our results suggest that global inflation is the most important factor among global macro fundamentals. Non-fundamental factors are essential in driving global co-movements, and are closely related to sentiment and economic uncertainty. Lastly, we analyze asymmetric spillovers in global bond markets connected to diverging monetary policies. Chapter 4 proposes a no-arbitrage framework of term structure modeling with learning and model uncertainty. The representative agent considers parameter instability, as well as the uncertainty in learning speed and model restrictions. The empirical evidence shows that apart from observational variance, parameter instability is the dominant source of predictive variance when compared with uncertainty in learning speed or model restrictions. When accounting for ambiguity aversion, the out-of-sample predictability of excess returns implied by the learning model can be translated into significant and consistent economic gains over the Expectations Hypothesis benchmark.
Resumo:
This dissertation investigates the connection between spectral analysis and frame theory. When considering the spectral properties of a frame, we present a few novel results relating to the spectral decomposition. We first show that scalable frames have the property that the inner product of the scaling coefficients and the eigenvectors must equal the inverse eigenvalues. From this, we prove a similar result when an approximate scaling is obtained. We then focus on the optimization problems inherent to the scalable frames by first showing that there is an equivalence between scaling a frame and optimization problems with a non-restrictive objective function. Various objective functions are considered, and an analysis of the solution type is presented. For linear objectives, we can encourage sparse scalings, and with barrier objective functions, we force dense solutions. We further consider frames in high dimensions, and derive various solution techniques. From here, we restrict ourselves to various frame classes, to add more specificity to the results. Using frames generated from distributions allows for the placement of probabilistic bounds on scalability. For discrete distributions (Bernoulli and Rademacher), we bound the probability of encountering an ONB, and for continuous symmetric distributions (Uniform and Gaussian), we show that symmetry is retained in the transformed domain. We also prove several hyperplane-separation results. With the theory developed, we discuss graph applications of the scalability framework. We make a connection with graph conditioning, and show the in-feasibility of the problem in the general case. After a modification, we show that any complete graph can be conditioned. We then present a modification of standard PCA (robust PCA) developed by Cand\`es, and give some background into Electron Energy-Loss Spectroscopy (EELS). We design a novel scheme for the processing of EELS through robust PCA and least-squares regression, and test this scheme on biological samples. Finally, we take the idea of robust PCA and apply the technique of kernel PCA to perform robust manifold learning. We derive the problem and present an algorithm for its solution. There is also discussion of the differences with RPCA that make theoretical guarantees difficult.
Resumo:
The Theoretical and Experimental Tomography in the Sea Experiment (THETIS 1) took place in the Gulf of Lion to observe the evolution of the temperature field and the process of deep convection during the 1991-1992 winter. The temperature measurements consist, of moored sensors, conductivity-temperature-depth and expendable bathythermograph surveys, ana acoustic tomography. Because of this diverse data set and since the field evolves rather fast, the analysis uses a unified framework, based on estimation theory and implementing a Kalman filter. The resolution and the errors associated with the model are systematically estimated. Temperature is a good tracer of water masses. The time-evolving three-dimensional view of the field resulting from the analysis shows the details of the three classical convection phases: preconditioning, vigourous convection, and relaxation. In all phases, there is strong spatial nonuniformity, with mesoscale activity, short timescales, and sporadic evidence of advective events (surface capping, intrusions of Levantine Intermediate Water (LIW)). Deep convection, reaching 1500 m, was observed in late February; by late April the field had not yet returned to its initial conditions (strong deficit of LIW). Comparison with available atmospheric flux data shows that advection acts to delay the occurence of convection and confirms the essential role of buoyancy fluxes. For this winter, the deep. mixing results in an injection of anomalously warm water (Delta T similar or equal to 0.03 degrees) to a depth of 1500 m, compatible with the deep warming previously reported.
Resumo:
110 p.
Resumo:
The Water Framework Directive (WFD) establishes Environmental Quality Standards (EQS) in marine water for 34 priority substances. Among these substances, 25 are hydrophobic and bioaccumulable (2 metals and 23 organic compounds). For these 25 substances, monitoring in water matrix is not appropriate and an alternative matrix should be developed. Bivalve mollusks, particularly mussels (Mytilus edulis, Mytilus galloprovincialis), are used by Ifremer as a quantitative biological indicator since 1979 in France, to assess the marine water quality. This study has been carried out in order to determine thresholds in mussels at least as protective as EQS in marine water laid down by the WFD. Three steps are defined: - Provide an overview of knowledges about the relations between the concentrations of contaminants in the marine water and mussels through bioaccumulation factor (BAF) and bioconcentration factor (BCF). This allows to examine how a BCF or a BAF can be determined: BCF can be determined experimentally (according to US EPA or ASTM standards), or by Quantitative Activity-Structure Relationship models (QSAR): four equations can be used for mussels. BAF can be determined by field experiment; but none standards exists. It could be determined by using QSAR but this method is considered as invalid for mussels, or by using existing model: Dynamic Budget Model, but this is complex to use. - Collect concentrations data in marine water (Cwater) in bibliography for those 25 substances; and compare them with concentration in mussels (Cmussels) obtained through French monitoring network of chemicals contaminants (ROCCH) and biological integrator network RINBIO. According to available data, this leads to determine the BAF or the BCF (Cmussels /Cwater) with field data. - Compare BAF and BCF values (when available) obtained with various methods for these substances: BCF (stemming from the bibliography, using experimental process), BCF calculated by QSAR and BAF determined using field data. This study points out that experimental BCF data are available for 3 substances (Chlorpyrifos, HCH, Pentachlorobenzene). BCF by QSAR can be calculated for 20 substances. The use of field data allows to evaluate 4 BAF for organic compounds and 2 BAF for metals. Using these BAF or BCF value, thresholds in shellfish can be determined as an alternative to EQS in marine water.
Resumo:
Part 11: Reference and Conceptual Models
Resumo:
This paper presents a harmonised framework of sediment quality assessment and dredging material characterisation for estuaries and port zones of North and South Atlantic. This framework, based on the weight-of-evidence approach, provides a structure and a process for conducting sediment/dredging material assessment that leads to a decision. The main structure consists of step 1 (examination of available data); step 2 (chemical characterisation and toxicity assessment); decision 1 (any chemical level higher than reference values? are sediments toxic?); step 3 (assessment of benthic community structure); step 4 (integration of the results); decision 2 (are sediments toxic or benthic community impaired?); step 5 (construction of the decision matrix) and decision 3 (is there environmental risk?). The sequence of assessments may be interrupted when the information obtained is judged to be sufficient for a correct characterisation of the risk posed by the sediments/dredging material. This framework brought novel features compared to other sediment/dredging material risk assessment frameworks: data integration through multivariate analysis allows the identification of which samples are toxic and/or related to impaired benthic communities; it also discriminates the chemicals responsible for negative biological effects; and the framework dispenses the use of a reference area. We demonstrated the successful application of this framework in different port and estuarine zones of the North (Gulf of Cadiz) and South Atlantic (Santos and Paranagua Estuarine Systems).
Resumo:
Most of the studies devoted to thiolated gold clusters suppose that their core and Au-S framework do not suffer from distortion independently of the protecting ligands (-SR) and it is assumed as correct to simplify the ligand as SCH3. In this work is delivered a systematic study of the structure and vibrational properties (IR and Raman) of the Au18(SR)14 cluster. The pursued goal is to understand the dependency of the displayed vibrational properties of the thiolated Au18 cluster with the ligands type. A set of six ligands was considered during calculations of the vibrational properties based on density functional theory (DFT) and in its dispersioncorrected approach (DFT-D)
Resumo:
Increasing in resolution of numerical weather prediction models has allowed more and more realistic forecasts of atmospheric parameters. Due to the growing variability into predicted fields the traditional verification methods are not always able to describe the model ability because they are based on a grid-point-by-grid-point matching between observation and prediction. Recently, new spatial verification methods have been developed with the aim of show the benefit associated to the high resolution forecast. Nested in among of the MesoVICT international project, the initially aim of this work is to compare the newly tecniques remarking advantages and disadvantages. First of all, the MesoVICT basic examples, represented by synthetic precipitation fields, have been examined. Giving an error evaluation in terms of structure, amplitude and localization of the precipitation fields, the SAL method has been studied more thoroughly respect to the others approaches with its implementation in the core cases of the project. The verification procedure has concerned precipitation fields over central Europe: comparisons between the forecasts performed by the 00z COSMO-2 model and the VERA (Vienna Enhanced Resolution Analysis) have been done. The study of these cases has shown some weaknesses of the methodology examined; in particular has been highlighted the presence of a correlation between the optimal domain size and the extention of the precipitation systems. In order to increase ability of SAL, a subdivision of the original domain in three subdomains has been done and the method has been applied again. Some limits have been found in cases in which at least one of the two domains does not show precipitation. The overall results for the subdomains have been summarized on scatter plots. With the aim to identify systematic errors of the model the variability of the three parameters has been studied for each subdomain.
Resumo:
User Quality of Experience (QoE) is a subjective entity and difficult to measure. One important aspect of it, User Experience (UX), corresponds to the sensory and emotional state of a user. For a user interacting through a User Interface (UI), precise information on how they are using the UI can contribute to understanding their UX, and thereby understanding their QoE. As well as a user’s use of the UI such as clicking, scrolling, touching, or selecting, other real-time digital information about the user such as from smart phone sensors (e.g. accelerometer, light level) and physiological sensors (e.g. heart rate, ECG, EEG) could contribute to understanding UX. Baran is a framework that is designed to capture, record, manage and analyse the User Digital Imprint (UDI) which, is the data structure containing all user context information. Baran simplifies the process of collecting experimental information in Human and Computer Interaction (HCI) studies, by recording comprehensive real-time data for any UI experiment, and making the data available as a standard UDI data structure. This paper presents an overview of the Baran framework, and provides an example of its use to record user interaction and perform some basic analysis of the interaction.
Resumo:
Nesta dissertação descreve-se uma proposta de implementação de uma plataforma de desenvolvimento de Sistemas de Comunicação Aumentativa e Alternativa para programadores, com o objectivo de melhorar a produtividade e diminuir os tempos despendidos na implementação deste tipo de soluções. Esta proposta assenta numa estrutura composta por widgets configuráveis por código e integráveis em novas aplicações, numa filosofia de reaproveitamento de objectos e funcionalidades, permitindo ainda a uniformização da estrutura do código no desenvolvimento de softwares deste tipo. Esta plataforma pretende ainda dar flexibilidade aos programadores, através da possibilidade de introdução de novas funcionalidades e widgets, permitindo também que se testem novas abordagens ao software durante a investigação. A implementação em tecnologias open source independentes da plataforma, permitirá ainda utilizar os objectos deste toolkit em vários sistemas operativos. ABSTRACT: ln this master thesis we describe an implementation proposal for an Augmentative and Alternative Communication Framework for developers, with the objective of improves the productivity and reduces the implementation times for these types of solutions. This proposal is based on a customized widgets structure that can be integrated in new applications, with the purpose of reuse common features of these applications, also allowing standardize the code structure in this kind of software development. This framework intends to provide flexibility to programmers giving them the possibility of introduce new functionalities and widgets, allowing them to test new approaches during research. The implementation based on open-source technologies, platform independent, allows the use of this toolkit in several different operating systems.
Resumo:
Investigating stock identity of marine species in a multidisciplinary holistic approach can reveal patterns of complex spatial population structure and signatures of potential local adaptation. The population structure of common sole (Solea solea) in the Mediterranean Sea was delineated using genomic and otolith data, including single nucleotide polymorphisms (SNPs) markers and otolith data. SNPs were correlated with environmental and spatial variables to evaluate the impact of these features on the actual genetic population structure. Integrated holistic approach was applied to combine the tracers with different spatio-temporal scales. SNPs data was also used to illustrate the population structure of European hake (Merluccius merluccius) within the Alboran Sea, extending into the neighboring Mediterranean Sea and Atlantic Ocean. The aim was to identify patterns of neutral and potential adaptive genetic variation by applying seascape genomic framework. Results from both genetic and otolith data suggested significant divergence among putative populations of common sole, confirming a clear separation between Western, Adriatic Sea and Eastern Mediterranean Sea. Evidence of fine-scale population structure in the Western Mediterranean Sea was observed at outlier loci level and in the Adriatic. Our study not only indicates that separation among Mediterranean sole population is led primarily by neutral processes, but it also suggests the presence of local adaptation influenced by environmental and spatial factors. The holistic approach by considering the spatio-temporal scales of variation confirmed that the same pattern of separation between these geographical sites is currently occurring and has occurred for many generations. Results showed the occurrence of population structure in Merluccius merluccius by detecting westward–eastward differentiation among populations and distinct subgroups at a fine geographical scale using outlier SNPs. These results enhance the knowledge of the population structure of commercially relevant species to support the application of spatial stock assessment models, including a redefinition of fishery management units.