836 resultados para Framework Model
Resumo:
Electron-phonon interaction is considered within the framework of the fluctuating valence of Cu atoms. Anderson's lattice Hamiltonian is suitably modified to take this into account. Using Green's function technique tbe possible quasiparticle excitations' are determined. The quantity 2delta k(O)/ kB Tc is calculated for Tc= 40 K. The calculated values are in good agreement with the experimental results.
Resumo:
The “cotton issue” has been a topic of several academic discussions for trade policy analysts. However the design of trade and agricultural policy in the EU and the USA has become a politically sensitive matter throughout the last five years. This study utilizing the Agricultural Trade Policy Simulation Model (ATPSM) aims to gain insights into the global cotton market, to explain why domestic support for cotton has become an issue, to quantify the impact of the new EU agricultural policy on the cotton sector, and to measure the effect of eliminating support policies on production and trade. Results indicate that full trade liberalization would lead the four West African countries to better terms of trade with the EU. If tariff reduction follows the so-called Swiss formula, world prices would increase by 3.5%.
Resumo:
Initialising the ocean internal variability for decadal predictability studies is a new area of research and a variety of ad hoc methods are currently proposed. In this study, we explore how nudging with sea surface temperature (SST) and salinity (SSS) can reconstruct the three-dimensional variability of the ocean in a perfect model framework. This approach builds on the hypothesis that oceanic processes themselves will transport the surface information into the ocean interior as seen in ocean-only simulations. Five nudged simulations are designed to reconstruct a 150 years “target” simulation, defined as a portion of a long control simulation. The nudged simulations differ by the variables restored to, SST or SST + SSS, and by the area where the nudging is applied. The strength of the heat flux feedback is diagnosed from observations and the restoring coefficients for SSS use the same time-scale. We observed that this choice prevents spurious convection at high latitudes and near sea-ice border when nudging both SST and SSS. In the tropics, nudging the SST is enough to reconstruct the tropical atmosphere circulation and the associated dynamical and thermodynamical impacts on the underlying ocean. In the tropical Pacific Ocean, the profiles for temperature show a significant correlation from the surface down to 2,000 m, due to dynamical adjustment of the isopycnals. At mid-to-high latitudes, SSS nudging is required to reconstruct both the temperature and the salinity below the seasonal thermocline. This is particularly true in the North Atlantic where adding SSS nudging enables to reconstruct the deep convection regions of the target. By initiating a previously documented 20-year cycle of the model, the SST + SSS nudging is also able to reproduce most of the AMOC variations, a key source of decadal predictability. Reconstruction at depth does not significantly improve with amount of time spent nudging and the efficiency of the surface nudging rather depends on the period/events considered. The joint SST + SSS nudging applied everywhere is the most efficient approach. It ensures that the right water masses are formed at the right surface density, the subsequent circulation, subduction and deep convection further transporting them at depth. The results of this study underline the potential key role of SSS for decadal predictability and further make the case for sustained large-scale observations of this field.
Resumo:
The Caribbean region remains highly vulnerable to the impacts of climate change. In order to assess the social and economic consequences of climate change for the region, the Economic Commission for Latin America and the Caribbean( ECLAC) has developed a model for this purpose. The model is referred to as the Climate Impact Assessment Model (ECLAC-CIAM) and is a tool that can simultaneously assess multiple sectoral climate impacts specific to the Caribbean as a whole and for individual countries. To achieve this goal, an Integrated Assessment Model (IAM) with a Computable General Equilibrium Core was developed comprising of three modules to be executed sequentially. The first of these modules defines the type and magnitude of economic shocks on the basis of a climate change scenario, the second module is a global Computable General Equilibrium model with a special regional and industrial classification and the third module processes the output of the CGE model to get more disaggregated results. The model has the potential to produce several economic estimates but the current default results include percentage change in real national income for individual Caribbean states which provides a simple measure of welfare impacts. With some modifications, the model can also be used to consider the effects of single sectoral shocks such as (Land, Labour, Capital and Tourism) on the percentage change in real national income. Ultimately, the model is envisioned as an evolving tool for assessing the impact of climate change in the Caribbean and as a guide to policy responses with respect to adaptation strategies.
Resumo:
Usually we observe that Bio-physical systems or Bio-chemical systems are many a time based on nanoscale phenomenon in different host environments, which involve many particles can often not be solved explicitly. Instead a physicist, biologist or a chemist has to rely either on approximate or numerical methods. For a certain type of systems, called integrable in nature, there exist particular mathematical structures and symmetries which facilitate the exact and explicit description. Most integrable systems, we come across are low-dimensional, for instance, a one-dimensional chain of coupled atoms in DNA molecular system with a particular direction or exist as a vector in the environment. This theoretical research paper aims at bringing one of the pioneering ‘Reaction-Diffusion’ aspects of the DNA-plasma material system based on an integrable lattice model approach utilizing quantized functional algebras, to disseminate the new developments, initiate novel computational and design paradigms.
Resumo:
Item response theory (IRT) comprises a set of statistical models which are useful in many fields, especially when there is an interest in studying latent variables (or latent traits). Usually such latent traits are assumed to be random variables and a convenient distribution is assigned to them. A very common choice for such a distribution has been the standard normal. Recently, Azevedo et al. [Bayesian inference for a skew-normal IRT model under the centred parameterization, Comput. Stat. Data Anal. 55 (2011), pp. 353-365] proposed a skew-normal distribution under the centred parameterization (SNCP) as had been studied in [R. B. Arellano-Valle and A. Azzalini, The centred parametrization for the multivariate skew-normal distribution, J. Multivariate Anal. 99(7) (2008), pp. 1362-1382], to model the latent trait distribution. This approach allows one to represent any asymmetric behaviour concerning the latent trait distribution. Also, they developed a Metropolis-Hastings within the Gibbs sampling (MHWGS) algorithm based on the density of the SNCP. They showed that the algorithm recovers all parameters properly. Their results indicated that, in the presence of asymmetry, the proposed model and the estimation algorithm perform better than the usual model and estimation methods. Our main goal in this paper is to propose another type of MHWGS algorithm based on a stochastic representation (hierarchical structure) of the SNCP studied in [N. Henze, A probabilistic representation of the skew-normal distribution, Scand. J. Statist. 13 (1986), pp. 271-275]. Our algorithm has only one Metropolis-Hastings step, in opposition to the algorithm developed by Azevedo et al., which has two such steps. This not only makes the implementation easier but also reduces the number of proposal densities to be used, which can be a problem in the implementation of MHWGS algorithms, as can be seen in [R.J. Patz and B.W. Junker, A straightforward approach to Markov Chain Monte Carlo methods for item response models, J. Educ. Behav. Stat. 24(2) (1999), pp. 146-178; R. J. Patz and B. W. Junker, The applications and extensions of MCMC in IRT: Multiple item types, missing data, and rated responses, J. Educ. Behav. Stat. 24(4) (1999), pp. 342-366; A. Gelman, G.O. Roberts, and W.R. Gilks, Efficient Metropolis jumping rules, Bayesian Stat. 5 (1996), pp. 599-607]. Moreover, we consider a modified beta prior (which generalizes the one considered in [3]) and a Jeffreys prior for the asymmetry parameter. Furthermore, we study the sensitivity of such priors as well as the use of different kernel densities for this parameter. Finally, we assess the impact of the number of examinees, number of items and the asymmetry level on the parameter recovery. Results of the simulation study indicated that our approach performed equally as well as that in [3], in terms of parameter recovery, mainly using the Jeffreys prior. Also, they indicated that the asymmetry level has the highest impact on parameter recovery, even though it is relatively small. A real data analysis is considered jointly with the development of model fitting assessment tools. The results are compared with the ones obtained by Azevedo et al. The results indicate that using the hierarchical approach allows us to implement MCMC algorithms more easily, it facilitates diagnosis of the convergence and also it can be very useful to fit more complex skew IRT models.
Resumo:
The magnetic moments of the low-lying spin-parity J(P) = 1/2(-), 3/2(-) Lambda resonances, like, for example, Lambda(1405) 1/2(-), Lambda(1520) 3/2(-), as well as their transition magnetic moments, are calculated using the chiral quark model. The results found are compared with those obtained from the nonrelativistic quark model and those of unitary chiral theories, where some of these states are generated through the dynamics of two hadron coupled channels and their unitarization.
Resumo:
This thesis describes modelling tools and methods suited for complex systems (systems that typically are represented by a plurality of models). The basic idea is that all models representing the system should be linked by well-defined model operations in order to build a structured repository of information, a hierarchy of models. The port-Hamiltonian framework is a good candidate to solve this kind of problems as it supports the most important model operations natively. The thesis in particular addresses the problem of integrating distributed parameter systems in a model hierarchy, and shows two possible mechanisms to do that: a finite-element discretization in port-Hamiltonian form, and a structure-preserving model order reduction for discretized models obtainable from commercial finite-element packages.
Resumo:
The research hypothesis of the thesis is that “an open participation in the co-creation of the services and environments, makes life easier for vulnerable groups”; assuming that the participatory and emancipatory approaches are processes of possible actions and changes aimed at facilitating people’s lives. The adoption of these approaches is put forward as the common denominator of social innovative practices that supporting inclusive processes allow a shift from a medical model to a civil and human rights approach to disability. The theoretical basis of this assumption finds support in many principles of Inclusive Education and the main focus of the hypothesis of research is on participation and emancipation as approaches aimed at facing emerging and existing problems related to inclusion. The framework of reference for the research is represented by the perspectives adopted by several international documents concerning policies and interventions to promote and support the leadership and participation of vulnerable groups. In the first part an in-depth analysis of the main academic publications on the central themes of the thesis has been carried out. After investigating the framework of reference, the analysis focuses on the main tools of participatory and emancipatory approaches, which are able to connect with the concepts of active citizenship and social innovation. In the second part two case studies concerning participatory and emancipatory approaches in the areas of concern are presented and analyzed as example of the improvement of inclusion, through the involvement and participation of persons with disability. The research has been developed using a holistic and interdisciplinary approach, aimed at providing a knowledge-base that fosters a shift from a situation of passivity and care towards a new scenario based on the person’s commitment in the elaboration of his/her own project of life.
Resumo:
Initialising the ocean internal variability for decadal predictability studies is a new area of research and a variety of ad hoc methods are currently proposed. In this study, we explore how nudging with sea surface temperature (SST) and salinity (SSS) can reconstruct the threedimensional variability of the ocean in a perfect model framework. This approach builds on the hypothesis that oceanic processes themselves will transport the surface information into the ocean interior as seen in ocean-only simulations. Five nudged simulations are designed to reconstruct a 150 years ‘‘target’’ simulation, defined as a portion of a long control simulation. The nudged simulations differ by the variables restored to, SST or SST + SSS, and by the area where the nudging is applied. The strength of the heat flux feedback is diagnosed from observations and the restoring coefficients for SSS use the same time-scale. We observed that this choice prevents spurious convection at high latitudes and near sea-ice border when nudging both SST and SSS. In the tropics, nudging the SST is enough to reconstruct the tropical atmosphere circulation and the associated dynamical and thermodynamical impacts on the underlying ocean. In the tropical Pacific Ocean, the profiles for temperature show a significant correlation from the surface down to 2,000 m, due to dynamical adjustment of the isopycnals. At mid-tohigh latitudes, SSS nudging is required to reconstruct both the temperature and the salinity below the seasonal thermocline. This is particularly true in the North Atlantic where adding SSS nudging enables to reconstruct the deep convection regions of the target. By initiating a previously documented 20-year cycle of the model, the SST + SSS nudging is also able to reproduce most of the AMOC variations, a key source of decadal predictability. Reconstruction at depth does not significantly improve with amount of time spent nudging and the efficiency of the surface nudging rather depends on the period/events considered. The joint SST + SSS nudging applied verywhere is the most efficient approach. It ensures that the right water masses are formed at the right surface density, the subsequent circulation, subduction and deep convection further transporting them at depth. The results of this study underline the potential key role of SSS for decadal predictability and further make the case for sustained largescale observations of this field.