933 resultados para algebraic structures of integrable models


Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper does two things. First, it presents alternative approaches to the standard methods of estimating productive efficiency using a production function. It favours a parametric approach (viz. the stochastic production frontier approach) over a nonparametric approach (e.g. data envelopment analysis); and, further, one that provides a statistical explanation of efficiency, as well as an estimate of its magnitude. Second, it illustrates the favoured approach (i.e. the ‘single stage procedure’) with estimates of two models of explained inefficiency, using data from the Thai manufacturing sector, after the crisis of 1997. Technical efficiency is modelled as being dependent on capital investment in three major areas (viz. land, machinery and office appliances) where land is intended to proxy the effects of unproductive, speculative capital investment; and both machinery and office appliances are intended to proxy the effects of productive, non-speculative capital investment. The estimates from these models cast new light on the five-year long, post-1997 crisis period in Thailand, suggesting a structural shift from relatively labour intensive to relatively capital intensive production in manufactures from 1998 to 2002.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Report for the scientific sojourn carried out at the Uppsala Universitet, Sweden, from April to July the 2007. Two series of analogue models are used to explore ductile-frictional contrasts of the basal décollement in the development of oblique and transverse structures simultaneously to thin-skinned shortening. These models simulate the evolution of the Central External Sierras (Southern Pyrenees, Spain), which constitute the frontal emerging part of the southernmost Pyrenean thrust sheet. They are characterized by the presence of transverse N-S to NW-SE anticlines, which are perpendicular to the Pyrenean structural trend and developed in the hangingwall of the Santo Domingo thrust system, detaching on an unevenly distributed Triassic materials (evaporitic-dolomitic interfingerings). Model setup performs a décollement made by three patches of silicone neighbouring pure brittle sand. Model series A test the thickness ratio between overburden and décollement. Model series B test the width of frictional detachment areas. Model results show how deformation reaches further in areas detached on ductile layer whereas frictional décollement areas assimilate the strain by means of an additional uplift. This replicates the structural style of Central External Sierras: higher structural relief of N-S anticlines with regard to orogen-parallel structures, absence of a representative ductile décollement in the core, tilting towards the orogen and foreland-side closure not thrusted by the frontal emerging South-Pyrenean thrust.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

1. Model-based approaches have been used increasingly in conservation biology over recent years. Species presence data used for predictive species distribution modelling are abundant in natural history collections, whereas reliable absence data are sparse, most notably for vagrant species such as butterflies and snakes. As predictive methods such as generalized linear models (GLM) require absence data, various strategies have been proposed to select pseudo-absence data. However, only a few studies exist that compare different approaches to generating these pseudo-absence data. 2. Natural history collection data are usually available for long periods of time (decades or even centuries), thus allowing historical considerations. However, this historical dimension has rarely been assessed in studies of species distribution, although there is great potential for understanding current patterns, i.e. the past is the key to the present. 3. We used GLM to model the distributions of three 'target' butterfly species, Melitaea didyma, Coenonympha tullia and Maculinea teleius, in Switzerland. We developed and compared four strategies for defining pools of pseudo-absence data and applied them to natural history collection data from the last 10, 30 and 100 years. Pools included: (i) sites without target species records; (ii) sites where butterfly species other than the target species were present; (iii) sites without butterfly species but with habitat characteristics similar to those required by the target species; and (iv) a combination of the second and third strategies. Models were evaluated and compared by the total deviance explained, the maximized Kappa and the area under the curve (AUC). 4. Among the four strategies, model performance was best for strategy 3. Contrary to expectations, strategy 2 resulted in even lower model performance compared with models with pseudo-absence data simulated totally at random (strategy 1). 5. Independent of the strategy model, performance was enhanced when sites with historical species presence data were not considered as pseudo-absence data. Therefore, the combination of strategy 3 with species records from the last 100 years achieved the highest model performance. 6. Synthesis and applications. The protection of suitable habitat for species survival or reintroduction in rapidly changing landscapes is a high priority among conservationists. Model-based approaches offer planning authorities the possibility of delimiting priority areas for species detection or habitat protection. The performance of these models can be enhanced by fitting them with pseudo-absence data relying on large archives of natural history collection species presence data rather than using randomly sampled pseudo-absence data.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In recent years there has been increasing concern about the identification of parameters in dynamic stochastic general equilibrium (DSGE) models. Given the structure of DSGE models it may be difficult to determine whether a parameter is identified. For the researcher using Bayesian methods, a lack of identification may not be evident since the posterior of a parameter of interest may differ from its prior even if the parameter is unidentified. We show that this can even be the case even if the priors assumed on the structural parameters are independent. We suggest two Bayesian identification indicators that do not suffer from this difficulty and are relatively easy to compute. The first applies to DSGE models where the parameters can be partitioned into those that are known to be identified and the rest where it is not known whether they are identified. In such cases the marginal posterior of an unidentified parameter will equal the posterior expectation of the prior for that parameter conditional on the identified parameters. The second indicator is more generally applicable and considers the rate at which the posterior precision gets updated as the sample size (T) is increased. For identified parameters the posterior precision rises with T, whilst for an unidentified parameter its posterior precision may be updated but its rate of update will be slower than T. This result assumes that the identified parameters are pT-consistent, but similar differential rates of updates for identified and unidentified parameters can be established in the case of super consistent estimators. These results are illustrated by means of simple DSGE models.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper compares the forecasting performance of different models which have been proposed for forecasting in the presence of structural breaks. These models differ in their treatment of the break process, the parameters defining the model which applies in each regime and the out-of-sample probability of a break occurring. In an extensive empirical evaluation involving many important macroeconomic time series, we demonstrate the presence of structural breaks and their importance for forecasting in the vast majority of cases. However, we find no single forecasting model consistently works best in the presence of structural breaks. In many cases, the formal modeling of the break process is important in achieving good forecast performance. However, there are also many cases where simple, rolling OLS forecasts perform well.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper compares the forecasting performance of different models which have been proposed for forecasting in the presence of structural breaks. These models differ in their treatment of the break process, the parameters defining the model which applies in each regime and the out-of-sample probability of a break occurring. In an extensive empirical evaluation involving many important macroeconomic time series, we demonstrate the presence of structural breaks and their importance for forecasting in the vast majority of cases. However, we find no single forecasting model consistently works best in the presence of structural breaks. In many cases, the formal modeling of the break process is important in achieving good forecast performance. However, there are also many cases where simple, rolling OLS forecasts perform well.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We present existence, uniqueness and continuous dependence results for some kinetic equations motivated by models for the collective behavior of large groups of individuals. Models of this kind have been recently proposed to study the behavior of large groups of animals, such as flocks of birds, swarms, or schools of fish. Our aim is to give a well-posedness theory for general models which possibly include a variety of effects: an interaction through a potential, such as a short-range repulsion and long-range attraction; a velocity-averaging effect where individuals try to adapt their own velocity to that of other individuals in their surroundings; and self-propulsion effects, which take into account effects on one individual that are independent of the others. We develop our theory in a space of measures, using mass transportation distances. As consequences of our theory we show also the convergence of particle systems to their corresponding kinetic equations, and the local-in-time convergence to the hydrodynamic limit for one of the models.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Na,K-ATPase is a major ion-motive ATPase of the P-type family responsible for many aspects of cellular homeostasis. To determine the structure of the pathway for cations across the transmembrane portion of the Na,K-ATPase, we mutated 24 residues of the fourth transmembrane segment into cysteine and studied their function and accessibility by exposure to the sulfhydryl reagent 2-aminoethyl-methanethiosulfonate. Accessibility was also examined after treatment with palytoxin, which transforms the Na,K-pump into a cation channel. Of the 24 tested cysteine mutants, seven had no or a much reduced transport function. In particular cysteine mutants of the highly conserved "PEG" motif had a strongly reduced activity. However, most of the non-functional mutants could still be transformed by palytoxin as well as all of the functional mutants. Accessibility, determined as a 2-aminoethyl-methanethiosulfonate-induced reduction of the transport activity or as inhibition of the membrane conductance after palytoxin treatment, was observed for the following positions: Phe(323), Ile(322), Gly(326), Ala(330), Pro(333), Glu(334), and Gly(335). In accordance with a structural model of the Na,K-ATPase obtained by homology modeling with the two published structures of sarcoplasmic and endoplasmic reticulum calcium ATPase (Protein Data Bank codes 1EUL and 1IWO), the results suggest the presence of a cation pathway along the side of the fourth transmembrane segment that faces the space between transmembrane segments 5 and 6. The phenylalanine residue in position 323 has a critical position at the outer mouth of the cation pathway. The residues thought to form the cation binding site II ((333)PEGL) are also part of the accessible wall of the cation pathway opened by palytoxin through the Na,K-pump.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Most studies about the higher-order dimensions to be considered in order to parsimoniously describe Personality Disorders (PDs) have identified between two and four factors but there is still no consensus about their exact number. In this context, the cultural stability of these structures might be a criterion to be considered. The aim of this study was to identify stable higher-order structures of PD traits in a French-speaking African and Swiss sample (N = 2,711). All subject completed the IPDE screening questionnaire. Using Everett's criterion and conducting a series of principal component analyses, a cross-culturally stable two- and four-factor structure were identified, associated with a total congruence coefficient of respectively .98 and .94 after Procrustes rotation. Moreover, these two structures were also highly replicable across the four African regions considered, North Africa, West Africa, Central Africa, and Mauritius, with a mean total congruence coefficient of respectively .97 and .87. The four-factor structure presented the advantage of being similar to Livesely's four components and of describing the ten PDs more accurately.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

There are different approaches to dealing with alcohol related problems in the workplace. A literature review indicates that two of the models that underpin programmes to deal with alcohol related problems in the workplace are the disease model and the health promotion model. The disease model considers alcoholism as an illness and uses curative techniques to restore the individual to sobriety. The health promotion model looks at the determinants of health and promotes changes in the environment and structures, which would support healthy behaviour in relation to alcohol. Employee Assistance Programmes (EAPs) may have elements of both theses models. Dealing with alcohol problems at work involves a captive audience and the workplace as a setting can be used to influence healthier lifestyles. A workplace alcohol policy is a mechanism through which alcohol related issues might be dealt with, and the necessary resources and commitment of managers and staff channelled to this end. The policy aims should be clear and unambiguous, and specific plans put in place for implementing all aspects of the policy. In the case of the alcohol policy in the organisation under study, the policy was underpinned by a health promotion ethos and the policy document reflects broad aims and objectives to support this. The steering group that oversaw the development of the policy had particular needs of their own which they brought to the development process. The common theme in their needs was how to identify and support employees with alcohol related problems within an equitable staff welfare system. The role of the supervisor was recognised as crucial and training was provided to introduce the skills needed for an early intervention and constructive confrontation with employees who had alcohol related problems. Opportunities provided by this policy initiative to deal with broader issues around alcohol and to consider the determinants of health in relation to alcohol were not fully utilised. The policy formalised the procedures for dealing with people who have alcohol related problems in an equitable and supportive manner. The wider aspect of the health promotion approach does not appear to have been a priority in the development and implementation of the policy.This resource was contributed by The National Documentation Centre on Drug Use.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A survey was undertaken among Swiss occupational health and safety specialists in 2004 to identify uses, difficulties, and possible developments of exposure models. Occupational hygienists (121), occupational physicians (169), and safety specialists (95) were surveyed with an in depth questionnaire. Results obtained indicate that models are not used very much in practice in Switzerland and are reserved to research groups focusing on specific topics. However, various determinants of exposure are often considered important by professionals (emission rate, work activity), and in some cases recorded and used (room parameters, operator activity). These parameters cannot be directly included in present models. Nevertheless, more than half of the occupational hygienists think that it is important to develop quantitative exposure models. Looking at research institutions, there is, however, a big interest in the use of models to solve problems which are difficult to address with direct measurements; i. e. retrospective exposure assessment for specific clinical cases and prospective evaluation for new situations or estimation of the effect of selected parameters. In a recent study about cases of acutepulmonary toxicity following water proofing spray exposure, exposure models have been used to reconstruct exposure of a group of patients. Finally, in the context of exposure prediction, it is also important to report about a measurement database existing in Switzerland since 1991. [Authors]

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A wide range of numerical models and tools have been developed over the last decades to support the decision making process in environmental applications, ranging from physical models to a variety of statistically-based methods. In this study, a landslide susceptibility map of a part of Three Gorges Reservoir region of China was produced, employing binary logistic regression analyses. The available information includes the digital elevation model of the region, geological map and different GIS layers including land cover data obtained from satellite imagery. The landslides were observed and documented during the field studies. The validation analysis is exploited to investigate the quality of mapping.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Leishmaniasis causes significant morbidity and mortality, constituting an important global health problem for which there are few effective drugs. Given the urgent need to identify a safe and effective Leishmania vaccine to help prevent the two million new cases of human leishmaniasis worldwide each year, all reasonable efforts to achieve this goal should be made. This includes the use of animal models that are as close to leishmanial infection in humans as is practical and feasible. Old world monkey species (macaques, baboons, mandrills etc.) have the closest evolutionary relatedness to humans among the approachable animal models. The Asian rhesus macaques (Macaca mulatta) are quite susceptible to leishmanial infection, develop a human-like disease, exhibit antibodies to Leishmania and parasite-specific T-cell mediated immune responses both in vivo and in vitro, and can be protected effectively by vaccination. Results from macaque vaccine studies could also prove useful in guiding the design of human vaccine trials. This review summarizes our current knowledge on this topic and proposes potential approaches that may result in the more effective use of the macaque model to maximize its potential to help the development of an effective vaccine for human leishmaniasis.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Quantitative or algorithmic trading is the automatization of investments decisions obeying a fixed or dynamic sets of rules to determine trading orders. It has increasingly made its way up to 70% of the trading volume of one of the biggest financial markets such as the New York Stock Exchange (NYSE). However, there is not a signi cant amount of academic literature devoted to it due to the private nature of investment banks and hedge funds. This projects aims to review the literature and discuss the models available in a subject that publications are scarce and infrequently. We review the basic and fundamental mathematical concepts needed for modeling financial markets such as: stochastic processes, stochastic integration and basic models for prices and spreads dynamics necessary for building quantitative strategies. We also contrast these models with real market data with minutely sampling frequency from the Dow Jones Industrial Average (DJIA). Quantitative strategies try to exploit two types of behavior: trend following or mean reversion. The former is grouped in the so-called technical models and the later in the so-called pairs trading. Technical models have been discarded by financial theoreticians but we show that they can be properly cast into a well defined scientific predictor if the signal generated by them pass the test of being a Markov time. That is, we can tell if the signal has occurred or not by examining the information up to the current time; or more technically, if the event is F_t-measurable. On the other hand the concept of pairs trading or market neutral strategy is fairly simple. However it can be cast in a variety of mathematical models ranging from a method based on a simple euclidean distance, in a co-integration framework or involving stochastic differential equations such as the well-known Ornstein-Uhlenbeck mean reversal ODE and its variations. A model for forecasting any economic or financial magnitude could be properly defined with scientific rigor but it could also lack of any economical value and be considered useless from a practical point of view. This is why this project could not be complete without a backtesting of the mentioned strategies. Conducting a useful and realistic backtesting is by no means a trivial exercise since the \laws" that govern financial markets are constantly evolving in time. This is the reason because we make emphasis in the calibration process of the strategies' parameters to adapt the given market conditions. We find out that the parameters from technical models are more volatile than their counterpart form market neutral strategies and calibration must be done in a high-frequency sampling manner to constantly track the currently market situation. As a whole, the goal of this project is to provide an overview of a quantitative approach to investment reviewing basic strategies and illustrating them by means of a back-testing with real financial market data. The sources of the data used in this project are Bloomberg for intraday time series and Yahoo! for daily prices. All numeric computations and graphics used and shown in this project were implemented in MATLAB^R scratch from scratch as a part of this thesis. No other mathematical or statistical software was used.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The paper discusses maintenance challenges of organisations with a huge number of devices and proposes the use of probabilistic models to assist monitoring and maintenance planning. The proposal assumes connectivity of instruments to report relevant features for monitoring. Also, the existence of enough historical registers with diagnosed breakdowns is required to make probabilistic models reliable and useful for predictive maintenance strategies based on them. Regular Markov models based on estimated failure and repair rates are proposed to calculate the availability of the instruments and Dynamic Bayesian Networks are proposed to model cause-effect relationships to trigger predictive maintenance services based on the influence between observed features and previously documented diagnostics