55 resultados para INDIVIDUAL-BASED MODEL
Resumo:
The paper presents a foundation model for Marxian theories of the breakdown of capitalism based on a new falling rate of profit mechanism. All of these theories are based on one or more of ?the historical tendencies?: a rising capital-wage bill ratio, a rising capitalist share and a falling rate of profit. The model is a foundation in the sense that it generates these tendencies in the context of a model with a constant subsistence wage. The newly discovered generating mechanism is based on neo-classical reasoning for a model with land. It is non-Ricardian in that land augmenting technical progress can be unboundedly rapid. Finally, since the model has no steady state, it is necessary to use a new technique, Chaplygin?s method, to prove the result.
Resumo:
Report for the scientific sojourn carried out at the Model-based Systems and Qualitative Reasoning Group (Technical University of Munich), from September until December 2005. Constructed wetlands (CWs), or modified natural wetlands, are used all over the world as wastewater treatment systems for small communities because they can provide high treatment efficiency with low energy consumption and low construction, operation and maintenance costs. Their treatment process is very complex because it includes physical, chemical and biological mechanisms like microorganism oxidation, microorganism reduction, filtration, sedimentation and chemical precipitation. Besides, these processes can be influenced by different factors. In order to guarantee the performance of CWs, an operation and maintenance program must be defined for each Wastewater Treatment Plant (WWTP). The main objective of this project is to provide a computer support to the definition of the most appropriate operation and maintenance protocols to guarantee the correct performance of CWs. To reach them, the definition of models which represent the knowledge about CW has been proposed: components involved in the sanitation process, relation among these units and processes to remove pollutants. Horizontal Subsurface Flow CWs are chosen as a case study and the filtration process is selected as first modelling-process application. However, the goal is to represent the process knowledge in such a way that it can be reused for other types of WWTP.
Resumo:
L’anàlisi de l’efecte dels gens i els factors ambientals en el desenvolupament de malalties complexes és un gran repte estadístic i computacional. Entre les diverses metodologies de mineria de dades que s’han proposat per a l’anàlisi d’interaccions una de les més populars és el mètode Multifactor Dimensionality Reduction, MDR, (Ritchie i al. 2001). L’estratègia d’aquest mètode és reduir la dimensió multifactorial a u mitjançant l’agrupació dels diferents genotips en dos grups de risc: alt i baix. Tot i la seva utilitat demostrada, el mètode MDR té alguns inconvenients entre els quals l’agrupació excessiva de genotips pot fer que algunes interaccions importants no siguin detectades i que no permet ajustar per efectes principals ni per variables confusores. En aquest article il•lustrem les limitacions de l’estratègia MDR i d’altres aproximacions no paramètriques i demostrem la conveniència d’utilitzar metodologies parametriques per analitzar interaccions en estudis cas-control on es requereix l’ajust per variables confusores i per efectes principals. Proposem una nova metodologia, una versió paramètrica del mètode MDR, que anomenem Model-Based Multifactor Dimensionality Reduction (MB-MDR). La metodologia proposada té com a objectiu la identificació de genotips específics que estiguin associats a la malaltia i permet ajustar per efectes marginals i variables confusores. La nova metodologia s’il•lustra amb dades de l’Estudi Espanyol de Cancer de Bufeta.
Resumo:
Report for the scientific sojourn at the University of Linköping between April to July 2007. Monitoring of the air intake system of an automotive engine is important to meet emission related legislative diagnosis requirements. During the research the problem of fault detection in the air intake system was stated as a constraint satisfaction problem over continuous domains with a big number of variables and constraints. This problem was solved using Interval-based Consistency Techniques. Interval-based consistency techniques are shown to be particularly efficient for checking the consistency of the Analytical Redundancy Relations (ARRs), dealing with uncertain measurements and parameters, and using experimental data. All experiments were performed on a four-cylinder turbo-charged spark-ignited SAAB engine located in the research laboratory at Vehicular System Group - University of Linköping.
Resumo:
Populations displaced as a result of mass violent conflict have become one of the most pressing humanitarian concerns of the last decades. They have also become one salient political issue as a perceived burden (in economic and security terms) and as an important piece in the shift towards a more interventionist paradigm in the international system, based on both humanitarian and security grounds. The saliency of these aspects has detracted attention from the analysis of the interactions between relocation processes and violent conflict. Violent conflict studies have also largely ignored those interactions as a result of the consideration of these processes as mere reaction movements determined by structural conditions. This article takes the view that individual’s agency is retained during such processes, and that it is consequential, calling for the need to introduce a micro perspective. Based on this, a model for the individual’s decision of return is presented. The model has the potential to account for the dynamics of return at both the individual and the aggregate level. And it further helps to grasp fundamental interconnections with violent conflict. Some relevant conclusions are derived for the case of Bosnia-Herzegovina and about the implications of the politicization of return.
Resumo:
In CoDaWork’05, we presented an application of discriminant function analysis (DFA) to 4 differentcompositional datasets and modelled the first canonical variable using a segmented regression modelsolely based on an observation about the scatter plots. In this paper, multiple linear regressions areapplied to different datasets to confirm the validity of our proposed model. In addition to dating theunknown tephras by calibration as discussed previously, another method of mapping the unknown tephrasinto samples of the reference set or missing samples in between consecutive reference samples isproposed. The application of these methodologies is demonstrated with both simulated and real datasets.This new proposed methodology provides an alternative, more acceptable approach for geologists as theirfocus is on mapping the unknown tephra with relevant eruptive events rather than estimating the age ofunknown tephra.Kew words: Tephrochronology; Segmented regression
Resumo:
Following the approach developed by Luttens (2010), we consider a model where individuals with di fferent levels of skills exert di fferent levels of e ffor. Speci fically, we propose a redistribution mechanism based on a lower bound on what every individual deserves: the so-called minimal rights (O'Neill (1982)). Our re finement of Luttens' mechanism ensures at the same time minimal rights based solidarity, participation (non-negativity) and claims feasibility. Keywords: Redistribution mechanism, Minimal rights, Solidarity, Participation, Claims feasibility. JEL classi fication: C71, D63, D71.
Resumo:
In this article we present a hybrid approach for automatic summarization of Spanish medical texts. There are a lot of systems for automatic summarization using statistics or linguistics, but only a few of them combining both techniques. Our idea is that to reach a good summary we need to use linguistic aspects of texts, but as well we should benefit of the advantages of statistical techniques. We have integrated the Cortex (Vector Space Model) and Enertex (statistical physics) systems coupled with the Yate term extractor, and the Disicosum system (linguistics). We have compared these systems and afterwards we have integrated them in a hybrid approach. Finally, we have applied this hybrid system over a corpora of medical articles and we have evaluated their performances obtaining good results.
Resumo:
In this paper we analyze the sensitivity of the labour market decisions of workers close toretirement with respect to the incentives created by public regulations. We improve upon the extensiveprior literature on the effect of pension incentives on retirement in two ways. First, bymodeling the transitions between employment, unemployment and retirement in a simultaneousmanner, paying special attention to the transition from unemployment to retirement (which is particularlyimportant in Spain). Second, by considering the influence of unobserved heterogeneity inthe estimation of the effect of our (carefully constructed) incentive variables.Using administrative data, we find that, when properly defined, economic incentives have astrong impact on labour market decisions in Spain. Unemployment regulations are shown to be particularlyinfluential for retirement behaviour, along with the more traditional determinants linked tothe pension system. Pension variables also have a major bearing on both workers reemploymentdecisions and on the strategic actions of employers. The quantitative impact of the incentives, however,is greatly affected by the existence of unobserved heterogeneity among workers. Its omissionleads to sizable biases in the assessment of the sensitivity to economic incentives, a finding thathas clear consequences for the credibility of any model-based policy analysis. We confirm theimportance of this potential problem in one especially interesting instance: the reform of earlyretirement provisions undertaken in Spain in 2002. We use a difference-in-difference approach tomeasure the behavioural reaction to this change, finding a large overestimation when unobservedheterogeneity is not taken into account.
Resumo:
This paper shows how risk may aggravate fluctuations in economies with imperfect insurance and multiple assets. A two period job matching model is studied, in which risk averse agents act both as workers and as entrepreneurs. They choose between two types of investment: one type is riskless, while the other is a risky activity that creates jobs.Equilibrium is unique under full insurance. If investment is fully insured but unemployment risk is uninsured, then precautionary saving behavior dampens output fluctuations. However, if both investment and employment are uninsured, then an increase in unemployment gives agents an incentive to shift investment away from the risky asset, further increasing unemployment. This positive feedback may lead to multiple Pareto ranked equilibria. An overlapping generations version of the model may exhibit poverty traps or persistent multiplicity. Greater insurance is doubly beneficial in this context since it can both prevent multiplicity and promote risky investment.
Resumo:
A new parameter is introduced: the lightning potential index (LPI), which is a measure of the potential for charge generation and separation that leads to lightning flashes in convective thunderstorms. The LPI is calculated within the charge separation region of clouds between 0 C and 20 C, where the noninductive mechanism involving collisions of ice and graupel particles in the presence of supercooled water is most effective. As shown in several case studies using the Weather Research and Forecasting (WRF) model with explicit microphysics, the LPI is highly correlated with observed lightning. It is suggested that the LPI may be a useful parameter for predicting lightning as well as a tool for improving weather forecasting of convective storms and heavy rainfall.
Resumo:
Background: Network reconstructions at the cell level are a major development in Systems Biology. However, we are far from fully exploiting its potentialities. Often, the incremental complexity of the pursued systems overrides experimental capabilities, or increasingly sophisticated protocols are underutilized to merely refine confidence levels of already established interactions. For metabolic networks, the currently employed confidence scoring system rates reactions discretely according to nested categories of experimental evidence or model-based likelihood. Results: Here, we propose a complementary network-based scoring system that exploits the statistical regularities of a metabolic network as a bipartite graph. As an illustration, we apply it to the metabolism of Escherichia coli. The model is adjusted to the observations to derive connection probabilities between individual metabolite-reaction pairs and, after validation, to assess the reliability of each reaction in probabilistic terms. This network-based scoring system uncovers very specific reactions that could be functionally or evolutionary important, identifies prominent experimental targets, and enables further confirmation of modeling results. Conclusions: We foresee a wide range of potential applications at different sub-cellular or supra-cellular levels of biological interactions given the natural bipartivity of many biological networks.
Resumo:
Interaction models of atomic Al with Si4H9, Si4H7, and Si6H9 clusters have been studied to simulate Al chemisorption on the Si(111) surface in the atop, fourfold atop, and open sites. Calculations were carried out using nonempirical pseudopotentials in the framework of the ab initio Hartree-Fock procedure. Equilibrium bond distances, binding energies for adsorption, and vibrational frequencies of the adatoms are calculated. Several basis sets were used in order to show the importance of polarization effects, especially in the binding energies. Final results show the importance of considering adatom-induced relaxation effects to specify the order of energy stabilities for the three different sites, the fourfold atop site being the preferred one, in agreement with experimental findings.
Resumo:
Geometric parameters of binary (1:1) PdZn and PtZn alloys with CuAu-L10 structure were calculated with a density functional method. Based on the total energies, the alloys are predicted to feature equal formation energies. Calculated surface energies of PdZn and PtZn alloys show that (111) and (100) surfaces exposing stoichiometric layers are more stable than (001) and (110) surfaces comprising alternating Pd (Pt) and Zn layers. The surface energy values of alloys lie between the surface energies of the individual components, but they differ from their composition weighted averages. Compared with the pure metals, the valence d-band widths and the Pd or Pt partial densities of states at the Fermi level are dramatically reduced in PdZn and PtZn alloys. The local valence d-band density of states of Pd and Pt in the alloys resemble that of metallic Cu, suggesting that a similar catalytic performance of these systems can be related to this similarity in the local electronic structures.
Resumo:
Background: MLPA method is a potentially useful semi-quantitative method to detect copy number alterations in targeted regions. In this paper, we propose a method for the normalization procedure based on a non-linear mixed-model, as well as a new approach for determining the statistical significance of altered probes based on linear mixed-model. This method establishes a threshold by using different tolerance intervals that accommodates the specific random error variability observed in each test sample.Results: Through simulation studies we have shown that our proposed method outperforms two existing methods that are based on simple threshold rules or iterative regression. We have illustrated the method using a controlled MLPA assay in which targeted regions are variable in copy number in individuals suffering from different disorders such as Prader-Willi, DiGeorge or Autism showing the best performace.Conclusion: Using the proposed mixed-model, we are able to determine thresholds to decide whether a region is altered. These threholds are specific for each individual, incorporating experimental variability, resulting in improved sensitivity and specificity as the examples with real data have revealed.