911 resultados para FULL CCSDT MODEL


Relevância:

80.00% 80.00%

Publicador:

Resumo:

This thesis develops and evaluates statistical methods for different types of genetic analyses, including quantitative trait loci (QTL) analysis, genome-wide association study (GWAS), and genomic evaluation. The main contribution of the thesis is to provide novel insights in modeling genetic variance, especially via random effects models. In variance component QTL analysis, a full likelihood model accounting for uncertainty in the identity-by-descent (IBD) matrix was developed. It was found to be able to correctly adjust the bias in genetic variance component estimation and gain power in QTL mapping in terms of precision.  Double hierarchical generalized linear models, and a non-iterative simplified version, were implemented and applied to fit data of an entire genome. These whole genome models were shown to have good performance in both QTL mapping and genomic prediction. A re-analysis of a publicly available GWAS data set identified significant loci in Arabidopsis that control phenotypic variance instead of mean, which validated the idea of variance-controlling genes.  The works in the thesis are accompanied by R packages available online, including a general statistical tool for fitting random effects models (hglm), an efficient generalized ridge regression for high-dimensional data (bigRR), a double-layer mixed model for genomic data analysis (iQTL), a stochastic IBD matrix calculator (MCIBD), a computational interface for QTL mapping (qtl.outbred), and a GWAS analysis tool for mapping variance-controlling loci (vGWAS).

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Includes bibliography.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

When there is a failure on the external sheath of a flexible pipe, a high value of hydrostatic pressure is transferred to its internal plastic layer and consequently to its interlocked carcass, leading to the possibility of collapse. The design of a flexible pipe must predict the maximum value of external pressure the carcass layer can be subjected to without collapse. This value depends on the initial ovalization due to manufacturing tolerances. To study that problem, two numerical finite element models were developed to simulate the behavior of the carcass subjected to external pressure, including the plastic behavior of the materials. The first one is a full 3D model and the second one is a 3D ring model, both composed by solid elements. An interesting conclusion is that both the models provide the same results. An analytical model using an equivalent thickness approach for the carcass layer was also constructed. A good correlation between analytical and numerical models was achieved for pre-collapse behavior but the collapse pressure value and post-collapse behavior were not well predicted by the analytical model. [DOI: 10.1115/1.4005185]

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The purpose of this study is to apply inverse dynamics control for a six degree of freedom flight simulator motion system. Imperfect compensation of the inverse dynamic control is intentionally introduced in order to simplify the implementation of this approach. The control strategy is applied in the outer loop of the inverse dynamic control to counteract the effects of imperfect compensation. The control strategy is designed using H-infinity theory. Forward and inverse kinematics and full dynamic model of a six degrees of freedom motion base driven by electromechanical actuators are briefly presented. Describing function, acceleration step response and some maneuvers computed from the washout filter were used to evaluate the performance of the controllers.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The purpose of this study is to apply inverse dynamics control for a six degree of freedom flight simulator motion system. Imperfect compensation of the inverse dynamic control is intentionally introduced in order to simplify the implementation of this approach. The control strategy is applied in the outer loop of the inverse dynamic control to counteract the effects of imperfect compensation. The control strategy is designed using H∞ theory. Forward and inverse kinematics and full dynamic model of a six degrees of freedom motion base driven by electromechanical actuators are briefly presented. Describing function, acceleration step response and some maneuvers computed from the washout filter were used to evaluate the performance of the controllers.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In this work we aim to propose a new approach for preliminary epidemiological studies on Standardized Mortality Ratios (SMR) collected in many spatial regions. A preliminary study on SMRs aims to formulate hypotheses to be investigated via individual epidemiological studies that avoid bias carried on by aggregated analyses. Starting from collecting disease counts and calculating expected disease counts by means of reference population disease rates, in each area an SMR is derived as the MLE under the Poisson assumption on each observation. Such estimators have high standard errors in small areas, i.e. where the expected count is low either because of the low population underlying the area or the rarity of the disease under study. Disease mapping models and other techniques for screening disease rates among the map aiming to detect anomalies and possible high-risk areas have been proposed in literature according to the classic and the Bayesian paradigm. Our proposal is approaching this issue by a decision-oriented method, which focus on multiple testing control, without however leaving the preliminary study perspective that an analysis on SMR indicators is asked to. We implement the control of the FDR, a quantity largely used to address multiple comparisons problems in the eld of microarray data analysis but which is not usually employed in disease mapping. Controlling the FDR means providing an estimate of the FDR for a set of rejected null hypotheses. The small areas issue arises diculties in applying traditional methods for FDR estimation, that are usually based only on the p-values knowledge (Benjamini and Hochberg, 1995; Storey, 2003). Tests evaluated by a traditional p-value provide weak power in small areas, where the expected number of disease cases is small. Moreover tests cannot be assumed as independent when spatial correlation between SMRs is expected, neither they are identical distributed when population underlying the map is heterogeneous. The Bayesian paradigm oers a way to overcome the inappropriateness of p-values based methods. Another peculiarity of the present work is to propose a hierarchical full Bayesian model for FDR estimation in testing many null hypothesis of absence of risk.We will use concepts of Bayesian models for disease mapping, referring in particular to the Besag York and Mollié model (1991) often used in practice for its exible prior assumption on the risks distribution across regions. The borrowing of strength between prior and likelihood typical of a hierarchical Bayesian model takes the advantage of evaluating a singular test (i.e. a test in a singular area) by means of all observations in the map under study, rather than just by means of the singular observation. This allows to improve the power test in small areas and addressing more appropriately the spatial correlation issue that suggests that relative risks are closer in spatially contiguous regions. The proposed model aims to estimate the FDR by means of the MCMC estimated posterior probabilities b i's of the null hypothesis (absence of risk) for each area. An estimate of the expected FDR conditional on data (\FDR) can be calculated in any set of b i's relative to areas declared at high-risk (where thenull hypothesis is rejected) by averaging the b i's themselves. The\FDR can be used to provide an easy decision rule for selecting high-risk areas, i.e. selecting as many as possible areas such that the\FDR is non-lower than a prexed value; we call them\FDR based decision (or selection) rules. The sensitivity and specicity of such rule depend on the accuracy of the FDR estimate, the over-estimation of FDR causing a loss of power and the under-estimation of FDR producing a loss of specicity. Moreover, our model has the interesting feature of still being able to provide an estimate of relative risk values as in the Besag York and Mollié model (1991). A simulation study to evaluate the model performance in FDR estimation accuracy, sensitivity and specificity of the decision rule, and goodness of estimation of relative risks, was set up. We chose a real map from which we generated several spatial scenarios whose counts of disease vary according to the spatial correlation degree, the size areas, the number of areas where the null hypothesis is true and the risk level in the latter areas. In summarizing simulation results we will always consider the FDR estimation in sets constituted by all b i's selected lower than a threshold t. We will show graphs of the\FDR and the true FDR (known by simulation) plotted against a threshold t to assess the FDR estimation. Varying the threshold we can learn which FDR values can be accurately estimated by the practitioner willing to apply the model (by the closeness between\FDR and true FDR). By plotting the calculated sensitivity and specicity (both known by simulation) vs the\FDR we can check the sensitivity and specicity of the corresponding\FDR based decision rules. For investigating the over-smoothing level of relative risk estimates we will compare box-plots of such estimates in high-risk areas (known by simulation), obtained by both our model and the classic Besag York Mollié model. All the summary tools are worked out for all simulated scenarios (in total 54 scenarios). Results show that FDR is well estimated (in the worst case we get an overestimation, hence a conservative FDR control) in small areas, low risk levels and spatially correlated risks scenarios, that are our primary aims. In such scenarios we have good estimates of the FDR for all values less or equal than 0.10. The sensitivity of\FDR based decision rules is generally low but specicity is high. In such scenario the use of\FDR = 0:05 or\FDR = 0:10 based selection rule can be suggested. In cases where the number of true alternative hypotheses (number of true high-risk areas) is small, also FDR = 0:15 values are well estimated, and \FDR = 0:15 based decision rules gains power maintaining an high specicity. On the other hand, in non-small areas and non-small risk level scenarios the FDR is under-estimated unless for very small values of it (much lower than 0.05); this resulting in a loss of specicity of a\FDR = 0:05 based decision rule. In such scenario\FDR = 0:05 or, even worse,\FDR = 0:1 based decision rules cannot be suggested because the true FDR is actually much higher. As regards the relative risk estimation, our model achieves almost the same results of the classic Besag York Molliè model. For this reason, our model is interesting for its ability to perform both the estimation of relative risk values and the FDR control, except for non-small areas and large risk level scenarios. A case of study is nally presented to show how the method can be used in epidemiology.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The subject of this thesis are the interactions between nucleosome core particles (NCPs). NCPs are the primary storage units of DNA in eucaryotic cells. Each NCP consists of a core of eight histone proteins and a strand of DNA, which is wrapped around about two times. Each histone protein has a terminal tail passing over and between the superhelix of the wrapped DNA. Special emphasis was placed on the role of the histone tails, since experimental ndings suggest that the tails have a great in uence on the mutual attraction of the NCPs. In those experiments Mangenot et al. observe a dramatic change in the con guration of the tails, which is accompanied by evidence of mutual attraction between NCPs, when a certain salt concentration is reached. Existing models used in the theoretical approaches and in simulations focus on the description of the histone core and the wrapped DNA, but neglect the histone tails. We introduce the multi chain complex as a new simulation model. Here the histone core and the wrapping DNA are modelled via a charged sphere, while the histone tails are represented by oppositely charged chains grafted on the sphere surface. We start by investigating the parameter space describing a single NCP. The Debye-Huckel potential is used to model the electrostatic interactions and to determine the e ective charge of the NCP core. This value is subsequently used for a study of the pairinteraction of two NCPs via an extensive Molecular Dynamics study. The monomer distribution of the full chain model is investigated. The existence of tail bridges between the cores is demonstrated. Finally, by discriminating between bridging and non-bridging con gurations, we can show that the effect of tail bridging between the spheres does indeed account for the observed attraction. The full chain model can serve as a model to study the acetylation of the histone tails of the nucleosome. The reduction of the charge fraction of the tails, that corresponds to the process of acetylation, leads to a reduction or even the disappearance of the attraction. A recent MC study links this e ect to the unfolding of the chromatin ber in the case of acetylated histone tails. In this case the acetylation of the histone tails leads to the formation of heterochromatin, and one could understand how larger regions of the genetic information could be inactivated through this mechanism.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The increasing precision of current and future experiments in high-energy physics requires a likewise increase in the accuracy of the calculation of theoretical predictions, in order to find evidence for possible deviations of the generally accepted Standard Model of elementary particles and interactions. Calculating the experimentally measurable cross sections of scattering and decay processes to a higher accuracy directly translates into including higher order radiative corrections in the calculation. The large number of particles and interactions in the full Standard Model results in an exponentially growing number of Feynman diagrams contributing to any given process in higher orders. Additionally, the appearance of multiple independent mass scales makes even the calculation of single diagrams non-trivial. For over two decades now, the only way to cope with these issues has been to rely on the assistance of computers. The aim of the xloops project is to provide the necessary tools to automate the calculation procedures as far as possible, including the generation of the contributing diagrams and the evaluation of the resulting Feynman integrals. The latter is based on the techniques developed in Mainz for solving one- and two-loop diagrams in a general and systematic way using parallel/orthogonal space methods. These techniques involve a considerable amount of symbolic computations. During the development of xloops it was found that conventional computer algebra systems were not a suitable implementation environment. For this reason, a new system called GiNaC has been created, which allows the development of large-scale symbolic applications in an object-oriented fashion within the C++ programming language. This system, which is now also in use for other projects besides xloops, is the main focus of this thesis. The implementation of GiNaC as a C++ library sets it apart from other algebraic systems. Our results prove that a highly efficient symbolic manipulator can be designed in an object-oriented way, and that having a very fine granularity of objects is also feasible. The xloops-related parts of this work consist of a new implementation, based on GiNaC, of functions for calculating one-loop Feynman integrals that already existed in the original xloops program, as well as the addition of supplementary modules belonging to the interface between the library of integral functions and the diagram generator.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

As land is developed, the impervious surfaces that are created increase the amount of runoff during rainfall events, disrupting the natural hydrologic cycle, with an increment in volume of runoff and in pollutant loadings. Pollutants deposited or derived from an activity on the land surface will likely end up in stormwater runoff in some concentration, such as nutrients, sediment, heavy metals, hydrocarbons, gasoline additives, pathogens, deicers, herbicides and pesticides. Several of these pollutants are particulate-bound, so it appears clear that sediment removal can provide significant water-quality improvements and it appears to be important the knowledge of the ability of stromwater treatment devices to retain particulate matter. For this reason three different units which remove sediments have been tested through laboratory. In particular a roadside gully pot has been tested under steady hydraulic conditions, varying the characteristics of the influent solids (diameter, particle size distribution and specific gravity). The efficiency in terms of particles retained has been evaluated as a function of influent flow rate and particles characteristics; results have been compared to efficiency evaluated applying an overflow rate model. Furthermore the role of particles settling velocity in efficiency determination has been investigated. After the experimental runs on the gully pot, a standard full-scale model of an hydrodynamic separator (HS) has been tested under unsteady influent flow rate condition, and constant solid concentration at the input. The results presented in this study illustrate that particle separation efficiency of the unit is predominately influenced by operating flow rate, which strongly affects the particles and hydraulic residence time of the system. The efficiency data have been compared to results obtained from a modified overflow rate model; moreover the residence time distribution has been experimentally determined through tracer analyses for several steady flow rates. Finally three testing experiments have been performed for two different configurations of a full-scale model of a clarifier (linear and crenulated) under unsteady influent flow rate condition, and constant solid concentration at the input. The results illustrate that particle separation efficiency of the unit is predominately influenced by the configuration of the unit itself. Turbidity measures have been used to compare turbidity with the suspended sediments concentration, in order to find a correlation between these two values, which can allow to have a measure of the sediments concentration simply installing a turbidity probe.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Diabatische Rossby-Wellen (DRWs) sind zyklonale Wirbel in der unteren Troposphäre, welche sich durch einen thermodynamisch-dynamischen Mechanismus kontinuierlich regenerieren und dabei schnell propagieren können. Vorangehende Untersuchungen schreiben derartigen zyklonalen Wirbeln das Potential zu, unter Wechselwirkung mit einer Anomalie an der Tropopause eine rapide Zyklonenintensivierung und folglich extreme Wetterereignisse hervorrufen zu können. DRWs wurden bisher meist in idealisierten Studien untersucht, woraus sich noch einige offene Fragen zu diesem Phänomen, besonders in realen Modelldaten, ergeben.rnrnIm Mittelpunkt dieser Arbeit steht die Fallstudie einer DRW, die im Dezember 2005 über dem Nordatlantik auftrat. Der Lebenszyklus des Systems ist über mehrere Tage und durch verschiedene Phasen verfolgbar und resultiert in einer explosiven Druckvertiefung. Zur Untersuchung der Fallstudie wurde mit operationellen Daten eines Globalmodelles sowie mit den Resultaten eines feinskaligeren Regionalmodelles gearbeitet, auf welche unterschiedliche Analysewerkzeuge angewendet wurden. rnrnDie eingehende Untersuchung der Propagationsphase der DRW bekräftigte das Vorhandensein von genügend Feuchte und Baroklinität als essentiell für den Propagationsmechanismus und die Intensität der DRW. Während der Propagationsphase arbeitet der selbsterhaltende DRW-Mechanismus unabhängig von einer von den Wellen an der Tropopause ausgehenden Anregung. Sensitivitätsstudien mit dem Regionalmodell, in denen die Umgebungsbedingungen der DRW lokal modifiziert wurden, ergaben, dass die Propagation einen relativ robusten Ablauf darstellt. Dementsprechend war in den vier untersuchten operationellen Vorhersagen die Propagationsphase gut wiedergegeben, während die rapide Intensivierung, wie sie gemäß den Analysen aufgetreten ist, von zwei der Vorhersagen verfehlt wurde.rnrnBei der Untersuchung der Intensivierungsphase stellten sich die Position und die zeitliche Abstimmung der Bewegung der Anomalie an der Tropopause relativ zur DRW in der unteren Troposphäre sowie die Stärke der Systeme als entscheidende Einflussfaktoren heraus. In den Entwicklungen der Sensitivitätssimulationen deutete sich an, dass ein unabhängig von der DRW an geeigneter Position entstandener zyklonaler Wirbel konstruktiver zu einer starken Zyklonenintensivierung beitragen kann als die DRW.rnrnIm zweiten Teil der Arbeit wurde ein Datensatz über die Nordhemisphäre für die Jahre 2004-2008 hinsichtlich des geographischen Vorkommens und der Intensivierung von DRWs untersucht. DRWs ereigneten sich in diesem Zeitraum über dem Atlantik (255 DRWs) halb so oft wie über dem Pazifik (515 DRWs). Ihre Entstehungsgebiete befanden sich über den Ostteilen der Kontinente und den Westhälften der Ozeane. Die Zugbahnen folgten größtenteils der baroklinen Zone der mittleren Breiten. Von den erfassten DRWs intensivierten sich im Atlanik 16% zu explosiven Tiefdruckgebieten, über dem Pazifik liegt der Anteil mit 11% etwas niedriger. Damit tragen DRWs zu etwa 20% der sich explosiv intensivierenden außertropischen Zyklonen bei.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In many applications the observed data can be viewed as a censored high dimensional full data random variable X. By the curve of dimensionality it is typically not possible to construct estimators that are asymptotically efficient at every probability distribution in a semiparametric censored data model of such a high dimensional censored data structure. We provide a general method for construction of one-step estimators that are efficient at a chosen submodel of the full-data model, are still well behaved off this submodel and can be chosen to always improve on a given initial estimator. These one-step estimators rely on good estimators of the censoring mechanism and thus will require a parametric or semiparametric model for the censoring mechanism. We present a general theorem that provides a template for proving the desired asymptotic results. We illustrate the general one-step estimation methods by constructing locally efficient one-step estimators of marginal distributions and regression parameters with right-censored data, current status data and bivariate right-censored data, in all models allowing the presence of time-dependent covariates. The conditions of the asymptotics theorem are rigorously verified in one of the examples and the key condition of the general theorem is verified for all examples.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

While clinical studies have shown a negative relationship between obesity and mental health in women, population studies have not shown a consistent association. However, many of these studies can be criticized regarding fatness level criteria, lack of control variables, and validity of the psychological variables.^ The purpose of this research was to elucidate the relationship between fatness level and mental health in United States women using data from the First National Health and Nutrition Examination Survey (NHANES I), which was conducted on a national probability sample from 1971 to 1974. Mental health was measured by the General Well-Being Schedule (GWB), and fatness level was determined by the sum of the triceps and subscapular skinfolds. Women were categorized as lean (15th percentile or less), normal (16th to 84th percentiles), or obese (85th percentile or greater).^ A conceptual framework was developed which identified the variables of age, race, marital status, socioeconomic status (education), employment status, number of births, physical health, weight history, and perception of body image as important to the fatness level-GWB relationship. Multiple regression analyses were performed separately for whites and blacks with GWB as the response variable, and fatness level, age, education, employment status, number of births, marital status, and health perception as predictor variables. In addition, 2- and 3-way interaction terms for leanness, obesity and age were included as predictor variables. Variables related to weight history and perception of body image were not collected in NHANES I, and thus were not included in this study.^ The results indicated that obesity was a statistically significant predictor of lower GWB in white women even when the other predictor variables were controlled. The full regression model identified the young, more educated, obese female as a subgroup with lower GWB, especially in blacks. These findings were not consistent with the previous non-clinical studies which found that obesity was associated with better mental health. The social stigma of being obese and the preoccupation of women with being lean may have contributed to the lower GWB in these women. ^

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The need for timely population data for health planning and Indicators of need has Increased the demand for population estimates. The data required to produce estimates is difficult to obtain and the process is time consuming. Estimation methods that require less effort and fewer data are needed. The structure preserving estimator (SPREE) is a promising technique not previously used to estimate county population characteristics. This study first uses traditional regression estimation techniques to produce estimates of county population totals. Then the structure preserving estimator, using the results produced in the first phase as constraints, is evaluated.^ Regression methods are among the most frequently used demographic methods for estimating populations. These methods use symptomatic indicators to predict population change. This research evaluates three regression methods to determine which will produce the best estimates based on the 1970 to 1980 indicators of population change. Strategies for stratifying data to improve the ability of the methods to predict change were tested. Difference-correlation using PMSA strata produced the equation which fit the data the best. Regression diagnostics were used to evaluate the residuals.^ The second phase of this study is to evaluate use of the structure preserving estimator in making estimates of population characteristics. The SPREE estimation approach uses existing data (the association structure) to establish the relationship between the variable of interest and the associated variable(s) at the county level. Marginals at the state level (the allocation structure) supply the current relationship between the variables. The full allocation structure model uses current estimates of county population totals to limit the magnitude of county estimates. The limited full allocation structure model has no constraints on county size. The 1970 county census age - gender population provides the association structure, the allocation structure is the 1980 state age - gender distribution.^ The full allocation model produces good estimates of the 1980 county age - gender populations. An unanticipated finding of this research is that the limited full allocation model produces estimates of county population totals that are superior to those produced by the regression methods. The full allocation model is used to produce estimates of 1986 county population characteristics. ^

Relevância:

80.00% 80.00%

Publicador:

Resumo:

El presente trabajo intenta realizar una síntesis de cómo se fue articulando la sociedad a partir del modelo capitalista de producción en el período histórico que transcurre desde el impulso desarrollado en la Segunda Revolución Industrial, a finales del siglo XIX , hasta la primera década del siglo XXI. Este período determinará una forma de integración social que va estar vinculada al trabajo asalariado y ligada a un Estado más presente, articulador y distribuidor de la riqueza social. Algunos de los elementos constitutivos que sostuvieron este modelo fueron el New Deal y la política económica keynesiana, que van a perdurar hasta bien entrada la década de 1970. En esta etapa, la relación capital-trabajo va a entrar en crisis; el modelo keynesiano de pleno empleo no le es útil al capitalismo; y, por lo tanto, el Estado va a permitir la incorporación de un nuevo discurso, enarbolado por los denominados economistas liberales ortodoxos y, ya en la última década del siglo XX , por el economista norteamericano John Williamson, relacionado con un movimiento intelectual denominado "Consenso de Washington"; esto dando lugar una nueva forma de estructuración social en la que van a coexistir ganadores y perdedores del sistema. Se intenta explicar, además, el impacto que provocó en la Argentina adherir al modelo neoliberal a través de una economía abierta (1989-2002). Por último, describimos la situación de la Provincia de San Luis; los dispositivos y mecanismos que utilizó para contrarrestar los dos dígitos de desocupación producidos después del año 2002; para ello, nos referimos al Plan de Inclusión Social, expuesto a través de datos estadísticos tomados del INDEC y de la Dirección Provincial de Estadísticas y Censos de la Provincia de San Luis

Relevância:

80.00% 80.00%

Publicador:

Resumo:

El presente trabajo intenta realizar una síntesis de cómo se fue articulando la sociedad a partir del modelo capitalista de producción en el período histórico que transcurre desde el impulso desarrollado en la Segunda Revolución Industrial, a finales del siglo XIX , hasta la primera década del siglo XXI. Este período determinará una forma de integración social que va estar vinculada al trabajo asalariado y ligada a un Estado más presente, articulador y distribuidor de la riqueza social. Algunos de los elementos constitutivos que sostuvieron este modelo fueron el New Deal y la política económica keynesiana, que van a perdurar hasta bien entrada la década de 1970. En esta etapa, la relación capital-trabajo va a entrar en crisis; el modelo keynesiano de pleno empleo no le es útil al capitalismo; y, por lo tanto, el Estado va a permitir la incorporación de un nuevo discurso, enarbolado por los denominados economistas liberales ortodoxos y, ya en la última década del siglo XX , por el economista norteamericano John Williamson, relacionado con un movimiento intelectual denominado "Consenso de Washington"; esto dando lugar una nueva forma de estructuración social en la que van a coexistir ganadores y perdedores del sistema. Se intenta explicar, además, el impacto que provocó en la Argentina adherir al modelo neoliberal a través de una economía abierta (1989-2002). Por último, describimos la situación de la Provincia de San Luis; los dispositivos y mecanismos que utilizó para contrarrestar los dos dígitos de desocupación producidos después del año 2002; para ello, nos referimos al Plan de Inclusión Social, expuesto a través de datos estadísticos tomados del INDEC y de la Dirección Provincial de Estadísticas y Censos de la Provincia de San Luis