967 resultados para Partial Credit Model
A refined LEED analysis of water on Ru{0001}: an experimental test of the partial dissociation model
Resumo:
Despite a number of earlier studies which seemed to confirm molecular adsorption of water on close-packed surfaces of late transition metals, new controversy has arisen over a recent theoretical work by Feibelman, according to which partial dissociation occurs on the Ru{0001} surface leading to a mixed (H2O + OH + H) superstructure. Here, we present a refined LEED-IV analysis of the (root3 x root3)R30degrees-D2O-Ru{0001} structure, testing explicitly this new model by Feibelman. Our results favour the model proposed earlier by Held and Menzel assuming intact water molecules with almost coplanar oxygen atoms and out-of-plane hydrogen atoms atop the slightly higher oxygen atoms. The partially dissociated model with an almost identical arrangement of oxygen atoms can, however, not unambiguously be excluded, especially when the single hydrogen atoms are not present in the surface unit cell. In contrast to the earlier LEED-IV analysis, we can, however, clearly exclude a buckled geometry of oxygen atoms.
Resumo:
Polytomous Item Response Theory Models provides a unified, comprehensive introduction to the range of polytomous models available within item response theory (IRT). It begins by outlining the primary structural distinction between the two major types of polytomous IRT models. This focuses on the two types of response probability that are unique to polytomous models and their associated response functions, which are modeled differently by the different types of IRT model. It describes, both conceptually and mathematically, the major specific polytomous models, including the Nominal Response Model, the Partial Credit Model, the Rating Scale model, and the Graded Response Model. Important variations, such as the Generalized Partial Credit Model are also described as are less common variations, such as the Rating Scale version of the Graded Response Model. Relationships among the models are also investigated and the operation of measurement information is described for each major model. Practical examples of major models using real data are provided, as is a chapter on choosing an appropriate model. Figures are used throughout to illustrate important elements as they are described.
Resumo:
La collaboration est une compétence essentielle que les futurs médecins doivent développer. La détermination des niveaux de compétence est cruciale dans la planification de cet apprentissage. Les échelles descriptives suscitent un intérêt croissant, car elles décrivent en termes qualitatifs les performances attendues. Nous inspirant de la méthodologie mixte de Blais, Laurier, & Rousseau (2009), nous avons construit en cinq étapes une échelle de niveau de compétence de collaboration: 1) formulation d’une liste d’indicateurs situés à quatre niveaux de la formation médicale (préclinique, externat, résidence junior et sénior) par les chercheurs (n= 3) et un groupe d’éducateurs (n=7), leaders pédagogiques possédant une expertise pour la compétence de collaboration; 2) sondage en ligne comprenant quatre questionnaires portant sur les niveaux de 118 indicateurs, auprès d’enseignants cliniciens représentant les différentes spécialités (n=277); 3) analyse, avec le modèle partial credit de Rasch, des réponses aux questionnaires appariés par calibration concurrente; 4) détermination des niveaux des indicateurs par les éducateurs et les chercheurs; et 5) rédaction de l’échelle à partir des indicateurs de chaque niveau. L’analyse itérative des réponses montre une adéquation au modèle de Rasch et répartit les indicateurs sur l’échelle linéaire aux quatre niveaux. Les éducateurs déterminent le niveau des 111 indicateurs retenus en tenant compte des résultats du sondage et de la cohérence avec le curriculum. L’échelle comporte un paragraphe descriptif par niveau, selon trois capacités : 1) participer au fonctionnement d’une équipe; 2) prévenir et gérer les conflits; et 3) planifier, coordonner et dispenser les soins en équipe. Cette échelle rend explicites les comportements collaboratifs attendus à la fin de chaque niveau et est utile à la planification de l’apprentissage et de l’évaluation de cette compétence. La discordance entre les niveaux choisis par les éducateurs et ceux issus de l’analyse des réponses des enseignants cliniciens est principalement due au faible choix de réponse du niveau préclinique par les enseignants et aux problèmes d’adéquation pour les indicateurs décrivant la gestion des conflits. Cette recherche marque une avan- cée dans la compréhension de la compétence de collaboration et démontre l’efficacité de la méthodologie de Blais (2009) dans un contexte de compétence transversale, en sciences de la santé. Cette méthodologie pourrait aider à approfondir les trajectoires de développement d’autres compétences.
Resumo:
Neste trabalho descreve-se a construção, a aplicação e os resultados obtidos numa bateria de exercícios informatizados para treinamento da visualização espacial de estudantes de Engenharia. A bateria contém quatro exercícios baseados em tarefas muito comuns do ensino fundamental de desenho técnico. Cada exercício é composto por 18 itens com quatro opções de respostas das quais apenas uma é correta. Após responder a cada item o aluno recebe um feed-back imediato, com a demonstração da precisão da sua resposta. O formato de resposta dos exercícios denomina-se Responder até acertar, uma vez que se a resposta é incorreta, o aluno recebe a informação da pontuação obtida. Para avaliar a influencia do treinamento na visualização espacial foram administrados testes dessa aptidão no começo e no final do curso de desenho técnico. As figuras dos exercícios e os testes foram construídas com AutoCad e a programação foi realizada com Revolution Studio 2. Utilizaram-se vários modelos para obter as medidas: Partial Credit Model (Masters, 1982) e Rasch Model (Rasch, 1960). Observou-se que os alunos apresentaram uma melhora moderada em visualização espacial.
Resumo:
Studies suggest that enjoyment, perceived benefits and perceived barriers may be important mediators of physical activity. However, the psychometric properties of these scales have not been assessed using Rasch modeling. The purpose of this study was to use Rasch modeling to evaluate the properties of three scales commonly used in physical activity studies: the Physical Activity Enjoyment Scale, the Benefits of Physical Activity Scale and the Barriers to Physical Activity Scale. The scales were administered to 378 healthy adults, aged 25–75 years (50% women, 62% Whites), at the baseline assessment for a lifestyle physical activity intervention trial. The ConQuest software was used to assess model fit, item difficulty, item functioning and standard error of measurement. For all scales, the partial credit model fit the data. Item content of one scale did not adequately cover all respondents. Response options of each scale were not targeting respondents appropriately, and standard error of measurement varied across the total score continuum of each scale. These findings indicate that each scale's effectiveness at detecting differences among individuals may be limited unless changes in scale content and response format are made.
Resumo:
There are very few studies in Spain that treat underachievement rigorously, and those that do are typically related to gifted students. The present study examined the proportion of underachieving students using the Rasch measurement model. A sample of 643 first-year high school students (mean age = 12.09; SD = 0.47) from 8 schools in the province of Alicante (Spain) completed the Battery of Differential and General Skills (Badyg), and these students' General Points Average (GPAs) were recovered by teachers. Dichotomous and Partial credit Rasch models were performed. After adjusting the measurement instruments, the individual underachievement index provided a total sample of 181 underachieving students, or 28.14% of the total sample across the ability levels. This study confirms that the Rasch measurement model can accurately estimate the construct validity of both the intelligence test and the academic grades for the calculation of underachieving students. Furthermore, the present study constitutes a pioneer framework for the estimation of the prevalence of underachievement in Spain.
Resumo:
Part I What makes science hard for newcomers? 1) The background (briefly) of my research - (why the math anxiety model doesn’t fit) 2) The Tier analysis (a visual) – message: there are many types of science learners in your class than simply younger versions of yourself 3) Three approaches (bio, chem, physics) but only one Nature 4) The (different) vocabularies of the three Sciences 5) How mathematics is variously used in Science Part II Rules and rules-driven assignments- lQ vs OQ1) How to incorporate creativity into assignments and tests? 2) Tests- borrowing “thought questions" from other fields (If Columbus hadn't discovered the new World, when and under whose law would it have been discovered?) 3) Grading practices (partial credit, post-exam credit for finding and explaining nontrivial errors 4) Icing on the cake – applications, examples of science/engineering from Tuesdays NY Times Part III Making Change at the Departmental Level 1) Taking control of at least some portion of the curriculum 2) Varying style of presentation 3) Taking control of at least some portion of the exams 4) GRADING pros and cons of grading on a curve 5) Updating labs and lab reporting.
Resumo:
(Matsukawa and Habeck, 2007) analyse the main instruments for risk mitigation in infrastructure financing with Multilateral Financial Institutions (MFIs). Their review coincided with the global financial crisis of 2007-08, and is highly relevant in current times considering the sovereign debt crisis, the lack of available capital and the increases in bank regulation in Western economies. The current macroeconomic environment has seen a slowdown in the level of finance for infrastructure projects, as they pose a higher credit risk given their requirements for long term investments. The rationale for this work is to look for innovative solutions that are focused on the credit risk mitigation of infrastructure and energy projects whilst optimizing the economic capital allocation for commercial banks. This objective is achieved through risk-sharing with MFIs and looking for capital relief in project finance transactions. This research finds out the answer to the main question: "What is the impact of risk-sharing with MFIs on project finance transactions to increase their efficiency and viability?", and is developed from the perspective of a commercial bank assessing the economic capital used and analysing the relevant variables for it: Probability of Default, Loss Given Default and Recovery Rates, (Altman, 2010). An overview of project finance for the infrastructure and energy sectors in terms of the volume of transactions worldwide is outlined, along with a summary of risk-sharing financing with MFIs. A review of the current regulatory framework beneath risk-sharing in structured finance with MFIs is also analysed. From here, the impact of risk-sharing and the diversification effect in infrastructure and energy projects is assessed, from the perspective of economic capital allocation for a commercial bank. CreditMetrics (J. P. Morgan, 1997) is applied over an existing well diversified portfolio of project finance infrastructure and energy investments, working with the main risk capital measures: economic capital, RAROC, and EVA. The conclusions of this research show that economic capital allocation on a portfolio of project finance along with risk-sharing with MFIs have a huge impact on capital relief whilst increasing performance profitability for commercial banks. There is an outstanding diversification effect due to the portfolio, which is combined with risk mitigation and an improvement in recovery rates through Partial Credit Guarantees issued by MFIs. A stress test scenario analysis is applied to the current assumptions and credit risk model, considering a downgrade in the rating for the commercial bank (lender) and an increase of default in emerging countries, presenting a direct impact on economic capital, through an increase in expected loss and a decrease in performance profitability. Getting capital relief through risk-sharing makes it more viable for commercial banks to finance infrastructure and energy projects, with the beneficial effect of a direct impact of these investments on GDP growth and employment. The main contribution of this work is to promote a strategic economic capital allocation in infrastructure and energy financing through innovative risk-sharing with MFIs and economic pricing to create economic value added for banks, and to allow the financing of more infrastructure and energy projects. This work suggests several topics for further research in relation to issues analysed. (Matsukawa and Habeck, 2007) analizan los principales instrumentos de mitigación de riesgos en las Instituciones Financieras Multilaterales (IFMs) para la financiación de infraestructuras. Su presentación coincidió con el inicio de la crisis financiera en Agosto de 2007, y sus consecuencias persisten en la actualidad, destacando la deuda soberana en economías desarrolladas y los problemas capitalización de los bancos. Este entorno macroeconómico ha ralentizado la financiación de proyectos de infraestructuras. El actual trabajo de investigación tiene su motivación en la búsqueda de soluciones para la financiación de proyectos de infraestructuras y de energía, mitigando los riesgos inherentes, con el objeto de reducir el consumo de capital económico en los bancos financiadores. Este objetivo se alcanza compartiendo el riesgo de la financiación con IFMs, a través de estructuras de risk-sharing. La investigación responde la pregunta: "Cuál es el impacto de risk-sharing con IFMs, en la financiación de proyectos para aumentar su eficiencia y viabilidad?". El trabajo se desarrolla desde el enfoque de un banco comercial, estimando el consumo de capital económico en la financiación de proyectos y analizando las principales variables del riesgo de crédito, Probability of Default, Loss Given Default and Recovery Rates, (Altman, 2010). La investigación presenta las cifras globales de Project Finance en los sectores de infraestructuras y de energía, y analiza el marco regulatorio internacional en relación al consumo de capital económico en la financiación de proyectos en los que participan IFMs. A continuación, el trabajo modeliza una cartera real, bien diversificada, de Project Finance de infraestructuras y de energía, aplicando la metodología CreditMet- rics (J. P. Morgan, 1997). Su objeto es estimar el consumo de capital económico y la rentabilidad de la cartera de proyectos a través del RAROC y EVA. La modelización permite estimar el efecto diversificación y la liberación de capital económico consecuencia del risk-sharing. Los resultados muestran el enorme impacto del efecto diversificación de la cartera, así como de las garantías parciales de las IFMs que mitigan riesgos, mejoran el recovery rate de los proyectos y reducen el consumo de capital económico para el banco comercial, mientras aumentan la rentabilidad, RAROC, y crean valor económico, EVA. En escenarios económicos de inestabilidad, empeoramiento del rating de los bancos, aumentos de default en los proyectos y de correlación en las carteras, hay un impacto directo en el capital económico y en la pérdida de rentabilidad. La liberación de capital económico, como se plantea en la presente investigación, permitirá financiar más proyectos de infraestructuras y de energía, lo que repercutirá en un mayor crecimiento económico y creación de empleo. La principal contribución de este trabajo es promover la gestión activa del capital económico en la financiación de infraestructuras y de proyectos energéticos, a través de estructuras innovadoras de risk-sharing con IFMs y de creación de valor económico en los bancos comerciales, lo que mejoraría su eficiencia y capitalización. La aportación metodológica del trabajo se convierte por su originalidad en una contribución, que sugiere y facilita nuevas líneas de investigación académica en las principales variables del riesgo de crédito que afectan al capital económico en la financiación de proyectos.
Resumo:
This study explores factors related to the prompt difficulty in Automated Essay Scoring. The sample was composed of 6,924 students. For each student, there were 1-4 essays, across 20 different writing prompts, for a total of 20,243 essays. E-rater® v.2 essay scoring engine developed by the Educational Testing Service was used to score the essays. The scoring engine employs a statistical model that incorporates 10 predictors associated with writing characteristics of which 8 were used. The Rasch partial credit analysis was applied to the scores to determine the difficulty levels of prompts. In addition, the scores were used as outcomes in the series of hierarchical linear models (HLM) in which students and prompts constituted the cross-classification levels. This methodology was used to explore the partitioning of the essay score variance.^ The results indicated significant differences in prompt difficulty levels due to genre. Descriptive prompts, as a group, were found to be more difficult than the persuasive prompts. In addition, the essay score variance was partitioned between students and prompts. The amount of the essay score variance that lies between prompts was found to be relatively small (4 to 7 percent). When the essay-level, student-level-and prompt-level predictors were included in the model, it was able to explain almost all variance that lies between prompts. Since in most high-stakes writing assessments only 1-2 prompts per students are used, the essay score variance that lies between prompts represents an undesirable or "noise" variation. Identifying factors associated with this "noise" variance may prove to be important for prompt writing and for constructing Automated Essay Scoring mechanisms for weighting prompt difficulty when assigning essay score.^
Resumo:
Traditional Real-Time Operating Systems (RTOS) are not designed to accommodate application specific requirements. They address a general case and the application must co-exist with any limitations imposed by such design. For modern real-time applications this limits the quality of services offered to the end-user. Research in this field has shown that it is possible to develop dynamic systems where adaptation is the key for success. However, adaptation requires full knowledge of the system state. To overcome this we propose a framework to gather data, and interact with the operating system, extending the traditional POSIX trace model with a partial reflective model. Such combination still preserves the trace mechanism semantics while creating a powerful platform to develop new dynamic systems, with little impact in the system and avoiding complex changes in the kernel source code.
Resumo:
This thesis focuses on theoretical asset pricing models and their empirical applications. I aim to investigate the following noteworthy problems: i) if the relationship between asset prices and investors' propensities to gamble and to fear disaster is time varying, ii) if the conflicting evidence for the firm and market level skewness can be explained by downside risk, Hi) if costly learning drives liquidity risk. Moreover, empirical tests support the above assumptions and provide novel findings in asset pricing, investment decisions, and firms' funding liquidity. The first chapter considers a partial equilibrium model where investors have heterogeneous propensities to gamble and fear disaster. Skewness preference represents the desire to gamble, while kurtosis aversion represents fear of extreme returns. Using US data from 1988 to 2012, my model demonstrates that in bad times, risk aversion is higher, more people fear disaster, and fewer people gamble, in contrast to good times. This leads to a new empirical finding: gambling preference has a greater impact on asset prices during market downturns than during booms. The second chapter consists of two essays. The first essay introduces a foramula based on conditional CAPM for decomposing the market skewness. We find that the major market upward and downward movements can be well preadicted by the asymmetric comovement of betas, which is characterized by an indicator called "Systematic Downside Risk" (SDR). We find that SDR can efafectively forecast future stock market movements and we obtain out-of-sample R-squares (compared with a strategy using historical mean) of more than 2.27% with monthly data. The second essay reconciles a well-known empirical fact: aggregating positively skewed firm returns leads to negatively skewed market return. We reconcile this fact through firms' greater response to negative maraket news than positive market news. We also propose several market return predictors, such as downside idiosyncratic skewness. The third chapter studies the funding liquidity risk based on a general equialibrium model which features two agents: one entrepreneur and one external investor. Only the investor needs to acquire information to estimate the unobservable fundamentals driving the economic outputs. The novelty is that information acquisition is more costly in bad times than in good times, i.e. counter-cyclical information cost, as supported by previous empirical evidence. Later we show that liquidity risks are principally driven by costly learning. Résumé Cette thèse présente des modèles théoriques dévaluation des actifs et leurs applications empiriques. Mon objectif est d'étudier les problèmes suivants: la relation entre l'évaluation des actifs et les tendances des investisseurs à parier et à crainadre le désastre varie selon le temps ; les indications contraires pour l'entreprise et l'asymétrie des niveaux de marché peuvent être expliquées par les risques de perte en cas de baisse; l'apprentissage coûteux augmente le risque de liquidité. En outre, des tests empiriques confirment les suppositions ci-dessus et fournissent de nouvelles découvertes en ce qui concerne l'évaluation des actifs, les décisions relatives aux investissements et la liquidité de financement des entreprises. Le premier chapitre examine un modèle d'équilibre où les investisseurs ont des tendances hétérogènes à parier et à craindre le désastre. La préférence asymétrique représente le désir de parier, alors que le kurtosis d'aversion représente la crainte du désastre. En utilisant les données des Etats-Unis de 1988 à 2012, mon modèle démontre que dans les mauvaises périodes, l'aversion du risque est plus grande, plus de gens craignent le désastre et moins de gens parient, conatrairement aux bonnes périodes. Ceci mène à une nouvelle découverte empirique: la préférence relative au pari a un plus grand impact sur les évaluations des actifs durant les ralentissements de marché que durant les booms économiques. Exploitant uniquement cette relation générera un revenu excédentaire annuel de 7,74% qui n'est pas expliqué par les modèles factoriels populaires. Le second chapitre comprend deux essais. Le premier essai introduit une foramule base sur le CAPM conditionnel pour décomposer l'asymétrie du marché. Nous avons découvert que les mouvements de hausses et de baisses majeures du marché peuvent être prédits par les mouvements communs des bêtas. Un inadicateur appelé Systematic Downside Risk, SDR (risque de ralentissement systématique) est créé pour caractériser cette asymétrie dans les mouvements communs des bêtas. Nous avons découvert que le risque de ralentissement systématique peut prévoir les prochains mouvements des marchés boursiers de manière efficace, et nous obtenons des carrés R hors échantillon (comparés avec une stratégie utilisant des moyens historiques) de plus de 2,272% avec des données mensuelles. Un investisseur qui évalue le marché en utilisant le risque de ralentissement systématique aurait obtenu une forte hausse du ratio de 0,206. Le second essai fait cadrer un fait empirique bien connu dans l'asymétrie des niveaux de march et d'entreprise, le total des revenus des entreprises positiveament asymétriques conduit à un revenu de marché négativement asymétrique. Nous décomposons l'asymétrie des revenus du marché au niveau de l'entreprise et faisons cadrer ce fait par une plus grande réaction des entreprises aux nouvelles négatives du marché qu'aux nouvelles positives du marché. Cette décomposition révélé plusieurs variables de revenus de marché efficaces tels que l'asymétrie caractéristique pondérée par la volatilité ainsi que l'asymétrie caractéristique de ralentissement. Le troisième chapitre fournit une nouvelle base théorique pour les problèmes de liquidité qui varient selon le temps au sein d'un environnement de marché incomplet. Nous proposons un modèle d'équilibre général avec deux agents: un entrepreneur et un investisseur externe. Seul l'investisseur a besoin de connaitre le véritable état de l'entreprise, par conséquent, les informations de paiement coutent de l'argent. La nouveauté est que l'acquisition de l'information coute plus cher durant les mauvaises périodes que durant les bonnes périodes, comme cela a été confirmé par de précédentes expériences. Lorsque la récession comamence, l'apprentissage coûteux fait augmenter les primes de liquidité causant un problème d'évaporation de liquidité, comme cela a été aussi confirmé par de précédentes expériences.
Resumo:
Evaluating the possible benefits of the introduction of genetically modified (GM) crops must address the issue of consumer resistance as well as the complex regulation that has ensued. In the European Union (EU) this regulation envisions the “co-existence” of GM food with conventional and quality-enhanced products, mandates the labelling and traceability of GM products, and allows only a stringent adventitious presence of GM content in other products. All these elements are brought together within a partial equilibrium model of the EU agricultural food sector. The model comprises conventional, GM and organic food. Demand is modelled in a novel fashion, whereby organic and conventional products are treated as horizontally differentiated but GM products are vertically differentiated (weakly inferior) relative to conventional ones. Supply accounts explicitly for the land constraint at the sector level and for the need for additional resources to produce organic food. Model calibration and simulation allow insights into the qualitative and quantitative effects of the large-scale introduction of GM products in the EU market. We find that the introduction of GM food reduces overall EU welfare, mostly because of the associated need for costly segregation of non-GM products, but the producers of quality-enhanced products actually benefit.
Resumo:
Projections of U.S. ethanol production and its impacts on planted acreage, crop prices, livestock production and prices, trade, and retail food costs are presented under the assumption that current tax credits and trade policies are maintained. The projections were made using a multi-product, multi-country deterministic partial equilibrium model. The impacts of higher oil prices, a drought combined with an ethanol mandate, and removal of land from the Conservation Reserve Program (CRP) relative to baseline projections are also presented. The results indicate that expanded U.S. ethanol production will cause long-run crop prices to increase. In response to higher feed costs, livestock farmgate prices will increase enough to cover the feed cost increases. Retail meat, egg, and dairy prices will also increase. If oil prices are permanently $10-per-barrel higher than assumed in the baseline projections, U.S. ethanol will expand significantly. The magnitude of the expansion will depend on the future makeup of the U.S. automobile fleet. If sufficient demand for E-85 from flex-fuel vehicles is available, corn-based ethanol production is projected to increase to over 30 billion gallons per year with the higher oil prices. The direct effect of higher feed costs is that U.S. food prices would increase by a minimum of 1.1% over baseline levels. Results of a model of a 1988-type drought combined with a large mandate for continued ethanol production show sharply higher crop prices, a drop in livestock production, and higher food prices. Corn exports would drop significantly, and feed costs would rise. Wheat feed use would rise sharply. Taking additional land out of the CRP would lower crop prices in the short run. But because long-run corn prices are determined by ethanol prices and not by corn acreage, the long-run impacts on commodity prices and food prices of a smaller CRP are modest. Cellulosic ethanol from switchgrass and biodiesel from soybeans do not become economically viable in the Corn Belt under any of the scenarios. This is so because high energy costs that increase the prices of biodiesel and switchgrass ethanol also increase the price of cornbased ethanol. So long as producers can choose between soybeans for biodiesel, switchgrass for ethanol, and corn for ethanol, they will choose to grow corn. Cellulosic ethanol from corn stover does not enter into any scenario because of the high cost of collecting and transporting corn stover over the large distances required to supply a commercial-sized ethanol facility.
Resumo:
Although the Santiago variety of Cape Verdean Creole (CVC) has been the subject of numerous linguistic works, the second major variety of the language, i.e. the São Vicente variety of CVC (CVSV), has hardly been described. Nevertheless this lack of studies and given its striking differences, on all linguistic levels, from the variety of Santiago (CVST), the implicit explanation for such divergences, echoed for decades in the literature on CVC, has been the presumably decreolized character of CVSV. First, this study provides a comprehensive fieldwork-based synchronic description of CVSV major morpho-syntactic categories in the intent to document the variety. Second, it aims to place the study of CVSV within a broader scope of contact linguistics in the quest to explain its structure. Based on analyses of historical documents and studies, it reconstructs the sociohistorical scenario of the emergence and development of CVSV in the period of 1797- 1975. From the comparison of the current structures of CVSV and CVST, the examination of linguistic data in historical texts and the analysis of sociohistorical facts it becomes clear that the contemporary structure of CVSV stems from the contact-induced changes that occurred during the intensive language and dialect contact on the island of São Vicente in the early days of its settlement in the late 18th and ensuing early 19th century development, rather than from modern day pressure of Portuguese. Although this dissertation argues for multiple explanations rather than a single theory, by showing that processes such as languages shift among the first Portuguese settlers, L2 acquisition, migration of the Barlavento speakers and subsequent dialect leveling as well as language borrowing at a later stage were at stake, it demonstrates the usefulness of partial-restructuring model proposed by Holm (2004).