942 resultados para Dual process theories
Resumo:
O presente estudo analisou as políticas públicas voltadas para a Educação Profissional e Tecnológica e sua influência sobre o projeto pedagógico do Curso Superior de Tecnologia em Gestão de Recursos Humanos da Faculdade do Pará e, em particular, como este Curso se apresenta ante ao histórico processo dual de oferta da educação no Brasil. Para tanto, analisamos o conteúdo das políticas públicas voltadas para a educação Profissional e Tecnológica no Brasil, a partir da década de 1990. Utilizamos os procedimentos da pesquisa qualitativa, fazendo uso de entrevistas e de análise documental, para analisar o nosso objeto. Partimos da hipótese segundo a qual as políticas de educação profissional e tecnológica de nosso país pressupõem a formação especificamente para o trabalho e visam a conformação das classes sociais fundamentais, proprietárias e não-proprietárias, e que o Curso Superior de Tecnologia em Gestão de Recursos Humanos da Faculdade do Pará fortalece esta mesma perspectiva político-pedagógica. Analisamos os documentos que normatizam os cursos superiores de tecnologia no Brasil, o projeto pedagógico e o desenho curricular do curso em questão e as falas de professores, técnicos, egressos e gestores da instituição à luz de autores identificados com o materialismo histórico. Verificamos que, tanto os documentos normatizadores da educação profissional tecnológica brasileira quanto àqueles que definem a estrutura do curso estudado na Faculdade do Pará têm sido orientados para o desenvolvimento do “fazer”, do “saber-fazer”, não dando conta das bases científicas deste fazer e nem de uma formação que considere as relações histórico-sociais nas quais está inserido. Assim, o direcionamento dado à formação de tecnólogos tem, fundamentalmente, promovido a capacidade de aprendizagem dos processos tecnológicos específicos, incentivando à produção e a inovação científico-tecnológica e suas aplicações no mundo do trabalho, visando o desenvolvimento de competências profissionais tecnológicas, gerais e específicas. Concluímos nosso estudo com a convicção de que o conteúdo das políticas para a educação profissional e tecnológica bem como do Curso Superior de Tecnologia em Gestão de Recursos Humanos fortalecem a histórica dualidade da educação brasileira ao se orientarem apenas para a conformação dos processos formativos as demandas dos setores produtivos e para a aquisição de conhecimentos relativos, unicamente, ao desenvolvimento de funções específicas.
Resumo:
The purpose of this doctoral thesis is to prove existence for a mutually catalytic random walk with infinite branching rate on countably many sites. The process is defined as a weak limit of an approximating family of processes. An approximating process is constructed by adding jumps to a deterministic migration on an equidistant time grid. As law of jumps we need to choose the invariant probability measure of the mutually catalytic random walk with a finite branching rate in the recurrent regime. This model was introduced by Dawson and Perkins (1998) and this thesis relies heavily on their work. Due to the properties of this invariant distribution, which is in fact the exit distribution of planar Brownian motion from the first quadrant, it is possible to establish a martingale problem for the weak limit of any convergent sequence of approximating processes. We can prove a duality relation for the solution to the mentioned martingale problem, which goes back to Mytnik (1996) in the case of finite rate branching, and this duality gives rise to weak uniqueness for the solution to the martingale problem. Using standard arguments we can show that this solution is in fact a Feller process and it has the strong Markov property. For the case of only one site we prove that the model we have constructed is the limit of finite rate mutually catalytic branching processes as the branching rate approaches infinity. Therefore, it seems naturalto refer to the above model as an infinite rate branching process. However, a result for convergence on infinitely many sites remains open.
Resumo:
Intestinal immunoglobulin A (IgA) ensures host defense and symbiosis with our commensal microbiota. Yet previous studies hint at a surprisingly low diversity of intestinal IgA, and it is unknown to what extent the diverse Ig arsenal generated by somatic recombination and diversification is actually used. In this study, we analyze more than one million mouse IgA sequences to describe the shaping of the intestinal IgA repertoire, its determinants, and stability over time. We show that expanded and infrequent clones combine to form highly diverse polyclonal IgA repertoires with very little overlap between individual mice. Selective homing allows expanded clones to evenly seed the small but not large intestine. Repertoire diversity increases during aging in a dual process. On the one hand, microbiota-, T cell-, and transcription factor RORγt-dependent but Peyer's patch-independent somatic mutations drive the diversification of expanded clones, and on the other hand, new clones are introduced into the repertoire of aged mice. An individual's IgA repertoire is stable and recalled after plasma cell depletion, which is indicative of functional memory. These data provide a conceptual framework to understand the dynamic changes in the IgA repertoires to match environmental and intrinsic stimuli.
Resumo:
We consider black probes of Anti-de Sitter and Schrödinger spacetimes embedded in string theory and M-theory and construct perturbatively new black hole geometries. We begin by reviewing black string configurations in Anti-de Sitter dual to finite temperature Wilson loops in the deconfined phase of the gauge theory and generalise the construction to the confined phase. We then consider black strings in thermal Schrödinger, obtained via a null Melvin twist of the extremal D3-brane, and construct three distinct types of black string configurations with spacelike as well as lightlike separated boundary endpoints. One of these configurations interpolates between the Wilson loop operators, with bulk duals defined in Anti-de Sitter and another class of Wilson loop operators, with bulk duals defined in Schrödinger. The case of black membranes with boundary endpoints on the M5-brane dual to Wilson surfaces in the gauge theory is analysed in detail. Four types of black membranes, ending on the null Melvin twist of the extremal M5-brane exhibiting the Schrödinger symmetry group, are then constructed. We highlight the differences between Anti-de Sitter and Schrödinger backgrounds and make some comments on the properties of the corresponding dual gauge theories.
Resumo:
OBJECTIVE Intense alcohol consumption is a risk factor for a number of health problems. Dual-process models assume that self-regulatory behavior such as drinking alcohol is guided by both reflective and impulsive processes. Evidence suggests that (a) impulsive processes such as implicit attitudes are more strongly associated with behavior when executive functioning abilities are low, and (b) higher neural baseline activation in the lateral prefrontal cortex (PFC) is associated with better inhibitory control. The present study integrates these 2 strands of research to investigate how individual differences in neural baseline activation in the lateral PFC moderate the association between implicit alcohol attitudes and drinking behavior. METHOD Baseline cortical activation was measured with resting electroencephalography (EEG) in 89 moderate drinkers. In a subsequent behavioral testing session they completed measures of implicit alcohol attitudes and self-reported drinking behavior. RESULTS Implicit alcohol attitudes were related to self-reported alcohol consumption. Most centrally, implicit alcohol attitudes were more strongly associated with drinking behavior in individuals with low as compared with high baseline activation in the right lateral PFC. CONCLUSIONS These findings are in line with predictions made on the basis of dual-process models. They provide further evidence that individual differences in neural baseline activation in the right lateral PFC may contribute to executive functioning abilities such as inhibitory control. Moreover, individuals with strongly positive implicit alcohol attitudes coupled with a low baseline activation in the right lateral PFC may be at greater risk of developing unhealthy drinking patterns than others.
Resumo:
El Cementerio Este de Malmo (1916-1973) ha sido analizado en el ámbito internacional sólo en contadas ocasiones, aunque en este lugar el arquitecto sueco Sigurd Lewerentz (1885-1975) realizó un proyecto experimental a gran escala, rechazando los trazados de cementerios pintorescos que se habían popularizado en Europa en aquella época y proponiendo una disposición más ortogonal, vinculada con el lugar de distintas maneras. Después de que Lewerentz ganase el concurso, el proyecto se desarrolló durante casi sesenta años. El Cementerio Este de Malmo ofrece una lente única a través de la cual se puede examinar la forma en la que se establecieron relaciones entre el proyecto de arquitectura y diversos aspectos del paisaje: con el entorno urbano y diversas circunstancias del entorno social mediante la vegetación como límite, con la topografía a través de la escala y los materiales de construcción, con la tradición organizando plantas, secciones y construcción, con el ámbito internacional a través de los paramentos constructivos, con las técnicas constructivas habituales en Suecia por medio de materiales cotidianos y con los alrededores a través del planteamiento de los detalles. La hipótesis de trabajo plantea que el concepto de límite en la obra de Lewerentz y en el Cementerio Este de Malmo en particular tiene un carácter dual, produciendo simultáneamente separaciones y relaciones con distintos aspectos del lugar y del tiempo. Esta investigación documenta y examina la construcción del Cementerio Este de Malmo apoyándose en planos y escritos originales de distintos archivos, datos recogidos en el proyecto así como en otros cementerios de la época y en análisis previos ya publicados. Proporcionando un estudio detallado de la evolución del enfoque de Lewerentz respecto a la relación entre arquitectura y paisaje, se define un marco en el que la obra de este arquitecto puede entenderse de una manera más completa. Lewerentz mantuvo una postura crítica con algunos modelos de su época, al tiempo que planteó vínculos con aspectos de la vida cotidiana de las personas. Este conjunto de referencias redefine, un conjunto de circunstancias vinculadas con el proyecto, que redefinen el lugar de forma abstracta. En esta doble cualidad del límite se da lugar a un espacio de intersección, entendido como un lugar de convergencia de personas. En estas relaciones, el usuario siempre es el centro, ya sea para crear un pórtico para resguardar, una avenida de árboles para crear un espacio más tranquilo o una ventana simplificada. Esta tesis contribuye a la investigación en el ámbito de la integración del proyecto de arquitectura en el lugar, describiendo una metodología de proyectar, en la que los elementos que sirven para separar espacios, también se emplean para vincularlos a un conjunto seleccionado de referencias a distintas escalas, lugares y tiempos. Este doble proceso que implica la construcción de los límites permite favorecer la proximidad a algunos aspectos del entorno y al mismo tiempo rechazar otros, permitiendo una perspectiva crítica y pudiendo proponer situaciones alternativas. ABSTRACT Malmó Eastern Cemetery (1916-1973) has received little international attention, despite the fact that Swedish architect Sigurd Lewerentz (1885-1975) led a large-scale experimental project, rejecting picturesque cemetery plans, popular in Europe at that time, and developing a more orthogonal layout, which was linked to the site in different ways. After Lewerentz won first prize in the competition to develop Malmó Eastern Cemetery, the project unfolded over a period of nearly sixty years. Malmó Eastern Cemetery project offers a unique lens through which to examine how different relationships were established between the architectural project and various features of the landscape: with the urban environment and various social circumstances using greenery as a boundary; with topography using the scale of construction and materials; with tradition by designing plans, sections and construction; with the international architectural arena by building walls; with common construction methods in Sweden using ordinary materials and with the surroundings by using detail. The hypothesis of this thesis explores the dual nature of the notion of limit in Lewerentz's work, and in Malmó Eastern Cemetery in particular, through the simultaneous creation of separations and connections with various aspects of place and time. This research documents and examines the construction of Malmó Eastern Cemetery by examining original drawings and writings from various archives, field exploration into the project and other cemeteries of the time, and the existing literature. By providing a detailed study of Lewerentz's evolving approach to the relationship between architecture and landscape, this thesis defines a framework in which Lewerentz's work may be understood in a more comprehensive way. Lewerentz was critical of some of the models of his time, although he established different connections with aspects of people's everyday lives. This set of connections defines a particular setting or set of circumstances for the project, which are more of an abstract notion. Within this double role of limits, a space of intersection is generated, which can be understood as a place of convergence for people. In this set of connections, the user is always at the centre of the design. For example, Lewerentz designed a portico that would provide shelter, an avenue of trees to produce a tranquil space of commemoration or a simplified window. This thesis contributes to research in the field of integration of an architectural project in a site, by describing a methodology of design, in which the elements used to subdivide spaces are also used to connect a set of selected references for different scales, locations and times. The dual process involved in building limits encourages proximity to some aspects of the environment, while rejecting others, thereby allowing for a critical perspective and the proposal of alternative situations.
Resumo:
The Tunisian constitution of 27 January 2014 was deemed essentially compatible with international human rights principles and standards. These were adopted at the outcome of a dual process, which was underway both inside the National Constituent Assembly (NCA) and outside it, between the NCA and civil society stakeholders. Three successive drafts fell considerably short of expectations (6 August 2012, 14 December 2012 and 22 April 2013). The fourth draft (1 June 2013) was still fraught with 20 or so fundamental divergences. These were resolved, thanks to the National Dialogue in cooperation with the ad hoc “consensus commission” (lajnet tawafuqat) within the NCA, which is chaired by Mustapha Ben Jaafar (President of the NCA). The final text was overwhelmingly adopted on 26 January 2014 by 200 votes, with 12 against and four abstentions. It was promulgated on 10 February.
Resumo:
Following study, participants received 2 tests. The 1st was a recognition test; the 2nd was designed to tap recollection. The objective was to examine performance on Test I conditional on Test 2 performance. In Experiment 1, contrary to process dissociation assumptions, exclusion errors better predicted subsequent recollection than did inclusion errors. In Experiments 2 and 3, with alternate questions posed on Test 2, words having high estimates of recollection with one question had high estimates of familiarity with the other question. Results supported the following: (a) the 2-test procedure has considerable potential for elucidating the relationship between recollection and familiarity; (b) there is substantial evidence for dependency between such processes when estimates are obtained using the process dissociation and remember-know procedures; and (c) order of information access appears to depend on the question posed to the memory system.
Resumo:
Affective learning, the acquisition of likes and dislikes, was investigated in two experiments using verbal ratings and affective priming as indices of affective change. In both experiments, neutral geometric shapes were paired with pleasant or unpleasant pictures in a picture-picture conditioning procedure to acquire positive and negative valence. Experiment 1 found the acquisition of positive valence; however, this valence change was lost after extinction training. Experiment 2 used more salient pictures as unconditioned stimuli. Positive and negative valence was acquired during paired presentations and retained across extinction training. The results of Experiment 2 are consistent with dual process accounts, which claim that evaluative conditioning is distinct from Pavlovian conditioning because it is resistant to extinction.
Resumo:
Semantic priming occurs when a subject is faster in recognising a target word when it is preceded by a related word compared to an unrelated word. The effect is attributed to automatic or controlled processing mechanisms elicited by short or long interstimulus intervals (ISIs) between primes and targets. We employed event-related functional magnetic resonance imaging (fMRI) to investigate blood oxygen level dependent (BOLD) responses associated with automatic semantic priming using an experimental design identical to that used in standard behavioural priming tasks. Prime-target semantic strength was manipulated by using lexical ambiguity primes (e.g., bank) and target words related to dominant or subordinate meaning of the ambiguity. Subjects made speeded lexical decisions (word/nonword) on dominant related, subordinate related, and unrelated word pairs presented randomly with a short ISI. The major finding was a pattern of reduced activity in middle temporal and inferior prefrontal regions for dominant versus unrelated and subordinate versus unrelated comparisons, respectively. These findings are consistent with both a dual process model of semantic priming and recent repetition priming data that suggest that reductions in BOLD responses represent neural priming associated with automatic semantic activation and implicate the left middle temporal cortex and inferior prefrontal cortex in more automatic aspects of semantic processing.
Resumo:
This paper addresses the dearth of research into material artifacts and how they are engaged in strategizing activities. Building on the strategy-as-practice perspective, and the notion of epistemic objects, we develop a typology of strategy practices that show how managers use material artifacts to strategize by a dual process of knowledge abstraction and substitution. Empirically, we study the practice of underwriting managers in reinsurance companies. Our findings first identify the artifacts – pictures, maps, data packs, spreadsheets and graphs – that these managers use to appraise reinsurance deals. Second, the analysis of each artifact’s situated use led to the identification of five practices for doing strategy with artifacts: physicalizing, locating, enumerating, analyzing, and selecting. Last, we developed a typology that shows how practices vary in terms of their level of abstraction from the physical properties of the risk being reinsured and unfold through a process of substituting. Our conceptual framework extends existing work in the strategy-as-practice field that calls for research into the role of material artifacts.
Resumo:
Nowadays financial institutions due to regulation and internal motivations care more intensively on their risks. Besides previously dominating market and credit risk new trend is to handle operational risk systematically. Operational risk is the risk of loss resulting from inadequate or failed internal processes, people and systems or from external events. First we show the basic features of operational risk and its modelling and regulatory approaches, and after we will analyse operational risk in an own developed simulation model framework. Our approach is based on the analysis of latent risk process instead of manifest risk process, which widely popular in risk literature. In our model the latent risk process is a stochastic risk process, so called Ornstein- Uhlenbeck process, which is a mean reversion process. In the model framework we define catastrophe as breach of a critical barrier by the process. We analyse the distributions of catastrophe frequency, severity and first time to hit, not only for single process, but for dual process as well. Based on our first results we could not falsify the Poisson feature of frequency, and long tail feature of severity. Distribution of “first time to hit” requires more sophisticated analysis. At the end of paper we examine advantages of simulation based forecasting, and finally we concluding with the possible, further research directions to be done in the future.
Resumo:
Family caregivers manage home enteral nutrition (HEN) for over 77% of an estimated 1 of every 400 Medicare recipients. Increasing usage of HEN in older adults combined with reliance on family caregivers raises concerns for the quality, outcomes, and costs of care. These concerns are relevant in light of Medicare limitations on nursing assistance and non-reimbursement for nutrition services, despite annual costs of over $600 million. This study applied stress process theories to assess stressor, mediator, and outcome variables salient to HEN and caregiving. In-home structured interviews occurred with a multi-ethnic sample of 30 caregiving dyads at 1–3 months after discharge on HEN. Care recipients were aged ≥60 (M = 68.4 years) and did not have dementia. Caregivers were aged ≥21, unpaid, and lived within 45 minutes of care recipients. Caregivers performed an average of 19.7 tasks daily for 61.9 hours weekly. Training needs were identified for 33 functional, care management, technical, and nutritional tasks. Preparedness scores were low (M = 1.73/4.0), and positively correlated with competence, self-rated quality of care and positive feelings, and negatively with overload, role captivity, and negative feelings (Ps < .05). Caregivers had multiple changes in lifestyle and dietary behaviors. Lifestyle changes positively correlated with overload, and negatively with preparedness and positive feelings. Dietary changes positively correlated with number of tasks, overload, role captivity and negative feelings, and negatively with preparedness (Ps < .01). Fifty-seven percent of caregivers aged >50 were at nutrition risk. Care recipients fared worse. Average weight change was −4.35 pounds (P < .001). Physical complications interrupted daily enteral infusions. Water intake was half of fluid need and associated with signs of dehydration (P < .001). Physical and social function was poor, with older subjects more impaired ( P < .04). Those with better prepared or less overloaded caregivers had higher functionality and QOL (P < .002). Complications, type of feeding tube, and caregiver preparedness correlated with frequency of health care utilization (Ps < .05). Efficacy of HEN in older adults requires specialized caregiver training, attention to caregivers' needs, and frequent monitoring from a highly skilled multidisciplinary team including dietitians. ^
Resumo:
My thesis consists of three essays that investigate strategic interactions between individuals engaging in risky collective action in uncertain environments. The first essay analyzes a broad class of incomplete information coordination games with a wide range of applications in economics and politics. The second essay draws from the general model developed in the first essay to study decisions by individuals of whether to engage in protest/revolution/coup/strike. The final essay explicitly integrates state response to the analysis. The first essay, Coordination Games with Strategic Delegation of Pivotality, exhaustively analyzes a class of binary action, two-player coordination games in which players receive stochastic payoffs only if both players take a ``stochastic-coordination action''. Players receive conditionally-independent noisy private signals about the normally distributed stochastic payoffs. With this structure, each player can exploit the information contained in the other player's action only when he takes the “pivotalizing action”. This feature has two consequences: (1) When the fear of miscoordination is not too large, in order to utilize the other player's information, each player takes the “pivotalizing action” more often than he would based solely on his private information, and (2) best responses feature both strategic complementarities and strategic substitutes, implying that the game is not supermodular nor a typical global game. This class of games has applications in a wide range of economic and political phenomena, including war and peace, protest/revolution/coup/ strike, interest groups lobbying, international trade, and adoption of a new technology. My second essay, Collective Action with Uncertain Payoffs, studies the decision problem of citizens who must decide whether to submit to the status quo or mount a revolution. If they coordinate, they can overthrow the status quo. Otherwise, the status quo is preserved and participants in a failed revolution are punished. Citizens face two types of uncertainty. (a) non-strategic: they are uncertain about the relative payoffs of the status quo and revolution, (b) strategic: they are uncertain about each other's assessments of the relative payoff. I draw on the existing literature and historical evidence to argue that the uncertainty in the payoffs of status quo and revolution is intrinsic in politics. Several counter-intuitive findings emerge: (1) Better communication between citizens can lower the likelihood of revolution. In fact, when the punishment for failed protest is not too harsh and citizens' private knowledge is accurate, then further communication reduces incentives to revolt. (2) Increasing strategic uncertainty can increase the likelihood of revolution attempts, and even the likelihood of successful revolution. In particular, revolt may be more likely when citizens privately obtain information than when they receive information from a common media source. (3) Two dilemmas arise concerning the intensity and frequency of punishment (repression), and the frequency of protest. Punishment Dilemma 1: harsher punishments may increase the probability that punishment is materialized. That is, as the state increases the punishment for dissent, it might also have to punish more dissidents. It is only when the punishment is sufficiently harsh, that harsher punishment reduces the frequency of its application. Punishment Dilemma 1 leads to Punishment Dilemma 2: the frequencies of repression and protest can be positively or negatively correlated depending on the intensity of repression. My third essay, The Repression Puzzle, investigates the relationship between the intensity of grievances and the likelihood of repression. First, I make the observation that the occurrence of state repression is a puzzle. If repression is to succeed, dissidents should not rebel. If it is to fail, the state should concede in order to save the costs of unsuccessful repression. I then propose an explanation for the “repression puzzle” that hinges on information asymmetries between the state and dissidents about the costs of repression to the state, and hence the likelihood of its application by the state. I present a formal model that combines the insights of grievance-based and political process theories to investigate the consequences of this information asymmetry for the dissidents' contentious actions and for the relationship between the magnitude of grievances (formulated here as the extent of inequality) and the likelihood of repression. The main contribution of the paper is to show that this relationship is non-monotone. That is, as the magnitude of grievances increases, the likelihood of repression might decrease. I investigate the relationship between inequality and the likelihood of repression in all country-years from 1981 to 1999. To mitigate specification problem, I estimate the probability of repression using a generalized additive model with thin-plate splines (GAM-TPS). This technique allows for flexible relationship between inequality, the proxy for the costs of repression and revolutions (income per capita), and the likelihood of repression. The empirical evidence support my prediction that the relationship between the magnitude of grievances and the likelihood of repression is non-monotone.
Resumo:
Previous research with the ratio-bias task found larger response latencies for conflict trials where the heuristic- and analytic-based responses are assumed to be in opposition (e.g., choosing between 1/10 and 9/100 ratios of success) when compared to no-conflict trials where both processes converge on the same response (e.g., choosing between 1/10 and 11/100). This pattern is consistent with parallel dualprocess models, which assume that there is effective, rather than lax, monitoring of the output of heuristic processing. It is, however, unclear why conflict resolution sometimes fails. Ratio-biased choices may increase because of a decline in analytical reasoning (leaving heuristic-based responses unopposed) or to a rise in heuristic processing (making it more difficult for analytic processes to override the heuristic preferences). Using the process-dissociation procedure, we found that instructions to respond logically and response speed affected analytic (controlled) processing (C), leaving heuristic processing (H) unchanged, whereas the intuitive preference for large nominators (as assessed by responses to equal ratio trials) affected H but not C. These findings create new challenges to the debate between dual-process and singleprocess accounts, which are discussed.