987 resultados para interaction history
Resumo:
The former occurrence of the North Atlantic right whale Eubalaena glacialis on the Portuguese coast may be inferred from the historical range of that species in Europe and in NW Africa. It is generally accepted that it was the main prey of coastal whaling in the Middle Ages and in the pre-modern period, but this assumption still needs firming up based on biological and archaeological evidence. We describe the skeletal remains of right whales excavated at Peniche in 2001-2002, in association with archaeological artefacts. The whale bones were covered by sandy sediments on the old seashore and they have been tentatively dated around the 16th to 17th centuries. This study contributes material evidence to the former occurrence of E. glacialis in Portugal (West Iberia). Some whale bones show unequivocal man-made scars. These are associated to wounds from instruments with a sharp-cutting blade. This evidence for past human interaction may suggest that whaling for that species was active at Peniche around the early 17th century.
Resumo:
IntroductionPurpureocillium lilacinum is emerging as a causal agent of hyalohyphomycosis that is refractory to antifungal drugs; however, the pathogenic mechanisms underlying P. lilacinum infection are not understood. In this study, we investigated the interaction of P. lilacinum conidia with human macrophages and dendritic cells in vitro.MethodsSpores of a P. lilacinum clinical isolate were obtained by chill-heat shock. Mononuclear cells were isolated from eight healthy individuals. Monocytes were separated by cold aggregation and differentiated into macrophages by incubation for 7 to 10 days at 37°C or into dendritic cells by the addition of the cytokines human granulocyte-macrophage colony stimulating factor and interleukin-4. Conidial suspension was added to the human cells at 1:1, 2:1, and 5:1 (conidia:cells) ratios for 1h, 6h, and 24h, and the infection was evaluated by Giemsa staining and light microscopy.ResultsAfter 1h interaction, P. lilacinum conidia were internalized by human cells and after 6h contact, some conidia became inflated. After 24h interaction, the conidia produced germ tubes and hyphae, leading to the disruption of macrophage and dendritic cell membranes. The infection rate analyzed after 6h incubation of P. lilacinumconidia with cells at 2:1 and 1:1 ratios was 76.5% and 25.5%, respectively, for macrophages and 54.3% and 19.5%, respectively, for cultured dendritic cells.ConclusionsP. lilacinum conidia are capable of infecting and destroying both macrophages and dendritic cells, clearly demonstrating the ability of this pathogenic fungus to invade human phagocytic cells.
Resumo:
FCT
Resumo:
Economics is a social science which, therefore, focuses on people and on the decisions they make, be it in an individual context, or in group situations. It studies human choices, in face of needs to be fulfilled, and a limited amount of resources to fulfill them. For a long time, there was a convergence between the normative and positive views of human behavior, in that the ideal and predicted decisions of agents in economic models were entangled in one single concept. That is, it was assumed that the best that could be done in each situation was exactly the choice that would prevail. Or, at least, that the facts that economics needed to explain could be understood in the light of models in which individual agents act as if they are able to make ideal decisions. However, in the last decades, the complexity of the environment in which economic decisions are made and the limits on the ability of agents to deal with it have been recognized, and incorporated into models of decision making in what came to be known as the bounded rationality paradigm. This was triggered by the incapacity of the unboundedly rationality paradigm to explain observed phenomena and behavior. This thesis contributes to the literature in three different ways. Chapter 1 is a survey on bounded rationality, which gathers and organizes the contributions to the field since Simon (1955) first recognized the necessity to account for the limits on human rationality. The focus of the survey is on theoretical work rather than the experimental literature which presents evidence of actual behavior that differs from what classic rationality predicts. The general framework is as follows. Given a set of exogenous variables, the economic agent needs to choose an element from the choice set that is avail- able to him, in order to optimize the expected value of an objective function (assuming his preferences are representable by such a function). If this problem is too complex for the agent to deal with, one or more of its elements is simplified. Each bounded rationality theory is categorized according to the most relevant element it simplifes. Chapter 2 proposes a novel theory of bounded rationality. Much in the same fashion as Conlisk (1980) and Gabaix (2014), we assume that thinking is costly in the sense that agents have to pay a cost for performing mental operations. In our model, if they choose not to think, such cost is avoided, but they are left with a single alternative, labeled the default choice. We exemplify the idea with a very simple model of consumer choice and identify the concept of isofin curves, i.e., sets of default choices which generate the same utility net of thinking cost. Then, we apply the idea to a linear symmetric Cournot duopoly, in which the default choice can be interpreted as the most natural quantity to be produced in the market. We find that, as the thinking cost increases, the number of firms thinking in equilibrium decreases. More interestingly, for intermediate levels of thinking cost, an equilibrium in which one of the firms chooses the default quantity and the other best responds to it exists, generating asymmetric choices in a symmetric model. Our model is able to explain well-known regularities identified in the Cournot experimental literature, such as the adoption of different strategies by players (Huck et al. , 1999), the inter temporal rigidity of choices (Bosch-Dom enech & Vriend, 2003) and the dispersion of quantities in the context of di cult decision making (Bosch-Dom enech & Vriend, 2003). Chapter 3 applies a model of bounded rationality in a game-theoretic set- ting to the well-known turnout paradox in large elections, pivotal probabilities vanish very quickly and no one should vote, in sharp contrast with the ob- served high levels of turnout. Inspired by the concept of rhizomatic thinking, introduced by Bravo-Furtado & Côrte-Real (2009a), we assume that each per- son is self-delusional in the sense that, when making a decision, she believes that a fraction of the people who support the same party decides alike, even if no communication is established between them. This kind of belief simplifies the decision of the agent, as it reduces the number of players he believes to be playing against { it is thus a bounded rationality approach. Studying a two-party first-past-the-post election with a continuum of self-delusional agents, we show that the turnout rate is positive in all the possible equilibria, and that it can be as high as 100%. The game displays multiple equilibria, at least one of which entails a victory of the bigger party. The smaller one may also win, provided its relative size is not too small; more self-delusional voters in the minority party decreases this threshold size. Our model is able to explain some empirical facts, such as the possibility that a close election leads to low turnout (Geys, 2006), a lower margin of victory when turnout is higher (Geys, 2006) and high turnout rates favoring the minority (Bernhagen & Marsh, 1997).
Resumo:
AbstractIn the last 15 years, different types of Triatominae resistance to different insecticides have been reported; thus, resistance may be more widespread than known, requiring better characterization and delimitation, which was the aim of this review. This review was structured on a literature search of all articles from 1970 to 2015 in the PubMed database that contained the keywords Insecticide resistance and Triatominae . Out of 295 articles screened by title, 33 texts were selected for detailed analysis. Insecticide resistance of Triatomines is a complex phenomenon that has been primarily reported in Argentina and Bolivia, and is caused by different factors (associated or isolated). Insecticide resistance of Triatominae is a characteristic inherited in an autosomal and semi-dominant manner, and is polygenic, being present in both domestic and sylvatic populations. The toxicological profile observed in eggs cannot be transposed to different stages of evolution. Different toxicological profiles exist at macro- and microgeographical levels. The insecticide phenotype has both reproductive and developmental costs. Different physiological mechanisms are involved in resistance. Studies of Triatomine resistance to insecticides highlight three deficiencies in interpreting the obtained results: I) the vast diversity of methodologies, despite the existence of a single guiding protocol; II) the lack of information on the actual impact of resistance ratios in the field; and III) the concept of the susceptibility reference lineage. Research on the biological and behavioral characteristics of each Triatominae species that has evolved resistance is required in relation to the environmental conditions of each region.
Resumo:
FCT
Resumo:
Chagas disease (CD) is a parasitic infection that originated in the Americas and is caused by Trypanosoma cruzi. In the last few years, the disease has spread to countries in North America, Asia and Europe due to the migration of Latin Americans. In the Brazilian Amazon, CD has an endemic transmission, especially in the Rio Negro region, where an occupational hazard was described for piaçaveiros (piassaba gatherers). In the State of Amazonas, the first chagasic infection was reported in 1977, and the first acute CD case was recorded in 1980. After initiatives to integrate acute CD diagnostics with the malaria laboratories network, reports of acute CD cases have increased. Most of these cases are associated with oral transmission by the consumption of contaminated food. Chronic cases have also been diagnosed, mostly in the indeterminate form. These cases were detected by serological surveys in cardiologic outpatient clinics and during blood donor screening. Considering that the control mechanisms adopted in Brazil's classic transmission areas are not fully applicable in the Amazon, it is important to understand the disease behavior in this region, both in the acute and chronic cases. Therefore, the pursuit of control measures for the Amazon region should be a priority given that CD represents a challenge to preserving the way of life of the Amazon's inhabitants.
Resumo:
PURPOSE: Inspite of the long experience with the treatment of intermittent claudication, little is known about the natural history of stenotic lesions in the iliac segment. With the advent of endovascular treatment, this knowledge has become important. METHODS: Fifty-two stenosis, diagnosed using arteriography, in 38 claudicant patients were analyzed. After a minimum time interval of 6 months, a magnetic resonance angiography was performed to determine whether there was arterial occlusion. The primary factors that could influence the progression of a stenosis were analyzed, such as risk factors (smoking, hypertension, diabetes, sex, and age), compliance with clinical treatment, initial degree of stenosis, site of the stenosis, and length of follow-up. RESULTS: The average length of follow-up was 39 months. From the 52 lesions analyzed, 13 (25%) evolved to occlusion. When occlusion occurred, there was clinical deterioration in 63.2% of cases. This association was statistically significant (P = .002). There was no statistically significant association of the progression of the lesion with the degree or site of stenosis, compliance with treatment, or length of follow-up. Patients who evolved to occlusion were younger (P = .02). The logistic regression model showed that the determinant factors for clinical deterioration were arterial occlusion and noncompliance with clinical treatment. CONCLUSIONS: The progression of a stenosis to occlusion, which occurred in 25% of the cases, caused clinical deterioration. Clinical treatment was important, but it did not forestall the arterial occlusion. Prevention of occlusion could be achieved by early endovascular intervention or with the development of drugs that might stabilize the atherosclerotic plaque.
Resumo:
The frequency of electric organ discharges (EOD) of a gymnotiform fish of "pulse" frequency (40-100 Hz) from South America - Ramphicthys rostratuswas studied. The animals were settled in pairs in a aquarium and thus observed: variation in EOD frequency had at least two components: one more positively correlated with temperature, another less positively correlated due to social interaction.
Resumo:
The paper will address George Kubler’s Portuguese Plain Architecture [PPA] (1972) and its effect in Portuguese architectural practice. Kubler’s philosophy of art history implied that closed sequences of objects could be opened by several reasons. Thus, it will be argued that there is an effect upon Portuguese architecture post 1974, that is apparent by the reemergence of some of the form classes treated by Kubler. This was mostly achieved through the popularity of Kubler’s book within architectural practice, scholarship and moreover by the establishment of the term “Plain Architecture” in portuguese architectural vocabulary. Plain Architecture of the seventeenth and eighteenth centuries shared some qualities with the architecture to be built in post‑revolutionary Portugal, most importantly the effect that could be achieved with low budget buildings that were responding to a situation of crisis, and simultaneously exhaled aristocratic sparsity. The connection of PPA with the ideological attributes of early modernism and the political context of the time catalysed the reemergence of a new order of Portuguese Plain that resonates still in contemporary architecture.
Resumo:
This paper aims to explore the ways in which standard art history terminology shapes the practice of art history by conditioning the interpretation of specific works of art and, in certain cases, the definition of a research subject (especially where questions of genre and periodization are concerned). Taking as a case study a painting by Georges de La Tour, the Peasant Couple Eating, I will argue that terms such as realism, realistic, naturalistic etc. used for its description and/or interpretation, far from constituting objective stylistic characterizations, shape our perception of the work in question. Bringing the question of social class to the center of the discourse on realism, I propose to show how the social divide between the painter and his subject matter (in this case, the peasants) is internalized in the painting’s style and meaning, and how it is fundamental for the understanding of its intentionality and function.
Resumo:
This article aims to reconstruct the critical debate regarding the examination of the crisis in the disciplines of art history and criticism with a particular focus on the proposal formulated by U.S. theorists who contributed to October journal. The discrediting of many modernist critical methods, particularly that of Clement Greenberg – the formalist diktat – marked the birth of the journal and gave rise to proposals set forth by critics committed to a new approach. Their divergent positions, nonetheless, have contributed to undermining the traditional concepts of the autonomy of art and criticism. The proposals discussed over the course of publication were the result of a reappraisal of the disciplinary instruments of art history and criticism pursuant to the crucial cultural changes which took place in the 1980s.
Resumo:
Este trabalho de investigação começou por ser estruturado em torno de quatro grandes capítulos (quatro grandes linhas de orientação temática), todos eles amplamente desenvolvidos no sentido de podermos cartografar alguns dos principais territórios e sintomas da arte contemporânea, sendo certo também, que cada um deles assenta precisamente nos princípios de uma estrutura maleável que, para todos os efeitos, se encontra em processo de construção (work in progress), neste caso, graças à plasticidade do corpo, do espaço, da imagem e do uso criativo das tecnologias digitais, no âmbito das quais, aliás, tudo se parece produzir, transformar e disseminar hoje em dia à nossa volta (quase como se de uma autêntica viagem interactiva se tratasse). Por isso, a partir daqui, todo o esforço que se segue procurará ensaiar uma hipótese de trabalho (desenvolver uma investigação) que, porventura, nos permita desbravar alguns caminhos em direcção aos intermináveis túneis do futuro, sempre na expectativa de podermos dar forma, função e sentido a um desejo irreprimível de liberdade criativa, pois, a arte contemporânea tem essa extraordinária capacidade de nos transportar para muitos outros lugares do mundo, tão reais e imaginários como a nossa própria vida. Assim sendo, há que sumariar algumas das principais etapas a desenvolver ao longo desta investigação. Ora, num primeiro momento, começaremos por reflectir sobre o conceito alargado de «crise» (a crise da modernidade), para logo de seguida podermos abordar a questão da crise das antigas categorias estéticas, questionando assim, para todos os efeitos, quer o conceito de «belo» (Platão) e de «gosto» (Kant), quer ainda o conceito de «forma» (Foccilon), não só no sentido de tentarmos compreender algumas das principais razões que terão estado na origem do chamado «fim da arte» (Hegel), mas também algumas daquelas que terão conduzido à estetização generalizada da experiência contemporânea e à sua respectiva disseminação pelas mais variadas plataformas digitais. Num segundo momento, procuraremos reflectir sobre alguns dos principais problemas da inquietante história das imagens, nomeadamente para tentarmos perceber como é que todas estas transformações técnicas (ligadas ao aparecimento da fotografia, do cinema, do vídeo, do computador e da internet) terão contribuído para o processo de instauração e respectivo alargamento daquilo que todos nós ficaríamos a conhecer como a nova «era da imagem», ou a imagem na «era da sua própria reprodutibilidade técnica» (Benjamin), pois, só assim é que conseguiremos interrogar este imparável processo de movimentação, fragmentação, disseminação, simulação e interacção das mais variadas «formas de vida» (Nietzsche, Agamben). Entretanto, chegados ao terceiro grande momento, interessa-nos percepcionar a arte contemporânea como uma espécie de plataforma interactiva que, por sua vez, nos levará a interpelar alguns dos principais dispositivos metafóricos e experimentais da viagem, neste caso, da viagem enquanto linha facilitadora de acesso à arte, à cultura e à vida contemporânea em geral, ou seja, todo um processo de reflexão que nos incitará a cartografar alguns dos mais atractivos sintomas provenientes da estética do flâneur (na perspectiva de Rimbaud, Baudelaire, Long e Benjamin) e, consequentemente, a convocar algumas das principais sensações decorrentes da experiência altamente sedutora daqueles que vivem mergulhados na órbita interactiva do ciberespaço (na condição de ciberflâneurs), quase como se o mundo inteiro, agora, fosse tão somente um espaço poético «inteiramente navegável» (Manovich). Por fim, no quarto e último momento, procuraremos fazer uma profunda reflexão sobre a inquietante história do corpo, principalmente com o objectivo de reforçar a ideia de que apesar das suas inúmeras fragilidades biológicas (um ser que adoece e morre), o corpo continua a ser uma das «categorias mais persistentes de toda a cultura ocidental» (Ieda Tucherman), não só porque ele resistiu a todas as transformações que lhe foram impostas historicamente, mas também porque ele se soube reinventar e readaptar pacientemente face a todas essas transformações históricas. Sinal evidente de que a sua plasticidade lhe iria conferir, principalmente a partir do século XX («o século do corpo») um estatuto teórico e performativo verdadeiramente especial. Tão especial, aliás, que basta termos uma noção, mesmo que breve, da sua inquietante história para percebermos imediatamente a extraordinária importância dalgumas das suas mais variadas transformações, atracções, ligações e exibições ao longo das últimas décadas, nomeadamente sob o efeito criativo das tecnologias digitais (no âmbito das quais se processam algumas das mais interessantes operações de dinamização cultural e artística do nosso tempo). Em suma, esperamos sinceramente que este trabalho de investigação possa vir a contribuir para o processo de alargamento das fronteiras cada vez mais incertas, dinâmicas e interactivas do conhecimento daquilo que parece constituir, hoje em dia, o jogo fundamental da nossa contemporaneidade.
Resumo:
The present study investigates peer to peer oral interaction in two task based language teaching classrooms, one of which was a self-declared cohesive group, and the other a self- declared less cohesive group, both at B1 level. It studies how learners talk cohesion into being and considers how this talk leads to learning opportunities in these groups. The study was classroom-based and was carried out over the period of an academic year. Research was conducted in the classrooms and the tasks were part of regular class work. The research was framed within a sociocognitive perspective of second language learning and data came from a number of sources, namely questionnaires, interviews and audio recorded talk of dyads, triads and groups of four students completing a total of eight oral tasks. These audio recordings were transcribed and analysed qualitatively for interactions which encouraged a positive social dimension and behaviours which led to learning opportunities, using conversation analysis. In addition, recordings were analysed quantitatively for learning opportunities and quantity and quality of language produced. Results show that learners in both classes exhibited multiple behaviours in interaction which could promote a positive social dimension, although behaviours which could discourage positive affect amongst group members were also found. Analysis of interactions also revealed the many ways in which learners in both the cohesive and less cohesive class created learning opportunities. Further qualitative analysis of these interactions showed that a number of factors including how learners approach a task, the decisions they make at zones of interactional transition and the affective relationship between participants influence the amount of learning opportunities created, as well as the quality and quantity of language produced. The main conclusion of the study is that it is not the cohesive nature of the group as a whole but the nature of the relationship between the individual members of the small group completing the task which influences the effectiveness of oral interaction for learning.This study contributes to our understanding of the way in which learners individualise the learning space and highlights the situated nature of language learning. It shows how individuals interact with each other and the task, and how talk in interaction changes moment-by-moment as learners react to the ‘here and now’ of the classroom environment.