895 resultados para Epistemic justification


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Na literatura sobre redes interorganizacionais, parte das abordagens de pesquisa afirma que tal tipo de ambiente favorece as condições para a aprendizagem por meio da combinação de capacidade de diferentes membros. Igualmente, existem correntes de estudo que apontam haver ressalvas relacionadas às próprias características dos membros. Esta aparente contradição instigou um questionamento: como a configuração de posições de membros de redes interorganizacionais afeta o sentido que a aprendizagem adquire para os próprios membros? Utilizando-se referências conceituais da área de aprendizagem em redes interorganizacionais, o estudo apresenta os resultados da análise de um caso empírico de uma comunidade epistêmica (rede de cooperação científica formalmente instituída) brasileira no campo da biotecnologia. Adotou-se como método de pesquisa a triangulação de técnicas quantitativas e qualitativas. Inicialmente, utilizou-se a análise quantitativa sociométrica evidenciando a posição do membro de acordo com a centralidade para a aprendizagem e a produção acadêmica. Após essa identificação, foram aplicadas entrevistas semiestruturadas aos membros localizados em diferentes posições, fossem elas de alto valor agregado (membros com alta centralidade) ou marginais (membros com baixa centralidade). A partir de um referencial de análise qualitativo com foco nos conceitos de práticas e gêneros discursivos, foi possível delinear diferentes sentidos sobre a aprendizagem decorrida de acordo com a posição dos membros. A partir dos resultados, evidenciou-se a relação da posição ocupada pelo membro da rede o sentido dado à aprendizagem. Apesar das limitações associadas à pesquisa sobre um caso único que dificulta a possível generalização dos resultados, o estudo abre possibilidades de aprofundar novas pesquisas sobre a relação entre as características dos membros e sua posição com a aprendizagem em redes interorganizacionais.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

[EN] This article examines a variety of options for expressing speaker and writer stance in a subcorpus of MarENG, a maritime English learning tool sponsored by the EU (35,041 words). Non-verbal markers related to key areas of modal expression are presented; (1)epistemic adverbs and adverbial expressions, (2) epistemic adjectives, (3) deontic adjectives, (4) evidential adverbs, (5) evidential adjectives, (6) evidential interpersonal markers, and (7) single adverbials conveying the speaker’s attitudes, feelings or value judgments. The overall aim is to present an overview of how these non-verbal markers operate in this LSP genre.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Since the first underground nuclear explosion, carried out in 1958, the analysis of seismic signals generated by these sources has allowed seismologists to refine the travel times of seismic waves through the Earth and to verify the accuracy of the location algorithms (the ground truth for these sources was often known). Long international negotiates have been devoted to limit the proliferation and testing of nuclear weapons. In particular the Treaty for the comprehensive nuclear test ban (CTBT), was opened to signatures in 1996, though, even if it has been signed by 178 States, has not yet entered into force, The Treaty underlines the fundamental role of the seismological observations to verify its compliance, by detecting and locating seismic events, and identifying the nature of their sources. A precise definition of the hypocentral parameters represents the first step to discriminate whether a given seismic event is natural or not. In case that a specific event is retained suspicious by the majority of the State Parties, the Treaty contains provisions for conducting an on-site inspection (OSI) in the area surrounding the epicenter of the event, located through the International Monitoring System (IMS) of the CTBT Organization. An OSI is supposed to include the use of passive seismic techniques in the area of the suspected clandestine underground nuclear test. In fact, high quality seismological systems are thought to be capable to detect and locate very weak aftershocks triggered by underground nuclear explosions in the first days or weeks following the test. This PhD thesis deals with the development of two different seismic location techniques: the first one, known as the double difference joint hypocenter determination (DDJHD) technique, is aimed at locating closely spaced events at a global scale. The locations obtained by this method are characterized by a high relative accuracy, although the absolute location of the whole cluster remains uncertain. We eliminate this problem introducing a priori information: the known location of a selected event. The second technique concerns the reliable estimates of back azimuth and apparent velocity of seismic waves from local events of very low magnitude recorded by a trypartite array at a very local scale. For the two above-mentioned techniques, we have used the crosscorrelation technique among digital waveforms in order to minimize the errors linked with incorrect phase picking. The cross-correlation method relies on the similarity between waveforms of a pair of events at the same station, at the global scale, and on the similarity between waveforms of the same event at two different sensors of the try-partite array, at the local scale. After preliminary tests on the reliability of our location techniques based on simulations, we have applied both methodologies to real seismic events. The DDJHD technique has been applied to a seismic sequence occurred in the Turkey-Iran border region, using the data recorded by the IMS. At the beginning, the algorithm was applied to the differences among the original arrival times of the P phases, so the cross-correlation was not used. We have obtained that the relevant geometrical spreading, noticeable in the standard locations (namely the locations produced by the analysts of the International Data Center (IDC) of the CTBT Organization, assumed as our reference), has been considerably reduced by the application of our technique. This is what we expected, since the methodology has been applied to a sequence of events for which we can suppose a real closeness among the hypocenters, belonging to the same seismic structure. Our results point out the main advantage of this methodology: the systematic errors affecting the arrival times have been removed or at least reduced. The introduction of the cross-correlation has not brought evident improvements to our results: the two sets of locations (without and with the application of the cross-correlation technique) are very similar to each other. This can be commented saying that the use of the crosscorrelation has not substantially improved the precision of the manual pickings. Probably the pickings reported by the IDC are good enough to make the random picking error less important than the systematic error on travel times. As a further justification for the scarce quality of the results given by the cross-correlation, it should be remarked that the events included in our data set don’t have generally a good signal to noise ratio (SNR): the selected sequence is composed of weak events ( magnitude 4 or smaller) and the signals are strongly attenuated because of the large distance between the stations and the hypocentral area. In the local scale, in addition to the cross-correlation, we have performed a signal interpolation in order to improve the time resolution. The algorithm so developed has been applied to the data collected during an experiment carried out in Israel between 1998 and 1999. The results pointed out the following relevant conclusions: a) it is necessary to correlate waveform segments corresponding to the same seismic phases; b) it is not essential to select the exact first arrivals; and c) relevant information can be also obtained from the maximum amplitude wavelet of the waveforms (particularly in bad SNR conditions). Another remarkable point of our procedure is that its application doesn’t demand a long time to process the data, and therefore the user can immediately check the results. During a field survey, such feature will make possible a quasi real-time check allowing the immediate optimization of the array geometry, if so suggested by the results at an early stage.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The aim of this research is to estimate the impact of violent film excerpts on university students (30 f, 30 m) in two different sequences, a “justified” violent scene followed by an “unjustified” one, or vice versa, as follows: 1) before-after sequences, using Aggressive behaviour I-R Questionnaire, Self Depression Scale and ASQ-IPAT Anxiety SCALE; 2) after every excerpt, using a self-report to evaluate the intensity and hedonic tone of emotions and the violence justification level. Emotion regulation processes (suppression, reappraisal, self-efficacy) were considered. In contrast with the “unjustified” violent scene, during the “justified” one, the justification level was higher; intensity and unpleasantness of negative emotions were lower. Anxiety (total and latent) and rumination diminished after both types of sequences. Rumination decreases less after the JV-UV sequence than after the UV-JV sequence. Self-efficacy in controlling negative emotions reduced rumination, whereas suppression reduced irritability. Reappraisal, self-efficacy in positive emotion expression and perceived emphatic selfefficacy did not have any effects.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The miniaturization race in the hardware industry aiming at continuous increasing of transistor density on a die does not bring respective application performance improvements any more. One of the most promising alternatives is to exploit a heterogeneous nature of common applications in hardware. Supported by reconfigurable computation, which has already proved its efficiency in accelerating data intensive applications, this concept promises a breakthrough in contemporary technology development. Memory organization in such heterogeneous reconfigurable architectures becomes very critical. Two primary aspects introduce a sophisticated trade-off. On the one hand, a memory subsystem should provide well organized distributed data structure and guarantee the required data bandwidth. On the other hand, it should hide the heterogeneous hardware structure from the end-user, in order to support feasible high-level programmability of the system. This thesis work explores the heterogeneous reconfigurable hardware architectures and presents possible solutions to cope the problem of memory organization and data structure. By the example of the MORPHEUS heterogeneous platform, the discussion follows the complete design cycle, starting from decision making and justification, until hardware realization. Particular emphasis is made on the methods to support high system performance, meet application requirements, and provide a user-friendly programmer interface. As a result, the research introduces a complete heterogeneous platform enhanced with a hierarchical memory organization, which copes with its task by means of separating computation from communication, providing reconfigurable engines with computation and configuration data, and unification of heterogeneous computational devices using local storage buffers. It is distinguished from the related solutions by distributed data-flow organization, specifically engineered mechanisms to operate with data on local domains, particular communication infrastructure based on Network-on-Chip, and thorough methods to prevent computation and communication stalls. In addition, a novel advanced technique to accelerate memory access was developed and implemented.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

THE TITLE OF MY THESIS IS THE ROLE OF THE IDEAS AND THEIR CHANGE IN HIGHER EDUCATION POLICY-MAKING PROCESSES FROM THE EIGHTIES TO PRESENT-DAY: THE CASES OF ENGLAND AND NEW ZEALAND IN COMPARATIVE PERSPECTIVE UNDER A THEORETICAL POINT OF VIEW, THE AIM OF MY WORK IS TO CARRY OUT A RESEARCH MODELLED ON THE CONSTRUCTIVIST THEORY. IT FOCUSES ON THE ANALYSIS OF THE IMPACT OF IDEAS ON THE PROCESSES OF POLICY MAKING BY MEANS OF EPISTEMIC COMMUNITIES, THINK TANKS AND VARIOUS SOCIOECONOMIC CONTEXTS THAT MAY HAVE PLAYED A KEY ROLE IN THE CONSTRUCTION OF THE DIFFERENT PATHS. FROM MY POINT OF VIEW IDEAS CONSTITUTE A PRIORITY RESEARCH FIELD WHICH IS WORTH ANALYSING SINCE THEIR ROLE IN POLICY MAKING PROCESSES HAS BEEN TRADITIONALLY RATHER UNEXPLORED. IN THIS CONTEXT AND WITH THE AIM OF DEVELOPING A RESEARCH STRAND BASED ON THE ROLE OF IDEAS, I INTEND TO CARRY ON MY STUDY UNDER THE PERSPECTIVE OF CHANGE. DEPENDING ON THE DATA AND INFORMATION THAT I COLLECTED I EVALUATED THE WEIGHT OF EACH OF THESE VARIABLES AND MAYBE OTHERS SUCH AS THE INSTITUTIONS AND THE INDIVIDUAL INTERESTS, WHICH MAY HAVE INFLUENCED THE FORMATION OF THE POLICY MAKING PROCESSES. UNDER THIS LIGHT, I PLANNED TO ADOPT THE QUALITATIVE METHODOLOGY OF RESEARCH WHICH I BELIEVE TO BE VERY EFFECTIVE AGAINST THE MORE DIFFICULT AND POSSIBLY REDUCTIVE APPLICATION OF QUANTITIVE DATA SETS. I RECKON THEREFORE THAT THE MOST APPROPRIATE TOOLS FOR INFORMATION PROCESSING INCLUDE CONTENT ANALYSIS, AND IN-DEPTH INTERVIEWS TO PERSONALITIES OF THE POLITICAL PANORAMA (ÉLITE OR NOT) WHO HAVE PARTICIPATED IN THE PROCESS OF HIGHER EDUCATION REFORM FROM THE EIGHTIES TO PRESENT-DAY. THE TWO CASES TAKEN INTO CONSIDERATION SURELY SET AN EXAMPLE OF RADICAL REFORM PROCESSES WHICH HAVE OCCURRED IN QUITE DIFFERENT CONTEXTS DETERMINED BY THE SOCIOECONOMIC CHARACTERISTICS AND THE TRAITS OF THE ÉLITE. IN NEW ZEALAND THE DESCRIBED PROCESS HAS TAKEN PLACE WITH A STEADY PACE AND A GOOD GRADE OF CONSEQUANTIALITY, IN LINE WTH THE REFORMS IN OTHER STATE DIVISIONS DRIVEN BY THE IDEAS OF THE NEW PUBLIC MANAGEMENT. CONTRARILY IN ENGLAND THE REFORMATIVE ACTION OF MARGARET THATCHER HAS ACQUIRED A VERY RADICAL CONNOTATION AS IT HAS BROUGHT INTO THE AMBIT OF HIGHER EDUCATION POLICY CONCEPTS LIKE EFFICIENCY, EXCELLENCE, RATIONALIZATION THAT WOULD CONTRAST WITH THE GENERALISTIC AND MASS-ORIENTED IDEAS THAT WERE FASHIONABLE DURING THE SEVENTIES. THE MISSION I INTEND TO ACCOMPLISH THORUGHOUT MY RESEARCH IS TO INVESTIGATE AND ANALYSE INTO MORE DEPTH THE DIFFERENCES THAT SEEM TO EMERGE FROM TWO CONTEXTS WHICH MOST OF THE LITERATURE REGARDS AS A SINGLE MODEL: THE ANGLO-SAXON MODEL. UNDER THIS LIGHT, THE DENSE ANALYSIS OF POLICY PROCESSES ALLOWED TO BRING OUT BOTH THE CONTROVERSIAL AND CONTRASTING ASPECTS OF THE TWO REALITIES COMPARED, AND THE ROLE AND WEIGHT OF VARIABLES SUCH AS IDEAS (MAIN VARIABLE), INSTITUTIONAL SETTINGS AND INDIVIDUAL INTERESTS ACTING IN EACH CONTEXT. THE CASES I MEAN TO ATTEND PRESENT PECULIAR ASPECTS WORTH DEVELOPING AN IN-DEPTH ANALYSIS, AN OUTLINE OF WHICH WILL BE PROVIDED IN THIS ABSTRACT. ENGLAND THE CONSERVATIVE GOVERNMENT, SINCE 1981, INTRODUCED RADICAL CHANGES IN THE SECTOR OF HIGHER EDUCATION: FIRST CUTTING DOWN ON STATE FUNDINGS AND THEN WITH THE CREATION OF AN INSTITUTION FOR THE PLANNING AND LEADERSHIP OF THE POLYTECHNICS (NON-UNIVERSITY SECTOR). AFTERWARDS THE SCHOOL REFORM BY MARGARET THATCHER IN 1988 RAISED TO A GREAT STIR ALL OVER EUROPE DUE TO BOTH ITS CONSIDERABLE INNOVATIVE IMPRINT AND THE STRONG ATTACK AGAINST THE PEDAGOGY OF THE ‘ACTIVE’ SCHOOLING AND PROGRESSIVE EDUCATION, UNTIL THEN RECOGNIZED AS A MERIT OF THE BRITISH PUBLIC SCHOOL. IN THE AMBIT OF UNIVERSITY EDUCATION THIS REFORM, TOGETHER WITH SIMILAR MEASURES BROUGHT IN DURING 1992, PUT INTO PRACTICE THE CONSERVATIVE PRINCIPLES THROUGH A SERIES OF ACTIONS THAT INCLUDED: THE SUPPRESSION OF THE IRREMOVABILITY PRINCIPLE FOR UNIVERSITY TEACHERS; THE INTRODUCTION OF STUDENT LOANS FOR LOW-INCOME STUDENTS AND THE CANCELLATION OF THE CLEAR DISTINCTION BETWEEN UNIVERSITIES AND POLYTECHNICS. THE POLICIES OF THE LABOUR MAJORITY OF MR BLAIR DID NOT QUITE DIVERGE FROM THE CONSERVATIVES’ POSITION. IN 2003 BLAIR’S CABINET RISKED TO BECOME A MINORITY RIGHT ON THE OCCASION OF AN IMPORTANT UNIVERSITY REFORM PROPOSAL. THIS PROPOSAL WOULD FORESEE THE AUTONOMY FOR THE UNIVERSITIES TO RAISE UP TO 3.000 POUNDS THE ENROLMENT FEES FOR STUDENTS (WHILE FORMERLY THE CEILING WAS 1.125 POUNDS). BLAIR HAD TO FACE INTERNAL OPPOSITION WITHIN HIS OWN PARTY IN RELATION TO A MEASURE THAT, ACCORDING TO THE 150 MPS PROMOTERS OF AN ADVERSE MOTION, HAD NOT BEEN INCLUDED IN THE ELECTORAL PROGRAMME AND WOULD RISK CREATING INCOME-BASED DISCRIMINATION AMONG STUDENTS. AS A MATTER OF FACT THE BILL FOCUSED ON THE INTRODUCTION OF VERY LOW-INTEREST STUDENT LOANS TO BE SETTLED ONLY WHEN THE STUDENT WOULD HAVE FOUND A REMUNERATED OCCUPATION (A SYSTEM ALREADY PROVIDED FOR BY THE AUSTRALIAN LEGISLATION). NEW ZEALAND CONTRARILY TO MANY OTHER COUNTRIES, NEW ZEALAND HAS ADOPTED A VERY WIDE VISION OF THE TERTIARY EDUCATION. IT INCLUDES IN FACT THE FULL EDUCATIONAL PROGRAMME THAT IS INTERNATIONALLY RECOGNIZED AS THE POST-SECONDARY EDUCATION. SHOULD WE SPOTLIGHT A PECULIARITY OF THE NEW ZEALAND TERTIARY EDUCATION POLICY THEN IT WOULD BE ‘CHANGE’. LOOKING AT THE REFORM HISTORY RELATED TO THE TERTIARY EDUCATION SYSTEM, WE CAN CLEARLY IDENTIFY FOUR ‘SUB-PERIODS’ FROM THE EIGHTIES TO PRESENT-DAY: 1. BEFORE THE 80S’: AN ELITARIAN SYSTEM CHARACTERIZED BY LOW PARTICIPATION RATES. 2. BETWEEN MID AND LATE 80S’: A TREND TOWARDS THE ENLARGEMENT OF PARTICIPATION ASSOCIATED TO A GREATER COMPETITION. 3. 1990-1999: A FUTHER STEP TOWARDS A COMPETITIVE MODEL BASED ON THE MARKET-ORIENTED SYSTEM. 4. FROM 2000 TO TODAY: A CONTINUOUS EVOLUTION TOWARDS A MORE COMPETITIVE MODEL BASED ON THE MARKET-ORIENTED SYSTEM TOGETHER WITH A GROWING ATTENTION TO STATE CONTROL FOR SOCIAL AND ECONOMIC DEVELOPMENT OF THE NATION. AT PRESENT THE GOVERNMENT OF NEW ZEALAND OPERATES TO STRENGHTHEN THIS PROCESS, PRIMARILY IN RELATION TO THE ROLE OF TERTIARY EDUCATION AS A STEADY FACTOR OF NATIONAL WALFARE, WHERE PROFESSIONAL DEVELOPMENT CONTRIBUTES ACTIVELY TO THE GROWTH OF THE NATIONAL ECONOMIC SYSTEM5. THE CASES OF ENGLAND AND NEW ZEALAND ARE THE FOCUS OF AN IN-DEPTH INVESTIGATION THAT STARTS FROM AN ANALYSIS OF THE POLICIES OF EACH NATION AND DEVELOP INTO A COMPARATIVE STUDY. AT THIS POINT I ATTEMPT TO DRAW SOME PRELIMINARY IMPRESSIONS ON THE FACTS ESSENTIALLY DECRIBED ABOVE. THE UNIVERSITY POLICIES IN ENGLAND AND NEW ZEALAND HAVE BOTH UNDERGONE A SIGNIFICANT REFORMATORY PROCESS SINCE THE EARLY EIGHTIES; IN BOTH CONTEXTS THE IMPORTANCE OF IDEAS THAT CONSTITUTED THE BASE OF POLITICS UNTIL 1980 WAS QUITE RELEVANT. GENERALLY SPEAKING, IN BOTH CASES THE PRE-REFORM POLICIES WERE INSPIRED BY EGALITARIANISM AND EXPANSION OF THE STUDENT POPULATION WHILE THOSE BROUGHT IN BY THE REFORM WOULD PURSUE EFFICIENCY, QUALITY AND COMPETITIVENESS. UNDOUBTEDLY, IN LINE WITH THIS GENERAL TENDENCY THAT REFLECTS THE HYPOTHESIS PROPOSED, THE TWO UNIVERSITY SYSTEMS PRESENT SEVERAL DIFFERENCES. THE UNIVERSITY SYSTEM IN NEW ZEALAND PROCEEDED STEADILY TOWARDS THE IMPLEMENTATION OF A MANAGERIAL CONCEPTION OF TERTIARY EDUCATION, ESPECIALLY FROM 1996 ONWARDS, IN ACCORDANCE WITH THE REFORMATORY PROCESS OF THE WHOLE PUBLIC SECTOR. IN THE UNITED KINGDOM, AS IN THE REST OF EUROPE, THE NEW APPROACH TO UNIVERSITY POLICY-MAKING HAD TO CONFRONT A DEEP-ROOTED TRADITION OF PROGRESSIVE EDUCATION AND THE IDEA OF EDUCATION EXPANSION THAT IN FACT DOMINATED UNTIL THE EIGHTIES. FROM THIS VIEW POINT THE GOVERNING ACTION OF MARGARET THATCHER GAVE RISE TO A RADICAL CHANGE THAT REVOLUTIONIZED THE OBJECTIVES AND KEY VALUES OF THE WHOLE EDUCATIONAL SYSTEM, IN PARTICULAR IN THE HIGHER EDUCATION SECTOR. IDEAS AS EFFICIENCY, EXCELLENCE AND CONTROL OF THE PERFORMANCE BECAME DECISIVE. THE LABOUR CABINETS OF BLAIR DEVELOPED IN THE WAKE OF CONSERVATIVE REFORMS. THIS APPEARS TO BE A FOCAL POINT OF THIS STUDY THAT OBSERVES HOW ALSO IN NEW ZEALAND THE REFORMING PROCESS OCCURRED TRANSVERSELY DURING PROGRESSIVE AND CONSERVATIVE ADMINISTRATIONS. THE PRELIMINARY IMPRESSION IS THEREFORE THAT IDEAS DEEPLY MARK THE REFORMATIVE PROCESSES: THE AIM OF MY RESEARCH IS TO VERIFY TO WHICH EXTENT THIS STATEMENT IS TRUE. IN ORDER TO BUILD A COMPREHENSIVE ANALYLIS, FURTHER SIGNIFICANT FACTORS WILL HAVE TO BE INVESTIGATED: THE WAY IDEAS ARE PERCEIVED AND IMPLEMENTED BY THE DIFFERENT POLITICAL ELITES; HOW THE VARIOUS SOCIOECONOMIC CONTEXTS INFLUENCE THE REFORMATIVE PROCESS; HOW THE INSTITUTIONAL STRUCTURES CONDITION THE POLICY-MAKING PROCESSES; WHETHER INDIVIDUAL INTERESTS PLAY A ROLE AND, IF YES, TO WHICH EXTENT.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The contemporary media landscape is characterized by the emergence of hybrid forms of digital communication that contribute to the ongoing redefinition of our societies cultural context. An incontrovertible consequence of this phenomenon is the new public dimension that characterizes the transmission of historical knowledge in the twenty-first century. Awareness of this new epistemic scenario has led us to reflect on the following methodological questions: what strategies should be created to establish a communication system, based on new technology, that is scientifically rigorous, but at the same time engaging for the visitors of museums and Internet users? How does a comparative analysis of ancient documentary sources form a solid base of information for the virtual reconstruction of thirteenth century Bologna in the Metaverse? What benefits can the phenomenon of cross-mediality give to the virtual heritage? The implementation of a new version of the Nu.M.E. project allowed for answering many of these instances. The investigation carried out between 2008 and 2010 has shown that, indeed, real-time 3D graphics and collaborative virtual environments can be feasible tools for representing philologically the urban medieval landscape and for communicating properly validated historical data to the general public. This research is focused on the study and implementation of a pipeline that permits mass communication of historical information about an area of vital importance in late medieval Bologna: Piazza di Porta Ravegnana. The originality of the developed project is not limited solely to the methodological dimension of historical research. Adopted technological perspective is an excellent example of innovation that digital technologies can bring to the cultural heritage. The main result of this research is the creation of Nu.ME 2010, a cross-media system of 3D real-time visualization based on some of the most advanced free software and open source technologies available today free of charge.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The objective of the current thesis is to investigate the temporal dynamics (i.e., time courses) of the Simon effect, both from a theoretical and experimental point of view, for a better understanding of whether a) one or more process are responsible for the Simon effect and b) how this/these mechanism/s differently influence performance. In the first theoretical (i.e., “Theoretical Overview”) part, I examined in detail the process and justification for analyzing the temporal dynamics of the Simon effect and the assumptions that underlie interpretation of the results which have been obtained in the existing literature so far. In the second part (“Experimental Investigations”), though, I experimentally investigated several issues which the existing literature left unsolved, in order to get further evidence in favor or in contrast with the mainstream models which are currently used to account for the different Simon effect time courses. Some points about the experiments are worth mentioning: First, all the experiments were conducted in the laboratory, facing participants with stimuli presented on a PC screen and then recording their responses. Both stimuli presentation and response collection was controlled by the E-Prime software. The dependent variables of interest were always behavioral measures of performance, such as velocity and accuracy. Second, the most part of my experiments had been conducted at the Communication Sciences Department (University of Bologna), under Prof. Nicoletti’s supervision. The remaining part, though, had been conducted at the Psychological Sciences Department of Purdue University (West Lafayette, Indiana, USA), where I collaborated for one year as a visiting student with Prof. Proctor and his team. Third, my experimental pool was entirely composed by healthy and young students, since the cognitive functioning of elderly people was not the target of my research.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

La specificità dell'acquisizione di contenuti attraverso le interfacce digitali condanna l'agente epistemico a un'interazione frammentata, insufficiente da un punto di vista computazionale, mnemonico e temporale, rispetto alla mole informazionale oggi accessibile attraverso una qualunque implementazione della relazione uomo-computer, e invalida l'applicabilità del modello standard di conoscenza, come credenza vera e giustificata, sconfessando il concetto di credenza razionalmente fondata, per formare la quale, sarebbe invece richiesto all'agente di poter disporre appunto di risorse concettuali, computazionali e temporali inaccessibili. La conseguenza è che l'agente, vincolato dalle limitazioni ontologiche tipiche dell'interazione con le interfacce culturali, si vede costretto a ripiegare su processi ambigui, arbitrari e spesso più casuali di quanto creda, di selezione e gestione delle informazioni che danno origine a veri e propri ibridi (alla Latour) epistemologici, fatti di sensazioni e output di programmi, credenze non fondate e bit di testimonianze indirette e di tutta una serie di relazioni umano-digitali che danno adito a rifuggire in una dimensione trascendente che trova nel sacro il suo più immediato ambito di attuazione. Tutto ciò premesso, il presente lavoro si occupa di costruire un nuovo paradigma epistemologico di conoscenza proposizionale ottenibile attraverso un'interfaccia digitale di acquisizione di contenuti, fondato sul nuovo concetto di Tracciatura Digitale, definito come un un processo di acquisizione digitale di un insieme di tracce, ossia meta-informazioni di natura testimoniale. Tale dispositivo, una volta riconosciuto come un processo di comunicazione di contenuti, si baserà sulla ricerca e selezione di meta-informazioni, cioè tracce, che consentiranno l'implementazione di approcci derivati dall'analisi decisionale in condizioni di razionalità limitata, approcci che, oltre ad essere quasi mai utilizzati in tale ambito, sono ontologicamente predisposti per una gestione dell'incertezza quale quella riscontrabile nell'istanziazione dell'ibrido informazionale e che, in determinate condizioni, potranno garantire l'agente sulla bontà epistemica del contenuto acquisito.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Synthetic Biology is a relatively new discipline, born at the beginning of the New Millennium, that brings the typical engineering approach (abstraction, modularity and standardization) to biotechnology. These principles aim to tame the extreme complexity of the various components and aid the construction of artificial biological systems with specific functions, usually by means of synthetic genetic circuits implemented in bacteria or simple eukaryotes like yeast. The cell becomes a programmable machine and its low-level programming language is made of strings of DNA. This work was performed in collaboration with researchers of the Department of Electrical Engineering of the University of Washington in Seattle and also with a student of the Corso di Laurea Magistrale in Ingegneria Biomedica at the University of Bologna: Marilisa Cortesi. During the collaboration I contributed to a Synthetic Biology project already started in the Klavins Laboratory. In particular, I modeled and subsequently simulated a synthetic genetic circuit that was ideated for the implementation of a multicelled behavior in a growing bacterial microcolony. In the first chapter the foundations of molecular biology are introduced: structure of the nucleic acids, transcription, translation and methods to regulate gene expression. An introduction to Synthetic Biology completes the section. In the second chapter is described the synthetic genetic circuit that was conceived to make spontaneously emerge, from an isogenic microcolony of bacteria, two different groups of cells, termed leaders and followers. The circuit exploits the intrinsic stochasticity of gene expression and intercellular communication via small molecules to break the symmetry in the phenotype of the microcolony. The four modules of the circuit (coin flipper, sender, receiver and follower) and their interactions are then illustrated. In the third chapter is derived the mathematical representation of the various components of the circuit and the several simplifying assumptions are made explicit. Transcription and translation are modeled as a single step and gene expression is function of the intracellular concentration of the various transcription factors that act on the different promoters of the circuit. A list of the various parameters and a justification for their value closes the chapter. In the fourth chapter are described the main characteristics of the gro simulation environment, developed by the Self Organizing Systems Laboratory of the University of Washington. Then, a sensitivity analysis performed to pinpoint the desirable characteristics of the various genetic components is detailed. The sensitivity analysis makes use of a cost function that is based on the fraction of cells in each one of the different possible states at the end of the simulation and the wanted outcome. Thanks to a particular kind of scatter plot, the parameters are ranked. Starting from an initial condition in which all the parameters assume their nominal value, the ranking suggest which parameter to tune in order to reach the goal. Obtaining a microcolony in which almost all the cells are in the follower state and only a few in the leader state seems to be the most difficult task. A small number of leader cells struggle to produce enough signal to turn the rest of the microcolony in the follower state. It is possible to obtain a microcolony in which the majority of cells are followers by increasing as much as possible the production of signal. Reaching the goal of a microcolony that is split in half between leaders and followers is comparatively easy. The best strategy seems to be increasing slightly the production of the enzyme. To end up with a majority of leaders, instead, it is advisable to increase the basal expression of the coin flipper module. At the end of the chapter, a possible future application of the leader election circuit, the spontaneous formation of spatial patterns in a microcolony, is modeled with the finite state machine formalism. The gro simulations provide insights into the genetic components that are needed to implement the behavior. In particular, since both the examples of pattern formation rely on a local version of Leader Election, a short-range communication system is essential. Moreover, new synthetic components that allow to reliably downregulate the growth rate in specific cells without side effects need to be developed. In the appendix are listed the gro code utilized to simulate the model of the circuit, a script in the Python programming language that was used to split the simulations on a Linux cluster and the Matlab code developed to analyze the data.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Il lavoro è una riflessione sugli sviluppi della nozione di definizione nel recente dibattito sull'analiticità. La rinascita di questa discussione, dopo le critiche di Quine e un conseguente primo abbandono della concezione convenzionalista carnapiana ha come conseguenza una nuova concezione epistemica dell'analiticità. Nella maggior parte dei casi le nuove teorie epistemiche, tra le quali quelle di Bob Hale e Crispin Wright (Implicit Definition and the A priori, 2001) e Paul Boghossian (Analyticity, 1997; Epistemic analyticity, a defence, 2002, Blind reasoning, 2003, Is Meaning Normative ?, 2005) presentano il comune carattere di intendere la conoscenza a priori nella forma di una definizione implicita (Paul Horwich, Stipulation, Meaning, and Apriority, 2001). Ma una seconda linea di obiezioni facenti capo dapprima a Horwich, e in seguito agli stessi Hale e Wright, mettono in evidenza rispettivamente due difficoltà per la definizione corrispondenti alle questioni dell'arroganza epistemica e dell'accettazione (o della stipulazione) di una definizione implicita. Da questo presupposto nascono diversi tentativi di risposta. Da un lato, una concezione della definizione, nella teoria di Hale e Wright, secondo la quale essa appare come un principio di astrazione, dall'altro una nozione della definizione come definizione implicita, che si richiama alla concezione di P. Boghossian. In quest'ultima, la definizione implicita è data nella forma di un condizionale linguistico (EA, 2002; BR, 2003), ottenuto mediante una fattorizzazione della teoria costruita sul modello carnapiano per i termini teorici delle teorie empiriche. Un'analisi attenta del lavoro di Rudolf Carnap (Philosophical foundations of Physics, 1966), mostra che la strategia di scomposizione rappresenta una strada possibile per una nozione di analiticità adeguata ai termini teorici. La strategia carnapiana si colloca, infatti, nell'ambito di un tentativo di elaborazione di una nozione di analiticità che tiene conto degli aspetti induttivi delle teorie empiriche

Relevância:

10.00% 10.00%

Publicador:

Resumo:

For the detection of hidden objects by low-frequency electromagnetic imaging the Linear Sampling Method works remarkably well despite the fact that the rigorous mathematical justification is still incomplete. In this work, we give an explanation for this good performance by showing that in the low-frequency limit the measurement operator fulfills the assumptions for the fully justified variant of the Linear Sampling Method, the so-called Factorization Method. We also show how the method has to be modified in the physically relevant case of electromagnetic imaging with divergence-free currents. We present numerical results to illustrate our findings, and to show that similar performance can be expected for the case of conducting objects and layered backgrounds.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We consider the heat flux through a domain with subregions in which the thermal capacity approaches zero. In these subregions the parabolic heat equation degenerates to an elliptic one. We show the well-posedness of such parabolic-elliptic differential equations for general non-negative L-infinity-capacities and study the continuity of the solutions with respect to the capacity, thus giving a rigorous justification for modeling a small thermal capacity by setting it to zero. We also characterize weak directional derivatives of the temperature with respect to capacity as solutions of related parabolic-elliptic problems.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We consider a simple (but fully three-dimensional) mathematical model for the electromagnetic exploration of buried, perfect electrically conducting objects within the soil underground. Moving an electric device parallel to the ground at constant height in order to generate a magnetic field, we measure the induced magnetic field within the device, and factor the underlying mathematics into a product of three operations which correspond to the primary excitation, some kind of reflection on the surface of the buried object(s) and the corresponding secondary excitation, respectively. Using this factorization we are able to give a justification of the so-called sampling method from inverse scattering theory for this particular set-up.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Das Jahr 1989 markiert nicht nur den Beginn entscheidender geopolitischer Veränderungen, sondern gleichzeitig den Ursprung eines bedeutsamen Wandels in der internationalen Entwicklungszusammenarbeit. Mit der viel beachteten Studie ‚Sub-Saharan Africa – From Crisis to Sustainable Growth’ initiierte die Weltbank eine Debatte über die Relevanz institutioneller Faktoren für wirtschaftliche Entwicklung, die in den folgenden Jahren unter dem Titel ‚Good Governance’ erhebliche Bedeutung erlangte. Nahezu alle zentralen Akteure begannen, entsprechende Aspekte in ihrer praktischen Arbeit zu berücksichtigen, und entwickelten eigene Konzepte zu dieser Thematik. Wenn auch mit der Konzentration auf Institutionen als Entwicklungsdeterminanten eine grundlegende Gemeinsamkeit der Ansätze festzustellen ist, unterscheiden sie sich jedoch erheblich im Hinblick auf die Einbeziehung politischer Faktoren, so dass von einem einheitlichen Verständnis von ‚Good Governance’ nicht gesprochen werden kann. Während die meisten bilateralen Akteure sowie DAC und UNDP Demokratie und Menschenrechte explizit als zentrale Bestandteile betrachten, identifiziert die Weltbank einen Kern von Good Governance, der unabhängig von der Herrschaftsform, also sowohl in Demokratien wie auch in Autokratien, verwirklicht werden kann. Die Implikationen dieser Feststellung sind weit reichend. Zunächst erlaubt erst diese Sichtweise der Bank überhaupt, entsprechende Aspekte aufzugreifen, da ihr eine Berücksichtigung politischer Faktoren durch ihre Statuten verboten ist. Bedeutsamer ist allerdings, dass die Behauptung der Trennbarkeit von Good Governance und der Form politischer Herrschaft die Möglichkeit eröffnet, Entwicklung zu erreichen ohne eine demokratische Ordnung zu etablieren, da folglich autokratische Systeme in gleicher Weise wie Demokratien in der Lage sind, die institutionellen Voraussetzungen zu verwirklichen, welche als zentrale Determinanten für wirtschaftlichen Fortschritt identifiziert wurden. Damit entfällt nicht nur ein bedeutsamer Rechtfertigungsgrund für demokratische Herrschaft als solche, sondern rekurrierend auf bestimmte, dieser zu attestierende, entwicklungshemmende Charakteristika können Autokratien nun möglicherweise als überlegene Herrschaftsform verstanden werden, da sie durch jene nicht gekennzeichnet sind. Die Schlussfolgerungen der Weltbank unterstützen somit auch die vor allem im Zusammenhang mit der Erfolgsgeschichte der ostasiatischen Tigerstaaten vertretene Idee der Entwicklungsdiktatur, die heute mit dem Aufstieg der Volksrepublik China eine Renaissance erlebt. Der wirtschaftliche Erfolg dieser Staaten ist danach auf die überlegene Handlungsfähigkeit autokratischer Systeme zurückzuführen, während Demokratien aufgrund der Verantwortlichkeitsbeziehungen zwischen Regierenden und Regierten nicht in der Lage sind, die notwendigen Entscheidungen zu treffen und durchzusetzen. Die dargestellte Sichtweise der Weltbank ist allerdings von verschiedenen Autoren in Zweifel gezogen worden, die auch für ein im Wesentlichen auf technische Elemente beschränktes Good Governance-Konzept einen Zusammenhang mit der Form politischer Herrschaft erkennen. So wird beispielsweise vertreten, das Konzept der Bank bewege sich ausdrücklich nicht in einem systemneutralen Vakuum, sondern propagiere zumindest implizit die Etablierung demokratischer Regierungsformen. Im Übrigen steht die aus den Annahmen der Weltbank neuerlich abgeleitete Idee der Entwicklungsdiktatur in einem erheblichen Widerspruch zu der von multilateralen wie bilateralen Akteuren verstärkt verfolgten Förderung demokratischer Herrschaft als Mittel für wirtschaftliche Entwicklung sowie der fortschreitenden Verbreitung der Demokratie. Besteht nun doch ein Einfluss der Herrschaftsform auf die Verwirklichung von Good Governance als zentraler Entwicklungsdeterminante und kann zudem davon ausgegangen werden, dass Demokratien diesbezüglich Vorteile besitzen, dann ist eine Entwicklungsdiktatur keine denkbare Möglichkeit, sondern im Gegenteil demokratische Herrschaft der gebotene Weg zu wirtschaftlichem Wachstum bzw. einer Verbesserung der Lebensverhältnisse. Aufgrund der mit den Schlussfolgerungen der Weltbank verbundenen bedeutsamen Implikationen und der bisher weitestgehend fehlenden ausführlichen Thematisierung dieses Gegenstands in der Literatur ist eine detaillierte theoretische Betrachtung der Zusammenhänge zwischen den zentralen Elementen von Good Governance und demokratischer Herrschaft notwendig. Darüber hinaus sollen die angesprochenen Beziehungen auch einer empirischen Analyse unterzogen werden. Gegenstand dieser Arbeit ist deshalb die Fragestellung, ob Good Governance eine von demokratischer Herrschaft theoretisch und empirisch unabhängige Entwicklungsstrategie darstellt.