14 resultados para Epistemic justification
em AMS Tesi di Dottorato - Alm@DL - Università di Bologna
Resumo:
Since the first underground nuclear explosion, carried out in 1958, the analysis of seismic signals generated by these sources has allowed seismologists to refine the travel times of seismic waves through the Earth and to verify the accuracy of the location algorithms (the ground truth for these sources was often known). Long international negotiates have been devoted to limit the proliferation and testing of nuclear weapons. In particular the Treaty for the comprehensive nuclear test ban (CTBT), was opened to signatures in 1996, though, even if it has been signed by 178 States, has not yet entered into force, The Treaty underlines the fundamental role of the seismological observations to verify its compliance, by detecting and locating seismic events, and identifying the nature of their sources. A precise definition of the hypocentral parameters represents the first step to discriminate whether a given seismic event is natural or not. In case that a specific event is retained suspicious by the majority of the State Parties, the Treaty contains provisions for conducting an on-site inspection (OSI) in the area surrounding the epicenter of the event, located through the International Monitoring System (IMS) of the CTBT Organization. An OSI is supposed to include the use of passive seismic techniques in the area of the suspected clandestine underground nuclear test. In fact, high quality seismological systems are thought to be capable to detect and locate very weak aftershocks triggered by underground nuclear explosions in the first days or weeks following the test. This PhD thesis deals with the development of two different seismic location techniques: the first one, known as the double difference joint hypocenter determination (DDJHD) technique, is aimed at locating closely spaced events at a global scale. The locations obtained by this method are characterized by a high relative accuracy, although the absolute location of the whole cluster remains uncertain. We eliminate this problem introducing a priori information: the known location of a selected event. The second technique concerns the reliable estimates of back azimuth and apparent velocity of seismic waves from local events of very low magnitude recorded by a trypartite array at a very local scale. For the two above-mentioned techniques, we have used the crosscorrelation technique among digital waveforms in order to minimize the errors linked with incorrect phase picking. The cross-correlation method relies on the similarity between waveforms of a pair of events at the same station, at the global scale, and on the similarity between waveforms of the same event at two different sensors of the try-partite array, at the local scale. After preliminary tests on the reliability of our location techniques based on simulations, we have applied both methodologies to real seismic events. The DDJHD technique has been applied to a seismic sequence occurred in the Turkey-Iran border region, using the data recorded by the IMS. At the beginning, the algorithm was applied to the differences among the original arrival times of the P phases, so the cross-correlation was not used. We have obtained that the relevant geometrical spreading, noticeable in the standard locations (namely the locations produced by the analysts of the International Data Center (IDC) of the CTBT Organization, assumed as our reference), has been considerably reduced by the application of our technique. This is what we expected, since the methodology has been applied to a sequence of events for which we can suppose a real closeness among the hypocenters, belonging to the same seismic structure. Our results point out the main advantage of this methodology: the systematic errors affecting the arrival times have been removed or at least reduced. The introduction of the cross-correlation has not brought evident improvements to our results: the two sets of locations (without and with the application of the cross-correlation technique) are very similar to each other. This can be commented saying that the use of the crosscorrelation has not substantially improved the precision of the manual pickings. Probably the pickings reported by the IDC are good enough to make the random picking error less important than the systematic error on travel times. As a further justification for the scarce quality of the results given by the cross-correlation, it should be remarked that the events included in our data set don’t have generally a good signal to noise ratio (SNR): the selected sequence is composed of weak events ( magnitude 4 or smaller) and the signals are strongly attenuated because of the large distance between the stations and the hypocentral area. In the local scale, in addition to the cross-correlation, we have performed a signal interpolation in order to improve the time resolution. The algorithm so developed has been applied to the data collected during an experiment carried out in Israel between 1998 and 1999. The results pointed out the following relevant conclusions: a) it is necessary to correlate waveform segments corresponding to the same seismic phases; b) it is not essential to select the exact first arrivals; and c) relevant information can be also obtained from the maximum amplitude wavelet of the waveforms (particularly in bad SNR conditions). Another remarkable point of our procedure is that its application doesn’t demand a long time to process the data, and therefore the user can immediately check the results. During a field survey, such feature will make possible a quasi real-time check allowing the immediate optimization of the array geometry, if so suggested by the results at an early stage.
Resumo:
The aim of this research is to estimate the impact of violent film excerpts on university students (30 f, 30 m) in two different sequences, a “justified” violent scene followed by an “unjustified” one, or vice versa, as follows: 1) before-after sequences, using Aggressive behaviour I-R Questionnaire, Self Depression Scale and ASQ-IPAT Anxiety SCALE; 2) after every excerpt, using a self-report to evaluate the intensity and hedonic tone of emotions and the violence justification level. Emotion regulation processes (suppression, reappraisal, self-efficacy) were considered. In contrast with the “unjustified” violent scene, during the “justified” one, the justification level was higher; intensity and unpleasantness of negative emotions were lower. Anxiety (total and latent) and rumination diminished after both types of sequences. Rumination decreases less after the JV-UV sequence than after the UV-JV sequence. Self-efficacy in controlling negative emotions reduced rumination, whereas suppression reduced irritability. Reappraisal, self-efficacy in positive emotion expression and perceived emphatic selfefficacy did not have any effects.
Resumo:
The miniaturization race in the hardware industry aiming at continuous increasing of transistor density on a die does not bring respective application performance improvements any more. One of the most promising alternatives is to exploit a heterogeneous nature of common applications in hardware. Supported by reconfigurable computation, which has already proved its efficiency in accelerating data intensive applications, this concept promises a breakthrough in contemporary technology development. Memory organization in such heterogeneous reconfigurable architectures becomes very critical. Two primary aspects introduce a sophisticated trade-off. On the one hand, a memory subsystem should provide well organized distributed data structure and guarantee the required data bandwidth. On the other hand, it should hide the heterogeneous hardware structure from the end-user, in order to support feasible high-level programmability of the system. This thesis work explores the heterogeneous reconfigurable hardware architectures and presents possible solutions to cope the problem of memory organization and data structure. By the example of the MORPHEUS heterogeneous platform, the discussion follows the complete design cycle, starting from decision making and justification, until hardware realization. Particular emphasis is made on the methods to support high system performance, meet application requirements, and provide a user-friendly programmer interface. As a result, the research introduces a complete heterogeneous platform enhanced with a hierarchical memory organization, which copes with its task by means of separating computation from communication, providing reconfigurable engines with computation and configuration data, and unification of heterogeneous computational devices using local storage buffers. It is distinguished from the related solutions by distributed data-flow organization, specifically engineered mechanisms to operate with data on local domains, particular communication infrastructure based on Network-on-Chip, and thorough methods to prevent computation and communication stalls. In addition, a novel advanced technique to accelerate memory access was developed and implemented.
Resumo:
THE TITLE OF MY THESIS IS THE ROLE OF THE IDEAS AND THEIR CHANGE IN HIGHER EDUCATION POLICY-MAKING PROCESSES FROM THE EIGHTIES TO PRESENT-DAY: THE CASES OF ENGLAND AND NEW ZEALAND IN COMPARATIVE PERSPECTIVE UNDER A THEORETICAL POINT OF VIEW, THE AIM OF MY WORK IS TO CARRY OUT A RESEARCH MODELLED ON THE CONSTRUCTIVIST THEORY. IT FOCUSES ON THE ANALYSIS OF THE IMPACT OF IDEAS ON THE PROCESSES OF POLICY MAKING BY MEANS OF EPISTEMIC COMMUNITIES, THINK TANKS AND VARIOUS SOCIOECONOMIC CONTEXTS THAT MAY HAVE PLAYED A KEY ROLE IN THE CONSTRUCTION OF THE DIFFERENT PATHS. FROM MY POINT OF VIEW IDEAS CONSTITUTE A PRIORITY RESEARCH FIELD WHICH IS WORTH ANALYSING SINCE THEIR ROLE IN POLICY MAKING PROCESSES HAS BEEN TRADITIONALLY RATHER UNEXPLORED. IN THIS CONTEXT AND WITH THE AIM OF DEVELOPING A RESEARCH STRAND BASED ON THE ROLE OF IDEAS, I INTEND TO CARRY ON MY STUDY UNDER THE PERSPECTIVE OF CHANGE. DEPENDING ON THE DATA AND INFORMATION THAT I COLLECTED I EVALUATED THE WEIGHT OF EACH OF THESE VARIABLES AND MAYBE OTHERS SUCH AS THE INSTITUTIONS AND THE INDIVIDUAL INTERESTS, WHICH MAY HAVE INFLUENCED THE FORMATION OF THE POLICY MAKING PROCESSES. UNDER THIS LIGHT, I PLANNED TO ADOPT THE QUALITATIVE METHODOLOGY OF RESEARCH WHICH I BELIEVE TO BE VERY EFFECTIVE AGAINST THE MORE DIFFICULT AND POSSIBLY REDUCTIVE APPLICATION OF QUANTITIVE DATA SETS. I RECKON THEREFORE THAT THE MOST APPROPRIATE TOOLS FOR INFORMATION PROCESSING INCLUDE CONTENT ANALYSIS, AND IN-DEPTH INTERVIEWS TO PERSONALITIES OF THE POLITICAL PANORAMA (ÉLITE OR NOT) WHO HAVE PARTICIPATED IN THE PROCESS OF HIGHER EDUCATION REFORM FROM THE EIGHTIES TO PRESENT-DAY. THE TWO CASES TAKEN INTO CONSIDERATION SURELY SET AN EXAMPLE OF RADICAL REFORM PROCESSES WHICH HAVE OCCURRED IN QUITE DIFFERENT CONTEXTS DETERMINED BY THE SOCIOECONOMIC CHARACTERISTICS AND THE TRAITS OF THE ÉLITE. IN NEW ZEALAND THE DESCRIBED PROCESS HAS TAKEN PLACE WITH A STEADY PACE AND A GOOD GRADE OF CONSEQUANTIALITY, IN LINE WTH THE REFORMS IN OTHER STATE DIVISIONS DRIVEN BY THE IDEAS OF THE NEW PUBLIC MANAGEMENT. CONTRARILY IN ENGLAND THE REFORMATIVE ACTION OF MARGARET THATCHER HAS ACQUIRED A VERY RADICAL CONNOTATION AS IT HAS BROUGHT INTO THE AMBIT OF HIGHER EDUCATION POLICY CONCEPTS LIKE EFFICIENCY, EXCELLENCE, RATIONALIZATION THAT WOULD CONTRAST WITH THE GENERALISTIC AND MASS-ORIENTED IDEAS THAT WERE FASHIONABLE DURING THE SEVENTIES. THE MISSION I INTEND TO ACCOMPLISH THORUGHOUT MY RESEARCH IS TO INVESTIGATE AND ANALYSE INTO MORE DEPTH THE DIFFERENCES THAT SEEM TO EMERGE FROM TWO CONTEXTS WHICH MOST OF THE LITERATURE REGARDS AS A SINGLE MODEL: THE ANGLO-SAXON MODEL. UNDER THIS LIGHT, THE DENSE ANALYSIS OF POLICY PROCESSES ALLOWED TO BRING OUT BOTH THE CONTROVERSIAL AND CONTRASTING ASPECTS OF THE TWO REALITIES COMPARED, AND THE ROLE AND WEIGHT OF VARIABLES SUCH AS IDEAS (MAIN VARIABLE), INSTITUTIONAL SETTINGS AND INDIVIDUAL INTERESTS ACTING IN EACH CONTEXT. THE CASES I MEAN TO ATTEND PRESENT PECULIAR ASPECTS WORTH DEVELOPING AN IN-DEPTH ANALYSIS, AN OUTLINE OF WHICH WILL BE PROVIDED IN THIS ABSTRACT. ENGLAND THE CONSERVATIVE GOVERNMENT, SINCE 1981, INTRODUCED RADICAL CHANGES IN THE SECTOR OF HIGHER EDUCATION: FIRST CUTTING DOWN ON STATE FUNDINGS AND THEN WITH THE CREATION OF AN INSTITUTION FOR THE PLANNING AND LEADERSHIP OF THE POLYTECHNICS (NON-UNIVERSITY SECTOR). AFTERWARDS THE SCHOOL REFORM BY MARGARET THATCHER IN 1988 RAISED TO A GREAT STIR ALL OVER EUROPE DUE TO BOTH ITS CONSIDERABLE INNOVATIVE IMPRINT AND THE STRONG ATTACK AGAINST THE PEDAGOGY OF THE ‘ACTIVE’ SCHOOLING AND PROGRESSIVE EDUCATION, UNTIL THEN RECOGNIZED AS A MERIT OF THE BRITISH PUBLIC SCHOOL. IN THE AMBIT OF UNIVERSITY EDUCATION THIS REFORM, TOGETHER WITH SIMILAR MEASURES BROUGHT IN DURING 1992, PUT INTO PRACTICE THE CONSERVATIVE PRINCIPLES THROUGH A SERIES OF ACTIONS THAT INCLUDED: THE SUPPRESSION OF THE IRREMOVABILITY PRINCIPLE FOR UNIVERSITY TEACHERS; THE INTRODUCTION OF STUDENT LOANS FOR LOW-INCOME STUDENTS AND THE CANCELLATION OF THE CLEAR DISTINCTION BETWEEN UNIVERSITIES AND POLYTECHNICS. THE POLICIES OF THE LABOUR MAJORITY OF MR BLAIR DID NOT QUITE DIVERGE FROM THE CONSERVATIVES’ POSITION. IN 2003 BLAIR’S CABINET RISKED TO BECOME A MINORITY RIGHT ON THE OCCASION OF AN IMPORTANT UNIVERSITY REFORM PROPOSAL. THIS PROPOSAL WOULD FORESEE THE AUTONOMY FOR THE UNIVERSITIES TO RAISE UP TO 3.000 POUNDS THE ENROLMENT FEES FOR STUDENTS (WHILE FORMERLY THE CEILING WAS 1.125 POUNDS). BLAIR HAD TO FACE INTERNAL OPPOSITION WITHIN HIS OWN PARTY IN RELATION TO A MEASURE THAT, ACCORDING TO THE 150 MPS PROMOTERS OF AN ADVERSE MOTION, HAD NOT BEEN INCLUDED IN THE ELECTORAL PROGRAMME AND WOULD RISK CREATING INCOME-BASED DISCRIMINATION AMONG STUDENTS. AS A MATTER OF FACT THE BILL FOCUSED ON THE INTRODUCTION OF VERY LOW-INTEREST STUDENT LOANS TO BE SETTLED ONLY WHEN THE STUDENT WOULD HAVE FOUND A REMUNERATED OCCUPATION (A SYSTEM ALREADY PROVIDED FOR BY THE AUSTRALIAN LEGISLATION). NEW ZEALAND CONTRARILY TO MANY OTHER COUNTRIES, NEW ZEALAND HAS ADOPTED A VERY WIDE VISION OF THE TERTIARY EDUCATION. IT INCLUDES IN FACT THE FULL EDUCATIONAL PROGRAMME THAT IS INTERNATIONALLY RECOGNIZED AS THE POST-SECONDARY EDUCATION. SHOULD WE SPOTLIGHT A PECULIARITY OF THE NEW ZEALAND TERTIARY EDUCATION POLICY THEN IT WOULD BE ‘CHANGE’. LOOKING AT THE REFORM HISTORY RELATED TO THE TERTIARY EDUCATION SYSTEM, WE CAN CLEARLY IDENTIFY FOUR ‘SUB-PERIODS’ FROM THE EIGHTIES TO PRESENT-DAY: 1. BEFORE THE 80S’: AN ELITARIAN SYSTEM CHARACTERIZED BY LOW PARTICIPATION RATES. 2. BETWEEN MID AND LATE 80S’: A TREND TOWARDS THE ENLARGEMENT OF PARTICIPATION ASSOCIATED TO A GREATER COMPETITION. 3. 1990-1999: A FUTHER STEP TOWARDS A COMPETITIVE MODEL BASED ON THE MARKET-ORIENTED SYSTEM. 4. FROM 2000 TO TODAY: A CONTINUOUS EVOLUTION TOWARDS A MORE COMPETITIVE MODEL BASED ON THE MARKET-ORIENTED SYSTEM TOGETHER WITH A GROWING ATTENTION TO STATE CONTROL FOR SOCIAL AND ECONOMIC DEVELOPMENT OF THE NATION. AT PRESENT THE GOVERNMENT OF NEW ZEALAND OPERATES TO STRENGHTHEN THIS PROCESS, PRIMARILY IN RELATION TO THE ROLE OF TERTIARY EDUCATION AS A STEADY FACTOR OF NATIONAL WALFARE, WHERE PROFESSIONAL DEVELOPMENT CONTRIBUTES ACTIVELY TO THE GROWTH OF THE NATIONAL ECONOMIC SYSTEM5. THE CASES OF ENGLAND AND NEW ZEALAND ARE THE FOCUS OF AN IN-DEPTH INVESTIGATION THAT STARTS FROM AN ANALYSIS OF THE POLICIES OF EACH NATION AND DEVELOP INTO A COMPARATIVE STUDY. AT THIS POINT I ATTEMPT TO DRAW SOME PRELIMINARY IMPRESSIONS ON THE FACTS ESSENTIALLY DECRIBED ABOVE. THE UNIVERSITY POLICIES IN ENGLAND AND NEW ZEALAND HAVE BOTH UNDERGONE A SIGNIFICANT REFORMATORY PROCESS SINCE THE EARLY EIGHTIES; IN BOTH CONTEXTS THE IMPORTANCE OF IDEAS THAT CONSTITUTED THE BASE OF POLITICS UNTIL 1980 WAS QUITE RELEVANT. GENERALLY SPEAKING, IN BOTH CASES THE PRE-REFORM POLICIES WERE INSPIRED BY EGALITARIANISM AND EXPANSION OF THE STUDENT POPULATION WHILE THOSE BROUGHT IN BY THE REFORM WOULD PURSUE EFFICIENCY, QUALITY AND COMPETITIVENESS. UNDOUBTEDLY, IN LINE WITH THIS GENERAL TENDENCY THAT REFLECTS THE HYPOTHESIS PROPOSED, THE TWO UNIVERSITY SYSTEMS PRESENT SEVERAL DIFFERENCES. THE UNIVERSITY SYSTEM IN NEW ZEALAND PROCEEDED STEADILY TOWARDS THE IMPLEMENTATION OF A MANAGERIAL CONCEPTION OF TERTIARY EDUCATION, ESPECIALLY FROM 1996 ONWARDS, IN ACCORDANCE WITH THE REFORMATORY PROCESS OF THE WHOLE PUBLIC SECTOR. IN THE UNITED KINGDOM, AS IN THE REST OF EUROPE, THE NEW APPROACH TO UNIVERSITY POLICY-MAKING HAD TO CONFRONT A DEEP-ROOTED TRADITION OF PROGRESSIVE EDUCATION AND THE IDEA OF EDUCATION EXPANSION THAT IN FACT DOMINATED UNTIL THE EIGHTIES. FROM THIS VIEW POINT THE GOVERNING ACTION OF MARGARET THATCHER GAVE RISE TO A RADICAL CHANGE THAT REVOLUTIONIZED THE OBJECTIVES AND KEY VALUES OF THE WHOLE EDUCATIONAL SYSTEM, IN PARTICULAR IN THE HIGHER EDUCATION SECTOR. IDEAS AS EFFICIENCY, EXCELLENCE AND CONTROL OF THE PERFORMANCE BECAME DECISIVE. THE LABOUR CABINETS OF BLAIR DEVELOPED IN THE WAKE OF CONSERVATIVE REFORMS. THIS APPEARS TO BE A FOCAL POINT OF THIS STUDY THAT OBSERVES HOW ALSO IN NEW ZEALAND THE REFORMING PROCESS OCCURRED TRANSVERSELY DURING PROGRESSIVE AND CONSERVATIVE ADMINISTRATIONS. THE PRELIMINARY IMPRESSION IS THEREFORE THAT IDEAS DEEPLY MARK THE REFORMATIVE PROCESSES: THE AIM OF MY RESEARCH IS TO VERIFY TO WHICH EXTENT THIS STATEMENT IS TRUE. IN ORDER TO BUILD A COMPREHENSIVE ANALYLIS, FURTHER SIGNIFICANT FACTORS WILL HAVE TO BE INVESTIGATED: THE WAY IDEAS ARE PERCEIVED AND IMPLEMENTED BY THE DIFFERENT POLITICAL ELITES; HOW THE VARIOUS SOCIOECONOMIC CONTEXTS INFLUENCE THE REFORMATIVE PROCESS; HOW THE INSTITUTIONAL STRUCTURES CONDITION THE POLICY-MAKING PROCESSES; WHETHER INDIVIDUAL INTERESTS PLAY A ROLE AND, IF YES, TO WHICH EXTENT.
Resumo:
The contemporary media landscape is characterized by the emergence of hybrid forms of digital communication that contribute to the ongoing redefinition of our societies cultural context. An incontrovertible consequence of this phenomenon is the new public dimension that characterizes the transmission of historical knowledge in the twenty-first century. Awareness of this new epistemic scenario has led us to reflect on the following methodological questions: what strategies should be created to establish a communication system, based on new technology, that is scientifically rigorous, but at the same time engaging for the visitors of museums and Internet users? How does a comparative analysis of ancient documentary sources form a solid base of information for the virtual reconstruction of thirteenth century Bologna in the Metaverse? What benefits can the phenomenon of cross-mediality give to the virtual heritage? The implementation of a new version of the Nu.M.E. project allowed for answering many of these instances. The investigation carried out between 2008 and 2010 has shown that, indeed, real-time 3D graphics and collaborative virtual environments can be feasible tools for representing philologically the urban medieval landscape and for communicating properly validated historical data to the general public. This research is focused on the study and implementation of a pipeline that permits mass communication of historical information about an area of vital importance in late medieval Bologna: Piazza di Porta Ravegnana. The originality of the developed project is not limited solely to the methodological dimension of historical research. Adopted technological perspective is an excellent example of innovation that digital technologies can bring to the cultural heritage. The main result of this research is the creation of Nu.ME 2010, a cross-media system of 3D real-time visualization based on some of the most advanced free software and open source technologies available today free of charge.
Resumo:
The objective of the current thesis is to investigate the temporal dynamics (i.e., time courses) of the Simon effect, both from a theoretical and experimental point of view, for a better understanding of whether a) one or more process are responsible for the Simon effect and b) how this/these mechanism/s differently influence performance. In the first theoretical (i.e., “Theoretical Overview”) part, I examined in detail the process and justification for analyzing the temporal dynamics of the Simon effect and the assumptions that underlie interpretation of the results which have been obtained in the existing literature so far. In the second part (“Experimental Investigations”), though, I experimentally investigated several issues which the existing literature left unsolved, in order to get further evidence in favor or in contrast with the mainstream models which are currently used to account for the different Simon effect time courses. Some points about the experiments are worth mentioning: First, all the experiments were conducted in the laboratory, facing participants with stimuli presented on a PC screen and then recording their responses. Both stimuli presentation and response collection was controlled by the E-Prime software. The dependent variables of interest were always behavioral measures of performance, such as velocity and accuracy. Second, the most part of my experiments had been conducted at the Communication Sciences Department (University of Bologna), under Prof. Nicoletti’s supervision. The remaining part, though, had been conducted at the Psychological Sciences Department of Purdue University (West Lafayette, Indiana, USA), where I collaborated for one year as a visiting student with Prof. Proctor and his team. Third, my experimental pool was entirely composed by healthy and young students, since the cognitive functioning of elderly people was not the target of my research.
Resumo:
La specificità dell'acquisizione di contenuti attraverso le interfacce digitali condanna l'agente epistemico a un'interazione frammentata, insufficiente da un punto di vista computazionale, mnemonico e temporale, rispetto alla mole informazionale oggi accessibile attraverso una qualunque implementazione della relazione uomo-computer, e invalida l'applicabilità del modello standard di conoscenza, come credenza vera e giustificata, sconfessando il concetto di credenza razionalmente fondata, per formare la quale, sarebbe invece richiesto all'agente di poter disporre appunto di risorse concettuali, computazionali e temporali inaccessibili. La conseguenza è che l'agente, vincolato dalle limitazioni ontologiche tipiche dell'interazione con le interfacce culturali, si vede costretto a ripiegare su processi ambigui, arbitrari e spesso più casuali di quanto creda, di selezione e gestione delle informazioni che danno origine a veri e propri ibridi (alla Latour) epistemologici, fatti di sensazioni e output di programmi, credenze non fondate e bit di testimonianze indirette e di tutta una serie di relazioni umano-digitali che danno adito a rifuggire in una dimensione trascendente che trova nel sacro il suo più immediato ambito di attuazione. Tutto ciò premesso, il presente lavoro si occupa di costruire un nuovo paradigma epistemologico di conoscenza proposizionale ottenibile attraverso un'interfaccia digitale di acquisizione di contenuti, fondato sul nuovo concetto di Tracciatura Digitale, definito come un un processo di acquisizione digitale di un insieme di tracce, ossia meta-informazioni di natura testimoniale. Tale dispositivo, una volta riconosciuto come un processo di comunicazione di contenuti, si baserà sulla ricerca e selezione di meta-informazioni, cioè tracce, che consentiranno l'implementazione di approcci derivati dall'analisi decisionale in condizioni di razionalità limitata, approcci che, oltre ad essere quasi mai utilizzati in tale ambito, sono ontologicamente predisposti per una gestione dell'incertezza quale quella riscontrabile nell'istanziazione dell'ibrido informazionale e che, in determinate condizioni, potranno garantire l'agente sulla bontà epistemica del contenuto acquisito.
Resumo:
Il lavoro è una riflessione sugli sviluppi della nozione di definizione nel recente dibattito sull'analiticità. La rinascita di questa discussione, dopo le critiche di Quine e un conseguente primo abbandono della concezione convenzionalista carnapiana ha come conseguenza una nuova concezione epistemica dell'analiticità. Nella maggior parte dei casi le nuove teorie epistemiche, tra le quali quelle di Bob Hale e Crispin Wright (Implicit Definition and the A priori, 2001) e Paul Boghossian (Analyticity, 1997; Epistemic analyticity, a defence, 2002, Blind reasoning, 2003, Is Meaning Normative ?, 2005) presentano il comune carattere di intendere la conoscenza a priori nella forma di una definizione implicita (Paul Horwich, Stipulation, Meaning, and Apriority, 2001). Ma una seconda linea di obiezioni facenti capo dapprima a Horwich, e in seguito agli stessi Hale e Wright, mettono in evidenza rispettivamente due difficoltà per la definizione corrispondenti alle questioni dell'arroganza epistemica e dell'accettazione (o della stipulazione) di una definizione implicita. Da questo presupposto nascono diversi tentativi di risposta. Da un lato, una concezione della definizione, nella teoria di Hale e Wright, secondo la quale essa appare come un principio di astrazione, dall'altro una nozione della definizione come definizione implicita, che si richiama alla concezione di P. Boghossian. In quest'ultima, la definizione implicita è data nella forma di un condizionale linguistico (EA, 2002; BR, 2003), ottenuto mediante una fattorizzazione della teoria costruita sul modello carnapiano per i termini teorici delle teorie empiriche. Un'analisi attenta del lavoro di Rudolf Carnap (Philosophical foundations of Physics, 1966), mostra che la strategia di scomposizione rappresenta una strada possibile per una nozione di analiticità adeguata ai termini teorici. La strategia carnapiana si colloca, infatti, nell'ambito di un tentativo di elaborazione di una nozione di analiticità che tiene conto degli aspetti induttivi delle teorie empiriche
Resumo:
Il lavoro di tesi analizza da un punto di vista metodologico e concettuale le narrazioni di malattia delle persone affette da sclerosi multipla. Lo scopo della ricerca è duplice: da un lato quello di indagare quali siano le trame narrative di coloro che raccontano la diagnosi della loro malattia, e dall’altro di analizzare i vissuti di malattia attraverso le categorie della sociologia della salute e della medicina e dell’antropologia medica.
Resumo:
This dissertation introduces and develops a new method of rational reconstruction called structural heuristics. Structural heuristics takes assignment of structure to any given object of investigation as the starting point for its rational reconstruction. This means to look at any given object as a system of relations and of transformation laws for those relations. The operational content of this heuristics can be summarized as follows: when facing any given system the best way to approach it is to explicitly look for a possible structure of it. The utilization of structural heuristics allows structural awareness, which is considered a fundamental epistemic disposition, as well as a fundamental condition for the rational reconstruction of systems of knowledge. In this dissertation, structural heuristics is applied to reconstructing the domain of economic knowledge. This is done by exploring four distinct areas of economic research: (i) economic axiomatics; (ii) realism in economics; (iii) production theory; (iv) economic psychology. The application of structural heuristics to these fields of economic inquiry shows the flexibility and potential of structural heuristics as epistemic tool for theoretical exploration and reconstruction.
Resumo:
This thesis is divided in three chapters. In the first chapter we analyse the results of the world forecasting experiment run by the Collaboratory for the Study of Earthquake Predictability (CSEP). We take the opportunity of this experiment to contribute to the definition of a more robust and reliable statistical procedure to evaluate earthquake forecasting models. We first present the models and the target earthquakes to be forecast. Then we explain the consistency and comparison tests that are used in CSEP experiments to evaluate the performance of the models. Introducing a methodology to create ensemble forecasting models, we show that models, when properly combined, are almost always better performing that any single model. In the second chapter we discuss in depth one of the basic features of PSHA: the declustering of the seismicity rates. We first introduce the Cornell-McGuire method for PSHA and we present the different motivations that stand behind the need of declustering seismic catalogs. Using a theorem of the modern probability (Le Cam's theorem) we show that the declustering is not necessary to obtain a Poissonian behaviour of the exceedances that is usually considered fundamental to transform exceedance rates in exceedance probabilities in the PSHA framework. We present a method to correct PSHA for declustering, building a more realistic PSHA. In the last chapter we explore the methods that are commonly used to take into account the epistemic uncertainty in PSHA. The most widely used method is the logic tree that stands at the basis of the most advanced seismic hazard maps. We illustrate the probabilistic structure of the logic tree, and then we show that this structure is not adequate to describe the epistemic uncertainty. We then propose a new probabilistic framework based on the ensemble modelling that properly accounts for epistemic uncertainties in PSHA.
Resumo:
Depending on the regulatory regime they are subject to, governments may or may not be allowed to hand out state aid to private firms. The economic justification for state aid can address several issues present in the competition for capital and the competition for transfers from the state. First, there are principal-agent problems involved at several stages. Self-interested politicians might enter state aid deals that are the result of extensive rent-seeking activities of organized interest groups. Thus the institutional design of political systems will have an effect on the propensity of a jurisdiction to award state aid. Secondly, fierce competition for firm locations can lead to over-spending. This effect is stronger if the politicians do not take into account the entirety of the costs created by their participation in the firm location race. Thirdly, state aid deals can be incomplete and not in the interest of the citizens. This applies if there are no sanctions if firms do not meet their obligations from receiving aid, such as creating a certain number of jobs or not relocating again for a certain amount of time. The separation of ownership and control in modern corporations leads to principal-agent problems on the side of the aid recipient as well. Managers might receive personal benefits from subsidies, the use of which is sometimes less monitored than private finance. This can eventually be to the detriment of the shareholders. Overall, it can be concluded that state aid control should also serve the purpose of regulating the contracting between governments and firms. An extended mandate for supervision by the European Commission could include requirements to disincentive the misuse of state aid. The Commission should also focus on the corporate governance regime in place in the jurisdiction that awards the aid as well as in the recipient firm.
Resumo:
How to evaluate the cost-effectiveness of repair/retrofit intervention vs. demolition/replacement and what level of shaking intensity can the chosen repairing/retrofit technique sustain are open questions affecting either the pre-earthquake prevention, the post-earthquake emergency and the reconstruction phases. The (mis)conception that the cost of retrofit interventions would increase linearly with the achieved seismic performance (%NBS) often discourages stakeholders to consider repair/retrofit options in a post-earthquake damage situation. Similarly, in a pre-earthquake phase, the minimum (by-law) level of %NBS might be targeted, leading in some cases to no-action. Furthermore, the performance measure enforcing owners to take action, the %NBS, is generally evaluated deterministically. Not directly reflecting epistemic and aleatory uncertainties, the assessment can result in misleading confidence on the expected performance. The present study aims at contributing to the delicate decision-making process of repair/retrofit vs. demolition/replacement, by developing a framework to assist stakeholders with the evaluation of the effects in terms of long-term losses and benefits of an increment in their initial investment (targeted retrofit level) and highlighting the uncertainties hidden behind a deterministic approach. For a pre-1970 case study building, different retrofit solutions are considered, targeting different levels of %NBS, and the actual probability of reaching Collapse when considering a suite of ground-motions is evaluated, providing a correlation between %NBS and Risk. Both a simplified and a probabilistic loss modelling are then undertaken to study the relationship between %NBS and expected direct and indirect losses.
Resumo:
La detenzione amministrativa degli stranieri, pur condividendo il carattere tipicamente afflittivo e stigmatizzante delle pene, non si fonda sulla commissione di un reato e non gode delle medesime garanzie previste dal sistema della giustizia penale. Nel nostro ordinamento l’inadeguatezza della legislazione, l’ampio margine di discrezionalità rimesso all’autorità di pubblica sicurezza, nonché il debole potere di sindacato giurisdizionale rimesso all’autorità giudiziaria, raggiungono il loro apice problematico nell’ambito delle pratiche di privazione della libertà personale che hanno per destinatari gli stranieri maggiormente vulnerabili, ossia quelli appena giunti sul territorio e il cui status giuridico non è ancora stato accertato (c.d. situazione di pre-admittance). E’ proprio sulla loro condizione che il presente lavoro si focalizza maggiormente. Le detenzioni de facto degli stranieri in condizione di pre-admittance sono analizzate, nel primo capitolo, a partire dal “caso Lampedusa”, descritto alla luce dell’indagine sul campo condotta dall’Autrice. Nel secondo capitolo viene ricostruito lo statuto della libertà personale dello straniero sulla base dei principi costituzionali e, nel terzo capitolo, sono analizzati i principi che informano il diritto alla libertà personale nell’ambito delle fonti sovranazionali, con particolare riferimento al diritto dell’Unione Europea e al sistema della Convenzione Europea dei Diritti dell’Uomo. Sulla scorta dei principi indagati, nel quarto capitolo è tracciata l’evoluzione legislativa in materia di detenzione amministrativa dello straniero in Italia e, nel quinto capitolo, è approfondito il tema dei Centri dell’immigrazione e delle regole che li disciplinano. Nelle conclusioni, infine, sono tirate le fila del percorso tracciato, attraverso la valutazione degli strumenti di tutela in grado di prevenire le pratiche di privazione della libertà informali e di garantire uno standard minimo nella tutela della libertà individuale, anche nelle zone di frontiera del nostro ordinamento.