906 resultados para epistemic marking


Relevância:

10.00% 10.00%

Publicador:

Resumo:

This dissertation introduces and develops a new method of rational reconstruction called structural heuristics. Structural heuristics takes assignment of structure to any given object of investigation as the starting point for its rational reconstruction. This means to look at any given object as a system of relations and of transformation laws for those relations. The operational content of this heuristics can be summarized as follows: when facing any given system the best way to approach it is to explicitly look for a possible structure of it. The utilization of structural heuristics allows structural awareness, which is considered a fundamental epistemic disposition, as well as a fundamental condition for the rational reconstruction of systems of knowledge. In this dissertation, structural heuristics is applied to reconstructing the domain of economic knowledge. This is done by exploring four distinct areas of economic research: (i) economic axiomatics; (ii) realism in economics; (iii) production theory; (iv) economic psychology. The application of structural heuristics to these fields of economic inquiry shows the flexibility and potential of structural heuristics as epistemic tool for theoretical exploration and reconstruction.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Lo studio presentato in questa sede concerne applicazioni di saldatura LASER caratterizzate da aspetti di non-convenzionalità ed è costituito da tre filoni principali. Nel primo ambito di intervento è stata valutata la possibilità di effettuare saldature per fusione, con LASER ad emissione continua, su pannelli Aluminum Foam Sandwich e su tubi riempiti in schiuma di alluminio. Lo studio ha messo in evidenza numerose linee operative riguardanti le problematiche relative alla saldatura delle pelli esterne dei componenti ed ha dimostrato la fattibilità relativa ad un approccio di giunzione LASER integrato (saldatura seguita da un post trattamento termico) per la realizzazione della giunzione completa di particolari tubolari riempiti in schiuma con ripristino della struttura cellulare all’interfaccia di giunzione. Il secondo ambito di intervento è caratterizzato dall’applicazione di una sorgente LASER di bassissima potenza, operante in regime ad impulsi corti, nella saldatura di acciaio ad elevato contenuto di carbonio. Lo studio ha messo in evidenza come questo tipo di sorgente, solitamente applicata per lavorazioni di ablazione e marcatura, possa essere applicata anche alla saldatura di spessori sub-millimetrici. In questa fase è stato messo in evidenza il ruolo dei parametri di lavoro sulla conformazione del giunto ed è stata definita l’area di fattibilità del processo. Lo studio è stato completato investigando la possibilità di applicare un trattamento LASER dopo saldatura per addolcire le eventuali zone indurite. In merito all’ultimo ambito di intervento l’attività di studio si è focalizzata sull’utilizzo di sorgenti ad elevata densità di potenza (60 MW/cm^2) nella saldatura a profonda penetrazione di acciai da costruzione. L’attività sperimentale e di analisi dei risultati è stata condotta mediante tecniche di Design of Experiment per la valutazione del ruolo preciso di tutti i parametri di processo e numerose considerazioni relative alla formazione di cricche a caldo sono state suggerite.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This thesis is divided in three chapters. In the first chapter we analyse the results of the world forecasting experiment run by the Collaboratory for the Study of Earthquake Predictability (CSEP). We take the opportunity of this experiment to contribute to the definition of a more robust and reliable statistical procedure to evaluate earthquake forecasting models. We first present the models and the target earthquakes to be forecast. Then we explain the consistency and comparison tests that are used in CSEP experiments to evaluate the performance of the models. Introducing a methodology to create ensemble forecasting models, we show that models, when properly combined, are almost always better performing that any single model. In the second chapter we discuss in depth one of the basic features of PSHA: the declustering of the seismicity rates. We first introduce the Cornell-McGuire method for PSHA and we present the different motivations that stand behind the need of declustering seismic catalogs. Using a theorem of the modern probability (Le Cam's theorem) we show that the declustering is not necessary to obtain a Poissonian behaviour of the exceedances that is usually considered fundamental to transform exceedance rates in exceedance probabilities in the PSHA framework. We present a method to correct PSHA for declustering, building a more realistic PSHA. In the last chapter we explore the methods that are commonly used to take into account the epistemic uncertainty in PSHA. The most widely used method is the logic tree that stands at the basis of the most advanced seismic hazard maps. We illustrate the probabilistic structure of the logic tree, and then we show that this structure is not adequate to describe the epistemic uncertainty. We then propose a new probabilistic framework based on the ensemble modelling that properly accounts for epistemic uncertainties in PSHA.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Marking the final explosive burning stage of massive stars, supernovae are onernthe of most energetic celestial events. Apart from their enormous optical brightnessrnthey are also known to be associated with strong emission of MeV neutrinos—up tornnow the only proven source of extrasolar neutrinos.rnAlthough being designed for the detection of high energy neutrinos, the recentlyrncompleted IceCube neutrino telescope in the antarctic ice will have the highestrnsensitivity of all current experiments to measure the shape of the neutrino lightrncurve, which is in the MeV range. This measurement is crucial for the understandingrnof supernova dynamics.rnIn this thesis, the development of a Monte Carlo simulation for a future low energyrnextension of IceCube, called PINGU, is described that investigates the response ofrnPINGU to a supernova. Using this simulation, various detector configurations arernanalysed and optimised for supernova detection. The prospects of extracting notrnonly the total light curve, but also the direction of the supernova and the meanrnneutrino energy from the data are discussed. Finally the performance of PINGU isrncompared to the current capabilities of IceCube.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The country-of-origin is the “nationality” of a food when it goes through customs in a foreign country, and is a “brand” when the food is for sale in a foreign market. My research on country-of-origin labeling (COOL) started from a case study on the extra virgin olive oil exported from Italy to China; the result shows that asymmetric and imperfect origin information may lead to market inefficiency, even market failure in emerging countries. Then, I used the Delphi method to conduct qualitative and systematic research on COOL; the panel of experts in food labeling and food policy was composed of 19 members in 13 countries; the most important consensus is that multiple countries of origin marking can provide accurate information about the origin of a food produced by two or more countries, avoiding misinformation for consumers. Moreover, I enhanced the research on COOL by analyzing the rules of origin and drafting a guideline for the standardization of origin marking. Finally, from the perspective of information economics I estimated the potential effect of the multiple countries of origin labeling on the business models of international trade, and analyzed the regulatory options for mandatory or voluntary COOL of main ingredients. This research provides valuable insights for the formulation of COOL policy.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

During the last few decades an unprecedented technological growth has been at the center of the embedded systems design paramount, with Moore’s Law being the leading factor of this trend. Today in fact an ever increasing number of cores can be integrated on the same die, marking the transition from state-of-the-art multi-core chips to the new many-core design paradigm. Despite the extraordinarily high computing power, the complexity of many-core chips opens the door to several challenges. As a result of the increased silicon density of modern Systems-on-a-Chip (SoC), the design space exploration needed to find the best design has exploded and hardware designers are in fact facing the problem of a huge design space. Virtual Platforms have always been used to enable hardware-software co-design, but today they are facing with the huge complexity of both hardware and software systems. In this thesis two different research works on Virtual Platforms are presented: the first one is intended for the hardware developer, to easily allow complex cycle accurate simulations of many-core SoCs. The second work exploits the parallel computing power of off-the-shelf General Purpose Graphics Processing Units (GPGPUs), with the goal of an increased simulation speed. The term Virtualization can be used in the context of many-core systems not only to refer to the aforementioned hardware emulation tools (Virtual Platforms), but also for two other main purposes: 1) to help the programmer to achieve the maximum possible performance of an application, by hiding the complexity of the underlying hardware. 2) to efficiently exploit the high parallel hardware of many-core chips in environments with multiple active Virtual Machines. This thesis is focused on virtualization techniques with the goal to mitigate, and overtake when possible, some of the challenges introduced by the many-core design paradigm.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

How to evaluate the cost-effectiveness of repair/retrofit intervention vs. demolition/replacement and what level of shaking intensity can the chosen repairing/retrofit technique sustain are open questions affecting either the pre-earthquake prevention, the post-earthquake emergency and the reconstruction phases. The (mis)conception that the cost of retrofit interventions would increase linearly with the achieved seismic performance (%NBS) often discourages stakeholders to consider repair/retrofit options in a post-earthquake damage situation. Similarly, in a pre-earthquake phase, the minimum (by-law) level of %NBS might be targeted, leading in some cases to no-action. Furthermore, the performance measure enforcing owners to take action, the %NBS, is generally evaluated deterministically. Not directly reflecting epistemic and aleatory uncertainties, the assessment can result in misleading confidence on the expected performance. The present study aims at contributing to the delicate decision-making process of repair/retrofit vs. demolition/replacement, by developing a framework to assist stakeholders with the evaluation of the effects in terms of long-term losses and benefits of an increment in their initial investment (targeted retrofit level) and highlighting the uncertainties hidden behind a deterministic approach. For a pre-1970 case study building, different retrofit solutions are considered, targeting different levels of %NBS, and the actual probability of reaching Collapse when considering a suite of ground-motions is evaluated, providing a correlation between %NBS and Risk. Both a simplified and a probabilistic loss modelling are then undertaken to study the relationship between %NBS and expected direct and indirect losses.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The article explores the developments in German-language anthropology in the past decades, focussing on the period after the 1970s. It argues that the recent history of German-language Ethnologie (social and cultural anthropology) is one of catching-up modernization. German-speaking anthropologists are increasingly involved in, and contribute to, broader theoretical debates, publish in English and in international journals, and are actively engaged in international academic networks. The paper discusses how and under what conditions of knowledge production these transformations have taken place. It analyses the changing institutional environment in which German anthropologists have worked and work today, as well as the theoretical impulses from within and outside the discipline that have given rise to the contemporary orientation of German-language anthropology as an anthropology of the 'present'. Finally, and beyond the focus on Germany, the article offers some ideas on the future of anthropology as a symmetrical social science, characterized by a continued strong reliance on field work and a high level of 'worldliness', a basic attitude of systematically shifting perspectives, the critical reflection of the social and political embeddedness of knowledge production, and an engagement with social theory across disciplinary boundaries.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This thesis was part of a multidisciplinary research project funded by the German Research Foundation (“Bevölkerungsgeschichte des Karpatenbeckens in der Jungsteinzeit und ihr Einfluss auf die Besiedlung Mitteleuropas”, grant no. Al 287/10-1) aimed at elucidating the population history of the Carpathian Basin during the Neolithic. The Carpathian Basin was an important waypoint on the spread of the Neolithic from southeastern to central Europe. On the Great Hungarian Plain (Alföld), the first farming communities appeared around 6000 cal BC. They belonged to the Körös culture, which derived from the Starčevo-Körös-Criş complex in the northern Balkans. Around 5600 cal BC the Alföld-Linearbandkeramik (ALBK), so called due to its stylistic similarities with the Transdanubian and central European LBK, emerged in the northwestern Alföld. Following a short “classical phase”, the ALBK split into several regional subgroups during its later stages, but did not expand beyond the Great Hungarian Plain. Marking the beginning of the late Neolithic period, the Tisza culture first appeared in the southern Alföld around 5000 cal BC and subsequently spread into the central and northern Alföld. Together with the Herpály and Csőszhalom groups it was an integral part of the late Neolithic cultural landscape of the Alföld. Up until now, the Neolithic cultural succession on the Alföld has been almost exclusively studied from an archaeological point of view, while very little is known about the population genetic processes during this time period. The aim of this thesis was to perform ancient DNA (aDNA) analyses on human samples from the Alföld Neolithic and analyse the resulting mitochondrial population data to address the following questions: is there population continuity between the Central European Mesolithic hunter-gatherer metapopulation and the first farming communities on the Alföld? Is there genetic continuity from the early to the late Neolithic? Are there genetic as well as cultural differences between the regional groups of the ALBK? Additionally, the relationships between the Alföld and the neighbouring Transdanubian Neolithic as well as other European early farming communities were evaluated to gain insights into the genetic affinities of the Alföld Neolithic in a larger geographic context. 320 individuals were analysed for this study; reproducible mitochondrial haplogroup information (HVS-I and/or SNP data) could be obtained from 242 Neolithic individuals. According to the analyses, population continuity between hunter-gatherers and the Neolithic cultures of the Alföld can be excluded at any stage of the Neolithic. In contrast, there is strong evidence for population continuity from the early to the late Neolithic. All cultural groups on the Alföld were heavily shaped by the genetic substrate introduced into the Carpathian Basin during the early Neolithic by the Körös and Starčevo cultures. Accordingly, genetic differentiation between regional groups of the ALBK is not very pronounced. The Alföld cultures are furthermore genetically highly similar to the Transdanubian Neolithic cultures, probably due to common ancestry. In the wider European context, the Alföld Neolithic cultures also highly similar to the central European LBK, while they differ markedly from contemporaneous populations of the Iberian Peninsula and the Ukraine. Thus, the Körös culture, the ALBK and the Tisza culture can be regarded as part of a “genetic continuum” that links the Neolithic Carpathian Basin to central Europe and likely has its roots in the Starčevo -Körös-Criş complex of the northern Balkans.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The general dopamine agonist apomorphine has been shown to have mostly facilitative effects on sexual behavior in rodents (Domingues & Hull, 2005; Bitran & Hull, 1987). A study looking at the effectsof apomorphine on sexual behavior in male golden hamsters observed that after systemic injections of apomorphine the males became aggressive towards the estrous females (Floody, unpublished). Studies on aggressive behavior have shown that apomorphine has facilitative effects on aggression in rodents (Nelson & Trainor, 2007; van Erp & Miczek, 2000; Ferrari, van Erp, Tornatzky, & Miczek, 2003). The studies presented here attempt to unravel the effects that apomorphine has on sexual and aggressive behavior in male golden hamsters. Studies 1, 2, 3, and 4 focused on the effects of apomorphine on aggression and Study 5 focused on the effects of apomorphine on sexual behavior. It was important for the purposes ofthis study to have separate, specific measures of aggression and sexual behavior that did not involve a social context that would involve multiple behaviors and motivations. The measure used to assessaggression was flank marking behavior. The measure used to assess sexual behavior was the number of vocalizations in response to sexual stimuli. The results from Studies 1, 2, and 3 suggested thatapomorphine increased aggressive motivation in a dose-dependent manner. In Studies 1 and 2 there was a high occurrence of stereotyped cheek pouching that interfered with the flank marking behavior. In Study 3 the procedure was modified to prevent cheek pouching and flank marking was observed uninhibited. Study 5 suggested a decrease in vocalizations after apomorphine treatment. However, this decrease may have been a result of the increase in stereotyped licking behavior. Results suggested that systemic apomorphine treatments increase aggressive motivation in hamsters. The increase in aggressive motivation may confuse the perception of the sensory signals that the males receive from the estrous females. They may haveperceived the estrous female as a nonestrous female which they would normally associate with an aggressive interaction (Lehman, Powers, & Winans, 1983).

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Justification Logic studies epistemic and provability phenomena by introducing justifications/proofs into the language in the form of justification terms. Pure justification logics serve as counterparts of traditional modal epistemic logics, and hybrid logics combine epistemic modalities with justification terms. The computational complexity of pure justification logics is typically lower than that of the corresponding modal logics. Moreover, the so-called reflected fragments, which still contain complete information about the respective justification logics, are known to be in~NP for a wide range of justification logics, pure and hybrid alike. This paper shows that, under reasonable additional restrictions, these reflected fragments are NP-complete, thereby proving a matching lower bound. The proof method is then extended to provide a uniform proof that the corresponding full pure justification logics are $\Pi^p_2$-hard, reproving and generalizing an earlier result by Milnikel.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Portfolio use in writing studies contexts is becoming ubiquitous and, as such, portfolios are in danger of being rendered meaningless and thus require that we more fully theorize and historicize portfolios. To this end, I examine portfolios: both the standardized portfolio used for assessment purposes and the personalized portfolio used for entering the job market. I take a critical look at portfolios as a form of technology and acknowledge some of the dangers of blindly using portfolios for gaining employment in the current economic structure of fast capitalism. As educators in the writing studies fields, it is paramount that instructors have a critical awareness of the consequences of portfolio creation on students as designers, lifelong learners, and citizens of a larger society. I argue that a better understanding of the pedagogical implications for portfolio use is imperative before implementing them in the classroom, and that a social-epistemic approach provides a valuable rethinking of portfolio use for assessment purposes. Further, I argue for the notions of meditation and transformation to be added alongside collection, selection, and reflection because they enable portfolio designers and evaluators alike to thoughtfully consider new ways of meaning-making and innovation. Also important and included with meditation and transformation is the understanding that students are ideologically positioned in the educational system. For them to begin recognizing their situatedness is a step toward becoming designers of change. The portfolio can be a site for that change, and a way for them to document their own learning and ways of making meaning over a lifetime.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This research looks at the use of the Interactive Student Notebook (ISN) in the math classroom and the impact on student achievement as part of the MiTEP program. A reflective critical analysis of the MiTEP program discusses impact on teacher pedagogy, leadership, and connections to people and resources. The purpose of the study stemmed from the lack of student retention, poor organizational skills, and the students’ inability to demonstrate college readiness skills such as how to study, completing homework, and thinking independently. Motivation also stemmed from teacher frustration. The research was conducted at Linden Grove Middle School in Kalamazoo Michigan in a strategic math class. Twenty-two sixth graders, thirty-two seventh graders, and forty eighth graders were part of the study.Students were given the Strategic Math Inventory (SMI) test in week 1 of the class and again at the end of a 12 week marking period. Students participated in an attitude survey to record their feelings about the use of the ISN in the strategic math classroom. The data compared the control group (the previous year’s [2012-2013] growth data) to the experimental group, the current year’s (2013-2014) growth data. Both groups were statistically similar in that the mean average was about a 4th grade level equivalency and the groups had similar numbers of grade level students. The significant findings were in the amount of growth made using the ISN. The control group started with a mean average of 586.6 and ended with a mean average of 697.1, making about one year’s growth from a 4th to a 5th grade level equivalency. The experimental group started with a mean average of 585.2 and ended with a mean average of 744.2, making about two years growth from a 4th to a 6th grade level equivalency. This is double the growth of the control group. The Cohen’s test resulted in a score of 0.311 which describes that the teaching method, the use of the ISN in the math classroom had a medium impact on student growth.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Non-uniformity of steps within a flight is a major risk factor for falls. Guidelines and requirements for uniformity of step risers and tread depths assume the measurement system provides precise dimensional values. The state-of-the-art measurement system is a relatively new method, known as the nosing-to-nosing method. It involves measuring the distance between the noses of adjacent steps and the angle formed with the horizontal. From these measurements, the effective riser height and tread depth are calculated. This study was undertaken for the purpose of evaluating the measurement system to determine how much of total measurement variability comes from the step variations versus that due to repeatability and reproducibility (R&R) associated with the measurers. Using an experimental design quality control professionals call a measurement system experiment, two measurers measured all steps in six randomly selected flights, and repeated the process on a subsequent day. After marking each step in a flight in three lateral places (left, center, and right), the measurers took their measurement. This process yielded 774 values of riser height and 672 values of tread depth. Results of applying the Gage R&R ANOVA procedure in Minitab software indicated that the R&R contribution to riser height variability was 1.42%; and to tread depth was 0.50%. All remaining variability was attributed to actual step-to-step differences. These results may be compared with guidelines used in the automobile industry for measurement systems that consider R&R less than 1% as an acceptable measurement system; and R&R between 1% and 9% as acceptable depending on the application, the cost of the measuring device, cost of repair, or other factors.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In West African languages that have the relative TAM marking, i.e., a system of syntactically conditioned alternating TAM paradigms, it is generally considered that the paradigms in each alternating pair have necessarily the same meaning. This paper shows that in Hausa, the Completive, which appears in pragmatically neutral clauses, and the Relative Perfective, which appears in pragmatically marked clauses (such as relative clauses), have, respectively, a basic perfect and perfective semantics, and that in some marked cases the alternation is not possible. The paper also shows that the two paradigms have acquired derived uses in a way consistent with the results of typological studies in the domain of tense/aspect.