7 resultados para Strategic usage of IS

em AMS Tesi di Dottorato - Alm@DL - Università di Bologna


Relevância:

100.00% 100.00%

Publicador:

Resumo:

For their survival, humans and animals can rely on motivational systems which are specialized in assessing the valence and imminence of dangers and appetitive cues. The Orienting Response (OR) is a fundamental response pattern that an organism executes whenever a novel or significant stimulus is detected, and has been shown to be consistently modulated by the affective value of a stimulus. However, detecting threatening stimuli and appetitive affordances while they are far away compared to when they are within reach constitutes an obvious evolutionary advantage. Building on the linear relationship between stimulus distance and retinal size, the present research was aimed at investigating the extent to which emotional modulation of distinct processes (action preparation, attentional capture, and subjective emotional state) is affected when reducing the retinal size of a picture. Studies 1-3 examined the effects of picture size on emotional response. Subjective feeling of engagement, as well as sympathetic activation, were modulated by picture size, suggesting that action preparation and subjective experience reflect the combined effects of detecting an arousing stimulus and assessing its imminence. On the other hand, physiological responses which are thought to reflect the amount of attentional resources invested in stimulus processing did not vary with picture size. Studies 4-6 were conducted to substantiate and extend the results of studies 1-3. In particular, it was noted that a decrease in picture size is associated with a loss in the low spatial frequencies of a picture, which might confound the interpretation of the results of studies 1-3. Therefore, emotional and neutral images which were either low-pass filtered or reduced in size were presented, and affective responses were measured. Most effects which were observed when manipulating image size were replicated by blurring pictures. However, pictures depicting highly arousing unpleasant contents were associated with a more pronounced decrease in affective modulation when pictures were reduced in size compared to when they were blurred. The present results provide important information for the study of processes involved in picture perception and in the genesis and expression of an emotional response. In particular, the availability of high spatial frequencies might affect the degree of activation of an internal representation of an affectively charged scene, and might modulate subjective emotional state and preparation for action. Moreover, the manipulation of stimulus imminence revealed important effects of stimulus engagement on specific components of the emotional response, and the implications of the present data for some models of emotions have been discussed. In particular, within the framework of a staged model of emotional response, the tactic and strategic role of response preparation and attention allocation to stimuli varying in engaging power has been discussed, considering the adaptive advantages that each might represent in an evolutionary view. Finally, the identification of perceptual parameters that allow affective processing to be carried out has important methodological applications in future studies examining emotional response in basic research or clinical contexts.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The term "Brain Imaging" identi�es a set of techniques to analyze the structure and/or functional behavior of the brain in normal and/or pathological situations. These techniques are largely used in the study of brain activity. In addition to clinical usage, analysis of brain activity is gaining popularity in others recent �fields, i.e. Brain Computer Interfaces (BCI) and the study of cognitive processes. In this context, usage of classical solutions (e.g. f MRI, PET-CT) could be unfeasible, due to their low temporal resolution, high cost and limited portability. For these reasons alternative low cost techniques are object of research, typically based on simple recording hardware and on intensive data elaboration process. Typical examples are ElectroEncephaloGraphy (EEG) and Electrical Impedance Tomography (EIT), where electric potential at the patient's scalp is recorded by high impedance electrodes. In EEG potentials are directly generated from neuronal activity, while in EIT by the injection of small currents at the scalp. To retrieve meaningful insights on brain activity from measurements, EIT and EEG relies on detailed knowledge of the underlying electrical properties of the body. This is obtained from numerical models of the electric �field distribution therein. The inhomogeneous and anisotropic electric properties of human tissues make accurate modeling and simulation very challenging, leading to a tradeo�ff between physical accuracy and technical feasibility, which currently severely limits the capabilities of these techniques. Moreover elaboration of data recorded requires usage of regularization techniques computationally intensive, which influences the application with heavy temporal constraints (such as BCI). This work focuses on the parallel implementation of a work-flow for EEG and EIT data processing. The resulting software is accelerated using multi-core GPUs, in order to provide solution in reasonable times and address requirements of real-time BCI systems, without over-simplifying the complexity and accuracy of the head models.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Neolithic is characterized by the transition from a subsistence economy, based on hunting and gathering, to one based on food producing. This important change was paralleled by one of the most significant demographic increase in the recent history of European populations. The earliest Neolithic sites in Europe are located in Greece. However, the debate regarding the colonization route followed by the Middle-eastern farmers is still open. Based on archaeological, archaeobotanical, craniometric and genetic data, two main hypotheses have been proposed. The first implies the maritime colonization of North-eastern Peloponnesus from Crete, whereas the second points to an island hopping route that finally brought migrants to Central Greece. To test these hypotheses using a genetic approach, 206 samples were collected from the two Greek regions proposed as the arrival point of the two routes (Korinthian district and Euboea). Expectations for each hypothesis were compared with empirical observations based on the analysis of 60 SNPs and 26 microsatellite loci of Y-chromosome and mitochondrial DNA hypervariable region I. The analysis of Y-chromosome haplogroups revealed a strong genetic affinity of Euboea with Anatolian and Middle-eastern populations. The inferences of the time since population expansion suggests an earlier usage of agriculture in Euboea. Moreover, the haplogroup J2a-M410, supposed to be associated with the Neolithic transition, was observed at higher frequency and variance in Euboea showing, for both these parameters, a decreasing gradient moving from this area. The time since expansion estimates for J2a-M410 was found to be compatible with the Neolithic and slightly older in Euboea. The analysis of mtDNA resulted less informative. However, a higher genetic affinity of Euboea with Anatolian and Middle-eastern populations was confirmed. These results taken as a whole suggests that the most probable route followed by Neolithic farmers during the colonization of Greece was the island hopping route.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis is concerned with the role played by software tools in the analysis and dissemination of linguistic corpora and their contribution to a more widespread adoption of corpora in different fields. Chapter 1 contains an overview of some of the most relevant corpus analysis tools available today, presenting their most interesting features and some of their drawbacks. Chapter 2 begins with an explanation of the reasons why none of the available tools appear to satisfy the requirements of the user community and then continues with technical overview of the current status of the new system developed as part of this work. This presentation is followed by highlights of features that make the system appealing to users and corpus builders (i.e. scholars willing to make their corpora available to the public). The chapter concludes with an indication of future directions for the projects and information on the current availability of the software. Chapter 3 describes the design of an experiment devised to evaluate the usability of the new system in comparison to another corpus tool. Usage of the tool was tested in the context of a documentation task performed on a real assignment during a translation class in a master's degree course. In chapter 4 the findings of the experiment are presented on two levels of analysis: firstly a discussion on how participants interacted with and evaluated the two corpus tools in terms of interface and interaction design, usability and perceived ease of use. Then an analysis follows of how users interacted with corpora to complete the task and what kind of queries they submitted. Finally, some general conclusions are drawn and areas for future work are outlined.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis analyses problems related to the applicability, in business environments, of Process Mining tools and techniques. The first contribution is a presentation of the state of the art of Process Mining and a characterization of companies, in terms of their "process awareness". The work continues identifying circumstance where problems can emerge: data preparation; actual mining; and results interpretation. Other problems are the configuration of parameters by not-expert users and computational complexity. We concentrate on two possible scenarios: "batch" and "on-line" Process Mining. Concerning the batch Process Mining, we first investigated the data preparation problem and we proposed a solution for the identification of the "case-ids" whenever this field is not explicitly indicated. After that, we concentrated on problems at mining time and we propose the generalization of a well-known control-flow discovery algorithm in order to exploit non instantaneous events. The usage of interval-based recording leads to an important improvement of performance. Later on, we report our work on the parameters configuration for not-expert users. We present two approaches to select the "best" parameters configuration: one is completely autonomous; the other requires human interaction to navigate a hierarchy of candidate models. Concerning the data interpretation and results evaluation, we propose two metrics: a model-to-model and a model-to-log. Finally, we present an automatic approach for the extension of a control-flow model with social information, in order to simplify the analysis of these perspectives. The second part of this thesis deals with control-flow discovery algorithms in on-line settings. We propose a formal definition of the problem, and two baseline approaches. The actual mining algorithms proposed are two: the first is the adaptation, to the control-flow discovery problem, of a frequency counting algorithm; the second constitutes a framework of models which can be used for different kinds of streams (stationary versus evolving).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The present PhD dissertation is dedicated to the general topic of knowledge transfer from academia to industry and the role of various measures at both institutional and university levels in support of commercialization of university research. The overall contribution of the present dissertation work refers to presenting an in-depth and comprehensive analysis of the main critical issues that currently exist with regard to commercial exploitation of academic research, while providing evidence on the role of previously underexplored areas (e.g. strategic use of academic patents; female academic patenting) in a general debate on the ways to successful knowledge transfer from academia to industry. The first paper, included in the present PhD dissertation, aims to address this gap by developing a taxonomy of literature, based on a comprehensive review of the existing body of research on government measures in support of knowledge transfer from academia to industry. The results of the review reveal that there is a considerable gap in the analysis of the impact and relative effectiveness of the public policy measures, especially in what regards the measures aimed at building knowledge and expertise among academic faculty and technology transfer agents. The second paper, presented as a part of the dissertation, focuses on the role of interorganizational collaborations and their effect on the likelihood of an academic patent to remain unused, and points to the strategic management of patents by universities. In the third paper I turn to the issue of female participation in patenting and commercialization; in particular, I find evidence on the positive role of university and its internal support structures in closing the gender gap in female academic patenting. The results of the research, carried out for the present dissertation, provide important implications for policy makers in crafting measures to increase the efficient use of university knowledge stock.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This dissertation seeks to improve the usage of direct democracy in order to minimize agency cost. It first explains why insights from corporate governance can help to improve constitutional law and then identifies relevant insights from corporate governance that can make direct democracy more efficient. To accomplish this, the dissertation examines a number of questions. What are the key similarities in corporate and constitutional law? Do these similarities create agency problems that are similar enough for a comparative analysis to yield valuable insights? Once the utility of corporate governance insights is established, the dissertation answers two questions. Are initiatives necessary to minimize agency cost if referendums are already provided for? And, must the results of direct democracy be binding in order for agency cost to be minimized?