41 resultados para Representation and information retrieval technologies


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Many people routinely criticise themselves. While self-criticism is largely unproblematic for most individuals, depressed patients exhibit excessive self-critical thinking, which leads to strong negative affects. We used functional magnetic resonance imaging in healthy subjects (N = 20) to investigate neural correlates and possible psychological moderators of self-critical processing. Stimuli consisted of individually selected adjectives of personally negative content and were contrasted with neutral and negative non-self-referential adjectives. We found that confrontation with self-critical material yielded neural activity in regions involved in emotions (anterior insula/hippocampus-amygdala formation) and in anterior and posterior cortical midline structures, which are associated with self-referential and autobiographical memory processing. Furthermore, contrasts revealed an extended network of bilateral frontal brain areas. We suggest that the co-activation of superior and inferior lateral frontal brain regions reflects the recruitment of a frontal top-down pathway, representing cognitive reappraisal strategies for dealing with evoked negative affects. In addition, activation of right superior frontal areas was positively associated with neuroticism and negatively associated with cognitive reappraisal. Although these findings may not be specific to negative stimuli, they support a role for clinically relevant personality traits in successful regulation of emotion during confrontation with self-critical material.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: A prerequisite for high performance in motor tasks is the acquisition of egocentric sensory information that must be translated into motor actions. A phenomenon that supports this process is the Quiet Eye (QE) defined as long final fixation before movement initiation. It is assumed that the QE facilitates information processing, particularly regarding movement parameterization. Aims: The question remains whether this facilitation also holds for the information-processing stage of response selection and – related to perception crucial – stage of stimulus identification. Method: In two experiments with sport science students, performance-enhancing effects of experimentally manipulated QE durations were tested as a function of target position predictability and target visibility, thereby selectively manipulating response selection and stimulus identification demands, respectively. Results: The results support the hypothesis of facilitated information processing through long QE durations since in both experiments performance-enhancing effects of long QE durations were found under increased processing demands only. In Experiment 1, QE duration affected performance only if the target position was not predictable and positional information had to be processed over the QE period. In Experiment 2, in a full vs. no target visibility comparison with saccades to the upcoming target position induced by flicker cues, the functionality of a long QE duration depended on the visual stimulus identification period as soon as the interval falls below a certain threshold. Conclusions: The results corroborate earlier findings that QE efficiency depends on demands put on the visuomotor system, thereby furthering the assumption that the phenomenon supports the processes of sensorimotor integration.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Bargaining is the building block of many economic interactions, ranging from bilateral to multilateral encounters and from situations in which the actors are individuals to negotiations between firms or countries. In all these settings, economists have been intrigued for a long time by the fact that some projects, trades or agreements are not realized even though they are mutually beneficial. On the one hand, this has been explained by incomplete information. A firm may not be willing to offer a wage that is acceptable to a qualified worker, because it knows that there are also unqualified workers and cannot distinguish between the two types. This phenomenon is known as adverse selection. On the other hand, it has been argued that even with complete information, the presence of externalities may impede efficient outcomes. To see this, consider the example of climate change. If a subset of countries agrees to curb emissions, non-participant regions benefit from the signatories’ efforts without incurring costs. These free riding opportunities give rise to incentives to strategically improve ones bargaining power that work against the formation of a global agreement. This thesis is concerned with extending our understanding of both factors, adverse selection and externalities. The findings are based on empirical evidence from original laboratory experiments as well as game theoretic modeling. On a very general note, it is demonstrated that the institutions through which agents interact matter to a large extent. Insights are provided about which institutions we should expect to perform better than others, at least in terms of aggregate welfare. Chapters 1 and 2 focus on the problem of adverse selection. Effective operation of markets and other institutions often depends on good information transmission properties. In terms of the example introduced above, a firm is only willing to offer high wages if it receives enough positive signals about the worker’s quality during the application and wage bargaining process. In Chapter 1, it will be shown that repeated interaction coupled with time costs facilitates information transmission. By making the wage bargaining process costly for the worker, the firm is able to obtain more accurate information about the worker’s type. The cost could be pure time cost from delaying agreement or cost of effort arising from a multi-step interviewing process. In Chapter 2, I abstract from time cost and show that communication can play a similar role. The simple fact that a worker states to be of high quality may be informative. In Chapter 3, the focus is on a different source of inefficiency. Agents strive for bargaining power and thus may be motivated by incentives that are at odds with the socially efficient outcome. I have already mentioned the example of climate change. Other examples are coalitions within committees that are formed to secure voting power to block outcomes or groups that commit to different technological standards although a single standard would be optimal (e.g. the format war between HD and BlueRay). It will be shown that such inefficiencies are directly linked to the presence of externalities and a certain degree of irreversibility in actions. I now discuss the three articles in more detail. In Chapter 1, Olivier Bochet and I study a simple bilateral bargaining institution that eliminates trade failures arising from incomplete information. In this setting, a buyer makes offers to a seller in order to acquire a good. Whenever an offer is rejected by the seller, the buyer may submit a further offer. Bargaining is costly, because both parties suffer a (small) time cost after any rejection. The difficulties arise, because the good can be of low or high quality and the quality of the good is only known to the seller. Indeed, without the possibility to make repeated offers, it is too risky for the buyer to offer prices that allow for trade of high quality goods. When allowing for repeated offers, however, at equilibrium both types of goods trade with probability one. We provide an experimental test of these predictions. Buyers gather information about sellers using specific price offers and rates of trade are high, much as the model’s qualitative predictions. We also observe a persistent over-delay before trade occurs, and this mitigates efficiency substantially. Possible channels for over-delay are identified in the form of two behavioral assumptions missing from the standard model, loss aversion (buyers) and haggling (sellers), which reconcile the data with the theoretical predictions. Chapter 2 also studies adverse selection, but interaction between buyers and sellers now takes place within a market rather than isolated pairs. Remarkably, in a market it suffices to let agents communicate in a very simple manner to mitigate trade failures. The key insight is that better informed agents (sellers) are willing to truthfully reveal their private information, because by doing so they are able to reduce search frictions and attract more buyers. Behavior observed in the experimental sessions closely follows the theoretical predictions. As a consequence, costless and non-binding communication (cheap talk) significantly raises rates of trade and welfare. Previous experiments have documented that cheap talk alleviates inefficiencies due to asymmetric information. These findings are explained by pro-social preferences and lie aversion. I use appropriate control treatments to show that such consideration play only a minor role in our market. Instead, the experiment highlights the ability to organize markets as a new channel through which communication can facilitate trade in the presence of private information. In Chapter 3, I theoretically explore coalition formation via multilateral bargaining under complete information. The environment studied is extremely rich in the sense that the model allows for all kinds of externalities. This is achieved by using so-called partition functions, which pin down a coalitional worth for each possible coalition in each possible coalition structure. It is found that although binding agreements can be written, efficiency is not guaranteed, because the negotiation process is inherently non-cooperative. The prospects of cooperation are shown to crucially depend on i) the degree to which players can renegotiate and gradually build up agreements and ii) the absence of a certain type of externalities that can loosely be described as incentives to free ride. Moreover, the willingness to concede bargaining power is identified as a novel reason for gradualism. Another key contribution of the study is that it identifies a strong connection between the Core, one of the most important concepts in cooperative game theory, and the set of environments for which efficiency is attained even without renegotiation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In developing meaningful mitigation strategies to combat desertification, it is important to address the complex constellation of desertification under different bio-physical, social, demographic, political and economic conditions. In particular, desertification can be described as a cluster of key processes of global change which together form a typical syndrome. A critical reflection on the potential of research to help mitigate desertification will be a useful first step, before addressing the requirements for research partnerships between institutions at local levels and beyond. A practical example from Eritrea, an ecoregion which has been plagued by desertification for many centuries, is given at the end of the paper. It illustrates options for generating the necessary data and developing useful information in order to enhance the impact of research on sustainable development.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Abstract As librarians of the Social & Preventive Medicine Library in Bern, we help researchers perform systematic literature searches and teach students to use medical databases. We developed our skills mainly “on the job”, and we wondered how other health librarians in Europe were trained to become experts in searching. We had a great opportunity to “job shadow” specialists in this area of library service during a 5-day-internship at the Royal Free Hospital Medical Library in London, Great Britain.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Web-scale knowledge retrieval can be enabled by distributed information retrieval, clustering Web clients to a large-scale computing infrastructure for knowledge discovery from Web documents. Based on this infrastructure, we propose to apply semiotic (i.e., sub-syntactical) and inductive (i.e., probabilistic) methods for inferring concept associations in human knowledge. These associations can be combined to form a fuzzy (i.e.,gradual) semantic net representing a map of the knowledge in the Web. Thus, we propose to provide interactive visualizations of these cognitive concept maps to end users, who can browse and search the Web in a human-oriented, visual, and associative interface.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Digital technologies have profoundly changed not only the ways we create, distribute, access, use and re-use information but also many of the governance structures we had in place. Overall, "older" institutions at all governance levels have grappled and often failed to master the multi-faceted and multi-directional issues of the Internet. Regulatory entrepreneurs have yet to discover and fully mobilize the potential of digital technologies as an influential factor impacting upon the regulability of the environment and as a potential regulatory tool in themselves. At the same time, we have seen a deterioration of some public spaces and lower prioritization of public objectives, when strong private commercial interests are at play, such as most tellingly in the field of copyright. Less tangibly, private ordering has taken hold and captured through contracts spaces, previously regulated by public law. Code embedded in technology often replaces law. Non-state action has in general proliferated and put serious pressure upon conventional state-centered, command-and-control models. Under the conditions of this "messy" governance, the provision of key public goods, such as freedom of information, has been made difficult or is indeed jeopardized.The grand question is how can we navigate this complex multi-actor, multi-issue space and secure the attainment of fundamental public interest objectives. This is also the question that Ian Brown and Chris Marsden seek to answer with their book, Regulating Code, as recently published under the "Information Revolution and Global Politics" series of MIT Press. This book review critically assesses the bold effort by Brown and Marsden.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Soils are fundamental to ensuring water, energy and food security. Within the context of sus- tainable food production, it is important to share knowledge on existing and emerging tech- nologies that support land and soil monitoring. Technologies, such as remote sensing, mobile soil testing, and digital soil mapping, have the potential to identify degraded and non- /little-responsive soils, and may also provide a basis for programmes targeting the protection and rehabilitation of soils. In the absence of such information, crop production assessments are often not based on the spatio-temporal variability in soil characteristics. In addition, uncertain- ties in soil information systems are notable and build up when predictions are used for monitor- ing soil properties or biophysical modelling. Consequently, interpretations of model-based results have to be done cautiously. As such they provide a scientific, but not always manage- able, basis for farmers and/or policymakers. In general, the key incentives for stakeholders to aim for sustainable management of soils and more resilient food systems are complex at farm as well as higher levels. The same is true of drivers of soil degradation. The decision- making process aimed at sustainable soil management, be that at farm or higher level, also in- volves other goals and objectives valued by stakeholders, e.g. land governance, improved envi- ronmental quality, climate change adaptation and mitigation etc. In this dialogue session we will share ideas on recent developments in the discourse on soils, their functions and the role of soil and land information in enhancing food system resilience.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The metabolic network of a cell represents the catabolic and anabolic reactions that interconvert small molecules (metabolites) through the activity of enzymes, transporters and non-catalyzed chemical reactions. Our understanding of individual metabolic networks is increasing as we learn more about the enzymes that are active in particular cells under particular conditions and as technologies advance to allow detailed measurements of the cellular metabolome. Metabolic network databases are of increasing importance in allowing us to contextualise data sets emerging from transcriptomic, proteomic and metabolomic experiments. Here we present a dynamic database, TrypanoCyc (http://www.metexplore.fr/trypanocyc/), which describes the generic and condition-specific metabolic network of Trypanosoma brucei, a parasitic protozoan responsible for human and animal African trypanosomiasis. In addition to enabling navigation through the BioCyc-based TrypanoCyc interface, we have also implemented a network-based representation of the information through MetExplore, yielding a novel environment in which to visualise the metabolism of this important parasite.