10 resultados para Information analysis

em BORIS: Bern Open Repository and Information System - Berna - Suiça


Relevância:

60.00% 60.00%

Publicador:

Resumo:

Accurate assessment of anthropogenic carbon dioxide (CO2) emissions and their redistribution among the atmosphere, ocean, and terrestrial biosphere is important to better understand the global carbon cycle, support the development of climate policies, and project future climate change. Here we describe data sets and a methodology to quantify all major components of the global carbon budget, including their uncertainties, based on the combination of a range of data, algorithms, statistics and model estimates and their interpretation by a broad scientific community. We discuss changes compared to previous estimates, consistency within and among components, alongside methodology and data limitations. CO2 emissions from fossil-fuel combustion and cement production (EFF) are based on energy statistics, while emissions from land-use change (ELUC), mainly deforestation, are based on combined evidence from land-cover change data, fire activity associated with deforestation, and models. The global atmospheric CO2 concentration is measured directly and its rate of growth (GATM) is computed from the annual changes in concentration. The mean ocean CO2 sink (SOCEAN) is based on observations from the 1990s, while the annual anomalies and trends are estimated with ocean models. The variability in SOCEAN is evaluated for the first time in this budget with data products based on surveys of ocean CO2 measurements. The global residual terrestrial CO2 sink (SLAND) is estimated by the difference of the other terms of the global carbon budget and compared to results of independent dynamic global vegetation models forced by observed climate, CO2 and land cover change (some including nitrogen–carbon interactions). All uncertainties are reported as ± 1 σ, reflecting the current capacity to characterise the annual estimates of each component of the global carbon budget. For the last decade available (2003–2012), EFF was 8.6 ± 0.4 GtC yr − 1, ELUC 0.9 ± 0.5 GtC yr − 1, GATM 4.3 ± 0.1 GtC yr − 1, S OCEAN 2.5 ± 0.5 GtC yr − 1, and S LAND 2.8 ± 0.8 GtC yr − 1. For year 2012 alone, EFF grew to 9.7 ± 0.5 GtC yr − 1, 2.2 % above 2011, reflecting a continued growing trend in these emissions, GATM was 5.1 ± 0.2 GtC yr − 1, SOCEANwas 2.9 ± 0.5 GtC yr −1, and assuming an ELU Cof 1.0 ± 0.5 GtC yr − 1 (based on the 2001–2010 average), SLAND was 2.7 ± 0.9 GtC yr − 1. GATM was high in 2012 compared to the 2003–2012 average, almost entirely reflecting the high EFF. The global atmospheric CO2 con- centration reached 392.52 ± 0.10 ppm averaged over 2012. We estimate that EFF will increase by 2.1 % (1.1–3.1 %) to 9.9 ± 0.5 GtC in 2013, 61 % above emissions in 1990, based on projections of world gross domestic product and recent changes in the carbon intensity of the economy. With this projection, cumulative emissions of CO2 will reach about 535 ± 55 GtC for 1870–2013, about 70 % from EFF (390 ± 20 GtC) and 30 % from ELUC (145 ± 50 GtC). This paper also documents any changes in the methods and data sets used in this new carbon budget from previous budgets (Le Quéré et al., 2013). All observations presented here can be downloaded from the Carbon Dioxide Information Analysis Center.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Java Enterprise Applications (JEAs) are large systems that integrate multiple technologies and programming languages. Transactions in JEAs simplify the development of code that deals with failure recovery and multi-user coordination by guaranteeing atomicity of sets of operations. The heterogeneous nature of JEAs, however, can obfuscate conceptual errors in the application code, and in particular can hide incorrect declarations of transaction scope. In this paper we present a technique to expose and analyze the application transaction scope in JEAs by merging and analyzing information from multiple sources. We also present several novel visualizations that aid in the analysis of transaction scope by highlighting anomalies in the specification of transactions and violations of architectural constraints. We have validated our approach on two versions of a large commercial case study.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

A large body of research analyzes the runtime execution of a system to extract abstract behavioral views. Those approaches primarily analyze control flow by tracing method execution events or they analyze object graphs of heap snapshots. However, they do not capture how objects are passed through the system at runtime. We refer to the exchange of objects as the object flow, and we claim that object flow is necessary to analyze if we are to understand the runtime of an object-oriented application. We propose and detail Object Flow Analysis, a novel dynamic analysis technique that takes this new information into account. To evaluate its usefulness, we present a visual approach that allows a developer to study classes and components in terms of how they exchange objects at runtime. We illustrate our approach on three case studies.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Clinical studies indicate that exaggerated postprandial lipemia is linked to the progression of atherosclerosis, leading cause of Cardiovascular Diseases (CVD). CVD is a multi-factorial disease with complex etiology and according to the literature postprandial Triglycerides (TG) can be used as an independent CVD risk factor. Aim of the current study is to construct an Artificial Neural Network (ANN) based system for the identification of the most important gene-gene and/or gene-environmental interactions that contribute to a fast or slow postprandial metabolism of TG in blood and consequently to investigate the causality of postprandial TG response. The design and development of the system is based on a dataset of 213 subjects who underwent a two meals fatty prandial protocol. For each of the subjects a total of 30 input variables corresponding to genetic variations, sex, age and fasting levels of clinical measurements were known. Those variables provide input to the system, which is based on the combined use of Parameter Decreasing Method (PDM) and an ANN. The system was able to identify the ten (10) most informative variables and achieve a mean accuracy equal to 85.21%.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

We present the results of an investigation into the nature of information needs of software developers who work in projects that are part of larger ecosystems. This work is based on a quantitative survey of 75 professional software developers. We corroborate the results identified in the sur- vey with needs and motivations proposed in a previous sur- vey and discover that tool support for developers working in an ecosystem context is even more meager than we thought: mailing lists and internet search are the most popular tools developers use to satisfy their ecosystem-related information needs.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Bargaining is the building block of many economic interactions, ranging from bilateral to multilateral encounters and from situations in which the actors are individuals to negotiations between firms or countries. In all these settings, economists have been intrigued for a long time by the fact that some projects, trades or agreements are not realized even though they are mutually beneficial. On the one hand, this has been explained by incomplete information. A firm may not be willing to offer a wage that is acceptable to a qualified worker, because it knows that there are also unqualified workers and cannot distinguish between the two types. This phenomenon is known as adverse selection. On the other hand, it has been argued that even with complete information, the presence of externalities may impede efficient outcomes. To see this, consider the example of climate change. If a subset of countries agrees to curb emissions, non-participant regions benefit from the signatories’ efforts without incurring costs. These free riding opportunities give rise to incentives to strategically improve ones bargaining power that work against the formation of a global agreement. This thesis is concerned with extending our understanding of both factors, adverse selection and externalities. The findings are based on empirical evidence from original laboratory experiments as well as game theoretic modeling. On a very general note, it is demonstrated that the institutions through which agents interact matter to a large extent. Insights are provided about which institutions we should expect to perform better than others, at least in terms of aggregate welfare. Chapters 1 and 2 focus on the problem of adverse selection. Effective operation of markets and other institutions often depends on good information transmission properties. In terms of the example introduced above, a firm is only willing to offer high wages if it receives enough positive signals about the worker’s quality during the application and wage bargaining process. In Chapter 1, it will be shown that repeated interaction coupled with time costs facilitates information transmission. By making the wage bargaining process costly for the worker, the firm is able to obtain more accurate information about the worker’s type. The cost could be pure time cost from delaying agreement or cost of effort arising from a multi-step interviewing process. In Chapter 2, I abstract from time cost and show that communication can play a similar role. The simple fact that a worker states to be of high quality may be informative. In Chapter 3, the focus is on a different source of inefficiency. Agents strive for bargaining power and thus may be motivated by incentives that are at odds with the socially efficient outcome. I have already mentioned the example of climate change. Other examples are coalitions within committees that are formed to secure voting power to block outcomes or groups that commit to different technological standards although a single standard would be optimal (e.g. the format war between HD and BlueRay). It will be shown that such inefficiencies are directly linked to the presence of externalities and a certain degree of irreversibility in actions. I now discuss the three articles in more detail. In Chapter 1, Olivier Bochet and I study a simple bilateral bargaining institution that eliminates trade failures arising from incomplete information. In this setting, a buyer makes offers to a seller in order to acquire a good. Whenever an offer is rejected by the seller, the buyer may submit a further offer. Bargaining is costly, because both parties suffer a (small) time cost after any rejection. The difficulties arise, because the good can be of low or high quality and the quality of the good is only known to the seller. Indeed, without the possibility to make repeated offers, it is too risky for the buyer to offer prices that allow for trade of high quality goods. When allowing for repeated offers, however, at equilibrium both types of goods trade with probability one. We provide an experimental test of these predictions. Buyers gather information about sellers using specific price offers and rates of trade are high, much as the model’s qualitative predictions. We also observe a persistent over-delay before trade occurs, and this mitigates efficiency substantially. Possible channels for over-delay are identified in the form of two behavioral assumptions missing from the standard model, loss aversion (buyers) and haggling (sellers), which reconcile the data with the theoretical predictions. Chapter 2 also studies adverse selection, but interaction between buyers and sellers now takes place within a market rather than isolated pairs. Remarkably, in a market it suffices to let agents communicate in a very simple manner to mitigate trade failures. The key insight is that better informed agents (sellers) are willing to truthfully reveal their private information, because by doing so they are able to reduce search frictions and attract more buyers. Behavior observed in the experimental sessions closely follows the theoretical predictions. As a consequence, costless and non-binding communication (cheap talk) significantly raises rates of trade and welfare. Previous experiments have documented that cheap talk alleviates inefficiencies due to asymmetric information. These findings are explained by pro-social preferences and lie aversion. I use appropriate control treatments to show that such consideration play only a minor role in our market. Instead, the experiment highlights the ability to organize markets as a new channel through which communication can facilitate trade in the presence of private information. In Chapter 3, I theoretically explore coalition formation via multilateral bargaining under complete information. The environment studied is extremely rich in the sense that the model allows for all kinds of externalities. This is achieved by using so-called partition functions, which pin down a coalitional worth for each possible coalition in each possible coalition structure. It is found that although binding agreements can be written, efficiency is not guaranteed, because the negotiation process is inherently non-cooperative. The prospects of cooperation are shown to crucially depend on i) the degree to which players can renegotiate and gradually build up agreements and ii) the absence of a certain type of externalities that can loosely be described as incentives to free ride. Moreover, the willingness to concede bargaining power is identified as a novel reason for gradualism. Another key contribution of the study is that it identifies a strong connection between the Core, one of the most important concepts in cooperative game theory, and the set of environments for which efficiency is attained even without renegotiation.