869 resultados para Error-correcting codes (Information theory)
Resumo:
Software as a service (SaaS) is a service model in which the applications are accessible from various client devices through internet. Several studies report possible factors driving the adoption of SaaS but none have considered the perception of the SaaS features and the pressures existing in the organization’s environment. We propose an integrated research model that combines the process virtualization theory (PVT) and the institutional theory (INT). PVT seeks to explain whether SaaS processes are suitable for migration into virtual environments via an information technology-based mechanism. INT seeks to explain the effects of the institutionalized environment on the structure and actions of the organization. The research makes three contributions. First, it addresses a gap in the SaaS adoption literature by studying the internal perception of the technical features of SaaS and external coercive, normative, and mimetic pressures faced by an organization. Second, it empirically tests many of the propositions of PVT and INT in the SaaS context, thereby helping to determine how the theory operates in practice. Third, the integration of PVT and INT contributes to the information system (IS) discipline, deepening the applicability and strengths of these theories.
Resumo:
Information systems are widespread and used by anyone with computing devices as well as corporations and governments. It is often the case that security leaks are introduced during the development of an application. Reasons for these security bugs are multiple but among them one can easily identify that it is very hard to define and enforce relevant security policies in modern software. This is because modern applications often rely on container sharing and multi-tenancy where, for instance, data can be stored in the same physical space but is logically mapped into different security compartments or data structures. In turn, these security compartments, to which data is classified into in security policies, can also be dynamic and depend on runtime data. In this thesis we introduce and develop the novel notion of dependent information flow types, and focus on the problem of ensuring data confidentiality in data-centric software. Dependent information flow types fit within the standard framework of dependent type theory, but, unlike usual dependent types, crucially allow the security level of a type, rather than just the structural data type itself, to depend on runtime values. Our dependent function and dependent sum information flow types provide a direct, natural and elegant way to express and enforce fine grained security policies on programs. Namely programs that manipulate structured data types in which the security level of a structure field may depend on values dynamically stored in other fields The main contribution of this work is an efficient analysis that allows programmers to verify, during the development phase, whether programs have information leaks, that is, it verifies whether programs protect the confidentiality of the information they manipulate. As such, we also implemented a prototype typechecker that can be found at http://ctp.di.fct.unl.pt/DIFTprototype/.
Resumo:
O objetivo final deste estudo é contribuir para a discussão sobre qual a medida em que conceitos semânticos e discursivos estão sintaticamente codificados. Mais especificamente, investiga-se se existe alguma correlação consistente entre alguns aspetos interpretativos e sintáticos de quatro construções clivadas do Português Europeu, e como se deve dar conta teoricamente destas potenciais correlações. As clivadas consideradas são as clivadas canónicas, as pseudoclivadas, as clivadas de é que e as clivadas de SER. Sintaticamente podemos distinguir dois tipos: clivadas bioracionais (canónicas e pseudoclivadas) e clivadas mono-oracionais (clivadas de é que e de SER). Todas as estruturas têm um constituinte clivado focalizado que pode constituir tanto um foco informacional como um foco contrastivo, e uma oração clivada que introduz uma pressuposição existencial. Adicionalmente, o constituinte clivado identifica exaustivamente uma posição vazia na oração clivada. Adota-se a semântica alternativa para o foco (Rooth 1985), segundo a qual o foco entoacional contribui uniformemente um conjunto de alternativas na Forma Lógica. Regras pragmáticas operando neste conjunto dão origem a duas implicaturas que podem ser suspensas: pressuposição existencial e exaustividade. Dado que as clivadas de é que e as de SER têm a mesma interpretação que orações não-clivadas, conclui-se que a sua estrutura sintática particular não contribui para estas propriedades interpretativas. Em contrapartida, as clivadas bioracionais, que são orações copulativas especificacionais, têm uma presuposição existencial e uma interpretação exaustiva que não pode ser suspensa, tal como as orações especificacionais não-clivadas. Argumenta-se que isto se deve ao facto de o constituinte clivado identificar uma variável introduzida por uma descrição definida. Demonstra-se que a oração clivada, uma relativa em posição de complemento de um determinador definido nas clivadas canónicas e uma relativa livre nas pseudoclivadas, tem a mesma denotação que um DP definido, e portanto tem uma pressuposição existencial inerente. A interpretação exaustiva deve-se à relação identificacional entre o constituinte clivado e a descrição definida. Além disso, defende-se que em Português Europeu um traço de foco não desencadeia movimento-A’ para um FocP especializado. Os constituintes focalizados movem-se antes por razões independentes do foco. Isto é confirmado pelo facto de apenas o constituinte clivado das clivadas de é que ter propriedades de movimento A’, os outros parecem estar in situ. Propõe-se que o constituinte clivado das clivadas de é que é um tópico com um traço de foco que se move para um TopP. Esta análise dá conta da existência de restrições discursivas semelhantes para tópicos não focalizados e para o constituinte clivado das clivadas de é que. O traço quantificacional de foco arrastado pela topicalização dá origem a efeitos de intervenção, causando a não-recursividade do foco na periferia esquerda e a sua incompatibilidade com movimento de outros constituintes com traços quantificacionais. A análise prediz as restrições de encaixe observadas para as clivadas de é que. Finalmente, desenvolve-se uma análise sintática das clivadas de SER que aproxima estas estruturas das estruturas com partículas de foco. Propõe-se que a cópula é um operador sensível ao foco que é merged juntamente com o constituinte clivado. As restrições distribucionais da cópula devem-se a requisitos selecionais de núcleos.
Resumo:
The classical central limit theorem states the uniform convergence of the distribution functions of the standardized sums of independent and identically distributed square integrable real-valued random variables to the standard normal distribution function. While first versions of the central limit theorem are already due to Moivre (1730) and Laplace (1812), a systematic study of this topic started at the beginning of the last century with the fundamental work of Lyapunov (1900, 1901). Meanwhile, extensions of the central limit theorem are available for a multitude of settings. This includes, e.g., Banach space valued random variables as well as substantial relaxations of the assumptions of independence and identical distributions. Furthermore, explicit error bounds are established and asymptotic expansions are employed to obtain better approximations. Classical error estimates like the famous bound of Berry and Esseen are stated in terms of absolute moments of the random summands and therefore do not reflect a potential closeness of the distributions of the single random summands to a normal distribution. Non-classical approaches take this issue into account by providing error estimates based on, e.g., pseudomoments. The latter field of investigation was initiated by work of Zolotarev in the 1960's and is still in its infancy compared to the development of the classical theory. For example, non-classical error bounds for asymptotic expansions seem not to be available up to now ...
Resumo:
We construct estimates of educational attainment for a sample of OECD countries using previously unexploited sources. We follow a heuristic approach to obtain plausible time profiles for attainment levels by removing sharp breaks in the data that seem to reflect changes in classification criteria. We then construct indicators of the information content of our series and a number of previously available data sets and examine their performance in several growth specifications. We find a clear positive correlation between data quality and the size and significance of human capital coefficients in growth regressions. Using an extension of the classical errors in variables model, we construct a set of meta-estimates of the coefficient of years of schooling in an aggregate Cobb-Douglas production function. Our results suggest that, after correcting for measurement error bias, the value of this parameter is well above 0.50.
Resumo:
We study markets where the characteristics or decisions of certain agents are relevant but not known to their trading partners. Assuming exclusive transactions, the environment is described as a continuum economy with indivisible commodities. We characterize incentive efficient allocations as solutions to linear programming problems and appeal to duality theory to demonstrate the generic existence of external effects in these markets. Because under certain conditions such effects may generate non-convexities, randomization emerges as a theoretic possibility. In characterizing market equilibria we show that, consistently with the personalized nature of transactions, prices are generally non-linear in the underlying consumption. On the other hand, external effects may have critical implications for market efficiency. With adverse selection, in fact, cross-subsidization across agents with different private information may be necessary for optimality, and so, the market need not even achieve an incentive efficient allocation. In contrast, for the case of a single commodity, we find that when informational asymmetries arise after the trading period (e.g. moral hazard; ex post hidden types) external effects are fully internalized at a market equilibrium.
Resumo:
Cette thèse examine la circulation et l'intégration des informations scientifiques dans la pensée quotidienne d'après la théorie des représentations sociales (TRS). En tant qu'alternative aux approches traditionnelles de la communication de la science, les transformations survenant entre le discours scientifique et le discours de sens commun sont considérées comme adaptatives. Deux études sur la circulation des informations dans les media (études 1 et 2) montrent des variations dans les thèmes de discours exposés aux profanes, et parmi les discours de ceux-ci, en fonction de différentes sources. Ensuite, le processus d'ancrage dans le positionnement préalable envers la science est étudié, pour l'explication qu'il fournit de la réception et de la transmission d'informations scientifiques dans le sens commun. Les effets d'ancrage dans les attitudes et croyances préexistants sont reportés dans différents contextes de circulation des informations scientifiques (études 3 à 7), incluant des études de type corrélationnel, experimental et de terrain. Globalement, cette thèse procure des arguments en faveur de la pertinence de la TRS pour la recherche sur la communication de la science, et suggère des développements théoriques et méthodologiques pour ces deux domaines de recherche. Drawing on the social representations theory (SRT), this thesis examines the circulation and integration of scientific information into everyday thinking. As an alternative to the traditional approaches of science communication, it considers transformations between scientific and common-sense discourses as adaptive. Two studies, focused on the spreading of information into the media (Studies 1 and 2), show variations in the themes of discourses introduced to laypersons and in the themes among laypersons' discourses, according to different sources. Anchoring in prior positioning toward science is then studied for the explanation it provides on the reception and transmission of scientific information into common sense. Anchoring effects in prior attitudes and beliefs are reported in different contexts of circulation of scientific information (Studies 3 to 7) by using results from correlational, field, and experimental studies. Overall, this thesis provides arguments for the relevance of SRT in science communication research and suggests theoretical and methodological developments for both domains of research.
Resumo:
Among the largest resources for biological sequence data is the large amount of expressed sequence tags (ESTs) available in public and proprietary databases. ESTs provide information on transcripts but for technical reasons they often contain sequencing errors. Therefore, when analyzing EST sequences computationally, such errors must be taken into account. Earlier attempts to model error prone coding regions have shown good performance in detecting and predicting these while correcting sequencing errors using codon usage frequencies. In the research presented here, we improve the detection of translation start and stop sites by integrating a more complex mRNA model with codon usage bias based error correction into one hidden Markov model (HMM), thus generalizing this error correction approach to more complex HMMs. We show that our method maintains the performance in detecting coding sequences.
Resumo:
We propose an elementary theory of wars fought by fully rational contenders. Two parties play a Markov game that combines stages of bargaining with stages where one side has the ability to impose surrender on the other. Under uncertainty and incomplete information, in the unique equilibrium of the game, long confrontations occur: war arises when reality disappoints initial (rational) optimism, and it persist longer when both agents are optimists but reality proves both wrong. Bargaining proposals that are rejected initially might eventually be accepted after several periods of confrontation. We provide an explicit computation of the equilibrium, evaluating the probability of war, and its expected losses as a function of i) the costs of confrontation, ii) the asymmetry of the split imposed under surrender, and iii) the strengths of contenders at attack and defense. Changes in these parameters display non-monotonic effects.
Resumo:
Adverse selection may thwart trade between an informed seller, who knows the probability p that an item of antiquity is genuine, and an uninformed buyer, who does not know p. The buyer might not be wholly uninformed, however. Suppose he can perform a simple inspection, a test of his own: the probability that an item passes the test is g if the item is genuine, but only f < g if it is fake. Given that the buyer is no expert, his test may have little power: f may be close to g. Unfortunately, without much power, the buyer's test will not resolve the difficulty of adverse selection; gains from trade may remain unexploited. But now consider a "store", where the seller groups a number of items, perhaps all with the same quality, the same probability p of being genuine. (We show that in equilibrium the seller will choose to group items in this manner.) Now the buyer can conduct his test across a large sample, perhaps all, of a group of items in the seller's store. He can thereby assess the overall quality of these items; he can invert the aggregate of his test results to uncover the underlying p; he can form a "prior". There is thus no longer asymmetric information between seller and buyer: gains from trade can be exploited. This is our theory of retailing: by grouping items together - setting up a store - a seller is able to supply buyers with priors, as well as the items themselves. We show that the weaker the power of the buyer�s test (the closer f is to g), the greater the seller�s profit. So the seller has no incentive to assist the buyer � e.g., by performing her own tests on the items, or by cleaning them to reveal more about their true age. The paper ends with an analysis of which sellers should specialise in which qualities. We show that quality will be low in busy locations and high in expensive locations.
Resumo:
It has been alleged that J M Keynes, quoting in the General Theory a passage from J S Mill's Principles, misunderstood the passage in question and was therefore wrong to cite Mill as an upholder of the 'classical' proposition that 'supply creates its own demand'. We believe that, although Keynes was admittedly in error with respect to, so-to-say, the 'letter' of Mill's exposition, he did not mislead readers as to the 'substance' of Mill's conception. The purpose of this paper is to demonstrate that J S Mill did indeed stand for a 'classical' position, vulnerable to Keynes's critique as developed in the General Theory. [This is a revised version of an earlier working paper: 'Keynes, Mill and Say's Law', Strathclyde Papers in Economics, 2000/11]
Resumo:
The remarkable increase in trade flows and in migratory flows of highly educated people are two important features of globalization of the last decades. This paper extends a two-country model of inter- and intraindustry trade to a rich environment featuring technological differences, skill differences and the possibility of international labor mobility. The model is used to explain the patterns of trade and migration as countries remove barriers to trade and to labor mobility. We parameterize the model to match the features of the Western and Eastern European members of the EU and analyze first the effects of the trade liberalization which occured between 1989 and 2004, and then the gains and losses from migration which are expected to occur if legal barriers to labor mobility are substantially reduced. The lower barriers to migration would result in significant migration of skilled workers from Eastern European countries. Interestingly, this would not only benefit the migrants and most Western European workers but, via trade, it would also benefit the workers remaining in Eastern Europe. Key Words: Skilled Migration, Gains from Variety, Real Wages, Eastern-Western Europe. JEL Codes: F12, F22, J61.
Resumo:
Game theory describes and analyzes strategic interaction. It is usually distinguished between static games, which are strategic situations in which the players choose only once as well as simultaneously, and dynamic games, which are strategic situations involving sequential choices. In addition, dynamic games can be further classified according to perfect and imperfect information. Indeed, a dynamic game is said to exhibit perfect information, whenever at any point of the game every player has full informational access to all choices that have been conducted so far. However, in the case of imperfect information some players are not fully informed about some choices. Game-theoretic analysis proceeds in two steps. Firstly, games are modelled by so-called form structures which extract and formalize the significant parts of the underlying strategic interaction. The basic and most commonly used models of games are the normal form, which rather sparsely describes a game merely in terms of the players' strategy sets and utilities, and the extensive form, which models a game in a more detailed way as a tree. In fact, it is standard to formalize static games with the normal form and dynamic games with the extensive form. Secondly, solution concepts are developed to solve models of games in the sense of identifying the choices that should be taken by rational players. Indeed, the ultimate objective of the classical approach to game theory, which is of normative character, is the development of a solution concept that is capable of identifying a unique choice for every player in an arbitrary game. However, given the large variety of games, it is not at all certain whether it is possible to device a solution concept with such universal capability. Alternatively, interactive epistemology provides an epistemic approach to game theory of descriptive character. This rather recent discipline analyzes the relation between knowledge, belief and choice of game-playing agents in an epistemic framework. The description of the players' choices in a given game relative to various epistemic assumptions constitutes the fundamental problem addressed by an epistemic approach to game theory. In a general sense, the objective of interactive epistemology consists in characterizing existing game-theoretic solution concepts in terms of epistemic assumptions as well as in proposing novel solution concepts by studying the game-theoretic implications of refined or new epistemic hypotheses. Intuitively, an epistemic model of a game can be interpreted as representing the reasoning of the players. Indeed, before making a decision in a game, the players reason about the game and their respective opponents, given their knowledge and beliefs. Precisely these epistemic mental states on which players base their decisions are explicitly expressible in an epistemic framework. In this PhD thesis, we consider an epistemic approach to game theory from a foundational point of view. In Chapter 1, basic game-theoretic notions as well as Aumann's epistemic framework for games are expounded and illustrated. Also, Aumann's sufficient conditions for backward induction are presented and his conceptual views discussed. In Chapter 2, Aumann's interactive epistemology is conceptually analyzed. In Chapter 3, which is based on joint work with Conrad Heilmann, a three-stage account for dynamic games is introduced and a type-based epistemic model is extended with a notion of agent connectedness. Then, sufficient conditions for backward induction are derived. In Chapter 4, which is based on joint work with Jérémie Cabessa, a topological approach to interactive epistemology is initiated. In particular, the epistemic-topological operator limit knowledge is defined and some implications for games considered. In Chapter 5, which is based on joint work with Jérémie Cabessa and Andrés Perea, Aumann's impossibility theorem on agreeing to disagree is revisited and weakened in the sense that possible contexts are provided in which agents can indeed agree to disagree.
Resumo:
We study the interaction between nonprice public rationing and prices in the private market. Under a limited budget, the public supplier uses a rationing policy. A private firm may supply the good to those consumers who are rationed by the public system. Consumers have different amounts of wealth, and costs of providing the good to them vary. We consider two regimes. First, the public supplier observes consumers' wealth information; second, the public supplier observes both wealth and cost information. The public supplier chooses a rationing policy, and, simultaneously, the private firm, observing only cost but not wealth information, chooses a pricing policy. In the first regime, there is a continuum of equilibria. The Pareto dominant equilibrium is a means-test equilibrium: poor consumers are supplied while rich consumers are rationed. Prices in the private market increase with the budget. In the second regime, there is a unique equilibrium. This exhibits a cost-effectiveness rationing rule; consumers are supplied if and only if their costbenefit ratios are low. Prices in the private market do not change with the budget. Equilibrium consumer utility is higher in the cost-effectiveness equilibrium than the means-test equilibrium [Authors]
Resumo:
In population surveys of the exposure to medical X-rays both the frequency of examinations and the effective dose per examination are required. The use of the Swiss medical tariffication system (TARMED) for establishing the frequency of X-ray medical examinations was explored. The method was tested for radiography examinations performed in 2008 at the Lausanne University Hospital. The annual numbers of radiographies determined from the "TARMED" database are in good agreement with the figures extracted from the local RIS (Radiology Information System). The "TARMED" is a reliable and fast method for establishing the frequency of radiography examination, if we respect the context in which the "TARMED" code is used. In addition, this billing context provides most valuable information on the average number of radiographs per examination as well as the age and sex distributions. Radiographies represent the major part of X-ray examinations and are performed by about 4,000 practices and hospitals in Switzerland. Therefore this method has the potential to drastically simplify the organisation of nationwide surveys. There are still some difficulties to overcome if the method is to be used to assess the frequency of computed tomography or fluoroscopy examinations; procedures that deliver most of the radiation dose to the population. This is due to the poor specificity of "TARMED" codes concerning these modalities. However, the use of CT and fluoroscopy installations is easier to monitor using conventional survey methods since there are fewer centres. Ways to overcome the "TARMED" limitations for these two modalities are still being explored.