903 resultados para Information theory.
Resumo:
A theory is provided for the detection efficiency of diffuse light whose frequency is modulated by an acoustical wave. We derive expressions for the speckle pattern of the modulated light, as well as an expression for the signal-to-noise ratio for the detector. The aim is to develop a new imaging technology for detection of tumors in humans. The acoustic wave is focused into a small geometrical volume, which provides the spatial resolution for the imaging. The wavelength of the light wave can be selected to provide information regarding the kind of tumor.
Resumo:
We develop a unifying theory of hypoxia tolerance based on information from two cell level models (brain cortical cells and isolated hepatocytes) from the highly anoxia tolerant aquatic turtle and from other more hypoxia sensitive systems. We propose that the response of hypoxia tolerant systems to oxygen lack occurs in two phases (defense and rescue). The first lines of defense against hypoxia include a balanced suppression of ATP-demand and ATP-supply pathways; this regulation stabilizes (adenylates) at new steady-state levels even while ATP turnover rates greatly decline. The ATP demands of ion pumping are down-regulated by generalized "channel" arrest in hepatocytes and by "spike" arrest in neurons. Hypoxic ATP demands of protein synthesis are down-regulated probably by translational arrest. In hypoxia sensitive cells this translational arrest seems irreversible, but hypoxia-tolerant systems activate "rescue" mechanisms if the period of oxygen lack is extended by preferentially regulating the expression of several proteins. In these cells, a cascade of processes underpinning hypoxia rescue and defense begins with an oxygen sensor (a heme protein) and a signal-transduction pathway, which leads to significant gene-based metabolic reprogramming-the rescue process-with maintained down-regulation of energy-demand and energy-supply pathways in metabolism throughout the hypoxic period. This recent work begins to clarify how normoxic maintenance ATP turnover rates can be drastically (10-fold) down-regulated to a new hypometabolic steady state, which is prerequisite for surviving prolonged hypoxia or anoxia. The implications of these developments are extensive in biology and medicine.
Resumo:
This paper tests the existence of ‘reference dependence’ and ‘loss aversion’ in students’ academic performance. Accordingly, achieving a worse than expected academic performance would have a much stronger effect on students’ (dis)satisfaction than obtaining a better than expected grade. Although loss aversion is a well-established finding, some authors have demonstrated that it can be moderated – diminished, to be precise–. Within this line of research, we also examine whether the students’ emotional response (satisfaction/dissatisfaction) to their performance can be moderated by different musical stimuli. We design an experiment through which we test loss aversion in students’ performance with three conditions: ‘classical music’, ‘heavy music’ and ‘no music’. The empirical application supports the reference-dependence and loss aversion hypotheses (significant at p < 0.05), and the musical stimuli do have an influence on the students’ state of satisfaction with the grades (at p < 0.05). Analyzing students’ perceptions is vital to find the way they process information. Particularly, knowing the elements that can favour not only the academic performance of students but also their attitude towards certain results is fundamental. This study demonstrates that musical stimuli can modify the perceptions of a certain academic result: the effects of ‘positive’ and ‘negative’ surprises are higher or lower, not only in function of the size of these surprises, but also according to the musical stimulus received.
Resumo:
The modeling of complex dynamic systems depends on the solution of a differential equations system. Some problems appear because we do not know the mathematical expressions of the said equations. Enough numerical data of the system variables are known. The authors, think that it is very important to establish a code between the different languages to let them codify and decodify information. Coding permits us to reduce the study of some objects to others. Mathematical expressions are used to model certain variables of the system are complex, so it is convenient to define an alphabet code determining the correspondence between these equations and words in the alphabet. In this paper the authors begin with the introduction to the coding and decoding of complex structural systems modeling.
Resumo:
The mathematical models of the complex reality are texts belonging to a certain literature that is written in a semi-formal language, denominated L(MT) by the authors whose laws linguistic mathematics have been previously defined. This text possesses linguistic entropy that is the reflection of the physical entropy of the processes of real world that said text describes. Through the temperature of information defined by Mandelbrot, the authors begin a text-reality thermodynamic theory that drives to the existence of information attractors, or highly structured point, settling down a heterogeneity of the space text, the same one that of ontologic space, completing the well-known law of Saint Mathew, of the General Theory of Systems and formulated by Margalef saying: “To the one that has more he will be given, and to the one that doesn't have he will even be removed it little that it possesses.
Resumo:
Outliers are objects that show abnormal behavior with respect to their context or that have unexpected values in some of their parameters. In decision-making processes, information quality is of the utmost importance. In specific applications, an outlying data element may represent an important deviation in a production process or a damaged sensor. Therefore, the ability to detect these elements could make the difference between making a correct and an incorrect decision. This task is complicated by the large sizes of typical databases. Due to their importance in search processes in large volumes of data, researchers pay special attention to the development of efficient outlier detection techniques. This article presents a computationally efficient algorithm for the detection of outliers in large volumes of information. This proposal is based on an extension of the mathematical framework upon which the basic theory of detection of outliers, founded on Rough Set Theory, has been constructed. From this starting point, current problems are analyzed; a detection method is proposed, along with a computational algorithm that allows the performance of outlier detection tasks with an almost-linear complexity. To illustrate its viability, the results of the application of the outlier-detection algorithm to the concrete example of a large database are presented.
Resumo:
The theory of deliberate practice (Ericsson, Krampe, & Tesch-Römer, 1993) is predicated on the concept that the engagement in specific forms of practice is necessary for the attainment of expertise. The purpose of this paper was to examine the quantity and type of training performed by expert UE triathletes. Twenty-eight UE triathletes were stratified into expert, middle of the pack, and back of the pack groups based on previous finishing times. All participants provided detailed information regarding their involvement in sports in general and the three triathlon sports in particular. Results illustrated that experts performed more training than non-experts but that the relationship between training and performance was not monotonic as suggested by Ericsson et al. Further, experts' training was designed so periods of high training stress were followed by periods of low stress. However, early specialization was not a requirement for expertise. This work indicates that the theory of deliberate practice does not fully explain expertise development in UE triathlon.
Resumo:
In this project we review the effects of reputation within the context of game theory. This is done through a study of two key papers. First, we examine a paper from Fudenberg and Levine: Reputation and Equilibrium Selection in Games with a Patient Player (1989). We add to this a review Gossner’s Simple Bounds on the Value of a Reputation (2011). We look specifically at scenarios in which a long-run player faces a series of short-run opponents, and how the former may develop a reputation. In turn, we show how reputation leads directly to both lower and upper bounds on the long-run player’s payoffs.
Resumo:
Objectives: This article further examines the phenomenon of aggression inside barrooms by relying on the “bouncer-ethnographer” methodology. The objective is to investigate variations in aggression through time and space according to the role and routine of the target in a Montreal barroom. Thus, it provides an examination of routine activity theory at the micro level: the barroom. Methods: For a period of 258 nights of observation in a Canadian barroom, bouncers completed reports on each intervention and provided specific information regarding what happened, when and where within the venue. In addition, the bouncer-ethnographer compiled field observations and interviews with bar personnel in order to identify aggression hotspots and “rush hours” for three types of actors within barrooms: (a) bouncers, (b) barmaids and (c) patrons. Findings: Three different patterns emerged for shifting hotspots of aggression depending on the target. As the night progresses, aggressive incidents between patrons, towards barmaids and towards bouncers have specific hotspots and rush hours influenced by the specific routine of the target inside the barroom. Implications: The current findings enrich those of previous work by pointing to the relevance of not only examining the environmental characteristics of the barroom, but also the role of the target of aggression. Crime opportunities follow routine activities, even within a specific location on a micro level. Routine activity theory is thus relevant in this context, because as actors in differing roles follow differing routines, as do their patterns of victimization.
Resumo:
The Crimean operation has served as an occasion for Russia to demonstrate to the entire world the capabilities and the potential of information warfare. Its goal is to use difficult to detect methods to subordinate the elites and societies in other countries by making use of various kinds of secret and overt channels (secret services, diplomacy and the media), psychological impact, and ideological and political sabotage. Russian politicians and journalists have argued that information battles are necessary for “the Russian/Eurasian civilisation” to counteract “informational aggression from the Atlantic civilisation led by the USA”. This argument from the arsenal of applied geopolitics has been used for years. This text is an attempt to provide an interpretation of information warfare with the background of Russian geopolitical theory and practice.
Resumo:
This BEER addresses informational barriers to energy efficiency. It is a widely acknowledged result that an energy efficiency gap exists implying that the level of energy efficiency is at an inefficiently low level. Several barriers to energy efficiency create this gap and the presence of asymmetric information is likely to be one such barrier. In this article a theoretical framework is presented addressing the issues of moral hazard and adverse selection related to energy efficiency. Based on the theoretical framework, European policies on energy efficiency are evaluated. The article is divided into two main parts. The first part presents the theory on information asymmetries and its consequences on energy efficiency focusing on the problems of moral hazard and adverse selection. Having established a theoretical framework to understand the agency barriers to energy efficiency, the second part evaluates the policies of the European Union on energy efficiency. The BEER finds that problems of moral hazard and adverse selection indeed can help explain the seemingly low levels of energy. In both presented models the cost to the principal from implementing high energy efficiency outcome is increased with the informational asymmetries. The theory reveals two implications to policies on energy efficiency. First, the development of measures to enable contractual parties to base remuneration on energy performance must be enhanced, and second, the information on technologies and the education of consumers and installers on energy efficiency must be increased. This could be complemented with certification of installers and energy efficiency advisors to enable consumers to select good agents. Finally, it is found that the preferred EU policy instrument on energy efficiency, so far, seems to be the use of minimum requirements. Less used in EU legislation is the use of measuring and verification as well as the use of certifications. Therefore, it is concluded that the EU should consider an increased use of these instruments, and in particular focus on a further development of standards on measurability and verification as well as an increased focus on education of consumers as well as installers and advisors on energy efficiency.
Resumo:
Mixed enterprises, which are entities jointly owned by the public and private sector, are spreading all over Europe in local utilities. Well aware that in the vast majority of cases the preference of local authorities towards such governance structure is determined by practical reasons rather than by the ambition to implement new regulatory designs (an alternative to the typical “external” regulation), our purpose is to confer some scientific value to this phenomenon which has not been sufficiently investigated in the economic literature. This paper aims at proposing an economic analysis of mixed enterprises, especially of the specific configuration in which the public partner acts as controller and the private one (or “industrial” partner) as service provider. We suggest that the public service concession to mixed enterprises could embody, under certain conditions, a noteworthy substitute to the traditional public provision and the concession to totally private enterprises, as it can push regulated operators to outperform and limit the risk of private opportunism. The starting point of the entire analysis is that ownership allows the (public) owner to gather more information about the actual management of the firm, according to property rights theory. Following this stream of research, we conclude that under certain conditions mixed enterprises could significantly reduce asymmetric information between regulators and regulated firms by implementing a sort of “internal” regulation. With more information, in effect, the public authority (as owner/controller of the regulated firm, but also as member of the regulatory agency) can stimulate the private operator to be more efficient and can monitor it more effectively with respect to the fulfilment of contractual obligations (i.e., public service obligations, quality standards, etc.). Moreover, concerning the latter function, the board of directors of the mixed enterprise can be the suitable place where public and private representatives (respectively, welfare and profit maximisers) can meet to solve all disputes arising from incomplete contracts, without recourse to third parties. Finally, taking into account that a disproportionate public intervention in the “private” administration (or an ineffective protection of the general interest) would imply too many drawbacks, we draw some policy implications that make an equitable debate on the board of the firm feasible. Some empirical evidence is taken from the Italian water sector.
Resumo:
The emergence of widespread offshoring of information-intensive services is arguably one of the more impactful phenomena to transform business in the last ten years. A growing body of research has examined the firm-level drivers andlocation factors (i.e., the why's and where's) of services offshoring. However, little empirical research has examined the maturation sequencing (or when's) of services offshoring. Adopting industry life cycle theory as a framework, the key research questions examined in the paper are: when do different categories of offshoring services provision change from being emergent sectors to more mature ones, and how does the timing of this sequence relate to the type of service offshored. Using a database of 1420 offshore services FDI projects, we find that the value-add as well as the information sensitivity of the service category are related to when the service categories progress through the industry life cycle. Implications for future waves of service offshoring are discussed.
Resumo:
"May 1982."
Resumo:
Final report; July 1978.