27 resultados para One-shot information theory
em Instituto Politécnico do Porto, Portugal
Resumo:
Electricity markets are complex environments, involving a large number of different entities, playing in a dynamic scene to obtain the best advantages and profits. MASCEM is a multi-agent electricity market simu-lator to model market players and simulate their operation in the market. Market players are entities with specific characteristics and objectives, making their decisions and interacting with other players. MASCEM pro-vides several dynamic strategies for agents’ behaviour. This paper presents a method that aims to provide market players strategic bidding capabilities, allowing them to obtain the higher possible gains out of the market. This method uses an auxiliary forecasting tool, e.g. an Artificial Neural Net-work, to predict the electricity market prices, and analyses its forecasting error patterns. Through the recognition of such patterns occurrence, the method predicts the expected error for the next forecast, and uses it to adapt the actual forecast. The goal is to approximate the forecast to the real value, reducing the forecasting error.
Resumo:
Power law distributions, also known as heavy tail distributions, model distinct real life phenomena in the areas of biology, demography, computer science, economics, information theory, language, and astronomy, amongst others. In this paper, it is presented a review of the literature having in mind applications and possible explanations for the use of power laws in real phenomena. We also unravel some controversies around power laws.
Resumo:
This study addresses the deoxyribonucleic acid (DNA) and proposes a procedure based on the association of statistics, information theory, signal processing, Fourier analysis and fractional calculus for describing fundamental characteristics of the DNA. In a first phase the 24 chromosomes of the Human are evaluated. In a second phase, 10 chromosomes for different species are also processed and the results compared. The results reveal invariance in the description and close resemblances with fractional Brownian motion.
Resumo:
Many of the most common human functions such as temporal and non-monotonic reasoning have not yet been fully mapped in developed systems, even though some theoretical breakthroughs have already been accomplished. This is mainly due to the inherent computational complexity of the theoretical approaches. In the particular area of fault diagnosis in power systems however, some systems which tried to solve the problem, have been deployed using methodologies such as production rule based expert systems, neural networks, recognition of chronicles, fuzzy expert systems, etc. SPARSE (from the Portuguese acronym, which means expert system for incident analysis and restoration support) was one of the developed systems and, in the sequence of its development, came the need to cope with incomplete and/or incorrect information as well as the traditional problems for power systems fault diagnosis based on SCADA (supervisory control and data acquisition) information retrieval, namely real-time operation, huge amounts of information, etc. This paper presents an architecture for a decision support system, which can solve the presented problems, using a symbiosis of the event calculus and the default reasoning rule based system paradigms, insuring soft real-time operation with incomplete, incorrect or domain incoherent information handling ability. A prototype implementation of this system is already at work in the control centre of the Portuguese Transmission Network.
Resumo:
To select each node by devices and by contexts in urban computing, users have to put their plan information and their requests into a computing environment (ex. PDA, Smart Devices, Laptops, etc.) in advance and they will try to keep the optimized states between users and the computing environment. However, because of bad contexts, users may get the wrong decision, so, one of the users’ demands may be requesting the good server which has higher security. To take this issue, we define the structure of Dynamic State Information (DSI) which takes a process about security including the relevant factors in sending/receiving contexts, which select the best during user movement with server quality and security states from DSI. Finally, whenever some information changes, users and devices get the notices including security factors, then an automatic reaction can be possible; therefore all users can safely use all devices in urban computing.
Resumo:
Electrical activity is extremely broad and distinct, requiring by one hand, a deep knowledge on rules, regulations, materials, equipments, technical solutions and technologies and assistance in several areas, as electrical equipment, telecommunications, security and efficiency and rational use of energy, on the other hand, also requires other skills, depending on the specific projects to be implemented, being this knowledge a characteristic that belongs to the professionals with relevant experience, in terms of complexity and specific projects that were made.
Resumo:
Proteins are biochemical entities consisting of one or more blocks typically folded in a 3D pattern. Each block (a polypeptide) is a single linear sequence of amino acids that are biochemically bonded together. The amino acid sequence in a protein is defined by the sequence of a gene or several genes encoded in the DNA-based genetic code. This genetic code typically uses twenty amino acids, but in certain organisms the genetic code can also include two other amino acids. After linking the amino acids during protein synthesis, each amino acid becomes a residue in a protein, which is then chemically modified, ultimately changing and defining the protein function. In this study, the authors analyze the amino acid sequence using alignment-free methods, aiming to identify structural patterns in sets of proteins and in the proteome, without any other previous assumptions. The paper starts by analyzing amino acid sequence data by means of histograms using fixed length amino acid words (tuples). After creating the initial relative frequency histograms, they are transformed and processed in order to generate quantitative results for information extraction and graphical visualization. Selected samples from two reference datasets are used, and results reveal that the proposed method is able to generate relevant outputs in accordance with current scientific knowledge in domains like protein sequence/proteome analysis.
Resumo:
Dissertação de Mestrado em Finanças Empresariais
Resumo:
Purpose: To describe and compare the content of instruments that assess environmental factors using the International Classification of Functioning, Disability and Health (ICF). Methods: A systematic search of PubMed, CINAHL and PEDro databases was conducted using a pre-determined search strategy. The identified instruments were screened independently by two investigators, and meaningful concepts were linked to the most precise ICF category according to published linking rules. Results: Six instruments were included, containing 526 meaningful concepts. Instruments had between 20% and 98% of items linked to categories in Chapter 1. The highest percentage of items from one instrument linked to categories in Chapters 2–5 varied between 9% and 50%. The presence or absence of environmental factors in a specific context is assessed in 3 instruments, while the other 3 assess the intensity of the impact of environmental factors. Discussion: Instruments differ in their content, type of assessment, and have several items linked to the same ICF category. Most instruments primarily assess products and technology (Chapter 1), highlighting the need to deepen the discussion on the theory that supports the measurement of environmental factors. This discussion should be thorough and lead to the development of methodologies and new tools that capture the underlying concepts of the ICF.
Resumo:
This paper aims to present a contrastive approach between three different ways of building concepts after proving the similar syntactic possibilities that coexist in terms. However, from the semantic point of view we can see that each language family has a different distribution in meaning. But the most important point we try to show is that the differences found in the psychological process when communicating concepts should guide the translator and the terminologist in the target text production and the terminology planning process. Differences between languages in the information transmission process are due to the different roles the different types of knowledge play. We distinguish here the analytic-descriptive knowledge and the analogical knowledge among others. We also state that none of them is the best when determining the correctness of a term, but there has to be adequacy criteria in the selection process. This concept building or term building success is important when looking at the linguistic map of the information society.
Resumo:
Grounded on Raymond Williams‘s definition of knowable community as a cultural tool to analyse literary texts, the essay reads the texts D.H.Lawrence wrote while travelling in the Mediterranean (Twilight in Italy, Sea and Sardinia and Etruscan Places) as knowable communities, bringing to the discussion the wide importance of literature not only as an object for aesthetic or textual readings, but also as a signifying practice which tells stories of culture. Departing from some considerations regarding the historical development of the relationship between literature and culture, the essay analyses the ways D. H. Lawrence constructed maps of meaning, where the readers, in a dynamic relation with the texts, apprehend experiences, structures and feelings; putting into perspective Williams‘s theory of culture as a whole way of life, it also analyses the ways the author communicates and organizes these experiences, creating a space of communication and operating at different levels of reality: on the one hand, the reality of the whole way of Italian life, and, on the other hand, the reality of the reader who aspires to make sense and to create an interpretative context where all the information is put, and, also, the reality of the writer in the poetic act of writing. To read these travel writings as knowable communities is to understand them as a form that invents a community with no other existence but that of the literary text. The cultural construction we find in these texts is the result of the selection, and interpretation done by D.H.Lawrence, as well as the product of the author‘s enunciative positions, and of his epistemological and ontological filigrees of existence, structured by the conditions of possibility. In the rearticulation of the text, of the writer and of the reader, in a dynamic and shared process of discursive alliances, we understand that Lawrence tells stories of the Mediterranean through his literary art.
Resumo:
Electricity markets are complex environments, involving a large number of different entities, with specific characteristics and objectives, making their decisions and interacting in a dynamic scene. Game-theory has been widely used to support decisions in competitive environments; therefore its application in electricity markets can prove to be a high potential tool. This paper proposes a new scenario analysis algorithm, which includes the application of game-theory, to evaluate and preview different scenarios and provide players with the ability to strategically react in order to exhibit the behavior that better fits their objectives. This model includes forecasts of competitor players’ actions, to build models of their behavior, in order to define the most probable expected scenarios. Once the scenarios are defined, game theory is applied to support the choice of the action to be performed. Our use of game theory is intended for supporting one specific agent and not for achieving the equilibrium in the market. MASCEM (Multi-Agent System for Competitive Electricity Markets) is a multi-agent electricity market simulator that models market players and simulates their operation in the market. The scenario analysis algorithm has been tested within MASCEM and our experimental findings with a case study based on real data from the Iberian Electricity Market are presented and discussed.
Resumo:
Timeliness guarantee is an important feature of the recently standardized IEEE 802.15.4 protocol, turning it quite appealing for Wireless Sensor Network (WSN) applications under timing constraints. When operating in beacon-enabled mode, this protocol allows nodes with real-time requirements to allocate Guaranteed Time Slots (GTS) in the contention-free period. The protocol natively supports explicit GTS allocation, i.e. a node allocates a number of time slots in each superframe for exclusive use. The limitation of this explicit GTS allocation is that GTS resources may quickly disappear, since a maximum of seven GTSs can be allocated in each superframe, preventing other nodes to benefit from guaranteed service. Moreover, the GTS may be underutilized, resulting in wasted bandwidth. To overcome these limitations, this paper proposes i-GAME, an implicit GTS Allocation Mechanism in beacon-enabled IEEE 802.15.4 networks. The allocation is based on implicit GTS allocation requests, taking into account the traffic specifications and the delay requirements of the flows. The i-GAME approach enables the use of one GTS by multiple nodes, still guaranteeing that all their (delay, bandwidth) requirements are satisfied. For that purpose, we propose an admission control algorithm that enables to decide whether to accept a new GTS allocation request or not, based not only on the remaining time slots, but also on the traffic specifications of the flows, their delay requirements and the available bandwidth resources. We show that our approach improves the bandwidth utilization as compared to the native explicit allocation mechanism defined in the IEEE 802.15.4 standard. We also present some practical considerations for the implementation of i-GAME, ensuring backward compatibility with the IEEE 801.5.4 standard with only minor add-ons. Finally, an experimental evaluation on a real system that validates our theoretical analysis and demonstrates the implementation of i-GAME is also presented
Resumo:
Dynamical systems theory is used as a theoretical language and tool to design a distributed control architecture for teams of mobile robots, that must transport a large object and simultaneously avoid collisions with (either static or dynamic) obstacles. Here we demonstrate in simulations and implementations in real robots that it is possible to simplify the architectures presented in previous work and to extend the approach to teams of n robots. The robots have no prior knowledge of the environment. The motion of each robot is controlled by a time series of asymptotical stable states. The attractor dynamics permits the integration of information from various sources in a graded manner. As a result, the robots show a strikingly smooth an stable team behaviour.
Resumo:
We consider a Bertrand duopoly model with unknown costs. The firms' aim is to choose the price of its product according to the well-known concept of Bayesian Nash equilibrium. The chooses are made simultaneously by both firms. In this paper, we suppose that each firm has two different technologies, and uses one of them according to a certain probability distribution. The use of either one or the other technology affects the unitary production cost. We show that this game has exactly one Bayesian Nash equilibrium. We analyse the advantages, for firms and for consumers, of using the technology with highest production cost versus the one with cheapest production cost. We prove that the expected profit of each firm increases with the variance of its production costs. We also show that the expected price of each good increases with both expected production costs, being the effect of the expected production costs of the rival dominated by the effect of the own expected production costs.