931 resultados para information value
Resumo:
Dissertação apresentada ao Instituto Superior de Contabilidade e Administração do Porto (ISCAP) para a obtenção do Grau de Mestre em Auditoria Docente orientador: Mestre Domingos da Silva Duarte
Resumo:
Dissertação para a obtenção do grau de mestre em Contabilidade e Finanças Orientador: Mestre António Costa Reis
Resumo:
Dissertação de Mestrado em Finanças Empresariais
Resumo:
Dissertação apresentada ao Instituto Superior de Contabilidade para obtenção do Grau de Mestre em Auditoria Orientada por: Doutora Alcina Dias
Resumo:
Value has been defined in different theoretical contexts as need, desire, interest, standard /criteria, beliefs, attitudes, and preferences. The creation of value is key to any business, and any business activity is about exchanging some tangible and/or intangible good or service and having its value accepted and rewarded by customers or clients, either inside the enterprise or collaborative network or outside. “Perhaps surprising then is that firms often do not know how to define value, or how to measure it” (Anderson and Narus, 1998 cited by [1]). Woodruff echoed that we need “richer customer value theory” for providing an “important tool for locking onto the critical things that managers need to know”. In addition, he emphasized, “we need customer value theory that delves deeply into customer’s world of product use in their situations” [2]. In this sense, we proposed and validated a novel “Conceptual Model for Decomposing the Value for the Customer”. To this end, we were aware that time has a direct impact on customer perceived value, and the suppliers’ and customers’ perceptions change from the pre-purchase to the post-purchase phases, causing some uncertainty and doubts.We wanted to break down value into all its components, as well as every built and used assets (both endogenous and/or exogenous perspectives). This component analysis was then transposed into a mathematical formulation using the Fuzzy Analytic Hierarchy Process (AHP), so that the uncertainty and vagueness of value perceptions could be embedded in this model that relates used and built assets in the tangible and intangible deliverable exchange among the involved parties, with their actual value perceptions.
Resumo:
The large increase of renewable energy sources and Distributed Generation (DG) of electricity gives place to the Virtual Power Producer (VPP) concept. VPPs may turn electricity generation by renewable sources valuable in electricity markets. Information availability and adequate decision-support tools are crucial for achieving VPPs’ goals. This involves information concerning associated producers and market operation. This paper presents ViProd, a simulation tool that allows simulating VPPs operation, focusing mainly in the information requirements for adequate decision making.
Resumo:
Many of the most common human functions such as temporal and non-monotonic reasoning have not yet been fully mapped in developed systems, even though some theoretical breakthroughs have already been accomplished. This is mainly due to the inherent computational complexity of the theoretical approaches. In the particular area of fault diagnosis in power systems however, some systems which tried to solve the problem, have been deployed using methodologies such as production rule based expert systems, neural networks, recognition of chronicles, fuzzy expert systems, etc. SPARSE (from the Portuguese acronym, which means expert system for incident analysis and restoration support) was one of the developed systems and, in the sequence of its development, came the need to cope with incomplete and/or incorrect information as well as the traditional problems for power systems fault diagnosis based on SCADA (supervisory control and data acquisition) information retrieval, namely real-time operation, huge amounts of information, etc. This paper presents an architecture for a decision support system, which can solve the presented problems, using a symbiosis of the event calculus and the default reasoning rule based system paradigms, insuring soft real-time operation with incomplete, incorrect or domain incoherent information handling ability. A prototype implementation of this system is already at work in the control centre of the Portuguese Transmission Network.
Resumo:
To select each node by devices and by contexts in urban computing, users have to put their plan information and their requests into a computing environment (ex. PDA, Smart Devices, Laptops, etc.) in advance and they will try to keep the optimized states between users and the computing environment. However, because of bad contexts, users may get the wrong decision, so, one of the users’ demands may be requesting the good server which has higher security. To take this issue, we define the structure of Dynamic State Information (DSI) which takes a process about security including the relevant factors in sending/receiving contexts, which select the best during user movement with server quality and security states from DSI. Finally, whenever some information changes, users and devices get the notices including security factors, then an automatic reaction can be possible; therefore all users can safely use all devices in urban computing.
Resumo:
Electricity markets are complex environments, involving a large number of different entities, playing in a dynamic scene to obtain the best advantages and profits. MASCEM is a multi-agent electricity market simu-lator to model market players and simulate their operation in the market. Market players are entities with specific characteristics and objectives, making their decisions and interacting with other players. MASCEM pro-vides several dynamic strategies for agents’ behaviour. This paper presents a method that aims to provide market players strategic bidding capabilities, allowing them to obtain the higher possible gains out of the market. This method uses an auxiliary forecasting tool, e.g. an Artificial Neural Net-work, to predict the electricity market prices, and analyses its forecasting error patterns. Through the recognition of such patterns occurrence, the method predicts the expected error for the next forecast, and uses it to adapt the actual forecast. The goal is to approximate the forecast to the real value, reducing the forecasting error.
Resumo:
We describe a novel approach to explore DNA nucleotide sequence data, aiming to produce high-level categorical and structural information about the underlying chromosomes, genomes and species. The article starts by analyzing chromosomal data through histograms using fixed length DNA sequences. After creating the DNA-related histograms, a correlation between pairs of histograms is computed, producing a global correlation matrix. These data are then used as input to several data processing methods for information extraction and tabular/graphical output generation. A set of 18 species is processed and the extensive results reveal that the proposed method is able to generate significant and diversified outputs, in good accordance with current scientific knowledge in domains such as genomics and phylogenetics.
Resumo:
A análise que tem vindo a ser efectuada, paralelamente a experiências diversas de implementação de registos de dados pessoais de utentes das unidades de saúde, em particular no que concerne à protecção da privacidade, enquanto valor intrínseco à pessoa humana, encontra novos contornos face ao recente trabalho realizado no âmbito da Administração Central dos Sistemas de Saúde para a implementação do “Registo de Saúde Electrónico”. Este trabalho pretende analisar a bipolarização de interesses em questão. Por um lado o interesse público de adopção de um sistema de informação único, por outro lado a necessária protecção à privacidade do ser humano.
Resumo:
Electrical activity is extremely broad and distinct, requiring by one hand, a deep knowledge on rules, regulations, materials, equipments, technical solutions and technologies and assistance in several areas, as electrical equipment, telecommunications, security and efficiency and rational use of energy, on the other hand, also requires other skills, depending on the specific projects to be implemented, being this knowledge a characteristic that belongs to the professionals with relevant experience, in terms of complexity and specific projects that were made.
Resumo:
This paper analyzes the DNA code of several species in the perspective of information content. For that purpose several concepts and mathematical tools are selected towards establishing a quantitative method without a priori distorting the alphabet represented by the sequence of DNA bases. The synergies of associating Gray code, histogram characterization and multidimensional scaling visualization lead to a collection of plots with a categorical representation of species and chromosomes.
Resumo:
Proteins are biochemical entities consisting of one or more blocks typically folded in a 3D pattern. Each block (a polypeptide) is a single linear sequence of amino acids that are biochemically bonded together. The amino acid sequence in a protein is defined by the sequence of a gene or several genes encoded in the DNA-based genetic code. This genetic code typically uses twenty amino acids, but in certain organisms the genetic code can also include two other amino acids. After linking the amino acids during protein synthesis, each amino acid becomes a residue in a protein, which is then chemically modified, ultimately changing and defining the protein function. In this study, the authors analyze the amino acid sequence using alignment-free methods, aiming to identify structural patterns in sets of proteins and in the proteome, without any other previous assumptions. The paper starts by analyzing amino acid sequence data by means of histograms using fixed length amino acid words (tuples). After creating the initial relative frequency histograms, they are transformed and processed in order to generate quantitative results for information extraction and graphical visualization. Selected samples from two reference datasets are used, and results reveal that the proposed method is able to generate relevant outputs in accordance with current scientific knowledge in domains like protein sequence/proteome analysis.
Resumo:
A deposição de resíduos sólidos urbanos (RSU) em aterros sanitários (AS) tem uma dimensão que necessita de uma atenção especial por parte da sociedade. Deste acto podem resultar situações gravosas para o meio ambiente se não forem tomadas as medidas correctas. Um dos pontos que necessita de uma particular atenção é o tratamento das águas lixiviantes dos aterros sanitários, uma vez que estas apresentam geralmente cargas altamente poluentes. O presente trabalho apresenta a estrutura de um aterro sanitário, definindo os seus princípios de funcionamento, bem como as principais características a ter em consideração para a sua concepção e construção. É ainda abordada a composição dos resíduos sólidos produzidos em Portugal de acordo com (1) e as características quantitativas e qualitativas do lixiviado produzido por um AS. São apresentados os sistemas tratamento convencionais, biológicos e físico-químicos, mais utilizados em Portugal para o tratamento de águas lixiviantes, relacionando as problemáticas e as limitações associadas a cada um deles. É ainda apresentado o tratamento combinado de águas lixiviantes com águas residuais e a recirculação de lixiviado no sistema de tratamento, assim como algumas das vantagens associadas a estas práticas. Tendo por base, uma situação real, avalia-se um caso de estudo, que visa a avaliação do funcionamento da estação de pré-tratamento das águas lixiviantes produzidas no aterro sanitário do ecoparque de Palmela, tendo em consideração o facto de este tratamento ser realizado em combinação com o tratamento de águas residuais urbanas. A avaliação do funcionamento da estação de pré-tratamento de águas lixiviantes (EPTAL) é realizada de acordo com os dados fornecidos nos relatórios elaborados pela empresa LUSÁGUA, que apresentam os valores das cargas poluentes medidos à entrada e saída da EPTAL, verificando com estes dados se os valores limite de emissão medidos à saída da EPTAL são respeitados. São ainda propostas algumas alterações baseadas em pareceres técnicos que acompanham os relatórios elaborados pela LUSÁGUA e um estudo elaborada pela ECOserviços em Fevereiro de 2010 onde se obtêm eficiências de etapas de tratamentos com base em testes laboratoriais e testes realizados na própria EPTAL.