37 resultados para Information integration
Resumo:
The aim of this paper is to establish some basic guidelines to help draft the information letter sent to individual contributors should it be decided to use this model in the Spanish public pension system. With this end in mind and basing our work on the experiences of the most advanced countries in the field and the pioneering papers by Jackson (2005), Larsson et al. (2008) and Sunden (2009), we look into the concept of “individual pension information” and identify its most relevant characteristics. We then give a detailed description of two models, those in the United States and Sweden, and in particular look at how they are structured, what aspects could be improved and what their limitations are. Finally we make some recommendations of special interest for designing the model for Spain.
Resumo:
This paper proposes the use of a Modular Marx Multilevel Converter, as a solution for energy integration between an offshore Wind Farm and the power grid network. The Marx modular multilevel converter is based on the Marx generator, and solves two typical problems in this type of multilevel topologies: modularity and dc capacitor voltage balancing. This paper details the strategy for dc capacitor voltage equalization. The dynamic models of the converter and power grid are presented in order to design the converter ac output voltages and the dc capacitor voltage controller. The average current control is presented and used for power flow control, harmonics and reactive power compensation. Simulation results are presented in order to show the effectiveness of the proposed (MC)-C-3 topology.
Resumo:
The advances made in channel-capacity codes, such as turbo codes and low-density parity-check (LDPC) codes, have played a major role in the emerging distributed source coding paradigm. LDPC codes can be easily adapted to new source coding strategies due to their natural representation as bipartite graphs and the use of quasi-optimal decoding algorithms, such as belief propagation. This paper tackles a relevant scenario in distributedvideo coding: lossy source coding when multiple side information (SI) hypotheses are available at the decoder, each one correlated with the source according to different correlation noise channels. Thus, it is proposed to exploit multiple SI hypotheses through an efficient joint decoding technique withmultiple LDPC syndrome decoders that exchange information to obtain coding efficiency improvements. At the decoder side, the multiple SI hypotheses are created with motion compensated frame interpolation and fused together in a novel iterative LDPC based Slepian-Wolf decoding algorithm. With the creation of multiple SI hypotheses and the proposed decoding algorithm, bitrate savings up to 8.0% are obtained for similar decoded quality.
Resumo:
Anaemia has a significant impact on child development and mortality and is a severe public health problem in most countries in sub-Saharan Africa. Nutritional and infectious causes of anaemia are geographically variable and anaemia maps based on information on the major aetiologies of anaemia are important for identifying communities most in need and the relative contribution of major causes. We investigated the consistency between ecological and individual-level approaches to anaemia mapping, by building spatial anaemia models for children aged ≤15 years using different modeling approaches. We aimed to a) quantify the role of malnutrition, malaria, Schistosoma haematobium and soil-transmitted helminths (STH) for anaemia endemicity in children aged ≤15 years and b) develop a high resolution predictive risk map of anaemia for the municipality of Dande in Northern Angola. We used parasitological survey data on children aged ≤15 years to build Bayesian geostatistical models of malaria (PfPR≤15), S. haematobium, Ascaris lumbricoides and Trichuris trichiura and predict small-scale spatial variation in these infections. The predictions and their associated uncertainty were used as inputs for a model of anemia prevalence to predict small-scale spatial variation of anaemia. Stunting, PfPR≤15, and S. haematobium infections were significantly associated with anaemia risk. An estimated 12.5%, 15.6%, and 9.8%, of anaemia cases could be averted by treating malnutrition, malaria, S. haematobium, respectively. Spatial clusters of high risk of anaemia (>86%) were identified. Using an individual-level approach to anaemia mapping at a small spatial scale, we found that anaemia in children aged ≤15 years is highly heterogeneous and that malnutrition and parasitic infections are important contributors to the spatial variation in anemia risk. The results presented in this study can help inform the integration of the current provincial malaria control program with ancillary micronutrient supplementation and control of neglected tropical diseases, such as urogenital schistosomiasis and STH infection.
Resumo:
Video coding technologies have played a major role in the explosion of large market digital video applications and services. In this context, the very popular MPEG-x and H-26x video coding standards adopted a predictive coding paradigm, where complex encoders exploit the data redundancy and irrelevancy to 'control' much simpler decoders. This codec paradigm fits well applications and services such as digital television and video storage where the decoder complexity is critical, but does not match well the requirements of emerging applications such as visual sensor networks where the encoder complexity is more critical. The Slepian Wolf and Wyner-Ziv theorems brought the possibility to develop the so-called Wyner-Ziv video codecs, following a different coding paradigm where it is the task of the decoder, and not anymore of the encoder, to (fully or partly) exploit the video redundancy. Theoretically, Wyner-Ziv video coding does not incur in any compression performance penalty regarding the more traditional predictive coding paradigm (at least for certain conditions). In the context of Wyner-Ziv video codecs, the so-called side information, which is a decoder estimate of the original frame to code, plays a critical role in the overall compression performance. For this reason, much research effort has been invested in the past decade to develop increasingly more efficient side information creation methods. This paper has the main objective to review and evaluate the available side information methods after proposing a classification taxonomy to guide this review, allowing to achieve more solid conclusions and better identify the next relevant research challenges. After classifying the side information creation methods into four classes, notably guess, try, hint and learn, the review of the most important techniques in each class and the evaluation of some of them leads to the important conclusion that the side information creation methods provide better rate-distortion (RD) performance depending on the amount of temporal correlation in each video sequence. It became also clear that the best available Wyner-Ziv video coding solutions are almost systematically based on the learn approach. The best solutions are already able to systematically outperform the H.264/AVC Intra, and also the H.264/AVC zero-motion standard solutions for specific types of content. (C) 2013 Elsevier B.V. All rights reserved.
Resumo:
In distributed video coding, motion estimation is typically performed at the decoder to generate the side information, increasing the decoder complexity while providing low complexity encoding in comparison with predictive video coding. Motion estimation can be performed once to create the side information or several times to refine the side information quality along the decoding process. In this paper, motion estimation is performed at the decoder side to generate multiple side information hypotheses which are adaptively and dynamically combined, whenever additional decoded information is available. The proposed iterative side information creation algorithm is inspired in video denoising filters and requires some statistics of the virtual channel between each side information hypothesis and the original data. With the proposed denoising algorithm for side information creation, a RD performance gain up to 1.2 dB is obtained for the same bitrate.
Resumo:
Throughout the world, epidemiological studies were established to examine the relationship between air pollution and mortality rates and adverse respiratory health effects. However, despite the years of discussion the correlation between adverse health effects and atmospheric pollution remains controversial, partly because these studies are frequently restricted to small and well-monitored areas. Monitoring air pollution is complex due to the large spatial and temporal variations of pollution phenomena, the high costs of recording instruments, and the low sampling density of a purely instrumental approach. Therefore, together with the traditional instrumental monitoring, bioindication techniques allow for the mapping of pollution effects over wide areas with a high sampling density. In this study, instrumental and biomonitoring techniques were integrated to support an epidemiological study that will be developed in an industrial area located in Gijon in the coastal of central Asturias, Spain. Three main objectives were proposed to (i) analyze temporal patterns of PM10 concentrations in order to apportion emissions sources, (ii) investigate spatial patterns of lichen conductivity to identify the impact of the studied industrial area in air quality, and (iii) establish relationships amongst lichen conductivity with some site-specific characteristics. Samples of the epiphytic lichen Parmelia sulcata were transplanted in a grid of 18 by 20 km with an industrial area in the center. Lichens were exposed for a 5-mo period starting in April 2010. After exposure, lichen samples were soaked in 18-MΩ water aimed at determination of water electrical conductivity and, consequently, lichen vitality and cell damage. A marked decreasing gradient of lichens conductivity relative to distance from the emitting sources was observed. Transplants from a sampling site proximal to the industrial area reached values 10-fold higher than levels far from it. This finding showed that lichens reacted physiologically in the polluted industrial area as evidenced by increased conductivity correlated to contamination level. The integration of temporal PM10 measurements and analysis of wind direction corroborated the importance of this industrialized region for air quality measurements and identified the relevance of traffic for the urban area.
Resumo:
Dissertação apresentada à Escola Superior de Educação de Lisboa para a obtenção do Grau de Mestre em Ciências da Educação - especialidade Supervisão em Educação
Resumo:
Mestrado em Contabilidade
Resumo:
Mestrado em Contabilidade e Gestão das Instituições Financeiras
Resumo:
Research on cluster analysis for categorical data continues to develop, new clustering algorithms being proposed. However, in this context, the determination of the number of clusters is rarely addressed. We propose a new approach in which clustering and the estimation of the number of clusters is done simultaneously for categorical data. We assume that the data originate from a finite mixture of multinomial distributions and use a minimum message length criterion (MML) to select the number of clusters (Wallace and Bolton, 1986). For this purpose, we implement an EM-type algorithm (Silvestre et al., 2008) based on the (Figueiredo and Jain, 2002) approach. The novelty of the approach rests on the integration of the model estimation and selection of the number of clusters in a single algorithm, rather than selecting this number based on a set of pre-estimated candidate models. The performance of our approach is compared with the use of Bayesian Information Criterion (BIC) (Schwarz, 1978) and Integrated Completed Likelihood (ICL) (Biernacki et al., 2000) using synthetic data. The obtained results illustrate the capacity of the proposed algorithm to attain the true number of cluster while outperforming BIC and ICL since it is faster, which is especially relevant when dealing with large data sets.
Resumo:
Doutoramento em Gestão
Resumo:
This paper suggests that the thought of the North-American critical theorist James W. Carey provides a relevant perspective on communication and technology. Having as background American social pragmatism and progressive thinkers of the beginning of the 20th century (as Dewey, Mead, Cooley, and Park), Carey built a perspective that brought together the political economy of Harold A. Innis, the social criticism of David Riesman and Charles W. Mills and incorporated Marxist topics such as commodification and sociocultural domination. The main goal of this paper is to explore the connection established by Carey between modern technological communication and what he called the “transmissive model”, a model which not only reduces the symbolic process of communication to instrumentalization and to information delivery, but also politically converges with capitalism as well as power, control and expansionist goals. Conceiving communication as a process that creates symbolic and cultural systems, in which and through which social life takes place, Carey gives equal emphasis to the incorporation processes of communication.If symbolic forms and culture are ways of conditioning action, they are also influenced by technological and economic materializations of symbolic systems, and by other conditioning structures. In Carey’s view, communication is never a disembodied force; rather, it is a set of practices in which co-exist conceptions, techniques and social relations. These practices configure reality or, alternatively, can refute, transform and celebrate it. Exhibiting sensitiveness favourable to the historical understanding of communication, media and information technologies, one of the issues Carey explored most was the history of the telegraph as an harbinger of the Internet, of its problems and contradictions. For Carey, Internet was seen as the contemporary heir of the communications revolution triggered by the prototype of transmission technologies, namely the telegraph in the 19th century. In the telegraph Carey saw the prototype of many subsequent commercial empires based on science and technology, a pioneer model for complex business management; an example of conflict of interest for the control over patents; an inducer of changes both in language and in structures of knowledge; and a promoter of a futurist and utopian thought of information technologies. After a brief approach to Carey’s communication theory, this paper focuses on his seminal essay "Technology and ideology. The case of the telegraph", bearing in mind the prospect of the communication revolution introduced by Internet. We maintain that this essay has seminal relevance for critically studying the information society. Our reading of it highlights the reach, as well as the problems, of an approach which conceives the innovation of the telegraph as a metaphor for all innovations, announcing the modern stage of history and determining to this day the major lines of development in modern communication systems.
Resumo:
Trabalho Final de Mestrado para obtenção de grau de Mestre em Engenharia Mecânica na Especialidade de Manutenção e Produção
Impact of a price-maker pumped storage hydro unit on the integration of wind energy in power systems
Resumo:
The increasing integration of larger amounts of wind energy into power systems raises important operational issues, such as the balance between power generation and demand. The pumped storage hydro (PSH) units are one possible solution to mitigate this problem, once they can store the excess of energy in the periods of higher generation and lower demand. However, the behaviour of a PSH unit may differ considerably from the expected in terms of wind power integration when it operates in a liberalized electricity market under a price-maker context. In this regard, this paper models and computes the optimal PSH weekly scheduling in a price-taker and price-maker scenarios, either when the PSH unit operates in standalone and integrated in a portfolio of other generation assets. Results show that the price-maker standalone PSH will integrate less wind power in comparison with the price-taker situation. Moreover, when the PSH unit is integrated in a portfolio with a base load power plant, the role of the price elasticity of demand may completely change the operational profile of the PSH unit. (C) 2014 Elsevier Ltd. All rights reserved.