955 resultados para Networks partner techniques
Resumo:
A methodology based on data mining techniques to support the analysis of zonal prices in real transmission networks is proposed in this paper. The mentioned methodology uses clustering algorithms to group the buses in typical classes that include a set of buses with similar LMP values. Two different clustering algorithms have been used to determine the LMP clusters: the two-step and K-means algorithms. In order to evaluate the quality of the partition as well as the best performance algorithm adequacy measurements indices are used. The paper includes a case study using a Locational Marginal Prices (LMP) data base from the California ISO (CAISO) in order to identify zonal prices.
Resumo:
Control Centre operators are essential to assure a good performance of Power Systems. Operators’ actions are critical in dealing with incidents, especially severe faults, like blackouts. In this paper we present an Intelligent Tutoring approach for training Portuguese Control Centre operators in incident analysis and diagnosis, and service restoration of Power Systems, offering context awareness and an easy integration in the working environment.
Resumo:
Mestrado em Radiações Aplicadas às Tecnologias da Saúde.
Resumo:
Collaborative Work plays an important role in today’s organizations, especially in areas where decisions must be made. However, any decision that involves a collective or group of decision makers is, by itself complex, but is becoming recurrent in recent years. In this work we present the VirtualECare project, an intelligent multi-agent system able to monitor, interact and serve its customers, which are, normally, in need of care services. In last year’s there has been a substantially increase on the number of people needed of intensive care, especially among the elderly, a phenomenon that is related to population ageing. However, this is becoming not exclusive of the elderly, as diseases like obesity, diabetes and blood pressure have been increasing among young adults. This is a new reality that needs to be dealt by the health sector, particularly by the public one. Given this scenarios, the importance of finding new and cost effective ways for health care delivery are of particular importance, especially when we believe they should not to be removed from their natural “habitat”. Following this line of thinking, the VirtualECare project will be presented, like similar ones that preceded it. Recently we have also assisted to a growing interest in combining the advances in information society - computing, telecommunications and presentation – in order to create Group Decision Support Systems (GDSS). Indeed, the new economy, along with increased competition in today’s complex business environments, takes the companies to seek complementarities in order to increase competitiveness and reduce risks. Under these scenarios, planning takes a major role in a company life. However, effective planning depends on the generation and analysis of ideas (innovative or not) and, as a result, the idea generation and management processes are crucial. Our objective is to apply the above presented GDSS to a new area. We believe that the use of GDSS in the healthcare arena will allow professionals to achieve better results in the analysis of one’s Electronically Clinical Profile (ECP). This achievement is vital, regarding the explosion of knowledge and skills, together with the need to use limited resources and get better results.
Resumo:
Amorphous SiC tandem heterostructures are used to filter a specific band, in the visible range. Experimental and simulated results are compared to validate the use of SiC multilayered structures in applications where gain compensation is needed or to attenuate unwanted wavelengths. Spectral response data acquired under different frequencies, optical wavelength control and side irradiations are analyzed. Transfer function characteristics are discussed. Color pulsed communication channels are transmitted together and the output signal analyzed under different background conditions. Results show that under controlled wavelength backgrounds, the device sensitivity is enhanced in a precise wavelength range and quenched in the others, tuning or suppressing a specific band. Depending on the background wavelength and irradiation side, the device acts either as a long-, a short-, or a band-rejection pass filter. An optoelectronic model supports the experimental results and gives insight on the physics of the device.
Resumo:
Video coding technologies have played a major role in the explosion of large market digital video applications and services. In this context, the very popular MPEG-x and H-26x video coding standards adopted a predictive coding paradigm, where complex encoders exploit the data redundancy and irrelevancy to 'control' much simpler decoders. This codec paradigm fits well applications and services such as digital television and video storage where the decoder complexity is critical, but does not match well the requirements of emerging applications such as visual sensor networks where the encoder complexity is more critical. The Slepian Wolf and Wyner-Ziv theorems brought the possibility to develop the so-called Wyner-Ziv video codecs, following a different coding paradigm where it is the task of the decoder, and not anymore of the encoder, to (fully or partly) exploit the video redundancy. Theoretically, Wyner-Ziv video coding does not incur in any compression performance penalty regarding the more traditional predictive coding paradigm (at least for certain conditions). In the context of Wyner-Ziv video codecs, the so-called side information, which is a decoder estimate of the original frame to code, plays a critical role in the overall compression performance. For this reason, much research effort has been invested in the past decade to develop increasingly more efficient side information creation methods. This paper has the main objective to review and evaluate the available side information methods after proposing a classification taxonomy to guide this review, allowing to achieve more solid conclusions and better identify the next relevant research challenges. After classifying the side information creation methods into four classes, notably guess, try, hint and learn, the review of the most important techniques in each class and the evaluation of some of them leads to the important conclusion that the side information creation methods provide better rate-distortion (RD) performance depending on the amount of temporal correlation in each video sequence. It became also clear that the best available Wyner-Ziv video coding solutions are almost systematically based on the learn approach. The best solutions are already able to systematically outperform the H.264/AVC Intra, and also the H.264/AVC zero-motion standard solutions for specific types of content. (C) 2013 Elsevier B.V. All rights reserved.
Resumo:
Mestrado em Engenharia Informática
Resumo:
Mestrado em Engenharia Electrotécnica e de Computadores
Resumo:
Com o crescimento previsível e exponencial das redes de comunicações móveis motivado pela mobilidade, flexibilidade e também comodidade do utilizador levam a que este se torne na fatia mais importante do mundo das telecomunicações dos dias que correm. Assim é importante estudar e caracterizar canais rádio para as mais diversas gamas de frequências utilizadas nas mais variadas tecnologias. O objectivo principal desta dissertação de Mestrado é caracterizar um canal rádio para a tecnologia sem fios Worldwide Inter-operability for Microwave Access (Wimax para as frequências de 3,5 GHz e 5 GHz) actualmente vista pela comunidade científica como a tecnologia sem fios com maiores perspectivas de sucesso. Para tal, determinaram-se o Perfil de Atraso de Potência (PAP) e também a Potência em Função da Distância (PFD) recorrendo ao método computacional de simulação Finite-Difference Time-Domain (FDTD). De forma a estudar e caracterizar o canal rádio, em termos de desvanecimento relativo ao espalhamento de atraso, usaram-se dois métodos alternativos que têm como entrada o PAP. Para caracterizar o canal quanto ao desvanecimento baseado em espalhamento de Doppler, recorreu-se também a duas técnicas alternativas tendo como entrada o PFD. Em ambas as situações os dois métodos alternativos convergiram para os mesmos resultados. A caracterização é feita em dois cenários diferentes: um em que consideramos que a maioria dos obstáculos são condutores eléctricos perfeitos (CEP) e que passaremos a designar Cenário PEC, e um segundo cenário em que os obstáculos têm propriedades electromagnéticas diferentes, e que passará a ser designado por Cenário MIX. Em ambos os cenários de análise concluiu-se que o canal é plano, lento e sem ISI.
Resumo:
In general, modern networks are analysed by taking several Key Performance Indicators (KPIs) into account, their proper balance being required in order to guarantee a desired Quality of Service (QoS), particularly, cellular wireless heterogeneous networks. A model to integrate a set of KPIs into a single one is presented, by using a Cost Function that includes these KPIs, providing for each network node a single evaluation parameter as output, and reflecting network conditions and common radio resource management strategies performance. The proposed model enables the implementation of different network management policies, by manipulating KPIs according to users' or operators' perspectives, allowing for a better QoS. Results show that different policies can in fact be established, with a different impact on the network, e.g., with median values ranging by a factor higher than two.
Resumo:
Processes are a central entity in enterprise collaboration. Collaborative processes need to be executed and coordinated in a distributed Computational platform where computers are connected through heterogeneous networks and systems. Life cycle management of such collaborative processes requires a framework able to handle their diversity based on different computational and communication requirements. This paper proposes a rational for such framework, points out key requirements and proposes it strategy for a supporting technological infrastructure. Beyond the portability of collaborative process definitions among different technological bindings, a framework to handle different life cycle phases of those definitions is presented and discussed. (c) 2007 Elsevier Ltd. All rights reserved.
Resumo:
Throughout the world, epidemiological studies were established to examine the relationship between air pollution and mortality rates and adverse respiratory health effects. However, despite the years of discussion the correlation between adverse health effects and atmospheric pollution remains controversial, partly because these studies are frequently restricted to small and well-monitored areas. Monitoring air pollution is complex due to the large spatial and temporal variations of pollution phenomena, the high costs of recording instruments, and the low sampling density of a purely instrumental approach. Therefore, together with the traditional instrumental monitoring, bioindication techniques allow for the mapping of pollution effects over wide areas with a high sampling density. In this study, instrumental and biomonitoring techniques were integrated to support an epidemiological study that will be developed in an industrial area located in Gijon in the coastal of central Asturias, Spain. Three main objectives were proposed to (i) analyze temporal patterns of PM10 concentrations in order to apportion emissions sources, (ii) investigate spatial patterns of lichen conductivity to identify the impact of the studied industrial area in air quality, and (iii) establish relationships amongst lichen conductivity with some site-specific characteristics. Samples of the epiphytic lichen Parmelia sulcata were transplanted in a grid of 18 by 20 km with an industrial area in the center. Lichens were exposed for a 5-mo period starting in April 2010. After exposure, lichen samples were soaked in 18-MΩ water aimed at determination of water electrical conductivity and, consequently, lichen vitality and cell damage. A marked decreasing gradient of lichens conductivity relative to distance from the emitting sources was observed. Transplants from a sampling site proximal to the industrial area reached values 10-fold higher than levels far from it. This finding showed that lichens reacted physiologically in the polluted industrial area as evidenced by increased conductivity correlated to contamination level. The integration of temporal PM10 measurements and analysis of wind direction corroborated the importance of this industrialized region for air quality measurements and identified the relevance of traffic for the urban area.
Resumo:
In this work, we present a neural network (NN) based method designed for 3D rigid-body registration of FMRI time series, which relies on a limited number of Fourier coefficients of the images to be aligned. These coefficients, which are comprised in a small cubic neighborhood located at the first octant of a 3D Fourier space (including the DC component), are then fed into six NN during the learning stage. Each NN yields the estimates of a registration parameter. The proposed method was assessed for 3D rigid-body transformations, using DC neighborhoods of different sizes. The mean absolute registration errors are of approximately 0.030 mm in translations and 0.030 deg in rotations, for the typical motion amplitudes encountered in FMRI studies. The construction of the training set and the learning stage are fast requiring, respectively, 90 s and 1 to 12 s, depending on the number of input and hidden units of the NN. We believe that NN-based approaches to the problem of FMRI registration can be of great interest in the future. For instance, NN relying on limited K-space data (possibly in navigation echoes) can be a valid solution to the problem of prospective (in frame) FMRI registration.
Resumo:
This work aims at investigating the impact of treating breast cancer using different radiation therapy (RT) techniques – forwardly-planned intensity-modulated, f-IMRT, inversely-planned IMRT and dynamic conformal arc (DCART) RT – and their effects on the whole-breast irradiation and in the undesirable irradiation of the surrounding healthy tissues. Two algorithms of iPlan BrainLAB treatment planning system were compared: Pencil Beam Convolution (PBC) and commercial Monte Carlo (iMC). Seven left-sided breast patients submitted to breast-conserving surgery were enrolled in the study. For each patient, four RT techniques – f-IMRT, IMRT using 2-fields and 5-fields (IMRT2 and IMRT5, respectively) and DCART – were applied. The dose distributions in the planned target volume (PTV) and the dose to the organs at risk (OAR) were compared analyzing dose–volume histograms; further statistical analysis was performed using IBM SPSS v20 software. For PBC, all techniques provided adequate coverage of the PTV. However, statistically significant dose differences were observed between the techniques, in the PTV, OAR and also in the pattern of dose distribution spreading into normal tissues. IMRT5 and DCART spread low doses into greater volumes of normal tissue, right breast, right lung and heart than tangential techniques. However, IMRT5 plans improved distributions for the PTV, exhibiting better conformity and homogeneity in target and reduced high dose percentages in ipsilateral OAR. DCART did not present advantages over any of the techniques investigated. Differences were also found comparing the calculation algorithms: PBC estimated higher doses for the PTV, ipsilateral lung and heart than the iMC algorithm predicted.