997 resultados para Palatal expansion techniques


Relevância:

20.00% 20.00%

Publicador:

Resumo:

A organização automática de mensagens de correio electrónico é um desafio actual na área da aprendizagem automática. O número excessivo de mensagens afecta cada vez mais utilizadores, especialmente os que usam o correio electrónico como ferramenta de comunicação e trabalho. Esta tese aborda o problema da organização automática de mensagens de correio electrónico propondo uma solução que tem como objectivo a etiquetagem automática de mensagens. A etiquetagem automática é feita com recurso às pastas de correio electrónico anteriormente criadas pelos utilizadores, tratando-as como etiquetas, e à sugestão de múltiplas etiquetas para cada mensagem (top-N). São estudadas várias técnicas de aprendizagem e os vários campos que compõe uma mensagem de correio electrónico são analisados de forma a determinar a sua adequação como elementos de classificação. O foco deste trabalho recai sobre os campos textuais (o assunto e o corpo das mensagens), estudando-se diferentes formas de representação, selecção de características e algoritmos de classificação. É ainda efectuada a avaliação dos campos de participantes através de algoritmos de classificação que os representam usando o modelo vectorial ou como um grafo. Os vários campos são combinados para classificação utilizando a técnica de combinação de classificadores Votação por Maioria. Os testes são efectuados com um subconjunto de mensagens de correio electrónico da Enron e um conjunto de dados privados disponibilizados pelo Institute for Systems and Technologies of Information, Control and Communication (INSTICC). Estes conjuntos são analisados de forma a perceber as características dos dados. A avaliação do sistema é realizada através da percentagem de acerto dos classificadores. Os resultados obtidos apresentam melhorias significativas em comparação com os trabalhos relacionados.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A presente dissertação pretende conceber e implementar um sistema de controlo tolerante a falhas, no canal experimental de rega da Universidade de Évora, utilizando um modelo implementado em MATLAB/SIMULINK®. Como forma de responder a este desafio, analisaram-se várias técnicas de diagnóstico de falhas, tendo-se optado por técnicas baseadas em redes neuronais para o desenvolvimento de um sistema de detecção e isolamento de falhas no canal de rega, sem ter em conta o tipo de sistema de controlo utilizado. As redes neuronais foram, assim, os processadores não lineares utilizados e mais aconselhados em situações onde exista uma abundância de dados do processo, porque aprendem por exemplos e são suportadas por teorias estatísticas e de optimização, focando não somente o processamento de sinais, como também expandindo os horizontes desse processamento. A ênfase dos modelos das redes neuronais está na sua dinâmica, na sua estabilidade e no seu comportamento. Portanto, o trabalho de investigação do qual resultou esta Dissertação teve como principais objectivos o desenvolvimento de modelos de redes neuronais que representassem da melhor forma a dinâmica do canal de rega, de modo a obter um sistema de detecção de falhas que faça uma comparação entre os valores obtidos nos modelos e no processo. Com esta diferença de valores, da qual resultará um resíduo, é possível desenvolver tanto o sistema de detecção como de isolamento de falhas baseados nas redes neuronais, possibilitando assim o desenvolvimento dum sistema de controlo tolerante a falhas, que engloba os módulos de detecção, de isolamento/diagnóstico e de reconfiguração do canal de rega. Em síntese, na Dissertação realizada desenvolveu-se um sistema que permite reconfigurar o processo em caso de ocorrência de falhas, melhorando significativamente o desempenho do canal de rega.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Storm- and tsunami-deposits are generated by similar depositional mechanisms making their discrimination hard to establish using classic sedimentologic methods. Here we propose an original approach to identify tsunami-induced deposits by combining numerical simulation and rock magnetism. To test our method, we investigate the tsunami deposit of the Boca do Rio estuary generated by the 1755 earthquake in Lisbon which is well described in the literature. We first test the 1755 tsunami scenario using a numerical inundation model to provide physical parameters for the tsunami wave. Then we use concentration (MS. SIRM) and grain size (chi(ARM), ARM, B1/2, ARM/SIRM) sensitive magnetic proxies coupled with SEM microscopy to unravel the magnetic mineralogy of the tsunami-induced deposit and its associated depositional mechanisms. In order to study the connection between the tsunami deposit and the different sedimentologic units present in the estuary, magnetic data were processed by multivariate statistical analyses. Our numerical simulation show a large inundation of the estuary with flow depths varying from 0.5 to 6 m and run up of similar to 7 m. Magnetic data show a dominance of paramagnetic minerals (quartz) mixed with lesser amount of ferromagnetic minerals, namely titanomagnetite and titanohematite both of a detrital origin and reworked from the underlying units. Multivariate statistical analyses indicate a better connection between the tsunami-induced deposit and a mixture of Units C and D. All these results point to a scenario where the energy released by the tsunami wave was strong enough to overtop and erode important amount of sand from the littoral dune and mixed it with reworked materials from underlying layers at least 1 m in depth. The method tested here represents an original and promising tool to identify tsunami-induced deposits in similar embayed beach environments.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Recently, several distributed video coding (DVC) solutions based on the distributed source coding (DSC) paradigm have appeared in the literature. Wyner-Ziv (WZ) video coding, a particular case of DVC where side information is made available at the decoder, enable to achieve a flexible distribution of the computational complexity between the encoder and decoder, promising to fulfill novel requirements from applications such as video surveillance, sensor networks and mobile camera phones. The quality of the side information at the decoder has a critical role in determining the WZ video coding rate-distortion (RD) performance, notably to raise it to a level as close as possible to the RD performance of standard predictive video coding schemes. Towards this target, efficient motion search algorithms for powerful frame interpolation are much needed at the decoder. In this paper, the RD performance of a Wyner-Ziv video codec is improved by using novel, advanced motion compensated frame interpolation techniques to generate the side information. The development of these type of side information estimators is a difficult problem in WZ video coding, especially because the decoder only has available some reference, decoded frames. Based on the regularization of the motion field, novel side information creation techniques are proposed in this paper along with a new frame interpolation framework able to generate higher quality side information at the decoder. To illustrate the RD performance improvements, this novel side information creation framework has been integrated in a transform domain turbo coding based Wyner-Ziv video codec. Experimental results show that the novel side information creation solution leads to better RD performance than available state-of-the-art side information estimators, with improvements up to 2 dB: moreover, it allows outperforming H.264/AVC Intra by up to 3 dB with a lower encoding complexity.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Interest rate risk is one of the major financial risks faced by banks due to the very nature of the banking business. The most common approach in the literature has been to estimate the impact of interest rate risk on banks using a simple linear regression model. However, the relationship between interest rate changes and bank stock returns does not need to be exclusively linear. This article provides a comprehensive analysis of the interest rate exposure of the Spanish banking industry employing both parametric and non parametric estimation methods. Its main contribution is to use, for the first time in the context of banks’ interest rate risk, a nonparametric regression technique that avoids the assumption of a specific functional form. One the one hand, it is found that the Spanish banking sector exhibits a remarkable degree of interest rate exposure, although the impact of interest rate changes on bank stock returns has significantly declined following the introduction of the euro. Further, a pattern of positive exposure emerges during the post-euro period. On the other hand, the results corresponding to the nonparametric model support the expansion of the conventional linear model in an attempt to gain a greater insight into the actual degree of exposure.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

It is widely accepted that organizations and individuals must be innovative and continually create new knowledge and ideas to deal with rapid change. Innovation plays an important role in not only the development of new business, process and products, but also in competitiveness and success of any organization. Technology for Creativity and Innovation: Tools, Techniques and Applications provides empirical research findings and best practices on creativity and innovation in business, organizational, and social environments. It is written for educators, academics and professionals who want to improve their understanding of creativity and innovation as well as the role technology has in shaping this discipline.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The introduction of Electric Vehicles (EVs) together with the implementation of smart grids will raise new challenges to power system operators. This paper proposes a demand response program for electric vehicle users which provides the network operator with another useful resource that consists in reducing vehicles charging necessities. This demand response program enables vehicle users to get some profit by agreeing to reduce their travel necessities and minimum battery level requirements on a given period. To support network operator actions, the amount of demand response usage can be estimated using data mining techniques applied to a database containing a large set of operation scenarios. The paper includes a case study based on simulated operation scenarios that consider different operation conditions, e.g. available renewable generation, and considering a diversity of distributed resources and electric vehicles with vehicle-to-grid capacity and demand response capacity in a 33 bus distribution network.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Storm- and tsunami-deposits are generated by similar depositional mechanisms making their discrimination hard to establish using classic sedimentologic methods. Here we propose an original approach to identify tsunami-induced deposits by combining numerical simulation and rock magnetism. To test our method, we investigate the tsunami deposit of the Boca do Rio estuary generated by the 1755 earthquake in Lisbon which is well described in the literature. We first test the 1755 tsunami scenario using a numerical inundation model to provide physical parameters for the tsunami wave. Then we use concentration (MS. SIRM) and grain size (chi(ARM), ARM, B1/2, ARM/SIRM) sensitive magnetic proxies coupled with SEM microscopy to unravel the magnetic mineralogy of the tsunami-induced deposit and its associated depositional mechanisms. In order to study the connection between the tsunami deposit and the different sedimentologic units present in the estuary, magnetic data were processed by multivariate statistical analyses. Our numerical simulation show a large inundation of the estuary with flow depths varying from 0.5 to 6 m and run up of similar to 7 m. Magnetic data show a dominance of paramagnetic minerals (quartz) mixed with lesser amount of ferromagnetic minerals, namely titanomagnetite and titanohematite both of a detrital origin and reworked from the underlying units. Multivariate statistical analyses indicate a better connection between the tsunami-induced deposit and a mixture of Units C and D. All these results point to a scenario where the energy released by the tsunami wave was strong enough to overtop and erode important amount of sand from the littoral dune and mixed it with reworked materials from underlying layers at least 1 m in depth. The method tested here represents an original and promising tool to identify tsunami-induced deposits in similar embayed beach environments.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper describes a methodology that was developed for the classification of Medium Voltage (MV) electricity customers. Starting from a sample of data bases, resulting from a monitoring campaign, Data Mining (DM) techniques are used in order to discover a set of a MV consumer typical load profile and, therefore, to extract knowledge regarding to the electric energy consumption patterns. In first stage, it was applied several hierarchical clustering algorithms and compared the clustering performance among them using adequacy measures. In second stage, a classification model was developed in order to allow classifying new consumers in one of the obtained clusters that had resulted from the previously process. Finally, the interpretation of the discovered knowledge are presented and discussed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

To comply with natural gas demand growth patterns and Europe´s import dependency, the gas industry needs to organize an efficient upstream infrastructure. The best location of Gas Supply Units – GSUs and the alternative transportation mode – by phisical or virtual pipelines, are the key of a successful industry. In this work we study the optimal location of GSUs, as well as determining the most efficient allocation from gas loads to sources, selecting the best transportation mode, observing specific technical restrictions and minimizing system total costs. For the location of GSUs on system we use the P-median problem, for assigning gas demands nodes to source facilities we use the classical transportation problem. The developed model is an optimisation-based approach, based on a Lagrangean heuristic, using Lagrangean relaxation for P-median problems – Simple Lagrangean Heuristic. The solution of this heuristic can be improved by adding a local search procedure - the Lagrangean Reallocation Heuristic. These two heuristics, Simple Lagrangean and Lagrangean Reallocation, were tested on a realistic network - the primary Iberian natural gas network, organized with 65 nodes, connected by physical and virtual pipelines. Computational results are presented for both approaches, showing the location gas sources and allocation loads arrangement, system total costs and gas transportation mode.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Introduction: The present paper deals with the issue of the increasing usage of corporation mergers and acquisitions strategies within pharmaceutical industry environment. The aim is to identify the triggers of such business phenomenon and the immediate impact on the financial outcome of two powerful biopharmaceutical corporations: Pfizer and GlaxoSmithKline, which have been sampled due to their successful approach of the tactics in question. Materials and Methods: In order to create an overview of the development steps through mergers and acquisitions, the historical data of the two corporations has been consulted, from their official websites. The most relevant events were then associated with adequate information from the financial reports and statements of the two corporations indulged by web-based financial data providers. Results and Discussions: In the past few decades Pfizer and GlaxoSmithKline have purchased or merged with various companies in order to monopolize new markets, diversify products and services portfolios, survive and surpass competitors. The consequences proved to be positive although this approach implies certain capital availability. Conclusions: Results reveal the fact that, as far as the two sampled companies are concerned, acquisitions and mergers are reactions at the pressure of the highly competitive environment. Moreover, the continuous diversification of the market’s needs is also a consistent motive. However, the prevalence and the eminence of mergers and acquisition strategies are conditioned by the tender offer, the announcer’s caliber, research and development status and further other factors determined by the internal and external actors of the market.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In recent years, Power Systems (PS) have experimented many changes in their operation. The introduction of new players managing Distributed Generation (DG) units, and the existence of new Demand Response (DR) programs make the control of the system a more complex problem and allow a more flexible management. An intelligent resource management in the context of smart grids is of huge important so that smart grids functions are assured. This paper proposes a new methodology to support system operators and/or Virtual Power Players (VPPs) to determine effective and efficient DR programs that can be put into practice. This method is based on the use of data mining techniques applied to a database which is obtained for a large set of operation scenarios. The paper includes a case study based on 27,000 scenarios considering a diversity of distributed resources in a 32 bus distribution network.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The growing importance and influence of new resources connected to the power systems has caused many changes in their operation. Environmental policies and several well know advantages have been made renewable based energy resources largely disseminated. These resources, including Distributed Generation (DG), are being connected to lower voltage levels where Demand Response (DR) must be considered too. These changes increase the complexity of the system operation due to both new operational constraints and amounts of data to be processed. Virtual Power Players (VPP) are entities able to manage these resources. Addressing these issues, this paper proposes a methodology to support VPP actions when these act as a Curtailment Service Provider (CSP) that provides DR capacity to a DR program declared by the Independent System Operator (ISO) or by the VPP itself. The amount of DR capacity that the CSP can assure is determined using data mining techniques applied to a database which is obtained for a large set of operation scenarios. The paper includes a case study based on 27,000 scenarios considering a diversity of distributed resources in a 33 bus distribution network.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In recent decades, all over the world, competition in the electric power sector has deeply changed the way this sector’s agents play their roles. In most countries, electric process deregulation was conducted in stages, beginning with the clients of higher voltage levels and with larger electricity consumption, and later extended to all electrical consumers. The sector liberalization and the operation of competitive electricity markets were expected to lower prices and improve quality of service, leading to greater consumer satisfaction. Transmission and distribution remain noncompetitive business areas, due to the large infrastructure investments required. However, the industry has yet to clearly establish the best business model for transmission in a competitive environment. After generation, the electricity needs to be delivered to the electrical system nodes where demand requires it, taking into consideration transmission constraints and electrical losses. If the amount of power flowing through a certain line is close to or surpasses the safety limits, then cheap but distant generation might have to be replaced by more expensive closer generation to reduce the exceeded power flows. In a congested area, the optimal price of electricity rises to the marginal cost of the local generation or to the level needed to ration demand to the amount of available electricity. Even without congestion, some power will be lost in the transmission system through heat dissipation, so prices reflect that it is more expensive to supply electricity at the far end of a heavily loaded line than close to an electric power generation. Locational marginal pricing (LMP), resulting from bidding competition, represents electrical and economical values at nodes or in areas that may provide economical indicator signals to the market agents. This article proposes a data-mining-based methodology that helps characterize zonal prices in real power transmission networks. To test our methodology, we used an LMP database from the California Independent System Operator for 2009 to identify economical zones. (CAISO is a nonprofit public benefit corporation charged with operating the majority of California’s high-voltage wholesale power grid.) To group the buses into typical classes that represent a set of buses with the approximate LMP value, we used two-step and k-means clustering algorithms. By analyzing the various LMP components, our goal was to extract knowledge to support the ISO in investment and network-expansion planning.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Important research effort has been devoted to the topic of optimal planning of distribution systems. The non linear nature of the system, the need to consider a large number of scenarios and the increasing necessity to deal with uncertainties make optimal planning in distribution systems a difficult task. Heuristic techniques approaches have been proposed to deal with these issues, overcoming some of the inherent difficulties of classic methodologies. This paper considers several methodologies used to address planning problems of electrical power distribution networks, namely mixedinteger linear programming (MILP), ant colony algorithms (AC), genetic algorithms (GA), tabu search (TS), branch exchange (BE), simulated annealing (SA) and the Bender´s decomposition deterministic non-linear optimization technique (BD). Adequacy of theses techniques to deal with uncertainties is discussed. The behaviour of each optimization technique is compared from the point of view of the obtained solution and of the methodology performance. The paper presents results of the application of these optimization techniques to a real case of a 10-kV electrical distribution system with 201 nodes that feeds an urban area.