953 resultados para Generating summaries
Resumo:
One of the filmic trends which has been neglected by the Academy Awards is the metacinema, which for practical purposes I will consider to be a cross between the complexities of the self-reflexive cinema (highly connoted with modernism) and the Hollywood Film (the classical films about the urge to ‘make it’ in Hollywood). Indeed, these films have always existed and some, as Sunset Boulevard (Billy Wilder, 1950, USA) and Mulholland Dr. (David Lynch, 2001, FRA/USA), have even made it to the ceremony, but were, predictably, defeated in the main categories, by other more ‘serious’ or less self-reflexive products. The United States has always insisted on not revealing the tricks of the trade while, ironically, generating films that deal with this theme, in order to cater to the curiosity of the metacinema-inclined spectator. For this reason such films are usually about the universe of cinema but not its medium, at least not in a way that discloses the operations of the technical apparatus. Why are these films not viewed as serious enough and artistic enough to be awarded Oscars by the Academy in the categories of Best Picture, Best Director, Best Screenplay, and Best Cinematography? Are they being discarded for the same reasons that comedy and musicals usually are? Or are they being punished for being too unveiling? Or is the industry going for commercial products that can easily be pushed on a global scale and make a profit?
Resumo:
Dissertação para obtenção do grau de Mestre em Engenharia Eletrotécnica Ramo de Automação e Eletrónica Industrial
Resumo:
Consider the problem of assigning implicit-deadline sporadic tasks on a heterogeneous multiprocessor platform comprising two different types of processors—such a platform is referred to as two-type platform. We present two low degree polynomial time-complexity algorithms, SA and SA-P, each providing the following guarantee. For a given two-type platform and a task set, if there exists a task assignment such that tasks can be scheduled to meet deadlines by allowing them to migrate only between processors of the same type (intra-migrative), then (i) using SA, it is guaranteed to find such an assignment where the same restriction on task migration applies but given a platform in which processors are 1+α/2 times faster and (ii) SA-P succeeds in finding a task assignment where tasks are not allowed to migrate between processors (non-migrative) but given a platform in which processors are 1+α times faster. The parameter 0<α≤1 is a property of the task set; it is the maximum of all the task utilizations that are no greater than 1. We evaluate average-case performance of both the algorithms by generating task sets randomly and measuring how much faster processors the algorithms need (which is upper bounded by 1+α/2 for SA and 1+α for SA-P) in order to output a feasible task assignment (intra-migrative for SA and non-migrative for SA-P). In our evaluations, for the vast majority of task sets, these algorithms require significantly smaller processor speedup than indicated by their theoretical bounds. Finally, we consider a special case where no task utilization in the given task set can exceed one and for this case, we (re-)prove the performance guarantees of SA and SA-P. We show, for both of the algorithms, that changing the adversary from intra-migrative to a more powerful one, namely fully-migrative, in which tasks can migrate between processors of any type, does not deteriorate the performance guarantees. For this special case, we compare the average-case performance of SA-P and a state-of-the-art algorithm by generating task sets randomly. In our evaluations, SA-P outperforms the state-of-the-art by requiring much smaller processor speedup and by running orders of magnitude faster.
Resumo:
Dissertação para obtenção do grau de Mestre em Engenharia Electrotécnica Ramo Energia
Resumo:
1st European IAHR Congress,6-4 May, Edinburg, Scotland
Resumo:
River Flow 2010
Resumo:
Hyperspectral imaging has become one of the main topics in remote sensing applications, which comprise hundreds of spectral bands at different (almost contiguous) wavelength channels over the same area generating large data volumes comprising several GBs per flight. This high spectral resolution can be used for object detection and for discriminate between different objects based on their spectral characteristics. One of the main problems involved in hyperspectral analysis is the presence of mixed pixels, which arise when the spacial resolution of the sensor is not able to separate spectrally distinct materials. Spectral unmixing is one of the most important task for hyperspectral data exploitation. However, the unmixing algorithms can be computationally very expensive, and even high power consuming, which compromises the use in applications under on-board constraints. In recent years, graphics processing units (GPUs) have evolved into highly parallel and programmable systems. Specifically, several hyperspectral imaging algorithms have shown to be able to benefit from this hardware taking advantage of the extremely high floating-point processing performance, compact size, huge memory bandwidth, and relatively low cost of these units, which make them appealing for onboard data processing. In this paper, we propose a parallel implementation of an augmented Lagragian based method for unsupervised hyperspectral linear unmixing on GPUs using CUDA. The method called simplex identification via split augmented Lagrangian (SISAL) aims to identify the endmembers of a scene, i.e., is able to unmix hyperspectral data sets in which the pure pixel assumption is violated. The efficient implementation of SISAL method presented in this work exploits the GPU architecture at low level, using shared memory and coalesced accesses to memory.
Resumo:
Dissertation submitted to obtain a Ph.D. (Doutoramento) degree in Biology at the Instituto de Tecnologia Química e Biológica da Universidade Nova de Lisboa
Resumo:
Dissertation presented to obtain a Ph.D. degree in Engineering and Technology Sciences, Biotechnology at the Instituto de Tecnologia Química e Biológica, Universidade Nova de Lisboa
Resumo:
A otimização nos sistemas de suporte à decisão atuais assume um carácter fortemente interdisciplinar relacionando-se com a necessidade de integração de diferentes técnicas e paradigmas na resolução de problemas reais complexos, sendo que a computação de soluções ótimas em muitos destes problemas é intratável. Os métodos de pesquisa heurística são conhecidos por permitir obter bons resultados num intervalo temporal aceitável. Muitas vezes, necessitam que a parametrização seja ajustada de forma a permitir obter bons resultados. Neste sentido, as estratégias de aprendizagem podem incrementar o desempenho de um sistema, dotando-o com a capacidade de aprendizagem, por exemplo, qual a técnica de otimização mais adequada para a resolução de uma classe particular de problemas, ou qual a parametrização mais adequada de um dado algoritmo num determinado cenário. Alguns dos métodos de otimização mais usados para a resolução de problemas do mundo real resultaram da adaptação de ideias de várias áreas de investigação, principalmente com inspiração na natureza - Meta-heurísticas. O processo de seleção de uma Meta-heurística para a resolução de um dado problema é em si um problema de otimização. As Híper-heurísticas surgem neste contexto como metodologias eficientes para selecionar ou gerar heurísticas (ou Meta-heurísticas) na resolução de problemas de otimização NP-difícil. Nesta dissertação pretende-se dar uma contribuição para o problema de seleção de Metaheurísticas respetiva parametrização. Neste sentido é descrita a especificação de uma Híperheurística para a seleção de técnicas baseadas na natureza, na resolução do problema de escalonamento de tarefas em sistemas de fabrico, com base em experiência anterior. O módulo de Híper-heurística desenvolvido utiliza um algoritmo de aprendizagem por reforço (QLearning), que permite dotar o sistema da capacidade de seleção automática da Metaheurística a usar no processo de otimização, assim como a respetiva parametrização. Finalmente, procede-se à realização de testes computacionais para avaliar a influência da Híper- Heurística no desempenho do sistema de escalonamento AutoDynAgents. Como conclusão genérica, é possível afirmar que, dos resultados obtidos é possível concluir existir vantagem significativa no desempenho do sistema quando introduzida a Híper-heurística baseada em QLearning.
Resumo:
Over the last fifty years mobility practices have changed dramatically, improving the way travel takes place, the time it takes but also on matters like road safety and prevention. High mortality caused by high accident levels has reached untenable levels. But the research into road mortality stayed limited to comparative statistical exercises which go no further than defining accident types. In terms of sharing information and mapping accidents, little progress has been mad, aside from the normal publication of figures, either through simplistic tables or web pages. With considerable technological advances on geographical information technologies, research and development stayed rather static with only a few good examples on dynamic mapping. The use of Global Positioning System (GPS) devices as normal equipments on automobile industry resulted in a more dynamic mobility patterns but also with higher degrees of uncertainty on road traffic. This paper describes a road accident georeferencing project for the Lisbon District involving fatalities and serious injuries during 2007. In the initial phase, individual information summaries were compiled giving information on accidents and its majour characteristics, collected by the security forces: the Public Safety Police Force (Polícia de Segurança Pública - PSP) and the National Guard (Guarda Nacional Republicana - GNR). The Google Earth platform was used to georeference the information in order to inform the public and the authorities of the accident locations, the nature of the location, and the causes and consequences of the accidents. This paper also gives future insights about augmented reality technologies, considered crucial to advances to road safety and prevention studies. At the end, this exercise could be considered a success because of numerous consequences, as for stakeholders who decide what to do but also for the public awareness to the problem of road mortality.
Resumo:
O potencial de um reservatório de shale gas e influenciado por um grande número de fatores, tais como a sua mineralogia e textura, o seu tipo e maturação de querogénio, a saturação de fluidos, os mecanismos de armazenamento de gás, a profundidade do reservatório e a temperatura e pressão de poros. Nesse sentido, o principal objetivo desta tese foi estabelecer uma metodologia de avaliação preliminar de potenciais jazigos de shale gas (estudo de afloramentos com base numa litoestratigrafia de alta resolução), que foi posteriormente aplicada na Formação de Vale das Fontes (Bacia Lusitânica, Portugal). Esta tese tem a particularidade de contribuir, não só para o aprofundamento da informação a nível geoquímico do local, mas também na abordagem inovadora que permitiu a caracterização petrofísica da Formação de Vale das Fontes. Para a aplicação da metodologia estabelecida, foi necessária a realização dos seguintes ensaios laboratoriais: Rock-Eval 6, picnometria de gás hélio, ensaio de resistência a compressão simples, Darcypress e a difracção de raios-X, aplicando o método de Rietveld. Os resultados obtidos na análise petrofísica mostram uma formação rochosa de baixa porosidade que segundo a classificação ISRM, e classificada como ”Resistente”, para alem de revelar comportamento dúctil e elevado índice de fragilidade. A permeabilidade média obtida situa a Formação no intervalo correspondente as permeabilidades atribuídas aos jazigos de tigh gas, indicando a necessidade de fracturação hidráulica, no caso de uma eventual exploração de hidrocarbonetos, enquanto a difracção de raios-X destaca a calcite, o quartzo e os filossilicatos como os minerais mais presentes na Formação. Do ponto de vista geoquímico, os resultados obtidos mostram que apesar do considerável teor médio de carbono orgânico total, a natureza da matéria orgânica analisada e maioritariamente imatura, composta, principalmente, por querogénio do tipo IV, o que indica a incapacidade de a formação gerar hidrocarbonetos em quantidades economicamente exploráveis.
Resumo:
This document presents a tool able to automatically gather data provided by real energy markets and to generate scenarios, capture and improve market players’ profiles and strategies by using knowledge discovery processes in databases supported by artificial intelligence techniques, data mining algorithms and machine learning methods. It provides the means for generating scenarios with different dimensions and characteristics, ensuring the representation of real and adapted markets, and their participating entities. The scenarios generator module enhances the MASCEM (Multi-Agent Simulator of Competitive Electricity Markets) simulator, endowing a more effective tool for decision support. The achievements from the implementation of the proposed module enables researchers and electricity markets’ participating entities to analyze data, create real scenarios and make experiments with them. On the other hand, applying knowledge discovery techniques to real data also allows the improvement of MASCEM agents’ profiles and strategies resulting in a better representation of real market players’ behavior. This work aims to improve the comprehension of electricity markets and the interactions among the involved entities through adequate multi-agent simulation.
Resumo:
The electricity demand in Brazil has been growing. Some studies estimate that through 2035 the energy consumption (the power consumption) should increase 78%. Two distinct actions are necessary to meet this growth: the construction of new generating plants and to reduce electrical losses in the country. As the construction of power plants have a high price, coupled with the growth of (current) environmental concern, electric utilities are investing in reducing losses, both technical and non-technical. In this context, this paper aims to present an overview of nontechnical losses in Brazil and to raise a discussion on the reasons that contribute to energy fraud.
Resumo:
Dissertação de Mestrado apresentada ao Instituto de Contabilidade e Administração do Porto para a obtenção do grau de Mestre em Empreendedorismo e Internacionalização, sob orientação da Professora Doutora Maria Clara Ribeiro