947 resultados para Data aggregation


Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents the Realistic Scenarios Generator (RealScen), a tool that processes data from real electricity markets to generate realistic scenarios that enable the modeling of electricity market players’ characteristics and strategic behavior. The proposed tool provides significant advantages to the decision making process in an electricity market environment, especially when coupled with a multi-agent electricity markets simulator. The generation of realistic scenarios is performed using mechanisms for intelligent data analysis, which are based on artificial intelligence and data mining algorithms. These techniques allow the study of realistic scenarios, adapted to the existing markets, and improve the representation of market entities as software agents, enabling a detailed modeling of their profiles and strategies. This work contributes significantly to the understanding of the interactions between the entities acting in electricity markets by increasing the capability and realism of market simulations.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Dissertation presented to obtain the Ph.D degree in Bioinformatics

Relevância:

20.00% 20.00%

Publicador:

Resumo:

S100A6 is a small EF-hand calcium- and zinc-binding protein involved in the regulation of cell proliferation and cytoskeletal dynamics. It is overexpressed in neurodegenerative disorders and a proposed marker for Amyotrophic Lateral Sclerosis (ALS). Following recent reports of amyloid formation by S100 proteins, we investigated the aggregation properties of S100A6. Computational analysis using aggregation predictors Waltz and Zyggregator revealed increased propensity within S100A6 helices HI and HIV. Subsequent analysis of Thioflavin-T binding kinetics under acidic conditions elicited a very fast process with no lag phase and extensive formation of aggregates and stacked fibrils as observed by electron microscopy. Ca2+ exerted an inhibitory effect on the aggregation kinetics, which could be reverted upon chelation. An FT-IR investigation of the early conformational changes occurring under these conditions showed that Ca2+ promotes anti-parallel β-sheet conformations that repress fibrillation. At pH 7, Ca2+ rendered the fibril formation kinetics slower: time-resolved imaging showed that fibril formation is highly suppressed, with aggregates forming instead. In the absence of metals an extensive network of fibrils is formed. S100A6 oligomers, but not fibrils, were found to be cytotoxic, decreasing cell viability by up to 40%. This effect was not observed when the aggregates were formed in the presence of Ca2+. Interestingly, native S1006 seeds SOD1 aggregation, shortening its nucleation process. This suggests a cross-talk between these two proteins involved in ALS. Overall, these results put forward novel roles for S100 proteins, whose metal-modulated aggregation propensity may be a key aspect in their physiology and function.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Thesis submitted to Faculdade de Ciências e Tecnologia of the Universidade Nova de Lisboa, in partial fulfilment of the requirements for the degree of Master in Computer Science

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Harnessing idle PCs CPU cycles, storage space and other resources of networked computers to collaborative are mainly fixated on for all major grid computing research projects. Most of the university computers labs are occupied with the high puissant desktop PC nowadays. It is plausible to notice that most of the time machines are lying idle or wasting their computing power without utilizing in felicitous ways. However, for intricate quandaries and for analyzing astronomically immense amounts of data, sizably voluminous computational resources are required. For such quandaries, one may run the analysis algorithms in very puissant and expensive computers, which reduces the number of users that can afford such data analysis tasks. Instead of utilizing single expensive machines, distributed computing systems, offers the possibility of utilizing a set of much less expensive machines to do the same task. BOINC and Condor projects have been prosperously utilized for solving authentic scientific research works around the world at a low cost. In this work the main goal is to explore both distributed computing to implement, Condor and BOINC, and utilize their potency to harness the ideal PCs resources for the academic researchers to utilize in their research work. In this thesis, Data mining tasks have been performed in implementation of several machine learning algorithms on the distributed computing environment.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Dissertation submitted in partial fulfilment of the requirements for the Degree of Master of Science in Geospatial Technologies

Relevância:

20.00% 20.00%

Publicador:

Resumo:

O presente trabalho enquadra-se na temática de segurança contra incêndio em edifícios e consiste num estudo de caso de projeto de deteção e extinção de incêndio num Data Center. Os objetivos deste trabalho resumem-se à realização de um estudo sobre o estado da arte da extinção e deteção automática de incêndio, ao desenvolvimento de uma ferramenta de software de apoio a projetos de extinção por agentes gasosos, como também à realização de um estudo e uma análise da proteção contra incêndios em Data Centers. Por último foi efetuado um estudo de caso. São abordados os conceitos de fogo e de incêndio, em que um estudo teórico à temática foi desenvolvido, descrevendo de que forma pode o fogo ser originado e respetivas consequências. Os regulamentos nacionais relativos à Segurança Contra Incêndios em Edifícios (SCIE) são igualmente abordados, com especial foco nos Sistemas Automáticos de Deteção de Incêndio (SADI) e nos Sistemas Automáticos de Extinção de Incêndio (SAEI), as normas nacionais e internacionais relativas a esta temática também são mencionadas. Pelo facto de serem muito relevantes para o desenvolvimento deste trabalho, os sistemas de deteção de incêndio são exaustivamente abordados, mencionando características de equipamentos de deteção, técnicas mais utilizadas como também quais os aspetos a ter em consideração no dimensionamento de um SADI. Quanto aos meios de extinção de incêndio foram mencionados quais os mais utilizados atualmente, as suas vantagens e a que tipo de fogo se aplicam, com especial destaque para os SAEI com utilização de gases inertes, em que foi descrito como deve ser dimensionado um sistema deste tipo. Foi também efetuada a caracterização dos Data Centers para que seja possível entender quais as suas funcionalidades, a importância da sua existência e os aspetos gerais de uma proteção contra incêndio nestas instalações. Por último, um estudo de caso foi desenvolvido, um SADI foi projetado juntamente com um SAEI que utiliza azoto como gás de extinção. As escolhas e os sistemas escolhidos foram devidamente justificados, tendo em conta os regulamentos e normas em vigor.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

São muitas as organizações que por todo o mundo possuem instalações deste tipo, em Portugal temos o exemplo da Portugal Telecom que recentemente inaugurou o seu Data Center na Covilhã. O desenvolvimento de um Data Center exige assim um projeto muito cuidado, o qual entre outros aspetos deverá garantir a segurança da informação e das próprias instalações, nomeadamente no que se refere à segurança contra incêndio.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Dissertação para obtenção do Grau de Mestre em Genética Molecular e Biomedicina

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Disaster management is one of the most relevant application fields of wireless sensor networks. In this application, the role of the sensor network usually consists of obtaining a representation or a model of a physical phenomenon spreading through the affected area. In this work we focus on forest firefighting operations, proposing three fully distributed ways for approximating the actual shape of the fire. In the simplest approach, a circular burnt area is assumed around each node that has detected the fire and the union of these circles gives the overall fire’s shape. However, as this approach makes an intensive use of the wireless sensor network resources, we have proposed to incorporate two in-network aggregation techniques, which do not require considering the complete set of fire detections. The first technique models the fire by means of a complex shape composed of multiple convex hulls representing different burning areas, while the second technique uses a set of arbitrary polygons. Performance evaluation of realistic fire models on computer simulations reveals that the method based on arbitrary polygons obtains an improvement of 20% in terms of accuracy of the fire shape approximation, reducing the overhead in-network resources to 10% in the best case.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Accepted in 13th IEEE Symposium on Embedded Systems for Real-Time Multimedia (ESTIMedia 2015), Amsterdam, Netherlands.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Nowadays, data centers are large energy consumers and the trend for next years is expected to increase further, considering the growth in the order of cloud services. A large portion of this power consumption is due to the control of physical parameters of the data center (such as temperature and humidity). However, these physical parameters are tightly coupled with computations, and even more so in upcoming data centers, where the location of workloads can vary substantially due, for example, to workloads being moved in the cloud infrastructure hosted in the data center. Therefore, managing the physical and compute infrastructure of a large data center is an embodiment of a Cyber-Physical System (CPS). In this paper, we describe a data collection and distribution architecture that enables gathering physical parameters of a large data center at a very high temporal and spatial resolution of the sensor measurements. We think this is an important characteristic to enable more accurate heat-flow models of the data center and with them, find opportunities to optimize energy consumptions. Having a high-resolution picture of the data center conditions, also enables minimizing local hot-spots, perform more accurate predictive maintenance (failures in all infrastructure equipments can be more promptly detected) and more accurate billing. We detail this architecture and define the structure of the underlying messaging system that is used to collect and distribute the data. Finally, we show the results of a preliminary study of a typical data center radio environment.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper studies the statistical distributions of worldwide earthquakes from year 1963 up to year 2012. A Cartesian grid, dividing Earth into geographic regions, is considered. Entropy and the Jensen–Shannon divergence are used to analyze and compare real-world data. Hierarchical clustering and multi-dimensional scaling techniques are adopted for data visualization. Entropy-based indices have the advantage of leading to a single parameter expressing the relationships between the seismic data. Classical and generalized (fractional) entropy and Jensen–Shannon divergence are tested. The generalized measures lead to a clear identification of patterns embedded in the data and contribute to better understand earthquake distributions.