79 resultados para Posture Data


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The study of Electricity Markets operation has been gaining an increasing importance in the last years, as result of the new challenges that the restructuring produced. Currently, lots of information concerning Electricity Markets is available, as market operators provide, after a period of confidentiality, data regarding market proposals and transactions. These data can be used as source of knowledge, to define realistic scenarios, essential for understanding and forecast Electricity Markets behaviour. The development of tools able to extract, transform, store and dynamically update data, is of great importance to go a step further into the comprehension of Electricity Markets and the behaviour of the involved entities. In this paper we present an adaptable tool capable of downloading, parsing and storing data from market operators’ websites, assuring actualization and reliability of stored data.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Electric power networks, namely distribution networks, have been suffering several changes during the last years due to changes in the power systems operation, towards the implementation of smart grids. Several approaches to the operation of the resources have been introduced, as the case of demand response, making use of the new capabilities of the smart grids. In the initial levels of the smart grids implementation reduced amounts of data are generated, namely consumption data. The methodology proposed in the present paper makes use of demand response consumers’ performance evaluation methods to determine the expected consumption for a given consumer. Then, potential commercial losses are identified using monthly historic consumption data. Real consumption data is used in the case study to demonstrate the application of the proposed method.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Worldwide electricity markets have been evolving into regional and even continental scales. The aim at an efficient use of renewable based generation in places where it exceeds the local needs is one of the main reasons. A reference case of this evolution is the European Electricity Market, where countries are connected, and several regional markets were created, each one grouping several countries, and supporting transactions of huge amounts of electrical energy. The continuous transformations electricity markets have been experiencing over the years create the need to use simulation platforms to support operators, regulators, and involved players for understanding and dealing with this complex environment. This paper focuses on demonstrating the advantage that real electricity markets data has for the creation of realistic simulation scenarios, which allow the study of the impacts and implications that electricity markets transformations will bring to the participant countries. A case study using MASCEM (Multi-Agent System for Competitive Electricity Markets) is presented, with a scenario based on real data, simulating the European Electricity Market environment, and comparing its performance when using several different market mechanisms.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents the Realistic Scenarios Generator (RealScen), a tool that processes data from real electricity markets to generate realistic scenarios that enable the modeling of electricity market players’ characteristics and strategic behavior. The proposed tool provides significant advantages to the decision making process in an electricity market environment, especially when coupled with a multi-agent electricity markets simulator. The generation of realistic scenarios is performed using mechanisms for intelligent data analysis, which are based on artificial intelligence and data mining algorithms. These techniques allow the study of realistic scenarios, adapted to the existing markets, and improve the representation of market entities as software agents, enabling a detailed modeling of their profiles and strategies. This work contributes significantly to the understanding of the interactions between the entities acting in electricity markets by increasing the capability and realism of market simulations.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

2 Centre of Research, Education, Innovation and Intervention In Sport, Faculty of Sport, University of Porto, Portugal Background: Regarding children aged _10 years, only a few international studies were conducted to determine the prevalence of and risk factors for back pain. Although other studies on the older Portuguese children point to prevalence between 17% and 39%, none exists for this specific age-group. Thus, the aim of this study was conducted to establish the prevalence of and risk factors for back pain in schoolchildren aged 7–10 years. Methods: A cross-sectional survey among 637 children was conducted. A self-rating questionnaire was used to verify prevalence and duration of back pain, life habits, school absence, medical treatments or limitation of activities. For posture assessment, photographic records with a bio-photogrammetric analysis were used to obtain data about head, acromion and pelvic alignment, horizontal alignment of the scapulae, vertical alignment of the trunk and vertical body alignment. Results: Postural problems were found in 25.4% of the children, especially in the 8- and 9-year-old groups. Back pain occurs in 12.7% with the highest values among the 7- and 10-year-old children. The probability of back pain increased 7 times when the children presented a history of school absences, 4.3 times when they experienced sleeping difficulties, 4.4 times when school furniture was uncomfortable, 4.7 times if the children perceived an occurrence of parental back pain and 2.5 times when children presented incorrect posture. Conclusions: The combination of school absences, parental pain, sleeping difficulties, inappropriate school furniture and postural deviations at the sagittal and frontal planes seem to prove the multifactorial aetiology of back pain.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Harnessing idle PCs CPU cycles, storage space and other resources of networked computers to collaborative are mainly fixated on for all major grid computing research projects. Most of the university computers labs are occupied with the high puissant desktop PC nowadays. It is plausible to notice that most of the time machines are lying idle or wasting their computing power without utilizing in felicitous ways. However, for intricate quandaries and for analyzing astronomically immense amounts of data, sizably voluminous computational resources are required. For such quandaries, one may run the analysis algorithms in very puissant and expensive computers, which reduces the number of users that can afford such data analysis tasks. Instead of utilizing single expensive machines, distributed computing systems, offers the possibility of utilizing a set of much less expensive machines to do the same task. BOINC and Condor projects have been prosperously utilized for solving authentic scientific research works around the world at a low cost. In this work the main goal is to explore both distributed computing to implement, Condor and BOINC, and utilize their potency to harness the ideal PCs resources for the academic researchers to utilize in their research work. In this thesis, Data mining tasks have been performed in implementation of several machine learning algorithms on the distributed computing environment.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

O presente trabalho enquadra-se na temática de segurança contra incêndio em edifícios e consiste num estudo de caso de projeto de deteção e extinção de incêndio num Data Center. Os objetivos deste trabalho resumem-se à realização de um estudo sobre o estado da arte da extinção e deteção automática de incêndio, ao desenvolvimento de uma ferramenta de software de apoio a projetos de extinção por agentes gasosos, como também à realização de um estudo e uma análise da proteção contra incêndios em Data Centers. Por último foi efetuado um estudo de caso. São abordados os conceitos de fogo e de incêndio, em que um estudo teórico à temática foi desenvolvido, descrevendo de que forma pode o fogo ser originado e respetivas consequências. Os regulamentos nacionais relativos à Segurança Contra Incêndios em Edifícios (SCIE) são igualmente abordados, com especial foco nos Sistemas Automáticos de Deteção de Incêndio (SADI) e nos Sistemas Automáticos de Extinção de Incêndio (SAEI), as normas nacionais e internacionais relativas a esta temática também são mencionadas. Pelo facto de serem muito relevantes para o desenvolvimento deste trabalho, os sistemas de deteção de incêndio são exaustivamente abordados, mencionando características de equipamentos de deteção, técnicas mais utilizadas como também quais os aspetos a ter em consideração no dimensionamento de um SADI. Quanto aos meios de extinção de incêndio foram mencionados quais os mais utilizados atualmente, as suas vantagens e a que tipo de fogo se aplicam, com especial destaque para os SAEI com utilização de gases inertes, em que foi descrito como deve ser dimensionado um sistema deste tipo. Foi também efetuada a caracterização dos Data Centers para que seja possível entender quais as suas funcionalidades, a importância da sua existência e os aspetos gerais de uma proteção contra incêndio nestas instalações. Por último, um estudo de caso foi desenvolvido, um SADI foi projetado juntamente com um SAEI que utiliza azoto como gás de extinção. As escolhas e os sistemas escolhidos foram devidamente justificados, tendo em conta os regulamentos e normas em vigor.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

São muitas as organizações que por todo o mundo possuem instalações deste tipo, em Portugal temos o exemplo da Portugal Telecom que recentemente inaugurou o seu Data Center na Covilhã. O desenvolvimento de um Data Center exige assim um projeto muito cuidado, o qual entre outros aspetos deverá garantir a segurança da informação e das próprias instalações, nomeadamente no que se refere à segurança contra incêndio.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In-network storage of data in wireless sensor networks contributes to reduce the communications inside the network and to favor data aggregation. In this paper, we consider the use of n out of m codes and data dispersal in combination to in-network storage. In particular, we provide an abstract model of in-network storage to show how n out of m codes can be used, and we discuss how this can be achieved in five cases of study. We also define a model aimed at evaluating the probability of correct data encoding and decoding, we exploit this model and simulations to show how, in the cases of study, the parameters of the n out of m codes and the network should be configured in order to achieve correct data coding and decoding with high probability.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Accepted in 13th IEEE Symposium on Embedded Systems for Real-Time Multimedia (ESTIMedia 2015), Amsterdam, Netherlands.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Nowadays, data centers are large energy consumers and the trend for next years is expected to increase further, considering the growth in the order of cloud services. A large portion of this power consumption is due to the control of physical parameters of the data center (such as temperature and humidity). However, these physical parameters are tightly coupled with computations, and even more so in upcoming data centers, where the location of workloads can vary substantially due, for example, to workloads being moved in the cloud infrastructure hosted in the data center. Therefore, managing the physical and compute infrastructure of a large data center is an embodiment of a Cyber-Physical System (CPS). In this paper, we describe a data collection and distribution architecture that enables gathering physical parameters of a large data center at a very high temporal and spatial resolution of the sensor measurements. We think this is an important characteristic to enable more accurate heat-flow models of the data center and with them, find opportunities to optimize energy consumptions. Having a high-resolution picture of the data center conditions, also enables minimizing local hot-spots, perform more accurate predictive maintenance (failures in all infrastructure equipments can be more promptly detected) and more accurate billing. We detail this architecture and define the structure of the underlying messaging system that is used to collect and distribute the data. Finally, we show the results of a preliminary study of a typical data center radio environment.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper studies the statistical distributions of worldwide earthquakes from year 1963 up to year 2012. A Cartesian grid, dividing Earth into geographic regions, is considered. Entropy and the Jensen–Shannon divergence are used to analyze and compare real-world data. Hierarchical clustering and multi-dimensional scaling techniques are adopted for data visualization. Entropy-based indices have the advantage of leading to a single parameter expressing the relationships between the seismic data. Classical and generalized (fractional) entropy and Jensen–Shannon divergence are tested. The generalized measures lead to a clear identification of patterns embedded in the data and contribute to better understand earthquake distributions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Complex industrial plants exhibit multiple interactions among smaller parts and with human operators. Failure in one part can propagate across subsystem boundaries causing a serious disaster. This paper analyzes the industrial accident data series in the perspective of dynamical systems. First, we process real world data and show that the statistics of the number of fatalities reveal features that are well described by power law (PL) distributions. For early years, the data reveal double PL behavior, while, for more recent time periods, a single PL fits better into the experimental data. Second, we analyze the entropy of the data series statistics over time. Third, we use the Kullback–Leibler divergence to compare the empirical data and multidimensional scaling (MDS) techniques for data analysis and visualization. Entropy-based analysis is adopted to assess complexity, having the advantage of yielding a single parameter to express relationships between the data. The classical and the generalized (fractional) entropy and Kullback–Leibler divergence are used. The generalized measures allow a clear identification of patterns embedded in the data.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Currently, due to the widespread use of computers and the internet, students are trading libraries for the World Wide Web and laboratories with simulation programs. In most courses, simulators are made available to students and can be used to proof theoretical results or to test a developing hardware/product. Although this is an interesting solution: low cost, easy and fast way to perform some courses work, it has indeed major disadvantages. As everything is currently being done with/in a computer, the students are loosing the “feel” of the real values of the magnitudes. For instance in engineering studies, and mainly in the first years, students need to learn electronics, algorithmic, mathematics and physics. All of these areas can use numerical analysis software, simulation software or spreadsheets and in the majority of the cases data used is either simulated or random numbers, but real data could be used instead. For example, if a course uses numerical analysis software and needs a dataset, the students can learn to manipulate arrays. Also, when using the spreadsheets to build graphics, instead of using a random table, students could use a real dataset based, for instance, in the room temperature and its variation across the day. In this work we present a framework which uses a simple interface allowing it to be used by different courses where the computers are the teaching/learning process in order to give a more realistic feeling to students by using real data. A framework is proposed based on a set of low cost sensors for different physical magnitudes, e.g. temperature, light, wind speed, which are connected to a central server, that the students have access with an Ethernet protocol or are connected directly to the student computer/laptop. These sensors use the communication ports available such as: serial ports, parallel ports, Ethernet or Universal Serial Bus (USB). Since a central server is used, the students are encouraged to use sensor values results in their different courses and consequently in different types of software such as: numerical analysis tools, spreadsheets or simply inside any programming language when a dataset is needed. In order to do this, small pieces of hardware were developed containing at least one sensor using different types of computer communication. As long as the sensors are attached in a server connected to the internet, these tools can also be shared between different schools. This allows sensors that aren't available in a determined school to be used by getting the values from other places that are sharing them. Another remark is that students in the more advanced years and (theoretically) more know how, can use the courses that have some affinities with electronic development to build new sensor pieces and expand the framework further. The final solution provided is very interesting, low cost, simple to develop, allowing flexibility of resources by using the same materials in several courses bringing real world data into the students computer works.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Demo in Workshop on ns-3 (WNS3 2015). 13 to 14, May, 2015. Castelldefels, Spain.