101 resultados para Doubly-connected


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Com o advento da invenção do modelo relacional em 1970 por E.F.Codd, a forma como a informação era gerida numa base de dados foi totalmente revolucionada. Migrou‐se de sistemas hierárquicos baseados em ficheiros para uma base de dados relacional com tabelas relações e registos que simplificou em muito a gestão da informação e levou muitas empresas a adotarem este modelo. O que E.F.Codd não previu foi o facto de que cada vez mais a informação que uma base de dados teria de armazenar fosse de proporções gigantescas, nem que as solicitações às bases de dados fossem da mesma ordem. Tudo isto veio a acontecer com a difusão da internet que veio ligar todas as pessoas de qualquer parte do mundo que tivessem um computador. Com o número de adesões à internet a crescer, o número de sites que nela eram criados também cresceu (e ainda cresce exponencialmente). Os motores de busca que antigamente indexavam alguns sites por dia, atualmente indexam uns milhões de sites por segundo e, mais recentemente as redes sociais também estão a lidar com quantidades gigantescas de informação. Tanto os motores de busca como as redes sociais chegaram à conclusão que uma base de dados relacional não chega para gerir a enorme quantidade de informação que ambos produzem e como tal, foi necessário encontrar uma solução. Essa solução é NoSQL e é o assunto que esta tese vai tratar. O presente documento visa definir e apresentar o problema que as bases de dados relacionais têm quando lidam com grandes volumes de dados, introduzir os limites do modelo relacional que só até há bem pouco tempo começaram a ser evidenciados com o surgimento de movimentos, como o BigData, com o crescente número de sites que surgem por dia e com o elevado número de utilizadores das redes sociais. Será também ilustrada a solução adotada até ao momento pelos grandes consumidores de dados de elevado volume, como o Google e o Facebook, enunciando as suas características vantagens, desvantagens e os demais conceitos ligados ao modelo NoSQL. A presente tese tenciona ainda demonstrar que o modelo NoSQL é uma realidade usada em algumas empresas e quais as principias mudanças a nível programático e as boas práticas delas resultantes que o modelo NoSQL traz. Por fim esta tese termina com a explicação de que NoSQL é uma forma de implementar a persistência de uma aplicação que se inclui no novo modelo de persistência da informação.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Com o constante desenvolvimento da sociedade, o consumo de energia elétrica tem aumentado gradualmente, por outro lado a preocupação com o meio ambiente e a necessidade de um desenvolvimento sustentável, faz com que a legislação atual favoreça a introdução de fontes de energia de origem renovável em detrimento de fontes de energia de origem fóssil. Cada vez mais têm surgido incentivos para a implementação de pequenos sistemas de produção em instalações de utilização, estes consumidores/produtores são denominados de prosumers, sendo este tipo de produtores ligados à rede elétrica de baixa tensão. Com a introdução deste tipo de produtores é necessário dotar a rede elétrica de meios que permitam ao operador da rede monitorizar e controlar em tempo real o estado da rede assim como destes novos produtores. No âmbito desta dissertação, foi desenvolvido um algoritmo de controlo inteligente de microprodução. Avaliando o consumo, a produção, entre outros parâmetros de gestão da rede, este algoritmo calculará um conjunto de set-points que deverão ser enviados para os microprodutores de modo a limitar a potência injetada na rede e assim controlar a tensão. Também foi realizado um estudo económico do impacto que as medidas propostas teriam do ponto de vista dos gestores da rede bem como do ponto de vista dos microprodutores.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

All over the world, the liberalization of electricity markets, which follows different paradigms, has created new challenges for those involved in this sector. In order to respond to these challenges, electric power systems suffered a significant restructuring in its mode of operation and planning. This restructuring resulted in a considerable increase of the electric sector competitiveness. Particularly, the Ancillary Services (AS) market has been target of constant renovations in its operation mode as it is a targeted market for the trading of services, which have as main objective to ensure the operation of electric power systems with appropriate levels of stability, safety, quality, equity and competitiveness. In this way, with the increasing penetration of distributed energy resources including distributed generation, demand response, storage units and electric vehicles, it is essential to develop new smarter and hierarchical methods of operation of electric power systems. As these resources are mostly connected to the distribution network, it is important to consider the introduction of this kind of resources in AS delivery in order to achieve greater reliability and cost efficiency of electrical power systems operation. The main contribution of this work is the design and development of mechanisms and methodologies of AS market and for energy and AS joint market, considering different management entities of transmission and distribution networks. Several models developed in this work consider the most common AS in the liberalized market environment: Regulation Down; Regulation Up; Spinning Reserve and Non-Spinning Reserve. The presented models consider different rules and ways of operation, such as the division of market by network areas, which allows the congestion management of interconnections between areas; or the ancillary service cascading process, which allows the replacement of AS of superior quality by lower quality of AS, ensuring a better economic performance of the market. A major contribution of this work is the development an innovative methodology of market clearing process to be used in the energy and AS joint market, able to ensure viable and feasible solutions in markets, where there are technical constraints in the transmission network involving its division into areas or regions. The proposed method is based on the determination of Bialek topological factors and considers the contribution of the dispatch for all services of increase of generation (energy, Regulation Up, Spinning and Non-Spinning reserves) in network congestion. The use of Bialek factors in each iteration of the proposed methodology allows limiting the bids in the market while ensuring that the solution is feasible in any context of system operation. Another important contribution of this work is the model of the contribution of distributed energy resources in the ancillary services. In this way, a Virtual Power Player (VPP) is considered in order to aggregate, manage and interact with distributed energy resources. The VPP manages all the agents aggregated, being able to supply AS to the system operator, with the main purpose of participation in electricity market. In order to ensure their participation in the AS, the VPP should have a set of contracts with the agents that include a set of diversified and adapted rules to each kind of distributed resource. All methodologies developed and implemented in this work have been integrated into the MASCEM simulator, which is a simulator based on a multi-agent system that allows to study complex operation of electricity markets. In this way, the developed methodologies allow the simulator to cover more operation contexts of the present and future of the electricity market. In this way, this dissertation offers a huge contribution to the AS market simulation, based on models and mechanisms currently used in several real markets, as well as the introduction of innovative methodologies of market clearing process on the energy and AS joint market. This dissertation presents five case studies; each one consists of multiple scenarios. The first case study illustrates the application of AS market simulation considering several bids of market players. The energy and ancillary services joint market simulation is exposed in the second case study. In the third case study it is developed a comparison between the simulation of the joint market methodology, in which the player bids to the ancillary services is considered by network areas and a reference methodology. The fourth case study presents the simulation of joint market methodology based on Bialek topological distribution factors applied to transmission network with 7 buses managed by a TSO. The last case study presents a joint market model simulation which considers the aggregation of small players to a VPP, as well as complex contracts related to these entities. The case study comprises a distribution network with 33 buses managed by VPP, which comprises several kinds of distributed resources, such as photovoltaic, CHP, fuel cells, wind turbines, biomass, small hydro, municipal solid waste, demand response, and storage units.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

An ever increasing need for extra functionality in a single embedded system demands for extra Input/Output (I/O) devices, which are usually connected externally and are expensive in terms of energy consumption. To reduce their energy consumption, these devices are equipped with power saving mechanisms. While I/O device scheduling for real-time (RT) systems with such power saving features has been studied in the past, the use of energy resources by these scheduling algorithms may be improved. Technology enhancements in the semiconductor industry have allowed the hardware vendors to reduce the device transition and energy overheads. The decrease in overhead of sleep transitions has opened new opportunities to further reduce the device energy consumption. In this research effort, we propose an intra-task device scheduling algorithm for real-time systems that wakes up a device on demand and reduces its active time while ensuring system schedulability. This intra-task device scheduling algorithm is extended for devices with multiple sleep states to further minimise the overall device energy consumption of the system. The proposed algorithms have less complexity when compared to the conservative inter-task device scheduling algorithms. The system model used relaxes some of the assumptions commonly made in the state-of-the-art that restrict their practical relevance. Apart from the aforementioned advantages, the proposed algorithms are shown to demonstrate the substantial energy savings.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Distributed real-time systems such as automotive applications are becoming larger and more complex, thus, requiring the use of more powerful hardware and software architectures. Furthermore, those distributed applications commonly have stringent real-time constraints. This implies that such applications would gain in flexibility if they were parallelized and distributed over the system. In this paper, we consider the problem of allocating fixed-priority fork-join Parallel/Distributed real-time tasks onto distributed multi-core nodes connected through a Flexible Time Triggered Switched Ethernet network. We analyze the system requirements and present a set of formulations based on a constraint programming approach. Constraint programming allows us to express the relations between variables in the form of constraints. Our approach is guaranteed to find a feasible solution, if one exists, in contrast to other approaches based on heuristics. Furthermore, approaches based on constraint programming have shown to obtain solutions for these type of formulations in reasonable time.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Currently, due to the widespread use of computers and the internet, students are trading libraries for the World Wide Web and laboratories with simulation programs. In most courses, simulators are made available to students and can be used to proof theoretical results or to test a developing hardware/product. Although this is an interesting solution: low cost, easy and fast way to perform some courses work, it has indeed major disadvantages. As everything is currently being done with/in a computer, the students are loosing the “feel” of the real values of the magnitudes. For instance in engineering studies, and mainly in the first years, students need to learn electronics, algorithmic, mathematics and physics. All of these areas can use numerical analysis software, simulation software or spreadsheets and in the majority of the cases data used is either simulated or random numbers, but real data could be used instead. For example, if a course uses numerical analysis software and needs a dataset, the students can learn to manipulate arrays. Also, when using the spreadsheets to build graphics, instead of using a random table, students could use a real dataset based, for instance, in the room temperature and its variation across the day. In this work we present a framework which uses a simple interface allowing it to be used by different courses where the computers are the teaching/learning process in order to give a more realistic feeling to students by using real data. A framework is proposed based on a set of low cost sensors for different physical magnitudes, e.g. temperature, light, wind speed, which are connected to a central server, that the students have access with an Ethernet protocol or are connected directly to the student computer/laptop. These sensors use the communication ports available such as: serial ports, parallel ports, Ethernet or Universal Serial Bus (USB). Since a central server is used, the students are encouraged to use sensor values results in their different courses and consequently in different types of software such as: numerical analysis tools, spreadsheets or simply inside any programming language when a dataset is needed. In order to do this, small pieces of hardware were developed containing at least one sensor using different types of computer communication. As long as the sensors are attached in a server connected to the internet, these tools can also be shared between different schools. This allows sensors that aren't available in a determined school to be used by getting the values from other places that are sharing them. Another remark is that students in the more advanced years and (theoretically) more know how, can use the courses that have some affinities with electronic development to build new sensor pieces and expand the framework further. The final solution provided is very interesting, low cost, simple to develop, allowing flexibility of resources by using the same materials in several courses bringing real world data into the students computer works.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

O presente relatório apresenta o projeto desenvolvido na Casa-Acolhimento Santa Marta, cuja finalidade era a promoção de um envelhecimento ativo e bem-sucedido com vista à melhoria da qualidade de vida das pessoas idosas que frequentam a resposta social de Centro de Dia. O conhecimento coconstruído com as pessoas idosas e os profissionais da instituição permitiu a conceção e o desenvolvimento do projeto “Não nos deixem dormir…”. Sendo um projeto, elaborado em conjunto com os indivíduos, privilegiou os pressupostos da metodologia de investigação-ação participativa. Inerente a este posicionamento, incentivando a exploração e a rentabilização dos recursos e das potencialidades endógenas, bem como procurando atenuar ou resolver os problemas e as necessidades subjacentes, procurou-se tornar os sujeitos atores e autores das suas vidas. Deste modo, partindo dos contributos e das necessidades dos idosos o projeto justifica a sua importância, designadamente pela realização de ações que proporcionaram um maior número de atividades de acordo com as suas expectativas e os seus interesses e que promoveram as relações interpessoais propiciando momentos de convívio e de diálogo, fomentando o auto e o hétero conhecimento, bem como o respeito mútuo entre os idosos. De forma a sustentar a investigação e a intervenção, mobilizou-se contributos teóricos ligados sobretudo à terceira idade, que se tornaram essenciais na problematização, na reflexão e na ação. A concretização do projeto permitiu ainda uma constante reflexão acerca do papel do Educador Social junto da população idosa, bem como da pertinência da sua presença neste âmbito de intervenção.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Nos últimos anos, o fácil acesso em termos de custos, ferramentas de produção, edição e distribuição de conteúdos audiovisuais, contribuíram para o aumento exponencial da produção diária deste tipo de conteúdos. Neste paradigma de superabundância de conteúdos multimédia existe uma grande percentagem de sequências de vídeo que contém material explícito, sendo necessário existir um controlo mais rigoroso, de modo a não ser facilmente acessível a menores. O conceito de conteúdo explícito pode ser caraterizado de diferentes formas, tendo o trabalho descrito neste documento incidido sobre a deteção automática de nudez feminina presente em sequências de vídeo. Este processo de deteção e classificação automática de material para adultos pode constituir uma ferramenta importante na gestão de um canal de televisão. Diariamente podem ser recebidas centenas de horas de material sendo impraticável a implementação de um processo manual de controlo de qualidade. A solução criada no contexto desta dissertação foi estudada e desenvolvida em torno de um produto especifico ligado à área do broadcasting. Este produto é o mxfSPEEDRAIL F1000, sendo este uma solução da empresa MOG Technologies. O objetivo principal do projeto é o desenvolvimento de uma biblioteca em C++, acessível durante o processo de ingest, que permita, através de uma análise baseada em funcionalidades de visão computacional, detetar e sinalizar na metadata do sinal, quais as frames que potencialmente apresentam conteúdo explícito. A solução desenvolvida utiliza um conjunto de técnicas do estado da arte adaptadas ao problema a tratar. Nestas incluem-se algoritmos para realizar a segmentação de pele e deteção de objetos em imagens. Por fim é efetuada uma análise critica à solução desenvolvida no âmbito desta dissertação de modo a que em futuros desenvolvimentos esta seja melhorada a nível do consumo de recursos durante a análise e a nível da sua taxa de sucesso.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

To improve surgical safety, and to reduce the mortality and surgical complications incidence, the World Health Organization (WHO) developed the Surgical Safety Checklist (SSC). The SSC is a support of information that aids health professionals to reduce the number of complications, induction of anaesthesia, period before skin incision and period before leaving the operating room (OR). The SSC was tested in several countries of the world and their results shown that after introduction of the SSC the incidence of patient complication lowered from 11.0% to 7.0% (P<0.001), the rate of death declined from 1.5% to 0.8% (P = 0.003) and the nurses recognized that patients identity was more often con rmed (81.6% to 94.2%, P<0.01) in many institutions. Recently the SSC was also implemented in Portuguese hospitals, which led us to its study in the real clinical environment. An observational study was performed: several health professionals were observed and interviewed, to understand the functioning of the SSC in an OR, during the clinical routine. The objective of this study was to understand the current use of the SSC, and how it may be improved in terms of usability, taking advantage of the technological advancements such as mobile applications. During two days were observed 14 surgeries, only 2 surgeries met the requirements for the three phases of the SSC, as de ned by the WHO. Of the remaining 12 observed surgeries, 9 surgeries completed the last phase at the correct time. It was also observed that only in 2 surgeries all the phases of the SSC were read aloud to the team and that, in 7 surgeries, several items were read aloud and answered but no one was checking the SSC, only after the end of the phase. The observational study results disclose that several health professionals do not meet with rules of the WHO manual. This study demonstrates that it is urgent to change the mindset of health professionals, and that di erent features in the SSC may be useful to make it more easy to use. With the results of the observational study, a SSC application proposal was developed with new functionalities to improve and aid the health professional in its use. In this application the user can chose between a SSC already created to a speci c surgery or to create a new SSC, adding and adapting some questions from the WHO standard. To create a new SSC, the application is connected to an online questionnaire builder (JotForm). The choice for this online questionnaire builder went through three essential characteristics: number of types of questions, mainly checkbox, radio button and text; the possibility of to create sections inside sections and the API. In addition, in this proposal the improvements are focused in forcing the user to focus in the work ow of the SSC and to save the input timestamps and any actions made by them. Therefore, the following features was implemented to achieve that goal: display one item of the SSC at a time; display the stage where the SSC is; do not allow going back to the previous step; do not allow going forward to the next item if the current is not lled; do not allow going forward to the next item if the time it took to ll the item was too short and log any action made by the user.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In the traditional paradigm, the large power plants supply the reactive power required at a transmission level and the capacitors and transformer tap changer were also used at a distribution level. However, in a near future will be necessary to schedule both active and reactive power at a distribution level, due to the high number of resources connected in distribution levels. This paper proposes a new multi-objective methodology to deal with the optimal resource scheduling considering the distributed generation, electric vehicles and capacitor banks for the joint active and reactive power scheduling. The proposed methodology considers the minimization of the cost (economic perspective) of all distributed resources, and the minimization of the voltage magnitude difference (technical perspective) in all buses. The Pareto front is determined and a fuzzy-based mechanism is applied to present the best compromise solution. The proposed methodology has been tested in the 33-bus distribution network. The case study shows the results of three different scenarios for the economic, technical, and multi-objective perspectives, and the results demonstrated the importance of incorporating the reactive scheduling in the distribution network using the multi-objective perspective to obtain the best compromise solution for the economic and technical perspectives.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The high penetration of distributed energy resources (DER) in distribution networks and the competitive environment of electricity markets impose the use of new approaches in several domains. The network cost allocation, traditionally used in transmission networks, should be adapted and used in the distribution networks considering the specifications of the connected resources. The main goal is to develop a fairer methodology trying to distribute the distribution network use costs to all players which are using the network in each period. In this paper, a model considering different type of costs (fixed, losses, and congestion costs) is proposed comprising the use of a large set of DER, namely distributed generation (DG), demand response (DR) of direct load control type, energy storage systems (ESS), and electric vehicles with capability of discharging energy to the network, which is known as vehicle-to-grid (V2G). The proposed model includes three distinct phases of operation. The first phase of the model consists in an economic dispatch based on an AC optimal power flow (AC-OPF); in the second phase Kirschen's and Bialek's tracing algorithms are used and compared to evaluate the impact of each resource in the network. Finally, the MW-mile method is used in the third phase of the proposed model. A distribution network of 33 buses with large penetration of DER is used to illustrate the application of the proposed model.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Health promotion in hospital environments can be improved using the most recent information and communication technologies. The Internet connectivity to small sensor nodes carried by patients allows remote access to their bio-signals. To promote these features the healthcare wireless sensor networks (HWSN) are used. In these networks mobility support is a key issue in order to keep patients under realtime monitoring even when they move around. To keep sensors connected to the network, they should change their access points of attachment when patients move to a new coverage area along an infirmary. This process, called handover, is responsible for continuous network connectivity to the sensors. This paper presents a detailed performance evaluation study considering three handover mechanisms for healthcare scenarios (Hand4MAC, RSSI-based, and Backbone-based). The study was performed by simulation using several scenarios with different number of sensors and different moving velocities of sensor nodes. The results show that Hand4MAC is the best solution to guarantee almost continuous connectivity to sensor nodes with less energy consumption.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Sectorization means dividing a whole into parts (sectors), a procedure that occurs in many contexts and applications, usually to achieve some goal or to facilitate an activity. The objective may be a better organization or simplification of a large problem into smaller sub-problems. Examples of applications are political districting and sales territory division. When designing/comparing sectors some characteristics such as contiguity, equilibrium and compactness are usually considered. This paper presents and describes new generic measures and proposes a new measure, desirability, connected with the idea of preference.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Sectorization means dividing a set of basic units into sectors or parts, a procedure that occurs in several contexts, such as political, health and school districting, social networks and sales territory or airspace assignment, to achieve some goal or to facilitate an activity. This presentation will focus on three main issues: Measures, a new approach to sectorization problems and an application in waste collection. When designing or comparing sectors different characteristics are usually taken into account. Some are commonly used, and they are related to the concepts of contiguity, equilibrium and compactness. These fundamental characteristics will be addressed, by defining new generic measures and by proposing a new measure, desirability, connected with the idea of preference. A new approach to sectorization inspired in Coulomb’s Law, which establishes a relation of force between electrically charged points, will be proposed. A charged point represents a small region with specific characteristics/values creating relations of attraction/repulsion with the others (two by two), proportional to the charges and inversely proportional to their distance. Finally, a real case about sectorization and vehicle routing in solid waste collection will be mentioned.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

No presente relatório de estágio académico é mencionado o trabalho desenvolvido na empresa Civigest – Gestão de Projetos, Lda. por um período de seis meses, entre Dezembro de 2012 e Junho de 2013. O relatório que se apresenta é, antes de mais, o reflexo do produtivo período passado na empresa, na qual se executou diversificadas tarefas, atribuindo-se maior ênfase ao estudo e elaboração de projetos e fiscalização de empreitadas em todas as suas componentes. Procura-se pois abordar áreas de conhecimento que têm relacionamento direto com os projetos de engenharia civil. No âmbito do planeamento em empresas de projeto de engenharia, as previsões assumem-se como uma ferramenta incontornável para o cumprimento dos objetivos. A calendarização dos projetos e o esforço dos técnicos para o seu cumprimento são fundamentais, visto que as despesas fixas desta atividade prendem-se sobretudo com o custo de mão-de-obra. Incontornavelmente ligado ao departamento de estudos e projetos encontra-se o departamento de orçamentação, auxiliando os técnicos projetistas a optar por soluções que não comprometam os valores alvo da empreitada. O relatório de estágio descreve as atividades desenvolvidas nos vários departamentos da organização, com destaque para o departamento de fiscalização de empreitadas, que tem como principais objetivos a coordenação e gestão de obras envolvendo a elaboração e aprovação de autos de medição e autos de pagamento, a verificação do estrito cumprimento dos projetos, a promoção de alterações de melhoria de projeto, a verificação das especificações técnicas dos materiais e sua correta aplicação, bem como promover um bom relacionamento e diálogo entre as partes interessadas da empreitada. A passagem pelo departamento de fiscalização de empreitadas é fulcral para uma melhor perceção das dificuldades encontradas em obra. Por outro lado, o relatório aborda outra atividade relativa à análise de soluções construtivas, visando o desempenho térmico dos edifícios. Este é o conceito do relatório de estágio onde serão apresentados alguns casos de obras iniciadas na Civigest e posteriormente acompanhadas em obra. Relativamente a este ponto serão analisadas soluções de melhoria térmica. Com menor ênfase no trabalho, mas não menos importante, referir-se-á a profissão de coordenação de segurança em obra.