929 resultados para Running Kinematics


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Reconfigurable computing experienced a considerable expansion in the last few years, due in part to the fast run-time partial reconfiguration features offered by recent SRAM-based Field Programmable Gate Arrays (FPGAs), which allowed the implementation in real-time of dynamic resource allocation strategies, with multiple independent functions from different applications sharing the same logic resources in the space and temporal domains. However, when the sequence of reconfigurations to be performed is not predictable, the efficient management of the logic space available becomes the greatest challenge posed to these systems. Resource allocation decisions have to be made concurrently with system operation, taking into account function priorities and optimizing the space currently available. As a consequence of the unpredictability of this allocation procedure, the logic space becomes fragmented, with many small areas of free resources failing to satisfy most requests and so remaining unused. A rearrangement of the currently running functions is therefore necessary, so as to obtain enough contiguous space to implement incoming functions, avoiding the spreading of their components and the resulting degradation of system performance. A novel active relocation procedure for Configurable Logic Blocks (CLBs) is herein presented, able to carry out online rearrangements, defragmenting the available FPGA resources without disturbing functions currently running.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Recent integrated circuit technologies have opened the possibility to design parallel architectures with hundreds of cores on a single chip. The design space of these parallel architectures is huge with many architectural options. Exploring the design space gets even more difficult if, beyond performance and area, we also consider extra metrics like performance and area efficiency, where the designer tries to design the architecture with the best performance per chip area and the best sustainable performance. In this paper we present an algorithm-oriented approach to design a many-core architecture. Instead of doing the design space exploration of the many core architecture based on the experimental execution results of a particular benchmark of algorithms, our approach is to make a formal analysis of the algorithms considering the main architectural aspects and to determine how each particular architectural aspect is related to the performance of the architecture when running an algorithm or set of algorithms. The architectural aspects considered include the number of cores, the local memory available in each core, the communication bandwidth between the many-core architecture and the external memory and the memory hierarchy. To exemplify the approach we did a theoretical analysis of a dense matrix multiplication algorithm and determined an equation that relates the number of execution cycles with the architectural parameters. Based on this equation a many-core architecture has been designed. The results obtained indicate that a 100 mm(2) integrated circuit design of the proposed architecture, using a 65 nm technology, is able to achieve 464 GFLOPs (double precision floating-point) for a memory bandwidth of 16 GB/s. This corresponds to a performance efficiency of 71 %. Considering a 45 nm technology, a 100 mm(2) chip attains 833 GFLOPs which corresponds to 84 % of peak performance These figures are better than those obtained by previous many-core architectures, except for the area efficiency which is limited by the lower memory bandwidth considered. The results achieved are also better than those of previous state-of-the-art many-cores architectures designed specifically to achieve high performance for matrix multiplication.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper discusses the results of applied research on the eco-driving domain based on a huge data set produced from a fleet of Lisbon's public transportation buses for a three-year period. This data set is based on events automatically extracted from the control area network bus and enriched with GPS coordinates, weather conditions, and road information. We apply online analytical processing (OLAP) and knowledge discovery (KD) techniques to deal with the high volume of this data set and to determine the major factors that influence the average fuel consumption, and then classify the drivers involved according to their driving efficiency. Consequently, we identify the most appropriate driving practices and styles. Our findings show that introducing simple practices, such as optimal clutch, engine rotation, and engine running in idle, can reduce fuel consumption on average from 3 to 5l/100 km, meaning a saving of 30 l per bus on one day. These findings have been strongly considered in the drivers' training sessions.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This article repports findings on a project - DONA EMPRESA - that the Portuguese Association of Women Entrepreneurs has been promoting for four years now. The project aims at supporting unemployed women, having a business idea, to create their own employment. So far about one hundred enterprises have been created in the scope of this project, their surviving rate being very high after one year of business running.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Sandwich structures with soft cores are widely used in applications where a high bending stiffness is required without compromising the global weight of the structure, as well as in situations where good thermal and damping properties are important parameters to observe. As equivalent single layer approaches are not the more adequate to describe realistically the kinematics and the stresses distributions as well as the dynamic behaviour of this type of sandwiches, where shear deformations and the extensibility of the core can be very significant, layerwise models may provide better solutions. Additionally and in connection with this multilayer approach, the selection of different shear deformation theories according to the nature of the material that constitutes the core and the outer skins can predict more accurately the sandwich behaviour. In the present work the authors consider the use of different shear deformation theories to formulate different layerwise models, implemented through kriging-based finite elements. The viscoelastic material behaviour, associated to the sandwich core, is modelled using the complex approach and the dynamic problem is solved in the frequency domain. The outer elastic layers considered in this work may also be made from different nanocomposites. The performance of the models developed is illustrated through a set of test cases. (C) 2015 Elsevier Ltd. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

RESUMO: Desde 1976 que as Forças Armadas desenvolvem acções de prevenção do consumo de drogas e álcool. Na década de 80 foi criada capacidade laboratorial e deu-se início a um programa de rastreios toxicológicos. No quinquénio 2001 a 2005, as proporções de resultados positivos, associando todos os tipos de rastreio, variaram entre 3,7% e 1,5%. De Outubro de 2006 a Julho de 2007 realizou-se um estudo analítico transversal, para estimar a prevalência do consumo de drogas (canabinóides, opiáceos, cocaína e anfetaminas) num dos Ramos das Forças Armadas, com base nos despistes realizados pelo seu laboratório. Foi utilizada uma amostra aleatória simples de 1039 militares, profissionais (QP) e contratados (RC), no activo e de ambos os sexos. Desde a nomeação dos militares a rastrear, passando pela cadeia de custódia das amostras até à obtenção do resultado foi utilizado apoio informático específico. O processo de pesquisa utilizou duas técnicas de triagem por imunoensaio e tecnologia de confirmação por GC/MS, de acordo com as recomendações europeias, permitindo estabelecer uma metodologia standard para organizações e empresas. A prevalência estimada, de consumidores de droga, foi de 3,8/1.000, para um erro de 0,37%. O número de casos registado (4) não permitiu a utilização de testes estatísticos que conduzissem à identificação de características determinantes da positividade, mas não deixou de revelar aspectos inesperados. A observação de séries de casos e a realização regular de estudos epidemiológicos, que ajudem a redefinir grupos alvo e a perceber a dimensão, as determinantes e as consequências do consumo de drogas é sugerida, em conclusão.--------------------------------------- RÉSUMÉ: Depuis 1976, les Forces Armées mettent au point des mesures visant à prévenir la consommation de drogues et d'alcool. En 1980, fut créé capacité laboratoriel et ont ensuite commencé un programme de dépistage toxicologique. Au cours des cinq années allant de 2001 à 2005, les proportions de consommateurs, impliquant tous les types de dépistage, allaient de 3,7% à 1,5 %. D'octobre 2006 à juillet 2007, une étude analytique transversale a été organisée pour évaluer la prévalence de l’usage de drogues (cannabis, opiacés, cocaïne et amphétamines) dans une branche de les Forces Armées, basée sur les dépistages faites par un laboratoire militaire, à l'aide d'un échantillon aléatoire de 1039 militaires, professionnels (QP) et sous contract (RC), à l’actif et des deux sexes. Tout au long du procès, de la nomination des donneurs, en passant par la chaine de garde des échantillons, jusqu’à obtention du résultat, il fut employé un appui informatique sécurisé. Le processus de recherche employa deux techniques de tri par imunoessay et la technologie de confirmation GC/MS, selon les recommandations européennes, permettant d'établir une méthodologie standard pour les organisations et les entreprises. La prévalence estimée fut de 3,8/1.000 pour une marge d’erreur de 0,37%. Le nombre de cas enregistrés (4) n'autorise pas l'utilisation de testes statistiques de menant à l'identification de caractéristiques déterminant de la positivité, mais il permet à révéler des aspects inattendus. L'observation de séries de cas et la tenue régulière d’études épidémiologiques, qui contribuent à redéfinir les groupes cibles et de comprendre l'ampleur, les déterminants et les conséquences de l'usage de drogues, est suggéré, en fin de compte.--------------------------------------- ABSTRACT: Since 1976, the Armed Forces, have been developing measures to prevent the use of drugs and alcohol. In 1980, was created laboratory facility which then started a program of toxicological screenings. In the five years running from 2001 to 2005, the proportions of consumers, involving all types of screening, ranged from 3,7% to 1,5%. From October 2006 to July 2007, a cross-sectional study was held to estimate the prevalence of drug use (cannabinoids, opiates, cocaine and amphetamines) in one branch of the Portuguese Armed Forces, based on laboratory screenings, using a random sample of 1039 military, professional (QP) and enlisted (RC), active-duty and of both sexes. Specific computer support was used all the way, from the appointment, including the chain of custody of samples, to the obtaining of the result. The process of search used two techniques for sorting by immunoassay and confirmation technology GC/MS, according to European recommendations, allowing to establish a standard methodology for organizations and companies. The estimated prevalence of drug users was 3.8/1.000 for a 0.37% error (95% confidence interval). The number of cases registered (4) does not permit use of statistical testing leading to the identification of characteristics weighing in the establishing to extrapolate for the population, but it allows revealing unexpected aspects. The observation of series of cases and the regular holding of epidemiological studies, which help redefine target groups and to understand the extent, the determinants and consequences of drug use, is suggested, in conclusion.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Consider the problem of assigning implicit-deadline sporadic tasks on a heterogeneous multiprocessor platform comprising two different types of processors—such a platform is referred to as two-type platform. We present two low degree polynomial time-complexity algorithms, SA and SA-P, each providing the following guarantee. For a given two-type platform and a task set, if there exists a task assignment such that tasks can be scheduled to meet deadlines by allowing them to migrate only between processors of the same type (intra-migrative), then (i) using SA, it is guaranteed to find such an assignment where the same restriction on task migration applies but given a platform in which processors are 1+α/2 times faster and (ii) SA-P succeeds in finding a task assignment where tasks are not allowed to migrate between processors (non-migrative) but given a platform in which processors are 1+α times faster. The parameter 0<α≤1 is a property of the task set; it is the maximum of all the task utilizations that are no greater than 1. We evaluate average-case performance of both the algorithms by generating task sets randomly and measuring how much faster processors the algorithms need (which is upper bounded by 1+α/2 for SA and 1+α for SA-P) in order to output a feasible task assignment (intra-migrative for SA and non-migrative for SA-P). In our evaluations, for the vast majority of task sets, these algorithms require significantly smaller processor speedup than indicated by their theoretical bounds. Finally, we consider a special case where no task utilization in the given task set can exceed one and for this case, we (re-)prove the performance guarantees of SA and SA-P. We show, for both of the algorithms, that changing the adversary from intra-migrative to a more powerful one, namely fully-migrative, in which tasks can migrate between processors of any type, does not deteriorate the performance guarantees. For this special case, we compare the average-case performance of SA-P and a state-of-the-art algorithm by generating task sets randomly. In our evaluations, SA-P outperforms the state-of-the-art by requiring much smaller processor speedup and by running orders of magnitude faster.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Task scheduling is one of the key mechanisms to ensure timeliness in embedded real-time systems. Such systems have often the need to execute not only application tasks but also some urgent routines (e.g. error-detection actions, consistency checkers, interrupt handlers) with minimum latency. Although fixed-priority schedulers such as Rate-Monotonic (RM) are in line with this need, they usually make a low processor utilization available to the system. Moreover, this availability usually decreases with the number of considered tasks. If dynamic-priority schedulers such as Earliest Deadline First (EDF) are applied instead, high system utilization can be guaranteed but the minimum latency for executing urgent routines may not be ensured. In this paper we describe a scheduling model according to which urgent routines are executed at the highest priority level and all other system tasks are scheduled by EDF. We show that the guaranteed processor utilization for the assumed scheduling model is at least as high as the one provided by RM for two tasks, namely 2(2√−1). Seven polynomial time tests for checking the system timeliness are derived and proved correct. The proposed tests are compared against each other and to an exact but exponential running time test.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper presents a new communication architecture to enable the remote control, monitoring and debug of embedded-system controllers designed using IOPT Petri nets. IOPT Petri nets and the related tools (http://gres.uninova.pt) have been used as a rapid prototyping and development framework, including model-checking, simulation and automatic code generation tools. The new architecture adds remote operation capabilities to the controllers produced by the automatic code generators, enabling quasi-real-time remote debugging and monitoring using the IOPT simulator tool. Furthermore, it enables the creation of graphical user interfaces for remote operation and the development of distributed systems where a Petri net model running on a central system supervises the actions of multiple remote subsystems. © 2015 IEEE.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Resumo: Com base no conceito de implementação de intenções (Gollwitzer, 1993, 1999) e na teoria do contexto de resposta de Kirsch & Lynn (1997), o presente trabalho testou a eficácia de uma intervenção combinada de implementação de intenções com hipnose e sugestão pós-hipnótica na promoção da adesão a uma tarefa simples (avaliação do humor) e uma tarefa difícil (actividade física). Os participantes são estudantes universitários de uma universidade na Nova Jérsia, (N=124, Estudo 1, EUA) e em Lisboa (N=323, Estudo 2, Portugal). Em ambos os estudos os participantes foram seleccionados a partir de uma amostra mais vasta baseado num escrutínio da sua sugestibilidade hipnótica avaliada por meio da Escala de Grupo de Sugestibilidade Hipnótica de Waterloo-Stanford (WSGC): Forma C. O Estudo 1 usou um desenho factorial do tipo 2x2x3 (tipo de intenção formada x hipnose x nível de sugestionabilidade) e o Estudo 2 usou um desenho factorial do tipo 2 x 2x 2 x 4 (tipo de tarefa x tipo de intenção formada x hipnose x nível de sugestionabilidade). No Estudo 1 foi pedido aos participantes que corressem todos os dias e durante três semanas durante 5 minutos, que medissem a sua pulsação antes e depois da actividade física e que mandassem um e-mail ao experimentador, fornecendo assim uma medida comportamental e uma medida de auto-relato. Aos participantes no grupo de intenções de meta foi apenas pedido que corressem todos os dias. Aos participantes no grupo de implementação de intenções foi pedido que especificasses com exactidão quando e onde iriam correr e enviar o e-mail. Para além disso, cerca de metade dos participantes foram hipnotizados e receberam uma sugestão pós-hipnótica em que lhes foi sugerido que o pensamento de correr todos os dias lhes viria à mente sem esforço no momento apropriado. A outra metade dos participantes não recebeu qualquer sugestão hipnótica. No Estudo 2 foi seguido o mesmo procedimento, mas a cerca de metade dos participantes foi atribuída uma tarefa fácil (enviar um Adherence to health-related behaviors ix SMS com a avaliação diária do seu estado de humor naquele momento) e à outra metade da amostra foi atribuída a tarefa de exercício físico atrás descrita (tarefa difícil). Os resultados do estudo 1 mostraram uma interacção significativa entre o nível de sugestionabilidade dos participantes e a sugestão pós-hipnótica (p<.01) indicando que a administração da sugestão pós-hipnótica aumentou a adesão nos participantes muito sugestionáveis, mas baixou a adesão nos participantes pouco sugestionáveis. Não se encontraram diferenças entre os grupos que formaram intenções de meta e os que formaram implementação de intenções. No Estudo 2 os resultados indicaram que os participantes aderiram significativamente mais à tarefa fácil do que à tarefa difícil (p<.001). Os resultados não revelaram diferenças significativas entre as condições implementações de intenções, hipnose e as duas estratégias combinadas, indicando que a implementação de intenções não foi eficaz no aumento da adesão às duas tarefas propostas e não beneficiou da combinação com as sugestões pós-hipnóticas. A utilização da hipnose com sugestão pós-hipnótica significativamente reduziu a adesão a ambas as tarefas. Dado que não existiam instrumentos em Português destinados a avaliar a sugestionabilidade hipnótica, traduziu-se e adaptou-se para Português Escala de Grupo de sugestibilidade hipnótica de Waterloo-Stanford (WSGC): Forma C. A amostra Portuguesa (N=625) apresentou resultados semelhantes aos encontrados nas amostras de referência em termos do formato da distribuição dos padrões da pontuação e do índice de dificuldade dos itens. Contudo, a proporção de estudantes portugueses encontrada que pontuaram na zona superior de sugestionabilidade foi significativamente inferior à proporção de participantes na mesma zona encontrada nas amostras de referência. No sentido de lançar alguma luz sobre as razões para este resultado, inquiriu-se alguns dos participantes acerca das suas atitudes face à hipnose utilizando uma versão portuguesa da Escala de Valência de Atitudes e Crenças face à Hipnose e comparou-se com a opinião de Adherence to health-related behaviors xAbstract: On the basis of Gollwitzer’s (1993, 1999) implementation intentions’ concept, and Kirsch & Lynn’s (1997) response set theory, this dissertation tested the effectiveness of a combined intervention of implementation intentions with hypnosis with posthypnotic suggestions in enhancing adherence to a simple (mood report) and a difficult (physical activity) health-related task. Participants were enrolled in a university in New Jersey (N=124, Study 1, USA) and in two universities in Lisbon (N=323, Study 2, Portugal). In both studies participants were selected from a broader sample based on their suggestibility scores using the Waterloo-Stanford Group C (WSGC) scale of hypnotic susceptibility and then randomly assigned to the experimental groups. Study 1 used a 2x2x3 factorial design (instruction x hypnosis x level of suggestibility) and Study 2 used a 2 x 2x 2 x 4 factorial design (task x instructions x hypnosis x level of suggestibility). In Study 1 participants were asked to run in place for 5 minutes each day for a three-week period, to take their pulse rate before and after the activity, and to send a daily email report to the experimenter, thus providing both a self-report and a behavioral measure of adherence. Participants in the goal intention condition were simply asked to run in place and send the e-mail once a day. Those in the implementation intention condition were further asked to specify the exact place and time they would perform the physical activity and send the e-mail. In addition, half of the participants were given a post-hypnotic suggestion indicating that the thought of running in place would come to mind without effort at the appropriate moment. The other half did not receive a posthypnotic suggestion. Study 2 followed the same procedure, but additionally half of the participants were instructed to send a mood report by SMS (easy task) and half were assigned to the physical activity task described above (difficult task). Adherence to health-related behaviors vii Study 1 result’s showed a significant interaction between participant’s suggestibility level and posthypnotic suggestion (p<.01) indicating that posthypnotic suggestion enhanced adherence among highly suggestible participants, but lowered it among low suggestible individuals. No differences between the goal intention and the implementation intentions groups were found. In Study 2, participants adhered significantly more (p<.001) to the easy task than to the difficult task. Results did not revealed significant differences between the implementation intentions, hypnosis and the two conditions combined, indicating that implementation intentions was not enhanced by hypnosis with posthypnotic suggestion, neither was effective as single intervention in enhancing adherence to any of the tasks. Hypnosis with posthypnotic suggestion alone significantly reduced adherence to both tasks in comparison with participants that did not receive hypnosis. Since there were no instruments in Portuguese language to asses hypnotic suggestibility, the Waterloo-Stanford Group C (WSGC) scale of hypnotic susceptibility was translated and adapted to Portuguese and was used in the screening of a sample of college students from Lisbon (N=625). Results showed that the Portuguese sample has distribution shapes and difficulty patterns of hypnotic suggestibility scores similar to the reference samples, with the exception of the proportion of Portuguese students scoring in the high range of hypnotic suggestibility, that was found lower than the in reference samples. In order to shed some light on the reasons for this finding participant’s attitudes toward hypnosis were inquired using a Portuguese translation and adaptation of the Escala de Valencia de Actitudes y Creencias Hacia la Hipnosis, Versión Cliente, and compared with participants with no prior hypnosis experience (N=444). Significant differences were found between the two groups with participants without hypnosis experience scoring higher in factors indicating misconceptions and negative attitudes about hypnosis.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Remote hyperspectral sensors collect large amounts of data per flight usually with low spatial resolution. It is known that the bandwidth connection between the satellite/airborne platform and the ground station is reduced, thus a compression onboard method is desirable to reduce the amount of data to be transmitted. This paper presents a parallel implementation of an compressive sensing method, called parallel hyperspectral coded aperture (P-HYCA), for graphics processing units (GPU) using the compute unified device architecture (CUDA). This method takes into account two main properties of hyperspectral dataset, namely the high correlation existing among the spectral bands and the generally low number of endmembers needed to explain the data, which largely reduces the number of measurements necessary to correctly reconstruct the original data. Experimental results conducted using synthetic and real hyperspectral datasets on two different GPU architectures by NVIDIA: GeForce GTX 590 and GeForce GTX TITAN, reveal that the use of GPUs can provide real-time compressive sensing performance. The achieved speedup is up to 20 times when compared with the processing time of HYCA running on one core of the Intel i7-2600 CPU (3.4GHz), with 16 Gbyte memory.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In the framework of multibody dynamics, the path motion constraint enforces that a body follows a predefined curve being its rotations with respect to the curve moving frame also prescribed. The kinematic constraint formulation requires the evaluation of the fourth derivative of the curve with respect to its arc length. Regardless of the fact that higher order polynomials lead to unwanted curve oscillations, at least a fifth order polynomials is required to formulate this constraint. From the point of view of geometric control lower order polynomials are preferred. This work shows that for multibody dynamic formulations with dependent coordinates the use of cubic polynomials is possible, being the dynamic response similar to that obtained with higher order polynomials. The stabilization of the equations of motion, always required to control the constraint violations during long analysis periods due to the inherent numerical errors of the integration process, is enough to correct the error introduced by using a lower order polynomial interpolation and thus forfeiting the analytical requirement for higher order polynomials.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Railway vehicle homologation, with respect to running dynamics, is addressed via dedicated norms. The results required, such as, accelerations and/or wheel-rail contact forces, obtained from experimental tests or simulations, must be available. Multibody dynamics allows the modelling of railway vehicles and their representation in real operations conditions, being the realism of the multibody models greatly influenced by the modelling assumptions. In this paper, two alternative multibody models of the Light Rail Vehicle 2000 (LRV) are constructed and simulated in a realistic railway track scenarios. The vehicle-track interaction compatibility analysis consists of two stages: the use of the simplified method described in the norm "UIC 518-Testing and Approval of Railway Vehicles from the Point of View of their Dynamic Behaviour-Safety-Track Fatigue-Running Behaviour" for decision making; and, visualization inspection of the vehicle motion with respect to the track via dedicated tools for understanding the mechanisms involved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Cryptosporidium was detected in 21 (3.8%) individual stool samples collected from 553 pediatric patients hospitalized in our center employing a Telemann concentration technique (formalinethercentrifugation) and stained with the modified Kinyoun method. The mean age of populations with Cryptosporidiosis (16 boys and 5 girls) was 11 months; 15 months for girls and 6.5 for boys. Ages of 81% of them were less than 19 months. Seventysix per cent of patients lived on the outskirts of Buenos Aires and 71% lacked pretreated running water at home. In 62% of the cases parasitological diagnoses coincided with warm seasons. At diagnosis mucous (63%) or watery (36%) diarrhea was presented in 90% of the patients with a median of 5 (38) bowel movements per day. Fever was presented in 66% of patients while abdominal pain and vomits in 60% and 52%, respectively. The median time from hospitalization up to parasitologic diagnosis was 20 days. Concomitant diseases observed were malnutrition, acute leukemia, bronchiolitis, HIV infection, anemia, celiac disease, myelofibrosis, vitelline sac tumor, neutropenia, osteosarcoma and dehydration. Cryptosporidiosis in our environment seems to occur more frequently in children younger than 18 months of age; who present diarrhea; are immunodeficient; come from a low socioeconomical background; and who live in poor sanitary conditions with no potable running water.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Os aproveitamentos geotérmicos têm vindo a aumentar significativamente em todo o mundo, sendo os Estados Unidos da América, o maior produtor desta energia proveniente do interior da Terra, com cerca de 3.187 MW de capacidade instalada. Portugal tem capacidade instalada total de 29 MW, no entanto no que se refere ao aproveitamento de “alta entalpia”, isto é, o aproveitamento geotérmico para produção elétrica, apenas se encontra no arquipélago dos Açores, na ilha de S. Miguel, onde estão instaladas e em funcionamento duas centrais geotérmicas com a potência total de 23 MW, com produção de energia de 185 GWh. Em Portugal Continental, não se consegue produzir energia elétrica devido às temperaturas existentes, restringindo esta utilização apenas ao aproveitamento de baixa entalpia (máximo de 76 ºC). Este aproveitamento normalmente é feito em cascata, segundo, predominando o aquecimento de águas sanitárias, climatização, e para termas, usando águas termominerais. Para a exploração deste recurso renovável, é necessário conhecer a hidrogeologia do país, e relacioná-la com a fracturação, e acidentes tectónicos. Portugal Continental, está divido em quatros partes distintas a nível hidrogeológico, o Maciço Antigo, a Orla Ocidental, a Bacia Tejo-Sado e a Orla Meridional. Qualquer aproveitamento geotérmico em Portugal terá de atender a estas características, potenciando também, novas explorações geotérmicas orientadas para as pessoas, respeitando os valores sociais, culturais e ambientais. Neste contexto, existem alguns complexos geotérmicos em funcionamento, outros abandonados, e muitos outros em estudo para uma breve aplicação. Um exemplo de sucesso no aproveitamento do calor geotérmico, é o complexo de Chaves, que foi evoluindo desde 1985, até aos dias de hoje, continuando em exploração e em expansão para um melhor servir da população local. A existência de dois furos, e brevemente dum terceiro, servem para o abastecimento duma piscina, dum hotel, das termas, e da balneoterapia. Devido à riqueza a nível das temperaturas, dos caudais, e ao nível das necessidades energéticas existentes, este complexo apresenta um tempo de retorno de investimento de cerca de 7 anos, o que é geralmente considerado para investimentos para fins públicos, como é o caso. No âmbito das investigações agora realizadas, foi constatado que estes projetos suportam a cobertura de alguma incerteza hidrogeológica, dada a importante procura existente.