24 resultados para medida de intervalos de tempo sucessivos
Resumo:
The exponential growth of the world population has led to an increase of settlements often located in areas prone to natural disasters, including earthquakes. Consequently, despite the important advances in the field of natural catastrophes modelling and risk mitigation actions, the overall human losses have continued to increase and unprecedented economic losses have been registered. In the research work presented herein, various areas of earthquake engineering and seismology are thoroughly investigated, and a case study application for mainland Portugal is performed. Seismic risk assessment is a critical link in the reduction of casualties and damages due to earthquakes. Recognition of this relation has led to a rapid rise in demand for accurate, reliable and flexible numerical tools and software. In the present work, an open-source platform for seismic hazard and risk assessment is developed. This software is capable of computing the distribution of losses or damage for an earthquake scenario (deterministic event-based) or earthquake losses due to all the possible seismic events that might occur within a region for a given interval of time (probabilistic event-based). This effort has been developed following an open and transparent philosophy and therefore, it is available to any individual or institution. The estimation of the seismic risk depends mainly on three components: seismic hazard, exposure and vulnerability. The latter component assumes special importance, as by intervening with appropriate retrofitting solutions, it may be possible to decrease directly the seismic risk. The employment of analytical methodologies is fundamental in the assessment of structural vulnerability, particularly in regions where post-earthquake building damage might not be available. Several common methodologies are investigated, and conclusions are yielded regarding the method that can provide an optimal balance between accuracy and computational effort. In addition, a simplified approach based on the displacement-based earthquake loss assessment (DBELA) is proposed, which allows for the rapid estimation of fragility curves, considering a wide spectrum of uncertainties. A novel vulnerability model for the reinforced concrete building stock in Portugal is proposed in this work, using statistical information collected from hundreds of real buildings. An analytical approach based on nonlinear time history analysis is adopted and the impact of a set of key parameters investigated, including the damage state criteria and the chosen intensity measure type. A comprehensive review of previous studies that contributed to the understanding of the seismic hazard and risk for Portugal is presented. An existing seismic source model was employed with recently proposed attenuation models to calculate probabilistic seismic hazard throughout the territory. The latter results are combined with information from the 2011 Building Census and the aforementioned vulnerability model to estimate economic loss maps for a return period of 475 years. These losses are disaggregated across the different building typologies and conclusions are yielded regarding the type of construction more vulnerable to seismic activity.
Resumo:
The expectations of citizens from the Information Technologies (ITs) are increasing as the ITs have become integral part of our society, serving all kinds of activities whether professional, leisure, safety-critical applications or business. Hence, the limitations of the traditional network designs to provide innovative and enhanced services and applications motivated a consensus to integrate all services over packet switching infrastructures, using the Internet Protocol, so as to leverage flexible control and economical benefits in the Next Generation Networks (NGNs). However, the Internet is not capable of treating services differently while each service has its own requirements (e.g., Quality of Service - QoS). Therefore, the need for more evolved forms of communications has driven to radical changes of architectural and layering designs which demand appropriate solutions for service admission and network resources control. This Thesis addresses QoS and network control issues, aiming to improve overall control performance in current and future networks which classify services into classes. The Thesis is divided into three parts. In the first part, we propose two resource over-reservation algorithms, a Class-based bandwidth Over-Reservation (COR) and an Enhanced COR (ECOR). The over-reservation means reserving more bandwidth than a Class of Service (CoS) needs, so the QoS reservation signalling rate is reduced. COR and ECOR allow for dynamically defining over-reservation parameters for CoSs based on network interfaces resource conditions; they aim to reduce QoS signalling and related overhead without incurring CoS starvation or waste of bandwidth. ECOR differs from COR by allowing for optimizing control overhead minimization. Further, we propose a centralized control mechanism called Advanced Centralization Architecture (ACA), that uses a single state-full Control Decision Point (CDP) which maintains a good view of its underlying network topology and the related links resource statistics on real-time basis to control the overall network. It is very important to mention that, in this Thesis, we use multicast trees as the basis for session transport, not only for group communication purposes, but mainly to pin packets of a session mapped to a tree to follow the desired tree. Our simulation results prove a drastic reduction of QoS control signalling and the related overhead without QoS violation or waste of resources. Besides, we provide a generic-purpose analytical model to assess the impact of various parameters (e.g., link capacity, session dynamics, etc.) that generally challenge resource overprovisioning control. In the second part of this Thesis, we propose a decentralization control mechanism called Advanced Class-based resource OverpRovisioning (ACOR), that aims to achieve better scalability than the ACA approach. ACOR enables multiple CDPs, distributed at network edge, to cooperate and exchange appropriate control data (e.g., trees and bandwidth usage information) such that each CDP is able to maintain a good knowledge of the network topology and the related links resource statistics on real-time basis. From scalability perspective, ACOR cooperation is selective, meaning that control information is exchanged dynamically among only the CDPs which are concerned (correlated). Moreover, the synchronization is carried out through our proposed concept of Virtual Over-Provisioned Resource (VOPR), which is a share of over-reservations of each interface to each tree that uses the interface. Thus, each CDP can process several session requests over a tree without requiring synchronization between the correlated CDPs as long as the VOPR of the tree is not exhausted. Analytical and simulation results demonstrate that aggregate over-reservation control in decentralized scenarios keep low signalling without QoS violations or waste of resources. We also introduced a control signalling protocol called ACOR Protocol (ACOR-P) to support the centralization and decentralization designs in this Thesis. Further, we propose an Extended ACOR (E-ACOR) which aggregates the VOPR of all trees that originate at the same CDP, and more session requests can be processed without synchronization when compared with ACOR. In addition, E-ACOR introduces a mechanism to efficiently track network congestion information to prevent unnecessary synchronization during congestion time when VOPRs would exhaust upon every session request. The performance evaluation through analytical and simulation results proves the superiority of E-ACOR in minimizing overall control signalling overhead while keeping all advantages of ACOR, that is, without incurring QoS violations or waste of resources. The last part of this Thesis includes the Survivable ACOR (SACOR) proposal to support stable operations of the QoS and network control mechanisms in case of failures and recoveries (e.g., of links and nodes). The performance results show flexible survivability characterized by fast convergence time and differentiation of traffic re-routing under efficient resource utilization i.e. without wasting bandwidth. In summary, the QoS and architectural control mechanisms proposed in this Thesis provide efficient and scalable support for network control key sub-systems (e.g., QoS and resource control, traffic engineering, multicasting, etc.), and thus allow for optimizing network overall control performance.
Resumo:
Nos últimos anos temos vindo a assistir a uma mudança na forma como a informação é disponibilizada online. O surgimento da web para todos possibilitou a fácil edição, disponibilização e partilha da informação gerando um considerável aumento da mesma. Rapidamente surgiram sistemas que permitem a coleção e partilha dessa informação, que para além de possibilitarem a coleção dos recursos também permitem que os utilizadores a descrevam utilizando tags ou comentários. A organização automática dessa informação é um dos maiores desafios no contexto da web atual. Apesar de existirem vários algoritmos de clustering, o compromisso entre a eficácia (formação de grupos que fazem sentido) e a eficiência (execução em tempo aceitável) é difícil de encontrar. Neste sentido, esta investigação tem por problemática aferir se um sistema de agrupamento automático de documentos, melhora a sua eficácia quando se integra um sistema de classificação social. Analisámos e discutimos dois métodos baseados no algoritmo k-means para o clustering de documentos e que possibilitam a integração do tagging social nesse processo. O primeiro permite a integração das tags diretamente no Vector Space Model e o segundo propõe a integração das tags para a seleção das sementes iniciais. O primeiro método permite que as tags sejam pesadas em função da sua ocorrência no documento através do parâmetro Social Slider. Este método foi criado tendo por base um modelo de predição que sugere que, quando se utiliza a similaridade dos cossenos, documentos que partilham tags ficam mais próximos enquanto que, no caso de não partilharem, ficam mais distantes. O segundo método deu origem a um algoritmo que denominamos k-C. Este para além de permitir a seleção inicial das sementes através de uma rede de tags também altera a forma como os novos centróides em cada iteração são calculados. A alteração ao cálculo dos centróides teve em consideração uma reflexão sobre a utilização da distância euclidiana e similaridade dos cossenos no algoritmo de clustering k-means. No contexto da avaliação dos algoritmos foram propostos dois algoritmos, o algoritmo da “Ground truth automática” e o algoritmo MCI. O primeiro permite a deteção da estrutura dos dados, caso seja desconhecida, e o segundo é uma medida de avaliação interna baseada na similaridade dos cossenos entre o documento mais próximo de cada documento. A análise de resultados preliminares sugere que a utilização do primeiro método de integração das tags no VSM tem mais impacto no algoritmo k-means do que no algoritmo k-C. Além disso, os resultados obtidos evidenciam que não existe correlação entre a escolha do parâmetro SS e a qualidade dos clusters. Neste sentido, os restantes testes foram conduzidos utilizando apenas o algoritmo k-C (sem integração de tags no VSM), sendo que os resultados obtidos indicam que a utilização deste algoritmo tende a gerar clusters mais eficazes.
Resumo:
Wireless communication technologies have become widely adopted, appearing in heterogeneous applications ranging from tracking victims, responders and equipments in disaster scenarios to machine health monitoring in networked manufacturing systems. Very often, applications demand a strictly bounded timing response, which, in distributed systems, is generally highly dependent on the performance of the underlying communication technology. These systems are said to have real-time timeliness requirements since data communication must be conducted within predefined temporal bounds, whose unfulfillment may compromise the correct behavior of the system and cause economic losses or endanger human lives. The potential adoption of wireless technologies for an increasingly broad range of application scenarios has made the operational requirements more complex and heterogeneous than before for wired technologies. On par with this trend, there is an increasing demand for the provision of cost-effective distributed systems with improved deployment, maintenance and adaptation features. These systems tend to require operational flexibility, which can only be ensured if the underlying communication technology provides both time and event triggered data transmission services while supporting on-line, on-the-fly parameter modification. Generally, wireless enabled applications have deployment requirements that can only be addressed through the use of batteries and/or energy harvesting mechanisms for power supply. These applications usually have stringent autonomy requirements and demand a small form factor, which hinders the use of large batteries. As the communication support may represent a significant part of the energy requirements of a station, the use of power-hungry technologies is not adequate. Hence, in such applications, low-range technologies have been widely adopted. In fact, although low range technologies provide smaller data rates, they spend just a fraction of the energy of their higher-power counterparts. The timeliness requirements of data communications, in general, can be met by ensuring the availability of the medium for any station initiating a transmission. In controlled (close) environments this can be guaranteed, as there is a strict regulation of which stations are installed in the area and for which purpose. Nevertheless, in open environments, this is hard to control because no a priori abstract knowledge is available of which stations and technologies may contend for the medium at any given instant. Hence, the support of wireless real-time communications in unmanaged scenarios is a highly challenging task. Wireless low-power technologies have been the focus of a large research effort, for example, in the Wireless Sensor Network domain. Although bringing extended autonomy to battery powered stations, such technologies are known to be negatively influenced by similar technologies contending for the medium and, especially, by technologies using higher power transmissions over the same frequency bands. A frequency band that is becoming increasingly crowded with competing technologies is the 2.4 GHz Industrial, Scientific and Medical band, encompassing, for example, Bluetooth and ZigBee, two lowpower communication standards which are the base of several real-time protocols. Although these technologies employ mechanisms to improve their coexistence, they are still vulnerable to transmissions from uncoordinated stations with similar technologies or to higher power technologies such as Wi- Fi, which hinders the support of wireless dependable real-time communications in open environments. The Wireless Flexible Time-Triggered Protocol (WFTT) is a master/multi-slave protocol that builds on the flexibility and timeliness provided by the FTT paradigm and on the deterministic medium capture and maintenance provided by the bandjacking technique. This dissertation presents the WFTT protocol and argues that it allows supporting wireless real-time communication services with high dependability requirements in open environments where multiple contention-based technologies may dispute the medium access. Besides, it claims that it is feasible to provide flexible and timely wireless communications at the same time in open environments. The WFTT protocol was inspired on the FTT paradigm, from which higher layer services such as, for example, admission control has been ported. After realizing that bandjacking was an effective technique to ensure the medium access and maintenance in open environments crowded with contention-based communication technologies, it was recognized that the mechanism could be used to devise a wireless medium access protocol that could bring the features offered by the FTT paradigm to the wireless domain. The performance of the WFTT protocol is reported in this dissertation with a description of the implemented devices, the test-bed and a discussion of the obtained results.
Resumo:
Interest on using teams of mobile robots has been growing, due to their potential to cooperate for diverse purposes, such as rescue, de-mining, surveillance or even games such as robotic soccer. These applications require a real-time middleware and wireless communication protocol that can support an efficient and timely fusion of the perception data from different robots as well as the development of coordinated behaviours. Coordinating several autonomous robots towards achieving a common goal is currently a topic of high interest, which can be found in many application domains. Despite these different application domains, the technical problem of building an infrastructure to support the integration of the distributed perception and subsequent coordinated action is similar. This problem becomes tougher with stronger system dynamics, e.g., when the robots move faster or interact with fast objects, leading to tighter real-time constraints. This thesis work addressed computing architectures and wireless communication protocols to support efficient information sharing and coordination strategies taking into account the real-time nature of robot activities. The thesis makes two main claims. Firstly, we claim that despite the use of a wireless communication protocol that includes arbitration mechanisms, the self-organization of the team communications in a dynamic round that also accounts for variable team membership, effectively reduces collisions within the team, independently of its current composition, significantly improving the quality of the communications. We will validate this claim in terms of packet losses and communication latency. We show how such self-organization of the communications can be achieved in an efficient way with the Reconfigurable and Adaptive TDMA protocol. Secondly, we claim that the development of distributed perception, cooperation and coordinated action for teams of mobile robots can be simplified by using a shared memory middleware that replicates in each cooperating robot all necessary remote data, the Real-Time Database (RTDB) middleware. These remote data copies, which are updated in the background by the selforganizing communications protocol, are extended with age information automatically computed by the middleware and are locally accessible through fast primitives. We validate our claim showing a parsimonious use of the communication medium, improved timing information with respect to the shared data and the simplicity of use and effectiveness of the proposed middleware shown in several use cases, reinforced with a reasonable impact in the Middle Size League of RoboCup.
Resumo:
This work is about the combination of functional ferroelectric oxides with Multiwall Carbon Nanotubes for microelectronic applications, as for example potential 3 Dimensional (3D) Non Volatile Ferroelectric Random Access Memories (NVFeRAM). Miniaturized electronics are ubiquitous now. The drive to downsize electronics has been spurred by needs of more performance into smaller packages at lower costs. But the trend of electronics miniaturization challenges board assembly materials, processes, and reliability. Semiconductor device and integrated circuit technology, coupled with its associated electronic packaging, forms the backbone of high-performance miniaturized electronic systems. However, as size decreases and functionalization increases in the modern electronics further size reduction is getting difficult; below a size limit the signal reliability and device performance deteriorate. Hence miniaturization of siliconbased electronics has limitations. On this background the Road Map for Semiconductor Industry (ITRS) suggests since 2011 alternative technologies, designated as More than Moore; being one of them based on carbon (carbon nanotubes (CNTs) and graphene) [1]. CNTs with their unique performance and three dimensionality at the nano-scale have been regarded as promising elements for miniaturized electronics [2]. CNTs are tubular in geometry and possess a unique set of properties, including ballistic electron transportation and a huge current caring capacity, which make them of great interest for future microelectronics [2]. Indeed CNTs might have a key role in the miniaturization of Non Volatile Ferroelectric Random Access Memories (NVFeRAM). Moving from a traditional two dimensional (2D) design (as is the case of thin films) to a 3D structure (based on a tridimensional arrangement of unidimensional structures) will result in the high reliability and sensing of the signals due to the large contribution from the bottom electrode. One way to achieve this 3D design is by using CNTs. Ferroelectrics (FE) are spontaneously polarized and can have high dielectric constants and interesting pyroelectric, piezoelectric, and electrooptic properties, being a key application of FE electronic memories. However, combining CNTs with FE functional oxides is challenging. It starts with materials compatibility, since crystallization temperature of FE and oxidation temperature of CNTs may overlap. In this case low temperature processing of FE is fundamental. Within this context in this work a systematic study on the fabrication of CNTs - FE structures using low cost low temperature methods was carried out. The FE under study are comprised of lead zirconate titanate (Pb1-xZrxTiO3, PZT), barium titanate (BaTiO3, BT) and bismuth ferrite (BiFeO3, BFO). The various aspects related to the fabrication, such as effect on thermal stability of MWCNTs, FE phase formation in presence of MWCNTs and interfaces between the CNTs/FE are addressed in this work. The ferroelectric response locally measured by Piezoresponse Force Microscopy (PFM) clearly evidenced that even at low processing temperatures FE on CNTs retain its ferroelectric nature. The work started by verifying the thermal decomposition behavior under different conditions of the multiwall CNTs (MWCNTs) used in this work. It was verified that purified MWCNTs are stable up to 420 ºC in air, as no weight loss occurs under non isothermal conditions, but morphology changes were observed for isothermal conditions at 400 ºC by Raman spectroscopy and Transmission Electron Microscopy (TEM). In oxygen-rich atmosphere MWCNTs started to oxidized at 200 ºC. However in argon-rich one and under a high heating rate MWCNTs remain stable up to 1300 ºC with a minimum sublimation. The activation energy for the decomposition of MWCNTs in air was calculated to lie between 80 and 108 kJ/mol. These results are relevant for the fabrication of MWCNTs – FE structures. Indeed we demonstrate that PZT can be deposited by sol gel at low temperatures on MWCNTs. And particularly interesting we prove that MWCNTs decrease the temperature and time for formation of PZT by ~100 ºC commensurate with a decrease in activation energy from 68±15 kJ/mol to 27±2 kJ/mol. As a consequence, monophasic PZT was obtained at 575 ºC for MWCNTs - PZT whereas for pure PZT traces of pyrochlore were still present at 650 ºC, where PZT phase formed due to homogeneous nucleation. The piezoelectric nature of MWCNTs - PZT synthesised at 500 ºC for 1 h was proved by PFM. In the continuation of this work we developed a low cost methodology of coating MWCNTs using a hybrid sol-gel / hydrothermal method. In this case the FE used as a proof of concept was BT. BT is a well-known lead free perovskite used in many microelectronic applications. However, synthesis by solid state reaction is typically performed around 1100 to 1300 ºC what jeopardizes the combination with MWCNTs. We also illustrate the ineffectiveness of conventional hydrothermal synthesis in this process due the formation of carbonates, namely BaCO3. The grown MWCNTs - BT structures are ferroelectric and exhibit an electromechanical response (15 pm/V). These results have broad implications since this strategy can also be extended to other compounds of materials with high crystallization temperatures. In addition the coverage of MWCNTs with FE can be optimized, in this case with non covalent functionalization of the tubes, namely with sodium dodecyl sulfate (SDS). MWCNTs were used as templates to grow, in this case single phase multiferroic BFO nanorods. This work shows that the use of nitric solvent results in severe damages of the MWCNTs layers that results in the early oxidation of the tubes during the annealing treatment. It was also observed that the use of nitric solvent results in the partial filling of MWCNTs with BFO due to the low surface tension (<119 mN/m) of the nitric solution. The opening of the caps and filling of the tubes occurs simultaneously during the refluxing step. Furthermore we verified that MWCNTs have a critical role in the fabrication of monophasic BFO; i.e. the oxidation of CNTs during the annealing process causes an oxygen deficient atmosphere that restrains the formation of Bi2O3 and monophasic BFO can be obtained. The morphology of the obtained BFO nano structures indicates that MWCNTs act as template to grow 1D structure of BFO. Magnetic measurements on these BFO nanostructures revealed a week ferromagnetic hysteresis loop with a coercive field of 956 Oe at 5 K. We also exploited the possible use of vertically-aligned multiwall carbon nanotubes (VA-MWCNTs) as bottom electrodes for microelectronics, for example for memory applications. As a proof of concept BiFeO3 (BFO) films were in-situ deposited on the surface of VA-MWCNTs by RF (Radio Frequency) magnetron sputtering. For in situ deposition temperature of 400 ºC and deposition time up to 2 h, BFO films cover the VA-MWCNTs and no damage occurs either in the film or MWCNTs. In spite of the macroscopic lossy polarization behaviour, the ferroelectric nature, domain structure and switching of these conformal BFO films was verified by PFM. A week ferromagnetic ordering loop was proved for BFO films on VA-MWCNTs having a coercive field of 700 Oe. Our systematic work is a significant step forward in the development of 3D memory cells; it clearly demonstrates that CNTs can be combined with FE oxides and can be used, for example, as the next 3D generation of FERAMs, not excluding however other different applications in microelectronics.
Resumo:
Perante os desafios do mundo contemporâneo, marcado pela complexidade, pelo ritmo acelerado de mudança e pela incerteza, o desenvolvimento dos alunos como cidadãos cientificamente cultos é um fator crítico. Neste contexto, a educação em Geociências, em geral, e a compreensão do tempo geológico, em particular, podem contribuir para um aprofundamento da cultura científica e da responsabilidade do cidadão promotor do desenvolvimento sustentável, numa matriz Ciência/Tecnologia/Sociedade. Assim, com este estudo, pretende-se alcançar duas grandes finalidades: a) contribuir para o desenvolvimento de um quadro teórico e concetual, no âmbito da educação em geral e das Geociências em particular, visando o desenvolvimento dos alunos como cidadãos cientificamente cultos numa lógica de sustentabilidade; b) conceber, implementar e avaliar estratégias, fundamentadas no corpus de referência, no âmbito da Geologia no ensino secundário. Corresponde a uma investigação-ação desenvolvida pelo professor de Biologia e Geologia com alunos do 11.º ano e organizada em três fases: i) desenvolvimento de um quadro teórico de referência e concetual; ii) conceção, desenvolvimento e avaliação de uma intervenção didática iii) elaboração de uma proposta metodológica fundamentada para o aprofundamento de uma cultura científica que promova o desenvolvimento da sustentabilidade na Terra a partir do ensino do tempo geológico. Os resultados do estudo apontam para a importância de trabalhar de modo articulado os conceitos de tempo geológico e de desenvolvimento sustentável na formação de cidadãos cientificamente cultos, capazes de dar resposta às questões complexas da sociedade atual. Por outro lado, a análise da intervenção didática confirma a importância das atividades exteriores à sala de aula e do trabalho prático para o desenvolvimento da cidadania, em alunos do ensino secundário, e como facilitadoras da compreensão de conceitos complexos como o de tempo geológico.
Resumo:
Este trabalho focou-se no estudo do impacte das condições ambientais, de instalação e de utilização na degradação da fibra ótica, que frequentemente resultam na redução do desempenho das fibras óticas. Entre este fatores, foram estudados os efeitos de ambientes agressivos para o revestimento da fibra, nomeadamente no tempo de vida e resistência. Foi também estudado o efeito da propagação de sinais óticos de elevadas potências em curvaturas apertadas e a sua influência na degradação do desempenho da fibra ótica. Ainda neste âmbito, foi também estudado o desempenho de fibras óticas insensíveis a curvtura e fibras dopadas com Érbio, sendo analisada a dinâmica do efeito rastilho nestas fibras. Como parte integrante das redes óticas, os conetores óticos são de extrema importância na sua estrutura. O seu desempenho será refletido na qualidade do serviço da rede, e por isso é determinante estudar os fatores que contribuem para a sua degradação e mau funcionamento. Assim, este trabalho apresenta um estudo do comportamento de conetores óticos perante situações de mau manuseamento (como são limpeza insuficiente e degradação física da face final). Em adição foi também dado ênfase à reutilização de fibra danificada pelo efeito rastilho no desenvolvimento de sensores, passíveis de serem utilizados na monitorização de índice de refração, pressão hidrostática, tensão ou alta temperatura. Este procedimento surge como uma solução de baixo custo para o desenvolvimento de sensores em fibra ótica a partir de fibra danificada e inutilizável para as suas habituais aplicações em transmissão e/ou reflexão de sinais óticos.
Resumo:
À medida que a supervisão, associada a lógicas de interação com a atividade docente e com outros intervenientes nos contextos educativos, adquiriu uma dimensão reflexiva e passou a ser entendida como instrumento de transformação do desenvolvimento humano e da qualidade do processo de ensino e aprendizagem na organização escolar inclusiva, tem vindo a conquistar o interesse de numerosos investigadores. Considerando que a intenção de oferecer uma escola de qualidade a todos e a cada um dos alunos (um dos princípios fundamentais da educação inclusiva), não está amplamente atingido, impõe-se uma nova atitude pessoal e institucional: um entendimento sistémico (envolvendo profissionais, alunos, pais e comunidade) sobre as respostas a oferecer aos alunos, capaz de atender às necessidades e especificidades de cada um, otimizando as suas oportunidades de aprendizagem e desenvolvimento. Na verdade, a intervenção na complexidade das diferentes situações-problema emergentes numa organização escola que se pretende inclusiva pode ser altamente potenciada se existir supervisão dos processos educacionais em curso. A especificidade do sistema de Educação Especial preconiza uma estrutura de coordenação e supervisão; uma estrutura capaz de viabilizar recursos e gerar dinâmicas de mediação à intervenção, bem como de acionar mecanismos de avaliação de processos e produtos, tornando-os consequentes ao nível das práticas e objetivando a aproximação a níveis superiores de funcionamento. Tendo como principal objetivo construir conhecimento sobre o perfil de competências profissionais do Coordenador da Educação Especial, com particular relevo na dimensão supervisiva inerente à respetiva ação, o estudo que se apresenta baseou-se numa abordagem de natureza mista para recolha e tratamento de dados de tipo quantitativo e qualitativo. Numa primeira fase, incluiu a aplicação dum inquérito por questionário a três grupos de informantes-chave: 105 professores de Educação Especial, 47 coordenadores da Educação Especial e 37 diretores de agrupamentos de escolas/escolas não agrupadas, do ensino público, da área de influência da Direção de Serviços da Região Centro, Direção-Geral dos Estabelecimentos Escolares. A segunda fase, em que se buscou vislumbrar insights passíveis de clarificar e aprofundar os dados recolhidos através dos inquéritos por questionário, compreendeu a realização de entrevistas a 10 especialistas em Educação Especial e/ou em Supervisão. Os dados apontam para um enquadramento organizacional /supervisivo em Educação Especial dissemelhante, podendo, todavia, identificar-se em comum, uma satisfatória ação liderante do Coordenador da Educação Especial, baseada em atividades de diálogo e reflexão, e respeitando princípios de colaboração e solidariedade. Há, no entanto, indicadores de que a prática deste coordenador se concentra bastante na gestão burocrática e administrativa do departamento/equipa, podendo isto explicar-se pelo facto de o Coordenador da Educação Especial servir mais de intermediário do que de interveniente entre órgãos de direção e gestão, nomeadamente, entre o diretor e o conselho pedagógico, e os docentes/profissionais do departamento/equipa. Os dados evidenciam também, a falta de formação dos coordenadores da Educação Especial em supervisão e salientam a importância do fator tempo para o eficaz exercício desse cargo, de modo a promover interações ricas e estimulantes, centradas nas reflexões sobre as práticas inclusivas. Podendo este coordenador contribuir de forma significativa para a dinamização e estímulo dos profissionais do agrupamento/escola, apoiando-os nos seus esforços e iniciativas para uma organização mais inclusiva, identificam-se alguns aspetos considerados determinantes no seu perfil de competências profissionais: experiência, conhecimentos, capacidades, valores e particularidades da personalidade.