947 resultados para Picture archiving
Resumo:
Ainda antes da invenção da escrita, o desenho foi utilizado para descrever a realidade, tendo evoluído ao longo dos tempos, ganhando mais qualidade e pormenor e recorrendo a suportes cada vez mais evoluídos que permitissem a perpetuação dessa imagem: dessa informação. Desde as pinturas rupestres, nas paredes de grutas paleolíticas, passando pelos hieróglifos, nos templos egípcios, nas gravuras das escrituras antigas e nos quadros sobre tela, a intenção sempre foi a de transmitir a informação da forma mais directa e perceptível por qualquer indivíduo. Nos dias de hoje as novas tecnologias permitem aceder à informação com uma facilidade nunca antes vista ou imaginada, estando certamente ainda por descobrir outras formas de registar e perpetuar a informação para as gerações vindouras. A fotografia está na origem das grandes evoluções da imagem, permitindo capturar o momento, tornando-o “eterno”. Hoje em dia, na era da imagem digital, além de se mostrar a realidade, é possível incorporar na imagem informação adicional, de modo a enriquecer a experiência de visualização e a maximizar a aquisição do conhecimento. As possibilidades da visualização em três dimensões (3D) vieram dar o realismo que faltava ao formato de fotografia original. O 3D permite a imersão do espectador no ambiente que, a própria imagem retrata, à qual se pode ainda adicionar informação escrita ou até sensorial como, por exemplo, o som. Esta imersão num ambiente tridimensional permite ao utilizador interagir com a própria imagem através da navegação e exploração de detalhes, usando ferramentas como o zoom ou ligações incorporados na imagem. A internet é o local onde, hoje em dia, já se disponibilizam estes ambientes imersivos, tornando esta experiência muita mais acessível a qualquer pessoa. Há poucos anos ainda, esta prática só era possível mediante o recurso a dispositivos especificamente construídos para o efeito e que, por isso, apenas estavam disponíveis a grupos restritos de utilizadores. Esta dissertação visa identificar as características de um ambiente 3D imersivo e as técnicas existentes e possíveis de serem usadas para maximizar a experiência de visualização. Apresentar-se-ão algumas aplicações destes ambientes e sua utilidade no nosso dia-a-dia, antevendo as tendências futuras de evolução nesta área. Serão apresentados exemplos de ferramentas para a composição e produção destes ambientes e serão construídos alguns modelos ilustrativos destas técnicas, como forma de avaliar o esforço de desenvolvimento e o resultado obtido, comparativamente com formas mais convencionais de transmitir e armazenar a informação. Para uma avaliação mais objectiva, submeteram-se os modelos produzidos à apreciação de diversos utilizadores, a partir da qual foram elaboradas as conclusões finais deste trabalho relativamente às potencialidades de utilização de ambientes 3D imersivos e suas mais diversas aplicações.
Resumo:
To mimic the online practices of citizens has been declared an imperative to improve communication and extend participation. This paper seeks to contribute to the understanding of how European discourses praising online video as a communication tool have been translated into actual practices by politicians, governments and organisations. By contrasting official documents with YouTube activity, it is argued that new opportunities for European political communication are far from being fully embraced, much akin to the early years of websites. The main choice has been to use YouTube channels fundamentally for distribution and archiving, thus neglecting its social media features. The disabling of comments by many heads of state and prime ministers - and, in 2010, the European Commission - indicates such an attitude. The few attempts made to foster citizen engagement, in particular during elections, have had limited success, given low participation numbers and lack of argument exchange.
Resumo:
A forte competitividade dos mercados a nível nacional e internacional tem levado muitas empresas a estudar métodos e técnicas de incremento à eliminação dos desperdícios, à redução de custos e tempos, ao aumento da qualidade e da flexibilidade, tendo a filosofia lean um papel crucial na prossecução destes objectivos. Desde os seus primórdios, a avaliação da implementação da filosofia lean no universo das empresas é uma questão de investigação na área de conhecimento da gestão industrial. Embora a nível individual as diferentes empresas possam quantificar e avaliar os resultados da aplicação do lean, a grande dificuldade surge quando se pretende obter uma comparação por sector ou tipo de actividade económica. Existem países onde a prática do lean tem sido prioritária e as empresas ocupam a vanguarda nesta área de conhecimento. No entanto, em Portugal, existe uma clara dificuldade em se determinar até que ponto o tecido empresarial português assimilou esta filosofia e que resultados têm obtido com a prática do lean. Este trabalho apresenta um estudo realizado a partir de um inquérito, obtido através de um questionário on-line, às empresas que operam em Portugal de forma a estudar e analisar o estado actual do lean em Portugal e antever tendências futuras numa perspectiva de evolução da aplicação desta metodologia de gestão de processos produtivos. Em resultado deste estudo foi possível identificar quais são os grandes obstáculos à introdução do lean, áreas em que se observou sucesso ou menor impacto e quais as ferramentas e técnicas mais usadas por sector. Como resultado deste estudo é convicção do autor que foi possível obter uma fotografia abrangente do actual estado de implementação do lean e desta forma caracterizar as áreas que seguem na vanguarda da implementação do lean, e as áreas que ainda apresentam um desenvolvimento incipiente. Desta forma parece ao autor que o presente estudo apresenta grande utilidade para o mundo académico bem como para o tecido empresarial português.
Resumo:
Trabalho Final de Mestrado para obtenção do grau de Mestre em Engenharia Mecânica
Resumo:
Mestrado em Fisioterapia
Resumo:
Dissertação para obtenção do grau de Mestre em Engenharia Informática e de Computadores
Resumo:
The theory and applications of fractional calculus (FC) had a considerable progress during the last years. Dynamical systems and control are one of the most active areas, and several authors focused on the stability of fractional order systems. Nevertheless, due to the multitude of efforts in a short period of time, contributions are scattered along the literature, and it becomes difficult for researchers to have a complete and systematic picture of the present day knowledge. This paper is an attempt to overcome this situation by reviewing the state of the art and putting this topic in a systematic form. While the problem is formulated with rigour, from the mathematical point of view, the exposition intends to be easy to read by the applied researchers. Different types of systems are considered, namely, linear/nonlinear, positive, with delay, distributed, and continuous/discrete. Several possible routes of future progress that emerge are also tackled.
Resumo:
Objective - To define a checklist that can be used to assess the performance of a department and evaluate the implementation of quality management (QM) activities across departments or pathways in acute care hospitals. Design - We developed and tested a checklist for the assessment of QM activities at department level in a cross-sectional study using on-site visits by trained external auditors. Setting and Participants - A sample of 292 hospital departments of 74 acute care hospitals across seven European countries. In every hospital, four departments for the conditions: acute myocardial infarction (AMI), stroke, hip fracture and deliveries participated. Main outcome measures - Four measures of QM activities were evaluated at care pathway level focusing on specialized expertise and responsibility (SER), evidence-based organization of pathways (EBOP), patient safety strategies and clinical review (CR). Results - Participating departments attained mean values on the various scales between 1.2 and 3.7. The theoretical range was 0-4. Three of the four QM measures are identical for the four conditions, whereas one scale (EBOP) has condition-specific items. Correlations showed that every factor was related, but also distinct, and added to the overall picture of QM at pathway level. Conclusion - The newly developed checklist can be used across various types of departments and pathways in acute care hospitals like AMI, deliveries, stroke and hip fracture. The anticipated users of the checklist are internal (e.g. peers within the hospital and hospital executive board) and external auditors (e.g. healthcare inspectorate, professional or patient organizations).
Resumo:
Managing the physical and compute infrastructure of a large data center is an embodiment of a Cyber-Physical System (CPS). The physical parameters of the data center (such as power, temperature, pressure, humidity) are tightly coupled with computations, even more so in upcoming data centers, where the location of workloads can vary substantially due, for example, to workloads being moved in a cloud infrastructure hosted in the data center. In this paper, we describe a data collection and distribution architecture that enables gathering physical parameters of a large data center at a very high temporal and spatial resolutionof the sensor measurements. We think this is an important characteristic to enable more accurate heat-flow models of the data center andwith them, _and opportunities to optimize energy consumption. Havinga high resolution picture of the data center conditions, also enables minimizing local hotspots, perform more accurate predictive maintenance (pending failures in cooling and other infrastructure equipment can be more promptly detected) and more accurate billing. We detail this architecture and define the structure of the underlying messaging system that is used to collect and distribute the data. Finally, we show the results of a preliminary study of a typical data center radio environment.
Resumo:
Most research work on WSNs has focused on protocols or on specific applications. There is a clear lack of easy/ready-to-use WSN technologies and tools for planning, implementing, testing and commissioning WSN systems in an integrated fashion. While there exists a plethora of papers about network planning and deployment methodologies, to the best of our knowledge none of them helps the designer to match coverage requirements with network performance evaluation. In this paper we aim at filling this gap by presenting an unified toolset, i.e., a framework able to provide a global picture of the system, from the network deployment planning to system test and validation. This toolset has been designed to back up the EMMON WSN system architecture for large-scale, dense, real-time embedded monitoring. It includes network deployment planning, worst-case analysis and dimensioning, protocol simulation and automatic remote programming and hardware testing tools. This toolset has been paramount to validate the system architecture through DEMMON1, the first EMMON demonstrator, i.e., a 300+ node test-bed, which is, to the best of our knowledge, the largest single-site WSN test-bed in Europe to date.
Resumo:
A gerência da informação em estudos multicêntricos de grande porte requer uma abordagem especializada. O Estudo Longitudinal da Saúde do Adulto (ELSA-Brasil) criou um Centro de Dados para delinear e gerenciar seu sistema de dados. O objetivo do artigo foi descrever os passos envolvidos, incluindo os métodos de entrada, transmissão e gerência de informações. Foi desenvolvido um sistema web que permitiu, de forma segura e confidencial, a entrada online, verificação e edição, bem como incorporação de dados coletados em papel. Além disso, foi implantado e personalizado um sistema de armazenamento e comunicação de imagens (Picture Arquiving and Communication System) para ecocardiografia e retinografia que armazena as imagens recebidas dos Centros de Investigação e as torna acessíveis nos Centros de Leitura. Finalmente, foram desenvolvidos processos de extração e limpeza de dados para criação de bases de dados em formatos que permitam análises em múltiplos pacotes estatísticos.
Resumo:
In video communication systems, the video signals are typically compressed and sent to the decoder through an error-prone transmission channel that may corrupt the compressed signal, causing the degradation of the final decoded video quality. In this context, it is possible to enhance the error resilience of typical predictive video coding schemes using as inspiration principles and tools from an alternative video coding approach, the so-called Distributed Video Coding (DVC), based on the Distributed Source Coding (DSC) theory. Further improvements in the decoded video quality after error-prone transmission may also be obtained by considering the perceptual relevance of the video content, as distortions occurring in different regions of a picture have a different impact on the user's final experience. In this context, this paper proposes a Perceptually Driven Error Protection (PDEP) video coding solution that enhances the error resilience of a state-of-the-art H.264/AVC predictive video codec using DSC principles and perceptual considerations. To increase the H.264/AVC error resilience performance, the main technical novelties brought by the proposed video coding solution are: (i) design of an improved compressed domain perceptual classification mechanism; (ii) design of an improved transcoding tool for the DSC-based protection mechanism; and (iii) integration of a perceptual classification mechanism in an H.264/AVC compliant codec with a DSC-based error protection mechanism. The performance results obtained show that the proposed PDEP video codec provides a better performing alternative to traditional error protection video coding schemes, notably Forward Error Correction (FEC)-based schemes. (C) 2013 Elsevier B.V. All rights reserved.
Fisioterapia cardiorrespiratória em pacientes vítimas de queimaduras: projeto de intervenção precoce
Resumo:
Mestrado em Fisioterapia
Resumo:
In practice the robotic manipulators present some degree of unwanted vibrations. The advent of lightweight arm manipulators, mainly in the aerospace industry, where weight is an important issue, leads to the problem of intense vibrations. On the other hand, robots interacting with the environment often generate impacts that propagate through the mechanical structure and produce also vibrations. In order to analyze these phenomena a robot signal acquisition system was developed. The manipulator motion produces vibrations, either from the structural modes or from endeffector impacts. The instrumentation system acquires signals from several sensors that capture the joint positions, mass accelerations, forces and moments, and electrical currents in the motors. Afterwards, an analysis package, running off-line, reads the data recorded by the acquisition system and extracts the signal characteristics. Due to the multiplicity of sensors, the data obtained can be redundant because the same type of information may be seen by two or more sensors. Because of the price of the sensors, this aspect can be considered in order to reduce the cost of the system. On the other hand, the placement of the sensors is an important issue in order to obtain the suitable signals of the vibration phenomenon. Moreover, the study of these issues can help in the design optimization of the acquisition system. In this line of thought a sensor classification scheme is presented. Several authors have addressed the subject of the sensor classification scheme. White (White, 1987) presents a flexible and comprehensive categorizing scheme that is useful for describing and comparing sensors. The author organizes the sensors according to several aspects: measurands, technological aspects, detection means, conversion phenomena, sensor materials and fields of application. Michahelles and Schiele (Michahelles & Schiele, 2003) systematize the use of sensor technology. They identified several dimensions of sensing that represent the sensing goals for physical interaction. A conceptual framework is introduced that allows categorizing existing sensors and evaluates their utility in various applications. This framework not only guides application designers for choosing meaningful sensor subsets, but also can inspire new systems and leads to the evaluation of existing applications. Today’s technology offers a wide variety of sensors. In order to use all the data from the diversity of sensors a framework of integration is needed. Sensor fusion, fuzzy logic, and neural networks are often mentioned when dealing with problem of combing information from several sensors to get a more general picture of a given situation. The study of data fusion has been receiving considerable attention (Esteban et al., 2005; Luo & Kay, 1990). A survey of the state of the art in sensor fusion for robotics can be found in (Hackett & Shah, 1990). Henderson and Shilcrat (Henderson & Shilcrat, 1984) introduced the concept of logic sensor that defines an abstract specification of the sensors to integrate in a multisensor system. The recent developments of micro electro mechanical sensors (MEMS) with unwired communication capabilities allow a sensor network with interesting capacity. This technology was applied in several applications (Arampatzis & Manesis, 2005), including robotics. Cheekiralla and Engels (Cheekiralla & Engels, 2005) propose a classification of the unwired sensor networks according to its functionalities and properties. This paper presents a development of a sensor classification scheme based on the frequency spectrum of the signals and on a statistical metrics. Bearing these ideas in mind, this paper is organized as follows. Section 2 describes briefly the robotic system enhanced with the instrumentation setup. Section 3 presents the experimental results. Finally, section 4 draws the main conclusions and points out future work.
Resumo:
Biopsies from cutaneous and mucosal lesions from 40 patients with active paracoccidioidomycosis, were studied histopathologically. All cases exhibited chronic granulomatous inflammation and 38 also presented suppuration; this picture corresponded to the mixed mycotic granuloma (MMG). Pseudoepitheliomatous hyperplasia and the transepidermic (or epithelial) elimination of the parasite, were observed in all cases. In paracoccidioidomycosis elimination takes place through formation of progressive edema, accompained by exocytosis. The edema gives rise to spongiosis, microvesicles and microabscesses which not only contain the fungus but also, various cellular elements. Cells in charge of the phagocytic process were essentialy Langhans giant cells; PMN's, epithelioid and foreign body giant cells were poor phagocytes. An additional finding was the presence of fibrosis in most biopsies.