94 resultados para Bluetooth Data Noise
Resumo:
Objectives : The purpose of this article is to find out differences between surveys using paper and online questionnaires. The author has deep knowledge in the case of questions concerning opinions in the development of survey based research, e.g. the limits of postal and online questionnaires. Methods : In the physician studies carried out in 1995 (doctors graduated in 1982-1991), 2000 (doctors graduated in 1982-1996), 2005 (doctors graduated in 1982-2001), 2011 (doctors graduated in 1977-2006) and 457 family doctors in 2000, were used paper and online questionnaires. The response rates were 64%, 68%, 64%, 49% and 73%, respectively. Results : The results of the physician studies showed that there were differences between methods. These differences were connected with using paper-based questionnaire and online questionnaire and response rate. The online-based survey gave a lower response rate than the postal survey. The major advantages of online survey were short response time; very low financial resource needs and data were directly loaded in the data analysis software, thus saved time and resources associated with the data entry process. Conclusions : The current article helps researchers with planning the study design and choosing of the right data collection method.
Resumo:
This paper presents the SmartClean tool. The purpose of this tool is to detect and correct the data quality problems (DQPs). Compared with existing tools, SmartClean has the following main advantage: the user does not need to specify the execution sequence of the data cleaning operations. For that, an execution sequence was developed. The problems are manipulated (i.e., detected and corrected) following that sequence. The sequence also supports the incremental execution of the operations. In this paper, the underlying architecture of the tool is presented and its components are described in detail. The tool's validity and, consequently, of the architecture is demonstrated through the presentation of a case study. Although SmartClean has cleaning capabilities in all other levels, in this paper are only described those related with the attribute value level.
Resumo:
The emergence of new business models, namely, the establishment of partnerships between organizations, the chance that companies have of adding existing data on the web, especially in the semantic web, to their information, led to the emphasis on some problems existing in databases, particularly related to data quality. Poor data can result in loss of competitiveness of the organizations holding these data, and may even lead to their disappearance, since many of their decision-making processes are based on these data. For this reason, data cleaning is essential. Current approaches to solve these problems are closely linked to database schemas and specific domains. In order that data cleaning can be used in different repositories, it is necessary for computer systems to understand these data, i.e., an associated semantic is needed. The solution presented in this paper includes the use of ontologies: (i) for the specification of data cleaning operations and, (ii) as a way of solving the semantic heterogeneity problems of data stored in different sources. With data cleaning operations defined at a conceptual level and existing mappings between domain ontologies and an ontology that results from a database, they may be instantiated and proposed to the expert/specialist to be executed over that database, thus enabling their interoperability.
Resumo:
Introduction Myocardial Perfusion Imaging (MPI) is a very important tool in the assessment of Coronary Artery Disease ( CAD ) patient s and worldwide data demonstrate an increasingly wider use and clinical acceptance. Nevertheless, it is a complex process and it is quite vulnerable concerning the amount and type of possible artefacts, some of them affecting seriously the overall quality and the clinical utility of the obtained data. One of the most in convenient artefacts , but relatively frequent ( 20% of the cases ) , is relate d with patient motion during image acquisition . Mostly, in those situations, specific data is evaluated and a decisi on is made between A) accept the results as they are , consider ing that t he “noise” so introduced does not affect too seriously the final clinical information, or B) to repeat the acquisition process . Another possib ility could be to use the “ Motion Correcti on Software” provided within the software package included in any actual gamma camera. The aim of this study is to compare the quality of the final images , obtained after the application of motion correction software and after the repetition of image acqui sition. Material and Methods Thirty cases of MPI affected by Motion Artefacts and repeated , were used. A group of three, independent (blinded for the differences of origin) expert Nuclear Medicine Clinicians had been invited to evaluate the 30 sets of thre e images - one set for each patient - being ( A) original image , motion uncorrected , (B) original image, motion corrected, and (C) second acquisition image, without motion . The results so obtained were statistically analysed . Results and Conclusion Results obtained demonstrate that the use of the Motion Correction Software is useful essentiall y if the amplitude of movement is not too important (with this specific quantification found hard to define precisely , due to discrepancies between clinicians and other factors , namely between one to another brand); when that is not the case and the amplitude of movement is too important , the n the percentage of agreement between clinicians is much higher and the repetition of the examination is unanimously considered ind ispensable.
Resumo:
Mestrado em Engenharia Electrotécnica – Sistemas Eléctricos de Energia
Resumo:
A avaliação das organizações e a deterntinação da performance obtida pelo exercício da gestão, tem sido uma preocupação constante de gestores e accionistas, embora com objectivos diversos. Nos dias de hoje, a questão coloca-se com maior acuidade quer pela competitividade acrescida quer pela dimensão e complexidade actual das empresas. Pretendemos com este trabalho fazer uma descrição da metodologia DEA - Data Envelopment Analysis - nas suas formulações iniciais mais simples. A metodologia do DEA, pretende obter uma medida única e simples de avaliação da eficiência, combinando um conjunto de outputs e de inputs relativos às diferentes unidades homogéneas que se pretendem avaliar. O método DEA é um método não paramétrico que pelas suas características é particularmente adequado à avaliação de unidades homogéneas não necessariamente lucrativas. Concluímos, em geral, que são úteis e constituem um avanço importante, as informações obtidas através do DEA mas que outros métodos, designadamente rácios e análises de regressão, podem dar um contributo importante para complementar aquela análise.
Resumo:
O intuito principal desta Tese é criar um interface de Dados entre uma fonte de informação e fornecimento de Rotas para turistas e disponibilizar essa informação através de um sistema móvel interactivo de navegação e visualização desses mesmos dados. O formato tecnológico será portátil e orientado à mobilidade (PDA) e deverá ser prático, intuitivo e multi-facetado, permitindo boa usabilidade a públicos de várias faixas etárias. Haverá uma componente de IA (Inteligência Artificial), que irá usar a informação fornecida para tomar decisões ponderadas tendo em conta uma diversidade de aspectos. O Sistema a desenvolver deverá ser, assim, capaz de lidar com imponderáveis (alterações de rota, gestão de horários, cancelamento de pontos de visita, novos pontos de visita) e, finalmente, deverá ajudar o turista a gerir o seu tempo entre Pontos de Interesse (POI – Points os Interest). Deverá também permitir seguir ou não um dado percurso pré-definido, havendo possibilidade de cenários de exploração de POIs, sugeridos a partir de sugestões in loco, similares a Locais incluídos no trajecto, que se enquadravam no perfil dos Utilizadores. O âmbito geográfico de teste deste projecto será a zona ribeirinha do porto, por ser um ex-líbris da cidade e, simultaneamente, uma zona com muitos desafios ao nível geográfico (com a inclinação) e ao nível do grande número de Eventos e Locais a visitar.
Resumo:
A descoberta de conhecimento em dados hoje em dia é um ponto forte para as empresas. Atualmente a CardMobili não dispõe de qualquer sistema de mineração de dados, sendo a existência deste uma mais-valia para as suas operações de marketing diárias, nomeadamente no lançamento de cupões a um grupo restrito de clientes com uma elevada probabilidade que os mesmos os utilizem. Para isso foi analisada a base de dados da aplicação tentando extrair o maior número de dados e aplicadas as transformações necessárias para posteriormente serem processados pelos algoritmos de mineração de dados. Durante a etapa de mineração de dados foram aplicadas as técnicas de associação e classificação, sendo que os melhores resultados foram obtidos com técnicas de associação. Desta maneira pretende-se que os resultados obtidos auxiliem o decisor na sua tomada de decisões.
Resumo:
The tongue is the most important and dynamic articulator for speech formation, because of its anatomic aspects (particularly, the large volume of this muscular organ comparatively to the surrounding organs of the vocal tract) and also due to the wide range of movements and flexibility that are involved. In speech communication research, a variety of techniques have been used for measuring the three-dimensional vocal tract shapes. More recently, magnetic resonance imaging (MRI) becomes common; mainly, because this technique allows the collection of a set of static and dynamic images that can represent the entire vocal tract along any orientation. Over the years, different anatomical organs of the vocal tract have been modelled; namely, 2D and 3D tongue models, using parametric or statistical modelling procedures. Our aims are to present and describe some 3D reconstructed models from MRI data, for one subject uttering sustained articulations of some typical Portuguese sounds. Thus, we present a 3D database of the tongue obtained by stack combinations with the subject articulating Portuguese vowels. This 3D knowledge of the speech organs could be very important; especially, for clinical purposes (for example, for the assessment of articulatory impairments followed by tongue surgery in speech rehabilitation), and also for a better understanding of acoustic theory in speech formation.
Resumo:
Seismic data is difficult to analyze and classical mathematical tools reveal strong limitations in exposing hidden relationships between earthquakes. In this paper, we study earthquake phenomena in the perspective of complex systems. Global seismic data, covering the period from 1962 up to 2011 is analyzed. The events, characterized by their magnitude, geographic location and time of occurrence, are divided into groups, either according to the Flinn-Engdahl (F-E) seismic regions of Earth or using a rectangular grid based in latitude and longitude coordinates. Two methods of analysis are considered and compared in this study. In a first method, the distributions of magnitudes are approximated by Gutenberg-Richter (G-R) distributions and the parameters used to reveal the relationships among regions. In the second method, the mutual information is calculated and adopted as a measure of similarity between regions. In both cases, using clustering analysis, visualization maps are generated, providing an intuitive and useful representation of the complex relationships that are present among seismic data. Such relationships might not be perceived on classical geographic maps. Therefore, the generated charts are a valid alternative to other visualization tools, for understanding the global behavior of earthquakes.
Resumo:
Abstract: Ototoxic substances have been associated to damage of the auditory system, and its effects are potentiated by noise exposure. The present study aims at analyzing auditory changes from combined exposure to noise and organic solvents, through a pilot study in the furniture industry sector. Audiological tests were performed on 44 workers, their levels of exposure to toluene, xylene and ethylbenzene were determined and the levels of noise exposure were evaluated. The results showed that workers are generally exposed to high noise levels and cabin priming filler and varnish sector workers have high levels of exposure to toluene. However, no hearing loss was registered among the workers. Workers exposed simultaneously to noise and ototoxic substances do not have a higher degree of hearing loss than those workers exposed only to noise. Thus, the results of this study did not show that the combined exposure to noise and the organic solvent is associated with hearing disorders.
Resumo:
Consider the problem of disseminating data from an arbitrary source node to all other nodes in a distributed computer system, like Wireless Sensor Networks (WSNs). We assume that wireless broadcast is used and nodes do not know the topology. We propose new protocols which disseminate data faster and use fewer broadcasts than the simple broadcast protocol.
Resumo:
The paper investigates the risk factors for the severity of orthodontic root resorption. The multidimensional scaling (MDS) visualization method is used to investigate the experimental data from patients who received orthodontic treatment at the Department of Orthodontics and Dentofacial Orthopedics, Faculty of Dentistry, “Carol Davila” University of Medicine and Pharmacy, during a period of 4 years. The clusters emerging in the MDS plots reveal features and properties not easily captured by classical statistical tools. The results support the adoption of MDS for tackling the dentistry information and overcoming noise embedded into the data. The method introduced in this paper is rapid, efficient, and very useful for treating the risk factors for the severity of orthodontic root resorption.
Resumo:
Nowadays, due to the incredible grow of the mobile devices market, when we want to implement a client-server applications we must consider mobile devices limitations. In this paper we discuss which can be the more reliable and fast way to exchange information between a server and an Android mobile application. This is an important issue because with a responsive application the user experience is more enjoyable. In this paper we present a study that test and evaluate two data transfer protocols, socket and HTTP, and three data serialization formats (XML, JSON and Protocol Buffers) using different environments and mobile devices to realize which is the most practical and fast to use.
Resumo:
The overall goal of the REMPLI project is to design and implement a communication infrastructure for distributed data acquisition and remote control operations using the power grid as the communication medium. The primary target application is remote meter reading with high time resolution, where the meters can be energy, heat, gas, or water meters. The users of the system (e.g. utility companies) will benefit from the REMPLI system by gaining more detailed information about how energy is consumed by the end-users. In this context, the power-line communication (PLC) is deployed to cover the distance between utility company’s Private Network and the end user. This document specifies a protocol for real-time PLC, in the framework of the REMPLI project. It mainly comprises the Network Layer and Data Link Layer. The protocol was designed having into consideration the specific aspects of the network: different network typologies (star, tree, ring, multiple paths), dynamic changes in network topology (due to network maintenance, hazards, etc.), communication lines strongly affected by noise.