855 resultados para structural health monitoring (SHM)


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Advances in electronics nowadays facilitate the design of smart spaces based on physical mash-ups of sensor and actuator devices. At the same time, software paradigms such as Internet of Things (IoT) and Web of Things (WoT) are motivating the creation of technology to support the development and deployment of web-enabled embedded sensor and actuator devices with two major objectives: (i) to integrate sensing and actuating functionalities into everyday objects, and (ii) to easily allow a diversity of devices to plug into the Internet. Currently, developers who are applying this Internet-oriented approach need to have solid understanding about specific platforms and web technologies. In order to alleviate this development process, this research proposes a Resource-Oriented and Ontology-Driven Development (ROOD) methodology based on the Model Driven Architecture (MDA). This methodology aims at enabling the development of smart spaces through a set of modeling tools and semantic technologies that support the definition of the smart space and the automatic generation of code at hardware level. ROOD feasibility is demonstrated by building an adaptive health monitoring service for a Smart Gym.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Development of PCB-integrateable microsensors for monitoring chemical species is a goal in areas such as lab-on-a-chip analytical devices, diagnostics medicine and electronics for hand-held instruments where the device size is a major issue. Cellular phones have pervaded the world inhabitants and their usefulness has dramatically increased with the introduction of smartphones due to a combination of amazing processing power in a confined space, geolocalization and manifold telecommunication features. Therefore, a number of physical and chemical sensors that add value to the terminal for health monitoring, personal safety (at home, at work) and, eventually, national security have started to be developed, capitalizing also on the huge number of circulating cell phones. The chemical sensor-enabled “super” smartphone provides a unique (bio)sensing platform for monitoring airborne or waterborne hazardous chemicals or microorganisms for both single user and crowdsourcing security applications. Some of the latest ones are illustrated by a few examples. Moreover, we have recently achieved for the first time (covalent) functionalization of p- and n-GaN semiconductor surfaces with tuneable luminescent indicator dyes of the Ru-polypyridyl family, as a key step in the development of innovative microsensors for smartphone applications. Chemical “sensoring” of GaN-based blue LED chips with those indicators has also been achieved by plasma treatment of their surface, and the micrometer-sized devices have been tested to monitor O2 in the gas phase to show their full functionality. Novel strategies to enhance the sensor sensitivity such as changing the length and nature of the siloxane buffer layer are discussed in this paper.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Los avances en el hardware permiten disponer de grandes volúmenes de datos, surgiendo aplicaciones que deben suministrar información en tiempo cuasi-real, la monitorización de pacientes, ej., el seguimiento sanitario de las conducciones de agua, etc. Las necesidades de estas aplicaciones hacen emerger el modelo de flujo de datos (data streaming) frente al modelo almacenar-para-despuésprocesar (store-then-process). Mientras que en el modelo store-then-process, los datos son almacenados para ser posteriormente consultados; en los sistemas de streaming, los datos son procesados a su llegada al sistema, produciendo respuestas continuas sin llegar a almacenarse. Esta nueva visión impone desafíos para el procesamiento de datos al vuelo: 1) las respuestas deben producirse de manera continua cada vez que nuevos datos llegan al sistema; 2) los datos son accedidos solo una vez y, generalmente, no son almacenados en su totalidad; y 3) el tiempo de procesamiento por dato para producir una respuesta debe ser bajo. Aunque existen dos modelos para el cómputo de respuestas continuas, el modelo evolutivo y el de ventana deslizante; éste segundo se ajusta mejor en ciertas aplicaciones al considerar únicamente los datos recibidos más recientemente, en lugar de todo el histórico de datos. En los últimos años, la minería de datos en streaming se ha centrado en el modelo evolutivo. Mientras que, en el modelo de ventana deslizante, el trabajo presentado es más reducido ya que estos algoritmos no sólo deben de ser incrementales si no que deben borrar la información que caduca por el deslizamiento de la ventana manteniendo los anteriores tres desafíos. Una de las tareas fundamentales en minería de datos es la búsqueda de agrupaciones donde, dado un conjunto de datos, el objetivo es encontrar grupos representativos, de manera que se tenga una descripción sintética del conjunto. Estas agrupaciones son fundamentales en aplicaciones como la detección de intrusos en la red o la segmentación de clientes en el marketing y la publicidad. Debido a las cantidades masivas de datos que deben procesarse en este tipo de aplicaciones (millones de eventos por segundo), las soluciones centralizadas puede ser incapaz de hacer frente a las restricciones de tiempo de procesamiento, por lo que deben recurrir a descartar datos durante los picos de carga. Para evitar esta perdida de datos, se impone el procesamiento distribuido de streams, en concreto, los algoritmos de agrupamiento deben ser adaptados para este tipo de entornos, en los que los datos están distribuidos. En streaming, la investigación no solo se centra en el diseño para tareas generales, como la agrupación, sino también en la búsqueda de nuevos enfoques que se adapten mejor a escenarios particulares. Como ejemplo, un mecanismo de agrupación ad-hoc resulta ser más adecuado para la defensa contra la denegación de servicio distribuida (Distributed Denial of Services, DDoS) que el problema tradicional de k-medias. En esta tesis se pretende contribuir en el problema agrupamiento en streaming tanto en entornos centralizados y distribuidos. Hemos diseñado un algoritmo centralizado de clustering mostrando las capacidades para descubrir agrupaciones de alta calidad en bajo tiempo frente a otras soluciones del estado del arte, en una amplia evaluación. Además, se ha trabajado sobre una estructura que reduce notablemente el espacio de memoria necesario, controlando, en todo momento, el error de los cómputos. Nuestro trabajo también proporciona dos protocolos de distribución del cómputo de agrupaciones. Se han analizado dos características fundamentales: el impacto sobre la calidad del clustering al realizar el cómputo distribuido y las condiciones necesarias para la reducción del tiempo de procesamiento frente a la solución centralizada. Finalmente, hemos desarrollado un entorno para la detección de ataques DDoS basado en agrupaciones. En este último caso, se ha caracterizado el tipo de ataques detectados y se ha desarrollado una evaluación sobre la eficiencia y eficacia de la mitigación del impacto del ataque. ABSTRACT Advances in hardware allow to collect huge volumes of data emerging applications that must provide information in near-real time, e.g., patient monitoring, health monitoring of water pipes, etc. The data streaming model emerges to comply with these applications overcoming the traditional store-then-process model. With the store-then-process model, data is stored before being consulted; while, in streaming, data are processed on the fly producing continuous responses. The challenges of streaming for processing data on the fly are the following: 1) responses must be produced continuously whenever new data arrives in the system; 2) data is accessed only once and is generally not maintained in its entirety, and 3) data processing time to produce a response should be low. Two models exist to compute continuous responses: the evolving model and the sliding window model; the latter fits best with applications must be computed over the most recently data rather than all the previous data. In recent years, research in the context of data stream mining has focused mainly on the evolving model. In the sliding window model, the work presented is smaller since these algorithms must be incremental and they must delete the information which expires when the window slides. Clustering is one of the fundamental techniques of data mining and is used to analyze data sets in order to find representative groups that provide a concise description of the data being processed. Clustering is critical in applications such as network intrusion detection or customer segmentation in marketing and advertising. Due to the huge amount of data that must be processed by such applications (up to millions of events per second), centralized solutions are usually unable to cope with timing restrictions and recur to shedding techniques where data is discarded during load peaks. To avoid discarding of data, processing of streams (such as clustering) must be distributed and adapted to environments where information is distributed. In streaming, research does not only focus on designing for general tasks, such as clustering, but also in finding new approaches that fit bests with particular scenarios. As an example, an ad-hoc grouping mechanism turns out to be more adequate than k-means for defense against Distributed Denial of Service (DDoS). This thesis contributes to the data stream mining clustering technique both for centralized and distributed environments. We present a centralized clustering algorithm showing capabilities to discover clusters of high quality in low time and we provide a comparison with existing state of the art solutions. We have worked on a data structure that significantly reduces memory requirements while controlling the error of the clusters statistics. We also provide two distributed clustering protocols. We focus on the analysis of two key features: the impact on the clustering quality when computation is distributed and the requirements for reducing the processing time compared to the centralized solution. Finally, with respect to ad-hoc grouping techniques, we have developed a DDoS detection framework based on clustering.We have characterized the attacks detected and we have evaluated the efficiency and effectiveness of mitigating the attack impact.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper shows the preliminary results of the development and application of a procedure to filter the Acoustic Emission (AE) signals to distinguish between AE signals coming from friction and AE signals coming from concrete cracking. These signals were recorded during the trainings of an experiment carried out on a reinforced concrete frame subjected to dynamic loadings with the shaking table of the University of Granada (Spain). Discrimination between friction and cracking AE signals is the base to develop a successful procedure and damage index based on AE testing for health monitoring of RC structures subjected to earthquakes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Os ratos Wistar são amplamente empregados como modelo animal na pesquisa biomédica e o controle sanitário dos biotérios é essencial para garantir a qualidade dos experimentos. O objetivo do estudo foi a caracterização do estado sanitário da colônia de ratos Wistar em sistema de criação convencional e para tanto determinar as bactérias, fungos, virus e parasitos, bem como caracterizar as lesões anatomopatológicas do sistema respiratório. Foram utilizados 273 ratos (N), machos (M) e fêmeas (F), das faixas etárias 4, 8, 12, 16 a 20 semanas e entre 12 a 18 meses, para as determinações de peso e condição corpórea (N=273, 140M, 133F); avaliação bacteriológica de orofaringe, mucosa intestinal e lavado traqueobrônquico (N=40, 20M, 20F); determinação de anticorpos para vírus e bactérias (N=20, 10M, 10F); exame parasitológico (N=60, 30M, 30F); identificação molecular de Mycoplasma pulmonis em amostras de pulmão (N=25, 15M, 10F), e caracterização anatomopatológica da cavidade nasal, orofaringe, laringe, traqueia e pulmão (N=106, 53M, 53F). Foram realizadas ainda avaliações microbiológicas das salas dos ratos em três períodos com isolamento de Micrococcus spp., Staphylococcus spp., Bacillus spp., Aspergillus spp. e Penicillium spp. O peso se mostrou homogêneo dentro da faixa etária e gênero, com apenas sete animais magros (2,56%) e nove em sobrepeso (3,30%). Não foram isoladas bactérias patogênicas na orofaringe, mucosa intestinal e lavado traqueobrônquico por cultivo. Mycoplasma pulmonis foi determinado em 72% das amostras pulmonares e em 100% dos soros testados. Em 35% foram detectados anticorpos para Reovirus tipo III e em 100% para bacilos associados ao epitélio respiratório ciliado. Syphacia muris foi diagnosticada em 91,67%, Eimeria spp. em 3,33% e Entamoeba muris em 1,67%. Lesões relacionadas a infecção por agentes exógenos foram observadas em cavidade nasal e na orofaringe, laringe e traqueia a partir da 4 semanas de idade e, em pulmão desde as 12 semanas, com aumento de frequência de ocorrência e do grau de progressão, com o avançar da idade, nos vários segmentos estudados. Concluímos que a caracterização do estado sanitário dos ratos permite conhecer as particularidades do modelo biológico utilizado e compor base de dados para auxiliar no desenho e na interpretação experimental dos pesquisadores, além de garantir uma base para o programa de monitorização sanitária de biotérios em condições similares

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Introdução: A prevalência de doenças crônicas, sobretudo na população idosa, nos coloca diante da necessidade de modelos longitudinais de cuidado. Atualmente os sujeitos estão sendo cada vez mais responsabilizados pelo gerenciamento de sua saúde através do uso de dispositivos de monitoramento, tais como o glicosímetro e o aferidor de pressão arterial. Esta nova realidade culmina na tomada de decisão no próprio domicílio. Objetivos: Identificar a tomada de decisão de idosos no monitoramento domiciliar das condições crônicas; identificar se as variáveis: sexo, escolaridade e renda influenciam a tomada de decisão; identificar a percepção dos idosos quanto às ações de cuidado no domicílio; identificar as dificuldades e estratégias no manuseio dos dispositivos de monitoramento. Materiais e métodos: Estudo quantitativo, exploratório e transversal. Casuística: 150 sujeitos com 60 anos de idade ou mais, sem comprometimento cognitivo, sem depressão e que façam uso do glicosímetro e/ou do aferidor de pressão arterial no domicílio. Instrumentos para seleção dos participantes: (1) Mini Exame do Estado Mental; (2) Escala de Depressão Geriátrica e (3) Escala de Atividades Instrumentais de Vida Diária de Lawton e Brody; Coleta de dados: realizada na cidade de Ribeirão Preto - SP entre setembro de 2014 e outubro de 2015. Instrumentos: (1) Questionário Socioeconômico; (2) Questionário sobre a tomada de decisão no monitoramento da saúde no domicílio (3) Classificação do uso de dispositivos eletrônicos voltados aos cuidados à saúde. Análise dos dados: Realizada estatística descritiva e quantificações absolutas e percentuais para identificar a relação entre tomada de decisão de acordo com o sexo, escolaridade e renda. Resultados: Participaram 150 idosos, sendo 117 mulheres e 33 homens, com média de idade de 72 anos. Destes, 113 são hipertensos e 62 são diabéticos. Quanto à tomada de decisão imediata, tanto os que fazem uso do aferidor de pressão arterial (n=128) quanto do glicosímetro (n=62) referem em sua maioria procurar ajuda médica, seguida da administração do medicamento prescrito e opções alternativas de tratamento. Em médio prazo destaca-se a procura por ajuda profissional para a maioria dos idosos em ambos os grupos. Foi notada pequena diferença na tomada de decisão com relação ao sexo. Quanto à escolaridade, os idosos com mais anos de estudos tendem a procurar mais pelo serviço de saúde se comparado aos idosos de menor escolaridade. A renda não mostrou influencia entre os usuários do glicosímetro. Já entre os usuários do aferidor de pressão arterial, idosos de maior renda tendem a procurar mais pelo serviço de saúde. A maioria dos participantes se refere ao monitoramento domiciliar da saúde de maneira positiva, principalmente pela praticidade em não sair de casa, obtenção rápida de resultados e possibilidade de controle contínuo da doença. As principais dificuldades no manuseio do glicosímetro estão relacionadas ao uso da lanceta e fita reagente, seguida da checagem dos resultados armazenados. Já as dificuldades no uso do aferidor de pressão arterial estão relacionadas a conferir o resultado após cada medida e ao posicionamento correto do corpo durante o monitoramento. Em ambos os grupos as estratégias utilizadas são pedir o auxílio de terceiros e tentativa e erro. Conclusão: Os idosos tem se mostrado favoráveis às ações de monitoramento domiciliar da saúde. De maneira geral, de imediato decidem por ações dentro do próprio domicílio para o controle dos sintomas e isto reforça a necessidade do investimento em informação de qualidade e educação em saúde para que o gerenciamento domiciliar possa vir a ser uma vertente do cuidado integral no tratamento das condições crônicas.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

"N238.598"--P. [4] of cover.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Nitrogen loading to aquatic ecosystems from sewage is recognised worldwide as a growing problem. The use of nitrogen stable isotopes as a means of discerning sewage nitrogen in the environment has been used annually by the Ecosystem Health Monitoring Program in Moreton Bay (Australia) since 1997 when the technique was first developed. This (sewage plume mapping) technique, which measures the delta(15)N isotopic signature of the red macroalga Catenella nipae after incubation in situ, has demonstrated a large reduction in the magnitude and spatial extent of sewage nitrogen within Moreton Bay over the past 5 years. This observed reduction coincides with considerable upgrades to the nitrogen removal efficacy at several sewage treatment plants within the region. This paper describes the observed changes and evaluates whether they can be attributed to the treatment upgrades. (c) 2004 Published by Elsevier Ltd.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background Reliable information on causes of death is a fundamental component of health development strategies, yet globally only about one-third of countries have access to such information. For countries currently without adequate mortality reporting systems there are useful models other than resource-intensive population-wide medical certification. Sample-based mortality surveillance is one such approach. This paper provides methods for addressing appropriate sample size considerations in relation to mortality surveillance, with particular reference to situations in which prior information on mortality is lacking. Methods The feasibility of model-based approaches for predicting the expected mortality structure and cause composition is demonstrated for populations in which only limited empirical data is available. An algorithm approach is then provided to derive the minimum person-years of observation needed to generate robust estimates for the rarest cause of interest in three hypothetical populations, each representing different levels of health development. Results Modelled life expectancies at birth and cause of death structures were within expected ranges based on published estimates for countries at comparable levels of health development. Total person-years of observation required in each population could be more than halved by limiting the set of age, sex, and cause groups regarded as 'of interest'. Discussion The methods proposed are consistent with the philosophy of establishing priorities across broad clusters of causes for which the public health response implications are similar. The examples provided illustrate the options available when considering the design of mortality surveillance for population health monitoring purposes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

During the past decade the use of stable isotopes to investigate transport pathways of nutrients in aquatic ecosystems has contributed new understanding and knowledge to many aspects of ecology; from the trophic structure of food webs to the spatial extent of nutrient discharges. At the same time aquatic monitoring programs around the world have become more interested in quantifying ecosystem health rather than simply measuring the physical and chemical properties of water (nutrients, pH, temperature and turbidity). A novel technique was initiated in 1998 as part of the development of the Ecosystem Health Monitoring Program in S.E. Queensland Australia (EHMP) using changes in the 15N value of the red macroalgae Catenella nipae, to indicate regions impacted by sewage nitrogen. Sewage plume mapping, using the 15N of C. nipae, has demonstrated that over the past 5 years there has been a large reduction in the magnitude and spatial extent of 15N enrichment at sites close to sewage treatment plants (STPs) discharging into Moreton Bay. This presentation will discuss how the 15N signatures of the C. nipae in the plume at the mouth of the Brisbane River have declined since it was first sampled in 1998 and will evaluate causes that may be responsible for these variations. A series of laboratory experiments were conducted to investigate how environmental conditions influence the 15N signature of C, nipae over the incubation period. These data will be used to discuss the observed in situ decline in 15N in an attempt to determine if the reduction can be attributed solely to improvements in the wastewater discharge.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The existing method of pipeline health monitoring, which requires an entire pipeline to be inspected periodically, is unproductive. A risk-based decision support system (DSS) that reduces the amount of time spent on inspection has been presented. The risk-based DSS uses the analytic hierarchy process (AHP), a multiple attribute decision-making technique, to identify the factors that influence failure on specific segments and analyzes their effects by determining probability of occurrence of these risk factors. The severity of failure is determined through consequence analysis. From this, the effect of a failure caused by each risk factor can be established in terms of cost and the cumulative effect of failure is determined through probability analysis. The model optimizes the cost of pipeline operations by reducing subjectivity in selecting a specific inspection method, identifying and prioritizing the right pipeline segment for inspection and maintenance, deriving budget allocation, providing guidance to deploy the right mix labor for inspection and maintenance, planning emergency preparation, and deriving logical insurance plan. The proposed methodology also helps derive inspection and maintenance policy for the entire pipeline system, suggest design, operational philosophy, and construction methodology for new pipelines.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Offshore oil and gas pipelines are vulnerable to environment as any leak and burst in pipelines cause oil/gas spill resulting in huge negative Impacts on marine lives. Breakdown maintenance of these pipelines is also cost-intensive and time-consuming resulting in huge tangible and intangible loss to the pipeline operators. Pipelines health monitoring and integrity analysis have been researched a lot for successful pipeline operations and risk-based maintenance model is one of the outcomes of those researches. This study develops a risk-based maintenance model using a combined multiple-criteria decision-making and weight method for offshore oil and gas pipelines in Thailand with the active participation of experienced executives. The model's effectiveness has been demonstrated through real life application on oil and gas pipelines in the Gulf of Thailand. Practical implications. Risk-based inspection and maintenance methodology is particularly important for oil pipelines system, as any failure in the system will not only affect productivity negatively but also has tremendous negative environmental impact. The proposed model helps the pipelines operators to analyze the health of pipelines dynamically, to select specific inspection and maintenance method for specific section in line with its probability and severity of failure.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The existing method of pipeline health monitoring, which requires an entire pipeline to be inspected periodically, is both time-wasting and expensive. A risk-based model that reduces the amount of time spent on inspection has been presented. This model not only reduces the cost of maintaining petroleum pipelines, but also suggests efficient design and operation philosophy, construction methodology and logical insurance plans. The risk-based model uses Analytic Hierarchy Process (AHP), a multiple attribute decision-making technique, to identify the factors that influence failure on specific segments and analyzes their effects by determining probability of risk factors. The severity of failure is determined through consequence analysis. From this, the effect of a failure caused by each risk factor can be established in terms of cost, and the cumulative effect of failure is determined through probability analysis. The technique does not totally eliminate subjectivity, but it is an improvement over the existing inspection method.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Product Service Systems (PSSs) emphasize the substitution of products with services. The term “Servitisation” was introduced by Sandra Vendermerwe in the 80s to represent the addition of services to increase a company’s competitive edge. Key to PSS, and Servitisation more generally, is the “informated product”. The informated product enables health monitoring of the product in use and can be key to a workable PSS. This paper reviews the evolution of servitisation and the associated business benefit. It also then reviews the concept of informated product reconfiguration techniques and remote services that enables PSS to be delivered.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The existing method of pipeline health monitoring, which requires an entire pipeline to be inspected periodically, is both time-wasting and expensive. A risk-based model that reduces the amount of time spent on inspection has been presented. This model not only reduces the cost of maintaining petroleum pipelines, but also suggests an efficient design and operation philosophy, construction methodology, and logical insurance plans. The risk-based model uses the analytic hierarchy process (AHP), a multiple-attribute decision-making technique, to identify the factors that influence failure on specific segments and to analyze their effects by determining probability of risk factors. The severity of failure is determined through consequence analysis. From this, the effect of a failure caused by each risk factor can be established in terms of cost, and the cumulative effect of failure is determined through probability analysis. The technique does not totally eliminate subjectivity, but it is an improvement over the existing inspection method.