791 resultados para vibration-based structural health monitoring


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Behavior is one of the most important indicators for assessing cattle health and well-being. The objective of this study was to develop and validate a novel algorithm to monitor locomotor behavior of loose-housed dairy cows based on the output of the RumiWatch pedometer (ITIN+HOCH GmbH, Fütterungstechnik, Liestal, Switzerland). Data of locomotion were acquired by simultaneous pedometer measurements at a sampling rate of 10 Hz and video recordings for manual observation later. The study consisted of 3 independent experiments. Experiment 1 was carried out to develop and validate the algorithm for lying behavior, experiment 2 for walking and standing behavior, and experiment 3 for stride duration and stride length. The final version was validated, using the raw data, collected from cows not included in the development of the algorithm. Spearman correlation coefficients were calculated between accelerometer variables and respective data derived from the video recordings (gold standard). Dichotomous data were expressed as the proportion of correctly detected events, and the overall difference for continuous data was expressed as the relative measurement error. The proportions for correctly detected events or bouts were 1 for stand ups, lie downs, standing bouts, and lying bouts and 0.99 for walking bouts. The relative measurement error and Spearman correlation coefficient for lying time were 0.09% and 1; for standing time, 4.7% and 0.96; for walking time, 17.12% and 0.96; for number of strides, 6.23% and 0.98; for stride duration, 6.65% and 0.75; and for stride length, 11.92% and 0.81, respectively. The strong to very high correlations of the variables between visual observation and converted pedometer data indicate that the novel RumiWatch algorithm may markedly improve automated livestock management systems for efficient health monitoring of dairy cows.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The U.S. Air Force assesses Active Duty Air Force (ADAF) health annually using the Air Force Web-based Preventative Health Assessment (AF WebPHA). The assessment is based on a self-administered survey used to determine the overall Air Force health and readiness, as well as, the individual health of each airman. Individual survey responses as well as groups of responses generate further computer generated assessment and result in a classification of 'Critical', 'Priority', or 'Routine', depending on the need and urgency for further evaluation by a health care provider. The importance of the 'Priority' and 'Critical' classifications is to provide timely intervention to prevent or limit unfavorable outcomes that may threaten an airman. Though the USAF has been transitioning from a paper form to the online WebPHA survey for the last three years it was not made mandatory for all airmen until 2009. The survey covers many health aspects including family history, tobacco use, exercise, alcohol use, and mental health. ^ Military stressors such as deployment, change of station, and the trauma of war can aggravate and intensify the common baseline worries experienced by the general population and place airmen at additional risks for mental health concerns and illness. This study assesses the effectiveness of the AF WebPHA mental health screening questions in predicting a mental health disorder diagnosis according to International Classification of Diseases, 9th Revision, Clinical Modification (ICD-9-CM) codes generated by physicians or their surrogates. In order to assess the sensitivity, specificity, and positive predictive value of the AF WebPHA as a screening tool for mental health, survey results were compared to ascertain if they generated any mental health disorder related diagnosis for the period from January 1, 2009 to March 31, 2010. ^ Statistical analysis of the AF WebPHA mental health responses when compared with matching ICD-9-CM codes found that the sensitivity for 'Critical' or 'Priority' responses was only 3.4% and that it would correctly predict those who had the selected mental health diagnosis 9% of the time.^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Large cross section structural timber have been used in many structures over long periods of time and still make up an important part of the market due to its mechanical properties. Furthermore, it is frequent its employment in new construction site. It involves the need for a visual grading standard for timber used in construction according to the quality assessment. The material has to satisfy the requirements according to the currently regulations. UNE 56544 is the Spanish visual grading standard for coniferous structural timber. The 2007 version defined a new visual grade in the standard for large section termed Structural Large Timber (MEG). This research checks the new visual grading and consists of 116 structural size specimens of sawn coniferous timber of Scotch pine (Pinus sylvestris L.) from Segovia, Spain. The pieces had a cross section of 150 by 200 mm. They were visually graded according to UNE 56544:2007. Also, mechanical properties have been obtained according to standard EN 408. The results show very low output with an excessive percentage of rejected pieces (33%). The main reasons for the rejection of pieces are fissures and twist

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Esta tesis estudia la monitorización y gestión de la Calidad de Experiencia (QoE) en los servicios de distribución de vídeo sobre IP. Aborda el problema de cómo prevenir, detectar, medir y reaccionar a las degradaciones de la QoE desde la perspectiva de un proveedor de servicios: la solución debe ser escalable para una red IP extensa que entregue flujos individuales a miles de usuarios simultáneamente. La solución de monitorización propuesta se ha denominado QuEM(Qualitative Experience Monitoring, o Monitorización Cualitativa de la Experiencia). Se basa en la detección de las degradaciones de la calidad de servicio de red (pérdidas de paquetes, disminuciones abruptas del ancho de banda...) e inferir de cada una una descripción cualitativa de su efecto en la Calidad de Experiencia percibida (silencios, defectos en el vídeo...). Este análisis se apoya en la información de transporte y de la capa de abstracción de red de los flujos codificados, y permite caracterizar los defectos más relevantes que se observan en este tipo de servicios: congelaciones, efecto de “cuadros”, silencios, pérdida de calidad del vídeo, retardos e interrupciones en el servicio. Los resultados se han validado mediante pruebas de calidad subjetiva. La metodología usada en esas pruebas se ha desarrollado a su vez para imitar lo más posible las condiciones de visualización de un usuario de este tipo de servicios: los defectos que se evalúan se introducen de forma aleatoria en medio de una secuencia de vídeo continua. Se han propuesto también algunas aplicaciones basadas en la solución de monitorización: un sistema de protección desigual frente a errores que ofrece más protección a las partes del vídeo más sensibles a pérdidas, una solución para minimizar el impacto de la interrupción de la descarga de segmentos de Streaming Adaptativo sobre HTTP, y un sistema de cifrado selectivo que encripta únicamente las partes del vídeo más sensibles. También se ha presentado una solución de cambio rápido de canal, así como el análisis de la aplicabilidad de los resultados anteriores a un escenario de vídeo en 3D. ABSTRACT This thesis proposes a comprehensive approach to the monitoring and management of Quality of Experience (QoE) in multimedia delivery services over IP. It addresses the problem of preventing, detecting, measuring, and reacting to QoE degradations, under the constraints of a service provider: the solution must scale for a wide IP network delivering individual media streams to thousands of users. The solution proposed for the monitoring is called QuEM (Qualitative Experience Monitoring). It is based on the detection of degradations in the network Quality of Service (packet losses, bandwidth drops...) and the mapping of each degradation event to a qualitative description of its effect in the perceived Quality of Experience (audio mutes, video artifacts...). This mapping is based on the analysis of the transport and Network Abstraction Layer information of the coded stream, and allows a good characterization of the most relevant defects that exist in this kind of services: screen freezing, macroblocking, audio mutes, video quality drops, delay issues, and service outages. The results have been validated by subjective quality assessment tests. The methodology used for those test has also been designed to mimic as much as possible the conditions of a real user of those services: the impairments to evaluate are introduced randomly in the middle of a continuous video stream. Based on the monitoring solution, several applications have been proposed as well: an unequal error protection system which provides higher protection to the parts of the stream which are more critical for the QoE, a solution which applies the same principles to minimize the impact of incomplete segment downloads in HTTP Adaptive Streaming, and a selective scrambling algorithm which ciphers only the most sensitive parts of the media stream. A fast channel change application is also presented, as well as a discussion about how to apply the previous results and concepts in a 3D video scenario.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Los avances en el hardware permiten disponer de grandes volúmenes de datos, surgiendo aplicaciones que deben suministrar información en tiempo cuasi-real, la monitorización de pacientes, ej., el seguimiento sanitario de las conducciones de agua, etc. Las necesidades de estas aplicaciones hacen emerger el modelo de flujo de datos (data streaming) frente al modelo almacenar-para-despuésprocesar (store-then-process). Mientras que en el modelo store-then-process, los datos son almacenados para ser posteriormente consultados; en los sistemas de streaming, los datos son procesados a su llegada al sistema, produciendo respuestas continuas sin llegar a almacenarse. Esta nueva visión impone desafíos para el procesamiento de datos al vuelo: 1) las respuestas deben producirse de manera continua cada vez que nuevos datos llegan al sistema; 2) los datos son accedidos solo una vez y, generalmente, no son almacenados en su totalidad; y 3) el tiempo de procesamiento por dato para producir una respuesta debe ser bajo. Aunque existen dos modelos para el cómputo de respuestas continuas, el modelo evolutivo y el de ventana deslizante; éste segundo se ajusta mejor en ciertas aplicaciones al considerar únicamente los datos recibidos más recientemente, en lugar de todo el histórico de datos. En los últimos años, la minería de datos en streaming se ha centrado en el modelo evolutivo. Mientras que, en el modelo de ventana deslizante, el trabajo presentado es más reducido ya que estos algoritmos no sólo deben de ser incrementales si no que deben borrar la información que caduca por el deslizamiento de la ventana manteniendo los anteriores tres desafíos. Una de las tareas fundamentales en minería de datos es la búsqueda de agrupaciones donde, dado un conjunto de datos, el objetivo es encontrar grupos representativos, de manera que se tenga una descripción sintética del conjunto. Estas agrupaciones son fundamentales en aplicaciones como la detección de intrusos en la red o la segmentación de clientes en el marketing y la publicidad. Debido a las cantidades masivas de datos que deben procesarse en este tipo de aplicaciones (millones de eventos por segundo), las soluciones centralizadas puede ser incapaz de hacer frente a las restricciones de tiempo de procesamiento, por lo que deben recurrir a descartar datos durante los picos de carga. Para evitar esta perdida de datos, se impone el procesamiento distribuido de streams, en concreto, los algoritmos de agrupamiento deben ser adaptados para este tipo de entornos, en los que los datos están distribuidos. En streaming, la investigación no solo se centra en el diseño para tareas generales, como la agrupación, sino también en la búsqueda de nuevos enfoques que se adapten mejor a escenarios particulares. Como ejemplo, un mecanismo de agrupación ad-hoc resulta ser más adecuado para la defensa contra la denegación de servicio distribuida (Distributed Denial of Services, DDoS) que el problema tradicional de k-medias. En esta tesis se pretende contribuir en el problema agrupamiento en streaming tanto en entornos centralizados y distribuidos. Hemos diseñado un algoritmo centralizado de clustering mostrando las capacidades para descubrir agrupaciones de alta calidad en bajo tiempo frente a otras soluciones del estado del arte, en una amplia evaluación. Además, se ha trabajado sobre una estructura que reduce notablemente el espacio de memoria necesario, controlando, en todo momento, el error de los cómputos. Nuestro trabajo también proporciona dos protocolos de distribución del cómputo de agrupaciones. Se han analizado dos características fundamentales: el impacto sobre la calidad del clustering al realizar el cómputo distribuido y las condiciones necesarias para la reducción del tiempo de procesamiento frente a la solución centralizada. Finalmente, hemos desarrollado un entorno para la detección de ataques DDoS basado en agrupaciones. En este último caso, se ha caracterizado el tipo de ataques detectados y se ha desarrollado una evaluación sobre la eficiencia y eficacia de la mitigación del impacto del ataque. ABSTRACT Advances in hardware allow to collect huge volumes of data emerging applications that must provide information in near-real time, e.g., patient monitoring, health monitoring of water pipes, etc. The data streaming model emerges to comply with these applications overcoming the traditional store-then-process model. With the store-then-process model, data is stored before being consulted; while, in streaming, data are processed on the fly producing continuous responses. The challenges of streaming for processing data on the fly are the following: 1) responses must be produced continuously whenever new data arrives in the system; 2) data is accessed only once and is generally not maintained in its entirety, and 3) data processing time to produce a response should be low. Two models exist to compute continuous responses: the evolving model and the sliding window model; the latter fits best with applications must be computed over the most recently data rather than all the previous data. In recent years, research in the context of data stream mining has focused mainly on the evolving model. In the sliding window model, the work presented is smaller since these algorithms must be incremental and they must delete the information which expires when the window slides. Clustering is one of the fundamental techniques of data mining and is used to analyze data sets in order to find representative groups that provide a concise description of the data being processed. Clustering is critical in applications such as network intrusion detection or customer segmentation in marketing and advertising. Due to the huge amount of data that must be processed by such applications (up to millions of events per second), centralized solutions are usually unable to cope with timing restrictions and recur to shedding techniques where data is discarded during load peaks. To avoid discarding of data, processing of streams (such as clustering) must be distributed and adapted to environments where information is distributed. In streaming, research does not only focus on designing for general tasks, such as clustering, but also in finding new approaches that fit bests with particular scenarios. As an example, an ad-hoc grouping mechanism turns out to be more adequate than k-means for defense against Distributed Denial of Service (DDoS). This thesis contributes to the data stream mining clustering technique both for centralized and distributed environments. We present a centralized clustering algorithm showing capabilities to discover clusters of high quality in low time and we provide a comparison with existing state of the art solutions. We have worked on a data structure that significantly reduces memory requirements while controlling the error of the clusters statistics. We also provide two distributed clustering protocols. We focus on the analysis of two key features: the impact on the clustering quality when computation is distributed and the requirements for reducing the processing time compared to the centralized solution. Finally, with respect to ad-hoc grouping techniques, we have developed a DDoS detection framework based on clustering.We have characterized the attacks detected and we have evaluated the efficiency and effectiveness of mitigating the attack impact.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper shows the preliminary results of the development and application of a procedure to filter the Acoustic Emission (AE) signals to distinguish between AE signals coming from friction and AE signals coming from concrete cracking. These signals were recorded during the trainings of an experiment carried out on a reinforced concrete frame subjected to dynamic loadings with the shaking table of the University of Granada (Spain). Discrimination between friction and cracking AE signals is the base to develop a successful procedure and damage index based on AE testing for health monitoring of RC structures subjected to earthquakes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

MEDLINEplus is a Web-based consumer health information resource, made available by the National Library of Medicine (NLM). MEDLINEplus has been designed to provide consumers with a well-organized, selective Web site facilitating access to reliable full-text health information. In addition to full-text resources, MEDLINEplus directs consumers to dictionaries, organizations, directories, libraries, and clearinghouses for answers to health questions. For each health topic, MEDLINEplus includes a preformulated MEDLINE search created by librarians. The site has been designed to match consumer language to medical terminology. NLM has used advances in database and Web technologies to build and maintain MEDLINEplus, allowing health sciences librarians to contribute remotely to the resource. This article describes the development and implementation of MEDLINEplus, its supporting technology, and plans for future development.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

As a rural state, Ohio has a vital interest in addressing rural health and information needs. NetWellness is a Web-based consumer health information service that focuses on the needs of the residents of Ohio. Health sciences faculty from the state's three Carnegie Research I universities—University of Cincinnati, Case Western Reserve University, and The Ohio State University—create and evaluate content and provide Ask an Expert service to all visitors. Through partnerships at the state and local levels, involving public, private, commercial, and noncommercial organizations, NetWellness has grown from a regional demonstration project in 1995 to a key statewide service. Collaboration with public libraries, complemented by alliances with kindergarten through twelfth grade agencies, makes NetWellness Ohio's essential health information resource.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Both lifestyle and geography make the delivery of consumer health information in the rural setting unique. The Planetree Health Resource Center in The Dalles, Oregon, has served the public in a rural setting for the past eight years. It is a community-based consumer health library, affiliated with a small rural hospital, Mid-Columbia Medical Center. One task of providing consumer health information in rural environments is to be in relationship with individuals in the community. Integration into community life is very important for credibility and sustainability. The resource center takes a proactive approach and employs several different outreach efforts to deepen its relationship with community members. It also works hard to foster partnerships for improved health information delivery with other community organizations, including area schools. This paper describes Planetree Health Resource Center's approach to rural outreach.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

U.S. Atomic Energy Commission Plowshare Program -- Cover.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

"March 1996"--Cover of [pt. 6]

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Two water quality monitoring strategies designed to sample hydrophobic organic contaminants have been applied and evaluated across an expected concentration gradient in PAHs in the Moreton region. Semipermeable membrane devices (SPMDs) that sequester contaminants via passive diffusion across a membrane were used to evaluate the concentration of PAHs at four and five sites in spring and summer 2001/2002, respectively. In addition, induction of hepatic cytochrome P4501, EROD activity, in yellowfin bream, Acanthopagrus australis, captured in the vicinity of SPMD sampling sites following deployment in summer was used as a biomarker of exposure to PAHs and related chemicals. SPMDs identified a clear and reproducible gradient in PAH contamination with levels increasing from east to west in Moreton Bay and upstream in the Brisbane River. The highest PAH concentrations expressed as B(a)P-toxicity equivalents (TEQs) were found in urban areas, which were also furthest upstream and experienced the least flushing. Cytochrome P4501 induction in A. australis was similar at all sites. The absence of clear trends in EROD activity may be attributable to factors not measured in this study or variable residency time of A. australis in contaminated areas. It is also possible that fish in the Moreton region are displaying enzymatic adaptation, which has been reported previously for fish subjected to chronic exposure to organic contaminants. These potential interferences complicate interpretation of EROD activity from feral biota. It is, therefore, suggested that future monitoring combine the two methods by applying passive sampler extracts to in vitro EROD assays. (C) 2004 Elsevier Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper describes the use of a web-site for the dissemination of the community-based '10,000 steps' program which was originally developed and evaluated in Rockhampton, Queensland in 2001-2003. The website provides information and interactive activities for individuals, and promotes resources and programs for health promotion professionals. The dissemination activity was assessed in terms of program adoption and implementation. In a 2-year period (May 2004-March 2006) more than 18,000 people registered as users of the web-site (togging more than 8.5 billion steps) and almost 100 workplaces and 13 communities implemented aspects of the 10,000 steps program. These data support the use of the internet as an effective means of disseminating ideas and resources beyond the geographical borders of the original project. Following this preliminary dissemination, there remains a need for the systematic study of different dissemination strategies, so that evidence-based physical activity programs can be translated into more widespread public health practice. (c) 2006 Sports Medicine Australia. Published by Elsevier Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Access to Allied Psychological Services component of Australia's Better Outcomes in Mental Health Care program enables eligible general practitioners to refer consumers to allied health professionals for affordable, evidence-based mental health care, via 108 projects conducted by Divisions of General Practice. The current study profiled the models of service delivery across these projects, and examined whether particular models were associated with differential levels of access to services. We found: 76% of projects were retaining their allied health professionals under contract, 28% via direct employment, and 7% some other way; Allied health professionals were providing services from GPs' rooms in 63% of projects, from their own rooms in 63%, from a third location in 42%; and The referral mechanism of choice was direct referral in 51% of projects, a voucher system in 27%, a brokerage system in 24%, and a register system in 25%. Many of these models were being used in combination. No model was predictive of differential levels of access, suggesting that the approach of adapting models to the local context is proving successful.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In 1992 the Australian Government adopted the National Mental Health Strategy in an attempt to improve the provision of mental health services. A component was to improve geographical access to hospital-based mental health services. This paper is concerned with determining if this objective has been achieved. Time-series data on patients (at a regional level) with mental illness in the State of Queensland are available for the years from 1968-69 to 2002-03. A change in regional classification by the Australian Bureau of Statistics complicates the analysis by precluding certain empirical tests such as converging utilisation rates by region. To overcome this problem, it was decided to apply concepts of concentration and equality that are commonly employed in industrial economics to the regional data. The empirical results show no evidence of improving regional access following the National Mental Health Strategy: in fact the statistical results show the opposite, i.e. declining regional access.