950 resultados para Monitoring vibration systems
Resumo:
Whole-body vibration exposure of locomotive engineers and the vibration attenuation of seats in 22 U.S. locomotives (built between 1959 and 2000) was studied during normal revenue service and following international measurement guidelines. Triaxial vibration measurements (duration mean 155 min, range 84-383 min) on the seat and on the floor were compared. In addition to the basic vibration evaluation (aw rms), the vector sum (av), the maximum transient vibration value (MTVV/aw), the vibration dose value (VDV/(aw T1/4)), and the vibration seat effective transmissibility factor (SEAT) were calculated. The power spectral densities are also reported. The mean basic vibration level (aw rms) was for the fore-aft axis x = 0.18 m/sec2, the lateral axis y = 0.28 m/sec2, and the vertical axis z = 0.32 m/sec2. The mean vector sum was 0.59 m/sec2 (range 0.27 to 1.44). The crest factors were generally at or above 9 in the horizontal and vertical axis. The mean MTVV/aw was 5.3 (x), 5.1 (y), and 4.8 (z), and the VDV/(aw T1/4) values ranged from 1.32 to 2.3 (x-axis), 1.33 to 1.7 (y-axis), and 1.38 to 1.86 (z-axis), generally indicating high levels of shocks. The mean seat transmissibility factor (SEAT) was 1.4 (x) and 1.2 (y) and 1 (z), demonstrating a general ineffectiveness of any of the seat suspension systems. In conclusion, these data indicate that locomotive rides are characterized by relatively high shock content (acceleration peaks) of the vibration signal in all directions. Locomotive vertical and lateral vibrations are similar, which appears to be characteristic for rail vehicles compared with many road/off-road vehicles. Tested locomotive cab seats currently in use (new or old) appear inadequate to reduce potentially harmful vibration and shocks transmitted to the seated operator, and older seats particularly lack basic ergonomic features regarding adjustability and postural support.
Resumo:
Doppler Optical Coherence Tomography (DOCT) is a biomedical imaging technique that allows simultaneous structural imaging and flow monitoring inside biological tissues and materials with spatial resolution in the micrometer scale. It has recently been applied to the characterization of microfluidic systems. Structural and flow imaging of novel microfluidics platforms for cytotoxicologic applications were obtained with a real-time, Near Infrared Spectral Domain DOCT system. Characteristics such as flow homogeneity in the chamber, which is one of the most important parameters for cell culture, are investigated. OCT and DOCT images were used to monitor flow inside a specific platform that is based on microchannel division for a better flow homogeneity. In particular, the evolution of flow profile at the transition between the microchannel structure and the chamber is studied.
Resumo:
In this paper, we propose an intelligent method, named the Novelty Detection Power Meter (NodePM), to detect novelties in electronic equipment monitored by a smart grid. Considering the entropy of each device monitored, which is calculated based on a Markov chain model, the proposed method identifies novelties through a machine learning algorithm. To this end, the NodePM is integrated into a platform for the remote monitoring of energy consumption, which consists of a wireless sensors network (WSN). It thus should be stressed that the experiments were conducted in real environments different from many related works, which are evaluated in simulated environments. In this sense, the results show that the NodePM reduces by 13.7% the power consumption of the equipment we monitored. In addition, the NodePM provides better efficiency to detect novelties when compared to an approach from the literature, surpassing it in different scenarios in all evaluations that were carried out.
Resumo:
Recent advancements in cloud computing have enabled the proliferation of distributed applications, which require management and control of multiple services. However, without an efficient mechanism for scaling services in response to changing environmental conditions and number of users, application performance might suffer, leading to Service Level Agreement (SLA) violations and inefficient use of hardware resources. We introduce a system for controlling the complexity of scaling applications composed of multiple services using mechanisms based on fulfillment of SLAs. We present how service monitoring information can be used in conjunction with service level objectives, predictions, and correlations between performance indicators for optimizing the allocation of services belonging to distributed applications. We validate our models using experiments and simulations involving a distributed enterprise information system. We show how discovering correlations between application performance indicators can be used as a basis for creating refined service level objectives, which can then be used for scaling the application and improving the overall application's performance under similar conditions.
Resumo:
AIM: To investigate the acute effects of stochastic resonance whole body vibration (SR-WBV) training to identify possible explanations for preventive effects against musculoskeletal disorders. METHODS: Twenty-three healthy, female students participated in this quasi-experimental pilot study. Acute physiological and psychological effects of SR-WBV training were examined using electromyography of descending trapezius (TD) muscle, heart rate variability (HRV), different skin parameters (temperature, redness and blood flow) and self-report questionnaires. All subjects conducted a sham SR-WBV training at a low intensity (2 Hz with noise level 0) and a verum SR-WBV training at a higher intensity (6 Hz with noise level 4). They were tested before, during and after the training. Conclusions were drawn on the basis of analysis of variance. RESULTS: Twenty-three healthy, female students participated in this study (age = 22.4 ± 2.1 years; body mass index = 21.6 ± 2.2 kg/m2). Muscular activity of the TD and energy expenditure rose during verum SR-WBV compared to baseline and sham SR-WBV (all P < 0.05). Muscular relaxation after verum SR-WBV was higher than at baseline and after sham SR-WBV (all P < 0.05). During verum SR-WBV the levels of HRV were similar to those observed during sham SR-WBV. The same applies for most of the skin characteristics, while microcirculation of the skin of the middle back was higher during verum compared to sham SR-WBV (P < 0.001). Skin redness showed significant changes over the three measurement points only in the middle back area (P = 0.022). There was a significant rise from baseline to verum SR-WBV (0.86 ± 0.25 perfusion units; P = 0.008). The self-reported chronic pain grade indicators of pain, stiffness, well-being, and muscle relaxation showed a mixed pattern across conditions. Muscle and joint stiffness (P = 0.018) and muscular relaxation did significantly change from baseline to different conditions of SR-WBV (P < 0.001). Moreover, muscle relaxation after verum SR-WBV was higher than after sham SR-WBV (P < 0.05). CONCLUSION: Verum SR-WBV stimulated musculoskeletal activity in young healthy individuals while cardiovascular activation was low. Training of musculoskeletal capacity and immediate increase in musculoskeletal relaxation are potential mediators of pain reduction in preventive trials.
Resumo:
The identification of plausible causes for water body status deterioration will be much easier if it can build on available, reliable, extensive and comprehensive biogeochemical monitoring data (preferably aggregated in a database). A plausible identification of such causes is a prerequisite for well-informed decisions on which mitigation or remediation measures to take. In this chapter, first a rationale for an extended monitoring programme is provided; it is then compared to the one required by the Water Framework Directive (WFD). This proposal includes a list of relevant parameters that are needed for an integrated, a priori status assessment. Secondly, a few sophisticated statistical tools are described that subsequently allow for the estiation of the magnitude of impairment as well as the likely relative importance of different stressors in a multiple stressed environment. The advantages and restrictions of these rather complicated analytical methods are discussed. Finally, the use of Decision Support Systems (DSS) is advocated with regard to the specific WFD implementation requirements.
Resumo:
Detector uniformity is a fundamental performance characteristic of all modern gamma camera systems, and ensuring a stable, uniform detector response is critical for maintaining clinical images that are free of artifact. For these reasons, the assessment of detector uniformity is one of the most common activities associated with a successful clinical quality assurance program in gamma camera imaging. The evaluation of this parameter, however, is often unclear because it is highly dependent upon acquisition conditions, reviewer expertise, and the application of somewhat arbitrary limits that do not characterize the spatial location of the non-uniformities. Furthermore, as the goal of any robust quality control program is the determination of significant deviations from standard or baseline conditions, clinicians and vendors often neglect the temporal nature of detector degradation (1). This thesis describes the development and testing of new methods for monitoring detector uniformity. These techniques provide more quantitative, sensitive, and specific feedback to the reviewer so that he or she may be better equipped to identify performance degradation prior to its manifestation in clinical images. The methods exploit the temporal nature of detector degradation and spatially segment distinct regions-of-non-uniformity using multi-resolution decomposition. These techniques were tested on synthetic phantom data using different degradation functions, as well as on experimentally acquired time series floods with induced, progressively worsening defects present within the field-of-view. The sensitivity of conventional, global figures-of-merit for detecting changes in uniformity was evaluated and compared to these new image-space techniques. The image-space algorithms provide a reproducible means of detecting regions-of-non-uniformity prior to any single flood image’s having a NEMA uniformity value in excess of 5%. The sensitivity of these image-space algorithms was found to depend on the size and magnitude of the non-uniformities, as well as on the nature of the cause of the non-uniform region. A trend analysis of the conventional figures-of-merit demonstrated their sensitivity to shifts in detector uniformity. The image-space algorithms are computationally efficient. Therefore, the image-space algorithms should be used concomitantly with the trending of the global figures-of-merit in order to provide the reviewer with a richer assessment of gamma camera detector uniformity characteristics.
Clutter elimination for deep clinical optoacoustic imaging using localised vibration tagging (LOVIT)
Resumo:
This paper investigates a novel method which allows clutter elimination in deep optoacoustic imaging. Clutter significantly limits imaging depth in clinical optoacoustic imaging, when irradiation optics and ultrasound detector are integrated in a handheld probe for flexible imaging of the human body. Strong optoacoustic transients generated at the irradiation site obscure weak signals from deep inside the tissue, either directly by propagating towards the probe, or via acoustic scattering. In this study we demonstrate that signals of interest can be distinguished from clutter by tagging them at the place of origin with localised tissue vibration induced by the acoustic radiation force in a focused ultrasonic beam. We show phantom results where this technique allowed almost full clutter elimination and thus strongly improved contrast for deep imaging. Localised vibration tagging by means of acoustic radiation force is especially promising for integration into ultrasound systems that already have implemented radiation force elastography.
Resumo:
Increasing commercial pressures on land are provoking fundamental and far-reaching changes in the relationships between people and land. Much knowledge on land-oriented investments projects currently comes from the media. Although this provides a good starting point, lack of transparency and rapidly changing contexts mean that this is often unreliable. The International Land Coalition, in partnership with Oxfam Novib, Centre de coopération internationale en recherche agronomique pour le développement (CIRAD), University of Pretoria, Centre for Development and Environment of the University of Bern (CDE), and GIZ, started to compile an inventory of land-related investments. This project aims to better understand the extent, trends and impacts of land-related investments by supporting an ongoing and systematic stocktaking exercise of the various investment projects currently taking place worldwide. It involves a large number of organizations and individuals working in areas where land transactions are being made, and able to provide details of such investments. The project monitors land transactions in rural areas that imply a transformation of land use rights from communities and smallholders to commercial use, and are made both by domestic and foreign investors (private actors, governments, government-back private investors). The focus is on investments for food or agrofuel production, timber extraction, carbon trading, mineral extraction, conservation and tourism. A novel way of using ITC to document land acquisitions in a spatially explicit way and by using an approach called “crowdsourcing” is being developed. This approach will allow actors to share information and knowledge directly and at any time on a public platform, where it will be scrutinized in terms of reliability and cross checked with other sources. Up to now, over 1200 deals have been recorded across 96 countries. Details of such transactions have been classified in a matrix and distributed to over 350 contacts worldwide for verification. The verified information has been geo-referenced and represented in two global maps. This is an open database enabling a continued monitoring exercise and the improvement of data accuracy. More information will be released over time. The opportunities arise from overcoming constraints by incomplete information by proposing a new way of collecting, enhancing and sharing information and knowledge in a more democratic and transparent manner. The intention is to develop interactive knowledge platform where any interested person can share and access information on land deals, their link to involved stakeholders, and their embedding into a geographical context. By making use of new ICT technologies that are more and more in the reach of local stakeholders, as well as open access and web-based spatial information systems, it will become possible to create a dynamic database containing spatial explicit data. Feeding in data by a large number of stakeholders, increasingly also by means of new mobile ITC technologies, will open up new opportunities to analyse, monitor and assess highly dynamic trends of land acquisition and rural transformation.
Resumo:
In recent years, Geographic Information Systems (GIS) have increasingly been used in a wide array of application contexts for development cooperation in lowlands and mountain areas. When used for planning, implementation, and monitoring, GIS is a versatile and highly efficient tool, particularly in mountain areas characterized by great spatial diversity and inaccessibility. However, the establishment and application of GIS in mountain regions generally presents considerable technical challenges. Moreover, it is necessary to address specific institutional and organizational issues regarding implementation.
Resumo:
It is a challenge to measure the impact of releasing data to the public since the effects may not be directly linked to particular open data activities or substantial impact may only occur several years after publishing the data. This paper proposes a framework to assess the impact of releasing open data by applying the Social Return on Investment (SROI) approach. SROI was developed for organizations intended to generate social and environmental benefits thus fitting the purpose of most open data initiatives. We link the four steps of SROI (input, output, outcome, impact) with the 14 high-value data categories of the G8 Open Data Charter to create a matrix of open data examples, activities, and impacts in each of the data categories. This Impact Monitoring Framework helps data providers to navigate the impact space of open data laying out the conceptual basis for further research.
Resumo:
Due to its extraordinary biodiversity and rapid deforestation, north-eastern Madagascar is a conservation hotspot of global importance. Reducing shifting cultivation is a high priority for policy-makers and conservationists; however, spatially explicit evidence of shifting cultivation is lacking due to the difficulty of mapping it with common remote sensing methods. To overcome this challenge, we adopted a landscape mosaic approach to assess the changes between natural forests, shifting cultivation and permanent cultivation systems at the regional level from 1995 to 2011. Our study confirmed that shifting cultivation is still being used to produce subsistence rice throughout the region, but there is a trend of intensification away from shifting cultivation towards permanent rice production, especially near protected areas. While large continuous forest exists today only in the core zones of protected areas, the agricultural matrix is still dominated by a dense cover of tree crops and smaller forest fragments. We believe that this evidence makes a crucial contribution to the development of interventions to prevent further conversion of forest to agricultural land while improving local land users' well-being.
Resumo:
Syndromic surveillance (SyS) systems currently exploit various sources of health-related data, most of which are collected for purposes other than surveillance (e.g. economic). Several European SyS systems use data collected during meat inspection for syndromic surveillance of animal health, as some diseases may be more easily detected post-mortem than at their point of origin or during the ante-mortem inspection upon arrival at the slaughterhouse. In this paper we use simulation to evaluate the performance of a quasi-Poisson regression (also known as an improved Farrington) algorithm for the detection of disease outbreaks during post-mortem inspection of slaughtered animals. When parameterizing the algorithm based on the retrospective analyses of 6 years of historic data, the probability of detection was satisfactory for large (range 83-445 cases) outbreaks but poor for small (range 20-177 cases) outbreaks. Varying the amount of historical data used to fit the algorithm can help increasing the probability of detection for small outbreaks. However, while the use of a 0·975 quantile generated a low false-positive rate, in most cases, more than 50% of outbreak cases had already occurred at the time of detection. High variance observed in the whole carcass condemnations time-series, and lack of flexibility in terms of the temporal distribution of simulated outbreaks resulting from low reporting frequency (monthly), constitute major challenges for early detection of outbreaks in the livestock population based on meat inspection data. Reporting frequency should be increased in the future to improve timeliness of the SyS system while increased sensitivity may be achieved by integrating meat inspection data into a multivariate system simultaneously evaluating multiple sources of data on livestock health.