937 resultados para Satellite Monitoring Systems


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Brain injury due to lack of oxygen or impaired blood flow around the time of birth, may cause long term neurological dysfunction or death in severe cases. The treatments need to be initiated as soon as possible and tailored according to the nature of the injury to achieve best outcomes. The Electroencephalogram (EEG) currently provides the best insight into neurological activities. However, its interpretation presents formidable challenge for the neurophsiologists. Moreover, such expertise is not widely available particularly around the clock in a typical busy Neonatal Intensive Care Unit (NICU). Therefore, an automated computerized system for detecting and grading the severity of brain injuries could be of great help for medical staff to diagnose and then initiate on-time treatments. In this study, automated systems for detection of neonatal seizures and grading the severity of Hypoxic-Ischemic Encephalopathy (HIE) using EEG and Heart Rate (HR) signals are presented. It is well known that there is a lot of contextual and temporal information present in the EEG and HR signals if examined at longer time scale. The systems developed in the past, exploited this information either at very early stage of the system without any intelligent block or at very later stage where presence of such information is much reduced. This work has particularly focused on the development of a system that can incorporate the contextual information at the middle (classifier) level. This is achieved by using dynamic classifiers that are able to process the sequences of feature vectors rather than only one feature vector at a time.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Over the past few years, logging has evolved from from simple printf statements to more complex and widely used logging libraries. Today logging information is used to support various development activities such as fixing bugs, analyzing the results of load tests, monitoring performance and transferring knowledge. Recent research has examined how to improve logging practices by informing developers what to log and where to log. Furthermore, the strong dependence on logging has led to the development of logging libraries that have reduced the intricacies of logging, which has resulted in an abundance of log information. Two recent challenges have emerged as modern software systems start to treat logging as a core aspect of their software. In particular, 1) infrastructural challenges have emerged due to the plethora of logging libraries available today and 2) processing challenges have emerged due to the large number of log processing tools that ingest logs and produce useful information from them. In this thesis, we explore these two challenges. We first explore the infrastructural challenges that arise due to the plethora of logging libraries available today. As systems evolve, their logging infrastructure has to evolve (commonly this is done by migrating to new logging libraries). We explore logging library migrations within Apache Software Foundation (ASF) projects. We i find that close to 14% of the pro jects within the ASF migrate their logging libraries at least once. For processing challenges, we explore the different factors which can affect the likelihood of a logging statement changing in the future in four open source systems namely ActiveMQ, Camel, Cloudstack and Liferay. Such changes are likely to negatively impact the log processing tools that must be updated to accommodate such changes. We find that 20%-45% of the logging statements within the four systems are changed at least once. We construct random forest classifiers and Cox models to determine the likelihood of both just-introduced and long-lived logging statements changing in the future. We find that file ownership, developer experience, log density and SLOC are important factors in determining the stability of logging statements.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Recreational fisheries in North America are valued between $47.3 billion and $56.8 billion. Fisheries managers must make strategic decisions based on sound science and knowledge of population ecology, to effectively conserve populations. Competitive fishing, in the form of tournaments, has become an important part of recreational fisheries, and is common on large waterbodies including the Great Lakes. Black Bass, Micropterus spp., are top predators and among the most sought after species in competitive catch-and-release tournaments. This study investigated catch-and-release tournaments as an assessment tool through mark-recapture for Largemouth Bass (>305mm) populations in the Tri Lakes, and Bay of Quinte, part of the eastern basin of Lake Ontario. The population in the Tri Lakes (1999-2002) was estimated to be stable between 21,928-29,780, and the population in the Bay of Quinte (2012-2015) was estimated to be between 31,825-54,029 fish. Survival in the Tri Lakes varied throughout the study period, from 31%-54%; while survival in the Bay of Quinte remained stable at 63%. Differences in survival may be due to differences in fishing pressure, as 34-46% of the Largemouth Bass population on the Tri Lakes is harvested annually and only 19% of catch was attributed to tournament angling. Many biological issues still surround catch-and-release tournaments, particularly concerning displacement from initial capture sites. In the past, the majority of studies have focused on small inland lakes and coastal areas, displacing bass relatively short distances. My study displaced Largemouth and Smallmouth Bass up to 100km, and found very low rates of return; only 1 of 18 Largemouth Bass returned 15 km and 1 of 18 Smallmouth Bass returned 135 km. Both species remained near the release sites for an average of approximately 2 weeks prior to dispersing. Tournament organizers should consider the use of satellite release locations to facilitate dispersal and prevent stockpiling at the release site. Catch-and-release tournaments proved to be a valuable tool in assessing population variables and the effects of long distance displacement through the use of mark recapture and acoustic telemetry on large lake systems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Model Driven Engineering uses the principle that code can automatically be generated from software models which would potentially save time and cost of development. By this methodology, a systems structure and behaviour can be expressed in more abstract, high level terms without some of the accidental complexity that the use of a general purpose language can bring. Models are the actual implementation of the system unlike in traditional software development where models are often used for documentation purposes only. However once the code is generated from the model, testing and debugging activities tend to happen on the code level and the model is not updated. We believe that monitoring on the model level could potentially facilitate quality assurance activities as the errors are detected in the early phase of development. In this thesis, we create a Monitoring Configuration for an open source model driven engineering tool called PapyrusRT in Eclipse. We support the run-time monitoring of UML-RT elements with a tracing tool called LTTng. We annotate the model with monitoring information to be used by the code generator for adding tracepoint statements for the corresponding elements. We provide the option of a timing specification to discover latency errors on the model. We validate the results by creating and tracing real time models in PapyrusRT.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Using the NEODAAS-Dundee AVHRR receiving station (Scotland), NEODAAS-Plymouth can provide calibrated brightness temperature data to end users or interim users in near-real time. Between 2000 and 2009 these data were used to undertake volcano hot spot detection, reporting and time-average discharge rate dissemination during effusive crises at Mount Etna and Stromboli (Italy). Data were passed via FTP, within an hour of image generation, to the hot spot detection system maintained at Hawaii Institute of Geophysics and Planetology (HIGP, University of Hawaii at Manoa, Honolulu, USA). Final product generation and quality control were completed manually at HIGP once a day, so as to provide information to onsite monitoring agencies for their incorporation into daily reporting duties to Italian Civil Protection. We here describe the processing and dissemination chain, which was designed so as to provide timely, useable, quality-controlled and relevant information for ‘one voice’ reporting by the responsible monitoring agencies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Using the NEODAAS-Dundee AVHRR receiving station (Scotland), NEODAAS-Plymouth can provide calibrated brightness temperature data to end users or interim users in near-real time. Between 2000 and 2009 these data were used to undertake volcano hot spot detection, reporting and time-average discharge rate dissemination during effusive crises at Mount Etna and Stromboli (Italy). Data were passed via FTP, within an hour of image generation, to the hot spot detection system maintained at Hawaii Institute of Geophysics and Planetology (HIGP, University of Hawaii at Manoa, Honolulu, USA). Final product generation and quality control were completed manually at HIGP once a day, so as to provide information to onsite monitoring agencies for their incorporation into daily reporting duties to Italian Civil Protection. We here describe the processing and dissemination chain, which was designed so as to provide timely, useable, quality-controlled and relevant information for ‘one voice’ reporting by the responsible monitoring agencies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

To evaluate the performance of ocean-colour retrievals of total chlorophyll-a concentration requires direct comparison with concomitant and co-located in situ data. For global comparisons, these in situ match-ups should be ideally representative of the distribution of total chlorophyll-a concentration in the global ocean. The oligotrophic gyres constitute the majority of oceanic water, yet are under-sampled due to their inaccessibility and under-represented in global in situ databases. The Atlantic Meridional Transect (AMT) is one of only a few programmes that consistently sample oligotrophic waters. In this paper, we used a spectrophotometer on two AMT cruises (AMT19 and AMT22) to continuously measure absorption by particles in the water of the ship's flow-through system. From these optical data continuous total chlorophyll-a concentrations were estimated with high precision and accuracy along each cruise and used to evaluate the performance of ocean-colour algorithms. We conducted the evaluation using level 3 binned ocean-colour products, and used the high spatial and temporal resolution of the underway system to maximise the number of match-ups on each cruise. Statistical comparisons show a significant improvement in the performance of satellite chlorophyll algorithms over previous studies, with root mean square errors on average less than half (~ 0.16 in log10 space) that reported previously using global datasets (~ 0.34 in log10 space). This improved performance is likely due to the use of continuous absorption-based chlorophyll estimates, that are highly accurate, sample spatial scales more comparable with satellite pixels, and minimise human errors. Previous comparisons might have reported higher errors due to regional biases in datasets and methodological inconsistencies between investigators. Furthermore, our comparison showed an underestimate in satellite chlorophyll at low concentrations in 2012 (AMT22), likely due to a small bias in satellite remote-sensing reflectance data. Our results highlight the benefits of using underway spectrophotometric systems for evaluating satellite ocean-colour data and underline the importance of maintaining in situ observatories that sample the oligotrophic gyres.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

To evaluate the performance of ocean-colour retrievals of total chlorophyll-a concentration requires direct comparison with concomitant and co-located in situ data. For global comparisons, these in situ match-ups should be ideally representative of the distribution of total chlorophyll-a concentration in the global ocean. The oligotrophic gyres constitute the majority of oceanic water, yet are under-sampled due to their inaccessibility and under-represented in global in situ databases. The Atlantic Meridional Transect (AMT) is one of only a few programmes that consistently sample oligotrophic waters. In this paper, we used a spectrophotometer on two AMT cruises (AMT19 and AMT22) to continuously measure absorption by particles in the water of the ship's flow-through system. From these optical data continuous total chlorophyll-a concentrations were estimated with high precision and accuracy along each cruise and used to evaluate the performance of ocean-colour algorithms. We conducted the evaluation using level 3 binned ocean-colour products, and used the high spatial and temporal resolution of the underway system to maximise the number of match-ups on each cruise. Statistical comparisons show a significant improvement in the performance of satellite chlorophyll algorithms over previous studies, with root mean square errors on average less than half (~ 0.16 in log10 space) that reported previously using global datasets (~ 0.34 in log10 space). This improved performance is likely due to the use of continuous absorption-based chlorophyll estimates, that are highly accurate, sample spatial scales more comparable with satellite pixels, and minimise human errors. Previous comparisons might have reported higher errors due to regional biases in datasets and methodological inconsistencies between investigators. Furthermore, our comparison showed an underestimate in satellite chlorophyll at low concentrations in 2012 (AMT22), likely due to a small bias in satellite remote-sensing reflectance data. Our results highlight the benefits of using underway spectrophotometric systems for evaluating satellite ocean-colour data and underline the importance of maintaining in situ observatories that sample the oligotrophic gyres.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Over the last decade, ocean sunfish movements have been monitored worldwide using various satellite tracking methods. This study reports the near-real time monitoring of fine-scale (< 10 m) behaviour of sunfish. The study was conducted in southern Portugal in May 2014 and involved satellite tags and underwater and surface robotic vehicles to measure both the movements and the contextual environment of the fish. A total of four individuals were tracked using custom-made GPS satellite tags providing geolocation estimates of fine-scale resolution. These accurate positions further informed sunfish areas of restricted search (ARS), which were directly correlated to steep thermal frontal zones. Simultaneously, and for two different occasions, an Autonomous Underwater Vehicle (AUV) video-recorded the path of the tracked fish and detected buoyant particles in the water column. Importantly, the densities of these particles were also directly correlated to steep thermal gradients. Thus, both sunfish foraging behaviour (ARS) and possibly prey densities, were found to be influenced by analogous environmental conditions. In addition, the dynamic structure of the water transited by the tracked individuals was described by a Lagrangian modelling approach. The model informed the distribution of zooplankton in the region, both horizontally and in the water column, and the resultant simulated densities positively correlated with sunfish ARS behaviour estimator (rs = 0.184, p<0.001). The model also revealed that tracked fish opportunistically displace with respect to subsurface current flow. Thus, we show how physical forcing and current structure provide a rationale for a predator’s fine-scale behaviour observed over a two weeks in May 2014.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Over the last decade, ocean sunfish movements have been monitored worldwide using various satellite tracking methods. This study reports the near-real time monitoring of fine-scale (< 10 m) behaviour of sunfish. The study was conducted in southern Portugal in May 2014 and involved satellite tags and underwater and surface robotic vehicles to measure both the movements and the contextual environment of the fish. A total of four individuals were tracked using custom-made GPS satellite tags providing geolocation estimates of fine-scale resolution. These accurate positions further informed sunfish areas of restricted search (ARS), which were directly correlated to steep thermal frontal zones. Simultaneously, and for two different occasions, an Autonomous Underwater Vehicle (AUV) video-recorded the path of the tracked fish and detected buoyant particles in the water column. Importantly, the densities of these particles were also directly correlated to steep thermal gradients. Thus, both sunfish foraging behaviour (ARS) and possibly prey densities, were found to be influenced by analogous environmental conditions. In addition, the dynamic structure of the water transited by the tracked individuals was described by a Lagrangian modelling approach. The model informed the distribution of zooplankton in the region, both horizontally and in the water column, and the resultant simulated densities positively correlated with sunfish ARS behaviour estimator (rs = 0.184, p<0.001). The model also revealed that tracked fish opportunistically displace with respect to subsurface current flow. Thus, we show how physical forcing and current structure provide a rationale for a predator’s fine-scale behaviour observed over a two weeks in May 2014.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper is an overview of the development and application of Computer Vision for the Structural Health
Monitoring (SHM) of Bridges. A brief explanation of SHM is provided, followed by a breakdown of the stages of computer
vision techniques separated into laboratory and field trials. Qualitative evaluations and comparison of these methods have been
provided along with the proposal of guidelines for new vision-based SHM systems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Network security monitoring remains a challenge. As global networks scale up, in terms of traffic, volume and speed, effective attribution of cyber attacks is increasingly difficult. The problem is compounded by a combination of other factors, including the architecture of the Internet, multi-stage attacks and increasing volumes of nonproductive traffic. This paper proposes to shift the focus of security monitoring from the source to the target. Simply put, resources devoted to detection and attribution should be redeployed to efficiently monitor for targeting and prevention of attacks. The effort of detection should aim to determine whether a node is under attack, and if so, effectively prevent the attack. This paper contributes by systematically reviewing the structural, operational and legal reasons underlying this argument, and presents empirical evidence to support a shift away from attribution to favour of a target-centric monitoring approach. A carefully deployed set of experiments are presented and a detailed analysis of the results is achieved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Stealthy attackers move patiently through computer networks - taking days, weeks or months to accomplish their objectives in order to avoid detection. As networks scale up in size and speed, monitoring for such attack attempts is increasingly a challenge. This paper presents an efficient monitoring technique for stealthy attacks. It investigates the feasibility of proposed method under number of different test cases and examines how design of the network affects the detection. A methodological way for tracing anonymous stealthy activities to their approximate sources is also presented. The Bayesian fusion along with traffic sampling is employed as a data reduction method. The proposed method has the ability to monitor stealthy activities using 10-20% size sampling rates without degrading the quality of detection.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The BlackEnergy malware targeting critical infrastructures has a long history. It evolved over time from a simple DDoS platform to a quite sophisticated plug-in based malware. The plug-in architecture has a persistent malware core with easily installable attack specific modules for DDoS, spamming, info-stealing, remote access, boot-sector formatting etc. BlackEnergy has been involved in several high profile cyber physical attacks including the recent Ukraine power grid attack in December 2015. This paper investigates the evolution of BlackEnergy and its cyber attack capabilities. It presents a basic cyber attack model used by BlackEnergy for targeting industrial control systems. In particular, the paper analyzes cyber threats of BlackEnergy for synchrophasor based systems which are used for real-time control and monitoring functionalities in smart grid. Several BlackEnergy based attack scenarios have been investigated by exploiting the vulnerabilities in two widely used synchrophasor communication standards: (i) IEEE C37.118 and (ii) IEC 61850-90-5. Specifically, the paper addresses reconnaissance, DDoS, man-in-the-middle and replay/reflection attacks on IEEE C37.118 and IEC 61850-90-5. Further, the paper also investigates protection strategies for detection and prevention of BlackEnergy based cyber physical attacks.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

There have been over 3000 bridge weigh-in-motion (B-WIM) installations in 25 countries worldwide, this has led vast improvements in post processing of B-WIM systems since its introduction in the 1970’s. This paper introduces a new low-power B-WIM system using fibre optic sensors (FOS). The system consisted of a series of FOS which were attached to the soffit of an existing integral bridge with a single span of 19m. The site selection criteria and full installation process has been detailed in the paper. A method of calibration was adopted using live traffic at the bridge site and based on this calibration the accuracy of the system was determined.