990 resultados para Cloud Detection
Resumo:
We report the first in situ measurements of neutral deuterium originating in the local interstellar medium (LISM) in Earth’s orbit. These measurements were performed with the IBEX-Lo camera on NASA’s interstellar boundary explorer (IBEX) satellite. All data from the spring observation periods of 2009 through 2011 have been analysed. In the three years of the IBEX mission time, the observation geometry and orbit allowed for a total observation time of 115.3 days for the LISM. However, the effects of the spinning spacecraft and the stepping through 8 energy channels mean that we are only observing the interstellar wind for a total time of 1.44 days, in which 2 counts for interstellar deuterium were collected. We report here a conservative number, because a possibility of systematic error or additional noise, though eliminated in our analysis to the best of our knowledge, only supports detection at a 1-sigma level. From these observations, we derive a ratio D/H = (5.8 ± 4.4) × 10-4 at 1 AU. After modelling the transport and loss of D and H from the termination shock to Earth’s orbit, we find that our result of D/HLISM = (1.6 ± 1.2) × 10-5 agrees with D/HLIC = (1.6 ± 0.4) × 10-5 for the local interstellar cloud. This weak interstellar signal is extracted from a strong terrestrial background signal consisting of sputter products from the sensor’s conversion surface. As reference, we accurately measure the terrestrial D/H ratio in these sputtered products and then discriminate this terrestrial background source. Because of the diminishing D and H signal at Earth’s orbit during the rising solar activity due to photoionisation losses and increased photon pressure, our result demonstrates that in situ measurements of interstellar deuterium in the inner heliosphere are only possible during solar minimum conditions.
Resumo:
The spatial and temporal patterns of fog and low clouds along the South-Western African coast are characterized based on an evaluation of Meteosat SEVIRI satellite data. A technique for the detection of fog/low clouds in the region is introduced, and validated using 1 year of CALIOP cloud lidar products, showing reliable performance. The frequency of fog and low cloud in the study area is analyzed by systematic application of the technique to all available Meteosat SEVIRI scenes from 2004 to 2009, for 7:00 UTC and 14:00 UTC. The highest frequencies are encountered in the area around Walvis Bay, with a peak in the summer months. Fog and low clouds clear by 14:00 UTC almost everywhere over land.
Resumo:
This paper assesses the along strike variation of active bedrock fault scarps using long range terrestrial laser scanning (t-LiDAR) data in order to determine the distribution behaviour of scarp height and the subsequently calculate long term throw-rates. Five faults on Cretewhich display spectacular limestone fault scarps have been studied using high resolution digital elevation model (HRDEM) data. We scanned several hundred square metres of the fault system including the footwall, fault scarp and hanging wall of the investigated fault segment. The vertical displacement and the dip of the scarp were extracted every metre along the strike of the detected fault segment based on the processed HRDEM. The scarp variability was analysed by using statistical and morphological methods. The analysis was done in a geographical information system (GIS) environment. Results show a normal distribution for the scanned fault scarp's vertical displacement. Based on these facts, the mean value of height was chosen to define the authentic vertical displacement. Consequently the scarp can be divided into above, below and within the range of mean (within one standard deviation) and quantify the modifications of vertical displacement. Therefore, the fault segment can be subdivided into areas which are influenced by external modification like erosion and sedimentation processes. Moreover, to describe and measure the variability of vertical displacement along strike the fault, the semi-variance was calculated with the variogram method. This method is used to determine how much influence the external processes have had on the vertical displacement. By combining of morphological and statistical results, the fault can be subdivided into areas with high external influences and areas with authentic fault scarps, which have little or no external influences. This subdivision is necessary for long term throw-rate calculations, because without this differentiation the calculated rates would be misleading and the activity of a fault would be incorrectly assessed with significant implications for seismic hazard assessment since fault slip rate data govern the earthquake recurrence. Furthermore, by using this workflow areas with minimal external influences can be determined, not only for throw-rate calculations, but also for determining samples sites for absolute dating techniques such as cosmogenic nuclide dating. The main outcomes of this study include: i) there is no direct correlation between the fault's mean vertical displacement and dip (R² less than 0.31); ii) without subdividing the scanned scarp into areas with differing amounts of external influences, the along strike variability of vertical displacement is ±35%; iii) when the scanned scarp is subdivided the variation of the vertical displacement of the authentic scarp (exposed by earthquakes only) is in a range of ±6% (the varies depending on the fault from 7 to 12%); iv) the calculation of the long term throw-rate (since 13 ka) for four scarps in Crete using the authentic vertical displacement is 0.35 ± 0.04 mm/yr at Kastelli 1, 0.31 ± 0.01 mm/yr at Kastelli 2, 0.85 ± 0.06 mm/yr at the Asomatos fault (Sellia) and 0.55 ± 0.05 mm/yr at the Lastros fault.
Resumo:
The popularity of MapReduce programming model has increased interest in the research community for its improvement. Among the other directions, the point of fault tolerance, concretely the failure detection issue seems to be a crucial one, but that until now has not reached its satisfying level. Motivated by this, I decided to devote my main research during this period into having a prototype system architecture of MapReduce framework with a new failure detection service, containing both analytical (theoretical) and implementation part. I am confident that this work should lead the way for further contributions in detecting failures to any NoSQL App frameworks, and cloud storage systems in general.
Resumo:
In this paper we present an innovative technique to tackle the problem of automatic road sign detection and tracking using an on-board stereo camera. It involves a continuous 3D analysis of the road sign during the whole tracking process. Firstly, a color and appearance based model is applied to generate road sign candidates in both stereo images. A sparse disparity map between the left and right images is then created for each candidate by using contour-based and SURF-based matching in the far and short range, respectively. Once the map has been computed, the correspondences are back-projected to generate a cloud of 3D points, and the best-fit plane is computed through RANSAC, ensuring robustness to outliers. Temporal consistency is enforced by means of a Kalman filter, which exploits the intrinsic smoothness of the 3D camera motion in traffic environments. Additionally, the estimation of the plane allows to correct deformations due to perspective, thus easing further sign classification.
Resumo:
In this paper we propose an innovative method for the automatic detection and tracking of road traffic signs using an onboard stereo camera. It involves a combination of monocular and stereo analysis strategies to increase the reliability of the detections such that it can boost the performance of any traffic sign recognition scheme. Firstly, an adaptive color and appearance based detection is applied at single camera level to generate a set of traffic sign hypotheses. In turn, stereo information allows for sparse 3D reconstruction of potential traffic signs through a SURF-based matching strategy. Namely, the plane that best fits the cloud of 3D points traced back from feature matches is estimated using a RANSAC based approach to improve robustness to outliers. Temporal consistency of the 3D information is ensured through a Kalman-based tracking stage. This also allows for the generation of a predicted 3D traffic sign model, which is in turn used to enhance the previously mentioned color-based detector through a feedback loop, thus improving detection accuracy. The proposed solution has been tested with real sequences under several illumination conditions and in both urban areas and highways, achieving very high detection rates in challenging environments, including rapid motion and significant perspective distortion
Resumo:
3D sensors provides valuable information for mobile robotic tasks like scene classification or object recognition, but these sensors often produce noisy data that makes impossible applying classical keypoint detection and feature extraction techniques. Therefore, noise removal and downsampling have become essential steps in 3D data processing. In this work, we propose the use of a 3D filtering and down-sampling technique based on a Growing Neural Gas (GNG) network. GNG method is able to deal with outliers presents in the input data. These features allows to represent 3D spaces, obtaining an induced Delaunay Triangulation of the input space. Experiments show how the state-of-the-art keypoint detectors improve their performance using GNG output representation as input data. Descriptors extracted on improved keypoints perform better matching in robotics applications as 3D scene registration.
Resumo:
Recent advances in airborne Light Detection and Ranging (LIDAR) technology allow rapid and inexpensive measurements of topography over large areas. Airborne LIDAR systems usually return a 3-dimensional cloud of point measurements from reflective objects scanned by the laser beneath the flight path. This technology is becoming a primary method for extracting information of different kinds of geometrical objects, such as high-resolution digital terrain models (DTMs), buildings and trees, etc. In the past decade, LIDAR gets more and more interest from researchers in the field of remote sensing and GIS. Compared to the traditional data sources, such as aerial photography and satellite images, LIDAR measurements are not influenced by sun shadow and relief displacement. However, voluminous data pose a new challenge for automated extraction the geometrical information from LIDAR measurements because many raster image processing techniques cannot be directly applied to irregularly spaced LIDAR points. ^ In this dissertation, a framework is proposed to filter out information about different kinds of geometrical objects, such as terrain and buildings from LIDAR automatically. They are essential to numerous applications such as flood modeling, landslide prediction and hurricane animation. The framework consists of several intuitive algorithms. Firstly, a progressive morphological filter was developed to detect non-ground LIDAR measurements. By gradually increasing the window size and elevation difference threshold of the filter, the measurements of vehicles, vegetation, and buildings are removed, while ground data are preserved. Then, building measurements are identified from no-ground measurements using a region growing algorithm based on the plane-fitting technique. Raw footprints for segmented building measurements are derived by connecting boundary points and are further simplified and adjusted by several proposed operations to remove noise, which is caused by irregularly spaced LIDAR measurements. To reconstruct 3D building models, the raw 2D topology of each building is first extracted and then further adjusted. Since the adjusting operations for simple building models do not work well on 2D topology, 2D snake algorithm is proposed to adjust 2D topology. The 2D snake algorithm consists of newly defined energy functions for topology adjusting and a linear algorithm to find the minimal energy value of 2D snake problems. Data sets from urbanized areas including large institutional, commercial, and small residential buildings were employed to test the proposed framework. The results demonstrated that the proposed framework achieves a very good performance. ^
Resumo:
We present Dithen, a novel computation-as-a-service (CaaS) cloud platform specifically tailored to the parallel ex-ecution of large-scale multimedia tasks. Dithen handles the upload/download of both multimedia data and executable items, the assignment of compute units to multimedia workloads, and the reactive control of the available compute units to minimize the cloud infrastructure cost under deadline-abiding execution. Dithen combines three key properties: (i) the reactive assignment of individual multimedia tasks to available computing units according to availability and predetermined time-to-completion constraints; (ii) optimal resource estimation based on Kalman-filter estimates; (iii) the use of additive increase multiplicative decrease (AIMD) algorithms (famous for being the resource management in the transport control protocol) for the control of the number of units servicing workloads. The deployment of Dithen over Amazon EC2 spot instances is shown to be capable of processing more than 80,000 video transcoding, face detection and image processing tasks (equivalent to the processing of more than 116 GB of compressed data) for less than $1 in billing cost from EC2. Moreover, the proposed AIMD-based control mechanism, in conjunction with the Kalman estimates, is shown to provide for more than 27% reduction in EC2 spot instance cost against methods based on reactive resource estimation. Finally, Dithen is shown to offer a 38% to 500% reduction of the billing cost against the current state-of-the-art in CaaS platforms on Amazon EC2 (Amazon Lambda and Amazon Autoscale). A baseline version of Dithen is currently available at dithen.com.
Resumo:
This paper provides an overview of IDS types and how they work as well as configuration considerations and issues that affect them. Advanced methods of increasing the performance of an IDS are explored such as specification based IDS for protecting Supervisory Control And Data Acquisition (SCADA) and Cloud networks. Also by providing a review of varied studies ranging from issues in configuration and specific problems to custom techniques and cutting edge studies a reference can be provided to others interested in learning about and developing IDS solutions. Intrusion Detection is an area of much required study to provide solutions to satisfy evolving services and networks and systems that support them. This paper aims to be a reference for IDS technologies other researchers and developers interested in the field of intrusion detection.
Resumo:
In recent years, we have witnessed the growth of the Internet of Things paradigm, with its increased pervasiveness in our everyday lives. The possible applications are diverse: from a smartwatch able to measure heartbeat and communicate it to the cloud, to the device that triggers an event when we approach an exhibit in a museum. Present in many of these applications is the Proximity Detection task: for instance the heartbeat could be measured only when the wearer is near to a well defined location for medical purposes or the touristic attraction must be triggered only if someone is very close to it. Indeed, the ability of an IoT device to sense the presence of other devices nearby and calculate the distance to them can be considered the cornerstone of various applications, motivating research on this fundamental topic. The energy constraints of the IoT devices are often in contrast with the needs of continuous operations to sense the environment and to achieve high accurate distance measurements from the neighbors, thus making the design of Proximity Detection protocols a challenging task.
Resumo:
To assess binocular detection grating acuity using the LEA GRATINGS test to establish age-related norms in healthy infants during their first 3 months of life. In this prospective, longitudinal study of healthy infants with clear red reflex at birth, responses to gratings were measured at 1, 2, and 3 months of age using LEA gratings at a distance of 28 cm. The results were recorded as detection grating acuity values, which were arranged in frequency tables and converted to a one-octave scale for statistical analysis. For the repeated measurements, analysis of variance (ANOVA) was used to compare the detection grating acuity results between ages. A total of 133 infants were included. The binocular responses to gratings showed development toward higher mean values and spatial frequencies, ranging from 0.55 ± 0.70 cycles per degree (cpd), or 1.74 ± 0.21 logMAR, in month 1 to 3.11 ± 0.54 cpd, or 0.98 ± 0.16 logMAR, in month 3. Repeated ANOVA indicated differences among grating acuity values in the three age groups. The LEA GRATINGS test allowed assessment of detection grating acuity and its development in a cohort of healthy infants during their first 3 months of life.
Resumo:
A novel capillary electrophoresis method using capacitively coupled contactless conductivity detection is proposed for the determination of the biocide tetrakis(hydroxymethyl)phosphonium sulfate. The feasibility of the electrophoretic separation of this biocide was attributed to the formation of an anionic complex between the biocide and borate ions in the background electrolyte. Evidence of this complex formation was provided by (11) B NMR spectroscopy. A linear relationship (R(2) = 0.9990) between the peak area of the complex and the biocide concentration (50-900 μmol/L) was found. The limit of detection and limit of quantification were 15.0 and 50.1 μmol/L, respectively. The proposed method was applied to the determination of tetrakis(hydroxymethyl)phosphonium sulfate in commercial formulations, and the results were in good agreement with those obtained by the standard iodometric titration method. The method was also evaluated for the analysis of tap water and cooling water samples treated with the biocide. The results of the recovery tests at three concentration levels (300, 400, and 600 μmol/L) varied from 75 to 99%, with a relative standard deviation no higher than 9%.
Resumo:
Infections of the central nervous systems (CNS) present a diagnostic problem for which an accurate laboratory diagnosis is essential. Invasive practices, such as cerebral biopsy, have been replaced by obtaining a polymerase chain reaction (PCR) diagnosis using cerebral spinal fluid (CSF) as a reference method. Tests on DNA extracted from plasma are noninvasive, thus avoiding all of the collateral effects and patient risks associated with CSF collection. This study aimed to determine whether plasma can replace CSF in nested PCR analysis for the detection of CNS human herpesvirus (HHV) diseases by analysing the proportion of patients whose CSF nested PCR results were positive for CNS HHV who also had the same organism identified by plasma nested PCR. In this study, CSF DNA was used as the gold standard, and nested PCR was performed on both types of samples. Fifty-two patients with symptoms of nervous system infection were submitted to CSF and blood collection. For the eight HHV, one positive DNA result-in plasma and/or CSF nested PCR-was considered an active HHV infection, whereas the occurrence of two or more HHVs in the same sample was considered a coinfection. HHV infections were positively detected in 27/52 (51.9%) of the CSF and in 32/52 (61.5%) of the plasma, difference not significant, thus nested PCR can be performed on plasma instead of CSF. In conclusion, this findings suggest that plasma as a useful material for the diagnosis of cases where there is any difficulty to perform a CSF puncture.
Resumo:
The aim of this study was to develop a methodology using Raman hyperspectral imaging and chemometric methods for identification of pre- and post-blast explosive residues on banknote surfaces. The explosives studied were of military, commercial and propellant uses. After the acquisition of the hyperspectral imaging, independent component analysis (ICA) was applied to extract the pure spectra and the distribution of the corresponding image constituents. The performance of the methodology was evaluated by the explained variance and the lack of fit of the models, by comparing the ICA recovered spectra with the reference spectra using correlation coefficients and by the presence of rotational ambiguity in the ICA solutions. The methodology was applied to forensic samples to solve an automated teller machine explosion case. Independent component analysis proved to be a suitable method of resolving curves, achieving equivalent performance with the multivariate curve resolution with alternating least squares (MCR-ALS) method. At low concentrations, MCR-ALS presents some limitations, as it did not provide the correct solution. The detection limit of the methodology presented in this study was 50μgcm(-2).