963 resultados para automatic target detection
Resumo:
Las técnicas SAR (Synthetic Aperture Radar, radar de apertura sintética) e ISAR (Inverse SAR, SAR inverso) son sistemas radar coherentes de alta resolución, capaces de proporcionar un mapa de la sección radar del blanco en el dominio espacial de distancia y acimut. El objetivo de ambas técnicas radica en conseguir una resolución acimutal más fina generando una apertura sintética a partir del movimiento relativo entre radar y blanco. Los radares imagen complementan la labor de los sistemas ópticos e infrarrojos convencionales, especialmente en condiciones meteorológicas adversas. Los sistemas SAR e ISAR convencionales se diseñan para iluminar blancos en situaciones de línea de vista entre sensor y blanco. Por este motivo, presentan un menor rendimiento en escenarios complejos, como por ejemplo en bosques o entornos urbanos, donde los retornos multitrayecto se superponen a los ecos directos procedentes de los blancos. Se conocen como "imágenes fantasma", puesto que enmascaran a los verdaderos blancos y dan lugar a una calidad visual pobre, complicando en gran medida la detección del blanco. El problema de la mitigación del multitrayecto en imágenes radar adquiere una relevancia teórica y práctica. En esta Tesis Doctoral, se hace uso del concepto de inversión temporal (Time Reversal, TR) para mejorar la calidad visual de las imágenes SAR e ISAR eliminando las "imágenes fantasma" originadas por la propagación multitrayecto (algoritmos TR-SAR y TR-ISAR, respectivamente). No obstante, previamente a la aplicación de estas innovadoras técnicas de mitigación del multi-trayecto, es necesario resolver el problema geométrico asociado al multitrayecto. Centrando la atención en la mejora de las prestaciones de TR-ISAR, se implementan una serie de técnicas de procesado de señal avanzadas antes y después de la etapa basada en inversión temporal (el eje central de esta Tesis). Las primeras (técnicas de pre-procesado) están relacionadas con el multilook averaging, las transformadas tiempo-frecuencia y la transformada de Radon, mientras que las segundas (técnicas de post-procesado) se componen de un conjunto de algoritmos de superresolución. En pocas palabras, todas ellas pueden verse como un valor añadido al concepto de TR, en lugar de ser consideradas como técnicas independientes. En resumen, la utilización del algoritmo diseñado basado en inversión temporal, junto con algunas de las técnicas de procesado de señal propuestas, no deben obviarse si se desean obtener imágenes ISAR de gran calidad en escenarios con mucho multitrayecto. De hecho, las imágenes resultantes pueden ser útiles para posteriores esquemas de reconocimiento automático de blancos (Automatic Target Recognition, ATR). Como prueba de concepto, se hace uso tanto de datos simulados como experimentales obtenidos a partir de radares de alta resolución con el fin de verificar los métodos propuestos.
Resumo:
The concept of service oriented architecture has been extensively explored in software engineering, due to the fact that it produces architectures made up of several interconnected modules, easy to reuse when building new systems. This approach to design would be impossible without interconnection mechanisms such as REST (Representationa State Transfer) services, which allow module communication while minimizing coupling. . However, this low coupling brings disadvantages, such as the lack of transparency, which makes it difficult to sistematically create tests without knowledge of the inner working of a system. In this article, we present an automatic error detection system for REST services, based on a statistical analysis over responses produced at multiple service invocations. Thus, a service can be systematically tested without knowing its full specification. The method can find errors in REST services which could not be identified by means of traditional testing methods, and provides limited testing coverage for services whose response format is unknown. It can be also useful as a complement to other testing mechanisms.
Resumo:
At early stages in visual processing cells respond to local stimuli with specific features such as orientation and spatial frequency. Although the receptive fields of these cells have been thought to be local and independent, recent physiological and psychophysical evidence has accumulated, indicating that the cells participate in a rich network of local connections. Thus, these local processing units can integrate information over much larger parts of the visual field; the pattern of their response to a stimulus apparently depends on the context presented. To explore the pattern of lateral interactions in human visual cortex under different context conditions we used a novel chain lateral masking detection paradigm, in which human observers performed a detection task in the presence of different length chains of high-contrast-flanked Gabor signals. The results indicated a nonmonotonic relation of the detection threshold with the number of flankers. Remote flankers had a stronger effect on target detection when the space between them was filled with other flankers, indicating that the detection threshold is caused by dynamics of large neuronal populations in the neocortex, with a major interplay between excitation and inhibition. We considered a model of the primary visual cortex as a network consisting of excitatory and inhibitory cell populations, with both short- and long-range interactions. The model exhibited a behavior similar to the experimental results throughout a range of parameters. Experimental and modeling results indicated that long-range connections play an important role in visual perception, possibly mediating the effects of context.
Resumo:
Comunicación presentada en EVACES 2011, 4th International Conference on Experimental Vibration Analysis for Civil Engineering Structures, Varenna (Lecco), Italy, October 3-5, 2011.
Resumo:
This study was carried out to detect differences in locomotion and feeding behavior in lame (group L; n = 41; gait score ≥ 2.5) and non-lame (group C; n = 12; gait score ≤ 2) multiparous Holstein cows in a cross-sectional study design. A model for automatic lameness detection was created, using data from accelerometers attached to the hind limbs and noseband sensors attached to the head. Each cow's gait was videotaped and scored on a 5-point scale before and after a period of 3 consecutive days of behavioral data recording. The mean value of 3 independent experienced observers was taken as a definite gait score and considered to be the gold standard. For statistical analysis, data from the noseband sensor and one of two accelerometers per cow (randomly selected) of 2 out of 3 randomly selected days was used. For comparison between group L and group C, the T-test, the Aspin-Welch Test and the Wilcoxon Test were used. The sensitivity and specificity for lameness detection was determined with logistic regression and ROC-analysis. Group L compared to group C had significantly lower eating and ruminating time, fewer eating chews, ruminating chews and ruminating boluses, longer lying time and lying bout duration, lower standing time, fewer standing and walking bouts, fewer, slower and shorter strides and a lower walking speed. The model considering the number of standing bouts and walking speed was the best predictor of cows being lame with a sensitivity of 90.2% and specificity of 91.7%. Sensitivity and specificity of the lameness detection model were considered to be very high, even without the use of halter data. It was concluded that under the conditions of the study farm, accelerometer data were suitable for accurately distinguishing between lame and non-lame dairy cows, even in cases of slight lameness with a gait score of 2.5.
Resumo:
The world's largest fossil oyster reef, formed by the giant oyster Crassostrea gryphoides and located in Stetten (north of Vienna, Austria) is studied by Harzhauser et al., 2015, 2016; Djuricic et al., 2016. Digital documentation of the unique geological site is provided by terrestrial laser scanning (TLS) at the millimeter scale. Obtaining meaningful results is not merely a matter of data acquisition with a suitable device; it requires proper planning, data management, and postprocessing. Terrestrial laser scanning technology has a high potential for providing precise 3D mapping that serves as the basis for automatic object detection in different scenarios; however, it faces challenges in the presence of large amounts of data and the irregular geometry of an oyster reef. We provide a detailed description of the techniques and strategy used for data collection and processing in Djuricic et al., 2016. The use of laser scanning provided the ability to measure surface points of 46,840 (estimated) shells. They are up to 60-cm-long oyster specimens, and their surfaces are modeled with a high accuracy of 1 mm. In addition to laser scanning measurements, more than 300 photographs were captured, and an orthophoto mosaic was generated with a ground sampling distance (GSD) of 0.5 mm. This high-resolution 3D information and the photographic texture serve as the basis for ongoing and future geological and paleontological analyses. Moreover, they provide unprecedented documentation for conservation issues at a unique natural heritage site.
Resumo:
In this paper, we describe an algorithm that automatically detects and labels peaks I - VII of the normal, suprathreshold auditory brainstem response (ABR). The algorithm proceeds in three stages, with the option of a fourth: ( 1) all candidate peaks and troughs in the ABR waveform are identified using zero crossings of the first derivative, ( 2) peaks I - VII are identified from these candidate peaks based on their latency and morphology, ( 3) if required, peaks II and IV are identified as points of inflection using zero crossings of the second derivative and ( 4) interpeak troughs are identified before peak latencies and amplitudes are measured. The performance of the algorithm was estimated on a set of 240 normal ABR waveforms recorded using a stimulus intensity of 90 dBnHL. When compared to an expert audiologist, the algorithm correctly identified the major ABR peaks ( I, III and V) in 96 - 98% of the waveforms and the minor ABR peaks ( II, IV, VI and VII) in 45 - 83% of waveforms. Whilst peak II was correctly identified in only 83% and peak IV in 77% of waveforms, it was shown that 5% of the peak II identifications and 31% of the peak IV identifications came as a direct result of allowing these peaks to be found as points of inflection. Copyright (C) 2005 S. Karger AG, Basel.
Resumo:
Task classification is introduced as a method for the evaluation of monitoring behaviour in different task situations. On the basis of an analysis of different monitoring tasks, a task classification system comprising four task 'dimensions' is proposed. The perceptual speed and flexibility of closure categories, which are identified with signal discrimination type, comprise the principal dimension in this taxonomy, the others being sense modality, the time course of events, and source complexity. It is also proposed that decision theory provides the most complete method for the analysis of performance in monitoring tasks. Several different aspects of decision theory in relation to monitoring behaviour are described. A method is also outlined whereby both accuracy and latency measures of performance may be analysed within the same decision theory framework. Eight experiments and an organizational study are reported. The results show that a distinction can be made between the perceptual efficiency (sensitivity) of a monitor and his criterial level of response, and that in most monitoring situations, there is no decrement in efficiency over the work period, but an increase in the strictness of the response criterion. The range of tasks exhibiting either or both of these performance trends can be specified within the task classification system. In particular, it is shown that a sensitivity decrement is only obtained for 'speed' tasks with a high stimulation rate. A distinctive feature of 'speed' tasks is that target detection requires the discrimination of a change in a stimulus relative to preceding stimuli, whereas in 'closure' tasks, the information required for the discrimination of targets is presented at the same point In time. In the final study, the specification of tasks yielding sensitivity decrements is shown to be consistent with a task classification analysis of the monitoring literature. It is also demonstrated that the signal type dimension has a major influence on the consistency of individual differences in performance in different tasks. The results provide an empirical validation for the 'speed' and 'closure' categories, and suggest that individual differences are not completely task specific but are dependent on the demands common to different tasks. Task classification is therefore shovn to enable improved generalizations to be made of the factors affecting 1) performance trends over time, and 2) the consistencv of performance in different tasks. A decision theory analysis of response latencies is shown to support the view that criterion shifts are obtained in some tasks, while sensitivity shifts are obtained in others. The results of a psychophysiological study also suggest that evoked potential latency measures may provide temporal correlates of criterion shifts in monitoring tasks. Among other results, the finding that the latencies of negative responses do not increase over time is taken to invalidate arousal-based theories of performance trends over a work period. An interpretation in terms of expectancy, however, provides a more reliable explanation of criterion shifts. Although the mechanisms underlying the sensitivity decrement are not completely clear, the results rule out 'unitary' theories such as observing response and coupling theory. It is suggested that an interpretation in terms of the memory data limitations on information processing provides the most parsimonious explanation of all the results in the literature relating to sensitivity decrement. Task classification therefore enables the refinement and selection of theories of monitoring behaviour in terms of their reliability in generalizing predictions to a wide range of tasks. It is thus concluded that task classification and decision theory provide a reliable basis for the assessment and analysis of monitoring behaviour in different task situations.
Resumo:
Terrestrial remote sensing imagery involves the acquisition of information from the Earth's surface without physical contact with the area under study. Among the remote sensing modalities, hyperspectral imaging has recently emerged as a powerful passive technology. This technology has been widely used in the fields of urban and regional planning, water resource management, environmental monitoring, food safety, counterfeit drugs detection, oil spill and other types of chemical contamination detection, biological hazards prevention, and target detection for military and security purposes [2-9]. Hyperspectral sensors sample the reflected solar radiation from the Earth surface in the portion of the spectrum extending from the visible region through the near-infrared and mid-infrared (wavelengths between 0.3 and 2.5 µm) in hundreds of narrow (of the order of 10 nm) contiguous bands [10]. This high spectral resolution can be used for object detection and for discriminating between different objects based on their spectral xharacteristics [6]. However, this huge spectral resolution yields large amounts of data to be processed. For example, the Airbone Visible/Infrared Imaging Spectrometer (AVIRIS) [11] collects a 512 (along track) X 614 (across track) X 224 (bands) X 12 (bits) data cube in 5 s, corresponding to about 140 MBs. Similar data collection ratios are achieved by other spectrometers [12]. Such huge data volumes put stringent requirements on communications, storage, and processing. The problem of signal sbspace identification of hyperspectral data represents a crucial first step in many hypersctral processing algorithms such as target detection, change detection, classification, and unmixing. The identification of this subspace enables a correct dimensionality reduction (DR) yelding gains in data storage and retrieval and in computational time and complexity. Additionally, DR may also improve algorithms performance since it reduce data dimensionality without losses in the useful signal components. The computation of statistical estimates is a relevant example of the advantages of DR, since the number of samples required to obtain accurate estimates increases drastically with the dimmensionality of the data (Hughes phnomenon) [13].
Resumo:
The continuous technology evaluation is benefiting our lives to a great extent. The evolution of Internet of things and deployment of wireless sensor networks is making it possible to have more connectivity between people and devices used extensively in our daily lives. Almost every discipline of daily life including health sector, transportation, agriculture etc. is benefiting from these technologies. There is a great potential of research and refinement of health sector as the current system is very often dependent on manual evaluations conducted by the clinicians. There is no automatic system for patient health monitoring and assessment which results to incomplete and less reliable heath information. Internet of things has a great potential to benefit health care applications by automated and remote assessment, monitoring and identification of diseases. Acute pain is the main cause of people visiting to hospitals. An automatic pain detection system based on internet of things with wireless devices can make the assessment and redemption significantly more efficient. The contribution of this research work is proposing pain assessment method based on physiological parameters. The physiological parameters chosen for this study are heart rate, electrocardiography, breathing rate and galvanic skin response. As a first step, the relation between these physiological parameters and acute pain experienced by the test persons is evaluated. The electrocardiography data collected from the test persons is analyzed to extract interbeat intervals. This evaluation clearly demonstrates specific patterns and trends in these parameters as a consequence of pain. This parametric behavior is then used to assess and identify the pain intensity by implementing machine learning algorithms. Support vector machines are used for classifying these parameters influenced by different pain intensities and classification results are achieved. The classification results with good accuracy rates between two and three levels of pain intensities shows clear indication of pain and the feasibility of this pain assessment method. An improved approach on the basis of this research work can be implemented by using both physiological parameters and electromyography data of facial muscles for classification.
Resumo:
Intensification of permafrost disturbances such as active layer detachments (ALDs) and retrogressive thaw slumps (RTS) have been observed across the circumpolar Arctic. These features are indicators of unstable conditions stemming from recent climate warming and permafrost degradation. In order to understand the processes interacting to give rise to these features, a multidisciplinary approach is required; i.e., interactions between geomorphology, hydrology, vegetation and ground thermal conditions. The goal of this research is to detect and map permafrost disturbance, predict landscape controls over disturbance and determine approaches for monitoring disturbance, all with the goal of contributing to the mitigation of permafrost hazards. Permafrost disturbance inventories were created by applying semi-automatic change detection techniques to IKONOS satellite imagery collected at the Cape Bounty Arctic Watershed Observatory (CBAWO). These methods provide a means to estimate the spatial distribution of permafrost disturbances for a given area for use as an input in susceptibility modelling. Permafrost disturbance susceptibility models were then developed using generalized additive and generalized linear models (GAM, GLM) fitted to disturbed and undisturbed locations and relevant GIS-derived predictor variables (slope, potential solar radiation, elevation). These models successfully delineated areas across the landscape that were susceptible to disturbances locally and regionally when transferred to an independent validation location. Permafrost disturbance susceptibility models are a first-order assessment of landscape susceptibility and are promising for designing land management strategies for remote permafrost regions. Additionally, geomorphic patterns associated with higher susceptibility provide important knowledge about processes associated with the initiation of disturbances. Permafrost degradation was analyzed at the CBAWO using differential interferometric synthetic aperture radar (DInSAR). Active-layer dynamics were interpreted using inter-seasonal and intra-seasonal displacement measurements and highlight the importance of hydroclimatic factors on active layer change. Collectively, these research approaches contribute to permafrost monitoring and the assessment of landscape-scale vulnerability in order to develop permafrost disturbance mitigation strategies.
Resumo:
Introducción: Contar con un diagnóstico de las condiciones en seguridad y salud en el trabajo en el país permite crear estrategias para minimizar los problemas de la población trabajadora. En Colombia existe el observatorio del Instituto Nacional de Salud, sin embargo, no cuenta, en ninguno de sus tópicos, con información y análisis sobre la salud y seguridad de la población trabajadora. Objetivo: Determinar las condiciones de salud de la población atendida en la IPS SALUD OCUPACIONAL DE LOS ANDES LDTA en la ciudad de Bogotá, durante el año 2015. Materiales y métodos: Se realizó una prueba piloto del observatorio de salud y seguridad en el trabajo mediante un estudio de corte transversal, donde se tomó una base de datos de pacientes evaluados en la IPS SALUD OCUPACIONAL DE LOS ANDES LDTA, de la ciudad de Bogotá D.C. que contiene información de exámenes médicos ocupacionales realizados en el 2015 en la plataforma ISISMAWEB con una muestra representativa de 1923 registros. Se incluyeron variables sociodemográficas y laborales, los paraclínicos registrados como alterados más prevalentes, los diagnósticos y dictámenes emitidos en la población estudiada y las recomendaciones personales dadas por el sistema de gestión de la empresa. Se realizó un análisis descriptivo y para el estudio de las interacciones se empleó el Chi-cuadrado. Resultados: El 62,1% de la población fueron hombres con edad promedio de 34.8 años (DE 10,521). El 41.5% tuvieron estudios secundarios. La evaluación médica más realizada fue el examen de ingreso en el 30.5% de los casos. El cargo operadores de instalaciones y máquinas y ensambladores represento el 27.9% y en última medida los profesionales de nivel medio en operaciones financieras y administrativas con el 0.5%. El diagnostico CIE 10 emitido más frecuente fue con el 15,8% el código Z100 (Examen de salud ocupacional), seguido del Trastorno de la refracción no especificado (H527) con el 9,0%. En cuanto a las recomendaciones generales la que más se repitió fue examen periódico con un 30%. La recomendación preventiva más frecuente fue osteomuscular con el 36,5%. Las recomendaciones SVE de mayor prevalencia fueron ergonómicas con un 40,7%. Se encontraron asociaciones (p<0.05) entre las variables escolaridad, género y estrato. Conclusiones: Se deben optimizar los mecanismos de recolección del dato para ser más viable su evaluación y asociación. Hay un subregístro importante de segundos diagnósticos asociado al no registro de los paraclínicos. Este estudio plantea un modelo a seguir para poder desarrollar el observatorio nacional de salud y seguridad en el trabajo.
Resumo:
Nowadays, World Heritage Sites (WHS) have been facing new challenges, partially due to a different tourism consumption patterns. As it is highlighted in a considerable amount of studies, visits to these sites are almost justified by this prestigious classification and motivations are closely associated with their cultural aspects and quality of the overall environment (among others, Marujo et al, 2012). However, a diversity of tourists’ profiles have been underlined in the literature. Starting from the results obtained in a previous study about cultural tourists’ profile, conducted during the year 2009 in the city of Évora, Portugal, it is our intend to compare the results with a recent survey applied to the visitors of the same city. Recognition of Évora by UNESCO in 1986 as “World Heritage” has fostered not only the preservation of heritage but also the tourist promotion of the town. This study compares and examined tourists’ profile, regarding from the tourists’ expenditure patterns in Évora. A total of 450 surveys were distributed in 2009, and recently, in 2015, the same numbers of surveys were collected. Chi-squared Automatic Interaction Detection (CHAID) was applied to model consumer patterns of domestic and international visitors, based on socio demographic, trip characteristics, length of stay and the degree of satisfaction of pull factors. CHAID allowed find a population classification in groups that able to describe the dependent variable, average daily tourist expenditure. Results revealed different patterns of daily average expenditure amongst the years, 2009 and 2015, even if primarily results not revealed significant variations in socio-demographic and trip characteristics among the visitors’ core profile. Local authorities should be aware of this changing expensive behavior of cultural visitors and should formulate strategies accordingly. Policy and managerial recommendations are discussed.
Resumo:
This study investigates tourists’ expenditure patterns in the city of Évora, a world heritage site (WHS) classified by UNESCO. The use of chi-squared automatic interaction detection (CHAID) was chosen, allowing the identification of distinct segments based on expenditure patterns. Visitors’ expenditure patterns have proven to be a pertinent element for a broader understanding of visitors’ behaviour at cultural destinations. Visitors’ expenditure patterns were revealed to be increasing within years studied.
Resumo:
In cataract surgery, the eye’s natural lens is removed because it has gone opaque and doesn’t allow clear vision any longer. To maintain the eye’s optical power, a new artificial lens must be inserted. Called Intraocular Lens (IOL), it needs to be modelled in order to have the correct refractive power to substitute the natural lens. Calculating the refractive power of this substitution lens requires precise anterior eye chamber measurements. An interferometry equipment, the AC Master from Zeiss Meditec, AG, was in use for half a year to perform these measurements. A Low Coherence Interferometry (LCI) measurement beam is aligned with the eye’s optical axis, for precise measurements of anterior eye chamber distances. The eye follows a fixation target in order to make the visual axis align with the optical axis. Performance problems occurred, however, at this step. Therefore, there was a necessity to develop a new procedure that ensures better alignment between the eye’s visual and optical axes, allowing a more user friendly and versatile procedure, and eventually automatizing the whole process. With this instrument, the alignment between the eye’s optical and visual axes is detected when Purkinje reflections I and III are overlapped, as the eye follows a fixation target. In this project, image analysis is used to detect these Purkinje reflections’ positions, eventually automatically detecting when they overlap. Automatic detection of the third Purkinje reflection of an eye following a fixation target is possible with some restrictions. Each pair of detected third Purkinje reflections is used in automatically calculating an acceptable starting position for the fixation target, required for precise measurements of anterior eye chamber distances.