925 resultados para Signature Verification, Forgery Detection, Fuzzy Modeling


Relevância:

30.00% 30.00%

Publicador:

Resumo:

When observers are presented with two visual targets appearing in the same position in close temporal proximity, a marked reduction in detection performance of the second target has often been reported, the so-called attentional blink phenomenon. Several studies found a similar decrement of P300 amplitudes during the attentional blink period as observed with detection performances of the second target. However, whether the parallel courses of second target performances and corresponding P300 amplitudes resulted from the same underlying mechanisms remained unclear. The aim of our study was therefore to investigate whether the mechanisms underlying the AB can be assessed by fixed-links modeling and whether this kind of assessment would reveal the same or at least related processes in the behavioral and electrophysiological data. On both levels of observation three highly similar processes could be identified: an increasing, a decreasing and a u-shaped trend. Corresponding processes from the behavioral and electrophysiological data were substantially correlated, with the two u-shaped trends showing the strongest association with each other. Our results provide evidence for the assumption that the same mechanisms underlie attentional blink task performance at the electrophysiological and behavioral levels as assessed by fixed-links models.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose: To investigate the dosimetric properties of an electronic portal imaging device (EPID) for electron beam detection and to evaluate its potential for quality assurance (QA) of modulated electron radiotherapy (MERT). Methods: A commercially available EPID was used to detect electron beams shaped by a photon multileaf collimator (MLC) at a source-surface distance of 70 cm. The fundamental dosimetric properties such as reproducibility, dose linearity, field size response, energy response, and saturation were investigated for electron beams. A new method to acquire the flood-field for the EPID calibration was tested. For validation purpose, profiles of open fields and various MLC fields (square and irregular) were measured with a diode in water and compared to the EPID measurements. Finally, in order to use the EPID for QA of MERT delivery, a method was developed to reconstruct EPID two-dimensional (2D) dose distributions in a water-equivalent depth of 1.5 cm. Comparisons were performed with film measurement for static and dynamic monoenergy fields as well as for multienergy fields composed by several segments of different electron energies. Results: The advantageous EPID dosimetric properties already known for photons as reproducibility, linearity with dose, and dose rate were found to be identical for electron detection. The flood-field calibration method was proven to be effective and the EPID was capable to accurately reproduce the dose measured in water at 1.0 cm depth for 6 MeV, 1.3 cm for 9 MeV, and 1.5 cm for 12, 15, and 18 MeV. The deviations between the output factors measured with EPID and in water at these depths were within ±1.2% for all the energies with a mean deviation of 0.1%. The average gamma pass rate (criteria: 1.5%, 1.5 mm) for profile comparison between EPID and measurements in water was better than 99% for all the energies considered in this study. When comparing the reconstructed EPID 2D dose distributions at 1.5 cm depth to film measurements, the gamma pass rate (criteria: 2%, 2 mm) was better than 97% for all the tested cases. Conclusions: This study demonstrates the high potential of the EPID for electron dosimetry, and in particular, confirms the possibility to use it as an efficient verification tool for MERT delivery.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This bipartite comparative study aims at inspecting the similarities and differences between the Jones and Stokes–Mueller formalisms when modeling polarized light propagation with numerical simulations of the Monte Carlo type. In this first part, we review the theoretical concepts that concern light propagation and detection with both pure and partially/totally unpolarized states. The latter case involving fluctuations, or “depolarizing effects,” is of special interest here: Jones and Stokes–Mueller are equally apt to model such effects and are expected to yield identical results. In a second, ensuing paper, empirical evidence is provided by means of numerical experiments, using both formalisms.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Short range nucleon-nucleon correlations in nuclei (NN SRC) carry important information on nuclear structure and dynamics. NN SRC have been extensively probed through two-nucleon knock- out reactions in both pion and electron scattering experiments. We report here on the detection of two-nucleon knock-out events from neutrino interactions and discuss their topological features as possibly involving NN SRC content in the target argon nuclei. The ArgoNeuT detector in the Main Injector neutrino beam at Fermilab has recorded a sample of 30 fully reconstructed charged current events where the leading muon is accompanied by a pair of protons at the interaction vertex, 19 of which have both protons above the Fermi momentum of the Ar nucleus. Out of these 19 events, four are found with the two protons in a strictly back-to-back high momenta configuration directly observed in the final state and can be associated to nucleon Resonance pionless mechanisms involving a pre-existing short range correlated np pair in the nucleus. Another fraction (four events) of the remaining 15 events have a reconstructed back-to-back configuration of a np pair in the initial state, a signature compatible with one-body Quasi Elastic interaction on a neutron in a SRC pair. The detection of these two subsamples of the collected (mu- + 2p) events suggests that mechanisms directly involving nucleon-nucleon SRC pairs in the nucleus are active and can be efficiently explored in neutrino-argon interactions with the LAr TPC technology.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The new computing paradigm known as cognitive computing attempts to imitate the human capabilities of learning, problem solving, and considering things in context. To do so, an application (a cognitive system) must learn from its environment (e.g., by interacting with various interfaces). These interfaces can run the gamut from sensors to humans to databases. Accessing data through such interfaces allows the system to conduct cognitive tasks that can support humans in decision-making or problem-solving processes. Cognitive systems can be integrated into various domains (e.g., medicine or insurance). For example, a cognitive system in cities can collect data, can learn from various data sources and can then attempt to connect these sources to provide real time optimizations of subsystems within the city (e.g., the transportation system). In this study, we provide a methodology for integrating a cognitive system that allows data to be verbalized, making the causalities and hypotheses generated from the cognitive system more understandable to humans. We abstract a city subsystem—passenger flow for a taxi company—by applying fuzzy cognitive maps (FCMs). FCMs can be used as a mathematical tool for modeling complex systems built by directed graphs with concepts (e.g., policies, events, and/or domains) as nodes and causalities as edges. As a verbalization technique we introduce the restriction-centered theory of reasoning (RCT). RCT addresses the imprecision inherent in language by introducing restrictions. Using this underlying combinatorial design, our approach can handle large data sets from complex systems and make the output understandable to humans.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The attentional blink (AB) is a fundamental limitation of the ability to select relevant information from irrelevant information. It can be observed with the detection rate in an AB task as well as with the corresponding P300 amplitude of the event-related potential. In previous research, however, correlations between these two levels of observation were weak and rather inconsistent. A possible explanation of this finding might be that multiple processes underlie the AB and, thus, obscure a possible relationship between AB-related detection rate and the corresponding P300 amplitude. The present study investigated this assumption by applying a fixed-links modeling approach to represent behavioral individual differences in the AB as a latent variable. Concurrently, this approach enabled us to control for additional sources of variance in AB performance by deriving two additional latent variables. The correlation between the latent variable reflecting behavioral individual differences in AB magnitude and a corresponding latent variable derived from the P300 amplitude was high (r=.70). Furthermore, this correlation was considerably stronger than the correlations of other behavioral measures of the AB magnitude with their psychophysiological counterparts (all rs<.40). Our findings clearly indicate that the systematic disentangling of various sources of variance by utilizing the fixed-links modeling approach is a promising tool to investigate behavioral individual differences in the AB and possible psychophysiological correlates of these individual differences.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The usage of intensity modulated radiotherapy (IMRT) treatments necessitates a significant amount of patient-specific quality assurance (QA). This research has investigated the precision and accuracy of Kodak EDR2 film measurements for IMRT verifications, the use of comparisons between 2D dose calculations and measurements to improve treatment plan beam models, and the dosimetric impact of delivery errors. New measurement techniques and software were developed and used clinically at M. D. Anderson Cancer Center. The software implemented two new dose comparison parameters, the 2D normalized agreement test (NAT) and the scalar NAT index. A single-film calibration technique using multileaf collimator (MLC) delivery was developed. EDR2 film's optical density response was found to be sensitive to several factors: radiation time, length of time between exposure and processing, and phantom material. Precision of EDR2 film measurements was found to be better than 1%. For IMRT verification, EDR2 film measurements agreed with ion chamber results to 2%/2mm accuracy for single-beam fluence map verifications and to 5%/2mm for transverse plane measurements of complete plan dose distributions. The same system was used to quantitatively optimize the radiation field offset and MLC transmission beam modeling parameters for Varian MLCs. While scalar dose comparison metrics can work well for optimization purposes, the influence of external parameters on the dose discrepancies must be minimized. The ability of 2D verifications to detect delivery errors was tested with simulated data. The dosimetric characteristics of delivery errors were compared to patient-specific clinical IMRT verifications. For the clinical verifications, the NAT index and percent of pixels failing the gamma index were exponentially distributed and dependent upon the measurement phantom but not the treatment site. Delivery errors affecting all beams in the treatment plan were flagged by the NAT index, although delivery errors impacting only one beam could not be differentiated from routine clinical verification discrepancies. Clinical use of this system will flag outliers, allow physicists to examine their causes, and perhaps improve the level of agreement between radiation dose distribution measurements and calculations. The principles used to design and evaluate this system are extensible to future multidimensional dose measurements and comparisons. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Hodgkin's disease (HD) is a cancer of the lymphatic system. Survivors of HD face varieties of consequent adverse effects, in which secondary primary tumors (SPT) is one of the most serious consequences. This dissertation is aimed to model time-to-SPT in the presence of death and HD relapses during follow-up.^ The model is designed to handle a mixture phenomenon of SPT and the influence of death. Relapses of HD are adjusted as a covariate. Proportional hazards framework is used to define SPT intensity function, which includes an exponential term to estimate explanatory variables. Death as a competing risk is considered according to different scenarios, depending on which terminal event comes first. Newton-Raphson method is used to estimate the parameter estimates in the end.^ The proposed method is applied to a real data set containing a group of HD patients. Several risk factors for the development of SPT are identified and the findings are noteworthy in the development of healthcare guidelines that may lead to the early detection or prevention of SPT.^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Marine sediments are the main sink in the oceanic phosphorus (P) cycle. The activity of benthic microorganisms is decisive for regeneration, reflux, or burial of inorganic phosphate (Pi), which has a strong impact on marine productivity. Recent formation of phosphorites on the continental shelf and a succession of different sedimentary environments make the Benguela upwelling system a prime region for studying the role of microbes in P biogeochemistry. The oxygen isotope signature of pore water phosphate (d18OP) carries characteristic information of microbial P cycling: Intracellular turnover of phosphorylated biomolecules results in isotopic equilibrium with ambient water, while enzymatic regeneration of Pi from organic matter produces distinct offsets from equilibrium. The balance of these two processes is the major control for d18OP. Our study assesses the importance of microbial P cycling relative to regeneration of Pi from organic matter from a transect across the Namibian continental shelf and slope by combining pore water chemistry (sulfate, sulfide, ferrous iron, Pi), steady-state turnover rate modeling, and oxygen isotope geochemistry of Pi. We found d18OP values in a range from 12.8 per mill to 26.6 per mill, both in equilibrium as well as pronounced disequilibrium with water. Our data show a trend towards regeneration signatures (disequilibrium) under low mineralization activity and low Pi concentrations, and microbial turnover signatures (equilibrium) under high mineralization activity and high Pi concentrations. These findings are opposite to observations from water column studies where regeneration signatures were found to coincide with high mineralization activity and high Pi concentrations. It appears that preferential Pi regeneration in marine sediments does not necessarily coincide with a disequilibrium d18OP signature. We propose that microbial Pi uptake strategies, which are controlled by Pi availability, are decisive for the alteration of the isotope signature. This hypothesis is supported by the observation of efficient microbial Pi turnover (equilibrium signatures) in the phosphogenic sediments of the Benguela upwelling system.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Recently, vision-based advanced driver-assistance systems (ADAS) have received a new increased interest to enhance driving safety. In particular, due to its high performance–cost ratio, mono-camera systems are arising as the main focus of this field of work. In this paper we present a novel on-board road modeling and vehicle detection system, which is a part of the result of the European I-WAY project. The system relies on a robust estimation of the perspective of the scene, which adapts to the dynamics of the vehicle and generates a stabilized rectified image of the road plane. This rectified plane is used by a recursive Bayesian classi- fier, which classifies pixels as belonging to different classes corresponding to the elements of interest of the scenario. This stage works as an intermediate layer that isolates subsequent modules since it absorbs the inherent variability of the scene. The system has been tested on-road, in different scenarios, including varied illumination and adverse weather conditions, and the results have been proved to be remarkable even for such complex scenarios.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

En esta tesis se aborda la detección y el seguimiento automático de vehículos mediante técnicas de visión artificial con una cámara monocular embarcada. Este problema ha suscitado un gran interés por parte de la industria automovilística y de la comunidad científica ya que supone el primer paso en aras de la ayuda a la conducción, la prevención de accidentes y, en última instancia, la conducción automática. A pesar de que se le ha dedicado mucho esfuerzo en los últimos años, de momento no se ha encontrado ninguna solución completamente satisfactoria y por lo tanto continúa siendo un tema de investigación abierto. Los principales problemas que plantean la detección y seguimiento mediante visión artificial son la gran variabilidad entre vehículos, un fondo que cambia dinámicamente debido al movimiento de la cámara, y la necesidad de operar en tiempo real. En este contexto, esta tesis propone un marco unificado para la detección y seguimiento de vehículos que afronta los problemas descritos mediante un enfoque estadístico. El marco se compone de tres grandes bloques, i.e., generación de hipótesis, verificación de hipótesis, y seguimiento de vehículos, que se llevan a cabo de manera secuencial. No obstante, se potencia el intercambio de información entre los diferentes bloques con objeto de obtener el máximo grado posible de adaptación a cambios en el entorno y de reducir el coste computacional. Para abordar la primera tarea de generación de hipótesis, se proponen dos métodos complementarios basados respectivamente en el análisis de la apariencia y la geometría de la escena. Para ello resulta especialmente interesante el uso de un dominio transformado en el que se elimina la perspectiva de la imagen original, puesto que este dominio permite una búsqueda rápida dentro de la imagen y por tanto una generación eficiente de hipótesis de localización de los vehículos. Los candidatos finales se obtienen por medio de un marco colaborativo entre el dominio original y el dominio transformado. Para la verificación de hipótesis se adopta un método de aprendizaje supervisado. Así, se evalúan algunos de los métodos de extracción de características más populares y se proponen nuevos descriptores con arreglo al conocimiento de la apariencia de los vehículos. Para evaluar la efectividad en la tarea de clasificación de estos descriptores, y dado que no existen bases de datos públicas que se adapten al problema descrito, se ha generado una nueva base de datos sobre la que se han realizado pruebas masivas. Finalmente, se presenta una metodología para la fusión de los diferentes clasificadores y se plantea una discusión sobre las combinaciones que ofrecen los mejores resultados. El núcleo del marco propuesto está constituido por un método Bayesiano de seguimiento basado en filtros de partículas. Se plantean contribuciones en los tres elementos fundamentales de estos filtros: el algoritmo de inferencia, el modelo dinámico y el modelo de observación. En concreto, se propone el uso de un método de muestreo basado en MCMC que evita el elevado coste computacional de los filtros de partículas tradicionales y por consiguiente permite que el modelado conjunto de múltiples vehículos sea computacionalmente viable. Por otra parte, el dominio transformado mencionado anteriormente permite la definición de un modelo dinámico de velocidad constante ya que se preserva el movimiento suave de los vehículos en autopistas. Por último, se propone un modelo de observación que integra diferentes características. En particular, además de la apariencia de los vehículos, el modelo tiene en cuenta también toda la información recibida de los bloques de procesamiento previos. El método propuesto se ejecuta en tiempo real en un ordenador de propósito general y da unos resultados sobresalientes en comparación con los métodos tradicionales. ABSTRACT This thesis addresses on-road vehicle detection and tracking with a monocular vision system. This problem has attracted the attention of the automotive industry and the research community as it is the first step for driver assistance and collision avoidance systems and for eventual autonomous driving. Although many effort has been devoted to address it in recent years, no satisfactory solution has yet been devised and thus it is an active research issue. The main challenges for vision-based vehicle detection and tracking are the high variability among vehicles, the dynamically changing background due to camera motion and the real-time processing requirement. In this thesis, a unified approach using statistical methods is presented for vehicle detection and tracking that tackles these issues. The approach is divided into three primary tasks, i.e., vehicle hypothesis generation, hypothesis verification, and vehicle tracking, which are performed sequentially. Nevertheless, the exchange of information between processing blocks is fostered so that the maximum degree of adaptation to changes in the environment can be achieved and the computational cost is alleviated. Two complementary strategies are proposed to address the first task, i.e., hypothesis generation, based respectively on appearance and geometry analysis. To this end, the use of a rectified domain in which the perspective is removed from the original image is especially interesting, as it allows for fast image scanning and coarse hypothesis generation. The final vehicle candidates are produced using a collaborative framework between the original and the rectified domains. A supervised classification strategy is adopted for the verification of the hypothesized vehicle locations. In particular, state-of-the-art methods for feature extraction are evaluated and new descriptors are proposed by exploiting the knowledge on vehicle appearance. Due to the lack of appropriate public databases, a new database is generated and the classification performance of the descriptors is extensively tested on it. Finally, a methodology for the fusion of the different classifiers is presented and the best combinations are discussed. The core of the proposed approach is a Bayesian tracking framework using particle filters. Contributions are made on its three key elements: the inference algorithm, the dynamic model and the observation model. In particular, the use of a Markov chain Monte Carlo method is proposed for sampling, which circumvents the exponential complexity increase of traditional particle filters thus making joint multiple vehicle tracking affordable. On the other hand, the aforementioned rectified domain allows for the definition of a constant-velocity dynamic model since it preserves the smooth motion of vehicles in highways. Finally, a multiple-cue observation model is proposed that not only accounts for vehicle appearance but also integrates the available information from the analysis in the previous blocks. The proposed approach is proven to run near real-time in a general purpose PC and to deliver outstanding results compared to traditional methods.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Here, a novel and efficient strategy for moving object detection by non-parametric modeling on smart cameras is presented. Whereas the background is modeled using only color information, the foreground model combines color and spatial information. The application of a particle filter allows the update of the spatial information and provides a priori information about the areas to analyze in the following images, enabling an important reduction in the computational requirements and improving the segmentation results

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A new method for detecting microcalcifications in regions of interest (ROIs) extracted from digitized mammograms is proposed. The top-hat transform is a technique based on mathematical morphology operations and, in this paper, is used to perform contrast enhancement of the mi-crocalcifications. To improve microcalcification detection, a novel image sub-segmentation approach based on the possibilistic fuzzy c-means algorithm is used. From the original ROIs, window-based features, such as the mean and standard deviation, were extracted; these features were used as an input vector in a classifier. The classifier is based on an artificial neural network to identify patterns belonging to microcalcifications and healthy tissue. Our results show that the proposed method is a good alternative for automatically detecting microcalcifications, because this stage is an important part of early breast cancer detection

Relevância:

30.00% 30.00%

Publicador:

Resumo:

When we try to analyze and to control a system whose model was obtained only based on input/output data, accuracy is essential in the model. On the other hand, to make the procedure practical, the modeling stage must be computationally efficient. In this regard, this paper presents the application of extended Kalman filter for the parametric adaptation of a fuzzy model

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We present a novel approach for detecting severe obstructive sleep apnea (OSA) cases by introducing non-linear analysis into sustained speech characterization. The proposed scheme was designed for providing additional information into our baseline system, built on top of state-of-the-art cepstral domain modeling techniques, aiming to improve accuracy rates. This new information is lightly correlated with our previous MFCC modeling of sustained speech and uncorrelated with the information in our continuous speech modeling scheme. Tests have been performed to evaluate the improvement for our detection task, based on sustained speech as well as combined with a continuous speech classifier, resulting in a 10% relative reduction in classification for the first and a 33% relative reduction for the fused scheme. Results encourage us to consider the existence of non-linear effects on OSA patients' voices, and to think about tools which could be used to improve short-time analysis.