961 resultados para CENTERBAND-ONLY DETECTION


Relevância:

30.00% 30.00%

Publicador:

Resumo:

In Switzerland sarcoptic mange is frequent in free-ranging wild carnivores but until recent years no cases had been recorded in wild ungulates. Since 2010, cases have been observed in wild boar in the cantons of Solothurn, Tessin and Thurgau. Here, we report the detection of mange-like skin lesions in wild boars by photo-trapping and the post-mortem findings in 6 culled animals presenting different stages of the disease. Potential sources of infection include mangy red foxes, outdoor domestic pigs and wild boars from surrounding countries. Disease spread in the wild boar population may become relevant not only for wildlife but also for domestic pig health in the future if piggeries' biosecurity is insufficient to prevent interactions with wild boar.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Fasciola hepatica, also called the large liver fluke, is a trematode which can infect most mammals. Monitoring the infection rate of snails, which function as intermediate hosts and harbour larval stages of F. hepatica, is an important component of epidemiological studies on fascioliasis. For this purpose, DNA probes were generated which can be used for the detection of F. hepatica larvae in snails. Four highly repetitive DNA fragments were cloned in a plasmid vector and tested by Southern blot hybridization to the DNA of various trematodes for specificity and sensitivity. The probes Fhr-I, Fhr-II and Fhr-III hybridized only to F. hepatica DNA. Fhr-IV contained ribosomal RNA gene sequences and cross-hybridize with the DNA from various other trematode species. Squash blot analysis showed that the different probes were able to detect the parasite larvae in trematode-infected snails even as isolated single larvae. No signals were obtained in squash blots of uninfected snails. Probes Fhr-I, Fhr-II and Fhr-III are thus useful specific tools for studying the epidemiology of fascioliasis. The probe Fhr-IV, because of its broader spectrum, can be used to detect the larvae of a wide range of trematode species of waterbirds, which are the causative agents of swimmer's itch.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The cause of infection of about a third of all travelers' diarrhea patients studied is not identified. Stools of these diarrhea patients tested for known enteric pathogens are shown to be negative, and identified as pathogen negative stools. We proposed that the third of these diarrhea patients might not only include at present unknown pathogens, but also known pathogens that go undetected. Conventionally, a probability sample of five E. coli colonies are used detect enterotoxigenic E. coli (ETEC) and other diarrhea-producing E. coli from stool cultures. We compared this conventional method of using five E. coli colonies, to the use of up to twenty E. coli colonies. Testing for up to fifteen E. coli colonies detected about twice as many ETEC when compared to the detection of ETEC, testing for five E. coli colonies. When the number of E. coli colonies tested was increased from 5 to 15, the detection of ETEC increased from 19.0% to 38.8%. The sensitivity of the assay with 5 E. coli colonies was statistically significantly different to the sensitivity of the assay with 10 E. coli colonies, suggesting that for the detection of ETEC at least 10 colonies of E. coli should be tested.^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The relationship between serum cholesterol and cancer incidence was investigated in the population of the Hypertension Detection and Follow-up Program (HDFP). The HDFP was a multi-center trial designed to test the effectiveness of a stepped program of medication in reducing mortality associated with hypertension. Over 10,000 participants, ages 30-69, were followed with clinic and home visits for a minimum of five years. Cancer incidence was ascertained from existing study documents, which included hospitalization records, autopsy reports and death certificates. During the five years of follow-up, 286 new cancer cases were documented. The distribution of sites and total number of cases were similar to those predicted using rates from the Third National Cancer Survey. A non-fasting baseline serum cholesterol level was available for most participants. Age, sex, and race specific five-year cancer incidence rates were computed for each cholesterol quartile. Rates were also computed by smoking status, education status, and percent ideal weight quartiles. In addition, these and other factors were investigated with the use of the multiple logistic model.^ For all cancers combined, a significant inverse relationship existed between baseline serum cholesterol levels and cancer incidence. Previously documented associations between smoking, education and cancer were also demonstrated but did not account for the relationship between serum cholesterol and cancer. The relationship was more evident in males than females but this was felt to represent the different distribution of occurrence of specific cancer sites in the two sexes. The inverse relationship existed for all specific sites investigated (except breast) although a level of statistical significance was reached only for prostate carcinoma. Analyses after exclusion of cases diagnosed during the first two years of follow-up still yielded an inverse relationship. Life table analysis indicated that competing risks during the period of follow-up did not account for the existence of an inverse relationship. It is concluded that a weak inverse relationship does exist between serum cholesterol for many but not all cancer sites. This relationship is not due to confounding by other known cancer risk factors, competing risks or persons entering the study with undiagnosed cancer. Not enough information is available at the present time to determine whether this relationship is causal and further research is suggested. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The relationship between degree of diastolic blood pressure (DBP) reduction and mortality was examined among hypertensives, ages 30-69, in the Hypertension Detection and Follow-up Program (HDFP). The HDFP was a multi-center community-based trial, which followed 10,940 hypertensive participants for five years. One-year survival was required for inclusion in this investigation since the one-year annual visit was the first occasion where change in blood pressure could be measured on all participants. During the subsequent four years of follow-up on 10,052 participants, 568 deaths occurred. For levels of change in DBP and for categories of variables related to mortality, the crude mortality rate was calculated. Time-dependent life tables were also calculated so as to utilize available blood pressure data over time. In addition, the Cox life table regression model, extended to take into account both time-constant and time-dependent covariates, was used to examine the relationship change in blood pressure over time and mortality.^ The results of the time-dependent life table and time-dependent Cox life table regression analyses supported the existence of a quadratic function which modeled the relationship between DBP reduction and mortality, even after adjusting for other risk factors. The minimum mortality hazard ratio, based on a particular model, occurred at a DBP reduction of 22.6 mm Hg (standard error = 10.6) in the whole population and 8.5 mm Hg (standard error = 4.6) in the baseline DBP stratum 90-104. After this reduction, there was a small increase in the risk of death. There was not evidence of the quadratic function after fitting the same model using systolic blood pressure. Methodologic issues involved in studying a particular degree of blood pressure reduction were considered. The confidence interval around the change corresponding to the minimum hazard ratio was wide and the obtained blood pressure level should not be interpreted as a goal for treatment. Blood pressure reduction was attributed, not only to pharmacologic therapy, but also to regression to the mean, and to other unknown factors unrelated to treatment. Therefore, the surprising results of this study do not provide direct implications for treatment, but strongly suggest replication in other populations. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Documenting changes in distribution is necessary for understanding species' response to environmental changes, but data on species distributions are heterogeneous in accuracy and resolution. Combining different data sources and methodological approaches can fill gaps in knowledge about the dynamic processes driving changes in species-rich, but data-poor regions. We combined recent bird survey data from the Neotropical Biodiversity Mapping Initiative (NeoMaps) with historical distribution records to estimate potential changes in the distribution of eight species of Amazon parrots in Venezuela. Using environmental covariates and presence-only data from museum collections and the literature, we first used maximum likelihood to fit a species distribution model (SDM) estimating a historical maximum probability of occurrence for each species. We then used recent, NeoMaps survey data to build single-season occupancy models (OM) with the same environmental covariates, as well as with time- and effort-dependent detectability, resulting in estimates of the current probability of occurrence. We finally calculated the disagreement between predictions as a matrix of probability of change in the state of occurrence. Our results suggested negative changes for the only restricted, threatened species, Amazona barbadensis, which has been independently confirmed with field studies. Two of the three remaining widespread species that were detected, Amazona amazonica, Amazona ochrocephala, also had a high probability of negative changes in northern Venezuela, but results were not conclusive for Amazona farinosa. The four remaining species were undetected in recent field surveys; three of these were most probably absent from the survey locations (Amazona autumnalis, Amazona mercenaria and Amazona festiva), while a fourth (Amazona dufresniana) requires more intensive targeted sampling to estimate its current status. Our approach is unique in taking full advantage of available, but limited data, and in detecting a high probability of change even for rare and patchily-distributed species. However, it is presently limited to species meeting the strong assumptions required for maximum-likelihood estimation with presence-only data, including very high detectability and representative sampling of its historical distribution.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Speed enforcement on public roadways is an important issue in order to guarantee road security and to reduce the number and seriousness of traffic accidents. Traditionally, this task has been partially solved using radar and/or laser technologies and, more recently, using video-camera based systems. All these systems have significant shortcomings that have yet to be overcome. The main drawback of classical Doppler radar technology is that the velocity measurement fails when several vehicles are in the radars beam. Modern radar systems are able to measure speed and range between vehicle and radar. However, this is not enough to discriminate the lane where the vehicle is driving on. The limitation of several vehicles in the beam is overcome using laser technology. However, laser systems have another important limitation: They cannot measure the speed of several vehicles simultaneously. Novel video-camera systems, based on license plate identification, solve the previous drawbacks, but they have the problem that they can only measure average speed but never top-speed. This paper studies the feasibility of using an interferometric linear frequency modulated continuous wave radar to improve top-speed enforcement on roadways. Two different systems based on down-the-road and across-the-road radar configurations are presented. The main advantage of the proposed solutions is they can simultaneously measure speed, range, and lane of several vehicles, allowing the univocal identification of the offenders. A detailed analysis about the operation and accuracy of these solutions is reported. In addition, the feasibility of the proposed techniques has been demonstrated with simulations and real experiments using a Ka-band interferometric radar developed by our research group.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

En esta tesis se aborda la detección y el seguimiento automático de vehículos mediante técnicas de visión artificial con una cámara monocular embarcada. Este problema ha suscitado un gran interés por parte de la industria automovilística y de la comunidad científica ya que supone el primer paso en aras de la ayuda a la conducción, la prevención de accidentes y, en última instancia, la conducción automática. A pesar de que se le ha dedicado mucho esfuerzo en los últimos años, de momento no se ha encontrado ninguna solución completamente satisfactoria y por lo tanto continúa siendo un tema de investigación abierto. Los principales problemas que plantean la detección y seguimiento mediante visión artificial son la gran variabilidad entre vehículos, un fondo que cambia dinámicamente debido al movimiento de la cámara, y la necesidad de operar en tiempo real. En este contexto, esta tesis propone un marco unificado para la detección y seguimiento de vehículos que afronta los problemas descritos mediante un enfoque estadístico. El marco se compone de tres grandes bloques, i.e., generación de hipótesis, verificación de hipótesis, y seguimiento de vehículos, que se llevan a cabo de manera secuencial. No obstante, se potencia el intercambio de información entre los diferentes bloques con objeto de obtener el máximo grado posible de adaptación a cambios en el entorno y de reducir el coste computacional. Para abordar la primera tarea de generación de hipótesis, se proponen dos métodos complementarios basados respectivamente en el análisis de la apariencia y la geometría de la escena. Para ello resulta especialmente interesante el uso de un dominio transformado en el que se elimina la perspectiva de la imagen original, puesto que este dominio permite una búsqueda rápida dentro de la imagen y por tanto una generación eficiente de hipótesis de localización de los vehículos. Los candidatos finales se obtienen por medio de un marco colaborativo entre el dominio original y el dominio transformado. Para la verificación de hipótesis se adopta un método de aprendizaje supervisado. Así, se evalúan algunos de los métodos de extracción de características más populares y se proponen nuevos descriptores con arreglo al conocimiento de la apariencia de los vehículos. Para evaluar la efectividad en la tarea de clasificación de estos descriptores, y dado que no existen bases de datos públicas que se adapten al problema descrito, se ha generado una nueva base de datos sobre la que se han realizado pruebas masivas. Finalmente, se presenta una metodología para la fusión de los diferentes clasificadores y se plantea una discusión sobre las combinaciones que ofrecen los mejores resultados. El núcleo del marco propuesto está constituido por un método Bayesiano de seguimiento basado en filtros de partículas. Se plantean contribuciones en los tres elementos fundamentales de estos filtros: el algoritmo de inferencia, el modelo dinámico y el modelo de observación. En concreto, se propone el uso de un método de muestreo basado en MCMC que evita el elevado coste computacional de los filtros de partículas tradicionales y por consiguiente permite que el modelado conjunto de múltiples vehículos sea computacionalmente viable. Por otra parte, el dominio transformado mencionado anteriormente permite la definición de un modelo dinámico de velocidad constante ya que se preserva el movimiento suave de los vehículos en autopistas. Por último, se propone un modelo de observación que integra diferentes características. En particular, además de la apariencia de los vehículos, el modelo tiene en cuenta también toda la información recibida de los bloques de procesamiento previos. El método propuesto se ejecuta en tiempo real en un ordenador de propósito general y da unos resultados sobresalientes en comparación con los métodos tradicionales. ABSTRACT This thesis addresses on-road vehicle detection and tracking with a monocular vision system. This problem has attracted the attention of the automotive industry and the research community as it is the first step for driver assistance and collision avoidance systems and for eventual autonomous driving. Although many effort has been devoted to address it in recent years, no satisfactory solution has yet been devised and thus it is an active research issue. The main challenges for vision-based vehicle detection and tracking are the high variability among vehicles, the dynamically changing background due to camera motion and the real-time processing requirement. In this thesis, a unified approach using statistical methods is presented for vehicle detection and tracking that tackles these issues. The approach is divided into three primary tasks, i.e., vehicle hypothesis generation, hypothesis verification, and vehicle tracking, which are performed sequentially. Nevertheless, the exchange of information between processing blocks is fostered so that the maximum degree of adaptation to changes in the environment can be achieved and the computational cost is alleviated. Two complementary strategies are proposed to address the first task, i.e., hypothesis generation, based respectively on appearance and geometry analysis. To this end, the use of a rectified domain in which the perspective is removed from the original image is especially interesting, as it allows for fast image scanning and coarse hypothesis generation. The final vehicle candidates are produced using a collaborative framework between the original and the rectified domains. A supervised classification strategy is adopted for the verification of the hypothesized vehicle locations. In particular, state-of-the-art methods for feature extraction are evaluated and new descriptors are proposed by exploiting the knowledge on vehicle appearance. Due to the lack of appropriate public databases, a new database is generated and the classification performance of the descriptors is extensively tested on it. Finally, a methodology for the fusion of the different classifiers is presented and the best combinations are discussed. The core of the proposed approach is a Bayesian tracking framework using particle filters. Contributions are made on its three key elements: the inference algorithm, the dynamic model and the observation model. In particular, the use of a Markov chain Monte Carlo method is proposed for sampling, which circumvents the exponential complexity increase of traditional particle filters thus making joint multiple vehicle tracking affordable. On the other hand, the aforementioned rectified domain allows for the definition of a constant-velocity dynamic model since it preserves the smooth motion of vehicles in highways. Finally, a multiple-cue observation model is proposed that not only accounts for vehicle appearance but also integrates the available information from the analysis in the previous blocks. The proposed approach is proven to run near real-time in a general purpose PC and to deliver outstanding results compared to traditional methods.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Here, a novel and efficient strategy for moving object detection by non-parametric modeling on smart cameras is presented. Whereas the background is modeled using only color information, the foreground model combines color and spatial information. The application of a particle filter allows the update of the spatial information and provides a priori information about the areas to analyze in the following images, enabling an important reduction in the computational requirements and improving the segmentation results

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A stress-detection system is proposed based on physiological signals. Concretely, galvanic skin response (GSR) and heart rate (HR) are proposed to provide information on the state of mind of an individual, due to their nonintrusiveness and noninvasiveness. Furthermore, specific psychological experiments were designed to induce properly stress on individuals in order to acquire a database for training, validating, and testing the proposed system. Such system is based on fuzzy logic, and it described the behavior of an individual under stressing stimuli in terms of HR and GSR. The stress-detection accuracy obtained is 99.5% by acquiring HR and GSR during a period of 10 s, and what is more, rates over 90% of success are achieved by decreasing that acquisition period to 3-5 s. Finally, this paper comes up with a proposal that an accurate stress detection only requires two physiological signals, namely, HR and GSR, and the fact that the proposed stress-detection system is suitable for real-time applications.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The localization of persons in indoor environments is nowadays an open problem. There are partial solutions based on the deployment of a network of sensors (Local Positioning Systems or LPS). Other solutions only require the installation of an inertial sensor on the person’s body (Pedestrian Dead-Reckoning or PDR). PDR solutions integrate the signals coming from an Inertial Measurement Unit (IMU), which usually contains 3 accelerometers and 3 gyroscopes. The main problem of PDR is the accumulation of positioning errors due to the drift caused by the noise in the sensors. This paper presents a PDR solution that incorporates a drift correction method based on detecting the access ramps usually found in buildings. The ramp correction method is implemented over a PDR framework that uses an Inertial Navigation algorithm (INS) and an IMU attached to the person’s foot. Unlike other approaches that use external sensors to correct the drift error, we only use one IMU on the foot. To detect a ramp, the slope of the terrain on which the user is walking, and the change in height sensed when moving forward, are estimated from the IMU. After detection, the ramp is checked for association with one of the existing in a database. For each associated ramp, a position correction is fed into the Kalman Filter in order to refine the INS-PDR solution. Drift-free localization is achieved with positioning errors below 2 meters for 1,000-meter-long routes in a building with a few ramps.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents a technique for achieving a class of optimizations related to the reduction of checks within cycles. The technique uses both Program Transformation and Abstract Interpretation. After a ñrst pass of an abstract interpreter which detects simple invariants, program transformation is used to build a hypothetical situation that simpliñes some predicates that should be executed within the cycle. This transformation implements the heuristic hypothesis that once conditional tests hold they may continué doing so recursively. Specialized versions of predicates are generated to detect and exploit those cases in which the invariance may hold. Abstract interpretation is then used again to verify the truth of such hypotheses and conñrm the proposed simpliñcation. This allows optimizations that go beyond those possible with only one pass of the abstract interpreter over the original program, as is normally the case. It also allows selective program specialization using a standard abstract interpreter not speciñcally designed for this purpose, thus simplifying the design of this already complex module of the compiler. In the paper, a class of programs amenable to such optimization is presented, along with some examples and an evaluation of the proposed techniques in some application áreas such as floundering detection and reducing run-time tests in automatic logic program parallelization. The analysis of the examples presented has been performed automatically by an implementation of the technique using existing abstract interpretation and program transformation tools.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Los sensores inerciales (acelerómetros y giróscopos) se han ido introduciendo poco a poco en dispositivos que usamos en nuestra vida diaria gracias a su minituarización. Hoy en día todos los smartphones contienen como mínimo un acelerómetro y un magnetómetro, siendo complementados en losmás modernos por giróscopos y barómetros. Esto, unido a la proliferación de los smartphones ha hecho viable el diseño de sistemas basados en las medidas de sensores que el usuario lleva colocados en alguna parte del cuerpo (que en un futuro estarán contenidos en tejidos inteligentes) o los integrados en su móvil. El papel de estos sensores se ha convertido en fundamental para el desarrollo de aplicaciones contextuales y de inteligencia ambiental. Algunos ejemplos son el control de los ejercicios de rehabilitación o la oferta de información referente al sitio turístico que se está visitando. El trabajo de esta tesis contribuye a explorar las posibilidades que ofrecen los sensores inerciales para el apoyo a la detección de actividad y la mejora de la precisión de servicios de localización para peatones. En lo referente al reconocimiento de la actividad que desarrolla un usuario, se ha explorado el uso de los sensores integrados en los dispositivos móviles de última generación (luz y proximidad, acelerómetro, giróscopo y magnetómetro). Las actividades objetivo son conocidas como ‘atómicas’ (andar a distintas velocidades, estar de pie, correr, estar sentado), esto es, actividades que constituyen unidades de actividades más complejas como pueden ser lavar los platos o ir al trabajo. De este modo, se usan algoritmos de clasificación sencillos que puedan ser integrados en un móvil como el Naïve Bayes, Tablas y Árboles de Decisión. Además, se pretende igualmente detectar la posición en la que el usuario lleva el móvil, no sólo con el objetivo de utilizar esa información para elegir un clasificador entrenado sólo con datos recogidos en la posición correspondiente (estrategia que mejora los resultados de estimación de la actividad), sino también para la generación de un evento que puede producir la ejecución de una acción. Finalmente, el trabajo incluye un análisis de las prestaciones de la clasificación variando el tipo de parámetros y el número de sensores usados y teniendo en cuenta no sólo la precisión de la clasificación sino también la carga computacional. Por otra parte, se ha propuesto un algoritmo basado en la cuenta de pasos utilizando informaiii ción proveniente de un acelerómetro colocado en el pie del usuario. El objetivo final es detectar la actividad que el usuario está haciendo junto con la estimación aproximada de la distancia recorrida. El algoritmo de cuenta pasos se basa en la detección de máximos y mínimos usando ventanas temporales y umbrales sin requerir información específica del usuario. El ámbito de seguimiento de peatones en interiores es interesante por la falta de un estándar de localización en este tipo de entornos. Se ha diseñado un filtro extendido de Kalman centralizado y ligeramente acoplado para fusionar la información medida por un acelerómetro colocado en el pie del usuario con medidas de posición. Se han aplicado también diferentes técnicas de corrección de errores como las de velocidad cero que se basan en la detección de los instantes en los que el pie está apoyado en el suelo. Los resultados han sido obtenidos en entornos interiores usando las posiciones estimadas por un sistema de triangulación basado en la medida de la potencia recibida (RSS) y GPS en exteriores. Finalmente, se han implementado algunas aplicaciones que prueban la utilidad del trabajo desarrollado. En primer lugar se ha considerado una aplicación de monitorización de actividad que proporciona al usuario información sobre el nivel de actividad que realiza durante un período de tiempo. El objetivo final es favorecer el cambio de comportamientos sedentarios, consiguiendo hábitos saludables. Se han desarrollado dos versiones de esta aplicación. En el primer caso se ha integrado el algoritmo de cuenta pasos en una plataforma OSGi móvil adquiriendo los datos de un acelerómetro Bluetooth colocado en el pie. En el segundo caso se ha creado la misma aplicación utilizando las implementaciones de los clasificadores en un dispositivo Android. Por otro lado, se ha planteado el diseño de una aplicación para la creación automática de un diario de viaje a partir de la detección de eventos importantes. Esta aplicación toma como entrada la información procedente de la estimación de actividad y de localización además de información almacenada en bases de datos abiertas (fotos, información sobre sitios) e información sobre sensores reales y virtuales (agenda, cámara, etc.) del móvil. Abstract Inertial sensors (accelerometers and gyroscopes) have been gradually embedded in the devices that people use in their daily lives thanks to their miniaturization. Nowadays all smartphones have at least one embedded magnetometer and accelerometer, containing the most upto- date ones gyroscopes and barometers. This issue, together with the fact that the penetration of smartphones is growing steadily, has made possible the design of systems that rely on the information gathered by wearable sensors (in the future contained in smart textiles) or inertial sensors embedded in a smartphone. The role of these sensors has become key to the development of context-aware and ambient intelligent applications. Some examples are the performance of rehabilitation exercises, the provision of information related to the place that the user is visiting or the interaction with objects by gesture recognition. The work of this thesis contributes to explore to which extent this kind of sensors can be useful to support activity recognition and pedestrian tracking, which have been proven to be essential for these applications. Regarding the recognition of the activity that a user performs, the use of sensors embedded in a smartphone (proximity and light sensors, gyroscopes, magnetometers and accelerometers) has been explored. The activities that are detected belong to the group of the ones known as ‘atomic’ activities (e.g. walking at different paces, running, standing), that is, activities or movements that are part of more complex activities such as doing the dishes or commuting. Simple, wellknown classifiers that can run embedded in a smartphone have been tested, such as Naïve Bayes, Decision Tables and Trees. In addition to this, another aim is to estimate the on-body position in which the user is carrying the mobile phone. The objective is not only to choose a classifier that has been trained with the corresponding data in order to enhance the classification but also to start actions. Finally, the performance of the different classifiers is analysed, taking into consideration different features and number of sensors. The computational and memory load of the classifiers is also measured. On the other hand, an algorithm based on step counting has been proposed. The acceleration information is provided by an accelerometer placed on the foot. The aim is to detect the activity that the user is performing together with the estimation of the distance covered. The step counting strategy is based on detecting minima and its corresponding maxima. Although the counting strategy is not innovative (it includes time windows and amplitude thresholds to prevent under or overestimation) no user-specific information is required. The field of pedestrian tracking is crucial due to the lack of a localization standard for this kind of environments. A loosely-coupled centralized Extended Kalman Filter has been proposed to perform the fusion of inertial and position measurements. Zero velocity updates have been applied whenever the foot is detected to be placed on the ground. The results have been obtained in indoor environments using a triangulation algorithm based on RSS measurements and GPS outdoors. Finally, some applications have been designed to test the usefulness of the work. The first one is called the ‘Activity Monitor’ whose aim is to prevent sedentary behaviours and to modify habits to achieve desired objectives of activity level. Two different versions of the application have been implemented. The first one uses the activity estimation based on the step counting algorithm, which has been integrated in an OSGi mobile framework acquiring the data from a Bluetooth accelerometer placed on the foot of the individual. The second one uses activity classifiers embedded in an Android smartphone. On the other hand, the design of a ‘Travel Logbook’ has been planned. The input of this application is the information provided by the activity and localization modules, external databases (e.g. pictures, points of interest, weather) and mobile embedded and virtual sensors (agenda, camera, etc.). The aim is to detect important events in the journey and gather the information necessary to store it as a journal page.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The pH response of GaN/AlInN/AlN/GaN ion-sensitive field effect transistor (ISFET) on Si substrates has been characterized. We analyzed the variation of the surface potential (ΔVsp/ΔpH) and current (ΔIds/ΔpH) with solution pH in devices with the same indium content (17%, in-plane lattice-matched to GaN) and different AlInN thickness (6 nm and 10 nm), and compared with the literature. The shrinkage of the barrier, that has the effect to increase the transconductance of the device, makes the 2-dimensional electron density (2DEG) at the interface very sensitive to changes in the surface. Although the surface potential sensitivity to pH is similar in the two devices, the current change with pH (ΔIds/ΔpH), when biasing the ISFET by a Ag/AgCl reference electrode, is almost 50% higher in the device with 6 nm AlInN barrier, compared to the device with 10 nm barrier. When measuring the current response (ΔIds/ΔpH) without reference electrode, the device with thinner AlInN layer has a larger response than the thicker one, of a factor of 140%, and that current response without reference electrode is only 22% lower than its maximum response obtained using reference electrode.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In order to achieve total selectivity at electrical distribution networks it is of great importance to analyze the defect currents at ungrounded power systems. This information will help to grant selectivity at electrical distribution networks ensuring that only the defect line or feeder is removed from service. In the present work a new selective and directional protection method for ungrounded power systems is evaluated. The new method measures only defect currents to detect earth faults and works with a directional criterion to determine the line under faulty conditions. The main contribution of this new technique is that it can detect earth faults in outgoing lines at any type of substation avoiding the possible mismatch of traditional directional earth fault relays. This detection technique is based on the comparison of the direction of a reference current to the direction of all earth fault capacitive currents at all the feeders connected to the same bus bars. This new method has been validated through computer simulations. The results for the different cases studied are remarkable, proving total validity and usefulness of the new method.