922 resultados para change detection analysis
Resumo:
Background: Analysis of exhaled volatile organic compounds (VOCs) in breath is an emerging approach for cancer diagnosis, but little is known about its potential use as a biomarker for colorectal cancer (CRC). We investigated whether a combination of VOCs could distinct CRC patients from healthy volunteers. Methods: In a pilot study, we prospectively analyzed breath exhalations of 38 CRC patient and 43 healthy controls all scheduled for colonoscopy, older than 50 in the average-risk category. The samples were ionized and analyzed using a Secondary ElectroSpray Ionization (SESI) coupled with a Time-of-Flight Mass Spectrometer (SESI-MS). After a minimum of 2 hours fasting, volunteers deeply exhaled into the system. Each test requires three soft exhalations and takes less than ten minutes. No breath condensate or collection are required and VOCs masses are detected in real time, also allowing for a spirometric profile to be analyzed along with the VOCs. A new sampling system precludes ambient air from entering the system, so background contamination is reduced by an overall factor of ten. Potential confounding variables from the patient or the environment that could interfere with results were analyzed. Results: 255 VOCs, with masses ranging from 30 to 431 Dalton have been identified in the exhaled breath. Using a classification technique based on the ROC curve for each VOC, a set of 9 biomarkers discriminating the presence of CRC from healthy volunteers was obtained, showing an average recognition rate of 81.94%, a sensitivity of 87.04% and specificity of 76.85%. Conclusions: A combination of cualitative and cuantitative analysis of VOCs in the exhaled breath could be a powerful diagnostic tool for average-risk CRC population. These results should be taken with precaution, as many endogenous or exogenous contaminants could interfere as confounding variables. On-line analysis with SESI-MS is less time-consuming and doesn’t need sample preparation. We are recruiting in a new pilot study including breath cleaning procedures and spirometric analysis incorporated into the postprocessing algorithms, to better control for confounding variables.
Resumo:
Esta tesis estudia la monitorización y gestión de la Calidad de Experiencia (QoE) en los servicios de distribución de vídeo sobre IP. Aborda el problema de cómo prevenir, detectar, medir y reaccionar a las degradaciones de la QoE desde la perspectiva de un proveedor de servicios: la solución debe ser escalable para una red IP extensa que entregue flujos individuales a miles de usuarios simultáneamente. La solución de monitorización propuesta se ha denominado QuEM(Qualitative Experience Monitoring, o Monitorización Cualitativa de la Experiencia). Se basa en la detección de las degradaciones de la calidad de servicio de red (pérdidas de paquetes, disminuciones abruptas del ancho de banda...) e inferir de cada una una descripción cualitativa de su efecto en la Calidad de Experiencia percibida (silencios, defectos en el vídeo...). Este análisis se apoya en la información de transporte y de la capa de abstracción de red de los flujos codificados, y permite caracterizar los defectos más relevantes que se observan en este tipo de servicios: congelaciones, efecto de “cuadros”, silencios, pérdida de calidad del vídeo, retardos e interrupciones en el servicio. Los resultados se han validado mediante pruebas de calidad subjetiva. La metodología usada en esas pruebas se ha desarrollado a su vez para imitar lo más posible las condiciones de visualización de un usuario de este tipo de servicios: los defectos que se evalúan se introducen de forma aleatoria en medio de una secuencia de vídeo continua. Se han propuesto también algunas aplicaciones basadas en la solución de monitorización: un sistema de protección desigual frente a errores que ofrece más protección a las partes del vídeo más sensibles a pérdidas, una solución para minimizar el impacto de la interrupción de la descarga de segmentos de Streaming Adaptativo sobre HTTP, y un sistema de cifrado selectivo que encripta únicamente las partes del vídeo más sensibles. También se ha presentado una solución de cambio rápido de canal, así como el análisis de la aplicabilidad de los resultados anteriores a un escenario de vídeo en 3D. ABSTRACT This thesis proposes a comprehensive approach to the monitoring and management of Quality of Experience (QoE) in multimedia delivery services over IP. It addresses the problem of preventing, detecting, measuring, and reacting to QoE degradations, under the constraints of a service provider: the solution must scale for a wide IP network delivering individual media streams to thousands of users. The solution proposed for the monitoring is called QuEM (Qualitative Experience Monitoring). It is based on the detection of degradations in the network Quality of Service (packet losses, bandwidth drops...) and the mapping of each degradation event to a qualitative description of its effect in the perceived Quality of Experience (audio mutes, video artifacts...). This mapping is based on the analysis of the transport and Network Abstraction Layer information of the coded stream, and allows a good characterization of the most relevant defects that exist in this kind of services: screen freezing, macroblocking, audio mutes, video quality drops, delay issues, and service outages. The results have been validated by subjective quality assessment tests. The methodology used for those test has also been designed to mimic as much as possible the conditions of a real user of those services: the impairments to evaluate are introduced randomly in the middle of a continuous video stream. Based on the monitoring solution, several applications have been proposed as well: an unequal error protection system which provides higher protection to the parts of the stream which are more critical for the QoE, a solution which applies the same principles to minimize the impact of incomplete segment downloads in HTTP Adaptive Streaming, and a selective scrambling algorithm which ciphers only the most sensitive parts of the media stream. A fast channel change application is also presented, as well as a discussion about how to apply the previous results and concepts in a 3D video scenario.
Resumo:
FBGs are excellent strain sensors, because of its low size and multiplexing capability. Tens to hundred of sensors may be embedded into a structure, as it has already been demonstrated. Nevertheless, they only afford strain measurements at local points, so unless the damage affects the strain readings in a distinguishable manner, damage will go undetected. This paper show the experimental results obtained on the wing of a UAV, instrumented with 32 FBGs, before and after small damages were introduced. The PCA algorithm was able to distinguish the damage cases, even for small cracks. Principal Component Analysis (PCA) is a technique of multivariable analysis to reduce a complex data set to a lower dimension and reveal some hidden patterns that underlie.
Resumo:
In this paper we present an innovative technique to tackle the problem of automatic road sign detection and tracking using an on-board stereo camera. It involves a continuous 3D analysis of the road sign during the whole tracking process. Firstly, a color and appearance based model is applied to generate road sign candidates in both stereo images. A sparse disparity map between the left and right images is then created for each candidate by using contour-based and SURF-based matching in the far and short range, respectively. Once the map has been computed, the correspondences are back-projected to generate a cloud of 3D points, and the best-fit plane is computed through RANSAC, ensuring robustness to outliers. Temporal consistency is enforced by means of a Kalman filter, which exploits the intrinsic smoothness of the 3D camera motion in traffic environments. Additionally, the estimation of the plane allows to correct deformations due to perspective, thus easing further sign classification.
Resumo:
Nonlinear analysis tools for studying and characterizing the dynamics of physiological signals have gained popularity, mainly because tracking sudden alterations of the inherent complexity of biological processes might be an indicator of altered physiological states. Typically, in order to perform an analysis with such tools, the physiological variables that describe the biological process under study are used to reconstruct the underlying dynamics of the biological processes. For that goal, a procedure called time-delay or uniform embedding is usually employed. Nonetheless, there is evidence of its inability for dealing with non-stationary signals, as those recorded from many physiological processes. To handle with such a drawback, this paper evaluates the utility of non-conventional time series reconstruction procedures based on non uniform embedding, applying them to automatic pattern recognition tasks. The paper compares a state of the art non uniform approach with a novel scheme which fuses embedding and feature selection at once, searching for better reconstructions of the dynamics of the system. Moreover, results are also compared with two classic uniform embedding techniques. Thus, the goal is comparing uniform and non uniform reconstruction techniques, including the one proposed in this work, for pattern recognition in biomedical signal processing tasks. Once the state space is reconstructed, the scheme followed characterizes with three classic nonlinear dynamic features (Largest Lyapunov Exponent, Correlation Dimension and Recurrence Period Density Entropy), while classification is carried out by means of a simple k-nn classifier. In order to test its generalization capabilities, the approach was tested with three different physiological databases (Speech Pathologies, Epilepsy and Heart Murmurs). In terms of the accuracy obtained to automatically detect the presence of pathologies, and for the three types of biosignals analyzed, the non uniform techniques used in this work lightly outperformed the results obtained using the uniform methods, suggesting their usefulness to characterize non-stationary biomedical signals in pattern recognition applications. On the other hand, in view of the results obtained and its low computational load, the proposed technique suggests its applicability for the applications under study.
Resumo:
The use of a common environment for processing different powder foods in the industry has increased the risk of finding peanut traces in powder foods. The analytical methods commonly used for detection of peanut such as enzyme-linked immunosorbent assay (ELISA) and real-time polymerase chain reaction (RT-PCR) represent high specificity and sensitivity but are destructive and time-consuming, and require highly skilled experimenters. The feasibility of NIR hyperspectral imaging (HSI) is studied for the detection of peanut traces down to 0.01% by weight. A principal-component analysis (PCA) was carried out on a dataset of peanut and flour spectra. The obtained loadings were applied to the HSI images of adulterated wheat flour samples with peanut traces. As a result, HSI images were reduced to score images with enhanced contrast between peanut and flour particles. Finally, a threshold was fixed in score images to obtain a binary classification image, and the percentage of peanut adulteration was compared with the percentage of pixels identified as peanut particles. This study allowed the detection of traces of peanut down to 0.01% and quantification of peanut adulteration from 10% to 0.1% with a coefficient of determination (r2) of 0.946. These results show the feasibility of using HSI systems for the detection of peanut traces in conjunction with chemical procedures, such as RT-PCR and ELISA to facilitate enhanced quality-control surveillance on food-product processing lines.
Resumo:
Climate Change, Water Scarcity in Agriculture and the Country-Level Economic Impacts. A Multimarket Analysis. Abstract: Agriculture could be one of the most vulnerable economic sectors to the impacts of climate change in the coming decades. Considering the critical role that water plays for agricultural production, any shock in water availability will have great implications for agricultural production, land allocation, and agricultural prices. In this paper, an Agricultural Multimarket model is developed to analyze climate change impacts in developing countries, accounting for the uncertainty associated with the impacts of climate change. The model has a structure flexible enough to represent local conditions, resource availability, and market conditions. The results suggest different economic consequences of climate change depending on the specific activity, with many distributional effects across regions
Resumo:
Los efectos del cambio global sobre los bosques son una de las grandes preocupaciones de la sociedad del siglo XXI. Algunas de sus posibles consecuencias como son los efectos en la producción, la sostenibilidad, la pérdida de biodiversidad o cambios en la distribución y ensamblaje de especies forestales pueden tener grandes repercusiones sociales, ecológicas y económicas. La detección y seguimiento de estos efectos constituyen uno de los retos a los que se enfrentan en la actualidad científicos y gestores forestales. En base a la comparación de series históricas del Inventario Forestal Nacional Español (IFN), esta tesis trata de arrojar luz sobre algunos de los impactos que los cambios socioeconómicos y ambientales de las últimas décadas han generado sobre nuestros bosques. En primer lugar, esta tesis presenta una innovadora metodología con base geoestadística que permite la comparación de diferentes ciclos de inventario sin importar los diferentes métodos de muestreo empleados en cada uno de ellos (Capítulo 3). Esta metodología permite analizar cambios en la dinámica y distribución espacial de especies forestales en diferentes gradientes geográficos. Mediante su aplicación, se constatarán y cuantificarán algunas de las primeras evidencias de cambio en la distribución altitudinal y latitudinal de diferentes especies forestales ibéricas, que junto al estudio de su dinámica poblacional y tasas demográficas, ayudarán a testar algunas hipótesis biogeográficas en un escenario de cambio global en zonas de especial vulnerabilidad (Capítulos 3, 4 y 5). Por último, mediante la comparación de ciclos de parcelas permanentes del IFN se ahondará en el conocimiento de la evolución en las últimas décadas de especies invasoras en los ecosistemas forestales del cuadrante noroccidental ibérico, uno de los más afectados por la invasión de esta flora (Capítulo 6). ABSTRACT The effects of global change on forests are one of the major concerns of the XXI century. Some of the potential impacts of global change on forest growth, productivity, biodiversity or changes in species assembly and spatial distribution may have great ecological and economic consequences. The detection and monitoring of these effects are some of the major challenges that scientists and forest managers face nowadays. Based on the comparison of historical series of the Spanish National Forest Inventory (NFI), this thesis tries to shed some light on some of the impacts driven by recent socio-economic and environmental changes on our forest ecosystems. Firstly, this thesis presents an innovative methodology based on geostatistical techniques that allows the comparison of different NFI cycles regardless of the different sampling methods used in each of them (Chapter 3). This methodology, in conjunction with other statistical techniques, allows to analyze changes in the spatial distribution and population dynamics of forest species along different geographic gradients. By its application, this thesis presents some of the first evidences of changes in species distribution along different geographical gradients in the Iberian Peninsula. The analysis of these findings, of species population dynamics and demographic rates will help to test some biogeographical hypothesis on forests under climate change scenarios in areas of particular vulnerability (Chapters 3, 4 and 5). Finally, by comparing NFI cycles with permanent plots, this thesis increases our knowledge about the patterns and processes associated with the recent evolution of invasive species in the forest ecosystems of North-western Iberia, one of the areas most affected by the invasion of allien species at national scale (Chapter 6).
Resumo:
Esta tesis se centra en el análisis de dos aspectos complementarios de la ciberdelincuencia (es decir, el crimen perpetrado a través de la red para ganar dinero). Estos dos aspectos son las máquinas infectadas utilizadas para obtener beneficios económicos de la delincuencia a través de diferentes acciones (como por ejemplo, clickfraud, DDoS, correo no deseado) y la infraestructura de servidores utilizados para gestionar estas máquinas (por ejemplo, C & C, servidores explotadores, servidores de monetización, redirectores). En la primera parte se investiga la exposición a las amenazas de los ordenadores victimas. Para realizar este análisis hemos utilizado los metadatos contenidos en WINE-BR conjunto de datos de Symantec. Este conjunto de datos contiene metadatos de instalación de ficheros ejecutables (por ejemplo, hash del fichero, su editor, fecha de instalación, nombre del fichero, la versión del fichero) proveniente de 8,4 millones de usuarios de Windows. Hemos asociado estos metadatos con las vulnerabilidades en el National Vulnerability Database (NVD) y en el Opens Sourced Vulnerability Database (OSVDB) con el fin de realizar un seguimiento de la decadencia de la vulnerabilidad en el tiempo y observar la rapidez de los usuarios a remiendar sus sistemas y, por tanto, su exposición a posibles ataques. Hemos identificado 3 factores que pueden influir en la actividad de parches de ordenadores victimas: código compartido, el tipo de usuario, exploits. Presentamos 2 nuevos ataques contra el código compartido y un análisis de cómo el conocimiento usuarios y la disponibilidad de exploit influyen en la actividad de aplicación de parches. Para las 80 vulnerabilidades en nuestra base de datos que afectan código compartido entre dos aplicaciones, el tiempo entre el parche libera en las diferentes aplicaciones es hasta 118 das (con una mediana de 11 das) En la segunda parte se proponen nuevas técnicas de sondeo activos para detectar y analizar las infraestructuras de servidores maliciosos. Aprovechamos técnicas de sondaje activo, para detectar servidores maliciosos en el internet. Empezamos con el análisis y la detección de operaciones de servidores explotadores. Como una operación identificamos los servidores que son controlados por las mismas personas y, posiblemente, participan en la misma campaña de infección. Hemos analizado un total de 500 servidores explotadores durante un período de 1 año, donde 2/3 de las operaciones tenían un único servidor y 1/2 por varios servidores. Hemos desarrollado la técnica para detectar servidores explotadores a diferentes tipologías de servidores, (por ejemplo, C & C, servidores de monetización, redirectores) y hemos logrado escala de Internet de sondeo para las distintas categorías de servidores maliciosos. Estas nuevas técnicas se han incorporado en una nueva herramienta llamada CyberProbe. Para detectar estos servidores hemos desarrollado una novedosa técnica llamada Adversarial Fingerprint Generation, que es una metodología para generar un modelo único de solicitud-respuesta para identificar la familia de servidores (es decir, el tipo y la operación que el servidor apartenece). A partir de una fichero de malware y un servidor activo de una determinada familia, CyberProbe puede generar un fingerprint válido para detectar todos los servidores vivos de esa familia. Hemos realizado 11 exploraciones en todo el Internet detectando 151 servidores maliciosos, de estos 151 servidores 75% son desconocidos a bases de datos publicas de servidores maliciosos. Otra cuestión que se plantea mientras se hace la detección de servidores maliciosos es que algunos de estos servidores podrán estar ocultos detrás de un proxy inverso silente. Para identificar la prevalencia de esta configuración de red y mejorar el capacidades de CyberProbe hemos desarrollado RevProbe una nueva herramienta a través del aprovechamiento de leakages en la configuración de la Web proxies inversa puede detectar proxies inversos. RevProbe identifica que el 16% de direcciones IP maliciosas activas analizadas corresponden a proxies inversos, que el 92% de ellos son silenciosos en comparación con 55% para los proxies inversos benignos, y que son utilizado principalmente para equilibrio de carga a través de múltiples servidores. ABSTRACT In this dissertation we investigate two fundamental aspects of cybercrime: the infection of machines used to monetize the crime and the malicious server infrastructures that are used to manage the infected machines. In the first part of this dissertation, we analyze how fast software vendors apply patches to secure client applications, identifying shared code as an important factor in patch deployment. Shared code is code present in multiple programs. When a vulnerability affects shared code the usual linear vulnerability life cycle is not anymore effective to describe how the patch deployment takes place. In this work we show which are the consequences of shared code vulnerabilities and we demonstrate two novel attacks that can be used to exploit this condition. In the second part of this dissertation we analyze malicious server infrastructures, our contributions are: a technique to cluster exploit server operations, a tool named CyberProbe to perform large scale detection of different malicious servers categories, and RevProbe a tool that detects silent reverse proxies. We start by identifying exploit server operations, that are, exploit servers managed by the same people. We investigate a total of 500 exploit servers over a period of more 13 months. We have collected malware from these servers and all the metadata related to the communication with the servers. Thanks to this metadata we have extracted different features to group together servers managed by the same entity (i.e., exploit server operation), we have discovered that 2/3 of the operations have a single server while 1/3 have multiple servers. Next, we present CyberProbe a tool that detects different malicious server types through a novel technique called adversarial fingerprint generation (AFG). The idea behind CyberProbe’s AFG is to run some piece of malware and observe its network communication towards malicious servers. Then it replays this communication to the malicious server and outputs a fingerprint (i.e. a port selection function, a probe generation function and a signature generation function). Once the fingerprint is generated CyberProbe scans the Internet with the fingerprint and finds all the servers of a given family. We have performed a total of 11 Internet wide scans finding 151 new servers starting with 15 seed servers. This gives to CyberProbe a 10 times amplification factor. Moreover we have compared CyberProbe with existing blacklists on the internet finding that only 40% of the server detected by CyberProbe were listed. To enhance the capabilities of CyberProbe we have developed RevProbe, a reverse proxy detection tool that can be integrated with CyberProbe to allow precise detection of silent reverse proxies used to hide malicious servers. RevProbe leverages leakage based detection techniques to detect if a malicious server is hidden behind a silent reverse proxy and the infrastructure of servers behind it. At the core of RevProbe is the analysis of differences in the traffic by interacting with a remote server.
Resumo:
Frequency Response Analysis is a well-known technique for the diagnosis of power transformers. Currently, this technique is under research for its application in rotary electrical machines. This paper presents significant results on the application of Frequency Response Analysis to fault detection in field winding of synchronous machines with static excitation. First, the influence of the rotor position on the frequency response is evaluated. Secondly, some relevant test results are shown regarding ground fault and inter-turn fault detection in field windings at standstill condition. The influence of the fault resistance value is also taken into account. This paper also studies the applicability of Frequency Response Analysis in fault detection in field windings while rotating. This represents an important feature because some defects only appear with the machine rated speed. Several laboratory test results show the applicability of this fault detection technique in field windings at full speed with no excitation current.
Resumo:
A methodology, fluorescence-intensity distribution analysis, has been developed for confocal microscopy studies in which the fluorescence intensity of a sample with a heterogeneous brightness profile is monitored. An adjustable formula, modeling the spatial brightness distribution, and the technique of generating functions for calculation of theoretical photon count number distributions serve as the two cornerstones of the methodology. The method permits the simultaneous determination of concentrations and specific brightness values of a number of individual fluorescent species in solution. Accordingly, we present an extremely sensitive tool to monitor the interaction of fluorescently labeled molecules or other microparticles with their respective biological counterparts that should find a wide application in life sciences, medicine, and drug discovery. Its potential is demonstrated by studying the hybridization of 5′-(6-carboxytetramethylrhodamine)-labeled and nonlabeled complementary oligonucleotides and the subsequent cleavage of the DNA hybrids by restriction enzymes.
Resumo:
Objective: To identify which aspects of socioeconomic change were associated with the steep decline in life expectancy in Russia between 1990 and 1994.
Resumo:
Early detection is an effective means of reducing cancer mortality. Here, we describe a highly sensitive high-throughput screen that can identify panels of markers for the early detection of solid tumor cells disseminated in peripheral blood. The method is a two-step combination of differential display and high-sensitivity cDNA arrays. In a primary screen, differential display identified 170 candidate marker genes differentially expressed between breast tumor cells and normal breast epithelial cells. In a secondary screen, high-sensitivity arrays assessed expression levels of these genes in 48 blood samples, 22 from healthy volunteers and 26 from breast cancer patients. Cluster analysis identified a group of 12 genes that were elevated in the blood of cancer patients. Permutation analysis of individual genes defined five core genes (P ≤ 0.05, permax test). As a group, the 12 genes generally distinguished accurately between healthy volunteers and patients with breast cancer. Mean expression levels of the 12 genes were elevated in 77% (10 of 13) untreated invasive cancer patients, whereas cluster analysis correctly classified volunteers and patients (P = 0.0022, Fisher's exact test). Quantitative real-time PCR confirmed array results and indicated that the sensitivity of the assay (1:2 × 108 transcripts) was sufficient to detect disseminated solid tumor cells in blood. Expression-based blood assays developed with the screening approach described here have the potential to detect and classify solid tumor cells originating from virtually any primary site in the body.
Resumo:
Climate change is critically impacting the environment and economy at the local level. County governments have an opportunity to adopt climate change policies that address local environmental and economic concerns. The Colorado counties of Boulder, Gunnison, and Pitkin have all adopted some form of climate change policies. There are some components of each of these policies that are more effective in terms of economic, environmental, and community benefits. An effective climate change policy clearly states specific cost analyses, environmental impacts at the local level, the relationship between impacts and the community, and the economic benefits of policy adoption. This Capstone project addresses specific cost and energy analyses and provides a beneficial policy framework for county governments.