957 resultados para detectors
Resumo:
We are investigating the performances of a data acquisition system for Time of Flight PET, based on LYSO crystal slabs and 64 channels Silicon Photomultipliers matrices (1.2 cm2 of active area each). Measurements have been performed to test the timing capability of the detection system (SiPM matices coupled to a LYSO slab and the read-out electronics) with both test signal and radioactive source.
Resumo:
In this paper, the authors provide a methodology to design nonparametric permutation tests and, in particular, nonparametric rank tests for applications in detection. In the first part of the paper, the authors develop the optimization theory of both permutation and rank tests in the Neyman?Pearson sense; in the second part of the paper, they carry out a comparative performance analysis of the permutation and rank tests (detectors) against the parametric ones in radar applications. First, a brief review of some contributions on nonparametric tests is realized. Then, the optimum permutation and rank tests are derived. Finally, a performance analysis is realized by Monte-Carlo simulations for the corresponding detectors, and the results are shown in curves of detection probability versus signal-to-noise ratio
Resumo:
Outline: • Motivation, aim • Complement waveguide data on silica • Optical data in quartz • Detailed analysis, i.e. both fluence kinetics and resolution • Efficiency of irradiation and analysis, samples, time... • Experimental set-up description • Reflectance procedure • Options: light source (lasers, white light..), detectors, configurations • Results and discussion • Comparative of amorphous and crystalline phases
Resumo:
En este proyecto se van a aplicar las técnicas de análisis de ruido para caracterizar la respuesta dinámica de varios sensores de temperatura, tanto termorresistencias de platino como de termopares. Estos sensores son imprescindibles para él correcto funcionamiento de las centrales nucleares y requieren vigilancia para garantizar la exactitud de las medidas. Las técnicas de análisis de ruido son técnicas pasivas, es decir, no afectan a la operación de la planta y permiten realizar una vigilancia in situ de los sensores. Para el caso de los sensores de temperatura, dado que se pueden asimilar a sistemas de primer orden, el parámetro fundamental a vigilar es el tiempo de respuesta. Éste puede obtenerse para cada una de las sondas por medio de técnicas en el dominio de la frecuencia (análisis espectral) o por medio de técnicas en el dominio del tiempo (modelos autorregresivos). Además de la estimación del tiempo de respuesta, se realizará una caracterización estadística de las sondas. El objetivo es conocer el comportamiento de los sensores y vigilarlos de manera que se puedan diagnosticar las averías aunque éstas estén en una etapa incipiente. ABSTRACT In this project we use noise analysis technique to study the dynamic response of RTDs (Resistant temperature detectors) and thermocouples. These sensors are essential for the proper functioning of nuclear power plants and therefore need to be monitored to guarantee accurate measurements. The noise analysis techniques do not affect plant operation and allow in situ monitoring of the sensors. Temperature sensors are equivalent to first order systems. In these systems the main parameter to monitor is the response time which can be obtained by means of techniques in the frequency domain (spectral analysis) as well as time domain (autoregressive models). Besides response time estimation the project will also include a statistical study of the probes. The goal is to understand the behavior of the sensors and monitor them in order to detect any anomalies or malfunctions even if they occur in an early stage.
Resumo:
It is a known fact that noise analysis is a suitable method for sensor performance surveillance. In particular, controlling the response time of a sensor is an efficient way to anticipate failures and to have the opportunity to prevent them. In this work the response times of several sensors of Trillo NPP are estimated by means of noise analysis. The procedure applied consists of modeling each sensor with autoregressive methods and getting the searched parameter by analyzing the response of the model when a ramp is simulated as the input signal. Core exit thermocouples and in core self-powered neutron detectors are the main sensors analyzed but other plant sensors are studied as well. Since several measurement campaigns have been carried out, it has been also possible to analyze the evolution of the estimated parameters during more than one fuel cycle. Some sensitivity studies for the sample frequency of the signals and its influence on the response time are also included. Calculations and analysis have been done in the frame of a collaboration agreement between Trillo NPP operator (CNAT) and the School of Mines of Madrid.
Resumo:
La diabetes mellitus es el conjunto de alteraciones provocadas por un defecto en la cantidad de insulina secretada o por un aprovechamiento deficiente de la misma. Es causa directa de complicaciones a corto, medio y largo plazo que disminuyen la calidad y las expectativas de vida de las personas con diabetes. La diabetes mellitus es en la actualidad uno de los problemas más importantes de salud. Ha triplicado su prevalencia en los últimos 20 anos y para el año 2025 se espera que existan casi 300 millones de personas con diabetes. Este aumento de la prevalencia junto con la morbi-mortalidad asociada a sus complicaciones micro y macro-vasculares convierten la diabetes en una carga para los sistemas sanitarios, sus recursos económicos y sus profesionales, haciendo de la enfermedad un problema individual y de salud pública de enormes proporciones. De momento no existe cura a esta enfermedad, de modo que el objetivo terapéutico del tratamiento de la diabetes se centra en la normalización de la glucemia intentando minimizar los eventos de hiper e hipoglucemia y evitando la aparición o al menos retrasando la evolución de las complicaciones vasculares, que constituyen la principal causa de morbi-mortalidad de las personas con diabetes. Un adecuado control diabetológico implica un tratamiento individualizado que considere multitud de factores para cada paciente (edad, actividad física, hábitos alimentarios, presencia de complicaciones asociadas o no a la diabetes, factores culturales, etc.). Sin embargo, a corto plazo, las dos variables más influyentes que el paciente ha de manejar para intervenir sobre su nivel glucémico son la insulina administrada y la dieta. Ambas presentan un retardo entre el momento de su aplicación y el comienzo de su acción, asociado a la absorción de los mismos. Por este motivo la capacidad de predecir la evolución del perfil glucémico en un futuro cercano, ayudara al paciente a tomar las decisiones adecuadas para mantener un buen control de su enfermedad y evitar situaciones de riesgo. Este es el objetivo de la predicción en diabetes: adelantar la evolución del perfil glucémico en un futuro cercano para ayudar al paciente a adaptar su estilo de vida y sus acciones correctoras, con el propósito de que sus niveles de glucemia se aproximen a los de una persona sana, evitando así los síntomas y complicaciones de un mal control. La aparición reciente de los sistemas de monitorización continua de glucosa ha proporcionado nuevas alternativas. La disponibilidad de un registro exhaustivo de las variaciones del perfil glucémico, con un periodo de muestreo de entre uno y cinco minutos, ha favorecido el planteamiento de nuevos modelos que tratan de predecir la glucemia utilizando tan solo las medidas anteriores de glucemia o al menos reduciendo significativamente la información de entrada a los algoritmos. El hecho de requerir menor intervención por parte del paciente, abre nuevas posibilidades de aplicación de los predictores de glucemia, haciéndose viable su uso en tiempo real, como sistemas de ayuda a la decisión, como detectores de situaciones de riesgo o integrados en algoritmos automáticos de control. En esta tesis doctoral se proponen diferentes algoritmos de predicción de glucemia para pacientes con diabetes, basados en la información registrada por un sistema de monitorización continua de glucosa así como incorporando la información de la insulina administrada y la ingesta de carbohidratos. Los algoritmos propuestos han sido evaluados en simulación y utilizando datos de pacientes registrados en diferentes estudios clínicos. Para ello se ha desarrollado una amplia metodología, que trata de caracterizar las prestaciones de los modelos de predicción desde todos los puntos de vista: precisión, retardo, ruido y capacidad de detección de situaciones de riesgo. Se han desarrollado las herramientas de simulación necesarias y se han analizado y preparado las bases de datos de pacientes. También se ha probado uno de los algoritmos propuestos para comprobar la validez de la predicción en tiempo real en un escenario clínico. Se han desarrollado las herramientas que han permitido llevar a cabo el protocolo experimental definido, en el que el paciente consulta la predicción bajo demanda y tiene el control sobre las variables metabólicas. Este experimento ha permitido valorar el impacto sobre el control glucémico del uso de la predicción de glucosa. ABSTRACT Diabetes mellitus is the set of alterations caused by a defect in the amount of secreted insulin or a suboptimal use of insulin. It causes complications in the short, medium and long term that affect the quality of life and reduce the life expectancy of people with diabetes. Diabetes mellitus is currently one of the most important health problems. Prevalence has tripled in the past 20 years and estimations point out that it will affect almost 300 million people by 2025. Due to this increased prevalence, as well as to morbidity and mortality associated with micro- and macrovascular complications, diabetes has become a burden on health systems, their financial resources and their professionals, thus making the disease a major individual and a public health problem. There is currently no cure for this disease, so that the therapeutic goal of diabetes treatment focuses on normalizing blood glucose events. The aim is to minimize hyper- and hypoglycemia and to avoid, or at least to delay, the appearance and development of vascular complications, which are the main cause of morbidity and mortality among people with diabetes. A suitable, individualized and controlled treatment for diabetes involves many factors that need to be considered for each patient: age, physical activity, eating habits, presence of complications related or unrelated to diabetes, cultural factors, etc. However, in the short term, the two most influential variables that the patient has available in order to manage his/her glycemic levels are administered insulin doses and diet. Both suffer from a delay between their time of application and the onset of the action associated with their absorption. Therefore, the ability to predict the evolution of the glycemic profile in the near future could help the patient to make appropriate decisions on how to maintain good control of his/her disease and to avoid risky situations. Hence, the main goal of glucose prediction in diabetes consists of advancing the evolution of glycemic profiles in the near future. This would assist the patient in adapting his/her lifestyle and in taking corrective actions in a way that blood glucose levels approach those of a healthy person, consequently avoiding the symptoms and complications of a poor glucose control. The recent emergence of continuous glucose monitoring systems has provided new alternatives in this field. The availability of continuous records of changes in glycemic profiles (with a sampling period of one or five minutes) has enabled the design of new models which seek to predict blood glucose by using automatically read glucose measurements only (or at least, reducing significantly the data input manually to the algorithms). By requiring less intervention by the patient, new possibilities are open for the application of glucose predictors, making its use feasible in real-time applications, such as: decision support systems, hypo- and hyperglycemia detectors, integration into automated control algorithms, etc. In this thesis, different glucose prediction algorithms are proposed for patients with diabetes. These are based on information recorded by a continuous glucose monitoring system and incorporate information of the administered insulin and carbohydrate intakes. The proposed algorithms have been evaluated in-silico and using patients’ data recorded in different clinical trials. A complete methodology has been developed to characterize the performance of predictive models from all points of view: accuracy, delay, noise and ability to detect hypo- and hyperglycemia. In addition, simulation tools and patient databases have been deployed. One of the proposed algorithms has additionally been evaluated in terms of real-time prediction performance in a clinical scenario in which the patient checked his/her glucose predictions on demand and he/she had control on his/her metabolic variables. This has allowed assessing the impact of using glucose prediction on glycemic control. The tools to carry out the defined experimental protocols were also developed in this thesis.
Resumo:
This paper reports on design studies concerning a moderator concept which aims to maximize the time averaged flux. The idea is to provide neutron spectra showing two clear maxima, with peaks at View the MathML source and View the MathML source arising from leakage from both cryogenic and thermal moderators. Such a concept known as a bi-spectral moderator (Mezei, 2006 [1]) while proven on a reactor source has only been examined for the ESS 2003 proposal. Filges et al. (2003 [2]), which featured a different design than the current ESS. This paper thus reports on a baseline design for such a moderator concept and shows that it can provide substantial gains in count rates for those applications not requiring high resolution in time-of-flight.
Resumo:
The ESS-Bilbao facility, hosted by the University of the Basque Country (UPV/EHU), envisages the operation of a high-current proton accelerator delivering beams with energies up to 50 MeV. The time-averaged proton current will be 2.25 mA, delivered by 1.5 ms proton pulses with a repetition rate of 20 Hz. This beam will feed a neutron source based upon the Be (p,n) reaction, which will enable the provision of relevant neutron experimentation capabilities. The neutron source baseline concept consists in a rotating beryllium target cooled by water. The target structure will comprise a rotatable disk made of 6061-T6 aluminium alloy holding 20 beryllium plates. Heat dissipation from the target relies upon a distribution of coolant-flow channels. The practical implementation of such a concept is here described with emphasis put on the beryllium plates thermo-mechanical optimization, the chosen coolant distribution system as well as the mechanical behavior of the assembly.
Resumo:
The European Spallation Source-Bilbao (ESS-Bilbao) project plans to build an accelerator facility compliant with the ESS-AB requirements which will be able to drive several experimental stations for research purposes involving intense proton beams with currents up to 75 mA, 50 MeV of final energy, 1.5 ms of pulse length and up to 50 Hz repetition rate. The accelerator will also drive a compact neutron source which will provide useful neutron beams to carry out experiments on moderator optimization, neutron optics devices and general neutron instrumentation as well as preparation work for experiments to be carried out by neutron beam users at the large facilities.
Resumo:
The concept of unreliable failure detector was introduced by Chandra and Toueg as a mechanism that provides information about process failures. This mechanism has been used to solve several agreement problems, such as the consensus problem. In this paper, algorithms that implement failure detectors in partially synchronous systems are presented. First two simple algorithms of the weakest class to solve the consensus problem, namely the Eventually Strong class (⋄S), are presented. While the first algorithm is wait-free, the second algorithm is f-resilient, where f is a known upper bound on the number of faulty processes. Both algorithms guarantee that, eventually, all the correct processes agree permanently on a common correct process, i.e. they also implement a failure detector of the class Omega (Ω). They are also shown to be optimal in terms of the number of communication links used forever. Additionally, a wait-free algorithm that implements a failure detector of the Eventually Perfect class (⋄P) is presented. This algorithm is shown to be optimal in terms of the number of bidirectional links used forever.
Resumo:
A PET imaging system demonstrator based on LYSO crystal arrays coupled to SiPM matrices is under construction at the University and INFN of Pisa. Two SiPM matrices, composed of 8×8 SiPM pixels, and 1,5 mm pitch, have been coupled one to one to a LYSO crystals array and read out by a custom electronics system. front-end ASICs were used to read 8 channels of each matrix. Data from each front-end were multiplexed and sent to a DAQ board for the digital conversion; a motherboard collects the data and communicates with a host computer through a USB port for the storage and off-line data processing. In this paper we show the first preliminary tomographic image of a point-like radioactive source acquired with part of the two detection heads in time coincidence.
Resumo:
Providing security to the emerging field of ambient intelligence will be difficult if we rely only on existing techniques, given their dynamic and heterogeneous nature. Moreover, security demands of these systems are expected to grow, as many applications will require accurate context modeling. In this work we propose an enhancement to the reputation systems traditionally deployed for securing these systems. Different anomaly detectors are combined using the immunological paradigm to optimize reputation system performance in response to evolving security requirements. As an example, the experiments show how a combination of detectors based on unsupervised techniques (self-organizing maps and genetic algorithms) can help to significantly reduce the global response time of the reputation system. The proposed solution offers many benefits: scalability, fast response to adversarial activities, ability to detect unknown attacks, high adaptability, and high ability in detecting and confining attacks. For these reasons, we believe that our solution is capable of coping with the dynamism of ambient intelligence systems and the growing requirements of security demands.
Resumo:
In this study, we present a structural and optoelectronic characterization of high dose Ti implanted Si subsequently pulsed-laser melted (Ti supersaturated Si). Time-of-flight secondary ion mass spectrometry analysis reveals that the theoretical Mott limit has been surpassed after the laser process and transmission electron microscopy images show a good lattice reconstruction. Optical characterization shows strong sub-band gap absorption related to the high Ti concentration. Photoconductivity measurements show that Ti supersaturated Si presents spectral response orders of magnitude higher than unimplanted Si at energies below the band gap. We conclude that the observed below band gap photoconductivity cannot be attributed to structural defects produced by the fabrication processes and suggest that both absorption coefficient of the new material and lifetime of photoexcited carriers have been enhanced due to the presence of a high Ti concentration. This remarkable result proves that Ti supersaturated Si is a promising material for both infrared detectors and high efficiency photovoltaic devices.
Resumo:
Using CMOS transistors for terahertz detection is currently a disruptive technology that offers the direct integration of a terahertz detector with video preamplifiers. The detectors are based on the resistive mixer concept and its performance mainly depends on the following parameters: type of antenna, electrical parameters (gate to drain capacitor and channel length of the CMOS device) and foundry. Two different 300 GHz detectors are discussed: a single transistor detector with a broadband antenna and a differential pair driven by a resonant patch antenna.
Resumo:
Using CMOS transistors for terahertz detection is currently a disruptive technology that offers the direct integration of a terahertz detector with video preamplifiers. The detectors are based on the resistive mixer concept and performance mainly depends on the following parameters: type of antenna, electrical parameters (gate to drain capacitor and channel length of the CMOS device) and foundry. Two different 300 GHz detectors are discussed: a single transistor detector with a broadband antenna and a differential pair driven by a resonant patch antenna.