18 resultados para The near-poor

em Universidad Politécnica de Madrid


Relevância:

100.00% 100.00%

Publicador:

Resumo:

An accurate characterization of the near-region propagation of radio waves inside tunnels is of practical importance for the design and planning of advanced communication systems. However, there has been no consensus yet on the propagation mechanism in this region. Some authors claim that the propagation mechanism follows the free space model, others intend to interpret it by the multi-mode waveguide model. This paper clarifies the situation in the near-region of arched tunnels by analytical modeling of the division point between the two propagation mechanisms. The procedure is based on the combination of the propagation theory and the three-dimensional solid geometry. Three groups of measurements are employed to verify the model in different tunnels at different frequencies. Furthermore, simplified models for the division point in five specific application situations are derived to facilitate the use of the model. The results in this paper could help to deepen the insight into the propagation mechanism within tunnel environments.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Artículo sobre comunicaciones ferroviarias. Abstract: Along with the increase in operating frequencies in advanced radio communication systems utilised inside tunnels, the location of the break point is further and further away from the transmitter. This means that the near region lengthens considerably and even occupies the whole propagation cell or the entire length of some short tunnels. To begin with, this study analyses the propagation loss resulting from the free-space mechanism and the multi-mode waveguide mechanism in the near region of circular tunnels, respectively. Then, by conjunctive employing the propagation theory and the three-dimensional solid geometry, a general analytical model of the dividing point between two propagation mechanisms is presented for the first time. Moreover, the model is validated by a wide range of measurement campaigns in different tunnels at different frequencies. Finally, discussions on the simplified formulae of the dividing point in some application situations are made. The results in this study can be helpful to grasp the essence of the propagation mechanism inside tunnels.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Along with the increase of the use of working frequencies in advanced radio communication systems, the near-region inside tunnels lengthens considerably and even occupies the whole propagation cell or the entire length of some short tunnels. This paper analytically models the propagation mechanisms and their dividing point in the near-region of arbitrary cross-sectional tunnels for the first time. To begin with, the propagation losses owing to the free space mechanism and the multimode waveguide mechanism are modeled, respectively. Then, by conjunctively employing the propagation theory and the three-dimensional solid geometry, the paper presents a general model for the dividing point between two propagation mechanisms. It is worthy to mention that this model can be applied in arbitrary cross-sectional tunnels. Furthermore, the general dividing point model is specified in rectangular, circular, and arched tunnels, respectively. Five groups of measurements are used to justify the model in different tunnels at different frequencies. Finally, in order to facilitate the use of the model, simplified analytical solutions for the dividing point in five specific application situations are derived. The results in this paper could help deepen the insight into the propagation mechanisms in tunnels.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This thesis contributes to the analysis and design of printed reflectarray antennas. The main part of the work is focused on the analysis of dual offset antennas comprising two reflectarray surfaces, one of them acts as sub-reflector and the second one acts as mainreflector. These configurations introduce additional complexity in several aspects respect to conventional dual offset reflectors, however they present a lot of degrees of freedom that can be used to improve the electrical performance of the antenna. The thesis is organized in four parts: the development of an analysis technique for dualreflectarray antennas, a preliminary validation of such methodology using equivalent reflector systems as reference antennas, a more rigorous validation of the software tool by manufacturing and testing a dual-reflectarray antenna demonstrator and the practical design of dual-reflectarray systems for some applications that show the potential of these kind of configurations to scan the beam and to generate contoured beams. In the first part, a general tool has been implemented to analyze high gain antennas which are constructed of two flat reflectarray structures. The classic reflectarray analysis based on MoM under local periodicity assumption is used for both sub and main reflectarrays, taking into account the incident angle on each reflectarray element. The incident field on the main reflectarray is computed taking into account the field radiated by all the elements on the sub-reflectarray.. Two approaches have been developed, one which employs a simple approximation to reduce the computer run time, and the other which does not, but offers in many cases, improved accuracy. The approximation is based on computing the reflected field on each element on the main reflectarray only once for all the fields radiated by the sub-reflectarray elements, assuming that the response will be the same because the only difference is a small variation on the angle of incidence. This approximation is very accurate when the reflectarray elements on the main reflectarray show a relatively small sensitivity to the angle of incidence. An extension of the analysis technique has been implemented to study dual-reflectarray antennas comprising a main reflectarray printed on a parabolic surface, or in general in a curved surface. In many applications of dual-reflectarray configurations, the reflectarray elements are in the near field of the feed-horn. To consider the near field radiated by the horn, the incident field on each reflectarray element is computed using a spherical mode expansion. In this region, the angles of incidence are moderately wide, and they are considered in the analysis of the reflectarray to better calculate the actual incident field on the sub-reflectarray elements. This technique increases the accuracy for the prediction of co- and cross-polar patterns and antenna gain respect to the case of using ideal feed models. In the second part, as a preliminary validation, the proposed analysis method has been used to design a dual-reflectarray antenna that emulates previous dual-reflector antennas in Ku and W-bands including a reflectarray as subreflector. The results for the dualreflectarray antenna compare very well with those of the parabolic reflector and reflectarray subreflector; radiation patterns, antenna gain and efficiency are practically the same when the main parabolic reflector is substituted by a flat reflectarray. The results show that the gain is only reduced by a few tenths of a dB as a result of the ohmic losses in the reflectarray. The phase adjustment on two surfaces provided by the dual-reflectarray configuration can be used to improve the antenna performance in some applications requiring multiple beams, beam scanning or shaped beams. Third, a very challenging dual-reflectarray antenna demonstrator has been designed, manufactured and tested for a more rigorous validation of the analysis technique presented. The proposed antenna configuration has the feed, the sub-reflectarray and the main-reflectarray in the near field one to each other, so that the conventional far field approximations are not suitable for the analysis of such antenna. This geometry is used as benchmarking for the proposed analysis tool in very stringent conditions. Some aspects of the proposed analysis technique that allow improving the accuracy of the analysis are also discussed. These improvements include a novel method to reduce the inherent cross polarization which is introduced mainly from grounded patch arrays. It has been checked that cross polarization in offset reflectarrays can be significantly reduced by properly adjusting the patch dimensions in the reflectarray in order to produce an overall cancellation of the cross-polarization. The dimensions of the patches are adjusted in order not only to provide the required phase-distribution to shape the beam, but also to exploit the crosses by zero of the cross-polarization components. The last part of the thesis deals with direct applications of the technique described. The technique presented is directly applicable to the design of contoured beam antennas for DBS applications, where the requirements of cross-polarisation are very stringent. The beam shaping is achieved by synthesithing the phase distribution on the main reflectarray while the sub-reflectarray emulates an equivalent hyperbolic subreflector. Dual-reflectarray antennas present also the ability to scan the beam over small angles about boresight. Two possible architectures for a Ku-band antenna are also described based on a dual planar reflectarray configuration that provides electronic beam scanning in a limited angular range. In the first architecture, the beam scanning is achieved by introducing a phase-control in the elements of the sub-reflectarray and the mainreflectarray is passive. A second alternative is also studied, in which the beam scanning is produced using 1-bit control on the main reflectarray, while a passive subreflectarray is designed to provide a large focal distance within a compact configuration. The system aims to develop a solution for bi-directional satellite links for emergency communications. In both proposed architectures, the objective is to provide a compact optics and simplicity to be folded and deployed.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The emerging use of real-time 3D-based multimedia applications imposes strict quality of service (QoS) requirements on both access and core networks. These requirements and their impact to provide end-to-end 3D videoconferencing services have been studied within the Spanish-funded VISION project, where different scenarios were implemented showing an agile stereoscopic video call that might be offered to the general public in the near future. In view of the requirements, we designed an integrated access and core converged network architecture which provides the requested QoS to end-to-end IP sessions. Novel functional blocks are proposed to control core optical networks, the functionality of the standard ones is redefined, and the signaling improved to better meet the requirements of future multimedia services. An experimental test-bed to assess the feasibility of the solution was also deployed. In such test-bed, set-up and release of end-to-end sessions meeting specific QoS requirements are shown and the impact of QoS degradation in terms of the user perceived quality degradation is quantified. In addition, scalability results show that the proposed signaling architecture is able to cope with large number of requests introducing almost negligible delay.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

We explore the near-field concentration properties of dielectric spheroidal scatterers with sizes close to the wavelength, using an analytical separation-of-variables method. Such particles act as mesoscopic lenses whose physical parameters are optimized here for maximum scattered light enhancement in photovoltaic applications.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The Bioinstrumentation Laboratory belongs to the Centre for Biomedical Technology (CTB) of the Technical University of Madrid and its main objective is to provide the scientific community with devices and techniques for the characterization of micro and nanostructures and consequently finding their best biomedical applications. Hyperthermia (greek word for “overheating”) is defined as the phenomenon that occurs when a body is exposed to an energy generating source that can produce a rise in temperature (42-45ºC) for a given time [1]. Specifically, the aim of the hyperthermia methods used in The Bioinstrumentation Laboratory is the development of thermal therapies, some of these using different kinds of nanoparticles, to kill cancer cells and reduce the damage on healthy tissues. The optical hyperthermia is based on noble metal nanoparticles and laser irradiation. This kind of nanoparticles has an immense potential associated to the development of therapies for cancer on account of their Surface Plasmon Resonance (SPR) enhanced light scattering and absorption. In a short period of time, the absorbed light is converted into localized heat, so we can take advantage of these characteristics to heat up tumor cells in order to obtain the cellular death [2]. In this case, the laboratory has an optical hyperthermia device based on a continuous wave laser used to kill glioblastoma cell lines (1321N1) in the presence of gold nanorods (Figure 1a). The wavelength of the laser light is 808 nm because the penetration of the light in the tissue is deeper in the Near Infrared Region. The first optical hyperthermia results show that the laser irradiation produces cellular death in the experimental samples of glioblastoma cell lines using gold nanorods but is not able to decrease the cellular viability of cancer cells in samples without the suitable nanorods (Figure 1b) [3]. The generation of magnetic hyperthermia is performed through changes of the magnetic induction in magnetic nanoparticles (MNPs) that are embedded in viscous medium. The Figure 2 shows a schematic design of the AC induction hyperthermia device in magnetic fluids. The equipment has been manufactured at The Bioinstrumentation Laboratory. The first block implies two steps: the signal selection with frequency manipulation option from 9 KHz to 2MHz, and a linear output up to 1500W. The second block is where magnetic field is generated ( 5mm, 10 turns). Finally, the third block is a software control where the user can establish initial parameters, and also shows the temperature response of MNPs due to the magnetic field applied [4-8]. The Bioinstrumentation Laboratory in collaboration with the Mexican company MRI-DT have recently implemented a new research line on Nuclear Magnetic Resonance Hyperthermia, which is sustained on the patent US 7,423,429B2 owned by this company. This investigation is based on the use of clinical MRI equipment not only for diagnosis but for therapy [9]. This idea consists of two main facts: Magnetic Resonance Imaging can cause focal heating [10], and the differentiation in resonant frequency between healthy and cancer cells [11]. To produce only heating in cancer cells when the whole body is irradiated, it is necessary to determine the specific resonant frequency of the target, using the information contained in the spectra of the area of interest. Then, special RF pulse sequence is applied to produce fast excitation and relaxation mechanism that generates temperature increase of the tumor, causing cellular death or metabolism malfunction that stops cellular division

Relevância:

90.00% 90.00%

Publicador:

Resumo:

With the rising prices of the retail electricity and the decreasing cost of the PV technology, grid parity with commercial electricity will soon become a reality in Europe. This fact, together with less attractive PV feed-in-tariffs in the near future and incentives to promote self-consumption suggest, that new operation modes for the PV Distributed Generation should be explored; differently from the traditional approach which is only based on maximizing the exported electricity to the grid. The smart metering is experiencing a growth in Europe and the United States but the possibilities of its use are still uncertain, in our system we propose their use to manage the storage and to allow the user to know their electrical power and energy balances. The ADSM has many benefits studied previously but also it has important challenges, in this paper we can observe and ADSM implementation example where we propose a solution to these challenges. In this paper we study the effects of the Active Demand-Side Management (ADSM) and storage systems in the amount of consumed local electrical energy. It has been developed on a prototype of a self-sufficient solar house called “MagicBox” equipped with grid connection, PV generation, lead–acid batteries, controllable appliances and smart metering. We carried out simulations for long-time experiments (yearly studies) and real measures for short and mid-time experiments (daily and weekly studies). Results show the relationship between the electricity flows and the storage capacity, which is not linear and becomes an important design criterion.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Time-resolved reflectance spectroscopy can be used to assess nondestructively the bulk (rather than the superficial) optical properties of highly diffusive media. A fully automated system for time-resolved reflectance spectroscopy was used to evaluate the absorption and the transport scattering spectra of fruits in the red and the near-infrared regions. In particular, data were collected in the range 650-1000 nm from three varieties of apples and from peaches, kiwifruits, and tomatoes. The absorption spectra were usually dominated by the water peak near 970 nm, whereas chlorophyll was detected at 675 nm. For ail species the scattering decreased progressively with increasing wavelength. A best fit to water and chlorophyll absorption line shapes and to Mie theory permitted the estimation of water and chlorophyll content and the average size of scattering centers in the bulls; of intact fruits.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

CO2 capture and storage (CCS) projects are presently developed to reduce the emission of anthropogenic CO2 into the atmosphere. CCS technologies are expected to account for the 20% of the CO2 reduction by 2050. One of the main concerns of CCS is whether CO2 may remain confined within the geological formation into which it is injected since post-injection CO2 migration in the time scale of years, decades and centuries is not well understood. Theoretically, CO2 can be retained at depth i) as a supercritical fluid (physical trapping), ii) as a fluid slowly migrating in an aquifer due to long flow path (hydrodynamic trapping), iii) dissolved into ground waters (solubility trapping) and iv) precipitated secondary carbonates. Carbon dioxide will be injected in the near future (2012) at Hontomín (Burgos, Spain) in the frame of the Compostilla EEPR project, led by the Fundación Ciudad de la Energía (CIUDEN). In order to detect leakage in the operational stage, a pre-injection geochemical baseline is presently being developed. In this work a geochemical monitoring design is presented to provide information about the feasibility of CO2 storage at depth.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

A simplified CFD wake model based on the actuator-disk concept is used to simulate the wind turbine, represented by an actuator disk upon which a distribution of forces, defined as axial momentum sources, are applied on the incoming flow. The rotor is supposed to be uniformly loaded, with the exerted forces as a function of the incident wind speed, the thrust coefficient and the rotor diameter. The model is validated through experimental measurements downwind of a wind turbine in terms of wind speed deficit. Validation on turbulence intensity will also be made in the near future.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

La prevalencia de las alergias está aumentando desde mediados del siglo XX, y se estima que actualmente afectan a alrededor del 2-8 % de la población, pero las causas de este aumento aún no están claras. Encontrar el origen del mecanismo por el cual una proteína inofensiva se convierte en capaz de inducir una respuesta alérgica es de vital importancia para prevenir y tratar estas enfermedades. Aunque la caracterización de alérgenos relevantes ha ayudado a mejorar el manejo clínico y a aclarar los mecanismos básicos de las reacciones alérgicas, todavía queda un largo camino para establecer el origen de la alergenicidad y reactividad cruzada. El objetivo de esta tesis ha sido caracterizar las bases moleculares de la alergenicidad tomando como modelo dos familias de panalergenos (proteínas de transferencia de lípidos –LTPs- y taumatinas –TLPs-) y estudiando los mecanismos que median la sensibilización y la reactividad cruzada para mejorar tanto el diagnóstico como el tratamiento de la alergia. Para ello, se llevaron a cabo dos estrategias: estudiar la reactividad cruzada de miembros de familias de panalérgenos; y estudiar moléculas-co-adyuvantes que pudieran favorecer la capacidad alergénica de dichas proteínas. Para estudiar la reactividad cruzada entre miembros de la misma familia de proteínas, se seleccionaron LTPs y TLPs, descritas como alergenos, tomando como modelo la alergia a frutas. Por otra parte, se estudiaron los perfiles de sensibilización a alérgenos de trigo relacionados con el asma del panadero, la enfermedad ocupacional más relevante de origen alérgico. Estos estudios se llevaron a cabo estandarizando ensayos tipo microarrays con alérgenos y analizando los resultados por la teoría de grafos. En relación al estudiar moléculas-co-adyuvantes que pudieran favorecer la capacidad alergénica de dichas proteínas, se llevaron a cabo estudios sobre la interacción de los alérgenos alimentarios con células del sistema inmune humano y murino y el epitelio de las mucosas, analizando la importancia de moléculas co-transportadas con los alérgenos en el desarrollo de una respuesta Th2. Para ello, Pru p 3(LTP y alérgeno principal del melocotón) se selección como modelo para llevarlo a cabo. Por otra parte, se analizó el papel de moléculas activadoras del sistema inmune producidas por patógenos en la inducción de alergias alimentarias seleccionando el modelo kiwi-alternaria, y el papel de Alt a 1, alérgeno mayor de dicho hongo, en la sensibilización a Act d 2, alérgeno mayor de kiwi. En resumen, el presente trabajo presenta una investigación innovadora aportando resultados de gran utilidad tanto para la mejora del diagnóstico como para nuevas investigaciones sobre la alergia y el esclarecimiento final de los mecanismos que caracterizan esta enfermedad. ABSTRACT Allergies are increasing their prevalence from mid twentieth century, and they are currently estimated to affect around 2-8% of the population but the underlying causes of this increase remain still elusive. The understanding of the mechanism by which a harmless protein becomes capable of inducing an allergic response provides us the basis to prevent and treat these diseases. Although the characterization of relevant allergens has led to improved clinical management and has helped to clarify the basic mechanisms of allergic reactions, it seems justified in aspiring to molecularly dissecting these allergens to establish the structural basis of their allergenicity and cross-reactivity. The aim of this thesis was to characterize the molecular basis of the allergenicity of model proteins belonging to different families (Lipid Transfer Proteins –LTPs-, and Thaumatin-like Proteins –TLPs-) in order to identify mechanisms that mediate sensitization and cross reactivity for developing new strategies in the management of allergy, both diagnosis and treatment, in the near future. With this purpose, two strategies have been conducted: studies of cross-reactivity among panallergen families and molecular studies of the contribution of cofactors in the induction of the allergic response by these panallergens. Following the first strategy, we studied the cross-reactivity among members of two plant panallergens (LTPs , Lipid Transfer Proteins , and TLPs , Thaumatin-like Proteins) using the peach allergy as a model. Similarly, we characterized the sensitization profiles to wheat allergens in baker's asthma development, the most relevant occupational disease. These studies were performed using allergen microarrays and the graph theory for analyzing the results. Regarding the second approach, we analyzed the interaction of plant allergens with immune and epithelial cells. To perform these studies , we examined the importance of ligands and co-transported molecules of plant allergens in the development of Th2 responses. To this end, Pru p 3, nsLTP (non-specific Lipid Transfer Protein) and peach major allergen, was selected as a model to investigate its interaction with cells of the human and murine immune systems as well as with the intestinal epithelium and the contribution of its ligand in inducing an allergic response was studied. Moreover, we analyzed the role of pathogen associated molecules in the induction of food allergy. For that, we selected the kiwi- alternaria system as a model and the role of Alt a 1 , major allergen of the fungus, in the development of Act d 2-sensitization was studied. In summary, this work presents an innovative research providing useful results for improving diagnosis and leading to further research on allergy and the final clarification of the mechanisms that characterize this disease.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Many advantages can be got in combining finite and boundary elements.It is the case, for example, of unbounded field problems where boundary elements can provide the appropriate conditions to represent the infinite domain while finite elements are suitable for more complex properties in the near domain. However, in spite of it, other disadvantages can appear. It would be, for instance, the loss of symmetry in the finite elements stiffness matrix, when the combination is made. On the other hand, in our days, with the strong irruption of the parallel proccessing the techniques of decomposition of domains are getting the interest of numerous scientists. With their application it is possible to separate the resolution of a problem into several subproblems. That would be beneficial in the combinations BEM-FEM as the loss of symmetry would be avoided and every technique would be applicated separately. Evidently for the correct application of these techniques it is necessary to establish the suitable transmission conditions in the interface between BEM domain and FEM domain. In this paper, one parallel method is presented which is based in the interface operator of Steklov Poincarè.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

La diabetes mellitus es el conjunto de alteraciones provocadas por un defecto en la cantidad de insulina secretada o por un aprovechamiento deficiente de la misma. Es causa directa de complicaciones a corto, medio y largo plazo que disminuyen la calidad y las expectativas de vida de las personas con diabetes. La diabetes mellitus es en la actualidad uno de los problemas más importantes de salud. Ha triplicado su prevalencia en los últimos 20 anos y para el año 2025 se espera que existan casi 300 millones de personas con diabetes. Este aumento de la prevalencia junto con la morbi-mortalidad asociada a sus complicaciones micro y macro-vasculares convierten la diabetes en una carga para los sistemas sanitarios, sus recursos económicos y sus profesionales, haciendo de la enfermedad un problema individual y de salud pública de enormes proporciones. De momento no existe cura a esta enfermedad, de modo que el objetivo terapéutico del tratamiento de la diabetes se centra en la normalización de la glucemia intentando minimizar los eventos de hiper e hipoglucemia y evitando la aparición o al menos retrasando la evolución de las complicaciones vasculares, que constituyen la principal causa de morbi-mortalidad de las personas con diabetes. Un adecuado control diabetológico implica un tratamiento individualizado que considere multitud de factores para cada paciente (edad, actividad física, hábitos alimentarios, presencia de complicaciones asociadas o no a la diabetes, factores culturales, etc.). Sin embargo, a corto plazo, las dos variables más influyentes que el paciente ha de manejar para intervenir sobre su nivel glucémico son la insulina administrada y la dieta. Ambas presentan un retardo entre el momento de su aplicación y el comienzo de su acción, asociado a la absorción de los mismos. Por este motivo la capacidad de predecir la evolución del perfil glucémico en un futuro cercano, ayudara al paciente a tomar las decisiones adecuadas para mantener un buen control de su enfermedad y evitar situaciones de riesgo. Este es el objetivo de la predicción en diabetes: adelantar la evolución del perfil glucémico en un futuro cercano para ayudar al paciente a adaptar su estilo de vida y sus acciones correctoras, con el propósito de que sus niveles de glucemia se aproximen a los de una persona sana, evitando así los síntomas y complicaciones de un mal control. La aparición reciente de los sistemas de monitorización continua de glucosa ha proporcionado nuevas alternativas. La disponibilidad de un registro exhaustivo de las variaciones del perfil glucémico, con un periodo de muestreo de entre uno y cinco minutos, ha favorecido el planteamiento de nuevos modelos que tratan de predecir la glucemia utilizando tan solo las medidas anteriores de glucemia o al menos reduciendo significativamente la información de entrada a los algoritmos. El hecho de requerir menor intervención por parte del paciente, abre nuevas posibilidades de aplicación de los predictores de glucemia, haciéndose viable su uso en tiempo real, como sistemas de ayuda a la decisión, como detectores de situaciones de riesgo o integrados en algoritmos automáticos de control. En esta tesis doctoral se proponen diferentes algoritmos de predicción de glucemia para pacientes con diabetes, basados en la información registrada por un sistema de monitorización continua de glucosa así como incorporando la información de la insulina administrada y la ingesta de carbohidratos. Los algoritmos propuestos han sido evaluados en simulación y utilizando datos de pacientes registrados en diferentes estudios clínicos. Para ello se ha desarrollado una amplia metodología, que trata de caracterizar las prestaciones de los modelos de predicción desde todos los puntos de vista: precisión, retardo, ruido y capacidad de detección de situaciones de riesgo. Se han desarrollado las herramientas de simulación necesarias y se han analizado y preparado las bases de datos de pacientes. También se ha probado uno de los algoritmos propuestos para comprobar la validez de la predicción en tiempo real en un escenario clínico. Se han desarrollado las herramientas que han permitido llevar a cabo el protocolo experimental definido, en el que el paciente consulta la predicción bajo demanda y tiene el control sobre las variables metabólicas. Este experimento ha permitido valorar el impacto sobre el control glucémico del uso de la predicción de glucosa. ABSTRACT Diabetes mellitus is the set of alterations caused by a defect in the amount of secreted insulin or a suboptimal use of insulin. It causes complications in the short, medium and long term that affect the quality of life and reduce the life expectancy of people with diabetes. Diabetes mellitus is currently one of the most important health problems. Prevalence has tripled in the past 20 years and estimations point out that it will affect almost 300 million people by 2025. Due to this increased prevalence, as well as to morbidity and mortality associated with micro- and macrovascular complications, diabetes has become a burden on health systems, their financial resources and their professionals, thus making the disease a major individual and a public health problem. There is currently no cure for this disease, so that the therapeutic goal of diabetes treatment focuses on normalizing blood glucose events. The aim is to minimize hyper- and hypoglycemia and to avoid, or at least to delay, the appearance and development of vascular complications, which are the main cause of morbidity and mortality among people with diabetes. A suitable, individualized and controlled treatment for diabetes involves many factors that need to be considered for each patient: age, physical activity, eating habits, presence of complications related or unrelated to diabetes, cultural factors, etc. However, in the short term, the two most influential variables that the patient has available in order to manage his/her glycemic levels are administered insulin doses and diet. Both suffer from a delay between their time of application and the onset of the action associated with their absorption. Therefore, the ability to predict the evolution of the glycemic profile in the near future could help the patient to make appropriate decisions on how to maintain good control of his/her disease and to avoid risky situations. Hence, the main goal of glucose prediction in diabetes consists of advancing the evolution of glycemic profiles in the near future. This would assist the patient in adapting his/her lifestyle and in taking corrective actions in a way that blood glucose levels approach those of a healthy person, consequently avoiding the symptoms and complications of a poor glucose control. The recent emergence of continuous glucose monitoring systems has provided new alternatives in this field. The availability of continuous records of changes in glycemic profiles (with a sampling period of one or five minutes) has enabled the design of new models which seek to predict blood glucose by using automatically read glucose measurements only (or at least, reducing significantly the data input manually to the algorithms). By requiring less intervention by the patient, new possibilities are open for the application of glucose predictors, making its use feasible in real-time applications, such as: decision support systems, hypo- and hyperglycemia detectors, integration into automated control algorithms, etc. In this thesis, different glucose prediction algorithms are proposed for patients with diabetes. These are based on information recorded by a continuous glucose monitoring system and incorporate information of the administered insulin and carbohydrate intakes. The proposed algorithms have been evaluated in-silico and using patients’ data recorded in different clinical trials. A complete methodology has been developed to characterize the performance of predictive models from all points of view: accuracy, delay, noise and ability to detect hypo- and hyperglycemia. In addition, simulation tools and patient databases have been deployed. One of the proposed algorithms has additionally been evaluated in terms of real-time prediction performance in a clinical scenario in which the patient checked his/her glucose predictions on demand and he/she had control on his/her metabolic variables. This has allowed assessing the impact of using glucose prediction on glycemic control. The tools to carry out the defined experimental protocols were also developed in this thesis.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Theoretical models for the thermal response of vertical geothermal boreholes often assume that the characteristic time of variation of the heat injection rate is much larger than the characteristic diffusion time across the borehole. In this case, heat transfer inside the borehole and in its immediate surroundings is quasi-steady in the first approximation, while unsteady effects enter only in the far field. Previous studies have exploited this disparity of time scales, incorporating approximate matching conditions to couple the near-borehole region with the outer unsteady temperatura field. In the present work matched asymptotic expansion techniques are used to analyze the heat transfer problem, delivering a rigorous derivation of the true matching condition between the two regions and of the correct definition of the network of thermal resistances that represents the quasi-steady solution near the borehole. Additionally, an apparent temperature due to the unsteady far field is identified that needs to be taken into account by the near-borehole region for the correct computation of the heat injection rate. This temperature differs from the usual mean borehole temperature employed in the literatura.