25 resultados para application to medical science
em Universidad Politécnica de Madrid
Resumo:
We present a framework specially designed to deal with structurally complex data, where all individuals have the same structure, as is the case in many medical domains. A structurally complex individual may be composed of any type of singlevalued or multivalued attributes, including time series, for example. These attributes are structured according to domain-dependent hierarchies. Our aim is to generate reference models of population groups. These models represent the population archetype and are very useful for supporting such important tasks as diagnosis, detecting fraud, analyzing patient evolution, identifying control groups, etc.
Resumo:
A particle accelerator is any device that, using electromagnetic fields, is able to communicate energy to charged particles (typically electrons or ionized atoms), accelerating and/or energizing them up to the required level for its purpose. The applications of particle accelerators are countless, beginning in a common TV CRT, passing through medical X-ray devices, and ending in large ion colliders utilized to find the smallest details of the matter. Among the other engineering applications, the ion implantation devices to obtain better semiconductors and materials of amazing properties are included. Materials supporting irradiation for future nuclear fusion plants are also benefited from particle accelerators. There are many devices in a particle accelerator required for its correct operation. The most important are the particle sources, the guiding, focalizing and correcting magnets, the radiofrequency accelerating cavities, the fast deflection devices, the beam diagnostic mechanisms and the particle detectors. Most of the fast particle deflection devices have been built historically by using copper coils and ferrite cores which could effectuate a relatively fast magnetic deflection, but needed large voltages and currents to counteract the high coil inductance in a response in the microseconds range. Various beam stability considerations and the new range of energies and sizes of present time accelerators and their rings require new devices featuring an improved wakefield behaviour and faster response (in the nanoseconds range). This can only be achieved by an electromagnetic deflection device based on a transmission line. The electromagnetic deflection device (strip-line kicker) produces a transverse displacement on the particle beam travelling close to the speed of light, in order to extract the particles to another experiment or to inject them into a different accelerator. The deflection is carried out by the means of two short, opposite phase pulses. The diversion of the particles is exerted by the integrated Lorentz force of the electromagnetic field travelling along the kicker. This Thesis deals with a detailed calculation, manufacturing and test methodology for strip-line kicker devices. The methodology is then applied to two real cases which are fully designed, built, tested and finally installed in the CTF3 accelerator facility at CERN (Geneva). Analytical and numerical calculations, both in 2D and 3D, are detailed starting from the basic specifications in order to obtain a conceptual design. Time domain and frequency domain calculations are developed in the process using different FDM and FEM codes. The following concepts among others are analyzed: scattering parameters, resonating high order modes, the wakefields, etc. Several contributions are presented in the calculation process dealing specifically with strip-line kicker devices fed by electromagnetic pulses. Materials and components typically used for the fabrication of these devices are analyzed in the manufacturing section. Mechanical supports and connexions of electrodes are also detailed, presenting some interesting contributions on these concepts. The electromagnetic and vacuum tests are then analyzed. These tests are required to ensure that the manufactured devices fulfil the specifications. Finally, and only from the analytical point of view, the strip-line kickers are studied together with a pulsed power supply based on solid state power switches (MOSFETs). The solid state technology applied to pulsed power supplies is introduced and several circuit topologies are modelled and simulated to obtain fast and good flat-top pulses.
Resumo:
The acquisition of technical, contextual and behavioral competences is a prerequisite for sustainable development and strengthening of rural communities. Territorial display of the status of these skills helps to design the necessary learning, so its inclusion in planning processes is useful for decision making. The article discusses the application of visual representation of competences in a rural development project with Aymara women communities in Peru. The results show an improvement of transparency and dialogue, resulting in a more successful project management and strengthening of social organization.
Resumo:
Determining as accurate as possible spent nuclear fuel isotopic content is gaining importance due to its safety and economic implications. Since nowadays higher burn ups are achievable through increasing initial enrichments, more efficient burn up strategies within the reactor cores and the extension of the irradiation periods, establishing and improving computation methodologies is mandatory in order to carry out reliable criticality and isotopic prediction calculations. Several codes (WIMSD5, SERPENT 1.1.7, SCALE 6.0, MONTEBURNS 2.0 and MCNP-ACAB) and methodologies are tested here and compared to consolidated benchmarks (OECD/NEA pin cell moderated with light water) with the purpose of validating them and reviewing the state of the isotopic prediction capabilities. These preliminary comparisons will suggest what can be generally expected of these codes when applied to real problems. In the present paper, SCALE 6.0 and MONTEBURNS 2.0 are used to model the same reported geometries, material compositions and burn up history of the Spanish Van de llós II reactor cycles 7-11 and to reproduce measured isotopies after irradiation and decay times. We analyze comparisons between measurements and each code results for several grades of geometrical modelization detail, using different libraries and cross-section treatment methodologies. The power and flux normalization method implemented in MONTEBURNS 2.0 is discussed and a new normalization strategy is developed to deal with the selected and similar problems, further options are included to reproduce temperature distributions of the materials within the fuel assemblies and it is introduced a new code to automate series of simulations and manage material information between them. In order to have a realistic confidence level in the prediction of spent fuel isotopic content, we have estimated uncertainties using our MCNP-ACAB system. This depletion code, which combines the neutron transport code MCNP and the inventory code ACAB, propagates the uncertainties in the nuclide inventory assessing the potential impact of uncertainties in the basic nuclear data: cross-section, decay data and fission yields
Resumo:
Wind power time series usually show complex dynamics mainly due to non-linearities related to the wind physics and the power transformation process in wind farms. This article provides an approach to the incorporation of observed local variables (wind speed and direction) to model some of these effects by means of statistical models. To this end, a benchmarking between two different families of varying-coefficient models (regime-switching and conditional parametric models) is carried out. The case of the offshore wind farm of Horns Rev in Denmark has been considered. The analysis is focused on one-step ahead forecasting and a time series resolution of 10 min. It has been found that the local wind direction contributes to model some features of the prevailing winds, such as the impact of the wind direction on the wind variability, whereas the non-linearities related to the power transformation process can be introduced by considering the local wind speed. In both cases, conditional parametric models showed a better performance than the one achieved by the regime-switching strategy. The results attained reinforce the idea that each explanatory variable allows the modelling of different underlying effects in the dynamics of wind power time series.
Resumo:
Program specialization optimizes programs for known valúes of the input. It is often the case that the set of possible input valúes is unknown, or this set is infinite. However, a form of specialization can still be performed in such cases by means of abstract interpretation, specialization then being with respect to abstract valúes (substitutions), rather than concrete ones. We study the múltiple specialization of logic programs based on abstract interpretation. This involves in principie, and based on information from global analysis, generating several versions of a program predicate for different uses of such predicate, optimizing these versions, and, finally, producing a new, "multiply specialized" program. While múltiple specialization has received theoretical attention, little previous evidence exists on its practicality. In this paper we report on the incorporation of múltiple specialization in a parallelizing compiler and quantify its effects. A novel approach to the design and implementation of the specialization system is proposed. The resulting implementation techniques result in identical specializations to those of the best previously proposed techniques but require little or no modification of some existing abstract interpreters. Our results show that, using the proposed techniques, the resulting "abstract múltiple specialization" is indeed a relevant technique in practice. In particular, in the parallelizing compiler application, a good number of run-time tests are eliminated and invariants extracted automatically from loops, resulting generally in lower overheads and in several cases in increased speedups.
Resumo:
Abstract This paper describes a two-part methodology for managing the risk posed by water supply variability to irrigated agriculture. First, an econometric model is used to explain the variation in the production value of irrigated agriculture. The explanatory variables include an index of irrigation water availability (surface storage levels), a price index representative of the crops grown in each geographical unit, and a time variable. The model corrects for autocorrelation and it is applied to 16 representative Spanish provinces in terms of irrigated agriculture. In the second part, the fitted models are used for the economic evaluation of drought risk. In flow variability in the hydrological system servicing each province is used to perform ex-ante evaluations of economic output for the upcoming irrigation season. The model?s error and the probability distribution functions (PDFs) of the reservoirs? storage variations are used to generate Monte Carlo (Latin Hypercube) simulations of agricultural output 7 and 3 months prior to the irrigation season. The results of these simulations illustrate the different risk profiles of each management unit, which depend on farm productivity and on the probability distribution function of water in flow to reservoirs. The potential for ex-ante drought impact assessments is demonstrated. By complementing hydrological models, this method can assist water managers and decisionmakers in managing reservoirs.
Resumo:
Old-growth trees play a very important role in the maintenance of biodiversity in forests. However, no clear definition is yet available to help identify them since tree age is usually not recorded in National Forest Inventories. To develop and test a new method to identify old-growth trees using a species-specific threshold for tree diameter in National Forest Inventories. Different nonlinear mixed models for diameter ? age were generated using data from the Spanish Forest Inventory in order to identify the most appropriate one for Aleppo pine in its South-western distribution area. The asymptote of the optimal model indicates the threshold diameter for defining an old-growth tree. Additionally, five site index curves were examined to analyze the influence of site quality on these models.
Resumo:
The educational platform Virtual Science Hub (ViSH) has been developed as part of the GLOBAL excursion European project. ViSH (http://vishub.org/) is a portal where teachers and scientist interact to create virtual excursions to science infrastructures. The main motivation behind the project was to connect teachers - and in consequence their students - to scientific institutions and their wide amount of infrastructures and resources they are working with. Thus the idea of a hub was born that would allow the two worlds of scientists and teachers to connect and to innovate science teaching. The core of the ViSH?s concept design is based on virtual excursions, which allow for a number of pedagogical models to be applied. According to our internal definition a virtual excursion is a tour through some digital context by teachers and pupils on a given topic that is attractive and has an educational purpose. Inquiry-based learning, project-based and problem-based learning are the most prominent approaches that a virtual excursion may serve. The domain specific resources and scientific infrastructures currently available on the ViSH are focusing on life sciences, nano-technology, biotechnology, grid and volunteer computing. The virtual excursion approach allows an easy combination of these resources into interdisciplinary teaching scenarios. In addition, social networking features support the users in collaborating and communicating in relation to these excursions and thus create a community of interest for innovative science teaching. The design and development phases were performed following a participatory design approach. An important aspect in this process was to create design partnerships amongst all actors involved, researchers, developers, infrastructure providers, teachers, social scientists, and pedagogical experts early in the project. A joint sense of ownership was created and important changes during the conceptual phase were implemented in the ViSH due to early user feedback. Technology-wise the ViSH is based on the latest web technologies in order to make it cross-platform compatible so that it works on several operative systems such as Windows, Mac or Linux and multi-device accessible, such as desktop, tablet and mobile devices. The platform has been developed in HTML5, the latest standard for web development, assuring that it can run on any modern browser. In addition to social networking features a core element on the ViSH is the virtual excursions editor. It is a web tool that allows teachers and scientists to create rich mash-ups of learning resources provided by the e-Infrastructures (i.e. remote laboratories and live webcams). These rich mash-ups can be presented in either slides or flashcards format. Taking advantage of the web architecture supported, additional powerful components have been integrated like a recommendation engine to provide personalized suggestions about educational content or interesting users and a videoconference tool to enhance real-time collaboration like MashMeTV (http://www.mashme.tv/).
Resumo:
Modeling is an essential tool for the development of atmospheric emission abatement measures and air quality plans. Most often these plans are related to urban environments with high emission density and population exposure. However, air quality modeling in urban areas is a rather challenging task. As environmental standards become more stringent (e.g. European Directive 2008/50/EC), more reliable and sophisticated modeling tools are needed to simulate measures and plans that may effectively tackle air quality exceedances, common in large urban areas across Europe, particularly for NO2. This also implies that emission inventories must satisfy a number of conditions such as consistency across the spatial scales involved in the analysis, consistency with the emission inventories used for regulatory purposes and versatility to match the requirements of different air quality and emission projection models. This study reports the modeling activities carried out in Madrid (Spain) highlighting the atmospheric emission inventory development and preparation as an illustrative example of the combination of models and data needed to develop a consistent air quality plan at urban level. These included a series of source apportionment studies to define contributions from the international, national, regional and local sources in order to understand to what extent local authorities can enforce meaningful abatement measures. Moreover, source apportionment studies were conducted in order to define contributions from different sectors and to understand the maximum feasible air quality improvement that can be achieved by reducing emissions from those sectors, thus targeting emission reduction policies to the most relevant activities. Finally, an emission scenario reflecting the effect of such policies was developed and the associated air quality was modeled.
Resumo:
Fuel cycles are designed with the aim of obtaining the highest amount of energy possible. Since higher burnup values are reached, it is necessary to improve our disposal designs, traditionally based on the conservative assumption that they contain fresh fuel. The criticality calculations involved must consider burnup by making the most of the experimental and computational capabilities developed, respectively, to measure and predict the isotopic content of the spent nuclear fuel. These high burnup scenarios encourage a review of the computational tools to find out possible weaknesses in the nuclear data libraries, in the methodologies applied and their applicability range. Experimental measurements of the spent nuclear fuel provide the perfect framework to benchmark the most well-known and established codes, both in the industry and academic research activity. For the present paper, SCALE 6.0/TRITON and MONTEBURNS 2.0 have been chosen to follow the isotopic content of four samples irradiated in the Spanish Vandellós-II pressurized water reactor up to burnup values ranging from 40 GWd/MTU to 75 GWd/MTU. By comparison with the experimental data reported for these samples, we can probe the applicability of these codes to deal with high burnup problems. We have developed new computational tools within MONTENBURNS 2.0. They make possible to handle an irradiation history that includes geometrical and positional changes of the samples within the reactor core. This paper describes the irradiation scenario against which the mentioned codes and our capabilities are to be benchmarked.
Resumo:
The MobiGuide system provides patients with personalized decision support tools, based on computerized clinical guidelines, in a mobile environment. The generic capabilities of the system will be demonstrated applied to the clinical domain of Gestational Diabetes (GD). This paper presents a methodology to identify personalized recommendations, obtained from the analysis of the GD guideline. We added a conceptual parallel part to the formalization of the GD guideline called "parallel workflow" that allows considering patient?s personal context and preferences. As a result of analysing the GD guideline and eliciting medical knowledge, we identified three different types of personalized advices (therapy, measurements and upcoming events) that will be implemented to perform patients? guiding at home, supported by the MobiGuide system. These results will be essential to determine the distribution of functionalities between mobile and server decision support capabilities.
Resumo:
The application of conservation treatments, such as consolidation and protection ones, has been demonstrated ineffective in many cases, and even harmful. Evaluation studies should be a mandatory task, ideally before and after the intervention, but both tasks are complex and unusual in the case of archaeological heritage. This study is mainly focused on analyzing changes in petrophysical properties of stone material from archaeological sites of Merida (Spain), evaluating, both on site and in laboratory, effects derived from different conservation treatments applied in past interventions, throughout the integration of different non-destructive techniques (NDT) and portable devices of analysis available at the Institute of Geosciences (CSIC,UCM). These techniques allow, not only assessment of effectiveness and alteration processes, but also monitoring durability of treatments, focused mainly on 1996 intervention in the case of Roman Theater, as well as different punctual interventions from the 90?s until date in the House of Mitreo. Studies carried out on archaeological sites of Merida permit us to compare outcomes and also check limitations in the use of those equipments. In this paper we discuss about the use of some techniques, their integration and limits, for the assessment of conservation treatments, showing some examples of Merida?s case study.
Resumo:
The aim of this paper is to develop a probabilistic modeling framework for the segmentation of structures of interest from a collection of atlases. Given a subset of registered atlases into the target image for a particular Region of Interest (ROI), a statistical model of appearance and shape is computed for fusing the labels. Segmentations are obtained by minimizing an energy function associated with the proposed model, using a graph-cut technique. We test different label fusion methods on publicly available MR images of human brains.
Resumo:
An inverse optimization strategy was developed to determine the single crystal properties from experimental results of the mechanical behavior of polycrystals. The polycrystal behavior was obtained by means of the finite element simulation of a representative volume element of the microstructure in which the dominant slip and twinning systems were included in the constitutive equation of each grain. The inverse problem was solved by means of the Levenberg-Marquardt method, which provided an excellent fit to the experimental results. The iterative optimization process followed a hierarchical scheme in which simple representative volume elements were initially used, followed by more realistic ones to reach the final optimum solution, leading to important reductions in computer time. The new strategy was applied to identify the initial and saturation critical resolved shear stresses and the hardening modulus of the active slip systems and extension twinning in a textured AZ31 Mg alloy. The results were in general agreement with the data in the literature but also showed some differences. They were partially explained because of the higher accuracy of the new optimization strategy but it was also shown that the number of independent experimental stress-strain curves used as input is critical to reach an accurate solution to the inverse optimization problem. It was concluded that at least three independent stress-strain curves are necessary to determine the single crystal behavior from polycrystal tests in the case of highly textured Mg alloys.