28 resultados para Optimisation of methods
em Universidad Politécnica de Madrid
Resumo:
At present there is much literature that refers to the advantages and disadvantages of different methods of statistical and dynamical downscaling of climate variables projected by climate models. Less attention has been paid to other indirect variables, like runoff, which play a significant role in evaluating the impact of climate change on hydrological systems. Runoff presents a much greater bias in climate models than other climate variables, like temperature or precipitation. It is very important to identify the methods that minimize bias while downscaling runoff from the gridded results of climate models to the basin scale
Resumo:
This paper presents a study of the effectiveness of three different algorithms for the parallelization of logic programs based on compile-time detection of independence among goals. The algorithms are embedded in a complete parallelizing compiler, which incorporates different abstract interpretation-based program analyses. The complete system shows the task of automatic program parallelization to be practical. The trade-offs involved in using each of the algorithms in this task are studied experimentally, weaknesses of these identified, and possible improvements discussed.
Resumo:
Métodos estadísticos para análisis de MRI PSIR
Resumo:
Today's motivation for autonomous systems research stems out of the fact that networked environments have reached a level of complexity and heterogeneity that make their control and management by solely human administrators more and more difficult. The optimisation of performance metrics for the air traffic management system, like in other networked system, has become more complex with increasing number of flights, capacity constraints, environmental factors and safety regulations. It is anticipated that a new structure of planning layers and the introduction of higher levels of automation will reduce complexity and will optimise the performance metrics of the air traffic management system. This paper discusses the complexity of optimising air traffic management performance metrics and proposes a way forward based on higher levels of automation.
Resumo:
Analysis of big amount of data is a field with many years of research. It is centred in getting significant values, to make it easier to understand and interpret data. Being the analysis of interdependence between time series an important field of research, mainly as a result of advances in the characterization of dynamical systems from the signals they produce. In the medicine sphere, it is easy to find many researches that try to understand the brain behaviour, its operation mode and its internal connections. The human brain comprises approximately 1011 neurons, each of which makes about 103 synaptic connections. This huge number of connections between individual processing elements provides the fundamental substrate for neuronal ensembles to become transiently synchronized or functionally connected. A similar complex network configuration and dynamics can also be found at the macroscopic scales of systems neuroscience and brain imaging. The emergence of dynamically coupled cell assemblies represents the neurophysiological substrate for cognitive function such as perception, learning, thinking. Understanding the complex network organization of the brain on the basis of neuroimaging data represents one of the most impervious challenges for systems neuroscience. Brain connectivity is an elusive concept that refers to diferent interrelated aspects of brain organization: structural, functional connectivity (FC) and efective connectivity (EC). Structural connectivity refers to a network of physical connections linking sets of neurons, it is the anatomical structur of brain networks. However, FC refers to the statistical dependence between the signals stemming from two distinct units within a nervous system, while EC refers to the causal interactions between them. This research opens the door to try to resolve diseases related with the brain, like Parkinson’s disease, senile dementia, mild cognitive impairment, etc. One of the most important project associated with Alzheimer’s research and other diseases are enclosed in the European project called Blue Brain. The center for Biomedical Technology (CTB) of Universidad Politecnica de Madrid (UPM) forms part of the project. The CTB researches have developed a magnetoencephalography (MEG) data processing tool that allow to visualise and analyse data in an intuitive way. This tool receives the name of HERMES, and it is presented in this document. Analysis of big amount of data is a field with many years of research. It is centred in getting significant values, to make it easier to understand and interpret data. Being the analysis of interdependence between time series an important field of research, mainly as a result of advances in the characterization of dynamical systems from the signals they produce. In the medicine sphere, it is easy to find many researches that try to understand the brain behaviour, its operation mode and its internal connections. The human brain comprises approximately 1011 neurons, each of which makes about 103 synaptic connections. This huge number of connections between individual processing elements provides the fundamental substrate for neuronal ensembles to become transiently synchronized or functionally connected. A similar complex network configuration and dynamics can also be found at the macroscopic scales of systems neuroscience and brain imaging. The emergence of dynamically coupled cell assemblies represents the neurophysiological substrate for cognitive function such as perception, learning, thinking. Understanding the complex network organization of the brain on the basis of neuroimaging data represents one of the most impervious challenges for systems neuroscience. Brain connectivity is an elusive concept that refers to diferent interrelated aspects of brain organization: structural, functional connectivity (FC) and efective connectivity (EC). Structural connectivity refers to a network of physical connections linking sets of neurons, it is the anatomical structur of brain networks. However, FC refers to the statistical dependence between the signals stemming from two distinct units within a nervous system, while EC refers to the causal interactions between them. This research opens the door to try to resolve diseases related with the brain, like Parkinson’s disease, senile dementia, mild cognitive impairment, etc. One of the most important project associated with Alzheimer’s research and other diseases are enclosed in the European project called Blue Brain. The center for Biomedical Technology (CTB) of Universidad Politecnica de Madrid (UPM) forms part of the project. The CTB researches have developed a magnetoencephalography (MEG) data processing tool that allow to visualise and analyse data in an intuitive way. This tool receives the name of HERMES, and it is presented in this document.
Resumo:
This paper includes the experimental study, analysis, redesign and subsequent test of the parts of a closed circuit, low speed wind tunnel which are relevant in terms of total pressure loss. The objective is to lower the energy consumption of this system for given conditions in test chamber, so as to reduce the operational costs. In order to achieve this objective, several tasks were performed as the text shows in its different parts. For these tasks, the ETSIAE wind tunnel was used, although the results of this work can be extrapolated to any wind tunnel with the same characteristics. Part II presents a theoretical previous study of the general running of a closed circuit, low speed wind tunnel, as well as the followed procedure to conduct experimental tests for obtaining the total pressure loss in its parts. Results from these tests and their analysis are included in this part. In part III, the analysis of the influence of corner 1 on the pressure loss takes place. As it is said in this part, corner 1 has great importance in the total pressure loss of the wind tunnel. Therefore, it is the first part that should be modified in order to improve the performances of the wind tunnel. During part IV, an optimised guide vane is designed in order to reduce the pressure loss in corner 1 of the wind tunnel. Software MISES is used to achieve this goal by means of selecting the optimum guide vane. In order to introduce the new guide vane in wind tunnels with affordable costs, the easily constructable criterion is kept during design. For this reason, the guide vane will consist of simple aerodynamic contours. Part V includes some possible improvements for the proposed guide vane, in order to evaluate if there is room for improvement in its design. Finally, part VI includes the tests that were conducted in the wind tunnel with the new guide vane cascade and the analysis of their results, in order to asses whether the proposed design fulfills the requirement of lowering the total pressure loss in the wind tunnel. Part VII gathers the main ideas resulting from the whole work.
Resumo:
There is increasing pressure on developers to produce usable systems, which requires the use of appropriate methods to support user centred design during development. There is currently no consistent advice on which methods are appropriate in which circumstances, so the selection of methods relies on individual experience and expertise. Considerable effort is required to collate information from various sources and to understand the applicability of each method in a particular situation. Usability Planner is a tool aimed to support the selection of the most appropriate methods depending on project and organizational constraints. Many of the rules employed are derived from ISO standards, complemented with rules from the authors’ experience.
Resumo:
Patent and trademark offices which run according to principles of new management have an inherent need for dependable forecasting data in planning capacity and service levels. The ability of the Spanish Office of Patents and Trademarks to carry out efficient planning of its resource needs requires the use of methods which allow it to predict the changes in the number of patent and trademark applications at different time horizons. The approach for the prediction of time series of Spanish patents and trademarks applications (1979e2009) was based on the use of different techniques of time series prediction in a short-term horizon. The methods used can be grouped into two specifics areas: regression models of trends and time series models. The results of this study show that it is possible to model the series of patents and trademarks applications with different models, especially ARIMA, with satisfactory model adjustment and relatively low error.
Resumo:
This poster raises the issue of a research work oriented to the storage, retrieval, representation and analysis of dynamic GI, taking into account The ultimate objective is the modelling and representation of the dynamic nature of geographic features, establishing mechanisms to store geometries enriched with a temporal structure (regardless of space) and a set of semantic descriptors detailing and clarifying the nature of the represented features and their temporality. the semantic, the temporal and the spatiotemporal components. We intend to define a set of methods, rules and restrictions for the adequate integration of these components into the primary elements of the GI: theme, location, time [1]. We intend to establish and incorporate three new structures (layers) into the core of data storage by using mark-up languages: a semantictemporal structure, a geosemantic structure, and an incremental spatiotemporal structure. Thus, data would be provided with the capability of pinpointing and expressing their own basic and temporal characteristics, enabling them to interact each other according to their context, and their time and meaning relationships that could be eventually established
Resumo:
A number of thrombectomy devices using a variety of methods have now been developed to facilitate clot removal. We present research involving one such experimental device recently developed in the UK, called a ‘GP’ Thrombus Aspiration Device (GPTAD). This device has the potential to bring about the extraction of a thrombus. Although the device is at a relatively early stage of development, the results look encouraging. In this work, we present an analysis and modeling of the GPTAD by means of the bond graph technique; it seems to be a highly effective method of simulating the device under a variety of conditions. Such modeling is useful in optimizing the GPTAD and predicting the result of clot extraction. The aim of this simulation model is to obtain the minimum pressure necessary to extract the clot and to verify that both the pressure and the time required to complete the clot extraction are realistic for use in clinical situations, and are consistent with any experimentally obtained data. We therefore consider aspects of rheology and mechanics in our modeling.
Resumo:
This poster raises the issue of a research work oriented to the storage, retrieval, representation and analysis of dynamic GI, taking into account the semantic, the temporal and the spatiotemporal components. We intend to define a set of methods, rules and restrictions for the adequate integration of these components into the primary elements of the GI: theme, location, time [1]. We intend to establish and incorporate three new structures (layers) into the core of data storage by using mark-up languages: a semantictemporal structure, a geosemantic structure, and an incremental spatiotemporal structure. The ultimate objective is the modelling and representation of the dynamic nature of geographic features, establishing mechanisms to store geometries enriched with a temporal structure (regardless of space) and a set of semantic descriptors detailing and clarifying the nature of the represented features and their temporality. Thus, data would be provided with the capability of pinpointing and expressing their own basic and temporal characteristics, enabling them to interact each other according to their context, and their time and meaning relationships that could be eventually established
Resumo:
The objective of this paper is to address the methodological process of a teaching strategy for training project management complexity in postgraduate programs. The proposal is made up of different methods —intuitive, comparative, deductive, case study, problem-solving Project-Based Learning— and different activities inside and outside the classroom. This integration of methods motivated the current use of the concept of “learning strategy”. The strategy has two phases: firstly, the integration of the competences —technical, behavioral and contextual—in real projects; and secondly, the learning activity was oriented in upper level of knowledge, the evaluating the complexity for projects management in real situations. Both the competences in the learning strategy and the Project Complexity Evaluation are based on the ICB of IPMA. The learning strategy is applied in an international Postgraduate Program —Erasmus Mundus Master of Science— with the participation of five Universities of the European Union. This master program is fruit of a cooperative experience from one Educative Innovation Group of the UPM -GIE-Project-, two Research Groups of the UPM and the collaboration with other external agents to the university. Some reflections on the experience and the main success factors in the learning strategy were presented in the paper
Resumo:
The objective of this paper is to address the methodological process of a teaching strategy for training project management complexity in postgraduate programs. The proposal is made up of different methods —intuitive, comparative, deductive, case study, problem-solving Project-Based Learning— and different activities inside and outside the classroom. This integration of methods motivated the current use of the concept of ―learning strategy‖. The strategy has two phases: firstly, the integration of the competences —technical, behavioral and contextual—in real projects; and secondly, the learning activity was oriented in upper level of knowledge, the evaluating the complexity for projects management in real situations. Both the competences in the learning strategy and the Project Complexity Evaluation are based on the ICB of IPMA. The learning strategy is applied in an international Postgraduate Program —Erasmus Mundus Master of Science— with the participation of five Universities of the European Union. This master program is fruit of a cooperative experience from one Educative Innovation Group of the UPM -GIE-Project-, two Research Groups of the UPM and the collaboration with other external agents to the university. Some reflections on the experience and the main success factors in the learning strategy were presented in the paper.
Resumo:
High-resolution monochromated electron energy loss spectroscopy (EELS) at subnanometric spatial resolution and <200 meV energy resolution has been used to assess the valence band properties of a distributed Bragg reflector multilayer heterostructure composed of InAlN lattice matched to GaN. This work thoroughly presents the collection of methods and computational tools put together for this task. Among these are zero-loss-peak subtraction and nonlinear fitting tools, and theoretical modeling of the electron scattering distribution. EELS analysis allows retrieval of a great amount of information: indium concentration in the InAlN layers is monitored through the local plasmon energy position and calculated using a bowing parameter version of Vegard Law. Also a dielectric characterization of the InAlN and GaN layers has been performed through Kramers-Kronig analysis of the Valence-EELS data, allowing band gap energy to be measured and an insight on the polytypism of the GaN layers.
Resumo:
In the last years significant efforts have been devoted to the development of advanced data analysis tools to both predict the occurrence of disruptions and to investigate the operational spaces of devices, with the long term goal of advancing the understanding of the physics of these events and to prepare for ITER. On JET the latest generation of the disruption predictor called APODIS has been deployed in the real time network during the last campaigns with the new metallic wall. Even if it was trained only with discharges with the carbon wall, it has reached very good performance, with both missed alarms and false alarms in the order of a few percent (and strategies to improve the performance have already been identified). Since for the optimisation of the mitigation measures, predicting also the type of disruption is considered to be also very important, a new clustering method, based on the geodesic distance on a probabilistic manifold, has been developed. This technique allows automatic classification of an incoming disruption with a success rate of better than 85%. Various other manifold learning tools, particularly Principal Component Analysis and Self Organised Maps, are also producing very interesting results in the comparative analysis of JET and ASDEX Upgrade (AUG) operational spaces, on the route to developing predictors capable of extrapolating from one device to another.