43 resultados para exceedance probabilities
Resumo:
The purpose of this paper is to present a program written in Matlab-Octave for the simulation of the time evolution of student curricula, i.e, how students pass their subjects along time until graduation. The program computes, from the simulations, the academic performance rates for the subjects of the study plan for each semester as well as the overall rates, which are a) the efficiency rate defined as the ratio of the number of students passing the exam to the number of students who registered for it and b) the success rate, defined as the ratio of the number of students passing the exam to the number of students who not only registered for it but also actually took it. Additionally, we compute the rates for the bachelor academic degree which are established for Spain by the National Quality Evaluation and Accreditation Agency (ANECA) and which are the graduation rate (measured as the percentage of students who finish as scheduled in the plan or taking an extra year) and the efficiency rate (measured as the percentage of credits which a student who graduated has really taken). The simulation is done in terms of the probabilities of passing all the subjects in their study plan. The application of the simulator to Polytech students in Madrid, where requirements for passing are specially stiff in first and second year subjects, is particularly relevant to analyze student cohorts and the probabilities of students finishing in the minimum of four years, or taking and extra year or two extra years, and so forth. It is a very useful tool when designing new study plans. The calculation of the probability distribution of the random variable "number of semesters a student has taken to complete the curricula and graduate" is difficult or even unfeasible to obtain analytically, and this is even truer when we incorporate uncertainty in parameter estimation. This is why we apply Monte Carlo simulation which not only provides illustration of the stochastic process but also a method for computation. The stochastic simulator is proving to be a useful tool for identification of the subjects most critical in the distribution of the number of semesters for curriculum vitae (CV) completion and subsequently for a decision making process in terms of CV planning and passing standards in the University. Simulations are performed through a graphical interface where also the results are presented in appropriate figures. The Project has been funded by the Call for Innovation in Education Projects of Universidad Politécnica de Madrid (UPM) through a Project of its school Escuela Técnica Superior de Ingenieros Industriales ETSII during the period September 2010-September 2011.
Resumo:
Intermittency phenomenon is a continuous route from regular to chaotic behaviour. Intermittency is an occurrence of a signal that alternates chaotic bursts between quasi-regular periods called laminar phases, driven by the so called reinjection probability density function (RPD). In this paper is introduced a new technique to obtain the RPD for type-II and III intermittency. The new RPD is more general than the classical one and includes the classical RPD as a particular case. The probabilities of the laminar length, the average laminar lengths and the characteristic relations are determined with and without lower bound of the reinjection in agreement with numerical simulations. Finally, it is analyzed the noise effect in intermittency. A method to obtain the noisy RPD is developed extending the procedure used in the noiseless case. The analytical results show a good agreement with numerical simulations.
Resumo:
All activities of an organization involve risks that should be managed. The risk management process aids decision making by taking account of uncertainty and the possibility of future events or circumstances (intended or unintended) and their effects on agreed objectives. With that idea, new ISO Standard has been drawn up. ISO 31010 has been recently issued which provides a structured process that identifies how objectives may be affected, and analyses the risk in term of consequences and their probabilities before deciding on whether further treatment is required. In this lecture, that ISO Standard has been adapted to Open Pit Blasting Operations, focusing in Environmental effects which can be managed properly. Technique used is Fault Tree Analysis (FTA), which is applied in all possible scenarios, providing to Blasting Professionals the tools to identify, analyze and manage environmental effects in blasting operations. Also this lecture can help to minimize each effect, studying each case. This paper also can be useful to Project Managers and Occupational Health and Safety Departments (OH&S) because blasting operations can be evaluated and compared one to each other to determine the risks that should be managed in different case studies. The environmental effects studied are: ground vibrations, flyrock and air overpressure (airblast). Sometimes, blasting operations are carried out near populated areas where environmental effects may impose several limitations on the use of explosives. In those cases, where these factors approach certain limits, National Standards and Regulations have to be applied.
Resumo:
In the present work we report theoretical Stark widths and shifts calculated using the Griem semi-empirical approach, corresponding to 237 spectral lines of MgIII. Data are presented for an electron density of 1017 cm?3 and temperatures T = 0.5?10.0 (104 K). The matrix elements used in these calculations have been determined from 23 configurations of MgIII: 2s22p6, 2s22p53p, 2s22p54p, 2s22p54f and 2s22p55f for even parity and 2s22p5ns (n = 3?6), 2s22p5nd (n = 3?9), 2s22p55g and 2s2p6np (n = 3?8) for odd parity. For the intermediate coupling (IC) calculations, we use the standard method of least-squares fitting from experimental energy levels by means of the Cowan computer code. Also, in order to test the matrix elements used in our calculations, we present calculated values of 70 transition probabilities of MgIII spectral lines and 14 calculated values of radiative lifetimes of MgIII levels. There is good agreement between our calculations and experimental radiative lifetimes. Spectral lines of MgIII are relevant in astrophysics and also play an important role in the spectral analysis of laboratory plasma. Theoretical trends of the Stark broadening parameter versus the temperature for relevant lines are presented. No values of Stark parameters can be found in the bibliography.
Resumo:
In this paper, the fusion of probabilistic knowledge-based classification rules and learning automata theory is proposed and as a result we present a set of probabilistic classification rules with self-learning capability. The probabilities of the classification rules change dynamically guided by a supervised reinforcement process aimed at obtaining an optimum classification accuracy. This novel classifier is applied to the automatic recognition of digital images corresponding to visual landmarks for the autonomous navigation of an unmanned aerial vehicle (UAV) developed by the authors. The classification accuracy of the proposed classifier and its comparison with well-established pattern recognition methods is finally reported.
Resumo:
We present an approach to adapt dynamically the language models (LMs) used by a speech recognizer that is part of a spoken dialogue system. We have developed a grammar generation strategy that automatically adapts the LMs using the semantic information that the user provides (represented as dialogue concepts), together with the information regarding the intentions of the speaker (inferred by the dialogue manager, and represented as dialogue goals). We carry out the adaptation as a linear interpolation between a background LM, and one or more of the LMs associated to the dialogue elements (concepts or goals) addressed by the user. The interpolation weights between those models are automatically estimated on each dialogue turn, using measures such as the posterior probabilities of concepts and goals, estimated as part of the inference procedure to determine the actions to be carried out. We propose two approaches to handle the LMs related to concepts and goals. Whereas in the first one we estimate a LM for each one of them, in the second one we apply several clustering strategies to group together those elements that share some common properties, and estimate a LM for each cluster. Our evaluation shows how the system can estimate a dynamic model adapted to each dialogue turn, which helps to improve the performance of the speech recognition (up to a 14.82% of relative improvement), which leads to an improvement in both the language understanding and the dialogue management tasks.
Resumo:
Esta tesis doctoral pretende profundizar en el conocimiento de la ecología de Ulmus laevis Pallas, especie autóctona en peligro de extinción en la Península Ibérica, con el fin de proponer medidas adecuadas para su conservación. Se ha estudiado la distribución natural de la especie atendiendo a aspectos edáficos. Los resultados muestran que U. laevis presenta menor capacidad de acidificación de la rizosfera, menor actividad de la reductasa férrica y menor homeostasis que U. minor Mill. cuando crecen en sustratos con una disponibilidad de hierro limitada. Estas diferencias ayudan a comprender la distribución de ambas especies en la Península Ibérica: U. laevis se ve restringido a suelos ácidos o moderadamente ácidos, mientras que U. minor es capaz de habitar tanto suelos ácidos como básicos. Se han analizado las propiedades hidráulicas y anatómicas de U. laevis, constatando que sus características son favorables en ambientes con gran disponibilidad hídrica y que se trata del olmo ibérico más vulnerable a la cavitación por estrés hídrico, por lo que la aridificación del clima y la pérdida de los freáticos supone un riesgo para sus poblaciones. Para evaluar la capacidad de recuperación de la especie se han estudiado la diversidad y estructura genética espacial de las dos mayores poblaciones españolas. Los resultados evidencian que estas poblaciones mantienen niveles de diversidad equiparables o ligeramente superiores a los europeos, pese a haber sufrido un cuello de botella prolongado durante las glaciaciones y a las reducciones poblacionales recientes. En la actualidad la endogamia no representa un riesgo para estas poblaciones. También se ha analizado la producción, dispersión y predación de semillas en Valdelatas (Madrid). Los resultados han mostrado que el viento dispersa las sámaras a corta distancia (<30 m) y que los años no veceros las probabilidades de establecimiento de regenerado son bajas. Además, la producción de sámaras vanas puede tratarse de un carácter adaptativo que aumenta la eficiencia biológica de la especie, ya que favorece la supervivencia de las semillas embrionadas disminuyendo sus tasas de predación pre- y post-dispersión. La modificación del hábitat de esta especie como consecuencia de las actividades humanas afecta de manera negativa al establecimiento del regenerado. La conservación de esta especie a largo plazo requiere la recuperación de los niveles freáticos y de regímenes hidrológicos que permitan avenidas, ya que estas crean las condiciones adecuadas para el establecimiento de regenerado al eliminar la vegetación preexistente y depositar barro. ABSTRACT Ulmus laevis Pallas is an endangered species in the Iberian Peninsula. Therefore, in order to be able to propose adequate management guidelines for its conservation, this PhD Thesis intends to advance the knowledge on the species ecology in the region. Firstly, the species natural distribution was studied in relation to soil nature. Results show that U. minor Mill. had a higher root ferric reductase activity and proton extrusion capability than U. laevis, and maintained a better nutrient homeostasis when grown under iron limiting conditions. These differences in root Fe acquisition efficiencies proved helpful to understand the distribution of these species in the Iberian Peninsula, where U. laevis is restricted to acid or moderately acid soils, whereas U. minor can grow both in acid and basic soils. Secondly, we studied Ulmus laevis’ xylem anatomy and hydraulic traits. These proved favourable for growing under high water availability, but highly susceptible to drought-stress cavitation. Therefore, this species is vulnerable to the Iberian Peninsula’s aridification. Spatial genetic structure and diversity were evaluated in two of the biggest U. laevis populations in Spain in order to evaluate their recovery capabilities. These populations maintain similar or slightly higher diversity levels than European populations, despite having undergone an ancestral genetic bottleneck and having suffered recent population size reductions. No inbreeding problems have been detected in these populations. Seed production, dispersal and predation were assessed in Valdelatas’ elm grove (Madrid). Despite U. laevis samaras being winged nuts, wind dispersed them short distances from the mother tree (<30 m). The seed shadow models show that non-mast years provide very few chances for the stand to regenerate due to their low full seed flux. Empty samaras deceive pre- and post-dispersal predators increasing full seed survival probabilities. Therefore, empty fruit production might be an adaptive trait that increases plant fitness. Finally, human-induced changes in water-table levels and river regulation may affect U. laevis seed dispersal and regeneration establishment negatively. The long-term conservation and expansion of this species in the Iberian Peninsula requires the recovery of water-tables and of natural hydrological regimes, as flooding eliminates vegetation, creating open microhabitats and deposits mud, creating the ideal conditions for seedling establishment.
Resumo:
Fruit damage during harvesting and handling is a standing problem, particularly for susceptible fruits like peaches and apricots. The resulting mechanical damage is a combination of fruit properties and damage inflicting effects due to procedures and to the equipment. Nine packing lines in the region of Murcia (SE Spain) have been tested with the aid of two different-size electronic fruits IS-100. Probabilities of impacts above three preset thresholds (50 g's, 100 g's and 150 g's) were calculated for each transfer point. Interaction fruit-packing line tests have been also performed in order to study the real incidence of packing lines on natural produce: apricots (1 variety), peaches (3 v.), lemons (1 v.) and oranges (3 v.). Bruises of handled and not handled samples of fruits were compared.
Resumo:
RESUMEN El apoyo a la selección de especies a la restauración de la vegetación en España en los últimos 40 años se ha basado fundamentalmente en modelos de distribución de especies, también llamados modelos de nicho ecológico, que estiman la probabilidad de presencia de las especies en función de las condiciones del medio físico (clima, suelo, etc.). Con esta tesis se ha intentado contribuir a la mejora de la capacidad predictiva de los modelos introduciendo algunas propuestas metodológicas adaptadas a los datos disponibles actualmente en España y enfocadas al uso de los modelos en la selección de especies. No siempre se dispone de datos a una resolución espacial adecuada para la escala de los proyectos de restauración de la vegetación. Sin embrago es habitual contar con datos de baja resolución espacial para casi todas las especies vegetales presentes en España. Se propone un método de recalibración que actualiza un modelo de regresión logística de baja resolución espacial con una nueva muestra de alta resolución espacial. El método permite obtener predicciones de calidad aceptable con muestras relativamente pequeñas (25 presencias de la especie) frente a las muestras mucho mayores (más de 100 presencias) que requería una estrategia de modelización convencional que no usara el modelo previo. La selección del método estadístico puede influir decisivamente en la capacidad predictiva de los modelos y por esa razón la comparación de métodos ha recibido mucha atención en la última década. Los estudios previos consideraban a la regresión logística como un método inferior a técnicas más modernas como las de máxima entropía. Los resultados de la tesis demuestran que esa diferencia observada se debe a que los modelos de máxima entropía incluyen técnicas de regularización y la versión de la regresión logística usada en las comparaciones no. Una vez incorporada la regularización a la regresión logística usando penalización, las diferencias en cuanto a capacidad predictiva desaparecen. La regresión logística penalizada es, por tanto, una alternativa más para el ajuste de modelos de distribución de especies y está a la altura de los métodos modernos con mejor capacidad predictiva como los de máxima entropía. A menudo, los modelos de distribución de especies no incluyen variables relativas al suelo debido a que no es habitual que se disponga de mediciones directas de sus propiedades físicas o químicas. La incorporación de datos de baja resolución espacial proveniente de mapas de suelo nacionales o continentales podría ser una alternativa. Los resultados de esta tesis sugieren que los modelos de distribución de especies de alta resolución espacial mejoran de forma ligera pero estadísticamente significativa su capacidad predictiva cuando se incorporan variables relativas al suelo procedente de mapas de baja resolución espacial. La validación es una de las etapas fundamentales del desarrollo de cualquier modelo empírico como los modelos de distribución de especies. Lo habitual es validar los modelos evaluando su capacidad predictiva especie a especie, es decir, comparando en un conjunto de localidades la presencia o ausencia observada de la especie con las predicciones del modelo. Este tipo de evaluación no responde a una cuestión clave en la restauración de la vegetación ¿cuales son las n especies más idóneas para el lugar a restaurar? Se ha propuesto un método de evaluación de modelos adaptado a esta cuestión que consiste en estimar la capacidad de un conjunto de modelos para discriminar entre las especies presentes y ausentes de un lugar concreto. El método se ha aplicado con éxito a la validación de 188 modelos de distribución de especies leñosas orientados a la selección de especies para la restauración de la vegetación en España. Las mejoras metodológicas propuestas permiten mejorar la capacidad predictiva de los modelos de distribución de especies aplicados a la selección de especies en la restauración de la vegetación y también permiten ampliar el número de especies para las que se puede contar con un modelo que apoye la toma de decisiones. SUMMARY During the last 40 years, decision support tools for plant species selection in ecological restoration in Spain have been based on species distribution models (also called ecological niche models), that estimate the probability of occurrence of the species as a function of environmental predictors (e.g., climate, soil). In this Thesis some methodological improvements are proposed to contribute to a better predictive performance of such models, given the current data available in Spain and focusing in the application of the models to selection of species for ecological restoration. Fine grained species distribution data are required to train models to be used at the scale of the ecological restoration projects, but this kind of data are not always available for every species. On the other hand, coarse grained data are available for almost every species in Spain. A recalibration method is proposed that updates a coarse grained logistic regression model using a new fine grained updating sample. The method allows obtaining acceptable predictive performance with reasonably small updating sample (25 occurrences of the species), in contrast with the much larger samples (more than 100 occurrences) required for a conventional modeling approach that discards the coarse grained data. The choice of the statistical method may have a dramatic effect on model performance, therefore comparisons of methods have received much interest in the last decade. Previous studies have shown a poorer performance of the logistic regression compared to novel methods like maximum entropy models. The results of this Thesis show that the observed difference is caused by the fact that maximum entropy models include regularization techniques and the versions of logistic regression compared do not. Once regularization has been added to the logistic regression using a penalization procedure, the differences in model performance disappear. Therefore, penalized logistic regression may be considered one of the best performing methods to model species distributions. Usually, species distribution models do not consider soil related predictors because direct measurements of the chemical or physical properties are often lacking. The inclusion of coarse grained soil data from national or continental soil maps could be a reasonable alternative. The results of this Thesis suggest that the performance of the models slightly increase after including soil predictors form coarse grained soil maps. Model validation is a key stage of the development of empirical models, such as species distribution models. The usual way of validating is based on the evaluation of model performance for each species separately, i.e., comparing observed species presences or absence to predicted probabilities in a set of sites. This kind of evaluation is not informative for a common question in ecological restoration projects: which n species are the most suitable for the environment of the site to be restored? A method has been proposed to address this question that estimates the ability of a set of models to discriminate among present and absent species in a evaluation site. The method has been successfully applied to the validation of 188 species distribution models used to support decisions on species selection for ecological restoration in Spain. The proposed methodological approaches improve the predictive performance of the predictive models applied to species selection in ecological restoration and increase the number of species for which a model that supports decisions can be fitted.
Resumo:
Expert knowledge is used to assign probabilities to events in many risk analysis models. However, experts sometimes find it hard to provide specific values for these probabilities, preferring to express vague or imprecise terms that are mapped using a previously defined fuzzy number scale. The rigidity of these scales generates bias in the probability elicitation process and does not allow experts to adequately express their probabilistic judgments. We present an interactive method for extracting a fuzzy number from experts that represents their probabilistic judgments for a given event, along with a quality measure of the probabilistic judgments, useful in a final information filtering and analysis sensitivity process.
Resumo:
Purpose: A fully three-dimensional (3D) massively parallelizable list-mode ordered-subsets expectation-maximization (LM-OSEM) reconstruction algorithm has been developed for high-resolution PET cameras. System response probabilities are calculated online from a set of parameters derived from Monte Carlo simulations. The shape of a system response for a given line of response (LOR) has been shown to be asymmetrical around the LOR. This work has been focused on the development of efficient region-search techniques to sample the system response probabilities, which are suitable for asymmetric kernel models, including elliptical Gaussian models that allow for high accuracy and high parallelization efficiency. The novel region-search scheme using variable kernel models is applied in the proposed PET reconstruction algorithm. Methods: A novel region-search technique has been used to sample the probability density function in correspondence with a small dynamic subset of the field of view that constitutes the region of response (ROR). The ROR is identified around the LOR by searching for any voxel within a dynamically calculated contour. The contour condition is currently defined as a fixed threshold over the posterior probability, and arbitrary kernel models can be applied using a numerical approach. The processing of the LORs is distributed in batches among the available computing devices, then, individual LORs are processed within different processing units. In this way, both multicore and multiple many-core processing units can be efficiently exploited. Tests have been conducted with probability models that take into account the noncolinearity, positron range, and crystal penetration effects, that produced tubes of response with varying elliptical sections whose axes were a function of the crystal's thickness and angle of incidence of the given LOR. The algorithm treats the probability model as a 3D scalar field defined within a reference system aligned with the ideal LOR. Results: This new technique provides superior image quality in terms of signal-to-noise ratio as compared with the histogram-mode method based on precomputed system matrices available for a commercial small animal scanner. Reconstruction times can be kept low with the use of multicore, many-core architectures, including multiple graphic processing units. Conclusions: A highly parallelizable LM reconstruction method has been proposed based on Monte Carlo simulations and new parallelization techniques aimed at improving the reconstruction speed and the image signal-to-noise of a given OSEM algorithm. The method has been validated using simulated and real phantoms. A special advantage of the new method is the possibility of defining dynamically the cut-off threshold over the calculated probabilities thus allowing for a direct control on the trade-off between speed and quality during the reconstruction.
Resumo:
This paper describes a theoretical model based primarily on transaction costs, for comparing the various tendering mechanisms used for transportation Public-Private Partnership (PPP) projects. In particular, the model contrasts negotiated procedures with the open procedure, as defined by the current European Union legislation on public tendering. The model includes both ex ante transaction costs (borne during the tendering stage) and ex post transaction costs (such as enforcement costs, re-negotiation costs, and costs arising from litigation between partners), explaining the trade-off between them. Generally speaking, it is assumed that the open procedure implies lower transaction costs ex ante, while the negotiated procedure reduces the probability of the appearance of new contingencies not foreseen in the contract, hence diminishing the expected value of transaction costs ex post. Therefore, the balance between ex ante and ex post transaction costs is the main criterion for deciding whether the open or negotiated procedure would be optimal. Notwithstanding, empirical evidence currently exists only on ex ante transaction costs in transportation infrastructure projects. This evidence has shown a relevant difference between the two procedures as far as ex ante costs are concerned, favouring the open procedure. The model developed in this paper also demonstrates that a larger degree of complexity in a contract does not unequivocally favour the use of a negotiated procedure. Only in those cases dealing with very innovative projects, where important dimensions of the quality of the asset or service are not verifiable, may we observe an advantage in favour of the negotiated procedure. The bottom line is that we find it difficult to justify the employment of negotiated procedures in most transportation PPP contracts, especially in the field of roads. Nevertheless, the field remains open for future empirical work and research on the levels of transaction costs borne ex post in PPP contracts, as well as on the probabilities of such costs appearing under any of the procurement procedures.
Resumo:
In order to show the choice of transparency as the guiding principle of the accreditation process, the article evaluates its influence on the fundamental subprocess of self-evaluation, thereby confirming that transparency is an essential tool for continuous improvement of academic processes and those of educational quality management. It fosters educational innovation and permits the sustainability of the continuous accreditation process over time, resulting in greater probabilities of university self-regulation through systemization of the process, with the objective of continuous improvement of university degree programs. The article analyzes the influence of transparency on each activity of the self-evaluation process according to the Peruvian accreditation model prepared under the total quality approach, as a reference for other accreditation models, proposing concrete transparency actions and evaluating its influence on the stakeholder groups in the self-evaluation process, as well as on the efficiency and effectiveness of the process. It is concluded that transparency has a positive influence on the training of human capital and the formation of the university?s organizational culture, facilitating dissemination, understanding and involvement of the stakeholder groups in the continuous improvement of accreditation activities and increasing their acceptance of change and commitment to the process. It is confirmed that transparency contributes toward increasing the efficiency index of the self-evaluation process by reducing operating costs through adequate, accessible, timely contribution of information by the stakeholders and through the optimization of the time spent gathering relevant information. In addition, it is concluded that transparency contributes toward increasing the effectiveness index of self-evaluation by facilitating the achievement of its objectives through synthetic, useful, reliable interpretation of the education situation and the formulation of feasible improvement plans based on the adequacy, relevance, visibility, pertinence and truthfulness of the information analyzed.
Resumo:
La visión del riesgo es muy importante a la hora de diseñar y construir estructuras. Sobre todo en presencia de catástrofes naturales. Esta tesis presenta las consideraciones y recomendaciones a tener en cuenta para proyectar estructuras en zonas con riesgo de tsunami, tras haber estudiado todos los riesgos y soluciones posibles mediante tres técnicas diferentes. Esto se ha obtenido aplicando el método LOGRO (Líder de Organización de Gestión de Riesgos y Oportunidades). Se puede pensar que la única solución posible ante esta catástrofe es no construir en zonas vulnerables, pero no es posible. Hay que tener en cuenta que no todos los tsunamis son de la misma magnitud e intensidad y este trabajo contempla todos ellos con distintos niveles de actuación para cada situación. Por otro lado, existen lugares costeros muy urbanizados en los que sería imposible fomentar la emigración, y zonas turísticas donde lo que prima es la cercanía a la costa. Las resultados obtenidos de esta investigación tienen como objetivo reducir el riesgo vinculado al tsunami, minimizando tanto probabilidades de ocurrencia como consecuencias. The hazard view is very important when designing and building structures, mainly, under natural disasters. This thesis presents the considerations and recommendations to take into account to project structures in tsunami prone areas after having studied all the hazards and possible solutions through three different techniques. This has been obtained using LOGRO´s method. It is easy to think that the only possible solution to this disaster is not to build in vulnerable areas, but it is not like this. It has to be taken into account that not all tsunamis are of the same magnitude neither of the same intensity and this work considers all of them with different performance levels for each situation. On the other hand, there are very developed coastal places where it would be impossible to encourage emigration, and touristic areas where the closeness to the coast is what comes first. The aim of the results obtained in this research is to reduce the tsunami hazard minimising not only probabilities but also consequences.
Resumo:
Motivado por los últimos hallazgos realizados gracias a los recientes avances tecnológicos y misiones espaciales, el estudio de los asteroides ha despertado el interés de la comunidad científica. Tal es así que las misiones a asteroides han proliferado en los últimos años (Hayabusa, Dawn, OSIRIX-REx, ARM, AIMS-DART, ...) incentivadas por su enorme interés científico. Los asteroides son constituyentes fundamentales en la evolución del Sistema Solar, son además grandes concentraciones de valiosos recursos naturales, y también pueden considerarse como objectivos estratégicos para la futura exploración espacial. Desde hace tiempo se viene especulando con la posibilidad de capturar objetos próximos a la Tierra (NEOs en su acrónimo anglosajón) y acercarlos a nuestro planeta, permitiendo así un acceso asequible a los mismos para estudiarlos in-situ, explotar sus recursos u otras finalidades. Por otro lado, las asteroides se consideran con frecuencia como posibles peligros de magnitud planetaria, ya que impactos de estos objetos con la Tierra suceden constantemente, y un asteroide suficientemente grande podría desencadenar eventos catastróficos. Pese a la gravedad de tales acontecimientos, lo cierto es que son ciertamente difíciles de predecir. De hecho, los ricos aspectos dinámicos de los asteroides, su modelado complejo y las incertidumbres observaciones hacen que predecir su posición futura con la precisión necesaria sea todo un reto. Este hecho se hace más relevante cuando los asteroides sufren encuentros próximos con la Tierra, y más aún cuando estos son recurrentes. En tales situaciones en las cuales fuera necesario tomar medidas para mitigar este tipo de riesgos, saber estimar con precisión sus trayectorias y probabilidades de colisión es de una importancia vital. Por ello, se necesitan herramientas avanzadas para modelar su dinámica y predecir sus órbitas con precisión, y son también necesarios nuevos conceptos tecnológicos para manipular sus órbitas llegado el caso. El objetivo de esta Tesis es proporcionar nuevos métodos, técnicas y soluciones para abordar estos retos. Las contribuciones de esta Tesis se engloban en dos áreas: una dedicada a la propagación numérica de asteroides, y otra a conceptos de deflexión y captura de asteroides. Por lo tanto, la primera parte de este documento presenta novedosos avances de apliación a la propagación dinámica de alta precisión de NEOs empleando métodos de regularización y perturbaciones, con especial énfasis en el método DROMO, mientras que la segunda parte expone ideas innovadoras para la captura de asteroides y comenta el uso del “ion beam shepherd” (IBS) como tecnología para deflectarlos. Abstract Driven by the latest discoveries enabled by recent technological advances and space missions, the study of asteroids has awakened the interest of the scientific community. In fact, asteroid missions have become very popular in the recent years (Hayabusa, Dawn, OSIRIX-REx, ARM, AIMS-DART, ...) motivated by their outstanding scientific interest. Asteroids are fundamental constituents in the evolution of the Solar System, can be seen as vast concentrations of valuable natural resources, and are also considered as strategic targets for the future of space exploration. For long it has been hypothesized with the possibility of capturing small near-Earth asteroids and delivering them to the vicinity of the Earth in order to allow an affordable access to them for in-situ science, resource utilization and other purposes. On the other side of the balance, asteroids are often seen as potential planetary hazards, since impacts with the Earth happen all the time, and eventually an asteroid large enough could trigger catastrophic events. In spite of the severity of such occurrences, they are also utterly hard to predict. In fact, the rich dynamical aspects of asteroids, their complex modeling and observational uncertainties make exceptionally challenging to predict their future position accurately enough. This becomes particularly relevant when asteroids exhibit close encounters with the Earth, and more so when these happen recurrently. In such situations, where mitigation measures may need to be taken, it is of paramount importance to be able to accurately estimate their trajectories and collision probabilities. As a consequence, advanced tools are needed to model their dynamics and accurately predict their orbits, as well as new technological concepts to manipulate their orbits if necessary. The goal of this Thesis is to provide new methods, techniques and solutions to address these challenges. The contributions of this Thesis fall into two areas: one devoted to the numerical propagation of asteroids, and another to asteroid deflection and capture concepts. Hence, the first part of the dissertation presents novel advances applicable to the high accuracy dynamical propagation of near-Earth asteroids using regularization and perturbations techniques, with a special emphasis in the DROMO method, whereas the second part exposes pioneering ideas for asteroid retrieval missions and discusses the use of an “ion beam shepherd” (IBS) for asteroid deflection purposes.