901 resultados para Surveillance and Reconnaissance


Relevância:

80.00% 80.00%

Publicador:

Resumo:

Modern embedded applications typically integrate a multitude of functionalities with potentially different criticality levels into a single system. Without appropriate preconditions, the integration of mixed-criticality subsystems can lead to a significant and potentially unacceptable increase of engineering and certification costs. A promising solution is to incorporate mechanisms that establish multiple partitions with strict temporal and spatial separation between the individual partitions. In this approach, subsystems with different levels of criticality can be placed in different partitions and can be verified and validated in isolation. The MultiPARTES FP7 project aims at supporting mixed- criticality integration for embedded systems based on virtualization techniques for heterogeneous multicore processors. A major outcome of the project is the MultiPARTES XtratuM, an open source hypervisor designed as a generic virtualization layer for heterogeneous multicore. MultiPARTES evaluates the developed technology through selected use cases from the offshore wind power, space, visual surveillance, and automotive domains. The impact of MultiPARTES on the targeted domains will be also discussed. In a number of ongoing research initiatives (e.g., RECOMP, ARAMIS, MultiPARTES, CERTAINTY) mixed-criticality integration is considered in multicore processors. Key challenges are the combination of software virtualization and hardware segregation and the extension of partitioning mechanisms to jointly address significant non-functional requirements (e.g., time, energy and power budgets, adaptivity, reliability, safety, security, volume, weight, etc.) along with development and certification methodology.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Using the Bayesian approach as the model selection criteria, the main purpose in this study is to establish a practical road accident model that can provide a better interpretation and prediction performance. For this purpose we are using a structural explanatory model with autoregressive error term. The model estimation is carried out through Bayesian inference and the best model is selected based on the goodness of fit measures. To cross validate the model estimation further prediction analysis were done. As the road safety measures the number of fatal accidents in Spain, during 2000-2011 were employed. The results of the variable selection process show that the factors explaining fatal road accidents are mainly exposure, economic factors, and surveillance and legislative measures. The model selection shows that the impact of economic factors on fatal accidents during the period under study has been higher compared to surveillance and legislative measures.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Esta tesis aborda metodologías para el cálculo de riesgo de colisión de satélites. La minimización del riesgo de colisión se debe abordar desde dos puntos de vista distintos. Desde el punto de vista operacional, es necesario filtrar los objetos que pueden presentar un encuentro entre todos los objetos que comparten el espacio con un satélite operacional. Puesto que las órbitas, del objeto operacional y del objeto envuelto en la colisión, no se conocen perfectamente, la geometría del encuentro y el riesgo de colisión deben ser evaluados. De acuerdo con dicha geometría o riesgo, una maniobra evasiva puede ser necesaria para evitar la colisión. Dichas maniobras implican un consumo de combustible que impacta en la capacidad de mantenimiento orbital y por tanto de la visa útil del satélite. Por tanto, el combustible necesario a lo largo de la vida útil de un satélite debe ser estimado en fase de diseño de la misión para una correcta definición de su vida útil, especialmente para satélites orbitando en regímenes orbitales muy poblados. Los dos aspectos, diseño de misión y aspectos operacionales en relación con el riesgo de colisión están abordados en esta tesis y se resumen en la Figura 3. En relación con los aspectos relacionados con el diseño de misión (parte inferior de la figura), es necesario evaluar estadísticamente las características de de la población espacial y las teorías que permiten calcular el número medio de eventos encontrados por una misión y su capacidad de reducir riesgo de colisión. Estos dos aspectos definen los procedimientos más apropiados para reducir el riesgo de colisión en fase operacional. Este aspecto es abordado, comenzando por la teoría descrita en [Sánchez-Ortiz, 2006]T.14 e implementada por el autor de esta tesis en la herramienta ARES [Sánchez-Ortiz, 2004b]T.15 proporcionada por ESA para la evaluación de estrategias de evitación de colisión. Esta teoría es extendida en esta tesis para considerar las características de los datos orbitales disponibles en las fases operacionales de un satélite (sección 4.3.3). Además, esta teoría se ha extendido para considerar riesgo máximo de colisión cuando la incertidumbre de las órbitas de objetos catalogados no es conocida (como se da el caso para los TLE), y en el caso de querer sólo considerar riesgo de colisión catastrófico (sección 4.3.2.3). Dichas mejoras se han incluido en la nueva versión de ARES [Domínguez-González and Sánchez-Ortiz, 2012b]T.12 puesta a disposición a través de [SDUP,2014]R.60. En fase operacional, los catálogos que proporcionan datos orbitales de los objetos espaciales, son procesados rutinariamente, para identificar posibles encuentros que se analizan en base a algoritmos de cálculo de riesgo de colisión para proponer maniobras de evasión. Actualmente existe una única fuente de datos públicos, el catálogo TLE (de sus siglas en inglés, Two Line Elements). Además, el Joint Space Operation Center (JSpOC) Americano proporciona mensajes con alertas de colisión (CSM) cuando el sistema de vigilancia americano identifica un posible encuentro. En función de los datos usados en fase operacional (TLE o CSM), la estrategia de evitación puede ser diferente debido a las características de dicha información. Es preciso conocer las principales características de los datos disponibles (respecto a la precisión de los datos orbitales) para estimar los posibles eventos de colisión encontrados por un satélite a lo largo de su vida útil. En caso de los TLE, cuya precisión orbital no es proporcionada, la información de precisión orbital derivada de un análisis estadístico se puede usar también en el proceso operacional así como en el diseño de la misión. En caso de utilizar CSM como base de las operaciones de evitación de colisiones, se conoce la precisión orbital de los dos objetos involucrados. Estas características se han analizado en detalle, evaluando estadísticamente las características de ambos tipos de datos. Una vez concluido dicho análisis, se ha analizado el impacto de utilizar TLE o CSM en las operaciones del satélite (sección 5.1). Este análisis se ha publicado en una revista especializada [Sánchez-Ortiz, 2015b]T.3. En dicho análisis, se proporcionan recomendaciones para distintas misiones (tamaño del satélite y régimen orbital) en relación con las estrategias de evitación de colisión para reducir el riesgo de colisión de manera significativa. Por ejemplo, en el caso de un satélite en órbita heliosíncrona en régimen orbital LEO, el valor típico del ACPL que se usa de manera extendida es 10-4. Este valor no es adecuado cuando los esquemas de evitación de colisión se realizan sobre datos TLE. En este caso, la capacidad de reducción de riesgo es prácticamente nula (debido a las grandes incertidumbres de los datos TLE) incluso para tiempos cortos de predicción. Para conseguir una reducción significativa del riesgo, sería necesario usar un ACPL en torno a 10-6 o inferior, produciendo unas 10 alarmas al año por satélite (considerando predicciones a un día) o 100 alarmas al año (con predicciones a tres días). Por tanto, la principal conclusión es la falta de idoneidad de los datos TLE para el cálculo de eventos de colisión. Al contrario, usando los datos CSM, debido a su mejor precisión orbital, se puede obtener una reducción significativa del riesgo con ACPL en torno a 10-4 (considerando 3 días de predicción). Incluso 5 días de predicción pueden ser considerados con ACPL en torno a 10-5. Incluso tiempos de predicción más largos se pueden usar (7 días) con reducción del 90% del riesgo y unas 5 alarmas al año (en caso de predicciones de 5 días, el número de maniobras se mantiene en unas 2 al año). La dinámica en GEO es diferente al caso LEO y hace que el crecimiento de las incertidumbres orbitales con el tiempo de propagación sea menor. Por el contrario, las incertidumbres derivadas de la determinación orbital son peores que en LEO por las diferencias en las capacidades de observación de uno y otro régimen orbital. Además, se debe considerar que los tiempos de predicción considerados para LEO pueden no ser apropiados para el caso de un satélite GEO (puesto que tiene un periodo orbital mayor). En este caso usando datos TLE, una reducción significativa del riesgo sólo se consigue con valores pequeños de ACPL, produciendo una alarma por año cuando los eventos de colisión se predicen a un día vista (tiempo muy corto para implementar maniobras de evitación de colisión).Valores más adecuados de ACPL se encuentran entre 5•10-8 y 10-7, muy por debajo de los valores usados en las operaciones actuales de la mayoría de las misiones GEO (de nuevo, no se recomienda en este régimen orbital basar las estrategias de evitación de colisión en TLE). Los datos CSM permiten una reducción de riesgo apropiada con ACPL entre 10-5 y 10-4 con tiempos de predicción cortos y medios (10-5 se recomienda para predicciones a 5 o 7 días). El número de maniobras realizadas sería una en 10 años de misión. Se debe notar que estos cálculos están realizados para un satélite de unos 2 metros de radio. En el futuro, otros sistemas de vigilancia espacial (como el programa SSA de la ESA), proporcionarán catálogos adicionales de objetos espaciales con el objetivo de reducir el riesgo de colisión de los satélites. Para definir dichos sistemas de vigilancia, es necesario identificar las prestaciones del catalogo en función de la reducción de riesgo que se pretende conseguir. Las características del catálogo que afectan principalmente a dicha capacidad son la cobertura (número de objetos incluidos en el catalogo, limitado principalmente por el tamaño mínimo de los objetos en función de las limitaciones de los sensores utilizados) y la precisión de los datos orbitales (derivada de las prestaciones de los sensores en relación con la precisión de las medidas y la capacidad de re-observación de los objetos). El resultado de dicho análisis (sección 5.2) se ha publicado en una revista especializada [Sánchez-Ortiz, 2015a]T.2. Este análisis no estaba inicialmente previsto durante la tesis, y permite mostrar como la teoría descrita en esta tesis, inicialmente definida para facilitar el diseño de misiones (parte superior de la figura 1) se ha extendido y se puede aplicar para otros propósitos como el dimensionado de un sistema de vigilancia espacial (parte inferior de la figura 1). La principal diferencia de los dos análisis se basa en considerar las capacidades de catalogación (precisión y tamaño de objetos observados) como una variable a modificar en el caso de un diseño de un sistema de vigilancia), siendo fijas en el caso de un diseño de misión. En el caso de las salidas generadas en el análisis, todos los aspectos calculados en un análisis estadístico de riesgo de colisión son importantes para diseño de misión (con el objetivo de calcular la estrategia de evitación y la cantidad de combustible a utilizar), mientras que en el caso de un diseño de un sistema de vigilancia, los aspectos más importantes son el número de maniobras y falsas alarmas (fiabilidad del sistema) y la capacidad de reducción de riesgo (efectividad del sistema). Adicionalmente, un sistema de vigilancia espacial debe ser caracterizado por su capacidad de evitar colisiones catastróficas (evitando así in incremento dramático de la población de basura espacial), mientras que el diseño de una misión debe considerar todo tipo de encuentros, puesto que un operador está interesado en evitar tanto las colisiones catastróficas como las letales. Del análisis de las prestaciones (tamaño de objetos a catalogar y precisión orbital) requeridas a un sistema de vigilancia espacial se concluye que ambos aspectos han de ser fijados de manera diferente para los distintos regímenes orbitales. En el caso de LEO se hace necesario observar objetos de hasta 5cm de radio, mientras que en GEO se rebaja este requisito hasta los 100 cm para cubrir las colisiones catastróficas. La razón principal para esta diferencia viene de las diferentes velocidades relativas entre los objetos en ambos regímenes orbitales. En relación con la precisión orbital, ésta ha de ser muy buena en LEO para poder reducir el número de falsas alarmas, mientras que en regímenes orbitales más altos se pueden considerar precisiones medias. En relación con los aspectos operaciones de la determinación de riesgo de colisión, existen varios algoritmos de cálculo de riesgo entre dos objetos espaciales. La Figura 2 proporciona un resumen de los casos en cuanto a algoritmos de cálculo de riesgo de colisión y como se abordan en esta tesis. Normalmente se consideran objetos esféricos para simplificar el cálculo de riesgo (caso A). Este caso está ampliamente abordado en la literatura y no se analiza en detalle en esta tesis. Un caso de ejemplo se proporciona en la sección 4.2. Considerar la forma real de los objetos (caso B) permite calcular el riesgo de una manera más precisa. Un nuevo algoritmo es definido en esta tesis para calcular el riesgo de colisión cuando al menos uno de los objetos se considera complejo (sección 4.4.2). Dicho algoritmo permite calcular el riesgo de colisión para objetos formados por un conjunto de cajas, y se ha presentado en varias conferencias internacionales. Para evaluar las prestaciones de dicho algoritmo, sus resultados se han comparado con un análisis de Monte Carlo que se ha definido para considerar colisiones entre cajas de manera adecuada (sección 4.1.2.3), pues la búsqueda de colisiones simples aplicables para objetos esféricos no es aplicable a este caso. Este análisis de Monte Carlo se considera la verdad a la hora de calcular los resultados del algoritmos, dicha comparativa se presenta en la sección 4.4.4. En el caso de satélites que no se pueden considerar esféricos, el uso de un modelo de la geometría del satélite permite descartar eventos que no son colisiones reales o estimar con mayor precisión el riesgo asociado a un evento. El uso de estos algoritmos con geometrías complejas es más relevante para objetos de dimensiones grandes debido a las prestaciones de precisión orbital actuales. En el futuro, si los sistemas de vigilancia mejoran y las órbitas son conocidas con mayor precisión, la importancia de considerar la geometría real de los satélites será cada vez más relevante. La sección 5.4 presenta un ejemplo para un sistema de grandes dimensiones (satélite con un tether). Adicionalmente, si los dos objetos involucrados en la colisión tienen velocidad relativa baja (y geometría simple, Caso C en la Figura 2), la mayor parte de los algoritmos no son aplicables requiriendo implementaciones dedicadas para este caso particular. En esta tesis, uno de estos algoritmos presentado en la literatura [Patera, 2001]R.26 se ha analizado para determinar su idoneidad en distintos tipos de eventos (sección 4.5). La evaluación frete a un análisis de Monte Carlo se proporciona en la sección 4.5.2. Tras este análisis, se ha considerado adecuado para abordar las colisiones de baja velocidad. En particular, se ha concluido que el uso de algoritmos dedicados para baja velocidad son necesarios en función del tamaño del volumen de colisión proyectado en el plano de encuentro (B-plane) y del tamaño de la incertidumbre asociada al vector posición entre los dos objetos. Para incertidumbres grandes, estos algoritmos se hacen más necesarios pues la duración del intervalo en que los elipsoides de error de los dos objetos pueden intersecar es mayor. Dicho algoritmo se ha probado integrando el algoritmo de colisión para objetos con geometrías complejas. El resultado de dicho análisis muestra que este algoritmo puede ser extendido fácilmente para considerar diferentes tipos de algoritmos de cálculo de riesgo de colisión (sección 4.5.3). Ambos algoritmos, junto con el método Monte Carlo para geometrías complejas, se han implementado en la herramienta operacional de la ESA CORAM, que es utilizada para evaluar el riesgo de colisión en las actividades rutinarias de los satélites operados por ESA [Sánchez-Ortiz, 2013a]T.11. Este hecho muestra el interés y relevancia de los algoritmos desarrollados para la mejora de las operaciones de los satélites. Dichos algoritmos han sido presentados en varias conferencias internacionales [Sánchez-Ortiz, 2013b]T.9, [Pulido, 2014]T.7,[Grande-Olalla, 2013]T.10, [Pulido, 2014]T.5, [Sánchez-Ortiz, 2015c]T.1. ABSTRACT This document addresses methodologies for computation of the collision risk of a satellite. Two different approaches need to be considered for collision risk minimisation. On an operational basis, it is needed to perform a sieve of possible objects approaching the satellite, among all objects sharing the space with an operational satellite. As the orbits of both, satellite and the eventual collider, are not perfectly known but only estimated, the miss-encounter geometry and the actual risk of collision shall be evaluated. In the basis of the encounter geometry or the risk, an eventual manoeuvre may be required to avoid the conjunction. Those manoeuvres will be associated to a reduction in the fuel for the mission orbit maintenance, and thus, may reduce the satellite operational lifetime. Thus, avoidance manoeuvre fuel budget shall be estimated, at mission design phase, for a better estimation of mission lifetime, especially for those satellites orbiting in very populated orbital regimes. These two aspects, mission design and operational collision risk aspects, are summarised in Figure 3, and covered along this thesis. Bottom part of the figure identifies the aspects to be consider for the mission design phase (statistical characterisation of the space object population data and theory computing the mean number of events and risk reduction capability) which will define the most appropriate collision avoidance approach at mission operational phase. This part is covered in this work by starting from the theory described in [Sánchez-Ortiz, 2006]T.14 and implemented by this author in ARES tool [Sánchez-Ortiz, 2004b]T.15 provided by ESA for evaluation of collision avoidance approaches. This methodology has been now extended to account for the particular features of the available data sets in operational environment (section 4.3.3). Additionally, the formulation has been extended to allow evaluating risk computation approached when orbital uncertainty is not available (like the TLE case) and when only catastrophic collisions are subject to study (section 4.3.2.3). These improvements to the theory have been included in the new version of ESA ARES tool [Domínguez-González and Sánchez-Ortiz, 2012b]T.12 and available through [SDUP,2014]R.60. At the operation phase, the real catalogue data will be processed on a routine basis, with adequate collision risk computation algorithms to propose conjunction avoidance manoeuvre optimised for every event. The optimisation of manoeuvres in an operational basis is not approached along this document. Currently, American Two Line Element (TLE) catalogue is the only public source of data providing orbits of objects in space to identify eventual conjunction events. Additionally, Conjunction Summary Message (CSM) is provided by Joint Space Operation Center (JSpOC) when the American system identifies a possible collision among satellites and debris. Depending on the data used for collision avoidance evaluation, the conjunction avoidance approach may be different. The main features of currently available data need to be analysed (in regards to accuracy) in order to perform estimation of eventual encounters to be found along the mission lifetime. In the case of TLE, as these data is not provided with accuracy information, operational collision avoidance may be also based on statistical accuracy information as the one used in the mission design approach. This is not the case for CSM data, which includes the state vector and orbital accuracy of the two involved objects. This aspect has been analysed in detail and is depicted in the document, evaluating in statistical way the characteristics of both data sets in regards to the main aspects related to collision avoidance. Once the analysis of data set was completed, investigations on the impact of those features in the most convenient avoidance approaches have been addressed (section 5.1). This analysis is published in a peer-reviewed journal [Sánchez-Ortiz, 2015b]T.3. The analysis provides recommendations for different mission types (satellite size and orbital regime) in regards to the most appropriate collision avoidance approach for relevant risk reduction. The risk reduction capability is very much dependent on the accuracy of the catalogue utilized to identify eventual collisions. Approaches based on CSM data are recommended against the TLE based approach. Some approaches based on the maximum risk associated to envisaged encounters are demonstrated to report a very large number of events, which makes the approach not suitable for operational activities. Accepted Collision Probability Levels are recommended for the definition of the avoidance strategies for different mission types. For example for the case of a LEO satellite in the Sun-synchronous regime, the typically used ACPL value of 10-4 is not a suitable value for collision avoidance schemes based on TLE data. In this case the risk reduction capacity is almost null (due to the large uncertainties associated to TLE data sets, even for short time-to-event values). For significant reduction of risk when using TLE data, ACPL on the order of 10-6 (or lower) seems to be required, producing about 10 warnings per year and mission (if one-day ahead events are considered) or 100 warnings per year (for three-days ahead estimations). Thus, the main conclusion from these results is the lack of feasibility of TLE for a proper collision avoidance approach. On the contrary, for CSM data, and due to the better accuracy of the orbital information when compared with TLE, ACPL on the order of 10-4 allows to significantly reduce the risk. This is true for events estimated up to 3 days ahead. Even 5 days ahead events can be considered, but ACPL values down to 10-5 should be considered in such case. Even larger prediction times can be considered (7 days) for risk reduction about 90%, at the cost of larger number of warnings up to 5 events per year, when 5 days prediction allows to keep the manoeuvre rate in 2 manoeuvres per year. Dynamics of the GEO orbits is different to that in LEO, impacting on a lower increase of orbits uncertainty along time. On the contrary, uncertainties at short prediction times at this orbital regime are larger than those at LEO due to the differences in observation capabilities. Additionally, it has to be accounted that short prediction times feasible at LEO may not be appropriate for a GEO mission due to the orbital period being much larger at this regime. In the case of TLE data sets, significant reduction of risk is only achieved for small ACPL values, producing about a warning event per year if warnings are raised one day in advance to the event (too short for any reaction to be considered). Suitable ACPL values would lay in between 5•10-8 and 10-7, well below the normal values used in current operations for most of the GEO missions (TLE-based strategies for collision avoidance at this regime are not recommended). On the contrary, CSM data allows a good reduction of risk with ACPL in between 10-5 and 10-4 for short and medium prediction times. 10-5 is recommended for prediction times of five or seven days. The number of events raised for a suitable warning time of seven days would be about one in a 10-year mission. It must be noted, that these results are associated to a 2 m radius spacecraft, impact of the satellite size are also analysed within the thesis. In the future, other Space Situational Awareness Systems (SSA, ESA program) may provide additional catalogues of objects in space with the aim of reducing the risk. It is needed to investigate which are the required performances of those catalogues for allowing such risk reduction. The main performance aspects are coverage (objects included in the catalogue, mainly limited by a minimum object size derived from sensor performances) and the accuracy of the orbital data to accurately evaluate the conjunctions (derived from sensor performance in regards to object observation frequency and accuracy). The results of these investigations (section 5.2) are published in a peer-reviewed journal [Sánchez-Ortiz, 2015a]T.2. This aspect was not initially foreseen as objective of the thesis, but it shows how the theory described in the thesis, initially defined for mission design in regards to avoidance manoeuvre fuel allocation (upper part of figure 1), is extended and serves for additional purposes as dimensioning a Space Surveillance and Tracking (SST) system (bottom part of figure below). The main difference between the two approaches is the consideration of the catalogue features as part of the theory which are not modified (for the satellite mission design case) instead of being an input for the analysis (in the case of the SST design). In regards to the outputs, all the features computed by the statistical conjunction analysis are of importance for mission design (with the objective of proper global avoidance strategy definition and fuel allocation), whereas for the case of SST design, the most relevant aspects are the manoeuvre and false alarm rates (defining a reliable system) and the Risk Reduction capability (driving the effectiveness of the system). In regards to the methodology for computing the risk, the SST system shall be driven by the capacity of providing the means to avoid catastrophic conjunction events (avoiding the dramatic increase of the population), whereas the satellite mission design should consider all type of encounters, as the operator is interested on avoiding both lethal and catastrophic collisions. From the analysis of the SST features (object coverage and orbital uncertainty) for a reliable system, it is concluded that those two characteristics are to be imposed differently for the different orbital regimes, as the population level is different depending on the orbit type. Coverage values range from 5 cm for very populated LEO regime up to 100 cm in the case of GEO region. The difference on this requirement derives mainly from the relative velocity of the encounters at those regimes. Regarding the orbital knowledge of the catalogues, very accurate information is required for objects in the LEO region in order to limit the number of false alarms, whereas intermediate orbital accuracy can be considered for higher orbital regimes. In regards to the operational collision avoidance approaches, several collision risk algorithms are used for evaluation of collision risk of two pair of objects. Figure 2 provides a summary of the different collision risk algorithm cases and indicates how they are covered along this document. The typical case with high relative velocity is well covered in literature for the case of spherical objects (case A), with a large number of available algorithms, that are not analysed in detailed in this work. Only a sample case is provided in section 4.2. If complex geometries are considered (Case B), a more realistic risk evaluation can be computed. New approach for the evaluation of risk in the case of complex geometries is presented in this thesis (section 4.4.2), and it has been presented in several international conferences. The developed algorithm allows evaluating the risk for complex objects formed by a set of boxes. A dedicated Monte Carlo method has also been described (section 4.1.2.3) and implemented to allow the evaluation of the actual collisions among a large number of simulation shots. This Monte Carlo runs are considered the truth for comparison of the algorithm results (section 4.4.4). For spacecrafts that cannot be considered as spheres, the consideration of the real geometry of the objects may allow to discard events which are not real conjunctions, or estimate with larger reliability the risk associated to the event. This is of particular importance for the case of large spacecrafts as the uncertainty in positions of actual catalogues does not reach small values to make a difference for the case of objects below meter size. As the tracking systems improve and the orbits of catalogued objects are known more precisely, the importance of considering actual shapes of the objects will become more relevant. The particular case of a very large system (as a tethered satellite) is analysed in section 5.4. Additionally, if the two colliding objects have low relative velocity (and simple geometries, case C in figure above), the most common collision risk algorithms fail and adequate theories need to be applied. In this document, a low relative velocity algorithm presented in the literature [Patera, 2001]R.26 is described and evaluated (section 4.5). Evaluation through comparison with Monte Carlo approach is provided in section 4.5.2. The main conclusion of this analysis is the suitability of this algorithm for the most common encounter characteristics, and thus it is selected as adequate for collision risk estimation. Its performances are evaluated in order to characterise when it can be safely used for a large variety of encounter characteristics. In particular, it is found that the need of using dedicated algorithms depend on both the size of collision volume in the B-plane and the miss-distance uncertainty. For large uncertainties, the need of such algorithms is more relevant since for small uncertainties the encounter duration where the covariance ellipsoids intersect is smaller. Additionally, its application for the case of complex satellite geometries is assessed (case D in figure above) by integrating the developed algorithm in this thesis with Patera’s formulation for low relative velocity encounters. The results of this analysis show that the algorithm can be easily extended for collision risk estimation process suitable for complex geometry objects (section 4.5.3). The two algorithms, together with the Monte Carlo method, have been implemented in the operational tool CORAM for ESA which is used for the evaluation of collision risk of ESA operated missions, [Sánchez-Ortiz, 2013a]T.11. This fact shows the interest and relevance of the developed algorithms for improvement of satellite operations. The algorithms have been presented in several international conferences, [Sánchez-Ortiz, 2013b]T.9, [Pulido, 2014]T.7,[Grande-Olalla, 2013]T.10, [Pulido, 2014]T.5, [Sánchez-Ortiz, 2015c]T.1.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

INTRODUÇÃO: Leishmaniose visceral (LV) é uma doença negligenciada que afeta milhões de pessoas no mundo e que constitui um grave problema de saúde pública. OBJETIVOS: Descrever no tempo e no espaço, a dispersão de Lutzomyia longipalpis e a expansão da LV no estado de São Paulo (SP); identificar fatores associados a estes processos. MÉTODOS: Foram realizados estudos descritivos, ecológicos e de análise de sobrevida. Informações sobre o vetor e os casos foram obtidas na Superintendência de Controle de Endemias e no Sistema de Informações de Agravos de Notificação para o período de 1997 a 2014. A área de estudo foi composta pelos 645 municípios de SP. Foram produzidos mapas temáticos e de fluxo e calcularam-se incidência, mortalidade e letalidade por LV em humanos (LVH). Utilizou-se a técnica de análise de sobrevida (Curvas de Kaplan-Meier e Regressão de Cox) para a identificação de fatores associados à dispersão do vetor e expansão da LV. RESULTADOS: A partir da detecção de Lu. longipalpis em Araçatuba em 1997, deram-se a ocorrência do primeiro caso canino (LVC) (1998) e o primeiro caso humano (LVH) autóctones (1999) em SP. Até 2014, foi detectada a presença do vetor em 173 (26,8 por cento ) municípios, LVC em 108 (16,7 por cento ) e LVH em 84 (13 por cento ). A expansão dos três fenômenos ocorreu no sentido noroeste para sudeste e se deram a velocidades constantes. Na região de São José do Rio Preto, a dispersão do vetor deu-se por vizinhança com municípios anteriormente infestados, a expansão da LV relacionou-se com os municípios sede das microrregiões e a doença ocorreu com maior intensidade nas áreas periféricas dos municípios. A presença da Rodovia Marechal Rondon e a divisa com o Mato Grosso do Sul foram fatores associados à ocorrência dos três eventos, assim como a presença da Rodovia Euclides da Cunha para presença do vetor e casos caninos, e, presença de presídios para casos humanos. CONCLUSÕES: A dispersão do vetor e da LV em SP iniciou-se, a partir de 1997, próximo à divisa com o estado do Mato Grosso do Sul, avançou no sentido noroeste para sudeste, na trajetória da rodovia Marechal Rondon, e ocorreu em progressão aritmética, com as sedes das microrregiões de SP tendo papel preponderante neste processo. A ocorrência autóctone de LVC e LVH iniciou-se na sequência da detecção de Lu. longipalpis em Araçatuba e de seu espalhamento por SP e não a partir dos locais onde anteriormente ele já estava presente. O uso da análise de sobrevida permitiu identificar fatores associados à dispersão do vetor e a expansão da LV. Os resultados deste estudo podem ser úteis para aprimorar as atividades de vigilância e controle da LV, no sentido de retardar sua expansão e/ou de mitigar seus efeitos, quando de sua ocorrência.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Este trabajo de investigación pretende poner en valor una comprensión del espacio desde la vinculación existente entre el cuerpo y el complejo e inestable ambiente que habita. Por un lado, el cuerpo, con sus relaciones, acciones y afectos, se erige como la herramienta clave para entender las dinámicas de producción espacial contemporánea, por otro, la inclusión de la escala humana, hace que los grandes relatos se desvanezcan en favor de una serie de vínculos sensibles que re-humanizan la arquitectura y atienden a las sensaciones del individuo. Así, se pone de manifiesto una narración que sitúa al cuerpo como protagonista y en la que su vinculación con la espacialidad fluctúa entre la sumisión, la violencia y la armonía para con los otros cuerpos y el espacio. Como caso de estudio, se ha tomado la más abyecta y violenta de las construcciones: el campo de concentración de Auschwitz, en un intento de re-pensarlo desde los afectos del individuo y de problematizar las relaciones y acciones contemporáneas –que de acuerdo con el filósofo Michael Foucault- derivan hacia derroteros de control y violencias invisibles. Dicha tarea se nutre de una multiplicidad de cartografías como instrumento de conocimiento para visibilizar narraciones arquitectónicas a través del cuerpo, con el objetivo tanto del análisis de situaciones acontecidas, como de operar sobre nuevas oportunidades espaciales.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

As paisagens dunares são sistemas de elevado dinamismo, devido à proximidade do mar e à extrema mobilidade do substrato arenoso e prevê-se que venham a ser severamente afetadas pelas alterações ambientais globais. As dunas são depósitos de areia criados por processos eólicos e apresentam uma vegetação muito característica. Estes depósitos de areia em conjunto com a vegetação formam uma barreira essencial ao avanço do mar durante as marés altas de águas vivas e tempestades. Em Portugal, a degradação dos ecossistemas costeiros é muito preocupante. O problema das espécies exóticas invasoras agravou-se, aumentando a pressão sobre as plantas nativas. Embora este problema não seja o único motivo da degradação dos ecossistemas costeiros, este trabalho pretende divulgar as plantas dunares da zona costeira de Matosinhos, sensibilizar para a proteção e conservação das dunas e alertar para o facto de diversas plantas invasoras rapidamente colonizarem espaços abertos, pondo em causa e estabilidade dos ecossistemas costeiros. O conhecimento detalhado destes ecossistemas permitirá a aplicação de processos de vigilância e monitorização bem como o restauro ecológico de áreas dunares degradadas.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The financial and economic crisis has hit Europe in its core. While the crisis may not have originated in the European Union, it has laid bare structural weaknesses in the EU’s policy framework. Both public finances and the banking sector have been heavily affected. For a long time, the EU failed to take into account sufficiently the perverse link that existed between the two. Negative evolutions in one field of the crisis often dragged along the other in its downward spiral. In June 2012, in the early hours of a yet another EU Summit, the leaders of the eurozone finally decided to address the link between the banking and sovereign debt crises. Faced with soaring public borrowing costs in Spain and Italy, they decided to allow for the direct European recapitalisation of banks when the Member State itself would no longer be in a position to do so. In exchange, supervision of the banking sector would be lifted to the European level by means of a Single Supervisory Mechanism. The Single Supervisory Mechanism, or SSM in the EU jargon, is a first step in the broader revision of policies towards banks in Europe. The eventual goal is the creation of a Banking Union, which is to carry out effective surveillance and – if needed – crisis management of the banking sector. The SSM is to rely on national supervisors and the ECB, with the ECB having final authority on the matter. The involvement of the latter made it clear that the SSM would be centred on the eurozone – while it is to remain open to other Member States willing to join. Due to the ongoing problems and the link between the creation of the SSM and the recapitalisation of banks, the SSM became one of the key legislative priorities of the EU. In December 2012, Member States reached an agreement on the design of the SSM. After discussions with the European Parliament (which were still ongoing at the time of writing), the process towards making the SSM operational can be initiated. The goal is to have the SSM fully up and running in the first half of 2014. The decisions that were taken in June 2012 are likely to have had a bigger impact than the eurozone’s Heads of State and Government could have realised at the time for two important reasons. On the one hand, creating the SSM necessitates a full Banking Union and therefore shared risk. On the other hand, the decisions improved the ECB’s perception of the willingness of governments to take far-reaching measures. This undoubtedly played a significant role in the creation of the Outright Monetary Transactions programme by the ECB, which has led to a substantial easing of the crisis in the short-term. 1 These short-term gains should now be matched with a stable long-term framework for bank supervision and crisis management. The agreement on the SSM should be the first step in the direction of this goal. This paper provides an analysis of the SSM and its role in the creation of a Banking Union. The paper starts with a reminder of why the EU decided to put in place the SSM (§1) and the state of play of the ongoing negotiations on the SSM (§2). Subsequently, the supervisory responsibilities of the SSM are detailed, including its scope and the division of labour between the national supervisors and the ECB (§3). The internal functioning of the SSM (§4) and its relation to the other supervisors are discussed afterwards (§5). As mentioned earlier, the SSM is part of a wider move towards a Banking Union. Therefore, this paper sheds light on the other building blocks of this ambitious project (§6). The transition towards the Banking Union is important and will prove to be a bumpy ride. Before formulating a number of conclusions, this Working Paper therefore provides an overview of the planned road ahead (§7).

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Fiscal consolidation is essential to ensure the sustainability of eurozone countries’ public debt. However, as a principle, consolidation should not be pursued at a pace unnecessarily undermining growth in the short term. Repeated downward revisions of growth call for the use of the flexibility foreseen in the EU fiscal framework. The Commission should adapt the deadlines for fiscal correction to prevent excessive, pro-cyclical adjustment in 2013. In turn, adequate surveillance and coordination must ensure structural adjustments constitute the core of fiscal consolidation plans.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Apprehending pirates in the Indian Ocean is one thing. Defeating the networks through which smugglers traffic migrants through North Africa is quite another. The European Union’s new naval force deployment in the Mediterranean - EUNAVFOR MED - drew criticism from international partners and the general public alike when plans for a “boat-sinking” operation were unveiled, raising fears about unacceptable levels of violence and collateral damage; a European version of Mexico’s drug war. Yet the problems of EUNAVFOR MED lie less in clumsy public diplomacy than in the perilous mismatch between its stated objectives and the absence of a clear strategy and mandate, and this creates both operational and political risks for member states. Phase 1 of the operation: surveillance and assessment, has begun with no legal mandate to carry out the crucial phases 2 and 3: seek and destroy, whose military planning and outcomes are undetermined. Despite these limitations, the naval force could nevertheless mark a turning point in the EU’s security narrative, because it means that the Union is finally addressing the threats to security and the humanitarian tragedies in its southern neighbourhood.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Mode of access: Internet.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Thesis (Master's)--University of Washington, 2016-06

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This thesis is concerned with understanding how Emergency Management Agencies (EMAs) influence public preparedness for mass evacuation across seven countries. Due to the lack of cross-national research (Tierney et al., 2001), there is a lack of knowledge on EMAs perspectives and approaches to the governance of public preparedness. This thesis seeks to address this gap through cross-national research that explores and contributes towards understanding the governance of public preparedness. The research draws upon the risk communication (Wood et al., 2011; Tierney et al., 2001) social marketing (Marshall et al., 2007; Kotler and Lee, 2008; Ramaprasad, 2005), risk governance (Walker et al., 2010, 2013; Kuhlicke et al., 2011; IRGC, 2005, 2007; Renn et al., 2011; Klinke and Renn, 2012), risk society (Beck, 1992, 1999, 2002) and governmentality (Foucault, 1978, 2003, 2009) literature to explain this governance and how EMAs responsibilize the public for their preparedness. EMAs from seven countries (Belgium, Denmark, Germany, Iceland, Japan, Sweden, the United Kingdom) explain how they prepare their public for mass evacuation in response to different types of risk. A cross-national (Hantrais, 1999) interpretive research approach, using qualitative methods including semi-structured interviews, documents and observation, was used to collect data. The data analysis process (Miles and Huberman, 1999) identified how the concepts of risk, knowledge and responsibility are critical for theorising how EMAs influence public preparedness for mass evacuation. The key findings grounded in these concepts include: - Theoretically, risk is multi-functional in the governance of public preparedness. It regulates behaviour, enables surveillance and acts as a technique of exclusion. - EMAs knowledge and how this influenced their assessment of risk, together with how they share the responsibility for public preparedness across institutions and the public, are key to the governance of public preparedness for mass evacuation. This resulted in a form of public segmentation common to all countries, whereby the public were prepared unequally.  - EMAs use their prior knowledge and assessments of risk to target public preparedness in response to particular known hazards. However, this strategy places the non-targeted public at greater risk in relation to unknown hazards, such as a man-made disaster. - A cross-national conceptual framework of four distinctive governance practices (exclusionary, informing, involving and influencing) are utilised to influence public preparedness. - The uncertainty associated with particular types of risk limits the application of social marketing as a strategy for influencing the public to take responsibility and can potentially increase the risk to the public.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

National surveys are important tools for public health surveillance and thus key elements in monitoring health conditions and system performance. In the field of oral health, such surveys began with the oral health survey in 1986 and later in 1996 and with the SBBrasil Project in 2003. The 2010 edition of SBBrasil is the principal oral health surveillance strategy for the production of primary data. In order to contribute to this discussion, this article proposes: (a) to present and discuss the Brazilian experience with nationwide oral health surveys and (b) to discuss the use of data in health surveillance models. One can conclude that oral health surveys in Brazil have great possibilities as a tool for health services and academia. Such surveys have shown evident potential for verifying trends in the oral health profile, as well as for producing valid indicators for use in health services.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Health promotion is opposed directly to the biomedical model and established by intersectoral action, with collective and interdisciplinary approaches, considering the subject in their life contexts. Build healthy territories is to promote health, which necessarily includes intersectoral coordination and community mobilization. The health and education sectors can work together to promote health, developing so articulate actions and practices involving the subject in its territory of life and work. This study aimed to design and experience of health promotion strategies in school and Basic Health Units Family in Uberlândia - MG, from intersectoral relationship and community mobilization. The methodological research route was action research, or research intervention, because while researching already applied the ideas to solve problems through collective action. The research began in the Municipal School of Basic Education Prof. Eurico Silva, with the Health Centre's deployment to carry out surveillance and health promotion with active participation of students, involving all subjects of the school, students, teachers and other staff in the context of everyday life, which extrapolates the school walls, reaching the family and social groups in the community to which they belong. The health observatory has the objective existence with the establishment of the working groups, which at first were "healthy eating" and "drug-free world" and later, "dengue". The themes were chosen by the participants of the Health Centre, in which each is involved preferably. The second part of the research started with the approach between the Centre for Health and the health units (UBS and BFHU). The proposal was that the schools and the health nurse unit together should undertake prevention and health promotion, combating Aedes aegypti with intersectoral coordination and community mobilization. For it was crucial the involvement of ACS, ACE, ASE and the nurse coordinator of the Health Unit in creating community networks in the territory. home visits, community mobilization and intersectoral coordination: a training course in all BFHU and UBS teams with the following subjects was conducted. At this stage, were the Health Units that should approach the schools, in order to provide community networks to fight Aedes aegypti in each territory. The results and the scope of this experiment could only be brought to fruition because the Board of Health Surveillance and Care Coordination council of Basic embraced the proposal and helped in its implementation. It remains to continue consolidating this process of work in health units of primary care and the elementary schools, replicate the Health Centre's experience at school. The conclusion of this work is that schools and care facilities to health together with intersectoral coordination and community mobilization supported by community networks, can carry out prevention and health promotion, from a health model that considers the social determinants of health and overcoming hygienist model / sanitarian.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Cette thèse a pour objet de comprendre la question du mariage forcé vécu par des femmes immigrantes vivant au Québec et, les réponses politiques, législatives et sociales qu’on y apporte. De façon plus spécifique, il s’agit de mettre à jour la diversité des situations et des significations que recouvre la notion de mariage forcé pour tenter d’en dégager des éléments de définition et de compréhension. La thèse vise également à identifier les conséquences spécifiques qui découlent d’un mariage forcé pour les femmes immigrantes vivant au Québec, et enfin, d’analyser les réponses politiques, législatives et sociales visant le mariage forcé au Canada et au Québec afin de prévenir, dépister et d’en protéger ses victimes en contexte interculturel. S’appuyant sur un corpus de dix entrevues avec des femmes immigrantes vivant, ayant vécu ou menacées d’un mariage forcé et de dix-huit informateurs clés intervenant auprès d’elles et provenant de différents milieux de pratique (police, justice, santé services sociaux et communautaires), une analyse intersectionnelle a permis de révéler toute la complexité des mariages forcés due notamment aux interrelations entre des systèmes d’oppression et des vulnérabilités multiples. La recension des écrits et nos résultats indiquent que certains éléments caractérisent les mariages forcés. Premièrement, la préservation de l’honneur patriarcal qui problématise et contrôle le comportement des femmes en ce qui à trait notamment à leur vie sexuelle, mais aussi sociale. Deuxièmement, le fait que le mariage forcé soit un moyen de poursuivre des intérêts plus souvent collectifs qu’individuels. Dimension collective qui devra nécessairement être prise en considération lors des solutions à apporter à cette problématique. Troisièmement, le rôle des femmes (mères, belles-mères et autres femmes de la communauté culturelle d’appartenance) dans l’arrangement des mariages, mais également dans la surveillance et le contrôle de tous les faits et gestes des autres femmes. i Quatrièmement, le potentiel d’agresseurs multiples, y compris la communauté elle-même, dans les actes de violence commis avant, pendant et, le cas échéant, après le mariage. Une autre dimension qui devra elle aussi être prise en compte lors de l’inter- vention. Cinquièmement, le potentiel d’exploitation sexuelle (viol conjugal, grossesses forcées), physique (mauvais traitements, blessures), psychologique (pressions, manipulations) ou encore économique (travail forcé, privation d’autonomie financière). L’ensemble de ces résultats a permis de cerner certains besoins liés à l’intervention, en terme de prévention, de dépistage et de protection des victimes de mariage forcé.