885 resultados para Risk levels
Resumo:
Como en todos los medios de transporte, la seguridad en los viajes en avión es de primordial importancia. Con los aumentos de tráfico aéreo previstos en Europa para la próxima década, es evidente que el riesgo de accidentes necesita ser evaluado y monitorizado cuidadosamente de forma continúa. La Tesis presente tiene como objetivo el desarrollo de un modelo de riesgo de colisión exhaustivo como método para evaluar el nivel de seguridad en ruta del espacio aéreo europeo, considerando todos los factores de influencia. La mayor limitación en el desarrollo de metodologías y herramientas de monitorización adecuadas para evaluar el nivel de seguridad en espacios de ruta europeos, donde los controladores aéreos monitorizan el tráfico aéreo mediante la vigilancia radar y proporcionan instrucciones tácticas a las aeronaves, reside en la estimación del riesgo operacional. Hoy en día, la estimación del riesgo operacional está basada normalmente en reportes de incidentes proporcionados por el proveedor de servicios de navegación aérea (ANSP). Esta Tesis propone un nuevo e innovador enfoque para evaluar el nivel de seguridad basado exclusivamente en el procesamiento y análisis trazas radar. La metodología propuesta ha sido diseñada para complementar la información recogida en las bases de datos de accidentes e incidentes, mediante la provisión de información robusta de los factores de tráfico aéreo y métricas de seguridad inferidas del análisis automático en profundidad de todos los eventos de proximidad. La metodología 3-D CRM se ha implementado en un prototipo desarrollado en MATLAB © para analizar automáticamente las trazas radar y planes de vuelo registrados por los Sistemas de Procesamiento de Datos Radar (RDP) e identificar y analizar todos los eventos de proximidad (conflictos, conflictos potenciales y colisiones potenciales) en un periodo de tiempo y volumen del espacio aéreo. Actualmente, el prototipo 3-D CRM está siendo adaptado e integrado en la herramienta de monitorización de prestaciones de Aena (PERSEO) para complementar las bases de accidentes e incidentes ATM y mejorar la monitorización y proporcionar evidencias de los niveles de seguridad. ABSTRACT As with all forms of transport, the safety of air travel is of paramount importance. With the projected increases in European air traffic in the next decade and beyond, it is clear that the risk of accidents needs to be assessed and carefully monitored on a continuing basis. The present thesis is aimed at the development of a comprehensive collision risk model as a method of assessing the European en-route risk, due to all causes and across all dimensions within the airspace. The major constraint in developing appropriate monitoring methodologies and tools to assess the level of safety in en-route airspaces where controllers monitor air traffic by means of radar surveillance and provide aircraft with tactical instructions lies in the estimation of the operational risk. The operational risk estimate normally relies on incident reports provided by the air navigation service providers (ANSPs). This thesis proposes a new and innovative approach to assessing aircraft safety level based exclusively upon the process and analysis of radar tracks. The proposed methodology has been designed to complement the information collected in the accident and incident databases, thereby providing robust information on air traffic factors and safety metrics inferred from the in depth assessment of proximate events. The 3-D CRM methodology is implemented in a prototype tool in MATLAB © in order to automatically analyze recorded aircraft tracks and flight plan data from the Radar Data Processing systems (RDP) and identify and analyze all proximate events (conflicts, potential conflicts and potential collisions) within a time span and a given volume of airspace. Currently, the 3D-CRM prototype is been adapted and integrated in AENA’S Performance Monitoring Tool (PERSEO) to complement the information provided by the ATM accident and incident databases and to enhance monitoring and providing evidence of levels of safety.
Resumo:
Human health problems and solutions. Urban gardening has spread worldwide in recent years as it enhances food security and selfsupply and promotes community integration. However urban soils are significantly enriched in trace elements relative to background levels. Exposure to the soil in urban gardens may therefore result in adverse health effects depending on the degree of contact during gardening, infant recreational activities and ingestion of vegetables grown in them. In order to evaluate this potential risk, 36 composite samples were collected from the top 20 cm of the soil of 6 urban gardens in Madrid. The aqua regia (pseudototal) and glycine-extractable (bioaccessible) concentrations of Co, Cr, Cu, Ni, Pb and Zn were determined by atomic absorption spectrophotometry. Additionally, pH, texture, Fe, Ca, and Mn concentrations, and organic matter and calcium carbonate contents were determined in all urban gardens and their influence on trace element bioaccessibility was analyzed.
Resumo:
La condición física, o como mejor se la conoce hoy en día el “fitness”, es una variable que está cobrando gran protagonismo, especialmente desde la perspectiva de la salud. La mejora de la calidad de vida que se ha experimentado en los últimos años en las sociedades desarrolladas, conlleva un aumento de la esperanza de vida, lo que hace que cada vez más personas vivan más años. Este rápido crecimiento de la población mayor de 60 años hace que, un grupo poblacional prácticamente olvidado desde el punto de vista de la investigación científica en el campo de la actividad física y del deporte, cobre gran relevancia, con el fin de poder ayudar a alcanzar el dicho “no se trata de aportar años a la vida sino vida a lo años”. La presente memoria de Tesis Doctoral tiene como principal objetivo valorar los niveles de fitness en población mayor española, además de analizar la relación existente entre el fitness, sus condicionantes y otros aspectos de la salud, tales como la composición corporal y el estado cognitivo. Entendemos que para poder establecer futuras políticas de salud pública en relación a la actividad física y el envejecimiento activo es necesario conocer cuáles son los niveles de partida de la población mayor en España y sus condicionantes. El trabajo está basado en los datos del estudio multicéntrico EXERNET (Estudio Multi-céntrico para la Evaluación de los Niveles de Condición Física y su relación con Estilos de Vida Saludables en población mayor española no institucionalizada), así como en los datos de dos estudios, llevados a cabo en población mayor institucionalizada. Se han analizado un total de 3136 mayores de vida independiente, procedentes de 6 comunidades autónomas, y 153 mayores institucionalizados en residencias de la Comunidad de Madrid. Los principales resultados de esta tesis son los siguientes: a) Fueron establecidos los valores de referencia, así como las curvas de percentiles, para cada uno de los test de fitness, de acuerdo a la edad y al sexo, en población mayor española de vida independiente y no institucionalizada. b) Los varones obtuvieron mejores niveles de fitness que las mujeres, excepto en los test de flexibilidad; existe una tendencia a disminuir la condición física en ambos sexos a medida que la edad aumenta. c) Niveles bajos de fitness funcional fueron asociados con un aumento en la percepción de problemas. d) El nivel mínimo de fitness funcional a partir del cual los mayores perciben problemas en sus actividades de la vida diaria (AVD) es similar en ambos sexos. e) Niveles elevados de fitness fueron asociados con un menor riesgo de sufrir obesidad sarcopénica y con una mejor salud percibida en los mayores. f) Las personas mayores con obesidad sarcopénica tienen menor capacidad funcional que las personas mayores sanas. g) Niveles elevados de fuerza fueron asociados con un mejor estado cognitivo siendo el estado cognitivo la variable que más influye en el deterioro de la fuerza, incluso más que el sexo y la edad. ABSTRACT Fitness is a variable that is gaining in prominence, especially from the health perspective. Improvement of life quality that has been experienced in the last few years in developed countries, leads to an expanded life expectancy, increasing the numbers of people living longer. This population consisting of people of over 60 years, an almost forgotten population group from the point of view of scientific research in the field of physical activity and sport, is becoming increasingly important, with the main aim of helping to achieve the saying “do not only add years to life, but also add life to years”. The principal aim of the current thesis was to assess physical fitness levels in Spanish elderly people, of over 65 years, analyzing relationship between physical fitness, its determinants, and other aspects of health such as body composition and cognitive status. In order to establish further public health policies in relation to physical activity and active ageing it is necessary to identify the starting physical fitness levels of the Spanish population and their determinants. The work is based on data from the EXERNET multi-center study ("Multi-center Study for the Evaluation of Fitness levels and their relationship to Healthy Lifestyles in noninstitutionalized Spanish elderly"), and on data from two studies conducted in institutionalized elderly people: a total of 3136 non-institutionalized elderly, from 6 Regions of Spain, and 153 institutionalized elderly in nursing homes of Madrid. The main outcomes of this thesis are: a) sex- and age-specific physical fitness normative values and percentile curves for independent and non-institutionalized Spanish elderly were established. b) Greater physical fitness was present in the elderly men than in women, except for the flexibility test, and a trend toward decreased physical fitness in both sexes as their age increased. c) Lower levels of functional fitness were associated with increased perceived problems. d) The minimum functional fitness level at which older adults perceive problems in their ADLs, is similar for both sexes e) Higher levels of physical fitness were associated with a reduced risk of suffering sarcopenic obesity and better perceived health among the elderly. f) The elderly with sarcopenic obesity have lower physical functioning than healthy counterparts. g) Higher strength values were associated with better cognitive status with cognitive status being the most influencing variable in strength deterioration even more than sex and age.
Resumo:
Colombia is one of the largest per capita mercury polluters in the world as a consequence of its artisanal gold mining activities. The severity of this problem in terms of potential health effects was evaluated by means of a probabilistic risk assessment carried out in the twelve departments (or provinces) in Colombia with the largest gold production. The two exposure pathways included in the risk assessment were inhalation of elemental Hg vapors and ingestion of fish contaminated with methyl mercury. Exposure parameters for the adult population (especially rates of fish consumption) were obtained from nation-wide surveys and concentrations of Hg in air and of methyl-mercury in fish were gathered from previous scientific studies. Fish consumption varied between departments and ranged from 0 to 0.3 kg d?1. Average concentrations of total mercury in fish (70 data) ranged from 0.026 to 3.3 lg g?1. A total of 550 individual measurements of Hg in workshop air (ranging from menor queDL to 1 mg m?3) and 261 measurements of Hg in outdoor air (ranging from menor queDL to 0.652 mg m?3) were used to generate the probability distributions used as concentration terms in the calculation of risk. All but two of the distributions of Hazard Quotients (HQ) associated with ingestion of Hg-contaminated fish for the twelve regions evaluated presented median values higher than the threshold value of 1 and the 95th percentiles ranged from 4 to 90. In the case of exposure to Hg vapors, minimum values of HQ for the general population exceeded 1 in all the towns included in this study, and the HQs for miner-smelters burning the amalgam is two orders of magnitude higher, reaching values of 200 for the 95th percentile. Even acknowledging the conservative assumptions included in the risk assessment and the uncertainties associated with it, its results clearly reveal the exorbitant levels of risk endured not only by miner-smelters but also by the general population of artisanal gold mining communities in Colombia.
Resumo:
Pseudo-total (i.e. aqua regia extractable) and gastric-bioaccessible (i.e. glycine + HCl extractable) concentrations of Ca, Co, Cr, Cu, Fe, Mn, Ni, Pb and Zn were determined in a total of 48 samples collected from six community urban gardens of different characteristics in the city of Madrid (Spain). Calcium carbonate appears to be the soil property that determines the bioaccessibility of a majority of those elements, and the lack of influence of organic matter, pH and texture can be explained by their low levels in the samples (organic matter) or their narrow range of variation (pH and texture). A conservative risk assessment with bioaccessible concentrations in two scenarios, i.e. adult urban farmers and children playing in urban gardens, revealed acceptable levels of risk, but with large differences between urban gardens depending on their history of land use and their proximity to busy areas in the city center. Only in a worst-case scenario in which children who use urban gardens as recreational areas also eat the produce grown in them would the risk exceed the limits of acceptability
Resumo:
In the present paper, 1-year PM10 and PM 2.5 data from roadside and urban background monitoring stations in Athens (Greece), Madrid (Spain) and London (UK) are analysed in relation to other air pollutants (NO,NO2,NOx,CO,O3 and SO2)and several meteorological parameters (wind velocity, temperature, relative humidity, precipitation, solar radiation and atmospheric pressure), in order to investigate the sources and factors affecting particulate pollution in large European cities. Principal component and regression analyses are therefore used to quantify the contribution of both combustion and non-combustion sources to the PM10 and PM 2.5 levels observed. The analysis reveals that the EU legislated PM 10 and PM2.5 limit values are frequently breached, forming a potential public health hazard in the areas studied. The seasonal variability patterns of particulates varies among cities and sites, with Athens and Madrid presenting higher PM10 concentrations during the warm period and suggesting the larger relative contribution of secondary and natural particles during hot and dry days. It is estimated that the contribution of non-combustion sources varies substantially among cities, sites and seasons and ranges between 38-67% and 40-62% in London, 26-50% and 20-62% in Athens, and 31-58% and 33-68% in Madrid, for both PM10 and PM 2.5. Higher contributions from non-combustion sources are found at urban background sites in all three cities, whereas in the traffic sites the seasonal differences are smaller. In addition, the non-combustion fraction of both particle metrics is higher during the warm season at all sites. On the whole, the analysis provides evidence of the substantial impact of non-combustion sources on local air quality in all three cities. While vehicular exhaust emissions carry a large part of the risk posed on human health by particle exposure, it is most likely that mitigation measures designed for their reduction will have a major effect only at traffic sites and additional measures will be necessary for the control of background levels. However, efforts in mitigation strategies should always focus on optimal health effects.
Resumo:
In this paper we focus on the selection of safeguards in a fuzzy risk analysis and management methodology for information systems (IS). Assets are connected by dependency relationships, and a failure of one asset may affect other assets. After computing impact and risk indicators associated with previously identified threats, we identify and apply safeguards to reduce risks in the IS by minimizing the transmission probabilities of failures throughout the asset network. However, as safeguards have associated costs, the aim is to select the safeguards that minimize costs while keeping the risk within acceptable levels. To do this, we propose a dynamic programming-based method that incorporates simulated annealing to tackle optimizations problems.
Resumo:
Las terminales de contenedores son sistemas complejos en los que un elevado número de actores económicos interactúan para ofrecer servicios de alta calidad bajo una estricta planificación y objetivos económicos. Las conocidas como "terminales de nueva generación" están diseñadas para prestar servicio a los mega-buques, que requieren tasas de productividad que alcanzan los 300 movimientos/ hora. Estas terminales han de satisfacer altos estándares dado que la competitividad entre terminales es elevada. Asegurar la fiabilidad de las planificaciones del atraque es clave para atraer clientes, así como reducir al mínimo el tiempo que el buque permanece en el puerto. La planificación de las operaciones es más compleja que antaño, y las tolerancias para posibles errores, menores. En este contexto, las interrupciones operativas deben reducirse al mínimo. Las principales causas de dichas perturbaciones operacionales, y por lo tanto de incertidumbre, se identifican y caracterizan en esta investigación. Existen una serie de factores que al interactuar con la infraestructura y/o las operaciones desencadenan modos de fallo o parada operativa. Los primeros pueden derivar no solo en retrasos en el servicio sino que además puede tener efectos colaterales sobre la reputación de la terminal, o incluso gasto de tiempo de gestión, todo lo cual supone un impacto para la terminal. En el futuro inmediato, la monitorización de las variables operativas presenta gran potencial de cara a mejorar cualitativamente la gestión de las operaciones y los modelos de planificación de las terminales, cuyo nivel de automatización va en aumento. La combinación del criterio experto con instrumentos que proporcionen datos a corto y largo plazo es fundamental para el desarrollo de herramientas que ayuden en la toma de decisiones, ya que de este modo estarán adaptadas a las auténticas condiciones climáticas y operativas que existen en cada emplazamiento. Para el corto plazo se propone una metodología con la que obtener predicciones de parámetros operativos en terminales de contenedores. Adicionalmente se ha desarrollado un caso de estudio en el que se aplica el modelo propuesto para obtener predicciones de la productividad del buque. Este trabajo se ha basado íntegramente en datos proporcionados por una terminal semi-automatizada española. Por otro lado, se analiza cómo gestionar, evaluar y mitigar el efecto de las interrupciones operativas a largo plazo a través de la evaluación del riesgo, una forma interesante de evaluar el effecto que eventos inciertos pero probables pueden generar sobre la productividad a largo plazo de la terminal. Además se propone una definición de riesgo operativo junto con una discusión de los términos que representan con mayor fidelidad la naturaleza de las actividades y finalmente, se proporcionan directrices para gestionar los resultados obtenidos. Container terminals are complex systems where a large number of factors and stakeholders interact to provide high-quality services under rigid planning schedules and economic objectives. The socalled next generation terminals are conceived to serve the new mega-vessels, which are demanding productivity rates up to 300 moves/hour. These terminals need to satisfy high standards because competition among terminals is fierce. Ensuring reliability in berth scheduling is key to attract clients, as well as to reduce at a minimum the time that vessels stay the port. Because of the aforementioned, operations planning is becoming more complex, and the tolerances for errors are smaller. In this context, operational disturbances must be reduced at a minimum. The main sources of operational disruptions and thus, of uncertainty, are identified and characterized in this study. External drivers interact with the infrastructure and/or the activities resulting in failure or stoppage modes. The later may derive not only in operational delays but in collateral and reputation damage or loss of time (especially management times), all what implies an impact for the terminal. In the near future, the monitoring of operational variables has great potential to make a qualitative improvement in the operations management and planning models of terminals that use increasing levels of automation. The combination of expert criteria with instruments that provide short- and long-run data is fundamental for the development of tools to guide decision-making, since they will be adapted to the real climatic and operational conditions that exist on site. For the short-term a method to obtain operational parameter forecasts in container terminals. To this end, a case study is presented, in which forecasts of vessel performance are obtained. This research has been entirely been based on data gathered from a semi-automated container terminal from Spain. In the other hand it is analyzed how to manage, evaluate and mitigate disruptions in the long-term by means of the risk assessment, an interesting approach to evaluate the effect of uncertain but likely events on the long-term throughput of the terminal. In addition, a definition for operational risk evaluation in port facilities is proposed along with a discussion of the terms that better represent the nature of the activities involved and finally, guidelines to manage the results obtained are provided.
Resumo:
Esta tesis aborda metodologías para el cálculo de riesgo de colisión de satélites. La minimización del riesgo de colisión se debe abordar desde dos puntos de vista distintos. Desde el punto de vista operacional, es necesario filtrar los objetos que pueden presentar un encuentro entre todos los objetos que comparten el espacio con un satélite operacional. Puesto que las órbitas, del objeto operacional y del objeto envuelto en la colisión, no se conocen perfectamente, la geometría del encuentro y el riesgo de colisión deben ser evaluados. De acuerdo con dicha geometría o riesgo, una maniobra evasiva puede ser necesaria para evitar la colisión. Dichas maniobras implican un consumo de combustible que impacta en la capacidad de mantenimiento orbital y por tanto de la visa útil del satélite. Por tanto, el combustible necesario a lo largo de la vida útil de un satélite debe ser estimado en fase de diseño de la misión para una correcta definición de su vida útil, especialmente para satélites orbitando en regímenes orbitales muy poblados. Los dos aspectos, diseño de misión y aspectos operacionales en relación con el riesgo de colisión están abordados en esta tesis y se resumen en la Figura 3. En relación con los aspectos relacionados con el diseño de misión (parte inferior de la figura), es necesario evaluar estadísticamente las características de de la población espacial y las teorías que permiten calcular el número medio de eventos encontrados por una misión y su capacidad de reducir riesgo de colisión. Estos dos aspectos definen los procedimientos más apropiados para reducir el riesgo de colisión en fase operacional. Este aspecto es abordado, comenzando por la teoría descrita en [Sánchez-Ortiz, 2006]T.14 e implementada por el autor de esta tesis en la herramienta ARES [Sánchez-Ortiz, 2004b]T.15 proporcionada por ESA para la evaluación de estrategias de evitación de colisión. Esta teoría es extendida en esta tesis para considerar las características de los datos orbitales disponibles en las fases operacionales de un satélite (sección 4.3.3). Además, esta teoría se ha extendido para considerar riesgo máximo de colisión cuando la incertidumbre de las órbitas de objetos catalogados no es conocida (como se da el caso para los TLE), y en el caso de querer sólo considerar riesgo de colisión catastrófico (sección 4.3.2.3). Dichas mejoras se han incluido en la nueva versión de ARES [Domínguez-González and Sánchez-Ortiz, 2012b]T.12 puesta a disposición a través de [SDUP,2014]R.60. En fase operacional, los catálogos que proporcionan datos orbitales de los objetos espaciales, son procesados rutinariamente, para identificar posibles encuentros que se analizan en base a algoritmos de cálculo de riesgo de colisión para proponer maniobras de evasión. Actualmente existe una única fuente de datos públicos, el catálogo TLE (de sus siglas en inglés, Two Line Elements). Además, el Joint Space Operation Center (JSpOC) Americano proporciona mensajes con alertas de colisión (CSM) cuando el sistema de vigilancia americano identifica un posible encuentro. En función de los datos usados en fase operacional (TLE o CSM), la estrategia de evitación puede ser diferente debido a las características de dicha información. Es preciso conocer las principales características de los datos disponibles (respecto a la precisión de los datos orbitales) para estimar los posibles eventos de colisión encontrados por un satélite a lo largo de su vida útil. En caso de los TLE, cuya precisión orbital no es proporcionada, la información de precisión orbital derivada de un análisis estadístico se puede usar también en el proceso operacional así como en el diseño de la misión. En caso de utilizar CSM como base de las operaciones de evitación de colisiones, se conoce la precisión orbital de los dos objetos involucrados. Estas características se han analizado en detalle, evaluando estadísticamente las características de ambos tipos de datos. Una vez concluido dicho análisis, se ha analizado el impacto de utilizar TLE o CSM en las operaciones del satélite (sección 5.1). Este análisis se ha publicado en una revista especializada [Sánchez-Ortiz, 2015b]T.3. En dicho análisis, se proporcionan recomendaciones para distintas misiones (tamaño del satélite y régimen orbital) en relación con las estrategias de evitación de colisión para reducir el riesgo de colisión de manera significativa. Por ejemplo, en el caso de un satélite en órbita heliosíncrona en régimen orbital LEO, el valor típico del ACPL que se usa de manera extendida es 10-4. Este valor no es adecuado cuando los esquemas de evitación de colisión se realizan sobre datos TLE. En este caso, la capacidad de reducción de riesgo es prácticamente nula (debido a las grandes incertidumbres de los datos TLE) incluso para tiempos cortos de predicción. Para conseguir una reducción significativa del riesgo, sería necesario usar un ACPL en torno a 10-6 o inferior, produciendo unas 10 alarmas al año por satélite (considerando predicciones a un día) o 100 alarmas al año (con predicciones a tres días). Por tanto, la principal conclusión es la falta de idoneidad de los datos TLE para el cálculo de eventos de colisión. Al contrario, usando los datos CSM, debido a su mejor precisión orbital, se puede obtener una reducción significativa del riesgo con ACPL en torno a 10-4 (considerando 3 días de predicción). Incluso 5 días de predicción pueden ser considerados con ACPL en torno a 10-5. Incluso tiempos de predicción más largos se pueden usar (7 días) con reducción del 90% del riesgo y unas 5 alarmas al año (en caso de predicciones de 5 días, el número de maniobras se mantiene en unas 2 al año). La dinámica en GEO es diferente al caso LEO y hace que el crecimiento de las incertidumbres orbitales con el tiempo de propagación sea menor. Por el contrario, las incertidumbres derivadas de la determinación orbital son peores que en LEO por las diferencias en las capacidades de observación de uno y otro régimen orbital. Además, se debe considerar que los tiempos de predicción considerados para LEO pueden no ser apropiados para el caso de un satélite GEO (puesto que tiene un periodo orbital mayor). En este caso usando datos TLE, una reducción significativa del riesgo sólo se consigue con valores pequeños de ACPL, produciendo una alarma por año cuando los eventos de colisión se predicen a un día vista (tiempo muy corto para implementar maniobras de evitación de colisión).Valores más adecuados de ACPL se encuentran entre 5•10-8 y 10-7, muy por debajo de los valores usados en las operaciones actuales de la mayoría de las misiones GEO (de nuevo, no se recomienda en este régimen orbital basar las estrategias de evitación de colisión en TLE). Los datos CSM permiten una reducción de riesgo apropiada con ACPL entre 10-5 y 10-4 con tiempos de predicción cortos y medios (10-5 se recomienda para predicciones a 5 o 7 días). El número de maniobras realizadas sería una en 10 años de misión. Se debe notar que estos cálculos están realizados para un satélite de unos 2 metros de radio. En el futuro, otros sistemas de vigilancia espacial (como el programa SSA de la ESA), proporcionarán catálogos adicionales de objetos espaciales con el objetivo de reducir el riesgo de colisión de los satélites. Para definir dichos sistemas de vigilancia, es necesario identificar las prestaciones del catalogo en función de la reducción de riesgo que se pretende conseguir. Las características del catálogo que afectan principalmente a dicha capacidad son la cobertura (número de objetos incluidos en el catalogo, limitado principalmente por el tamaño mínimo de los objetos en función de las limitaciones de los sensores utilizados) y la precisión de los datos orbitales (derivada de las prestaciones de los sensores en relación con la precisión de las medidas y la capacidad de re-observación de los objetos). El resultado de dicho análisis (sección 5.2) se ha publicado en una revista especializada [Sánchez-Ortiz, 2015a]T.2. Este análisis no estaba inicialmente previsto durante la tesis, y permite mostrar como la teoría descrita en esta tesis, inicialmente definida para facilitar el diseño de misiones (parte superior de la figura 1) se ha extendido y se puede aplicar para otros propósitos como el dimensionado de un sistema de vigilancia espacial (parte inferior de la figura 1). La principal diferencia de los dos análisis se basa en considerar las capacidades de catalogación (precisión y tamaño de objetos observados) como una variable a modificar en el caso de un diseño de un sistema de vigilancia), siendo fijas en el caso de un diseño de misión. En el caso de las salidas generadas en el análisis, todos los aspectos calculados en un análisis estadístico de riesgo de colisión son importantes para diseño de misión (con el objetivo de calcular la estrategia de evitación y la cantidad de combustible a utilizar), mientras que en el caso de un diseño de un sistema de vigilancia, los aspectos más importantes son el número de maniobras y falsas alarmas (fiabilidad del sistema) y la capacidad de reducción de riesgo (efectividad del sistema). Adicionalmente, un sistema de vigilancia espacial debe ser caracterizado por su capacidad de evitar colisiones catastróficas (evitando así in incremento dramático de la población de basura espacial), mientras que el diseño de una misión debe considerar todo tipo de encuentros, puesto que un operador está interesado en evitar tanto las colisiones catastróficas como las letales. Del análisis de las prestaciones (tamaño de objetos a catalogar y precisión orbital) requeridas a un sistema de vigilancia espacial se concluye que ambos aspectos han de ser fijados de manera diferente para los distintos regímenes orbitales. En el caso de LEO se hace necesario observar objetos de hasta 5cm de radio, mientras que en GEO se rebaja este requisito hasta los 100 cm para cubrir las colisiones catastróficas. La razón principal para esta diferencia viene de las diferentes velocidades relativas entre los objetos en ambos regímenes orbitales. En relación con la precisión orbital, ésta ha de ser muy buena en LEO para poder reducir el número de falsas alarmas, mientras que en regímenes orbitales más altos se pueden considerar precisiones medias. En relación con los aspectos operaciones de la determinación de riesgo de colisión, existen varios algoritmos de cálculo de riesgo entre dos objetos espaciales. La Figura 2 proporciona un resumen de los casos en cuanto a algoritmos de cálculo de riesgo de colisión y como se abordan en esta tesis. Normalmente se consideran objetos esféricos para simplificar el cálculo de riesgo (caso A). Este caso está ampliamente abordado en la literatura y no se analiza en detalle en esta tesis. Un caso de ejemplo se proporciona en la sección 4.2. Considerar la forma real de los objetos (caso B) permite calcular el riesgo de una manera más precisa. Un nuevo algoritmo es definido en esta tesis para calcular el riesgo de colisión cuando al menos uno de los objetos se considera complejo (sección 4.4.2). Dicho algoritmo permite calcular el riesgo de colisión para objetos formados por un conjunto de cajas, y se ha presentado en varias conferencias internacionales. Para evaluar las prestaciones de dicho algoritmo, sus resultados se han comparado con un análisis de Monte Carlo que se ha definido para considerar colisiones entre cajas de manera adecuada (sección 4.1.2.3), pues la búsqueda de colisiones simples aplicables para objetos esféricos no es aplicable a este caso. Este análisis de Monte Carlo se considera la verdad a la hora de calcular los resultados del algoritmos, dicha comparativa se presenta en la sección 4.4.4. En el caso de satélites que no se pueden considerar esféricos, el uso de un modelo de la geometría del satélite permite descartar eventos que no son colisiones reales o estimar con mayor precisión el riesgo asociado a un evento. El uso de estos algoritmos con geometrías complejas es más relevante para objetos de dimensiones grandes debido a las prestaciones de precisión orbital actuales. En el futuro, si los sistemas de vigilancia mejoran y las órbitas son conocidas con mayor precisión, la importancia de considerar la geometría real de los satélites será cada vez más relevante. La sección 5.4 presenta un ejemplo para un sistema de grandes dimensiones (satélite con un tether). Adicionalmente, si los dos objetos involucrados en la colisión tienen velocidad relativa baja (y geometría simple, Caso C en la Figura 2), la mayor parte de los algoritmos no son aplicables requiriendo implementaciones dedicadas para este caso particular. En esta tesis, uno de estos algoritmos presentado en la literatura [Patera, 2001]R.26 se ha analizado para determinar su idoneidad en distintos tipos de eventos (sección 4.5). La evaluación frete a un análisis de Monte Carlo se proporciona en la sección 4.5.2. Tras este análisis, se ha considerado adecuado para abordar las colisiones de baja velocidad. En particular, se ha concluido que el uso de algoritmos dedicados para baja velocidad son necesarios en función del tamaño del volumen de colisión proyectado en el plano de encuentro (B-plane) y del tamaño de la incertidumbre asociada al vector posición entre los dos objetos. Para incertidumbres grandes, estos algoritmos se hacen más necesarios pues la duración del intervalo en que los elipsoides de error de los dos objetos pueden intersecar es mayor. Dicho algoritmo se ha probado integrando el algoritmo de colisión para objetos con geometrías complejas. El resultado de dicho análisis muestra que este algoritmo puede ser extendido fácilmente para considerar diferentes tipos de algoritmos de cálculo de riesgo de colisión (sección 4.5.3). Ambos algoritmos, junto con el método Monte Carlo para geometrías complejas, se han implementado en la herramienta operacional de la ESA CORAM, que es utilizada para evaluar el riesgo de colisión en las actividades rutinarias de los satélites operados por ESA [Sánchez-Ortiz, 2013a]T.11. Este hecho muestra el interés y relevancia de los algoritmos desarrollados para la mejora de las operaciones de los satélites. Dichos algoritmos han sido presentados en varias conferencias internacionales [Sánchez-Ortiz, 2013b]T.9, [Pulido, 2014]T.7,[Grande-Olalla, 2013]T.10, [Pulido, 2014]T.5, [Sánchez-Ortiz, 2015c]T.1. ABSTRACT This document addresses methodologies for computation of the collision risk of a satellite. Two different approaches need to be considered for collision risk minimisation. On an operational basis, it is needed to perform a sieve of possible objects approaching the satellite, among all objects sharing the space with an operational satellite. As the orbits of both, satellite and the eventual collider, are not perfectly known but only estimated, the miss-encounter geometry and the actual risk of collision shall be evaluated. In the basis of the encounter geometry or the risk, an eventual manoeuvre may be required to avoid the conjunction. Those manoeuvres will be associated to a reduction in the fuel for the mission orbit maintenance, and thus, may reduce the satellite operational lifetime. Thus, avoidance manoeuvre fuel budget shall be estimated, at mission design phase, for a better estimation of mission lifetime, especially for those satellites orbiting in very populated orbital regimes. These two aspects, mission design and operational collision risk aspects, are summarised in Figure 3, and covered along this thesis. Bottom part of the figure identifies the aspects to be consider for the mission design phase (statistical characterisation of the space object population data and theory computing the mean number of events and risk reduction capability) which will define the most appropriate collision avoidance approach at mission operational phase. This part is covered in this work by starting from the theory described in [Sánchez-Ortiz, 2006]T.14 and implemented by this author in ARES tool [Sánchez-Ortiz, 2004b]T.15 provided by ESA for evaluation of collision avoidance approaches. This methodology has been now extended to account for the particular features of the available data sets in operational environment (section 4.3.3). Additionally, the formulation has been extended to allow evaluating risk computation approached when orbital uncertainty is not available (like the TLE case) and when only catastrophic collisions are subject to study (section 4.3.2.3). These improvements to the theory have been included in the new version of ESA ARES tool [Domínguez-González and Sánchez-Ortiz, 2012b]T.12 and available through [SDUP,2014]R.60. At the operation phase, the real catalogue data will be processed on a routine basis, with adequate collision risk computation algorithms to propose conjunction avoidance manoeuvre optimised for every event. The optimisation of manoeuvres in an operational basis is not approached along this document. Currently, American Two Line Element (TLE) catalogue is the only public source of data providing orbits of objects in space to identify eventual conjunction events. Additionally, Conjunction Summary Message (CSM) is provided by Joint Space Operation Center (JSpOC) when the American system identifies a possible collision among satellites and debris. Depending on the data used for collision avoidance evaluation, the conjunction avoidance approach may be different. The main features of currently available data need to be analysed (in regards to accuracy) in order to perform estimation of eventual encounters to be found along the mission lifetime. In the case of TLE, as these data is not provided with accuracy information, operational collision avoidance may be also based on statistical accuracy information as the one used in the mission design approach. This is not the case for CSM data, which includes the state vector and orbital accuracy of the two involved objects. This aspect has been analysed in detail and is depicted in the document, evaluating in statistical way the characteristics of both data sets in regards to the main aspects related to collision avoidance. Once the analysis of data set was completed, investigations on the impact of those features in the most convenient avoidance approaches have been addressed (section 5.1). This analysis is published in a peer-reviewed journal [Sánchez-Ortiz, 2015b]T.3. The analysis provides recommendations for different mission types (satellite size and orbital regime) in regards to the most appropriate collision avoidance approach for relevant risk reduction. The risk reduction capability is very much dependent on the accuracy of the catalogue utilized to identify eventual collisions. Approaches based on CSM data are recommended against the TLE based approach. Some approaches based on the maximum risk associated to envisaged encounters are demonstrated to report a very large number of events, which makes the approach not suitable for operational activities. Accepted Collision Probability Levels are recommended for the definition of the avoidance strategies for different mission types. For example for the case of a LEO satellite in the Sun-synchronous regime, the typically used ACPL value of 10-4 is not a suitable value for collision avoidance schemes based on TLE data. In this case the risk reduction capacity is almost null (due to the large uncertainties associated to TLE data sets, even for short time-to-event values). For significant reduction of risk when using TLE data, ACPL on the order of 10-6 (or lower) seems to be required, producing about 10 warnings per year and mission (if one-day ahead events are considered) or 100 warnings per year (for three-days ahead estimations). Thus, the main conclusion from these results is the lack of feasibility of TLE for a proper collision avoidance approach. On the contrary, for CSM data, and due to the better accuracy of the orbital information when compared with TLE, ACPL on the order of 10-4 allows to significantly reduce the risk. This is true for events estimated up to 3 days ahead. Even 5 days ahead events can be considered, but ACPL values down to 10-5 should be considered in such case. Even larger prediction times can be considered (7 days) for risk reduction about 90%, at the cost of larger number of warnings up to 5 events per year, when 5 days prediction allows to keep the manoeuvre rate in 2 manoeuvres per year. Dynamics of the GEO orbits is different to that in LEO, impacting on a lower increase of orbits uncertainty along time. On the contrary, uncertainties at short prediction times at this orbital regime are larger than those at LEO due to the differences in observation capabilities. Additionally, it has to be accounted that short prediction times feasible at LEO may not be appropriate for a GEO mission due to the orbital period being much larger at this regime. In the case of TLE data sets, significant reduction of risk is only achieved for small ACPL values, producing about a warning event per year if warnings are raised one day in advance to the event (too short for any reaction to be considered). Suitable ACPL values would lay in between 5•10-8 and 10-7, well below the normal values used in current operations for most of the GEO missions (TLE-based strategies for collision avoidance at this regime are not recommended). On the contrary, CSM data allows a good reduction of risk with ACPL in between 10-5 and 10-4 for short and medium prediction times. 10-5 is recommended for prediction times of five or seven days. The number of events raised for a suitable warning time of seven days would be about one in a 10-year mission. It must be noted, that these results are associated to a 2 m radius spacecraft, impact of the satellite size are also analysed within the thesis. In the future, other Space Situational Awareness Systems (SSA, ESA program) may provide additional catalogues of objects in space with the aim of reducing the risk. It is needed to investigate which are the required performances of those catalogues for allowing such risk reduction. The main performance aspects are coverage (objects included in the catalogue, mainly limited by a minimum object size derived from sensor performances) and the accuracy of the orbital data to accurately evaluate the conjunctions (derived from sensor performance in regards to object observation frequency and accuracy). The results of these investigations (section 5.2) are published in a peer-reviewed journal [Sánchez-Ortiz, 2015a]T.2. This aspect was not initially foreseen as objective of the thesis, but it shows how the theory described in the thesis, initially defined for mission design in regards to avoidance manoeuvre fuel allocation (upper part of figure 1), is extended and serves for additional purposes as dimensioning a Space Surveillance and Tracking (SST) system (bottom part of figure below). The main difference between the two approaches is the consideration of the catalogue features as part of the theory which are not modified (for the satellite mission design case) instead of being an input for the analysis (in the case of the SST design). In regards to the outputs, all the features computed by the statistical conjunction analysis are of importance for mission design (with the objective of proper global avoidance strategy definition and fuel allocation), whereas for the case of SST design, the most relevant aspects are the manoeuvre and false alarm rates (defining a reliable system) and the Risk Reduction capability (driving the effectiveness of the system). In regards to the methodology for computing the risk, the SST system shall be driven by the capacity of providing the means to avoid catastrophic conjunction events (avoiding the dramatic increase of the population), whereas the satellite mission design should consider all type of encounters, as the operator is interested on avoiding both lethal and catastrophic collisions. From the analysis of the SST features (object coverage and orbital uncertainty) for a reliable system, it is concluded that those two characteristics are to be imposed differently for the different orbital regimes, as the population level is different depending on the orbit type. Coverage values range from 5 cm for very populated LEO regime up to 100 cm in the case of GEO region. The difference on this requirement derives mainly from the relative velocity of the encounters at those regimes. Regarding the orbital knowledge of the catalogues, very accurate information is required for objects in the LEO region in order to limit the number of false alarms, whereas intermediate orbital accuracy can be considered for higher orbital regimes. In regards to the operational collision avoidance approaches, several collision risk algorithms are used for evaluation of collision risk of two pair of objects. Figure 2 provides a summary of the different collision risk algorithm cases and indicates how they are covered along this document. The typical case with high relative velocity is well covered in literature for the case of spherical objects (case A), with a large number of available algorithms, that are not analysed in detailed in this work. Only a sample case is provided in section 4.2. If complex geometries are considered (Case B), a more realistic risk evaluation can be computed. New approach for the evaluation of risk in the case of complex geometries is presented in this thesis (section 4.4.2), and it has been presented in several international conferences. The developed algorithm allows evaluating the risk for complex objects formed by a set of boxes. A dedicated Monte Carlo method has also been described (section 4.1.2.3) and implemented to allow the evaluation of the actual collisions among a large number of simulation shots. This Monte Carlo runs are considered the truth for comparison of the algorithm results (section 4.4.4). For spacecrafts that cannot be considered as spheres, the consideration of the real geometry of the objects may allow to discard events which are not real conjunctions, or estimate with larger reliability the risk associated to the event. This is of particular importance for the case of large spacecrafts as the uncertainty in positions of actual catalogues does not reach small values to make a difference for the case of objects below meter size. As the tracking systems improve and the orbits of catalogued objects are known more precisely, the importance of considering actual shapes of the objects will become more relevant. The particular case of a very large system (as a tethered satellite) is analysed in section 5.4. Additionally, if the two colliding objects have low relative velocity (and simple geometries, case C in figure above), the most common collision risk algorithms fail and adequate theories need to be applied. In this document, a low relative velocity algorithm presented in the literature [Patera, 2001]R.26 is described and evaluated (section 4.5). Evaluation through comparison with Monte Carlo approach is provided in section 4.5.2. The main conclusion of this analysis is the suitability of this algorithm for the most common encounter characteristics, and thus it is selected as adequate for collision risk estimation. Its performances are evaluated in order to characterise when it can be safely used for a large variety of encounter characteristics. In particular, it is found that the need of using dedicated algorithms depend on both the size of collision volume in the B-plane and the miss-distance uncertainty. For large uncertainties, the need of such algorithms is more relevant since for small uncertainties the encounter duration where the covariance ellipsoids intersect is smaller. Additionally, its application for the case of complex satellite geometries is assessed (case D in figure above) by integrating the developed algorithm in this thesis with Patera’s formulation for low relative velocity encounters. The results of this analysis show that the algorithm can be easily extended for collision risk estimation process suitable for complex geometry objects (section 4.5.3). The two algorithms, together with the Monte Carlo method, have been implemented in the operational tool CORAM for ESA which is used for the evaluation of collision risk of ESA operated missions, [Sánchez-Ortiz, 2013a]T.11. This fact shows the interest and relevance of the developed algorithms for improvement of satellite operations. The algorithms have been presented in several international conferences, [Sánchez-Ortiz, 2013b]T.9, [Pulido, 2014]T.7,[Grande-Olalla, 2013]T.10, [Pulido, 2014]T.5, [Sánchez-Ortiz, 2015c]T.1.
Resumo:
We hypothesized that feeding pregnant rats with a high-fat diet would increase both circulating 17β-estradiol (E2) levels in the dams and the risk of developing carcinogen-induced mammary tumors among their female offspring. Pregnant rats were fed isocaloric diets containing 12% or 16% (low fat) or 43% or 46% (high fat) of calories from corn oil, which primarily contains the n − 6 polyunsaturated fatty acid (PUFA) linoleic acid, throughout pregnancy. The plasma concentrations of E2 were significantly higher in pregnant females fed a high n − 6 PUFA diet. The female offspring of these rats were fed with a laboratory chow from birth onward, and when exposed to 7,12-dimethylbenz(a)anthracene had a significantly higher mammary tumor incidence (60% vs. 30%) and shorter latency for tumor appearance (11.4 ± 0.5 weeks vs. 14.2 ± 0.6 weeks) than the offspring of the low-fat mothers. The high-fat offspring also had puberty onset at a younger age, and their mammary glands contained significantly higher numbers of the epithelial structures that are the targets for malignant transformation. Comparable changes in puberty onset, mammary gland morphology, and tumor incidence were observed in the offspring of rats treated daily with 20 ng of E2 during pregnancy. These data, if extrapolated to humans, may explain the link among diet, early puberty onset, mammary parenchymal patterns, and breast cancer risk, and indicate that an in utero exposure to a diet high in n − 6 PUFA and/or estrogenic stimuli may be critical for affecting breast cancer risk.
Resumo:
Recent studies indicated that hyperactivity of the hypothalamo-pituitary-adrenal system is a considerable risk factor for the precipitation of affective disorders, most notably of major depression. The mechanism by which this hyperactivity eventually leads to clinical symptoms of depression is unknown. In the present animal study, we tested one possible mechanism, i.e., that long-term exposure to high corticosterone levels alters functional responses to serotonin in the hippocampus, an important area in the etiology of depression. Rats were injected daily for 3 weeks with a high dose of corticosterone; electrophysiological responses to serotonin were recorded intracellularly from CA1 pyramidal neurons in vitro. We observed that daily injections with corticosterone gradually attenuate the membrane hyperpolarization and resistance decrease mediated by serotonin-1A receptors. We next used single-cell antisense RNA amplification from identified CA1 pyramidal neurons to resolve whether the functional deficits in serotonin responsiveness are accompanied by decreased expression levels of the serotonin-1A receptor. It appeared that expression of serotonin-1A receptors in CA1 pyramidal cells is not altered; this result was supported by in situ hybridization. Expression of corticosteroid receptors in the same cells, particularly of the high-affinity mineralocorticoid receptor, was significantly reduced after long-term corticosterone treatment. The present findings indicate that prolonged elevation of the corticosteroid concentration, a possible causal factor for major depression in humans, gradually attenuates responsiveness to serotonin without necessarily decreasing serotonin-1A receptor mRNA levels in pyramidal neurons. These functional changes may occur by a posttranscriptional mechanism or by transcriptional regulation of genes other than the serotonin-1A receptor gene itself.
Resumo:
Recent epidemiological studies show a strong reduction in the incidence of Alzheimer's disease in patients treated with cholesterol-lowering statins. Moreover, elevated Aβ42 levels and the ɛ4 allele of the lipid-carrier apolipoprotein E are regarded as risk factors for sporadic and familial Alzheimer's disease. Here we demonstrate that the widely used cholesterol-lowering drugs simvastatin and lovastatin reduce intracellular and extracellular levels of Aβ42 and Aβ40 peptides in primary cultures of hippocampal neurons and mixed cortical neurons. Likewise, guinea pigs treated with high doses of simvastatin showed a strong and reversible reduction of cerebral Aβ42 and Aβ40 levels in the cerebrospinal fluid and brain homogenate. These results suggest that lipids are playing an important role in the development of Alzheimer's disease. Lowered levels of Aβ42 may provide the mechanism for the observed reduced incidence of dementia in statin-treated patients and may open up avenues for therapeutic interventions.
Resumo:
Poverty increases children's exposure to stress, elevating their risk for developing patterns of heightened sympathetic and parasympathetic stress reactivity. Repeated patterns of high sympathetic activation and parasympathetic withdrawal place children at risk for anxiety disorders. This study evaluated whether providing social support to preschool-age children during mildly stressful situations helps reduce reactivity, and whether this effect partly depends on children's previously assessed baseline reactivity patterns. The Biological Sensitivity to Context (BSC) theory proposes that highly reactive children may be more sensitive than less reactive children to all environmental influences, including social support. In contrast, conventional physiological reactivity (CPR) theory contends that highly reactive children are more vulnerable to the impact of stress but are less receptive to the potential benefits present within their social environments. In this study, baseline autonomic reactivity patterns were measured. Children were then randomly assigned to a high-support or neutral control condition, and the effect of social support on autonomic response patterns was assessed. Results revealed an interaction between baseline reactivity profiles and experimental condition. Children with patterns of high-reactivity reaped more benefits from the social support in the experimental condition than did their less reactive peers. Highly reactive children experienced relatively less reactivity reduction in the neutral condition while experiencing relatively greater reactivity reduction in the support condition. Despite their demonstrated stability over time, reactivity patterns are also quite susceptible to change at this age; therefore understanding how social support ameliorates reactivity will further efforts to avert stable patterns of high-reactivity among children with high levels of stress, ultimately reducing risk for anxiety disorders.
Resumo:
Although NSSI engagement is a growing public health concern, little research has documented the developmental precursors to NSSI in longitudinal studies using youth samples. This study aimed to expand upon previous research on groups of NSSI engagement in a population-based sample of youth using multi-wave data. Moreover, this study examined whether chronic peer and romantic stress, the serotonin transporter gene (5-HTTLPR), parenting behaviors, and negative attributional style predicted the NSSI group membership as well as the role of sex and grade. Participants were 549 youth in beginning in the 3rd, 6th, and 9th grades at the baseline assessment. NSSI was assessed across 7 waves of data. Chronic peer and romantic stress, 5-HTTLPR, parenting behaviors, and negative attributional style were assessed at baseline. Growth mixture models, conducted to test the latent trajectory of NSSI groups did not converge. Three NSSI groups were manually created according to classifications that were determined a priori. NSSI groups included: no NSSI (85.1%), episodic NSSI (8.5%), and repeated NSSI (6.4%). Chronic peer and romantic stress, sex, and grade differentiated the no NSSI vs. repeated NSSI groups and the episodic NSSI vs. repeated NSSI groups. Specifically, higher levels of stress, being female, and being in higher grades related to repeated NSSI. 5-HTTLPR differentiated the no NSSI vs. repeated NSSI groups, such that carrying the short allele of 5-HTTLPR related to repeated NSSI. Exploratory analyses revealed that the relationship between attributional style and NSSI group was moderated by grade. This study suggests chronic interpersonal peer and romantic stress is an important factor placing youth at greater risk for repeatedly engaging in NSSI.
Resumo:
In this paper the model of an Innovative Monitoring Network involving properly connected nodes to develop an Information and Communication Technology (ICT) solution for preventive maintenance of historical centres from early warnings is proposed. It is well known that the protection of historical centres generally goes from a large-scale monitoring to a local one and it could be supported by a unique ICT solution. More in detail, the models of a virtually organized monitoring system could enable the implementation of automated analyses by presenting various alert levels. An adequate ICT solution tool would allow to define a monitoring network for a shared processing of data and results. Thus, a possible retrofit solution could be planned for pilot cases shared among the nodes of the network on the basis of a suitable procedure utilizing a retrofit catalogue. The final objective would consist in providing a model of an innovative tool to identify hazards, damages and possible retrofit solutions for historical centres, assuring an easy early warning support for stakeholders. The action could proactively target the needs and requirements of users, such as decision makers responsible for damage mitigation and safeguarding of cultural heritage assets.