854 resultados para dioxin risk reduction
Resumo:
Technology involving genetic modification of crops has the potential to make a contribution to rural poverty reduction in many developing countries. Thus far, pesticide-producing Bacillus thuringensis (Bt) varieties of cotton have been the main GM crops under cultivation in developing nations. Several studies have evaluated the farm-level performance of Bt varieties in comparison to conventional ones by estimating production technology, and have mostly found Bt technology to be very successful in raising output and/or reducing pesticide input. However, the production risk properties of this technology have not been studied, although they are likely to be important to risk-averse smallholders. This study investigates the output risk aspects of Bt technology by estimating two 'flexible risk' production function models allowing technology to independently affect the mean and higher moments of output. The first is the popular Just-Pope model and the second is a more general 'damage control' flexible risk model. The models are applied to cross-sectional data on South African smallholders, some of whom used Bt varieties. The results show no evidence that a 'risk-reduction' claim can be made for Bt technology. Indeed, there is some evidence to support the notion that the technology increases output risk, implying that simple (expected) profit computations used in past evaluations may overstate true benefits.
Resumo:
Technology involving genetic modification of crops has the potential to make a contribution to rural poverty reduction in many developing countries. Thus far, pesticide-producing Bacillus thuringensis (Bt) varieties of cotton have been the main GM crops under cultivation in developing nations. Several studies have evaluated the farm-level performance of Bt varieties in comparison to conventional ones by estimating production technology, and have mostly found Bt technology to be very successful in raising output and/or reducing pesticide input. However, the production risk properties of this technology have not been studied, although they are likely to be important to risk-averse smallholders. This study investigates the output risk aspects of Bt technology by estimating two 'flexible risk' production function models allowing technology to independently affect the mean and higher moments of output. The first is the popular Just-Pope model and the second is a more general 'damage control' flexible risk model. The models are applied to cross-sectional data on South African smallholders, some of whom used Bt varieties. The results show no evidence that a 'risk-reduction' claim can be made for Bt technology. Indeed, there is some evidence to support the notion that the technology increases output risk, implying that simple (expected) profit computations used in past evaluations may overstate true benefits.
Resumo:
This paper investigates the potential benefits and limitations of equal and value-weighted diversification using as the example the UK institutional property market. To achieve this it uses the largest sample (392) of actual property returns that is currently available, over the period 1981 to 1996. To evaluate these issues two approaches are adopted; first, an analysis of the correlations within the sectors and regions and secondly simulations of property portfolios of increasing size constructed both naively and with value-weighting. Using these methods it is shown that the extent of possible risk reduction is limited because of the high positive correlations between assets in any portfolio, even when naively diversified. It is also shown that portfolios exhibit high levels of variability around the average risk, suggesting that previous work seriously understates the number of properties needed to achieve a satisfactory level of diversification. The results have implications for the development and maintenance of a property portfolio because they indicate that the achievable level of risk reduction depends upon the availability of assets, the weighting system used and the investor’s risk tolerance.
Resumo:
Understanding complex social-ecological systems, and anticipating how they may respond to rapid change, requires an approach that incorporates environmental, social, economic, and policy factors, usually in a context of fragmented data availability. We employed fuzzy cognitive mapping (FCM) to integrate these factors in the assessment of future wildfire risk in the Chiquitania region, Bolivia. In this region, dealing with wildfires is becoming increasingly challenging due to reinforcing feedbacks between multiple drivers. We conducted semi-structured interviews and constructed different FCMs in focus groups to understand the regional dynamics of wildfire from diverse perspectives. We used FCM modelling to evaluate possible adaptation scenarios in the context of future drier climatic conditions. Scenarios also considered possible failure to respond in time to the emergent risk. This approach proved of great potential to support decision-making for risk management. It helped identify key forcing variables and generate insights into potential risks and trade-offs of different strategies. All scenarios showed increased wildfire risk in the event of more droughts. The ‘Hands-off’ scenario resulted in amplified impacts driven by intensifying trends, affecting particularly the agricultural production. The ‘Fire management’ scenario, which adopted a bottom-up approach to improve controlled burning, showed less trade-offs between wildfire risk reduction and production compared to the ‘Fire suppression’ scenario. Findings highlighted the importance of considering strategies that involve all actors who use fire, and the need to nest these strategies for a more systemic approach to manage wildfire risk. The FCM model could be used as a decision-support tool and serve as a ‘boundary object’ to facilitate collaboration and integration of different forms of knowledge and perceptions of fire in the region. This approach has also the potential to support decisions in other dynamic frontier landscapes around the world that are facing increased risk of large wildfires.
Resumo:
Standard models of moral hazard predict a negative relationship between risk and incentives, but the empirical work has not confirmed this prediction. In this paper, we propose a model with adverse selection followed by moral hazard, where effort and the degree of risk aversion are private information of an agent who can control the mean and the variance of profits. For a given contract, more risk-averse agents suppIy more effort in risk reduction. If the marginal utility of incentives decreases with risk aversion, more risk-averse agents prefer lower-incentive contractsj thus, in the optimal contract, incentives are positively correlated with endogenous risk. In contrast, if risk aversion is high enough, the possibility of reduction in risk makes the marginal utility of incentives increasing in risk aversion and, in this case, risk and incentives are negatively related.
Resumo:
Vaquero AR, Ferreira NE, Omae SV, Rodrigues MV, Teixeira SK, Krieger JE, Pereira AC. Using gene-network landscape to dissect genotype effects of TCF7L2 genetic variant on diabetes and cardiovascular risk. Physiol Genomics 44: 903-914, 2012. First published August 7, 2012; doi:10.1152/physiolgenomics.00030.2012.-The single nucleotide polymorphism (SNP) within the TCF7L2 gene, rs7903146, is, to date, the most significant genetic marker associated with Type 2 diabetes mellitus (T2DM) risk. Nonetheless, its functional role in disease pathology is poorly understood. The aim of the present study was to investigate, in vascular smooth muscle cells from 92 patients undergoing aortocoronary bypass surgery, the contribution of this SNP in T2DM using expression levels and expression correlation comparison approaches, which were visually represented as gene interaction networks. Initially, the expression levels of 41 genes (seven TCF7L2 splice forms and 40 other T2DM relevant genes) were compared between rs7903146 wild-type (CC) and T2DM-risk (CT + TT) genotype groups. Next, we compared the expression correlation patterns of these 41 genes between groups to observe if the relationships between genes were different. Five TCF7L2 splice forms and nine genes showed significant expression differences between groups. RXR alpha gene was pinpointed as showing the most different expression correlation pattern with other genes. Therefore, T2DM risk alleles appear to be influencing TCF7L2 splice form's expression in vascular smooth muscle cells, and RXR alpha gene is pointed out as a treatment target candidate for risk reduction in individuals with high risk of developing T2DM, especially individuals harboring TCF7L2 risk genotypes.
Resumo:
Microalbuminuria is an established risk factor for renal disease, especially in the diabetic population. Recent studies have shown that microalbuminuria has also a highly relevant predictive value for cardiovascular morbidity and mortality. From normal to overt proteinuria levels, albuminuria shows a continuous marked increase in cardiovascular risk. This association is independent of other "classical" cardiovascular risk factors such as hypertension, hyperlipidemia or smoking. Furthermore it has a predictive value not only for patients with diabetic or renal disease, but also for hypertensive individuals or the general population. Angiotensin-converting enzyme inhibitors and angiotensin receptor blockers have been shown to display not only reno--but also cardioprotective effects. Their unique ability to lower albuminuria by 40% is related to a significant risk reduction in cardiovascular mortality. New clinical trials are needed to define "normal" albuminuria levels and how low we should go.
Resumo:
BACKGROUND: Though guidelines emphasize low-density lipoprotein cholesterol (LDL-C) lowering as an essential strategy for cardiovascular risk reduction, achieving target levels may be difficult. PATIENTS AND METHODS: The authors conducted a prospective, controlled, open-label trial examining the effectiveness and safety of high-dose fluvastatin or a standard dosage of simvastatin plus ezetimibe, both with an intensive guideline-oriented cardiac rehabilitation program, in achieving the new ATP III LDL-C targets in patients with proven coronary artery disease. 305 consecutive patients were enrolled in the study. Patients were divided into two groups: the simvastatin (40 mg/d) plus ezetimibe (10 mg/d) and the fluvastatin-only group (80 mg/d). Patients in both study groups received the treatment for 21 days in addition to nonpharmacological measures, including advanced physical, dietary, psychosocial, and educational activities. RESULTS: After 21 days of treatment, a significant reduction in LDL-C was found in both study groups as compared to the initial values, however, the reduction in LDL-C was significantly stronger in the simvastatin plus ezetimibe group: simvastatin plus ezetimibe treatment decreased LDL-C to a mean level of 57.7 +/- 1.7 mg/ml, while fluvastatin achieved a reduction to 84.1 +/- 2.4 mg/ml (p < 0.001). In the simvastatin plus ezetimibe group, 95% of the patients reached the target level of LDL-C < 100 mg/dl. This percentage was significantly higher than in patients treated with fluvastatin alone (75%; p < 0.001). The greater effectiveness of simvastatin plus ezetimibe was more impressive when considering the optional goal of LDL-C < 70 mg/dl (75% vs. 32%, respectively; p < 0.001). There was no difference in occurrence of adverse events between both groups. CONCLUSION: Simvastatin 40 mg/d plus ezetimibe 10 mg/d, on the background of a guideline-oriented standardized intensive cardiac rehabilitation program, can reach 95% effectiveness in achieving challenging goals (LDL < 100 mg/dl) using lipid-lowering medication in patients at high cardiovascular risk.
Resumo:
Starting with an overview on losses due to mountain hazards in the Russian Federation and the European Alps, the question is raised why a substantial number of events still are recorded—despite considerable efforts in hazard mitigation and risk reduction. The main reason for this paradox lies in a missing dynamic risk-based approach, and it is shown that these dynamics have different roots: firstly, neglecting climate change and systems dynamics, the development of hazard scenarios is based on the static approach of design events. Secondly, due to economic development and population dynamics, the elements at risk exposed are subject to spatial and temporal changes. These issues are discussed with respect to temporal and spatial demands. As a result, it is shown how risk is dynamic on a long-term and short-term scale, which has to be acknowledged in the risk concept if this concept is targeted at a sustainable development of mountain regions. A conceptual model is presented that can be used for dynamical risk assessment, and it is shown by different management strategies how this model may be converted into practice. Furthermore, the interconnectedness and interaction between hazard and risk are addressed in order to enhance prevention, the level of protection and the degree of preparedness.
Resumo:
The purpose of this dissertation was to examine the relationship between key psychosocial and behavioral components of the Transtheoretical Model and the Theory of Reasoned Action for sexual risk reduction in a population of crack cocaine smokers and sex workers, not in drug treatment. ^ The first study examined the results of an analysis of the association between two principal constructs in the Transtheoretical Model, the processes of change and the stages of change for condom use, in a high risk population. In the analysis of variance for all respondents, the overall F-test revealed that people in different stages have different levels of experiential process use, F(3,317) = 17.79, p = 0.0001 and different levels of behavioral process use, F(3,317) = 28.59, p = .0001. For the experiential processes, there was a significant difference between the precontemplation/contemplation stage, and both the action, and maintenance, stages.^ The second study explored the relationship between the Theory of Reasoned Action “beliefs” and the stages-of-change in the same population. In the analysis of variance for all participants, the results indicate that people in different stages did value the positive beliefs differently, F(3,502) = 15.38, p = .0001 but did not value the negative beliefs differently, F(3,502) = 2.08, p = .10. ^ The third study explored differences in stage-of-change by gender, partner type drug use, and HIV status. Three discriminant functions emerged, with a combined χ2(12) = 139.57, p = <.0001. The loading matrix of correlations between predictors and discriminant functions demonstrate that the strongest predictor for distinguishing between the precontemplation/contemplation stage and the preparation, action, and maintenance stages (first function) is partner type (.962). The loadings on the second discriminant function suggest that once partner type has been accounted for, ever having HIV/AIDS (.935) was the best predictor for distinguishing between the first three stages and the maintenance stage. ^ These studies demonstrate that behavioral change theories can contribute important insight to researchers and program planners attempting to alter HIV risk behavior in high-risk populations. ^
Resumo:
BACKGROUND Prior epidemiologic studies suggest inverse relations between diabetes and glioma risk, but the underlying mechanisms, including use of antidiabetic drugs, are unknown. METHODS We therefore performed a matched case-control analysis using the Clinical Practice Research Datalink (CPRD). We identified incident glioma cases diagnosed between 1995 and 2012 and matched each case with 10 controls on age, gender, calendar time, general practice, and years of active history in the CPRD. We performed conditional logistic regression to estimate odds ratios (ORs) with 95% CIs, adjusted for body mass index and smoking. RESULTS We identified 2005 cases and 20 050 controls. Diabetes was associated with decreased risk of glioma (OR = 0.74; 95% CI = 0.60-0.93), particularly glioblastoma (OR = 0.69; 95% CI = 0.51-0.94). Glioblastoma risk reduction was markedly pronounced among diabetic men (OR = 0.60; 95% CI = 0.40-0.90), most apparently for those with diabetes of long-term duration (OR for >5 vs 0 y = 0.46; 95% CI = 0.26-0.82) or poor glycemic control (OR for HbA1c ≥8 vs <6.5% = 0.20; 95% CI = 0.06-0.70). In contrast, the effect of diabetes on glioblastoma risk was absent among women (OR = 0.85; 95% CI = 0.53-1.36). No significant associations with glioma were found for use of metformin (OR for ≥30 vs 0 prescriptions = 0.72; 95% CI = 0.38-1.39), sulfonylureas (OR = 0.71; 95% CI = 0.39-1.30), or insulin (OR = 0.79; 95% CI = 0.37-1.69). CONCLUSIONS Antidiabetic treatment appears to be unrelated to glioma, but long-term diabetes duration and increased HbA1c both show decreased glioma risk. Stronger findings in men than women suggest low androgen levels concurrent with diabetes as a biologic mechanism.
Resumo:
Background. Primary liver cancer, the majority of which is hepatocellular carcinoma, is the third most common cause of mortality from cancer. It has one of the worst prognosis outcomes and an overall 5-year survival of only 5-6%. Hepatocellular carcinoma has been shown to have wide variations in geographic distribution and there is a marked difference in the incidence between different races and gender. Previously low-rate countries, including the US, have shown to have doubled the incidence of HCC during the past two decades. Even though the incidence of HCC is higher in males as compared to females, female hormones, especially estrogens have been postulated to have a role in the development of hepatocellular carcinoma on a molecular level. Despite the frequent usage of oral contraceptive pills (OCP) and previously, hormone replacement therapy (HRT), their role on HCC development has not been studied thoroughly. We aim to examine the association between exogenous hormone intake (oral contraceptives and post-menopausal hormone replacement therapy) and the development of HCC. Methods. This study is part of an ongoing hospital-based case-control study which is conducted at the Department of Gastrointestinal Oncology at The University of Texas M. D. Anderson Cancer Center. From January 2005 up to January 2008, a total of 77 women with pathologically confirmed hepatocellular carcinoma (cases) and 277 healthy women (controls) were included in the investigation. Information about the use of hormonal contraceptives, hormone replacement therapy and risk factors of hepatocellular cancer was collected by personal interview. Univariate and multivariate logistic regression analyses were done to estimate the crude odds ratios (OR) and adjusted odds ratios (AOR). Results. We found statistically significant protective effect for the use of HRT on the development of HCC, AOR=0.42 (95% CI, 0.21, 0.81). The significance was observed for estrogen replacement, AOR=0.43 (95% CI, 0.22, 0.83) and not for progesterone replacement, AOR=0.49 (95% CI, 0.10, 2.35). On the other hand, any hormonal contraceptive use, which encompasses oral contraceptive pills, implants and injections, did not show a statistical significance either in the crude OR=0.58 (95% CI, 0.33, 1.01) or AOR=0.56 (95% CI 0.26, 1.18). Conclusions. As corroborated by previous studies, HRT confers 58% HCC risk reduction among American women. The more important question of the association between hormonal contraceptives and HCC remains controversial. Further studies are warranted to explore the mechanism of the protective effect of HRT and the relationship between hormonal contraception and HCC.^
Resumo:
This study described home infusion techniques and practices, measured the perceived risk of HIV and hepatitis transmission to self and others, and measured the outcome expectancy of following risk reduction guidelines for 90 hemophilia patients and/or their infusion assistants. It also assessed general knowledge of HIV and hepatitis information for the same population.^ The study subjects were hemophilia patients or their infusion assistants from the Gulf States Hemophilia Center in Houston, the El Paso Satellite Hemophilia Clinic in El Paso, or Texas members of the Women Outreach Network of the National Hemophilia Foundation (WONN) group. Each subject was interviewed either by telephone or in person. The questionnaire used was developed for the study and consisted of 60 items. These items assessed general demographics for the patients and assistants, including questions about their training to do infusions as well as the actual practices, measured perceived personal risk for the transmission of HIV or hepatitis to the assistants, perceived risk of transmission of HIV or hepatitis to others for assistants and self-infusers, and the outcome expectancy for following recommended risk reduction guidelines also for both groups.^ The theoretical framework used assumed that perceived risk and outcome expectancy would be predictive of behavior. The findings did not support this theory. Instead, the findings suggest that infusion behavior is habitual in nature; most respondents perform exactly the same behavior for every infusion. Since none of the variables selected were predictive of the compliance behavior for home infusion the teaching method should be directed towards mastery learning, or learning that will incorporate the correct behavior into a habitual pattern of home infusion. ^
Resumo:
The study aim was to determine whether using automated side loader (ASL) trucks in higher proportions compared to other types of trucks for residential waste collection results in lower injury rates (from all causes). The primary hypothesis was that the risk of injury to workers was lower for those who work with ASL trucks than for workers who work with other types of trucks used in residential waste collection. To test this hypothesis, data were collected from one of the nation’s largest companies in the solid waste management industry. Different local operating units (i.e. facilities) in the company used different types of trucks to varying degrees, which created a special opportunity to examine refuse collection injuries and illnesses and the risk reduction potential of ASL trucks.^ The study design was ecological and analyzed end-of-year data provided by the company for calendar year 2007. During 2007, there were a total of 345 facilities which provided residential services. Each facility represented one observation.^ The dependent variable – injury and illness rate, was defined as a facility’s total case incidence rate (TCIR) recorded in accordance with federal OSHA requirements for the year 2007. The TCIR is the rate of total recordable injury and illness cases per 100 full-time workers. The independent variable, percent of ASL trucks, was calculated by dividing the number of ASL trucks by the total number of residential trucks at each facility.^ Multiple linear regression models were estimated for the impact of the percent of ASL trucks on TCIR per facility. Adjusted analyses included three covariates: median number of hours worked per week for residential workers; median number of months of work experience for residential workers; and median age of residential workers. All analyses were performed with the statistical software, Stata IC (version 11.0).^ The analyses included three approaches to classifying exposure, percent of ASL trucks. The first approach included two levels of exposure: (1) 0% and (2) >0 - <100%. The second approach included three levels of exposure: (1) 0%, (2) ≥ 1 - < 100%, and (3) 100%. The third approach included six levels of exposure to improve detection of a dose-response relationship: (1) 0%, (2) 1 to <25%, (3) 25 to <50%, (4) 50 to <75%, (5) 75 to <100%, and (6) 100%. None of the relationships between injury and illness rate and percent ASL trucks exposure levels was statistically significant (i.e., p<0.05), even after adjustment for all three covariates.^ In summary, the present study shows that there is some risk reduction impact of ASL trucks but not statistically significant. The covariates demonstrated a varied yet more modest impact on the injury and illness rate but again, none of the relationships between injury and illness rate and the covariates were statistically significant (i.e., p<0.05). However, as an ecological study, the present study also has the limitations inherent in such designs and warrants replication in an individual level cohort design. Any stronger conclusions are not suggested.^
Resumo:
Esta tesis aborda metodologías para el cálculo de riesgo de colisión de satélites. La minimización del riesgo de colisión se debe abordar desde dos puntos de vista distintos. Desde el punto de vista operacional, es necesario filtrar los objetos que pueden presentar un encuentro entre todos los objetos que comparten el espacio con un satélite operacional. Puesto que las órbitas, del objeto operacional y del objeto envuelto en la colisión, no se conocen perfectamente, la geometría del encuentro y el riesgo de colisión deben ser evaluados. De acuerdo con dicha geometría o riesgo, una maniobra evasiva puede ser necesaria para evitar la colisión. Dichas maniobras implican un consumo de combustible que impacta en la capacidad de mantenimiento orbital y por tanto de la visa útil del satélite. Por tanto, el combustible necesario a lo largo de la vida útil de un satélite debe ser estimado en fase de diseño de la misión para una correcta definición de su vida útil, especialmente para satélites orbitando en regímenes orbitales muy poblados. Los dos aspectos, diseño de misión y aspectos operacionales en relación con el riesgo de colisión están abordados en esta tesis y se resumen en la Figura 3. En relación con los aspectos relacionados con el diseño de misión (parte inferior de la figura), es necesario evaluar estadísticamente las características de de la población espacial y las teorías que permiten calcular el número medio de eventos encontrados por una misión y su capacidad de reducir riesgo de colisión. Estos dos aspectos definen los procedimientos más apropiados para reducir el riesgo de colisión en fase operacional. Este aspecto es abordado, comenzando por la teoría descrita en [Sánchez-Ortiz, 2006]T.14 e implementada por el autor de esta tesis en la herramienta ARES [Sánchez-Ortiz, 2004b]T.15 proporcionada por ESA para la evaluación de estrategias de evitación de colisión. Esta teoría es extendida en esta tesis para considerar las características de los datos orbitales disponibles en las fases operacionales de un satélite (sección 4.3.3). Además, esta teoría se ha extendido para considerar riesgo máximo de colisión cuando la incertidumbre de las órbitas de objetos catalogados no es conocida (como se da el caso para los TLE), y en el caso de querer sólo considerar riesgo de colisión catastrófico (sección 4.3.2.3). Dichas mejoras se han incluido en la nueva versión de ARES [Domínguez-González and Sánchez-Ortiz, 2012b]T.12 puesta a disposición a través de [SDUP,2014]R.60. En fase operacional, los catálogos que proporcionan datos orbitales de los objetos espaciales, son procesados rutinariamente, para identificar posibles encuentros que se analizan en base a algoritmos de cálculo de riesgo de colisión para proponer maniobras de evasión. Actualmente existe una única fuente de datos públicos, el catálogo TLE (de sus siglas en inglés, Two Line Elements). Además, el Joint Space Operation Center (JSpOC) Americano proporciona mensajes con alertas de colisión (CSM) cuando el sistema de vigilancia americano identifica un posible encuentro. En función de los datos usados en fase operacional (TLE o CSM), la estrategia de evitación puede ser diferente debido a las características de dicha información. Es preciso conocer las principales características de los datos disponibles (respecto a la precisión de los datos orbitales) para estimar los posibles eventos de colisión encontrados por un satélite a lo largo de su vida útil. En caso de los TLE, cuya precisión orbital no es proporcionada, la información de precisión orbital derivada de un análisis estadístico se puede usar también en el proceso operacional así como en el diseño de la misión. En caso de utilizar CSM como base de las operaciones de evitación de colisiones, se conoce la precisión orbital de los dos objetos involucrados. Estas características se han analizado en detalle, evaluando estadísticamente las características de ambos tipos de datos. Una vez concluido dicho análisis, se ha analizado el impacto de utilizar TLE o CSM en las operaciones del satélite (sección 5.1). Este análisis se ha publicado en una revista especializada [Sánchez-Ortiz, 2015b]T.3. En dicho análisis, se proporcionan recomendaciones para distintas misiones (tamaño del satélite y régimen orbital) en relación con las estrategias de evitación de colisión para reducir el riesgo de colisión de manera significativa. Por ejemplo, en el caso de un satélite en órbita heliosíncrona en régimen orbital LEO, el valor típico del ACPL que se usa de manera extendida es 10-4. Este valor no es adecuado cuando los esquemas de evitación de colisión se realizan sobre datos TLE. En este caso, la capacidad de reducción de riesgo es prácticamente nula (debido a las grandes incertidumbres de los datos TLE) incluso para tiempos cortos de predicción. Para conseguir una reducción significativa del riesgo, sería necesario usar un ACPL en torno a 10-6 o inferior, produciendo unas 10 alarmas al año por satélite (considerando predicciones a un día) o 100 alarmas al año (con predicciones a tres días). Por tanto, la principal conclusión es la falta de idoneidad de los datos TLE para el cálculo de eventos de colisión. Al contrario, usando los datos CSM, debido a su mejor precisión orbital, se puede obtener una reducción significativa del riesgo con ACPL en torno a 10-4 (considerando 3 días de predicción). Incluso 5 días de predicción pueden ser considerados con ACPL en torno a 10-5. Incluso tiempos de predicción más largos se pueden usar (7 días) con reducción del 90% del riesgo y unas 5 alarmas al año (en caso de predicciones de 5 días, el número de maniobras se mantiene en unas 2 al año). La dinámica en GEO es diferente al caso LEO y hace que el crecimiento de las incertidumbres orbitales con el tiempo de propagación sea menor. Por el contrario, las incertidumbres derivadas de la determinación orbital son peores que en LEO por las diferencias en las capacidades de observación de uno y otro régimen orbital. Además, se debe considerar que los tiempos de predicción considerados para LEO pueden no ser apropiados para el caso de un satélite GEO (puesto que tiene un periodo orbital mayor). En este caso usando datos TLE, una reducción significativa del riesgo sólo se consigue con valores pequeños de ACPL, produciendo una alarma por año cuando los eventos de colisión se predicen a un día vista (tiempo muy corto para implementar maniobras de evitación de colisión).Valores más adecuados de ACPL se encuentran entre 5•10-8 y 10-7, muy por debajo de los valores usados en las operaciones actuales de la mayoría de las misiones GEO (de nuevo, no se recomienda en este régimen orbital basar las estrategias de evitación de colisión en TLE). Los datos CSM permiten una reducción de riesgo apropiada con ACPL entre 10-5 y 10-4 con tiempos de predicción cortos y medios (10-5 se recomienda para predicciones a 5 o 7 días). El número de maniobras realizadas sería una en 10 años de misión. Se debe notar que estos cálculos están realizados para un satélite de unos 2 metros de radio. En el futuro, otros sistemas de vigilancia espacial (como el programa SSA de la ESA), proporcionarán catálogos adicionales de objetos espaciales con el objetivo de reducir el riesgo de colisión de los satélites. Para definir dichos sistemas de vigilancia, es necesario identificar las prestaciones del catalogo en función de la reducción de riesgo que se pretende conseguir. Las características del catálogo que afectan principalmente a dicha capacidad son la cobertura (número de objetos incluidos en el catalogo, limitado principalmente por el tamaño mínimo de los objetos en función de las limitaciones de los sensores utilizados) y la precisión de los datos orbitales (derivada de las prestaciones de los sensores en relación con la precisión de las medidas y la capacidad de re-observación de los objetos). El resultado de dicho análisis (sección 5.2) se ha publicado en una revista especializada [Sánchez-Ortiz, 2015a]T.2. Este análisis no estaba inicialmente previsto durante la tesis, y permite mostrar como la teoría descrita en esta tesis, inicialmente definida para facilitar el diseño de misiones (parte superior de la figura 1) se ha extendido y se puede aplicar para otros propósitos como el dimensionado de un sistema de vigilancia espacial (parte inferior de la figura 1). La principal diferencia de los dos análisis se basa en considerar las capacidades de catalogación (precisión y tamaño de objetos observados) como una variable a modificar en el caso de un diseño de un sistema de vigilancia), siendo fijas en el caso de un diseño de misión. En el caso de las salidas generadas en el análisis, todos los aspectos calculados en un análisis estadístico de riesgo de colisión son importantes para diseño de misión (con el objetivo de calcular la estrategia de evitación y la cantidad de combustible a utilizar), mientras que en el caso de un diseño de un sistema de vigilancia, los aspectos más importantes son el número de maniobras y falsas alarmas (fiabilidad del sistema) y la capacidad de reducción de riesgo (efectividad del sistema). Adicionalmente, un sistema de vigilancia espacial debe ser caracterizado por su capacidad de evitar colisiones catastróficas (evitando así in incremento dramático de la población de basura espacial), mientras que el diseño de una misión debe considerar todo tipo de encuentros, puesto que un operador está interesado en evitar tanto las colisiones catastróficas como las letales. Del análisis de las prestaciones (tamaño de objetos a catalogar y precisión orbital) requeridas a un sistema de vigilancia espacial se concluye que ambos aspectos han de ser fijados de manera diferente para los distintos regímenes orbitales. En el caso de LEO se hace necesario observar objetos de hasta 5cm de radio, mientras que en GEO se rebaja este requisito hasta los 100 cm para cubrir las colisiones catastróficas. La razón principal para esta diferencia viene de las diferentes velocidades relativas entre los objetos en ambos regímenes orbitales. En relación con la precisión orbital, ésta ha de ser muy buena en LEO para poder reducir el número de falsas alarmas, mientras que en regímenes orbitales más altos se pueden considerar precisiones medias. En relación con los aspectos operaciones de la determinación de riesgo de colisión, existen varios algoritmos de cálculo de riesgo entre dos objetos espaciales. La Figura 2 proporciona un resumen de los casos en cuanto a algoritmos de cálculo de riesgo de colisión y como se abordan en esta tesis. Normalmente se consideran objetos esféricos para simplificar el cálculo de riesgo (caso A). Este caso está ampliamente abordado en la literatura y no se analiza en detalle en esta tesis. Un caso de ejemplo se proporciona en la sección 4.2. Considerar la forma real de los objetos (caso B) permite calcular el riesgo de una manera más precisa. Un nuevo algoritmo es definido en esta tesis para calcular el riesgo de colisión cuando al menos uno de los objetos se considera complejo (sección 4.4.2). Dicho algoritmo permite calcular el riesgo de colisión para objetos formados por un conjunto de cajas, y se ha presentado en varias conferencias internacionales. Para evaluar las prestaciones de dicho algoritmo, sus resultados se han comparado con un análisis de Monte Carlo que se ha definido para considerar colisiones entre cajas de manera adecuada (sección 4.1.2.3), pues la búsqueda de colisiones simples aplicables para objetos esféricos no es aplicable a este caso. Este análisis de Monte Carlo se considera la verdad a la hora de calcular los resultados del algoritmos, dicha comparativa se presenta en la sección 4.4.4. En el caso de satélites que no se pueden considerar esféricos, el uso de un modelo de la geometría del satélite permite descartar eventos que no son colisiones reales o estimar con mayor precisión el riesgo asociado a un evento. El uso de estos algoritmos con geometrías complejas es más relevante para objetos de dimensiones grandes debido a las prestaciones de precisión orbital actuales. En el futuro, si los sistemas de vigilancia mejoran y las órbitas son conocidas con mayor precisión, la importancia de considerar la geometría real de los satélites será cada vez más relevante. La sección 5.4 presenta un ejemplo para un sistema de grandes dimensiones (satélite con un tether). Adicionalmente, si los dos objetos involucrados en la colisión tienen velocidad relativa baja (y geometría simple, Caso C en la Figura 2), la mayor parte de los algoritmos no son aplicables requiriendo implementaciones dedicadas para este caso particular. En esta tesis, uno de estos algoritmos presentado en la literatura [Patera, 2001]R.26 se ha analizado para determinar su idoneidad en distintos tipos de eventos (sección 4.5). La evaluación frete a un análisis de Monte Carlo se proporciona en la sección 4.5.2. Tras este análisis, se ha considerado adecuado para abordar las colisiones de baja velocidad. En particular, se ha concluido que el uso de algoritmos dedicados para baja velocidad son necesarios en función del tamaño del volumen de colisión proyectado en el plano de encuentro (B-plane) y del tamaño de la incertidumbre asociada al vector posición entre los dos objetos. Para incertidumbres grandes, estos algoritmos se hacen más necesarios pues la duración del intervalo en que los elipsoides de error de los dos objetos pueden intersecar es mayor. Dicho algoritmo se ha probado integrando el algoritmo de colisión para objetos con geometrías complejas. El resultado de dicho análisis muestra que este algoritmo puede ser extendido fácilmente para considerar diferentes tipos de algoritmos de cálculo de riesgo de colisión (sección 4.5.3). Ambos algoritmos, junto con el método Monte Carlo para geometrías complejas, se han implementado en la herramienta operacional de la ESA CORAM, que es utilizada para evaluar el riesgo de colisión en las actividades rutinarias de los satélites operados por ESA [Sánchez-Ortiz, 2013a]T.11. Este hecho muestra el interés y relevancia de los algoritmos desarrollados para la mejora de las operaciones de los satélites. Dichos algoritmos han sido presentados en varias conferencias internacionales [Sánchez-Ortiz, 2013b]T.9, [Pulido, 2014]T.7,[Grande-Olalla, 2013]T.10, [Pulido, 2014]T.5, [Sánchez-Ortiz, 2015c]T.1. ABSTRACT This document addresses methodologies for computation of the collision risk of a satellite. Two different approaches need to be considered for collision risk minimisation. On an operational basis, it is needed to perform a sieve of possible objects approaching the satellite, among all objects sharing the space with an operational satellite. As the orbits of both, satellite and the eventual collider, are not perfectly known but only estimated, the miss-encounter geometry and the actual risk of collision shall be evaluated. In the basis of the encounter geometry or the risk, an eventual manoeuvre may be required to avoid the conjunction. Those manoeuvres will be associated to a reduction in the fuel for the mission orbit maintenance, and thus, may reduce the satellite operational lifetime. Thus, avoidance manoeuvre fuel budget shall be estimated, at mission design phase, for a better estimation of mission lifetime, especially for those satellites orbiting in very populated orbital regimes. These two aspects, mission design and operational collision risk aspects, are summarised in Figure 3, and covered along this thesis. Bottom part of the figure identifies the aspects to be consider for the mission design phase (statistical characterisation of the space object population data and theory computing the mean number of events and risk reduction capability) which will define the most appropriate collision avoidance approach at mission operational phase. This part is covered in this work by starting from the theory described in [Sánchez-Ortiz, 2006]T.14 and implemented by this author in ARES tool [Sánchez-Ortiz, 2004b]T.15 provided by ESA for evaluation of collision avoidance approaches. This methodology has been now extended to account for the particular features of the available data sets in operational environment (section 4.3.3). Additionally, the formulation has been extended to allow evaluating risk computation approached when orbital uncertainty is not available (like the TLE case) and when only catastrophic collisions are subject to study (section 4.3.2.3). These improvements to the theory have been included in the new version of ESA ARES tool [Domínguez-González and Sánchez-Ortiz, 2012b]T.12 and available through [SDUP,2014]R.60. At the operation phase, the real catalogue data will be processed on a routine basis, with adequate collision risk computation algorithms to propose conjunction avoidance manoeuvre optimised for every event. The optimisation of manoeuvres in an operational basis is not approached along this document. Currently, American Two Line Element (TLE) catalogue is the only public source of data providing orbits of objects in space to identify eventual conjunction events. Additionally, Conjunction Summary Message (CSM) is provided by Joint Space Operation Center (JSpOC) when the American system identifies a possible collision among satellites and debris. Depending on the data used for collision avoidance evaluation, the conjunction avoidance approach may be different. The main features of currently available data need to be analysed (in regards to accuracy) in order to perform estimation of eventual encounters to be found along the mission lifetime. In the case of TLE, as these data is not provided with accuracy information, operational collision avoidance may be also based on statistical accuracy information as the one used in the mission design approach. This is not the case for CSM data, which includes the state vector and orbital accuracy of the two involved objects. This aspect has been analysed in detail and is depicted in the document, evaluating in statistical way the characteristics of both data sets in regards to the main aspects related to collision avoidance. Once the analysis of data set was completed, investigations on the impact of those features in the most convenient avoidance approaches have been addressed (section 5.1). This analysis is published in a peer-reviewed journal [Sánchez-Ortiz, 2015b]T.3. The analysis provides recommendations for different mission types (satellite size and orbital regime) in regards to the most appropriate collision avoidance approach for relevant risk reduction. The risk reduction capability is very much dependent on the accuracy of the catalogue utilized to identify eventual collisions. Approaches based on CSM data are recommended against the TLE based approach. Some approaches based on the maximum risk associated to envisaged encounters are demonstrated to report a very large number of events, which makes the approach not suitable for operational activities. Accepted Collision Probability Levels are recommended for the definition of the avoidance strategies for different mission types. For example for the case of a LEO satellite in the Sun-synchronous regime, the typically used ACPL value of 10-4 is not a suitable value for collision avoidance schemes based on TLE data. In this case the risk reduction capacity is almost null (due to the large uncertainties associated to TLE data sets, even for short time-to-event values). For significant reduction of risk when using TLE data, ACPL on the order of 10-6 (or lower) seems to be required, producing about 10 warnings per year and mission (if one-day ahead events are considered) or 100 warnings per year (for three-days ahead estimations). Thus, the main conclusion from these results is the lack of feasibility of TLE for a proper collision avoidance approach. On the contrary, for CSM data, and due to the better accuracy of the orbital information when compared with TLE, ACPL on the order of 10-4 allows to significantly reduce the risk. This is true for events estimated up to 3 days ahead. Even 5 days ahead events can be considered, but ACPL values down to 10-5 should be considered in such case. Even larger prediction times can be considered (7 days) for risk reduction about 90%, at the cost of larger number of warnings up to 5 events per year, when 5 days prediction allows to keep the manoeuvre rate in 2 manoeuvres per year. Dynamics of the GEO orbits is different to that in LEO, impacting on a lower increase of orbits uncertainty along time. On the contrary, uncertainties at short prediction times at this orbital regime are larger than those at LEO due to the differences in observation capabilities. Additionally, it has to be accounted that short prediction times feasible at LEO may not be appropriate for a GEO mission due to the orbital period being much larger at this regime. In the case of TLE data sets, significant reduction of risk is only achieved for small ACPL values, producing about a warning event per year if warnings are raised one day in advance to the event (too short for any reaction to be considered). Suitable ACPL values would lay in between 5•10-8 and 10-7, well below the normal values used in current operations for most of the GEO missions (TLE-based strategies for collision avoidance at this regime are not recommended). On the contrary, CSM data allows a good reduction of risk with ACPL in between 10-5 and 10-4 for short and medium prediction times. 10-5 is recommended for prediction times of five or seven days. The number of events raised for a suitable warning time of seven days would be about one in a 10-year mission. It must be noted, that these results are associated to a 2 m radius spacecraft, impact of the satellite size are also analysed within the thesis. In the future, other Space Situational Awareness Systems (SSA, ESA program) may provide additional catalogues of objects in space with the aim of reducing the risk. It is needed to investigate which are the required performances of those catalogues for allowing such risk reduction. The main performance aspects are coverage (objects included in the catalogue, mainly limited by a minimum object size derived from sensor performances) and the accuracy of the orbital data to accurately evaluate the conjunctions (derived from sensor performance in regards to object observation frequency and accuracy). The results of these investigations (section 5.2) are published in a peer-reviewed journal [Sánchez-Ortiz, 2015a]T.2. This aspect was not initially foreseen as objective of the thesis, but it shows how the theory described in the thesis, initially defined for mission design in regards to avoidance manoeuvre fuel allocation (upper part of figure 1), is extended and serves for additional purposes as dimensioning a Space Surveillance and Tracking (SST) system (bottom part of figure below). The main difference between the two approaches is the consideration of the catalogue features as part of the theory which are not modified (for the satellite mission design case) instead of being an input for the analysis (in the case of the SST design). In regards to the outputs, all the features computed by the statistical conjunction analysis are of importance for mission design (with the objective of proper global avoidance strategy definition and fuel allocation), whereas for the case of SST design, the most relevant aspects are the manoeuvre and false alarm rates (defining a reliable system) and the Risk Reduction capability (driving the effectiveness of the system). In regards to the methodology for computing the risk, the SST system shall be driven by the capacity of providing the means to avoid catastrophic conjunction events (avoiding the dramatic increase of the population), whereas the satellite mission design should consider all type of encounters, as the operator is interested on avoiding both lethal and catastrophic collisions. From the analysis of the SST features (object coverage and orbital uncertainty) for a reliable system, it is concluded that those two characteristics are to be imposed differently for the different orbital regimes, as the population level is different depending on the orbit type. Coverage values range from 5 cm for very populated LEO regime up to 100 cm in the case of GEO region. The difference on this requirement derives mainly from the relative velocity of the encounters at those regimes. Regarding the orbital knowledge of the catalogues, very accurate information is required for objects in the LEO region in order to limit the number of false alarms, whereas intermediate orbital accuracy can be considered for higher orbital regimes. In regards to the operational collision avoidance approaches, several collision risk algorithms are used for evaluation of collision risk of two pair of objects. Figure 2 provides a summary of the different collision risk algorithm cases and indicates how they are covered along this document. The typical case with high relative velocity is well covered in literature for the case of spherical objects (case A), with a large number of available algorithms, that are not analysed in detailed in this work. Only a sample case is provided in section 4.2. If complex geometries are considered (Case B), a more realistic risk evaluation can be computed. New approach for the evaluation of risk in the case of complex geometries is presented in this thesis (section 4.4.2), and it has been presented in several international conferences. The developed algorithm allows evaluating the risk for complex objects formed by a set of boxes. A dedicated Monte Carlo method has also been described (section 4.1.2.3) and implemented to allow the evaluation of the actual collisions among a large number of simulation shots. This Monte Carlo runs are considered the truth for comparison of the algorithm results (section 4.4.4). For spacecrafts that cannot be considered as spheres, the consideration of the real geometry of the objects may allow to discard events which are not real conjunctions, or estimate with larger reliability the risk associated to the event. This is of particular importance for the case of large spacecrafts as the uncertainty in positions of actual catalogues does not reach small values to make a difference for the case of objects below meter size. As the tracking systems improve and the orbits of catalogued objects are known more precisely, the importance of considering actual shapes of the objects will become more relevant. The particular case of a very large system (as a tethered satellite) is analysed in section 5.4. Additionally, if the two colliding objects have low relative velocity (and simple geometries, case C in figure above), the most common collision risk algorithms fail and adequate theories need to be applied. In this document, a low relative velocity algorithm presented in the literature [Patera, 2001]R.26 is described and evaluated (section 4.5). Evaluation through comparison with Monte Carlo approach is provided in section 4.5.2. The main conclusion of this analysis is the suitability of this algorithm for the most common encounter characteristics, and thus it is selected as adequate for collision risk estimation. Its performances are evaluated in order to characterise when it can be safely used for a large variety of encounter characteristics. In particular, it is found that the need of using dedicated algorithms depend on both the size of collision volume in the B-plane and the miss-distance uncertainty. For large uncertainties, the need of such algorithms is more relevant since for small uncertainties the encounter duration where the covariance ellipsoids intersect is smaller. Additionally, its application for the case of complex satellite geometries is assessed (case D in figure above) by integrating the developed algorithm in this thesis with Patera’s formulation for low relative velocity encounters. The results of this analysis show that the algorithm can be easily extended for collision risk estimation process suitable for complex geometry objects (section 4.5.3). The two algorithms, together with the Monte Carlo method, have been implemented in the operational tool CORAM for ESA which is used for the evaluation of collision risk of ESA operated missions, [Sánchez-Ortiz, 2013a]T.11. This fact shows the interest and relevance of the developed algorithms for improvement of satellite operations. The algorithms have been presented in several international conferences, [Sánchez-Ortiz, 2013b]T.9, [Pulido, 2014]T.7,[Grande-Olalla, 2013]T.10, [Pulido, 2014]T.5, [Sánchez-Ortiz, 2015c]T.1.