967 resultados para Low frequency


Relevância:

60.00% 60.00%

Publicador:

Resumo:

The Hall Effect Thruster (HET) is a type of satellite electric propulsion device initially developed in the 1960’s independently by USA and the former USSR. The development continued in the shadow during the 1970’s in the Soviet Union to reach a mature status from the technological point of view in the 1980’s. In the 1990’s the advanced state of this Russian technology became known in western countries, which rapidly restarted the analysis and development of modern Hall thrusters. Currently, there are several companies in USA, Russia and Europe manufacturing Hall thrusters for operational use. The main applications of these thrusters are low-thrust propulsion of interplanetary probes, orbital raising of satellites and stationkeeping of geostationary satellites. However, despite the well proven in-flight experience, the physics of the Hall Thruster are not completely understood yet. Over the last two decades large efforts have been dedicated to the understanding of the physics of Hall Effect thrusters. However, the so-called anomalous diffusion, short name for an excessive electron conductivity along the thruster, is not yet fully understood as it cannot be explained with classical collisional theories. One commonly accepted explanation is the existence of azimuthal oscillations with correlated plasma density and electric field fluctuations. In fact, there is experimental evidence of the presence of an azimuthal oscillation in the low frequency range (a few kHz). This oscillation, usually called spoke, was first detected empirically by Janes and Lowder in the 1960s. More recently several experiments have shown the existence of this type of oscillation in various modern Hall thrusters. Given the frequency range, it is likely that the ionization is the cause of the spoke oscillation, like for the breathing mode oscillation. In the high frequency range (a few MHz), electron-drift azimuthal oscillations have been detected in recent experiments, in line with the oscillations measured by Esipchuk and Tilinin in the 1970’s. Even though these low and high frequency azimuthal oscillations have been known for quite some time already, the physics behind them are not yet clear and their possible relation with the anomalous diffusion process remains an unknown. This work aims at analysing from a theoretical point of view and via computer simulations the possible relation between the azimuthal oscillations and the anomalous electron transport in HET. In order to achieve this main objective, two approaches are considered: local linear stability analyses and global linear stability analyses. The use of local linear stability analyses shall allow identifying the dominant terms in the promotion of the oscillations. However, these analyses do not take into account properly the axial variation of the plasma properties along the thruster. On the other hand, global linear stability analyses do account for these axial variations and shall allow determining how the azimuthal oscillations are promoted and their possible relation with the electron transport.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

La presente investigación tiene como objetivo principal diseñar un Modelo de Gestión de Riesgos Operacionales (MGRO) según las Directrices de los Acuerdos II y III del Comité de Supervisión Bancaria de Basilea del Banco de Pagos Internacionales (CSBB-BPI). Se considera importante realizar un estudio sobre este tema dado que son los riesgos operacionales (OpR) los responsables en gran medida de las últimas crisis financieras mundiales y por la dificultad para detectarlos en las organizaciones. Se ha planteado un modelo de gestión subdividido en dos vías de influencias. La primera acoge el paradigma holístico en el que se considera que hay múltiples maneras de percibir un proceso cíclico, así como las herramientas para observar, conocer y entender el objeto o sujeto percibido. La segunda vía la representa el paradigma totalizante, en el que se obtienen datos tanto cualitativos como cuantitativos, los cuales son complementarios entre si. Por otra parte, este trabajo plantea el diseño de un programa informático de OpR Cualitativo, que ha sido diseñado para determinar la raíz de los riesgos en las organizaciones y su Valor en Riesgo Operacional (OpVaR) basado en el método del indicador básico. Aplicando el ciclo holístico al caso de estudio, se obtuvo el siguiente diseño de investigación: no experimental, univariable, transversal descriptiva, contemporánea, retrospectiva, de fuente mixta, cualitativa (fenomenológica y etnográfica) y cuantitativa (descriptiva y analítica). La toma de decisiones y recolección de información se realizó en dos fases en la unidad de estudio. En la primera se tomó en cuenta la totalidad de la empresa Corpoelec-EDELCA, en la que se presentó un universo estadístico de 4271 personas, una población de 2390 personas y una unidad de muestreo de 87 personas. Se repitió el proceso en una segunda fase, para la Central Hidroeléctrica Simón Bolívar, y se determinó un segundo universo estadístico de 300 trabajadores, una población de 191 personas y una muestra de 58 profesionales. Como fuentes de recolección de información se utilizaron fuentes primarias y secundarias. Para recabar la información primaria se realizaron observaciones directas, dos encuestas para detectar las áreas y procesos con mayor nivel de riesgos y se diseñó un cuestionario combinado con otra encuesta (ad hoc) para establecer las estimaciones de frecuencia y severidad de pérdidas operacionales. La información de fuentes secundarias se extrajo de las bases de datos de Corpoelec-EDELCA, de la IEA, del Banco Mundial, del CSBB-BPI, de la UPM y de la UC at Berkeley, entre otras. Se establecieron las distribuciones de frecuencia y de severidad de pérdidas operacionales como las variables independientes y el OpVaR como la variable dependiente. No se realizó ningún tipo de seguimiento o control a las variables bajo análisis, ya que se consideraron estas para un instante especifico y solo se determinan con la finalidad de establecer la existencia y valoración puntual de los OpR en la unidad de estudio. El análisis cualitativo planteado en el MGRO, permitió detectar que en la unidad de investigación, el 67% de los OpR detectados provienen de dos fuentes principales: procesos (32%) y eventos externos (35%). Adicionalmente, la validación del MGRO en Corpoelec-EDELCA, permitió detectar que el 63% de los OpR en la organización provienen de tres categorías principales, siendo los fraudes externos los presentes con mayor regularidad y severidad de pérdidas en la organización. La exposición al riesgo se determinó fundamentándose en la adaptación del concepto de OpVaR que generalmente se utiliza para series temporales y que en el caso de estudio presenta la primicia de aplicarlo a datos cualitativos transformados con la escala Likert. La posibilidad de utilizar distribuciones de probabilidad típicas para datos cuantitativos en distribuciones de frecuencia y severidad de pérdidas con datos de origen cualitativo fueron analizadas. Para el 64% de los OpR estudiados se obtuvo que la frecuencia tiene un comportamiento semejante al de la distribución de probabilidad de Poisson y en un 55% de los casos para la severidad de pérdidas se obtuvo a las log-normal como las distribuciones de probabilidad más comunes, con lo que se concluyó que los enfoques sugeridos por el BCBS-BIS para series de tiempo son aplicables a los datos cualitativos. Obtenidas las distribuciones de frecuencia y severidad de pérdidas, se convolucionaron estas implementando el método de Montecarlo, con lo que se obtuvieron los enfoques de distribuciones de pérdidas (LDA) para cada uno de los OpR. El OpVaR se dedujo como lo sugiere el CSBB-BPI del percentil 99,9 o 99% de cada una de las LDA, obteniéndose que los OpR presentan un comportamiento similar al sistema financiero, resultando como los de mayor peligrosidad los que se ubican con baja frecuencia y alto impacto, por su dificultad para ser detectados y monitoreados. Finalmente, se considera que el MGRO permitirá a los agentes del mercado y sus grupos de interés conocer con efectividad, fiabilidad y eficiencia el status de sus entidades, lo que reducirá la incertidumbre de sus inversiones y les permitirá establecer una nueva cultura de gestión en sus organizaciones. ABSTRACT This research has as main objective the design of a Model for Operational Risk Management (MORM) according to the guidelines of Accords II and III of the Basel Committee on Banking Supervision of the Bank for International Settlements (BCBS- BIS). It is considered important to conduct a study on this issue since operational risks (OpR) are largely responsible for the recent world financial crisis and due to the difficulty in detecting them in organizations. A management model has been designed which is divided into two way of influences. The first supports the holistic paradigm in which it is considered that there are multiple ways of perceiving a cyclical process and contains the tools to observe, know and understand the subject or object perceived. The second way is the totalizing paradigm, in which both qualitative and quantitative data are obtained, which are complementary to each other. Moreover, this paper presents the design of qualitative OpR software which is designed to determine the root of risks in organizations and their Operational Value at Risk (OpVaR) based on the basic indicator approach. Applying the holistic cycle to the case study, the following research design was obtained: non- experimental, univariate, descriptive cross-sectional, contemporary, retrospective, mixed-source, qualitative (phenomenological and ethnographic) and quantitative (descriptive and analytical). Decision making and data collection was conducted in two phases in the study unit. The first took into account the totality of the Corpoelec-EDELCA company, which presented a statistical universe of 4271 individuals, a population of 2390 individuals and a sampling unit of 87 individuals. The process was repeated in a second phase to the Simon Bolivar Hydroelectric Power Plant, and a second statistical universe of 300 workers, a population of 191 people and a sample of 58 professionals was determined. As sources of information gathering primary and secondary sources were used. To obtain the primary information direct observations were conducted and two surveys to identify the areas and processes with higher risks were designed. A questionnaire was combined with an ad hoc survey to establish estimates of frequency and severity of operational losses was also considered. The secondary information was extracted from the databases of Corpoelec-EDELCA, IEA, the World Bank, the BCBS-BIS, UPM and UC at Berkeley, among others. The operational loss frequency distributions and the operational loss severity distributions were established as the independent variables and OpVaR as the dependent variable. No monitoring or control of the variables under analysis was performed, as these were considered for a specific time and are determined only for the purpose of establishing the existence and timely assessment of the OpR in the study unit. Qualitative analysis raised in the MORM made it possible to detect that in the research unit, 67% of detected OpR come from two main sources: external processes (32%) and external events (35%). Additionally, validation of the MORM in Corpoelec-EDELCA, enabled to estimate that 63% of OpR in the organization come from three main categories, with external fraud being present more regularly and greater severity of losses in the organization. Risk exposure is determined basing on adapting the concept of OpVaR generally used for time series and in the case study it presents the advantage of applying it to qualitative data transformed with the Likert scale. The possibility of using typical probability distributions for quantitative data in loss frequency and loss severity distributions with data of qualitative origin were analyzed. For the 64% of OpR studied it was found that the frequency has a similar behavior to that of the Poisson probability distribution and 55% of the cases for loss severity it was found that the log-normal were the most common probability distributions. It was concluded that the approach suggested by the BCBS-BIS for time series can be applied to qualitative data. Once obtained the distributions of loss frequency and severity have been obtained they were subjected to convolution implementing the Monte Carlo method. Thus the loss distribution approaches (LDA) were obtained for each of the OpR. The OpVaR was derived as suggested by the BCBS-BIS 99.9 percentile or 99% of each of the LDA. It was determined that the OpR exhibits a similar behavior to the financial system, being the most dangerous those with low frequency and high impact for their difficulty in being detected and monitored. Finally, it is considered that the MORM will allows market players and their stakeholders to know with effectiveness, efficiency and reliability the status of their entities, which will reduce the uncertainty of their investments and enable them to establish a new management culture in their organizations.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

La presente Tesis constituye un avance en el conocimiento de los efectos de la variabilidad climática en los cultivos en la Península Ibérica (PI). Es bien conocido que la temperatura del océano, particularmente de la región tropical, es una de las variables más convenientes para ser utilizado como predictor climático. Los océanos son considerados como la principal fuente de almacenamiento de calor del planeta debido a la alta capacidad calorífica del agua. Cuando se libera esta energía, altera los regímenes globales de circulación atmosférica por mecanismos de teleconexión. Estos cambios en la circulación general de la atmósfera afectan a la temperatura, precipitación, humedad, viento, etc., a escala regional, los cuales afectan al crecimiento, desarrollo y rendimiento de los cultivos. Para el caso de Europa, esto implica que la variabilidad atmosférica en una región específica se asocia con la variabilidad de otras regiones adyacentes y/o remotas, como consecuencia Europa está siendo afectada por los patrones de circulaciones globales, que a su vez, se ven afectados por patrones oceánicos. El objetivo general de esta tesis es analizar la variabilidad del rendimiento de los cultivos y su relación con la variabilidad climática y teleconexiones, así como evaluar su predictibilidad. Además, esta Tesis tiene como objetivo establecer una metodología para estudiar la predictibilidad de las anomalías del rendimiento de los cultivos. El análisis se centra en trigo y maíz como referencia para otros cultivos de la PI, cultivos de invierno en secano y cultivos de verano en regadío respectivamente. Experimentos de simulación de cultivos utilizando una metodología en cadena de modelos (clima + cultivos) son diseñados para evaluar los impactos de los patrones de variabilidad climática en el rendimiento y su predictibilidad. La presente Tesis se estructura en dos partes: La primera se centra en el análisis de la variabilidad del clima y la segunda es una aplicación de predicción cuantitativa de cosechas. La primera parte está dividida en 3 capítulos y la segundo en un capitulo cubriendo los objetivos específicos del presente trabajo de investigación. Parte I. Análisis de variabilidad climática El primer capítulo muestra un análisis de la variabilidad del rendimiento potencial en una localidad como indicador bioclimático de las teleconexiones de El Niño con Europa, mostrando su importancia en la mejora de predictibilidad tanto en clima como en agricultura. Además, se presenta la metodología elegida para relacionar el rendimiento con las variables atmosféricas y oceánicas. El rendimiento de los cultivos es parcialmente determinado por la variabilidad climática atmosférica, que a su vez depende de los cambios en la temperatura de la superficie del mar (TSM). El Niño es el principal modo de variabilidad interanual de la TSM, y sus efectos se extienden en todo el mundo. Sin embargo, la predictibilidad de estos impactos es controversial, especialmente aquellos asociados con la variabilidad climática Europea, que se ha encontrado que es no estacionaria y no lineal. Este estudio mostró cómo el rendimiento potencial de los cultivos obtenidos a partir de datos de reanálisis y modelos de cultivos sirve como un índice alternativo y más eficaz de las teleconexiones de El Niño, ya que integra las no linealidades entre las variables climáticas en una única serie temporal. Las relaciones entre El Niño y las anomalías de rendimiento de los cultivos son más significativas que las contribuciones individuales de cada una de las variables atmosféricas utilizadas como entrada en el modelo de cultivo. Además, la no estacionariedad entre El Niño y la variabilidad climática europea se detectan con mayor claridad cuando se analiza la variabilidad de los rendimiento de los cultivos. La comprensión de esta relación permite una cierta predictibilidad hasta un año antes de la cosecha del cultivo. Esta predictibilidad no es constante, sino que depende tanto la modulación de la alta y baja frecuencia. En el segundo capítulo se identifica los patrones oceánicos y atmosféricos de variabilidad climática que afectan a los cultivos de verano en la PI. Además, se presentan hipótesis acerca del mecanismo eco-fisiológico a través del cual el cultivo responde. Este estudio se centra en el análisis de la variabilidad del rendimiento de maíz en la PI para todo el siglo veinte, usando un modelo de cultivo calibrado en 5 localidades españolas y datos climáticos de reanálisis para obtener series temporales largas de rendimiento potencial. Este estudio evalúa el uso de datos de reanálisis para obtener series de rendimiento de cultivos que dependen solo del clima, y utilizar estos rendimientos para analizar la influencia de los patrones oceánicos y atmosféricos. Los resultados muestran una gran fiabilidad de los datos de reanálisis. La distribución espacial asociada a la primera componente principal de la variabilidad del rendimiento muestra un comportamiento similar en todos los lugares estudiados de la PI. Se observa una alta correlación lineal entre el índice de El Niño y el rendimiento, pero no es estacionaria en el tiempo. Sin embargo, la relación entre la temperatura del aire y el rendimiento se mantiene constante a lo largo del tiempo, siendo los meses de mayor influencia durante el período de llenado del grano. En cuanto a los patrones atmosféricos, el patrón Escandinavia presentó una influencia significativa en el rendimiento en PI. En el tercer capítulo se identifica los patrones oceánicos y atmosféricos de variabilidad climática que afectan a los cultivos de invierno en la PI. Además, se presentan hipótesis acerca del mecanismo eco-fisiológico a través del cual el cultivo responde. Este estudio se centra en el análisis de la variabilidad del rendimiento de trigo en secano del Noreste (NE) de la PI. La variabilidad climática es el principal motor de los cambios en el crecimiento, desarrollo y rendimiento de los cultivos, especialmente en los sistemas de producción en secano. En la PI, los rendimientos de trigo son fuertemente dependientes de la cantidad de precipitación estacional y la distribución temporal de las mismas durante el periodo de crecimiento del cultivo. La principal fuente de variabilidad interanual de la precipitación en la PI es la Oscilación del Atlántico Norte (NAO), que se ha relacionado, en parte, con los cambios en la temperatura de la superficie del mar en el Pacífico Tropical (El Niño) y el Atlántico Tropical (TNA). La existencia de cierta predictibilidad nos ha animado a analizar la posible predicción de los rendimientos de trigo en la PI utilizando anomalías de TSM como predictor. Para ello, se ha utilizado un modelo de cultivo (calibrado en dos localidades del NE de la PI) y datos climáticos de reanálisis para obtener series temporales largas de rendimiento de trigo alcanzable y relacionar su variabilidad con anomalías de la TSM. Los resultados muestran que El Niño y la TNA influyen en el desarrollo y rendimiento del trigo en el NE de la PI, y estos impactos depende del estado concurrente de la NAO. Aunque la relación cultivo-TSM no es igual durante todo el periodo analizado, se puede explicar por un mecanismo eco-fisiológico estacionario. Durante la segunda mitad del siglo veinte, el calentamiento (enfriamiento) en la superficie del Atlántico tropical se asocia a una fase negativa (positiva) de la NAO, que ejerce una influencia positiva (negativa) en la temperatura mínima y precipitación durante el invierno y, por lo tanto, aumenta (disminuye) el rendimiento de trigo en la PI. En relación con El Niño, la correlación más alta se observó en el período 1981 -2001. En estas décadas, los altos (bajos) rendimientos se asocian con una transición El Niño - La Niña (La Niña - El Niño) o con eventos de El Niño (La Niña) que están finalizando. Para estos eventos, el patrón atmosférica asociada se asemeja a la NAO, que también influye directamente en la temperatura máxima y precipitación experimentadas por el cultivo durante la floración y llenado de grano. Los co- efectos de los dos patrones de teleconexión oceánicos ayudan a aumentar (disminuir) la precipitación y a disminuir (aumentar) la temperatura máxima en PI, por lo tanto el rendimiento de trigo aumenta (disminuye). Parte II. Predicción de cultivos. En el último capítulo se analiza los beneficios potenciales del uso de predicciones climáticas estacionales (por ejemplo de precipitación) en las predicciones de rendimientos de trigo y maíz, y explora métodos para aplicar dichos pronósticos climáticos en modelos de cultivo. Las predicciones climáticas estacionales tienen un gran potencial en las predicciones de cultivos, contribuyendo de esta manera a una mayor eficiencia de la gestión agrícola, seguridad alimentaria y de subsistencia. Los pronósticos climáticos se expresan en diferentes formas, sin embargo todos ellos son probabilísticos. Para ello, se evalúan y aplican dos métodos para desagregar las predicciones climáticas estacionales en datos diarios: 1) un generador climático estocástico condicionado (predictWTD) y 2) un simple re-muestreador basado en las probabilidades del pronóstico (FResampler1). Los dos métodos se evaluaron en un caso de estudio en el que se analizaron los impactos de tres escenarios de predicciones de precipitación estacional (predicción seco, medio y lluvioso) en el rendimiento de trigo en secano, sobre las necesidades de riego y rendimiento de maíz en la PI. Además, se estimó el margen bruto y los riesgos de la producción asociada con las predicciones de precipitación estacional extremas (seca y lluviosa). Los métodos predWTD y FResampler1 usados para desagregar los pronósticos de precipitación estacional en datos diarios, que serán usados como inputs en los modelos de cultivos, proporcionan una predicción comparable. Por lo tanto, ambos métodos parecen opciones factibles/viables para la vinculación de los pronósticos estacionales con modelos de simulación de cultivos para establecer predicciones de rendimiento o las necesidades de riego en el caso de maíz. El análisis del impacto en el margen bruto de los precios del grano de los dos cultivos (trigo y maíz) y el coste de riego (maíz) sugieren que la combinación de los precios de mercado previstos y la predicción climática estacional pueden ser una buena herramienta en la toma de decisiones de los agricultores, especialmente en predicciones secas y/o localidades con baja precipitación anual. Estos métodos permiten cuantificar los beneficios y riesgos de los agricultores ante una predicción climática estacional en la PI. Por lo tanto, seríamos capaces de establecer sistemas de alerta temprana y diseñar estrategias de adaptación del manejo del cultivo para aprovechar las condiciones favorables o reducir los efectos de condiciones adversas. La utilidad potencial de esta Tesis es la aplicación de las relaciones encontradas para predicción de cosechas de la próxima campaña agrícola. Una correcta predicción de los rendimientos podría ayudar a los agricultores a planear con antelación sus prácticas agronómicas y todos los demás aspectos relacionados con el manejo de los cultivos. Esta metodología se puede utilizar también para la predicción de las tendencias futuras de la variabilidad del rendimiento en la PI. Tanto los sectores públicos (mejora de la planificación agrícola) como privados (agricultores, compañías de seguros agrarios) pueden beneficiarse de esta mejora en la predicción de cosechas. ABSTRACT The present thesis constitutes a step forward in advancing of knowledge of the effects of climate variability on crops in the Iberian Peninsula (IP). It is well known that ocean temperature, particularly the tropical ocean, is one of the most convenient variables to be used as climate predictor. Oceans are considered as the principal heat storage of the planet due to the high heat capacity of water. When this energy is released, it alters the global atmospheric circulation regimes by teleconnection1 mechanisms. These changes in the general circulation of the atmosphere affect the regional temperature, precipitation, moisture, wind, etc., and those influence crop growth, development and yield. For the case of Europe, this implies that the atmospheric variability in a specific region is associated with the variability of others adjacent and/or remote regions as a consequence of Europe being affected by global circulations patterns which, in turn, are affected by oceanic patterns. The general objective of this Thesis is to analyze the variability of crop yields at climate time scales and its relation to the climate variability and teleconnections, as well as to evaluate their predictability. Moreover, this Thesis aims to establish a methodology to study the predictability of crop yield anomalies. The analysis focuses on wheat and maize as a reference crops for other field crops in the IP, for winter rainfed crops and summer irrigated crops respectively. Crop simulation experiments using a model chain methodology (climate + crop) are designed to evaluate the impacts of climate variability patterns on yield and its predictability. The present Thesis is structured in two parts. The first part is focused on the climate variability analyses, and the second part is an application of the quantitative crop forecasting for years that fulfill specific conditions identified in the first part. This Thesis is divided into 4 chapters, covering the specific objectives of the present research work. Part I. Climate variability analyses The first chapter shows an analysis of potential yield variability in one location, as a bioclimatic indicator of the El Niño teleconnections with Europe, putting forward its importance for improving predictability in both climate and agriculture. It also presents the chosen methodology to relate yield with atmospheric and oceanic variables. Crop yield is partially determined by atmospheric climate variability, which in turn depends on changes in the sea surface temperature (SST). El Niño is the leading mode of SST interannual variability, and its impacts extend worldwide. Nevertheless, the predictability of these impacts is controversial, especially those associated with European climate variability, which have been found to be non-stationary and non-linear. The study showed how potential2 crop yield obtained from reanalysis data and crop models serves as an alternative and more effective index of El Niño teleconnections because it integrates the nonlinearities between the climate variables in a unique time series. The relationships between El Niño and crop yield anomalies are more significant than the individual contributions of each of the atmospheric variables used as input in the crop model. Additionally, the non-stationarities between El Niño and European climate variability are more clearly detected when analyzing crop-yield variability. The understanding of this relationship allows for some predictability up to one year before the crop is harvested. This predictability is not constant, but depends on both high and low frequency modulation. The second chapter identifies the oceanic and atmospheric patterns of climate variability affecting summer cropping systems in the IP. Moreover, hypotheses about the eco-physiological mechanism behind crop response are presented. It is focused on an analysis of maize yield variability in IP for the whole twenty century, using a calibrated crop model at five contrasting Spanish locations and reanalyses climate datasets to obtain long time series of potential yield. The study tests the use of reanalysis data for obtaining only climate dependent time series of simulated crop yield for the whole region, and to use these yield to analyze the influences of oceanic and atmospheric patterns. The results show a good reliability of reanalysis data. The spatial distribution of the leading principal component of yield variability shows a similar behaviour over all the studied locations in the IP. The strong linear correlation between El Niño index and yield is remarkable, being this relation non-stationary on time, although the air temperature-yield relationship remains on time, being the highest influences during grain filling period. Regarding atmospheric patterns, the summer Scandinavian pattern has significant influence on yield in IP. The third chapter identifies the oceanic and atmospheric patterns of climate variability affecting winter cropping systems in the IP. Also, hypotheses about the eco-physiological mechanism behind crop response are presented. It is focused on an analysis of rainfed wheat yield variability in IP. Climate variability is the main driver of changes in crop growth, development and yield, especially for rainfed production systems. In IP, wheat yields are strongly dependent on seasonal rainfall amount and temporal distribution of rainfall during the growing season. The major source of precipitation interannual variability in IP is the North Atlantic Oscillation (NAO) which has been related in part with changes in the Tropical Pacific (El Niño) and Atlantic (TNA) sea surface temperature (SST). The existence of some predictability has encouraged us to analyze the possible predictability of the wheat yield in the IP using SSTs anomalies as predictor. For this purpose, a crop model with a site specific calibration for the Northeast of IP and reanalysis climate datasets have been used to obtain long time series of attainable wheat yield and relate their variability with SST anomalies. The results show that El Niño and TNA influence rainfed wheat development and yield in IP and these impacts depend on the concurrent state of the NAO. Although crop-SST relationships do not equally hold on during the whole analyzed period, they can be explained by an understood and stationary ecophysiological mechanism. During the second half of the twenty century, the positive (negative) TNA index is associated to a negative (positive) phase of NAO, which exerts a positive (negative) influence on minimum temperatures (Tmin) and precipitation (Prec) during winter and, thus, yield increases (decreases) in IP. In relation to El Niño, the highest correlation takes place in the period 1981-2001. For these decades, high (low) yields are associated with an El Niño to La Niña (La Niña to El Niño) transitions or to El Niño events finishing. For these events, the regional associated atmospheric pattern resembles the NAO, which also influences directly on the maximum temperatures (Tmax) and precipitation experienced by the crop during flowering and grain filling. The co-effects of the two teleconnection patterns help to increase (decrease) the rainfall and decrease (increase) Tmax in IP, thus on increase (decrease) wheat yield. Part II. Crop forecasting The last chapter analyses the potential benefits for wheat and maize yields prediction from using seasonal climate forecasts (precipitation), and explores methods to apply such a climate forecast to crop models. Seasonal climate prediction has significant potential to contribute to the efficiency of agricultural management, and to food and livelihood security. Climate forecasts come in different forms, but probabilistic. For this purpose, two methods were evaluated and applied for disaggregating seasonal climate forecast into daily weather realizations: 1) a conditioned stochastic weather generator (predictWTD) and 2) a simple forecast probability resampler (FResampler1). The two methods were evaluated in a case study where the impacts of three scenarios of seasonal rainfall forecasts on rainfed wheat yield, on irrigation requirements and yields of maize in IP were analyzed. In addition, we estimated the economic margins and production risks associated with extreme scenarios of seasonal rainfall forecasts (dry and wet). The predWTD and FResampler1 methods used for disaggregating seasonal rainfall forecast into daily data needed by the crop simulation models provided comparable predictability. Therefore both methods seem feasible options for linking seasonal forecasts with crop simulation models for establishing yield forecasts or irrigation water requirements. The analysis of the impact on gross margin of grain prices for both crops and maize irrigation costs suggests the combination of market prices expected and the seasonal climate forecast can be a good tool in farmer’s decision-making, especially on dry forecast and/or in locations with low annual precipitation. These methodologies would allow quantifying the benefits and risks of a seasonal weather forecast to farmers in IP. Therefore, we would be able to establish early warning systems and to design crop management adaptation strategies that take advantage of favorable conditions or reduce the effect of adverse conditions. The potential usefulness of this Thesis is to apply the relationships found to crop forecasting on the next cropping season, suggesting opportunity time windows for the prediction. The methodology can be used as well for the prediction of future trends of IP yield variability. Both public (improvement of agricultural planning) and private (decision support to farmers, insurance companies) sectors may benefit from such an improvement of crop forecasting.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The three-dimensional wall-bounded open cavity may be considered as a simplified geometry found in industrial applications such as leading gear or slotted flats on the airplane. Understanding the three-dimensional complex flow structure that surrounds this particular geometry is therefore of major industrial interest. At the light of the remarkable former investigations in this kind of flows, enough evidences suggest that the lateral walls have a great influence on the flow features and hence on their instability modes. Nevertheless, even though there is a large body of literature on cavity flows, most of them are based on the assumption that the flow is two-dimensional and spanwise-periodic. The flow over realistic open cavity should be considered. This thesis presents an investigation of three-dimensional wall-bounded open cavity with geometric ratio 6:2:1. To this aim, three-dimensional Direct Numerical Simulation (DNS) and global linear instability have been performed. Linear instability analysis reveals that the onset of the first instability in this open cavity is around Recr 1080. The three-dimensional shear layer mode with a complex structure is shown to be the most unstable mode. I t is noteworthy that the flow pattern of this high-frequency shear layer mode is similar to the observed unstable oscillations in supercritical unstable case. DNS of the cavity flow carried out at different Reynolds number from steady state until a nonlinear saturated state is obtained. The comparison of time histories of kinetic energy presents a clearly dominant energetic mode which shifts between low-frequency and highfrequency oscillation. A complete flow patterns from subcritical cases to supercritical case has been put in evidence. The flow structure at the supercritical case Re=1100 resembles typical wake-shedding instability oscillations with a lateral motion existed in the subcritical cases. Also, This flow pattern is similar to the observations in experiments. In order to validate the linear instability analysis results, the topology of the composite flow fields reconstructed by linear superposition of a three-dimensional base flow and its leading three-dimensional global eigenmodes has been studied. The instantaneous wall streamlines of those composited flows display distinguish influence region of each eigenmode. Attention has been focused on the leading high-frequency shear layer mode; the composite flow fields have been fully recognized with respect to the downstream wave shedding. The three-dimensional shear layer mode is shown to give rise to a typical wake-shedding instability with a lateral motions occurring downstream which is in good agreement with the experiment results. Moreover, the spanwise-periodic, open cavity with the same length to depth ratio has been also studied. The most unstable linear mode is different from the real three-dimensional cavity flow, because of the existence of the side walls. Structure sensitivity of the unstable global mode is analyzed in the flow control context. The adjoint-based sensitivity analysis has been employed to localized the receptivity region, where the flow is more sensible to momentum forcing and mass injection. Because of the non-normality of the linearized Navier-Stokes equations, the direct and adjoint field has a large spatial separation. The strongest sensitivity region is locate in the upstream lip of the three-dimensional cavity. This numerical finding is in agreement with experimental observations. Finally, a prototype of passive flow control strategy is applied.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

En este proyecto se ha realizado el estudio del campo acústico de un estudio de grabación, en concreto el estudio de grabación “Sadman”, situado en Madrid. El estudio viene motivado por una serie de problemas de resonancia en frecuencias graves de la sala de control del estudio, lo cual genera algunas alteraciones desagradables cuando se realiza una escucha por el sistema de altavoces en dicha sala. Además de estudiar este problema, se han estudiado otros parámetros acústicos para poder plantear, si fuese necesario, posibles mejoras que faciliten las labores que se realizan en la sala. Para realizar tanto el estudio del campo acústico de la sala como su modelado se han utilizado herramientas tales como AutoCAD, Ease, Dirac y Spectraplus. Palabras clave: Acústica, modos de vibración, frecuencia, resonancia, Dirac, Ease, AutoCAD, Spectraplus, absorbente, difusor. ABSTRACT. The present project is aimed to study the acoustics of a recording studio. A report on "Sadman Studios" located in Madrid. The investigation was made due to a low frequency vibration problem on the control room that generated undesirable acoustic disturbances when the main monitoring system was running. Apart from the modal frequencies problem, several acoustic parameters have been submitted to investigation in case improvements in the acoustic field needed to be implemented, if required. System tools like AutoCAD, Ease, Dirac and Spectraplus were used to properly understand and reflect the issues we were faced with under the investigation of the within matter.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

La perspectiva del arquitecto en calidad ambiental, y salud en un contexto sostenible, se amplía al considerar las radiaciones electromagnéticas no ionizantes en el diseño arquitectónico. En ese sentido, además del confort higrotérmico, acústico, lumínico y de la calidad del aire, se podría considerar el confort electromagnético de un lugar. Dado que existe gran controversia en cuales han de ser los límites de exposición a radiaciones electromagnéticas no ionizantes, establezco como punto de referencia los valores límite más restrictivos, que son los recomendados por la norma SBM-2008, desarrollada por el Institut für Baubiologie & Oekologie Neubeuern (IBN)1. Se plantean como hipótesis que podemos modificar el entorno electromagnético con materiales de construcción y geometría; y que determinados trazados geométricos tienen la capacidad de reducir el impacto de los campos electromagnéticos sobre los organismos vivos. El objetivo consiste en demostrar experimentalmente que podemos trabajar sobre la calidad ambiental electromagnética de un espacio, a través de la elección de materiales de construcción y trazados geométricos, intentando demostrar que existe una relación causa - efecto entre ambos. La metodología plantea tres aproximaciones experimentales, cada una con un tipo de radiación electromagnética, pues se pretende abarcar las situaciones que comúnmente se pueden presentar en un entorno habitado, ya sea urbano o rural. La primera aproximación trata sobre las alteraciones del campo geomagnético natural (nT / m) provocadas por los materiales de construcción. Utilizo el geomagnetómetro BPM 2010, para realizar un ensayo con cuatro tipos de materiales de distinta procedencia: origen vegetal muy poco procesado (corcho aglomerado negro) y más procesado (OSB), origen derivado del petróleo (tablero rígido de poliuretano) y de origen mineral metálico (chapa minionda). De la lectura de los datos se observa relación causa-efecto entre los materiales de construcción estudiados y las modificaciones que pueden ejercer sobre el campo magnético de un lugar. A continuación se estudia el entorno de radiación electromagnética artificial a baja frecuencia (3 Hz a 3 kHz) y a alta frecuencia, (800 MHz a 10 GHz) en vivienda y en oficina utilizando unas geometrías concretas: las tarjetas de corrección de radiaciones. Estas tarjetas se ubican en paramentos verticales y horizontales de un espacio sometido a radiación propia de un entorno urbano. Se concluye que en una habitación inciden múltiples variables simultáneas muy difíciles de trabajar por separado y que aparentemente no se pueden identificar cambios significativos en las mediciones con y sin las tarjetas de corrección de radiaciones. A continuación estudio el entorno de radiación electromagnética artificial a baja frecuencia asociada a la red de distribución eléctrica. Para poder ver cómo este entorno electromagnético lo podemos modificar, utilizo las tarjetas de corrección de radiaciones ubicadas en relación directa con organismos vivos, por un lado germinados de semillas de haba mungo sometidas a campos electromagnéticos complejos a alta y baja frecuencia, propios de una oficina; y por otro lado germinados de semillas de haba mungo, sometidas a campos electromagnéticos puros a 50 Hz, sin influencias de radiación a alta frecuencia. Se concluye que se observa relación causa - efecto entre los trazados geométricos estudiados y su capacidad para reducir el impacto de los campos electromagnéticos a altas y bajas frecuencias sobre las semillas de haba mungo. También utilizo las tarjetas de corrección de radiaciones en un ensayo normalizado en el laboratorio de bioelectromagnetismo del Hospital Universitario Ramón y Cajal, con células de neuroblastoma humano. Se concluye que se observa relación causa - efecto entre los trazados geométricos estudiados y su capacidad para reducir el impacto de los campos electromagnéticos de 50 Hz Y 100 μT sobre células de neuroblastoma humano y además disminuyen la velocidad de proliferación celular respecto del grupo de células de control. Finalmente se estudia el entorno de radiación electromagnética artificial a alta frecuencia, asociado a comunicaciones inalámbricas. Para ello realizo simulaciones con el software CST Studio, sobre las tarjetas de corrección de radiaciones a alta frecuencia. A la luz de los datos se observa relación causa - efecto entre el trazado geométrico estudiado y su capacidad para reducir radiaciones electromagnéticas de alta frecuencia. Se comprueba además que, las tarjetas de corrección de radiaciones disminuyen la intensidad de la radiación acercándose a los límites de exposición establecidos por el instituto de la biología de la construcción alemán, que podrían estar señalando los estándares de biocompatibilidad. ABSTRACT The perspective of the architect in environmental quality, and health in a sustainable context is extended to consider non-ionizing electromagnetic radiation in architectural design. In that sense, besides the hygrothermal, acoustic, lighting and air quality comfort, the electromagnetic comfort of an indoor space could be considered. There is still great controversy about which should be the limits of exposure to nonionizing electromagnetic radiation, as a benchmark, the more restrictive limits are considered, by the SBM- 2008 standard, developed by the Institut für Baubiologie & Oekologie Neubeuern (IBN). The hypotheses that arise are the following: the electromagnetic environment can be modified by using certain construction materials and geometry; and certain geometric design have the ability to reduce the impact of electromagnetic fields on living organisms. The aim is to demonstrate experimentally that we can work on electromagnetic environmental quality of a indoor space, by using certain construction materials and geometric design, trying to demonstrate a cause - effect relationship between them. The methodology raises three experimental approaches, each with a type of radiation, it is intend to cover situations commonly may occur in an inhabited environment, whether urban or rural. The first approach discusses the alteration of the natural magnetic field (nT / m) caused by the building materials. Geomagnetometre BPM 2010 is used for conducting a test with four types of materials from different sources: vegetable origin less processing (black agglomerate cork) and vegetable origin more processed (OSB), petroleum origin (rigid polyurethane board) and metallic origin (miniwave plate). It is observed across the data information that exist cause-effect relationship between the construction materials studied and the modifications that they can exercise on the magnetic field of a place. Then I study the environment of artificial electromagnetic radiation at low frequency (3 Hz to 3 kHz) and high frequency (800 MHz to 10 GHz) in housing and office, using some specific geometries: correcting radiation cards. These cards are placed in vertical and horizontal surfaces of an indoor space concerned by radiation. I conclude that an indoor space is affected by multiple simultaneous variables difficult to work separately and apparently it is not possible identify significant changes in measurements with and without correcting radiation cards. Then the artificial electromagnetic environment of low-frequency radiation associated with the electricity distribution network is studied. To see how the electromagnetic environment can be changed, correcting radiation cards are placed directly related to living organisms. On one hand, mung bean seeds subject to complex electromagnetic fields at low and high frequency, typical of an office; and on the other hand mung bean seeds, subjected to pure electromagnetic fields at 50 Hz, no influenced by high frequency radiation. It is observed that exist cause-effect relationship between the geometric design and their ability to reduce the impact of electromagnetic fields at high and low frequencies that arrives on on mung bean seeds. The correcting radiation cards were also used in a standard test in the bioelectromagnetics laboratory of Ramón y Cajal University Hospital, on human neuroblastoma cells. It is observed that exist cause-effect relationship between the geometric design and their ability to reduce the impact of electromagnetic fields at 50 Hz and 100 μT on human neuroblastoma cells and also decrease the rate of cell proliferation compared to the group of cells control. Finally the artificial electromagnetic radiation environment at high frequency associated with wireless communications was studied. Simulations with CST Study software were made to determine the behavior of correcting radiation cards in high-frequency. It is observed across the data information that exist causeeffect relationship between the geometric design and the ability to reduce the levels of high-frequency electromagnetic radiation. It also checks that radiation correcting cards decrease the intensity of radiation approaching exposure limits established by Institut für Baubiologie & Oekologie Neubeuern (IBN), which could be signaling biocompatibility standards.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

A Administração Financeira surge no início do século XIX juntamente com o movimento de consolidação das grandes empresas e a formação dos mercados nacionais americano enquanto que no Brasil os primeiros estudos ocorrem a partir da segunda metade do século XX. Desde entãoo país conseguiu consolidar alguns centros de excelência em pesquisa, formar grupo significativo de pesquisadores seniores e expandir as áreas de pesquisa no campo, contudo, ainda são poucos os trabalhos que buscam retratar as características da produtividade científica em Finanças. Buscando contribuir para a melhor compreensão do comportamento produtivo dessa área a presente pesquisa estuda sua produção científica, materializada na forma de artigos digitais, publicados em 24 conceituados periódicos nacionais classificados nos estratos Qualis/CAPES A2, B1 e B2 da Área de Administração, Ciências Contábeis e Turismo. Para tanto são aplicadas a Lei de Bradford, Lei do Elitismo de Price e Lei de Lotka. Pela Lei de Bradford são identificadas três zonas de produtividade sendo o núcleo formado por três revistas, estando uma delas classificada no estrato Qualis/CAPES B2, o que evidencia a limitação de um recorte tendo como único critério a classificação Qualis/CAPES. Para a Lei do Elitismo de Price, seja pela contagem direta ou completa, não identificamos comportamento de uma elite semelhante ao apontado pela teoria e que conta com grande número de autores com apenas uma publicação.Aplicando-se o modelo do Poder Inverso Generalizado, calculado por Mínimos Quadrados Ordinários (MQO), verificamos que produtividade dos pesquisadores, quando feita pela contagem direta, se adequa àquela definida pela Lei de Lotka ao nível de α = 0,01 de significância, contudo, pela contagem completa não podemos confirmar a hipótese de homogeneidade das distribuições, além do fato de que nas duas contagens a produtividade analisada pelo parâmetro n é maior que 2 e, portanto, a produtividade do pesquisadores de finanças é menor que a defendida pela teoria.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

We measured coherence between the electroencephalogram at different scalp sites while human subjects performed delayed response tasks. The tasks required the retention of either verbalizable strings of characters or abstract line drawings. In both types of tasks, a significant enhancement in coherence in the θ range (4–7 Hz) was found between prefrontal and posterior electrodes during 4-s retention intervals. During 6-s perception intervals, far fewer increases in θ coherence were found. Also in other frequency bands, coherence increased; however, the patterns of enhancement made a relevance for working memory processes seem unlikely. Our results suggest that working memory involves synchronization between prefrontal and posterior association cortex by phase-locked, low frequency (4–7 Hz) brain activity.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

We have recently found that okadaic acid, which shows strong inhibitory activity on protein serine/threonine phosphatases and tumor-promoting activity in vivo and in vitro, induces minisatellite mutation (MSM). Human tumors and chemically induced counterparts in experimental animals are also sometimes associated with MSM. In the present study, we demonstrated minisatellite (MS) instability in severe combined immunodeficiency (SCID) cells in which the DNA-dependent protein kinase catalytic subunit (DNA-PKcs) is impaired. Cells from a SCID fibroblast cell line transformed by simian virus 40 large tumor antigen, SC3VA2, and from an embryonal SCID fibroblast cell line, SC1K, were cloned and propagated to 107 to 108 cells, and then subjected to subcloning. After propagation of each subclone to 107 to 108 cells, DNA samples were digested with HinfI and analyzed by Southern blotting using the Pc-1 MS sequence as a probe. Under low-stringency conditions, about 40 MS bands were detected, with 45% ± 6% and 37% ± 3% of SC3VA2 and SC1K cells, respectively, having MSM. In contrast, cells from the RD13B2 cell line, which was established from SCVA2 by introducing human chromosome 8q fragments, on which DNA-PKcs is known to reside, to complement the SCID phenotype, showed a very low frequency of MSM (3% ± 3%). The high frequencies of MSM in SC3VA2 and SC1K were significant, with no difference between the two. The present study clearly demonstrates that MS instability exists in SCID fibroblasts, suggesting that DNA-PKcs might be involved in the stable maintenance of MS sequences in the genome.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The computations involved in the processing of a visual scene invariably involve the interactions among neurons throughout all of visual cortex. One hypothesis is that the timing of neuronal activity, as well as the amplitude of activity, provides a means to encode features of objects. The experimental data from studies on cat [Gray, C. M., Konig, P., Engel, A. K. & Singer, W. (1989) Nature (London) 338, 334–337] support a view in which only synchronous (no phase lags) activity carries information about the visual scene. In contrast, theoretical studies suggest, on the one hand, the utility of multiple phases within a population of neurons as a means to encode independent visual features and, on the other hand, the likely existence of timing differences solely on the basis of network dynamics. Here we use widefield imaging in conjunction with voltage-sensitive dyes to record electrical activity from the virtually intact, unanesthetized turtle brain. Our data consist of single-trial measurements. We analyze our data in the frequency domain to isolate coherent events that lie in different frequency bands. Low frequency oscillations (<5 Hz) are seen in both ongoing activity and activity induced by visual stimuli. These oscillations propagate parallel to the afferent input. Higher frequency activity, with spectral peaks near 10 and 20 Hz, is seen solely in response to stimulation. This activity consists of plane waves and spiral-like waves, as well as more complex patterns. The plane waves have an average phase gradient of ≈π/2 radians/mm and propagate orthogonally to the low frequency waves. Our results show that large-scale differences in neuronal timing are present and persistent during visual processing.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The role of channel inactivation in the molecular mechanism of calcium (Ca2+) channel block by phenylalkylamines (PAA) was analyzed by designing mutant Ca2+ channels that carry the high affinity determinants of the PAA receptor site [Hockerman, G. H., Johnson, B. D., Scheuer, T., and Catterall, W. A. (1995) J. Biol. Chem. 270, 22119–22122] but inactivate at different rates. Use-dependent block by PAAs was studied after expressing the mutant Ca2+ channels in Xenopus oocytes. Substitution of single putative pore-orientated amino acids in segment IIIS6 by alanine (F-1499-A, F-1500-A, F-1510-A, I-1514-A, and F-1515-A) gradually slowed channel inactivation and simultaneously reduced inhibition of barium currents (IBa) by (−)D600 upon depolarization by 100 ms steps at 0.1 Hz. This apparent reduction in drug sensitivity was only evident if test pulses were applied at a low frequency of 0.1 Hz and almost disappeared at the frequency of 1 Hz. (−)D600 slowed IBa recovery after maintained membrane depolarization (1–3 sec) to a comparable extent in all channel constructs. A drug-induced delay in the onset of IBa recovery from inactivation suggests that PAAs promote the transition to a deep inactivated channel conformation. These findings indicate that apparent PAA sensitivity of Ca2+ channels is not only defined by drug interaction with its receptor site but also crucially dependent on intrinsic gating properties of the channel molecule. A molecular model for PAA-Ca2+ channel interaction that accounts for the relationship between drug induced inactivation and channel block by PAA is proposed.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The fungal pathogen Ustilago hordei causes the covered smut disease of barley and oats. Mating and pathogenicity in this fungus are controlled by the MAT locus, which contains two distinct gene complexes, a and b. In this study, we tagged the a and b regions with the recognition sequence for the restriction enzyme I-SceI and determined that the distance between the complexes is 500 kb in a MAT-1 strain and 430 kb in a MAT-2 strain. Characterization of the organization of the known genes within the a and b gene complexes provided evidence for nonhomology and sequence inversion between MAT-1 and MAT-2. Antibiotic-resistance markers also were used to tag the a gene complex in MAT-1 strains (phleomycin) and the b gene complex in MAT-2 strains (hygromycin). Crosses were performed with these strains and progeny resistant to both antibiotics were recovered at a very low frequency, suggesting that recombination is suppressed within the MAT region. Overall, the chromosome homologues carrying the MAT locus of U. hordei share features with primitive sex chromosomes, with the added twist that the MAT locus also controls pathogenicity.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Spontaneous magnetoencephalographic activity was recorded in awake, healthy human controls and in patients suffering from neurogenic pain, tinnitus, Parkinson's disease, or depression. Compared with controls, patients showed increased low-frequency θ rhythmicity, in conjunction with a widespread and marked increase of coherence among high- and low-frequency oscillations. These data indicate the presence of a thalamocortical dysrhythmia, which we propose is responsible for all the above mentioned conditions. This coherent θ activity, the result of a resonant interaction between thalamus and cortex, is due to the generation of low-threshold calcium spike bursts by thalamic cells. The presence of these bursts is directly related to thalamic cell hyperpolarization, brought about by either excess inhibition or disfacilitation. The emergence of positive clinical symptoms is viewed as resulting from ectopic γ-band activation, which we refer to as the “edge effect.” This effect is observable as increased coherence between low- and high-frequency oscillations, probably resulting from inhibitory asymmetry between high- and low-frequency thalamocortical modules at the cortical level.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The low frequency of precursor cells specific for any particular antigen (Ag) makes it difficult to characterize preimmune T cell receptor (TCR) repertoires and to understand repertoire selection during an immune response. We have undertaken a combined adoptive transfer single-cell PCR approach to probe the Ag-specific preimmune repertoires of individual mice. Our strategy was to inject paired irradiated recipient mice with normal spleen cells prepared from individual donors and to compare the TCR repertoires subsequently selected during a CD8 response to a defined model Ag. We found that although some TCRs were shared, the TCR repertoires selected by mice receiving splenocytes from the same donor were not identical in terms of the TCRs selected and their relative frequencies. Our results together with computer simulations imply that individual mice express distinct Ag-specific preimmune TCR repertoires composed of expanded clones and that selection by Ag is a random process.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Testicular cancers respond favorably to chemotherapy with the platinum-containing drug cis-diamminedichloroplatinum(II) (cisplatin). One factor that could explain the efficacy of cisplatin is the low frequency of p53 mutations observed in this tumor type. The present study examines the p53-mediated responses in murine testicular teratocarcinoma cells exposed to the drug. Cisplatin treatment of teratocarcinoma cells with a wild-type p53 gene resulted in accumulation of the p53 protein through posttranscriptional mechanisms; induction of p53-target genes was also observed. Drug treatment resulted in rapid apoptosis in p53-wild-type cells but not in p53−/− teratocarcinoma cells. In the latter cells, cisplatin exposure caused prolonged cell cycle arrest accompanied by induction of the p21 gene. Clonogenic assays demonstrated that the p53 mutation did not confer resistance to cisplatin. These experiments suggest that cisplatin inhibits cellular proliferation of testicular teratocarcinoma cells by two possible mechanisms, p53-dependent apoptosis and p53-independent cell cycle arrest.