988 resultados para Hydén, Holger
Resumo:
Ziel dieser Dissertation ist es, eine Klasse interferometrischer Messgeräte zu charakterisieren und weiter zu entwickeln. Die Modulation der optischen Weglänge (OPLM) im Referenzarm eines interferometrischen Messsystems ist ein anpassungsfähiger Ansatz. Sie ist zur Messung von Oberflächenprofilen mit einer Auflösung bis in den sub-nm-Bereich bei einem Messbereich von bis zu 100 Mikrometer geeignet. Wird ein statisches Messobjekt gemessen, tritt durch die Modulation im Referenzarm am Detektor ein periodisches Interferenzmuster auf. Dies ist in der unten stehenden Abbildung schematisch dargestellt. Bei einer Veränderung des Abstandes zwischen Objekt und Messgerät kann aus der Phasen- und/oder Hüllkurvenverschiebung im Interferenzmuster die Abstandsänderung abgeleitet werden.Im Rahmen der Arbeit sind zwei funktionsfähige OPLM-Messsysteme entwickelt, aufgebaut und getestet worden. Diese demonstrieren, dass der OPLM-Ansatz ein breites Spektrum an Anwendungen durch eine optische Messung abdecken kann. Allerdings zeigen sich an den Messsystemen auch die Limitierungen des OPLM-Ansatzes. Die Systeme basieren auf einer Punktmessung mittels einer fasergekoppelten Sonde sowie auf einer linienförmigen Messung durch eine Zeilenkamera. Um eine hohe laterale Auflösung zu erzielen, wird die Zeilenkamera mit einem Mikroskop kombiniert. Damit flächenhaft gemessen werden kann, ist es notwendig, Messobjekt und Sensor zueinander zu verschieben. Daher wird eine Theorie entwickelt, unter welchen Randbedingungen bewegte Objekte von einem OPLM-Messsystem aufgelöst werden können. Die Theorie wird anschließend experimentell überprüft und bestätigt. Für die Auswertung der bei der Modulation der optischen Weglänge entstehenden Interferenzen existieren bereits einige erprobte Algorithmen, welche auf ihre Eignung hin untersucht und mit selbst entwickelten Algorithmen verglichen werden. Auch wird darauf eingegangen, welches die zentralen Herausforderungen bei der Planung von OPLM-Interferometern sind und wie sich insbesondere die Wahl des Aktors für die OPLM auf das gesamte Messsystem auswirkt. Bei den beiden Messsystemen werden jeweils wichtige Komponenten wie analoge Elektronik und Aktorik sowie ihre Funktionsweise erläutert. Es wird detailliert beschrieben, wie ein OPLM-Messsystem charakterisiert und kalibriert werden muss, um möglichst zuverlässige Messwerte zu liefern. Abschließend werden die Möglichkeiten der beiden entwickelten Systeme durch Beispielmessungen demonstriert, sowie ihre Messgenauigkeit charakterisiert.
Resumo:
En el presente trabajo se pretendió demostrar la importancia que tiene el ciudadano como agente de veeduría y control de las instancias Estatales, y que a través de los medios se facilita la postulación de las demandas que han sido encubadas y maduradas en espacios de discusión y debate político en instancias locales y regionales. Para esto se escogió el caso de la Rebelión de los Forajidos que tuvo lugar en Quito, Ecuador.
Resumo:
Monográfico sobre historia de la formación profesional en Europa.- Resumen tomado parcialmente de la publicación
Resumo:
En la historia del Ecuador contemporáneo, desde 1990 en que se lleva a efecto el primer levantamiento indígena nacional aparece el movimiento indígena como un nuevo actor social que trastorna su condición social, el rol que tradicionalmente le había sido asignado por la sociedad y el Estado ecuatoriano y trata de conquistar un espacio público que le había sido negado por muchos siglos debido a la situación indígena de exclusión y marginación social. A partir de 1990 y los sucesivos años, década que se caracteriza en el Ecuador como un acumulado de nuevas movilizaciones y levantamientos indígenas de sentido nacional, el movimiento indígena se transforma en un nuevo actor social necesario en la sociedad debido a que asume una funcionalidad política que va más allá de su base organizacional y a través de una nueva discursividad y praxis política busca expresar un sentimiento social que ya no encaja en el discurso ni la práctica política decadentes del sindicalismo ni de los partidos políticos tradicionales. Desde el ámbito estrictamente del Estado, éste tampoco vino representando los intereses de los Pueblos y Nacionalidades Indígenas del Ecuador, a pesar de que algunos gobiernos crearon ciertas oficinas de atención a los sectores indígenas en el Ministerio de Bienestar Social. El Estado en sí mismo no había trastocado desde su formación original su fundamental principio uninacional concebido desde la ideología del mestizaje racial. Ante los hechos de exclusión de larga data de los Pueblos Indígenas promovidos por el Estado ecuatoriano y ante hechos más inmediatos tales como los ajustes estructurales, la modernización del Estado de tendencia privatizadora, la intervención de políticas internacionales, etc., el movimiento indígena asume en la década del noventa su rol protagónico, mediante propuestas substancialmente indígenas que buscan el reconocimiento del Ecuador como un país pluricultural, multiétnico y plurinacional, en el sentido de democratizar la sociedad y el Estado ecuatoriano. En este plano la participación indígena y negra se tornaría necesaria como búsqueda de aquello de Nunca más un Ecuador sin nosotros. La investigación que plantéo se inscribe en este contexto. Los Pueblos y Nacionalidades Indígenas, mediados por el movimiento indígena, por un lado, durante la década del noventa intentaron ir configurando un mayor acercamiento con el Estado a través del diálogo y la negociación política, y por otro lado, la estructuración de nuevas relaciones entre los indígenas y el Estado por medio de la creación de instituciones, tales como la Comisión Coordinadora de Asuntos Indígenas, la Secretaría Nacional de Asuntos Indígenas y Minorías Etnicas, SENAIME, el Consejo Nacional de Planificación y Desarrollo de los Pueblos Indígenas y Negros del Ecuador, CONPLADEIN y el Consejo de Desarrollo de las Nacionalidades y Pueblos del Ecuador, CODENPE. El objetivo de la investigación consiste en analizar para que sirvió cada movilización indígena ante el Estado ecuatoriano durante los acontecimientos indígenas de 1990, 1992, 1994 y 1997 y la emergencia de una nueva institucionalidad entre los indígenas y el Estado ecuatoriano desde 1990 hasta 1998. En dicho marco de estudio lo que me interesa averiguar es ¿para qué sirvió cada movilización ante el Estado ecuatoriano durante los sucesos indígenas de 1990, 1992, 1994 y 1997? y ¿cómo después de cada movilización indígena surgió una nueva institucionalidad entre los indígenas y el Estado?
Resumo:
Presenta las reseñas de los siguientes libros: César Montaño Galarza, Juan Carlos Mogrovejo Jaramillo, Derecho tributario municipal ecuatoriano: fundamentos y práctica, Quito, Universidad Andina Simón Bolívar / Corporación Editora Nacional, 2014. -- Álvaro Renato Mejía Salazar, Los medios de impugnación ante el proceso y el procedimiento contemporáneo, Quito, Ed. Legales, 2013, 157 pp. -- Holger Paúl Córdova, Los derechos sin poder popular. Presente y futuro de la participación, comunicación e información , Quito, Centro Andino de Estudios Estratégicos y Centro de Estudios Construyendo Ciudadanía y Democracia- ISPCI-UCE, 2013.
Resumo:
While selenium (Se) is an essential micronutrient for humans, epidemiological studies have raised concern that supranutritional Se intake may increase the risk to develop Type 2 diabetes mellitus (T2DM). We aimed to determine the impact of Se at a dose and source frequently ingested by humans on markers of insulin sensitivity and signalling. Male pigs were fed either a Se-adequate (0.17 mg Se/kg) or a Se-supranutritional (0.50 mg Se/kg; high-Se) diet. After 16 weeks of intervention, fasting plasma insulin and cholesterol levels were non-significantly increased in the high-Se pigs, whereas fasting glucose concentrations did not differ between the two groups. In skeletal muscle of high-Se pigs, glutathione peroxidase activity was increased, gene expression of forkhead box O1 transcription factor and peroxisomal proliferator-activated receptor- coactivator 1 were increased and gene expression of the glycolytic enzyme pyruvate kinase was decreased. In visceral adipose tissue of high-Se pigs, mRNA levels of sterol regulatory element-binding transcription factor 1 were increased, and the phosphorylation of Akt, AMP-activated kinase and mitogen-activated protein kinases was affected. In conclusion, dietary Se oversupply may affect expression and activity of proteins involved in energy metabolism in major insulin target tissues, though this is probably not sufficient to induce diabetes.
Resumo:
A set of random variables is exchangeable if its joint distribution function is invariant under permutation of the arguments. The concept of exchangeability is discussed, with a view towards potential application in evaluating ensemble forecasts. It is argued that the paradigm of ensembles being an independent draw from an underlying distribution function is probably too narrow; allowing ensemble members to be merely exchangeable might be a more versatile model. The question is discussed whether established methods of ensemble evaluation need alteration under this model, with reliability being given particular attention. It turns out that the standard methodology of rank histograms can still be applied. As a first application of the exchangeability concept, it is shown that the method of minimum spanning trees to evaluate the reliability of high dimensional ensembles is mathematically sound.
Resumo:
In a recent paper, Mason et al. propose a reliability test of ensemble forecasts for a continuous, scalar verification. As noted in the paper, the test relies on a very specific interpretation of ensembles, namely, that the ensemble members represent quantiles of some underlying distribution. This quantile interpretation is not the only interpretation of ensembles, another popular one being the Monte Carlo interpretation. Mason et al. suggest estimating the quantiles in this situation; however, this approach is fundamentally flawed. Errors in the quantile estimates are not independent of the exceedance events, and consequently the conditional exceedance probabilities (CEP) curves are not constant, which is a fundamental assumption of the test. The test would reject reliable forecasts with probability much higher than the test size.
Resumo:
An ensemble forecast is a collection of runs of a numerical dynamical model, initialized with perturbed initial conditions. In modern weather prediction for example, ensembles are used to retrieve probabilistic information about future weather conditions. In this contribution, we are concerned with ensemble forecasts of a scalar quantity (say, the temperature at a specific location). We consider the event that the verification is smaller than the smallest, or larger than the largest ensemble member. We call these events outliers. If a K-member ensemble accurately reflected the variability of the verification, outliers should occur with a base rate of 2/(K + 1). In operational forecast ensembles though, this frequency is often found to be higher. We study the predictability of outliers and find that, exploiting information available from the ensemble, forecast probabilities for outlier events can be calculated which are more skilful than the unconditional base rate. We prove this analytically for statistically consistent forecast ensembles. Further, the analytical results are compared to the predictability of outliers in an operational forecast ensemble by means of model output statistics. We find the analytical and empirical results to agree both qualitatively and quantitatively.
Resumo:
The application of forecast ensembles to probabilistic weather prediction has spurred considerable interest in their evaluation. Such ensembles are commonly interpreted as Monte Carlo ensembles meaning that the ensemble members are perceived as random draws from a distribution. Under this interpretation, a reasonable property to ask for is statistical consistency, which demands that the ensemble members and the verification behave like draws from the same distribution. A widely used technique to assess statistical consistency of a historical dataset is the rank histogram, which uses as a criterion the number of times that the verification falls between pairs of members of the ordered ensemble. Ensemble evaluation is rendered more specific by stratification, which means that ensembles that satisfy a certain condition (e.g., a certain meteorological regime) are evaluated separately. Fundamental relationships between Monte Carlo ensembles, their rank histograms, and random sampling from the probability simplex according to the Dirichlet distribution are pointed out. Furthermore, the possible benefits and complications of ensemble stratification are discussed. The main conclusion is that a stratified Monte Carlo ensemble might appear inconsistent with the verification even though the original (unstratified) ensemble is consistent. The apparent inconsistency is merely a result of stratification. Stratified rank histograms are thus not necessarily flat. This result is demonstrated by perfect ensemble simulations and supplemented by mathematical arguments. Possible methods to avoid or remove artifacts that stratification induces in the rank histogram are suggested.
Resumo:
We present the first climate prediction of the coming decade made with multiple models, initialized with prior observations. This prediction accrues from an international activity to exchange decadal predictions in near real-time, in order to assess differences and similarities, provide a consensus view to prevent over-confidence in forecasts from any single model, and establish current collective capability. We stress that the forecast is experimental, since the skill of the multi-model system is as yet unknown. Nevertheless, the forecast systems used here are based on models that have undergone rigorous evaluation and individually have been evaluated for forecast skill. Moreover, it is important to publish forecasts to enable open evaluation, and to provide a focus on climate change in the coming decade. Initialized forecasts of the year 2011 agree well with observations, with a pattern correlation of 0.62 compared to 0.31 for uninitialized projections. In particular, the forecast correctly predicted La Niña in the Pacific, and warm conditions in the north Atlantic and USA. A similar pattern is predicted for 2012 but with a weaker La Niña. Indices of Atlantic multi-decadal variability and Pacific decadal variability show no signal beyond climatology after 2015, while temperature in the Niño3 region is predicted to warm slightly by about 0.5 °C over the coming decade. However, uncertainties are large for individual years and initialization has little impact beyond the first 4 years in most regions. Relative to uninitialized forecasts, initialized forecasts are significantly warmer in the north Atlantic sub-polar gyre and cooler in the north Pacific throughout the decade. They are also significantly cooler in the global average and over most land and ocean regions out to several years ahead. However, in the absence of volcanic eruptions, global temperature is predicted to continue to rise, with each year from 2013 onwards having a 50 % chance of exceeding the current observed record. Verification of these forecasts will provide an important opportunity to test the performance of models and our understanding and knowledge of the drivers of climate change.
Resumo:
This paper provides an update on research in the relatively new and fast-moving field of decadal climate prediction, and addresses the use of decadal climate predictions not only for potential users of such information but also for improving our understanding of processes in the climate system. External forcing influences the predictions throughout, but their contributions to predictive skill become dominant after most of the improved skill from initialization with observations vanishes after about six to nine years. Recent multi-model results suggest that there is relatively more decadal predictive skill in the North Atlantic, western Pacific, and Indian Oceans than in other regions of the world oceans. Aspects of decadal variability of SSTs, like the mid-1970s shift in the Pacific, the mid-1990s shift in the northern North Atlantic and western Pacific, and the early-2000s hiatus, are better represented in initialized hindcasts compared to uninitialized simulations. There is evidence of higher skill in initialized multi-model ensemble decadal hindcasts than in single model results, with multi-model initialized predictions for near term climate showing somewhat less global warming than uninitialized simulations. Some decadal hindcasts have shown statistically reliable predictions of surface temperature over various land and ocean regions for lead times of up to 6-9 years, but this needs to be investigated in a wider set of models. As in the early days of El Niño-Southern Oscillation (ENSO) prediction, improvements to models will reduce the need for bias adjustment, and increase the reliability, and thus usefulness, of decadal climate predictions in the future.