889 resultados para Space-time intervention
Resumo:
Inhalt dieser Arbeit ist ein Verfahren zur numerischen Lösung der zweidimensionalen Flachwassergleichung, welche das Fließverhalten von Gewässern, deren Oberflächenausdehnung wesentlich größer als deren Tiefe ist, modelliert. Diese Gleichung beschreibt die gravitationsbedingte zeitliche Änderung eines gegebenen Anfangszustandes bei Gewässern mit freier Oberfläche. Diese Klasse beinhaltet Probleme wie das Verhalten von Wellen an flachen Stränden oder die Bewegung einer Flutwelle in einem Fluss. Diese Beispiele zeigen deutlich die Notwendigkeit, den Einfluss von Topographie sowie die Behandlung von Nass/Trockenübergängen im Verfahren zu berücksichtigen. In der vorliegenden Dissertation wird ein, in Gebieten mit hinreichender Wasserhöhe, hochgenaues Finite-Volumen-Verfahren zur numerischen Bestimmung des zeitlichen Verlaufs der Lösung der zweidimensionalen Flachwassergleichung aus gegebenen Anfangs- und Randbedingungen auf einem unstrukturierten Gitter vorgestellt, welches in der Lage ist, den Einfluss topographischer Quellterme auf die Strömung zu berücksichtigen, sowie in sogenannten \glqq lake at rest\grqq-stationären Zuständen diesen Einfluss mit den numerischen Flüssen exakt auszubalancieren. Basis des Verfahrens ist ein Finite-Volumen-Ansatz erster Ordnung, welcher durch eine WENO Rekonstruktion unter Verwendung der Methode der kleinsten Quadrate und eine sogenannte Space Time Expansion erweitert wird mit dem Ziel, ein Verfahren beliebig hoher Ordnung zu erhalten. Die im Verfahren auftretenden Riemannprobleme werden mit dem Riemannlöser von Chinnayya, LeRoux und Seguin von 1999 gelöst, welcher die Einflüsse der Topographie auf den Strömungsverlauf mit berücksichtigt. Es wird in der Arbeit bewiesen, dass die Koeffizienten der durch das WENO-Verfahren berechneten Rekonstruktionspolynome die räumlichen Ableitungen der zu rekonstruierenden Funktion mit einem zur Verfahrensordnung passenden Genauigkeitsgrad approximieren. Ebenso wird bewiesen, dass die Koeffizienten des aus der Space Time Expansion resultierenden Polynoms die räumlichen und zeitlichen Ableitungen der Lösung des Anfangswertproblems approximieren. Darüber hinaus wird die wohlbalanciertheit des Verfahrens für beliebig hohe numerische Ordnung bewiesen. Für die Behandlung von Nass/Trockenübergangen wird eine Methode zur Ordnungsreduktion abhängig von Wasserhöhe und Zellgröße vorgeschlagen. Dies ist notwendig, um in der Rechnung negative Werte für die Wasserhöhe, welche als Folge von Oszillationen des Raum-Zeit-Polynoms auftreten können, zu vermeiden. Numerische Ergebnisse die die theoretische Verfahrensordnung bestätigen werden ebenso präsentiert wie Beispiele, welche die hervorragenden Eigenschaften des Gesamtverfahrens in der Berechnung herausfordernder Probleme demonstrieren.
Resumo:
La información y los datos genéticos que emanan hoy de las investigaciones del genoma humano demandan el desarrollo de herramientas informáticas capaces de procesar la gran cantidad de información disponible. La mayor cantidad de datos genéticos es el resultado de equipos que realizan el análisis simultáneo de cientos o miles de polimorfismos o variaciones genéticas, de nuevas técnicas de laboratorio de mayor rendimiento que, en conjunto, ofrecen una mayor disponibilidad de información en un corto espacio de tiempo. Esta problemática conduce a la necesidad de desarrollar nuevas herramientas informáticas capaces de lidiar con este mayor volumen de datos genéticos. En el caso de la genética de poblaciones, a pesar de que existen herramientas informáticas que permiten procesar y facilitar el análisis de los datos, estas tienen limitaciones como la falta de conocimiento de los usuarios de algunos lenguajes de programación para alimentar la información y otras herramientas informáticas no realizan todas las estimaciones que se requieren y otros presentan limitaciones en cuanto al número de datos que pueden incorporar o manejar. En algunos casos hay redundancia al tener que usarse dos o más herramientas para poder procesar un conjunto de datos de información genética. El presente trabajo tiene por objetivo el desarrollo de una herramienta informática basada en aplicaciones de computador comunes, en este caso Microsoft Excel® y que resuelva todos los problemas y las limitaciones descritas antes. El desarrollo del conjunto de subprogramas que constituyen a Lustro; permiten superar lo anterior, presentar los resultados en un ambiente sencillo, conocido y fácil de operar, simplificando de esta forma el proceso de adaptación del usuario del programa, sin entrenamiento previo, obteniéndose en corto tiempo el procesamiento de la información genética de interés.
Resumo:
Con este trabajo se pretende generar una implementación de una mejora logística que agregue valor, aumente la eficiencia y mejore los procesos de almacenamiento y distribución, gestión del control de inventarios y seguridad industrial de la empresa YOKOMOTOS. Se realizó un estudio profundo de la situación y los problemas que tiene actualmente la empresa. Todo esto con el fin de dar resultados que diferencien a esta compañía en el mercado de los repuestos para motos, obteniendo mayor prestigio y reconocimiento a nivel latinoamericano. De igual forma se establecieron las posibles soluciones que permitieran mitigar estos problemas, mejorando los procesos en el área de almacenamiento, sistema de inventarios y seguridad industrial. Se realizaron diferentes pruebas piloto para analizar la viabilidad de nuestras soluciones, analizando espacios, tiempos y costos. Por último se implementó la mejor solución la cual se ajustó respondiendo a los requerimientos de la compañía, mejorando así los procesos de almacenamiento y distribución agregándole valor a su cadena.
Resumo:
El actual documento presenta los resultados de una investigación que vengo desarrollando hace algunos años sobre El Otoño del Patriarca. Esta investigación sufrió una importante variación en torno a la reflexión sobre los propósitos de la lectura filosófica de un texto literario, y ha sido consecuencia de, por un lado, un desplazamiento en la determinación de las relaciones entre las dimensiones del lenguaje, el poder, el tiempo y el espacio abiertos por la narración, y por otro, de una diferencia de concepción sobre la formación discursiva de estas relaciones en El Otoño del Patriarca como obra narrativa, oral y poética.
Resumo:
Resumen tomado de la publicación
Resumo:
El autor realiza un análisis sociológico de la novela de Pareja, amparado en los criterios del crítico Lucien Goldmann. Las categorías narratológicas que sirven de pauta son: el narrador, el espacio, el tiempo y los personajes. Se detiene en el análisis de los tres espacios que Pareja proyecta en la novela: campo, ciudad (ámbito identificado con la pobreza, el robo y el abuso de las autoridades) y mundo exterior (apenas referencial, pero gravitante en los inicios de modernización del país). Revisa los personajes y la estructura mental del autor. Concluye que el texto plantea una concepción diferente del arte de novelar, a tono con la vanguardia europea, a pesar de que muestra una realidad social que no cambia (Baldomera nació pobre, se desplaza entre el burdel, la cantina, el hospital y, finalmente, la cárcel). El texto se aleja del simple documento testimonial gracias al diseño del personaje: pese a su descomunal físico ya que personifica la desgracia, Baldomera expresa, al mismo tiempo, los valores de fidelidad, amor maternal y solidaridad.
Resumo:
Presented herein is an experimental design that allows the effects of several radiative forcing factors on climate to be estimated as precisely as possible from a limited suite of atmosphere-only general circulation model (GCM) integrations. The forcings include the combined effect of observed changes in sea surface temperatures, sea ice extent, stratospheric (volcanic) aerosols, and solar output, plus the individual effects of several anthropogenic forcings. A single linear statistical model is used to estimate the forcing effects, each of which is represented by its global mean radiative forcing. The strong colinearity in time between the various anthropogenic forcings provides a technical problem that is overcome through the design of the experiment. This design uses every combination of anthropogenic forcing rather than having a few highly replicated ensembles, which is more commonly used in climate studies. Not only is this design highly efficient for a given number of integrations, but it also allows the estimation of (nonadditive) interactions between pairs of anthropogenic forcings. The simulated land surface air temperature changes since 1871 have been analyzed. The changes in natural and oceanic forcing, which itself contains some forcing from anthropogenic and natural influences, have the most influence. For the global mean, increasing greenhouse gases and the indirect aerosol effect had the largest anthropogenic effects. It was also found that an interaction between these two anthropogenic effects in the atmosphere-only GCM exists. This interaction is similar in magnitude to the individual effects of changing tropospheric and stratospheric ozone concentrations or to the direct (sulfate) aerosol effect. Various diagnostics are used to evaluate the fit of the statistical model. For the global mean, this shows that the land temperature response is proportional to the global mean radiative forcing, reinforcing the use of radiative forcing as a measure of climate change. The diagnostic tests also show that the linear model was suitable for analyses of land surface air temperature at each GCM grid point. Therefore, the linear model provides precise estimates of the space time signals for all forcing factors under consideration. For simulated 50-hPa temperatures, results show that tropospheric ozone increases have contributed to stratospheric cooling over the twentieth century almost as much as changes in well-mixed greenhouse gases.
Resumo:
Channel estimation method is a key issue in MIMO system. In recent years, a lot of papers on subspace(SS)-based blind channel estimation have been published, and in this paper, combining SS method with a space-time coding scheme, we proposed a novel blind channel estimation method in MIMO system. Simulation result demonstrates the effectiveness of this method.
Resumo:
When the orthogonal space-time block code (STBC), or the Alamouti code, is applied on a multiple-input multiple-output (MIMO) communications system, the optimum reception can be achieved by a simple signal decoupling at the receiver. The performance, however, deteriorates significantly in presence of co-channel interference (CCI) from other users. In this paper, such CCI problem is overcome by applying the independent component analysis (ICA), a blind source separation algorithm. This is based on the fact that, if the transmission data from every transmit antenna are mutually independent, they can be effectively separated at the receiver with the principle of the blind source separation. Then equivalently, the CCI is suppressed. Although they are not required by the ICA algorithm itself, a small number of training data are necessary to eliminate the phase and order ambiguities at the ICA outputs, leading to a semi-blind approach. Numerical simulation is also shown to verify the proposed ICA approach in the multiuser MIMO system.
Resumo:
This paper introduces perspex algebra which is being developed as a common representation of geometrical knowledge. A perspex can currently be interpreted in one of four ways. First, the algebraic perspex is a generalization of matrices, it provides the most general representation for all of the interpretations of a perspex. The algebraic perspex can be used to describe arbitrary sets of coordinates. The remaining three interpretations of the perspex are all related to square matrices and operate in a Euclidean model of projective space-time, called perspex space. Perspex space differs from the usual Euclidean model of projective space in that it contains the point at nullity. It is argued that the point at nullity is necessary for a consistent account of perspective in top-down vision. Second, the geometric perspex is a simplex in perspex space. It can be used as a primitive building block for shapes, or as a way of recording landmarks on shapes. Third, the transformational perspex describes linear transformations in perspex space that provide the affine and perspective transformations in space-time. It can be used to match a prototype shape to an image, even in so called 'accidental' views where the depth of an object disappears from view, or an object stays in the same place across time. Fourth, the parametric perspex describes the geometric and transformational perspexes in terms of parameters that are related to everyday English descriptions. The parametric perspex can be used to obtain both continuous and categorical perception of objects. The paper ends with a discussion of issues related to using a perspex to describe logic.
Resumo:
In this paper we consider boundary integral methods applied to boundary value problems for the positive definite Helmholtz-type problem -DeltaU + alpha U-2 = 0 in a bounded or unbounded domain, with the parameter alpha real and possibly large. Applications arise in the implementation of space-time boundary integral methods for the heat equation, where alpha is proportional to 1/root deltat, and deltat is the time step. The corresponding layer potentials arising from this problem depend nonlinearly on the parameter alpha and have kernels which become highly peaked as alpha --> infinity, causing standard discretization schemes to fail. We propose a new collocation method with a robust convergence rate as alpha --> infinity. Numerical experiments on a model problem verify the theoretical results.
Resumo:
Using the formalism of the Ruelle response theory, we study how the invariant measure of an Axiom A dynamical system changes as a result of adding noise, and describe how the stochastic perturbation can be used to explore the properties of the underlying deterministic dynamics. We first find the expression for the change in the expectation value of a general observable when a white noise forcing is introduced in the system, both in the additive and in the multiplicative case. We also show that the difference between the expectation value of the power spectrum of an observable in the stochastically perturbed case and of the same observable in the unperturbed case is equal to the variance of the noise times the square of the modulus of the linear susceptibility describing the frequency-dependent response of the system to perturbations with the same spatial patterns as the considered stochastic forcing. This provides a conceptual bridge between the change in the fluctuation properties of the system due to the presence of noise and the response of the unperturbed system to deterministic forcings. Using Kramers-Kronig theory, it is then possible to derive the real and imaginary part of the susceptibility and thus deduce the Green function of the system for any desired observable. We then extend our results to rather general patterns of random forcing, from the case of several white noise forcings, to noise terms with memory, up to the case of a space-time random field. Explicit formulas are provided for each relevant case analysed. As a general result, we find, using an argument of positive-definiteness, that the power spectrum of the stochastically perturbed system is larger at all frequencies than the power spectrum of the unperturbed system. We provide an example of application of our results by considering the spatially extended chaotic Lorenz 96 model. These results clarify the property of stochastic stability of SRB measures in Axiom A flows, provide tools for analysing stochastic parameterisations and related closure ansatz to be implemented in modelling studies, and introduce new ways to study the response of a system to external perturbations. Taking into account the chaotic hypothesis, we expect that our results have practical relevance for a more general class of system than those belonging to Axiom A.
Resumo:
We reconsider the theory of the linear response of non-equilibrium steady states to perturbations. We �rst show that by using a general functional decomposition for space-time dependent forcings, we can de�ne elementary susceptibilities that allow to construct the response of the system to general perturbations. Starting from the de�nition of SRB measure, we then study the consequence of taking di�erent sampling schemes for analysing the response of the system. We show that only a speci�c choice of the time horizon for evaluating the response of the system to a general time-dependent perturbation allows to obtain the formula �rst presented by Ruelle. We also discuss the special case of periodic perturbations, showing that when they are taken into consideration the sampling can be �ne-tuned to make the de�nition of the correct time horizon immaterial. Finally, we discuss the implications of our results in terms of strategies for analyzing the outputs of numerical experiments by providing a critical review of a formula proposed by Reick.
Resumo:
The separate effects of ozone depleting substances (ODSs) and greenhouse gases (GHGs) on forcing circulation changes in the Southern Hemisphere extratropical troposphere are investigated using a version of the Canadian Middle Atmosphere Model (CMAM) that is coupled to an ocean. Circulation-related diagnostics include zonal wind, tropopause pressure, Hadley cell width, jet location, annular mode index, precipitation, wave drag, and eddy fluxes of momentum and heat. As expected, the tropospheric response to the ODS forcing occurs primarily in austral summer, with past (1960-99) and future (2000-99) trends of opposite sign, while the GHG forcing produces more seasonally uniform trends with the same sign in the past and future. In summer the ODS forcing dominates past trends in all diagnostics, while the two forcings contribute nearly equally but oppositely to future trends. The ODS forcing produces a past surface temperature response consisting of cooling over eastern Antarctica, and is the dominant driver of past summertime surface temperature changes when the model is constrained by observed sea surface temperatures. For all diagnostics, the response to the ODS and GHG forcings is additive: that is, the linear trend computed from the simulations using the combined forcings equals (within statistical uncertainty) the sum of the linear trends from the simulations using the two separate forcings. Space time spectra of eddy fluxes and the spatial distribution of transient wave drag are examined to assess the viability of several recently proposed mechanisms for the observed poleward shift in the tropospheric jet.
The Asian summer monsoon: an intercomparison of CMIP5 vs. CMIP3 simulations of the late 20th century
Resumo:
The boreal summer Asian monsoon has been evaluated in 25 Coupled Model Intercomparison Project-5 (CMIP5) and 22 CMIP3 GCM simulations of the late 20th Century. Diagnostics and skill metrics have been calculated to assess the time-mean, climatological annual cycle, interannual variability, and intraseasonal variability. Progress has been made in modeling these aspects of the monsoon, though there is no single model that best represents all of these aspects of the monsoon. The CMIP5 multi-model mean (MMM) is more skillful than the CMIP3 MMM for all diagnostics in terms of the skill of simulating pattern correlations with respect to observations. Additionally, for rainfall/convection the MMM outperforms the individual models for the time mean, the interannual variability of the East Asian monsoon, and intraseasonal variability. The pattern correlation of the time (pentad) of monsoon peak and withdrawal is better simulated than that of monsoon onset. The onset of the monsoon over India is typically too late in the models. The extension of the monsoon over eastern China, Korea, and Japan is underestimated, while it is overestimated over the subtropical western/central Pacific Ocean. The anti-correlation between anomalies of all-India rainfall and Niño-3.4 sea surface temperature is overly strong in CMIP3 and typically too weak in CMIP5. For both the ENSO-monsoon teleconnection and the East Asian zonal wind-rainfall teleconnection, the MMM interannual rainfall anomalies are weak compared to observations. Though simulation of intraseasonal variability remains problematic, several models show improved skill at representing the northward propagation of convection and the development of the tilted band of convection that extends from India to the equatorial west Pacific. The MMM also well represents the space-time evolution of intraseasonal outgoing longwave radiation anomalies. Caution is necessary when using GPCP and CMAP rainfall to validate (1) the time-mean rainfall, as there are systematic differences over ocean and land between these two data sets, and (2) the timing of monsoon withdrawal over India, where the smooth southward progression seen in India Meteorological Department data is better realized in CMAP data compared to GPCP data.