29 resultados para Partial oxalate method
em Consorci de Serveis Universitaris de Catalunya (CSUC), Spain
Resumo:
We show that if performance measures in a stochastic scheduling problem satisfy a set of so-called partial conservation laws (PCL), which extend previously studied generalized conservation laws (GCL), then the problem is solved optimally by a priority-index policy for an appropriate range of linear performance objectives, where the optimal indices are computed by a one-pass adaptive-greedy algorithm, based on Klimov's. We further apply this framework to investigate the indexability property of restless bandits introduced by Whittle, obtaining the following results: (1) we identify a class of restless bandits (PCL-indexable) which are indexable; membership in this class is tested through a single run of the adaptive-greedy algorithm, which also computes the Whittle indices when the test is positive; this provides a tractable sufficient condition for indexability; (2) we further indentify the class of GCL-indexable bandits, which includes classical bandits, having the property that they are indexable under any linear reward objective. The analysis is based on the so-called achievable region method, as the results follow fromnew linear programming formulations for the problems investigated.
Resumo:
The development of liquid-crystal panels for use in commercial equipment has been aimed at improving the pixel resolution and the display efficiency. These improvements have led to a reduction in the thickness of such devices, among other outcomes, that involves a loss in phase modulation. We propose a modification of the classical phase-only filter to permit displays in VGA liquid-crystal panels with a constant amplitude modulation and less than a 2¿(PI) phase modulation. The method was tested experimentally in an optical setup.
Resumo:
n this work we analyze the behavior of complex information in Fresnel domain taking into account the limited capability to display complex transmittance values of current liquid crystal devices, when used as holographic displays. In order to do this analysis we compute the reconstruction of Fresnel holograms at several distances using the different parts of the complex distribution (real and imaginary parts, amplitude and phase) as well as using the full complex information adjusted with a method that combines two configurations of the devices in an adding architecture. The RMS error between the amplitude of these reconstructions and the original amplitude is used to evaluate the quality of the information displayed. The results of the error analysis show different behavior for the reconstructions using the different parts of the complex distribution and using the combined method of two devices. Better reconstructions are obtained when using two devices whose configurations densely cover the complex plane when they are added. Simulated and experimental results are also presented.
Resumo:
"Vegeu el resum a l'inici del document del fitxer adjunt."
Resumo:
"Vegeu el resum a l'inici del document del fitxer adjunt."
Resumo:
"Vegeu el resum a l'inici del document del fitxer adjunt."
Resumo:
"Vegeu el resum a l'inici del document del fitxer adjunt."
Resumo:
"vegeu el resum a l'inici del document del fitxer adjunt."
Resumo:
Proyecto de investigación realizado a partir de una estancia en el Centro Internacional de Métodos Computacionales en Ingeniería (CIMEC), Argentina, entre febrero y abril del 2007. La simulación numérica de problemas de mezclas mediante el Particle Finite Element Method (PFEM) es el marco de estudio de una futura tesis doctoral. Éste es un método desarrollado conjuntamente por el CIMEC y el Centre Internacional de Mètodos Numèrics en l'Enginyeria (CIMNE-UPC), basado en la resolución de las ecuaciones de Navier-Stokes en formulación Lagrangiana. El mallador ha sido implementado y desarrollado por Dr. Nestor Calvo, investigador del CIMEC. El desarrollo del módulo de cálculo corresponde al trabajo de tesis de la beneficiaria. La correcta interacción entre ambas partes es fundamental para obtener resultados válidos. En esta memoria se explican los principales aspectos del mallador que fueron modificados (criterios de refinamiento geométrico) y los cambios introducidos en el módulo de cálculo (librería PETSc, algoritmo predictor-corrector) durante la estancia en el CIMEC. Por último, se muestran los resultados obtenidos en un problema de dos fluidos inmiscibles con transferencia de calor.
Resumo:
We propose a mixed finite element method for a class of nonlinear diffusion equations, which is based on their interpretation as gradient flows in optimal transportation metrics. We introduce an appropriate linearization of the optimal transport problem, which leads to a mixed symmetric formulation. This formulation preserves the maximum principle in case of the semi-discrete scheme as well as the fully discrete scheme for a certain class of problems. In addition solutions of the mixed formulation maintain exponential convergence in the relative entropy towards the steady state in case of a nonlinear Fokker-Planck equation with uniformly convex potential. We demonstrate the behavior of the proposed scheme with 2D simulations of the porous medium equations and blow-up questions in the Patlak-Keller-Segel model.
Resumo:
Minimal models for the explanation of decision-making in computational neuroscience are based on the analysis of the evolution for the average firing rates of two interacting neuron populations. While these models typically lead to multi-stable scenario for the basic derived dynamical systems, noise is an important feature of the model taking into account finite-size effects and robustness of the decisions. These stochastic dynamical systems can be analyzed by studying carefully their associated Fokker-Planck partial differential equation. In particular, we discuss the existence, positivity and uniqueness for the solution of the stationary equation, as well as for the time evolving problem. Moreover, we prove convergence of the solution to the the stationary state representing the probability distribution of finding the neuron families in each of the decision states characterized by their average firing rates. Finally, we propose a numerical scheme allowing for simulations performed on the Fokker-Planck equation which are in agreement with those obtained recently by a moment method applied to the stochastic differential system. Our approach leads to a more detailed analytical and numerical study of this decision-making model in computational neuroscience.
Resumo:
El proyecto “Vulnerabilidad costera a múltiples agentes. Aplicación al litoral Catalán” tiene como objetivo general desarrollar y validar una metodología para evaluar cuantitativamente la vulnerabilidad de las costas sedimentarias a los principales procesos que rigen su comportamiento. Dentro de este contexto y durante un periodo de 6 meses se ha estado trabajando en el principal objetivo parcial del proyecto: el desarrollo de una serie de indicadores de vulnerabilidad costera a procesos físicos, más específicamente en la obtención de un índice de vulnerabilidad costera a temporales. Para ello se ha analizado la variabilidad espacial y temporal de la intensidad de los procesos costeros inducidos por temporales a lo largo de la costa Catalana, teniendo en cuenta únicamente la contribución de las características del oleaje. Se han integrado datos reales y simulados de oleaje de tres sitios distribuidos a lo largo del litoral Catalán para obtener las series temporales de intensidad de los tres procesos costeros derivados de la acción de temporales más relevantes (transporte de sedimentos, erosión e inundación). Los resultados muestran que no existen tendencias significativas en las series temporales de los procesos estudiados. Por otro lado, el análisis de las series de la media móvil de 5 años de las anomalías de dichos procesos refleja tendencias positivas significativas en el transporte de sedimentos y la erosión para las zonas norte y sur de la costa, y en la inundación para la zona sur. En relación a la variabilidad espacial, los resultados muestran que la zona sur es la más vulnerable a los procesos de erosión costera y transporte de sedimentos, mientras que la parte norte es la más vulnerable al proceso de inundación.
Resumo:
This paper explores two major issues, from biophysical and historical viewpoints. We examine land management, which we define as the long-term fertility maintenance of land in relation to agriculture, fishery and forestry. We also explore humans’ positive role as agents aiming to reinforce harmonious materials circulation within the land. Liebig’s view on nature, agriculture and land, emphasizes the maintenance of long-term land fertility based on his agronomical thought that the circulation of matter in agricultural fields must be maintained with manure as much as possible. The thoughts of several classical economists, on nature, agriculture and land are reassessed from Liebig’s view point. Then, the land management problem is discussed at a much more fundamental level, to understand the necessary conditions for life in relation to land management. This point is analyzed in terms of two mechanisms: entropy disposal on the earth, and material circulation against gravitational field. Finally from the historical example of the metropolis of Edo, it is shown that there is yet another necessary condition for the sustainable management of land based on the creation of harmonious material cycles among cities, farm land, forests and surrounding sea areas in which humans play a vital role as agent.
Resumo:
The studies of Giacomo Becattini concerning the notion of the "Marshallian industrial district" have led a revolution in the field of economic development around the world. The paper offers an interpretation of the methodology adopted by Becattini. The roots are clearly Marshallian. Becattini proposes a return to the economy as a complex social science that operates in historical time. We adopt a Schumpeterian approach to the method in economic analysis in order to highlight the similarities between the Marshall and Becattini's approach. Finally the paper uses the distinction between logical time, real time and historical time which enable us to study the "localized" economic process in a Becattinian way.