1000 resultados para Combustió -- Models matemàtics
Resumo:
We derive analytical expressions for the propagation speed of downward combustion fronts of thin solid fuels with a background flow initially at rest. The classical combustion model for thin solid fuels that consists of five coupled reaction-convection-diffusion equations is here reduced into a single equation with the gas temperature as the single variable. For doing so we apply a two-zone combustion model that divides the system into a preheating region and a pyrolyzing region. The speed of the combustion front is obtained after matching the temperature and its derivative at the location that separates both regions.We also derive a simplified version of this analytical expression expected to be valid for a wide range of cases. Flame front velocities predicted by our analyticalexpressions agree well with experimental data found in the literature for a large variety of cases and substantially improve the results obtained from a previous well-known analytical expression
Resumo:
Report for the scientific sojourn carried out at the University of New South Wales from February to June the 2007. Two different biogeochemical models are coupled to a three dimensional configuration of the Princeton Ocean Model (POM) for the Northwestern Mediterranean Sea (Ahumada and Cruzado, 2007). The first biogeochemical model (BLANES) is the three-dimensional version of the model described by Bahamon and Cruzado (2003) and computes the nitrogen fluxes through six compartments using semi-empirical descriptions of biological processes. The second biogeochemical model (BIOMEC) is the biomechanical NPZD model described in Baird et al. (2004), which uses a combination of physiological and physical descriptions to quantify the rates of planktonic interactions. Physical descriptions include, for example, the diffusion of nutrients to phytoplankton cells and the encounter rate of predators and prey. The link between physical and biogeochemical processes in both models is expressed by the advection-diffusion of the non-conservative tracers. The similarities in the mathematical formulation of the biogeochemical processes in the two models are exploited to determine the parameter set for the biomechanical model that best fits the parameter set used in the first model. Three years of integration have been carried out for each model to reach the so called perpetual year run for biogeochemical conditions. Outputs from both models are averaged monthly and then compared to remote sensing images obtained from sensor MERIS for chlorophyll.
Resumo:
Fixed delays in neuronal interactions arise through synaptic and dendritic processing. Previous work has shown that such delays, which play an important role in shaping the dynamics of networks of large numbers of spiking neurons with continuous synaptic kinetics, can be taken into account with a rate model through the addition of an explicit, fixed delay. Here we extend this work to account for arbitrary symmetric patterns of synaptic connectivity and generic nonlinear transfer functions. Specifically, we conduct a weakly nonlinear analysis of the dynamical states arising via primary instabilities of the stationary uniform state. In this way we determine analytically how the nature and stability of these states depend on the choice of transfer function and connectivity. While this dependence is, in general, nontrivial, we make use of the smallness of the ratio in the delay in neuronal interactions to the effective time constant of integration to arrive at two general observations of physiological relevance. These are: 1 - fast oscillations are always supercritical for realistic transfer functions. 2 - Traveling waves are preferred over standing waves given plausible patterns of local connectivity.
Resumo:
In this paper we propose a parsimonious regime-switching approach to model the correlations between assets, the threshold conditional correlation (TCC) model. This method allows the dynamics of the correlations to change from one state (or regime) to another as a function of observable transition variables. Our model is similar in spirit to Silvennoinen and Teräsvirta (2009) and Pelletier (2006) but with the appealing feature that it does not suffer from the course of dimensionality. In particular, estimation of the parameters of the TCC involves a simple grid search procedure. In addition, it is easy to guarantee a positive definite correlation matrix because the TCC estimator is given by the sample correlation matrix, which is positive definite by construction. The methodology is illustrated by evaluating the behaviour of international equities, govenrment bonds and major exchange rates, first separately and then jointly. We also test and allow for different parts in the correlation matrix to be governed by different transition variables. For this, we estimate a multi-threshold TCC specification. Further, we evaluate the economic performance of the TCC model against a constant conditional correlation (CCC) estimator using a Diebold-Mariano type test. We conclude that threshold correlation modelling gives rise to a significant reduction in portfolio´s variance.
Resumo:
The introduction of an infective-infectious period on the geographic spread of epidemics is considered in two different models. The classical evolution equations arising in the literature are generalized and the existence of epidemic wave fronts is revised. The asymptotic speed is obtained and improves previous results for the Black Death plague
Resumo:
Populations of phase oscillators interacting globally through a general coupling function f(x) have been considered. We analyze the conditions required to ensure the existence of a Lyapunov functional giving close expressions for it in terms of a generating function. We have also proposed a family of exactly solvable models with singular couplings showing that it is possible to map the synchronization phenomenon into other physical problems. In particular, the stationary solutions of the least singular coupling considered, f(x) = sgn(x), have been found analytically in terms of elliptic functions. This last case is one of the few nontrivial models for synchronization dynamics which can be analytically solved.
Resumo:
En tot cas, jo voldria que aquesta conferència fos això que he dit: una breu lliçó sobre la importància de les equacions diferencials. Parlaré d'elles des de el punt de vista del models, és a dir, dels fenòmens que modelitzeu. I intentaré explicar que malgrat el seu origen antic, totes elles segueixen presentant avui en dia problemes nous i interessants, tant des de el punt de vista teòric com pràctic.
Resumo:
The present paper is aimed at providing a general strategic overview of the existing theoretical models that have applications in the field of financial innovation. Whereas most financialdevelopments have relied upon traditional economic tools, a new stream of research is defining a novel paradigm in which mathematical models from diverse scientific disciplines are being applied to conceptualize and explain economic and financial behavior. Indeed, terms such as ‘econophysics’ or ‘quantum finance’ have recently appeared to embrace efforts in this direction. As a first contact with such research, the project will present a brief description of some of the main theoretical models that have applications in finance and economics, and will try to present, if possible, potential new applications to particular areas in financial analysis, or new applicable models. As a result, emphasiswill be put on the implications of this research for the financial sector and its future dynamics.
Resumo:
Background: Design of newly engineered microbial strains for biotechnological purposes would greatly benefit from the development of realistic mathematical models for the processes to be optimized. Such models can then be analyzed and, with the development and application of appropriate optimization techniques, one could identify the modifications that need to be made to the organism in order to achieve the desired biotechnological goal. As appropriate models to perform such an analysis are necessarily non-linear and typically non-convex, finding their global optimum is a challenging task. Canonical modeling techniques, such as Generalized Mass Action (GMA) models based on the power-law formalism, offer a possible solution to this problem because they have a mathematical structure that enables the development of specific algorithms for global optimization. Results: Based on the GMA canonical representation, we have developed in previous works a highly efficient optimization algorithm and a set of related strategies for understanding the evolution of adaptive responses in cellular metabolism. Here, we explore the possibility of recasting kinetic non-linear models into an equivalent GMA model, so that global optimization on the recast GMA model can be performed. With this technique, optimization is greatly facilitated and the results are transposable to the original non-linear problem. This procedure is straightforward for a particular class of non-linear models known as Saturable and Cooperative (SC) models that extend the power-law formalism to deal with saturation and cooperativity. Conclusions: Our results show that recasting non-linear kinetic models into GMA models is indeed an appropriate strategy that helps overcoming some of the numerical difficulties that arise during the global optimization task.
Resumo:
Optimization models in metabolic engineering and systems biology focus typically on optimizing a unique criterion, usually the synthesis rate of a metabolite of interest or the rate of growth. Connectivity and non-linear regulatory effects, however, make it necessary to consider multiple objectives in order to identify useful strategies that balance out different metabolic issues. This is a fundamental aspect, as optimization of maximum yield in a given condition may involve unrealistic values in other key processes. Due to the difficulties associated with detailed non-linear models, analysis using stoichiometric descriptions and linear optimization methods have become rather popular in systems biology. However, despite being useful, these approaches fail in capturing the intrinsic nonlinear nature of the underlying metabolic systems and the regulatory signals involved. Targeting more complex biological systems requires the application of global optimization methods to non-linear representations. In this work we address the multi-objective global optimization of metabolic networks that are described by a special class of models based on the power-law formalism: the generalized mass action (GMA) representation. Our goal is to develop global optimization methods capable of efficiently dealing with several biological criteria simultaneously. In order to overcome the numerical difficulties of dealing with multiple criteria in the optimization, we propose a heuristic approach based on the epsilon constraint method that reduces the computational burden of generating a set of Pareto optimal alternatives, each achieving a unique combination of objectives values. To facilitate the post-optimal analysis of these solutions and narrow down their number prior to being tested in the laboratory, we explore the use of Pareto filters that identify the preferred subset of enzymatic profiles. We demonstrate the usefulness of our approach by means of a case study that optimizes the ethanol production in the fermentation of Saccharomyces cerevisiae.
Resumo:
The aim of this study is to define a new statistic, PVL, based on the relative distance between the likelihood associated with the simulation replications and the likelihood of the conceptual model. Our results coming from several simulation experiments of a clinical trial show that the PVL statistic range can be a good measure of stability to establish when a computational model verifies the underlying conceptual model. PVL improves also the analysis of simulation replications because only one statistic is associated with all the simulation replications. As well it presents several verification scenarios, obtained by altering the simulation model, that show the usefulness of PVL. Further simulation experiments suggest that a 0 to 20 % range may define adequate limits for the verification problem, if considered from the viewpoint of an equivalence test.
Aplicación del DEA en el análisis de beneficios en un sistema integrado verticalmente hacia adelante
Resumo:
En el presente trabajo se diseñan tres modelos DEA a partir de un sistema de producción cuyos componentes están colocados en un arreglo en serie que se integran verticalmente hacia adelante. El primer modelo busca optimizar los beneficios del sistema agregado, así como la mejora de los mismos en cada uno de los subsistemas. En el segundo de los modelos, además del objetivo anterior, se incluyen restricciones de transferencia de los recursos específicos asociados a cada subsistema, y en el tercer modelo se estima el intervalo de variación para los precios de transferencia de los inputs intermedios entre ambos subsistemas. Los modelos han sido programados y simulados en el software GAMS a partir de datos generados por una función de producción Cobb-Douglas para los inputs intermedios y los outputs finales.
Resumo:
Esta investigación relaciona el margen de solvencia que normativamente deben acreditar los aseguradores de vida con los recursos propios que cualquier empresa debe poseer para poder ejercer su actividad. Analizaremos la incidencia que la remuneración de esos recursos tiene en la propia actividad de comercialización de los seguros demostrando su relación inversa con el tipo de interés garantizado en los contratos. Analizaremos asimismo la incidencia que los eventuales cambios en los tipos de interés pueden tener en la remuneración que puede ofrecerse a esos recursos y propondremos una ecuación que incorpore todos estos aspectos, comprobando que las relaciones que actualmente ofrece la disciplina académica son casos particulares de la ecuación general, casos que demostraremos que incorporan unas hipótesis implícitas bastante restrictivas.
Resumo:
Markowitz portfolio theory (1952) has induced research into the efficiency of portfolio management. This paper studies existing nonparametric efficiency measurement approaches for single period portfolio selection from a theoretical perspective and generalises currently used efficiency measures into the full mean-variance space. Therefore, we introduce the efficiency improvement possibility function (a variation on the shortage function), study its axiomatic properties in the context of Markowitz efficient frontier, and establish a link to the indirect mean-variance utility function. This framework allows distinguishing between portfolio efficiency and allocative efficiency. Furthermore, it permits retrieving information about the revealed risk aversion of investors. The efficiency improvement possibility function thus provides a more general framework for gauging the efficiency of portfolio management using nonparametric frontier envelopment methods based on quadratic optimisation.
Resumo:
This paper investigates the selection of governance forms in interfirm collaborations taking into account the predictions from transaction costs and property rights theories. Transaction costs arguments are often used to justify the introduction of hierarchical controls in collaborations, but the ownership dimension of going from “contracts” to “hierarchies” has been ignored in the past and with it the so called “costs of ownership”. The theoretical results, tested with a sample of collaborations in which participate Spanish firms, indicate that the cost of ownership may offset the benefits of hierarchical controls and therefore limit their diffusion. Evidence is also reported of possible complementarities between reputation effects and forms of ownership that go together with hierarchical controls (i.e. joint ventures), in contrast with the generally assumed substitutability between the two.