984 resultados para Multiscale stochastic modelling
Resumo:
This analysis was stimulated by the real data analysis problem of householdexpenditure data. The full dataset contains expenditure data for a sample of 1224 households. The expenditure is broken down at 2 hierarchical levels: 9 major levels (e.g. housing, food, utilities etc.) and 92 minor levels. There are also 5 factors and 5 covariates at the household level. Not surprisingly, there are a small number of zeros at the major level, but many zeros at the minor level. The question is how best to model the zeros. Clearly, models that tryto add a small amount to the zero terms are not appropriate in general as at least some of the zeros are clearly structural, e.g. alcohol/tobacco for households that are teetotal. The key question then is how to build suitable conditional models. For example, is the sub-composition of spendingexcluding alcohol/tobacco similar for teetotal and non-teetotal households?In other words, we are looking for sub-compositional independence. Also, what determines whether a household is teetotal? Can we assume that it is independent of the composition? In general, whether teetotal will clearly depend on the household level variables, so we need to be able to model this dependence. The other tricky question is that with zeros on more than onecomponent, we need to be able to model dependence and independence of zeros on the different components. Lastly, while some zeros are structural, others may not be, for example, for expenditure on durables, it may be chance as to whether a particular household spends money on durableswithin the sample period. This would clearly be distinguishable if we had longitudinal data, but may still be distinguishable by looking at the distribution, on the assumption that random zeros will usually be for situations where any non-zero expenditure is not small.While this analysis is based on around economic data, the ideas carry over tomany other situations, including geological data, where minerals may be missing for structural reasons (similar to alcohol), or missing because they occur only in random regions which may be missed in a sample (similar to the durables)
Resumo:
Animal dispersal in a fragmented landscape depends on the complex interaction between landscape structure and animal behavior. To better understand how individuals disperse, it is important to explicitly represent the properties of organisms and the landscape in which they move. A common approach to modelling dispersal includes representing the landscape as a grid of equal sized cells and then simulating individual movement as a correlated random walk. This approach uses a priori scale of resolution, which limits the representation of all landscape features and how different dispersal abilities are modelled. We develop a vector-based landscape model coupled with an object-oriented model for animal dispersal. In this spatially explicit dispersal model, landscape features are defined based on their geographic and thematic properties and dispersal is modelled through consideration of an organism's behavior, movement rules and searching strategies (such as visual cues). We present the model's underlying concepts, its ability to adequately represent landscape features and provide simulation of dispersal according to different dispersal abilities. We demonstrate the potential of the model by simulating two virtual species in a real Swiss landscape. This illustrates the model's ability to simulate complex dispersal processes and provides information about dispersal such as colonization probability and spatial distribution of the organism's path.
Resumo:
There are two principal chemical concepts that are important for studying the naturalenvironment. The first one is thermodynamics, which describes whether a system is atequilibrium or can spontaneously change by chemical reactions. The second main conceptis how fast chemical reactions (kinetics or rate of chemical change) take place wheneverthey start. In this work we examine a natural system in which both thermodynamics andkinetic factors are important in determining the abundance of NH+4 , NO−2 and NO−3 insuperficial waters. Samples were collected in the Arno Basin (Tuscany, Italy), a system inwhich natural and antrophic effects both contribute to highly modify the chemical compositionof water. Thermodynamical modelling based on the reduction-oxidation reactionsinvolving the passage NH+4 -& NO−2 -& NO−3 in equilibrium conditions has allowed todetermine the Eh redox potential values able to characterise the state of each sample and,consequently, of the fluid environment from which it was drawn. Just as pH expressesthe concentration of H+ in solution, redox potential is used to express the tendency of anenvironment to receive or supply electrons. In this context, oxic environments, as thoseof river systems, are said to have a high redox potential because O2 is available as anelectron acceptor.Principles of thermodynamics and chemical kinetics allow to obtain a model that oftendoes not completely describe the reality of natural systems. Chemical reactions may indeedfail to achieve equilibrium because the products escape from the site of the rectionor because reactions involving the trasformation are very slow, so that non-equilibriumconditions exist for long periods. Moreover, reaction rates can be sensitive to poorly understoodcatalytic effects or to surface effects, while variables as concentration (a largenumber of chemical species can coexist and interact concurrently), temperature and pressurecan have large gradients in natural systems. By taking into account this, data of 91water samples have been modelled by using statistical methodologies for compositionaldata. The application of log–contrast analysis has allowed to obtain statistical parametersto be correlated with the calculated Eh values. In this way, natural conditions in whichchemical equilibrium is hypothesised, as well as underlying fast reactions, are comparedwith those described by a stochastic approach
Resumo:
The resistance of mosquitoes to chemical insecticides is threatening vector control programmes worldwide. Cytochrome P450 monooxygenases (CYPs) are known to play a major role in insecticide resistance, allowing resistant insects to metabolize insecticides at a higher rate. Among them, members of the mosquito CYP6Z subfamily, like Aedes aegypti CYP6Z8 and its Anopheles gambiae orthologue CYP6Z2, have been frequently associated with pyrethroid resistance. However, their role in the pyrethroid degradation pathway remains unclear. In the present study, we created a genetically modified yeast strain overexpressing Ae. aegypti cytochrome P450 reductase and CYP6Z8, thereby producing the first mosquito P450-CPR (NADPH-cytochrome P450-reductase) complex in a yeast recombinant system. The results of the present study show that: (i) CYP6Z8 metabolizes PBAlc (3-phenoxybenzoic alcohol) and PBAld (3-phenoxybenzaldehyde), common pyrethroid metabolites produced by carboxylesterases, producing PBA (3-phenoxybenzoic acid); (ii) CYP6Z8 transcription is induced by PBAlc, PBAld and PBA; (iii) An. gambiae CYP6Z2 metabolizes PBAlc and PBAld in the same way; (iv) PBA is the major metabolite produced in vivo and is excreted without further modification; and (v) in silico modelling of substrate-enzyme interactions supports a similar role of other mosquito CYP6Zs in pyrethroid degradation. By playing a pivotal role in the degradation of pyrethroid insecticides, mosquito CYP6Zs thus represent good targets for mosquito-resistance management strategies.
Resumo:
First: A continuous-time version of Kyle's model (Kyle 1985), known as the Back's model (Back 1992), of asset pricing with asymmetric information, is studied. A larger class of price processes and of noise traders' processes are studied. The price process, as in Kyle's model, is allowed to depend on the path of the market order. The process of the noise traders' is an inhomogeneous Lévy process. Solutions are found by the Hamilton-Jacobi-Bellman equations. With the insider being risk-neutral, the price pressure is constant, and there is no equilibirium in the presence of jumps. If the insider is risk-averse, there is no equilibirium in the presence of either jumps or drifts. Also, it is analised when the release time is unknown. A general relation is established between the problem of finding an equilibrium and of enlargement of filtrations. Random announcement time is random is also considered. In such a case the market is not fully efficient and there exists equilibrium if the sensitivity of prices with respect to the global demand is time decreasing according with the distribution of the random time. Second: Power variations. it is considered, the asymptotic behavior of the power variation of processes of the form _integral_0^t u(s-)dS(s), where S_ is an alpha-stable process with index of stability 0&alpha&2 and the integral is an Itô integral. Stable convergence of corresponding fluctuations is established. These results provide statistical tools to infer the process u from discrete observations. Third: A bond market is studied where short rates r(t) evolve as an integral of g(t-s)sigma(s) with respect to W(ds), where g and sigma are deterministic and W is the stochastic Wiener measure. Processes of this type are particular cases of ambit processes. These processes are in general not of the semimartingale kind.
Resumo:
In this work we develop a viscoelastic bar element that can handle multiple rheo- logical laws with non-linear elastic and non-linear viscous material models. The bar element is built by joining in series an elastic and viscous bar, constraining the middle node position to the bar axis with a reduction method, and stati- cally condensing the internal degrees of freedom. We apply the methodology to the modelling of reversible softening with sti ness recovery both in 2D and 3D, a phenomenology also experimentally observed during stretching cycles on epithelial lung cell monolayers.
Resumo:
There are many factors that influence the day-ahead market bidding strategies of a generation company (GenCo) in the current energy market framework. Environmental policy issues have become more and more important for fossil-fuelled power plants and they have to be considered in their management, giving rise to emission limitations. This work allows to investigate the influence of both the allowances and emission reduction plan, and the incorporation of the derivatives medium-term commitments in the optimal generation bidding strategy to the day-ahead electricity market. Two different technologies have been considered: the coal thermal units, high-emission technology, and the combined cycle gas turbine units, low-emission technology. The Iberian Electricity Market and the Spanish National Emissions and Allocation Plans are the framework to deal with the environmental issues in the day-ahead market bidding strategies. To address emission limitations, some of the standard risk management methodologies developed for financial markets, such as Value-at-Risk (VaR) and Conditional Value-at-Risk (CVaR), have been extended. This study offers to electricity generation utilities a mathematical model to determinate the individual optimal generation bid to the wholesale electricity market, for each one of their generation units that maximizes the long-run profits of the utility abiding by the Iberian Electricity Market rules, the environmental restrictions set by the EU Emission Trading Scheme, as well as the restrictions set by the Spanish National Emissions Reduction Plan. The economic implications for a GenCo of including the environmental restrictions of these National Plans are analyzed and the most remarkable results will be presented.
Resumo:
In this paper we propose a parsimonious regime-switching approach to model the correlations between assets, the threshold conditional correlation (TCC) model. This method allows the dynamics of the correlations to change from one state (or regime) to another as a function of observable transition variables. Our model is similar in spirit to Silvennoinen and Teräsvirta (2009) and Pelletier (2006) but with the appealing feature that it does not suffer from the course of dimensionality. In particular, estimation of the parameters of the TCC involves a simple grid search procedure. In addition, it is easy to guarantee a positive definite correlation matrix because the TCC estimator is given by the sample correlation matrix, which is positive definite by construction. The methodology is illustrated by evaluating the behaviour of international equities, govenrment bonds and major exchange rates, first separately and then jointly. We also test and allow for different parts in the correlation matrix to be governed by different transition variables. For this, we estimate a multi-threshold TCC specification. Further, we evaluate the economic performance of the TCC model against a constant conditional correlation (CCC) estimator using a Diebold-Mariano type test. We conclude that threshold correlation modelling gives rise to a significant reduction in portfolio´s variance.
Resumo:
It can be assumed that the composition of Mercury’s thin gas envelope (exosphere) is related to thecomposition of the planets crustal materials. If this relationship is true, then inferences regarding the bulkchemistry of the planet might be made from a thorough exospheric study. The most vexing of allunsolved problems is the uncertainty in the source of each component. Historically, it has been believedthat H and He come primarily from the solar wind, while Na and K originate from volatilized materialspartitioned between Mercury’s crust and meteoritic impactors. The processes that eject atoms andmolecules into the exosphere of Mercury are generally considered to be thermal vaporization, photonstimulateddesorption (PSD), impact vaporization, and ion sputtering. Each of these processes has its owntemporal and spatial dependence. The exosphere is strongly influenced by Mercury’s highly ellipticalorbit and rapid orbital speed. As a consequence the surface undergoes large fluctuations in temperatureand experiences differences of insolation with longitude. We will discuss these processes but focus moreon the expected surface composition and solar wind particle sputtering which releases material like Caand other elements from the surface minerals and discuss the relevance of composition modelling
Resumo:
The main objective of this paper aims at developing a methodology that takes into account the human factor extracted from the data base used by the recommender systems, and which allow to resolve the specific problems of prediction and recommendation. In this work, we propose to extract the user's human values scale from the data base of the users, to improve their suitability in open environments, such as the recommender systems. For this purpose, the methodology is applied with the data of the user after interacting with the system. The methodology is exemplified with a case study
Resumo:
Reverse transcriptase (RT) is a multifunctional enzyme in the human immunodeficiency virus (HIV)-1 life cycle and represents a primary target for drug discovery efforts against HIV-1 infection. Two classes of RT inhibitors, the nucleoside RT inhibitors (NRTIs) and the nonnucleoside transcriptase inhibitors are prominently used in the highly active antiretroviral therapy in combination with other anti-HIV drugs. However, the rapid emergence of drug-resistant viral strains has limited the successful rate of the anti-HIV agents. Computational methods are a significant part of the drug design process and indispensable to study drug resistance. In this review, recent advances in computer-aided drug design for the rational design of new compounds against HIV-1 RT using methods such as molecular docking, molecular dynamics, free energy calculations, quantitative structure-activity relationships, pharmacophore modelling and absorption, distribution, metabolism, excretion and toxicity prediction are discussed. Successful applications of these methodologies are also highlighted.
Resumo:
In South America, yellow fever (YF) is an established infectious disease that has been identified outside of its traditional endemic areas, affecting human and nonhuman primate (NHP) populations. In the epidemics that occurred in Argentina between 2007-2009, several outbreaks affecting humans and howler monkeys (Alouatta spp) were reported, highlighting the importance of this disease in the context of conservation medicine and public health policies. Considering the lack of information about YF dynamics in New World NHP, our main goal was to apply modelling tools to better understand YF transmission dynamics among endangered brown howler monkey (Alouatta guariba clamitans) populations in northeastern Argentina. Two complementary modelling tools were used to evaluate brown howler population dynamics in the presence of the disease: Vortex, a stochastic demographic simulation model, and Outbreak, a stochastic disease epidemiology simulation. The baseline model of YF disease epidemiology predicted a very high probability of population decline over the next 100 years. We believe the modelling approach discussed here is a reasonable description of the disease and its effects on the howler monkey population and can be useful to support evidence-based decision-making to guide actions at a regional level.
Resumo:
En aquest article es resumeixen els resultats publicats en un informe de l' ISS (Istituto Superiore di Sanità) del desembre de 2006, sobre un model matemàtic desenvolupat per un grup de treball que inclou a investigadors de les Universitats de Trento, Pisa i Roma, i els Instituts Nacionals de Salut (Istituto Superiore di Sanità, ISS), per avaluar i mesurar l'impacte de la transmissió i el control de la pandèmia de grip