968 resultados para Stochastic dynamic programming (SDP)
Resumo:
The dynamic properties of helix 12 in the ligand binding domain of nuclear receptors are a major determinant of AF-2 domain activity. We investigated the molecular and structural basis of helix 12 mobility, as well as the involvement of individual residues with regard to peroxisome proliferator-activated receptor alpha (PPARalpha) constitutive and ligand-dependent transcriptional activity. Functional assays of the activity of PPARalpha helix 12 mutants were combined with free energy molecular dynamics simulations. The agreement between the results from these approaches allows us to make robust claims concerning the mechanisms that govern helix 12 functions. Our data support a model in which PPARalpha helix 12 transiently adopts a relatively stable active conformation even in the absence of a ligand. This conformation provides the interface for the recruitment of a coactivator and results in constitutive activity. The receptor agonists stabilize this conformation and increase PPARalpha transcription activation potential. Finally, we disclose important functions of residues in PPARalpha AF-2, which determine the positioning of helix 12 in the active conformation in the absence of a ligand. Substitution of these residues suppresses PPARalpha constitutive activity, without changing PPARalpha ligand-dependent activation potential.
Resumo:
Hepatitis C virus (HCV) replicates its genome in a membrane-associated replication complex (RC). Specific membrane alterations, designated membranous webs, represent predominant sites of HCV RNA replication. The principles governing HCV RC and membranous web formation are poorly understood. Here, we used replicons harboring a green fluorescent protein (GFP) insertion in nonstructural protein 5A (NS5A) to study HCV RCs in live cells. Two distinct patterns of NS5A-GFP were observed. (i) Large structures, representing membranous webs, showed restricted motility, were stable over many hours, were partitioned among daughter cells during cell division, and displayed a static internal architecture without detectable exchange of NS5A-GFP. (ii) In contrast, small structures, presumably representing small RCs, showed fast, saltatory movements over long distances. Both populations were associated with endoplasmic reticulum (ER) tubules, but only small RCs showed ER-independent, microtubule (MT)-dependent transport. We suggest that this MT-dependent transport sustains two distinct RC populations, which are both required during the HCV life cycle.
Resumo:
Quantitative or algorithmic trading is the automatization of investments decisions obeying a fixed or dynamic sets of rules to determine trading orders. It has increasingly made its way up to 70% of the trading volume of one of the biggest financial markets such as the New York Stock Exchange (NYSE). However, there is not a signi cant amount of academic literature devoted to it due to the private nature of investment banks and hedge funds. This projects aims to review the literature and discuss the models available in a subject that publications are scarce and infrequently. We review the basic and fundamental mathematical concepts needed for modeling financial markets such as: stochastic processes, stochastic integration and basic models for prices and spreads dynamics necessary for building quantitative strategies. We also contrast these models with real market data with minutely sampling frequency from the Dow Jones Industrial Average (DJIA). Quantitative strategies try to exploit two types of behavior: trend following or mean reversion. The former is grouped in the so-called technical models and the later in the so-called pairs trading. Technical models have been discarded by financial theoreticians but we show that they can be properly cast into a well defined scientific predictor if the signal generated by them pass the test of being a Markov time. That is, we can tell if the signal has occurred or not by examining the information up to the current time; or more technically, if the event is F_t-measurable. On the other hand the concept of pairs trading or market neutral strategy is fairly simple. However it can be cast in a variety of mathematical models ranging from a method based on a simple euclidean distance, in a co-integration framework or involving stochastic differential equations such as the well-known Ornstein-Uhlenbeck mean reversal ODE and its variations. A model for forecasting any economic or financial magnitude could be properly defined with scientific rigor but it could also lack of any economical value and be considered useless from a practical point of view. This is why this project could not be complete without a backtesting of the mentioned strategies. Conducting a useful and realistic backtesting is by no means a trivial exercise since the \laws" that govern financial markets are constantly evolving in time. This is the reason because we make emphasis in the calibration process of the strategies' parameters to adapt the given market conditions. We find out that the parameters from technical models are more volatile than their counterpart form market neutral strategies and calibration must be done in a high-frequency sampling manner to constantly track the currently market situation. As a whole, the goal of this project is to provide an overview of a quantitative approach to investment reviewing basic strategies and illustrating them by means of a back-testing with real financial market data. The sources of the data used in this project are Bloomberg for intraday time series and Yahoo! for daily prices. All numeric computations and graphics used and shown in this project were implemented in MATLAB^R scratch from scratch as a part of this thesis. No other mathematical or statistical software was used.
Resumo:
There are many factors that influence the day-ahead market bidding strategies of a generation company (GenCo) in the current energy market framework. Environmental policy issues have become more and more important for fossil-fuelled power plants and they have to be considered in their management, giving rise to emission limitations. This work allows to investigate the influence of both the allowances and emission reduction plan, and the incorporation of the derivatives medium-term commitments in the optimal generation bidding strategy to the day-ahead electricity market. Two different technologies have been considered: the coal thermal units, high-emission technology, and the combined cycle gas turbine units, low-emission technology. The Iberian Electricity Market and the Spanish National Emissions and Allocation Plans are the framework to deal with the environmental issues in the day-ahead market bidding strategies. To address emission limitations, some of the standard risk management methodologies developed for financial markets, such as Value-at-Risk (VaR) and Conditional Value-at-Risk (CVaR), have been extended. This study offers to electricity generation utilities a mathematical model to determinate the individual optimal generation bid to the wholesale electricity market, for each one of their generation units that maximizes the long-run profits of the utility abiding by the Iberian Electricity Market rules, the environmental restrictions set by the EU Emission Trading Scheme, as well as the restrictions set by the Spanish National Emissions Reduction Plan. The economic implications for a GenCo of including the environmental restrictions of these National Plans are analyzed and the most remarkable results will be presented.
Resumo:
This paper studies the limits of discrete time repeated games with public monitoring. We solve and characterize the Abreu, Milgrom and Pearce (1991) problem. We found that for the "bad" ("good") news model the lower (higher) magnitude events suggest cooperation, i.e., zero punishment probability, while the highrt (lower) magnitude events suggest defection, i.e., punishment with probability one. Public correlation is used to connect these two sets of signals and to make the enforceability to bind. The dynamic and limit behavior of the punishment probabilities for variations in ... (the discount rate) and ... (the time interval) are characterized, as well as the limit payo¤s for all these scenarios (We also introduce uncertainty in the time domain). The obtained ... limits are to the best of my knowledge, new. The obtained ... limits coincide with Fudenberg and Levine (2007) and Fudenberg and Olszewski (2011), with the exception that we clearly state the precise informational conditions that cause the limit to converge from above, to converge from below or to degenerate. JEL: C73, D82, D86. KEYWORDS: Repeated Games, Frequent Monitoring, Random Pub- lic Monitoring, Moral Hazard, Stochastic Processes.
Resumo:
El cluster Medicon Valley es troba a la regió d'Oresund binacional que s'estén per Dinamarca i Suècia, inclosa la Universitat de Lund, ciutat i tercera ciutat més gran de Suècia, Malmö (veure figura 1). El 2000, aquestes dues parts nacionals estaven connectades físicament per l'establiment dels 18 quilòmetres de longitud, enllaç fix del Øresund (ponts i túnels).
Resumo:
Hepatitis C virus (HCV) NS3-4A is a membrane-associated multifunctional protein harboring serine protease and RNA helicase activities. It is an essential component of the HCV replication complex and a prime target for antiviral intervention. Here, we show that membrane association and structural organization of HCV NS3-4A are ensured in a cooperative manner by two membrane-binding determinants. We demonstrate that the N-terminal 21 amino acids of NS4A form a transmembrane alpha-helix that may be involved in intramembrane protein-protein interactions important for the assembly of a functional replication complex. In addition, we demonstrate that amphipathic helix alpha(0), formed by NS3 residues 12-23, serves as a second essential determinant for membrane association of NS3-4A, allowing proper positioning of the serine protease active site on the membrane. These results allowed us to propose a dynamic model for the membrane association, processing, and structural organization of NS3-4A on the membrane. This model has implications for the functional architecture of the HCV replication complex, proteolytic targeting of host factors, and drug design.
Resumo:
Large projects evaluation rises well known difficulties because -by definition- they modify the current price system; their public evaluation presents additional difficulties because they modify too existing shadow prices without the project. This paper analyzes -first- the basic methodologies applied until late 80s., based on the integration of projects in optimization models or, alternatively, based on iterative procedures with information exchange between two organizational levels. New methodologies applied afterwards are based on variational inequalities, bilevel programming and linear or nonlinear complementarity. Their foundations and different applications related with project evaluation are explored. As a matter of fact, these new tools are closely related among them and can treat more complex cases involving -for example- the reaction of agents to policies or the existence of multiple agents in an environment characterized by common functions representing demands or constraints on polluting emissions.
Resumo:
Theories on social capital and on social entrepreneurship have mainly highlighted the attitude of social capital to generate enterprises and to foster good relations between third sector organizations and the public sector. This paper considers the social capital in a specific third sector enterprise; here, multi-stakeholder social cooperatives are seen, at the same time, as social capital results, creators and incubators. In the particular enterprises that identify themselves as community social enterprises, social capital, both as organizational and relational capital, is fundamental: SCEs arise from but also produce and disseminate social capital. This paper aims to improve the building of relational social capital and the refining of helpful relations drawn from other arenas, where they were created and from where they are sometimes transferred to other realities, where their role is carried on further (often working in non-profit, horizontally and vertically arranged groups, where they share resources and relations). To represent this perspective, we use a qualitative system dynamic approach in which social capital is measured using proxies. Cooperation of volunteers, customers, community leaders and third sector local organizations is fundamental to establish trust relations between public local authorities and cooperatives. These relations help the latter to maintain long-term contracts with local authorities as providers of social services and enable them to add innovation to their services, by developing experiences and management models and maintaining an interchange with civil servants regarding these matters. The long-term relations and the organizational relations linking SCEs and public organizations help to create and to renovate social capital. Thus, multi-stakeholder cooperatives originated via social capital developed in third sector organizations produce new social capital within the cooperatives themselves and between different cooperatives (entrepreneurial components of the third sector) and the public sector. In their entrepreneurial life, cooperatives have to contrast the "working drift," as a result of which only workers remain as members of the cooperative, while other stakeholders leave the organization. Those who are not workers in the cooperative are (stake)holders with "weak ties," who are nevertheless fundamental in making a worker's cooperative an authentic social multi-stakeholders cooperative. To maintain multi-stakeholder governance and the relations with third sector and civil society, social cooperatives have to reinforce participation and dialogue with civil society through ongoing efforts to include people that provide social proposals. We try to represent these processes in a system dynamic model applied to local cooperatives, measuring the social capital created by the social cooperative through proxies, such as number of volunteers and strong cooperation with public institutions. Using a reverse-engineering approach, we can individuate the determinants of the creation of social capital and thereby give support to governance that creates social capital.
Resumo:
We construct a dynamic theory of civil conflict hinging on inter-ethnic trust and trade. The model economy is inhabitated by two ethnic groups. Inter-ethnic trade requires imperfectly observed bilateral investments and one group has to form beliefs on the average propensity to trade of the other group. Since conflict disrupts trade, the onset of a conflict signals that the aggressor has a low propensity to trade. Agents observe the history of conflicts and update their beliefs over time, transmitting them to the next generation. The theory bears a set of testable predictions. First, war is a stochastic process whose frequency depends on the state of endogenous beliefs. Second, the probability of future conflicts increases after each conflict episode. Third, "accidental" conflicts that do not reflect economic fundamentals can lead to a permanent breakdown of trust, plunging a society into a vicious cycle of recurrent conflicts (a war trap). The incidence of conflict can be reduced by policies abating cultural barriers, fostering inter-ethnic trade and human capital, and shifting beliefs. Coercive peace policies such as peacekeeping forces or externally imposed regime changes have instead no persistent effects.
Resumo:
Nessie is an Autonomous Underwater Vehicle (AUV) created by a team of students in the Heriot Watt University to compete in the Student Autonomous Underwater Competition, Europe (SAUC-E) in August 2006. The main objective of the project is to find the dynamic equation of the robot, dynamic model. With it, the behaviour of the robot will be easier to understand and movement tests will be available by computer without the need of the robot, what is a way to save time, batteries, money and the robot from water inside itself. The object of the second part in this project is setting a control system for Nessie by using the model
Resumo:
Ground-penetrating radar (GPR) has the potential to provide valuable information on hydrological properties of the vadose zone because of their strong sensitivity to soil water content. In particular, recent evidence has suggested that the stochastic inversion of crosshole GPR data within a coupled geophysical-hydrological framework may allow for effective estimation of subsurface van-Genuchten-Mualem (VGM) parameters and their corresponding uncertainties. An important and still unresolved issue, however, is how to best integrate GPR data into a stochastic inversion in order to estimate the VGM parameters and their uncertainties, thus improving hydrological predictions. Recognizing the importance of this issue, the aim of the research presented in this thesis was to first introduce a fully Bayesian inversion called Markov-chain-Monte-carlo (MCMC) strategy to perform the stochastic inversion of steady-state GPR data to estimate the VGM parameters and their uncertainties. Within this study, the choice of the prior parameter probability distributions from which potential model configurations are drawn and tested against observed data was also investigated. Analysis of both synthetic and field data collected at the Eggborough (UK) site indicates that the geophysical data alone contain valuable information regarding the VGM parameters. However, significantly better results are obtained when these data are combined with a realistic, informative prior. A subsequent study explore in detail the dynamic infiltration case, specifically to what extent time-lapse ZOP GPR data, collected during a forced infiltration experiment at the Arrenaes field site (Denmark), can help to quantify VGM parameters and their uncertainties using the MCMC inversion strategy. The findings indicate that the stochastic inversion of time-lapse GPR data does indeed allow for a substantial refinement in the inferred posterior VGM parameter distributions. In turn, this significantly improves knowledge of the hydraulic properties, which are required to predict hydraulic behaviour. Finally, another aspect that needed to be addressed involved the comparison of time-lapse GPR data collected under different infiltration conditions (i.e., natural loading and forced infiltration conditions) to estimate the VGM parameters using the MCMC inversion strategy. The results show that for the synthetic example, considering data collected during a forced infiltration test helps to better refine soil hydraulic properties compared to data collected under natural infiltration conditions. When investigating data collected at the Arrenaes field site, further complications arised due to model error and showed the importance of also including a rigorous analysis of the propagation of model error with time and depth when considering time-lapse data. Although the efforts in this thesis were focused on GPR data, the corresponding findings are likely to have general applicability to other types of geophysical data and field environments. Moreover, the obtained results allow to have confidence for future developments in integration of geophysical data with stochastic inversions to improve the characterization of the unsaturated zone but also reveal important issues linked with stochastic inversions, namely model errors, that should definitely be addressed in future research.
The role of energetic value in dynamic brain response adaptation during repeated food image viewing.
Resumo:
The repeated presentation of simple objects as well as biologically salient objects can cause the adaptation of behavioral and neural responses during the visual categorization of these objects. Mechanisms of response adaptation during repeated food viewing are of particular interest for better understanding food intake beyond energetic needs. Here, we measured visual evoked potentials (VEPs) and conducted neural source estimations to initial and repeated presentations of high-energy and low-energy foods as well as non-food images. The results of our study show that the behavioral and neural responses to food and food-related objects are not uniformly affected by repetition. While the repetition of images displaying low-energy foods and non-food modulated VEPs as well as their underlying neural sources and increased behavioral categorization accuracy, the responses to high-energy images remained largely invariant between initial and repeated encounters. Brain mechanisms when viewing images of high-energy foods thus appear less susceptible to repetition effects than responses to low-energy and non-food images. This finding is likely related to the superior reward value of high-energy foods and might be one reason why in particular high-energetic foods are indulged although potentially leading to detrimental health consequences.