25 resultados para Coupling and Integration of Hydrologic Models II

em Consorci de Serveis Universitaris de Catalunya (CSUC), Spain


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The main purpose of this work is to give a survey of main monotonicity properties of queueing processes based on the coupling method. The literature on this topic is quite extensive, and we do not consider all aspects of this topic. Our more concrete goal is to select the most interesting basic monotonicity results and give simple and elegant proofs. Also we give a few new (or revised) proofs of a few important monotonicity properties for the queue-size and workload processes both in single-server and multi- server systems. The paper is organized as follows. In Section 1, the basic notions and results on coupling method are given. Section 2 contains known coupling results for renewal processes with focus on construction of synchronized renewal instants for a superposition of independent renewal processes. In Section 3, we present basic monotonicity results for the queue-size and workload processes. We consider both discrete-and continuous-time queueing systems with single and multi servers. Less known results on monotonicity of queueing processes with dependent service times and interarrival times are also presented. Section 4 is devoted to monotonicity of general Jackson-type queueing networks with Markovian routing. This section is based on the notable paper [17]. Finally, Section 5 contains elements of stability analysis of regenerative queues and networks, where coupling and monotonicity results play a crucial role to establish minimal suficient stability conditions. Besides, we present some new monotonicity results for tandem networks.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper is a sequel to ``Normal forms, stability and splitting of invariant manifolds I. Gevrey Hamiltonians", in which we gave a new construction of resonant normal forms with an exponentially small remainder for near-integrable Gevrey Hamiltonians at a quasi-periodic frequency, using a method of periodic approximations. In this second part we focus on finitely differentiable Hamiltonians, and we derive normal forms with a polynomially small remainder. As applications, we obtain a polynomially large upper bound on the stability time for the evolution of the action variables and a polynomially small upper bound on the splitting of invariant manifolds for hyperbolic tori.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The aim of this study is to define a new statistic, PVL, based on the relative distance between the likelihood associated with the simulation replications and the likelihood of the conceptual model. Our results coming from several simulation experiments of a clinical trial show that the PVL statistic range can be a good measure of stability to establish when a computational model verifies the underlying conceptual model. PVL improves also the analysis of simulation replications because only one statistic is associated with all the simulation replications. As well it presents several verification scenarios, obtained by altering the simulation model, that show the usefulness of PVL. Further simulation experiments suggest that a 0 to 20 % range may define adequate limits for the verification problem, if considered from the viewpoint of an equivalence test.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Quantitative or algorithmic trading is the automatization of investments decisions obeying a fixed or dynamic sets of rules to determine trading orders. It has increasingly made its way up to 70% of the trading volume of one of the biggest financial markets such as the New York Stock Exchange (NYSE). However, there is not a signi cant amount of academic literature devoted to it due to the private nature of investment banks and hedge funds. This projects aims to review the literature and discuss the models available in a subject that publications are scarce and infrequently. We review the basic and fundamental mathematical concepts needed for modeling financial markets such as: stochastic processes, stochastic integration and basic models for prices and spreads dynamics necessary for building quantitative strategies. We also contrast these models with real market data with minutely sampling frequency from the Dow Jones Industrial Average (DJIA). Quantitative strategies try to exploit two types of behavior: trend following or mean reversion. The former is grouped in the so-called technical models and the later in the so-called pairs trading. Technical models have been discarded by financial theoreticians but we show that they can be properly cast into a well defined scientific predictor if the signal generated by them pass the test of being a Markov time. That is, we can tell if the signal has occurred or not by examining the information up to the current time; or more technically, if the event is F_t-measurable. On the other hand the concept of pairs trading or market neutral strategy is fairly simple. However it can be cast in a variety of mathematical models ranging from a method based on a simple euclidean distance, in a co-integration framework or involving stochastic differential equations such as the well-known Ornstein-Uhlenbeck mean reversal ODE and its variations. A model for forecasting any economic or financial magnitude could be properly defined with scientific rigor but it could also lack of any economical value and be considered useless from a practical point of view. This is why this project could not be complete without a backtesting of the mentioned strategies. Conducting a useful and realistic backtesting is by no means a trivial exercise since the \laws" that govern financial markets are constantly evolving in time. This is the reason because we make emphasis in the calibration process of the strategies' parameters to adapt the given market conditions. We find out that the parameters from technical models are more volatile than their counterpart form market neutral strategies and calibration must be done in a high-frequency sampling manner to constantly track the currently market situation. As a whole, the goal of this project is to provide an overview of a quantitative approach to investment reviewing basic strategies and illustrating them by means of a back-testing with real financial market data. The sources of the data used in this project are Bloomberg for intraday time series and Yahoo! for daily prices. All numeric computations and graphics used and shown in this project were implemented in MATLAB^R scratch from scratch as a part of this thesis. No other mathematical or statistical software was used.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: Single nucleotide polymorphisms (SNPs) are the most frequent type of sequence variation between individuals, and represent a promising tool for finding genetic determinants of complex diseases and understanding the differences in drug response. In this regard, it is of particular interest to study the effect of non-synonymous SNPs in the context of biological networks such as cell signalling pathways. UniProt provides curated information about the functional and phenotypic effects of sequence variation, including SNPs, as well as on mutations of protein sequences. However, no strategy has been developed to integrate this information with biological networks, with the ultimate goal of studying the impact of the functional effect of SNPs in the structure and dynamics of biological networks. Results: First, we identified the different challenges posed by the integration of the phenotypic effect of sequence variants and mutations with biological networks. Second, we developed a strategy for the combination of data extracted from public resources, such as UniProt, NCBI dbSNP, Reactome and BioModels. We generated attribute files containing phenotypic and genotypic annotations to the nodes of biological networks, which can be imported into network visualization tools such as Cytoscape. These resources allow the mapping and visualization of mutations and natural variations of human proteins and their phenotypic effect on biological networks (e.g. signalling pathways, protein-protein interaction networks, dynamic models). Finally, an example on the use of the sequence variation data in the dynamics of a network model is presented. Conclusion: In this paper we present a general strategy for the integration of pathway and sequence variation data for visualization, analysis and modelling purposes, including the study of the functional impact of protein sequence variations on the dynamics of signalling pathways. This is of particular interest when the SNP or mutation is known to be associated to disease. We expect that this approach will help in the study of the functional impact of disease-associated SNPs on the behaviour of cell signalling pathways, which ultimately will lead to a better understanding of the mechanisms underlying complex diseases.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper I review a series of theoretical concepts that are relevant for the integrated assessment of agricultural sustainability but that are not generally included in the curriculum of the various scientific disciplines dealing with quantitative analysis of agriculture. I first illustrate with plain narratives and concrete examples that sustainability is an extremely complex issue requiring the simultaneous consideration of several aspects, which cannot be reduced into a single indicator of performance. Following, I justify this obvious need for multi-criteria analysis with theoretical concepts dealing with the epistemological predicament of complexity, starting from classic philosophical lessons to arrive to recent developments in complex system theory, in particular Rosen´s theory of modelling relation which is essential to analyze the quality of any quantitative representation. The implications of these theoretical concepts are then illustrated with applications of multi-criteria analysis to the sustainability of agriculture. I wrap up by pointing out the crucial difference between "integrated assessment" and "integrated analysis". An integrated analysis is a set of indicators and analytical models generating an analytical output. An integrated assessment is much more than that. It is about finding an effective way to deal with three key issues: (i) legitimacy – how to handle the unavoidable existence of legitimate but contrasting points of view about different meanings given by social actors to the word "development"; (ii) pertinence – how to handle in a coherent way scientific analyses referring to different scales and dimensions; and (iii) credibility – how to handle the unavoidable existence of uncertainty and genuine ignorance, when dealing with the analysis of future scenarios.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

DNA methylation has an important impact on normal cell physiology, thus any defects in this mechanism may be related to the development of various diseases In this project we are interested in identifying epigeneticaliy modified genes, in general controlled by processes related to the DNA methylation, by means of a new strategy combining protomic and genomic analyses. First, the two Dimensional-Difference Gel Electrophoresis (2-DIGE) protein analyses of extracts obtained from HCT-116 wt and double knockout for DNMT1 and DNMT3b (DKO) cells revealed 34 proteins overexpressed in the condition of DNMTs depletion. From five genes with higher transcript lavels in DKO cells, comparing with HCT-116 wt. oniy AKR1B1, UCHLl and VIM are melhylated in HCT-116. As expected. the DNA methvlation 1s lost in DKO cells. The rneth,vl ation of VIM and UCHLl promoters in some cancer samples has already been repaired, thus further studies has been focused on AKRlBI. AKR1B1 expression due lo DNA methyiaton of promoter region seems to occur specilfically in the colon cancer cell Iines. which was confirmed in the DNA rnethylation status and expression analyses. performed on 32 different cancer cell lines (including colon, breast, lymphoma, leukemia, neuroblastoma, glioma and lung cancer cell Iines) as well as normal colon and normal lymphocytes samples. AKRIBI expression after treatments with DNA demethvlating agent (AZA) was rescued in 5 coloncancer cell lines (including genetic regulation of the candidate gene. The methylation status of the rest of the genes identified in proteomic analysis was checked by methylation specific PCR (MSP) experiment and all appeared to be unmethylated. The similar research has been done also bv means of Mecp2-null mouse model For 14 selected candidate genes the analyses of expression leveis, methylation Status and MeCP2 interaction with promoters are currently being performed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The increasing volume of data describing humandisease processes and the growing complexity of understanding, managing, and sharing such data presents a huge challenge for clinicians and medical researchers. This paper presents the@neurIST system, which provides an infrastructure for biomedical research while aiding clinical care, by bringing together heterogeneous data and complex processing and computing services. Although @neurIST targets the investigation and treatment of cerebral aneurysms, the system’s architecture is generic enough that it could be adapted to the treatment of other diseases.Innovations in @neurIST include confining the patient data pertaining to aneurysms inside a single environment that offers cliniciansthe tools to analyze and interpret patient data and make use of knowledge-based guidance in planning their treatment. Medicalresearchers gain access to a critical mass of aneurysm related data due to the system’s ability to federate distributed informationsources. A semantically mediated grid infrastructure ensures that both clinicians and researchers are able to seamlessly access andwork on data that is distributed across multiple sites in a secure way in addition to providing computing resources on demand forperforming computationally intensive simulations for treatment planning and research.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper points out an empirical puzzle that arises when an RBC economy with a job matching function is used to model unemployment. The standard model can generate sufficiently large cyclical fluctuations in unemployment, or a sufficiently small response of unemployment to labor market policies, but it cannot do both. Variable search and separation, finite UI benefit duration, efficiency wages, and capital all fail to resolve this puzzle. However, both sticky wages and match-specific productivity shocks help the model reproduce the stylized facts: both make the firm's flow of surplus more procyclical, thus making hiring more procyclical too.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Research, teaching and service are the main activities carried out in almost all European universities. Previous research, which has been mainlycentred in North-American universities, has found solid results indicatingthat research and teaching are not equally valued when deciding on facultypromotion. This conclusion creates a potential conflict for accountingacademics on how to distribute working time in order to accomplish personalcareer objectives. This paper presents the results of a survey realisedin two European countries: Spain and the United Kingdom, which intendedto explore the opinions and personal experience of accounting academicsworking in these countries. Specifically, we focus on the following issues:(i) The impact of teaching and service on time available for research;(ii) The integration of teaching and research; (iii) The perceived valueof teaching and research for career success and (iv) The interaction betweenprofessional accounting and accounting research. The results show thatboth in Spain and in the United Kingdom there is a conflict between teachingand research, which has its origin in the importance attached to researchactivities on promotion decisions. It also seems evident that so far, theconflict is being solved in favour of research in prejudice of teaching.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper theoretically and empirically documents a puzzle that arises when an RBC economy with a job matching function is used to model unemployment. The standard model can generate sufficiently large cyclical fluctuations in unemployment, or a sufficiently small response of unemployment to labor market policies, but it cannot do both. Variable search and separation, finite UI benefit duration, efficiency wages, and capital all fail to resolve this puzzle. However, either sticky wages or match-specific productivity shocks can improve the model's performance by making the firm's flow of surplus more procyclical, which makes hiring more procyclical too.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This line of research of my group intends to establish a Silicon technological platform in the field of photonics allowing the development of a wide set of applications. Particularly, what is still lacking in Silicon Photonics is an efficient and integrable light source such an LED or laser. Nanocrystals in silicon oxide or nitride matrices have been recently demonstrated as competitive materials for both active components (electrically and optically driven light emitters and optical amplifiers) and passive ones (waveguides and modulators). The final goal is the achievement of a complete integration of electronic and optical functions in the same CMOS chip. The first part of this paper will introduce the structural and optical properties of LEDs fabricated from silicon nanostructures. The second will treat the interaction of such nanocrystals with rare-earth elements (Er), which lead to an efficient hybrid system emitting in the third window of optical fibers. I will present the fabrication and assessment of optical waveguide amplifiers at 1.54 ¿m for which we have been able to demonstrate recently optical gain in waveguides made from sputtered silicon suboxide materials.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The intensity correlation functions C(t) for the colored-gain-noise model of dye lasers are analyzed and compared with those for the loss-noise model. For correlation times ¿ larger than the deterministic relaxation time td, we show with the use of the adiabatic approximation that C(t) values coincide for both models. For small correlation times we use a method that provides explicit expressions of non-Markovian correlation functions, approximating simultaneously short- and long-time behaviors. Comparison with numerical simulations shows excellent results simultaneously for short- and long-time regimes. It is found that, when the correlation time of the noise increases, differences between the gain- and loss-noise models tend to disappear. The decay of C(t) for both models can be described by a time scale that approaches the deterministic relaxation time. However, in contrast with the loss-noise model, a secondary time scale remains for large times for the gain-noise model, which could allow one to distinguish between both models.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The material presented in the these notes covers the sessions Modelling of electromechanical systems, Passive control theory I and Passive control theory II of the II EURON/GEOPLEX Summer School on Modelling and Control of Complex Dynamical Systems.We start with a general description of what an electromechanical system is from a network modelling point of view. Next, a general formulation in terms of PHDS is introduced, and some of the previous electromechanical systems are rewritten in this formalism. Power converters, which are variable structure systems (VSS), can also be given a PHDS form.We conclude the modelling part of these lectures with a rather complex example, showing the interconnection of subsystems from several domains, namely an arrangement to temporally store the surplus energy in a section of a metropolitan transportation system based on dc motor vehicles, using either arrays of supercapacitors or an electric poweredflywheel. The second part of the lectures addresses control of PHD systems. We first present the idea of control as power connection of a plant and a controller. Next we discuss how to circumvent this obstacle and present the basic ideas of Interconnection and Damping Assignment (IDA) passivity-based control of PHD systems.