989 resultados para Linear integrated circuits
Resumo:
A new algorithm called the parameterized expectations approach(PEA) for solving dynamic stochastic models under rational expectationsis developed and its advantages and disadvantages are discussed. Thisalgorithm can, in principle, approximate the true equilibrium arbitrarilywell. Also, this algorithm works from the Euler equations, so that theequilibrium does not have to be cast in the form of a planner's problem.Monte--Carlo integration and the absence of grids on the state variables,cause the computation costs not to go up exponentially when the numberof state variables or the exogenous shocks in the economy increase. \\As an application we analyze an asset pricing model with endogenousproduction. We analyze its implications for time dependence of volatilityof stock returns and the term structure of interest rates. We argue thatthis model can generate hump--shaped term structures.
Resumo:
Electron microscopy was used to monitor the fate of reconstituted nucleosome cores during in vitro transcription of long linear and supercoiled multinucleosomic templates by the prokaryotic T7 RNA polymerase and the eukaryotic RNA polymerase II. Transcription by T7 RNA polymerase disrupted the nucleosomal configuration in the transcribed region, while nucleosomes were preserved upstream of the transcription initiation site and in front of the polymerase. Nucleosome disruption was independent of the topology of the template, linear or supercoiled, and of the presence or absence of nucleosome positioning sequences in the transcribed region. In contrast, the nucleosomal configuration was preserved during transcription from the vitellogenin B1 promoter with RNA polymerase II in a rat liver total nuclear extract. However, the persistence of nucleosomes on the template was not RNA polymerase II-specific, but was dependent on another activity present in the nuclear extract. This was demonstrated by addition of the extract to the T7 RNA polymerase transcription reaction, which resulted in retention of the nucleosomal configuration. This nuclear activity, also found in HeLa cell nuclei, is heat sensitive and could not be substituted by nucleoplasmin, chromatin assembly factor (CAF-I) or a combination thereof. Altogether, these results identify a novel nuclear activity, called herein transcription-dependent chromatin stabilizing activity I or TCSA-I, which may be involved in a nucleosome transfer mechanism during transcription.
Resumo:
The problems arising in the logistics of commercial distribution are complexand involve several players and decision levels. One important decision isrelated with the design of the routes to distribute the products, in anefficient and inexpensive way.This article explores three different distribution strategies: the firststrategy corresponds to the classical vehicle routing problem; the second isa master route strategy with daily adaptations and the third is a strategythat takes into account the cross-functional planning through amulti-objective model with two objectives. All strategies are analyzed ina multi-period scenario. A metaheuristic based on the Iteratetd Local Search,is used to solve the models related with each strategy. A computationalexperiment is performed to evaluate the three strategies with respect to thetwo objectives. The cross functional planning strategy leads to solutions thatput in practice the coordination between functional areas and better meetbusiness objectives.
Resumo:
O problema de otimização de mínimos quadrados e apresentado como uma classe importante de problemas de minimização sem restrições. A importância dessa classe de problemas deriva das bem conhecidas aplicações a estimação de parâmetros no contexto das analises de regressão e de resolução de sistemas de equações não lineares. Apresenta-se uma revisão dos métodos de otimização de mínimos quadrados lineares e de algumas técnicas conhecidas de linearização. Faz-se um estudo dos principais métodos de gradiente usados para problemas não lineares gerais: Métodos de Newton e suas modificações incluindo os métodos Quasi-Newton mais usados (DFP e BFGS). Introduzem-se depois métodos específicos de gradiente para problemas de mínimos quadrados: Gauss-Newton e Levenberg-Larquardt. Apresenta-se uma variedade de exemplos selecionados na literatura para testar os diferentes métodos usando rotinas MATLAB. Faz-se uma an alise comparativa dos algoritmos baseados nesses ensaios computacionais que exibem as vantagens e desvantagens dos diferentes métodos.
Resumo:
The statistical properties of inflation and, in particular, its degree of persistence and stability over time is a subject of intense debate and no consensus has been achieved yet. The goal of this paper is to analyze this controversy using a general approach, with the aim of providing a plausible explanation for the existing contradictory results. We consider the inflation rates of 21 OECD countries which are modelled as fractionally integrated (FI) processes. First, we show analytically that FI can appear in inflation rates after aggregating individual prices from firms that face different costs of adjusting their prices. Then, we provide robust empirical evidence supporting the FI hypothesis using both classical and Bayesian techniques. Next, we estimate impulse response functions and other scalar measures of persistence, achieving an accurate picture of this property and its variation across countries. It is shown that the application of some popular tools for measuring persistence, such as the sum of the AR coefficients, could lead to erroneous conclusions if fractional integration is present. Finally, we explore the existence of changes in inflation inertia using a novel approach. We conclude that the persistence of inflation is very high (although non-permanent) in most post-industrial countries and that it has remained basically unchanged over the last four decades.
Resumo:
We present a new unifying framework for investigating throughput-WIP(Work-in-Process) optimal control problems in queueing systems,based on reformulating them as linear programming (LP) problems withspecial structure: We show that if a throughput-WIP performance pairin a stochastic system satisfies the Threshold Property we introducein this paper, then we can reformulate the problem of optimizing alinear objective of throughput-WIP performance as a (semi-infinite)LP problem over a polygon with special structure (a thresholdpolygon). The strong structural properties of such polygones explainthe optimality of threshold policies for optimizing linearperformance objectives: their vertices correspond to the performancepairs of threshold policies. We analyze in this framework theversatile input-output queueing intensity control model introduced byChen and Yao (1990), obtaining a variety of new results, including (a)an exact reformulation of the control problem as an LP problem over athreshold polygon; (b) an analytical characterization of the Min WIPfunction (giving the minimum WIP level required to attain a targetthroughput level); (c) an LP Value Decomposition Theorem that relatesthe objective value under an arbitrary policy with that of a giventhreshold policy (thus revealing the LP interpretation of Chen andYao's optimality conditions); (d) diminishing returns and invarianceproperties of throughput-WIP performance, which underlie thresholdoptimality; (e) a unified treatment of the time-discounted andtime-average cases.
Resumo:
We present an exact test for whether two random variables that have known bounds on their support are negatively correlated. The alternative hypothesis is that they are not negatively correlated. No assumptions are made on the underlying distributions. We show by example that the Spearman rank correlation test as the competing exact test of correlation in nonparametric settings rests on an additional assumption on the data generating process without which it is not valid as a test for correlation.We then show how to test for the significance of the slope in a linear regression analysis that invovles a single independent variable and where outcomes of the dependent variable belong to a known bounded set.
Resumo:
This paper presents a test of the predictive validity of various classes ofQALY models (i.e., linear, power and exponential models). We first estimatedTTO utilities for 43 EQ-5D chronic health states and next these states wereembedded in health profiles. The chronic TTO utilities were then used topredict the responses to TTO questions with health profiles. We find that thepower QALY model clearly outperforms linear and exponential QALY models.Optimal power coefficient is 0.65. Our results suggest that TTO-based QALYcalculations may be biased. This bias can be avoided using a power QALY model.
Resumo:
This paper introduces the approach of using Total Unduplicated Reach and Frequency analysis (TURF) to design a product line through a binary linear programming model. This improves the efficiency of the search for the solution to the problem compared to the algorithms that have been used to date. The results obtained through our exact algorithm are presented, and this method shows to be extremely efficient both in obtaining optimal solutions and in computing time for very large instances of the problem at hand. Furthermore, the proposed technique enables the model to be improved in order to overcome the main drawbacks presented by TURF analysis in practice.
3D seismic facies characterization and geological patterns recognition (Australian North West Shelf)
Resumo:
EXECUTIVE SUMMARY This PhD research, funded by the Swiss Sciences Foundation, is principally devoted to enhance the recognition, the visualisation and the characterization of geobodies through innovative 3D seismic approaches. A series of case studies from the Australian North West Shelf ensures the development of reproducible integrated 3D workflows and gives new insight into local and regional stratigraphic as well as structural issues. This project was initiated in year 2000 at the Geology and Palaeontology Institute of the University of Lausanne (Switzerland). Several collaborations ensured the improvement of technical approaches as well as the assessment of geological models. - Investigations into the Timor Sea structural style were carried out at the Tectonics Special Research Centre of the University of Western Australia and in collaboration with Woodside Energy in Perth. - Seismic analysis and attributes classification approach were initiated with Schlumberger Oilfield Australia in Perth; assessments and enhancements of the integrated seismic approaches benefited from collaborations with scientists from Schlumberger Stavanger Research (Norway). Adapting and refining from "linear" exploration techniques, a conceptual "helical" 3D seismic approach has been developed. In order to investigate specific geological issues this approach, integrating seismic attributes and visualisation tools, has been refined and adjusted leading to the development of two specific workflows: - A stratigraphic workflow focused on the recognition of geobodies and the characterization of depositional systems. Additionally, it can support the modelling of the subsidence and incidentally the constraint of the hydrocarbon maturity of a given area. - A structural workflow used to quickly and accurately define major and secondary fault systems. The integration of the 3D structural interpretation results ensures the analysis of the fault networks kinematics which can affect hydrocarbon trapping mechanisms. The application of these integrated workflows brings new insight into two complex settings on the Australian North West Shelf and ensures the definition of astonishing stratigraphic and structural outcomes. The stratigraphic workflow ensures the 3D characterization of the Late Palaeozoic glacial depositional system on the Mermaid Nose (Dampier Subbasin, Northern Carnarvon Basin) that presents similarities with the glacial facies along the Neotethys margin up to Oman (chapter 3.1). A subsidence model reveals the Phanerozoic geodynamic evolution of this area (chapter 3.2) and emphasizes two distinct mode of regional extension for the Palaeozoic (Neotethys opening) and Mesozoic (abyssal plains opening). The structural workflow is used for the definition of the structural evolution of the Laminaria High area (Bonaparte Basin). Following a regional structural characterization of the Timor Sea (chapter 4.1), a thorough analysis of the Mesozoic fault architecture reveals a local rotation of the stress field and the development of reverse structures (flower structures) in extensional setting, that form potential hydrocarbon traps (chapter 4.2). The definition of the complex Neogene structural architecture associated with the fault kinematic analysis and a plate flexure model (chapter 4.3) suggest that the Miocene to Pleistocene reactivation phases recorded at the Laminaria High most probably result from the oblique normal reactivation of the underlying Mesozoic fault planes. This episode is associated with the deformation of the subducting Australian plate. Based on these results three papers were published in international journals and two additional publications will be submitted. Additionally this research led to several communications in international conferences. Although the different workflows presented in this research have been primarily developed and used for the analysis of specific stratigraphic and structural geobodies on the Australian North West Shelf, similar integrated 3D seismic approaches will have applications to hydrocarbon exploration and production phases; for instance increasing the recognition of potential source rocks, secondary migration pathways, additional traps or reservoir breaching mechanisms. The new elements brought by this research further highlight that 3D seismic data contains a tremendous amount of hidden geological information waiting to be revealed and that will undoubtedly bring new insight into depositional systems, structural evolution and geohistory of the areas reputed being explored and constrained and other yet to be constrained. The further development of 3D texture attributes highlighting specific features of the seismic signal, the integration of quantitative analysis for stratigraphic and structural processes, the automation of the interpretation workflow as well as the formal definition of "seismo-morphologic" characteristics of a wide range of geobodies from various environments would represent challenging examples of continuation of this present research. The 21st century will most probably represent a transition period between fossil and other alternative energies. The next generation of seismic interpreters prospecting for hydrocarbon will undoubtedly face new challenges mostly due to the shortage of obvious and easy targets. They will probably have to keep on integrating techniques and geological processes in order to further capitalise the seismic data for new potentials definition. Imagination and creativity will most certainly be among the most important quality required from such geoscientists.
Resumo:
We develop a mathematical programming approach for the classicalPSPACE - hard restless bandit problem in stochastic optimization.We introduce a hierarchy of n (where n is the number of bandits)increasingly stronger linear programming relaxations, the lastof which is exact and corresponds to the (exponential size)formulation of the problem as a Markov decision chain, while theother relaxations provide bounds and are efficiently computed. Wealso propose a priority-index heuristic scheduling policy fromthe solution to the first-order relaxation, where the indices aredefined in terms of optimal dual variables. In this way wepropose a policy and a suboptimality guarantee. We report resultsof computational experiments that suggest that the proposedheuristic policy is nearly optimal. Moreover, the second-orderrelaxation is found to provide strong bounds on the optimalvalue.
Resumo:
Research on judgment and decision making presents a confusing picture of human abilities. For example, much research has emphasized the dysfunctional aspects of judgmental heuristics, and yet, other findings suggest that these can be highly effective. A further line of research has modeled judgment as resulting from as if linear models. This paper illuminates the distinctions in these approaches by providing a common analytical framework based on the central theoretical premise that understanding human performance requires specifying how characteristics of the decision rules people use interact with the demands of the tasks they face. Our work synthesizes the analytical tools of lens model research with novel methodology developed to specify the effectiveness of heuristics in different environments and allows direct comparisons between the different approaches. We illustrate with both theoretical analyses and simulations. We further link our results to the empirical literature by a meta-analysis of lens model studies and estimate both human andheuristic performance in the same tasks. Our results highlight the trade-off betweenlinear models and heuristics. Whereas the former are cognitively demanding, the latterare simple to use. However, they require knowledge and thus maps of when andwhich heuristic to employ.