932 resultados para C68 - Computable General Equilibrium Models


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Planner is a formalism for proving theorems and manipulating models in a robot. The formalism is built out of a number of problem-solving primitives together with a hierarchical multiprocess backtrack control structure. Statements can be asserted and perhaps later withdrawn as the state of the world changes. Under BACKTRACK control structure, the hierarchy of activations of functions previously executed is maintained so that it is possible to revert to any previous state. Thus programs can easily manipulate elaborate hypothetical tentative states. In addition PLANNER uses multiprocessing so that there can be multiple loci of changes in state. Goals can be established and dismissed when they are satisfied. The deductive system of PLANNER is subordinate to the hierarchical control structure in order to maintain the desired degree of control. The use of a general-purpose matching language as the basis of the deductive system increases the flexibility of the system. Instead of explicitly naming procedures in calls, procedures can be invoked implicitly by patterns of what the procedure is supposed to accomplish. The language is being applied to solve problems faced by a robot, to write special purpose routines from goal oriented language, to express and prove properties of procedures, to abstract procedures from protocols of their actions, and as a semantic base for English.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

An investigation in innovation management and entrepreneurial management is conducted in this thesis. The aim of the research is to explore changes of innovation styles in the transformation process from a start-up company to a more mature phase of business, to predict in a second step future sustainability and the probability of success. As businesses grow in revenue, corporate size and functional complexity, various triggers, supporters and drivers affect innovation and company's success. In a comprehensive study more than 200 innovative and technology driven companies have been examined and compared to identify patterns in different performance levels. All of them have been founded under the same formal requirements of the Munich Business Plan Competition -a research approach which allowed a unique snapshot that only long-term studies would be able to provide. The general objective was to identify the correlation between different factors, as well as different dimensions, to incremental and radical innovations realised. The 12 hypothesis were formed to prove have been derived from a comprehensive literature review. The relevant academic and practitioner literature on entrepreneurial, innovation, and knowledge management as well as social network theory revealed that the concept of innovation has evolved significantly over the last decade. A review of over 15 innovation models/frameworks contributed to understand what innovation in context means and what the dimensions are. It appears that the complex theories of innovation can be described by the increasing extent of social ingredients in the explanation of innovativeness. Originally based on tangible forms of capital, and on the necessity of pull and technology push, innovation management is today integrated in a larger system. Therefore, two research instruments have been developed to explore the changes in innovations styles. The Innovation Management Audits (IMA Start-up and IMA Mature) provided statements related to product/service development, innovativeness in various typologies, resources for innovations, innovation capabilities in conjunction to knowledge and management, social networks as well as the measurement of outcomes to generate high-quality data for further exploration. In obtaining results the mature companies have been clustered in the performance level low, average and high, while the start-up companies have been kept as one cluster. Firstly, the analysis exposed that knowledge, the process of acquiring knowledge, interorganisational networks and resources for innovations are the most important driving factors for innovation and success. Secondly, the actual change of the innovation style provides new insights about the importance of focusing on sustaining success and innovation ii 16 key areas. Thirdly, a detailed overview of triggers, supporters and drivers for innovation and success for each dimension support decision makers in putting their company in the right direction. Fourthly, a critical review of contemporary strategic management in conjunction to the findings provides recommendation of how to apply well-known management tools. Last but not least, the Munich cluster is analysed providing an estimation of the success probability of the different performance cluster and start-up companies. For the analysis of the probability of success of the newly developed as well as statistically and qualitative validated ICP Model (Innovativeness, Capabilities & Potential) has been developed and applied. While the model was primarily developed to evaluate the probability of success of companies; it has equal application in the situation to measure innovativeness to identify the impact of various strategic initiatives within small or large enterprises. The main findings of the model are that competitor, and customer orientation and acquiring knowledge important for incremental and radical innovation. Formal and interorganisation networks are important to foster innovation but informal networks appear to be detrimental to innovation. The testing of the ICP model h the long term is recommended as one subject of further research. Another is to investigate some of the more intangible aspects of innovation management such as attitude and motivation of mangers. IV

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The extremes of exercise capacity and health are considered a complex interplay between genes and the environment. In general, the study of animal models has proven critical for deep mechanistic exploration that provides guidance for focused and hypothesis driven discovery in humans. Hypotheses underlying molecular mechanisms of disease, and gene/tissue function can be tested in rodents in order to generate sufficient evidence to resolve and progress our understanding of human biology. Here we provide examples of three alternative uses of rodent models that have been applied successfully to advance knowledge that bridges our understanding of the connection between exercise capacity and health status. Firstly we review the strong association between exercise capacity and all-cause morbidity and mortality in humans through artificial selection on low and high exercise performance in the rat and the consequent generation of the "energy transfer hypothesis". Secondly we review specific transgenic and knock-out mouse models that replicate the human disease condition and performance. This includes human glycogen storage diseases (McArdle and Pompe) and α-actinin-3 deficiency. Together these rodent models provide an overview of the advancements of molecular knowledge required for clinical translation. Continued study of these models in conjunction with human association studies will be critical to resolving the complex gene-environment interplay linking exercise capacity, health, and disease.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

M.H. Lee, On Models, Modelling and the Distinctive Nature of Model-Based Reasoning, AI Communications, 12 (3), pp127-137.1999.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Many deterministic models with hysteresis have been developed in the areas of economics, finance, terrestrial hydrology and biology. These models lack any stochastic element which can often have a strong effect in these areas. In this work stochastically driven closed loop systems with hysteresis type memory are studied. This type of system is presented as a possible stochastic counterpart to deterministic models in the areas of economics, finance, terrestrial hydrology and biology. Some price dynamics models are presented as a motivation for the development of this type of model. Numerical schemes for solving this class of stochastic differential equation are developed in order to examine the prototype models presented. As a means of further testing the developed numerical schemes, numerical examination is made of the behaviour near equilibrium of coupled ordinary differential equations where the time derivative of the Preisach operator is included in one of the equations. A model of two phenotype bacteria is also presented. This model is examined to explore memory effects and related hysteresis effects in the area of biology. The memory effects found in this model are similar to that found in the non-ideal relay. This non-ideal relay type behaviour is used to model a colony of bacteria with multiple switching thresholds. This model contains a Preisach type memory with a variable Preisach weight function. Shown numerically for this multi-threshold model is a pattern formation for the distribution of the phenotypes among the available thresholds.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Understanding how dynamic ecological communities respond to anthropogenic drivers of change such as habitat loss and fragmentation, climate change and the introduction of alien species requires that there is a theoretical framework able to predict community dynamics. At present there is a lack of empirical data that can be used to inform and test predictive models, which means that much of our knowledge regarding the response of ecological communities to perturbations is obtained from theoretical analyses and simulations. This thesis is composed of two strands of research: an empirical experiment conducted to inform the scaling of intraspecific and interspecific interaction strengths in a three species food chain and a series of theoretical analyses on the changes to equilibrium biomass abundances following press perturbations. The empirical experiment is a consequence of the difficulties faced when parameterising the intraspecific interaction strengths in a Lotka-Volterra model. A modification of the dynamic index is used alongside the original dynamic index to estimate intraspecific interactions and interspecific interaction strengths in a three species food. The theoretical analyses focused on the effect of press perturbations to focal species on the equilibrium biomass densities of all species in the community; these perturbations allow for the quantification of a species total net effect. It was found that there is a strong and consistent positive relationship between a species body size and its total net effect for a set of 97 synthetic food webs and also for the Ythan Estuary and Tuesday Lake food webs (empirically described food webs). It is shown that ecological constraints (due to allometric scaling) on the magnitude of entries in the community matrix cause the patterns observed in the inverse community matrix and thus explain the relationship between a species body mass and its total net effect in a community.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper, we examine exchange rates in Vietnam’s transitional economy. Evidence of long-run equilibrium are established in most cases through a single co-integrating vector among endogenous variables that determine the real exchange rates. This supports relative PPP in which ECT of the system can be combined linearly into a stationary process, reducing deviation from PPP in the long run. Restricted coefficient vectors ß’ = (1, 1, -1) for real exchange rates of currencies in question are not rejected. This empirics of relative PPP adds to found evidences by many researchers, including Flre et al. (1999), Lee (1999), Johnson (1990), Culver and Papell (1999), Cuddington and Liang (2001). Instead of testing for different time series on a common base currency, we use different base currencies (USD, GBP, JPY and EUR). By doing so we want to know the whether theory may posit significant differences against one currency? We have found consensus, given inevitable technical differences, even with smallerdata sample for EUR. Speeds of convergence to PPP and adjustment are faster compared to results from other researches for developed economies, using both observed and bootstrapped HL measures. Perhaps, a better explanation is the adjustment from hyperinflation period, after which the theory indicates that adjusting process actually accelerates. We observe that deviation appears to have been large in early stages of the reform, mostly overvaluation. Over time, its correction took place leading significant deviations to gradually disappear.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Empirical modeling of high-frequency currency market data reveals substantial evidence for nonnormality, stochastic volatility, and other nonlinearities. This paper investigates whether an equilibrium monetary model can account for nonlinearities in weekly data. The model incorporates time-nonseparable preferences and a transaction cost technology. Simulated sample paths are generated using Marcet's parameterized expectations procedure. The paper also develops a new method for estimation of structural economic models. The method forces the model to match (under a GMM criterion) the score function of a nonparametric estimate of the conditional density of observed data. The estimation uses weekly U.S.-German currency market data, 1975-90. © 1995.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper considers forecasting the conditional mean and variance from a single-equation dynamic model with autocorrelated disturbances following an ARMA process, and innovations with time-dependent conditional heteroskedasticity as represented by a linear GARCH process. Expressions for the minimum MSE predictor and the conditional MSE are presented. We also derive the formula for all the theoretical moments of the prediction error distribution from a general dynamic model with GARCH(1, 1) innovations. These results are then used in the construction of ex ante prediction confidence intervals by means of the Cornish-Fisher asymptotic expansion. An empirical example relating to the uncertainty of the expected depreciation of foreign exchange rates illustrates the usefulness of the results. © 1992.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper provides a root-n consistent, asymptotically normal weighted least squares estimator of the coefficients in a truncated regression model. The distribution of the errors is unknown and permits general forms of unknown heteroskedasticity. Also provided is an instrumental variables based two-stage least squares estimator for this model, which can be used when some regressors are endogenous, mismeasured, or otherwise correlated with the errors. A simulation study indicates that the new estimators perform well in finite samples. Our limiting distribution theory includes a new asymptotic trimming result addressing the boundary bias in first-stage density estimation without knowledge of the support boundary. © 2007 Cambridge University Press.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We discuss a general approach to dynamic sparsity modeling in multivariate time series analysis. Time-varying parameters are linked to latent processes that are thresholded to induce zero values adaptively, providing natural mechanisms for dynamic variable inclusion/selection. We discuss Bayesian model specification, analysis and prediction in dynamic regressions, time-varying vector autoregressions, and multivariate volatility models using latent thresholding. Application to a topical macroeconomic time series problem illustrates some of the benefits of the approach in terms of statistical and economic interpretations as well as improved predictions. Supplementary materials for this article are available online. © 2013 Copyright Taylor and Francis Group, LLC.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper studies two models of two-stage processing with no-wait in process. The first model is the two-machine flow shop, and the other is the assembly model. For both models we consider the problem of minimizing the makespan, provided that the setup and removal times are separated from the processing times. Each of these scheduling problems is reduced to the Traveling Salesman Problem (TSP). We show that, in general, the assembly problem is NP-hard in the strong sense. On the other hand, the two-machine flow shop problem reduces to the Gilmore-Gomory TSP, and is solvable in polynomial time. The same holds for the assembly problem under some reasonable assumptions. Using these and existing results, we provide a complete complexity classification of the relevant two-stage no-wait scheduling models.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Mathematical models of straight-grate pellet induration processes have been developed and carefully validated by a number of workers over the past two decades. However, the subsequent exploitation of these models in process optimization is less clear, but obviously requires a sound understanding of how the key factors control the operation. In this article, we show how a thermokinetic model of pellet induration, validated against operating data from one of the Iron Ore Company of Canada (IOCC) lines in Canada, can be exploited in process optimization from the perspective of fuel efficiency, production rate, and product quality. Most existing processes are restricted in the options available for process optimization. Here, we review the role of each of the drying (D), preheating (PH), firing (F), after-firing (AF), and cooling (C) phases of the induration process. We then use the induration process model to evaluate whether the first drying zone is best to use on the up- or down-draft gas-flow stream, and we optimize the on-gas temperature profile in the hood of the PH, F, and AF zones, to reduce the burner fuel by at least 10 pct over the long term. Finally, we consider how efficient and flexible the process could be if some of the structural constraints were removed (i.e., addressed at the design stage). The analysis suggests it should be possible to reduce the burner fuel lead by 35 pct, easily increase production by 5+ pct, and improve pellet quality.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Belief revision is a well-research topic within AI. We argue that the new model of distributed belief revision as discussed here is suitable for general modelling of judicial decision making, along with extant approach as known from jury research. The new approach to belief revision is of general interest, whenever attitudes to information are to be simulated within a multi-agent environment with agents holding local beliefs yet by interaction with, and influencing, other agents who are deliberating collectively. In the approach proposed, it's the entire group of agents, not an external supervisor, who integrate the different opinions. This is achieved through an election mechanism, The principle of "priority to the incoming information" as known from AI models of belief revision are problematic, when applied to factfinding by a jury. The present approach incorporates a computable model for local belief revision, such that a principle of recoverability is adopted. By this principle, any previously held belief must belong to the current cognitive state if consistent with it. For the purposes of jury simulation such a model calls for refinement. Yet we claim, it constitutes a valid basis for an open system where other AI functionalities (or outer stiumuli) could attempt to handle other aspects of the deliberation which are more specifi to legal narrative, to argumentation in court, and then to the debate among the jurors.