845 resultados para Theory of economic-mathematical models
Resumo:
In this thesis work we develop a new generative model of social networks belonging to the family of Time Varying Networks. The importance of correctly modelling the mechanisms shaping the growth of a network and the dynamics of the edges activation and inactivation are of central importance in network science. Indeed, by means of generative models that mimic the real-world dynamics of contacts in social networks it is possible to forecast the outcome of an epidemic process, optimize the immunization campaign or optimally spread an information among individuals. This task can now be tackled taking advantage of the recent availability of large-scale, high-quality and time-resolved datasets. This wealth of digital data has allowed to deepen our understanding of the structure and properties of many real-world networks. Moreover, the empirical evidence of a temporal dimension in networks prompted the switch of paradigm from a static representation of graphs to a time varying one. In this work we exploit the Activity-Driven paradigm (a modeling tool belonging to the family of Time-Varying-Networks) to develop a general dynamical model that encodes fundamental mechanism shaping the social networks' topology and its temporal structure: social capital allocation and burstiness. The former accounts for the fact that individuals does not randomly invest their time and social interactions but they rather allocate it toward already known nodes of the network. The latter accounts for the heavy-tailed distributions of the inter-event time in social networks. We then empirically measure the properties of these two mechanisms from seven real-world datasets and develop a data-driven model, analytically solving it. We then check the results against numerical simulations and test our predictions with real-world datasets, finding a good agreement between the two. Moreover, we find and characterize a non-trivial interplay between burstiness and social capital allocation in the parameters phase space. Finally, we present a novel approach to the development of a complete generative model of Time-Varying-Networks. This model is inspired by the Kaufman's adjacent possible theory and is based on a generalized version of the Polya's urn. Remarkably, most of the complex and heterogeneous feature of real-world social networks are naturally reproduced by this dynamical model, together with many high-order topological properties (clustering coefficient, community structure etc.).
Resumo:
This paper examines the strategic implications of resource allocation models (RAMs). Four interrelated aspects of resource allocation are discussed: degree of centralisation, locus of strategic direction, cross-subsidy, and locus of control. The paper begins with a theoretical overview of these concepts, locating the study in the contexts of both strategic management literature and the university. The concepts are then examined empirically, drawing upon a longitudinal study of three UK universities, Warwick, London School of Economics and Political Science (LSE), and Oxford Brookes. Findings suggest that RAMs are historically and culturally situated within the context of each university and this is associated with different patterns of strategic direction and forms of strategic control. As such, the RAM in use may be less a matter of best practice than one of internal fit. The paper concludes with some implications for theory and practice by discussing the potential trajectories of each type of RAM.
Resumo:
Previous developments in the opportunism-independent theory of the firm are either restricted to special cases or are derived from the capabilities or resource-based perspective. However, a more general opportunism-independent approach can be developed, based on the work of Demsetz and Coase, which is nevertheless contractual in nature. This depends on 'direction', that is, deriving economic value by permitting one set of actors to direct the activities of another, and of non-human factors of production. Direction helps to explain not only firm boundaries and organisation, but also the existence of firms, without appealing to opportunism or moral hazard. The paper also considers the extent to which it is meaningful to speak of 'contractual' theories in the absence of opportunism, and whether this analysis can be extended beyond the employment contract to encompass ownership of assets by the firm. © The Author 2005. Published by Oxford University Press on behalf of the Cambridge Political Economy Society. All rights reserved.
Resumo:
There has been a revival of interest in economic techniques to measure the value of a firm through the use of economic value added as a technique for measuring such value to shareholders. This technique, based upon the concept of economic value equating to total value, is founded upon the assumptions of classical liberal economic theory. Such techniques have been subject to criticism both from the point of view of the level of adjustment to published accounts needed to make the technique work and from the point of view of the validity of such techniques in actually measuring value in a meaningful context. This paper critiques economic value added techniques as a means of calculating changes in shareholder value, contrasting such techniques with more traditional techniques of measuring value added. It uses the company Severn Trent plc as an actual example in order to evaluate and contrast the techniques in action. The paper demonstrates discrepancies between the calculated results from using economic value added analysis and those reported using conventional accounting measures. It considers the merits of the respective techniques in explaining shareholder and managerial behaviour and the problems with using such techniques in considering the wider stakeholder concept of value. It concludes that this economic value added technique has merits when compared with traditional accounting measures of performance but that it does not provide the universal panacea claimed by its proponents.
Resumo:
We present a mean field theory of code-division multiple access (CDMA) systems with error-control coding. On the basis of the relation between the free energy and mutual information, we obtain an analytical expression of the maximum spectral efficiency of the coded CDMA system, from which a mean field description of the coded CDMA system is provided in terms of a bank of scalar Gaussian channels whose variances in general vary at different code symbol positions. Regular low-density parity-check (LDPC)-coded CDMA systems are also discussed as an example of the coded CDMA systems.
Resumo:
The thesis began as a study of new firm formation. Preliminary research suggested that infant death rate was considered to be a closely related problem and the search was for a theory of new firm formation which would explain both. The thesis finds theories of exit and entry inadequate in this respect and focusses instead on theories of entrepreneurship, particularly those which concentrate on entrepreneurship as an agent of change. The role of information is found to be fundamental to economic change and an understanding of information generation and dissemination and the nature and direction of information flows is postulated to lead coterminously to an understanding of entrepreneurhsip and economic change. The economics of information is applied to theories of entrepreneurhsip and some testable hypotheses are derived. The testing relies on etablishing and measuring the information bases of the founders of new firms and then testing for certain hypothesised differences between the information bases of survivors and non-survivors. No theory of entrepreneurship is likely to be straightforwardly testable and many postulates have to be established to bring the theory to a testable stage. A questionnaire is used to gather information from a sample of firms taken from a new micro-data set established as part of the work of the thesis. Discriminant Analysis establishes the variables which best distinguish between survivors and non-survivors. The variables which emerge as important discriminators are consistent with the theory which the analysis is testing. While there are alternative interpretations of the important variables, collective consistency with the theory under test is established. The thesis concludes with an examination of the implications of the theory for policy towards stimulating new firm formation.
Resumo:
Over recent years, evidence has been accumulating in favour of the importance of long-term information as a variable which can affect the success of short-term recall. Lexicality, word frequency, imagery and meaning have all been shown to augment short term recall performance. Two competing theories as to the causes of this long-term memory influence are outlined and tested in this thesis. The first approach is the order-encoding account, which ascribes the effect to the usage of resources at encoding, hypothesising that word lists which require less effort to process will benefit from increased levels of order encoding, in turn enhancing recall success. The alternative view, trace redintegration theory, suggests that order is automatically encoded phonologically, and that long-term information can only influence the interpretation of the resultant memory trace. The free recall experiments reported here attempted to determine the importance of order encoding as a facilitatory framework and to determine the locus of the effects of long-term information in free recall. Experiments 1 and 2 examined the effects of word frequency and semantic categorisation over a filled delay, and experiments 3 and 4 did the same for immediate recall. Free recall was improved by both long-term factors tested. Order information was not used over a short filled delay, but was evident in immediate recall. Furthermore, it was found that both long-term factors increased the amount of order information retained. Experiment 5 induced an order encoding effect over a filled delay, leaving a picture of short-term processes which are closely associated with long-term processes, and which fit conceptions of short-term memory being part of language processes rather better than either the encoding or the retrieval-based models. Experiments 6 and 7 aimed to determine to what extent phonological processes were responsible for the pattern of results observed. Articulatory suppression affected the encoding of order information where speech rate had no direct influence, suggesting that it is ease of lexical access which is the most important factor in the influence of long-term memory on immediate recall tasks. The evidence presented in this thesis does not offer complete support for either the retrieval-based account or the order encoding account of long-term influence. Instead, the evidence sits best with models that are based upon language-processing. The path urged for future research is to find ways in which this diffuse model can be better specified, and which can take account of the versatility of the human brain.
Resumo:
Oxygen is a crucial molecule for cellular function. When oxygen demand exceeds supply, the oxygen sensing pathway centred on the hypoxia inducible factor (HIF) is switched on and promotes adaptation to hypoxia by up-regulating genes involved in angiogenesis, erythropoiesis and glycolysis. The regulation of HIF is tightly modulated through intricate regulatory mechanisms. Notably, its protein stability is controlled by the oxygen sensing prolyl hydroxylase domain (PHD) enzymes and its transcriptional activity is controlled by the asparaginyl hydroxylase FIH (factor inhibiting HIF-1).To probe the complexity of hypoxia-induced HIF signalling, efforts in mathematical modelling of the pathway have been underway for around a decade. In this paper, we review the existing mathematical models developed to describe and explain specific behaviours of the HIF pathway and how they have contributed new insights into our understanding of the network. Topics for modelling included the switch-like response to decreased oxygen gradient, the role of micro environmental factors, the regulation by FIH and the temporal dynamics of the HIF response. We will also discuss the technical aspects, extent and limitations of these models. Recently, HIF pathway has been implicated in other disease contexts such as hypoxic inflammation and cancer through crosstalking with pathways like NF?B and mTOR. We will examine how future mathematical modelling and simulation of interlinked networks can aid in understanding HIF behaviour in complex pathophysiological situations. Ultimately this would allow the identification of new pharmacological targets in different disease settings.
Resumo:
This article explores the growing perception, prompted by the eurozone crisis, of Germany as a hegemonic power in the European Union. The article explores the realignments in the power balance within the European Union (EU) by making an original application of the insights from the literature on hegemony. It reviews the evidence for Germany playing a hegemonic role, but then emphasizes three sets of constraints. First, German pre-eminence is largely confined to the economic sphere. Even in this area Germany has not acted fully in line with the role ascribed by hegemonic stability theory. Second, its pre-eminence in the EU encounters problems of international legitimacy. Third, growing constraints arising from German domestic politics further hamper playing the role of hegemon. In consequence, Germany is intrinsically a reluctant hegemon: one whose economic leadership is recognized but politically contested. The conclusion considers the significance of these findings on the EU's most important member state. © 2013 Taylor & Francis.
Resumo:
The importance to solve the problem of spatial-temporal dynamics analysis in the system of economic security of different subjects of economic management is substantiated. Various methods and approaches for carrying out analysis of spatial-temporal dynamics in the system of economic security are considered. The basis of the generalized analysis of spatial-temporal dynamics in economic systems is offered.
Resumo:
Here we study the integers (d, g, r) such that on a smooth projective curve of genus g there exists a rank r stable vector bundle with degree d and spanned by its global sections.
Resumo:
The purpose of this paper is (1) to highlight some recent and heretofore unpublished results in the theory of multiplier sequences and (2) to survey some open problems in this area of research. For the sake of clarity of exposition, we have grouped the problems in three subsections, although several of the problems are interrelated. For the reader’s convenience, we have included the pertinent definitions, cited references and related results, and in several instances, elucidated the problems by examples.
Resumo:
In the specific area of software engineering (SE) for self-adaptive systems (SASs) there is a growing research awareness about the synergy between SE and artificial intelligence (AI). However, just few significant results have been published so far. In this paper, we propose a novel and formal Bayesian definition of surprise as the basis for quantitative analysis to measure degrees of uncertainty and deviations of self-adaptive systems from normal behavior. A surprise measures how observed data affects the models or assumptions of the world during runtime. The key idea is that a "surprising" event can be defined as one that causes a large divergence between the belief distributions prior to and posterior to the event occurring. In such a case the system may decide either to adapt accordingly or to flag that an abnormal situation is happening. In this paper, we discuss possible applications of Bayesian theory of surprise for the case of self-adaptive systems using Bayesian dynamic decision networks. Copyright © 2014 ACM.
Resumo:
Chlamydia is a common sexually transmitted infection that has potentially serious consequences unless detected and treated early. The health service in the UK offers clinic-based testing for chlamydia but uptake is low. Identifying the predictors of testing behaviours may inform interventions to increase uptake. Self-tests for chlamydia may facilitate testing and treatment in people who avoid clinic-based testing. Self-testing and being tested by a health care professional (HCP) involve two contrasting contexts that may influence testing behaviour. However, little is known about how predictors of behaviour differ as a function of context. In this study, theoretical models of behaviour were used to assess factors that may predict intention to test in two different contexts: self-testing and being tested by a HCP. Individuals searching for or reading about chlamydia testing online were recruited using Google Adwords. Participants completed an online questionnaire that addressed previous testing behaviour and measured constructs of the Theory of Planned Behaviour and Protection Motivation Theory, which propose a total of eight possible predictors of intention. The questionnaire was completed by 310 participants. Sufficient data for multiple regression were provided by 102 and 118 respondents for self-testing and testing by a HCP respectively. Intention to self-test was predicted by vulnerability and self-efficacy, with a trend-level effect for response efficacy. Intention to be tested by a HCP was predicted by vulnerability, attitude and subjective norm. Thus, intentions to carry out two testing behaviours with very similar goals can have different predictors depending on test context. We conclude that interventions to increase self-testing should be based on evidence specifically related to test context.