15 resultados para Theory of economic-mathematical models
em Aston University Research Archive
Resumo:
This paper formulates several mathematical models for determining the optimal sequence of component placements and assignment of component types to feeders simultaneously or the integrated scheduling problem for a type of surface mount technology placement machines, called the sequential pick-andplace (PAP) machine. A PAP machine has multiple stationary feeders storing components, a stationary working table holding a printed circuit board (PCB), and a movable placement head to pick up components from feeders and place them to a board. The objective of integrated problem is to minimize the total distance traveled by the placement head. Two integer nonlinear programming models are formulated first. Then, each of them is equivalently converted into an integer linear type. The models for the integrated problem are verified by two commercial packages. In addition, a hybrid genetic algorithm previously developed by the authors is adopted to solve the models. The algorithm not only generates the optimal solutions quickly for small-sized problems, but also outperforms the genetic algorithms developed by other researchers in terms of total traveling distance.
Resumo:
High velocity oxyfuel (HVOF) thermal spraying is one of the most significant developments in the thermal spray industry since the development of the original plasma spray technique. The first investigation deals with the combustion and discrete particle models within the general purpose commercial CFD code FLUENT to solve the combustion of kerosene and couple the motion of fuel droplets with the gas flow dynamics in a Lagrangian fashion. The effects of liquid fuel droplets on the thermodynamics of the combusting gas flow are examined thoroughly showing that combustion process of kerosene is independent on the initial fuel droplet sizes. The second analysis copes with the full water cooling numerical model, which can assist on thermal performance optimisation or to determine the best method for heat removal without the cost of building physical prototypes. The numerical results indicate that the water flow rate and direction has noticeable influence on the cooling efficiency but no noticeable effect on the gas flow dynamics within the thermal spraying gun. The third investigation deals with the development and implementation of discrete phase particle models. The results indicate that most powder particles are not melted upon hitting the substrate to be coated. The oxidation model confirms that HVOF guns can produce metallic coating with low oxidation within the typical standing-off distance about 30cm. Physical properties such as porosity, microstructure, surface roughness and adhesion strength of coatings produced by droplet deposition in a thermal spray process are determined to a large extent by the dynamics of deformation and solidification of the particles impinging on the substrate. Therefore, is one of the objectives of this study to present a complete numerical model of droplet impact and solidification. The modelling results show that solidification of droplets is significantly affected by the thermal contact resistance/substrate surface roughness.
Resumo:
Understanding the process of economic growth has been called the ultimate objective of economics. It has also been likened to an elusive quest – like the Holy Grail or the Elixir of Life (Easterly 2001). Taking on such a quest requires ingenuity and perseverance. Even small insights along the way can have major benefits to millions of people; small mistakes can do the reverse. Economies which achieve large increases in output over extended periods of time, not only enable rapid increases in standards of living, but also have dramatic changes in the economic, political and social landscape. For example, the USA is estimated to produce approximately 30 times as much in 1999 as it did in 1899. This sustained economic growth means that in 1999 the USA had an average income per capita of US$34 100. In contrast, sub-Saharan Africa had an average income of $490. Understanding these vast income differences, produced over many decades, is the elusive quest. The aim of this survey is to explain how economists try to understand the process of economic growth. To make the task manageable, the focus is on major issues and current debates. Models and conceptual frameworks are discussed in section III. Section IV summarises empirical studies, with a particular focus on econometric studies of groups of countries. This is not to say that case studies of single countries are not valuable, but space precludes covering everything. The following section sets out some facts about economic growth and, hopefully, motivates the further effort needed to tackle the theory and econometrics.
Resumo:
This paper examines the strategic implications of resource allocation models (RAMs). Four interrelated aspects of resource allocation are discussed: degree of centralisation, locus of strategic direction, cross-subsidy, and locus of control. The paper begins with a theoretical overview of these concepts, locating the study in the contexts of both strategic management literature and the university. The concepts are then examined empirically, drawing upon a longitudinal study of three UK universities, Warwick, London School of Economics and Political Science (LSE), and Oxford Brookes. Findings suggest that RAMs are historically and culturally situated within the context of each university and this is associated with different patterns of strategic direction and forms of strategic control. As such, the RAM in use may be less a matter of best practice than one of internal fit. The paper concludes with some implications for theory and practice by discussing the potential trajectories of each type of RAM.
Resumo:
Previous developments in the opportunism-independent theory of the firm are either restricted to special cases or are derived from the capabilities or resource-based perspective. However, a more general opportunism-independent approach can be developed, based on the work of Demsetz and Coase, which is nevertheless contractual in nature. This depends on 'direction', that is, deriving economic value by permitting one set of actors to direct the activities of another, and of non-human factors of production. Direction helps to explain not only firm boundaries and organisation, but also the existence of firms, without appealing to opportunism or moral hazard. The paper also considers the extent to which it is meaningful to speak of 'contractual' theories in the absence of opportunism, and whether this analysis can be extended beyond the employment contract to encompass ownership of assets by the firm. © The Author 2005. Published by Oxford University Press on behalf of the Cambridge Political Economy Society. All rights reserved.
Resumo:
There has been a revival of interest in economic techniques to measure the value of a firm through the use of economic value added as a technique for measuring such value to shareholders. This technique, based upon the concept of economic value equating to total value, is founded upon the assumptions of classical liberal economic theory. Such techniques have been subject to criticism both from the point of view of the level of adjustment to published accounts needed to make the technique work and from the point of view of the validity of such techniques in actually measuring value in a meaningful context. This paper critiques economic value added techniques as a means of calculating changes in shareholder value, contrasting such techniques with more traditional techniques of measuring value added. It uses the company Severn Trent plc as an actual example in order to evaluate and contrast the techniques in action. The paper demonstrates discrepancies between the calculated results from using economic value added analysis and those reported using conventional accounting measures. It considers the merits of the respective techniques in explaining shareholder and managerial behaviour and the problems with using such techniques in considering the wider stakeholder concept of value. It concludes that this economic value added technique has merits when compared with traditional accounting measures of performance but that it does not provide the universal panacea claimed by its proponents.
Resumo:
We present a mean field theory of code-division multiple access (CDMA) systems with error-control coding. On the basis of the relation between the free energy and mutual information, we obtain an analytical expression of the maximum spectral efficiency of the coded CDMA system, from which a mean field description of the coded CDMA system is provided in terms of a bank of scalar Gaussian channels whose variances in general vary at different code symbol positions. Regular low-density parity-check (LDPC)-coded CDMA systems are also discussed as an example of the coded CDMA systems.
Resumo:
The thesis began as a study of new firm formation. Preliminary research suggested that infant death rate was considered to be a closely related problem and the search was for a theory of new firm formation which would explain both. The thesis finds theories of exit and entry inadequate in this respect and focusses instead on theories of entrepreneurship, particularly those which concentrate on entrepreneurship as an agent of change. The role of information is found to be fundamental to economic change and an understanding of information generation and dissemination and the nature and direction of information flows is postulated to lead coterminously to an understanding of entrepreneurhsip and economic change. The economics of information is applied to theories of entrepreneurhsip and some testable hypotheses are derived. The testing relies on etablishing and measuring the information bases of the founders of new firms and then testing for certain hypothesised differences between the information bases of survivors and non-survivors. No theory of entrepreneurship is likely to be straightforwardly testable and many postulates have to be established to bring the theory to a testable stage. A questionnaire is used to gather information from a sample of firms taken from a new micro-data set established as part of the work of the thesis. Discriminant Analysis establishes the variables which best distinguish between survivors and non-survivors. The variables which emerge as important discriminators are consistent with the theory which the analysis is testing. While there are alternative interpretations of the important variables, collective consistency with the theory under test is established. The thesis concludes with an examination of the implications of the theory for policy towards stimulating new firm formation.
Resumo:
Over recent years, evidence has been accumulating in favour of the importance of long-term information as a variable which can affect the success of short-term recall. Lexicality, word frequency, imagery and meaning have all been shown to augment short term recall performance. Two competing theories as to the causes of this long-term memory influence are outlined and tested in this thesis. The first approach is the order-encoding account, which ascribes the effect to the usage of resources at encoding, hypothesising that word lists which require less effort to process will benefit from increased levels of order encoding, in turn enhancing recall success. The alternative view, trace redintegration theory, suggests that order is automatically encoded phonologically, and that long-term information can only influence the interpretation of the resultant memory trace. The free recall experiments reported here attempted to determine the importance of order encoding as a facilitatory framework and to determine the locus of the effects of long-term information in free recall. Experiments 1 and 2 examined the effects of word frequency and semantic categorisation over a filled delay, and experiments 3 and 4 did the same for immediate recall. Free recall was improved by both long-term factors tested. Order information was not used over a short filled delay, but was evident in immediate recall. Furthermore, it was found that both long-term factors increased the amount of order information retained. Experiment 5 induced an order encoding effect over a filled delay, leaving a picture of short-term processes which are closely associated with long-term processes, and which fit conceptions of short-term memory being part of language processes rather better than either the encoding or the retrieval-based models. Experiments 6 and 7 aimed to determine to what extent phonological processes were responsible for the pattern of results observed. Articulatory suppression affected the encoding of order information where speech rate had no direct influence, suggesting that it is ease of lexical access which is the most important factor in the influence of long-term memory on immediate recall tasks. The evidence presented in this thesis does not offer complete support for either the retrieval-based account or the order encoding account of long-term influence. Instead, the evidence sits best with models that are based upon language-processing. The path urged for future research is to find ways in which this diffuse model can be better specified, and which can take account of the versatility of the human brain.
Resumo:
Oxygen is a crucial molecule for cellular function. When oxygen demand exceeds supply, the oxygen sensing pathway centred on the hypoxia inducible factor (HIF) is switched on and promotes adaptation to hypoxia by up-regulating genes involved in angiogenesis, erythropoiesis and glycolysis. The regulation of HIF is tightly modulated through intricate regulatory mechanisms. Notably, its protein stability is controlled by the oxygen sensing prolyl hydroxylase domain (PHD) enzymes and its transcriptional activity is controlled by the asparaginyl hydroxylase FIH (factor inhibiting HIF-1).To probe the complexity of hypoxia-induced HIF signalling, efforts in mathematical modelling of the pathway have been underway for around a decade. In this paper, we review the existing mathematical models developed to describe and explain specific behaviours of the HIF pathway and how they have contributed new insights into our understanding of the network. Topics for modelling included the switch-like response to decreased oxygen gradient, the role of micro environmental factors, the regulation by FIH and the temporal dynamics of the HIF response. We will also discuss the technical aspects, extent and limitations of these models. Recently, HIF pathway has been implicated in other disease contexts such as hypoxic inflammation and cancer through crosstalking with pathways like NF?B and mTOR. We will examine how future mathematical modelling and simulation of interlinked networks can aid in understanding HIF behaviour in complex pathophysiological situations. Ultimately this would allow the identification of new pharmacological targets in different disease settings.
Resumo:
This article explores the growing perception, prompted by the eurozone crisis, of Germany as a hegemonic power in the European Union. The article explores the realignments in the power balance within the European Union (EU) by making an original application of the insights from the literature on hegemony. It reviews the evidence for Germany playing a hegemonic role, but then emphasizes three sets of constraints. First, German pre-eminence is largely confined to the economic sphere. Even in this area Germany has not acted fully in line with the role ascribed by hegemonic stability theory. Second, its pre-eminence in the EU encounters problems of international legitimacy. Third, growing constraints arising from German domestic politics further hamper playing the role of hegemon. In consequence, Germany is intrinsically a reluctant hegemon: one whose economic leadership is recognized but politically contested. The conclusion considers the significance of these findings on the EU's most important member state. © 2013 Taylor & Francis.
Resumo:
In the specific area of software engineering (SE) for self-adaptive systems (SASs) there is a growing research awareness about the synergy between SE and artificial intelligence (AI). However, just few significant results have been published so far. In this paper, we propose a novel and formal Bayesian definition of surprise as the basis for quantitative analysis to measure degrees of uncertainty and deviations of self-adaptive systems from normal behavior. A surprise measures how observed data affects the models or assumptions of the world during runtime. The key idea is that a "surprising" event can be defined as one that causes a large divergence between the belief distributions prior to and posterior to the event occurring. In such a case the system may decide either to adapt accordingly or to flag that an abnormal situation is happening. In this paper, we discuss possible applications of Bayesian theory of surprise for the case of self-adaptive systems using Bayesian dynamic decision networks. Copyright © 2014 ACM.
Resumo:
Chlamydia is a common sexually transmitted infection that has potentially serious consequences unless detected and treated early. The health service in the UK offers clinic-based testing for chlamydia but uptake is low. Identifying the predictors of testing behaviours may inform interventions to increase uptake. Self-tests for chlamydia may facilitate testing and treatment in people who avoid clinic-based testing. Self-testing and being tested by a health care professional (HCP) involve two contrasting contexts that may influence testing behaviour. However, little is known about how predictors of behaviour differ as a function of context. In this study, theoretical models of behaviour were used to assess factors that may predict intention to test in two different contexts: self-testing and being tested by a HCP. Individuals searching for or reading about chlamydia testing online were recruited using Google Adwords. Participants completed an online questionnaire that addressed previous testing behaviour and measured constructs of the Theory of Planned Behaviour and Protection Motivation Theory, which propose a total of eight possible predictors of intention. The questionnaire was completed by 310 participants. Sufficient data for multiple regression were provided by 102 and 118 respondents for self-testing and testing by a HCP respectively. Intention to self-test was predicted by vulnerability and self-efficacy, with a trend-level effect for response efficacy. Intention to be tested by a HCP was predicted by vulnerability, attitude and subjective norm. Thus, intentions to carry out two testing behaviours with very similar goals can have different predictors depending on test context. We conclude that interventions to increase self-testing should be based on evidence specifically related to test context.
Resumo:
Previous work has demonstrated that planning behaviours may be more adaptive than avoidance strategies in driving self-regulation, but ways of encouraging planning have not been investigated. The efficacy of an extended theory of planned behaviour (TPB) plus implementation intention based intervention to promote planning self-regulation in drivers across the lifespan was tested. An age stratified group of participants (N=81, aged 18-83 years) was randomly assigned to an experimental or control condition. The intervention prompted specific goal setting with action planning and barrier identification. Goal setting was carried out using an agreed behavioural contract. Baseline and follow-up measures of TPB variables, self-reported, driving self-regulation behaviours (avoidance and planning) and mobility goal achievements were collected using postal questionnaires. Like many previous efforts to change planned behaviour by changing its predictors using models of planned behaviour such as the TPB, results showed that the intervention did not significantly change any of the model components. However, more than 90% of participants achieved their primary driving goal, and self-regulation planning as measured on a self-regulation inventory was marginally improved. The study demonstrates the role of pre-decisional, or motivational components as contrasted with post-decisional goal enactment, and offers promise for the role of self-regulation planning and implementation intentions in assisting drivers in achieving their mobility goals and promoting safer driving across the lifespan, even in the context of unchanging beliefs such as perceived risk or driver anxiety.