48 resultados para Internationalisation Process Theory
Resumo:
In this paper we review recent theoretical approaches for analysing the dynamics of on-line learning in multilayer neural networks using methods adopted from statistical physics. The analysis is based on monitoring a set of macroscopic variables from which the generalisation error can be calculated. A closed set of dynamical equations for the macroscopic variables is derived analytically and solved numerically. The theoretical framework is then employed for defining optimal learning parameters and for analysing the incorporation of second order information into the learning process using natural gradient descent and matrix-momentum based methods. We will also briefly explain an extension of the original framework for analysing the case where training examples are sampled with repetition.
Resumo:
A sieve plate distillation column has been constructed and interfaced to a minicomputer with the necessary instrumentation for dynamic, estimation and control studies with special bearing on low-cost and noise-free instrumentation. A dynamic simulation of the column with a binary liquid system has been compiled using deterministic models that include fluid dynamics via Brambilla's equation for tray liquid holdup calculations. The simulation predictions have been tested experimentally under steady-state and transient conditions. The simulator's predictions of the tray temperatures have shown reasonably close agreement with the measured values under steady-state conditions and in the face of a step change in the feed rate. A method of extending linear filtering theory to highly nonlinear systems with very nonlinear measurement functional relationships has been proposed and tested by simulation on binary distillation. The simulation results have proved that the proposed methodology can overcome the typical instability problems associated with the Kalman filters. Three extended Kalman filters have been formulated and tested by simulation. The filters have been used to refine a much simplified model sequentially and to estimate parameters such as the unmeasured feed composition using information from the column simulation. It is first assumed that corrupted tray composition measurements are made available to the filter and then corrupted tray temperature measurements are accessed instead. The simulation results have demonstrated the powerful capability of the Kalman filters to overcome the typical hardware problems associated with the operation of on-line analyzers in relation to distillation dynamics and control by, in effect, replacirig them. A method of implementing estimator-aided feedforward (EAFF) control schemes has been proposed and tested by simulation on binary distillation. The results have shown that the EAFF scheme provides much better control and energy conservation than the conventional feedback temperature control in the face of a sustained step change in the feed rate or multiple changes in the feed rate, composition and temperature. Further extensions of this work are recommended as regards simulation, estimation and EAFF control.
Resumo:
Purpose – In the 1990s, a growing number of companies adopted value-based management (VBM) techniques in the UK. The purpose of this paper is to explore the motivations for the adoption or non-adoption of VBM for managing a business. Design/methodology/approach – An interview-based study of 37 large UK companies. Insights from diffusion theory and institutional theory are utilised to theorise these motivations. Findings – It was found that the rate of adoption of VBM in the sample companies does follow the classical S-shape. It also suggests that the supply-side of the diffusion process, most notably the role played by consultants, was an influence on many companies. This was not, however, a sufficient condition for companies to adopt the technique. The research also finds evidence of relocation diffusion, as several adopters are influenced by new officers, for example chief executive officers and finance directors, importing VBM techniques that they have used in organizations within which they have previously worked. Research limitations/implications – It is quite a small scale study and further work would be needed to develop the findings. Practical implications – Understanding and theorising the adoption of new management techniques will help understand the management of a business. Originality/value – This research adds further evidence to the value of studying management accounting, and more specifically management accounting change, in practice. It shows the developments in the adoption of a new technique and hence how a technique becomes accepted in practice.
Resumo:
What does endogenous growth theory tell about regional economies? Empirics of R&D worker-based productivity growth, Regional Studies. Endogenous growth theory emerged in the 1990s as ‘new growth theory’ accounting for technical progress in the growth process. This paper examines the role of research and development (R&D) workers underlying the Romer model (1990) and its subsequent modifications, and compares it with a model based on the accumulation of human capital engaged in R&D. Cross-section estimates of the models against productivity growth of European regions in the 1990s suggest that each R&D worker has a unique set of knowledge while his/her contributions are enhanced by knowledge sharing within a region as well as spillovers from other regions in proximity.
Resumo:
This paper explains how strategic planning is able to deliver strategic integration within organizations. While communication and participation within planning processes are perceived to have an integrative effect, we argue that these effects are unlikely to arise simply from bringing people together. Rather, we suggest that, given the varying interests of actors in different business units, integration will only arise from active negotiations and compromises between these actors. The paper is based upon a case of strategic planning in a multinational that was attempting to develop greater strategic integration across Europe. Drawing upon an activity theory framework, we examine how a common strategy emerges over time through modifications to the planning process and to different actors’ roles within it. The findings are used to develop a process model that shows how different business unit characteristics of planning experience and relative power shape different experiences of communication and participation activities and different processes for achieving integration. The paper concludes with a discussion of how this process model contributes to the literature on strategic planning, political processes of strategy-making, and strategy-as-practice.
Resumo:
The literature on the potential use of liquid ammonia as a solvent for the extraction of aromatic hydrocarbons from mixtures with paraffins, and the application of reflux, has been reviewed. Reference is made to extractors suited to this application. A pilot scale extraction plant was designed comprising a Scm. diameter by 12Scm. high, 50 stage Rotating Disc Contactor with 2 external settlers. Provision was made for operation with, or without, reflux at a pressure of 10 bar and ambient temperature. The solvent recovery unit consisted of an evaporator, compressor and condenser in a refrigeration cycle. Two systems were selected for study, Cumene-n-Heptane-Ammonia and Toluene-Methylcyclohexane-Ammonia. Equlibrium data for the first system was determined experimentally in a specially-designed, equilibrium bomb. A technique was developed to withdraw samples under pressure for analysis by chromatography and titration. The extraction plant was commissioned with a kerosine-water system; detailed operating procedures were developed based on a Hazard and Operability Study. Experimental runs were carried out with both ternary ammonia systems. With the system Toluene-Methylcyclohexane-Ammonia the extraction plant and the solvent recovery facility, operated satisfactorily, and safely,in accordance with the operating procedures. Experimental data gave reasonable agreement with theory. Recommendations are made for further work with plant.
Resumo:
Gain insight into crucial British mental health approaches for LGB individuals. There is very little collaborative literature between LGB-affirmative psychologists and psychotherapists in the United States and the United Kingdom. British Lesbian, Gay, and Bisexual Psychologies: Theory, Research, and Practice may well be a crucial beginning step in building dialogue between these two countries on important LGB psychotherapy developments. Leading authorities comprehensively examine the latest studies and effective therapies for LGB individuals in the United Kingdom. Practitioners will discover an extensive survey of the most current developments to supplement their own work, while educators and students will find diverse expert perspectives on which to consider and broaden their own viewpoints. This unique book offers an informative introduction to British psychosocial perspectives on theory, research, and practice. British Lesbian, Gay, and Bisexual Psychologies provides a critical exploration of the recent history of LGB psychology and psychotherapy in the United Kingdom, focusing on key publications and outlining the current terrain. Other chapters are organized into two thematic sections. The first section explores theoretical frameworks in United Kingdom therapeutic practice, while the second section examines sexual minority identities and their needs for support and community. Topics in British Lesbian, Gay, and Bisexual Psychologies include: - similarities and differences between LGBT psychology and psychotherapy in the United States and United Kingdom - gay affirmative therapy (GAT) as a positive framework - existential-phenomenological approach to psychotherapy - core issues in the anxiety about whether or not to “come out” - object relations theory - exploring homo-negativity in the therapeutic process - aspects of psychotherapy that lesbians and gay men find helpful - research into how the mainstreaming of lesbian and gay culture has affected the lives of LGB individuals - study into LGB youth issues - difficulties of gay men with learning disabilities—with suggestions on how to offer the best psychological service - a study on gay athletes’ experiences of coming out in a heterosexist world British Lesbian, Gay, and Bisexual Psychologies takes a needed step toward sharing valuable psychosocial perspectives between countries. This useful, enlightening text is perfect for educators, students, psychologists, psychotherapists, and counselors working in the field of sexuality.
Resumo:
It is generally assumed when using Bayesian inference methods for neural networks that the input data contains no noise. For real-world (errors in variable) problems this is clearly an unsafe assumption. This paper presents a Bayesian neural network framework which accounts for input noise provided that a model of the noise process exists. In the limit where the noise process is small and symmetric it is shown, using the Laplace approximation, that this method adds an extra term to the usual Bayesian error bar which depends on the variance of the input noise process. Further, by treating the true (noiseless) input as a hidden variable, and sampling this jointly with the network’s weights, using a Markov chain Monte Carlo method, it is demonstrated that it is possible to infer the regression over the noiseless input. This leads to the possibility of training an accurate model of a system using less accurate, or more uncertain, data. This is demonstrated on both the, synthetic, noisy sine wave problem and a real problem of inferring the forward model for a satellite radar backscatter system used to predict sea surface wind vectors.
The transformational implementation of JSD process specifications via finite automata representation
Resumo:
Conventional structured methods of software engineering are often based on the use of functional decomposition coupled with the Waterfall development process model. This approach is argued to be inadequate for coping with the evolutionary nature of large software systems. Alternative development paradigms, including the operational paradigm and the transformational paradigm, have been proposed to address the inadequacies of this conventional view of software developement, and these are reviewed. JSD is presented as an example of an operational approach to software engineering, and is contrasted with other well documented examples. The thesis shows how aspects of JSD can be characterised with reference to formal language theory and automata theory. In particular, it is noted that Jackson structure diagrams are equivalent to regular expressions and can be thought of as specifying corresponding finite automata. The thesis discusses the automatic transformation of structure diagrams into finite automata using an algorithm adapted from compiler theory, and then extends the technique to deal with areas of JSD which are not strictly formalisable in terms of regular languages. In particular, an elegant and novel method for dealing with so called recognition (or parsing) difficulties is described,. Various applications of the extended technique are described. They include a new method of automatically implementing the dismemberment transformation; an efficient way of implementing inversion in languages lacking a goto-statement; and a new in-the-large implementation strategy.
Resumo:
Despite the voluminous studies written about organisational innovation over the last 30-40 years our understanding of this phenomenon continues to be inconsistent and inconclusive (Wolfe, 1994). An assessment of the theoretical and methodological issues influencing the explanatory utility of many studies has led scholars (e.g. Slappendel, 1996) to re-evaluate the assumptions used to ground studies. Building on these criticisms the current study contributes to the development of an interactive perspective of organisational innovation. This work contributes empirically and theoretically to an improved understanding of the innovation process and the interaction between the realm of action and the mediating effects of pre-existing contingencies i.e. social control, economic exchange and the communicability of knowledge (Scarbrough, 1996). Building on recent advances in institutional theory (see Barley, 1986; 1990; Barley and Tolbert, 1997) and critical theory (Morrow, 1994, Sayer, 1992) the study aims to demonstrate, via longitudinal intensive research, the process through which ideas are translated into reality. This is significant because, despite a growing recognition of the implicit link between the strategic conduct of actors and the institutional realm in organisational analysis, there are few examples that theorise and empirically test these connections. By assessing an under researched example of technology transfer; the government's Teaching Company Scheme (TCS) this project provides a critique of the innovation process that contributes to theory and our appreciation of change in the UK government's premier technology transfer scheme (QR, 1996). Critical moments during the translation of ideas illustrate how elements that are linked to social control, economic exchange and communicability mediate the innovation process. Using analytical categories i.e. contradiction, slippage and dysfunctionality these are assessed in relation to the actions (coping strategies) of programme members over a two-year period. Drawing on Giddens' (1995) notion of the duality of structure this study explores the nature of the relationship between the task environment and institutional environment demonstrating how and why knowledge is both an enabler and barrier to organisational innovation.
Resumo:
This thesis is concerned with the management processes concerned with complex strategic decision in organisations. The research has sought to explore these processes by taking as its focus the reconstruction of decision processes a) on the basis of an historical study of an industry and in particular a major company in that industry; and b) the perception and understanding of strategic decision processes and change by managers involved in companies in that industry. The main body of analysis and theoretical contributions arise from the detailed analysis of extended depth interviews with managers carried out in 1980 and 1983 which trace thirteen years of the strategic development of a firm. In so doing, extensive use is made of verbatim accounts by managers of events and their interpretation of events. This is then compared with data gathered from.similar interviews with managers of two other companies and examined in the light of existing research and theory in the field. The thesis both provides a detailed insight into the processes associated with the identification and resolution of complex strategic issues and also generates a body of theory concerning the mechanisms by which strategic decisions and the processes of strategic change are interwoven with the cultural and political fabric of organisations. The thesis is divided into four parts. The first part deals with the background to the research, provid1ng a fuller summary of the purpose, structure and content of the thesis and a discussion of relevant previous research and the methodology employed herein. The second part mainly provides case studies of the industry and the main company studied. The third part is a detailed presentation and analysis of data. The fourth part is a synthesis of the findings and consolidation of the theoretical interpretation advanced in the thesis.
Resumo:
Over recent years, evidence has been accumulating in favour of the importance of long-term information as a variable which can affect the success of short-term recall. Lexicality, word frequency, imagery and meaning have all been shown to augment short term recall performance. Two competing theories as to the causes of this long-term memory influence are outlined and tested in this thesis. The first approach is the order-encoding account, which ascribes the effect to the usage of resources at encoding, hypothesising that word lists which require less effort to process will benefit from increased levels of order encoding, in turn enhancing recall success. The alternative view, trace redintegration theory, suggests that order is automatically encoded phonologically, and that long-term information can only influence the interpretation of the resultant memory trace. The free recall experiments reported here attempted to determine the importance of order encoding as a facilitatory framework and to determine the locus of the effects of long-term information in free recall. Experiments 1 and 2 examined the effects of word frequency and semantic categorisation over a filled delay, and experiments 3 and 4 did the same for immediate recall. Free recall was improved by both long-term factors tested. Order information was not used over a short filled delay, but was evident in immediate recall. Furthermore, it was found that both long-term factors increased the amount of order information retained. Experiment 5 induced an order encoding effect over a filled delay, leaving a picture of short-term processes which are closely associated with long-term processes, and which fit conceptions of short-term memory being part of language processes rather better than either the encoding or the retrieval-based models. Experiments 6 and 7 aimed to determine to what extent phonological processes were responsible for the pattern of results observed. Articulatory suppression affected the encoding of order information where speech rate had no direct influence, suggesting that it is ease of lexical access which is the most important factor in the influence of long-term memory on immediate recall tasks. The evidence presented in this thesis does not offer complete support for either the retrieval-based account or the order encoding account of long-term influence. Instead, the evidence sits best with models that are based upon language-processing. The path urged for future research is to find ways in which this diffuse model can be better specified, and which can take account of the versatility of the human brain.
Resumo:
This investigation is in two parts, theory and experimental verification. (1) Theoretical Study In this study it is, for obvious reasons, necessary to analyse the concept of formability first. For the purpose of the present investigation it is sufficient to define the four aspects of formability as follows: (a) the formability of the material at a critical section, (b) the formability of the material in general, (c) process efficiency, (d) proportional increase in surface area. A method of quantitative assessment is proposed for each of the four aspects of formability. The theoretical study also includes the distinction between coaxial and non-coaxial strains which occur, respectively, in axisymmetrical and unsymmetrical forming processes and the inadequacy of the circular grid system for the assessment of formability is explained in the light of this distinction. (2) Experimental Study As one of the bases of the experimental work, the determination of the end point of a forming process, which sets the limit to the formability of the work material, is discussed. The effects of three process parameters on draw-in are shown graphically. Then the delay of fracture in sheet metal forming resulting from draw-in is analysed in kinematical terms, namely, through the radial displacements, the radial and the circumferential strains, and the projected thickness of the workpiece. Through the equilibrium equation of the membrane stresses, the effect on the shape of the unsupported region of the workpiece, and hence the position of the critical section is explained. Then, the effect of draw-in on the four aspects of formability is discussed throughout this investigation. The triangular coordinate system is used to present and analyse the triaxial strains involved. This coordinate system has the advantage of showing all the three principal strains in a material simultaneously, as well as representing clearly the many types of strains involved in sheet metal work.
Resumo:
Diagnosing faults in wastewater treatment, like diagnosis of most problems, requires bi-directional plausible reasoning. This means that both predictive (from causes to symptoms) and diagnostic (from symptoms to causes) inferences have to be made, depending on the evidence available, in reasoning for the final diagnosis. The use of computer technology for the purpose of diagnosing faults in the wastewater process has been explored, and a rule-based expert system was initiated. It was found that such an approach has serious limitations in its ability to reason bi-directionally, which makes it unsuitable for diagnosing tasks under the conditions of uncertainty. The probabilistic approach known as Bayesian Belief Networks (BBNS) was then critically reviewed, and was found to be well-suited for diagnosis under uncertainty. The theory and application of BBNs are outlined. A full-scale BBN for the diagnosis of faults in a wastewater treatment plant based on the activated sludge system has been developed in this research. Results from the BBN show good agreement with the predictions of wastewater experts. It can be concluded that the BBNs are far superior to rule-based systems based on certainty factors in their ability to diagnose faults and predict systems in complex operating systems having inherently uncertain behaviour.
Resumo:
This thesis is a theoretical study of the accuracy and usability of models that attempt to represent the environmental control system of buildings in order to improve environmental design. These models have evolved from crude representations of a building and its environment through to an accurate representation of the dynamic characteristics of the environmental stimuli on buildings. Each generation of models has had its own particular influence on built form. This thesis analyses the theory, structure and data of such models in terms of their accuracy of simulation and therefore their validity in influencing built form. The models are also analysed in terms of their compatability with the design process and hence their ability to aid designers. The conclusions are that such models are unlikely to improve environmental performance since: a the models can only be applied to a limited number of building types, b they can only be applied to a restricted number of the characteristics of a design, c they can only be employed after many major environmental decisions have been made, d the data used in models is inadequate and unrepresentative, e models do not account for occupant interaction in environmental control. It is argued that further improvements in the accuracy of simulation of environmental control will not significantly improve environmental design. This is based on the premise that strategic environmental decisions are made at the conceptual stages of design whereas models influence the detailed stages of design. It is hypothesised that if models are to improve environmental design it must be through the analysis of building typologies which provides a method of feedback between models and the conceptual stages of design. Field studies are presented to describe a method by which typologies can be analysed and a theoretical framework is described which provides a basis for further research into the implications of the morphology of buildings on environmental design.