32 resultados para Recursive logit

em Aston University Research Archive


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Recently there has been an outburst of interest in extending topographic maps of vectorial data to more general data structures, such as sequences or trees. However, there is no general consensus as to how best to process sequences using topographicmaps, and this topic remains an active focus of neurocomputational research. The representational capabilities and internal representations of the models are not well understood. Here, we rigorously analyze a generalization of the self-organizingmap (SOM) for processing sequential data, recursive SOM (RecSOM) (Voegtlin, 2002), as a nonautonomous dynamical system consisting of a set of fixed input maps. We argue that contractive fixed-input maps are likely to produce Markovian organizations of receptive fields on the RecSOM map. We derive bounds on parameter β (weighting the importance of importing past information when processing sequences) under which contractiveness of the fixed-input maps is guaranteed. Some generalizations of SOM contain a dynamic module responsible for processing temporal contexts as an integral part of the model. We show that Markovian topographic maps of sequential data can be produced using a simple fixed (nonadaptable) dynamic module externally feeding a standard topographic model designed to process static vectorial data of fixed dimensionality (e.g., SOM). However, by allowing trainable feedback connections, one can obtain Markovian maps with superior memory depth and topography preservation. We elaborate on the importance of non-Markovian organizations in topographic maps of sequential data. © 2006 Massachusetts Institute of Technology.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Recently, there has been a considerable research activity in extending topographic maps of vectorial data to more general data structures, such as sequences or trees. However, the representational capabilities and internal representations of the models are not well understood. We rigorously analyze a generalization of the Self-Organizing Map (SOM) for processing sequential data, Recursive SOM (RecSOM [1]), as a non-autonomous dynamical system consisting off a set of fixed input maps. We show that contractive fixed input maps are likely to produce Markovian organizations of receptive fields o the RecSOM map. We derive bounds on parameter $\beta$ (weighting the importance of importing past information when processing sequences) under which contractiveness of the fixed input maps is guaranteed.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Increasing mail-survey response using monetary incentives is a proven, but not always cost-effective method in every population. This paper tackles the questions of whether it is worth using monetary incentives and the size of the inducement by testing a logit model of the impact of prepaid monetary incentives on response rates in consumer and organizational mail surveys. The results support their use and show that the inducement value makes a significant impact on the effect size. Importantly, no significant differences were found between consumer and organizational populations. A cost-benefit model is developed to estimate the optimum incentive when attempting to minimize overall survey costs for a given sample size. © 2006 Operational Research Society Ltd. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Innovation events – the introduction of new products or processes – represent the end of a process of knowledge sourcing and transformation. They also represent the beginning of a process of exploitation which may result in an improvement in the performance of the innovating business. This recursive process of knowledge sourcing, transformation and exploitation comprises the innovation value chain. Modelling the innovation value chain for a large group of manufacturing firms in Ireland and Northern Ireland highlights the drivers of innovation, productivity and firm growth. In terms of knowledge sourcing,we find strong complementarity between horizontal, forwards, backwards, public and internal knowledge sourcing activities. Each of these forms of knowledge sourcing also makes a positive contribution to innovation in both products and processes although public knowledge sources have only an indirect effect on innovation outputs. In the exploitation phase, innovation in both products and processes contribute positively tocompany growth, with product innovation having a short-term ‘disruption’ effect on labour productivity. Modelling the complete innovation value chain highlights the structure and complexity of the process of translating knowledge into business value and emphasises the role of skills, capital investment and firms’ other resources in the value creation process.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper looks at how a strategic plan is constructed through a communicative process. Drawing on Ricoeur’s concepts of decontextualization and recontextualization, we conceptualize strategic planning activities as being constituted through the iterative and recursive relationship of talk and text. Based on an in-depth case study, our findings show how multiple actors engage in a formal strategic planning process which is manifested in a written strategy document. This document is thus central in the iterative talk to text cycles. As individuals express their interpretations of the current strategic plan in talk, they are able to make amendments to the text that then shape future textual versions of the plan. This iterative cycle is repeated until a final plan is agreed. We develop our findings into a model of the communication process that explains how texts become more authoritative over time and, in doing so, how they inscribe power relationships and social order within organizations. These findings contribute to the literature on the purposes of largely institutionalized processes of strategic planning and to the literature on organization as a communications process.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Innovation events - the introduction of new products or processes - represent the end of a process of knowledge sourcing and transformation. They also represent the beginning of a process of exploitation which may result in an improvement in the performance of the innovating business. This recursive process of knowledge sourcing, transformation and exploitation we call the innovation value chain. Modelling the innovation value chain for a large group of manufacturing firms in Ireland and Northern Ireland highlights the drivers of innovation, productivity and firm growth. In terms of knowledge sourcing, we find strong complementarity between horizontal, forwards, backwards, public and internal knowledge sourcing activities. Each of these forms of knowledge sourcing also makes a positive contribution to innovation in both products and processes although public knowledge sources have only an indirect effect on innovation outputs. In the exploitation phase, innovation in both products and processes contribute positively to company growth, with product innovation having a short-term ‘disruption’ effect on labour productivity. Modelling the complete innovation value chain highlights the structure and complexity of the process of translating knowledge into business value and emphasises the role of skills, capital investment and firms’ other resources in the value creation process.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Operators can become confused while diagnosing faults in process plant while in operation. This may prevent remedial actions being taken before hazardous consequences can occur. The work in this thesis proposes a method to aid plant operators in systematically finding the causes of any fault in the process plant. A computer aided fault diagnosis package has been developed for use on the widely available IBM PC compatible microcomputer. The program displays a coloured diagram of a fault tree on the VDU of the microcomputer, so that the operator can see the link between the fault and its causes. The consequences of the fault and the causes of the fault are also shown to provide a warning of what may happen if the fault is not remedied. The cause and effect data needed by the package are obtained from a hazard and operability (HAZOP) study on the process plant. The result of the HAZOP study is recorded as cause and symptom equations which are translated into a data structure and stored in the computer as a file for the package to access. Probability values are assigned to the events that constitute the basic causes of any deviation. From these probability values, the a priori probabilities of occurrence of other events are evaluated. A top-down recursive algorithm, called TDRA, for evaluating the probability of every event in a fault tree has been developed. From the a priori probabilities, the conditional probabilities of the causes of the fault are then evaluated using Bayes' conditional probability theorem. The posteriori probability values could then be used by the operators to check in an orderly manner the cause of the fault. The package has been tested using the results of a HAZOP study on a pilot distillation plant. The results from the test show how easy it is to trace the chain of events that leads to the primary cause of a fault. This method could be applied in a real process environment.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Reflecting changes in the nature of governance, some have questioned whether Public Administration is now an historical anachronism. While a legitimate debate exists between sceptics and optimists, this special issue demonstrates grounds for optimism by indicating the continuing diversity and adaptability of the field of Public Administration. In this introduction, we first sketch the variety of intellectual traditions which comprise the field of modern Public Administration. We then consider institutional challenges facing the subject given considerable pressures towards disciplinary fragmentation, and ideological challenges arising from a new distrust of public provision in the UK. Despite these challenges, Public Administration continues to provide a framework to analyse the practice of government and governance, governing institutions and traditions, and their wider sociological context. It can also directly inform policy reform - even if this endeavour can have its own pitfalls and pratfalls for the 'engaged' academic. We further suggest that, rather than lacking theoretical rigour, new approaches are developing that recognise the structural and political nature of the determinants of public administration. Finally, we highlight the richness of modern comparative work in Public Administration. Researchers can usefully look beyond the Atlantic relationship for theoretical enhancement and also consider more seriously the recursive and complex nature of international pressures on public administration. © The Author(s) 2012 Reprints and permissions: sagepub.co.uk/journalsPermissions.nav.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Sensitive and precise radioimmunoassays for insulin and glucagon have been established. Although it was possible to employ similar precepts to the development of both hormone assays, the establishment of a reliable glucagon radioimmunoassay was complicated by the poor immunogenicity and instability of the peptide. Thus, unlike insulin antisera which were prepared by monthly injection of guinea pigs with crystalline insulin emulsified in adjuvant, the successful production of glucagon antisera was accomplished by immunisation of rabbits and guinea pigs with glucagon covalently linked to bovine plasma albumin. The conventional chloramine-T iodination with purification by gel chromatography was only suitable for the production of labelled insulin. Quality tracer for use in the glucagon radioimmunoassay was prepared by trace iodination, with subsequent purification of monoiodinated glucagon by anion exchange chromatography. Separation of free and antibody bound moieties by coated charcoal was applicable to both hormone assays, and a computerised data processing system, relying on logit-log transformation, was used to analyse all assay results. The assays were employed to evaluate the regulation of endocrine pancreatic function and the role of insulin and glucagon in the pathogenesis of the obese hyperglycaemic syndrome in mice. In the homozygous (ob/ob) condition, mice of the Birmingham strain were characterised by numerous abnormalities of glucose homeostasis, several of which were detected in heterozygous (ob/+) mice. Obese mice exhibited pancreatic alpha cell dysfunction and hyperglucagonaemia. Investigation of this defect revealed a marked insensitivity of an insulin dependent glucose sensing mechanism that inhibited glucagon secretion. Although circulating glucagon was of minor importance in the maintenance of hyperinsulinaemia, lack of suppression of alpha cell function by glucose and insulin contributed significantly to both the insulin insensitivity and the hyperglycaemia of obese mice.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This thesis explores translating well-written sequential programs in a subset of the Eiffel programming language - without syntactic or semantic extensions - into parallelised programs for execution on a distributed architecture. The main focus is on constructing two object-oriented models: a theoretical self-contained model of concurrency which enables a simplified second model for implementing the compiling process. There is a further presentation of principles that, if followed, maximise the potential levels of parallelism. Model of Concurrency. The concurrency model is designed to be a straightforward target for mapping sequential programs onto, thus making them parallel. It aids the compilation process by providing a high level of abstraction, including a useful model of parallel behaviour which enables easy incorporation of message interchange, locking, and synchronization of objects. Further, the model is sufficient such that a compiler can and has been practically built. Model of Compilation. The compilation-model's structure is based upon an object-oriented view of grammar descriptions and capitalises on both a recursive-descent style of processing and abstract syntax trees to perform the parsing. A composite-object view with an attribute grammar style of processing is used to extract sufficient semantic information for the parallelisation (i.e. code-generation) phase. Programming Principles. The set of principles presented are based upon information hiding, sharing and containment of objects and the dividing up of methods on the basis of a command/query division. When followed, the level of potential parallelism within the presented concurrency model is maximised. Further, these principles naturally arise from good programming practice. Summary. In summary this thesis shows that it is possible to compile well-written programs, written in a subset of Eiffel, into parallel programs without any syntactic additions or semantic alterations to Eiffel: i.e. no parallel primitives are added, and the parallel program is modelled to execute with equivalent semantics to the sequential version. If the programming principles are followed, a parallelised program achieves the maximum level of potential parallelisation within the concurrency model.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Geometric information relating to most engineering products is available in the form of orthographic drawings or 2D data files. For many recent computer based applications, such as Computer Integrated Manufacturing (CIM), these data are required in the form of a sophisticated model based on Constructive Solid Geometry (CSG) concepts. A recent novel technique in this area transfers 2D engineering drawings directly into a 3D solid model called `the first approximation'. In many cases, however, this does not represent the real object. In this thesis, a new method is proposed and developed to enhance this model. This method uses the notion of expanding an object in terms of other solid objects, which are either primitive or first approximation models. To achieve this goal, in addition to the prepared subroutine to calculate the first approximation model of input data, two other wireframe models are found for extraction of sub-objects. One is the wireframe representation on input, and the other is the wireframe of the first approximation model. A new fast method is developed for the latter special case wireframe, which is named the `first approximation wireframe model'. This method avoids the use of a solid modeller. Detailed descriptions of algorithms and implementation procedures are given. In these techniques utilisation of dashed line information is also considered in improving the model. Different practical examples are given to illustrate the functioning of the program. Finally, a recursive method is employed to automatically modify the output model towards the real object. Some suggestions for further work are made to increase the domain of objects covered, and provide a commercially usable package. It is concluded that the current method promises the production of accurate models for a large class of objects.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

How does a firm choose a proper model of foreign direct investment (FDI) for entering a foreign market? Which mode of entry performs better? What are the performance implications of joint venture (JV) ownership structure? These important questions face a multinational enterprise (MNE) that decides to enter a foreign market. However, few studies have been conducted on such issues, and no consistent or conclusive findings are generated, especially with respect to China. It’s composed of five chapters, providing corresponding answers to the questions given above. Specifically, Chapter One is an overall introductory chapter. Chapter Two is about the choice of entry mode of FDI in China. Chapter Three examines the relationship between four main entry modes and performance. Chapter Four explores the performance implications of JV ownership structure. Chapter Five is an overall concluding chapter. These empirical studies are based on the most recent and richest data that has never been explored in previous studies. It contains information on 11,765 foreign-invested enterprises in China in seven manufacturing industries in 2000, 10,757 in 1999, and 10,666 in 1998. The four FDI entry modes examined include wholly-owned enterprises (WOEs), equity joint ventures (EJVs), contractual joint ventures (CJVs), and joint stock companies (JSCs). In Chapter Two, a multinominal logit model is established, and techniques of multiple linear regression analysis are employed in Chapter Three and Four. It was found that MNEs, under the conditions of a good investment environment, large capital commitment and small cultural distance, prefer the WOE strategy. If these conditions are not met, the EJV mode would be of greater use. The relative propensity to pursue the CJV mode increases with a good investment environment, small capital commitment, and small cultural distance. JSCs are not favoured by MNEs when the investment environment improves and when affiliates are located in the coastal areas. MNEs have been found to have a greater preference for an EJV as a mode of entry into the Chinese market in all industries. It is also found that in terms of return on assets (ROA) and asset turnover, WOEs perform the best, followed by EJVs, CJVs, and JSCs. Finally, minority-owned EJVs or JSCs are found to outperform their majority-owned counterparts in terms of ROA and asset turnover.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This thesis investigates the pricing-to-market (PTM) behaviour of the UK export sector. Unlike previous studies, this study econometrically tests for seasonal unit roots in the export prices prior to estimating PTM behaviour. Prior studies have seasonally adjusted the data automatically. This study’s results show that monthly export prices contain very little seasonal unit roots implying that there is a loss of information in the data generating process of the series when estimating PTM using seasonally-adjusted data. Prior studies have also ignored the econometric properties of the data despite the existence of ARCH effects in such data. The standard approach has been to estimate PTM models using Ordinary Least Square (OLS). For this reason, both EGARCH and GJR-EGARCH (hereafter GJR) estimation methods are used to estimate both a standard and an Error Correction model (ECM) of PTM. The results indicate that PTM behaviour varies across UK sectors. The variables used in the PTM models are co-integrated and an ECM is a valid representation of pricing behaviour. The study also finds that the price adjustment is slower when the analysis is performed on real prices, i.e., data that are adjusted for inflation. There is strong evidence of auto-regressive condition heteroscedasticity (ARCH) effects – meaning that the PTM parameter estimates of prior studies have been ineffectively estimated. Surprisingly, there is very little evidence of asymmetry. This suggests that exporters appear to PTM at a relatively constant rate. This finding might also explain the failure of prior studies to find evidence of asymmetric exposure in foreign exchange (FX) rates. This study also provides a cross sectional analysis to explain the implications of the observed PTM of producers’ marginal cost, market share and product differentiation. The cross-sectional regressions are estimated using OLS, Generalised Method of Moment (GMM) and Logit estimations. Overall, the results suggest that market share affects PTM positively.Exporters with smaller market share are more likely to operate PTM. Alternatively, product differentiation is negatively associated with PTM. So industries with highly differentiated products are less likely to adjust their prices. However, marginal costs seem not to be significantly associated with PTM. Exporters perform PTM to limit the FX rate effect pass-through to their foreign customers, but they also avoided exploiting PTM to the full, since to do so can substantially reduce their profits.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Corporate restructuring is perceived as a challenge to research. Prior studies do not provide conclusive evidence regarding the effects of restructuring. Since there are discernible findings, this research attempts to examine the effects of restructuring events amongst the UK listed firms. The sample firms are listed in the LSE and London AIM stock exchange. Only completed restructuring transactions are included in the study. The time horizon extends from year 1999 to 2003. A three-year floating window is assigned to examine the sample firms. The key enquiry is to scrutinise the ex post effects of restructuring on performance and value measures of firms with contrast to a matched criteria non-restructured sample. A cross sectional study employing logit estimate is undertaken to examine firm characteristics of restructuring samples. Further, additional parameters, i.e. Conditional Volatility and Asymmetry are generated under the GJR-GARCH estimate and reiterated in logit models to capture time-varying heteroscedasticity of the samples. This research incorporates most forms of restructurings, while prior studies have examined certain forms of restructuring. Particularly, these studies have made limited attempts to examine different restructuring events simultaneously. In addition to logit analysis, an event study is adopted to evaluate the announcement effect of restructuring under both the OLS and GJR-GARCH estimate supplementing our prior results. By engaging a composite empirical framework, our estimation method validates a full appreciation of restructuring effect. The study provides evidence that restructurings indicate non-trivial significant positive effect. There are some evidences that the response differs because of the types of restructuring, particularly while event study is applied. The results establish that performance measures, i.e. Operating Profit Margin, Return on Equity, Return on Assets, Growth, Size, Profit Margin and Shareholders' Ownership indicate consistent and significant increase. However, Leverage and Asset Turn Over suggest reasonable influence on restructuring across the sample period. Similarly, value measures, i.e. Abnormal Returns, Return on Equity and Cash Flow Margin suggest sizeable improvement. A notable characteristic seen coherently throughout the analysis is the decreasing proportion of Systematic Risk. Consistent with these findings, Conditional Volatility and Asymmetry exhibit similar trend. The event study analysis suggests that on an average market perceives restructuring favourably and shareholders experience significant and systematic positive gain.