36 resultados para Gradient-based approaches
em Aston University Research Archive
Resumo:
This study presents some quantitative evidence from a number of simulation experiments on the accuracy of the productivitygrowth estimates derived from growthaccounting (GA) and frontier-based methods (namely data envelopment analysis-, corrected ordinary least squares-, and stochastic frontier analysis-based malmquist indices) under various conditions. These include the presence of technical inefficiency, measurement error, misspecification of the production function (for the GA and parametric approaches) and increased input and price volatility from one period to the next. The study finds that the frontier-based methods usually outperform GA, but the overall performance varies by experiment. Parametric approaches generally perform best when there is no functional form misspecification, but their accuracy greatly diminishes otherwise. The results also show that the deterministic approaches perform adequately even under conditions of (modest) measurement error and when measurement error becomes larger, the accuracy of all approaches (including stochastic approaches) deteriorates rapidly, to the point that their estimates could be considered unreliable for policy purposes.
Resumo:
Relationship-based approaches to leadership (e.g., Leader–Member Exchange theory) currently represent one of the most popular approaches to understanding workplace leadership. Although the concept of “relationship” is central to these approaches, generally this has not been well articulated and is often conceptualized simply in terms of relationship quality between the leader and the follower. In contrast, research in the wider relationship science domain provides a more detailed exposition of relationships and how they form and develop. We propose that research and methodology developed in relationship science (i.e., close relationships) can enhance understanding of the leader–follower relationship and therefore advance theory in this area. To address this issue, we organize our review in two areas. First, we examine how a social cognitive approach to close relationships can benefit an understanding of the leader–follower relationship (in terms of structure, content, and processes). Second, we show how the research designs and methodologies that have been developed in relationship science can be applied to understand better the leader–follower relationship. The cross-fertilization of research from the close relationships literature to understanding the leader–follower relationship provides new insights into leadership processes and potential avenues for further research. Copyright © 2013 John Wiley & Sons, Ltd.
Resumo:
DUE TO COPYRIGHT RESTRICTIONS ONLY AVAILABLE FOR CONSULTATION AT ASTON UNIVERSITY LIBRARY AND INFORMATION SERVICES WITH PRIOR ARRANGEMENT
Resumo:
The research described in this PhD thesis focuses on proteomics approaches to study the effect of oxidation on the modification status and protein-protein interactions of PTEN, a redox-sensitive phosphatase involved in a number of cellular processes including metabolism, apoptosis, cell proliferation, and survival. While direct evidence of a redox regulation of PTEN and its downstream signaling has been reported, the effect of cellular oxidative stress or direct PTEN oxidation on PTEN structure and interactome is still poorly defined. In a first study, GST-tagged PTEN was directly oxidized over a range of hypochlorous acid (HOCl) concentration, assayed for phosphatase activity, and oxidative post-translational modifications (oxPTMs) were quantified using LC-MS/MS-based label-free methods. In a second study, GSTtagged PTEN was prepared in a reduced and reversibly H2O2-oxidized form, immobilized on a resin support and incubated with HCT116 cell lysate to capture PTEN interacting proteins, which were analyzed by LC-MS/MS and comparatively quantified using label-free methods. In parallel experiments, HCT116 cells transfected with a GFP-tagged PTEN were treated with H2O2 and PTENinteracting proteins immunoprecipitated using standard methods. Several high abundance HOCl-induced oxPTMs were mapped, including those taking place at amino acids known to be important for PTEN phosphatase activity and protein-protein interactions, such as Met35, Tyr155, Tyr240 and Tyr315. A PTEN redox interactome was also characterized, which identified a number of PTEN-interacting proteins that vary with the reversible inactivation of PTEN caused by H2O2 oxidation. These included new PTEN interactors as well as the redox proteins peroxiredoxin-1 (Prdx1) and thioredoxin (Trx), which are known to be involved in the recycling of PTEN active site following H2O2-induced reversible inactivation. The results suggest that the oxidative modification of PTEN causes functional alterations in PTEN structure and interactome, with fundamental implications for the PTEN signaling role in many cellular processes, such as those involved in the pathophysiology of disease and ageing.
Resumo:
Most object-based approaches to Geographical Information Systems (GIS) have concentrated on the representation of geometric properties of objects in terms of fixed geometry. In our road traffic marking application domain we have a requirement to represent the static locations of the road markings but also enforce the associated regulations, which are typically geometric in nature. For example a give way line of a pedestrian crossing in the UK must be within 1100-3000 mm of the edge of the crossing pattern. In previous studies of the application of spatial rules (often called 'business logic') in GIS emphasis has been placed on the representation of topological constraints and data integrity checks. There is very little GIS literature that describes models for geometric rules, although there are some examples in the Computer Aided Design (CAD) literature. This paper introduces some of the ideas from so called variational CAD models to the GIS application domain, and extends these using a Geography Markup Language (GML) based representation. In our application we have an additional requirement; the geometric rules are often changed and vary from country to country so should be represented in a flexible manner. In this paper we describe an elegant solution to the representation of geometric rules, such as requiring lines to be offset from other objects. The method uses a feature-property model embraced in GML 3.1 and extends the possible relationships in feature collections to permit the application of parameterized geometric constraints to sub features. We show the parametric rule model we have developed and discuss the advantage of using simple parametric expressions in the rule base. We discuss the possibilities and limitations of our approach and relate our data model to GML 3.1. © 2006 Springer-Verlag Berlin Heidelberg.
Resumo:
Software development methodologies are becoming increasingly abstract, progressing from low level assembly and implementation languages such as C and Ada, to component based approaches that can be used to assemble applications using technologies such as JavaBeans and the .NET framework. Meanwhile, model driven approaches emphasise the role of higher level models and notations, and embody a process of automatically deriving lower level representations and concrete software implementations. The relationship between data and software is also evolving. Modern data formats are becoming increasingly standardised, open and empowered in order to support a growing need to share data in both academia and industry. Many contemporary data formats, most notably those based on XML, are self-describing, able to specify valid data structure and content, and can also describe data manipulations and transformations. Furthermore, while applications of the past have made extensive use of data, the runtime behaviour of future applications may be driven by data, as demonstrated by the field of dynamic data driven application systems. The combination of empowered data formats and high level software development methodologies forms the basis of modern game development technologies, which drive software capabilities and runtime behaviour using empowered data formats describing game content. While low level libraries provide optimised runtime execution, content data is used to drive a wide variety of interactive and immersive experiences. This thesis describes the Fluid project, which combines component based software development and game development technologies in order to define novel component technologies for the description of data driven component based applications. The thesis makes explicit contributions to the fields of component based software development and visualisation of spatiotemporal scenes, and also describes potential implications for game development technologies. The thesis also proposes a number of developments in dynamic data driven application systems in order to further empower the role of data in this field.
Resumo:
In April 2009, Google Images added a filter for narrowing search results by colour. Several other systems for searching image databases by colour were also released around this time. These colour-based image retrieval systems enable users to search image databases either by selecting colours from a graphical palette (i.e., query-by-colour), by drawing a representation of the colour layout sought (i.e., query-by-sketch), or both. It was comments left by readers of online articles describing these colour-based image retrieval systems that provided us with the inspiration for this research. We were surprised to learn that the underlying query-based technology used in colour-based image retrieval systems today remains remarkably similar to that of systems developed nearly two decades ago. Discovering this ageing retrieval approach, as well as uncovering a large user demographic requiring image search by colour, made us eager to research more effective approaches for colour-based image retrieval. In this thesis, we detail two user studies designed to compare the effectiveness of systems adopting similarity-based visualisations, query-based approaches, or a combination of both, for colour-based image retrieval. In contrast to query-based approaches, similarity-based visualisations display and arrange database images so that images with similar content are located closer together on screen than images with dissimilar content. This removes the need for queries, as users can instead visually explore the database using interactive navigation tools to retrieve images from the database. As we found existing evaluation approaches to be unreliable, we describe how we assessed and compared systems adopting similarity-based visualisations, query-based approaches, or both, meaningfully and systematically using our Mosaic Test - a user-based evaluation approach in which evaluation study participants complete an image mosaic of a predetermined target image using the colour-based image retrieval system under evaluation.
Resumo:
Image database visualisations, in particular mapping-based visualisations, provide an interesting approach to accessing image repositories as they are able to overcome some of the drawbacks associated with retrieval based approaches. However, making a mapping-based approach work efficiently on large remote image databases, has yet to be explored. In this paper, we present Web-Based Images Browser (WBIB), a novel system that efficiently employs image pyramids to reduce bandwidth requirements so that users can interactively explore large remote image databases. © 2013 Authors.
Resumo:
Este artigo estabelece uma postura crítica e analítica em relação às abordagens baseadas em microatividades para se entender a estratégia. Argumenta-se que tais abordagens trazem desafios teóricos e empíricos importantes. Argumenta-se também contra a tendência ao reducionismo, sem igual ênfase nas influências contextuais que configuram a estratégia em um nível micro. Finalmente, o artigo defende uma abordagem mais internacional e comparativa para os estudos da estratégia em um nível micro. This paper takes a critical and evaluative stance toward micro-activity-based approaches to understanding strategy. It argues that such approaches bring with them important theoretical and empirical challenges. The paper argues against a tendency to reductionism without equal emphasis to the contextual influences that bound micro-strategising. Finally, the paper argues for a more international and comparative approach to micro-strategy studies than has currently been the case.
Resumo:
A simple method for training the dynamical behavior of a neural network is derived. It is applicable to any training problem in discrete-time networks with arbitrary feedback. The algorithm resembles back-propagation in that an error function is minimized using a gradient-based method, but the optimization is carried out in the hidden part of state space either instead of, or in addition to weight space. Computational results are presented for some simple dynamical training problems, one of which requires response to a signal 100 time steps in the past.
Resumo:
A simple method for training the dynamical behavior of a neural network is derived. It is applicable to any training problem in discrete-time networks with arbitrary feedback. The method resembles back-propagation in that it is a least-squares, gradient-based optimization method, but the optimization is carried out in the hidden part of state space instead of weight space. A straightforward adaptation of this method to feedforward networks offers an alternative to training by conventional back-propagation. Computational results are presented for simple dynamical training problems, with varied success. The failures appear to arise when the method converges to a chaotic attractor. A patch-up for this problem is proposed. The patch-up involves a technique for implementing inequality constraints which may be of interest in its own right.
Resumo:
This technical report builds on previous reports to derive the likelihood and its derivatives for a Gaussian Process with a modified Bessel function based covariance function. The full derivation is shown. The likelihood (with gradient information) can be used in maximum likelihood procedures (i.e. gradient based optimisation) and in Hybrid Monte Carlo sampling (i.e. within a Bayesian framework).
Resumo:
Purpose: The purpose of this paper is to describe how the application of systems thinking to designing, managing and improving business processes has resulted in a new and unique holonic-based process modeling methodology know as process orientated holonic modeling. Design/methodology/approach: The paper describes key systems thinking axioms that are built upon in an overview of the methodology; the techniques are described using an example taken from a large organization designing and manufacturing capital goods equipment operating within a complex and dynamic environment. These were produced in an 18 month project, using an action research approach, to improve quality and process efficiency. Findings: The findings of this research show that this new methodology can support process depiction and improvement in industrial sectors which are characterized by environments of high variety and low volume (e.g. projects; such as the design and manufacture of a radar system or a hybrid production process) which do not provide repetitive learning opportunities. In such circumstances, the methodology has not only been able to deliver holonic-based process diagrams but also been able to transfer strategic vision from top management to middle and operational levels without being reductionistic. Originality/value: This paper will be of interest to organizational analysts looking at large complex projects whom require a methodology that does not confine them to thinking reductionistically in "task-breakdown" based approaches. The novel ideas in this paper have great impact on the way analysts should perceive organizational processes. Future research is applying the methodology in similar environments in other industries. © Emerald Group Publishing Limited.
Resumo:
This paper takes a critical and evaluative stance toward micro-activity-based approaches to understanding strategy. It argues that such approaches bring with them important theoretical and empirical challenges. The paper argues against a tendency to reductionism without equal emphasis to the contextual influences that bound micro-strategising. Finally, the paper argues for a more international and comparative approach to micro-strategy studies than has currently been the case.
Resumo:
The research developed in this thesis explores the sensing and inference of human movement in a dynamic way, as opposed to conventional measurement systems, that are only concerned with discrete evaluations of stimuli in sequential time. Typically, conventional approaches are used to infer the dynamic movement of the body; such as vision and motion tracking devices, with either a human diagnosis or complex image processing algorithm to classify the movement. This research is therefore the first of its kind to attempt and provide a movement classifying algorithm through the use of minimal sensing points, with the application for this novel system, to classify human movement during a golf swing. There are two main categories of force sensing. Firstly, array-type systems consisting of many sensing elements, and are the most commonly researched and commercially available. Secondly, reduced force sensing element systems (RFSES) also known as distributive systems have only been recently exploited in the academic world. The fundamental difference between these systems is that array systems handle the data captured from each sensor as unique outputs and suffer the effects of resolution. The effect of resolution, is the error in the load position measurement between sensing elements, as the output is quantized in terms of position. This can be compared to a reduced sensor element system that maximises that data received through the coupling of data from a distribution of sensing points to describe the output in discrete time. Also this can be extended to a coupling of transients in the time domain to describe an activity or dynamic movement. It is the RFSES that is to be examined and exploited in the commercial sector due to its advantages over array-based approaches such as reduced design, computational complexity and cost.