29 resultados para nonlocal theories and models
em Aston University Research Archive
Resumo:
The leadership categorisation theory suggests that followers rely on a hierarchical cognitive structure in perceiving leaders and the leadership process, which consists of three levels; superordinate, basic and subordinate. The predominant view is that followers rely on Implicit Leadership Theories (ILTs) at the basic level in making judgments about managers. The thesis examines whether this presumption is true by proposing and testing two competing conceptualisations; namely the congruence between the basic level ILTs (general leader) and actual manager perceptions, and subordinate level ILTs (job-specific leader) and actual manager. The conceptualisation at the job-specific level builds on context-related assertions of the ILT explanatory models: leadership categorisation, information processing and connectionist network theories. Further, the thesis addresses the effects of ILT congruence at the group level. The hypothesised model suggests that Leader-Member Exchange (LMX) will act as a mediator between ILT congruence and outcomes. Three studies examined the proposed model. The first was cross-sectional with 175 students reporting on work experience during a 1-year industrial placement. The second was longitudinal and had a sample of 343 students engaging in a business simulation in groups with formal leadership. The final study was a cross-sectional survey in several organisations with a sample of 178. A novel approach was taken to congruence analysis; the hypothesised models were tested using Latent Congruence Modelling (LCM), which accounts for measurement error and overcomes the majority of limitations of traditional approaches. The first two studies confirm the traditional theorised view that employees rely on basic-level ILTs in making judgments about their managers with important implications, and show that LMX mediates the relationship between ILT congruence and work-related outcomes (performance, job satisfaction, well-being, task satisfaction, intragroup conflict, group satisfaction, team realness, team-member exchange, group performance). The third study confirms this with conflict, well-being, self-rated performance and commitment as outcomes.
Resumo:
A significant forum of scholarly and practitioner-based research has developed in recent years that has sought both to theorize upon and empirically measure the competitiveness of regions. However, the disparate and fragmented nature of this work has led to the lack of a substantive theoretical foundation underpinning the various analyses and measurement methodologies employed. The aim of this paper is to place the regional competitiveness discourse within the context of theories of economic growth, and more particularly, those concerning regional economic growth. It is argued that regional competitiveness models are usually implicitly constructed in the lineage of endogenous growth frameworks, whereby deliberate investments in factors such as human capital and knowledge are considered to be key drivers of growth differentials. This leads to the suggestion that regional competitiveness can be usefully defined as the capacity and capability of regions to achieve economic growth relative to other regions at a similar overall stage of economic development, which will usually be within their own nation or continental bloc. The paper further assesses future avenues for theoretical and methodological exploration, highlighting the role of institutions, resilience and, well-being in understanding how the competitiveness of regions influences their long-term evolution.
Resumo:
The research described here concerns the development of metrics and models to support the development of hybrid (conventional/knowledge based) integrated systems. The thesis argues from the point that, although it is well known that estimating the cost, duration and quality of information systems is a difficult task, it is far from clear what sorts of tools and techniques would adequately support a project manager in the estimation of these properties. A literature review shows that metrics (measurements) and estimating tools have been developed for conventional systems since the 1960s while there has been very little research on metrics for knowledge based systems (KBSs). Furthermore, although there are a number of theoretical problems with many of the `classic' metrics developed for conventional systems, it also appears that the tools which such metrics can be used to develop are not widely used by project managers. A survey was carried out of large UK companies which confirmed this continuing state of affairs. Before any useful tools could be developed, therefore, it was important to find out why project managers were not using these tools already. By characterising those companies that use software cost estimating (SCE) tools against those which could but do not, it was possible to recognise the involvement of the client/customer in the process of estimation. Pursuing this point, a model of the early estimating and planning stages (the EEPS model) was developed to test exactly where estimating takes place. The EEPS model suggests that estimating could take place either before a fully-developed plan has been produced, or while this plan is being produced. If it were the former, then SCE tools would be particularly useful since there is very little other data available from which to produce an estimate. A second survey, however, indicated that project managers see estimating as being essentially the latter at which point project management tools are available to support the process. It would seem, therefore, that SCE tools are not being used because project management tools are being used instead. The issue here is not with the method of developing an estimating model or tool, but; in the way in which "an estimate" is intimately tied to an understanding of what tasks are being planned. Current SCE tools are perceived by project managers as targetting the wrong point of estimation, A model (called TABATHA) is then presented which describes how an estimating tool based on an analysis of tasks would thus fit into the planning stage. The issue of whether metrics can be usefully developed for hybrid systems (which also contain KBS components) is tested by extending a number of "classic" program size and structure metrics to a KBS language, Prolog. Measurements of lines of code, Halstead's operators/operands, McCabe's cyclomatic complexity, Henry & Kafura's data flow fan-in/out and post-release reported errors were taken for a set of 80 commercially-developed LPA Prolog programs: By re~defining the metric counts for Prolog it was found that estimates of program size and error-proneness comparable to the best conventional studies are possible. This suggests that metrics can be usefully applied to KBS languages, such as Prolog and thus, the development of metncs and models to support the development of hybrid information systems is both feasible and useful.
Resumo:
As a discipline, supply chain management (SCM) has traditionally been primarily concerned with the procurement, processing, movement and sale of physical goods. However an important class of products has emerged - digital products - which cannot be described as physical as they do not obey commonly understood physical laws. They do not possess mass or volume, and they require no energy in their manufacture or distribution. With the Internet, they can be distributed at speeds unimaginable in the physical world, and every copy produced is a 100% perfect duplicate of the original version. Furthermore, the ease with which digital products can be replicated has few analogues in the physical world. This paper assesses the effect of non-physicality on one such product – software – in relation to the practice of SCM. It explores the challenges that arise when managing the software supply chain and how practitioners are addressing these challenges. Using a two-pronged exploratory approach that examines the literature around software management as well as direct interviews with software distribution practitioners, a number of key challenges associated with software supply chains are uncovered, along with responses to these challenges. This paper proposes a new model for software supply chains that takes into account the non-physicality of the product being delivered. Central to this model is the replacement of physical flows with flows of intellectual property, the growing importance of innovation over duplication and the increased centrality of the customer in the entire process. Hybrid physical / digital supply chains are discussed and a framework for practitioners concerned with software supply chains is presented.
Resumo:
This paper critically evaluates the paradigm, theory, and methodology that dominate research on related party transactions (RPTs). RPTs have been debated in the literature whether they are a facet of conflict of interest between major and minor shareholders or they are normal efficient transactions that help the firms to achieve asset utilization. Literature has been widely interested in studying the association between corporate governance and RPTs especially that according to the agency theory it is assumed that corporate governance as a monitoring tool should impede the negative consequences of RPTs and ensure they are conducted to achieve better asset utilization.
Resumo:
Cognitive linguistics scholars argue that metaphor is fundamentally a conceptual process of mapping one domain of experience onto another domain. The study of metaphor in the context of Translation Studies has not, unfortunately, kept pace with the discoveries about the nature and role of metaphor in the cognitive sciences. This study aims primarily to fill part of this gap of knowledge. Specifically, the thesis is an attempt to explore some implications of the conceptual theory of metaphor for translation. Because the study of metaphor in translation is also based on views about the nature of translation, the thesis first presents a general overview of the discipline of Translation Studies, describing the major models of translation. The study (in Chapter Two) then discusses the major traditional theories of metaphor (comparison, substitution and interaction theories) and shows how the ideas of those theories were adopted in specific translation studies of metaphor. After that, the study presents a detailed account of the conceptual theory of metaphor and some hypothetical implications for the study of metaphor in translation from the perspective of cognitive linguistics. The data and methodology are presented in Chapter Four. A novel classification of conceptual metaphor is presented which distinguishes between different source domains of conceptual metaphors: physical, human-life and intertextual. It is suggested that each source domain places different demands on translators. The major sources of the data for this study are (1) the translations done by the Foreign Broadcasting Information Service (FBIS), which is a translation service of the Central Intelligence Agency (CIA) in the United Sates of America, of a number of speeches by the Iraqi president Saddam Hussein during the Gulf Crisis (1990-1991) and (2) official (governmental) Omani translations of National Day speeches of Sultan Qaboos bin Said of Oman.
Resumo:
How are innovative new business models established if organizations constantly compare themselves against existing criteria and expectations? The objective is to address this question from the perspective of innovators and their ability to redefine established expectations and evaluation criteria. The research questions ask whether there are discernible patterns of discursive action through which innovators theorize institutional change and what role such theorizations play for mobilizing support and realizing change projects. These questions are investigated through a case study on a critical area of enterprise computing software, Java application servers. In the present case, business practices and models were already well established among incumbents with critical market areas allocated to few dominant firms. Fringe players started experimenting with a new business approach of selling services around freely available opensource application servers. While most new players struggled, one new entrant succeeded in leading incumbents to adopt and compete on the new model. The case demonstrates that innovative and substantially new models and practices are established in organizational fields when innovators are able to refine expectations and evaluation criteria within an organisational field. The study addresses the theoretical paradox of embedded agency. Actors who are embedded in prevailing institutional logics and structures find it hard to perceive potentially disruptive opportunities that fall outside existing ways of doing things. Changing prevailing institutional logics and structures requires strategic and institutional work aimed at overcoming barriers to innovation. The study addresses this problem through the lens of (new) institutional theory. This discourse methodology traces the process through which innovators were able to establish a new social and business model in the field.
Resumo:
Recent developments in service-oriented and distributed computing have created exciting opportunities for the integration of models in service chains to create the Model Web. This offers the potential for orchestrating web data and processing services, in complex chains; a flexible approach which exploits the increased access to products and tools, and the scalability offered by the Web. However, the uncertainty inherent in data and models must be quantified and communicated in an interoperable way, in order for its effects to be effectively assessed as errors propagate through complex automated model chains. We describe a proposed set of tools for handling, characterizing and communicating uncertainty in this context, and show how they can be used to 'uncertainty- enable' Web Services in a model chain. An example implementation is presented, which combines environmental and publicly-contributed data to produce estimates of sea-level air pressure, with estimates of uncertainty which incorporate the effects of model approximation as well as the uncertainty inherent in the observational and derived data.
Resumo:
Feminist Translation Studies: Local and Transnational Perspectives situates feminist translation as political activism. Chapters highlight the multiple agendas and visions of feminist translation and the different political voices and cultural heritages through which it speaks across times and places, addressing the question of how both literary and nonliterary discourses migrate and contribute to local and transnational processes of feminist knowledge building and political activism. This collection does not pursue a narrow, fixed definition of feminism that is based solely on (Eurocentric or West-centric) gender politics—rather, Feminist Translation Studies: Local and Transnational Perspectives seeks to expand our understanding of feminist action not only to include feminist translation as resistance against multiple forms of domination, but also to rethink feminist translation through feminist theories and practices developed in different geohistorical and disciplinary contexts. In so doing, the collection expands the geopolitical, sociocultural and historical scope of the field from different disciplinary perspectives, pointing towards a more transnational, interdisciplinary and overtly political conceptualization of translation studies.
Resumo:
Surface compositional changes in GaAs due to RF plasmas of different gases have been investigated by XPS and etch rates were measured using AFM. Angular Resolved XPS (ARXPS) was also employed for depth analysis of the composition of the surface layers. An important role in this study was determination of oxide thickness using XPS data. The study of surface - plasma interaction was undertaken by correlating results of surface analysis with plasma diagnosis. Different experiments were designed to accurately measure the BEs associated with the Ga 3d, Ga 2P3/2 and LMM peaks using XPS analysis and propose identification in terms of the oxides of GaAs. Along with GaAs wafers, some reference compounds such as metallic Ga and Ga2O3 powder were used. A separate study aiming the identification of the GaAs surface oxides formed on the GaAs surface during and after plasma processing was undertaken. Surface compositional changes after plasma treatment, prior to surface analysis are considered, with particular reference to the oxides formed in the air on the activated surface. Samples exposed to ambient air for different periods of time and also to pure oxygen were analysed. Models of surface processes were proposed for explanation of the stoichiometry changes observed with the inert and reactive plasmas used. In order to help with the understanding of the mechanisms responsible for surface effects during plasma treatment, computer simulation using SRIM code was also undertaken. Based on simulation and experimental results, models of surface phenomena are proposed. Discussion of the experimental and simulated results is made in accordance with current theories and published results of different authors. The experimental errors introduced by impurities and also by data acquisition and processing are also evaluated.
Resumo:
The fast spread of the Internet and the increasing demands of the service are leading to radical changes in the structure and management of underlying telecommunications systems. Active networks (ANs) offer the ability to program the network on a per-router, per-user, or even per-packet basis, thus promise greater flexibility than current networks. To make this new network paradigm of active network being widely accepted, a lot of issues need to be solved. Management of the active network is one of the challenges. This thesis investigates an adaptive management solution based on genetic algorithm (GA). The solution uses a distributed GA inspired by bacterium on the active nodes within an active network, to provide adaptive management for the network, especially the service provision problems associated with future network. The thesis also reviews the concepts, theories and technologies associated with the management solution. By exploring the implementation of these active nodes in hardware, this thesis demonstrates the possibility of implementing a GA based adaptive management in the real network that being used today. The concurrent programming language, Handel-C, is used for the description of the design system and a re-configurable computer platform based on a FPGA process element is used for the hardware implementation. The experiment results demonstrate both the availability of the hardware implementation and the efficiency of the proposed management solution.
Resumo:
In induction machines the tooth frequency losses due to permeance variation constitute a signif'icant, portion of the total loss. In order to predict and estimate these losses it, is essential to obtain a clear understanding of the no-load distribution of the air gap magnetic field and the magnitude of flux pulsation in both stator and rotor teeth. The existing theories and methods by which the air gap permeance variation in a doubly slotted structure is calculated are either empirical or restricted. The main objective of this thesis is to obtain a detailed analysis of the no-load air gap magnetic field distribution and the effect of air gap geometry on the magnitude and waveform of the tooth flux pulsation. In this thesis a detaiiled theoretical and experimental analysis of flux distribution not only leads to a better understanding of the distribution of no-load losses but also provides theoretical analysis for calculating the losses with greater accuracy
Resumo:
This research focuses on two groups of local companies; namely, high-growth local companies and other local companies, to examine and compare the influence of utilising governmental initiatives, servicing foreign MNCs and internationalisation on their strategic planning process. The theme of this thesis argues that the approach of an organisation towards strategic planning is not only determined by the internal influences; namely, its firm size and the planning behaviour and attitude of an entrepreneur, as revealed in the literature, but it can also be affected by external influences. The theoretical contribution of this research determines this unique situation in Singapore, and tests the robustness of the conventional models of planning in smaller companies. As a result of the external influences, this study reveals that local companies are more likely to undertake a much more formal strategic planning than the conventional Western literature and models would indicate. High-growth local companies, in comparison, however, had undertaken a more formal and rigorous strategic planning process than other local companies.
Resumo:
This research project focused upon the design strategies adopted by expert and novice designers. It was based upon a desire to compare the design problem solving strategies of novices, in this case key stage three pupils studying technolgy within the United Kingdom National Curriculum, with designers who could be considered to have developed expertise. The findings helped to provide insights into potential teaching strategies to suit novice designers. Verbal protocols were made as samples of expert and novice designers solved a design problem and talked aloud as they worked. The verbalisations were recorded on video tape. The protocols were transcribed and segmented, with each segment being assigned to a predetermined coding system which represented a model of design problem solving. The results of the encoding were analysed and consideration was also given to the general design strategy and heuristics used by the expert and novice designers. The drawings and models produced during the generation of the protocols were also analysed and considered. A number of significant differences between the problem solving strategies adopted by the expert and novice designers were identified. First of all, differences were observed in the way expert and novice designers used the problem statement and solution validation during the process. Differences were also identified in the way holistic solutions were generated near the start of the process, and also in the cycles of exploration and the processes of integration. The way design and technological knowledge was used provided further insights into the differences between experts and novices, as did the role of drawing and modelling during the process. In more general terms, differences were identified in the heuristics and overall design strategies adopted by the expert and novice designers. The above findings provided a basis for discussing teaching strategies appropriate for novice designers. Finally, opportunities for future research were discussed.