45 resultados para development need

em Aston University Research Archive


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Recent discussion of the knowledge-based economy draws increasingly attention to the role that the creation and management of knowledge plays in economic development. Development of human capital, the principal mechanism for knowledge creation and management, becomes a central issue for policy-makers and practitioners at the regional, as well as national, level. Facing competition both within and across nations, regional policy-makers view human capital development as a key to strengthening the positions of their economies in the global market. Against this background, the aim of this study is to go some way towards answering the question of whether, and how, investment in education and vocational training at regional level provides these territorial units with comparative advantages. The study reviews literature in economics and economic geography on economic growth (Chapter 2). In growth model literature, human capital has gained increased recognition as a key production factor along with physical capital and labour. Although leaving technical progress as an exogenous factor, neoclassical Solow-Swan models have improved their estimates through the inclusion of human capital. In contrast, endogenous growth models place investment in research at centre stage in accounting for technical progress. As a result, they often focus upon research workers, who embody high-order human capital, as a key variable in their framework. An issue of discussion is how human capital facilitates economic growth: is it the level of its stock or its accumulation that influences the rate of growth? In addition, these economic models are criticised in economic geography literature for their failure to consider spatial aspects of economic development, and particularly for their lack of attention to tacit knowledge and urban environments that facilitate the exchange of such knowledge. Our empirical analysis of European regions (Chapter 3) shows that investment by individuals in human capital formation has distinct patterns. Those regions with a higher level of investment in tertiary education tend to have a larger concentration of information and communication technology (ICT) sectors (including provision of ICT services and manufacture of ICT devices and equipment) and research functions. Not surprisingly, regions with major metropolitan areas where higher education institutions are located show a high enrolment rate for tertiary education, suggesting a possible link to the demand from high-order corporate functions located there. Furthermore, the rate of human capital development (at the level of vocational type of upper secondary education) appears to have significant association with the level of entrepreneurship in emerging industries such as ICT-related services and ICT manufacturing, whereas such association is not found with traditional manufacturing industries. In general, a high level of investment by individuals in tertiary education is found in those regions that accommodate high-tech industries and high-order corporate functions such as research and development (R&D). These functions are supported through the urban infrastructure and public science base, facilitating exchange of tacit knowledge. They also enjoy a low unemployment rate. However, the existing stock of human and physical capital in those regions with a high level of urban infrastructure does not lead to a high rate of economic growth. Our empirical analysis demonstrates that the rate of economic growth is determined by the accumulation of human and physical capital, not by level of their existing stocks. We found no significant effects of scale that would favour those regions with a larger stock of human capital. The primary policy implication of our study is that, in order to facilitate economic growth, education and training need to supply human capital at a faster pace than simply replenishing it as it disappears from the labour market. Given the significant impact of high-order human capital (such as business R&D staff in our case study) as well as the increasingly fast pace of technological change that makes human capital obsolete, a concerted effort needs to be made to facilitate its continuous development.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose - This paper provides a deeper examination of the fundamentals of commonly-used techniques - such as coefficient alpha and factor analysis - in order to more strongly link the techniques used by marketing and social researchers to their underlying psychometric and statistical rationale. Design/methodology approach - A wide-ranging review and synthesis of psychometric and other measurement literature both within and outside the marketing field is used to illuminate and reconsider a number of misconceptions which seem to have evolved in marketing research. Findings - The research finds that marketing scholars have generally concentrated on reporting what are essentially arbitrary figures such as coefficient alpha, without fully understanding what these figures imply. It is argued that, if the link between theory and technique is not clearly understood, use of psychometric measure development tools actually runs the risk of detracting from the validity of the measures rather than enhancing it. Research limitations/implications - The focus on one stage of a particular form of measure development could be seen as rather specialised. The paper also runs the risk of increasing the amount of dogma surrounding measurement, which runs contrary to the spirit of this paper. Practical implications - This paper shows that researchers may need to spend more time interpreting measurement results. Rather than simply referring to precedence, one needs to understand the link between measurement theory and actual technique. Originality/value - This paper presents psychometric measurement and item analysis theory in easily understandable format, and offers an important set of conceptual tools for researchers in many fields. © Emerald Group Publishing Limited.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The following thesis instigates the discussion on corporate social responsibility (CSR) through a review of literature on the conceptualisation, determinants, and remunerations of organisational CSR engagement. The case is made for the need to draw attention to the micro-levels of CSR, and consequently focus on employee social responsibility at multiple levels of analysis. In order to further research efforts in this area, the prerequisite of an employee social responsibility behavioural measurement tool is acknowledged. Accordingly, the subsequent chapters outline the process of scale development and validation, resulting in a robust, reliable and valid employee social responsibility scale. This scale is then put to use in a field study, and the noteworthy roles of the antecedent and boundary conditions of transformational leadership, assigned CSR priority, and CSR climate are confirmed at the group and individual level. Directionality of these relationships is subsequently alluded to in a time-lagged investigation, set within a simulated business environment. The thesis collates and discusses the contributions of the findings from the research series, which highlight a consistent three-way interaction effect of transformational leadership, assigned CSR priority and CSR climate. Specifically, efforts are made to outline various avenues for future research, given the infancy of the micro-level study of employee social responsibility.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose - Despite the increasing sophistication of new product development (NPD) research, the reliance on traditional approaches to studying NPD has left several areas in need of further research. The authors propose addressing some of these gaps, especially the limited focus on consumer brands, evaluation criteria used across different project-review points in the NPD process, and the distinction between "kills", "successes", and "failures". Moreover, they propose investigating how screening criteria change across project-review points, using real-time NPD projects. Design/methodology/approach - A postal survey generated 172 usable questionnaires from a sample of European, North American, Far Eastern and Australian consumer packaged-goods firms, providing data on 314 new product projects covering different development and post-commercialization review points. Findings - The results confirm that acceptance-rejection criteria vary through the NPD process. However, financial criteria dominate across all the project-review points. Initial screening is coarse, focusing predominantly on financial criteria. Fit with organizational, product, brand, promotional, and market requirements dominate in the detailed screen and pre-development evaluation points. At pre-launch, decision-makers focus on product, brand, and promotional criteria. Commercial fit, production synergies, and reliability of the firm's market intelligence are significant discriminators in the post-launch review. Moreover, the importance of marketing and channel issues makes the criteria for screening brands different from those of industrial markets. Originality/value - The study, although largely descriptive and involves a relatively small sample of consumer goods firms, offers new insights into NPD project evaluation behavior. Future, larger-scale investigations covering a broader spectrum of consumer product sectors are needed to validate our results and to explain the reasons behind managers' decisions. © Emerald Group Publishing Limited.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The object of this work was to further develop the idea introduced by Muaddi et al (1981) which enables some of the disadvantages of earlier destructive adhesion test methods to be overcome. The test is non-destructive in nature but it does need to be calibrated against a destructive method. Adhesion is determined by measuring the effect of plating on internal friction. This is achieved by determining the damping of vibrations of a resonating specimen before and after plating. The level of adhesion was considered by the above authors to influence the degree of damping. In the major portion of the research work the electrodeposited metal was Watt's nickel, which is ductile in nature and is therefore suitable for peel adhesion testing. The base metals chosen were aluminium alloys S1C and HE9 as it is relatively easy to produce varying levels of adhesion between the substrate and electrodeposited coating by choosing the appropriate process sequence. S1C alloy is the commercially pure aluminium and was used to produce good adhesion. HE9 aluminium alloy is a more difficult to plate alloy and was chosen to produce poorer adhesion. The "Modal Testing" method used for studying vibrations was investigated as a possible means of evaluating adhesion but was not successful and so research was concentrated on the "Q" meter. The method based on the use of a "Q" meter involves the principle of exciting vibrations in a sample, interrupting the driving signal and counting the number of oscillations of the freely decaying vibrations between two known preselected amplitudes of oscillations. It was not possible to reconstruct a working instrument using Muaddi's thesis (1982) as it had either a serious error or the information was incomplete. Hence a modified "Q" meter had to be designed and constructed but it was then difficult to resonate non-magnetic materials, such as aluminium, therefore, a comparison before and after plating could not be made. A new "Q" meter was then developed based on an Impulse Technique. A regulated miniature hammer was used to excite the test piece at the fundamental mode instead of an electronic hammer and test pieces were supported at the two predetermined nodal points using nylon threads. This instrument developed was not very successful at detecting changes due to good and poor pretreatments given before plating, however, it was more sensitive to changes at the surface such as room temperature oxidation. Statistical analysis of test results from untreated aluminium alloys show that the instrument is not always consistent, the variation was even bigger when readings were taken on different days. Although aluminium is said to form protective oxides at room temperature there was evidence that the aluminium surface changes continuously due to film formation, growth and breakdown. Nickel plated and zinc alloy immersion coated samples also showed variation in Q with time. In order to prove that the variations in Q were mainly due to surface oxidation, aluminium samples were lacquered and anodised Such treatments enveloped the active surfaces reacting with the environment and the Q variation with time was almost eliminated especially after hard anodising. This instrument detected major differences between different untreated aluminium substrates.Also Q values decreased progressively as coating thicknesses were increased. This instrument was also able to detect changes in Q due to heat-treatment of aluminium alloys.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In recent years it has become increasingly common for companies to improve their competitiveness and find new markets by extending their operations through international new product development collaborations involving technology transfer. Technology development, cost reduction and market penetration are seen as the foci in such collaborative operations with the aim being to improve the competitive position of both partners. In this paper the case of technology transfer through collaborative new product development in the machine tool sector is used to provide a typical example of such partnerships. The research evidence on which the paper is based includes longitudinal case studies and questionnaire surveys of machine tool manufacturers in both countries. The specific case of a UK machine tool company and its Chinese partner is used to provide a specific example of the operational development of a successful collaboration. The paper concludes that a phased co-ordination of commercial, technical and strategic interactions between the two partners is essential for such collaborations to work. In particular, the need to transfer marketing know-how is emphasised, having been identified as an area of weakness among technology acquirers in China.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Scenario Planning is a strategy tool with growing popularity in both academia and practical situations. Current practices in the teaching of scenario planning are largely based on existing literature which utilises scenario planning to develop strategies for the future, primarily considering the assessment of perceived macro-external environmental uncertainties. However there is a body of literature hitherto ignored by scenario planning researchers, which suggests that Perceived Environmental Uncertainty (PEU) influences micro-external or industrial environmental as well as the internal environment of the organisation. This paper provides a review of the most dominant theories on scenario planning process, demonstrates the need to consider PEU theory within scenario planning and presents how this can be done. The scope of this paper is to enhance the scenario planning process as a tool taught for Strategy Development. A case vignette is developed based on published scenarios to demonstrate the potential utilisation of the proposed process.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

To elucidate the structures of orgamc molecules in solution using pulse FT NMR, heteronuclear pulse sequence experiments to probe carbon-13 (13C) and proton (1H) spin systems are invaluable. The one-dimensional insensitive nucleus detected PENDANT experiment finds popular use for structure determination via one-bond 13C-1H scalar couplings. PENDANT facilitates the desired increase in 13C signal-to-noise ratio, and unlike many other pulse sequence experiments (e.g., refocused INEPT and DEPT), allows the simultaneous detection of 13C quaternary nuclei. The tlrst chapter herein details the characterisation of PENDANT and the successful rectification of spectral anomalies that occur when it is used without proton broadband decoupling. Multiple-bond (long-range) l3C-1H scalar coupling correlations can yield important bonding information. When the molecule under scrutiny is devoid of proton spectral crowding, and more sensitive 'inverse' pulse sequence experiments are not available, one may use insensitive nucleus detected long-range selective one-dimensional correlation methods, rather than more time consuming and insensitive multidimensional analogues. To this end a novel long-range selective one-dimensional correlation pulse sequence experiment has been invented. Based on PENDANT, the new experiment is shown to rival the popular selective INEPT technique because it can determine the same correlations while simultaneously detecting isolated 13C quaternary nuclei. INEPT cannot facilitate this, potentially leaving other important quaternary nuclei undetected. The novel sequence has been modified further to yield a second novel experiment that simultaneously yields selective 13C transient nOe data. Consequently, the need to perform the two experiments back-to-back is conveniently removed, and the experimental time reduced. Finally, the SNARE pulse sequence was further developed. SNARE facilitates the reduction of experimental time by accelerating the relaxation of protons upon which pulse sequences, to which SNARE is appended, relies. It is shown, contrary to the original publication, that reiaxation time savings can be derived from negative nOes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The objective of the research carried out in this report was to observe the first ever in-situ sonochemical reaction in the NMR Spectrometer in the megahertz region of ultrasound. Several reactions were investigated as potential systems for a sonochemical reaction followed by NMR spectroscopy. The primary problem to resolve when applying ultrasound to a chemical reaction is that of heating. Ultrasound causes the liquid to move and produces 'hot spots' resulting in an increase in sample temperature. The problem was confronted by producing a device that would counteract this effect and so remove the need to account for heating. However, the design of the device limited the length of time during which it would function. Longer reaction times were required to enable observations to be carried out in the NMR spectrometer. The fIrst and most obvious reactions attempted were those of the well-known ultrasonic dosimeter. Such a reaction would, theoretically, enable the author to simultaneously observe a reaction and determine the exact power entering the system for direct comparison of results. Unfortunately, in order to monitor the reactions in the NMR spectrometer the reactant concentrations had to be signifIcantly increased, which resulted in a notable increase in reaction time, making the experiment too lengthy to follow in the time allocated. The Diels-Alder Reaction is probably one of the most highly investigated reaction systems in the field of chemistry and it was this to which the author turned her attention. Previous authors have carried out ultrasonic investigations, with considerable success, for the reaction of anthracene with maleic anhydride. It was this reaction in particular that was next attempted. The first ever sonochemically enhanced reaction using a frequency of ultrasound in the megahertz (MHz) region was successfully carried out as bench experiments. Due to the complexity of the component reactants the product would precipitate from the solution and because the reaction could only be monitored by its formation, it was not possible to observe the reaction in the NMR spectrometer. The solvolysis of 2-chloro-2-methylpropane was examined in various solvent systems; the most suitable of which was determined to be aqueous 2-methylpropan-2-ol. The experiment was successfully enhanced by the application of ultrasound and monitored in-situ in the NMR spectrometer. The increase in product formation of an ultrasonic reaction over that of a traditional thermal reaction occurred. A range of 1.4 to 2.9 fold improvement was noted, dependent upon the reaction conditions investigated. An investigation into the effect of sonication upon a large biological molecule, in this case aqueous lysozyme, was carried out. An easily observed effect upon the sample was noted but no explanation for the observed effects could be established.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Information systems are corporate resources, therefore information systems development must be aligned with corporate strategy. This thesis proposes that effective strategic alignment of information systems requires information systems development, information systems planning and strategic management to be united. Literature in these areas is examined, breaching the academic boundaries which separate these areas, to contribute a synthesised approach to the strategic alignment of information systems development. Previous work in information systems planning has extended information systems development techniques, such as data modelling, into strategic planning activities, neglecting techniques of strategic management. Examination of strategic management in this thesis, identifies parallel trends in strategic management and information systems development; the premises of the learning school of strategic management are similar to those of soft systems approaches to information systems development. It is therefore proposed that strategic management can be supported by a soft systems approach. Strategic management tools and techniques frame individual views of a strategic situation; soft systems approaches can integrate these diverse views to explore the internal and external environments of an organisation. The information derived from strategic analysis justifies the need for an information system and provides a starting point for information systems development. This is demonstrated by a composite framework which enables each information system to be justified according to its direct contribution to corporate strategy. The proposed framework was developed through action research conducted in a number of organisations of varying types. This suggests that the framework can be widely used to support the strategic alignment of information systems development, thereby contributing to organisational success.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The increasing cost of developing complex software systems has created a need for tools which aid software construction. One area in which significant progress has been made is with the so-called Compiler Writing Tools (CWTs); these aim at automated generation of various components of a compiler and hence at expediting the construction of complete programming language translators. A number of CWTs are already in quite general use, but investigation reveals significant drawbacks with current CWTs, such as lex and yacc. The effective use of a CWT typically requires a detailed technical understanding of its operation and involves tedious and error-prone input preparation. Moreover, CWTs such as lex and yacc address only a limited aspect of the compilation process; for example, actions necessary to perform lexical symbol valuation and abstract syntax tree construction must be explicitly coded by the user. This thesis presents a new CWT called CORGI (COmpiler-compiler from Reference Grammar Input) which deals with the entire `front-end' component of a compiler; this includes the provision of necessary data structures and routines to manipulate them, both generated from a single input specification. Compared with earlier CWTs, CORGI has a higher-level and hence more convenient user interface, operating on a specification derived directly from a `reference manual' grammar for the source language. Rather than developing a compiler-compiler from first principles, CORGI has been implemented by building a further shell around two existing compiler construction tools, namely lex and yacc. CORGI has been demonstrated to perform efficiently in realistic tests, both in terms of speed and the effectiveness of its user interface and error-recovery mechanisms.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The traditional waterfall software life cycle model has several weaknesses. One problem is that a working version of a system is unavailable until a late stage in the development; any omissions and mistakes in the specification undetected until that stage can be costly to maintain. The operational approach which emphasises the construction of executable specifications can help to remedy this problem. An operational specification may be exercised to generate the behaviours of the specified system, thereby serving as a prototype to facilitate early validation of the system's functional requirements. Recent ideas have centred on using an existing operational method such as JSD in the specification phase of object-oriented development. An explicit transformation phase following specification is necessary in this approach because differences in abstractions between the two domains need to be bridged. This research explores an alternative approach of developing an operational specification method specifically for object-oriented development. By incorporating object-oriented concepts in operational specifications, the specifications have the advantage of directly facilitating implementation in an object-oriented language without requiring further significant transformations. In addition, object-oriented concepts can help the developer manage the complexity of the problem domain specification, whilst providing the user with a specification that closely reflects the real world and so the specification and its execution can be readily understood and validated. A graphical notation has been developed for the specification method which can capture the dynamic properties of an object-oriented system. A tool has also been implemented comprising an editor to facilitate the input of specifications, and an interpreter which can execute the specifications and graphically animate the behaviours of the specified systems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Software development methodologies are becoming increasingly abstract, progressing from low level assembly and implementation languages such as C and Ada, to component based approaches that can be used to assemble applications using technologies such as JavaBeans and the .NET framework. Meanwhile, model driven approaches emphasise the role of higher level models and notations, and embody a process of automatically deriving lower level representations and concrete software implementations. The relationship between data and software is also evolving. Modern data formats are becoming increasingly standardised, open and empowered in order to support a growing need to share data in both academia and industry. Many contemporary data formats, most notably those based on XML, are self-describing, able to specify valid data structure and content, and can also describe data manipulations and transformations. Furthermore, while applications of the past have made extensive use of data, the runtime behaviour of future applications may be driven by data, as demonstrated by the field of dynamic data driven application systems. The combination of empowered data formats and high level software development methodologies forms the basis of modern game development technologies, which drive software capabilities and runtime behaviour using empowered data formats describing game content. While low level libraries provide optimised runtime execution, content data is used to drive a wide variety of interactive and immersive experiences. This thesis describes the Fluid project, which combines component based software development and game development technologies in order to define novel component technologies for the description of data driven component based applications. The thesis makes explicit contributions to the fields of component based software development and visualisation of spatiotemporal scenes, and also describes potential implications for game development technologies. The thesis also proposes a number of developments in dynamic data driven application systems in order to further empower the role of data in this field.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The aim of the research is to develop an e-business selection framework for small and medium enterprises (SMEs) by integrating established techniques in planning. The research is case based, comprising four case studies carried out in the printing industry for the purpose of evaluating the framework. Two of the companies are from Singapore, while the other two are from Guangzhou, China and Jinan, China respectively. To determine the need of an e-business selection framework for SMEs, extensive literature reviews were carried out in the area of e-business, business planning frameworks, SMEs and the printing industry. An e-business selection framework is then proposed by integrating the three established techniques of the Balanced Scorecard (BSC), Value Chain Analysis (VCA) and Quality Function Deployment (QFD). The newly developed selection framework is pilot tested using a published case study before actual evaluation is carried out in four case study companies. The case study methodology was chosen because of its ability to integrate diverse data collection techniques required to generate the BSC, VCA and QFD for the selection framework. The findings of the case studies revealed that the three techniques of BSC, VCA and QFD can be integrated seamlessly to complement on each other’s strengths in e-business planning. The eight-step methodology of the selection framework can provide SMEs with a step-by-step approach to e-business through structured planning. Also, the project has also provided better understanding and deeper insights into SMEs in the printing industry.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Industry cluster policies are a current trend in local economic development programmes and represent a major shift from traditional approaches. This trend has been coupled by an increasing interest in new media industry as a significant focus for regional development strategies. In England clusters and new media industry have therefore come to be seen as important tools in promoting local and regional economic development. This study aimed to ascertain the success of these policies. In order to achieve the aims of the study, the Birmingham new media industry was chosen for the study. In addition to an extensive review of the literature, semi-structured interviews were conducted with new media firms and Business Support Agencies (BSAs) offering programmes to promote the development of the new media industry cluster. The key findings of the thesis are that the concerns of new media industry when choosing their location do not conform to the industry cluster theory. Moreover, close proximity in geographical location of the industries does not mean there is collaboration and any costs saved as a result of close proximity to similar firms are at present seen as irrelevant because of the type of products they offer. Building trust between firms is the key in developing the new media industry cluster and the BSAs can act as a broker and provide neutral ground to develop it. The key policy recommendations are that new media industry is continually changing and research must continuously track and analyse cluster dynamics in order to be aware of emerging trends and future developments that can positively and negatively affect the cluster. Policy makers need to keep in mind that there is no uniform tool kit to foster the different sectors in cluster development. It is also important for them to be winning support and trust of new media firms since this is key in the success of the cluster. When cluster programs are introduced they must explain their benefits to industries more effectively in order to encourage them to participate in programmes. The general conclusions of the thesis are that clusters are a potentially important tool in local economic development policy and that the new media industry has a considerable growth potential. The kinds of relationships which cluster theory suggests develop between do not, as yet, appear to exist within the new media cluster. There are however, steps that the BSAs can take to encourage their development. Thus, the BSAs need to ensure that they establish an environment that enables growth of the industry.