13 resultados para commodity
em Aston University Research Archive
Resumo:
The thesis examines the effects of the privatisation process on productivity, competitiveness and performance in two major Brazilian steel companies, which were privatised in between 1991 and 1993. The case study method was adopted in this research due to its strengths as a useful technique allowing in-depth examination of the privatisation process, the context in which it happened and its effects on the companies. The thesis has developed a company analysis framework consisting of three components: management, competitiveness/productivity and performance and examined the evidence on the companies within this framework.The research indicates that there is no straightforward relationship between privatisation, competitiveness and performance. There were many significant differences in the management and technological capabilities, products and performance of the two companies, and these have largely influenced the effects of privatisation on each company. Company Alpha's strengths in technological and management capabilities and high value added products explain strong productivity and financial performance during and after privatisation. Company Beta's performance was weak before the privatisation and remained weak immediately after. Before the privatisation, weaknesses in management, commodity type low value added products and shortage of funds for investment were the major problems. These were compounded by greater government interference. Despite major restructuring, the poor performance has continued after privatisation largely because the company has not been able to improve its productivity sufficiently to be cost competitive in commodity type markets. Both companies state that their strategies have changed significantly. They claim to be more responsive to market conditions and customers and are attempting to develop closer links with major customers. It is not possible to assess the consequences of these changes in the short time that has elapsed since privatisation but Alpha appears to be more effective in developing a coherent strategy because of its strengths. Both companies accelerated their programme of organisational restructuring and reducing the number of their employees during the privatisation process to improve productivity and performance. Alpha has attained standards comparable to major international steel companies. Beta has had to make much bigger organisational changes and cuts in its labour force but its productivity levels still remain low in comparison with Alpha and international competitors.
Resumo:
Academic researchers have followed closely the interest of companies in establishing industrial networks by studying aspects such as social interaction and contractual relationships. But what patterns underlie the emergence of industrial networks and what support should research provide for practitioners? Firstly, it appears that manufacturing is becoming a commodity rather than a unique capability, which accounts especially for low-technology approaches in downstream parts of the network, for example in assembly operations. Secondly, the increased tendency towards specialization has forced other, upstream, parts of industrial networks to introduce advanced manufacturing technologies to supply niche markets. Thirdly, the capital market for investments in capacity, and the trade in manufacturing as a commodity, dominates resource allocation to a larger extent than previously was the case. Fourthly, there is a continuous move towards more loosely connected entities that comprise manufacturing networks. More traditional concepts, such as the “keiretsu” and “chaibol” networks of some Asian economies, do not sufficiently support the demands now being placed on networks. Research should address these four fundamental challenges to prepare for the industrial networks of 2020 and beyond.
Resumo:
The most significant environmental change to support people who want to give up smoking is the legislation to ban smoking in public places. Following Scotland in March 2006, and Wales and Northern Ireland in April 2007, England moves one step closer to being smoke free on 1 July 2007, when it becomes illegal to smoke in almost every enclosed public place and workplace. Social marketing will be used to support this health promoting policy and will become more prominent in the design of health promotion campaigns of the future. Social marketing is not a new approach to promoting health but its adoption by the Government does represent a paradigm shift in the challenge to change public opinion and social norms. As a result some behaviours, like smoking or excessive alcohol consumption, will no longer be socially acceptable. The Department of Health has decided that social marketing should be used in England to guide all future health promotion efforts directed at achieving behavioural goals. This paradigm shift was announced in Chapter 2 of the “Choosing health” White Paper with its emphasis on the consumer, noting that a wide range of lifestyle choices are marketed to people, although health as a commodity itself has not been marketed. The DoH has an internal social marketing development unit to integrate social marketing principles into its work and ensure that providers deliver. The National Centre for Social Marketing has funding to provide ongoing support, to build capacity and capability in the workforce. This article describes the distinguishing features of the social marketing approach. It seeks to answer some questions. Is this really a new idea, a paradigm shift, or simply a change in terminology? What do the marketing principles offer that is new, or are they merely familiar ideas repackaged in marketing jargon? Will these principles be more effective than current health promotion practice and, if so, how does it work? Finally, what are the implications for community pharmacy?
Resumo:
Very large spatially-referenced datasets, for example, those derived from satellite-based sensors which sample across the globe or large monitoring networks of individual sensors, are becoming increasingly common and more widely available for use in environmental decision making. In large or dense sensor networks, huge quantities of data can be collected over small time periods. In many applications the generation of maps, or predictions at specific locations, from the data in (near) real-time is crucial. Geostatistical operations such as interpolation are vital in this map-generation process and in emergency situations, the resulting predictions need to be available almost instantly, so that decision makers can make informed decisions and define risk and evacuation zones. It is also helpful when analysing data in less time critical applications, for example when interacting directly with the data for exploratory analysis, that the algorithms are responsive within a reasonable time frame. Performing geostatistical analysis on such large spatial datasets can present a number of problems, particularly in the case where maximum likelihood. Although the storage requirements only scale linearly with the number of observations in the dataset, the computational complexity in terms of memory and speed, scale quadratically and cubically respectively. Most modern commodity hardware has at least 2 processor cores if not more. Other mechanisms for allowing parallel computation such as Grid based systems are also becoming increasingly commonly available. However, currently there seems to be little interest in exploiting this extra processing power within the context of geostatistics. In this paper we review the existing parallel approaches for geostatistics. By recognising that diffeerent natural parallelisms exist and can be exploited depending on whether the dataset is sparsely or densely sampled with respect to the range of variation, we introduce two contrasting novel implementations of parallel algorithms based on approximating the data likelihood extending the methods of Vecchia [1988] and Tresp [2000]. Using parallel maximum likelihood variogram estimation and parallel prediction algorithms we show that computational time can be significantly reduced. We demonstrate this with both sparsely sampled data and densely sampled data on a variety of architectures ranging from the common dual core processor, found in many modern desktop computers, to large multi-node super computers. To highlight the strengths and weaknesses of the diffeerent methods we employ synthetic data sets and go on to show how the methods allow maximum likelihood based inference on the exhaustive Walker Lake data set.
Resumo:
A systematic analysis is presented of the economic consequences of the abnormally high concentration of Zambia's exports on a commodity whose price is exceptionally unstable. Zambian macro-economic variables in the post-independence years are extensively documented, showing acute instability and decline, particularly after the energy price revolution and the collapse of copper prices. The relevance of stabilization policies designed to correct short-term disequilibrium is questioned. It is, therefore, a pathological case study of externally induced economic instability, complementing other studies in this area which use cross-country analysis of a few selected variables. After a survey of theory and issues pertaining to development, finance and stabilization, the emergence of domestic and foreign financial constraints on the Zambian economy is described. The world copper industry is surveyed and an examination of commodity and world trade prices concludes that copper showed the highest degree of price instability. Specific aspects of Zambia's economy identified for detailed analysis include: its unprofitable mining industry, external payments disequilibrium, a constrained government budget, potentially inflationary monetary growth, and external indebtedness. International comparisons are used extensively, but major copper exporters are subjected to closer scrutiny. An appraisal of policy options concludes the study.
Resumo:
Academic researchers have followed closely the interest of companies in establishing industrial networks by studying aspects such as social interaction and contractual relationships. But what patterns underlie the emergence of industrial networks and what support should research provide for practitioners? First, it appears that manufacturing is becoming a commodity rather than a unique capability, which accounts especially for low-technology approaches in downstream parts of the network, for example, in assembly operations. Second, the increased tendency towards specialisation has forced other, upstream, parts of industrial networks to introduce advanced manufacturing technologies for niche markets. Third, the capital market for investments in capacity, and the trade in manufacturing as a commodity, dominates resource allocation to a larger extent than was previously the case. Fourth, there is becoming a continuous move towards more loosely connected entities that comprise manufacturing networks. Finally, in these networks, concepts for supply chain management should address collaboration and information technology that supports decentralised decision-making, in particular to address sustainable and green supply chains. More traditional concepts, such as the keiretsu and chaibol networks of some Asian economies, do not sufficiently support the demands now being placed on networks. Research should address these five fundamental challenges to prepare for the industrial networks of 2020 and beyond. © 2010 Springer-Verlag London.
Resumo:
Book revew: Marketinggeschichte: die Genese einer modernen Sozialtechnik [Marketing history: The genesis of a modern social technique], edited by Hartmut Berghoff, Frankfurt/Main, Campus Verlag, 2007, 409 pp., illus., [euro]30.00 (paperback), ISBN 978-3-593-38323-1. This edited volume is the result of a workshop at Göttingen University in 2006 and combines a number of different approaches to the research into the history of marketing in Germany's economy and society. The majority of contributions loosely focus around the occurrence of a ‘marketing revolution’ in the 1970s, which ties in with interpretations of the Americanisation of German business. This revolution replaced the indigenous German idea of Absatzwirtschaft (the economics of sales) with the American-influenced idea of Marketing, which was less functionally oriented and more strategic, and which aimed to connect processes within the firm in order to allow a greater focus on the consumer. The entire volume is framed by Hartmut Berghoff's substantial and informative introduction, which introduces a number of actors and trends beyond the content of the volume. Throughout the various contributions, authors provide explanations of the timing and nature of marketing revolutions. Alexander Engel identifies an earlier revolution in the marketing of dyes, which undergoes major change with the emergence of chemical dyes. While the natural dyestuff had been a commodity, with producers removed from consumers via a global network of traders, chemical dyes were products and were branded at an early stage. This was a fundamental change in the nature of production and sales. As Roman Rossfeld shows in his contribution on the Swiss chocolate industry (which focuses almost exclusively on Suchard), even companies that produced non-essential consumer goods which had always required some measure of labelling grappled for years with the need to develop fewer and higher impact brands, as well as an efficient sales operation. A good example for the classical ‘marketing revolution’ of the 1970s is the German automobile industry. Ingo Köhler convincingly argues that the crisis situation of German car manufacturers – the change from a seller's to a buyer's market, appreciation of the German mark which undermines exports, the oil crises coupled with higher inflation and greater frugality of consumers and the emergence of new competitors – lead companies to refocus from production to the demands of the consumer. While he highlights the role of Ford in responding most rapidly to these problems, he does not address whether the multinational was potentially transferring American knowledge to the German market. Similarly, Paul Erker illustrates that a marketing revolution in transport and logistics happened much later, because the market remained highly regulated until the 1980s. Both Paul Erker and Uwe Spiekermann in their contribution, present comparisons of two different sectors or companies (the tire manufacturer Continental and the logistics company Dachser, and agriculture and trade, respectively). In both cases, however, it remains unclear why these examples were chosen for comparison, as both seem to have little in common and are not always effectively used to demonstrate differences. The weakest section of the book is the development of marketing as an academic discipline. The attempt at sketching the phases in the evolution of marketing as an academic discipline by Ursula Hansen and Matthias Bode opens with an undergraduate-level explanation on the methodology of historical periodisation that seems extraneous. Considerably stronger is the section on the wider societal impact of marketing, and Anja Kruke shows how the new techniques of opinion research was accepted by politics and business – surprisingly more readily by politicians than their commercial counterparts. In terms of contemporary personalities, Hans Domizlaff emerges as one fascinating figure of German marketing history, which several contributors refer to and whose career as the German cigarette manufacturer Reemtsma is critically analysed by Tino Jacobs. Domizlaff was Germany's own ‘marketing guru’, whose successful campaigns led to the wide-ranging reception of his ideas about the nature of good branding and marketing. These are variously described as intuitive, elitist, and sachlich, a German concept of a sober, fact-based, and ‘no frills’ approach. Domizlaff did not believe in market research. Rather, he saw the genius of the individual advertiser as key to intuitively ascertaining the people's moods, wishes, and desires. This seems to have made him peculiarly suited to the tastes of the German middle class, according to Thomas Mergel's contribution on the nature of political marketing in the republic. Especially in politics, any form of hard sales tactics were severely frowned upon and considered to demean the citizen as incapable of making an informed choice, a mentality that he dates back to the traditions of nineteenth-century liberalism. Part of this disdain of ‘selling politics like toothpaste’ was also founded on the highly effective use of branding by the National Socialists, who identified their party through the use of an increasingly standardised image of Adolf Hitler and the swastika. Alexander Schug extends on previous research that criticised the simplistic notion of Hitler's charisma as the only explanation of the popular success and distances his approach from those who see it in terms of propaganda and demagogy. He argues that the NSDAP used the tools of advertising and branding precisely because they had to introduce their new ideology into a political marketplace dominated by more established parties. In this they were undoubtedly successful, more so than they intended: as bakers sold swastika cookies and butchers formed Führer heads out of lard, the NSDAP sought to regain control over the now effectively iconic images that constituted their brand, which was in danger of being trivialised and devalued. Key to understanding the history of marketing in Germany is on the one hand the exchange of ideas with the United States, and on the other the impact of national-socialist policies, and the question whether they were a force of modernisation or retardation. The general argument in the volume appears to favour the latter explanation. In the 1930s, some of the leading marketing experts emigrated to the USA, leaving German academia and business isolated. The aftermath of the Second World War left a country that needed to increase production to satisfy consumer demand, and there was little interest in advanced sales techniques. Although the Nazis were progressive in applying new marketing methods to their political campaign, this retarded the adoption of sales techniques in politics for a long time. Germany saw the development of idiosyncratic approaches by people like Domizlaff in the 1930s and 1940s, when it lost some leading thinkers, and only engaged with American marketing conceptions in the 1960s and 1970s, when consumers eventually became more important than producers.
Resumo:
Academia has followed the interest by companies in establishing industrial networks by studying aspects such as social interaction and contractual relationships. But what patterns underlie the emergence of industrial networks and what support should research provide for practitioners? Firstly, it seems that manufacturing is becoming a commodity rather than a unique capability, which accounts especially for low-technology approaches in downstream parts of the network, for example in assembly operations. Secondly, the increased tendency to specialize forces other parts of industrial networks to introduce advanced manufacturing technologies for niche markets. Thirdly, the capital market for investments in capacity and the trade in manufacturing as a commodity dominates resource allocation to a larger extent. Fourthly, there will be a continuous move toward more loosely connected entities forming manufacturing networks. More traditional concepts, like keiretsu and chaibol networks, do not sufficiently support this transition. Research should address these fundamental challenges to prepare for the industrial networks of 2020 and beyond.
Resumo:
The first resonant-cavity time-division-multiplexed (TDM) fiber Bragg grating sensor interrogation system is reported. This novel design uses a pulsed semiconductor optical amplifier in a cyclic manner to function as the optical source, amplifier, and modulator. Compatible with a range of standard wavelength detection techniques, this optically gated TDM system allows interrogation of low reflectivity "commodity" sensors spaced just 2 m apart, using a single active component. Results demonstrate an exceptional optical signal-to-noise ratio of 36 dB, a peak signal power of over +7 dBm, and no measurable crosstalk between sensors. Temperature tuning shows that the system is fully stable with a highly linear response. © 2004 IEEE.
Resumo:
The quest for sustainable resources to meet the demands of a rapidly rising global population while mitigating the risks of rising CO2 emissions and associated climate change, represents a grand challenge for humanity. Biomass offers the most readily implemented and low-cost solution for sustainable transportation fuels, and the only non-petroleum route to organic molecules for the manufacture of bulk, fine and speciality chemicals and polymers. To be considered truly sustainable, biomass must be derived fromresources which do not compete with agricultural land use for food production, or compromise the environment (e.g. via deforestation). Potential feedstocks include waste lignocellulosic or oil-based materials derived from plant or aquatic sources, with the so-called biorefinery concept offering the co-production of biofuels, platform chemicals and energy; analogous to today's petroleum refineries which deliver both high-volume/low-value (e.g. fuels and commodity chemicals) and lowvolume/ high-value (e.g. fine/speciality chemicals) products, thereby maximizing biomass valorization. This article addresses the challenges to catalytic biomass processing and highlights recent successes in the rational design of heterogeneous catalysts facilitated by advances in nanotechnology and the synthesis of templated porous materials, as well as the use of tailored catalyst surfaces to generate bifunctional solid acid/base materials or tune hydrophobicity.
Resumo:
Much has been written about the potential impact of Lean Agile paradigm on firm's supply chain performance. However, most of the existing studies mainly pointed out Lean is for cost reduction, whereas Agility is for attaining flexibility. There are little empirical studies in literature that examined how Lean Agile paradigm impacts on supply chain performance. This paper aims to address this gap by studying the influence of Lean and Agility paradigms on a single commodity supply chain delivery performance in the aerospace industry. Data was collected from four separate 'Rigid pipes' supply chains to study how manufacturing alignment impacts on the delivery performance. Implications of the study to practitioners and academia are discussed and future research outlined.
Resumo:
The increasing trend of disaster victims globally is posing a complex challenge for disaster management authorities. Moreover, to accomplish successful transition between preparedness and response, it is important to consider the different features inherent to each type of disaster. Floods are portrayed as one of the most frequent and harmful disasters, hence introducing the necessity to develop a tool for disaster preparedness to perform efficient and effective flood management. The purpose of the article is to introduce a method to simultaneously define the proper location of shelters and distribution centers, along with the allocation of prepositioned goods and distribution decisions required to satisfy flood victims. The tool combines the use of a raster geographical information system (GIS) and an optimization model. The GIS determines the flood hazard of the city areas aiming to assess the flood situation and to discard floodable facilities. Then, the multi-commodity multimodal optimization model is solved to obtain the Pareto frontier of two criteria: distance and cost. The methodology was applied to a case study in the flood of Villahermosa, Mexico, in 2007, and the results were compared to an optimized scenario of the guidelines followed by Mexican authorities, concluding that the value of the performance measures was improved using the developed method. Furthermore, the results exhibited the possibility to provide adequate care for people affected with less facilities than the current approach and the advantages of considering more than one distribution center for relief prepositioning.