866 resultados para global industry classification standard
Resumo:
We have applied a number of objective statistical techniques to define homogeneous climatic regions for the Pacific Ocean, using COADS (Woodruff et al 1987) monthly sea surface temperature (SST) for 1950-1989 as the key variable. The basic data comprised all global 4°x4° latitude/longitude boxes with enough data available to yield reliable long-term means of monthly mean SST. An R-mode principal components analysis of these data, following a technique first used by Stidd (1967), yields information about harmonics of the annual cycles of SST. We used the spatial coefficients (one for each 4-degree box and eigenvector) as input to a K-means cluster analysis to classify the gridbox SST data into 34 global regions, in which 20 comprise the Pacific and Indian oceans. Seasonal time series were then produced for each of these regions. For comparison purposes, the variance spectrum of each regional anomaly time series was calculated. Most of the significant spectral peaks occur near the biennial (2.1-2.2 years) and ENSO (~3-6 years) time scales in the tropical regions. Decadal scale fluctuations are important in the mid-latitude ocean regions.
Resumo:
The Ugandan fishery, heavily influenced by the emergence of global markets, is extremely dynamic. In recent years a major export trade, principally in Nile perch fillets from Lake Victoria, has expanded markedly. The growth of this factory based processing industry has had a marked impact on the pre-existing artisanal fishery, which has become increasingly dependent on supplying the export market instead of its traditional local small-scale markets. The industrial fishery developed as a response to the liberalisation of the management of the Ugandan economy and the consequent opening up of the export markets in North America and Europe. The emergence of the export industry has resulted in the creation of a dual structure in the fisheries sector, with the Nile perch catching and processing chain operating to European standards, whilst the artisanal sub-sector still utilises traditional methods. This dual structure is a potential source of disadvantage to the artisanal fishery which has command over fewer financial assets than the export fishery.
Resumo:
The generation of new medicinal products is both a contributor to global economic growth and a source of valuable benefits to human health. Given their direct responsibility for public health, regulatory authorities monitor closely both the development and exploitation of the underlying technologies and the products derived from them. The manner in which such regulation is implemented can result in regulators constraining or facilitating the generation of new products. This paper will study as an example the impact of EU Risk Management Plans (EU-RMPs), which have been mandatory for the approval of new medicines since 2005, on both the industry and regulatory authorities. In interviews, the responses of those who had experience of the implementation of EU-RMPs were mixed. Although the benefits of a more structured and predictable approach to the evaluation of risk were appreciated, some respondents perceived the regulation as an excessive burden on their organisations. The exploration of factors that influence how EU-RMP regulation affects individual firms provides new insights for both regulators and managers, and demonstrates one aspect of the complexity of the process by which new medicinal products are brought to market.
Resumo:
The generation of new medicinal products is both a contributor to global economic growth and a source of valuable benefits to human health. Given their direct responsibility for public health, regulatory authorities monitor closely both the development and exploitation of the underlying technologies and the products derived from them. The manner in which such regulation is implemented can result in regulators constraining or facilitating the generation of new products. This paper will study as an example the impact of EU Risk Management Plans (EU-RMPs), which have been mandatory for the approval of new medicines since 2005, on both the industry and regulatory authorities. In interviews, the responses of those who had experience of the implementation of EU-RMPs were mixed. Although the benefits of a more structured and predictable approach to the evaluation of risk were appreciated, some respondents perceived the regulation as an excessive burden on their organisations. The exploration of factors that influence how EU-RMP regulation affects individual firms provides new insights for both regulators and managers, and demonstrates one aspect of the complexity of the process by which new medicinal products are brought to market. © 2010 IEEE.
Resumo:
Purpose: This paper aims to improve understanding of how to manage global network operations from an engineering perspective. Design/methodology/approach: This research adopted a theory building approach based on case studies. Grounded in the existing literature, the theoretical framework was refined and enriched through nine in-depth case studies in the industry sectors of aerospace, automotives, defence and electrics and electronics. Findings: This paper demonstrates the main value creation mechanisms of global network operations along the engineering value chain. Typical organisational features to support the value creation mechanisms are captured, and the key issues in engineering network design and operations are presented with an overall framework. Practical implications: Evidenced by a series of pilot applications, outputs of this research can help companies to improve the performance of their current engineering networks and design new engineering networks to better support their global businesses and customers in a systematic way. Originality/value: Issues about the design and operations of global engineering networks (GEN) are poorly understood in the existing literature in contrast to their apparent importance in value creation and realisation. To address this knowledge gap, this paper introduces the concept of engineering value chain to highlight the potential of a value chain approach to the exploration of engineering activities in a complex business context. At the same time, it develops an overall framework for managing GEN along the engineering value chain. This improves our understanding of engineering in industrial value chains and extends the theoretical understanding of GEN through integrating the engineering network theories and the value chain concepts. © Emerald Group Publishing Limited.
Resumo:
The amount of original imaging information produced yearly during the last decade has experienced a tremendous growth in all industries due to the technological breakthroughs in digital imaging and electronic storage capabilities. This trend is affecting the construction industry as well, where digital cameras and image databases are gradually replacing traditional photography. Owners demand complete site photograph logs and engineers store thousands of images for each project to use in a number of construction management tasks like monitoring an activity's progress and keeping evidence of the "as built" in case any disputes arise. So far, retrieval methodologies are done manually with the user being responsible for imaging classification according to specific rules that serve a limited number of construction management tasks. New methods that, with the guidance of the user, can automatically classify and retrieve construction site images are being developed and promise to remove the heavy burden of manually indexing images. In this paper, both the existing methods and a novel image retrieval method developed by the authors for the classification and retrieval of construction site images are described and compared. Specifically a number of examples are deployed in order to present their advantages and limitations. The results from this comparison demonstrates that the content based image retrieval method developed by the authors can reduce the overall time spent for the classification and retrieval of construction images while providing the user with the flexibility to retrieve images according different classification schemes.
Resumo:
Our society is addicted to steel. Global demand for steel has risen to 1.4 billion tonnes a year and is set to at least double by 2050, while the steel industry generates nearly a 10th of the world's energy related CO₂ emissions. Meeting our 2050 climate change targets would require a 75% reduction in CO₂ emissions for every tonne of steel produced and finding credible solutions is proving a challenge. The starting point for understanding the environmental impacts of steel production is to accurately map the global steel supply chain and identify the biggest steel flows where actions can be directed to deliver the largest impact. In this paper we present a map of global steel, which for the first time traces steel flows from steelmaking, through casting, forming, and rolling, to the fabrication of final goods. The diagram reveals the relative scale of steel flows and shows where efforts to improve energy and material efficiency should be focused.
Resumo:
Identifying strategies for reducing greenhouse gas emissions from steel production requires a comprehensive model of the sector but previous work has either failed to consider the whole supply chain or considered only a subset of possible abatement options. In this work, a global mass flow analysis is combined with process emissions intensities to allow forecasts of future steel sector emissions under all abatement options. Scenario analysis shows that global capacity for primary steel production is already near to a peak and that if sectoral emissions are to be reduced by 50% by 2050, the last required blast furnace will be built by 2020. Emissions reduction targets cannot be met by energy and emissions efficiency alone, but deploying material efficiency provides sufficient extra abatement potential.
Resumo:
We present in two parts an assessment of global manufacturing. In the first part, we review economic development, pollution, and carbon emissions from a country perspective, tracking the rise of China and other developing countries. The results show not only a rise in the economic fortunes of the newly industrializing nations, but also a significant rise in global pollution, particularly air pollution and CO2 emissions largely from coal use, which alter and even reverse previous global trends. In the second part, we change perspective and quantitatively evaluate two important technical strategies to reduce pollution and carbon emissions: energy efficiency and materials recycling. We subdivide the manufacturing sector on the basis of the five major subsectors that dominate energy use and carbon emissions: (a) iron and steel, (b) cement, (c) plastics, (d) paper, and (e) aluminum. The analysis identifies technical constraints on these strategies, but by combined and aggressive action, industry should be able to balance increases in demand with these technical improvements. The result would be high but relatively flat energy use and carbon emissions. The review closes by demonstrating the consequences of extrapolating trends in production and carbon emissions and suggesting two options for further environmental improvements, materials efficiency, and demand reduction. © 2013 by Annual Reviews. All rights reserved.
Resumo:
© Springer International Publishing Switzerland 2015. Making sound asset management decisions, such as whether to replace or maintain an ageing underground water pipe, are critical to ensure that organisations maximise the performance of their assets. These decisions are only as good as the data that supports them, and hence many asset management organisations are in desperate need to improve the quality of their data. This chapter reviews the key academic research on data quality (DQ) and Information Quality (IQ) (used interchangeably in this chapter) in asset management, combines this with the current DQ problems faced by asset management organisations in various business sectors, and presents a classification of the most important DQ problems that need to be tackled by asset management organisations. In this research, eleven semi structured interviews were carried out with asset management professionals in a range of business sectors in the UK. The problems described in the academic literature were cross checked against the problems found in industry. In order to support asset management professionals in solving these problems, we categorised them into seven different DQ dimensions, used in the academic literature, so that it is clear how these problems fit within the standard frameworks for assessing and improving data quality. Asset management professionals can therefore now use these frameworks to underpin their DQ improvement initiatives while focussing on the most critical DQ problems.
Resumo:
Inspired by human visual cognition mechanism, this paper first presents a scene classification method based on an improved standard model feature. Compared with state-of-the-art efforts in scene classification, the newly proposed method is more robust, more selective, and of lower complexity. These advantages are demonstrated by two sets of experiments on both our own database and standard public ones. Furthermore, occlusion and disorder problems in scene classification in video surveillance are also first studied in this paper.
Resumo:
We introduce and explore an approach to estimating statistical significance of classification accuracy, which is particularly useful in scientific applications of machine learning where high dimensionality of the data and the small number of training examples render most standard convergence bounds too loose to yield a meaningful guarantee of the generalization ability of the classifier. Instead, we estimate statistical significance of the observed classification accuracy, or the likelihood of observing such accuracy by chance due to spurious correlations of the high-dimensional data patterns with the class labels in the given training set. We adopt permutation testing, a non-parametric technique previously developed in classical statistics for hypothesis testing in the generative setting (i.e., comparing two probability distributions). We demonstrate the method on real examples from neuroimaging studies and DNA microarray analysis and suggest a theoretical analysis of the procedure that relates the asymptotic behavior of the test to the existing convergence bounds.
Resumo:
Wilkinson, Jane, Performing the Local and the Global: The Theatre Festivals of Lake Constance (Peter Lang, 2007), pp.286 RAE2008
Resumo:
Nearest neighbor classification using shape context can yield highly accurate results in a number of recognition problems. Unfortunately, the approach can be too slow for practical applications, and thus approximation strategies are needed to make shape context practical. This paper proposes a method for efficient and accurate nearest neighbor classification in non-Euclidean spaces, such as the space induced by the shape context measure. First, a method is introduced for constructing a Euclidean embedding that is optimized for nearest neighbor classification accuracy. Using that embedding, multiple approximations of the underlying non-Euclidean similarity measure are obtained, at different levels of accuracy and efficiency. The approximations are automatically combined to form a cascade classifier, which applies the slower approximations only to the hardest cases. Unlike typical cascade-of-classifiers approaches, that are applied to binary classification problems, our method constructs a cascade for a multiclass problem. Experiments with a standard shape data set indicate that a two-to-three order of magnitude speed up is gained over the standard shape context classifier, with minimal losses in classification accuracy.
Resumo:
How do humans rapidly recognize a scene? How can neural models capture this biological competence to achieve state-of-the-art scene classification? The ARTSCENE neural system classifies natural scene photographs by using multiple spatial scales to efficiently accumulate evidence for gist and texture. ARTSCENE embodies a coarse-to-fine Texture Size Ranking Principle whereby spatial attention processes multiple scales of scenic information, ranging from global gist to local properties of textures. The model can incrementally learn and predict scene identity by gist information alone and can improve performance through selective attention to scenic textures of progressively smaller size. ARTSCENE discriminates 4 landscape scene categories (coast, forest, mountain and countryside) with up to 91.58% correct on a test set, outperforms alternative models in the literature which use biologically implausible computations, and outperforms component systems that use either gist or texture information alone. Model simulations also show that adjacent textures form higher-order features that are also informative for scene recognition.