42 resultados para Triple Bottom Line Approach
em Aston University Research Archive
Resumo:
The number of research papers linking sustainability with supply chain management is increasing around the world. The purpose of this paper is to analyse how the publications in Brazil are considering the relationship between sustainability and supply chain management. The methodology applied consists in five major steps: (1) selection of databases and journals, (2) selection of the papers, (3) reading of papers' abstracts to select only papers that are related to business and sustainability, (4) qualitative and quantitative analysis of the selected papers' abstracts to define the main dimension of sustainability and sustainability aspect, and finally, (5) an evaluation of experts' responses to a questionnaire in the field of sustainability and supply chain in Brazil. The literature review was conducted in 120 Brazilian academic journals in which 124 papers were identified as being published in relation to sustainability, business management and companies, from 2008 until 2013. When considering the traditional Triple Bottom Line approach, the results of the analysis show that sustainability research in Brazil is focusing on the environmental dimension and SCM research is focusing on the economic dimension. Additional inputs are provided by integrating the governance dimension in the analysis to underline which actions and policies are discussed in Brazilian literature at a corporate level. The consultation of experts in the field of sustainability in Brazil was aimed at understanding better the results of the conducted literature review. One of the main conclusions is that there are large opportunities to increase publications about sustainability and SCM in the country.
Resumo:
This research study illustrates the importance of sustainable purchasing practices for organizations in the U.S. distribution industry and answers several important questions: what is the current awareness of U.S. organizations regarding sustainable purchasing practices; to what extent are U.S. organizations evaluating, selecting, and retaining suppliers based upon sustainable purchasing practices; and to what extent are sustainable purchasing practices being implemented by the U.S. organizations under study? With an ever increasing global economy, is it critically important for organizations to put in place sustainability practices; the biggest impact organizations can make is often in an organization’s purchasing department. The researcher begins by explaining the reasoning for conducting the research, and then builds the readers’ understanding of sustainability in a supply chain environment. It then moves to the subject of how sustainable purchasing can be an advantageous method for bringing about “triple bottom line” savings to an organization. This section is followed by the researcher’s methodology and ending results for a survey conducted to examine the current awareness and implementation of sustainable purchasing practices among U.S. plumbing, heating, cooling and piping (PHCP) distribution firms who participated in the study.
Resumo:
Block copolymers are versatile designer macromolecules where a “bottom-up” approach can be used to create tailored materials with unique properties. These simple building blocks allow us to create actuators that convert energy from a variety of sources (such as chemical, electrical and heat) into mechanical energy. In this review we will discuss the advantages and potential pitfalls of using block copolymers to create actuators, putting emphasis on the ways in which these materials can be synthesised and processed. Particular attention will be given to the theoretical background of microphase separation and how the phase diagram can be used during the design process of actuators. Different types of actuation will be discussed throughout.
Resumo:
The application of any e-Solution promises significant returns. In particular, using internet technologies both within enterprises and across the supply (value) chain provides real opportunity, not only for operational improvement but also for innovative strategic positioning. However, significant questions obscure potential investment; how any value will actually be created and, importantly, how this value will be shared across the value chain is not clear. This paper will describe a programme of research that is developing an enterprise simulator that will provide a more fundamental understanding of the impact of e-Solutions across operational supply chains, in terms of both standard operational and financial measures of performance. An efficient supply chain reduces total costs of operations by sharing accurate real-time information and coordinating inter-organizational business processes. This form of electronic link between organizations is known as business-to-business (B2B) e-Business. The financial measures go beyond simple cost calculations to real bottom-line performance by modelling the financial transactions that business processes generate. The paper will show how this enterprise simulator allows for a complete supply chain to be modelled in this way across four key applications: control system design, virtual enterprises, pan-supply-chain performance metrics and supporting e-Supply-chain design methodology.
Resumo:
Purpose - This paper aims to demonstrate the need for an improved understanding of the opportunities offered by privacy online. This is contextualized in the case of supermarket purchases of food in particular, often described as an intimate and personal choice. In the case of grocery shopping, the "intimacy" may be at the household level between members or/and between e-grocers' food offerings, and their other "non-food" related services Design/methodology/approach - This paper draws upon social practice theory research, retailing and consumer behaviour in order to develop a conceptual framework for understanding the value of positive privacy. The research uses 39 in-depth interviews of e-grocery shoppers in the area of Portsmouth (UK). Findings - This paper suggests a framework for embedded elements of positive privacy into retailing strategy as a driver for growth in the e-grocery sector. Three meta-themes requiring different approaches to privacy are uncovered. Positive privacy is dynamic and contextual at the consumer/household levels as well as for product/e-grocery brands. Research limitations/implications - This paper advocates the building of long-term sustainable relationship through sharing, offering, and exchange of information rather than pure technological chasing of data. Originality/value - A consumer centred bottom-up approach is employed demonstrating the value of two-way dialogues with consumers on sensitive issues. E-grocery is used as an illustration that involves regular re-purchase of a basket of staple goods over a long period of time where privacy becomes a latent long-term concern. © Emerald Group Publishing Limited.
Resumo:
A history of government drug regulation and the relationship between the pharmaceutical companies in the U.K. and the licensing authority is outlined. Phases of regulatory stringency are identified with the formation of the Committees on Safety of Drugs and Medicines viewed as watersheds. A study of the impact of government regulation on industrial R&D activities focuses on the effects on the rate and direction of new product innovation. A literature review examines the decline in new chemical entity innovation. Regulations are cited as a major but not singular cause of the decline. Previous research attempting to determine the causes of such a decline on an empirical basis is given and the methodological problems associated with such research are identified. The U.K. owned sector of the British pharmaceutical industry is selected for a study employing a bottom-up approach allowing disaggregation of data. A historical background to the industry is provided, with each company analysed or a case study basis. Variations between companies regarding the policies adopted for R&D are emphasised. The process of drug innovation is described in order to determine possible indicators of the rate and direction of inventive and innovative activity. All possible indicators are considered and their suitability assessed. R&D expenditure data for the period 1960-1983 is subsequently presented as an input indicator. Intermediate output indicators are treated in a similar way and patent data are identified as a readily-available and useful source. The advantages and disadvantages of using such data are considered. Using interview material, patenting policies for most of the U.K. companies are described providing a background for a patent-based study. Sources of patent data are examined with an emphasis on computerised systems. A number of searches using a variety of sources are presented. Patent family size is examined as a possible indicator of an invention's relative importance. The patenting activity of the companies over the period 1960-1983 is given and the variation between companies is noted. The relationship between patent data and other indicators used is analysed using statistical methods resulting in an apparent lack of correlation. An alternative approach taking into account variations in company policy and phases in research activity indicates a stronger relationship between patenting activity, R&D Expenditure and NCE output over the period. The relationship is not apparent at an aggregated company level. Some evidence is presented for a relationship between phases of regulatory stringency, inventive and innovative activity but the importance of other factors is emphasised.
Resumo:
Current British government economic development policy emphasises regional and sub-regional scale, multi-agent initiatives that form part of national frameworks to encourage a 'bottom up' approach to economic development. An emphasis on local multi-agent initiatives was also the mission of Training and Enterprise Councils (TECs). Using new survey evidence this article tracks the progress of a number of initiatives established under the TECs, using the TEC Discretionary Fund as an example. It assesses the ability of successor bodies to be more effective in promoting local economic development. Survey evidence is used to confirm that many projects previously set up by the TECs continue to operate successfully under new partnership arrangements. However as new structures have developed, and policy has become more centralized, it is less likely that similar local initiatives will be developed in future. There is evidence to suggest that with the end of the TECs a gap has emerged in the institutional infrastructure for local economic development, particularly with regard to workforce development. Much will depend in future on how the Regional Development Agencies deploy their growing power and resources.
Resumo:
The incentive dilemma refers to a situation in which incentives are offered but do not work as intended. The authors suggest that, in an interorganizational context, whether a principal-provided incentive works is a function of how it is evaluated by an agent: for its contribution to the agent's bottom line (instrumental evaluation) and for the extent it is strategically aligned with the agent's direction (congruence evaluation). To further understand when incentives work, the influence of two key contextual variables-industry volatility and dependence-are examined. A field study featuring 57 semi-structured depth interviews and 386 responses from twin surveys in the information technology and brewing industries provide data for hypothesis testing. When and whether incentives work is demonstrated by certain conditions under which the agent's evaluation of an incentive has positive or negative effects on its compliance and active representation. Further, some outcomes are reversed in the high volatility condition. © 2013 Academy of Marketing Science.
Resumo:
The human and material cost of type 2 diabetes is a cause of increasing concern for health professionals, representative organisations and governments worldwide. The scale of morbidity and mortality has led the United Nations to issue a resolution on diabetes, calling for national policies for prevention, treatment and care. There is clearly an urgent need for a concerted response from all interested parties at the community, national and international level to work towards the goals of the resolution and create effective, sustainable treatment models, care systems and prevention strategies. Action requires both a 'bottom-up' approach of public awareness campaigns and pressure from healthcare professionals, coupled with a 'top-down' drive for change, via partnerships with governments, third sector (non-governmental) organisations and other institutions. In this review, we examine how existing collaborative initiatives serve as examples for those seeking to implement change in health policy and practice in the quest to alleviate the health and economic burden of diabetes. Efforts are underway to provide continuous and comprehensive care models for those who already have type 2 diabetes; in some cases, national plans extend to prevention strategies in attempts to improve overall public health. In the spirit of partnership, collaborations with governments that incorporate sustainability, long-term goals and a holistic approach continue to be a driving force for change. It is now critical to maintain this momentum and use the growing body of compelling evidence to educate, inform and deliver a long-term, lasting impact on patient and public health worldwide. © 2007 The Authors.
Resumo:
Microfinance has been developed as alternative solution for global poverty alleviation effort in the last 30 years. Microfinance institution (MFI) has unique characteristic wherein they face double bottom line objectives of outreach to the poor and financial sustainability. This study proposes a two-stage analysis to measure Islamic Microfinance institutions (IMFIs) performance by comparing them to conventional MFIs. First, we develop a Data Envelopment Analysis (DEA) framework to measure MFIs' efficiency in its double bottom line objectives, i.e. in terms of social and financial efficiency. In the second stage non-parametric tests are used to compare the performance and identify factors that contribute to the efficiency of IMFIs and MFIs.
Resumo:
As the pressure continues to grow on Diamond and the world's synchrotrons for higher throughput of diffraction experiments, new and novel techniques are required for presenting micron dimension crystals to the X ray beam. Currently this task is both labour intensive and primarily a serial process. Diffraction measurements typically take milliseconds but sample preparation and presentation can reduce throughput down to 4 measurements an hour. With beamline waiting times as long as two years it is of key importance for researchers to capitalize on available beam time, generating as much data as possible. Other approaches detailed in the literature [1] [2] [3] are very much skewed towards automating, with robotics, the actions of a human protocols. The work detailed here is the development and discussion of a bottom up approach relying on SSAW self assembly, including material selection, microfluidic integration and tuning of the acoustic cavity to order the protein crystals.
Resumo:
As the pressure continues to grow on Diamond and the world's synchrotrons for higher throughput of diffraction experiments, new and novel techniques are required for presenting micron dimension crystals to the X ray beam. Currently this task is both labour intensive and primarily a serial process. Diffraction measurements typically take milliseconds but sample preparation and presentation can reduce throughput down to 4 measurements an hour. With beamline waiting times as long as two years it is of key importance for researchers to capitalize on available beam time, generating as much data as possible. Other approaches detailed in the literature [1] [2] [3] are very much skewed towards automating, with robotics, the actions of a human protocols. The work detailed here is the development and discussion of a bottom up approach relying on SSAW self assembly, including material selection, microfluidic integration and tuning of the acoustic cavity to order the protein crystals.
Resumo:
Purpose: The purpose of this paper is to present the application of logical framework analysis (LFA) for implementing continuous quality improvement (CQI) across multiple settings in a tertiary care hospital. Design/methodology/approach: This study adopts a multiple case study approach. LFA is implemented within three diverse settings, namely, intensive care unit, surgical ward, and acute in-patient psychiatric ward. First, problem trees are developed in order to determine the root causes of quality issues, specific to the three settings. Second, objective trees are formed suggesting solutions to the quality issues. Third, project plan template using logical framework (LOGFRAME) is created for each setting. Findings: This study shows substantial improvement in quality across the three settings. LFA proved to be effective to analyse quality issues and suggest improvement measures objectively. Research limitations/implications: This paper applies LFA in specific, albeit, diverse settings in one hospital. For validation purposes, it would be ideal to analyse in other settings within the same hospital, as well as in several hospitals. It also adopts a bottom-up approach when this can be triangulated with other sources of data. Practical implications: LFA enables top management to obtain an integrated view of performance. It also provides a basis for further quantitative research on quality management through the identification of key performance indicators and facilitates the development of a business case for improvement. Originality/value: LFA is a novel approach for the implementation of CQI programs. Although LFA has been used extensively for project development to source funds from development banks, its application in quality improvement within healthcare projects is scant.
Resumo:
In this paper we review recent theoretical approaches for analysing the dynamics of on-line learning in multilayer neural networks using methods adopted from statistical physics. The analysis is based on monitoring a set of macroscopic variables from which the generalisation error can be calculated. A closed set of dynamical equations for the macroscopic variables is derived analytically and solved numerically. The theoretical framework is then employed for defining optimal learning parameters and for analysing the incorporation of second order information into the learning process using natural gradient descent and matrix-momentum based methods. We will also briefly explain an extension of the original framework for analysing the case where training examples are sampled with repetition.
Resumo:
Online learning is discussed from the viewpoint of Bayesian statistical inference. By replacing the true posterior distribution with a simpler parametric distribution, one can define an online algorithm by a repetition of two steps: An update of the approximate posterior, when a new example arrives, and an optimal projection into the parametric family. Choosing this family to be Gaussian, we show that the algorithm achieves asymptotic efficiency. An application to learning in single layer neural networks is given.