822 resultados para Value chain


Relevância:

20.00% 20.00%

Publicador:

Resumo:

IEC 61850 Process Bus technology has the potential to improve cost, performance and reliability of substation design. Substantial costs associated with copper wiring (designing, documentation, construction, commissioning and troubleshooting) can be reduced with the application of digital Process Bus technology, especially those based upon international standards. An IEC 61850-9-2 based sampled value Process Bus is an enabling technology for the application of Non-Conventional Instrument Transformers (NCIT). Retaining the output of the NCIT in its native digital form, rather than conversion to an analogue output, allows for improved transient performance, dynamic range, safety, reliability and reduced cost. In this paper we report on a pilot installation using NCITs communicating across a switched Ethernet network using the UCAIug Implementation Guideline for IEC 61850-9-2 (9-2 Light Edition or 9-2LE). This system was commissioned in a 275 kV Line Reactor bay at Powerlink Queensland’s Braemar substation in 2009, with sampled value protection IEDs 'shadowing' the existing protection system. The results of commissioning tests and twelve months of service experience using a Fibre Optic Current Transformer (FOCT) from Smart Digital Optics (SDO) are presented, including the response of the system to fault conditions. A number of remaining issues to be resolved to enable wide-scale deployment of NCITs and IEC 61850-9-2 Process Bus technology are also discussed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A key concern in the field of contemporary fashion/textiles design is the emergence of ‘fast fashion’: best explained as "buy it Friday, wear it Saturday and throw it away on Sunday" (O'Loughlin, 2007). In this contemporary retail atmosphere of “pile it high: sell it cheap” and “quick to market”, even designer goods have achieved a throwaway status. This modern culture of consumerism is the antithesis of sustainability and is proving a dilemma surrounding sustainable practice for designers and producers in the disciplines (de Blas, 2010). Design researchers including those in textiles/fashion have begun to explore what is a key question in the 21st century in order to create a vision and reason for their disciplines: Can products be designed to have added value to the consumer and hence contribute to a more sustainable industry? Fashion Textiles Design has much to answer for in contributing to the problems of unsustainable practices on a global scale in design, production and waste. However, designers within this field also have great potential to contribute to practical ‘real world’ solutions. ----- ----- This paper provides an overview of some of the design and technological developments from the fashion/textiles industry, endorsing a model where designers and technicians use their transferrable skills for wellbeing rather than desire. Smart materials in the form of responsive and adaptive fibres and fabrics combined with electro active devices, and ICT are increasingly shaping many aspects of society particularly in the leisure industry and interactive consumer products are ever more visible in healthcare. Combinations of biocompatible delivery devices with bio sensing elements can create analyse, sense and actuate early warning and monitoring systems which can be linked to data logging and patient records via intelligent networks. Patient sympathetic, ‘smart’ fashion/textiles applications based on interdisciplinary expertise utilising textiles design and technology is emerging. An analysis of a series of case studies demonstrates the potential of fashion textiles design practitioners to exploit the concept of value adding through technological garment and textiles applications and enhancement for health and wellbeing and in doing so contribute to a more sustainable future fashion/textiles design industry.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Proposed transmission smart grids will use a digital platform for the automation of substations operating at voltage levels of 110 kV and above. The IEC 61850 series of standards, released in parts over the last ten years, provide a specification for substation communications networks and systems. These standards, along with IEEE Std 1588-2008 Precision Time Protocol version 2 (PTPv2) for precision timing, are recommended by the both IEC Smart Grid Strategy Group and the NIST Framework and Roadmap for Smart Grid Interoperability Standards for substation automation. IEC 61850-8-1 and IEC 61850-9-2 provide an inter-operable solution to support multi-vendor digital process bus solutions, allowing for the removal of potentially lethal voltages and damaging currents from substation control rooms, a reduction in the amount of cabling required in substations, and facilitates the adoption of non-conventional instrument transformers (NCITs). IEC 61850, PTPv2 and Ethernet are three complementary protocol families that together define the future of sampled value digital process connections for smart substation automation. This paper describes a specific test and evaluation system that uses real time simulation, protection relays, PTPv2 time clocks and artificial network impairment that is being used to investigate technical impediments to the adoption of SV process bus systems by transmission utilities. Knowing the limits of a digital process bus, especially when sampled values and NCITs are included, will enable utilities to make informed decisions regarding the adoption of this technology.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Infrastructure capacity management is the process of ensuring optimal provision of infrastructure assets to support business operations. Effectiveness in this process will enable infrastructure asset owners and its stakeholders to receive full value on their investment. Management research has shown that an organisation can only achieve business value when it has the right capabilities. This paradigm can also be applied to infrastructure capacity management. With competing needs for limited organisation resources, the challenge for infrastructure organisations is to identify and invest their limited resources to develop the right capabilities in the management of their infrastructure capacity. Using a multiple case study approach, the challenges faced in the management of infrastructure asset capacity and the approaches that can be adopted to overcome these challenges were explored. Conceptualising the approaches adopted by the case participants, the findings suggest that infrastructure organisations must strengthen their stakeholder connectivity capability in order to effectively manage the capacity of their infrastructure assets.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A Wireless Sensor Network (WSN) is a set of sensors that are integrated with a physical environment. These sensors are small in size, and capable of sensing physical phenomena and processing them. They communicate in a multihop manner, due to a short radio range, to form an Ad Hoc network capable of reporting network activities to a data collection sink. Recent advances in WSNs have led to several new promising applications, including habitat monitoring, military target tracking, natural disaster relief, and health monitoring. The current version of sensor node, such as MICA2, uses a 16 bit, 8 MHz Texas Instruments MSP430 micro-controller with only 10 KB RAM, 128 KB program space, 512 KB external ash memory to store measurement data, and is powered by two AA batteries. Due to these unique specifications and a lack of tamper-resistant hardware, devising security protocols for WSNs is complex. Previous studies show that data transmission consumes much more energy than computation. Data aggregation can greatly help to reduce this consumption by eliminating redundant data. However, aggregators are under the threat of various types of attacks. Among them, node compromise is usually considered as one of the most challenging for the security of WSNs. In a node compromise attack, an adversary physically tampers with a node in order to extract the cryptographic secrets. This attack can be very harmful depending on the security architecture of the network. For example, when an aggregator node is compromised, it is easy for the adversary to change the aggregation result and inject false data into the WSN. The contributions of this thesis to the area of secure data aggregation are manifold. We firstly define the security for data aggregation in WSNs. In contrast with existing secure data aggregation definitions, the proposed definition covers the unique characteristics that WSNs have. Secondly, we analyze the relationship between security services and adversarial models considered in existing secure data aggregation in order to provide a general framework of required security services. Thirdly, we analyze existing cryptographic-based and reputationbased secure data aggregation schemes. This analysis covers security services provided by these schemes and their robustness against attacks. Fourthly, we propose a robust reputationbased secure data aggregation scheme for WSNs. This scheme minimizes the use of heavy cryptographic mechanisms. The security advantages provided by this scheme are realized by integrating aggregation functionalities with: (i) a reputation system, (ii) an estimation theory, and (iii) a change detection mechanism. We have shown that this addition helps defend against most of the security attacks discussed in this thesis, including the On-Off attack. Finally, we propose a secure key management scheme in order to distribute essential pairwise and group keys among the sensor nodes. The design idea of the proposed scheme is the combination between Lamport's reverse hash chain as well as the usual hash chain to provide both past and future key secrecy. The proposal avoids the delivery of the whole value of a new group key for group key update; instead only the half of the value is transmitted from the network manager to the sensor nodes. This way, the compromise of a pairwise key alone does not lead to the compromise of the group key. The new pairwise key in our scheme is determined by Diffie-Hellman based key agreement.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Purpose: The purpose of this paper is to explain variations in discretionary information shared between buyers and key suppliers. The paper also aims to examine how the extent of information shared affects buyers’ performance in terms of resource usage, output, and flexibility. ----- ----- Design/methodology/approach: The data for the paper comprise 221 Finnish and Swedish non-service companies obtained through a mail survey. The hypothesized relationships were tested using partial least squares modelling with reflective and formative constructs.----- ----- Findings: The results of the study suggest that (environmental and demand) uncertainty and interdependency can to some degree explain the extent of information shared between a buyer and key supplier. Furthermore, information sharing improves buyers’ performance with respect to resource usage, output, and flexibility.----- ----- Research limitations/implications: A limitation to the paper relates to the data, which only included buyers.Abetter approach would have been to collect data from both, buyers and key suppliers. Practical implications – Companies face a wide range of supply chain solutions that enable and encourage collaboration across organizations. This paper suggests a more selective and balanced approach toward adopting the solutions offered as the benefits are contingent on a number of factors such as uncertainty. Also, the risks of information sharing are far too high for a one size fits all approach.----- ----- Originality/value: The paper illustrates the applicability of transaction cost theory to the contemporary era of e-commerce. With this finding, transaction cost economics can provide a valuable lens with which to view and interpret interorganizational information sharing, a topic that has received much attention in the recent years.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper focuses on data exchange relationships and ways to improve collaboration in the supply chain. Initially, the paper examines the information needs and alternatives in supply chain management. In the second part, the paper identifies different sets of factors that are likely to influence information sharing with suppliers, from the manufacturers’ point of view. Results from a Finnish Manufacturing industry survey show that manufacturers provided substantial information on demand data, production schedules, and inventories to their suppliers. Respondents perceived delivery performance measured by the timeliness, accuracy, and defect rate of deliveries as the primary incentives for supplier collaboration. On the other hand, supplier image and the market in which the supplier operates were found to be less relevant in determining the intensity of collaboration.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The economic environment of today can be characterized as highly dynamic and competitive if not being in a constant flux. Globalization and the Information Technology (IT) revolution are perhaps the main contributing factors to this observation. While companies have to some extent adapted to the current business environment, new pressures such as the recent increase in environmental awareness and its likely effects on regulations are underway. Hence, in the light of market and competitive pressures, companies must constantly evaluate and if necessary update their strategies to sustain and increase the value they create for shareholders (Hunt and Morgan, 1995; Christopher and Towill, 2002). One way to create greater value is to become more efficient in producing and delivering goods and services to customers, which can lead to a strategy known as cost leadership (Porter, 1980). Even though Porter (1996) notes that in the long run cost leadership may not be a sufficient strategy for competitive advantage, operational efficiency is certainly necessary and should therefore be on the agenda of every company. ----- ----- ----- Better workflow management, technology, and resource utilization can lead to greater internal operational efficiency, which explains why, for example, many companies have recently adopted Enterprise Resource Planning (ERP) Systems: integrated softwares that streamline business processes. However, as today more and more companies are approaching internal operational excellence, the focus for finding inefficiencies and cost saving opportunities is moving beyond the boundaries of the firm. Today many firms in the supply chain are engaging in collaborative relationships with customers, suppliers, and third parties (services) in an attempt to cut down on costs related to for example, inventory, production, as well as to facilitate synergies. Thus, recent years have witnessed fluidity and blurring regarding organizational boundaries (Coad and Cullen, 2006). ----- ----- ----- The Information Technology (IT) revolution of the late 1990’s has played an important role in bringing organizations closer together. In their efforts to become more efficient, companies first integrated their information systems to speed up transactions such as ordering and billing. Later collaboration on a multidimensional scale including logistics, production, and Research & Development became evident as companies expected substantial benefits from collaboration. However, one could also argue that the recent popularity of the concepts falling under Supply Chain Management (SCM) such as Vendor Managed Inventory, Collaborative Planning, Replenishment, and Forecasting owe to the marketing efforts of software vendors and consultants who provide these solutions. Nevertheless, reports from professional organizations as well as academia indicate that the trend towards interorganizational collaboration is gaining wider ground. For example, the ARC Advisory Group, a research organization on supply chain solutions, estimated that the market for SCM, which includes various kinds of collaboration tools and related services, is going to grow at an annual rate of 7.4% during the years 2004-2008, reaching to $7.4 billion in 2008 (Engineeringtalk 2004).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

One of Cultural Studies' most important contributions to academic thinking about culture is the acceptance as axiomatic that we must not simply accept traditional value hierarchies in relation to cultural objects (see, for example, McGuigan, 1992: 157; Brunsdon, 1997: 5; Wark, 2001). Since Richard Hoggart and Raymond Williams took popular culture as a worthy object of study, Cultural Studies practitioners have accepted that the terms in which cultural debate had previously been conducted involved a category error. Opera is not 'better' than pop music, we believe in Cultural Studies - 'better for what?', we would ask. Similarly, Shakespeare is not 'better' than Mills and Boon, unless you can specify the purpose for which you want to use the texts. Shakespeare is indeed better than Mills and Boon for understanding seventeenth century ideas about social organisation; but Mills and Boon is unquestionably better than Shakespeare if you want slightly scandalous, but ultimately reassuring representations of sexual intercourse. The reason that we do not accept traditional hierarchies of cultural value is that we know that the culture that is commonly understood to be 'best' also happens to be that which is preferred by the most educated and most materially well-off people in any given culture (Bourdieu, 1984: 1- 2; Ross, 1989: 211). We can interpret this information in at least two ways. On the one hand, it can be read as proving that the poorer and less well-educated members of a society do indeed have tastes which are innately less worthwhile than those of the material and educational elite. On the other hand, this information can be interpreted as demonstrating that the cultural and material elite publicly represent their own tastes as being the only correct ones. In Cultural Studies, we tend to favour the latter interpretation. We reject the idea that cultural objects have innate value, in terms of beauty, truth, excellence, simply 'there' in the object. That is, we reject 'aesthetic' approaches to culture (Bourdieu, 1984: 6; 485; Hartley, 1994: 6)1. In this, Cultural Studies is similar to other postmodern institutions, where high and popular culture can be mixed in ways unfamiliar to modernist culture (Sim, 1992: 1; Jameson, 1998: 100). So far, so familiar.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This volume breaks new ground by approaching Socially Responsible Investment (SRI) as an explicitly ethical practice in financial markets. The work explains the philosophical and practical shortcomings of ‘long term shareholder value’ and the origins and conceptual structure of SRI, and links its pursuit to both its deeper philosophical foundations and the broader, multi-dimensional global movement towards greater social responsibility in global markets. Interviews with fund managers in the Australian SRI sector generate recommendations for better integrating ethics into SRI practice via ethically informed engagement with invested companies, and an in-depth discussion of the central practical SRI issue of fiduciary responsibility strengthens the case in favour of SRI. The practical and ethical theoretical perspectives are then brought together to sketch out an achievable ideal for SRI worldwide, in which those who are involved in investment and business decisions become part of an ‘ethical chain’ of decision makers linking the ultimate owners of capital with the business executives who frame, advocate and implement business strategies. In between there are investment advisors, fund managers, business analysts and boards. The problem lies in the fact that the ultimate owners are discouraged from considering their own values, or even their own long term interests, whilst the others often look only to short term interests. The solution lies in the latter recognising themselves as links in the ethical chain.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Engineering asset management (EAM) is a broad discipline and the EAM functions and processes are characterized by its distributed nature. However, engineering asset nowadays mostly relies on self-maintained experiential rule bases and periodic maintenance, which is lacking a collaborative engineering approach. This research proposes a collaborative environment integrated by a service center with domain expertise such as diagnosis, prognosis, and asset operations. The collaborative maintenance chain combines asset operation sites, service center (i.e., maintenance operation coordinator), system provider, first tier collaborators, and maintenance part suppliers. Meanwhile, to realize the automation of communication and negotiation among organizations, multiagent system (MAS) technique is applied to enhance the entire service level. During the MAS design processes, this research combines Prometheus MAS modeling approach with Petri-net modeling methodology and unified modeling language to visualize and rationalize the design processes of MAS. The major contributions of this research include developing a Petri-net enabled Prometheus MAS modeling methodology and constructing a collaborative agent-based maintenance chain framework for integrated EAM.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A pervasive and puzzling feature of banks’ Value-at-Risk (VaR) is its abnormally high level, which leads to excessive regulatory capital. A possible explanation for the tendency of commercial banks to overstate their VaR is that they incompletely account for the diversification effect among broad risk categories (e.g., equity, interest rate, commodity, credit spread, and foreign exchange). By underestimating the diversification effect, bank’s proprietary VaR models produce overly prudent market risk assessments. In this paper, we examine empirically the validity of this hypothesis using actual VaR data from major US commercial banks. In contrast to the VaR diversification hypothesis, we find that US banks show no sign of systematic underestimation of the diversification effect. In particular, diversification effects used by banks is very close to (and quite often larger than) our empirical diversification estimates. A direct implication of this finding is that individual VaRs for each broad risk category, just like aggregate VaRs, are biased risk assessments.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper we study both the level of Value-at-Risk (VaR) disclosure and the accuracy of the disclosed VaR figures for a sample of US and international commercial banks. To measure the level of VaR disclosures, we develop a VaR Disclosure Index that captures many different facets of market risk disclosure. Using panel data over the period 1996–2005, we find an overall upward trend in the quantity of information released to the public. We also find that Historical Simulation is by far the most popular VaR method. We assess the accuracy of VaR figures by studying the number of VaR exceedances and whether actual daily VaRs contain information about the volatility of subsequent trading revenues. Unlike the level of VaR disclosure, the quality of VaR disclosure shows no sign of improvement over time. We find that VaR computed using Historical Simulation contains very little information about future volatility.