349 resultados para money-equivalent value (MEV)


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The Government of Indonesia (GoI) increasingly relies on the private sector financing to build and operate infrastructures through public private partnership (PPP) schemes. However, PPP does not automatically provide the solution for the financing scheme due to value for money (VFM) issues. The procurement authority must show whether a PPP proposal is the optimal solution that provides best VFM outcome. The paper presents a literature review of comparing quantitative VFM methodology for PPP infrastructure project procurement in Indonesia and Australia. Public Sector Comparator (PSC) is used to assess the potential project VFM quantitatively in Australia. In Indonesia, the PSC has not been applied, where the PPP procurement authority tends to utilize a common project evaluation method that ignores the issues of risk. Unlike the conventional price bid evaluation, the PSC enables a financial comparison including costs/gains and risks. Since the construction of PSC is primarily on risk management approach, it can facilitate risk negotiation processes between the involved parties. The study indicates that the quantitative VFM methodology of PSC is potentially applicable in Indonesia for water supply sector. Various supporting regulations are available that emphasize the importance of VFM and risk management in infrastructure investment. However, the study also reveals a number of challenges that need to be anticipated, such as the need of a more comprehensive PPP policy at both central and local government level, a more specific legal instrument for bidding evaluation method and the issue of institutional capacity development in PPP Units at the local level.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The use of social networking sites (SNS) by online citizens to share photos, update friends, play games and to connect with the world has exploded, with SNS and blogs now eclipsing email traffic (eMarketer 2009). Just one popular application on one SNS, (Farmville on Facebook) acquired more than 63 million users since its launch in June 2009 (Marketing 2009. The major global social networks are Facebook, Twitter, YouTube and MySpace, with Facebook claiming that it passed 350 million users in November (Marketing 2009). As usage increases and competition intensifies, the major sites must strategically position themselves to develop a competitive advantage in order to maintain or grow their share of the pie. So how do the major SNS position their brands, and do users perceive significant differences among the big players? This presentation answers these questions by reporting the results of an empirical study of SNS usage by Australian adults. Like other brands, aligning brand positioning strategies with user knowledge and perceptions of SNS is an important ingredient to achieving success (Keller 1993). Furthermore we compare the types of value for three different SNS to identify the relationships between the value derived by users and the stated positioning of the site.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Many cities worldwide face the prospect of major transformation as the world moves towards a global information order. In this new era, urban economies are being radically altered by dynamic processes of economic and spatial restructuring. The result is the creation of ‘informational cities’ or its new and more popular name, ‘knowledge cities’. For the last two centuries, social production had been primarily understood and shaped by neo-classical economic thought that recognized only three factors of production: land, labor and capital. Knowledge, education, and intellectual capacity were secondary, if not incidental, factors. Human capital was assumed to be either embedded in labor or just one of numerous categories of capital. In the last decades, it has become apparent that knowledge is sufficiently important to deserve recognition as a fourth factor of production. Knowledge and information and the social and technological settings for their production and communication are now seen as keys to development and economic prosperity. The rise of knowledge-based opportunity has, in many cases, been accompanied by a concomitant decline in traditional industrial activity. The replacement of physical commodity production by more abstract forms of production (e.g. information, ideas, and knowledge) has, however paradoxically, reinforced the importance of central places and led to the formation of knowledge cities. Knowledge is produced, marketed and exchanged mainly in cities. Therefore, knowledge cities aim to assist decision-makers in making their cities compatible with the knowledge economy and thus able to compete with other cities. Knowledge cities enable their citizens to foster knowledge creation, knowledge exchange and innovation. They also encourage the continuous creation, sharing, evaluation, renewal and update of knowledge. To compete nationally and internationally, cities need knowledge infrastructures (e.g. universities, research and development institutes); a concentration of well-educated people; technological, mainly electronic, infrastructure; and connections to the global economy (e.g. international companies and finance institutions for trade and investment). Moreover, they must possess the people and things necessary for the production of knowledge and, as importantly, function as breeding grounds for talent and innovation. The economy of a knowledge city creates high value-added products using research, technology, and brainpower. Private and the public sectors value knowledge, spend money on its discovery and dissemination and, ultimately, harness it to create goods and services. Although many cities call themselves knowledge cities, currently, only a few cities around the world (e.g., Barcelona, Delft, Dublin, Montreal, Munich, and Stockholm) have earned that label. Many other cities aspire to the status of knowledge city through urban development programs that target knowledge-based urban development. Examples include Copenhagen, Dubai, Manchester, Melbourne, Monterrey, Singapore, and Shanghai. Knowledge-Based Urban Development To date, the development of most knowledge cities has proceeded organically as a dependent and derivative effect of global market forces. Urban and regional planning has responded slowly, and sometimes not at all, to the challenges and the opportunities of the knowledge city. That is changing, however. Knowledge-based urban development potentially brings both economic prosperity and a sustainable socio-spatial order. Its goal is to produce and circulate abstract work. The globalization of the world in the last decades of the twentieth century was a dialectical process. On one hand, as the tyranny of distance was eroded, economic networks of production and consumption were constituted at a global scale. At the same time, spatial proximity remained as important as ever, if not more so, for knowledge-based urban development. Mediated by information and communication technology, personal contact, and the medium of tacit knowledge, organizational and institutional interactions are still closely associated with spatial proximity. The clustering of knowledge production is essential for fostering innovation and wealth creation. The social benefits of knowledge-based urban development extend beyond aggregate economic growth. On the one hand is the possibility of a particularly resilient form of urban development secured in a network of connections anchored at local, national, and global coordinates. On the other hand, quality of place and life, defined by the level of public service (e.g. health and education) and by the conservation and development of the cultural, aesthetic and ecological values give cities their character and attract or repel the creative class of knowledge workers, is a prerequisite for successful knowledge-based urban development. The goal is a secure economy in a human setting: in short, smart growth or sustainable urban development.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

IEC Technical Committee 57 (TC57) published a series of standards and technical reports for “Communication networks and systems for power utility automation” as the IEC 61850 series. Sampled value (SV) process buses allow for the removal of potentially lethal voltages and damaging currents inside substation control rooms and marshalling kiosks, reduce the amount of cabling required in substations, and facilitate the adoption of non-conventional instrument transformers. IEC 61850-9-2 provides an inter-operable solution to support multi-vendor process bus solutions. A time synchronisation system is required for a SV process bus, however the details are not defined in IEC 61850-9-2. IEEE Std 1588-2008, Precision Time Protocol version 2 (PTPv2), provides the greatest accuracy of network based time transfer systems, with timing errors of less than 100 ns achievable. PTPv2 is proposed by the IEC Smart Grid Strategy Group to synchronise IEC 61850 based substation automation systems. IEC 61850-9-2, PTPv2 and Ethernet are three complementary protocols that together define the future of sampled value digital process connections in substations. The suitability of PTPv2 for use with SV is evaluated, with preliminary results indicating that steady state performance is acceptable (jitter < 300 ns), and that extremely stable grandmaster oscillators are required to ensure SV timing requirements are met when recovering from loss of external synchronisation (such as GPS).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

IEC 61850 Process Bus technology has the potential to improve cost, performance and reliability of substation design. Substantial costs associated with copper wiring (designing, documentation, construction, commissioning and troubleshooting) can be reduced with the application of digital Process Bus technology, especially those based upon international standards. An IEC 61850-9-2 based sampled value Process Bus is an enabling technology for the application of Non-Conventional Instrument Transformers (NCIT). Retaining the output of the NCIT in its native digital form, rather than conversion to an analogue output, allows for improved transient performance, dynamic range, safety, reliability and reduced cost. In this paper we report on a pilot installation using NCITs communicating across a switched Ethernet network using the UCAIug Implementation Guideline for IEC 61850-9-2 (9-2 Light Edition or 9-2LE). This system was commissioned in a 275 kV Line Reactor bay at Powerlink Queensland’s Braemar substation in 2009, with sampled value protection IEDs 'shadowing' the existing protection system. The results of commissioning tests and twelve months of service experience using a Fibre Optic Current Transformer (FOCT) from Smart Digital Optics (SDO) are presented, including the response of the system to fault conditions. A number of remaining issues to be resolved to enable wide-scale deployment of NCITs and IEC 61850-9-2 Process Bus technology are also discussed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A key concern in the field of contemporary fashion/textiles design is the emergence of ‘fast fashion’: best explained as "buy it Friday, wear it Saturday and throw it away on Sunday" (O'Loughlin, 2007). In this contemporary retail atmosphere of “pile it high: sell it cheap” and “quick to market”, even designer goods have achieved a throwaway status. This modern culture of consumerism is the antithesis of sustainability and is proving a dilemma surrounding sustainable practice for designers and producers in the disciplines (de Blas, 2010). Design researchers including those in textiles/fashion have begun to explore what is a key question in the 21st century in order to create a vision and reason for their disciplines: Can products be designed to have added value to the consumer and hence contribute to a more sustainable industry? Fashion Textiles Design has much to answer for in contributing to the problems of unsustainable practices on a global scale in design, production and waste. However, designers within this field also have great potential to contribute to practical ‘real world’ solutions. ----- ----- This paper provides an overview of some of the design and technological developments from the fashion/textiles industry, endorsing a model where designers and technicians use their transferrable skills for wellbeing rather than desire. Smart materials in the form of responsive and adaptive fibres and fabrics combined with electro active devices, and ICT are increasingly shaping many aspects of society particularly in the leisure industry and interactive consumer products are ever more visible in healthcare. Combinations of biocompatible delivery devices with bio sensing elements can create analyse, sense and actuate early warning and monitoring systems which can be linked to data logging and patient records via intelligent networks. Patient sympathetic, ‘smart’ fashion/textiles applications based on interdisciplinary expertise utilising textiles design and technology is emerging. An analysis of a series of case studies demonstrates the potential of fashion textiles design practitioners to exploit the concept of value adding through technological garment and textiles applications and enhancement for health and wellbeing and in doing so contribute to a more sustainable future fashion/textiles design industry.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background: The Current Population Survey (CPS) and the American Time Use Survey (ATUS) use the 2002 census occupation system to classify workers into 509 separate occupations arranged into 22 major occupational categories. Methods: We describe the methods and rationale for assigning detailed MET estimates to occupations and present population estimates (comparing outputs generated by analysis of previously published summary MET estimates to the detailed MET estimates) of intensities of occupational activity using the 2003 ATUS data comprised of 20,720 respondents, 5,323 (2,917 males and 2,406 females) of whom reported working 6+ hours at their primary occupation on their assigned reporting day. Results: Analysis using the summary MET estimates resulted in 4% more workers in sedentary occupations, 6% more in light, 7% less in moderate, and 3% less in vigorous compared to using the detailed MET estimates. The detailed estimates are more sensitive to identifying individuals who do any occupational activity that is moderate or vigorous in intensity resulting in fewer workers in sedentary and light intensity occupations. Conclusions: Since CPS/ATUS regularly captures occupation data it will be possible to track prevalence of the different intensity levels of occupations. Updates will be required with inevitable adjustments to future occupational classification systems.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Proposed transmission smart grids will use a digital platform for the automation of substations operating at voltage levels of 110 kV and above. The IEC 61850 series of standards, released in parts over the last ten years, provide a specification for substation communications networks and systems. These standards, along with IEEE Std 1588-2008 Precision Time Protocol version 2 (PTPv2) for precision timing, are recommended by the both IEC Smart Grid Strategy Group and the NIST Framework and Roadmap for Smart Grid Interoperability Standards for substation automation. IEC 61850-8-1 and IEC 61850-9-2 provide an inter-operable solution to support multi-vendor digital process bus solutions, allowing for the removal of potentially lethal voltages and damaging currents from substation control rooms, a reduction in the amount of cabling required in substations, and facilitates the adoption of non-conventional instrument transformers (NCITs). IEC 61850, PTPv2 and Ethernet are three complementary protocol families that together define the future of sampled value digital process connections for smart substation automation. This paper describes a specific test and evaluation system that uses real time simulation, protection relays, PTPv2 time clocks and artificial network impairment that is being used to investigate technical impediments to the adoption of SV process bus systems by transmission utilities. Knowing the limits of a digital process bus, especially when sampled values and NCITs are included, will enable utilities to make informed decisions regarding the adoption of this technology.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The Texas Transportation Commission (“the Commission”) is responsible for planning and making policies for the location, construction, and maintenance of a comprehensive system of highways and public roads in Texas. In order for the Commission to carry out its legislative mandate, the Texas Constitution requires that most revenue generated by motor vehicle registration fees and motor fuel taxes be used for constructing and maintaining public roadways and other designated purposes. The Texas Department of Transportation (TxDOT) assists the Commission in executing state transportation policy. It is the responsibility of the legislature to appropriate money for TxDOT’s operation and maintenance expenses. All money authorized to be appropriated for TxDOT’s operations must come from the State Highway Fund (also known as Fund 6, Fund 006, or Fund 0006). The Commission can then use the balance in the fund to fulfill its responsibilities. However, the value of the revenue received in Fund 6 is not keeping pace with growing demand for transportation infrastructure in Texas. Additionally, diversion of revenue to nontransportation uses now exceeds $600 million per year. As shown in Figure 1.1, revenues and expenditures of the State Highway Fund per vehicle mile traveled (VMT) in Texas have remained almost flat since 1993. In the meantime, construction cost inflation has gone up more than 100%, effectively halving the value of expenditure.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

One of Cultural Studies' most important contributions to academic thinking about culture is the acceptance as axiomatic that we must not simply accept traditional value hierarchies in relation to cultural objects (see, for example, McGuigan, 1992: 157; Brunsdon, 1997: 5; Wark, 2001). Since Richard Hoggart and Raymond Williams took popular culture as a worthy object of study, Cultural Studies practitioners have accepted that the terms in which cultural debate had previously been conducted involved a category error. Opera is not 'better' than pop music, we believe in Cultural Studies - 'better for what?', we would ask. Similarly, Shakespeare is not 'better' than Mills and Boon, unless you can specify the purpose for which you want to use the texts. Shakespeare is indeed better than Mills and Boon for understanding seventeenth century ideas about social organisation; but Mills and Boon is unquestionably better than Shakespeare if you want slightly scandalous, but ultimately reassuring representations of sexual intercourse. The reason that we do not accept traditional hierarchies of cultural value is that we know that the culture that is commonly understood to be 'best' also happens to be that which is preferred by the most educated and most materially well-off people in any given culture (Bourdieu, 1984: 1- 2; Ross, 1989: 211). We can interpret this information in at least two ways. On the one hand, it can be read as proving that the poorer and less well-educated members of a society do indeed have tastes which are innately less worthwhile than those of the material and educational elite. On the other hand, this information can be interpreted as demonstrating that the cultural and material elite publicly represent their own tastes as being the only correct ones. In Cultural Studies, we tend to favour the latter interpretation. We reject the idea that cultural objects have innate value, in terms of beauty, truth, excellence, simply 'there' in the object. That is, we reject 'aesthetic' approaches to culture (Bourdieu, 1984: 6; 485; Hartley, 1994: 6)1. In this, Cultural Studies is similar to other postmodern institutions, where high and popular culture can be mixed in ways unfamiliar to modernist culture (Sim, 1992: 1; Jameson, 1998: 100). So far, so familiar.

Relevância:

20.00% 20.00%

Publicador:

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A pervasive and puzzling feature of banks’ Value-at-Risk (VaR) is its abnormally high level, which leads to excessive regulatory capital. A possible explanation for the tendency of commercial banks to overstate their VaR is that they incompletely account for the diversification effect among broad risk categories (e.g., equity, interest rate, commodity, credit spread, and foreign exchange). By underestimating the diversification effect, bank’s proprietary VaR models produce overly prudent market risk assessments. In this paper, we examine empirically the validity of this hypothesis using actual VaR data from major US commercial banks. In contrast to the VaR diversification hypothesis, we find that US banks show no sign of systematic underestimation of the diversification effect. In particular, diversification effects used by banks is very close to (and quite often larger than) our empirical diversification estimates. A direct implication of this finding is that individual VaRs for each broad risk category, just like aggregate VaRs, are biased risk assessments.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper we study both the level of Value-at-Risk (VaR) disclosure and the accuracy of the disclosed VaR figures for a sample of US and international commercial banks. To measure the level of VaR disclosures, we develop a VaR Disclosure Index that captures many different facets of market risk disclosure. Using panel data over the period 1996–2005, we find an overall upward trend in the quantity of information released to the public. We also find that Historical Simulation is by far the most popular VaR method. We assess the accuracy of VaR figures by studying the number of VaR exceedances and whether actual daily VaRs contain information about the volatility of subsequent trading revenues. Unlike the level of VaR disclosure, the quality of VaR disclosure shows no sign of improvement over time. We find that VaR computed using Historical Simulation contains very little information about future volatility.