794 resultados para economic approach
Resumo:
Water environments are greatly valued in urban areas as ecological and aesthetic assets. However, it is the water environment that is most adversely affected by urbanisation. Urban land use coupled with anthropogenic activities alters the stream flow regime and degrade water quality with urban stormwater being a significant source of pollutants. Unfortunately, urban water pollution is difficult to evaluate in terms of conventional monetary measures. True costs extend beyond immediate human or the physical boundaries of the urban area and affect the function of surrounding ecosystems. Current approaches for handling stormwater pollution and water quality issues in urban landscapes are limited as these are primarily focused on ‘end-of-pipe’ solutions. The approaches are commonly based either on, insufficient design knowledge, faulty value judgements or inadequate consideration of full life cycle costs. It is in this context that the adoption of a triple bottom line approach is advocated to safeguard urban water quality. The problem of degradation of urban water environments can only be remedied through innovative planning, water sensitive engineering design and the foresight to implement sustainable practices. Sustainable urban landscapes must be designed to match the triple bottom line needs of the community, starting with ecosystem services first such as the water cycle, then addressing the social and immediate ecosystem health needs, and finally the economic performance of the catchment. This calls for a cultural change towards urban water resources rather than the current piecemeal and single issue focus approach. This paper discusses the challenges in safeguarding urban water environments and the limitations of current approaches. It then explores the opportunities offered by integrating innovative planning practices with water engineering concepts into a single cohesive framework to protect valuable urban ecosystem assets. Finally, a series of recommendations are proposed for protecting urban water resources within the context of a triple bottom line approach.
Resumo:
Multi-disciplinary approaches to complex problems are becoming more common – they enable criteria manifested in distinct (and potentially conflicting) domains to be jointly balanced and satisfied. In this paper we present airport terminals as a case study which requires multi-disciplinary knowledge in order to balance conflicting security, economic and passenger-driven needs and correspondingly enhance the design, management and operation of airport terminals. The need for a truly multi-disciplinary scientific approach which integrates information, process, people, technology and space domains is highlighted through a brief discussion of two challenges currently faced by airport operators. The paper outlines the approach taken by this project, detailing the aims and objectives of each of seven diverse research programs.
Resumo:
China has made great progress in constructing comprehensive legislative and judicial infrastructures to protect intellectual property rights. But levels of enforcement remain low. Estimates suggest that 90% of film and music products consumed in China are ‘pirated’ and in 2009 81% of the infringing goods seized at the US border originated from China. Despite of heavy criticism over its failure to enforce IPRs, key areas of China’s creative industries, including film, mobile-music, fashion and animation, are developing rapidly. This paper explores how the rapid expansion of China’s creative economy might be reconciled with conceptual approaches that view the CIs in terms of creativity inputs and IP outputs. It argues that an evolutionary understanding of copyright’s role in creative innovation might better explain China’s experiences and provide more general insights into the nature of the creative industries and the policies most likely to promote growth in this sector of the economy.
Resumo:
This paper considers the scope to develop an approach to the spatial dimensions of media and culture that is informed by cultural-economic geography. I refer to cultural-economic geography as that strand of research in the field of geography that has been informed on the one hand by the ‘cultural turn’ in both geographical and economic thought, and which focuses on the relationship between, space, knowledge and identity in the spheres of production and consumption, and on the other to work by geographers that has sought to map the scale and significance of the cultural or creative industries as new drivers of the global economy. The paper considers the extent to which this work enables those engaged with urban cultural policy to get beyond some of the impasses that have arisen with the development of “creative cities” policies derived from the work of authors such as Richard Florida as well as the business management literature on clusters. It will frame these debates in the context of recent work by Michael Curtin on media capitals, and the question of whether cities in East Asia can emerge as media capitals from outside of the US-Europe-dominated transnational cultural axis.
Resumo:
In this chapter we present a case study set in Beloi, a fishing village located on Ataúro Island, 30 km across the sea from Díli, capital of Timor-Leste (East-Timor). We explore the tension between tourism development, food security and marine conservation in a developing country context. In order to better understand the relationships between the social, ecological and economic issues that arise in tourism planning we use an approach and associated methodology based on storytelling, complexity theory and concept mapping. Through testing scenarios with this methodology we hope to evaluate which trade-offs are acceptable to local people in return for the hoped-for economic boost from increased tourist visitation and associated developments.
Resumo:
The modern society has come to expect the electrical energy on demand, while many of the facilities in power systems are aging beyond repair and maintenance. The risk of failure is increasing with the aging equipments and can pose serious consequences for continuity of electricity supply. As the equipments used in high voltage power networks are very expensive, economically it may not be feasible to purchase and store spares in a warehouse for extended periods of time. On the other hand, there is normally a significant time before receiving equipment once it is ordered. This situation has created a considerable interest in the evaluation and application of probability methods for aging plant and provisions of spares in bulk supply networks, and can be of particular importance for substations. Quantitative adequacy assessment of substation and sub-transmission power systems is generally done using a contingency enumeration approach which includes the evaluation of contingencies, classification of the contingencies based on selected failure criteria. The problem is very complex because of the need to include detailed modelling and operation of substation and sub-transmission equipment using network flow evaluation and to consider multiple levels of component failures. In this thesis a new model associated with aging equipment is developed to combine the standard tools of random failures, as well as specific model for aging failures. This technique is applied in this thesis to include and examine the impact of aging equipments on system reliability of bulk supply loads and consumers in distribution network for defined range of planning years. The power system risk indices depend on many factors such as the actual physical network configuration and operation, aging conditions of the equipment, and the relevant constraints. The impact and importance of equipment reliability on power system risk indices in a network with aging facilities contains valuable information for utilities to better understand network performance and the weak links in the system. In this thesis, algorithms are developed to measure the contribution of individual equipment to the power system risk indices, as part of the novel risk analysis tool. A new cost worth approach was developed in this thesis that can make an early decision in planning for replacement activities concerning non-repairable aging components, in order to maintain a system reliability performance which economically is acceptable. The concepts, techniques and procedures developed in this thesis are illustrated numerically using published test systems. It is believed that the methods and approaches presented, substantially improve the accuracy of risk predictions by explicit consideration of the effect of equipment entering a period of increased risk of a non-repairable failure.
Resumo:
This thesis articulates a methodology that can be applied to the analysis and design of underlying organisational structures and processes that will consistently and effectively address ‘wicked problems’ (the most difficult class of problems that we can conceptualise: problems which consist of ‘clusters’ of problems; problems within these clusters cannot be solved in isolation from one another, and include sociopolitical and moral-spiritual issues (Rittel and Webber 1973)) in forestry. This transdisciplinary methodology has been developed from the perspective of institutional economics synthesised with perspectives from ecological economics and system dynamics. The institutionalist policymaking framework provides an approach for the explicit development of holistic policy. An illustrative application of this framework has been applied to the wicked problem of forestry in southern Tasmania as an example of the applicability of the approach in the Australian context. To date all attempts to seek solutions to that prevailing wicked problem set have relied on non-reflexive, partial and highly reductionist thinking. A formal assessment of prevailing governance and process arrangements applying to that particular forestry industry has been undertaken using the social fabric matrix. This methodology lies at the heart of the institutionalist policymaking framework, and allows for the systematic exploration of elaborately complex causal links and relationships, such as are present in southern Tasmania. Some possible attributes of an alternative approach to forest management that sustains ecological, social and economic values of forests have been articulated as indicative of the alternative policy and management outcomes that real-world application of this transdisciplinary, discursive and reflexive framework may crystallise. Substantive and lasting solutions to wicked problems need to be formed endogenously, that is, from within the system. The institutionalist policymaking framework is a vehicle through which this endogenous creation of solutions to wicked problems may be realised.
Resumo:
In the networked information driven world that we now inhabit the ability to access and reuse information, data and culture is a key ingredient to social, economic and cultural innovation. As government holds enormous amounts of publicly funded material that can be released to the public without breaching the law it should move to implement policies that will allow better access to and reuse of that information, knowledge and culture. The Queensland Government Information Licensing Framework (GILF) Project4 is one of the first projects in the world to systemically approach this issue and should be consulted as a best practice model.
Resumo:
This special issue presents an excellent opportunity to study applied epistemology in public policy. This is an important task because the arena of public policy is the social domain in which macro conditions for ‘knowledge work’ and ‘knowledge industries’ are defined and created. We argue that knowledge-related public policy has become overly concerned with creating the politico-economic parameters for the commodification of knowledge. Our policy scope is broader than that of Fuller (1988), who emphasizes the need for a social epistemology of science policy. We extend our focus to a range of policy documents that include communications, science, education and innovation policy (collectively called knowledge-related public policy in acknowledgement of the fact that there is no defined policy silo called ‘knowledge policy’), all of which are central to policy concerned with the ‘knowledge economy’ (Rooney and Mandeville, 1998). However, what we will show here is that, as Fuller (1995) argues, ‘knowledge societies’ are not industrial societies permeated by knowledge, but that knowledge societies are permeated by industrial values. Our analysis is informed by an autopoietic perspective. Methodologically, we approach it from a sociolinguistic position that acknowledges the centrality of language to human societies (Graham, 2000). Here, what we call ‘knowledge’ is posited as a social and cognitive relationship between persons operating on and within multiple social and non-social (or, crudely, ‘physical’) environments. Moreover, knowing, we argue, is a sociolinguistically constituted process. Further, we emphasize that the evaluative dimension of language is most salient for analysing contemporary policy discourses about the commercialization of epistemology (Graham, in press). Finally, we provide a discourse analysis of a sample of exemplary texts drawn from a 1.3 million-word corpus of knowledge-related public policy documents that we compiled from local, state, national and supranational legislatures throughout the industrialized world. Our analysis exemplifies a propensity in policy for resorting to technocratic, instrumentalist and anti-intellectual views of knowledge in policy. We argue that what underpins these patterns is a commodity-based conceptualization of knowledge, which is underpinned by an axiology of narrowly economic imperatives at odds with the very nature of knowledge. The commodity view of knowledge, therefore, is flawed in its ignorance of the social systemic properties of knowing’.
Resumo:
Multivariate volatility forecasts are an important input in many financial applications, in particular portfolio optimisation problems. Given the number of models available and the range of loss functions to discriminate between them, it is obvious that selecting the optimal forecasting model is challenging. The aim of this thesis is to thoroughly investigate how effective many commonly used statistical (MSE and QLIKE) and economic (portfolio variance and portfolio utility) loss functions are at discriminating between competing multivariate volatility forecasts. An analytical investigation of the loss functions is performed to determine whether they identify the correct forecast as the best forecast. This is followed by an extensive simulation study examines the ability of the loss functions to consistently rank forecasts, and their statistical power within tests of predictive ability. For the tests of predictive ability, the model confidence set (MCS) approach of Hansen, Lunde and Nason (2003, 2011) is employed. As well, an empirical study investigates whether simulation findings hold in a realistic setting. In light of these earlier studies, a major empirical study seeks to identify the set of superior multivariate volatility forecasting models from 43 models that use either daily squared returns or realised volatility to generate forecasts. This study also assesses how the choice of volatility proxy affects the ability of the statistical loss functions to discriminate between forecasts. Analysis of the loss functions shows that QLIKE, MSE and portfolio variance can discriminate between multivariate volatility forecasts, while portfolio utility cannot. An examination of the effective loss functions shows that they all can identify the correct forecast at a point in time, however, their ability to discriminate between competing forecasts does vary. That is, QLIKE is identified as the most effective loss function, followed by portfolio variance which is then followed by MSE. The major empirical analysis reports that the optimal set of multivariate volatility forecasting models includes forecasts generated from daily squared returns and realised volatility. Furthermore, it finds that the volatility proxy affects the statistical loss functions’ ability to discriminate between forecasts in tests of predictive ability. These findings deepen our understanding of how to choose between competing multivariate volatility forecasts.
Resumo:
In response to the need to leverage private finance and the lack of competition in some parts of the Australian public sector major infrastructure market, especially in very large economic infrastructure procured using Pubic Private Partnerships, the Australian Federal government has demonstrated its desire to attract new sources of in-bound foreign direct investment (FDI) into the Australian construction market. This paper aims to report on progress towards an investigation into the determinants of multinational contractors’ willingness to bid for Australian public sector major infrastructure projects and which is designed to give an improved understanding of matters surrounding FDI into the Australian construction sector. This research deploys Dunning’s eclectic theory for the first time in terms of in-bound FDI by multinational contractors and as head contractors bidding for Australian major infrastructure public sector projects. Elsewhere, the authors have developed Dunning’s principal hypothesis associated with his eclectic framework in order to suit the context of this research and to address a weakness arising in Dunning’s principal hypothesis that is based on a nominal approach to the factors in the eclectic framework and which fail to speak to the relative explanatory power of these factors. In this paper, an approach to reviewing and analysing secondary data, as part of the first stage investigation in this research, is developed and some illustrations given, vis-à-vis the selected sector (roads, bridges and tunnels) in Australia (as the host location) and using one of the selected home countries (Spain). In conclusion, some tentative thoughts are offered in anticipation of the completion of the first stage investigation - in terms of the extent to which this first stage based on secondary data only might suggest the relative importance of the factors in the eclectic framework. It is noted that more robust conclusions are expected following the future planned stages of the research and these stages including primary data are briefly outlined. Finally, and beyond theoretical contributions expected from the overall approach taken to developing and testing Dunning’s framework, other expected contributions concerning research method and practical implications are mentioned.
Resumo:
This paper reviews diversity in knowledge management (KM) from a cultural perspective; it argues that culturally embedded theories and practices influence the practice of knowledge management. It further presents and analyses several case studies and in particular a case study of the Islamic culture focusing on its traditional approach to both Islamic knowledge and management. The analysis of this case reveals the cultural challenges that emerge in the process of applying essentially Western management theories within an Islamic culture with particular reference to knowledge management theories. The paper concludes that the concept of knowledge management must take into account the diversity of national culture in which the organization exists and that the concept of knowledge management will benefit from a diversity perspective rather than a universality perspective.
Resumo:
The motivation of the study stems from the results reported in the Excellence in Research for Australia (ERA) 2010 report. The report showed that only 12 universities performed research at or above international standards, of which, the Group of Eight (G8) universities filled the top eight spots. While performance of universities was based on number of research outputs, total amount of research income and other quantitative indicators, the measure of efficiency or productivity was not considered. The objectives of this paper are twofold. First, to provide a review of the research performance of 37 Australian universities using the data envelopment analysis (DEA) bootstrap approach of Simar and Wilson (2007). Second, to determine sources of productivity drivers by regressing the efficiency scores against a set of environmental variables.
Resumo:
There has been much written about the Internet’s potential to enhance international market growth opportunities for SME’s. However, the literature is vague as to how Internet usage and the application of Internet marketing also known as Internet marketing intensity has an impact on firm international market growth. This paper examines the level and role of the Internet in the international operations of a sample of 218 Australian SMEs with international customers. This study shows evidence of a statistical relationship between Internet usage and Internet marketing intensity, which in turn leads to international market growth, in terms of increased sales from new customers in new countries, new customers in existing countries and from existing customers.
Resumo:
The purpose of this paper is to show how project management governance is addressed through the use of a specific meta-method. Governance is defined here on two criteria: accountability and performance. Accountability is promoted through transparency and performance is promoted by responsive and responsible decision-making. According to a systemic perspective, transparency and decision-making involve having information, tacit or explicit knowledge, as well as understanding of the context, the different parameters and variables, their interaction and conditions of change. Although this method of methods was built according a heuristic process involving 25 years of various researches and consulting activities, it seems appropriate to draw its foundations. I clarify first my epistemological position and the notion of project and project management, as Art and Science. This lead me to define a "Be" / "Have" posture to this regards. Then, the main theoretical roots of MAP Method are exposed: Boisot' s Social Learning Cycle, Praxeology and Theory of Convention. Then we introduced the main characteristics of the method and the 17 methods and tools constituting MAP "tool box", thus with regard to the project management governance perspective. Finally, I discuss the integration of two managerial modes (operational and project modes) and the consequence in term of governance in a specific socio-techno-economic project/context ecosystem.