301 resultados para Source to sinks


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The notion of designing with change constitutes a fundamental and foundational theoretical premise for much of what constitutes landscape architecture, notably through engagement with ecology, particularly since the work of Ian McHarg in the 1960s and his key text Design with Nature. However, while most if not all texts in landscape architecture would cite this engagement of change theoretically, few go any further than citation, and when they do their methods seem fixated on utilising empirical, quantitative scientific tools for doing so, rather than the tools of design, in an architectural sense, as implied by the name of the discipline, landscape architecture.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the rapidly growing knowledge economy, the talent and creativity of those around us will be increasingly decisive in shaping economic opportunity. Creativity can be described as the ability to produce new and original ideas and things. In other words, it is any act, idea, or product that changes an existing domain or transforms an existing domain into a new one. From an economic perspective, creativity can be considered as the generation of new ideas that is the major source of innovation and new economic activities. As urban regions have become the localities of key knowledge precincts and knowledge clusters across the globe, the link between a range of new technologies and the development of ‘creative urban regions’ (CURs) has come to the fore. In this sense, creativity has become a buzz concept in knowledge-economy research and policy circles. It has spawned ‘creative milieus,’ ‘creative industries,’ ‘creative cities,’ ‘creative class,’ and ‘creative capital.’ Hence, creativity has become a key concept on the agenda of city managers, development agents, and planners as they search for new forms of urban and economic development. CURs provide vast opportunities for knowledge production and spillover, which lead to the formation of knowledge cities. Urban information and communication technology (ICT) developments support the transformation of cities into knowledge cities. This book, which is a companion volume to Knowledge-Based Urban Development: Planning and Applications in the Information Era (also published by IGI Global) focuses on some of these developments. The Forward and Afterword are written by senior respected academic researchers Robert Stimson of the University of Queensland, Australia, and Zorica Nedovic-Budic of the University of Illinois at Urbana-Champaign, USA. The book is divided into four sections, each one dealing with selected aspects of information and communication technologies and creative urban regions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Despite the numerous observations that dynamic capabilities lie at the source of competitive advantage, we still have limited knowledge as to how access to firm-based resources and changes to these affect the development of dynamic capabilities. In this paper, we examine founder human capital, access to employee human capital, access to technological expertise, access to other specific expertise, and access to two types of tangible resources in a sample of new firms in Sweden. We empirically measure four dynamic capabilities and find that the nature and effect of resources employed in the development of these capabilities vary greatly. For the most part, there are positive effects stemming from access to particular resources. However, for some resources, such as access to employee human capital and access to financial capital, unexpected negative effects also appear. This study therefore provides statistical evidence as to the varying role of resources in capability development. Importantly, we also find that changes in resource bases have more influential roles in the development of dynamic capabilities than the resource stock variables that were measured at an earlier stage of firm development. This provides empirical support for the notion of treating the firm as a dynamic flow of resources as opposed to a static stock. This finding also highlights the importance of longitudinal designs in studies of dynamic capability development. Further recommendations for future empirical studies of dynamic capabilities are presented.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Key topics: Since the birth of the Open Source movement in the mid-80's, open source software has become more and more widespread. Amongst others, the Linux operating system, the Apache web server and the Firefox internet explorer have taken substantial market shares to their proprietary competitors. Open source software is governed by particular types of licenses. As proprietary licenses only allow the software's use in exchange for a fee, open source licenses grant users more rights like the free use, free copy, free modification and free distribution of the software, as well as free access to the source code. This new phenomenon has raised many managerial questions: organizational issues related to the system of governance that underlie such open source communities (Raymond, 1999a; Lerner and Tirole, 2002; Lee and Cole 2003; Mockus et al. 2000; Tuomi, 2000; Demil and Lecocq, 2006; O'Mahony and Ferraro, 2007;Fleming and Waguespack, 2007), collaborative innovation issues (Von Hippel, 2003; Von Krogh et al., 2003; Von Hippel and Von Krogh, 2003; Dahlander, 2005; Osterloh, 2007; David, 2008), issues related to the nature as well as the motivations of developers (Lerner and Tirole, 2002; Hertel, 2003; Dahlander and McKelvey, 2005; Jeppesen and Frederiksen, 2006), public policy and innovation issues (Jullien and Zimmermann, 2005; Lee, 2006), technological competitions issues related to standard battles between proprietary and open source software (Bonaccorsi and Rossi, 2003; Bonaccorsi et al. 2004, Economides and Katsamakas, 2005; Chen, 2007), intellectual property rights and licensing issues (Laat 2005; Lerner and Tirole, 2005; Gambardella, 2006; Determann et al., 2007). A major unresolved issue concerns open source business models and revenue capture, given that open source licenses imply no fee for users. On this topic, articles show that a commercial activity based on open source software is possible, as they describe different possible ways of doing business around open source (Raymond, 1999; Dahlander, 2004; Daffara, 2007; Bonaccorsi and Merito, 2007). These studies usually look at open source-based companies. Open source-based companies encompass a wide range of firms with different categories of activities: providers of packaged open source solutions, IT Services&Software Engineering firms and open source software publishers. However, business models implications are different for each of these categories: providers of packaged solutions and IT Services&Software Engineering firms' activities are based on software developed outside their boundaries, whereas commercial software publishers sponsor the development of the open source software. This paper focuses on open source software publishers' business models as this issue is even more crucial for this category of firms which take the risk of investing in the development of the software. Literature at last identifies and depicts only two generic types of business models for open source software publishers: the business models of ''bundling'' (Pal and Madanmohan, 2002; Dahlander 2004) and the dual licensing business models (Välimäki, 2003; Comino and Manenti, 2007). Nevertheless, these business models are not applicable in all circumstances. Methodology: The objectives of this paper are: (1) to explore in which contexts the two generic business models described in literature can be implemented successfully and (2) to depict an additional business model for open source software publishers which can be used in a different context. To do so, this paper draws upon an explorative case study of IdealX, a French open source security software publisher. This case study consists in a series of 3 interviews conducted between February 2005 and April 2006 with the co-founder and the business manager. It aims at depicting the process of IdealX's search for the appropriate business model between its creation in 2000 and 2006. This software publisher has tried both generic types of open source software publishers' business models before designing its own. Consequently, through IdealX's trials and errors, I investigate the conditions under which such generic business models can be effective. Moreover, this study describes the business model finally designed and adopted by IdealX: an additional open source software publisher's business model based on the principle of ''mutualisation'', which is applicable in a different context. Results and implications: Finally, this article contributes to ongoing empirical work within entrepreneurship and strategic management on open source software publishers' business models: it provides the characteristics of three generic business models (the business model of bundling, the dual licensing business model and the business model of mutualisation) as well as conditions under which they can be successfully implemented (regarding the type of product developed and the competencies of the firm). This paper also goes further into the traditional concept of business model used by scholars in the open source related literature. In this article, a business model is not only considered as a way of generating incomes (''revenue model'' (Amit and Zott, 2001)), but rather as the necessary conjunction of value creation and value capture, according to the recent literature about business models (Amit and Zott, 2001; Chresbrough and Rosenblum, 2002; Teece, 2007). Consequently, this paper analyses the business models from these two components' point of view.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the case of industrial relations research, particularly that which sets out to examine practices within workplaces, the best way to study this real-life context is to work for the organisation. Studies conducted by researchers working within the organisation comprise some of the (broad) field’s classic research (cf. Roy, 1954; Burawoy, 1979). Participant and non-participant ethnographic research provides an opportunity to investigate workplace behaviour beyond the scope of questionnaires and interviews. However, we suggest that the data collected outside a workplace can be just as important as the data collected inside the organisation’s walls. In recent years the introduction of anti-smoking legislation in Australia has meant that people who smoke cigarettes are no longer allowed to do so inside buildings. Not only are smokers forced outside to engage in their habit, but they have to smoke prescribed distances from doorways, or in some workplaces outside the property line. This chapter considers the importance of cigarette-smoking employees in ethnographic research. Through data collected across three separate research projects, the chapter argues that smokers, as social outcasts in the workplace, can provide a wealth of important research data. We suggest that smokers also appear more likely to provide stories that contradict the ‘management’ or ‘organisational’ position. Thus, within the haze of smoke, researchers can uncover a level of discontent with the ‘corporate line’ presented inside the workplace. There are several aspects to the increased propensity of smokers to provide a contradictory or discontented story. It may be that the researcher is better able to establish a rapport with smokers, as there is a removal of the artificial wall a researcher presents as an outsider. It may also be that a research location physically outside the boundaries of the organisation provides workers with the freedom to express their discontent. The authors offer no definitive answers; rather, this chapter is intended to extend our knowledge of workplace research through highlighting the methodological value in using smokers as research subjects. We present the experience of three separate case studies where interactions with cigarette smokers have provided either important organisational data or alternatively a means of entering what Cunnison (1966) referred to as the ‘gossip circle’. The final section of the chapter draws on the evidence to demonstrate how the community of smokers, as social outcasts, are valuable in investigating workplace issues. For researchers and practitioners, these social outcasts may very well prove to be an important barometer of employee attitudes; attitudes that perhaps cannot be measured through traditional staff surveys.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Alvin Toffler’s image of the prosumer (1970, 1980, 1990) continues to influence in a significant way our understanding of the user-led, collaborative processes of content creation which are today labelled “social media” or “Web 2.0”. A closer look at Toffler’s own description of his prosumer model reveals, however, that it remains firmly grounded in the mass media age: the prosumer is clearly not the self-motivated creative originator and developer of new content which can today be observed in projects ranging from open source software through Wikipedia to Second Life, but simply a particularly well-informed, and therefore both particularly critical and particularly active, consumer. The highly specialised, high end consumers which exist in areas such as hi-fi or car culture are far more representative of the ideal prosumer than the participants in non-commercial (or as yet non-commercial) collaborative projects. And to expect Toffler’s 1970s model of the prosumer to describe these 21st-century phenomena was always an unrealistic expectation, of course. To describe the creative and collaborative participation which today characterises user-led projects such as Wikipedia, terms such as ‘production’ and ‘consumption’ are no longer particularly useful – even in laboured constructions such as ‘commons-based peer-production’ (Benkler 2006) or ‘p2p production’ (Bauwens 2005). In the user communities participating in such forms of content creation, roles as consumers and users have long begun to be inextricably interwoven with those as producer and creator: users are always already also able to be producers of the shared information collection, regardless of whether they are aware of that fact – they have taken on a new, hybrid role which may be best described as that of a produser (Bruns 2008). Projects which build on such produsage can be found in areas from open source software development through citizen journalism to Wikipedia, and beyond this also in multi-user online computer games, filesharing, and even in communities collaborating on the design of material goods. While addressing a range of different challenges, they nonetheless build on a small number of universal key principles. This paper documents these principles and indicates the possible implications of this transition from production and prosumption to produsage.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We assess the increase in particle number emissions from motor vehicles driving at steady speed when forced to stop and accelerate from rest. Considering the example of a signalized pedestrian crossing on a two-way single-lane urban road, we use a complex line source method to calculate the total emissions produced by a specific number and mix of light petrol cars and diesel passenger buses and show that the total emissions during a red light is significantly higher than during the time when the light remains green. Replacing two cars with one bus increased the emissions by over an order of magnitude. Considering these large differences, we conclude that the importance attached to particle number emissions in traffic management policies be reassessed in the future.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We present a new penalty-based genetic algorithm for the multi-source and multi-sink minimum vertex cut problem, and illustrate the algorithm’s usefulness with two real-world applications. It is proved in this paper that the genetic algorithm always produces a feasible solution by exploiting some domain-specific knowledge. The genetic algorithm has been implemented on the example applications and evaluated to show how well it scales as the problem size increases.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The process of structural health monitoring (SHM) involves monitoring a structure over a period of time using appropriate sensors, extracting damage sensitive features from the measurements made by the sensors and analysing these features to determine the current state of the structure. Various techniques are available for structural health monitoring of structures and acoustic emission (AE) is one technique that is finding an increasing use. Acoustic emission waves are the stress waves generated by the mechanical deformation of materials. AE waves produced inside a structure can be recorded by means of sensors attached on the surface. Analysis of these recorded signals can locate and assess the extent of damage. This paper describes preliminary studies on the application of AE technique for health monitoring of bridge structures. Crack initiation or structural damage will result in wave propagation in solid and this can take place in various forms. Propagation of these waves is likely to be affected by the dimensions, surface properties and shape of the specimen. This, in turn, will affect source localization. Various laboratory test results will be presented on source localization, using pencil lead break tests. The results from the tests can be expected to aid in enhancement of knowledge of acoustic emission process and development of effective bridge structure diagnostics system.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The multi-level current reinjection concept described in literature is well-known to produce high quality AC current waveforms in high power and high voltage self-commutating current source converters. This paper proposes a novel reinjection circuitry which is capable of producing a 7-level reinjection current. It is shown that this reinjection current effectively increases the pulse number of the converter to 72. The use of PSCAD/EMTDC simulation validates the functionality of the proposed concept illustrating its effectiveness on both AC and DC sides of the converter.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

An informed citizenry is essential to the effective functioning of democracy. In most modern liberal democracies, citizens have traditionally looked to the media as the primary source of information about socio-political matters. In our increasingly mediated world, it is critical that audiences be able to effectively and accurately use the media to meet their information needs. Media literacy, the ability to access, understand, evaluate and create media content is therefore a vital skill for a healthy democracy. The past three decades have seen the rapid expansion of the information environment, particularly through Internet technologies. It is obvious that media usage patterns have changed dramatically as a result. Blogs and websites are now popular sources of news and information, and are for some sections of the population likely to be the first, and possibly only, information source accessed when information is required. What are the implications for media literacy in such a diverse and changing information environment? The Alexandria Manifesto stresses the link between libraries, a well informed citizenry and effective governance, so how do these changes impact on libraries? This paper considers the role libraries can play in developing media literate communities, and explores the ways in which traditional media literacy training may be expanded to better equip citizens for new media technologies. Drawing on original empirical research, this paper highlights a key shortcoming of existing media literacy approaches: that of overlooking the importance of needs identification as an initial step in media selection. Self-awareness of one’s actual information need is not automatic, as can be witnessed daily at reference desks in libraries the world over. Citizens very often do not know what it is that they need when it comes to information. Without this knowledge, selecting the most appropriate information source from the vast range available becomes an uncertain, possibly even random, enterprise. Incorporating reference interview-type training into media literacy education, whereby the individual will develop the skills to interrogate themselves regarding their underlying information needs, will enhance media literacy approaches. This increased focus on the needs of the individual will also push media literacy education into a more constructivist methodology. The paper also stresses the importance of media literacy training for adults. Media literacy education received in school or even university cannot be expected to retain its relevance over time in our rapidly evolving information environment. Further, constructivist teaching approaches highlight the importance of context to the learning process, thus it may be more effective to offer media literacy education relating to news media use to adults, whilst school-based approaches focus on types of media more relevant to young people, such as entertainment media. Librarians are ideally placed to offer such community-based media literacy education for adults. They already understand, through their training and practice of the reference interview, how to identify underlying information needs. Further, libraries are placed within community contexts, where the everyday practice of media literacy occurs. The Alexandria Manifesto stresses the link between libraries, a well informed citizenry and effective governance. It is clear that libraries have a role to play in fostering media literacy within their communities.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

To allocate and size capacitors in a distribution system, an optimization algorithm, called Discrete Particle Swarm Optimization (DPSO), is employed in this paper. The objective is to minimize the transmission line loss cost plus capacitors cost. During the optimization procedure, the bus voltage, the feeder current and the reactive power flowing back to the source side should be maintained within standard levels. To validate the proposed method, the semi-urban distribution system that is connected to bus 2 of the Roy Billinton Test System (RBTS) is used. This 37-bus distribution system has 22 loads being located in the secondary side of a distribution substation (33/11 kV). Reducing the transmission line loss in a standard system, in which the transmission line loss consists of only about 6.6 percent of total power, the capabilities of the proposed technique are seen to be validated.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Human-specific Bacteroides HF183 (HS-HF183), human-specific Enterococci faecium esp (HS-esp), human-specific adenoviruses (HS-AVs) and human-specific polyomaviruses (HS-PVs) assays were evaluated in freshwater, seawater and distilled water to detect fresh sewage. The sewage spiked water samples were also tested for the concentrations of traditional fecal indicators (i.e., Escherichia coli, enterococci and Clostridium perfringens) and enteric viruses such as enteroviruses (EVs), sapoviruses (SVs), and torquetenoviruses (TVs). The overall host-specificity of the HS-HF183 marker to differentiate between humans and other animals was 98%. However, the HS-esp, HS-AVs and HS-PVs showed 100% hostspecificity. All the human-specific markers showed >97% sensitivity to detect human fecal pollution. E. coli, enterococci and, C. perfringens were detected up to dilutions of sewage 10_5, 10_4 and 10_3 respectively.HS-esp, HS-AVs, HS-PVs, SVs and TVs were detected up to dilution of sewage 10_4 whilst EVs were detected up to dilution 10_5. The ability of the HS-HF183 marker to detect freshsewagewas3–4 orders ofmagnitude higher than that of the HS-esp and viral markers. The ability to detect fresh sewage in freshwater, seawater and distilled water matrices was similar for human-specific bacterial and viral marker. Based on our data, it appears that human-specific molecular markers are sensitive measures of fresh sewage pollution, and the HS-HF183 marker appears to be the most sensitive among these markers in terms of detecting fresh sewage. However, the presence of the HS-HF183 marker in environmental waters may not necessarily indicate the presence of enteric viruses due to their high abundance in sewage compared to enteric viruses. More research is required on the persistency of these markers in environmental water samples in relation to traditional fecal indicators and enteric pathogens.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper reports the application of multicriteria decision making techniques, PROMETHEE and GAIA, and receptor models, PCA/APCS and PMF, to data from an air monitoring site located on the campus of Queensland University of Technology in Brisbane, Australia and operated by Queensland Environmental Protection Agency (QEPA). The data consisted of the concentrations of 21 chemical species and meteorological data collected between 1995 and 2003. PROMETHEE/GAIA separated the samples into those collected when leaded and unleaded petrol were used to power vehicles in the region. The number and source profiles of the factors obtained from PCA/APCS and PMF analyses were compared. There are noticeable differences in the outcomes possibly because of the non-negative constraints imposed on the PMF analysis. While PCA/APCS identified 6 sources, PMF reduced the data to 9 factors. Each factor had distinctive compositions that suggested that motor vehicle emissions, controlled burning of forests, secondary sulphate, sea salt and road dust/soil were the most important sources of fine particulate matter at the site. The most plausible locations of the sources were identified by combining the results obtained from the receptor models with meteorological data. The study demonstrated the potential benefits of combining results from multi-criteria decision making analysis with those from receptor models in order to gain insights into information that could enhance the development of air pollution control measures.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The impact of the Internet on our lives has been pervasive. People are increasingly turning to the social interaction available on the Internet to satisfy their needs, whether these are professional or personal. The Internet offers users fast access to social contacts such as online chat groups and discussion lists,helping us to make connections with others. Online communities are being increasingly used by teachers for professional support, guidance and inspiration. These are often organised around subject areas and offer teachers opportunities to develop both personally and professionally. Online communities may present as a source of continuous professional development for teachers as they are able to deliver authentic and personalised opportunities for learning. This paper will present the findings of a study that was conducted on three online communities for teachers. It will explore the nature of online community membership and offer some conclusions regarding their potential as a source of professional learning for teachers.