997 resultados para anthropogenic source
Resumo:
The notion of designing with change constitutes a fundamental and foundational theoretical premise for much of what constitutes landscape architecture, notably through engagement with ecology, particularly since the work of Ian McHarg in the 1960s and his key text Design with Nature. However, while most if not all texts in landscape architecture would cite this engagement of change theoretically, few go any further than citation, and when they do their methods seem fixated on utilising empirical, quantitative scientific tools for doing so, rather than the tools of design, in an architectural sense, as implied by the name of the discipline, landscape architecture.
Resumo:
The measurement of submicrometre (< 1.0 m) and ultrafine particles (diameter < 0.1 m) number concentration have attracted attention since the last decade because the potential health impacts associated with exposure to these particles can be more significant than those due to exposure to larger particles. At present, ultrafine particles are not regularly monitored and they are yet to be incorporated into air quality monitoring programs. As a result, very few studies have analysed their long-term and spatial variations in ultrafine particle concentration, and none have been in Australia. To address this gap in scientific knowledge, the aim of this research was to investigate the long-term trends and seasonal variations in particle number concentrations in Brisbane, Australia. Data collected over a five-year period were analysed using weighted regression models. Monthly mean concentrations in the morning (6:00-10:00) and the afternoon (16:00-19:00) were plotted against time in months, using the monthly variance as the weights. During the five-year period, submicrometre and ultrafine particle concentrations increased in the morning by 105.7% and 81.5% respectively whereas in the afternoon there was no significant trend. The morning concentrations were associated with fresh traffic emissions and the afternoon concentrations with the background. The statistical tests applied to the seasonal models, on the other hand, indicated that there was no seasonal component. The spatial variation in size distribution in a large urban area was investigated using particle number size distribution data collected at nine different locations during different campaigns. The size distributions were represented by the modal structures and cumulative size distributions. Particle number peaked at around 30 nm, except at an isolated site dominated by diesel trucks, where the particle number peaked at around 60 nm. It was found that ultrafine particles contributed to 82%-90% of the total particle number. At the sites dominated by petrol vehicles, nanoparticles (< 50 nm) contributed 60%-70% of the total particle number, and at the site dominated by diesel trucks they contributed 50%. Although the sampling campaigns took place during different seasons and were of varying duration these variations did not have an effect on the particle size distributions. The results suggested that the distributions were rather affected by differences in traffic composition and distance to the road. To investigate the occurrence of nucleation events, that is, secondary particle formation from gaseous precursors, particle size distribution data collected over a 13 month period during 5 different campaigns were analysed. The study area was a complex urban environment influenced by anthropogenic and natural sources. The study introduced a new application of time series differencing for the identification of nucleation events. To evaluate the conditions favourable to nucleation, the meteorological conditions and gaseous concentrations prior to and during nucleation events were recorded. Gaseous concentrations did not exhibit a clear pattern of change in concentration. It was also found that nucleation was associated with sea breeze and long-range transport. The implications of this finding are that whilst vehicles are the most important source of ultrafine particles, sea breeze and aged gaseous emissions play a more important role in secondary particle formation in the study area.
Resumo:
Key topics: Since the birth of the Open Source movement in the mid-80's, open source software has become more and more widespread. Amongst others, the Linux operating system, the Apache web server and the Firefox internet explorer have taken substantial market shares to their proprietary competitors. Open source software is governed by particular types of licenses. As proprietary licenses only allow the software's use in exchange for a fee, open source licenses grant users more rights like the free use, free copy, free modification and free distribution of the software, as well as free access to the source code. This new phenomenon has raised many managerial questions: organizational issues related to the system of governance that underlie such open source communities (Raymond, 1999a; Lerner and Tirole, 2002; Lee and Cole 2003; Mockus et al. 2000; Tuomi, 2000; Demil and Lecocq, 2006; O'Mahony and Ferraro, 2007;Fleming and Waguespack, 2007), collaborative innovation issues (Von Hippel, 2003; Von Krogh et al., 2003; Von Hippel and Von Krogh, 2003; Dahlander, 2005; Osterloh, 2007; David, 2008), issues related to the nature as well as the motivations of developers (Lerner and Tirole, 2002; Hertel, 2003; Dahlander and McKelvey, 2005; Jeppesen and Frederiksen, 2006), public policy and innovation issues (Jullien and Zimmermann, 2005; Lee, 2006), technological competitions issues related to standard battles between proprietary and open source software (Bonaccorsi and Rossi, 2003; Bonaccorsi et al. 2004, Economides and Katsamakas, 2005; Chen, 2007), intellectual property rights and licensing issues (Laat 2005; Lerner and Tirole, 2005; Gambardella, 2006; Determann et al., 2007). A major unresolved issue concerns open source business models and revenue capture, given that open source licenses imply no fee for users. On this topic, articles show that a commercial activity based on open source software is possible, as they describe different possible ways of doing business around open source (Raymond, 1999; Dahlander, 2004; Daffara, 2007; Bonaccorsi and Merito, 2007). These studies usually look at open source-based companies. Open source-based companies encompass a wide range of firms with different categories of activities: providers of packaged open source solutions, IT Services&Software Engineering firms and open source software publishers. However, business models implications are different for each of these categories: providers of packaged solutions and IT Services&Software Engineering firms' activities are based on software developed outside their boundaries, whereas commercial software publishers sponsor the development of the open source software. This paper focuses on open source software publishers' business models as this issue is even more crucial for this category of firms which take the risk of investing in the development of the software. Literature at last identifies and depicts only two generic types of business models for open source software publishers: the business models of ''bundling'' (Pal and Madanmohan, 2002; Dahlander 2004) and the dual licensing business models (Välimäki, 2003; Comino and Manenti, 2007). Nevertheless, these business models are not applicable in all circumstances. Methodology: The objectives of this paper are: (1) to explore in which contexts the two generic business models described in literature can be implemented successfully and (2) to depict an additional business model for open source software publishers which can be used in a different context. To do so, this paper draws upon an explorative case study of IdealX, a French open source security software publisher. This case study consists in a series of 3 interviews conducted between February 2005 and April 2006 with the co-founder and the business manager. It aims at depicting the process of IdealX's search for the appropriate business model between its creation in 2000 and 2006. This software publisher has tried both generic types of open source software publishers' business models before designing its own. Consequently, through IdealX's trials and errors, I investigate the conditions under which such generic business models can be effective. Moreover, this study describes the business model finally designed and adopted by IdealX: an additional open source software publisher's business model based on the principle of ''mutualisation'', which is applicable in a different context. Results and implications: Finally, this article contributes to ongoing empirical work within entrepreneurship and strategic management on open source software publishers' business models: it provides the characteristics of three generic business models (the business model of bundling, the dual licensing business model and the business model of mutualisation) as well as conditions under which they can be successfully implemented (regarding the type of product developed and the competencies of the firm). This paper also goes further into the traditional concept of business model used by scholars in the open source related literature. In this article, a business model is not only considered as a way of generating incomes (''revenue model'' (Amit and Zott, 2001)), but rather as the necessary conjunction of value creation and value capture, according to the recent literature about business models (Amit and Zott, 2001; Chresbrough and Rosenblum, 2002; Teece, 2007). Consequently, this paper analyses the business models from these two components' point of view.
Resumo:
In the case of industrial relations research, particularly that which sets out to examine practices within workplaces, the best way to study this real-life context is to work for the organisation. Studies conducted by researchers working within the organisation comprise some of the (broad) field’s classic research (cf. Roy, 1954; Burawoy, 1979). Participant and non-participant ethnographic research provides an opportunity to investigate workplace behaviour beyond the scope of questionnaires and interviews. However, we suggest that the data collected outside a workplace can be just as important as the data collected inside the organisation’s walls. In recent years the introduction of anti-smoking legislation in Australia has meant that people who smoke cigarettes are no longer allowed to do so inside buildings. Not only are smokers forced outside to engage in their habit, but they have to smoke prescribed distances from doorways, or in some workplaces outside the property line. This chapter considers the importance of cigarette-smoking employees in ethnographic research. Through data collected across three separate research projects, the chapter argues that smokers, as social outcasts in the workplace, can provide a wealth of important research data. We suggest that smokers also appear more likely to provide stories that contradict the ‘management’ or ‘organisational’ position. Thus, within the haze of smoke, researchers can uncover a level of discontent with the ‘corporate line’ presented inside the workplace. There are several aspects to the increased propensity of smokers to provide a contradictory or discontented story. It may be that the researcher is better able to establish a rapport with smokers, as there is a removal of the artificial wall a researcher presents as an outsider. It may also be that a research location physically outside the boundaries of the organisation provides workers with the freedom to express their discontent. The authors offer no definitive answers; rather, this chapter is intended to extend our knowledge of workplace research through highlighting the methodological value in using smokers as research subjects. We present the experience of three separate case studies where interactions with cigarette smokers have provided either important organisational data or alternatively a means of entering what Cunnison (1966) referred to as the ‘gossip circle’. The final section of the chapter draws on the evidence to demonstrate how the community of smokers, as social outcasts, are valuable in investigating workplace issues. For researchers and practitioners, these social outcasts may very well prove to be an important barometer of employee attitudes; attitudes that perhaps cannot be measured through traditional staff surveys.
Resumo:
We present a new penalty-based genetic algorithm for the multi-source and multi-sink minimum vertex cut problem, and illustrate the algorithm’s usefulness with two real-world applications. It is proved in this paper that the genetic algorithm always produces a feasible solution by exploiting some domain-specific knowledge. The genetic algorithm has been implemented on the example applications and evaluated to show how well it scales as the problem size increases.
Resumo:
The process of structural health monitoring (SHM) involves monitoring a structure over a period of time using appropriate sensors, extracting damage sensitive features from the measurements made by the sensors and analysing these features to determine the current state of the structure. Various techniques are available for structural health monitoring of structures and acoustic emission (AE) is one technique that is finding an increasing use. Acoustic emission waves are the stress waves generated by the mechanical deformation of materials. AE waves produced inside a structure can be recorded by means of sensors attached on the surface. Analysis of these recorded signals can locate and assess the extent of damage. This paper describes preliminary studies on the application of AE technique for health monitoring of bridge structures. Crack initiation or structural damage will result in wave propagation in solid and this can take place in various forms. Propagation of these waves is likely to be affected by the dimensions, surface properties and shape of the specimen. This, in turn, will affect source localization. Various laboratory test results will be presented on source localization, using pencil lead break tests. The results from the tests can be expected to aid in enhancement of knowledge of acoustic emission process and development of effective bridge structure diagnostics system.
Resumo:
The multi-level current reinjection concept described in literature is well-known to produce high quality AC current waveforms in high power and high voltage self-commutating current source converters. This paper proposes a novel reinjection circuitry which is capable of producing a 7-level reinjection current. It is shown that this reinjection current effectively increases the pulse number of the converter to 72. The use of PSCAD/EMTDC simulation validates the functionality of the proposed concept illustrating its effectiveness on both AC and DC sides of the converter.
Resumo:
This paper reports the application of multicriteria decision making techniques, PROMETHEE and GAIA, and receptor models, PCA/APCS and PMF, to data from an air monitoring site located on the campus of Queensland University of Technology in Brisbane, Australia and operated by Queensland Environmental Protection Agency (QEPA). The data consisted of the concentrations of 21 chemical species and meteorological data collected between 1995 and 2003. PROMETHEE/GAIA separated the samples into those collected when leaded and unleaded petrol were used to power vehicles in the region. The number and source profiles of the factors obtained from PCA/APCS and PMF analyses were compared. There are noticeable differences in the outcomes possibly because of the non-negative constraints imposed on the PMF analysis. While PCA/APCS identified 6 sources, PMF reduced the data to 9 factors. Each factor had distinctive compositions that suggested that motor vehicle emissions, controlled burning of forests, secondary sulphate, sea salt and road dust/soil were the most important sources of fine particulate matter at the site. The most plausible locations of the sources were identified by combining the results obtained from the receptor models with meteorological data. The study demonstrated the potential benefits of combining results from multi-criteria decision making analysis with those from receptor models in order to gain insights into information that could enhance the development of air pollution control measures.
Resumo:
The impact of the Internet on our lives has been pervasive. People are increasingly turning to the social interaction available on the Internet to satisfy their needs, whether these are professional or personal. The Internet offers users fast access to social contacts such as online chat groups and discussion lists,helping us to make connections with others. Online communities are being increasingly used by teachers for professional support, guidance and inspiration. These are often organised around subject areas and offer teachers opportunities to develop both personally and professionally. Online communities may present as a source of continuous professional development for teachers as they are able to deliver authentic and personalised opportunities for learning. This paper will present the findings of a study that was conducted on three online communities for teachers. It will explore the nature of online community membership and offer some conclusions regarding their potential as a source of professional learning for teachers.
Resumo:
In this paper, we consider a modified anomalous subdiffusion equation with a nonlinear source term for describing processes that become less anomalous as time progresses by the inclusion of a second fractional time derivative acting on the diffusion term. A new implicit difference method is constructed. The stability and convergence are discussed using a new energy method. Finally, some numerical examples are given. The numerical results demonstrate the effectiveness of theoretical analysis
Resumo:
In this paper, we consider a variable-order fractional advection-diffusion equation with a nonlinear source term on a finite domain. Explicit and implicit Euler approximations for the equation are proposed. Stability and convergence of the methods are discussed. Moreover, we also present a fractional method of lines, a matrix transfer technique, and an extrapolation method for the equation. Some numerical examples are given, and the results demonstrate the effectiveness of theoretical analysis.
Resumo:
In this paper, a fixed-switching-frequency closed-loop modulation of a voltage-source inverter (VSI), upon the digital implementation of the modulation process, is analyzed and characterized. The sampling frequency of the digital processor is considered as an integer multiple of the modulation switching frequency. An expression for the determination of the modulation design parameter is developed for smooth modulation at a fixed switching frequency. The variation of the sampling frequency, switching frequency, and modulation index has been analyzed for the determination of the switching condition under closed loop. It is shown that the switching condition determined based on the continuous-time analysis of the closed-loop modulation will ensure smooth modulation upon the digital implementation of the modulation process. However, the stability properties need to be tested prior to digital implementation as they get deteriorated at smaller sampling frequencies. The closed-loop modulation index needs to be considered maximum while determining the design parameters for smooth modulation. In particular, a detailed analysis has been carried out by varying the control gain in the sliding-mode control of a two-level VSI. The proposed analysis of the closed-loop modulation of the VSI has been verified for the operation of a distribution static compensator. The theoretical results are validated experimentally on both single- and three-phase systems.