518 resultados para source encoder identification
Resumo:
The notion of designing with change constitutes a fundamental and foundational theoretical premise for much of what constitutes landscape architecture, notably through engagement with ecology, particularly since the work of Ian McHarg in the 1960s and his key text Design with Nature. However, while most if not all texts in landscape architecture would cite this engagement of change theoretically, few go any further than citation, and when they do their methods seem fixated on utilising empirical, quantitative scientific tools for doing so, rather than the tools of design, in an architectural sense, as implied by the name of the discipline, landscape architecture.
Resumo:
The measurement of submicrometre (< 1.0 m) and ultrafine particles (diameter < 0.1 m) number concentration have attracted attention since the last decade because the potential health impacts associated with exposure to these particles can be more significant than those due to exposure to larger particles. At present, ultrafine particles are not regularly monitored and they are yet to be incorporated into air quality monitoring programs. As a result, very few studies have analysed their long-term and spatial variations in ultrafine particle concentration, and none have been in Australia. To address this gap in scientific knowledge, the aim of this research was to investigate the long-term trends and seasonal variations in particle number concentrations in Brisbane, Australia. Data collected over a five-year period were analysed using weighted regression models. Monthly mean concentrations in the morning (6:00-10:00) and the afternoon (16:00-19:00) were plotted against time in months, using the monthly variance as the weights. During the five-year period, submicrometre and ultrafine particle concentrations increased in the morning by 105.7% and 81.5% respectively whereas in the afternoon there was no significant trend. The morning concentrations were associated with fresh traffic emissions and the afternoon concentrations with the background. The statistical tests applied to the seasonal models, on the other hand, indicated that there was no seasonal component. The spatial variation in size distribution in a large urban area was investigated using particle number size distribution data collected at nine different locations during different campaigns. The size distributions were represented by the modal structures and cumulative size distributions. Particle number peaked at around 30 nm, except at an isolated site dominated by diesel trucks, where the particle number peaked at around 60 nm. It was found that ultrafine particles contributed to 82%-90% of the total particle number. At the sites dominated by petrol vehicles, nanoparticles (< 50 nm) contributed 60%-70% of the total particle number, and at the site dominated by diesel trucks they contributed 50%. Although the sampling campaigns took place during different seasons and were of varying duration these variations did not have an effect on the particle size distributions. The results suggested that the distributions were rather affected by differences in traffic composition and distance to the road. To investigate the occurrence of nucleation events, that is, secondary particle formation from gaseous precursors, particle size distribution data collected over a 13 month period during 5 different campaigns were analysed. The study area was a complex urban environment influenced by anthropogenic and natural sources. The study introduced a new application of time series differencing for the identification of nucleation events. To evaluate the conditions favourable to nucleation, the meteorological conditions and gaseous concentrations prior to and during nucleation events were recorded. Gaseous concentrations did not exhibit a clear pattern of change in concentration. It was also found that nucleation was associated with sea breeze and long-range transport. The implications of this finding are that whilst vehicles are the most important source of ultrafine particles, sea breeze and aged gaseous emissions play a more important role in secondary particle formation in the study area.
Resumo:
Vibration based damage identification methods examine the changes in primary modal parameters or quantities derived from modal parameters. As one method may have advantages over the other under some circumstances, a multi-criteria approach is proposed. Case studies are conducted separately on beam, plate and plate-on-beam structures. Using the numerically simulated modal data obtained through finite element analysis software, algorithms based on flexibility and strain energy changes before and after damage are obtained and used as the indices for the assessment of the state of structural health. Results show that the proposed multi-criteria method is effective in damage identification in these structures.
Resumo:
Although the benefits of service orientation are prevalent in literature, a review, analysis, and evaluation of the 30 existing service analysis approaches presented in this paper have shown that a comprehensive approach to the identification and analysis of both business and supporting software services is missing. Based on this evaluation of existing approaches and additional sources, we close this gap by proposing an integrated, consolidated approach to business and software service analysis that combines and extends the strengths of the examined methodologies.
Resumo:
Key topics: Since the birth of the Open Source movement in the mid-80's, open source software has become more and more widespread. Amongst others, the Linux operating system, the Apache web server and the Firefox internet explorer have taken substantial market shares to their proprietary competitors. Open source software is governed by particular types of licenses. As proprietary licenses only allow the software's use in exchange for a fee, open source licenses grant users more rights like the free use, free copy, free modification and free distribution of the software, as well as free access to the source code. This new phenomenon has raised many managerial questions: organizational issues related to the system of governance that underlie such open source communities (Raymond, 1999a; Lerner and Tirole, 2002; Lee and Cole 2003; Mockus et al. 2000; Tuomi, 2000; Demil and Lecocq, 2006; O'Mahony and Ferraro, 2007;Fleming and Waguespack, 2007), collaborative innovation issues (Von Hippel, 2003; Von Krogh et al., 2003; Von Hippel and Von Krogh, 2003; Dahlander, 2005; Osterloh, 2007; David, 2008), issues related to the nature as well as the motivations of developers (Lerner and Tirole, 2002; Hertel, 2003; Dahlander and McKelvey, 2005; Jeppesen and Frederiksen, 2006), public policy and innovation issues (Jullien and Zimmermann, 2005; Lee, 2006), technological competitions issues related to standard battles between proprietary and open source software (Bonaccorsi and Rossi, 2003; Bonaccorsi et al. 2004, Economides and Katsamakas, 2005; Chen, 2007), intellectual property rights and licensing issues (Laat 2005; Lerner and Tirole, 2005; Gambardella, 2006; Determann et al., 2007). A major unresolved issue concerns open source business models and revenue capture, given that open source licenses imply no fee for users. On this topic, articles show that a commercial activity based on open source software is possible, as they describe different possible ways of doing business around open source (Raymond, 1999; Dahlander, 2004; Daffara, 2007; Bonaccorsi and Merito, 2007). These studies usually look at open source-based companies. Open source-based companies encompass a wide range of firms with different categories of activities: providers of packaged open source solutions, IT Services&Software Engineering firms and open source software publishers. However, business models implications are different for each of these categories: providers of packaged solutions and IT Services&Software Engineering firms' activities are based on software developed outside their boundaries, whereas commercial software publishers sponsor the development of the open source software. This paper focuses on open source software publishers' business models as this issue is even more crucial for this category of firms which take the risk of investing in the development of the software. Literature at last identifies and depicts only two generic types of business models for open source software publishers: the business models of ''bundling'' (Pal and Madanmohan, 2002; Dahlander 2004) and the dual licensing business models (Vlimki, 2003; Comino and Manenti, 2007). Nevertheless, these business models are not applicable in all circumstances. Methodology: The objectives of this paper are: (1) to explore in which contexts the two generic business models described in literature can be implemented successfully and (2) to depict an additional business model for open source software publishers which can be used in a different context. To do so, this paper draws upon an explorative case study of IdealX, a French open source security software publisher. This case study consists in a series of 3 interviews conducted between February 2005 and April 2006 with the co-founder and the business manager. It aims at depicting the process of IdealX's search for the appropriate business model between its creation in 2000 and 2006. This software publisher has tried both generic types of open source software publishers' business models before designing its own. Consequently, through IdealX's trials and errors, I investigate the conditions under which such generic business models can be effective. Moreover, this study describes the business model finally designed and adopted by IdealX: an additional open source software publisher's business model based on the principle of ''mutualisation'', which is applicable in a different context. Results and implications: Finally, this article contributes to ongoing empirical work within entrepreneurship and strategic management on open source software publishers' business models: it provides the characteristics of three generic business models (the business model of bundling, the dual licensing business model and the business model of mutualisation) as well as conditions under which they can be successfully implemented (regarding the type of product developed and the competencies of the firm). This paper also goes further into the traditional concept of business model used by scholars in the open source related literature. In this article, a business model is not only considered as a way of generating incomes (''revenue model'' (Amit and Zott, 2001)), but rather as the necessary conjunction of value creation and value capture, according to the recent literature about business models (Amit and Zott, 2001; Chresbrough and Rosenblum, 2002; Teece, 2007). Consequently, this paper analyses the business models from these two components' point of view.
Resumo:
In the case of industrial relations research, particularly that which sets out to examine practices within workplaces, the best way to study this real-life context is to work for the organisation. Studies conducted by researchers working within the organisation comprise some of the (broad) fields classic research (cf. Roy, 1954; Burawoy, 1979). Participant and non-participant ethnographic research provides an opportunity to investigate workplace behaviour beyond the scope of questionnaires and interviews. However, we suggest that the data collected outside a workplace can be just as important as the data collected inside the organisations walls. In recent years the introduction of anti-smoking legislation in Australia has meant that people who smoke cigarettes are no longer allowed to do so inside buildings. Not only are smokers forced outside to engage in their habit, but they have to smoke prescribed distances from doorways, or in some workplaces outside the property line. This chapter considers the importance of cigarette-smoking employees in ethnographic research. Through data collected across three separate research projects, the chapter argues that smokers, as social outcasts in the workplace, can provide a wealth of important research data. We suggest that smokers also appear more likely to provide stories that contradict the management or organisational position. Thus, within the haze of smoke, researchers can uncover a level of discontent with the corporate line presented inside the workplace. There are several aspects to the increased propensity of smokers to provide a contradictory or discontented story. It may be that the researcher is better able to establish a rapport with smokers, as there is a removal of the artificial wall a researcher presents as an outsider. It may also be that a research location physically outside the boundaries of the organisation provides workers with the freedom to express their discontent. The authors offer no definitive answers; rather, this chapter is intended to extend our knowledge of workplace research through highlighting the methodological value in using smokers as research subjects. We present the experience of three separate case studies where interactions with cigarette smokers have provided either important organisational data or alternatively a means of entering what Cunnison (1966) referred to as the gossip circle. The final section of the chapter draws on the evidence to demonstrate how the community of smokers, as social outcasts, are valuable in investigating workplace issues. For researchers and practitioners, these social outcasts may very well prove to be an important barometer of employee attitudes; attitudes that perhaps cannot be measured through traditional staff surveys.
Resumo:
An estimation of costs for maintenance and rehabilitation is subject to variation due to the uncertainties of input parameters. This paper presents the results of an analysis to identify input parameters that affect the prediction of variation in road deterioration. Road data obtained from 1688 km of a national highway located in the tropical northeast of Queensland in Australia were used in the analysis. Data were analysed using a probability-based method, the Monte Carlo simulation technique and HDM-4s roughness prediction model. The results of the analysis indicated that among the input parameters the variability of pavement strength, rut depth, annual equivalent axle load and initial roughness affected the variability of the predicted roughness. The second part of the paper presents an analysis to assess the variation in cost estimates due to the variability of the overall identified critical input parameters.
Resumo:
Manufacture, construction and use of buildings and building materials make a significant environmental impact internally (inside the building), locally (neighbourhood) and globally. Life cycle assessment (LCA) methodology is being applied for evaluating the environmental impact of building/or building materials. One of the major applications of LCA is to identify key issues of a product system from cradle to grave. Key issues identified in an LCA lead one to the right direction in assessing the environmental aspects of a product system and help to identify the areas for improvement of the environmental performance of a product as well. The purpose of this paper is to suggest two methods for identifying key issues using an integrated tool (LCADesign), which has been developed to provide a method of determining the best alternative for reducing environmental impacts from a building or building materials, and compare both methods in the case study. This paper assists the designers or marketers related to building or building materials in their decision making by giving information on activities or alternatives which are identified as key issues for environmental impacts.
Resumo:
We present a new penalty-based genetic algorithm for the multi-source and multi-sink minimum vertex cut problem, and illustrate the algorithms usefulness with two real-world applications. It is proved in this paper that the genetic algorithm always produces a feasible solution by exploiting some domain-specific knowledge. The genetic algorithm has been implemented on the example applications and evaluated to show how well it scales as the problem size increases.
Resumo:
The process of structural health monitoring (SHM) involves monitoring a structure over a period of time using appropriate sensors, extracting damage sensitive features from the measurements made by the sensors and analysing these features to determine the current state of the structure. Various techniques are available for structural health monitoring of structures and acoustic emission (AE) is one technique that is finding an increasing use. Acoustic emission waves are the stress waves generated by the mechanical deformation of materials. AE waves produced inside a structure can be recorded by means of sensors attached on the surface. Analysis of these recorded signals can locate and assess the extent of damage. This paper describes preliminary studies on the application of AE technique for health monitoring of bridge structures. Crack initiation or structural damage will result in wave propagation in solid and this can take place in various forms. Propagation of these waves is likely to be affected by the dimensions, surface properties and shape of the specimen. This, in turn, will affect source localization. Various laboratory test results will be presented on source localization, using pencil lead break tests. The results from the tests can be expected to aid in enhancement of knowledge of acoustic emission process and development of effective bridge structure diagnostics system.
Resumo:
The engagement behaviour of 1,524 student-enrolments (students) in five first year units was monitored and 608 (39.9%) were classified as at risk using the criterion of not submitting or failing their first assignment. Of these, 327 (53.8%) were successfully contacted (i.e., spoken to by phone) and provided with advice and/or referral to learning and personal support services while the remaining 281 (46.2%) could not be contacted. Nine hundred and sixteen students (60.1%) were classified as not at risk. Overall, the at risk group who were contacted achieved significantly higher end-of-semester final grades than, and persisted (completed the unit) at more than twice the rate of, the at risk group who were not contacted. There were variations among the units which were explained by the timing of the first assignment, specific teaching-learning processes and the structure of the curriculum. Implications for curriculum design and supporting first year students within a personal, social and academic framework are discussed.
Resumo:
The multi-level current reinjection concept described in literature is well-known to produce high quality AC current waveforms in high power and high voltage self-commutating current source converters. This paper proposes a novel reinjection circuitry which is capable of producing a 7-level reinjection current. It is shown that this reinjection current effectively increases the pulse number of the converter to 72. The use of PSCAD/EMTDC simulation validates the functionality of the proposed concept illustrating its effectiveness on both AC and DC sides of the converter.
Resumo:
An informed citizenry is essential to the effective functioning of democracy. In most modern liberal democracies, citizens have traditionally looked to the media as the primary source of information about socio-political matters. In our increasingly mediated world, it is critical that audiences be able to effectively and accurately use the media to meet their information needs. Media literacy, the ability to access, understand, evaluate and create media content is therefore a vital skill for a healthy democracy. The past three decades have seen the rapid expansion of the information environment, particularly through Internet technologies. It is obvious that media usage patterns have changed dramatically as a result. Blogs and websites are now popular sources of news and information, and are for some sections of the population likely to be the first, and possibly only, information source accessed when information is required. What are the implications for media literacy in such a diverse and changing information environment? The Alexandria Manifesto stresses the link between libraries, a well informed citizenry and effective governance, so how do these changes impact on libraries? This paper considers the role libraries can play in developing media literate communities, and explores the ways in which traditional media literacy training may be expanded to better equip citizens for new media technologies. Drawing on original empirical research, this paper highlights a key shortcoming of existing media literacy approaches: that of overlooking the importance of needs identification as an initial step in media selection. Self-awareness of ones actual information need is not automatic, as can be witnessed daily at reference desks in libraries the world over. Citizens very often do not know what it is that they need when it comes to information. Without this knowledge, selecting the most appropriate information source from the vast range available becomes an uncertain, possibly even random, enterprise. Incorporating reference interview-type training into media literacy education, whereby the individual will develop the skills to interrogate themselves regarding their underlying information needs, will enhance media literacy approaches. This increased focus on the needs of the individual will also push media literacy education into a more constructivist methodology. The paper also stresses the importance of media literacy training for adults. Media literacy education received in school or even university cannot be expected to retain its relevance over time in our rapidly evolving information environment. Further, constructivist teaching approaches highlight the importance of context to the learning process, thus it may be more effective to offer media literacy education relating to news media use to adults, whilst school-based approaches focus on types of media more relevant to young people, such as entertainment media. Librarians are ideally placed to offer such community-based media literacy education for adults. They already understand, through their training and practice of the reference interview, how to identify underlying information needs. Further, libraries are placed within community contexts, where the everyday practice of media literacy occurs. The Alexandria Manifesto stresses the link between libraries, a well informed citizenry and effective governance. It is clear that libraries have a role to play in fostering media literacy within their communities.