412 resultados para Averaging Principle
Resumo:
This paper outlines the methods and outcomes of a study into equity management strategies in Australian private sector organisations reporting to the Equal Opportunity for Women in the Workplace Agency. Reports from 1976 organisations indicate eleven key factors characterising equity management in Australia. The study highlights differences within previously identified social structural policies, temperamental and opportunity policies and identifies a further policy type, categorised as “support policies”. Differences have also been identified in relation to distribution structures, suggesting that gender is not the sole consideration in determining equity management strategies. The principle of distribution also figures strongly in equity management implementation.
Resumo:
In the early 1990's the University of Salford was typical of most pre-1992 Universities in that whilst students provided much of it's income, little attention was paid to pedagogy. As Warren Piper (1994) observed, University teachers were professional in their subject areas but generally did not seek to acquire a pedagogy of HE. This was the case in Alsford. Courses were efficiently run but only a minority of staff were engaged in actively considering learning and teaching issues. Instead staff time was spent on research and commercial activity.----- In the mid-1990's the teaching environment began to change significantly. As well as Dearing, the advent of QAA and teaching quality reviews, Salford was already experiencing changes in the characteristics of its student body. Wideing access was on our agenda before it was so predominant nationally. With increasing numbers and heterogeneity of students as well as these external factors, new challenges were facing the University and teaching domain.----- This paper describes how a culture which values teaching, learning and pedagogic inquiry is being created in the university. It then focuses on parts of this process specific to the Faculty of Business and Informatics, namely the Faculty's Learning and Teaching Research Network and the establishment of the Centre for Construction Education in the School of Construction and Property Management.----- The Faculty of Business and Informatics' Learning and Teaching Research Network aims to raise the profile, quality and volume of pedagogic research across the five schools in the faculty. The initiative is targeted at all academics regardless of previous research experience. We hope to grow and nurture research potential where it exists and to acknowledge and use the existing expertise of subject-based researchers in collaborative ventures. We work on the principle that people are deliged to share what they know but need appreciation and feedback for doing so. A further ain is to surface and celebrate the significant amount of tacit knowledge in the area of pedagogy evidenced by the strength of student and employer feedback in many areas of the faculty's teaching.----- The Faculty embraces generic and core management expertise but also includes applied management disciplines in information systems and construction and property management where internationally leading research activities and networked centres of excellence have been established. Drawing from this experience, and within the context of the Faculty network, a Centre for Construction Education is being established with key international external partners to develop a sustainable business model of an enterprising pedagogic centre that can undertake useful research to underpin teaching in the Faculty whilst offering sustainable business services to allow it to benefit from pump-priming grant funding.----- Internal and external networking are important elements in our plans and ongoing work. Key to this are our links with the LTSN subject centres (BEST and CEBE) and the LTSN generic centre. The paper discusses networking as a concept and gives examples of practices which have proved useful in this context.----- The academic influences on our approach are also examined. Dixon’s (2000) work examining how a range of companies succeed through internal knowledge sharing has provided a range of transferable practices. We also examine the notion of dialogue in this context, defined by Ballantyne (1999) as ‘The interactive human process of reasoning together which comes into being through interactions based on spontaneity or need and is enabled by trust’ Social constructionist principles of Practical Authorship (Shotter, 1993, Pavlica, Holman and Thorpe, 1998)) have also proved useful in developing our perspective on learning and knowledge creation within our community of practice.
Resumo:
The release of ultrafine particles (UFP) from laser printers and office equipment was analyzed using a particle counter (FMPS; Fast Mobility Particle Sizer) with a high time resolution, as well as the appropriate mathematical models. Measurements were carried out in a 1 m³ chamber, a 24 m³ chamber and an office. The time-dependent emission rates were calculated for these environments using a deconvolution model, after which the total amount of emitted particles was calculated. The total amounts of released particles were found to be independent of the environmental parameters and therefore, in principle, they were appropriate for the comparison of different printers. On the basis of the time-dependent emission rates, “initial burst” emitters and constant emitters could also be distinguished. In the case of an “initial burst” emitter, the comparison to other devices is generally affected by strong variations between individual measurements. When conducting exposure assessments for UFP in an office, the spatial distribution of the particles also had to be considered. In this work, the spatial distribution was predicted on a case by case basis, using CFD simulation.
Resumo:
'The Millennial Adolescent' offers contemporary, stimulating insights for those currently teaching as well as those preparing to teach. This book investigates the characteristics of Generation Y, using students own voices, generational theory and case studies. The text is structured around the principle that effective teachers need to know who they are teaching as well as what to teach, how to teach it, and how to assess the outcome. Using generational theory, 'The Millennial Adolescent' investigates the characteristics of Generation Y, or the Millennial Generation, and points out what all teachers need to know about working with this current generation of students who are described in a number of ways digital natives, team oriented, confident, multi-taskers, high achievers, and a generation unlike any other. The book contains well-known frameworks for developing understandings about adolescents, blended and contrasted with a contemporary socio-cultural construction of adolescence, set in our particular time, era and society. This book reflects the uniqueness of Australian contexts, while connecting with international trends and global patterns. Engaging and full of insights, this book is essential reading for all professionals dealing with adolescents.
Resumo:
This dissertation by publication which focuses on gender and the Australian federal parliament has resulted in the submission of three refereed journal articles. Data for the study were obtained from 30 semi-structured interviews undertaken in 2006 with fifteen (15) male and fifteen (15) female members of the Australian parliament. The first of the articles is methodological and has been accepted for publication in the Australian Journal of Political Science. The paper argues that feminist political science is guided by five important principles. These are placing gender at the centre of the research, giving emphasis to women’s voice, challenging the public/private divide, using research to transform society and taking a reflexive approach to positionality. It is the latter principle, that of the importance of taking a reflexive approach to research which I explore in the paper. Through drawing on my own experiences as a member of the House of Representatives (Forde 1987-1996) I reflexively investigate the intersections between my background and my identity as a researcher. The second of the articles views the data through the lens of Acker’s (1990) notion of the ‘gendered organization’ which posits that there are four dimensions by which organizations are gendered. These are via the division of labour, through symbols, images and ideologies, by workplace interactions and through the gendered components of individual identity. In this paper which has been submitted to the British Journal of Political Science, each of Acker’s (1990) dimensions is examined in terms of the data from interviews with male and female politicians. The central question investigated is thus to what extent does the Australian parliament conform to Acker’s (1990) concept of the ‘gendered organization’? The third of the papers focuses specifically on data from interviews with the 15 male politicians and investigates how they view gender equality and the Australian parliament. The article, which has been submitted to the European Journal of Political Science asks to what extent contemporary male politicians view the Australian parliament as gendered? Discourse analysis that is ‘ways of viewing’ (Bacchi, 1999, p. 40) is used as an approach to analyse the data. Three discursive frameworks by which male politicians view gender in the Australian parliament are identified. These are: that the parliament is gendered as masculine but this is unavoidable; that the parliament is gendered as feminine and women are actually advantaged; and that the parliament is gender neutral and gender is irrelevant. It is argued that collectively these framing devices operate to mask the many constraints which exist to marginalise women from political participation and undermine attempts to address women’s political disadvantage as political participants. The article concludes by highlighting the significance of the paper beyond the Australian context and calling for further research which names and critiques political men and their discourses on gender and parliamentary practices and processes.
Resumo:
Background Zoonotic schistosomiasis japonica is a major public health problem in China. Bovines, particularly water buffaloes, are thought to play a major role in the transmission of schistosomiasis to humans in China. Preliminary results (1998–2003) of a praziquantel (PZQ)-based pilot intervention study we undertook provided proof of principle that water buffaloes are major reservoir hosts for S. japonicum in the Poyang Lake region, Jiangxi Province. Methods and Findings Here we present the results of a cluster-randomised intervention trial (2004–2007) undertaken in Hunan and Jiangxi Provinces, with increased power and more general applicability to the lake and marshlands regions of southern China. The trial involved four matched pairs of villages with one village within each pair randomly selected as a control (human PZQ treatment only), leaving the other as the intervention (human and bovine PZQ treatment). A sentinel cohort of people to be monitored for new infections for the duration of the study was selected from each village. Results showed that combined human and bovine chemotherapy with PZQ had a greater effect on human incidence than human PZQ treatment alone. Conclusions The results from this study, supported by previous experimental evidence, confirms that bovines are the major reservoir host of human schistosomiasis in the lake and marshland regions of southern China, and reinforce the rationale for the development and deployment of a transmission blocking anti-S. japonicum vaccine targeting bovines.
Resumo:
This document provides a review of international and national practices in investment decision support tools in road asset management. Efforts were concentrated on identifying analytic frameworks, evaluation methodologies and criteria adopted by current tools. Emphasis was also given to how current approaches support Triple Bottom Line decision-making. Benefit Cost Analysis and Multiple Criteria Analysis are principle methodologies in supporting decision-making in Road Asset Management. The complexity of the applications shows significant differences in international practices. There is continuing discussion amongst practitioners and researchers regarding to which one is more appropriate in supporting decision-making. It is suggested that the two approaches should be regarded as complementary instead of competitive means. Multiple Criteria Analysis may be particularly helpful in early stages of project development, say strategic planning. Benefit Cost Analysis is used most widely for project prioritisation and selecting the final project from amongst a set of alternatives. Benefit Cost Analysis approach is useful tool for investment decision-making from an economic perspective. An extension of the approach, which includes social and environmental externalities, is currently used in supporting Triple Bottom Line decision-making in the road sector. However, efforts should be given to several issues in the applications. First of all, there is a need to reach a degree of commonality on considering social and environmental externalities, which may be achieved by aggregating the best practices. At different decision-making level, the detail of consideration of the externalities should be different. It is intended to develop a generic framework to coordinate the range of existing practices. The standard framework will also be helpful in reducing double counting, which appears in some current practices. Cautions should also be given to the methods of determining the value of social and environmental externalities. A number of methods, such as market price, resource costs and Willingness to Pay, are found in the review. The use of unreasonable monetisation methods in some cases has discredited Benefit Cost Analysis in the eyes of decision makers and the public. Some social externalities, such as employment and regional economic impacts, are generally omitted in current practices. This is due to the lack of information and credible models. It may be appropriate to consider these externalities in qualitative forms in a Multiple Criteria Analysis. Consensus has been reached in considering noise and air pollution in international practices. However, Australia practices generally omitted these externalities. Equity is an important consideration in Road Asset Management. The considerations are either between regions, or social groups, such as income, age, gender, disable, etc. In current practice, there is not a well developed quantitative measure for equity issues. More research is needed to target this issue. Although Multiple Criteria Analysis has been used for decades, there is not a generally accepted framework in the choice of modelling methods and various externalities. The result is that different analysts are unlikely to reach consistent conclusions about a policy measure. In current practices, some favour using methods which are able to prioritise alternatives, such as Goal Programming, Goal Achievement Matrix, Analytic Hierarchy Process. The others just present various impacts to decision-makers to characterise the projects. Weighting and scoring system are critical in most Multiple Criteria Analysis. However, the processes of assessing weights and scores were criticised as highly arbitrary and subjective. It is essential that the process should be as transparent as possible. Obtaining weights and scores by consulting local communities is a common practice, but is likely to result in bias towards local interests. Interactive approach has the advantage in helping decision-makers elaborating their preferences. However, computation burden may result in lose of interests of decision-makers during the solution process of a large-scale problem, say a large state road network. Current practices tend to use cardinal or ordinal scales in measure in non-monetised externalities. Distorted valuations can occur where variables measured in physical units, are converted to scales. For example, decibels of noise converts to a scale of -4 to +4 with a linear transformation, the difference between 3 and 4 represents a far greater increase in discomfort to people than the increase from 0 to 1. It is suggested to assign different weights to individual score. Due to overlapped goals, the problem of double counting also appears in some of Multiple Criteria Analysis. The situation can be improved by carefully selecting and defining investment goals and criteria. Other issues, such as the treatment of time effect, incorporating risk and uncertainty, have been given scant attention in current practices. This report suggested establishing a common analytic framework to deal with these issues.
Resumo:
Key topics: Since the birth of the Open Source movement in the mid-80's, open source software has become more and more widespread. Amongst others, the Linux operating system, the Apache web server and the Firefox internet explorer have taken substantial market shares to their proprietary competitors. Open source software is governed by particular types of licenses. As proprietary licenses only allow the software's use in exchange for a fee, open source licenses grant users more rights like the free use, free copy, free modification and free distribution of the software, as well as free access to the source code. This new phenomenon has raised many managerial questions: organizational issues related to the system of governance that underlie such open source communities (Raymond, 1999a; Lerner and Tirole, 2002; Lee and Cole 2003; Mockus et al. 2000; Tuomi, 2000; Demil and Lecocq, 2006; O'Mahony and Ferraro, 2007;Fleming and Waguespack, 2007), collaborative innovation issues (Von Hippel, 2003; Von Krogh et al., 2003; Von Hippel and Von Krogh, 2003; Dahlander, 2005; Osterloh, 2007; David, 2008), issues related to the nature as well as the motivations of developers (Lerner and Tirole, 2002; Hertel, 2003; Dahlander and McKelvey, 2005; Jeppesen and Frederiksen, 2006), public policy and innovation issues (Jullien and Zimmermann, 2005; Lee, 2006), technological competitions issues related to standard battles between proprietary and open source software (Bonaccorsi and Rossi, 2003; Bonaccorsi et al. 2004, Economides and Katsamakas, 2005; Chen, 2007), intellectual property rights and licensing issues (Laat 2005; Lerner and Tirole, 2005; Gambardella, 2006; Determann et al., 2007). A major unresolved issue concerns open source business models and revenue capture, given that open source licenses imply no fee for users. On this topic, articles show that a commercial activity based on open source software is possible, as they describe different possible ways of doing business around open source (Raymond, 1999; Dahlander, 2004; Daffara, 2007; Bonaccorsi and Merito, 2007). These studies usually look at open source-based companies. Open source-based companies encompass a wide range of firms with different categories of activities: providers of packaged open source solutions, IT Services&Software Engineering firms and open source software publishers. However, business models implications are different for each of these categories: providers of packaged solutions and IT Services&Software Engineering firms' activities are based on software developed outside their boundaries, whereas commercial software publishers sponsor the development of the open source software. This paper focuses on open source software publishers' business models as this issue is even more crucial for this category of firms which take the risk of investing in the development of the software. Literature at last identifies and depicts only two generic types of business models for open source software publishers: the business models of ''bundling'' (Pal and Madanmohan, 2002; Dahlander 2004) and the dual licensing business models (Välimäki, 2003; Comino and Manenti, 2007). Nevertheless, these business models are not applicable in all circumstances. Methodology: The objectives of this paper are: (1) to explore in which contexts the two generic business models described in literature can be implemented successfully and (2) to depict an additional business model for open source software publishers which can be used in a different context. To do so, this paper draws upon an explorative case study of IdealX, a French open source security software publisher. This case study consists in a series of 3 interviews conducted between February 2005 and April 2006 with the co-founder and the business manager. It aims at depicting the process of IdealX's search for the appropriate business model between its creation in 2000 and 2006. This software publisher has tried both generic types of open source software publishers' business models before designing its own. Consequently, through IdealX's trials and errors, I investigate the conditions under which such generic business models can be effective. Moreover, this study describes the business model finally designed and adopted by IdealX: an additional open source software publisher's business model based on the principle of ''mutualisation'', which is applicable in a different context. Results and implications: Finally, this article contributes to ongoing empirical work within entrepreneurship and strategic management on open source software publishers' business models: it provides the characteristics of three generic business models (the business model of bundling, the dual licensing business model and the business model of mutualisation) as well as conditions under which they can be successfully implemented (regarding the type of product developed and the competencies of the firm). This paper also goes further into the traditional concept of business model used by scholars in the open source related literature. In this article, a business model is not only considered as a way of generating incomes (''revenue model'' (Amit and Zott, 2001)), but rather as the necessary conjunction of value creation and value capture, according to the recent literature about business models (Amit and Zott, 2001; Chresbrough and Rosenblum, 2002; Teece, 2007). Consequently, this paper analyses the business models from these two components' point of view.
Resumo:
Construction is an information intensive industry in which the accuracy and timeliness of information is paramount. It observed that the main communication issue in construction is to provide a method to exchange data between the site operation, the site office and the head office. The information needs under consideration are time critical to assist in maintaining or improving the efficiency at the jobsite. Without appropriate computing support this may increase the difficulty of problem solving. Many researchers focus their research on the usage of mobile computing devices in the construction industry and they believe that mobile computers have the potential to solve some construction problems that leads to reduce overall productivity. However, to date very limited observation has been conducted in terms of the deployment of mobile computers for construction workers on-site. By providing field workers with accurate, reliable and timely information at the location where it is needed, it will support the effectiveness and efficiency at the job site. Bringing a new technology into construction industry is not only need a better understanding of the application, but also need a proper preparation of the allocation of the resources such as people, and investment. With this in mind, an accurate analysis is needed to provide clearly idea of the overall costs and benefits of the new technology. A cost benefit analysis is a method of evaluating the relative merits of a proposed investment project in order to achieve efficient allocation of resources. It is a way of identifying, portraying and assessing the factors which need to be considered in making rational economic choices. In principle, a cost benefit analysis is a rigorous, quantitative and data-intensive procedure, which requires identification all potential effects, categorisation of these effects as costs and benefits, quantitative estimation of the extent of each cost and benefit associated with an action, translation of these into a common metric such as dollars, discounting of future costs and benefits into the terms of a given year, and summary of all cost and benefit to see which is greater. Even though many cost benefit analysis methodologies are available for a general assessment, there is no specific methodology can be applied for analysing the cost and benefit of the application of mobile computing devices in the construction site. Hence, the proposed methodology in this document is predominantly adapted from Baker et al. (2000), Department of Finance (1995), and Office of Investment Management (2005). The methodology is divided into four main stages and then detailed into ten steps. The methodology is provided for the CRC CI 2002-057-C Project: Enabling Team Collaboration with Pervasive and Mobile Computing and can be seen in detail in Section 3.
Resumo:
A small group of companies including Intel, Microsoft, and Cisco have used "platform leadership" with great effect as a means for driving innovation and accelerating market growth within their respective industries. Prior research in this area emphasizes that trust plays a critical role in the success of this strategy. However, many of the categorizations of trust discussed in the literature tend to ignore or undervalue the fact that trust and power are often functionally equivalent, and that the coercion of weaker partners is sometimes misdiagnosed as collaboration. In this paper, I use case study data focusing on Intel's shift from ceramic/wire-bonded packaging to organic/C4 packaging to characterize the relationships between Intel and its suppliers, and to determine if these links are based on power in addition to trust. The case study shows that Intel's platform leadership strategy is built on a balance of both trust and a relatively benevolent form of power that is exemplified by the company's "open kimono" principle, through which Intel insists that suppliers share detailed financial data and highly proprietary technical information to achieve mutually advantageous objectives. By explaining more completely the nature of these inter-firm linkages, this paper usefully extends our understanding of how platform leadership is maintained by Intel, and contributes to the literature by showing how trust and power can be used simultaneously within an inter-firm relationship in a way that benefits all of the stakeholders.
Resumo:
Since 1995 the buildingSMART International Alliance for Interoperability (buildingSMART)has developed a robust standard called the Industry Foundation Classes (IFC). IFC is an object oriented data model with related file format that has facilitated the efficient exchange of data in the development of building information models (BIM). The Cooperative Research Centre for Construction Innovation has contributed to the international effort in the development of the IFC standard and specifically the reinforced concrete part of the latest IFC 2x3 release. Industry Foundation Classes have been endorsed by the International Standards Organisation as a Publicly Available Specification (PAS) under the ISO label ISO/PAS 16739. For more details, go to http://www.tc184- sc4.org/About_TC184-SC4/About_SC4_Standards/ The current IFC model covers the building itself to a useful level of detail. The next stage of development for the IFC standard is where the building meets the ground (terrain) and with civil and external works like pavements, retaining walls, bridges, tunnels etc. With the current focus in Australia on infrastructure projects over the next 20 years a logical extension to this standard was in the area of site and civil works. This proposal recognises that there is an existing body of work on the specification of road representation data. In particular, LandXML is recognised as also is TransXML in the broader context of transportation and CityGML in the common interfacing of city maps, buildings and roads. Examination of interfaces between IFC and these specifications is therefore within the scope of this project. That such interfaces can be developed has already been demonstrated in principle within the IFC for Geographic Information Systems (GIS) project. National road standards that are already in use should be carefully analysed and contacts established in order to gain from this knowledge. The Object Catalogue for the Road Transport Sector (OKSTRA) should be noted as an example. It is also noted that buildingSMART Norway has submitted a proposal
Resumo:
Damage localization induced by strain softening can be predicted by the direct minimization of a global energy function. This article concerns the computational strategy for implementing this principle for softening materials such as concrete. Instead of using heuristic global optimization techniques, our strategies are a hybrid of local optimization methods with a path-finding approach to ensure a global optimum. With admissible nodal displacements being independent variables, it is easy to deal with the geometric (mesh) constraint conditions. The direct search optimization methods recover the localized solutions for a range of softening lattice models which are representative of quasi-brittle structures
Resumo:
This study established that the core principle underlying categorisation of activities have the potential to provide more comprehensive outcomes than the recognition of activities because it takes into consideration activities other than directional locomotion.
Resumo:
The purpose of this chapter is to discuss the relationship between crime and morality, with a specific focus on crimes against morality. While we argue that all crimes have a general moral basis, condemned as wrong or bad and proscribed by society, there is a specific group of offences in modern democratic nations labelled crimes against morality. Included within this group are offences related to prostitution, pornography and homosexuality. What do these crimes have in common? Most clearly they tend to have a sexual basis and are often argued to do sexual harm, in both a moral and /or psychological sense, as well as physically. Conversely they are often argued to be victimless crimes, especially when the acts occur between consenting adults. Finally they are considered essentially private acts but they often occur, and are regulated, in the public domain. Most importantly, each of these crimes against morality has only relatively recently (ie in the past 150 years) become identified and regulated by the state as a criminal offence.