921 resultados para INVARIANCE-PRINCIPLE
Resumo:
L’intégration des nouveaux immigrants pose un défi, et ce, particulièrement dans les nations infra-étatiques. En effet, les citoyens vivant dans ces contextes ont davantage tendance à percevoir les immigrants comme de potentielles menaces politiques et culturelles. Cependant, les différents groupes ethniques et religieux minoritaires ne représentent pas tous le même degré de menace. Cette étude cherche à déterminer si les citoyens francophones québécois perçoivent différemment les différents groupes ethniques et religieux minoritaires, et s’ils entretiennent des attitudes plus négatives envers ces groupes, comparativement aux autres Canadiens. Dans la mesure où ces attitudes négatives existent, l’étude cherche à comprendre si ces dernières sont basées principalement sur des préjugés raciaux ou sur des inquiétudes culturelles. Se fondant sur des données nationales et provinciales, les résultats démontrent que les francophones Québécois sont plus négatifs envers les minorités religieuses que les autres canadiens mais pas envers les minorités raciales, et que ces attitudes négatives sont fondées principalement sur une inquiétude liée la laïcité et à la sécurité culturelle. L’antipathie envers certaines minorités observée au sein de la majorité francophone au Québec semble donc être dirigée envers des groupes spécifiques, et se fondent sur des principes de nature davantage culturelle que raciale.
Resumo:
The Posttraumatic Growth Inventory (PTGI) is frequently used to assess positive changes following a traumatic event. The aim of the study is to examine the factor structure and the latent mean invariance of PTGI. A sample of 205 (M age = 54.3, SD = 10.1) women diagnosed with breast cancer and 456 (M age = 34.9, SD = 12.5) adults who had experienced a range of adverse life events were recruited to complete the PTGI and a socio-demographic questionnaire. We use Confirmatory Factor Analysis (CFA) to test the factor-structure and multi-sample CFA to examine the invariance of the PTGI between the two groups. The goodness of fit for the five-factor model is satisfactory for breast cancer sample (χ2(175) = 396.265; CFI = .884; NIF = .813; RMSEA [90% CI] = .079 [.068, .089]), and good for non-clinical sample (χ2(172) = 574.329; CFI = .931; NIF = .905; RMSEA [90% CI] = .072 [.065, .078]). The results of multi-sample CFA show that the model fit indices of the unconstrained model are equal but the model that uses constrained factor loadings is not invariant across groups. The findings provide support for the original five-factor structure and for the multidimensional nature of posttraumatic growth (PTG). Regarding invariance between both samples, the factor structure of PTGI and other parameters (i.e., factor loadings, variances, and co-variances) are not invariant across the sample of breast cancer patients and the non-clinical sample.
Resumo:
The idea of spacecraft formations, flying in tight configurations with maximum baselines of a few hundred meters in low-Earth orbits, has generated widespread interest over the last several years. Nevertheless, controlling the movement of spacecraft in formation poses difficulties, such as in-orbit high-computing demand and collision avoidance capabilities, which escalate as the number of units in the formation is increased and complicated nonlinear effects are imposed to the dynamics, together with uncertainty which may arise from the lack of knowledge of system parameters. These requirements have led to the need of reliable linear and nonlinear controllers in terms of relative and absolute dynamics. The objective of this thesis is, therefore, to introduce new control methods to allow spacecraft in formation, with circular/elliptical reference orbits, to efficiently execute safe autonomous manoeuvres. These controllers distinguish from the bulk of literature in that they merge guidance laws never applied before to spacecraft formation flying and collision avoidance capacities into a single control strategy. For this purpose, three control schemes are presented: linear optimal regulation, linear optimal estimation and adaptive nonlinear control. In general terms, the proposed control approaches command the dynamical performance of one or several followers with respect to a leader to asymptotically track a time-varying nominal trajectory (TVNT), while the threat of collision between the followers is reduced by repelling accelerations obtained from the collision avoidance scheme during the periods of closest proximity. Linear optimal regulation is achieved through a Riccati-based tracking controller. Within this control strategy, the controller provides guidance and tracking toward a desired TVNT, optimizing fuel consumption by Riccati procedure using a non-infinite cost function defined in terms of the desired TVNT, while repelling accelerations generated from the CAS will ensure evasive actions between the elements of the formation. The relative dynamics model, suitable for circular and eccentric low-Earth reference orbits, is based on the Tschauner and Hempel equations, and includes a control input and a nonlinear term corresponding to the CAS repelling accelerations. Linear optimal estimation is built on the forward-in-time separation principle. This controller encompasses two stages: regulation and estimation. The first stage requires the design of a full state feedback controller using the state vector reconstructed by means of the estimator. The second stage requires the design of an additional dynamical system, the estimator, to obtain the states which cannot be measured in order to approximately reconstruct the full state vector. Then, the separation principle states that an observer built for a known input can also be used to estimate the state of the system and to generate the control input. This allows the design of the observer and the feedback independently, by exploiting the advantages of linear quadratic regulator theory, in order to estimate the states of a dynamical system with model and sensor uncertainty. The relative dynamics is described with the linear system used in the previous controller, with a control input and nonlinearities entering via the repelling accelerations from the CAS during collision avoidance events. Moreover, sensor uncertainty is added to the control process by considering carrier-phase differential GPS (CDGPS) velocity measurement error. An adaptive control law capable of delivering superior closed-loop performance when compared to the certainty-equivalence (CE) adaptive controllers is finally presented. A novel noncertainty-equivalence controller based on the Immersion and Invariance paradigm for close-manoeuvring spacecraft formation flying in both circular and elliptical low-Earth reference orbits is introduced. The proposed control scheme achieves stabilization by immersing the plant dynamics into a target dynamical system (or manifold) that captures the desired dynamical behaviour. They key feature of this methodology is the addition of a new term to the classical certainty-equivalence control approach that, in conjunction with the parameter update law, is designed to achieve adaptive stabilization. This parameter has the ultimate task of shaping the manifold into which the adaptive system is immersed. The performance of the controller is proven stable via a Lyapunov-based analysis and Barbalat’s lemma. In order to evaluate the design of the controllers, test cases based on the physical and orbital features of the Prototype Research Instruments and Space Mission Technology Advancement (PRISMA) are implemented, extending the number of elements in the formation into scenarios with reconfigurations and on-orbit position switching in elliptical low-Earth reference orbits. An extensive analysis and comparison of the performance of the controllers in terms of total Δv and fuel consumption, with and without the effects of the CAS, is presented. These results show that the three proposed controllers allow the followers to asymptotically track the desired nominal trajectory and, additionally, those simulations including CAS show an effective decrease of collision risk during the performance of the manoeuvre.
Resumo:
At what point in reading development does literacy impact object recognition and orientation processing? Is it specific to mirror images? To answer these questions, forty-six 5- to 7-year-old preschoolers and first graders performed two same–different tasks differing in the matching criterion-orientation-based versus shape-based (orientation independent)-on geometric shapes and letters. On orientation-based judgments, first graders out- performed preschoolers who had the strongest difficulty with mirrored pairs. On shape-based judgments, first graders were slower for mirrored than identical pairs, and even slower than preschoolers. This mirror cost emerged with letter knowledge. Only first graders presented worse shape-based judgments for mirrored and rotated pairs of reversible (e.g., b-d; b-q) than nonreversible (e.g., e-ә) letters, indicating readers’ difficulty in ignoring orientation contrasts relevant to letters.
Authentic Leadership Questionnaire: invariance between samples of Brazilian and Portuguese employees
Resumo:
The Authentic Leadership Questionnaire (ALQ) is used to assess authentic leadership (AL). Although ALQ is often used in empirical research, cross-cultural studies with this measure are scarce. Aiming to contribute to filling this gap, this study assesses the invariance of the ALQ measure between samples of Brazilian (N = 1019) and Portuguese (N = 842) employees. A multi-group confirmatory factor analysis was performed, and the results showed the invariance of the first- and second-order factor models between the Brazilian and Portuguese samples. The results are discussed considering their cultural setting, with the study’s limitations and future research directions being pointed out.
Resumo:
This paper outlines the methods and outcomes of a study into equity management strategies in Australian private sector organisations reporting to the Equal Opportunity for Women in the Workplace Agency. Reports from 1976 organisations indicate eleven key factors characterising equity management in Australia. The study highlights differences within previously identified social structural policies, temperamental and opportunity policies and identifies a further policy type, categorised as “support policies”. Differences have also been identified in relation to distribution structures, suggesting that gender is not the sole consideration in determining equity management strategies. The principle of distribution also figures strongly in equity management implementation.
Resumo:
In the early 1990's the University of Salford was typical of most pre-1992 Universities in that whilst students provided much of it's income, little attention was paid to pedagogy. As Warren Piper (1994) observed, University teachers were professional in their subject areas but generally did not seek to acquire a pedagogy of HE. This was the case in Alsford. Courses were efficiently run but only a minority of staff were engaged in actively considering learning and teaching issues. Instead staff time was spent on research and commercial activity.----- In the mid-1990's the teaching environment began to change significantly. As well as Dearing, the advent of QAA and teaching quality reviews, Salford was already experiencing changes in the characteristics of its student body. Wideing access was on our agenda before it was so predominant nationally. With increasing numbers and heterogeneity of students as well as these external factors, new challenges were facing the University and teaching domain.----- This paper describes how a culture which values teaching, learning and pedagogic inquiry is being created in the university. It then focuses on parts of this process specific to the Faculty of Business and Informatics, namely the Faculty's Learning and Teaching Research Network and the establishment of the Centre for Construction Education in the School of Construction and Property Management.----- The Faculty of Business and Informatics' Learning and Teaching Research Network aims to raise the profile, quality and volume of pedagogic research across the five schools in the faculty. The initiative is targeted at all academics regardless of previous research experience. We hope to grow and nurture research potential where it exists and to acknowledge and use the existing expertise of subject-based researchers in collaborative ventures. We work on the principle that people are deliged to share what they know but need appreciation and feedback for doing so. A further ain is to surface and celebrate the significant amount of tacit knowledge in the area of pedagogy evidenced by the strength of student and employer feedback in many areas of the faculty's teaching.----- The Faculty embraces generic and core management expertise but also includes applied management disciplines in information systems and construction and property management where internationally leading research activities and networked centres of excellence have been established. Drawing from this experience, and within the context of the Faculty network, a Centre for Construction Education is being established with key international external partners to develop a sustainable business model of an enterprising pedagogic centre that can undertake useful research to underpin teaching in the Faculty whilst offering sustainable business services to allow it to benefit from pump-priming grant funding.----- Internal and external networking are important elements in our plans and ongoing work. Key to this are our links with the LTSN subject centres (BEST and CEBE) and the LTSN generic centre. The paper discusses networking as a concept and gives examples of practices which have proved useful in this context.----- The academic influences on our approach are also examined. Dixon’s (2000) work examining how a range of companies succeed through internal knowledge sharing has provided a range of transferable practices. We also examine the notion of dialogue in this context, defined by Ballantyne (1999) as ‘The interactive human process of reasoning together which comes into being through interactions based on spontaneity or need and is enabled by trust’ Social constructionist principles of Practical Authorship (Shotter, 1993, Pavlica, Holman and Thorpe, 1998)) have also proved useful in developing our perspective on learning and knowledge creation within our community of practice.
Resumo:
The release of ultrafine particles (UFP) from laser printers and office equipment was analyzed using a particle counter (FMPS; Fast Mobility Particle Sizer) with a high time resolution, as well as the appropriate mathematical models. Measurements were carried out in a 1 m³ chamber, a 24 m³ chamber and an office. The time-dependent emission rates were calculated for these environments using a deconvolution model, after which the total amount of emitted particles was calculated. The total amounts of released particles were found to be independent of the environmental parameters and therefore, in principle, they were appropriate for the comparison of different printers. On the basis of the time-dependent emission rates, “initial burst” emitters and constant emitters could also be distinguished. In the case of an “initial burst” emitter, the comparison to other devices is generally affected by strong variations between individual measurements. When conducting exposure assessments for UFP in an office, the spatial distribution of the particles also had to be considered. In this work, the spatial distribution was predicted on a case by case basis, using CFD simulation.
Resumo:
'The Millennial Adolescent' offers contemporary, stimulating insights for those currently teaching as well as those preparing to teach. This book investigates the characteristics of Generation Y, using students own voices, generational theory and case studies. The text is structured around the principle that effective teachers need to know who they are teaching as well as what to teach, how to teach it, and how to assess the outcome. Using generational theory, 'The Millennial Adolescent' investigates the characteristics of Generation Y, or the Millennial Generation, and points out what all teachers need to know about working with this current generation of students who are described in a number of ways digital natives, team oriented, confident, multi-taskers, high achievers, and a generation unlike any other. The book contains well-known frameworks for developing understandings about adolescents, blended and contrasted with a contemporary socio-cultural construction of adolescence, set in our particular time, era and society. This book reflects the uniqueness of Australian contexts, while connecting with international trends and global patterns. Engaging and full of insights, this book is essential reading for all professionals dealing with adolescents.
Resumo:
This dissertation by publication which focuses on gender and the Australian federal parliament has resulted in the submission of three refereed journal articles. Data for the study were obtained from 30 semi-structured interviews undertaken in 2006 with fifteen (15) male and fifteen (15) female members of the Australian parliament. The first of the articles is methodological and has been accepted for publication in the Australian Journal of Political Science. The paper argues that feminist political science is guided by five important principles. These are placing gender at the centre of the research, giving emphasis to women’s voice, challenging the public/private divide, using research to transform society and taking a reflexive approach to positionality. It is the latter principle, that of the importance of taking a reflexive approach to research which I explore in the paper. Through drawing on my own experiences as a member of the House of Representatives (Forde 1987-1996) I reflexively investigate the intersections between my background and my identity as a researcher. The second of the articles views the data through the lens of Acker’s (1990) notion of the ‘gendered organization’ which posits that there are four dimensions by which organizations are gendered. These are via the division of labour, through symbols, images and ideologies, by workplace interactions and through the gendered components of individual identity. In this paper which has been submitted to the British Journal of Political Science, each of Acker’s (1990) dimensions is examined in terms of the data from interviews with male and female politicians. The central question investigated is thus to what extent does the Australian parliament conform to Acker’s (1990) concept of the ‘gendered organization’? The third of the papers focuses specifically on data from interviews with the 15 male politicians and investigates how they view gender equality and the Australian parliament. The article, which has been submitted to the European Journal of Political Science asks to what extent contemporary male politicians view the Australian parliament as gendered? Discourse analysis that is ‘ways of viewing’ (Bacchi, 1999, p. 40) is used as an approach to analyse the data. Three discursive frameworks by which male politicians view gender in the Australian parliament are identified. These are: that the parliament is gendered as masculine but this is unavoidable; that the parliament is gendered as feminine and women are actually advantaged; and that the parliament is gender neutral and gender is irrelevant. It is argued that collectively these framing devices operate to mask the many constraints which exist to marginalise women from political participation and undermine attempts to address women’s political disadvantage as political participants. The article concludes by highlighting the significance of the paper beyond the Australian context and calling for further research which names and critiques political men and their discourses on gender and parliamentary practices and processes.
Resumo:
Background Zoonotic schistosomiasis japonica is a major public health problem in China. Bovines, particularly water buffaloes, are thought to play a major role in the transmission of schistosomiasis to humans in China. Preliminary results (1998–2003) of a praziquantel (PZQ)-based pilot intervention study we undertook provided proof of principle that water buffaloes are major reservoir hosts for S. japonicum in the Poyang Lake region, Jiangxi Province. Methods and Findings Here we present the results of a cluster-randomised intervention trial (2004–2007) undertaken in Hunan and Jiangxi Provinces, with increased power and more general applicability to the lake and marshlands regions of southern China. The trial involved four matched pairs of villages with one village within each pair randomly selected as a control (human PZQ treatment only), leaving the other as the intervention (human and bovine PZQ treatment). A sentinel cohort of people to be monitored for new infections for the duration of the study was selected from each village. Results showed that combined human and bovine chemotherapy with PZQ had a greater effect on human incidence than human PZQ treatment alone. Conclusions The results from this study, supported by previous experimental evidence, confirms that bovines are the major reservoir host of human schistosomiasis in the lake and marshland regions of southern China, and reinforce the rationale for the development and deployment of a transmission blocking anti-S. japonicum vaccine targeting bovines.
Resumo:
This document provides a review of international and national practices in investment decision support tools in road asset management. Efforts were concentrated on identifying analytic frameworks, evaluation methodologies and criteria adopted by current tools. Emphasis was also given to how current approaches support Triple Bottom Line decision-making. Benefit Cost Analysis and Multiple Criteria Analysis are principle methodologies in supporting decision-making in Road Asset Management. The complexity of the applications shows significant differences in international practices. There is continuing discussion amongst practitioners and researchers regarding to which one is more appropriate in supporting decision-making. It is suggested that the two approaches should be regarded as complementary instead of competitive means. Multiple Criteria Analysis may be particularly helpful in early stages of project development, say strategic planning. Benefit Cost Analysis is used most widely for project prioritisation and selecting the final project from amongst a set of alternatives. Benefit Cost Analysis approach is useful tool for investment decision-making from an economic perspective. An extension of the approach, which includes social and environmental externalities, is currently used in supporting Triple Bottom Line decision-making in the road sector. However, efforts should be given to several issues in the applications. First of all, there is a need to reach a degree of commonality on considering social and environmental externalities, which may be achieved by aggregating the best practices. At different decision-making level, the detail of consideration of the externalities should be different. It is intended to develop a generic framework to coordinate the range of existing practices. The standard framework will also be helpful in reducing double counting, which appears in some current practices. Cautions should also be given to the methods of determining the value of social and environmental externalities. A number of methods, such as market price, resource costs and Willingness to Pay, are found in the review. The use of unreasonable monetisation methods in some cases has discredited Benefit Cost Analysis in the eyes of decision makers and the public. Some social externalities, such as employment and regional economic impacts, are generally omitted in current practices. This is due to the lack of information and credible models. It may be appropriate to consider these externalities in qualitative forms in a Multiple Criteria Analysis. Consensus has been reached in considering noise and air pollution in international practices. However, Australia practices generally omitted these externalities. Equity is an important consideration in Road Asset Management. The considerations are either between regions, or social groups, such as income, age, gender, disable, etc. In current practice, there is not a well developed quantitative measure for equity issues. More research is needed to target this issue. Although Multiple Criteria Analysis has been used for decades, there is not a generally accepted framework in the choice of modelling methods and various externalities. The result is that different analysts are unlikely to reach consistent conclusions about a policy measure. In current practices, some favour using methods which are able to prioritise alternatives, such as Goal Programming, Goal Achievement Matrix, Analytic Hierarchy Process. The others just present various impacts to decision-makers to characterise the projects. Weighting and scoring system are critical in most Multiple Criteria Analysis. However, the processes of assessing weights and scores were criticised as highly arbitrary and subjective. It is essential that the process should be as transparent as possible. Obtaining weights and scores by consulting local communities is a common practice, but is likely to result in bias towards local interests. Interactive approach has the advantage in helping decision-makers elaborating their preferences. However, computation burden may result in lose of interests of decision-makers during the solution process of a large-scale problem, say a large state road network. Current practices tend to use cardinal or ordinal scales in measure in non-monetised externalities. Distorted valuations can occur where variables measured in physical units, are converted to scales. For example, decibels of noise converts to a scale of -4 to +4 with a linear transformation, the difference between 3 and 4 represents a far greater increase in discomfort to people than the increase from 0 to 1. It is suggested to assign different weights to individual score. Due to overlapped goals, the problem of double counting also appears in some of Multiple Criteria Analysis. The situation can be improved by carefully selecting and defining investment goals and criteria. Other issues, such as the treatment of time effect, incorporating risk and uncertainty, have been given scant attention in current practices. This report suggested establishing a common analytic framework to deal with these issues.
Resumo:
Key topics: Since the birth of the Open Source movement in the mid-80's, open source software has become more and more widespread. Amongst others, the Linux operating system, the Apache web server and the Firefox internet explorer have taken substantial market shares to their proprietary competitors. Open source software is governed by particular types of licenses. As proprietary licenses only allow the software's use in exchange for a fee, open source licenses grant users more rights like the free use, free copy, free modification and free distribution of the software, as well as free access to the source code. This new phenomenon has raised many managerial questions: organizational issues related to the system of governance that underlie such open source communities (Raymond, 1999a; Lerner and Tirole, 2002; Lee and Cole 2003; Mockus et al. 2000; Tuomi, 2000; Demil and Lecocq, 2006; O'Mahony and Ferraro, 2007;Fleming and Waguespack, 2007), collaborative innovation issues (Von Hippel, 2003; Von Krogh et al., 2003; Von Hippel and Von Krogh, 2003; Dahlander, 2005; Osterloh, 2007; David, 2008), issues related to the nature as well as the motivations of developers (Lerner and Tirole, 2002; Hertel, 2003; Dahlander and McKelvey, 2005; Jeppesen and Frederiksen, 2006), public policy and innovation issues (Jullien and Zimmermann, 2005; Lee, 2006), technological competitions issues related to standard battles between proprietary and open source software (Bonaccorsi and Rossi, 2003; Bonaccorsi et al. 2004, Economides and Katsamakas, 2005; Chen, 2007), intellectual property rights and licensing issues (Laat 2005; Lerner and Tirole, 2005; Gambardella, 2006; Determann et al., 2007). A major unresolved issue concerns open source business models and revenue capture, given that open source licenses imply no fee for users. On this topic, articles show that a commercial activity based on open source software is possible, as they describe different possible ways of doing business around open source (Raymond, 1999; Dahlander, 2004; Daffara, 2007; Bonaccorsi and Merito, 2007). These studies usually look at open source-based companies. Open source-based companies encompass a wide range of firms with different categories of activities: providers of packaged open source solutions, IT Services&Software Engineering firms and open source software publishers. However, business models implications are different for each of these categories: providers of packaged solutions and IT Services&Software Engineering firms' activities are based on software developed outside their boundaries, whereas commercial software publishers sponsor the development of the open source software. This paper focuses on open source software publishers' business models as this issue is even more crucial for this category of firms which take the risk of investing in the development of the software. Literature at last identifies and depicts only two generic types of business models for open source software publishers: the business models of ''bundling'' (Pal and Madanmohan, 2002; Dahlander 2004) and the dual licensing business models (Välimäki, 2003; Comino and Manenti, 2007). Nevertheless, these business models are not applicable in all circumstances. Methodology: The objectives of this paper are: (1) to explore in which contexts the two generic business models described in literature can be implemented successfully and (2) to depict an additional business model for open source software publishers which can be used in a different context. To do so, this paper draws upon an explorative case study of IdealX, a French open source security software publisher. This case study consists in a series of 3 interviews conducted between February 2005 and April 2006 with the co-founder and the business manager. It aims at depicting the process of IdealX's search for the appropriate business model between its creation in 2000 and 2006. This software publisher has tried both generic types of open source software publishers' business models before designing its own. Consequently, through IdealX's trials and errors, I investigate the conditions under which such generic business models can be effective. Moreover, this study describes the business model finally designed and adopted by IdealX: an additional open source software publisher's business model based on the principle of ''mutualisation'', which is applicable in a different context. Results and implications: Finally, this article contributes to ongoing empirical work within entrepreneurship and strategic management on open source software publishers' business models: it provides the characteristics of three generic business models (the business model of bundling, the dual licensing business model and the business model of mutualisation) as well as conditions under which they can be successfully implemented (regarding the type of product developed and the competencies of the firm). This paper also goes further into the traditional concept of business model used by scholars in the open source related literature. In this article, a business model is not only considered as a way of generating incomes (''revenue model'' (Amit and Zott, 2001)), but rather as the necessary conjunction of value creation and value capture, according to the recent literature about business models (Amit and Zott, 2001; Chresbrough and Rosenblum, 2002; Teece, 2007). Consequently, this paper analyses the business models from these two components' point of view.
Resumo:
Construction is an information intensive industry in which the accuracy and timeliness of information is paramount. It observed that the main communication issue in construction is to provide a method to exchange data between the site operation, the site office and the head office. The information needs under consideration are time critical to assist in maintaining or improving the efficiency at the jobsite. Without appropriate computing support this may increase the difficulty of problem solving. Many researchers focus their research on the usage of mobile computing devices in the construction industry and they believe that mobile computers have the potential to solve some construction problems that leads to reduce overall productivity. However, to date very limited observation has been conducted in terms of the deployment of mobile computers for construction workers on-site. By providing field workers with accurate, reliable and timely information at the location where it is needed, it will support the effectiveness and efficiency at the job site. Bringing a new technology into construction industry is not only need a better understanding of the application, but also need a proper preparation of the allocation of the resources such as people, and investment. With this in mind, an accurate analysis is needed to provide clearly idea of the overall costs and benefits of the new technology. A cost benefit analysis is a method of evaluating the relative merits of a proposed investment project in order to achieve efficient allocation of resources. It is a way of identifying, portraying and assessing the factors which need to be considered in making rational economic choices. In principle, a cost benefit analysis is a rigorous, quantitative and data-intensive procedure, which requires identification all potential effects, categorisation of these effects as costs and benefits, quantitative estimation of the extent of each cost and benefit associated with an action, translation of these into a common metric such as dollars, discounting of future costs and benefits into the terms of a given year, and summary of all cost and benefit to see which is greater. Even though many cost benefit analysis methodologies are available for a general assessment, there is no specific methodology can be applied for analysing the cost and benefit of the application of mobile computing devices in the construction site. Hence, the proposed methodology in this document is predominantly adapted from Baker et al. (2000), Department of Finance (1995), and Office of Investment Management (2005). The methodology is divided into four main stages and then detailed into ten steps. The methodology is provided for the CRC CI 2002-057-C Project: Enabling Team Collaboration with Pervasive and Mobile Computing and can be seen in detail in Section 3.