921 resultados para theory-building
Resumo:
Emerging sciences, such as conceptual cost estimating, seem to have to go through two phases. The first phase involves reducing the field of study down to its basic ingredients - from systems development to technological development (techniques) to theoretical development. The second phase operates in the direction in building up techniques from theories, and systems from techniques. Cost estimating is clearly and distinctly still in the first phase. A great deal of effort has been put into the development of both manual and computer based cost estimating systems during this first phase and, to a lesser extent, the development of a range of techniques that can be used (see, for instance, Ashworth & Skitmore, 1986). Theoretical developments have not, as yet, been forthcoming. All theories need the support of some observational data and cost estimating is not likely to be an exception. These data do not need to be complete in order to build theories. As it is possible to construct an image of a prehistoric animal such as the brontosaurus from only a few key bones and relics, so a theory of cost estimating may possibly be found on a few factual details. The eternal argument of empiricists and deductionists is that, as theories need factual support, so do we need theories in order to know what facts to collect. In cost estimating, the basic facts of interest concern accuracy, the cost of achieving this accuracy, and the trade off between the two. When cost estimating theories do begin to emerge, it is highly likely that these relationships will be central features. This paper presents some of the facts we have been able to acquire regarding one part of this relationship - accuracy, and its influencing factors. Although some of these factors, such as the amount of information used in preparing the estimate, will have cost consequences, we have not yet reached the stage of quantifying these costs. Indeed, as will be seen, many of the factors do not involve any substantial cost considerations. The absence of any theory is reflected in the arbitrary manner in which the factors are presented. Rather, the emphasis here is on the consideration of purely empirical data concerning estimating accuracy. The essence of good empirical research is to .minimize the role of the researcher in interpreting the results of the study. Whilst space does not allow a full treatment of the material in this manner, the principle has been adopted as closely as possible to present results in an uncleaned and unbiased way. In most cases the evidence speaks for itself. The first part of the paper reviews most of the empirical evidence that we have located to date. Knowledge of any work done, but omitted here would be most welcome. The second part of the paper presents an analysis of some recently acquired data pertaining to this growing subject.
Resumo:
This paper describes the instigation and development of an expert system to aid in the strategic planning of construction projects. The paper consists of four parts - the origin of the project, the development of the concepts needed for the proposed system, the building of the system itself, and assessment of its performance. The origin of the project is outlined starting with the Japanese commitment to 5th generation computing together with the increasing local reaction to theory based prescriptive research in the field. The subsequent development of activities via the Alvey Commission and the RICS in conjunction with Salford University are traced culminating in the proposal and execution of the first major expert system to be built for the UK construction industry, subsequently recognised as one of the most successful of the expert system projects commissioned under the Alvey programme
Resumo:
We report a comprehensive theoretical study on reaction of methane by Fe4 cluster. This Letter gains insight into the mechanism of the reaction and indicate the Fe4 cluster has strong catalytic effect on the activation reaction of methane. In detail, the results show the cleavage of the first C–H bond is both an energetically and kinetically favourable process and the breaking of the second C–H is the rate-determining step. Moreover, our Letter demonstrates that the different cluster size of iron can not only determine the catalytic activity of methane but also control the product selectivity.
Resumo:
Dynamic capability theory asserts that the learning capabilities of construction organisations influence the degree to which value-for-money (VfM) is achieved on collaborative projects. However, there has been little study conducted to verify this relationship. The evidence is particularly limited within the empirical context of infrastructure delivery in Australia. Primarily drawing on the theoretical perspectives of the resource-based view of the firm (e.g. Barney 1991), dynamic capabilities (e.g. Helfat et al. 2007), absorptive capacity (e.g. Lane et al. 2006) and knowledge management (e.g. Nonaka 1994), this paper conceptualises learning capability as a knowledge-based dynamic capability. Learning capability builds on the micro-foundations of high-order learning routines, which are deliberately developed by construction organisations for managing collaborative projects. Based on this conceptualisation of learning capability, an exploratory case study was conducted. The study investigated the operational and higher-order learning routines adopted by a project alliance team to successfully achieve VfM. The case study demonstrated that the learning routines of the alliance project were developed and modified by the continual joint learning activities of participant organisations. Project-level learning routines were found to significantly influence the development of organisational-level learning routines. In turn, the learning outcomes generated from the alliance project appeared to significantly influence the development of project management routines and contractual arrangements applied by the participant organisations in subsequent collaborative projects. The case study findings imply that the higher-order learning routines that underpin the learning capability of construction organisations have the potential to influence the VfM achieved on both current and future collaborative projects.
Resumo:
Computer games have become a commonplace but engaging activity among students. They enjoy playing computer games as they can perform larger-than-life activities virtually such as jumping from great heights, flying planes, and racing cars; actions that are otherwise not possible in real life. Computer games also offer user interactivity which gives them a certain appeal. Considering this appeal, educators should consider integrating computer games into student learning and to encourage students to author computer games of their own. It is thought that students can be engaged in learning by authoring and using computer games and can also gain essential skills such as collaboration, teamwork, problem solving and deductive reasoning. The research in this study revolves around building student engagement through the task of authoring computer games. The study aims to demonstrate how the creation and sharing of student-authored educational games might facilitate student engagement and how ICT (information and communication technology) plays a supportive role in student learning. Results from this study may lead to the broader integration of computer games into student learning and contribute to similar studies. In this qualitative case study, based in a state school in a low socio-economic area west of Brisbane, Australia, students were selected in both junior and senior secondary classes who have authored computer games as a part of their ICT learning. Senior secondary students (Year 12 ICT) were given the task of programming the games, which were to be based on Mathematics learning topics while the junior secondary students (Year 8 ICT) were given the task of creating multimedia elements for the games. A Mathematics teacher volunteered to assist in the project and provided guidance on the inclusion of suitable Mathematics curricular content into these computer games. The student-authored computer games were then used to support another group of Year 8 Mathematics students to learn the topics of Area, Volume and Time. Data was collected through interviews, classroom observations and artefacts. The teacher researcher, acting in the role of ICT teacher, coordinated with the students and the Mathematics teacher to conduct this study. Instrumental case study was applied as research methodology and Third Generation Activity Theory served as theoretical framework for this study. Data was analysed adopting qualitative coding procedures. Findings of this study indicate that having students author and play computer games promoted student engagement and that ICT played a supportive role in learning and allowed students to gain certain essential skills. Although this study will suggest integrating computer games to support classroom learning, it cannot be presumed that computer games are an immediate solution for promoting student engagement.
Resumo:
The green building trend has increased rapidly worldwide in recent decades as a means of addressing growing concerns over climate change and global warming and to reduce the impact of the building industry on the environment. A significant contribution in Australia is the use of a series of rating tools by the Green Building Council Australia (GBCA) for the certification of various types of buildings. This paper reviews the use of the Green Star system in Australian building construction, and investigates the potential challenges involved in acquiring the certification of Australian buildings by critically analysing a database of most recently certified GBCA projects. The results show that management-related credits and innovation-related credits are the easiest and most difficult respectively to obtain. Additionally, 6-Star green buildings achieve significantly higher points than other certified buildings in the Energy category. In contrast, 4 Star green buildings achieve more points in the Material category than 5 and 6 Star buildings. The study offers a useful reference for both property developers and project teams to obtain a better understanding of the rating scheme and consequently the effective preparation of certification documentation.
Resumo:
The building sector is the dominant consumer of energy and therefore a major contributor to anthropomorphic climate change. The rapid generation of photorealistic, 3D environment models with incorporated surface temperature data has the potential to improve thermographic monitoring of building energy efficiency. In pursuit of this goal, we propose a system which combines a range sensor with a thermal-infrared camera. Our proposed system can generate dense 3D models of environments with both appearance and temperature information, and is the first such system to be developed using a low-cost RGB-D camera. The proposed pipeline processes depth maps successively, forming an ongoing pose estimate of the depth camera and optimizing a voxel occupancy map. Voxels are assigned 4 channels representing estimates of their true RGB and thermal-infrared intensity values. Poses corresponding to each RGB and thermal-infrared image are estimated through a combination of timestamp-based interpolation and a pre-determined knowledge of the extrinsic calibration of the system. Raycasting is then used to color the voxels to represent both visual appearance using RGB, and an estimate of the surface temperature. The output of the system is a dense 3D model which can simultaneously represent both RGB and thermal-infrared data using one of two alternative representation schemes. Experimental results demonstrate that the system is capable of accurately mapping difficult environments, even in complete darkness.
Resumo:
The practical number of charge carriers loaded is crucial to the evaluation of the capacity performance of carbon-based electrodes in service, and cannot be easily addressed experimentally. In this paper, we report a density functional theory study of charge carrier adsorption onto zigzag edge-shaped graphene nanoribbons (ZGNRs), both pristine and incorporating edge substitution with boron, nitrogen or oxygen atoms. All edge substitutions are found to be energetically favorable, especially in oxidized environments. The maximal loading of protons onto the substituted ZGNR edges obeys a rule of [8-n-1], where n is the number of valence electrons of the edge-site atom constituting the adsorption site. Hence, a maximum charge loading is achieved with boron substitution. This result correlates in a transparent manner with the electronic structure characteristics of the edge atom. The boron edge atom, characterized by the most empty p band, facilitates more than the other substitutional cases the accommodation of valence electrons transferred from the ribbon, induced by adsorption of protons. This result not only further confirms the possibility of enhancing charge storage performance of carbon-based electrochemical devices through chemical functionalization but also, more importantly, provides the physical rationale for further design strategies.
Resumo:
Heteroatom doping on the edge of graphene may serve as an effective way to tune chemical activity of carbon-based electrodes with respect to charge carrier transfer in an aqueous environment. In a step towards developing mechanistic understanding of this phenomenon, we explore herein mechanisms of proton transfer from aqueous solution to pristine and doped graphene edges utilizing density functional theory. Atomic B-, N-, and O- doped edges as well as the native graphene are examined, displaying varying proton affinities and effective interaction ranges with the H3O+ charge carrier. Our study shows that the doped edges characterized by more dispersive orbitals, namely boron and nitrogen, demonstrate more energetically favourable charge carrier exchange compared with oxygen, which features more localized orbitals. Extended calculations are carried out to examine proton transfer from the hydronium ion in the presence of explicit water, with results indicating that the basic mechanistic features of the simpler model are unchanged.
Resumo:
Parametric and generative modelling methods are ways in which computer models are made more flexible, and of formalising domain-specific knowledge. At present, no open standard exists for the interchange of parametric and generative information. The Industry Foundation Classes (IFC) which are an open standard for interoperability in building information models is presented as the base for an open standard in parametric modelling. The advantage of allowing parametric and generative representations are that the early design process can allow for more iteration and changes can be implemented quicker than with traditional models. This paper begins with a formal definition of what constitutes to be parametric and generative modelling methods and then proceeds to describe an open standard in which the interchange of components could be implemented. As an illustrative example of generative design, Frazer’s ‘Reptiles’ project from 1968 is reinterpreted.
Resumo:
This thematic issue on education and the politics of becoming focuses on how a Multiple Literacies Theory (MLT) plugs into practice in education. MLT does this by creating an assemblage between discourse, text, resonance and sensations. What does this produce? Becoming AND how one might live are the product of an assemblage (May, 2005; Semetsky, 2003). In this paper, MLT is the approach that explores the connection between educational theory and practice through the lens of an empirical study of multilingual children acquiring multiple writing systems simultaneously. The introduction explicates discourse, text, resonance, sensation and becoming. The second section introduces certain Deleuzian concepts that plug into MLT. The third section serves as an introduction to MLT. The fourth section is devoted to the study by way of a rhizoanalysis. Finally, drawing on the concept of the rhizome, this article exits with potential lines of flight opened by MLT. These are becomings which highlight the significance of this work in terms of transforming not only how literacies are conceptualized, especially in minority language contexts, but also how one might live.
Resumo:
This project was a step forward in developing and evaluating a novel, mathematical model that can deduce the meaning of words based on their use in language. This model can be applied to a wide range of natural language applications, including the information seeking process most of us undertake on a daily basis.
Resumo:
Boards of directors are key governancemechanisms in organizations and fulfill twomain tasks:monitoringmanagers and firm performance, and providing advice and access to resources. In spite of a wealth of researchmuch remains unknown about how boards attend to the two tasks. This study investigates whether organizational (firm profitability) and environmental factors (industry regulation) affect board task performance. The data combine CEOs' responses to a questionnaire, and archival data from a sample of large Italian firms. Findings show that past firm performance is negatively associatedwith board monitoring and advice tasks; greater industry regulation enhances perceived board task performance; board monitoring and advice tasks tend to reinforce each other, despite their theoretical and practical distinction.
Resumo:
Links between the built environment and human behaviour have long been of interest to those involved in the fields of urban planning and architecture, but direct assessments of the links between the three-dimensional building façade form and human behaviour are rare. Much work has been completed on subjects’ responses to the aesthetic of architectural frontages but this has generally been conducted using two-dimensional images of structures and in no way assesses human responses when in the presence of these structures. This research has set about observing the behaviour of individuals and groups in the public realm and recording their reactions to architecture which has a distinct three-dimensional character, with particular reference to the street level façade. The behaviour was recorded and quantified and indicated that there is significant differences in human behaviour around these various types of architecture.