924 resultados para Specification
Resumo:
In architectural design and the construction industry, there is insufficient evidence about the way designers collaborate in their normal working environments using both traditional and digital media. It is this gap in empirical evidence that the CRC project, “Team Collaboration in High Bandwidth Virtual Environments” addresses. The project is primarily, but not exclusively, concerned with the conceptual stages of design carried out by professional designers working in different offices. The aim is to increase opportunities for communication and interaction between people in geographically distant locations in order to improve the quality of collaboration. In order to understand the practical implications of introducing new digital tools on working practices, research into how designers work collaboratively using both traditional and digital media is being undertaken. This will involve a series of empirical studies in the work places of the industry partners in the project. The studies of collaboration processes will provide empirical results that will lead to more effective use of virtual environments in design and construction processes. The report describes the research approach, the industry study, the methods for data collection and analysis and the foundation research methodologies. A distinctive aspect is that the research has been devised to enable field studies to be undertaken in a live industrial environment where the participant designers carry out real projects alongside their colleagues and in familiar locations. There are two basic research objectives: one is to obtain evidence about design practice that will inform the architecture and construction industries about the impact and potential benefit of using digital collaboration technologies; the second is to add to long term research knowledge of human cognitive and behavioural processes based on real world data. In order to achieve this, the research methods must be able to acquire a rich and heterogeneous set of data from design activities as they are carried out in the normal working environment. This places different demands upon the data collection and analysis methods to those of laboratory studies where controlled conditions are required. In order to address this, the research approach that has been adopted is ethnographic in nature and case study-based. The plan is to carry out a series of indepth studies in order to provide baseline results for future research across a wider community of user groups. An important objective has been to develop a methodology that will produce valid, significant and transferable results. The research will contribute to knowledge about how architectural design and the construction industry may benefit from the introduction of leading edge collaboration technologies. The outcomes will provide a sound foundation for the production of guidelines for the assessment of high bandwidth tools and their future deployment. The knowledge will form the basis for the specification of future collaboration products and collaboration processes. This project directly addresses the industry-identified focus on cultural change, image, e-project management, and innovative methods.
Resumo:
The quality of office indoor environments is considered to consist of those factors that impact occupants according to their health and well-being and (by consequence) their productivity. Indoor Environment Quality (IEQ) can be characterized by four indicators: • Indoor air quality indicators • Thermal comfort indicators • Lighting indicators • Noise indicators. Within each indicator, there are specific metrics that can be utilized in determining an acceptable quality of an indoor environment based on existing knowledge and best practice. Examples of these metrics are: indoor air levels of pollutants or odorants; operative temperature and its control; radiant asymmetry; task lighting; glare; ambient noise. The way in which these metrics impact occupants is not fully understood, especially when multiple metrics may interact in their impacts. While the potential cost of lost productivity from poor IEQ has been estimated to exceed building operation costs, the level of impact and the relative significance of the above four indicators are largely unknown. However, they are key factors in the sustainable operation or refurbishment of office buildings. This paper presents a methodology for assessing indoor environment quality (IEQ) in office buildings, and indicators with related metrics for high performance and occupant comfort. These are intended for integration into the specification of sustainable office buildings as key factors to ensure a high degree of occupant habitability, without this being impaired by other sustainability factors. The assessment methodology was applied in a case study on IEQ in Australia’s first ‘six star’ sustainable office building, Council House 2 (CH2), located in the centre of Melbourne. The CH2 building was designed and built with specific focus on sustainability and the provision of a high quality indoor environment for occupants. Actual IEQ performance was assessed in this study by field assessment after construction and occupancy. For comparison, the methodology was applied to a 30 year old conventional building adjacent to CH2 which housed the same or similar occupants and activities. The impact of IEQ on occupant productivity will be reported in a separate future paper
Resumo:
Forensic analysis requires the acquisition and management of many different types of evidence, including individual disk drives, RAID sets, network packets, memory images, and extracted files. Often the same evidence is reviewed by several different tools or examiners in different locations. We propose a backwards-compatible redesign of the Advanced Forensic Formatdan open, extensible file format for storing and sharing of evidence, arbitrary case related information and analysis results among different tools. The new specification, termed AFF4, is designed to be simple to implement, built upon the well supported ZIP file format specification. Furthermore, the AFF4 implementation has downward comparability with existing AFF files.
Resumo:
Since 1995 the buildingSMART International Alliance for Interoperability (buildingSMART)has developed a robust standard called the Industry Foundation Classes (IFC). IFC is an object oriented data model with related file format that has facilitated the efficient exchange of data in the development of building information models (BIM). The Cooperative Research Centre for Construction Innovation has contributed to the international effort in the development of the IFC standard and specifically the reinforced concrete part of the latest IFC 2x3 release. Industry Foundation Classes have been endorsed by the International Standards Organisation as a Publicly Available Specification (PAS) under the ISO label ISO/PAS 16739. For more details, go to http://www.tc184- sc4.org/About_TC184-SC4/About_SC4_Standards/ The current IFC model covers the building itself to a useful level of detail. The next stage of development for the IFC standard is where the building meets the ground (terrain) and with civil and external works like pavements, retaining walls, bridges, tunnels etc. With the current focus in Australia on infrastructure projects over the next 20 years a logical extension to this standard was in the area of site and civil works. This proposal recognises that there is an existing body of work on the specification of road representation data. In particular, LandXML is recognised as also is TransXML in the broader context of transportation and CityGML in the common interfacing of city maps, buildings and roads. Examination of interfaces between IFC and these specifications is therefore within the scope of this project. That such interfaces can be developed has already been demonstrated in principle within the IFC for Geographic Information Systems (GIS) project. National road standards that are already in use should be carefully analysed and contacts established in order to gain from this knowledge. The Object Catalogue for the Road Transport Sector (OKSTRA) should be noted as an example. It is also noted that buildingSMART Norway has submitted a proposal
Resumo:
The generic IS-success constructs first identified by DeLone and McLean (1992) continue to be widely employed in research. Yet, recent work by Petter et al (2007) has cast doubt on the validity of many mainstream constructs employed in IS research over the past 3 decades; critiquing the almost universal conceptualization and validation of these constructs as reflective when in many studies the measures appear to have been implicitly operationalized as formative. Cited examples of proper specification of the Delone and McLean constructs are few, particularly in light of their extensive employment in IS research. This paper introduces a four-stage formative construct development framework: Conceive > Operationalize > Respond > Validate (CORV). Employing the CORV framework in an archival analysis of research published in top outlets 1985-2007, the paper explores the extent of possible problems with past IS research due to potential misspecification of the four application-related success dimensions: Individual-Impact, Organizational-Impact, System-Quality and Information-Quality. Results suggest major concerns where there is a mismatch of the Respond and Validate stages. A general dearth of attention to the Operationalize and Respond stages in methodological writings is also observed.
Resumo:
Current regulatory requirements on data privacy make it increasingly important for enterprises to be able to verify and audit their compliance with their privacy policies. Traditionally, a privacy policy is written in a natural language. Such policies inherit the potential ambiguity, inconsistency and mis-interpretation of natural text. Hence, formal languages are emerging to allow a precise specification of enforceable privacy policies that can be verified. The EP3P language is one such formal language. An EP3P privacy policy of an enterprise consists of many rules. Given the semantics of the language, there may exist some rules in the ruleset which can never be used, these rules are referred to as redundant rules. Redundancies adversely affect privacy policies in several ways. Firstly, redundant rules reduce the efficiency of operations on privacy policies. Secondly, they may misdirect the policy auditor when determining the outcome of a policy. Therefore, in order to address these deficiencies it is important to identify and resolve redundancies. This thesis introduces the concept of minimal privacy policy - a policy that is free of redundancy. The essential component for maintaining the minimality of privacy policies is to determine the effects of the rules on each other. Hence, redundancy detection and resolution frameworks are proposed. Pair-wise redundancy detection is the central concept in these frameworks and it suggests a pair-wise comparison of the rules in order to detect redundancies. In addition, the thesis introduces a policy management tool that assists policy auditors in performing several operations on an EP3P privacy policy while maintaining its minimality. Formal results comparing alternative notions of redundancy, and how this would affect the tool, are also presented.
Resumo:
The protection of privacy has gained considerable attention recently. In response to this, new privacy protection systems are being introduced. SITDRM is one such system that protects private data through the enforcement of licenses provided by consumers. Prior to supplying data, data owners are expected to construct a detailed license for the potential data users. A license specifies whom, under what conditions, may have what type of access to the protected data. The specification of a license by a data owner binds the enterprise data handling to the consumer’s privacy preferences. However, licenses are very detailed, may reveal the internal structure of the enterprise and need to be kept synchronous with the enterprise privacy policy. To deal with this, we employ the Platform for Privacy Preferences Language (P3P) to communicate enterprise privacy policies to consumers and enable them to easily construct data licenses. A P3P policy is more abstract than a license, allows data owners to specify the purposes for which data are being collected and directly reflects the privacy policy of an enterprise.
Resumo:
Context The School of Information Technology at QUT has recently undertaken a major restructuring of their Bachelor of Information Technology (BIT) course. Some of the aims of this restructuring include a reduction in first year attrition and to provide an attractive degree course that meets both student and industry expectations. Emphasis has been placed on the first semester in the context of retaining students by introducing a set of four units that complement one another and provide introductory material on technology, programming and related skills, and generic skills that will aid the students throughout their undergraduate course and in their careers. This discussion relates to one of these four fist semester units, namely Building IT Systems. The aim of this unit is to create small Information Technology (IT) systems that use programming or scripting, databases as either standalone applications or web applications. In the prior history of teaching introductory computer programming at QUT, programming has been taught as a stand alone subject and integration of computer applications with other systems such as databases and networks was not undertaken until students had been given a thorough grounding in those topics as well. Feedback has indicated that students do not believe that working with a database requires programming skills. In fact, the teaching of the building blocks of computer applications have been compartmentalized and taught in isolation from each other. The teaching of introductory computer programming has been an industry requirement of IT degree courses as many jobs require at least some knowledge of the topic. Yet, computer programming is not a skill that all students have equal capabilities of learning (Bruce et al., 2004) and this is clearly shown by the volume of publications dedicated to this topic in the literature over a broad period of time (Eckerdal & Berglund, 2005; Mayer, 1981; Winslow, 1996). The teaching of this introductory material has been done pretty much the same way over the past thirty years. During this period of time that introductory computer programming courses have been taught at QUT, a number of different programming languages and programming paradigms have been used and different approaches to teaching and learning have been attempted in an effort to find the golden thread that would allow students to learn this complex topic. Unfortunately, computer programming is not a skill that can be learnt in one semester. Some basics can be learnt but it can take many years to master (Norvig, 2001). Faculty data typically has shown a bimodal distribution of results for students undertaking introductory programming courses with a high proportion of students receiving a high mark and a high proportion of students receiving a low or failing mark. This indicates that there are students who understand and excel with the introductory material while there is another group who struggle to understand the concepts and practices required to be able to translate a specification or problem statement into a computer program that achieves what is being requested. The consequence of a large group of students failing the introductory programming course has been a high level of attrition amongst first year students. This attrition level does not provide good continuity in student numbers in later years of the degree program and the current approach is not seen as sustainable.
Resumo:
This research investigates wireless intrusion detection techniques for detecting attacks on IEEE 802.11i Robust Secure Networks (RSNs). Despite using a variety of comprehensive preventative security measures, the RSNs remain vulnerable to a number of attacks. Failure of preventative measures to address all RSN vulnerabilities dictates the need for a comprehensive monitoring capability to detect all attacks on RSNs and also to proactively address potential security vulnerabilities by detecting security policy violations in the WLAN. This research proposes novel wireless intrusion detection techniques to address these monitoring requirements and also studies correlation of the generated alarms across wireless intrusion detection system (WIDS) sensors and the detection techniques themselves for greater reliability and robustness. The specific outcomes of this research are: A comprehensive review of the outstanding vulnerabilities and attacks in IEEE 802.11i RSNs. A comprehensive review of the wireless intrusion detection techniques currently available for detecting attacks on RSNs. Identification of the drawbacks and limitations of the currently available wireless intrusion detection techniques in detecting attacks on RSNs. Development of three novel wireless intrusion detection techniques for detecting RSN attacks and security policy violations in RSNs. Development of algorithms for each novel intrusion detection technique to correlate alarms across distributed sensors of a WIDS. Development of an algorithm for automatic attack scenario detection using cross detection technique correlation. Development of an algorithm to automatically assign priority to the detected attack scenario using cross detection technique correlation.
Resumo:
Privacy enhancing protocols (PEPs) are a family of protocols that allow secure exchange and management of sensitive user information. They are important in preserving users’ privacy in today’s open environment. Proof of the correctness of PEPs is necessary before they can be deployed. However, the traditional provable security approach, though well established for verifying cryptographic primitives, is not applicable to PEPs. We apply the formal method of Coloured Petri Nets (CPNs) to construct an executable specification of a representative PEP, namely the Private Information Escrow Bound to Multiple Conditions Protocol (PIEMCP). Formal semantics of the CPN specification allow us to reason about various security properties of PIEMCP using state space analysis techniques. This investigation provides us with preliminary insights for modeling and verification of PEPs in general, demonstrating the benefit of applying the CPN-based formal approach to proving the correctness of PEPs.
Resumo:
The overall research aims to develop a standardised instrument to measure the impacts resulting from contemporary Information Systems (IS). The research adopts the IS-Impact measurement model, introduced by Gable et al, (2008), as its theoretical foundation, and applies the extension strategy described by Berthon et al. (2002); extending both theory and the context, where the new context is the Human Resource (HR) system. The research will be conducted in two phases, the exploratory phase and the specification phase. The purpose of this paper is to present the findings of the exploratory phase. 134 respondents from a major Australian University were involved in this phase. The findings have supported most of the existing IS-Impact model’s credibility. However, some textual data may suggest new measures for the IS-Impact model, while the low response rate or the averting of some may suggest the elimination of some measures from the model.
Resumo:
Standards are designed to promote the interoperability of products and systems by enabling different parties to develop technologies that can be used together. There is an increasing expectation in many technical communities, including open source communities, that standards will be ‘open’. However, standards are subject to legal rights which impact upon, not only their development, but also their implementation. Of central importance are intellectual property rights: technical standards may incorporate patented technologies, while the specification documents of standards are protected by copyright. This article provides an overview of the processes by which standards are developed and considers the concept of ‘interoperability’, the meaning of the term ‘open standard’ and how open standards contribute to interoperability. It explains how intellectual property rights operate in relation to standards and how they can be managed to create standards that are open, not only during their development, but also in implementation.
Resumo:
This paper reports on the development of specifications for an on-board mass monitoring (OBM) application for regulatory requirements in Australia. An earlier paper reported on feasibility study and pilot testing program prior to the specification development [1]. Learnings from the pilot were used to refine this testing process and a full scale testing program was conducted from July to October 2008. The results from the full scale test and evidentiary implications are presented in this report. The draft specification for an evidentiary on-board mass monitoring application is currently under development.
Resumo:
A common optometric problem is to specify the eye’s ocular aberrations in terms of Zernike coefficients and to reduce that specification to a prescription for the optimum sphero-cylindrical correcting lens. The typical approach is first to reconstruct wavefront phase errors from measurements of wavefront slopes obtained by a wavefront aberrometer. This paper applies a new method to this clinical problem that does not require wavefront reconstruction. Instead, we base our analysis of axial wavefront vergence as inferred directly from wavefront slopes. The result is a wavefront vergence map that is similar to the axial power maps in corneal topography and hence has a potential to be favoured by clinicians. We use our new set of orthogonal Zernike slope polynomials to systematically analyse details of the vergence map analogous to Zernike analysis of wavefront maps. The result is a vector of slope coefficients that describe fundamental aberration components. Three different methods for reducing slope coefficients to a spherocylindrical prescription in power vector forms are compared and contrasted. When the original wavefront contains only second order aberrations, the vergence map is a function of meridian only and the power vectors from all three methods are identical. The differences in the methods begin to appear as we include higher order aberrations, in which case the wavefront vergence map is more complicated. Finally, we discuss the advantages and limitations of vergence map representation of ocular aberrations.
Resumo:
This article presents a survey of authorisation models and considers their ‘fitness-for-purpose’ in facilitating information sharing. Network-supported information sharing is an important technical capability that underpins collaboration in support of dynamic and unpredictable activities such as emergency response, national security, infrastructure protection, supply chain integration and emerging business models based on the concept of a ‘virtual organisation’. The article argues that present authorisation models are inflexible and poorly scalable in such dynamic environments due to their assumption that the future needs of the system can be predicted, which in turn justifies the use of persistent authorisation policies. The article outlines the motivation and requirement for a new flexible authorisation model that addresses the needs of information sharing. It proposes that a flexible and scalable authorisation model must allow an explicit specification of the objectives of the system and access decisions must be made based on a late trade-off analysis between these explicit objectives. A research agenda for the proposed Objective-based Access Control concept is presented.