537 resultados para computer forensics tools
Resumo:
I am suspicious of tools without a purpose - tools that are not developed in response to a clearly defined problem. Of course tools without a purpose can still be useful. However the development of first generation CAD was seriously impeded because the solution came before the problem. We are in danger of repeating this mistake if we do not clarify the nature of the problem that we are trying to solve with the next generation of tools. Back in the 1980s I used to add a postscript slide at the end of CAD conference presentations and the applause would invariably turn to concern. The slide simple asked: can anyone remember what it was about design that needed aiding before we had computer aided design?
Resumo:
John Frazer's architectural work is inspired by living and generative processes. Both evolutionary and revolutionary, it explores informatin ecologies and the dynamics of the spaces between objects. Fuelled by an interest in the cybernetic work of Gordon Pask and Norbert Wiener, and the possibilities of the computer and the "new science" it has facilitated, Frazer and his team of collaborators have conducted a series of experiments that utilize genetic algorithms, cellular automata, emergent behaviour, complexity and feedback loops to create a truly dynamic architecture. Frazer studied at the Architectural Association (AA) in London from 1963 to 1969, and later became unit master of Diploma Unit 11 there. He was subsequently Director of Computer-Aided Design at the University of Ulter - a post he held while writing An Evolutionary Architecture in 1995 - and a lecturer at the University of Cambridge. In 1983 he co-founded Autographics Software Ltd, which pioneered microprocessor graphics. Frazer was awarded a person chair at the University of Ulster in 1984. In Frazer's hands, architecture becomes machine-readable, formally open-ended and responsive. His work as computer consultant to Cedric Price's Generator Project of 1976 (see P84)led to the development of a series of tools and processes; these have resulted in projects such as the Calbuild Kit (1985) and the Universal Constructor (1990). These subsequent computer-orientated architectural machines are makers of architectural form beyond the full control of the architect-programmer. Frazer makes much reference to the multi-celled relationships found in nature, and their ongoing morphosis in response to continually changing contextual criteria. He defines the elements that describe his evolutionary architectural model thus: "A genetic code script, rules for the development of the code, mapping of the code to a virtual model, the nature of the environment for the development of the model and, most importantly, the criteria for selection. In setting out these parameters for designing evolutionary architectures, Frazer goes beyond the usual notions of architectural beauty and aesthetics. Nevertheless his work is not without an aesthetic: some pieces are a frenzy of mad wire, while others have a modularity that is reminiscent of biological form. Algorithms form the basis of Frazer's designs. These algorithms determine a variety of formal results dependent on the nature of the information they are given. His work, therefore, is always dynamic, always evolving and always different. Designing with algorithms is also critical to other architects featured in this book, such as Marcos Novak (see p150). Frazer has made an unparalleled contribution to defining architectural possibilities for the twenty-first century, and remains an inspiration to architects seeking to create responsive environments. Architects were initially slow to pick up on the opportunities that the computer provides. These opportunities are both representational and spatial: computers can help architects draw buildings and, more importantly, they can help architects create varied spaces, both virtual and actual. Frazer's work was groundbreaking in this respect, and well before its time.
Resumo:
The majority of the world’s citizens now live in cities. Although urban planning can thus be thought of as a field with significant ramifications on the human condition, many practitioners feel that it has reached the crossroads in thought leadership between traditional practice and a new, more participatory and open approach. Conventional ways to engage people in participatory planning exercises are limited in reach and scope. At the same time, socio-cultural trends and technology innovation offer opportunities to re-think the status quo in urban planning. Neogeography introduces tools and services that allow non-geographers to use advanced geographical information systems. Similarly, is there potential for the emergence of a neo-planning paradigm in which urban planning is carried out through active civic engagement aided by Web 2.0 and new media technologies thus redefining the role of practicing planners? This paper traces a number of evolving links between urban planning, neogeography and information and communication technology. Two significant trends – participation and visualisation – with direct implications for urban planning are discussed. Combining advanced participation and visualisation features, the popular virtual reality environment Second Life is then introduced as a test bed to explore a planning workshop and an integrated software event framework to assist narrative generation. We discuss an approach to harness and analyse narratives using virtual reality logging to make transparent how users understand and interpret proposed urban designs.
Resumo:
This paper presents aspects of a longitudinal study in the design and practice of Internet meetings between farmer their advisors and researchers in rural Australia. It reports on the use of Microsoft NetMeeting (NM) by a group of agricultural researchers from Australia's CSIRO (Commonwealth Scientific and Industrial Research Organisation) for regular meetings, over nine years, with farmers and the commercial advisers. It describes lessons drawn from this experience about the conditions under which telecollaborative tools, such as NM and video conferencing, are likely to be both useful and used.
Resumo:
Denial-of-service attacks (DoS) and distributed denial-of-service attacks (DDoS) attempt to temporarily disrupt users or computer resources to cause service un- availability to legitimate users in the internetworking system. The most common type of DoS attack occurs when adversaries °ood a large amount of bogus data to interfere or disrupt the service on the server. The attack can be either a single-source attack, which originates at only one host, or a multi-source attack, in which multiple hosts coordinate to °ood a large number of packets to the server. Cryptographic mechanisms in authentication schemes are an example ap- proach to help the server to validate malicious tra±c. Since authentication in key establishment protocols requires the veri¯er to spend some resources before successfully detecting the bogus messages, adversaries might be able to exploit this °aw to mount an attack to overwhelm the server resources. The attacker is able to perform this kind of attack because many key establishment protocols incorporate strong authentication at the beginning phase before they can iden- tify the attacks. This is an example of DoS threats in most key establishment protocols because they have been implemented to support con¯dentiality and data integrity, but do not carefully consider other security objectives, such as availability. The main objective of this research is to design denial-of-service resistant mechanisms in key establishment protocols. In particular, we focus on the design of cryptographic protocols related to key establishment protocols that implement client puzzles to protect the server against resource exhaustion attacks. Another objective is to extend formal analysis techniques to include DoS- resistance. Basically, the formal analysis approach is used not only to analyse and verify the security of a cryptographic scheme carefully but also to help in the design stage of new protocols with a high level of security guarantee. In this research, we focus on an analysis technique of Meadows' cost-based framework, and we implement DoS-resistant model using Coloured Petri Nets. Meadows' cost-based framework is directly proposed to assess denial-of-service vulnerabil- ities in the cryptographic protocols using mathematical proof, while Coloured Petri Nets is used to model and verify the communication protocols using inter- active simulations. In addition, Coloured Petri Nets are able to help the protocol designer to clarify and reduce some inconsistency of the protocol speci¯cation. Therefore, the second objective of this research is to explore vulnerabilities in existing DoS-resistant protocols, as well as extend a formal analysis approach to our new framework for improving DoS-resistance and evaluating the performance of the new proposed mechanism. In summary, the speci¯c outcomes of this research include following results; 1. A taxonomy of denial-of-service resistant strategies and techniques used in key establishment protocols; 2. A critical analysis of existing DoS-resistant key exchange and key estab- lishment protocols; 3. An implementation of Meadows's cost-based framework using Coloured Petri Nets for modelling and evaluating DoS-resistant protocols; and 4. A development of new e±cient and practical DoS-resistant mechanisms to improve the resistance to denial-of-service attacks in key establishment protocols.
Resumo:
Health Information Systems (HIS) make extensive use of Information and Communication Technologies (ICT). The use of ICT aids in improving the quality and efficiency of healthcare services by making healthcare information available at the point of care (Goldstein, Groen, Ponkshe, and Wine, 2007). The increasing availability of healthcare data presents security and privacy issues which have not yet been fully addressed (Liu, Caelli, May, and Croll, 2008a). Healthcare organisations have to comply with the security and privacy requirements stated in laws, regulations and ethical standards, while managing healthcare information. Protecting the security and privacy of healthcare information is a very complex task (Liu, May, Caelli and Croll, 2008b). In order to simplify the complexity of providing security and privacy in HIS, appropriate information security services and mechanisms have to be implemented. Solutions at the application layer have already been implemented in HIS such as those existing in healthcare web services (Weaver et al., 2003). In addition, Discretionary Access Control (DAC) is the most commonly implemented access control model to restrict access to resources at the OS layer (Liu, Caelli, May, Croll and Henricksen, 2007a). Nevertheless, the combination of application security mechanisms and DAC at the OS layer has been stated to be insufficient in satisfying security requirements in computer systems (Loscocco et al., 1998). This thesis investigates the feasibility of implementing Security Enhanced Linux (SELinux) to enforce a Role-Based Access Control (RBAC) policy to help protect resources at the Operating System (OS) layer. SELinux provides Mandatory Access Control (MAC) mechanisms at the OS layer. These mechanisms can contain the damage from compromised applications and restrict access to resources according to the security policy implemented. The main contribution of this research is to provide a modern framework to implement and manage SELinux in HIS. The proposed framework introduces SELinux Profiles to restrict access permissions over the system resources to authorised users. The feasibility of using SELinux profiles in HIS was demonstrated through the creation of a prototype, which was submitted to various attack scenarios. The prototype was also subjected to testing during emergency scenarios, where changes to the security policies had to be made on the spot. Attack scenarios were based on vulnerabilities common at the application layer. SELinux demonstrated that it could effectively contain attacks at the application layer and provide adequate flexibility during emergency situations. However, even with the use of current tools, the development of SELinux policies can be very complex. Further research has to be made in order to simplify the management of SELinux policies and access permissions. In addition, SELinux related technologies, such as the Policy Management Server by Tresys Technologies, need to be researched in order to provide solutions at different layers of protection.
Resumo:
Engineering graduates of today, face a working environment that assumes global mobility in the labour market. This challenge means, amongst universities worldwide, a demand to increase the globalisation of educational programs, context, and increase and support the mobility of students through mechanisms such as student exchange and double masters degrees. Engineering student mobility from Australia is low with only a few Engineering Faculties encouraging students to go internationally. This comparative study, using universities in Australia and Europe, of feedback from students who have been on exchange or proposing to go on exchange, employers and faculty addresses the motivators and barriers to student mobility and exchange from the perspectives of the university, faculty, students and employers. Recommendations will be presented on how student mobility and exchange can be improved, and mechanisms such as double Masters Degrees, dual accreditation and Erasmus Mundus 2009 – 2013 can be utilised to improve student mobility.
Resumo:
The building life cycle process is complex and prone to fragmentation as it moves through its various stages. The number of participants, and the diversity, specialisation and isolation both in space and time of their activities, have dramatically increased over time. The data generated within the construction industry has become increasingly overwhelming. Most currently available computer tools for the building industry have offered productivity improvement in the transmission of graphical drawings and textual specifications, without addressing more fundamental changes in building life cycle management. Facility managers and building owners are primarily concerned with highlighting areas of existing or potential maintenance problems in order to be able to improve the building performance, satisfying occupants and minimising turnover especially the operational cost of maintenance. In doing so, they collect large amounts of data that is stored in the building’s maintenance database. The work described in this paper is targeted at adding value to the design and maintenance of buildings by turning maintenance data into information and knowledge. Data mining technology presents an opportunity to increase significantly the rate at which the volumes of data generated through the maintenance process can be turned into useful information. This can be done using classification algorithms to discover patterns and correlations within a large volume of data. This paper presents how and what data mining techniques can be applied on maintenance data of buildings to identify the impediments to better performance of building assets. It demonstrates what sorts of knowledge can be found in maintenance records. The benefits to the construction industry lie in turning passive data in databases into knowledge that can improve the efficiency of the maintenance process and of future designs that incorporate that maintenance knowledge.
Resumo:
Sleeper is an 18'00" musical work for live performer and laptop computer which exists as both a live performance work and a recorded work for audio CD. The work has been presented at a range of international performance events and survey exhibitions. These include the 2003 International Computer Music Conference (Singapore) where it was selected for CD publication, Variable Resistance (San Francisco Museum of Modern Art, USA), and i.audio, a survey of experimental sound at the Performance Space, Sydney. The source sound materials are drawn from field recordings made in acoustically resonant spaces in the Australian urban environment, amplified and acoustic instruments, radio signals, and sound synthesis procedures. The processing techniques blur the boundaries between, and exploit, the perceptual ambiguities of de-contextualised and processed sound. The work thus challenges the arbitrary distinctions between sound, noise and music and attempts to reveal the inherent musicality in so-called non-musical materials via digitally re-processed location audio. Thematically the work investigates Paul Virilio’s theory that technology ‘collapses space’ via the relationship of technology to speed. Technically this is explored through the design of a music composition process that draws upon spatially and temporally dispersed sound materials treated using digital audio processing technologies. One of the contributions to knowledge in this work is a demonstration of how disparate materials may be employed within a compositional process to produce music through the establishment of musically meaningful morphological, spectral and pitch relationships. This is achieved through the design of novel digital audio processing networks and a software performance interface. The work explores, tests and extends the music perception theories of ‘reduced listening’ (Schaeffer, 1967) and ‘surrogacy’ (Smalley, 1997), by demonstrating how, through specific audio processing techniques, sounds may shifted away from ‘causal’ listening contexts towards abstract aesthetic listening contexts. In doing so, it demonstrates how various time and frequency domain processing techniques may be used to achieve this shift.
Resumo:
Existing widely known environmental assessment models, primarily those for Life Cycle Assessment of manufactured products and buildings, were reviewed to grasp their characteristics, since the past several years have seen a significant increase in interest and research activity in the development of building environmental assessment methods. Each method or tool was assessed under the headings of description, data requirement, end-use, assessment criteria (scale of assessment and scoring/ weighting system)and present status
Resumo:
The construction industry is categorised as being an information-intensive industry and described as one of the most important industries in any developed country, facing a period of rapid and unparalleled change (Industry Science Resources 1999) (Love P.E.D., Tucker S.N. et al. 1996). Project communications are becoming increasingly complex, with a growing need and fundamental drive to collaborate electronically at project level and beyond (Olesen K. and Myers M.D. 1999; Thorpe T. and Mead S. 2001; CITE 2003). Yet, the industry is also identified as having a considerable lack of knowledge and awareness about innovative information and communication technology (ICT) and web-based communication processes, systems and solutions which may prove beneficial in the procurement, delivery and life cycle of projects (NSW Government 1998; Kajewski S. and Weippert A. 2000). The Internet has debatably revolutionised the way in which information is stored, exchanged and viewed, opening new avenues for business, which only a decade ago were deemed almost inconceivable (DCITA 1998; IIB 2002). In an attempt to put these ‘new avenues of business’ into perspective, this report provides an overall ‘snapshot’ of current public and private construction industry sector opportunities and practices in the implementation and application of web-based ICT tools, systems and processes (e-Uptake). Research found that even with a reserved uptake, the construction industry and its participating organisations are making concerted efforts (fortunately with positive results) in taking up innovative forms of doing business via the internet, including e-Tendering (making it possible to manage the entire tender letting process electronically and online) (Anumba C.J. and Ruikar K. 2002; ITCBP 2003). Furthermore, Government (often a key client within the construction industry),and with its increased tendency to transact its business electronically, undoubtedly has an effect on how various private industry consultants, contractors, suppliers, etc. do business (Murray M. 2003) – by offering a wide range of (current and anticipated) e-facilities / services, including e-Tendering (Ecommerce 2002). Overall, doing business electronically is found to have a profound impact on the way today’s construction businesses operate - streamlining existing processes, with the growth in innovative tools, such as e-Tender, offering the construction industry new responsibilities and opportunities for all parties involved (ITCBP 2003). It is therefore important that these opportunities should be accessible to as many construction industry businesses as possible (The Construction Confederation 2001). Historically, there is a considerable exchange of information between various parties during a tendering process, where accuracy and efficiency of documentation is critical. Traditionally this process is either paper-based (involving large volumes of supporting tender documentation), or via a number of stand-alone, non-compatible computer systems, usually costly to both the client and contractor. As such, having a standard electronic exchange format that allows all parties involved in an electronic tender process to access one system only via the Internet, saves both time and money, eliminates transcription errors and increases speed of bid analysis (The Construction Confederation 2001). Supporting this research project’s aims and objectives, researchers set to determine today’s construction industry ‘current state-of-play’ in relation to e-Tendering opportunities. The report also provides brief introductions to several Australian and International e-Tender systems identified during this investigation. e-Tendering, in its simplest form, is described as the electronic publishing, communicating, accessing, receiving and submitting of all tender related information and documentation via the internet, thereby replacing the traditional paper-based tender processes, and achieving a more efficient and effective business process for all parties involved (NT Governement 2000; NT Government 2000; NSW Department of Commerce 2003; NSW Government 2003). Although most of the e-Tender websites investigated at the time, maintain their tendering processes and capabilities are ‘electronic’, research shows these ‘eTendering’ systems vary from being reasonably advanced to more ‘basic’ electronic tender notification and archiving services for various industry sectors. Research also indicates an e-Tender system should have a number of basic features and capabilities, including: • All tender documentation to be distributed via a secure web-based tender system – thereby avoiding the need for collating paperwork and couriers. • The client/purchaser should be able to upload a notice and/or invitation to tender onto the system. • Notification is sent out electronically (usually via email) for suppliers to download the information and return their responses electronically (online). • During the tender period, updates and queries are exchanged through the same e-Tender system. • The client/purchaser should only be able to access the tenders after the deadline has passed. • All tender related information is held in a central database, which should be easily searchable and fully audited, with all activities recorded. • It is essential that tender documents are not read or submitted by unauthorised parties. • Users of the e-Tender system are to be properly identified and registered via controlled access. In simple terms, security has to be as good as if not better than a manual tender process. Data is to be encrypted and users authenticated by means such as digital signatures, electronic certificates or smartcards. • All parties must be assured that no 'undetected' alterations can be made to any tender. • The tenderer should be able to amend the bid right up to the deadline – whilst the client/purchaser cannot obtain access until the submission deadline has passed. • The e-Tender system may also include features such as a database of service providers with spreadsheet-based pricing schedules, which can make it easier for a potential tenderer to electronically prepare and analyse a tender. Research indicates the efficiency of an e-Tender process is well supported internationally, with a significant number, yet similar, e-Tender benefits identified during this investigation. Both construction industry and Government participants generally agree that the implementation of an automated e-Tendering process or system enhances the overall quality, timeliness and cost-effectiveness of a tender process, and provides a more streamlined method of receiving, managing, and submitting tender documents than the traditional paper-based process. On the other hand, whilst there are undoubtedly many more barriers challenging the successful implementation and adoption of an e-Tendering system or process, researchers have also identified a range of challenges and perceptions that seem to hinder the uptake of this innovative approach to tendering electronically. A central concern seems to be that of security - when industry organisations have to use the Internet for electronic information transfer. As a result, when it comes to e-Tendering, industry participants insist these innovative tendering systems are developed to ensure the utmost security and integrity. Finally, if Australian organisations continue to explore the competitive ‘dynamics’ of the construction industry, without realising the current and future, trends and benefits of adopting innovative processes, such as e-Tendering, it will limit their globalising opportunities to expand into overseas markets and allow the continuation of international firms successfully entering local markets. As such, researchers believe increased knowledge, awareness and successful implementation of innovative systems and processes raises great expectations regarding their contribution towards ‘stimulating’ the globalisation of electronic procurement activities, and improving overall business and project performances throughout the construction industry sectors and overall marketplace (NSW Government 2002; Harty C. 2003; Murray M. 2003; Pietroforte R. 2003). Achieving the successful integration of an innovative e-Tender solution with an existing / traditional process can be a complex, and if not done correctly, could lead to failure (Bourn J. 2002).
Resumo:
This document provides a review of international and national practices in investment decision support tools in road asset management. Efforts were concentrated on identifying analytic frameworks, evaluation methodologies and criteria adopted by current tools. Emphasis was also given to how current approaches support Triple Bottom Line decision-making. Benefit Cost Analysis and Multiple Criteria Analysis are principle methodologies in supporting decision-making in Road Asset Management. The complexity of the applications shows significant differences in international practices. There is continuing discussion amongst practitioners and researchers regarding to which one is more appropriate in supporting decision-making. It is suggested that the two approaches should be regarded as complementary instead of competitive means. Multiple Criteria Analysis may be particularly helpful in early stages of project development, say strategic planning. Benefit Cost Analysis is used most widely for project prioritisation and selecting the final project from amongst a set of alternatives. Benefit Cost Analysis approach is useful tool for investment decision-making from an economic perspective. An extension of the approach, which includes social and environmental externalities, is currently used in supporting Triple Bottom Line decision-making in the road sector. However, efforts should be given to several issues in the applications. First of all, there is a need to reach a degree of commonality on considering social and environmental externalities, which may be achieved by aggregating the best practices. At different decision-making level, the detail of consideration of the externalities should be different. It is intended to develop a generic framework to coordinate the range of existing practices. The standard framework will also be helpful in reducing double counting, which appears in some current practices. Cautions should also be given to the methods of determining the value of social and environmental externalities. A number of methods, such as market price, resource costs and Willingness to Pay, are found in the review. The use of unreasonable monetisation methods in some cases has discredited Benefit Cost Analysis in the eyes of decision makers and the public. Some social externalities, such as employment and regional economic impacts, are generally omitted in current practices. This is due to the lack of information and credible models. It may be appropriate to consider these externalities in qualitative forms in a Multiple Criteria Analysis. Consensus has been reached in considering noise and air pollution in international practices. However, Australia practices generally omitted these externalities. Equity is an important consideration in Road Asset Management. The considerations are either between regions, or social groups, such as income, age, gender, disable, etc. In current practice, there is not a well developed quantitative measure for equity issues. More research is needed to target this issue. Although Multiple Criteria Analysis has been used for decades, there is not a generally accepted framework in the choice of modelling methods and various externalities. The result is that different analysts are unlikely to reach consistent conclusions about a policy measure. In current practices, some favour using methods which are able to prioritise alternatives, such as Goal Programming, Goal Achievement Matrix, Analytic Hierarchy Process. The others just present various impacts to decision-makers to characterise the projects. Weighting and scoring system are critical in most Multiple Criteria Analysis. However, the processes of assessing weights and scores were criticised as highly arbitrary and subjective. It is essential that the process should be as transparent as possible. Obtaining weights and scores by consulting local communities is a common practice, but is likely to result in bias towards local interests. Interactive approach has the advantage in helping decision-makers elaborating their preferences. However, computation burden may result in lose of interests of decision-makers during the solution process of a large-scale problem, say a large state road network. Current practices tend to use cardinal or ordinal scales in measure in non-monetised externalities. Distorted valuations can occur where variables measured in physical units, are converted to scales. For example, decibels of noise converts to a scale of -4 to +4 with a linear transformation, the difference between 3 and 4 represents a far greater increase in discomfort to people than the increase from 0 to 1. It is suggested to assign different weights to individual score. Due to overlapped goals, the problem of double counting also appears in some of Multiple Criteria Analysis. The situation can be improved by carefully selecting and defining investment goals and criteria. Other issues, such as the treatment of time effect, incorporating risk and uncertainty, have been given scant attention in current practices. This report suggested establishing a common analytic framework to deal with these issues.