800 resultados para Information technology Australia Management
Resumo:
Quality, in construction projects should be regarded as the fulfillment of expectation of those contributors involved in such projects. Although a significant amount of quality practices have been introduced within the industry, attainment of reasonable levels of quality in construction projects continues to be an ongoing problem. To date, some research into the introduction and improvement of quality practices and stakeholder management has been undertaken, but so far no major studies have been completed that comprehensively examine how greater consideration of stakeholders’ perspectives of quality can be used to contribute to final project quality outcomes. This paper aims to examine the requirements for development of a framework leading to more effective involvement of stakeholders in quality planning and practices thus ultimately contributing to higher quality outcomes for construction projects. Through an extensive literature review it highlights various perceptions of quality, categorizes quality issues with particular focus on benefits and shortcomings and also examines the viewpoints of major stakeholders on project quality. It proposes a set of criteria to be used as a basis for a quality practice improvement framework, which will provide project managers and owners with the required information and strategic direction to achieve their own and their stakeholders’ targets for implementation of quality practices leading to the achievement of improved quality outcomes on future projects.
Resumo:
Defence organisations perform information security evaluations to confirm that electronic communications devices are safe to use in security-critical situations. Such evaluations include tracing all possible dataflow paths through the device, but this process is tedious and error-prone, so automated reachability analysis tools are needed to make security evaluations faster and more accurate. Previous research has produced a tool, SIFA, for dataflow analysis of basic digital circuitry, but it cannot analyse dataflow through microprocessors embedded within the circuit since this depends on the software they run. We have developed a static analysis tool that produces SIFA compatible dataflow graphs from embedded microcontroller programs written in C. In this paper we present a case study which shows how this new capability supports combined hardware and software dataflow analyses of a security critical communications device.
Resumo:
Data flow analysis techniques can be used to help assess threats to data confidentiality and integrity in security critical program code. However, a fundamental weakness of static analysis techniques is that they overestimate the ways in which data may propagate at run time. Discounting large numbers of these false-positive data flow paths wastes an information security evaluator's time and effort. Here we show how to automatically eliminate some false-positive data flow paths by precisely modelling how classified data is blocked by certain expressions in embedded C code. We present a library of detailed data flow models of individual expression elements and an algorithm for introducing these components into conventional data flow graphs. The resulting models can be used to accurately trace byte-level or even bit-level data flow through expressions that are normally treated as atomic. This allows us to identify expressions that safely downgrade their classified inputs and thereby eliminate false-positive data flow paths from the security evaluation process. To validate the approach we have implemented and tested it in an existing data flow analysis toolkit.
Resumo:
This article presents a novel approach to confidentiality violation detection based on taint marking. Information flows are dynamically tracked between applications and objects of the operating system such as files, processes and sockets. A confidentiality policy is defined by labelling sensitive information and defining which information may leave the local system through network exchanges. Furthermore, per application profiles can be defined to restrict the sets of information each application may access and/or send through the network. In previous works, we focused on the use of mandatory access control mechanisms for information flow tracking. In this current work, we have extended the previous information flow model to track network exchanges, and we are able to define a policy attached to network sockets. We show an example application of this extension in the context of a compromised web browser: our implementation detects a confidentiality violation when the browser attempts to leak private information to a remote host over the network.
Resumo:
The ICT degrees in most Australian universities have a sequence of up to three programming subjects, or units. BABELnot is an ALTC-funded project that will document the academic standards associated with those three subjects in the six participating universities and, if possible, at other universities. This will necessitate the development of a rich framework for describing the learning goals associated with programming. It will also be necessary to benchmark exam questions that are mapped onto this framework. As part of the project, workshops are planned for ACE 2012, ICER 2012 and ACE 2013, to elicit feedback from the broader Australasian computing education community, and to disseminate the project’s findings. The purpose of this paper is to introduce the project to that broader Australasian computing education community and to invite their active participation.
Resumo:
Video surveillance systems using Closed Circuit Television (CCTV) cameras, is one of the fastest growing areas in the field of security technologies. However, the existing video surveillance systems are still not at a stage where they can be used for crime prevention. The systems rely heavily on human observers and are therefore limited by factors such as fatigue and monitoring capabilities over long periods of time. This work attempts to address these problems by proposing an automatic suspicious behaviour detection which utilises contextual information. The utilisation of contextual information is done via three main components: a context space model, a data stream clustering algorithm, and an inference algorithm. The utilisation of contextual information is still limited in the domain of suspicious behaviour detection. Furthermore, it is nearly impossible to correctly understand human behaviour without considering the context where it is observed. This work presents experiments using video feeds taken from CAVIAR dataset and a camera mounted on one of the buildings Z-Block) at the Queensland University of Technology, Australia. From these experiments, it is shown that by exploiting contextual information, the proposed system is able to make more accurate detections, especially of those behaviours which are only suspicious in some contexts while being normal in the others. Moreover, this information gives critical feedback to the system designers to refine the system.
Resumo:
Throughout the twentieth century the economics of the Middle East rose and fell many times in response to the external environment, including European de-colonization and the US and former USSR competing to provide military and economic aid after World War II. Throughout these upheavals the Middle East has remained internationally significant politically and economically not least for the region's large reserves of oil and gas, as discussed in the Introduction to this volume. In recent decades, Western nations have moved to invest into the Middle East in the rapidly developing technology, tourism and education industries that have proliferated. For its part, Iran has been the world's fourth largest provider of petroleum and second largest provider of natural gas and, despite years of political unrest, has made rapid expansion into information technology and telecommunications. Increased involvement in the global economy has meant that Iran has invested heavily in education and training and moved to modernize its management practices. Hitherto there has been little academic research into management in either Western or local organizations in Iran. This chapter seeks to address that gap in knowledge by exploring business leadership in Iran, with particular reference to cultural and institutional impacts.
Resumo:
Several authors stress the importance of data’s crucial foundation for operational, tactical and strategic decisions (e.g., Redman 1998, Tee et al. 2007). Data provides the basis for decision making as data collection and processing is typically associated with reducing uncertainty in order to make more effective decisions (Daft and Lengel 1986). While the first series of investments of Information Systems/Information Technology (IS/IT) into organizations improved data collection, restricted computational capacity and limited processing power created challenges (Simon 1960). Fifty years on, capacity and processing problems are increasingly less relevant; in fact, the opposite exists. Determining data relevance and usefulness is complicated by increased data capture and storage capacity, as well as continual improvements in information processing capability. As the IT landscape changes, businesses are inundated with ever-increasing volumes of data from both internal and external sources available on both an ad-hoc and real-time basis. More data, however, does not necessarily translate into more effective and efficient organizations, nor does it increase the likelihood of better or timelier decisions. This raises questions about what data managers require to assist their decision making processes.
Resumo:
At the previous conference in this series, Corney, Lister and Teague presented research results showing relationships between code writing, code tracing and code explaining, from as early as week 3 of semester. We concluded that the problems some students face in learning to program start very early in the semester. In this paper we report on our replication of that experiment, at two institutions, where one is the same as the original institution. In some cases, we did not find the same relationship between explaining code and writing code, but we believe this was because our teachers discussed the code in lectures between the two tests. Apart from that exception, our replication results at both institutions are consistent with our original study.
Resumo:
Recent research on novice programmers has suggested that they pass through neo-Piagetian stages: sensorimotor, preoperational, and concrete operational stages, before eventually reaching programming competence at the formal operational stage. This paper presents empirical results in support of this neo-Piagetian perspective. The major novel contributions of this paper are empirical results for some exam questions aimed at testing novices for the concrete operational abilities to reason with quantities that are conserved, processes that are reversible, and properties that hold under transitive inference. While the questions we used had been proposed earlier by Lister, he did not present any data for how students performed on these questions. Our empirical results demonstrate that many students struggle to answer these problems, despite the apparent simplicity of these problems. We then compare student performance on these questions with their performance on six explain in plain English questions.
Resumo:
Information Technology and its relationship to organisational performance has been a subject of continued interest to researchers and other stakeholders in developing countries. While there is concurrence that IT does contribute to performance, and we are efficiently expanding our knowledge on what factors cause better leveraging of IT resources in organisations, we have done little to understand how these factors interact with technology that results in improved performance. This paper suggests looking that the interaction between organisational resources and technology within the structurational lens, which recognises the recursive interaction between technology and people in the presence of social practices, and the norms that inform their ongoing practices. An ethnographic approach to understanding this interaction between technology and resources, as suggested by the structuration perspective, is suggested, aiming to provide richer insights on the nature of the environment that promotes better use of IT resources in developing countries. Such insights could provide the IT users in developing countries with at least an initial conception of the “IT usage platform” that they could promote in their organisations to leverage the most from their IT resources.
Resumo:
An online survey of recent ICT graduates in the workplace was carried out as part of a recent project funded by the Australian Learning and Teaching Council. The survey was concerned with the ICT curriculum in relation to workplace job requirements and university preparation for these requirements. The survey contained quantitative and qualitative components and findings from the former have been published (Koppi et al., 2009). This paper reports on a quantitative comparison of responses from graduates who had workplace experience and those who did not, and a qualitative analysis of text responses from all ICT graduates to open-ended questions concerning the curriculum and their perceived university preparation for the workplace. The overwhelming response from ICT graduates in the workplace was for more industry related learning. These industry relationships included industry involvement, workplace learning and business experience, up-to-date teaching and technologies, practical applications, and real-world activities. A closer relationship of academia and industry was strongly advocated by ICT graduates in the workplace.
Resumo:
Service oriented architecture (SOA) is an architectural style for building software systems based on services. Especially in those scenarios where services implement business processes, complex conversations between the services occur. Service choreographies are a means to capture all interaction obligations and constraints from a global perspective. This article introduces choreographies as an important artifact for SOA, compares them to service orchestrations and surveys existing languages for modeling them.
Resumo:
In this study on the basis of lab data and available resources in Bangladesh, feasibility study has been carried out for pyrolysis process converting solid tire wastes into pyrolysis oils, solid char and gases. The process considered for detailed analysis was fixed-bed fire-tube heating pyrolysis reactor system. The comparative techno-economic assessment was carried out in US$ for three different sizes plants: medium commercial scale (144 tons/day), small commercial scale (36 tons/day), pilot scale (3.6 tons/day). The assessment showed that medium commercial scale plant was economically feasible, with the lowest unit production cost than small commercial and pilot scale plants for the production of crude pyrolysis oil that could be used as boiler fuel oil and for the production of upgraded liquid-products.