530 resultados para Green Information Technology


Relevância:

80.00% 80.00%

Publicador:

Resumo:

Prefabricated housing innovations have the potential to reduce the environmental impact of construction through improving efficiency and quality. The current paper systematically summarises the published evidence since 1990 that describes the barriers and drivers affecting the uptake of prefabricated housing innovations. These are discussed in relation to a ‘Project-Based Product Framework’ which considers multiple stakeholders including builders and other intermediaries, suppliers, end-users, the broader policy context and technical issues. The framework facilitated identification of central issues such as the prevalent business and cultural resistance associated with process changes; the potential for efficiency and quality improvements and cost savings; the simultaneous risks and benefits of close supplier-builder relationships, and negative user perceptions towards prefabricated houses. Though there is a lack of evidence regarding the effects of regulations and government policies on prefabrication uptake, there are indications of the positive potential of financial and social incentives. Directions for further research include understanding how to: manage the industry’s transition to prefabricated houses; appropriately compare prefabricated housing to traditional housing on cost, efficiency and quality measures; reconcile the differing perspectives of various stakeholders; quantify and identify the perspectives of the potential end-user population, and manage the interface between the emerging industry and information technology improvements.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Efficient error-Propagating Block Chaining (EPBC) is a block cipher mode intended to simultaneously provide both confidentiality and integrity protection for messages. Mitchell’s analysis pointed out a weakness in the EPBC integrity mechanism that can be used in a forgery attack. This paper identifies and corrects a flaw in Mitchell’s analysis of EPBC, and presents other attacks on the EPBC integrity mechanism.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The Distributed Network Protocol v3.0 (DNP3) is one of the most widely used protocols, to control national infrastructure. Widely used interactive packet manipulation tools, such as Scapy, have not yet been augmented to parse and create DNP3 frames (Biondi 2014). In this paper we extend Scapy to include DNP3, thus allowing us to perform attacks on DNP3 in real-time. Our contribution builds on East et al. (2009), who proposed a range of possible attacks on DNP3. We implement several of these attacks to validate our DNP3 extension to Scapy, then executed the attacks on real world equipment. We present our results, showing that many of these theoretical attacks would be unsuccessful in an Ethernet-based network.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Every university in Australia has a set of policies that guide the institution in its educational practices, however, the policies are often developed in isolation to each other. Now imagine a space where policies are evidence-based, refined annually, cohesively interrelated, and meet stakeholders’ needs. Is this happenstance or the result of good planning? Culturally, Queensland University of Technology (QUT) is a risk-averse institution that takes pride in its financial solvency and is always keen to know “how are we going?” With a twenty-year history of annual reporting that assures the quality of course performance through multiple lines of evidence, QUT’s Learning and Teaching Unit went one step further and strategically aligned a suite of policies that take into consideration the needs of their stakeholders, collaborate with other areas across the institution and use multiple lines of evidence to inform curriculum decision-making. In QUT’s experience, strategic planning can lead to policy that is designed to meet stakeholders’ needs, not manage them; where decision-making is supported by evidence, not rhetoric; where all feedback is incorporated, not ignored; and where policies are cohesively interrelated, not isolated. While many may call this ‘policy nirvana’, QUT has positioned itself to demonstrate good educational practice through Reframe, its evaluation framework. In this case, best practice was achieved through the application of a theory of change and a design-led logic model that allows for transition to other institutions with different cultural specificity. The evaluation approach follows Seldin’s (2003) notion to offer depth and breadth to the evaluation framework along with Berk’s (2005) concept of multiple lines of evidence. In summary, this paper offers university executives, academics, planning and quality staff an opportunity to understand the critical steps that lead to strategic planning and design of evidence-based educational policy that positions a university for best practice in learning and teaching.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Multidimensional data are getting increasing attention from researchers for creating better recommender systems in recent years. Additional metadata provides algorithms with more details for better understanding the interaction between users and items. While neighbourhood-based Collaborative Filtering (CF) approaches and latent factor models tackle this task in various ways effectively, they only utilize different partial structures of data. In this paper, we seek to delve into different types of relations in data and to understand the interaction between users and items more holistically. We propose a generic multidimensional CF fusion approach for top-N item recommendations. The proposed approach is capable of incorporating not only localized relations of user-user and item-item but also latent interaction between all dimensions of the data. Experimental results show significant improvements by the proposed approach in terms of recommendation accuracy.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Industrial control systems (ICS) have been moving from dedicated communications to switched and routed corporate networks, making it probable that these devices are being exposed to the Internet. Many ICS have been designed with poor or little security features, making them vulnerable to potential attack. Recently, several tools have been developed that can scan the internet, including ZMap, Masscan and Shodan. However, little in-depth analysis has been done to compare these Internet-wide scanning techniques, and few Internet-wide scans have been conducted targeting ICS and protocols. In this paper we present a Taxonomy of Internet-wide scanning with a comparison of three popular network scanning tools, and a framework for conducting Internet-wide scans.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Understanding the dynamics of disease spread is essential in contexts such as estimating load on medical services, as well as risk assessment and interven- tion policies against large-scale epidemic outbreaks. However, most of the information is available after the outbreak itself, and preemptive assessment is far from trivial. Here, we report on an agent-based model developed to investigate such epidemic events in a stylised urban environment. For most diseases, infection of a new individual may occur from casual contact in crowds as well as from repeated interactions with social partners such as work colleagues or family members. Our model therefore accounts for these two phenomena. Given the scale of the system, efficient parallel computing is required. In this presentation, we focus on aspects related to paralllelisation for large networks generation and massively multi-agent simulations.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Despite significant improvements in capacity-distortion performance, a computationally efficient capacity control is still lacking in the recent watermarking schemes. In this paper, we propose an efficient capacity control framework to substantiate the notion of watermarking capacity control to be the process of maintaining “acceptable” distortion and running time, while attaining the required capacity. The necessary analysis and experimental results on the capacity control are reported to address practical aspects of the watermarking capacity problem, in dynamic (size) payload embedding.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Business Process Management (BPM) (Dumas et al. 2013) investigates how organizations function and can be improved on the basis of their business processes. The starting point for BPM is that organizational performance is a function of process performance. Thus, BPM proposes a set of methods, techniques and tools to discover, analyze, implement, monitor and control business processes, with the ultimate goal of improving these processes. Most importantly, BPM is not just an organizational management discipline. BPM also studies how technology, and particularly information technology, can effectively support the process improvement effort. In the past two decades the field of BPM has been the focus of extensive research, which spans an increasingly growing scope and advances technology in various directions. The main international forum for state-of-the-art research in this field is the International Conference on Business Process Management, or “BPM” for short—an annual meeting of the aca ...

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Modern commercial agricultural practices in Asia during the last three to four decades involving chemicals (fertilisers and pesticides) have been associated with large increases in food production never witnessed before, especially under the Green Revolution technology in South Asia. This also involves large-scale increases in commercial vegetable crops. However, the high reliance on chemical inputs to bring about these increases in food production is not without problems. A visible, parallel correlation between higher productivity, high artificial input use and environmental degradation and human ill-health is evident in many countries where commercial agriculture is widespread. In this chapter, we focus on the impact of chemical inputs, in particular the impact of pesticides on the environment and on human health in South Asia with special reference to Sri Lanka...

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Decision-making is such an integral aspect in health care routine that the ability to make the right decisions at crucial moments can lead to patient health improvements. Evidence-based practice, the paradigm used to make those informed decisions, relies on the use of current best evidence from systematic research such as randomized controlled trials. Limitations of the outcomes from randomized controlled trials (RCT), such as “quantity” and “quality” of evidence generated, has lowered healthcare professionals’ confidence in using EBP. An alternate paradigm of Practice-Based Evidence has evolved with the key being evidence drawn from practice settings. Through the use of health information technology, electronic health records (EHR) capture relevant clinical practice “evidence”. A data-driven approach is proposed to capitalize on the benefits of EHR. The issues of data privacy, security and integrity are diminished by an information accountability concept. Data warehouse architecture completes the data-driven approach by integrating health data from multi-source systems, unique within the healthcare environment.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Background As the increasing adoption of information technology continues to offer better distant medical services, the distribution of, and remote access to digital medical images over public networks continues to grow significantly. Such use of medical images raises serious concerns for their continuous security protection, which digital watermarking has shown great potential to address. Methods We present a content-independent embedding scheme for medical image watermarking. We observe that the perceptual content of medical images varies widely with their modalities. Recent medical image watermarking schemes are image-content dependent and thus they may suffer from inconsistent embedding capacity and visual artefacts. To attain the image content-independent embedding property, we generalise RONI (region of non-interest, to the medical professionals) selection process and use it for embedding by utilising RONI’s least significant bit-planes. The proposed scheme thus avoids the need for RONI segmentation that incurs capacity and computational overheads. Results Our experimental results demonstrate that the proposed embedding scheme performs consistently over a dataset of 370 medical images including their 7 different modalities. Experimental results also verify how the state-of-the-art reversible schemes can have an inconsistent performance for different modalities of medical images. Our scheme has MSSIM (Mean Structural SIMilarity) larger than 0.999 with a deterministically adaptable embedding capacity. Conclusions Our proposed image-content independent embedding scheme is modality-wise consistent, and maintains a good image quality of RONI while keeping all other pixels in the image untouched. Thus, with an appropriate watermarking framework (i.e., with the considerations of watermark generation, embedding and detection functions), our proposed scheme can be viable for the multi-modality medical image applications and distant medical services such as teleradiology and eHealth.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Cloud computing has significantly impacted a broad range of industries, but these technologies and services have been absorbed throughout the marketplace unevenly. Some industries have moved aggressively towards cloud computing, while others have moved much more slowly. For the most part, the energy sector has approached cloud computing in a measured and cautious way, with progress often in the form of private cloud solutions rather than public ones, or hybridized information technology systems that combine cloud and existing non-cloud architectures. By moving towards cloud computing in a very slow and tentative way, however, the energy industry may prevent itself from reaping the full benefit that a more complete migration to the public cloud has brought about in several other industries. This short communication is accordingly intended to offer a high-level overview of cloud computing, and to put forward the argument that the energy sector should make a more complete migration to the public cloud in order to unlock the major system-wide efficiencies that cloud computing can provide. Also, assets within the energy sector should be designed with as much modularity and flexibility as possible so that they are not locked out of cloud-friendly options in the future.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

STIMulate is a support for learning program at the Queensland University of Technology in Brisbane, Australia. The program provides assistance in mathematics, science and information technology for undergraduate students. This paper develops personas - archetypal users - that represent the attitudes and motivations of students that utilise STIMulate (in particular, the IT stream). Seven different personas were developed based on interviews gathered from Peer Learning Facilitators (PLF) who are experienced students that have excelled in relevant subject areas. The personas were then validated by a PLF focus group. Developing the personas enabled us to better understand the characteristics and needs of the students using the STIMulate program, enabling a more critical analysis of the quality of the service provided.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Particle swarm optimization (PSO), a new population based algorithm, has recently been used on multi-robot systems. Although this algorithm is applied to solve many optimization problems as well as multi-robot systems, it has some drawbacks when it is applied on multi-robot search systems to find a target in a search space containing big static obstacles. One of these defects is premature convergence. This means that one of the properties of basic PSO is that when particles are spread in a search space, as time increases they tend to converge in a small area. This shortcoming is also evident on a multi-robot search system, particularly when there are big static obstacles in the search space that prevent the robots from finding the target easily; therefore, as time increases, based on this property they converge to a small area that may not contain the target and become entrapped in that area.Another shortcoming is that basic PSO cannot guarantee the global convergence of the algorithm. In other words, initially particles explore different areas, but in some cases they are not good at exploiting promising areas, which will increase the search time.This study proposes a method based on the particle swarm optimization (PSO) technique on a multi-robot system to find a target in a search space containing big static obstacles. This method is not only able to overcome the premature convergence problem but also establishes an efficient balance between exploration and exploitation and guarantees global convergence, reducing the search time by combining with a local search method, such as A-star.To validate the effectiveness and usefulness of algorithms,a simulation environment has been developed for conducting simulation-based experiments in different scenarios and for reporting experimental results. These experimental results have demonstrated that the proposed method is able to overcome the premature convergence problem and guarantee global convergence.