845 resultados para Chunk-based information diffusion
Resumo:
Choi et al. recently proposed an efficient RFID authentication protocol for a ubiquitous computing environment, OHLCAP(One-Way Hash based Low-Cost Authentication Protocol). However, this paper reveals that the protocol has several security weaknesses : 1) traceability based on the leakage of counter information, 2) vulnerability to an impersonation attack by maliciously updating a random number, and 3) traceability based on a physically-attacked tag. Finally, a security enhanced group-based authentication protocol is presented.
Resumo:
Online social networking has become one of the most popular Internet applications in the modern era. They have given the Internet users, access to information that other Internet based applications are unable to. Although many of the popular online social networking web sites are focused towards entertainment purposes, sharing information can benefit the healthcare industry in terms of both efficiency and effectiveness. But the capability to share personal information; the factor which has made online social networks so popular, is itself a major obstacle when considering information security and privacy aspects. Healthcare can benefit from online social networking if they are implemented such that sensitive patient information can be safeguarded from ill exposure. But in an industry such as healthcare where the availability of information is crucial for better decision making, information must be made available to the appropriate parties when they require it. Hence the traditional mechanisms for information security and privacy protection may not be suitable for healthcare. In this paper we propose a solution to privacy enhancement in online healthcare social networks through the use of an information accountability mechanism.
Resumo:
Business Process Management (BPM) is a top priority in organisations and is rapidly proliferating as an emerging discipline in practice. However, the current studies show lack of appropriate BPM skilled professionals in the field and a dearth of opportunities to develop BPM expertise. This paper analyses the gap between available BPM-related education in Australia and required BPM capabilities. BPM courses offered by Australian universities and training institutions have been critically analysed and mapped against leading BPM capability frameworks to determine how well current BPM education and training offerings in Australia actually address the core capabilities required for BPM professionals. The outcomes reported here can be used by Australian universities and training institutions to better align and position their training materials to the BPM required capabilities. It could also be beneficial to individuals looking for a systematic and in-depth understanding of BPM capabilities and trainings.
Resumo:
This paper derives from research-in-progress intending both Design Research (DR) and Design Science (DS) outputs; the former a management decision tool based in IS-Impact (Gable et al. 2008) kernel theory; the latter being methodological learnings deriving from synthesis of the literature and reflection on the DR ‘case study’ experience. The paper introduces a generic, detailed and pragmatic DS ‘Research Roadmap’ or methodology, deriving at this stage primarily from synthesis and harmonization of relevant concepts identified through systematic archival analysis of related literature. The scope of the Roadmap too has been influenced by the parallel study aim to undertake DR applying and further evolving the Roadmap. The Roadmap is presented in attention to the dearth of detailed guidance available to novice Researchers in Design Science Research (DSR), and though preliminary, is expected to evolve and gradually be substantiated through experience of its application. A key distinction of the Roadmap from other DSR methods is its breadth of coverage of published DSR concepts and activities; its detail and scope. It represents a useful synthesis and integration of otherwise highly disparate DSR-related concepts.
Resumo:
Background In order to provide insights into the complex biochemical processes inside a cell, modelling approaches must find a balance between achieving an adequate representation of the physical phenomena and keeping the associated computational cost within reasonable limits. This issue is particularly stressed when spatial inhomogeneities have a significant effect on system's behaviour. In such cases, a spatially-resolved stochastic method can better portray the biological reality, but the corresponding computer simulations can in turn be prohibitively expensive. Results We present a method that incorporates spatial information by means of tailored, probability distributed time-delays. These distributions can be directly obtained by single in silico or a suitable set of in vitro experiments and are subsequently fed into a delay stochastic simulation algorithm (DSSA), achieving a good compromise between computational costs and a much more accurate representation of spatial processes such as molecular diffusion and translocation between cell compartments. Additionally, we present a novel alternative approach based on delay differential equations (DDE) that can be used in scenarios of high molecular concentrations and low noise propagation. Conclusions Our proposed methodologies accurately capture and incorporate certain spatial processes into temporal stochastic and deterministic simulations, increasing their accuracy at low computational costs. This is of particular importance given that time spans of cellular processes are generally larger (possibly by several orders of magnitude) than those achievable by current spatially-resolved stochastic simulators. Hence, our methodology allows users to explore cellular scenarios under the effects of diffusion and stochasticity in time spans that were, until now, simply unfeasible. Our methodologies are supported by theoretical considerations on the different modelling regimes, i.e. spatial vs. delay-temporal, as indicated by the corresponding Master Equations and presented elsewhere.
Resumo:
This paper presents a fault diagnosis method based on adaptive neuro-fuzzy inference system (ANFIS) in combination with decision trees. Classification and regression tree (CART) which is one of the decision tree methods is used as a feature selection procedure to select pertinent features from data set. The crisp rules obtained from the decision tree are then converted to fuzzy if-then rules that are employed to identify the structure of ANFIS classifier. The hybrid of back-propagation and least squares algorithm are utilized to tune the parameters of the membership functions. In order to evaluate the proposed algorithm, the data sets obtained from vibration signals and current signals of the induction motors are used. The results indicate that the CART–ANFIS model has potential for fault diagnosis of induction motors.
Resumo:
Increasingly, national and international governments have a strong mandate to develop national e-health systems to enable delivery of much-needed healthcare services. Research is, therefore, needed into appropriate security and reliance structures for the development of health information systems which must be compliant with governmental and alike obligations. The protection of e-health information security is critical to the successful implementation of any e-health initiative. To address this, this paper proposes a security architecture for index-based e-health environments, according to the broad outline of Australia’s National E-health Strategy and National E-health Transition Authority (NEHTA)’s Connectivity Architecture. This proposal, however, could be equally applied to any distributed, index-based health information system involving referencing to disparate health information systems. The practicality of the proposed security architecture is supported through an experimental demonstration. This successful prototype completion demonstrates the comprehensibility of the proposed architecture, and the clarity and feasibility of system specifications, in enabling ready development of such a system. This test vehicle has also indicated a number of parameters that need to be considered in any national indexed-based e-health system design with reasonable levels of system security. This paper has identified the need for evaluation of the levels of education, training, and expertise required to create such a system.
Resumo:
Expert elicitation is the process of determining what expert knowledge is relevant to support a quantitative analysis and then eliciting this information in a form that supports analysis or decision-making. The credibility of the overall analysis, therefore, relies on the credibility of the elicited knowledge. This, in turn, is determined by the rigor of the design and execution of the elicitation methodology, as well as by its clear communication to ensure transparency and repeatability. It is difficult to establish rigor when the elicitation methods are not documented, as often occurs in ecological research. In this chapter, we describe software that can be combined with a well-structured elicitation process to improve the rigor of expert elicitation and documentation of the results
Resumo:
Recently, a constraints- led approach has been promoted as a framework for understanding how children and adults acquire movement skills for sport and exercise (see Davids, Button & Bennett, 2008; Araújo et al., 2004). The aim of a constraints- led approach is to identify the nature of interacting constraints that influence skill acquisition in learners. In this chapter the main theoretical ideas behind a constraints- led approach are outlined to assist practical applications by sports practitioners and physical educators in a non- linear pedagogy (see Chow et al., 2006, 2007). To achieve this goal, this chapter examines implications for some of the typical challenges facing sport pedagogists and physical educators in the design of learning programmes.
Resumo:
In this thesis, I advance the understanding of information technology (IT) governance research and corporate governance research by considering the question “How do boards govern IT?” The importance of IT to business has increased over the last decade, but there has been little academic research which has focused on boards and their role in the governance of IT (Van Grembergen, De Haes and Guldentops, 2004). Most of the research on information technology governance (ITG) has focused on advancing the understanding and measurement of the components of the ITG model (Buckby, Best & Stewart, 2008; Wilkin & Chenhall, 2010), a model recommended by the IT Governance Institute (2003) as ‘best practice’ for boards to use in governing IT. IT governance is considered to be the responsibility of the board and is said to form an important subset of an organisation’s corporate governance processes (Borth & Bradley, 2008). Boards need to govern IT as a result of the large capital investment in IT resources and high dependency on IT by organisations. Van Grembergen, De Haes and Guldentops (2004) and De Haes & Van Grembergen (2009) indicate that corporate governance matters are not able to be effectively discharged unless IT is being governed properly, and call for further specific research on the role of the board in ITG. Researchers also indicate that the link between corporate governance and IT governance has been neglected (Borth & Bradley, 2008; Musson & Jordan, 2005; Bhattacharjya & Chang, 2008). This thesis will address this gap in the ITG literature by providing the bridge between the ITG and corporate governance literatures. My thesis uses a critical realist epistemology and a mixed method approach to gather insights into my research question. In the first phase of my research I develop a survey instrument to assess whether boards consider the components of the ITG model in governing IT. The results of this first study indicated that directors do not conceptualise their role in governing IT using the elements of the ITG model. Thus, I moved to focus on whether prominent corporate governance theories might elucidate how boards govern IT. In the second phase of the research, I used a qualitative inductive case based study to assess whether agency, stewardship and resource dependence theories explain how boards govern IT in Australian universities. As the first in-depth study of university IT governance processes, my research contributes to the ITG research field by revealing that Australian university board governance of IT is characterized by a combination of agency theory and stewardship theory behaviours and processes. The study also identified strong links between a university’s IT structure and evidence of agency and stewardship theories. This link provides insight into the structures element of the emerging enterprise governance of IT framework (Van Grembergen, De Haes & Guldentops, 2004; De Haes & Van Grembergen, 2009; Van Grembergen & De Haes, 2009b; Ko & Fink, 2010). My research makes an important contribution to governance research by identifying a key link between corporate and ITG literatures and providing insight into board IT governance processes. The research conducted in my thesis should encourage future researchers to continue to explore the links between corporate and IT governance research.
Resumo:
Most web service discovery systems use keyword-based search algorithms and, although partially successful, sometimes fail to satisfy some users information needs. This has given rise to several semantics-based approaches that look to go beyond simple attribute matching and try to capture the semantics of services. However, the results reported in the literature vary and in many cases are worse than the results obtained by keyword-based systems. We believe the accuracy of the mechanisms used to extract tokens from the non-natural language sections of WSDL files directly affects the performance of these techniques, because some of them can be more sensitive to noise. In this paper three existing tokenization algorithms are evaluated and a new algorithm that outperforms all the algorithms found in the literature is introduced.
Resumo:
There are many applications in aeronautics where there exist strong couplings between disciplines. One practical example is within the context of Unmanned Aerial Vehicle(UAV) automation where there exists strong coupling between operation constraints, aerodynamics, vehicle dynamics, mission and path planning. UAV path planning can be done either online or offline. The current state of path planning optimisation online UAVs with high performance computation is not at the same level as its ground-based offline optimizer's counterpart, this is mainly due to the volume, power and weight limitations on the UAV; some small UAVs do not have the computational power needed for some optimisation and path planning task. In this paper, we describe an optimisation method which can be applied to Multi-disciplinary Design Optimisation problems and UAV path planning problems. Hardware-based design optimisation techniques are used. The power and physical limitations of UAV, which may not be a problem in PC-based solutions, can be approached by utilizing a Field Programmable Gate Array (FPGA) as an algorithm accelerator. The inevitable latency produced by the iterative process of an Evolutionary Algorithm (EA) is concealed by exploiting the parallelism component within the dataflow paradigm of the EA on an FPGA architecture. Results compare software PC-based solutions and the hardware-based solutions for benchmark mathematical problems as well as a simple real world engineering problem. Results also indicate the practicality of the method which can be used for more complex single and multi objective coupled problems in aeronautical applications.