948 resultados para Information Model
Resumo:
The encryption method is a well established technology for protecting sensitive data. However, once encrypted, the data can no longer be easily queried. The performance of the database depends on how to encrypt the sensitive data. In this paper we review the conventional encryption method which can be partially queried and propose the encryption method for numerical data which can be effectively queried. The proposed system includes the design of the service scenario, and metadata.
Resumo:
The new model of North Island Cenozoic palaeogeography developed by Kamp et al. has a range of important implications for the evolution of New Zealand terrestrial taxa over the past 30 Ma. Key aspects include the prolonged isolation of the biota on the North Island landmass from the larger and more diverse greater South Island, and the founding of North Island taxa from the potentially unusual ecosystem of a small island around Northland. The prolonged period of isolation is expected to have generated deep phylogenetic splits within taxa present on both islands, and an important current aim should be to identify such signals in surviving endemics to start building a picture of the historical phylogeography, and inferred ecology of both islands through the Cenozoic. Given the potential differences in founding terrestrial species and climatic conditions, it seems likely that the ecology may have been very diferent between the North and South Islands. New genetic data from the 10 or so species of extinct moa suggest that the radiation of moa was much more recent than previously suggested, and reveals a complex pattern that is inferred to result from the interplay of the Cenozoic biogeography, marine barriers, and glacial cycles.
Resumo:
Enterprise architecture (EA) management has become an intensively discussed approach to manage enterprise transformations. Despite the popularity and potential of EA, both researchers and practitioners lament a lack of knowledge about the realization of benefits from EA. To determine the benefits from EA, we explore the various dimensions of EA benefit realization and report on the development of a validated and robust measurement instrument. In this paper, we test the reliability and construct validity of the EA benefit realization model (EABRM), which we have designed based on the DeLone & McLean IS success model and findings from exploratory interviews. A confirmatory factor analysis confirms the existence of an impact of five distinct and individually important dimensions on the benefits derived from EA: EA artefact quality, EA infrastructure quality, EA service quality, EA culture, and EA use. The analysis presented in this paper shows that the EA benefit realization model is an instrument that demonstrates strong reliability and validity.
Resumo:
Successful firms use business model innovation to rethink the way they do business and transform industries. However, current research on business model innovation is lacking theoretical underpinnings and is in need of new insights. This objective of this paper is to advance our understanding of both the business model concept and business model innovation based on service logic as foundation for customer value and value creation. We present and discuss a rationale for business models based on ‘service logic’ with service as a value-supporting process and compared it with a business model based on ‘goods logic’ with goods as value-supporting resources. The implications for each of the business model dimensions: customer, value proposition, organizational architecture and revenue model, are described and discussed in detail.
Resumo:
An enhanced mill extraction model has been developed to calculate mill performance parameters and to predict the extraction performance of a milling unit. The model takes into account the fibre suspended in juice streams and calculates filling ratio, reabsorption factor, imbibition coefficient, and separation efficiency using more complete definitions than those used in previous extraction models. A mass balance model is used to determine the fibre, brix and moisture mass flows between milling units so that a complete milling train, including the return stream from the juice screen, is modelled. Model solutions are presented to determine the effect of different levels of fibre in juice and efficiency of fibre separation in the juice screen on brix extraction. The model provides more accurate results than earlier models leading to better understanding and improvement of the milling process.
Resumo:
Information communication and technology (ICT) systems are almost ubiquitous in the modern world. It is hard to identify any industry, or for that matter any part of society, that is not in some way dependent on these systems and their continued secure operation. Therefore the security of information infrastructures, both on an organisational and societal level, is of critical importance. Information security risk assessment is an essential part of ensuring that these systems are appropriately protected and positioned to deal with a rapidly changing threat environment. The complexity of these systems and their inter-dependencies however, introduces a similar complexity to the information security risk assessment task. This complexity suggests that information security risk assessment cannot, optimally, be undertaken manually. Information security risk assessment for individual components of the information infrastructure can be aided by the use of a software tool, a type of simulation, which concentrates on modelling failure rather than normal operational simulation. Avoiding the modelling of the operational system will once again reduce the level of complexity of the assessment task. The use of such a tool provides the opportunity to reuse information in many different ways by developing a repository of relevant information to aid in both risk assessment and management and governance and compliance activities. Widespread use of such a tool allows the opportunity for the risk models developed for individual information infrastructure components to be connected in order to develop a model of information security exposures across the entire information infrastructure. In this thesis conceptual and practical aspects of risk and its underlying epistemology are analysed to produce a model suitable for application to information security risk assessment. Based on this work prototype software has been developed to explore these concepts for information security risk assessment. Initial work has been carried out to investigate the use of this software for information security compliance and governance activities. Finally, an initial concept for extending the use of this approach across an information infrastructure is presented.
Resumo:
The ability to detect unusual events in surviellance footage as they happen is a highly desireable feature for a surveillance system. However, this problem remains challenging in crowded scenes due to occlusions and the clustering of people. In this paper, we propose using the Distributed Behavior Model (DBM), which has been widely used in computer graphics, for video event detection. Our approach does not rely on object tracking, and is robust to camera movements. We use sparse coding for classification, and test our approach on various datasets. Our proposed approach outperforms a state-of-the-art work which uses the social force model and Latent Dirichlet Allocation.
Resumo:
A total histological grade does not necessarily distinguish between different manifestations of cartilage damage or degeneration. An accurate and reliable histological assessment method is required to separate normal and pathological tissue within a joint during treatment of degenerative joint conditions and to sub-classify the latter in meaningful ways. The Modified Mankin method may be adaptable for this purpose. We investigated how much detail may be lost by assigning one composite score/grade to represent different degenerative components of the osteoarthritic condition. We used four ovine injury models (sham surgery, anterior cruciate ligament/medial collateral ligament instability, simulated anatomic anterior cruciate ligament reconstruction and meniscal removal) to induce different degrees and potentially 'types' (mechanisms) of osteoarthritis. Articular cartilage was systematically harvested, prepared for histological examination and graded in a blinded fashion using a Modified Mankin grading method. Results showed that the possible permutations of cartilage damage were significant and far more varied than the current intended use that histological grading systems allow. Of 1352 cartilage specimens graded, 234 different manifestations of potential histological damage were observed across 23 potential individual grades of the Modified Mankin grading method. The results presented here show that current composite histological grading may contain additional information that could potentially discern different stages or mechanisms of cartilage damage and degeneration in a sheep model. This approach may be applicable to other grading systems.
Resumo:
Client puzzles are cryptographic problems that are neither easy nor hard to solve. Most puzzles are based on either number theoretic or hash inversions problems. Hash-based puzzles are very efficient but so far have been shown secure only in the random oracle model; number theoretic puzzles, while secure in the standard model, tend to be inefficient. In this paper, we solve the problem of constucting cryptographic puzzles that are secure int he standard model and are very efficient. We present an efficient number theoretic puzzle that satisfies the puzzle security definition of Chen et al. (ASIACRYPT 2009). To prove the security of our puzzle, we introduce a new variant of the interval discrete logarithm assumption which may be of independent interest, and show this new problem to be hard under reasonable assumptions. Our experimental results show that, for 512-bit modulus, the solution verification time of our proposed puzzle can be up to 50x and 89x faster than the Karame-Capkum puzzle and the Rivest et al.'s time-lock puzzle respectively. In particular, the solution verification tiem of our puzzle is only 1.4x slower than that of Chen et al.'s efficient hash based puzzle.
Resumo:
While the studio environment has been promoted as an ideal educational setting for project-based disciplines, few qualitative studies have been undertaken in a comprehensive way (Bose, 2007). This study responds to this need by adopting Grounded Theory methodology in a qualitative comparative approach. The research aims to explore the limitations and benefits of a face-to-face (f2f) design studio as well as a virtual design studio (VDS) as experienced by architecture students and educators at an Australian university in order to find the optimal combination for a blended environment to maximize learning. The main outcome is a holistic multidimensional blended model being sufficiently flexible to adapt to various setting, in the process, facilitating constructivist learning through self-determination, self-management, and personalization of the learning environment.
Resumo:
Building Web 2.0 sites does not necessarily ensure the success of the site. We aim to better understand what improves the success of a site by drawing insight from biologically inspired design patterns. Web 2.0 sites provide a mechanism for human interaction enabling powerful intercommunication between massive volumes of users. Early Web 2.0 site providers that were previously dominant are being succeeded by newer sites providing innovative social interaction mechanisms. Understanding what site traits contribute to this success drives research into Web sites mechanics using models to describe the associated social networking behaviour. Some of these models attempt to show how the volume of users provides a self-organising and self-contextualisation of content. One model describing coordinated environments is called stigmergy, a term originally describing coordinated insect behavior. This paper explores how exploiting stigmergy can provide a valuable mechanism for identifying and analysing online user behavior specifically when considering that user freedom of choice is restricted by the provided web site functionality. This will aid our building better collaborative Web sites improving the collaborative processes.