469 resultados para Model information
Resumo:
As computers approach the physical limits of information storable in memory, new methods will be needed to further improve information storage and retrieval. We propose a quantum inspired vector based approach, which offers a contextually dependent mapping from the subsymbolic to the symbolic representations of information. If implemented computationally, this approach would provide exceptionally high density of information storage, without the traditionally required physical increase in storage capacity. The approach is inspired by the structure of human memory and incorporates elements of Gardenfors’ Conceptual Space approach and Humphreys et al.’s matrix model of memory.
Resumo:
Understanding the business value of IT has mostly been studied in developed countries, but because most investment in developing countries is derived from external sources, the influence of that investment on business value is likely to be different. We test this notion using a two-layer model. We examine the impact of IT investments on firm processes, and the relationship of these processes to firm performance in a developing country. Our findings suggest that investment in different areas of IT positively relates to improvements in intermediate business processes and these intermediate business processes positively relate to the overall financial performance of firms in a developing country.
Resumo:
Having a good automatic anomalous human behaviour detection is one of the goals of smart surveillance systems’ domain of research. The automatic detection addresses several human factor issues underlying the existing surveillance systems. To create such a detection system, contextual information needs to be considered. This is because context is required in order to correctly understand human behaviour. Unfortunately, the use of contextual information is still limited in the automatic anomalous human behaviour detection approaches. This paper proposes a context space model which has two benefits: (a) It provides guidelines for the system designers to select information which can be used to describe context; (b)It enables a system to distinguish between different contexts. A comparative analysis is conducted between a context-based system which employs the proposed context space model and a system which is implemented based on one of the existing approaches. The comparison is applied on a scenario constructed using video clips from CAVIAR dataset. The results show that the context-based system outperforms the other system. This is because the context space model allows the system to considering knowledge learned from the relevant context only.
Resumo:
This book provides the much needed international dimension on the payoffs of information technology investments. The bulk of the research on the impact of information technology investments has been undertaken in developed economies, mainly the United States. This research provides an alternative dimension - a developing country perspective on how information technology investments impacts organizations. Secondly, there has been much debate and controversy on how we measure information technology investment payoffs. This research uses an innovative two-stage model where it proposes that information technology investments will first impact the process and improvement in the processes will then impact the performance. In doing so, it considers sectors of information technology investment rather than taking it as one. Finally, almost all prior studies in this area have considered only the tangible impact of information technology investments. This research proposes that one can only better understand the benefits by looking at both the tangible and intangible benefits.
Resumo:
With the advent of social web initiatives, some argued that these new emerging tools might be useful in tacit knowledge sharing through providing interactive and collaborative technologies. However, there is still a poverty of literature to understand how and what might be the contributions of social media in facilitating tacit knowledge sharing. Therefore, this paper is intended to theoretically investigate and map social media concepts and characteristics with tacit knowledge creation and sharing requirements. By conducting a systematic literature review, five major requirements found that need to be present in an environment that involves tacit knowledge sharing. These requirements have been analyzed against social media concepts and characteristics to see how they map together. The results showed that social media have abilities to comply some of the main requirements of tacit knowledge sharing. The relationships have been illustrated in a conceptual framework, suggesting further empirical studies to acknowledge findings of this study.
Resumo:
Experts’ views and commentary have been highly respected in every discipline. However, unlike traditional disciplines like medicine, mathematics and engineering, Information System (IS) expertise is difficult to define. Despite seeking expert advice and views is common in the areas of IS project management, system implementations and evaluations. This research-in-progress paper attempts to understand the characteristics of IS-expert through a comprehensive literature review of analogous disciplines and then derives a formative research model with three main constructs. A validated construct of expertise of IS will have a wide range of implications for research and practice.
Resumo:
The encryption method is a well established technology for protecting sensitive data. However, once encrypted, the data can no longer be easily queried. The performance of the database depends on how to encrypt the sensitive data. In this paper we review the conventional encryption method which can be partially queried and propose the encryption method for numerical data which can be effectively queried. The proposed system includes the design of the service scenario, and metadata.
Resumo:
The new model of North Island Cenozoic palaeogeography developed by Kamp et al. has a range of important implications for the evolution of New Zealand terrestrial taxa over the past 30 Ma. Key aspects include the prolonged isolation of the biota on the North Island landmass from the larger and more diverse greater South Island, and the founding of North Island taxa from the potentially unusual ecosystem of a small island around Northland. The prolonged period of isolation is expected to have generated deep phylogenetic splits within taxa present on both islands, and an important current aim should be to identify such signals in surviving endemics to start building a picture of the historical phylogeography, and inferred ecology of both islands through the Cenozoic. Given the potential differences in founding terrestrial species and climatic conditions, it seems likely that the ecology may have been very diferent between the North and South Islands. New genetic data from the 10 or so species of extinct moa suggest that the radiation of moa was much more recent than previously suggested, and reveals a complex pattern that is inferred to result from the interplay of the Cenozoic biogeography, marine barriers, and glacial cycles.
Resumo:
Enterprise architecture (EA) management has become an intensively discussed approach to manage enterprise transformations. Despite the popularity and potential of EA, both researchers and practitioners lament a lack of knowledge about the realization of benefits from EA. To determine the benefits from EA, we explore the various dimensions of EA benefit realization and report on the development of a validated and robust measurement instrument. In this paper, we test the reliability and construct validity of the EA benefit realization model (EABRM), which we have designed based on the DeLone & McLean IS success model and findings from exploratory interviews. A confirmatory factor analysis confirms the existence of an impact of five distinct and individually important dimensions on the benefits derived from EA: EA artefact quality, EA infrastructure quality, EA service quality, EA culture, and EA use. The analysis presented in this paper shows that the EA benefit realization model is an instrument that demonstrates strong reliability and validity.
Resumo:
Successful firms use business model innovation to rethink the way they do business and transform industries. However, current research on business model innovation is lacking theoretical underpinnings and is in need of new insights. This objective of this paper is to advance our understanding of both the business model concept and business model innovation based on service logic as foundation for customer value and value creation. We present and discuss a rationale for business models based on ‘service logic’ with service as a value-supporting process and compared it with a business model based on ‘goods logic’ with goods as value-supporting resources. The implications for each of the business model dimensions: customer, value proposition, organizational architecture and revenue model, are described and discussed in detail.
Resumo:
An enhanced mill extraction model has been developed to calculate mill performance parameters and to predict the extraction performance of a milling unit. The model takes into account the fibre suspended in juice streams and calculates filling ratio, reabsorption factor, imbibition coefficient, and separation efficiency using more complete definitions than those used in previous extraction models. A mass balance model is used to determine the fibre, brix and moisture mass flows between milling units so that a complete milling train, including the return stream from the juice screen, is modelled. Model solutions are presented to determine the effect of different levels of fibre in juice and efficiency of fibre separation in the juice screen on brix extraction. The model provides more accurate results than earlier models leading to better understanding and improvement of the milling process.
Resumo:
Information communication and technology (ICT) systems are almost ubiquitous in the modern world. It is hard to identify any industry, or for that matter any part of society, that is not in some way dependent on these systems and their continued secure operation. Therefore the security of information infrastructures, both on an organisational and societal level, is of critical importance. Information security risk assessment is an essential part of ensuring that these systems are appropriately protected and positioned to deal with a rapidly changing threat environment. The complexity of these systems and their inter-dependencies however, introduces a similar complexity to the information security risk assessment task. This complexity suggests that information security risk assessment cannot, optimally, be undertaken manually. Information security risk assessment for individual components of the information infrastructure can be aided by the use of a software tool, a type of simulation, which concentrates on modelling failure rather than normal operational simulation. Avoiding the modelling of the operational system will once again reduce the level of complexity of the assessment task. The use of such a tool provides the opportunity to reuse information in many different ways by developing a repository of relevant information to aid in both risk assessment and management and governance and compliance activities. Widespread use of such a tool allows the opportunity for the risk models developed for individual information infrastructure components to be connected in order to develop a model of information security exposures across the entire information infrastructure. In this thesis conceptual and practical aspects of risk and its underlying epistemology are analysed to produce a model suitable for application to information security risk assessment. Based on this work prototype software has been developed to explore these concepts for information security risk assessment. Initial work has been carried out to investigate the use of this software for information security compliance and governance activities. Finally, an initial concept for extending the use of this approach across an information infrastructure is presented.
Resumo:
The ability to detect unusual events in surviellance footage as they happen is a highly desireable feature for a surveillance system. However, this problem remains challenging in crowded scenes due to occlusions and the clustering of people. In this paper, we propose using the Distributed Behavior Model (DBM), which has been widely used in computer graphics, for video event detection. Our approach does not rely on object tracking, and is robust to camera movements. We use sparse coding for classification, and test our approach on various datasets. Our proposed approach outperforms a state-of-the-art work which uses the social force model and Latent Dirichlet Allocation.
Resumo:
A total histological grade does not necessarily distinguish between different manifestations of cartilage damage or degeneration. An accurate and reliable histological assessment method is required to separate normal and pathological tissue within a joint during treatment of degenerative joint conditions and to sub-classify the latter in meaningful ways. The Modified Mankin method may be adaptable for this purpose. We investigated how much detail may be lost by assigning one composite score/grade to represent different degenerative components of the osteoarthritic condition. We used four ovine injury models (sham surgery, anterior cruciate ligament/medial collateral ligament instability, simulated anatomic anterior cruciate ligament reconstruction and meniscal removal) to induce different degrees and potentially 'types' (mechanisms) of osteoarthritis. Articular cartilage was systematically harvested, prepared for histological examination and graded in a blinded fashion using a Modified Mankin grading method. Results showed that the possible permutations of cartilage damage were significant and far more varied than the current intended use that histological grading systems allow. Of 1352 cartilage specimens graded, 234 different manifestations of potential histological damage were observed across 23 potential individual grades of the Modified Mankin grading method. The results presented here show that current composite histological grading may contain additional information that could potentially discern different stages or mechanisms of cartilage damage and degeneration in a sheep model. This approach may be applicable to other grading systems.
Resumo:
Client puzzles are cryptographic problems that are neither easy nor hard to solve. Most puzzles are based on either number theoretic or hash inversions problems. Hash-based puzzles are very efficient but so far have been shown secure only in the random oracle model; number theoretic puzzles, while secure in the standard model, tend to be inefficient. In this paper, we solve the problem of constucting cryptographic puzzles that are secure int he standard model and are very efficient. We present an efficient number theoretic puzzle that satisfies the puzzle security definition of Chen et al. (ASIACRYPT 2009). To prove the security of our puzzle, we introduce a new variant of the interval discrete logarithm assumption which may be of independent interest, and show this new problem to be hard under reasonable assumptions. Our experimental results show that, for 512-bit modulus, the solution verification time of our proposed puzzle can be up to 50x and 89x faster than the Karame-Capkum puzzle and the Rivest et al.'s time-lock puzzle respectively. In particular, the solution verification tiem of our puzzle is only 1.4x slower than that of Chen et al.'s efficient hash based puzzle.