969 resultados para Information Attacks


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The forthcoming NIST’s Advanced Hash Standard (AHS) competition to select SHA-3 hash function requires that each candidate hash function submission must have at least one construction to support FIPS 198 HMAC application. As part of its evaluation, NIST is aiming to select either a candidate hash function which is more resistant to known side channel attacks (SCA) when plugged into HMAC, or that has an alternative MAC mode which is more resistant to known SCA than the other submitted alternatives. In response to this, we perform differential power analysis (DPA) on the possible smart card implementations of some of the recently proposed MAC alternatives to NMAC (a fully analyzed variant of HMAC) and HMAC algorithms and NMAC/HMAC versions of some recently proposed hash and compression function modes. We show that the recently proposed BNMAC and KMDP MAC schemes are even weaker than NMAC/HMAC against the DPA attacks, whereas multi-lane NMAC, EMD MAC and the keyed wide-pipe hash have similar security to NMAC against the DPA attacks. Our DPA attacks do not work on the NMAC setting of MDC-2, Grindahl and MAME compression functions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This reports a study that seeks to explore the experience of students majoring in technology and design in an undergraduate education degree. It examines their experiences in finding and using information for a practical assignment. In mapping the variation of the students' experience, the study uses a qualitative, interpretive approach to analyse the data, which was collected via one-to-one interviews. The analysis yielded five themes through which technology education students find and use information: interaction with others; experience (past and new); formal educational learning; the real world; and incidental occurrences. The intentions and strategies that form the students' approaches to finding and using information are discussed. So too are the implications for teaching practice.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

New digital media surrounds us. Little is known, however, about the influence of technology devices such as tablets (e.g. iPads) and smart phones on young children’s lives in home and school settings, and what it means for them throughout their schooling and beyond. Most research to date has focused on children aged six years and older, and much less (with a few exceptions) on preschool-aged children. This article draws on parent interviews to show how family members engage with technology as part of the flow of everyday life. Only time and increased understandings of everyday practices will tell the real values and scope of using digital media.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Business Process Management describes a holistic management approach for the systematic design, modeling, execution, validation, monitoring and improvement of organizational business processes. Traditionally, most attention within this community has been given to control-flow aspects, i.e., the ordering and sequencing of business activities, oftentimes in isolation with regards to the context in which these activities occur. In this paper, we propose an approach that allows executable process models to be integrated with Geographic Information Systems. This approach enables process models to take geospatial and other geographic aspects into account in an explicit manner both during the modeling phase and the execution phase. We contribute a structured modeling methodology, based on the well-known Business Process Model and Notation standard, which is formalized by means of a mapping to executable Colored Petri nets. We illustrate the feasibility of our approach by means of a sustainability-focused case example of a process with important ecological concerns.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Focus groups are a popular qualitative research method for information systems researchers. However, compared with the abundance of research articles and handbooks on planning and conducting focus groups, surprisingly, there is little research on how to analyse focus group data. Moreover, those few articles that specifically address focus group analysis are all in fields other than information systems, and offer little specific guidance for information systems researchers. Further, even the studies that exist in other fields do not provide a systematic and integrated procedure to analyse both focus group ‘content’ and ‘interaction’ data. As the focus group is a valuable method to answer the research questions of many IS studies (in the business, government and society contexts), we believe that more attention should be paid to this method in the IS research. This paper offers a systematic and integrated procedure for qualitative focus group data analysis in information systems research.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents a field study of the Queensland Information Technology and Telecommunications Industry Strategy (QITIS), and of the Information Industries Board (IIB), a joint industry-state government body established in 1992 to oversee the implementation of that strategy for the development of the IT&T Industry in Queensland. The aim of the study was to analyse differing stakeholder perspectives on the strategy and on its implementation by the IIB. The study forms part of a longer-term review which aims to develop methodologies for the selection of appropriate strategies for the IT&T Industry, and for the evaluation of outcomes of strategy.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This thesis introduces a method of applying Bayesian Networks to combine information from a range of data sources for effective decision support systems. It develops a set of techniques in development, validation, visualisation, and application of Complex Systems models, with a working demonstration in an Australian airport environment. The methods presented here have provided a modelling approach that produces highly flexible, informative and applicable interpretations of a system's behaviour under uncertain conditions. These end-to-end techniques are applied to the development of model based dashboards to support operators and decision makers in the multi-stakeholder airport environment. They provide highly flexible and informative interpretations and confidence in these interpretations of a system's behaviour under uncertain conditions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Objective: To estimate the relative inpatient costs of hospital-acquired conditions. Methods: Patient level costs were estimated using computerized costing systems that log individual utilization of inpatient services and apply sophisticated cost estimates from the hospital's general ledger. Occurrence of hospital-acquired conditions was identified using an Australian ‘condition-onset' flag for diagnoses not present on admission. These were grouped to yield a comprehensive set of 144 categories of hospital-acquired conditions to summarize data coded with ICD-10. Standard linear regression techniques were used to identify the independent contribution of hospital-acquired conditions to costs, taking into account the case-mix of a sample of acute inpatients (n = 1,699,997) treated in Australian public hospitals in Victoria (2005/06) and Queensland (2006/07). Results: The most costly types of complications were post-procedure endocrine/metabolic disorders, adding AU$21,827 to the cost of an episode, followed by MRSA (AU$19,881) and enterocolitis due to Clostridium difficile (AU$19,743). Aggregate costs to the system, however, were highest for septicaemia (AU$41.4 million), complications of cardiac and vascular implants other than septicaemia (AU$28.7 million), acute lower respiratory infections, including influenza and pneumonia (AU$27.8 million) and UTI (AU$24.7 million). Hospital-acquired complications are estimated to add 17.3% to treatment costs in this sample. Conclusions: Patient safety efforts frequently focus on dramatic but rare complications with very serious patient harm. Previous studies of the costs of adverse events have provided information on ‘indicators’ of safety problems rather than the full range of hospital-acquired conditions. Adding a cost dimension to priority-setting could result in changes to the focus of patient safety programmes and research. Financial information should be combined with information on patient outcomes to allow for cost-utility evaluation of future interventions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Initial attempts to obtain lattice based signatures were closely related to reducing a vector modulo the fundamental parallelepiped of a secret basis (like GGH [9], or NTRUSign [12]). This approach leaked some information on the secret, namely the shape of the parallelepiped, which has been exploited on practical attacks [24]. NTRUSign was an extremely efficient scheme, and thus there has been a noticeable interest on developing countermeasures to the attacks, but with little success [6]. In [8] Gentry, Peikert and Vaikuntanathan proposed a randomized version of Babai’s nearest plane algorithm such that the distribution of a reduced vector modulo a secret parallelepiped only depended on the size of the base used. Using this algorithm and generating large, close to uniform, public keys they managed to get provably secure GGH-like lattice-based signatures. Recently, Stehlé and Steinfeld obtained a provably secure scheme very close to NTRUSign [26] (from a theoretical point of view). In this paper we present an alternative approach to seal the leak of NTRUSign. Instead of modifying the lattices and algorithms used, we do a classic leaky NTRUSign signature and hide it with gaussian noise using techniques present in Lyubashevky’s signatures. Our main contributions are thus a set of strong NTRUSign parameters, obtained by taking into account latest known attacks against the scheme, a statistical way to hide the leaky NTRU signature so that this particular instantiation of CVP-based signature scheme becomes zero-knowledge and secure against forgeries, based on the worst-case hardness of the O~(N1.5)-Shortest Independent Vector Problem over NTRU lattices. Finally, we give a set of concrete parameters to gauge the efficiency of the obtained signature scheme.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This article provides a general review of the literature on the nature and role of empathy in social interaction for information professionals working in a variety of information and knowledge environments. Relational agency theory (Edwards, 2005) is used asa framework to re-conceptualize education for empathic social interaction between information professionals and their clients. Past, present and future issues relevant to empathic interaction in information and knowledge management are discussed in the context of three shifts identified from the literature: (a) the continued increase in communication channels, both physical and virtual, for reference, information and re-search services, (b) the transition from the information age to the conceptual age and(c) the growing need for understanding of the affective paradigm in the information and knowledge professions. Findings from the literature review on the relationships between empathy and information behavior, social networking, knowledge management and information and knowledge services are presented. Findings are discussed in relation to the development of guidelines for the affective education and training of information and knowledge professionals and the potential use of virtual learning software such as Second Life in developing empathic communication skills

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The NLM stream cipher designed by Hoon Jae Lee, Sang Min Sung, Hyeong Rag Kim is a strengthened version of the LM summation generator that combines linear and non-linear feedback shift registers. In recent works, the NLM cipher has been used for message authentication in lightweight communication over wireless sensor networks and for RFID authentication protocols. The work analyses the security of the NLM stream cipher and the NLM-MAC scheme that is built on the top of the NLM cipher. We first show that the NLM cipher suffers from two major weaknesses that lead to key recovery and forgery attacks. We prove the internal state of the NLM cipher can be recovered with time complexity about nlog7×2, where the total length of internal state is 2⋅n+22⋅n+2 bits. The attack needs about n2n2 key-stream bits. We also show adversary is able to forge any MAC tag very efficiently by having only one pair (MAC tag, ciphertext). The proposed attacks are practical and break the scheme with a negligible error probability.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The DeLone and McLean (D&M) model (2003) has been broadly used and generally recognised as a useful model for gauging the success of IS implementations. However, it is not without limitations. In this study, we evaluate a model that extends the D&M model and attempts to address some of it slimitations by providing a more complete measurement model of systems success. To that end, we augment the D&M (2003) model and include three variables: business value, institutional trust, and future readiness. We propose that the addition of these variables allows systems success to be assessed at both the systems level and the business level. Consequently, we develop a measurement model rather than a structural or predictive model of systems success.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Introduction. Social media is becoming a vital source of information in disaster or emergency situations. While a growing number of studies have explored the use of social media in natural disasters by emergency staff, military personnel, medial and other professionals, very few studies have investigated the use of social media by members of the public. The purpose of this paper is to explore citizens’ information experiences in social media during times of natural disaster. Method. A qualitative research approach was applied. Data was collected via in-depth interviews. Twenty-five people who used social media during a natural disaster in Australia participated in the study. Analysis. Audio recordings of interviews and interview transcripts provided the empirical material for data analysis. Data was analysed using structural and focussed coding methods. Results. Eight key themes depicting various aspects of participants’ information experience during a natural disaster were uncovered by the study: connected; wellbeing; coping; help; brokerage; journalism; supplementary and characteristics. Conclusion. This study contributes insights into social media’s potential for developing community disaster resilience and promotes discussion about the value of civic participation in social media when such circumstances occur. These findings also contribute to our understanding of information experiences as a new informational research object.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

An increasing amount of people seek health advice on the web using search engines; this poses challenging problems for current search technologies. In this paper we report an initial study of the effectiveness of current search engines in retrieving relevant information for diagnostic medical circumlocutory queries, i.e., queries that are issued by people seeking information about their health condition using a description of the symptoms they observes (e.g. hives all over body) rather than the medical term (e.g. urticaria). This type of queries frequently happens when people are unfamiliar with a domain or language and they are common among health information seekers attempting to self-diagnose or self-treat themselves. Our analysis reveals that current search engines are not equipped to effectively satisfy such information needs; this can have potential harmful outcomes on people’s health. Our results advocate for more research in developing information retrieval methods to support such complex information needs.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Over the last few years, investigations of human epigenetic profiles have identified key elements of change to be Histone Modifications, stable and heritable DNA methylation and Chromatin remodeling. These factors determine gene expression levels and characterise conditions leading to disease. In order to extract information embedded in long DNA sequences, data mining and pattern recognition tools are widely used, but efforts have been limited to date with respect to analyzing epigenetic changes, and their role as catalysts in disease onset. Useful insight, however, can be gained by investigation of associated dinucleotide distributions. The focus of this paper is to explore specific dinucleotides frequencies across defined regions within the human genome, and to identify new patterns between epigenetic mechanisms and DNA content. Signal processing methods, including Fourier and Wavelet Transformations, are employed and principal results are reported.