866 resultados para Information Security, Safe Behavior, Users’ behavior, Brazilian users, threats


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Focuses on the various aspects of advances in future information communication technology and its applications Presents the latest issues and progress in the area of future information communication technology Applicable to both researchers and professionals These proceedings are based on the 2013 International Conference on Future Information & Communication Engineering (ICFICE 2013), which will be held at Shenyang in China from June 24-26, 2013. The conference is open to all over the world, and participation from Asia-Pacific region is particularly encouraged. The focus of this conference is on all technical aspects of electronics, information, and communications ICFICE-13 will provide an opportunity for academic and industry professionals to discuss the latest issues and progress in the area of FICE. In addition, the conference will publish high quality papers which are closely related to the various theories and practical applications in FICE. Furthermore, we expect that the conference and its publications will be a trigger for further related research and technology improvements in this important subject. "This work was supported by the NIPA (National IT Industry Promotion Agency) of Korea Grant funded by the Korean Government (Ministry of Science, ICT & Future Planning)."

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Advances in Information and Communication Technologies have the potential to improve many facets of modern healthcare service delivery. The implementation of electronic health records systems is a critical part of an eHealth system. Despite the potential gains, there are several obstacles that limit the wider development of electronic health record systems. Among these are the perceived threats to the security and privacy of patients’ health data, and a widely held belief that these cannot be adequately addressed. We hypothesise that the major concerns regarding eHealth security and privacy cannot be overcome through the implementation of technology alone. Human dimensions must be considered when analysing the provision of the three fundamental information security goals: confidentiality, integrity and availability. A sociotechnical analysis to establish the information security and privacy requirements when designing and developing a given eHealth system is important and timely. A framework that accommodates consideration of the legislative requirements and human perspectives in addition to the technological measures is useful in developing a measurable and accountable eHealth system. Successful implementation of this approach would enable the possibilities, practicalities and sustainabilities of proposed eHealth systems to be realised.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Recent road safety statistics show that the decades-long fatalities decreasing trend is stopping and stagnating. Statistics further show that crashes are mostly driven by human error, compared to other factors such as environmental conditions and mechanical defects. Within human error, the dominant error source is perceptive errors, which represent about 50% of the total. The next two sources are interpretation and evaluation, which accounts together with perception for more than 75% of human error related crashes. Those statistics show that allowing drivers to perceive and understand their environment better, or supplement them when they are clearly at fault, is a solution to a good assessment of road risk, and, as a consequence, further decreasing fatalities. To answer this problem, currently deployed driving assistance systems combine more and more information from diverse sources (sensors) to enhance the driver's perception of their environment. However, because of inherent limitations in range and field of view, these systems' perception of their environment remains largely limited to a small interest zone around a single vehicle. Such limitations can be overcomed by increasing the interest zone through a cooperative process. Cooperative Systems (CS), a specific subset of Intelligent Transportation Systems (ITS), aim at compensating for local systems' limitations by associating embedded information technology and intervehicular communication technology (IVC). With CS, information sources are not limited to a single vehicle anymore. From this distribution arises the concept of extended or augmented perception. Augmented perception allows extending an actor's perceptive horizon beyond its "natural" limits not only by fusing information from multiple in-vehicle sensors but also information obtained from remote sensors. The end result of an augmented perception and data fusion chain is known as an augmented map. It is a repository where any relevant information about objects in the environment, and the environment itself, can be stored in a layered architecture. This thesis aims at demonstrating that augmented perception has better performance than noncooperative approaches, and that it can be used to successfully identify road risk. We found it was necessary to evaluate the performance of augmented perception, in order to obtain a better knowledge on their limitations. Indeed, while many promising results have already been obtained, the feasibility of building an augmented map from exchanged local perception information and, then, using this information beneficially for road users, has not been thoroughly assessed yet. The limitations of augmented perception, and underlying technologies, have not be thoroughly assessed yet. Most notably, many questions remain unanswered as to the IVC performance and their ability to deliver appropriate quality of service to support life-saving critical systems. This is especially true as the road environment is a complex, highly variable setting where many sources of imperfections and errors exist, not only limited to IVC. We provide at first a discussion on these limitations and a performance model built to incorporate them, created from empirical data collected on test tracks. Our results are more pessimistic than existing literature, suggesting IVC limitations have been underestimated. Then, we develop a new CS-applications simulation architecture. This architecture is used to obtain new results on the safety benefits of a cooperative safety application (EEBL), and then to support further study on augmented perception. At first, we confirm earlier results in terms of crashes numbers decrease, but raise doubts on benefits in terms of crashes' severity. In the next step, we implement an augmented perception architecture tasked with creating an augmented map. Our approach is aimed at providing a generalist architecture that can use many different types of sensors to create the map, and which is not limited to any specific application. The data association problem is tackled with an MHT approach based on the Belief Theory. Then, augmented and single-vehicle perceptions are compared in a reference driving scenario for risk assessment,taking into account the IVC limitations obtained earlier; we show their impact on the augmented map's performance. Our results show that augmented perception performs better than non-cooperative approaches, allowing to almost tripling the advance warning time before a crash. IVC limitations appear to have no significant effect on the previous performance, although this might be valid only for our specific scenario. Eventually, we propose a new approach using augmented perception to identify road risk through a surrogate: near-miss events. A CS-based approach is designed and validated to detect near-miss events, and then compared to a non-cooperative approach based on vehicles equiped with local sensors only. The cooperative approach shows a significant improvement in the number of events that can be detected, especially at the higher rates of system's deployment.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Over the last decade, the majority of existing search techniques is either keyword- based or category-based, resulting in unsatisfactory effectiveness. Meanwhile, studies have illustrated that more than 80% of users preferred personalized search results. As a result, many studies paid a great deal of efforts (referred to as col- laborative filtering) investigating on personalized notions for enhancing retrieval performance. One of the fundamental yet most challenging steps is to capture precise user information needs. Most Web users are inexperienced or lack the capability to express their needs properly, whereas the existent retrieval systems are highly sensitive to vocabulary. Researchers have increasingly proposed the utilization of ontology-based tech- niques to improve current mining approaches. The related techniques are not only able to refine search intentions among specific generic domains, but also to access new knowledge by tracking semantic relations. In recent years, some researchers have attempted to build ontological user profiles according to discovered user background knowledge. The knowledge is considered to be both global and lo- cal analyses, which aim to produce tailored ontologies by a group of concepts. However, a key problem here that has not been addressed is: how to accurately match diverse local information to universal global knowledge. This research conducts a theoretical study on the use of personalized ontolo- gies to enhance text mining performance. The objective is to understand user information needs by a \bag-of-concepts" rather than \words". The concepts are gathered from a general world knowledge base named the Library of Congress Subject Headings. To return desirable search results, a novel ontology-based mining approach is introduced to discover accurate search intentions and learn personalized ontologies as user profiles. The approach can not only pinpoint users' individual intentions in a rough hierarchical structure, but can also in- terpret their needs by a set of acknowledged concepts. Along with global and local analyses, another solid concept matching approach is carried out to address about the mismatch between local information and world knowledge. Relevance features produced by the Relevance Feature Discovery model, are determined as representatives of local information. These features have been proven as the best alternative for user queries to avoid ambiguity and consistently outperform the features extracted by other filtering models. The two attempt-to-proposed ap- proaches are both evaluated by a scientific evaluation with the standard Reuters Corpus Volume 1 testing set. A comprehensive comparison is made with a num- ber of the state-of-the art baseline models, including TF-IDF, Rocchio, Okapi BM25, the deploying Pattern Taxonomy Model, and an ontology-based model. The gathered results indicate that the top precision can be improved remarkably with the proposed ontology mining approach, where the matching approach is successful and achieves significant improvements in most information filtering measurements. This research contributes to the fields of ontological filtering, user profiling, and knowledge representation. The related outputs are critical when systems are expected to return proper mining results and provide personalized services. The scientific findings have the potential to facilitate the design of advanced preference mining models, where impact on people's daily lives.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A fundamental part of many authentication protocols which authenticate a party to a human involves the human recognizing or otherwise processing a message received from the party. Examples include typical implementations of Verified by Visa in which a message, previously stored by the human at a bank, is sent by the bank to the human to authenticate the bank to the human; or the expectation that humans will recognize or verify an extended validation certificate in a HTTPS context. This paper presents general definitions and building blocks for the modelling and analysis of human recognition in authentication protocols, allowing the creation of proofs for protocols which include humans. We cover both generalized trawling and human-specific targeted attacks. As examples of the range of uses of our construction, we use the model presented in this paper to prove the security of a mutual authentication login protocol and a human-assisted device pairing protocol.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

For the past several decades, cryptographers have consistently provided us with stronger and more capable primitives and protocols that have found many applications in security systems in everyday life. One of the central tenets of cryptographic design is that, whereas a system’s architecture ought to be public and open to scrutiny, the keys on which it depends — long, utterly random, unique strings of bits — will be perfectly preserved by their owner, and yet nominally inaccessible to foes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Rakaposhi is a synchronous stream cipher, which uses three main components: a non-linear feedback shift register (NLFSR), a dynamic linear feedback shift register (DLFSR) and a non-linear filtering function (NLF). NLFSR consists of 128 bits and is initialised by the secret key K. DLFSR holds 192 bits and is initialised by an initial vector (IV). NLF takes 8-bit inputs and returns a single output bit. The work identifies weaknesses and properties of the cipher. The main observation is that the initialisation procedure has the so-called sliding property. The property can be used to launch distinguishing and key recovery attacks. The distinguisher needs four observations of the related (K,IV) pairs. The key recovery algorithm allows to discover the secret key K after observing 29 pairs of (K,IV). Based on the proposed related-key attack, the number of related (K,IV) pairs is 2(128 + 192)/4 pairs. Further the cipher is studied when the registers enter short cycles. When NLFSR is set to all ones, then the cipher degenerates to a linear feedback shift register with a non-linear filter. Consequently, the initial state (and Secret Key and IV) can be recovered with complexity 263.87. If DLFSR is set to all zeros, then NLF reduces to a low non-linearity filter function. As the result, the cipher is insecure allowing the adversary to distinguish it from a random cipher after 217 observations of keystream bits. There is also the key recovery algorithm that allows to find the secret key with complexity 2 54.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Early works on Private Information Retrieval (PIR) focused on minimizing the necessary communication overhead. They seemed to achieve this goal but at the expense of query response time. To mitigate this weakness, protocols with secure coprocessors were introduced. They achieve optimal communication complexity and better online processing complexity. Unfortunately, all secure coprocessor-based PIR protocols require heavy periodical preprocessing. In this paper, we propose a new protocol, which is free from the periodical preprocessing while offering the optimal communication complexity and almost optimal online processing complexity. The proposed protocol is proven to be secure.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper provides a detailed description of the current Australian e-passport implementation and makes a formal verification using model checking tools CASPER/CSP/FDR. We highlight security issues present in the current e-passport implementation and identify new threats when an e-passport system is integrated with an automated processing systems like SmartGate. Because the current e-passport specification does not provide adequate security goals, to perform a rational security analysis we identify and describe a set of security goals for evaluation of e-passport protocols. Our analysis confirms existing security issues that were previously informally identified and presents weaknesses that exists in the current e-passport implementation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis investigates the use of building information models for access control and security applications in critical infrastructures and complex building environments. It examines current problems in security management for physical and logical access control and proposes novel solutions that exploit the detailed information available in building information models. The project was carried out as part of the Airports of the Future Project and the research was modelled based on real-world problems identified in collaboration with our industry partners in the project.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Distributed Network Protocol Version 3 (DNP3) is the de-facto communication protocol for power grids. Standard-based interoperability among devices has made the protocol useful to other infrastructures such as water, sewage, oil and gas. DNP3 is designed to facilitate interaction between master stations and outstations. In this paper, we apply a formal modelling methodology called Coloured Petri Nets (CPN) to create an executable model representation of DNP3 protocol. The model facilitates the analysis of the protocol to ensure that the protocol will behave as expected. Also, we illustrate how to verify and validate the behaviour of the protocol, using the CPN model and the corresponding state space tool to determine if there are insecure states. With this approach, we were able to identify a Denial of Service (DoS) attack against the DNP3 protocol.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Vehicular Ad-hoc Networks (VANETs) can make roads safer, cleaner, and smarter. It can offer a wide range of services, which can be safety and non-safety related. Many safety-related VANETs applications are real-time and mission critical, which would require strict guarantee of security and reliability. Even non-safety related multimedia applications, which will play an important role in the future, will require security support. Lack of such security and privacy in VANETs is one of the key hindrances to the wide spread implementations of it. An insecure and unreliable VANET can be more dangerous than the system without VANET support. So it is essential to make sure that “life-critical safety” information is secure enough to rely on. Securing the VANETs along with appropriate protection of the privacy drivers or vehicle owners is a very challenging task. In this work we summarize the attacks, corresponding security requirements and challenges in VANETs. We also present the most popular generic security policies which are based on prevention as well detection methods. Many VANETs applications require system-wide security support rather than individual layer from the VANETs’ protocol stack. In this work we will review the existing works in the perspective of holistic approach of security. Finally, we will provide some possible future directions to achieve system-wide security as well as privacy-friendly security in VANETs.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Wireless Sensor Networks (WSNs) are employed in numerous applications in different areas including military, ecology, and health; for example, to control of important information like the personnel position in a building, as a result, WSNs need security. However, several restrictions such as low capability of computation, small memory, limited resources of energy, and the unreliable channels employ communication in using WSNs can cause difficulty in use of security and protection in WSNs. It is very essential to save WSNs from malevolent attacks in unfriendly situations. Such networks require security plan due to various limitations of resources and the prominent characteristics of a wireless sensor network which is a considerable challenge. This article is an extensive review about problems of WSNs security, which examined recently by researchers and a better understanding of future directions for WSN security.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Information security and privacy in the healthcare domain is a complex and challenging problem for computer scientists, social scientists, law experts and policy makers. Appropriate healthcare provision requires specialized knowledge, is information intensive and much patient information is of a particularly sensitive nature. Electronic health record systems provide opportunities for information sharing which may enhance healthcare services, for both individuals and populations. However, appropriate information management measures are essential for privacy preservation...

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We have previously suggested that three proof requirements are essential for a sustainable land registration system. These were proof of identity, proof of ownership and authority to deal. Our attention in this article is drawn to the security framework that surrounds these requirements. We will ask whether the introduction of the Property Exchange of Australia (PEXA), and its underpinning regulatory regime will meet the concerns that we have in relation to them. In drawing out some problems with PEXA, we then offer an innovative idea, sourced from the transfer of equities that could serve to generate discussion on how we can ensure the Torrens system of land registration is sustainable for another 160 years. We also canvass some more incremental suggestions that evolve out of what we currently do, as well as outlining some comparative externally sourced ideas as to how the transfer and ownership of land can be made safer for all citizens. Such a goal is imperative when land transfer and secure property ownership is a critical component of the economic infrastructure of a modern society.