812 resultados para Information privacy and security
Resumo:
Information privacy requirements of patients and information requirements of healthcare providers (HCP) are competing concerns. Reaching a balance between these requirements have proven difficult but is crucial for the success of eHealth systems. The traditional approaches to information management have been preventive measures which either allow or deny access to information. We believe that this approach is inappropriate for a domain such as healthcare. We contend that introducing information accountability (IA) to eHealth systems can reach the aforementioned balance without the need for rigid information control. IA is a fairly new concept to computer science, hence; there are no unambiguously accepted principles as yet. But the concept delivers promising advantages to information management in a robust manner. Accountable-eHealth (AeH) systems are eHealth systems which use IA principles as the measure for privacy and information management. AeH systems face three main impediments; technological, social and ethical and legal. In this paper, we present the AeH model and focus on the legal aspects of AeH systems in Australia. We investigate current legislation available in Australia regarding health information management and identify future legal requirements if AeH systems are to be implemented in Australia.
Resumo:
Mandatory data breach notification laws are a novel statutory solution in relation to organizational protections of personal information. They require organizations which have suffered a breach of security involving personal information to notif'y those persons whose information may have been affected. These laws originated in the state based legislatures of the United States during the last decade and have subsequently garnered worldwide legislative interest. Despite their perceived utility, mandatory data breach notification laws have several conceptual and practical concems that limit the scope of their applicability, particularly in relation to existing information privacy law regimes. We outline these concerns, and in doing so, we contend that while mandatory data breach notification laws have many useful facets, their utility as an 'add-on' to enhance the failings of current information privacy law frameworks should not necessarily be taken for granted.
Resumo:
Background: Enabling women to make informed decisions is a crucial component of consumer-focused maternity care. Current evidence suggests that health care practitioners’ communication of care options may not facilitate patient involvement in decision-making. The aim of this study was to investigate the effect of specific variations in health caregiver communication on women’s preferences for induction of labor for prolonged pregnancy. Methods: A convenience sample of 595 female participants read a hypothetical scenario in which an obstetrician discusses induction of labor with a pregnant woman. Information provided on induction and the degree of encouragement for the woman’s involvement in decision-making was manipulated to create four experimental conditions. Participants indicated preference with respect to induction, their perceptions of the quality of information received, and other potential moderating factors. Results: Participants who received information that was directive in favor of medical intervention were significantly more likely to prefer induction than those given nondirective information. No effect of level of involvement in decision-making was found. Participants’ general trust in doctors moderated the relationship between health caregiver communication and preferences for induction, such that the influence of information provided on preferences for induction differed across levels of involvement in decision-making for women with a low trust in doctors, but not for those with high trust. Many women were not aware of the level of information required to make an informed decision. Conclusions: Our findings highlight the potential value of strategies such as patient decision aids and health care professional education to improve the quality of information available to women and their capacity for informed decision-making during pregnancy and birth. (BIRTH 39:3 September 2012)
Resumo:
Chatrooms, for example Internet Relay Chat, are generally multi-user, multi-channel and multiserver chat-systems which run over the Internet and provide a protocol for real-time text-based conferencing between users all over the world. While a well-trained human observer is able to understand who is chatting with whom, there are no efficient and accurate automated tools to determine the groups of users conversing with each other. A precursor to analysing evolving cyber-social phenomena is to first determine what the conversations are and which groups of chatters are involved in each conversation. We consider this problem in this paper. We propose an algorithm to discover all groups of users that are engaged in conversation. Our algorithms are based on a statistical model of a chatroom that is founded on our experience with real chatrooms. Our approach does not require any semantic analysis of the conversations, rather it is based purely on the statistical information contained in the sequence of posts. We improve the accuracy by applying some graph algorithms to clean the statistical information. We present some experimental results which indicate that one can automatically determine the conversing groups in a chatroom, purely on the basis of statistical analysis.
Resumo:
The increase of online services, such as eBanks, WebMails, in which users are verified by a username and password, is increasingly exploited by Identity Theft procedures. Identity Theft is a fraud, in which someone pretends to be someone else is order to steal money or get other benefits. To overcome the problem of Identity Theft an additional security layer is required. Within the last decades the option of verifying users based on their keystroke dynamics was proposed during login verification. Thus, the imposter has to be able to type in a similar way to the real user in addition to having the username and password. However, verifying users upon login is not enough, since a logged station/mobile is vulnerable for imposters when the user leaves her machine. Thus, verifying users continuously based on their activities is required. Within the last decade there is a growing interest and use of biometrics tools, however, these are often costly and require additional hardware. Behavioral biometrics, in which users are verified, based on their keyboard and mouse activities, present potentially a good solution. In this paper we discuss the problem of Identity Theft and propose behavioral biometrics as a solution. We survey existing studies and list the challenges and propose solutions.
Resumo:
This work identifies the limitations of n-way data analysis techniques in multidimensional stream data, such as Internet chat room communications data, and establishes a link between data collection and performance of these techniques. Its contributions are twofold. First, it extends data analysis to multiple dimensions by constructing n-way data arrays known as high order tensors. Chat room tensors are generated by a simulator which collects and models actual communication data. The accuracy of the model is determined by the Kolmogorov-Smirnov goodness-of-fit test which compares the simulation data with the observed (real) data. Second, a detailed computational comparison is performed to test several data analysis techniques including svd [1], and multi-way techniques including Tucker1, Tucker3 [2], and Parafac [3].
Resumo:
This work investigates the accuracy and efficiency tradeoffs between centralized and collective (distributed) algorithms for (i) sampling, and (ii) n-way data analysis techniques in multidimensional stream data, such as Internet chatroom communications. Its contributions are threefold. First, we use the Kolmogorov-Smirnov goodness-of-fit test to show that statistical differences between real data obtained by collective sampling in time dimension from multiple servers and that of obtained from a single server are insignificant. Second, we show using the real data that collective data analysis of 3-way data arrays (users x keywords x time) known as high order tensors is more efficient than centralized algorithms with respect to both space and computational cost. Furthermore, we show that this gain is obtained without loss of accuracy. Third, we examine the sensitivity of collective constructions and analysis of high order data tensors to the choice of server selection and sampling window size. We construct 4-way tensors (users x keywords x time x servers) and analyze them to show the impact of server and window size selections on the results.
Resumo:
Secure communications in distributed Wireless Sensor Networks (WSN) operating under adversarial conditions necessitate efficient key management schemes. In the absence of a priori knowledge of post-deployment network configuration and due to limited resources at sensor nodes, key management schemes cannot be based on post-deployment computations. Instead, a list of keys, called a key-chain, is distributed to each sensor node before the deployment. For secure communication, either two nodes should have a key in common in their key-chains, or they should establish a key through a secure-path on which every link is secured with a key. We first provide a comparative survey of well known key management solutions for WSN. Probabilistic, deterministic and hybrid key management solutions are presented, and they are compared based on their security properties and re-source usage. We provide a taxonomy of solutions, and identify trade-offs in them to conclude that there is no one size-fits-all solution. Second, we design and analyze deterministic and hybrid techniques to distribute pair-wise keys to sensor nodes before the deployment. We present novel deterministic and hybrid approaches based on combinatorial design theory and graph theory for deciding how many and which keys to assign to each key-chain before the sensor network deployment. Performance and security of the proposed schemes are studied both analytically and computationally. Third, we address the key establishment problem in WSN which requires key agreement algorithms without authentication are executed over a secure-path. The length of the secure-path impacts the power consumption and the initialization delay for a WSN before it becomes operational. We formulate the key establishment problem as a constrained bi-objective optimization problem, break it into two sub-problems, and show that they are both NP-Hard and MAX-SNP-Hard. Having established inapproximability results, we focus on addressing the authentication problem that prevents key agreement algorithms to be used directly over a wireless link. We present a fully distributed algorithm where each pair of nodes can establish a key with authentication by using their neighbors as the witnesses.
Resumo:
This report looks at opportunities in relation to what is either already available or starting to take off in Information and Communication Technology (ICT). ICT focuses on the entire system of information, communication, processes and knowledge within an organisation. It focuses on how technology can be implemented to serve the information and communication needs of people and organisations. An ICT system involves a combination of work practices, information, people and a range of technologies and applications organised to make the business or organisation fully functional and efficient, and to accomplish goals in an organisation. Our focus is on vocational, workbased education in New Zealand. It is not about eLearning, although we briefly touch on the topic. We provide a background on vocational education in New Zealand, cover what we consider to be key trends impacting workbased, vocational education and training (VET), and offer practical suggestions for leveraging better value from ICT initiatives across the main activities of an Industry Training Organisation (ITO). We use a learning value chain approach to demonstrate the main functions ITOs engage in and also use this approach as the basis for developing and prioritising an ICT strategy. Much of what we consider in this report is applicable to the wider tertiary education sector as it relates to life-long learning. We consider ICT as an enabler that: a) connects education businesses (all types including tertiary education institutions) to learners, their career decisions and their learning, and as well, b) enables those same businesses to run more efficiently. We suggest that these two sets of activities are considered as interconnected parts of the same education or training business ICT strategy.
Resumo:
Media and Information Literacy is the focus of several teaching and research projects at Queensland University of Technology and there is particular emphasis placed on digital technologies and how they are used for communication, information use and learning in formal contexts such as schools. Research projects are currently taking place in several locations where investigators are collecting data on approaches to the use of digital media tools like cameras and editing systems, tablet computers and video games. This complements QUT’s teacher preparation courses, including preparation to implement UNESCO’s Online Course in Media and Information Literacy and Intercultural Dialogue in 2013. This work takes place in the context of projects occurring at the National level in Australia that continue to promote Media and Information Literacy.
Resumo:
Iris based identity verification is highly reliable but it can also be subject to attacks. Pupil dilation or constriction stimulated by the application of drugs are examples of sample presentation security attacks which can lead to higher false rejection rates. Suspects on a watch list can potentially circumvent the iris based system using such methods. This paper investigates a new approach using multiple parts of the iris (instances) and multiple iris samples in a sequential decision fusion framework that can yield robust performance. Results are presented and compared with the standard full iris based approach for a number of iris degradations. An advantage of the proposed fusion scheme is that the trade-off between detection errors can be controlled by setting parameters such as the number of instances and the number of samples used in the system. The system can then be operated to match security threat levels. It is shown that for optimal values of these parameters, the fused system also has a lower total error rate.
'Information in context' : co-designing workplace structures and systems for organizational learning
Resumo:
With the aim of advancing professional practice through better understanding how to create workplace contexts that cultivate individual and collective learning through situated 'information in context' experiences, this paper presents insights gained from three North American collaborative design (co-design) implementations. In the current project at the Auraria Library in Denver, Colorado, USA, participants use collaborative information practices to redesign face-to-face and technology-enabled communication, decision making, and planning systems. Design processes are described and results-to-date described, within an appreciative framework which values information sharing and enables knowledge creation through shared leadership.
Resumo:
The Council of Australian Governments (COAG) in 2003 gave in-principle approval to a best-practice report recommending a holistic approach to managing natural disasters in Australia incorporating a move from a traditional response-centric approach to a greater focus on mitigation, recovery and resilience with community well-being at the core. Since that time, there have been a range of complementary developments that have supported the COAG recommended approach. Developments have been administrative, legislative and technological, both, in reaction to the COAG initiative and resulting from regular natural disasters. This paper reviews the characteristics of the spatial data that is becoming increasingly available at Federal, state and regional jurisdictions with respect to their being fit for the purpose for disaster planning and mitigation and strengthening community resilience. In particular, Queensland foundation spatial data, which is increasingly accessible by the public under the provisions of the Right to Information Act 2009, Information Privacy Act 2009, and recent open data reform initiatives are evaluated. The Fitzroy River catchment and floodplain is used as a case study for the review undertaken. The catchment covers an area of 142,545 km2, the largest river catchment flowing to the eastern coast of Australia. The Fitzroy River basin experienced extensive flooding during the 2010–2011 Queensland floods. The basin is an area of important economic, environmental and heritage values and contains significant infrastructure critical for the mining and agricultural sectors, the two most important economic sectors for Queensland State. Consequently, the spatial datasets for this area play a critical role in disaster management and for protecting critical infrastructure essential for economic and community well-being. The foundation spatial datasets are assessed for disaster planning and mitigation purposes using data quality indicators such as resolution, accuracy, integrity, validity and audit trail.
Resumo:
This paper presents the results from a study of information behaviors, with specific focus on information organisation-related behaviours conducted as part of a larger daily diary study with 34 participants. The findings indicate that organization of information in everyday life is a problematic area due to various factors. The self-evident one is the inter-subjectivity between the person who may have organized the information and the person looking for that same information (Berlin et. al., 1993). Increasingly though, we are not just looking for information within collections that have been designed by someone else, but within our own personal collections of information, which frequently include books, electronic files, photos, records, documents, desktops, web bookmarks, and portable devices. The passage of time between when we categorized or classified the information, and the time when we look for the same information, poses several problems of intra-subjectivity, or the difference between our own past and present perceptions of the same information. Information searching, and hence the retrieval of information from one's own collection of information in everyday life involved a spatial and temporal coordination with one's own past selves in a sort of cognitive and affective time travel, just as organizing information is a form of anticipatory coordination with one's future information needs. This has implications for finding information and also on personal information management.
Resumo:
Numerous statements and declarations have been made over recent decades in support of open access to research data. The growing recognition of the importance of open access to research data has been accompanied by calls on public research funding agencies and universities to facilitate better access to publicly funded research data so that it can be re-used and redistributed as public goods. International and inter-governmental bodies such as the ICSU/CODATA, the OECD and the European Union are strong supporters of open access to and re-use of publicly funded research data. This thesis focuses on the research data created by university researchers in Malaysian public universities whose research activities are funded by the Federal Government of Malaysia. Malaysia, like many countries, has not yet formulated a policy on open access to and re-use of publicly funded research data. Therefore, the aim of this thesis is to develop a policy to support the objective of enabling open access to and re-use of publicly funded research data in Malaysian public universities. Policy development is very important if the objective of enabling open access to and re-use of publicly funded research data is to be successfully achieved. In developing the policy, this thesis identifies a myriad of legal impediments arising from intellectual property rights, confidentiality, privacy and national security laws, novelty requirements in patent law and lack of a legal duty to ensure data quality. Legal impediments such as these have the effect of restricting, obstructing, hindering or slowing down the objective of enabling open access to and re-use of publicly funded research data. A key focus in the formulation of the policy was the need to resolve the various legal impediments that have been identified. This thesis analyses the existing policies and guidelines of Malaysian public universities to ascertain to what extent the legal impediments have been resolved. An international perspective is adopted by making a comparative analysis of the policies of public research funding agencies and universities in the United Kingdom, the United States and Australia to understand how they have dealt with the identified legal impediments. These countries have led the way in introducing policies which support open access to and re-use of publicly funded research data. As well as proposing a policy supporting open access to and re-use of publicly funded research data in Malaysian public universities, this thesis provides procedures for the implementation of the policy and guidelines for addressing the legal impediments to open access and re-use.