962 resultados para Standard information
Resumo:
This tutorial is primarily based on the IEEE eHealth technical committee Newsletter published in March 2013. Its main focus is on information privacy management in eHealth through information accountability. The tutorial consists of three main aspects of a proposed information accountability framework for eHealth, namely, social aspects, technical aspects and legal aspects. Following a brief introduction of the problem domain and context, we present the tutorial in these three main components. The length of the tutorial is intended to be half a day.
Resumo:
Information Technology (IT) is successfully applied in a diverse range of fields. Though, the field of Medical Informatics is more than three decades old, it shows a very slow progress compared to many other fields in which the application of IT is growing rapidly. The spending on IT in health care is shooting up but the road to successful use of IT in health care has not been easy. This paper discusses about the barriers to the successful adoption of information technology in clinical environments and outlines the different approaches used by various countries and organisations to tackle the issues successfully. Investing financial and other resources to overcome the barriers for successful adoption of HIT is highly important to realise the dream of a future healthcare system with each customer having secure, private Electronic Health Record (EHR) that is available whenever and wherever needed, enabling the highest degree of coordinated medical care based on the latest medical knowledge and evidence. Arguably, the paper reviews barriers to HIT from organisations’ alignment in respect to the leadership; with their stated values when accepting or willingness to consider the HIT as a determinant factor on their decision-making processes. However, the review concludes that there are many aspects of the organisational accountability and readiness to agree to the technology implementation.
Resumo:
Health Informatics is an intersection of information technology, several disciplines of medicine and health care. It sits at the common frontiers of health care services including patient centric, processes driven and procedural centric care. From the information technology perspective it can be viewed as computer application in medical and/or health processes for delivering better health care solutions. In spite of the exaggerated hype, this field is having a major impact in health care solutions, in particular health care deliveries, decision making, medical devices and allied health care industries. It also affords enormous research opportunities for new methodological development. Despite the obvious connections between Medical Informatics, Nursing Informatics and Health Informatics, most of the methodologies and approaches used in Health Informatics have so far originated from health system management, care aspects and medical diagnostic. This paper explores reasoning for domain knowledge analysis that would establish Health Informatics as a domain and recognised as an intellectual discipline in its own right.
Resumo:
Availability of health information is rapidly increasing and the expansion and proliferation of health information is inevitable. The Electronic Healthcare Record, Electronic Medical Record and Personal Health Record are at the core of this trend and are required for appropriate and practicable exchange and sharing of health information. However, it is becoming increasingly recognized that it is essential to preserve patient privacy and information security when utilising sensitive information for clinical, management and administrative processes. Furthermore, the usability of emerging healthcare applications is also becoming a growing concern. This paper proposes a novel approach for integrating consideration of information accountability with a perspective from usability engineering that can be applied when developing healthcare information technology applications. A social networking user case in the healthcare information exchange will be presented in the context of our approach.
Resumo:
The INEX workshop is concerned with evaluating the effectiveness of XML retrieval systems. In 2004 a natural language query task was added to the INEX Ad hoc track. Standard INEX Ad hoc topic titles are specified in NEXI -- a simplified and restricted subset of XPath, with a similar feel, and yet with a distinct IR flavour and interpretation. The syntax of NEXI is rigid and it imposes some limitations on the kind of information need that it can faithfully capture. At INEX 2004 the NLP question to be answered was simple -- is it practical to use a natural language query that is the equivalent of the formal NEXI title? The results of this experiment are reported and some information on the future direction of the NLP task is presented.
Resumo:
This report describes the available functionality and use of the ClusterEval evaluation software. It implements novel and standard measures for the evaluation of cluster quality. This software has been used at the INEX XML Mining track and in the MediaEval Social Event Detection task.
Resumo:
This thesis is the result of an investigation into information privacy management in eHealth. It explores the applicability of accountability measures as a means of protection of eHealth consumer privacy. The thesis presented a new concept of Accountable eHealth Systems for achieving a balance between the information privacy concerns of eHealth consumers and the information access requirements of healthcare professionals and explored the social, technological and implementation aspects involved in such a system.
Resumo:
Clinicians often report that currently available methods to assess older patients, including standard clinical consultations, do not elicit the information necessary to make an appropriate cancer treatment recommendation for older cancer patients. An increasingly popular way of assessing the potential of older patients to cope with chemotherapy is a Comprehensive Geriatric Assessment. What constitutes Comprehensive Geriatric Assessment, however, is open to interpretation and varies from one setting to another. Furthermore, Comprehensive Geriatric Assessment’s usefulness as a predictor of fitness for chemotherapy and as a determinant of actual treatment is not well understood. In this article, we analyse how Comprehensive Geriatric Assessment was developed for use in a large cancer service in an Australian capital city. Drawing upon Actor–Network Theory, our findings reveal how, during its development, Comprehensive Geriatric Assessment was made both a tool and a science. Furthermore, we briefly explore the tensions that we experienced as scholars who analyse medico-scientific practices and as practitioner–designers charged with improving the very tools we critique. Our study contributes towards geriatric oncology by scrutinising the medicalisation of ageing, unravelling the practices of standardisation and illuminating the multiplicity of ‘fitness for chemotherapy’.
Resumo:
Iris based identity verification is highly reliable but it can also be subject to attacks. Pupil dilation or constriction stimulated by the application of drugs are examples of sample presentation security attacks which can lead to higher false rejection rates. Suspects on a watch list can potentially circumvent the iris based system using such methods. This paper investigates a new approach using multiple parts of the iris (instances) and multiple iris samples in a sequential decision fusion framework that can yield robust performance. Results are presented and compared with the standard full iris based approach for a number of iris degradations. An advantage of the proposed fusion scheme is that the trade-off between detection errors can be controlled by setting parameters such as the number of instances and the number of samples used in the system. The system can then be operated to match security threat levels. It is shown that for optimal values of these parameters, the fused system also has a lower total error rate.
Resumo:
This research studies information systems that adapt to the context in which they are used and provides recommendations on how the design of such systems can be improved. This thesis covers the problem of context-awareness via two case studies in the insurance and transportation industries. The study highlights shortcomings in the understanding of the relationship between information systems and context. Furthermore, it presents a new, theory-informed approach to design, and provides guidance for system developers seeking to implement context-aware information systems.
Resumo:
The contemporary working environment is being rapidly reshaped by technological, industrial and political forces. Increased global competitiveness and an emphasis on productivity have led to the appearance of alternative methods of employment, such as part-time, casual and itinerant work, allowing greater flexibility. This allows for the development of a core permanent staff and the simultaneous utilisation of casual staff according to business needs. Flexible workers across industries are generally referred to as the non-standard workforce and full-time permanent workers as the standard workforce. Even though labour flexibility favours the employer, increased opportunity for flexible work has been embraced by women for many reasons, including the gender struggle for greater economic independence and social equality. Consequently, the largely female nursing industry, both nationally and internationally, has been caught up in this wave of change. This ageing workforce has been at the forefront of the push for flexibility with recent figures showing almost half the nursing workforce is employed in non-standard capacity. In part, this has allowed women to fulfil caring roles outside their work, to ease off nearing retirement and to supplement the family income. More significantly, however, flexibility has developed as an economic management initiative, as a strategy for cost constraint. The result has been the development of a dual workforce and as suggested by Pocock, Buchanan and Campbell (2004), associated deep-seated resentment and the marginalisation of part-time and casual workers by their full-time colleagues and managers. Additionally, as nursing currently faces serious recruitment and retention problems there is urgent need to understand the factors which are underlying present discontent in the nursing profession. There is an identified gap in nursing knowledge surrounding the issues relating to recruitment and retention. Communication involves speaking, listening, reading and writing and is an interactive process which is central to the lives of humans. Workplace communication refers to human interaction, information technology, and multimedia and print. It is the means to relationship building between workers, management, and their external environment and is critical to organisational effectiveness. Communication and language are integral to nursing performance (Hall, 2005), in twenty-four hour service however increasing fragmentation due to part-time and casual work in the nursing industry means that effective communication management has become increasingly difficult. More broadly it is known that disruption to communication systems impacts negatively on consumer outcomes. Because of this gap in understanding how nurses view their contemporary nursing world, an interpretative ethnographic study which progressed to a critical ethnographic study, based on the conceptual framework of constructionism and interpretativism was used. The study site was a division within an acute health care facility, and the relationship between increasing casualisation of the nursing workforce and the experiences of communication of standard and non-standard nurses was explored. For this study, full-time standard nurses were those employed to work in a specific unit for forty hours per week. Non-standard nurses were those employed part-time in specific units or those nurses employed to work as relief pool nurses for shift short falls where needed. Nurses employed by external agencies, but required to fill in for shifts at the facility were excluded from this research. This study involved an analysis of observational, interview and focus group data of standard and non-standard nurses within this facility. Three analytical findings - the organisation of nursing work; constructing the casual nurse as other; and the function of space, situate communication within a broader discussion about non-standard work and organisational culture. The study results suggest that a significant culture of marginalisation exists for nurses who work in a non-standard capacity and that this affects communication for nurses and has implications for the quality of patient care. The discussion draws on the seven elements of marginalisation described by Hall, Stephen and Melius (1994). The arguments propose that these elements underpin a culture which supports remnants of the historically gendered stereotype "the good nurse" and these cultural values contribute to practices and behaviour which marginalise all nurses, particularly those who work less than full-time. Gender inequality is argued to be at the heart of marginalising practices because of long standing subordination of nurses by the powerful medical profession, paralleling historical subordination of women in society. This has denied nurses adequate representation and voice in decision making. The new knowledge emanating from this study extends current knowledge of factors surrounding recruitment and retention and as such contributes to an understanding of the current and complex nursing environment.
Resumo:
This poster summarises the current findings from STRC’s Integrated Traveller Information research domain that aims for accurate and reliable travel time prediction, and optimisation of multimodal trips. Following are the three selected discussions: a) Fundamental understanding on the use of Bluetooth MAC Scanner (BMS) for travel time estimation b) Integration of multi-sources (Loops and Bluetooth) for travel time and density estimation c) Architecture for online and predictive multimodal trip planner
Resumo:
Availability has become a primary goal of information security and is as significant as other goals, in particular, confidentiality and integrity. Maintaining availability of essential services on the public Internet is an increasingly difficult task in the presence of sophisticated attackers. Attackers may abuse limited computational resources of a service provider and thus managing computational costs is a key strategy for achieving the goal of availability. In this thesis we focus on cryptographic approaches for managing computational costs, in particular computational effort. We focus on two cryptographic techniques: computational puzzles in cryptographic protocols and secure outsourcing of cryptographic computations. This thesis contributes to the area of cryptographic protocols in the following ways. First we propose the most efficient puzzle scheme based on modular exponentiations which, unlike previous schemes of the same type, involves only a few modular multiplications for solution verification; our scheme is provably secure. We then introduce a new efficient gradual authentication protocol by integrating a puzzle into a specific signature scheme. Our software implementation results for the new authentication protocol show that our approach is more efficient and effective than the traditional RSA signature-based one and improves the DoSresilience of Secure Socket Layer (SSL) protocol, the most widely used security protocol on the Internet. Our next contributions are related to capturing a specific property that enables secure outsourcing of cryptographic tasks in partial-decryption. We formally define the property of (non-trivial) public verifiability for general encryption schemes, key encapsulation mechanisms (KEMs), and hybrid encryption schemes, encompassing public-key, identity-based, and tag-based encryption avors. We show that some generic transformations and concrete constructions enjoy this property and then present a new public-key encryption (PKE) scheme having this property and proof of security under the standard assumptions. Finally, we combine puzzles with PKE schemes for enabling delayed decryption in applications such as e-auctions and e-voting. For this we first introduce the notion of effort-release PKE (ER-PKE), encompassing the well-known timedrelease encryption and encapsulated key escrow techniques. We then present a security model for ER-PKE and a generic construction of ER-PKE complying with our security notion.
Resumo:
Process models expressed in BPMN typically rely on a small subset of all available symbols. In our 2008 study, we examined the composition of these subsets, and found that the distribution of BPMN symbols in practice closely resembles the frequency distribution of words in natural language. We offered some suggestions based on our findings, how to make the use of BPMN more manageable and also outlined ideas for further development of BPMN. Since this paper was published it has provoked spirited debate in the BPM practitioner community, prompted the definition of a modeling standard in US government, and helped shape the next generation of the BPMN standard.
Resumo:
Denial-of-service (DoS) attacks are a growing concern to networked services like the Internet. In recent years, major Internet e-commerce and government sites have been disabled due to various DoS attacks. A common form of DoS attack is a resource depletion attack, in which an attacker tries to overload the server's resources, such as memory or computational power, rendering the server unable to service honest clients. A promising way to deal with this problem is for a defending server to identify and segregate malicious traffic as earlier as possible. Client puzzles, also known as proofs of work, have been shown to be a promising tool to thwart DoS attacks in network protocols, particularly in authentication protocols. In this thesis, we design efficient client puzzles and propose a stronger security model to analyse client puzzles. We revisit a few key establishment protocols to analyse their DoS resilient properties and strengthen them using existing and novel techniques. Our contributions in the thesis are manifold. We propose an efficient client puzzle that enjoys its security in the standard model under new computational assumptions. Assuming the presence of powerful DoS attackers, we find a weakness in the most recent security model proposed to analyse client puzzles and this study leads us to introduce a better security model for analysing client puzzles. We demonstrate the utility of our new security definitions by including two hash based stronger client puzzles. We also show that using stronger client puzzles any protocol can be converted into a provably secure DoS resilient key exchange protocol. In other contributions, we analyse DoS resilient properties of network protocols such as Just Fast Keying (JFK) and Transport Layer Security (TLS). In the JFK protocol, we identify a new DoS attack by applying Meadows' cost based framework to analyse DoS resilient properties. We also prove that the original security claim of JFK does not hold. Then we combine an existing technique to reduce the server cost and prove that the new variant of JFK achieves perfect forward secrecy (the property not achieved by original JFK protocol) and secure under the original security assumptions of JFK. Finally, we introduce a novel cost shifting technique which reduces the computation cost of the server significantly and employ the technique in the most important network protocol, TLS, to analyse the security of the resultant protocol. We also observe that the cost shifting technique can be incorporated in any Diffine{Hellman based key exchange protocol to reduce the Diffie{Hellman exponential cost of a party by one multiplication and one addition.