251 resultados para Entity Authentication
Resumo:
This study contributes to the understanding of the contribution of financial reserves to sustaining nonprofit organisations. Recognising the limited recent Australian research in the area of nonprofit financial vulnerability, it specifically examines financial reserves held by signatories to the Code of Conduct of the Australian Council for International Development (ACFID) for the years 2006 to 2010. As this period includes the Global Financial Crisis, it presents a unique opportunity to observe the role of savings in a period of heightened financial threats to sustainability. The need for nonprofit entities to maintain reserves, while appearing intuitively evident, is neither unanimously accepted nor supported by established theoretic constructs. Some early frameworks attempt to explain the savings behaviour of nonprofit organisations and its role in organisational sustainability. Where researchers have considered the issue, its treatment has usually been either purely descriptive or alternatively, peripheral to a broader attempt to predict financial vulnerability. Given the importance of nonprofit entities to civil society, the sustainability of these organisations during times of economic contraction, such as the recent Global Financial Crisis, is a significant issue. Widespread failure of nonprofits, or even the perception of failure, will directly affect, not only those individuals who access their public goods and services, but would also have impacts on public confidence in both government and the sectors’ ability to manage and achieve their purpose. This study attempts to ‘shine a light’ on the paradox inherent in considering nonprofit savings. On the one hand, a public prevailing view is that nonprofit organisations should not hoard and indeed, should spend all of their funds on the direct achievement of their purposes. Against this, is the commonsense need for a financial buffer if only to allow for the day to day contingencies of pay rises and cost increases. At the entity level, the extent of reserves accumulated (or not) is an important consideration for Management Boards. The general public are also interested in knowing the level of funds held by nonprofits as a measure of both their commitment to purpose and as an indicator of their effectiveness. There is a need to communicate the level and prevalence of reserve holdings, balancing the prudent hedging of uncertainty against a sense of resource hoarding in the mind of donors. Finally, funders (especially governments) are interested in knowing the appropriate level of reserves to facilitate the ongoing sustainability of the sector. This is particularly so where organisations are involved in the provision of essential public goods and services. At a scholarly level, the study seeks to provide a rationale for this behaviour within the context of appropriate theory. At a practical level, the study seeks to give an indication of the drivers for savings, the actual levels of reserves held within the sector studied, as well as an indication as to whether the presence of reserves did mitigate the effects of financial turmoil during the Global Financial Crisis. The argument is not whether there is a need to ensure sustainability of nonprofits, but rather how it is to be done and whether the holding of reserves (net assets) is an essential element is achieving this. While the study offers no simple answers, it does appear that the organisations studied present as two groups, the ‘savers’ who build reserves and keep ‘money in the bank’ and ‘spender-delivers’ who put their resources ‘on the ground’. To progress an understanding of this dichotomy, the study suggests a need to move from its current approach to one which needs to more closely explore accounts based empirical donor attitude and nonprofit Management Board strategy.
Resumo:
Abstract Purpose – The purpose of this paper is to identify stakeholders’ expectations of information to be conveyed in local authorities’ annual reports and to develop an index of best practice performance reporting. Design/methodology/approach – The paper describes the development of a disclosure index emphasizing the public interest aspect of reporting and the need to provide relevant and meaningful information to stakeholders. The index was crafted from a public accountability perspective and based on the expectations of stakeholders as reconciled and validated by a Delphi panel of experts. Findings – The wide scope of information that was dentified as being important for disclosure by local authorities is consistent with the public accountability paradigm which requires the reporting of comprehensive information (both financial and non financial), about the condition, performance, activities and progress of the entity. Originality/value – The research posits a model of best practice performance reporting for Malaysian, and other, local authorities to meet the need for greater accountability by these entities.
Resumo:
The use of Trusted Platform Module (TPM) is be- coming increasingly popular in many security sys- tems. To access objects protected by TPM (such as cryptographic keys), several cryptographic proto- cols, such as the Object Specific Authorization Pro- tocol (OSAP), can be used. Given the sensitivity and the importance of those objects protected by TPM, the security of this protocol is vital. Formal meth- ods allow a precise and complete analysis of crypto- graphic protocols such that their security properties can be asserted with high assurance. Unfortunately, formal verification of these protocols are limited, de- spite the abundance of formal tools that one can use. In this paper, we demonstrate the use of Coloured Petri Nets (CPN) - a type of formal technique, to formally model the OSAP. Using this model, we then verify the authentication property of this protocol us- ing the state space analysis technique. The results of analysis demonstrates that as reported by Chen and Ryan the authentication property of OSAP can be violated.
Resumo:
Cost implications related to the physical resources such as land and building in organisation is significant. Management entity of government or private sectors has often ignored the importance and the contribution gained from the physical resources towards their organisation. This source is a precious asset that can generate income if properly managed. This paper aims to explore the current trends in space management internationally, both from the government and private sector perspectives. A case study is conducted to study the level of effectiveness of space management in one of the government institutions in Malaysia. The findings from the case study will be compared with the current international trend of space management. The study will enrich the current understanding of space management in government properties, as well as to compare the level of space management effectiveness of government properties in Malaysia with the international trends and proposed suggestions to improve current practices of space management of Malaysian government’s properties. Keywords:
Resumo:
Background: Recent clinical studies have demonstrated an emerging subgroup of head and neck cancers that are virally mediated. This disease appears to be a distinct clinical entity with patients presenting younger and with more advanced nodal disease, having lower tobacco and alcohol exposure and highly radiosensitive tumours. This means they are living longer, often with the debilitating functional side effects of treatment. The primary objective of this study was to determine how virally mediated nasopharyngeal and oropharyngeal cancers respond to radiation therapy treatment. The aim was to determine risk categories and corresponding adaptive treatment management strategies to proactively manage these patients. Method/Results: 121 patients with virally mediated, node positive nasopharyngeal or oropharyngeal cancer who received radiotherapy treatment with curative intent between 2005 and 2010 were studied. Relevant patient demographics including age, gender, diagnosis, TNM stage, pre-treatment nodal size and dose delivered was recorded. Each patient’s treatment plan was reviewed to determine if another computed tomography (re-CT) scan was performed and at what time point (dose/fraction) this occurred. The justification for this re-CT was determined using four categories: tumour and/or nodal regression, weight loss, both or other. Patients who underwent a re-CT were further investigated to determine whether a new plan was calculated. If a re-plan was performed, the dosimetric effect was quantified by comparing dose volume histograms of planning target volumes and critical structures from the actual treatment delivered and the original treatment plan. Preliminary results demonstrated that 25/121 (20.7%) patients required a re-CT and that these re-CTs were performed between fractions 20 to 25 of treatment. The justification for these re-CTs consisted of a combination of tumour and/or nodal regression and weight loss. 16/25 (13.2%) patients had a replan calculated. 9 (7.4%) of these replans were implemented clinically due to the resultant dosimetric effect calculated. The data collected from this assessment was statistically analysed to identify the major determining factors for patients to undergo a re-CT and/or replan. Specific factors identified included nodal size and timing of the required intervention (i.e. how when a plan is to be adapted). This data was used to generate specific risk profiles that will form the basis of a biologically guided adaptive treatment management strategy for virally mediated head and neck cancer. Conclusion: Preliminary data indicates that virally mediated head and neck cancers respond significantly during radiation treatment (tumour and/or nodal regression and weight loss). Implications of this response are the potential underdosing or overdosing of tumour and/or surrounding critical structures. This could lead to sub-optimal patient outcomes and compromised quality of life. Consequently, the development of adaptive treatment strategies that improve organ sparing for this patient group is important to ensure delivery of the prescribed dose to the tumour volume whilst minimizing the dose received to surrounding critical structures. This could reduce side effects and improve overall patient quality of life. The risk profiles and associated adaptive treatment approaches developed in this study will be tested prospectively in the clinical setting in Phase 2 of this investigation.
Resumo:
This paper presents a graph-based method to weight medical concepts in documents for the purposes of information retrieval. Medical concepts are extracted from free-text documents using a state-of-the-art technique that maps n-grams to concepts from the SNOMED CT medical ontology. In our graph-based concept representation, concepts are vertices in a graph built from a document, edges represent associations between concepts. This representation naturally captures dependencies between concepts, an important requirement for interpreting medical text, and a feature lacking in bag-of-words representations. We apply existing graph-based term weighting methods to weight medical concepts. Using concepts rather than terms addresses vocabulary mismatch as well as encapsulates terms belonging to a single medical entity into a single concept. In addition, we further extend previous graph-based approaches by injecting domain knowledge that estimates the importance of a concept within the global medical domain. Retrieval experiments on the TREC Medical Records collection show our method outperforms both term and concept baselines. More generally, this work provides a means of integrating background knowledge contained in medical ontologies into data-driven information retrieval approaches.
Resumo:
Shared services are increasingly prevalent in practice, their introduction potentially entailing substantive and highly consequential organizational redesign. Yet, attention to the structural arrangements of shared services has been limited. This study explores types of structural arrangements for shared services that are observed in practice, and the salient dimensions along which those types can be usefully differentiated. Through inductive attention to the shared services literature, and content analysis of 36 secondary case studies of shared services in the higher education sector, three salient dimensions emerged: (1) the existence or not of a separate organizational entity, (2) an intra- or inter-organizational sharing boundary, and (3) involvement or not of a third party. Each dimension being dichotomous yields 23 combinations, or eight shared services structural arrangement types. Each of the eight structural arrangement types is defined and demonstrated through case examples. The typology offers clarity around shared services structural arrangements. It can serve as a useful analytical tool for researchers investigating the phenomenon further, and for practitioners considering the introduction or further development of shared services arrangements. Important follow on research is suggested too.
Resumo:
To protect the health information security, cryptography plays an important role to establish confidentiality, authentication, integrity and non-repudiation. Keys used for encryption/decryption and digital signing must be managed in a safe, secure, effective and efficient fashion. The certificate-based Public Key Infrastructure (PKI) scheme may seem to be a common way to support information security; however, so far, there is still a lack of successful large-scale certificate-based PKI deployment in the world. In addressing the limitations of the certificate-based PKI scheme, this paper proposes a non-certificate-based key management scheme for a national e-health implementation. The proposed scheme eliminates certificate management and complex certificate validation procedures while still maintaining security. It is also believed that this study will create a new dimension to the provision of security for the protection of health information in a national e-health environment.
Resumo:
Increasing use of computerized systems in our daily lives creates new adversarial opportunities for which complex mechanisms are exploited to mend the rapid development of new attacks. Behavioral Biometrics appear as one of the promising response to these attacks. But it is a relatively new research area, specific frameworks for evaluation and development of behavioral biometrics solutions could not be found yet. In this paper we present a conception of a generic framework and runtime environment which will enable researchers to develop, evaluate and compare their behavioral biometrics solutions with repeatable experiments under the same conditions with the same data.
Resumo:
In the modern connected world, pervasive computing has become reality. Thanks to the ubiquity of mobile computing devices and emerging cloud-based services, the users permanently stay connected to their data. This introduces a slew of new security challenges, including the problem of multi-device key management and single-sign-on architectures. One solution to this problem is the utilization of secure side-channels for authentication, including the visual channel as vicinity proof. However, existing approaches often assume confidentiality of the visual channel, or provide only insufficient means of mitigating a man-in-the-middle attack. In this work, we introduce QR-Auth, a two-step, 2D barcode based authentication scheme for mobile devices which aims specifically at key management and key sharing across devices in a pervasive environment. It requires minimal user interaction and therefore provides better usability than most existing schemes, without compromising its security. We show how our approach fits in existing authorization delegation and one-time-password generation schemes, and that it is resilient to man-in-the-middle attacks.
Resumo:
Internet services are important part of daily activities for most of us. These services come with sophisticated authentication requirements which may not be handled by average Internet users. The management of secure passwords for example creates an extra overhead which is often neglected due to usability reasons. Furthermore, password-based approaches are applicable only for initial logins and do not protect against unlocked workstation attacks. In this paper, we provide a non-intrusive identity verification scheme based on behavior biometrics where keystroke dynamics based-on free-text is used continuously for verifying the identity of a user in real-time. We improved existing keystroke dynamics based verification schemes in four aspects. First, we improve the scalability where we use a constant number of users instead of whole user space to verify the identity of target user. Second, we provide an adaptive user model which enables our solution to take the change of user behavior into consideration in verification decision. Next, we identify a new distance measure which enables us to verify identity of a user with shorter text. Fourth, we decrease the number of false results. Our solution is evaluated on a data set which we have collected from users while they were interacting with their mail-boxes during their daily activities.
Resumo:
Nowadays people heavily rely on the Internet for information and knowledge. Wikipedia is an online multilingual encyclopaedia that contains a very large number of detailed articles covering most written languages. It is often considered to be a treasury of human knowledge. It includes extensive hypertext links between documents of the same language for easy navigation. However, the pages in different languages are rarely cross-linked except for direct equivalent pages on the same subject in different languages. This could pose serious difficulties to users seeking information or knowledge from different lingual sources, or where there is no equivalent page in one language or another. In this thesis, a new information retrieval task—cross-lingual link discovery (CLLD) is proposed to tackle the problem of the lack of cross-lingual anchored links in a knowledge base such as Wikipedia. In contrast to traditional information retrieval tasks, cross language link discovery algorithms actively recommend a set of meaningful anchors in a source document and establish links to documents in an alternative language. In other words, cross-lingual link discovery is a way of automatically finding hypertext links between documents in different languages, which is particularly helpful for knowledge discovery in different language domains. This study is specifically focused on Chinese / English link discovery (C/ELD). Chinese / English link discovery is a special case of cross-lingual link discovery task. It involves tasks including natural language processing (NLP), cross-lingual information retrieval (CLIR) and cross-lingual link discovery. To justify the effectiveness of CLLD, a standard evaluation framework is also proposed. The evaluation framework includes topics, document collections, a gold standard dataset, evaluation metrics, and toolkits for run pooling, link assessment and system evaluation. With the evaluation framework, performance of CLLD approaches and systems can be quantified. This thesis contributes to the research on natural language processing and cross-lingual information retrieval in CLLD: 1) a new simple, but effective Chinese segmentation method, n-gram mutual information, is presented for determining the boundaries of Chinese text; 2) a voting mechanism of name entity translation is demonstrated for achieving a high precision of English / Chinese machine translation; 3) a link mining approach that mines the existing link structure for anchor probabilities achieves encouraging results in suggesting cross-lingual Chinese / English links in Wikipedia. This approach was examined in the experiments for better, automatic generation of cross-lingual links that were carried out as part of the study. The overall major contribution of this thesis is the provision of a standard evaluation framework for cross-lingual link discovery research. It is important in CLLD evaluation to have this framework which helps in benchmarking the performance of various CLLD systems and in identifying good CLLD realisation approaches. The evaluation methods and the evaluation framework described in this thesis have been utilised to quantify the system performance in the NTCIR-9 Crosslink task which is the first information retrieval track of this kind.
Resumo:
There are different ways to authenticate humans, which is an essential prerequisite for access control. The authentication process can be subdivided into three categories that rely on something someone i) knows (e.g. password), and/or ii) has (e.g. smart card), and/or iii) is (biometric features). Besides classical attacks on password solutions and the risk that identity-related objects can be stolen, traditional biometric solutions have their own disadvantages such as the requirement of expensive devices, risk of stolen bio-templates etc. Moreover, existing approaches provide the authentication process usually performed only once initially. Non-intrusive and continuous monitoring of user activities emerges as promising solution in hardening authentication process: iii-2) how so. behaves. In recent years various keystroke dynamic behavior-based approaches were published that are able to authenticate humans based on their typing behavior. The majority focuses on so-called static text approaches, where users are requested to type a previously defined text. Relatively few techniques are based on free text approaches that allow a transparent monitoring of user activities and provide continuous verification. Unfortunately only few solutions are deployable in application environments under realistic conditions. Unsolved problems are for instance scalability problems, high response times and error rates. The aim of this work is the development of behavioral-based verification solutions. Our main requirement is to deploy these solutions under realistic conditions within existing environments in order to enable a transparent and free text based continuous verification of active users with low error rates and response times.
Resumo:
Session Initiation Protocol (SIP) is developed to provide advanced voice services over IP networks. SIP unites telephony and data world, permitting telephone calls to be transmitted over Intranets and Internet. Increase in network performance and new mechanisms for guaranteed quality of service encourage this consolidation to provide toll cost savings. Security comes up as one of the most important issues when voice communication and critical voice applications are considered. Not only the security methods provided by traditional telephony systems, but also additional methods are required to overcome security risks introduced by the public IP networks. SIP considers security problems of such a consolidation and provides a security framework. There are several security methods defined within SIP specifications and extensions. But, suggested methods can not solve all the security problems of SIP systems with various system requirements. In this thesis, a Kerberos based solution is proposed for SIP security problems, including SIP authentication and privacy. The proposed solution tries to establish flexible and scalable SIP system that will provide desired level of security for voice communications and critical telephony applications.
Resumo:
Secure communications in distributed Wireless Sensor Networks (WSN) operating under adversarial conditions necessitate efficient key management schemes. In the absence of a priori knowledge of post-deployment network configuration and due to limited resources at sensor nodes, key management schemes cannot be based on post-deployment computations. Instead, a list of keys, called a key-chain, is distributed to each sensor node before the deployment. For secure communication, either two nodes should have a key in common in their key-chains, or they should establish a key through a secure-path on which every link is secured with a key. We first provide a comparative survey of well known key management solutions for WSN. Probabilistic, deterministic and hybrid key management solutions are presented, and they are compared based on their security properties and re-source usage. We provide a taxonomy of solutions, and identify trade-offs in them to conclude that there is no one size-fits-all solution. Second, we design and analyze deterministic and hybrid techniques to distribute pair-wise keys to sensor nodes before the deployment. We present novel deterministic and hybrid approaches based on combinatorial design theory and graph theory for deciding how many and which keys to assign to each key-chain before the sensor network deployment. Performance and security of the proposed schemes are studied both analytically and computationally. Third, we address the key establishment problem in WSN which requires key agreement algorithms without authentication are executed over a secure-path. The length of the secure-path impacts the power consumption and the initialization delay for a WSN before it becomes operational. We formulate the key establishment problem as a constrained bi-objective optimization problem, break it into two sub-problems, and show that they are both NP-Hard and MAX-SNP-Hard. Having established inapproximability results, we focus on addressing the authentication problem that prevents key agreement algorithms to be used directly over a wireless link. We present a fully distributed algorithm where each pair of nodes can establish a key with authentication by using their neighbors as the witnesses.