866 resultados para Information Security, Safe Behavior, Users’ behavior, Brazilian users, threats


Relevância:

100.00% 100.00%

Publicador:

Resumo:

It is not uncommon for enterprises today to be faced with the demand to integrate and incor- porate many different and possibly heterogeneous systems which are generally independently designed and developed, to allow seamless access. In effect, the integration of these systems results in one large whole system that must be able, at the same time, to maintain the local autonomy and to continue working as an independent entity. This problem has introduced a new distributed architecture called federated systems. The most challenging issue in federated systems is to find answers for the question of how to efficiently cooperate while preserving their autonomous characteristic, especially the security autonomy. This thesis intends to address this issue. The thesis reviews the evolution of the concept of federated systems and discusses the organisational characteristics as well as remaining security issues with the existing approaches. The thesis examines how delegation can be used as means to achieve better security, especially authorisation while maintaining autonomy for the participating member of the federation. A delegation taxonomy is proposed as one of the main contributions. The major contribution of this thesis is to study and design a mechanism to support dele- gation within and between multiple security domains with constraint management capability. A novel delegation framework is proposed including two modules: Delegation Constraint Man- agement module and Policy Management module. The first module is designed to effectively create, track and manage delegation constraints, especially for delegation processes which require re-delegation (indirect delegation). The first module employs two algorithms to trace the root authority of a delegation constraint chain and to prevent the potential conflict when creating a delegation constraint chain if necessary. The first module is designed for conflict prevention not conflict resolution. The second module is designed to support the first module via the policy comparison capability. The major function of this module is to provide the delegation framework the capability to compare policies and constraints (written under the format of a policy). The module is an extension of Lin et al.'s work on policy filtering and policy analysis. Throughout the thesis, some case studies are used as examples to illustrate the discussed concepts. These two modules are designed to capture one of the most important aspects of the delegation process: the relationships between the delegation transactions and the involved constraints, which are not very well addressed by the existing approaches. This contribution is significant because the relationships provide information to keep track and en- force the involved delegation constraints and, therefore, play a vital role in maintaining and enforcing security for transactions across multiple security domains.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Security and privacy in electronic health record systems have been hindering the growth of e-health systems since their emergence. The development of policies that satisfy the security and privacy requirements of different stakeholders in healthcare has proven to be difficult. But, these requirements have to be met if the systems developed are to succeed in achieving their intended goals. Access control is a fundamental security barrier for securing data in healthcare information systems. In this paper we present an access control model for electronic health records. We address patient privacy requirements, confidentiality of private information and the need for flexible access for health professionals for electronic health records. We carefully combine three existing access control models and present a novel access control model for EHRs which satisfies requirements of electronic health records.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Accurate and efficient thermal-infrared (IR) camera calibration is important for advancing computer vision research within the thermal modality. This paper presents an approach for geometrically calibrating individual and multiple cameras in both the thermal and visible modalities. The proposed technique can be used to correct for lens distortion and to simultaneously reference both visible and thermal-IR cameras to a single coordinate frame. The most popular existing approach for the geometric calibration of thermal cameras uses a printed chessboard heated by a flood lamp and is comparatively inaccurate and difficult to execute. Additionally, software toolkits provided for calibration either are unsuitable for this task or require substantial manual intervention. A new geometric mask with high thermal contrast and not requiring a flood lamp is presented as an alternative calibration pattern. Calibration points on the pattern are then accurately located using a clustering-based algorithm which utilizes the maximally stable extremal region detector. This algorithm is integrated into an automatic end-to-end system for calibrating single or multiple cameras. The evaluation shows that using the proposed mask achieves a mean reprojection error up to 78% lower than that using a heated chessboard. The effectiveness of the approach is further demonstrated by using it to calibrate two multiple-camera multiple-modality setups. Source code and binaries for the developed software are provided on the project Web site.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Large margin learning approaches, such as support vector machines (SVM), have been successfully applied to numerous classification tasks, especially for automatic facial expression recognition. The risk of such approaches however, is their sensitivity to large margin losses due to the influence from noisy training examples and outliers which is a common problem in the area of affective computing (i.e., manual coding at the frame level is tedious so coarse labels are normally assigned). In this paper, we leverage the relaxation of the parallel-hyperplanes constraint and propose the use of modified correlation filters (MCF). The MCF is similar in spirit to SVMs and correlation filters, but with the key difference of optimizing only a single hyperplane. We demonstrate the superiority of MCF over current techniques on a battery of experiments.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper introduces the Weighted Linear Discriminant Analysis (WLDA) technique, based upon the weighted pairwise Fisher criterion, for the purposes of improving i-vector speaker verification in the presence of high intersession variability. By taking advantage of the speaker discriminative information that is available in the distances between pairs of speakers clustered in the development i-vector space, the WLDA technique is shown to provide an improvement in speaker verification performance over traditional Linear Discriminant Analysis (LDA) approaches. A similar approach is also taken to extend the recently developed Source Normalised LDA (SNLDA) into Weighted SNLDA (WSNLDA) which, similarly, shows an improvement in speaker verification performance in both matched and mismatched enrolment/verification conditions. Based upon the results presented within this paper using the NIST 2008 Speaker Recognition Evaluation dataset, we believe that both WLDA and WSNLDA are viable as replacement techniques to improve the performance of LDA and SNLDA-based i-vector speaker verification.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

IT Governance (ITG) adoption remains a relevant topic of study. While extensive research has been done looking into the drivers and critical success factors of ITG practice, there seems to be a lack of interest in identifying the barriers to its adoption. This study reports on a survey conducted to first: provide some primary data that suggest ITG adoption and maturity levels are still low, especially in a developing country like Malaysia; and second: to provide initial empirical support for model development. Results obtained supported our assumptions that: (1) ITG adoption and maturity levels are still relatively low in Malaysia, therefore justifying Malaysia as a suitable case; (2) organizational factors, environmental factors and characteristics of the innovation as identified from the literature may serve as possible barriers to adoption.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

ZnO nanowires are normally exposed to an oxygen atmosphere to achieve high performance in UV photodetection. In this work we present results on a UV photodetector fabricated using a flexible ZnO nanowire sheet embedded in polydimethylsiloxane (PDMS), a gas-permeable polymer, showing reproducible UV photoresponse and enhanced photoconduction. PDMS coating results in a reduced response speed compared to that of a ZnO nanowire film in air. The rising speed is slightly reduced, while the decay time is prolonged by about a factor of four. We conclude that oxygen molecules diffusing in PDMS are responsible for the UV photoresponse

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In the recent past, there are some social issues when personal sensitive data in medical database were exposed. The personal sensitive data should be protected and access must be accounted for. Protecting the sensitive information is possible by encrypting such information. The challenge is querying the encrypted information when making the decision. Encrypted query is practically somewhat tedious task. So we present the more effective method using bucket index and bloom filter technology. We find that our proposed method shows low memory and fast efficiency comparatively. Simulation approaches on data encryption techniques to improve health care decision making processes are presented in this paper as a case scenario.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The thesis presented in this paper is that the land fraud committed by Matthew Perrin in Queensland and inflicted upon Roger Mildenhall in Western Australia demonstrates the need for urgent procedural reform to the conveyancing process. Should this not occur, then calls to reform the substantive principles of the Torrens system will be heard throughout the jurisdictions that adopt title by registration, particularly in those places where immediate indefeasibility is still the norm. This paper closely examines the factual matrix behind both of these frauds, and asks what steps should have been taken to prevent them occurring. With 2012 bringing us Australian legislation embedding a national e-conveyancing system and a new Land Transfer Act for New Zealand we ask what legislative measures should be introduced to minimise the potential for such fraud. In undertaking this study, we reflect on whether the activities of Perrin and the criminals responsible for stealing Mildenhall's land would have succeeded under the present system for automated registration utilised in New Zealand.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Modelling activities in crowded scenes is very challenging as object tracking is not robust in complicated scenes and optical flow does not capture long range motion. We propose a novel approach to analyse activities in crowded scenes using a “bag of particle trajectories”. Particle trajectories are extracted from foreground regions within short video clips using particle video, which estimates long range motion in contrast to optical flow which is only concerned with inter-frame motion. Our applications include temporal video segmentation and anomaly detection, and we perform our evaluation on several real-world datasets containing complicated scenes. We show that our approaches achieve state-of-the-art performance for both tasks.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Systems, methods and articles for determining anomalous user activity are disclosed. Data representing a transaction activity corresponding to a plurality of user transactions can be received and user transactions can be grouped according to types of user transactions. The transaction activity can be determined to be anomalous in relation to the grouped user transactions based on a predetermined parameter.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper investigates the effects of limited speech data in the context of speaker verification using a probabilistic linear discriminant analysis (PLDA) approach. Being able to reduce the length of required speech data is important to the development of automatic speaker verification system in real world applications. When sufficient speech is available, previous research has shown that heavy-tailed PLDA (HTPLDA) modeling of speakers in the i-vector space provides state-of-the-art performance, however, the robustness of HTPLDA to the limited speech resources in development, enrolment and verification is an important issue that has not yet been investigated. In this paper, we analyze the speaker verification performance with regards to the duration of utterances used for both speaker evaluation (enrolment and verification) and score normalization and PLDA modeling during development. Two different approaches to total-variability representation are analyzed within the PLDA approach to show improved performance in short-utterance mismatched evaluation conditions and conditions for which insufficient speech resources are available for adequate system development. The results presented within this paper using the NIST 2008 Speaker Recognition Evaluation dataset suggest that the HTPLDA system can continue to achieve better performance than Gaussian PLDA (GPLDA) as evaluation utterance lengths are decreased. We also highlight the importance of matching durations for score normalization and PLDA modeling to the expected evaluation conditions. Finally, we found that a pooled total-variability approach to PLDA modeling can achieve better performance than the traditional concatenated total-variability approach for short utterances in mismatched evaluation conditions and conditions for which insufficient speech resources are available for adequate system development.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper investigates the use of the dimensionality-reduction techniques weighted linear discriminant analysis (WLDA), and weighted median fisher discriminant analysis (WMFD), before probabilistic linear discriminant analysis (PLDA) modeling for the purpose of improving speaker verification performance in the presence of high inter-session variability. Recently it was shown that WLDA techniques can provide improvement over traditional linear discriminant analysis (LDA) for channel compensation in i-vector based speaker verification systems. We show in this paper that the speaker discriminative information that is available in the distance between pair of speakers clustered in the development i-vector space can also be exploited in heavy-tailed PLDA modeling by using the weighted discriminant approaches prior to PLDA modeling. Based upon the results presented within this paper using the NIST 2008 Speaker Recognition Evaluation dataset, we believe that WLDA and WMFD projections before PLDA modeling can provide an improved approach when compared to uncompensated PLDA modeling for i-vector based speaker verification systems.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The digital humanities are growing rapidly in response to a rise in Internet use. What humanists mostly work on, and which forms much of the contents of our growing repositories, are digital surrogates of originally analog artefacts. But is the data model upon which many of those surrogates are based – embedded markup – adequate for the task? Or does it in fact inhibit reusability and flexibility? To enhance interoperability of resources and tools, some changes to the standard markup model are needed. Markup could be removed from the text and stored in standoff form. The versions of which many cultural heritage texts are composed could also be represented externally, and computed automatically. These changes would not disrupt existing data representations, which could be imported without significant data loss. They would also enhance automation and ease the increasing burden on the modern digital humanist.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Information technology (IT) has been playing a powerful role in creating a competitive advantage for organisations over the past decades. This role has become proportionally greater over time as expectations for IT investments to drive business opportunities keep on rising. However, this reliance on IT has also raised concerns about regulatory compliance, governance and security. IT governance (ITG) audit leverages the skills of IS/IT auditors to ensure that IT initiatives are in line with the business strategies. ITG audit emerged as part of performance audit to provide an assessment of the effective implementation of ITG. This research attempts to empirically examine the ITG audit challenges in the Australian public sector. Based on literature research and Delphi research, this paper provides insights regarding the impact of, and required effort to address these challenges. The authors also present the ten major ITG audit challenges facing Australian public sector organisations today.