877 resultados para Cheese authentication


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Imperfect; wanting the volume entitled, "Paper, by Prof. Archer, Printing, by Joseph Hatton, etc."

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Also included in the Annual report of the Ontario Department of Agriculture, which can be seen for later numbers.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Mode of access: Internet.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

There is some evidence that dietary factors may modify the risk of squamous cell carcinoma (SCC) of the skin, but the association between food intake and SCC has not been evaluated prospectively. We examined the association between food intake and SCC incidence among 1,056 randomly selected adults living in an Australian sub-tropical community. Measurement-error corrected estimates of intake in 15 food groups were defined from a validated food frequency questionnaire in 1992. Associations with SCC risk were assessed using Poisson and negative binomial regression to the persons affected and tumour counts, respectively, based on incident, histologically confirmed tumours occurring between 1992 and 2002. After multivariable adjustment, none of the food groups was significantly associated with SCC risk. Stratified analysis in participants with a past history of skin cancer showed a decreased risk of SCC tumours for high intakes of green leafy vegetables (RR = 0.45, 95% CI = 0.22-0.91; p for trend = 0.02) and an increased risk for high intake of unmodified dairy products (RR = 2.53, 95% CI: 1.15-5.54; p for trend = 0.03). Food intake was not associated with SCC risk in persons who had no past history of skin cancer. These findings suggest that consumption of green leafy vegetables may help prevent development of subsequent SCCs of the skin among people with previous skin cancer and that consumption of unmodified dairy products, such as whole milk, cheese and yoghurt, may increase SCC risk in susceptible persons.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Systems biology is based on computational modelling and simulation of large networks of interacting components. Models may be intended to capture processes, mechanisms, components and interactions at different levels of fidelity. Input data are often large and geographically disperse, and may require the computation to be moved to the data, not vice versa. In addition, complex system-level problems require collaboration across institutions and disciplines. Grid computing can offer robust, scaleable solutions for distributed data, compute and expertise. We illustrate some of the range of computational and data requirements in systems biology with three case studies: one requiring large computation but small data (orthologue mapping in comparative genomics), a second involving complex terabyte data (the Visible Cell project) and a third that is both computationally and data-intensive (simulations at multiple temporal and spatial scales). Authentication, authorisation and audit systems are currently not well scalable and may present bottlenecks for distributed collaboration particularly where outcomes may be commercialised. Challenges remain in providing lightweight standards to facilitate the penetration of robust, scalable grid-type computing into diverse user communities to meet the evolving demands of systems biology.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Security protocols preserve essential properties, such as confidentiality and authentication, of electronically transmitted data. However, such properties cannot be directly expressed or verified in contemporary formal methods. Via a detailed example, we describe the phases needed to formalise and verify the correctness of a security protocol in the state-oriented Z formalism.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Security protocols are often modelled at a high level of abstraction, potentially overlooking implementation-dependent vulnerabilities. Here we use the Z specification language's rich set of data structures to formally model potentially ambiguous messages that may be exploited in a 'type flaw' attack. We then show how to formally verify whether or not such an attack is actually possible in a particular protocol using Z's schema calculus.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

For the last several years, mobile devices and platform security threats, including wireless networking technology, have been top security issues. A departure has occurred from automatic anti-virus software based on traditional PC defense: risk management (authentication and encryption), compliance, and disaster recovery following polymorphic viruses and malware as the primary activities within many organizations and government services alike. This chapter covers research in Turkey as a reflection of the current market – e-government started officially in 2008. This situation in an emerging country presents the current situation and resistances encountered while engaging with mobile and e-government interfaces. The authors contend that research is needed to understand more precisely security threats and most of all potential solutions for sustainable future intention to use m-government services. Finally, beyond m-government initiatives' success or failure, the mechanisms related to public administration mobile technical capacity building and security issues are discussed.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The advent of personal communication systems within the last decade has depended upon the utilization of advanced digital schemes for source and channel coding and for modulation. The inherent digital nature of the communications processing has allowed the convenient incorporation of cryptographic techniques to implement security in these communications systems. There are various security requirements, of both the service provider and the mobile subscriber, which may be provided for in a personal communications system. Such security provisions include the privacy of user data, the authentication of communicating parties, the provision for data integrity, and the provision for both location confidentiality and party anonymity. This thesis is concerned with an investigation of the private-key and public-key cryptographic techniques pertinent to the security requirements of personal communication systems and an analysis of the security provisions of Second-Generation personal communication systems is presented. Particular attention has been paid to the properties of the cryptographic protocols which have been employed in current Second-Generation systems. It has been found that certain security-related protocols implemented in the Second-Generation systems have specific weaknesses. A theoretical evaluation of these protocols has been performed using formal analysis techniques and certain assumptions made during the development of the systems are shown to contribute to the security weaknesses. Various attack scenarios which exploit these protocol weaknesses are presented. The Fiat-Sharmir zero-knowledge cryptosystem is presented as an example of how asymmetric algorithm cryptography may be employed as part of an improved security solution. Various modifications to this cryptosystem have been evaluated and their critical parameters are shown to be capable of being optimized to suit a particular applications. The implementation of such a system using current smart card technology has been evaluated.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Classical studies of area summation measure contrast detection thresholds as a function of grating diameter. Unfortunately, (i) this approach is compromised by retinal inhomogeneity and (ii) it potentially confounds summation of signal with summation of internal noise. The Swiss cheese stimulus of T. S. Meese and R. J. Summers (2007) and the closely related Battenberg stimulus of T. S. Meese (2010) were designed to avoid these problems by keeping target diameter constant and modulating interdigitated checks of first-order carrier contrast within the stimulus region. This approach has revealed a contrast integration process with greater potency than the classical model of spatial probability summation. Here, we used Swiss cheese stimuli to investigate the spatial limits of contrast integration over a range of carrier frequencies (1–16 c/deg) and raised plaid modulator frequencies (0.25–32 cycles/check). Subthreshold summation for interdigitated carrier pairs remained strong (~4 to 6 dB) up to 4 to 8 cycles/check. Our computational analysis of these results implied linear signal combination (following square-law transduction) over either (i) 12 carrier cycles or more or (ii) 1.27 deg or more. Our model has three stages of summation: short-range summation within linear receptive fields, medium-range integration to compute contrast energy for multiple patches of the image, and long-range pooling of the contrast integrators by probability summation. Our analysis legitimizes the inclusion of widespread integration of signal (and noise) within hierarchical image processing models. It also confirms the individual differences in the spatial extent of integration that emerge from our approach.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In recent years, interest in digital watermarking has grown significantly. Indeed, the use of digital watermarking techniques is seen as a promising mean to protect intellectual property rights of digital data and to ensure the authentication of digital data. Thus, a significant research effort has been devoted to the study of practical watermarking systems, in particular for digital images. In this thesis, a practical and principled approach to the problem is adopted. Several aspects of practical watermarking schemes are investigated. First, a power constaint formulation of the problem is presented. Then, a new analysis of quantisation effects on the information rate of digital watermarking scheme is proposed and compared to other approaches suggested in the literature. Subsequently, a new information embedding technique, based on quantisation, is put forward and its performance evaluated. Finally, the influence of image data representation on the performance of practical scheme is studied along with a new representation based on independent component analysis.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

To extend our understanding of the early visual hierarchy, we investigated the long-range integration of first- and second-order signals in spatial vision. In our first experiment we performed a conventional area summation experiment where we varied the diameter of (a) luminance-modulated (LM) noise and (b) contrastmodulated (CM) noise. Results from the LM condition replicated previous findings with sine-wave gratings in the absence of noise, consistent with long-range integration of signal contrast over space. For CM, the summation function was much shallower than for LM suggesting, at first glance, that the signal integration process was spatially less extensive than for LM. However, an alternative possibility was that the high spatial frequency noise carrier for the CM signal was attenuated by peripheral retina (or cortex), thereby impeding our ability to observe area summation of CM in the conventional way. To test this, we developed the ''Swiss cheese'' stimulus of Meese and Summers (2007) in which signal area can be varied without changing the stimulus diameter, providing some protection against inhomogeneity of the retinal field. Using this technique and a two-component subthreshold summation paradigm we found that (a) CM is spatially integrated over at least five stimulus cycles (possibly more), (b) spatial integration follows square-law signal transduction for both LM and CM and (c) the summing device integrates over spatially-interdigitated LM and CM signals when they are co-oriented, but not when crossoriented. The spatial pooling mechanism that we have identified would be a good candidate component for amodule involved in representing visual textures, including their spatial extent.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In a Ubiquitous Consumer Wireless World (UCWW) environment the provision, administration and management of the authentication, authorization and accounting (AAA) policies and business services are provided by third-party AAA service providers (3P-AAA-SPs) who are independent of the wireless access network providers (ANPs). In this environment the consumer can freely choose any suitable ANP, based on his/her own preferences. This new AAA infrastructural arrangement necessitates assessing the impact and re-thinking the design, structure and location of ‘charging and billing’ (C&B) functions and services. This paper addresses C&B issues in UCWW, proposing potential architectural solutions for C&B realization. Implementation approaches of these novel solutions together with a software testbed for validation and performance evaluation are addressed.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Distributed and/or composite web applications are driven by intercommunication via web services, which employ application-level protocols, such as SOAP. However, these protocols usually rely on the classic HTTP for transportation. HTTP is quite efficient for what it does — delivering web page content, but has never been intended to carry complex web service oriented communication. Today there exist modern protocols that are much better fit for the job. Such a candidate is XMPP. It is an XML-based, asynchronous, open protocol that has built-in security and authentication mechanisms and utilizes a network of federated servers. Sophisticated asynchronous multi-party communication patterns can be established, effectively aiding web service developers. This paper’s purpose is to prove by facts, comparisons, and practical examples that XMPP is not only better suited than HTTP to serve as middleware for web service protocols, but can also contribute to the overall development state of web services.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background: During last decade the use of ECG recordings in biometric recognition studies has increased. ECG characteristics made it suitable for subject identification: it is unique, present in all living individuals, and hard to forge. However, in spite of the great number of approaches found in literature, no agreement exists on the most appropriate methodology. This study aimed at providing a survey of the techniques used so far in ECG-based human identification. Specifically, a pattern recognition perspective is here proposed providing a unifying framework to appreciate previous studies and, hopefully, guide future research. Methods: We searched for papers on the subject from the earliest available date using relevant electronic databases (Medline, IEEEXplore, Scopus, and Web of Knowledge). The following terms were used in different combinations: electrocardiogram, ECG, human identification, biometric, authentication and individual variability. The electronic sources were last searched on 1st March 2015. In our selection we included published research on peer-reviewed journals, books chapters and conferences proceedings. The search was performed for English language documents. Results: 100 pertinent papers were found. Number of subjects involved in the journal studies ranges from 10 to 502, age from 16 to 86, male and female subjects are generally present. Number of analysed leads varies as well as the recording conditions. Identification performance differs widely as well as verification rate. Many studies refer to publicly available databases (Physionet ECG databases repository) while others rely on proprietary recordings making difficult them to compare. As a measure of overall accuracy we computed a weighted average of the identification rate and equal error rate in authentication scenarios. Identification rate resulted equal to 94.95 % while the equal error rate equal to 0.92 %. Conclusions: Biometric recognition is a mature field of research. Nevertheless, the use of physiological signals features, such as the ECG traits, needs further improvements. ECG features have the potential to be used in daily activities such as access control and patient handling as well as in wearable electronics applications. However, some barriers still limit its growth. Further analysis should be addressed on the use of single lead recordings and the study of features which are not dependent on the recording sites (e.g. fingers, hand palms). Moreover, it is expected that new techniques will be developed using fiducials and non-fiducial based features in order to catch the best of both approaches. ECG recognition in pathological subjects is also worth of additional investigations.