243 resultados para Verification


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Literally, the word compliance suggests conformity in fulfilling official requirements. The thesis presents the results of the analysis and design of a class of protocols called compliant cryptologic protocols (CCP). The thesis presents a notion for compliance in cryptosystems that is conducive as a cryptologic goal. CCP are employed in security systems used by at least two mutually mistrusting sets of entities. The individuals in the sets of entities only trust the design of the security system and any trusted third party the security system may include. Such a security system can be thought of as a broker between the mistrusting sets of entities. In order to provide confidence in operation for the mistrusting sets of entities, CCP must provide compliance verification mechanisms. These mechanisms are employed either by all the entities or a set of authorised entities in the system to verify the compliance of the behaviour of various participating entities with the rules of the system. It is often stated that confidentiality, integrity and authentication are the primary interests of cryptology. It is evident from the literature that authentication mechanisms employ confidentiality and integrity services to achieve their goal. Therefore, the fundamental services that any cryptographic algorithm may provide are confidentiality and integrity only. Since controlling the behaviour of the entities is not a feasible cryptologic goal,the verification of the confidentiality of any data is a futile cryptologic exercise. For example, there exists no cryptologic mechanism that would prevent an entity from willingly or unwillingly exposing its private key corresponding to a certified public key. The confidentiality of the data can only be assumed. Therefore, any verification in cryptologic protocols must take the form of integrity verification mechanisms. Thus, compliance verification must take the form of integrity verification in cryptologic protocols. A definition of compliance that is conducive as a cryptologic goal is presented as a guarantee on the confidentiality and integrity services. The definitions are employed to provide a classification mechanism for various message formats in a cryptologic protocol. The classification assists in the characterisation of protocols, which assists in providing a focus for the goals of the research. The resulting concrete goal of the research is the study of those protocols that employ message formats to provide restricted confidentiality and universal integrity services to selected data. The thesis proposes an informal technique to understand, analyse and synthesise the integrity goals of a protocol system. The thesis contains a study of key recovery,electronic cash, peer-review, electronic auction, and electronic voting protocols. All these protocols contain message format that provide restricted confidentiality and universal integrity services to selected data. The study of key recovery systems aims to achieve robust key recovery relying only on the certification procedure and without the need for tamper-resistant system modules. The result of this study is a new technique for the design of key recovery systems called hybrid key escrow. The thesis identifies a class of compliant cryptologic protocols called secure selection protocols (SSP). The uniqueness of this class of protocols is the similarity in the goals of the member protocols, namely peer-review, electronic auction and electronic voting. The problem statement describing the goals of these protocols contain a tuple,(I, D), where I usually refers to an identity of a participant and D usually refers to the data selected by the participant. SSP are interested in providing confidentiality service to the tuple for hiding the relationship between I and D, and integrity service to the tuple after its formation to prevent the modification of the tuple. The thesis provides a schema to solve the instances of SSP by employing the electronic cash technology. The thesis makes a distinction between electronic cash technology and electronic payment technology. It will treat electronic cash technology to be a certification mechanism that allows the participants to obtain a certificate on their public key, without revealing the certificate or the public key to the certifier. The thesis abstracts the certificate and the public key as the data structure called anonymous token. It proposes design schemes for the peer-review, e-auction and e-voting protocols by employing the schema with the anonymous token abstraction. The thesis concludes by providing a variety of problem statements for future research that would further enrich the literature.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Keyword Spotting is the task of detecting keywords of interest within continu- ous speech. The applications of this technology range from call centre dialogue systems to covert speech surveillance devices. Keyword spotting is particularly well suited to data mining tasks such as real-time keyword monitoring and unre- stricted vocabulary audio document indexing. However, to date, many keyword spotting approaches have su®ered from poor detection rates, high false alarm rates, or slow execution times, thus reducing their commercial viability. This work investigates the application of keyword spotting to data mining tasks. The thesis makes a number of major contributions to the ¯eld of keyword spotting. The ¯rst major contribution is the development of a novel keyword veri¯cation method named Cohort Word Veri¯cation. This method combines high level lin- guistic information with cohort-based veri¯cation techniques to obtain dramatic improvements in veri¯cation performance, in particular for the problematic short duration target word class. The second major contribution is the development of a novel audio document indexing technique named Dynamic Match Lattice Spotting. This technique aug- ments lattice-based audio indexing principles with dynamic sequence matching techniques to provide robustness to erroneous lattice realisations. The resulting algorithm obtains signi¯cant improvement in detection rate over lattice-based audio document indexing while still maintaining extremely fast search speeds. The third major contribution is the study of multiple veri¯er fusion for the task of keyword veri¯cation. The reported experiments demonstrate that substantial improvements in veri¯cation performance can be obtained through the fusion of multiple keyword veri¯ers. The research focuses on combinations of speech background model based veri¯ers and cohort word veri¯ers. The ¯nal major contribution is a comprehensive study of the e®ects of limited training data for keyword spotting. This study is performed with consideration as to how these e®ects impact the immediate development and deployment of speech technologies for non-English languages.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper, the commonly used switching schemes for sliding mode control of power converters is analyzed and designed in the frequency domain. Particular application of a distribution static compensator (DSTATCOM) in voltage control mode is investigated in a power distribution system. Tsypkin's method and describing function is used to obtain the switching conditions for the two-level and three-level voltage source inverters. Magnitude conditions of carrier signals are developed for robust switching of the inverter under carrier-based modulation scheme of sliding mode control. The existence of border collision bifurcation is identified to avoid the complex switching states of the inverter. The load bus voltage of an unbalanced three-phase nonstiff radial distribution system is controlled using the proposed carrier-based design. The results are validated using PSCAD/EMTDC simulation studies and through a scaled laboratory model of DSTATCOM that is developed for experimental verification

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Purpose: The purpose of this study was to determine whether adiposity affects the attainment of VO2max. Methods: Sixty-seven male and 68 female overweight (body mass index (BMI) = 25-29.9 kg·m-2) and obese (BMI >= 30 kg·m-2) participants undertook a graded treadmill test to volitional exhaustion (phase 1) followed by a verification test (phase 2) to determine the proportion who could achieve a plateau in VO2 and other "maximal" markers (RER, lactate, HR, RPE). Results: At the end of phase 1, 46% of the participants reached a plateau in VO2, 83% increased HR to within 11 beats of age-predicted maximum, 89% reached an RER of >=1.15, 70% reached a blood lactate concentration of >=8 mmol·L-1, and 74% reached an RPE of >=18. No significant differences between genders and between BMI groups were found with the exception of blood lactate concentration (males = 84% vs females = 56%, P < 0.05). Neither gender nor fatness predicted the number of other markers attained, and attainment of other markers did not differentiate whether a VO2 plateau was achieved. The verification test (phase 2) revealed that an additional 52 individuals (39%) who did not exhibit a plateau in V·O2 in phase 1 had no further increase in VO2 in phase 2 despite an increase in workload. Conclusions: These findings indicate that the absence of a plateau in VO2 alone is not indicative of a failure to reach a true maximal VO2 and that individuals with excessive body fat are no less likely than "normal-weight" individuals to exhibit a plateau in VO2 provided that the protocol is appropriate and encouragement to exercise to maximal exertion is provided.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Queensland University of Technology (QUT) completed an Australian National Data Service (ANDS) funded “Seeding the Commons Project” to contribute metadata to Research Data Australia. The project employed two Research Data Librarians from October 2009 through to July 2010. Technical support for the project was provided by QUT’s High Performance Computing and Research Support Specialists. ---------- The project identified and described QUT’s category 1 (ARC / NHMRC) research datasets. Metadata for the research datasets was stored in QUT’s Research Data Repository (Architecta Mediaflux). Metadata which was suitable for inclusion in Research Data Australia was made available to the Australian Research Data Commons (ARDC) in RIF-CS format. ---------- Several workflows and processes were developed during the project. 195 data interviews took place in connection with 424 separate research activities which resulted in the identification of 492 datasets. ---------- The project had a high level of technical support from QUT High Performance Computing and Research Support Specialists who developed the Research Data Librarian interface to the data repository that enabled manual entry of interview data and dataset metadata, creation of relationships between repository objects. The Research Data Librarians mapped the QUT metadata repository fields to RIF-CS and an application was created by the HPC and Research Support Specialists to generate RIF-CS files for harvest by the Australian Research Data Commons (ARDC). ---------- This poster will focus on the workflows and processes established for the project including: ---------- • Interview processes and instruments • Data Ingest from existing systems (including mapping to RIF-CS) • Data entry and the Data Librarian interface to Mediaflux • Verification processes • Mapping and creation of RIF-CS for the ARDC

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Before 2001, most Africans immigrating to Australia were white South Africans and Zimbabweans who arrived as economic and family-reunion migrants (Cox, Cooper & Adepoju, 1999). Black African communities are a more recent addition to the Australian landscape, with most entering Australia as refugees after 2001. African refugees are a particularly disadvantaged immigrant group, which the Department of Immigration and Multicultural Affairs (in the Community Relations Commission of New South Wales, 2006) suggests require high levels of settlement support (p.23). Decision makers and settlement service providers need to have settlement data on the communities so that they can be effective in planning, budgeting and delivering support where it is most needed. Settlement data are also useful for determining the challenges that these communities face in trying to establish themselves in resettlement. There has been no verification of existing secondary data sources, however, or previous formal study of African refugee settlement geography in Southeast Queensland. This research addresses the knowledge gap by using a mixed-method approach to identify and describe the distribution and population size of eight African communities in Southeast Queensland, examine secondary migration patterns in these communities and assess the relationship between these geographic features and housing, a critical factor in successful settlement. Significant discrepancies exist between the primary data gathered in the study and existing secondary data relating to population size and distribution of the communities. Results also reveal a tension between the socio-cultural forces and the housing and economic imperatives driving secondary migration in the communities, and a general lack of engagement by African refugees with structured support networks. These findings have a wide range of implications for policy and for groups that provide settlement support to these communities.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Software used by architectural and industrial designers – has moved from becoming a tool for drafting, towards use in verification, simulation, project management and project sharing remotely. In more advanced models, parameters for the designed object can be adjusted so a family of variations can be produced rapidly. With advances in computer aided design technology, numerous design options can now be generated and analyzed in real time. However the use of digital tools to support design as an activity is still at an early stage and has largely been limited in functionality with regard to the design process. To date, major CAD vendors have not developed an integrated tool that is able to both leverage specialized design knowledge from various discipline domains (known as expert knowledge systems) and support the creation of design alternatives that satisfy different forms of constraints. We propose that evolutionary computing and machine learning be linked with parametric design techniques to record and respond to a designer’s own way of working and design history. It is expected that this will lead to results that impact on future work on design support systems-(ergonomics and interface) as well as implicit constraint and problem definition for problems that are difficult to quantify.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Immediate indefeasibility is the cornerstone of the Torrens system of land registration. However, when combined with the apparent ease in which forged mortgages become registered, the operation of this doctrine can come into question. This article seeks to argue that, rather than question indefeasibility, the focus should be on the verification of identity of parties to land transactions. Whilst no system can ever be infallible, it is suggested that by correctly imposing the responsibility for identity verification on the appropriate individual, the Torrens system can retain immediate indefeasibility as its paramount principle, yet achieve the optimum level of fairness in terms of allocation of responsibility and loss. With the dawn of a new era of electronic conveyancing about to begin, the framework suggested here provides a model for minimising the risks of forged mortgages and appropriately allocating the loss.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Human mesenchymal stem cells (hMSCs) possess great therapeutic potential for the treatment of bone disease and fracture non-union. Too often however, in vitro evidence alone of the interaction between hMSCs and the biomaterial of choice is used as justification for continued development of the material into the clinic. Clearly for hMSC-based regenerative medicine to be successful for the treatment of orthopaedic trauma, it is crucial to transplant hMSCs with a suitable carrier that facilitates their survival, optimal proliferation and osteogenic differentiation in vitro and in vivo. This motivated us to evaluate the use of polycaprolactone-20% tricalcium phosphate (PCL-TCP) scaffolds produced by fused deposition modeling for the delivery of hMSCs. When hMSCs were cultured on the PCL-TCP scaffolds and imaged by a combination of phase contrast, scanning electron and confocal laser microscopy, we observed five distinct stages of colonization over a 21-day period that were characterized by cell attachment, spreading, cellular bridging, the formation of a dense cellular mass and the accumulation of a mineralized extracellular matrix when induced with osteogenic stimulants. Having established that PCL-TCP scaffolds are able to support hMSC proliferation and osteogenic differentiation, we next tested the in vivo efficacy of hMSC-loaded PCL-TCP scaffolds in nude rat critical-sized femoral defects. We found that fluorescently labeled hMSCs survived in the defect site for up to 3 weeks post-transplantation. However, only 50% of the femoral defects treated with hMSCs responded favorably as determined by new bone volume. As such, we show that verification of hMSC viability and differentiation in vitro is not sufficient to predict the efficacy of transplanted stem cells to consistently promote bone formation in orthotopic defects in vivo.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This research shows that gross pollutant traps (GPTs) continue to play an important role in preventing visible street waste—gross pollutants—from contaminating the environment. The demand for these GPTs calls for stringent quality control and this research provides a foundation to rigorously examine the devices. A novel and comprehensive testing approach to examine a dry sump GPT was developed. The GPT is designed with internal screens to capture gross pollutants—organic matter and anthropogenic litter. This device has not been previously investigated. Apart from the review of GPTs and gross pollutant data, the testing approach includes four additional aspects to this research, which are: field work and an historical overview of street waste/stormwater pollution, calibration of equipment, hydrodynamic studies and gross pollutant capture/retention investigations. This work is the first comprehensive investigation of its kind and provides valuable practical information for the current research and any future work pertaining to the operations of GPTs and management of street waste in the urban environment. Gross pollutant traps—including patented and registered designs developed by industry—have specific internal configurations and hydrodynamic separation characteristics which demand individual testing and performance assessments. Stormwater devices are usually evaluated by environmental protection agencies (EPAs), professional bodies and water research centres. In the USA, the American Society of Civil Engineers (ASCE) and the Environmental Water Resource Institute (EWRI) are examples of professional and research organisations actively involved in these evaluation/verification programs. These programs largely rely on field evaluations alone that are limited in scope, mainly for cost and logistical reasons. In Australia, evaluation/verification programs of new devices in the stormwater industry are not well established. The current limitations in the evaluation methodologies of GPTs have been addressed in this research by establishing a new testing approach. This approach uses a combination of physical and theoretical models to examine in detail the hydrodynamic and capture/retention characteristics of the GPT. The physical model consisted of a 50% scale model GPT rig with screen blockages varying from 0 to 100%. This rig was placed in a 20 m flume and various inlet and outflow operating conditions were modelled on observations made during the field monitoring of GPTs. Due to infrequent cleaning, the retaining screens inside the GPTs were often observed to be blocked with organic matter. Blocked screens can radically change the hydrodynamic and gross pollutant capture/retention characteristics of a GPT as shown from this research. This research involved the use of equipment, such as acoustic Doppler velocimeters (ADVs) and dye concentration (Komori) probes, which were deployed for the first time in a dry sump GPT. Hence, it was necessary to rigorously evaluate the capability and performance of these devices, particularly in the case of the custom made Komori probes, about which little was known. The evaluation revealed that the Komori probes have a frequency response of up to 100 Hz —which is dependent upon fluid velocities—and this was adequate to measure the relevant fluctuations of dye introduced into the GPT flow domain. The outcome of this evaluation resulted in establishing methodologies for the hydrodynamic measurements and gross pollutant capture/retention experiments. The hydrodynamic measurements consisted of point-based acoustic Doppler velocimeter (ADV) measurements, flow field particle image velocimetry (PIV) capture, head loss experiments and computational fluid dynamics (CFD) simulation. The gross pollutant capture/retention experiments included the use of anthropogenic litter components, tracer dye and custom modified artificial gross pollutants. Anthropogenic litter was limited to tin cans, bottle caps and plastic bags, while the artificial pollutants consisted of 40 mm spheres with a range of four buoyancies. The hydrodynamic results led to the definition of global and local flow features. The gross pollutant capture/retention results showed that when the internal retaining screens are fully blocked, the capture/retention performance of the GPT rapidly deteriorates. The overall results showed that the GPT will operate efficiently until at least 70% of the screens are blocked, particularly at high flow rates. This important finding indicates that cleaning operations could be more effectively planned when the GPT capture/retention performance deteriorates. At lower flow rates, the capture/retention performance trends were reversed. There is little difference in the poor capture/retention performance between a fully blocked GPT and a partially filled or empty GPT with 100% screen blockages. The results also revealed that the GPT is designed with an efficient high flow bypass system to avoid upstream blockages. The capture/retention performance of the GPT at medium to high inlet flow rates is close to maximum efficiency (100%). With regard to the design appraisal of the GPT, a raised inlet offers a better capture/retention performance, particularly at lower flow rates. Further design appraisals of the GPT are recommended.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In a digital world, users’ Personally Identifiable Information (PII) is normally managed with a system called an Identity Management System (IMS). There are many types of IMSs. There are situations when two or more IMSs need to communicate with each other (such as when a service provider needs to obtain some identity information about a user from a trusted identity provider). There could be interoperability issues when communicating parties use different types of IMS. To facilitate interoperability between different IMSs, an Identity Meta System (IMetS) is normally used. An IMetS can, at least theoretically, join various types of IMSs to make them interoperable and give users the illusion that they are interacting with just one IMS. However, due to the complexity of an IMS, attempting to join various types of IMSs is a technically challenging task, let alone assessing how well an IMetS manages to integrate these IMSs. The first contribution of this thesis is the development of a generic IMS model called the Layered Identity Infrastructure Model (LIIM). Using this model, we develop a set of properties that an ideal IMetS should provide. This idealized form is then used as a benchmark to evaluate existing IMetSs. Different types of IMS provide varying levels of privacy protection support. Unfortunately, as observed by Jøsang et al (2007), there is insufficient privacy protection in many of the existing IMSs. In this thesis, we study and extend a type of privacy enhancing technology known as an Anonymous Credential System (ACS). In particular, we extend the ACS which is built on the cryptographic primitives proposed by Camenisch, Lysyanskaya, and Shoup. We call this system the Camenisch, Lysyanskaya, Shoup - Anonymous Credential System (CLS-ACS). The goal of CLS-ACS is to let users be as anonymous as possible. Unfortunately, CLS-ACS has problems, including (1) the concentration of power to a single entity - known as the Anonymity Revocation Manager (ARM) - who, if malicious, can trivially reveal a user’s PII (resulting in an illegal revocation of the user’s anonymity), and (2) poor performance due to the resource-intensive cryptographic operations required. The second and third contributions of this thesis are the proposal of two protocols that reduce the trust dependencies on the ARM during users’ anonymity revocation. Both protocols distribute trust from the ARM to a set of n referees (n > 1), resulting in a significant reduction of the probability of an anonymity revocation being performed illegally. The first protocol, called the User Centric Anonymity Revocation Protocol (UCARP), allows a user’s anonymity to be revoked in a user-centric manner (that is, the user is aware that his/her anonymity is about to be revoked). The second protocol, called the Anonymity Revocation Protocol with Re-encryption (ARPR), allows a user’s anonymity to be revoked by a service provider in an accountable manner (that is, there is a clear mechanism to determine which entity who can eventually learn - and possibly misuse - the identity of the user). The fourth contribution of this thesis is the proposal of a protocol called the Private Information Escrow bound to Multiple Conditions Protocol (PIEMCP). This protocol is designed to address the performance issue of CLS-ACS by applying the CLS-ACS in a federated single sign-on (FSSO) environment. Our analysis shows that PIEMCP can both reduce the amount of expensive modular exponentiation operations required and lower the risk of illegal revocation of users’ anonymity. Finally, the protocols proposed in this thesis are complex and need to be formally evaluated to ensure that their required security properties are satisfied. In this thesis, we use Coloured Petri nets (CPNs) and its corresponding state space analysis techniques. All of the protocols proposed in this thesis have been formally modeled and verified using these formal techniques. Therefore, the fifth contribution of this thesis is a demonstration of the applicability of CPN and its corresponding analysis techniques in modeling and verifying privacy enhancing protocols. To our knowledge, this is the first time that CPN has been comprehensively applied to model and verify privacy enhancing protocols. From our experience, we also propose several CPN modeling approaches, including complex cryptographic primitives (such as zero-knowledge proof protocol) modeling, attack parameterization, and others. The proposed approaches can be applied to other security protocols, not just privacy enhancing protocols.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper, the performance of voltage-source converter-based shunt and series compensators used for load voltage control in electrical power distribution systems has been analyzed and compared, when a nonlinear load is connected across the load bus. The comparison has been made based on the closed-loop frequency resopnse characteristics of the compensated distribution system. A distribution static compensator (DSTATCOM) as a shunt device and a dynamic voltage restorer (DVR) as a series device are considered in the voltage-control mode for the comparison. The power-quality problems which these compensator address include voltage sags/swells, load voltage harmonic distortions, and unbalancing. The effect of various system parameters on the control performance of the compensator can be studied using the proposed analysis. In particular, the performance of the two compensators are compared with the strong ac supply (stiff source) and weak ac-supply (non-still source) distribution system. The experimental verification of the analytical results derived has been obtained using a laboratory model of the single-phase DSTATCOM and DVR. A generalized converter topology using a cascaded multilevel inverter has been proposed for the medium-voltage distribution system. Simulation studies have been performed in the PSCAD/EMTDC software to verify the results in the three-phase system.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Petri nets are often used to model and analyze workflows. Many workflow languages have been mapped onto Petri nets in order to provide formal semantics or to verify correctness properties. Typically, the so-called Workflow nets are used to model and analyze workflows and variants of the classical soundness property are used as a correctness notion. Since many workflow languages have cancelation features, a mapping to workflow nets is not always possible. Therefore, it is interesting to consider workflow nets with reset arcs. Unfortunately, soundness is undecidable for workflow nets with reset arcs. In this paper, we provide a proof and insights into the theoretical limits of workflow verification.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The purpose of this work is to validate and automate the use of DYNJAWS; a new component module (CM) in the BEAMnrc Monte Carlo (MC) user code. The DYNJAWS CM simulates dynamic wedges and can be used in three modes; dynamic, step-and-shoot and static. The step-and-shoot and dynamic modes require an additional input file defining the positions of the jaw that constitutes the dynamic wedge, at regular intervals during its motion. A method for automating the generation of the input file is presented which will allow for the more efficient use of the DYNJAWS CM. Wedged profiles have been measured and simulated for 6 and 10 MV photons at three field sizes (5 cm x 5 cm , 10 cm x10 cm and 20 cm x 20 cm), four wedge angles (15, 30, 45 and 60 degrees), at dmax and at 10 cm depth. Results of this study show agreement between the measured and the MC profiles to within 3% of absolute dose or 3 mm distance to agreement for all wedge angles at both energies and depths. The gamma analysis suggests that dynamic mode is more accurate than the step-and-shoot mode. The DYNJAWS CM is an important addition to the BEAMnrc code and will enable the MC verification of patient treatments involving dynamic wedges.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Gradual authentication is a principle proposed by Meadows as a way to tackle denial-of-service attacks on network protocols by gradually increasing the confidence in clients before the server commits resources. In this paper, we propose an efficient method that allows a defending server to authenticate its clients gradually with the help of some fast-to-verify measures. Our method integrates hash-based client puzzles along with a special class of digital signatures supporting fast verification. Our hash-based client puzzle provides finer granularity of difficulty and is proven secure in the puzzle difficulty model of Chen et al. (2009). We integrate this with the fast-verification digital signature scheme proposed by Bernstein (2000, 2008). These schemes can be up to 20 times faster for client authentication compared to RSA-based schemes. Our experimental results show that, in the Secure Sockets Layer (SSL) protocol, fast verification digital signatures can provide a 7% increase in connections per second compared to RSA signatures, and our integration of client puzzles with client authentication imposes no performance penalty on the server since puzzle verification is a part of signature verification.