392 resultados para TRUST-REGION ALGORITHM


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Composite web services comprise several component web services. When a composite web service is executed centrally, a single web service engine is responsible for coordinating the execution of the components, which may create a bottleneck and degrade the overall throughput of the composite service when there are a large number of service requests. Potentially this problem can be handled by decentralizing execution of the composite web service, but this raises the issue of how to partition a composite service into groups of component services such that each group can be orchestrated by its own execution engine while ensuring acceptable overall throughput of the composite service. Here we present a novel penalty-based genetic algorithm to solve the composite web service partitioning problem. Empirical results show that our new algorithm outperforms existing heuristic-based solutions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In cloud computing resource allocation and scheduling of multiple composite web services is an important challenge. This is especially so in a hybrid cloud where there may be some free resources available from private clouds but some fee-paying resources from public clouds. Meeting this challenge involves two classical computational problems. One is assigning resources to each of the tasks in the composite web service. The other is scheduling the allocated resources when each resource may be used by more than one task and may be needed at different points of time. In addition, we must consider Quality-of-Service issues, such as execution time and running costs. Existing approaches to resource allocation and scheduling in public clouds and grid computing are not applicable to this new problem. This paper presents a random-key genetic algorithm that solves new resource allocation and scheduling problem. Experimental results demonstrate the effectiveness and scalability of the algorithm.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Cloud computing has become a main medium for Software as a Service (SaaS) hosting as it can provide the scalability a SaaS requires. One of the challenges in hosting the SaaS is the placement process where the placement has to consider SaaS interactions between its components and SaaS interactions with its data components. A previous research has tackled this problem using a classical genetic algorithm (GA) approach. This paper proposes a cooperative coevolutionary algorithm (CCEA) approach. The CCEA has been implemented and evaluated and the result has shown that the CCEA has produced higher quality solutions compared to the GA.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The main goal of this research is to design an efficient compression al~ gorithm for fingerprint images. The wavelet transform technique is the principal tool used to reduce interpixel redundancies and to obtain a parsimonious representation for these images. A specific fixed decomposition structure is designed to be used by the wavelet packet in order to save on the computation, transmission, and storage costs. This decomposition structure is based on analysis of information packing performance of several decompositions, two-dimensional power spectral density, effect of each frequency band on the reconstructed image, and the human visual sensitivities. This fixed structure is found to provide the "most" suitable representation for fingerprints, according to the chosen criteria. Different compression techniques are used for different subbands, based on their observed statistics. The decision is based on the effect of each subband on the reconstructed image according to the mean square criteria as well as the sensitivities in human vision. To design an efficient quantization algorithm, a precise model for distribution of the wavelet coefficients is developed. The model is based on the generalized Gaussian distribution. A least squares algorithm on a nonlinear function of the distribution model shape parameter is formulated to estimate the model parameters. A noise shaping bit allocation procedure is then used to assign the bit rate among subbands. To obtain high compression ratios, vector quantization is used. In this work, the lattice vector quantization (LVQ) is chosen because of its superior performance over other types of vector quantizers. The structure of a lattice quantizer is determined by its parameters known as truncation level and scaling factor. In lattice-based compression algorithms reported in the literature the lattice structure is commonly predetermined leading to a nonoptimized quantization approach. In this research, a new technique for determining the lattice parameters is proposed. In the lattice structure design, no assumption about the lattice parameters is made and no training and multi-quantizing is required. The design is based on minimizing the quantization distortion by adapting to the statistical characteristics of the source in each subimage. 11 Abstract Abstract Since LVQ is a multidimensional generalization of uniform quantizers, it produces minimum distortion for inputs with uniform distributions. In order to take advantage of the properties of LVQ and its fast implementation, while considering the i.i.d. nonuniform distribution of wavelet coefficients, the piecewise-uniform pyramid LVQ algorithm is proposed. The proposed algorithm quantizes almost all of source vectors without the need to project these on the lattice outermost shell, while it properly maintains a small codebook size. It also resolves the wedge region problem commonly encountered with sharply distributed random sources. These represent some of the drawbacks of the algorithm proposed by Barlaud [26). The proposed algorithm handles all types of lattices, not only the cubic lattices, as opposed to the algorithms developed by Fischer [29) and Jeong [42). Furthermore, no training and multiquantizing (to determine lattice parameters) is required, as opposed to Powell's algorithm [78). For coefficients with high-frequency content, the positive-negative mean algorithm is proposed to improve the resolution of reconstructed images. For coefficients with low-frequency content, a lossless predictive compression scheme is used to preserve the quality of reconstructed images. A method to reduce bit requirements of necessary side information is also introduced. Lossless entropy coding techniques are subsequently used to remove coding redundancy. The algorithms result in high quality reconstructed images with better compression ratios than other available algorithms. To evaluate the proposed algorithms their objective and subjective performance comparisons with other available techniques are presented. The quality of the reconstructed images is important for a reliable identification. Enhancement and feature extraction on the reconstructed images are also investigated in this research. A structural-based feature extraction algorithm is proposed in which the unique properties of fingerprint textures are used to enhance the images and improve the fidelity of their characteristic features. The ridges are extracted from enhanced grey-level foreground areas based on the local ridge dominant directions. The proposed ridge extraction algorithm, properly preserves the natural shape of grey-level ridges as well as precise locations of the features, as opposed to the ridge extraction algorithm in [81). Furthermore, it is fast and operates only on foreground regions, as opposed to the adaptive floating average thresholding process in [68). Spurious features are subsequently eliminated using the proposed post-processing scheme.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Literally, the word compliance suggests conformity in fulfilling official requirements. The thesis presents the results of the analysis and design of a class of protocols called compliant cryptologic protocols (CCP). The thesis presents a notion for compliance in cryptosystems that is conducive as a cryptologic goal. CCP are employed in security systems used by at least two mutually mistrusting sets of entities. The individuals in the sets of entities only trust the design of the security system and any trusted third party the security system may include. Such a security system can be thought of as a broker between the mistrusting sets of entities. In order to provide confidence in operation for the mistrusting sets of entities, CCP must provide compliance verification mechanisms. These mechanisms are employed either by all the entities or a set of authorised entities in the system to verify the compliance of the behaviour of various participating entities with the rules of the system. It is often stated that confidentiality, integrity and authentication are the primary interests of cryptology. It is evident from the literature that authentication mechanisms employ confidentiality and integrity services to achieve their goal. Therefore, the fundamental services that any cryptographic algorithm may provide are confidentiality and integrity only. Since controlling the behaviour of the entities is not a feasible cryptologic goal,the verification of the confidentiality of any data is a futile cryptologic exercise. For example, there exists no cryptologic mechanism that would prevent an entity from willingly or unwillingly exposing its private key corresponding to a certified public key. The confidentiality of the data can only be assumed. Therefore, any verification in cryptologic protocols must take the form of integrity verification mechanisms. Thus, compliance verification must take the form of integrity verification in cryptologic protocols. A definition of compliance that is conducive as a cryptologic goal is presented as a guarantee on the confidentiality and integrity services. The definitions are employed to provide a classification mechanism for various message formats in a cryptologic protocol. The classification assists in the characterisation of protocols, which assists in providing a focus for the goals of the research. The resulting concrete goal of the research is the study of those protocols that employ message formats to provide restricted confidentiality and universal integrity services to selected data. The thesis proposes an informal technique to understand, analyse and synthesise the integrity goals of a protocol system. The thesis contains a study of key recovery,electronic cash, peer-review, electronic auction, and electronic voting protocols. All these protocols contain message format that provide restricted confidentiality and universal integrity services to selected data. The study of key recovery systems aims to achieve robust key recovery relying only on the certification procedure and without the need for tamper-resistant system modules. The result of this study is a new technique for the design of key recovery systems called hybrid key escrow. The thesis identifies a class of compliant cryptologic protocols called secure selection protocols (SSP). The uniqueness of this class of protocols is the similarity in the goals of the member protocols, namely peer-review, electronic auction and electronic voting. The problem statement describing the goals of these protocols contain a tuple,(I, D), where I usually refers to an identity of a participant and D usually refers to the data selected by the participant. SSP are interested in providing confidentiality service to the tuple for hiding the relationship between I and D, and integrity service to the tuple after its formation to prevent the modification of the tuple. The thesis provides a schema to solve the instances of SSP by employing the electronic cash technology. The thesis makes a distinction between electronic cash technology and electronic payment technology. It will treat electronic cash technology to be a certification mechanism that allows the participants to obtain a certificate on their public key, without revealing the certificate or the public key to the certifier. The thesis abstracts the certificate and the public key as the data structure called anonymous token. It proposes design schemes for the peer-review, e-auction and e-voting protocols by employing the schema with the anonymous token abstraction. The thesis concludes by providing a variety of problem statements for future research that would further enrich the literature.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

When the supply voltages are balanced and sinusoidal, load compensation can give both unity power factor (UPF) and perfect harmonic cancellation (PHC) source currents. But under distorted supply voltages, achieving both UPF and PHC currents are not possible and contradictory to each other. Hence there should be an optimal performance between these two important compensation goals. This paper presents an optimal control algorithm for load compensation under unbalanced and distorted supply voltages. In this algorithm source currents are compensated for reactive, imbalance components and harmonic distortions set by the limits. By satisfying the harmonic distortion limits and power balance, this algorithm gives the source currents which will provide the maximum achievable power factor. The detailed simulation results using MATLAB are presented to support the performance of the proposed optimal control algorithm.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

One major gap in transportation system safety management is the ability to assess the safety ramifications of design changes for both new road projects and modifications to existing roads. To fulfill this need, FHWA and its many partners are developing a safety forecasting tool, the Interactive Highway Safety Design Model (IHSDM). The tool will be used by roadway design engineers, safety analysts, and planners throughout the United States. As such, the statistical models embedded in IHSDM will need to be able to forecast safety impacts under a wide range of roadway configurations and environmental conditions for a wide range of driver populations and will need to be able to capture elements of driving risk across states. One of the IHSDM algorithms developed by FHWA and its contractors is for forecasting accidents on rural road segments and rural intersections. The methodological approach is to use predictive models for specific base conditions, with traffic volume information as the sole explanatory variable for crashes, and then to apply regional or state calibration factors and accident modification factors (AMFs) to estimate the impact on accidents of geometric characteristics that differ from the base model conditions. In the majority of past approaches, AMFs are derived from parameter estimates associated with the explanatory variables. A recent study for FHWA used a multistate database to examine in detail the use of the algorithm with the base model-AMF approach and explored alternative base model forms as well as the use of full models that included nontraffic-related variables and other approaches to estimate AMFs. That research effort is reported. The results support the IHSDM methodology.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Establishing a nationwide Electronic Health Record system has become a primary objective for many countries around the world, including Australia, in order to improve the quality of healthcare while at the same time decreasing its cost. Doing so will require federating the large number of patient data repositories currently in use throughout the country. However, implementation of EHR systems is being hindered by several obstacles, among them concerns about data privacy and trustworthiness. Current IT solutions fail to satisfy patients’ privacy desires and do not provide a trustworthiness measure for medical data. This thesis starts with the observation that existing EHR system proposals suer from six serious shortcomings that aect patients’ privacy and safety, and medical practitioners’ trust in EHR data: accuracy and privacy concerns over linking patients’ existing medical records; the inability of patients to have control over who accesses their private data; the inability to protect against inferences about patients’ sensitive data; the lack of a mechanism for evaluating the trustworthiness of medical data; and the failure of current healthcare workflow processes to capture and enforce patient’s privacy desires. Following an action research method, this thesis addresses the above shortcomings by firstly proposing an architecture for linking electronic medical records in an accurate and private way where patients are given control over what information can be revealed about them. This is accomplished by extending the structure and protocols introduced in federated identity management to link a patient’s EHR to his existing medical records by using pseudonym identifiers. Secondly, a privacy-aware access control model is developed to satisfy patients’ privacy requirements. The model is developed by integrating three standard access control models in a way that gives patients access control over their private data and ensures that legitimate uses of EHRs are not hindered. Thirdly, a probabilistic approach for detecting and restricting inference channels resulting from publicly-available medical data is developed to guard against indirect accesses to a patient’s private data. This approach is based upon a Bayesian network and the causal probabilistic relations that exist between medical data fields. The resulting definitions and algorithms show how an inference channel can be detected and restricted to satisfy patients’ expressed privacy goals. Fourthly, a medical data trustworthiness assessment model is developed to evaluate the quality of medical data by assessing the trustworthiness of its sources (e.g. a healthcare provider or medical practitioner). In this model, Beta and Dirichlet reputation systems are used to collect reputation scores about medical data sources and these are used to compute the trustworthiness of medical data via subjective logic. Finally, an extension is made to healthcare workflow management processes to capture and enforce patients’ privacy policies. This is accomplished by developing a conceptual model that introduces new workflow notions to make the workflow management system aware of a patient’s privacy requirements. These extensions are then implemented in the YAWL workflow management system.