922 resultados para estimation of distribution algorithms


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Extensive use of the Internet coupled with the marvelous growth in e-commerce and m-commerce has created a huge demand for information security. The Secure Socket Layer (SSL) protocol is the most widely used security protocol in the Internet which meets this demand. It provides protection against eaves droppings, tampering and forgery. The cryptographic algorithms RC4 and HMAC have been in use for achieving security services like confidentiality and authentication in the SSL. But recent attacks against RC4 and HMAC have raised questions in the confidence on these algorithms. Hence two novel cryptographic algorithms MAJE4 and MACJER-320 have been proposed as substitutes for them. The focus of this work is to demonstrate the performance of these new algorithms and suggest them as dependable alternatives to satisfy the need of security services in SSL. The performance evaluation has been done by using practical implementation method.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The overall focus of the thesis involves the systematics,germplasm evaluation and pattern of distribution and abundance of freshwater fishes of kerala (india).Biodiversity is the measure of variety of Life.With the signing on the convention on biodiversity, the countries become privileged with absolute rights and responsibility to conserve and utilize their diverse resources for the betterment of mankind in a sustainable way. South-east Asia along with Africa and South America were considered to be the most biodiversity rich areas in the world .The tremendous potential associated with the sustainable utilization of fish germplasm resources of various river systems of Kerala for food, aquaculture and ornamental purposes have to be fully tapped for economic upliftment of fisherman community and also for equitable sharing of benefits among the mankind without compromising the conservation of the rare and unique fish germplasm resources for the future generations.The study was carried during April 2000 to December 2004. 25 major river systems of Kerala were surveyed for fish fauna for delineating the pattern of distribution and abundance of fishes both seasonally and geographically.The results of germplasm inventory and evaluation of fish species were presented both for the state and also river wise. The results of evaluation of fish species for their commercial utilization revealed that, of the 145, 76 are ornamental, 47 food and 22 cultivable. 21 species are strictly endemic to Kerala rivers. The revalidation on biodiversity status of the fishes assessed based on IUCN is so alarming that a high percentage of fishes (59spp.) belong to threatened category which is inclusive of 8 critically ndangered (CR), 36 endangered and 15 species under vulnerable (VU) category.The river wise fish germplasm inventory surveys were conducted in 25 major river systems of Kerala.The results of the present study is indicative of existence of several new fish species in the streams and rivulets located in remote areas of the forests and therefore, new exclusive surveys are required to surface fish species new to science, new distributional records etc, for the river systems.The results of fish germplasm evaluation revealed that there exist many potential endemic ornamental and cultivable fishes in Kerala. It is found imperative to utilize these species sustainably for improving the aquaculture production and aquarium trade of the country which would definitely fetch more income and generate employment.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The present study is intended to provide a new scientific approach to the solution of the worlds cost engineering problems encountered in the chemical industries in our nation. The problem is that of cost estimation of equipments especially of pressure vessels when setting up chemical industries .The present study attempts to develop a model for such cost estimation. This in turn is hoped would go a long way to solve this and related problems in forecasting the cost of setting up chemical plants.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

So far, in the bivariate set up, the analysis of lifetime (failure time) data with multiple causes of failure is done by treating each cause of failure separately. with failures from other causes considered as independent censoring. This approach is unrealistic in many situations. For example, in the analysis of mortality data on married couples one would be interested to compare the hazards for the same cause of death as well as to check whether death due to one cause is more important for the partners’ risk of death from other causes. In reliability analysis. one often has systems with more than one component and many systems. subsystems and components have more than one cause of failure. Design of high-reliability systems generally requires that the individual system components have extremely high reliability even after long periods of time. Knowledge of the failure behaviour of a component can lead to savings in its cost of production and maintenance and. in some cases, to the preservation of human life. For the purpose of improving reliability. it is necessary to identify the cause of failure down to the component level. By treating each cause of failure separately with failures from other causes considered as independent censoring, the analysis of lifetime data would be incomplete. Motivated by this. we introduce a new approach for the analysis of bivariate competing risk data using the bivariate vector hazard rate of Johnson and Kotz (1975).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

An accurate mass formula at finite temperature has been used to obtain a more precise estimation of temperature effects on fission barriers calculated within the liquid drop model.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Extensive use of the Internet coupled with the marvelous growth in e-commerce and m-commerce has created a huge demand for information security. The Secure Socket Layer (SSL) protocol is the most widely used security protocol in the Internet which meets this demand. It provides protection against eaves droppings, tampering and forgery. The cryptographic algorithms RC4 and HMAC have been in use for achieving security services like confidentiality and authentication in the SSL. But recent attacks against RC4 and HMAC have raised questions in the confidence on these algorithms. Hence two novel cryptographic algorithms MAJE4 and MACJER-320 have been proposed as substitutes for them. The focus of this work is to demonstrate the performance of these new algorithms and suggest them as dependable alternatives to satisfy the need of security services in SSL. The performance evaluation has been done by using practical implementation method.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Urbanization refers to the process in which an increasing proportion of a population lives in cities and suburbs. Urbanization fuels the alteration of the Land use/Land cover pattern of the region including increase in built-up area, leading to imperviousness of the ground surface. With increasing urbanization and population pressures; the impervious areas in the cities are increasing fast. An impervious surface refers to an anthropogenic ally modified surface that prevents water from infiltrating into the soil. Surface imperviousness mapping is important for the studies related to water cycling, water quality, soil erosion, flood water drainage, non-point source pollution, urban heat island effect and urban hydrology. The present study estimates the Total Impervious Area (TIA) of the city of Kochi using high resolution satellite image (LISS IV, 5m. resolution). Additionally the study maps the Effective Impervious Area (EIA) by coupling the capabilities of GIS and Remote Sensing. Land use/Land cover map of the study area was prepared from the LISS IV image acquired for the year 2012. The classes were merged to prepare a map showing pervious and impervious area. Supervised Maximum Likelihood Classification (Supervised MLC),which is a simple but accurate method for image classification, is used in calculating TIA and an overall classification accuracy of 86.33% was obtained. Water bodies are 100% pervious, whereas urban built up area are 100% impervious. Further based on percentage of imperviousness, the Total Impervious Area is categorized into various classes

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This work projects photoluminescence (PL) as an alternative technique to estimate the order of resistivity of zinc oxide (ZnO) thin films. ZnO thin films, deposited using chemical spray pyrolysis (CSP) by varying the deposition parameters like solvent, spray rate, pH of precursor, and so forth, have been used for this study. Variation in the deposition conditions has tremendous impact on the luminescence properties as well as resistivity. Two emissions could be recorded for all samples—the near band edge emission (NBE) at 380 nm and the deep level emission (DLE) at ∼500 nm which are competing in nature. It is observed that the ratio of intensities of DLE to NBE ( DLE/ NBE) can be reduced by controlling oxygen incorporation in the sample. - measurements indicate that restricting oxygen incorporation reduces resistivity considerably. Variation of DLE/ NBE and resistivity for samples prepared under different deposition conditions is similar in nature. DLE/ NBE was always less than resistivity by an order for all samples.Thus from PL measurements alone, the order of resistivity of the samples can be estimated.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Software systems are progressively being deployed in many facets of human life. The implication of the failure of such systems, has an assorted impact on its customers. The fundamental aspect that supports a software system, is focus on quality. Reliability describes the ability of the system to function under specified environment for a specified period of time and is used to objectively measure the quality. Evaluation of reliability of a computing system involves computation of hardware and software reliability. Most of the earlier works were given focus on software reliability with no consideration for hardware parts or vice versa. However, a complete estimation of reliability of a computing system requires these two elements to be considered together, and thus demands a combined approach. The present work focuses on this and presents a model for evaluating the reliability of a computing system. The method involves identifying the failure data for hardware components, software components and building a model based on it, to predict the reliability. To develop such a model, focus is given to the systems based on Open Source Software, since there is an increasing trend towards its use and only a few studies were reported on the modeling and measurement of the reliability of such products. The present work includes a thorough study on the role of Free and Open Source Software, evaluation of reliability growth models, and is trying to present an integrated model for the prediction of reliability of a computational system. The developed model has been compared with existing models and its usefulness of is being discussed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Correlation energies for all isoelectronic sequences of 2 to 20 electrons and Z = 2 to 25 are obtained by taking differences between theoretical total energies of Dirac-Fock calculations and experimental total energies. These are pure relativistic correlation energies because relativistic and QED effects are already taken care of. The theoretical as well as the experimental values are analysed critically in order to get values as accurate as possible. The correlation energies obtained show an essentially consistent behaviour from Z = 2 to 17. For Z > 17 inconsistencies occur indicating errors in the experimental values which become very large for Z > 25.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We describe the key role played by partial evaluation in the Supercomputing Toolkit, a parallel computing system for scientific applications that effectively exploits the vast amount of parallelism exposed by partial evaluation. The Supercomputing Toolkit parallel processor and its associated partial evaluation-based compiler have been used extensively by scientists at MIT, and have made possible recent results in astrophysics showing that the motion of the planets in our solar system is chaotically unstable.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The capability of estimating the walking direction of people would be useful in many applications such as those involving autonomous cars and robots. We introduce an approach for estimating the walking direction of people from images, based on learning the correct classification of a still image by using SVMs. We find that the performance of the system can be improved by classifying each image of a walking sequence and combining the outputs of the classifier. Experiments were performed to evaluate our system and estimate the trade-off between number of images in walking sequences and performance.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Interaction effects are usually modeled by means of moderated regression analysis. Structural equation models with non-linear constraints make it possible to estimate interaction effects while correcting for measurement error. From the various specifications, Jöreskog and Yang's (1996, 1998), likely the most parsimonious, has been chosen and further simplified. Up to now, only direct effects have been specified, thus wasting much of the capability of the structural equation approach. This paper presents and discusses an extension of Jöreskog and Yang's specification that can handle direct, indirect and interaction effects simultaneously. The model is illustrated by a study of the effects of an interactive style of use of budgets on both company innovation and performance

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We present a computer vision system that associates omnidirectional vision with structured light with the aim of obtaining depth information for a 360 degrees field of view. The approach proposed in this article combines an omnidirectional camera with a panoramic laser projector. The article shows how the sensor is modelled and its accuracy is proved by means of experimental results. The proposed sensor provides useful information for robot navigation applications, pipe inspection, 3D scene modelling etc