990 resultados para Estimation errors


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The present study is intended to provide a new scientific approach to the solution of the worlds cost engineering problems encountered in the chemical industries in our nation. The problem is that of cost estimation of equipments especially of pressure vessels when setting up chemical industries .The present study attempts to develop a model for such cost estimation. This in turn is hoped would go a long way to solve this and related problems in forecasting the cost of setting up chemical plants.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The results of an investigation on the limits of the random errors contained in the basic data of Physical Oceanography and their propagation through the computational procedures are presented in this thesis. It also suggest a method which increases the reliability of the derived results. The thesis is presented in eight chapters including the introductory chapter. Chapter 2 discusses the general theory of errors that are relevant in the context of the propagation of errors in Physical Oceanographic computations. The error components contained in the independent oceanographic variables namely, temperature, salinity and depth are deliniated and quantified in chapter 3. Chapter 4 discusses and derives the magnitude of errors in the computation of the dependent oceanographic variables, density in situ, gt, specific volume and specific volume anomaly, due to the propagation of errors contained in the independent oceanographic variables. The errors propagated into the computed values of the derived quantities namely, dynamic depth and relative currents, have been estimated and presented chapter 5. Chapter 6 reviews the existing methods for the identification of level of no motion and suggests a method for the identification of a reliable zero reference level. Chapter 7 discusses the available methods for the extension of the zero reference level into shallow regions of the oceans and suggests a new method which is more reliable. A procedure of graphical smoothening of dynamic topographies between the error limits to provide more reliable results is also suggested in this chapter. Chapter 8 deals with the computation of the geostrophic current from these smoothened values of dynamic heights, with reference to the selected zero reference level. The summary and conclusion are also presented in this chapter.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

So far, in the bivariate set up, the analysis of lifetime (failure time) data with multiple causes of failure is done by treating each cause of failure separately. with failures from other causes considered as independent censoring. This approach is unrealistic in many situations. For example, in the analysis of mortality data on married couples one would be interested to compare the hazards for the same cause of death as well as to check whether death due to one cause is more important for the partners’ risk of death from other causes. In reliability analysis. one often has systems with more than one component and many systems. subsystems and components have more than one cause of failure. Design of high-reliability systems generally requires that the individual system components have extremely high reliability even after long periods of time. Knowledge of the failure behaviour of a component can lead to savings in its cost of production and maintenance and. in some cases, to the preservation of human life. For the purpose of improving reliability. it is necessary to identify the cause of failure down to the component level. By treating each cause of failure separately with failures from other causes considered as independent censoring, the analysis of lifetime data would be incomplete. Motivated by this. we introduce a new approach for the analysis of bivariate competing risk data using the bivariate vector hazard rate of Johnson and Kotz (1975).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This thesis investigates the potential use of zerocrossing information for speech sample estimation. It provides 21 new method tn) estimate speech samples using composite zerocrossings. A simple linear interpolation technique is developed for this purpose. By using this method the A/D converter can be avoided in a speech coder. The newly proposed zerocrossing sampling theory is supported with results of computer simulations using real speech data. The thesis also presents two methods for voiced/ unvoiced classification. One of these methods is based on a distance measure which is a function of short time zerocrossing rate and short time energy of the signal. The other one is based on the attractor dimension and entropy of the signal. Among these two methods the first one is simple and reguires only very few computations compared to the other. This method is used imtea later chapter to design an enhanced Adaptive Transform Coder. The later part of the thesis addresses a few problems in Adaptive Transform Coding and presents an improved ATC. Transform coefficient with maximum amplitude is considered as ‘side information’. This. enables more accurate tfiiz assignment enui step—size computation. A new bit reassignment scheme is also introduced in this work. Finally, sum ATC which applies switching between luiscrete Cosine Transform and Discrete Walsh-Hadamard Transform for voiced and unvoiced speech segments respectively is presented. Simulation results are provided to show the improved performance of the coder

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Measurement is the act or the result of a quantitative comparison between a given quantity and a quantity of the same kind chosen as a unit. It is generally agreed that all measurements contain errors. In a measuring system where both a measuring instrument and a human being taking the measurement using a preset process, the measurement error could be due to the instrument, the process or the human being involved. The first part of the study is devoted to understanding the human errors in measurement. For that, selected person related and selected work related factors that could affect measurement errors have been identified. Though these are well known, the exact extent of the error and the extent of effect of different factors on human errors in measurement are less reported. Characterization of human errors in measurement is done by conducting an experimental study using different subjects, where the factors were changed one at a time and the measurements made by them recorded. From the pre‐experiment survey research studies, it is observed that the respondents could not give the correct answers to questions related to the correct values [extent] of human related measurement errors. This confirmed the fears expressed regarding lack of knowledge about the extent of human related measurement errors among professionals associated with quality. But in postexperiment phase of survey study, it is observed that the answers regarding the extent of human related measurement errors has improved significantly since the answer choices were provided based on the experimental study. It is hoped that this work will help users of measurement in practice to better understand and manage the phenomena of human related errors in measurement.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

An accurate mass formula at finite temperature has been used to obtain a more precise estimation of temperature effects on fission barriers calculated within the liquid drop model.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Underwater target localization and tracking attracts tremendous research interest due to various impediments to the estimation task caused by the noisy ocean environment. This thesis envisages the implementation of a prototype automated system for underwater target localization, tracking and classification using passive listening buoy systems and target identification techniques. An autonomous three buoy system has been developed and field trials have been conducted successfully. Inaccuracies in the localization results, due to changes in the environmental parameters, measurement errors and theoretical approximations are refined using the Kalman filter approach. Simulation studies have been conducted for the tracking of targets with different scenarios even under maneuvering situations. This system can as well be used for classifying the unknown targets by extracting the features of the noise emanations from the targets.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The present study is an attempt to highlight the problem of typographical errors in OPACS. The errors made while typing catalogue entries as well as importing bibliographical records from other libraries exist unnoticed by librarians resulting the non-retrieval of available records and affecting the quality of OPACs. This paper follows previous research on the topic mainly by Jeffrey Beall and Terry Ballard. The word “management” was chosen from the list of likely to be misspelled words identified by previous research. It was found that the word is wrongly entered in several forms in local, national and international OPACs justifying the observations of Ballard that typos occur in almost everywhere. Though there are lots of corrective measures proposed and are in use, the study asserts the fact that human effort is needed to get rid of the problem. The paper is also an invitation to the library professionals and system designers to construct a strategy to solve the issue

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The present study is an attempt to highlight the problem of typographical errors in OPACS. The errors made while typing catalogue entries as well as importing bibliographical records from other libraries exist unnoticed by librarians resulting the non-retrieval of available records and affecting the quality of OPACs. This paper follows previous research on the topic mainly by Jeffrey Beall and Terry Ballard. The word “management” was chosen from the list of likely to be misspelled words identified by previous research. It was found that the word is wrongly entered in several forms in local, national and international OPACs justifying the observations of Ballard that typos occur in almost everywhere. Though there are lots of corrective measures proposed and are in use, the study asserts the fact that human effort is needed to get rid of the problem. The paper is also an invitation to the library professionals and system designers to construct a strategy to solve the issue

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A new localization approach to increase the navigational capabilities and object manipulation of autonomous mobile robots, based on an encoded infrared sheet of light beacon system, which provides position errors smaller than 0.02m is presented in this paper. To achieve this minimal position error, a resolution enhancement technique has been developed by utilising an inbuilt odometric/optical flow sensor information. This system respects strong low cost constraints by using an innovative assembly for the digitally encoded infrared transmitter. For better guidance of mobile robot vehicles, an online traffic signalling capability is also incorporated. Other added features are its less computational complexity and online localization capability all these without any estimation uncertainty. The constructional details, experimental results and computational methodologies of the system are also described

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Urbanization refers to the process in which an increasing proportion of a population lives in cities and suburbs. Urbanization fuels the alteration of the Land use/Land cover pattern of the region including increase in built-up area, leading to imperviousness of the ground surface. With increasing urbanization and population pressures; the impervious areas in the cities are increasing fast. An impervious surface refers to an anthropogenic ally modified surface that prevents water from infiltrating into the soil. Surface imperviousness mapping is important for the studies related to water cycling, water quality, soil erosion, flood water drainage, non-point source pollution, urban heat island effect and urban hydrology. The present study estimates the Total Impervious Area (TIA) of the city of Kochi using high resolution satellite image (LISS IV, 5m. resolution). Additionally the study maps the Effective Impervious Area (EIA) by coupling the capabilities of GIS and Remote Sensing. Land use/Land cover map of the study area was prepared from the LISS IV image acquired for the year 2012. The classes were merged to prepare a map showing pervious and impervious area. Supervised Maximum Likelihood Classification (Supervised MLC),which is a simple but accurate method for image classification, is used in calculating TIA and an overall classification accuracy of 86.33% was obtained. Water bodies are 100% pervious, whereas urban built up area are 100% impervious. Further based on percentage of imperviousness, the Total Impervious Area is categorized into various classes

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This work projects photoluminescence (PL) as an alternative technique to estimate the order of resistivity of zinc oxide (ZnO) thin films. ZnO thin films, deposited using chemical spray pyrolysis (CSP) by varying the deposition parameters like solvent, spray rate, pH of precursor, and so forth, have been used for this study. Variation in the deposition conditions has tremendous impact on the luminescence properties as well as resistivity. Two emissions could be recorded for all samples—the near band edge emission (NBE) at 380 nm and the deep level emission (DLE) at ∼500 nm which are competing in nature. It is observed that the ratio of intensities of DLE to NBE ( DLE/ NBE) can be reduced by controlling oxygen incorporation in the sample. - measurements indicate that restricting oxygen incorporation reduces resistivity considerably. Variation of DLE/ NBE and resistivity for samples prepared under different deposition conditions is similar in nature. DLE/ NBE was always less than resistivity by an order for all samples.Thus from PL measurements alone, the order of resistivity of the samples can be estimated.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Software systems are progressively being deployed in many facets of human life. The implication of the failure of such systems, has an assorted impact on its customers. The fundamental aspect that supports a software system, is focus on quality. Reliability describes the ability of the system to function under specified environment for a specified period of time and is used to objectively measure the quality. Evaluation of reliability of a computing system involves computation of hardware and software reliability. Most of the earlier works were given focus on software reliability with no consideration for hardware parts or vice versa. However, a complete estimation of reliability of a computing system requires these two elements to be considered together, and thus demands a combined approach. The present work focuses on this and presents a model for evaluating the reliability of a computing system. The method involves identifying the failure data for hardware components, software components and building a model based on it, to predict the reliability. To develop such a model, focus is given to the systems based on Open Source Software, since there is an increasing trend towards its use and only a few studies were reported on the modeling and measurement of the reliability of such products. The present work includes a thorough study on the role of Free and Open Source Software, evaluation of reliability growth models, and is trying to present an integrated model for the prediction of reliability of a computational system. The developed model has been compared with existing models and its usefulness of is being discussed.