959 resultados para processing method


Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis is an outcome of the investigations carried out on the development of an Artificial Neural Network (ANN) model to implement 2-D DFT at high speed. A new definition of 2-D DFT relation is presented. This new definition enables DFT computation organized in stages involving only real addition except at the final stage of computation. The number of stages is always fixed at 4. Two different strategies are proposed. 1) A visual representation of 2-D DFT coefficients. 2) A neural network approach. The visual representation scheme can be used to compute, analyze and manipulate 2D signals such as images in the frequency domain in terms of symbols derived from 2x2 DFT. This, in turn, can be represented in terms of real data. This approach can help analyze signals in the frequency domain even without computing the DFT coefficients. A hierarchical neural network model is developed to implement 2-D DFT. Presently, this model is capable of implementing 2-D DFT for a particular order N such that ((N))4 = 2. The model can be developed into one that can implement the 2-D DFT for any order N upto a set maximum limited by the hardware constraints. The reported method shows a potential in implementing the 2-D DF T in hardware as a VLSI / ASIC

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Use of short fibers as reinforcing fillers in rubber composites is on an increasing trend. They are popular due to the possibility of obtaining anisotropic properties, ease of processing and economy. In the preparation of these composites short fibers are incorporated on two roll mixing mills or in internal mixers. This is a high energy intensive time consuming process. This calls for developing less energy intensive and less time consuming processes for incorporation and distribution of short fibers in the rubber matrix. One method for this is to incorporate fibers in the latex stage. The present study is primarily to optimize the preparation of short fiber- natural rubber composite by latex stage compounding and to evaluate the resulting composites in terms of mechanical, dynamic mechanical and thermal properties. A synthetic fiber (Nylon) and a natural fiber (Coir) are used to evaluate the advantages of the processing through latex stage. To extract the full reinforcing potential of the coir fibers the macro fibers are converted to micro fibers through chemical and mechanical means. The thesis is presented in 7 chapters

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis investigated the potential use of Linear Predictive Coding in speech communication applications. A Modified Block Adaptive Predictive Coder is developed, which reduces the computational burden and complexity without sacrificing the speech quality, as compared to the conventional adaptive predictive coding (APC) system. For this, changes in the evaluation methods have been evolved. This method is as different from the usual APC system in that the difference between the true and the predicted value is not transmitted. This allows the replacement of the high order predictor in the transmitter section of a predictive coding system, by a simple delay unit, which makes the transmitter quite simple. Also, the block length used in the processing of the speech signal is adjusted relative to the pitch period of the signal being processed rather than choosing a constant length as hitherto done by other researchers. The efficiency of the newly proposed coder has been supported with results of computer simulation using real speech data. Three methods for voiced/unvoiced/silent/transition classification have been presented. The first one is based on energy, zerocrossing rate and the periodicity of the waveform. The second method uses normalised correlation coefficient as the main parameter, while the third method utilizes a pitch-dependent correlation factor. The third algorithm which gives the minimum error probability has been chosen in a later chapter to design the modified coder The thesis also presents a comparazive study beh-cm the autocorrelation and the covariance methods used in the evaluaiicn of the predictor parameters. It has been proved that the azztocorrelation method is superior to the covariance method with respect to the filter stabf-it)‘ and also in an SNR sense, though the increase in gain is only small. The Modified Block Adaptive Coder applies a switching from pitch precitzion to spectrum prediction when the speech segment changes from a voiced or transition region to an unvoiced region. The experiments cont;-:ted in coding, transmission and simulation, used speech samples from .\£=_‘ajr2_1a:r1 and English phrases. Proposal for a speaker reecgnifion syste: and a phoneme identification system has also been outlized towards the end of the thesis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this thesis, different techniques for image analysis of high density microarrays have been investigated. Most of the existing image analysis techniques require prior knowledge of image specific parameters and direct user intervention for microarray image quantification. The objective of this research work was to develop of a fully automated image analysis method capable of accurately quantifying the intensity information from high density microarrays images. The method should be robust against noise and contaminations that commonly occur in different stages of microarray development.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Rays, belonging to the class Elasmobranchii, constitute a major fishery in many states in India like Tamil Nadu, Gujarat, Andhra Pradesh, Kerala and Maharashtra. The estimated landings are 21,700 tonnes per annum. Even though the meat of rays is nutritious and free from bones and spines, there is little demand for fresh meat due to the presence of a high urea content. The landings are mainly used for salt curing which fetches only very low prices for the producers. Urea nitrogen constituted the major component (50.8%) of the non-protein nitrogen of the meat. An attempt has been made to standat-dize the processing steps to reduce the urea levels in the meat before freezing by using different simple techniques like dipping the fillets in stagnant chilled water, dipping in chilled running water and dipping in stirred chilled running water. It was found that meat dipped in stirred running water for two hours reduced the urea level of the meat by 62%. The yield of the lateral fin fillets and caudal fin fillets vary with the size of the ray. The drip loss during frozen storage is found to be more in the case of samples frozen stored after the treatment for urea removal by the method of stirring in running water. The samples treated in stagnant chilled water had the lowest drip loss. The total nitrogen was higher in samples treated in stagnant chilled water and lowest in the samples treated in stirred running water. The overall acceptability was high in the case of samples treated with stirred running water and frozen stored

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The presence of microcalcifications in mammograms can be considered as an early indication of breast cancer. A fastfractal block coding method to model the mammograms fordetecting the presence of microcalcifications is presented in this paper. The conventional fractal image coding method takes enormous amount of time during the fractal block encoding.procedure. In the proposed method, the image is divided intoshade and non shade blocks based on the dynamic range, andonly non shade blocks are encoded using the fractal encodingtechnique. Since the number of image blocks is considerablyreduced in the matching domain search pool, a saving of97.996% of the encoding time is obtained as compared to theconventional fractal coding method, for modeling mammograms.The above developed mammograms are used for detectingmicrocalcifications and a diagnostic efficiency of 85.7% isobtained for the 28 mammograms used.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Sensitisation of natural rubber latex by addition of a small quantity of an anionic surfactant prior to the addition of a coacervant results in quick coagulation. The natural rubber prepared by the novel coagulation method shows improved raw rubber characteristics, better cure characteristics in gum and carbon black filled compounds and improved mechanical properties as compared to the conventionally coagulated natural rubber. Compounds based on dried masterbatches prepared by the incorporation of fluffy carbon black in different forms of soap sensitised natural rubber latices such as fresh latex, preserved field latex, centrifuged latex and a blend of preserved field latex and skim latex show improved cure characteristics and vucanizate properties as compared to an equivalent conventional dry rubber-fluffy carbon black based compound. The latex masterbatch based vulcanizates show higher level of crosslinking and better dispersion of filler. Vulcanizates based on fresh natural rubber latex- dual filler masterbatches containing a blend of carbon black and silica prepared by the modified coagulation process shows very good mechanical and dynamic properties that could be correlated to a low rolling resistance. The carbon black/silica/nanoclay tri-filler - fresh natural rubber latex masterbatch based vulcanizates show improved mechanical properties as the proportion of nanoclay increased up to 5 phr. The fresh natural rubber latex based carbon black-silica masterbatch/ polybutadiene blend vulcanizates show superior mechanical and dynamic properties as compared to the equivalent compound vulcanizates prepared from the dry natural rubber-filler (conventional dry mix)/polybutadiene blends

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Summary - Cooking banana is one of the most important crops in Uganda; it is a staple food and source of household income in rural areas. The most common cooking banana is locally called matooke, a Musa sp triploid acuminate genome group (AAA-EAHB). It is perishable and traded in fresh form leading to very high postharvest losses (22-45%). This is attributed to: non-uniform level of harvest maturity, poor handling, bulk transportation and lack of value addition/processing technologies, which are currently the main challenges for trade and export, and diversified utilization of matooke. Drying is one of the oldest technologies employed in processing of agricultural produce. A lot of research has been carried out on drying of fruits and vegetables, but little information is available on matooke. Drying of matooke and milling it to flour extends its shelf-life is an important means to overcome the above challenges. Raw matooke flour is a generic flour developed to improve shelf stability of the fruit and to find alternative uses. It is rich in starch (80 - 85%db) and subsequently has a high potential as a calorie resource base. It possesses good properties for both food and non-food industrial use. Some effort has been done to commercialize the processing of matooke but there is still limited information on its processing into flour. It was imperative to carry out an in-depth study to bridge the following gaps: lack of accurate information on the maturity window within which matooke for processing into flour can be harvested leading to non-uniform quality of matooke flour; there is no information on moisture sorption isotherm for matooke from which the minimum equilibrium moisture content in relation to temperature and relative humidity is obtainable, below which the dry matooke would be microbiologically shelf-stable; and lack of information on drying behavior of matooke and standardized processing parameters for matooke in relation to physicochemical properties of the flour. The main objective of the study was to establish the optimum harvest maturity window and optimize the processing parameters for obtaining standardized microbiologically shelf-stable matooke flour with good starch quality attributes. This research was designed to: i) establish the optimum maturity harvest window within which matooke can be harvested to produce a consistent quality of matooke flour, ii) establish the sorption isotherms for matooke, iii) establish the effect of process parameters on drying characteristics of matooke, iv) optimize the drying process parameters for matooke, v) validate the models of maturity and optimum process parameters and vi) standardize process parameters for commercial processing of matooke. Samples were obtained from a banana plantation at Presidential Initiative on Banana Industrial Development (PIBID), Technology Business Incubation Center (TBI) at Nyaruzunga – Bushenyi in Western Uganda. A completely randomized design (CRD) was employed in selecting the banana stools from which samples for the experiments were picked. The cultivar Mbwazirume which is soft cooking and commonly grown in Bushenyi was selected for the study. The static gravitation method recommended by COST 90 Project (Wolf et al., 1985), was used for determination of moisture sorption isotherms. A research dryer developed for this research. All experiments were carried out in laboratories at TBI. The physiological maturity of matooke cv. mbwazirume at Bushenyi is 21 weeks. The optimum harvest maturity window for commercial processing of matooke flour (Raw Tooke Flour - RTF) at Bushenyi is between 15-21 weeks. The finger weight model is recommended for farmers to estimate harvest maturity for matooke and the combined model of finger weight and pulp peel ratio is recommended for commercial processors. Matooke isotherms exhibited type II curve behavior which is characteristic of foodstuffs. The GAB model best described all the adsorption and desorption moisture isotherms. For commercial processing of matooke, in order to obtain a microbiologically shelf-stable dry product. It is recommended to dry it to moisture content below or equal to 10% (wb). The hysteresis phenomenon was exhibited by the moisture sorption isotherms for matooke. The isoteric heat of sorption for both adsorptions and desorption isotherms increased with decreased moisture content. The total isosteric heat of sorption for matooke: adsorption isotherm ranged from 4,586 – 2,386 kJ/kg and desorption isotherm from 18,194– 2,391 kJ/kg for equilibrium moisture content from 0.3 – 0.01 (db) respectively. The minimum energy required for drying matooke from 80 – 10% (wb) is 8,124 kJ/kg of water removed. Implying that the minimum energy required for drying of 1 kg of fresh matooke from 80 - 10% (wb) is 5,793 kJ. The drying of matooke takes place in three steps: the warm-up and the two falling rate periods. The drying rate constant for all processing parameters ranged from 5,793 kJ and effective diffusivity ranged from 1.5E-10 - 8.27E-10 m2/s. The activation energy (Ea) for matooke was 16.3kJ/mol (1,605 kJ/kg). Comparing the activation energy (Ea) with the net isosteric heat of sorption for desorption isotherm (qst) (1,297.62) at 0.1 (kg water/kg dry matter), indicated that Ea was higher than qst suggesting that moisture molecules travel in liquid form in matooke slices. The total color difference (ΔE*) between the fresh and dry samples, was lowest for effect of thickness of 7 mm, followed by air velocity of 6 m/s, and then drying air temperature at 70˚C. The drying system controlled by set surface product temperature, reduced the drying time by 50% compared to that of a drying system controlled by set air drying temperature. The processing parameters did not have a significant effect on physicochemical and quality attributes, suggesting that any drying air temperature can be used in the initial stages of drying as long as the product temperature does not exceed gelatinization temperature of matooke (72˚C). The optimum processing parameters for single-layer drying of matooke are: thickness = 3 mm, air temperatures 70˚C, dew point temperature 18˚C and air velocity 6 m/s overflow mode. From practical point of view it is recommended that for commercial processing of matooke, to employ multi-layer drying of loading capacity equal or less than 7 kg/m², thickness 3 mm, air temperatures 70˚C, dew point temperature 18˚C and air velocity 6 m/s overflow mode.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper describes a method to achieve the most relevant contours of an image. The presented method proposes to integrate the information of the local contours from chromatic components such as H, S and I, taking into account the criteria of coherence of the local contour orientation values obtained from each of these components. The process is based on parametrizing pixel by pixel the local contours (magnitude and orientation values) from the H, S and I images. This process is carried out individually for each chromatic component. If the criterion of dispersion of the obtained orientation values is high, this chromatic component will lose relevance. A final processing integrates the extracted contours of the three chromatic components, generating the so-called integrated contours image

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The aim of this study was to compare the contrast visual processing of concentric sinusoidal gratings stimuli between adolescents and adults. The study included 20 volunteers divided into two groups: 10 adolescents aged 13-19 years (M=16.5, SD=1.65) and 10 adults aged 20-26 years (M=21.8, SD=2.04). In order to measure the contrast sensitivity at spatial frequencies of 0.6, 2.5, 5 and 20 degrees of visual angle (cpd), it was used the psychophysical method of two alternative forced choice (2AFC). A One Way ANOVA performance showed a significant difference in the comparison between groups: F [(4, 237)=3.74, p<.05]. The post-hoc Tukey HSD showed a significant difference between the frequencies of 0.6 (p <.05) and 20 cpd (p<.05). Thus, the results showed that the visual perception behaves differently with regard to the sensory mechanisms that render the contrast towards adolescents and adults. These results are useful to better characterize and comprehend human vision development.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In image processing, segmentation algorithms constitute one of the main focuses of research. In this paper, new image segmentation algorithms based on a hard version of the information bottleneck method are presented. The objective of this method is to extract a compact representation of a variable, considered the input, with minimal loss of mutual information with respect to another variable, considered the output. First, we introduce a split-and-merge algorithm based on the definition of an information channel between a set of regions (input) of the image and the intensity histogram bins (output). From this channel, the maximization of the mutual information gain is used to optimize the image partitioning. Then, the merging process of the regions obtained in the previous phase is carried out by minimizing the loss of mutual information. From the inversion of the above channel, we also present a new histogram clustering algorithm based on the minimization of the mutual information loss, where now the input variable represents the histogram bins and the output is given by the set of regions obtained from the above split-and-merge algorithm. Finally, we introduce two new clustering algorithms which show how the information bottleneck method can be applied to the registration channel obtained when two multimodal images are correctly aligned. Different experiments on 2-D and 3-D images show the behavior of the proposed algorithms

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Realistic rendering animation is known to be an expensive processing task when physically-based global illumination methods are used in order to improve illumination details. This paper presents an acceleration technique to compute animations in radiosity environments. The technique is based on an interpolated approach that exploits temporal coherence in radiosity. A fast global Monte Carlo pre-processing step is introduced to the whole computation of the animated sequence to select important frames. These are fully computed and used as a base for the interpolation of all the sequence. The approach is completely view-independent. Once the illumination is computed, it can be visualized by any animated camera. Results present significant high speed-ups showing that the technique could be an interesting alternative to deterministic methods for computing non-interactive radiosity animations for moderately complex scenarios

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The main objective of this thesis was the integration of microstructure information in synoptic descriptors of turbulence, that reflects the mixing processes. Turbulent patches are intermittent in space and time, but they represent the dominant process for mixing. In this work, the properties of turbulent patches were considered the potential input for integrating the physical microscale measurements. The development of a method for integrating the properties of the turbulent patches required solving three main questions: a) how can we detect the turbulent patches from he microstructure measurements?; b) which are the most relevant properties of the turbulent patches?; and ) once an interval of time has been selected, what kind of synoptic parameters could better reflect the occurrence and properties of the turbulent patches? The answers to these questions were the final specific objectives of this thesis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Recent interest in the validation of general circulation models (GCMs) has been devoted to objective methods. A small number of authors have used the direct synoptic identification of phenomena together with a statistical analysis to perform the objective comparison between various datasets. This paper describes a general method for performing the synoptic identification of phenomena that can be used for an objective analysis of atmospheric, or oceanographic, datasets obtained from numerical models and remote sensing. Methods usually associated with image processing have been used to segment the scene and to identify suitable feature points to represent the phenomena of interest. This is performed for each time level. A technique from dynamic scene analysis is then used to link the feature points to form trajectories. The method is fully automatic and should be applicable to a wide range of geophysical fields. An example will be shown of results obtained from this method using data obtained from a run of the Universities Global Atmospheric Modelling Project GCM.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Different optimization methods can be employed to optimize a numerical estimate for the match between an instantiated object model and an image. In order to take advantage of gradient-based optimization methods, perspective inversion must be used in this context. We show that convergence can be very fast by extrapolating to maximum goodness-of-fit with Newton's method. This approach is related to methods which either maximize a similar goodness-of-fit measure without use of gradient information, or else minimize distances between projected model lines and image features. Newton's method combines the accuracy of the former approach with the speed of convergence of the latter.