997 resultados para Complexity processing


Relevância:

20.00% 20.00%

Publicador:

Resumo:

In the present work, the author has designed and developed all types of solar air heaters called porous and nonporous collectors. The developed solar air heaters were subjected to different air mass flow rates in order to standardize the flow per unit area of the collector. Much attention was given to investigate the performance of the solar air heaters fitted with baffles. The output obtained from the experiments on pilot models, helped the installation of solar air heating system for industrial drying applications also. Apart from these, various types of solar dryers, for small and medium scale drying applications, were also built up. The feasibility of ‘latent heat thermal energy storage system’ based on Phase Change Material was also undertaken. The application of solar greenhouse for drying industrial effluent was analyzed in the present study and a solar greenhouse was developed. The effectiveness of Computational Fluid Dynamics (CFD) in the field of solar air heaters was also analyzed. The thesis is divided into eight chapters.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Use of short fibers as reinforcing fillers in rubber composites is on an increasing trend. They are popular due to the possibility of obtaining anisotropic properties, ease of processing and economy. In the preparation of these composites short fibers are incorporated on two roll mixing mills or in internal mixers. This is a high energy intensive time consuming process. This calls for developing less energy intensive and less time consuming processes for incorporation and distribution of short fibers in the rubber matrix. One method for this is to incorporate fibers in the latex stage. The present study is primarily to optimize the preparation of short fiber- natural rubber composite by latex stage compounding and to evaluate the resulting composites in terms of mechanical, dynamic mechanical and thermal properties. A synthetic fiber (Nylon) and a natural fiber (Coir) are used to evaluate the advantages of the processing through latex stage. To extract the full reinforcing potential of the coir fibers the macro fibers are converted to micro fibers through chemical and mechanical means. The thesis is presented in 7 chapters

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Assembly job shop scheduling problem (AJSP) is one of the most complicated combinatorial optimization problem that involves simultaneously scheduling the processing and assembly operations of complex structured products. The problem becomes even more complicated if a combination of two or more optimization criteria is considered. This thesis addresses an assembly job shop scheduling problem with multiple objectives. The objectives considered are to simultaneously minimizing makespan and total tardiness. In this thesis, two approaches viz., weighted approach and Pareto approach are used for solving the problem. However, it is quite difficult to achieve an optimal solution to this problem with traditional optimization approaches owing to the high computational complexity. Two metaheuristic techniques namely, genetic algorithm and tabu search are investigated in this thesis for solving the multiobjective assembly job shop scheduling problems. Three algorithms based on the two metaheuristic techniques for weighted approach and Pareto approach are proposed for the multi-objective assembly job shop scheduling problem (MOAJSP). A new pairing mechanism is developed for crossover operation in genetic algorithm which leads to improved solutions and faster convergence. The performances of the proposed algorithms are evaluated through a set of test problems and the results are reported. The results reveal that the proposed algorithms based on weighted approach are feasible and effective for solving MOAJSP instances according to the weight assigned to each objective criterion and the proposed algorithms based on Pareto approach are capable of producing a number of good Pareto optimal scheduling plans for MOAJSP instances.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The source, fate and diagentic pathway of sedimentary organic matter in estuaries are difficult to delineate due to the complexity of organic matter sources, intensive physical mixing and biological processes. A combination of bulk organic matter techniques and molecular biomarkers are found to be successful in explaining organic matter dynamics in estuaries. The basic requirement for these multi-proxy approaches are (i) sources have significantly differing characteristics, (ii) there are a sufficient number of tracers to delineate all sources and (iii) organic matter degradation and processing have little, similar or predictable effects on end member characteristics. Although there have been abundant researches that have attempted to tackle difficulties related to the source and fate of organic matter in estuarine systems, our understanding remains limited or rather inconsistent regarding the Indian estuaries. Cochin estuary is the largest among many extensive estuarine systems along the southwest coast of India. It supports as much biological productivity and diversity as tropical rain forests. In this study, we have used a combination of bulk geochemical parameters and different group of molecular biomarkers to define organic matter sources and thereby identifying various biogeochemical processes acting along the salinity gradient of the Cochin estuary

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Interfacings of various subjects generate new field ofstudy and research that help in advancing human knowledge. One of the latest of such fields is Neurotechnology, which is an effective amalgamation of neuroscience, physics, biomedical engineering and computational methods. Neurotechnology provides a platform to interact physicist; neurologist and engineers to break methodology and terminology related barriers. Advancements in Computational capability, wider scope of applications in nonlinear dynamics and chaos in complex systems enhanced study of neurodynamics. However there is a need for an effective dialogue among physicists, neurologists and engineers. Application of computer based technology in the field of medicine through signal and image processing, creation of clinical databases for helping clinicians etc are widely acknowledged. Such synergic effects between widely separated disciplines may help in enhancing the effectiveness of existing diagnostic methods. One of the recent methods in this direction is analysis of electroencephalogram with the help of methods in nonlinear dynamics. This thesis is an effort to understand the functional aspects of human brain by studying electroencephalogram. The algorithms and other related methods developed in the present work can be interfaced with a digital EEG machine to unfold the information hidden in the signal. Ultimately this can be used as a diagnostic tool.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Broiler chicken is gaining popularity among the consumers of India. Since poultry is recognised as a leading food vehicle for Salmonella contamination, the prevalence and distribution of Salmonella serotypes in broiler chickens and processing environments of retail outlets has been studied. In the present study 214 samples of broiler chicken and 311 environmental samples from cage were analysed for the presence of Salmonella. Of the various body parts of live chicken analysed prevalence varied from 1.4% in cloacca to 6.9% in crop region. Environmental samples from the cage showed higher prevalence of Salmonella ranging from0 to 16.67%. Apart from Salmonella enteritidis, which was the predominant Salmonella serotype in the chickens as well as in the environmental samples, other serotypes such as S. bareilly, S. cerro, S. mbandaka and S. moladewere also encountered. The results of the research calls for strict hygiene standards for retail broiler chicken processing outlets

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Animportant step in the residue number system(RNS) based signal processing is the conversion of signal into residue domain. Many implementations of this conversion have been proposed for various goals, and one of the implementations is by a direct conversion from an analogue input. A novel approach for analogue-to-residue conversion is proposed in this research using the most popular Sigma–Delta analogue-to-digital converter (SD-ADC). In this approach, the front end is the same as in traditional SD-ADC that uses Sigma–Delta (SD) modulator with appropriate dynamic range, but the filtering is doneby a filter implemented usingRNSarithmetic. Hence, the natural output of the filter is an RNS representation of the input signal. The resolution, conversion speed, hardware complexity and cost of implementation of the proposed SD based analogue-to-residue converter are compared with the existing analogue-to-residue converters based on Nyquist rate ADCs

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Several oral vaccination studies have been undertaken to evoke a better protection against white spot syndrome virus (WSSV), amajor shrimp pathogen. Formalin-inactivated virus andWSSV envelope protein VP28 were suggested as candidate vaccine components, but their uptake mechanism upon oral delivery was not elucidated. In this study the fate of these components and of live WSSV, orally intubated to black tiger shrimp (Penaeus monodon) was investigated by immunohistochemistry, employing antibodies specific for VP28 and haemocytes. The midgut has been identified as the most prominent site of WSSV uptake and processing. The truncated recombinant VP28 (rec-VP28), formalin-inactivated virus (IVP) and live WSSV follow an identical uptake route suggested as receptor-mediated endocytosis that starts with adherence of luminal antigens at the apical layers of gut epithelium. Processing of internalized antigens is performed in endo-lysosomal compartments leading to formation of supra-nuclear vacuoles. However, the majority of WSSV-antigens escape these compartments and are transported to the inter-cellular space via transcytosis. Accumulation of the transcytosed antigens in the connective tissue initiates aggregation and degranulation of haemocytes. Finally the antigens exiting the midgut seem to reach the haemolymph. The nearly identical uptake pattern of the different WSSV-antigens suggests that receptors on the apical membrane of shrimp enterocytes recognize rec-VP28 efficiently. Hence the truncated VP28 can be considered suitable for oral vaccination, when the digestion in the foregut can be bypassed

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Rays, belonging to the class Elasmobranchii, constitute a major fishery in many states in India like Tamil Nadu, Gujarat, Andhra Pradesh, Kerala and Maharashtra. The estimated landings are 21,700 tonnes per annum. Even though the meat of rays is nutritious and free from bones and spines, there is little demand for fresh meat due to the presence of a high urea content. The landings are mainly used for salt curing which fetches only very low prices for the producers. Urea nitrogen constituted the major component (50.8%) of the non-protein nitrogen of the meat. An attempt has been made to standat-dize the processing steps to reduce the urea levels in the meat before freezing by using different simple techniques like dipping the fillets in stagnant chilled water, dipping in chilled running water and dipping in stirred chilled running water. It was found that meat dipped in stirred running water for two hours reduced the urea level of the meat by 62%. The yield of the lateral fin fillets and caudal fin fillets vary with the size of the ray. The drip loss during frozen storage is found to be more in the case of samples frozen stored after the treatment for urea removal by the method of stirring in running water. The samples treated in stagnant chilled water had the lowest drip loss. The total nitrogen was higher in samples treated in stagnant chilled water and lowest in the samples treated in stirred running water. The overall acceptability was high in the case of samples treated with stirred running water and frozen stored

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The date palm Phoenix dactylifera has played an important role in the day-to-day life of the people for the last 7000 years. Today worldwide production, utilization and industrialization of dates are continuously increasing since date fruits have earned great importance in human nutrition owing to their rich content of essential nutrients. Tons of date palm fruit wastes are discarded daily by the date processing industries leading to environmental problems. Wastes such as date pits represent an average of 10% of the date fruits. Thus, there is an urgent need to find suitable applications for this waste. In spite of several studies on date palm cultivation, their utilization and scope for utilizing date fruit in therapeutic applications, very few reviews are available and they are limited to the chemistry and pharmacology of the date fruits and phytochemical composition, nutritional significance and potential health benefits of date fruit consumption. In this context, in the present review the prospects of valorization of these date fruit processing by-products and wastes’ employing fermentation and enzyme processing technologies towards total utilization of this valuable commodity for the production of biofuels, biopolymers, biosurfactants, organic acids, antibiotics, industrial enzymes and other possible industrial chemicals are discussed

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Modeling nonlinear systems using Volterra series is a century old method but practical realizations were hampered by inadequate hardware to handle the increased computational complexity stemming from its use. But interest is renewed recently, in designing and implementing filters which can model much of the polynomial nonlinearities inherent in practical systems. The key advantage in resorting to Volterra power series for this purpose is that nonlinear filters so designed can be made to work in parallel with the existing LTI systems, yielding improved performance. This paper describes the inclusion of a quadratic predictor (with nonlinearity order 2) with a linear predictor in an analog source coding system. Analog coding schemes generally ignore the source generation mechanisms but focuses on high fidelity reconstruction at the receiver. The widely used method of differential pnlse code modulation (DPCM) for speech transmission uses a linear predictor to estimate the next possible value of the input speech signal. But this linear system do not account for the inherent nonlinearities in speech signals arising out of multiple reflections in the vocal tract. So a quadratic predictor is designed and implemented in parallel with the linear predictor to yield improved mean square error performance. The augmented speech coder is tested on speech signals transmitted over an additive white gaussian noise (AWGN) channel.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper discusses the implementation details of a child friendly, good quality, English text-to-speech (TTS) system that is phoneme-based, concatenative, easy to set up and use with little memory. Direct waveform concatenation and linear prediction coding (LPC) are used. Most existing TTS systems are unit-selection based, which use standard speech databases available in neutral adult voices.Here reduced memory is achieved by the concatenation of phonemes and by replacing phonetic wave files with their LPC coefficients. Linguistic analysis was used to reduce the algorithmic complexity instead of signal processing techniques. Sufficient degree of customization and generalization catering to the needs of the child user had been included through the provision for vocabulary and voice selection to suit the requisites of the child. Prosody had also been incorporated. This inexpensive TTS systemwas implemented inMATLAB, with the synthesis presented by means of a graphical user interface (GUI), thus making it child friendly. This can be used not only as an interesting language learning aid for the normal child but it also serves as a speech aid to the vocally disabled child. The quality of the synthesized speech was evaluated using the mean opinion score (MOS).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Super Resolution problem is an inverse problem and refers to the process of producing a High resolution (HR) image, making use of one or more Low Resolution (LR) observations. It includes up sampling the image, thereby, increasing the maximum spatial frequency and removing degradations that arise during the image capture namely aliasing and blurring. The work presented in this thesis is based on learning based single image super-resolution. In learning based super-resolution algorithms, a training set or database of available HR images are used to construct the HR image of an image captured using a LR camera. In the training set, images are stored as patches or coefficients of feature representations like wavelet transform, DCT, etc. Single frame image super-resolution can be used in applications where database of HR images are available. The advantage of this method is that by skilfully creating a database of suitable training images, one can improve the quality of the super-resolved image. A new super resolution method based on wavelet transform is developed and it is better than conventional wavelet transform based methods and standard interpolation methods. Super-resolution techniques based on skewed anisotropic transform called directionlet transform are developed to convert a low resolution image which is of small size into a high resolution image of large size. Super-resolution algorithm not only increases the size, but also reduces the degradations occurred during the process of capturing image. This method outperforms the standard interpolation methods and the wavelet methods, both visually and in terms of SNR values. Artifacts like aliasing and ringing effects are also eliminated in this method. The super-resolution methods are implemented using, both critically sampled and over sampled directionlets. The conventional directionlet transform is computationally complex. Hence lifting scheme is used for implementation of directionlets. The new single image super-resolution method based on lifting scheme reduces computational complexity and thereby reduces computation time. The quality of the super resolved image depends on the type of wavelet basis used. A study is conducted to find the effect of different wavelets on the single image super-resolution method. Finally this new method implemented on grey images is extended to colour images and noisy images

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Sensitisation of natural rubber latex by addition of a small quantity of an anionic surfactant prior to the addition of a coacervant results in quick coagulation. The natural rubber prepared by the novel coagulation method shows improved raw rubber characteristics, better cure characteristics in gum and carbon black filled compounds and improved mechanical properties as compared to the conventionally coagulated natural rubber. Compounds based on dried masterbatches prepared by the incorporation of fluffy carbon black in different forms of soap sensitised natural rubber latices such as fresh latex, preserved field latex, centrifuged latex and a blend of preserved field latex and skim latex show improved cure characteristics and vucanizate properties as compared to an equivalent conventional dry rubber-fluffy carbon black based compound. The latex masterbatch based vulcanizates show higher level of crosslinking and better dispersion of filler. Vulcanizates based on fresh natural rubber latex- dual filler masterbatches containing a blend of carbon black and silica prepared by the modified coagulation process shows very good mechanical and dynamic properties that could be correlated to a low rolling resistance. The carbon black/silica/nanoclay tri-filler - fresh natural rubber latex masterbatch based vulcanizates show improved mechanical properties as the proportion of nanoclay increased up to 5 phr. The fresh natural rubber latex based carbon black-silica masterbatch/ polybutadiene blend vulcanizates show superior mechanical and dynamic properties as compared to the equivalent compound vulcanizates prepared from the dry natural rubber-filler (conventional dry mix)/polybutadiene blends

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In natural languages with a high degree of word-order freedom syntactic phenomena like dependencies (subordinations) or valencies do not depend on the word-order (or on the individual positions of the individual words). This means that some permutations of sentences of these languages are in some (important) sense syntactically equivalent. Here we study this phenomenon in a formal way. Various types of j-monotonicity for restarting automata can serve as parameters for the degree of word-order freedom and for the complexity of word-order in sentences (languages). Here we combine two types of parameters on computations of restarting automata: 1. the degree of j-monotonicity, and 2. the number of rewrites per cycle. We study these notions formally in order to obtain an adequate tool for modelling and comparing formal descriptions of (natural) languages with different degrees of word-order freedom and word-order complexity.