939 resultados para compression wood
Resumo:
Inventory of goods and chattels belonging to Samuel Wood and delivered to the Honourable William Dickson (double-sided, handwritten page), Dec. 10, 1828.
Resumo:
Thèse numérisée par la Division de la gestion de documents et des archives de l'Université de Montréal
Resumo:
Rubber solutions were prepared and used for bonding wood pieces. The effect of the variation of chlorinated natural rubber (CNR) and phenolformaldehyde (PF) resin in the adhesive solutions on lap shear strength was determined. Natural rubber and neoprene-based adhesive solutions were compared for their lap shear strength. The storage stability of the adhesive prepared was determined. The change in lap shear strength before and after being placed in cold water, hot water, acid, and alkali was tested. The bonding character of these adhesives was compared with different commercially available solution adhesives. The room-temperature aging resistance of wood joints was also determined. In all the studies, the adhesive prepared in the laboratory was found to be superior compared to the commercial adhesives.
Resumo:
Faculty of Marine Sciences, Cochin University of Science and Technology
Resumo:
Faculty of Marine Sciences, Cochin University of Science and Technology
Resumo:
The research work which was carried out to characterization of wastes from natural rubber and rubber wood processing industries and their utilization for biomethanation. Environmental contamination is an inevitable consequence of human activity. The liquid and solid wastes from natural rubber based industries were: characterized and their use for the production of biogas investigated with a view to conserve conventional energy, and to mitigate environmental degradation.Rubber tree (flevea brasiliensis Muell. Arg.), is the most important commercial source of natural rubber and in india. Recently, pollution from the rubber processing factories has become very serious due to the introduction of modern methods and centralized group processing practices.The possibility of the use of spent slurry as organic manure is discussed.l0 percent level of PSD, the activity of cellulolytic, acid producing,proteolytic, lipolytic and methanogenic bacteria were more in the middle stage of methanogenesis.the liquid wastes from rubber processing used as diluents in combination with PSD, SPE promoted more biogas production with high methane content in the gas.The factors that favour methane production like TS, VS, cellulose and hemicellulose degradation were favoured in this treatment which led to higher methane biogenesis.The results further highlight ways and means to use agricultural wastes as alternative sources of energy.
Resumo:
The work is intended to study the following important aspects of document image processing and develop new methods. (1) Segmentation ofdocument images using adaptive interval valued neuro-fuzzy method. (2) Improving the segmentation procedure using Simulated Annealing technique. (3) Development of optimized compression algorithms using Genetic Algorithm and parallel Genetic Algorithm (4) Feature extraction of document images (5) Development of IV fuzzy rules. This work also helps for feature extraction and foreground and background identification. The proposed work incorporates Evolutionary and hybrid methods for segmentation and compression of document images. A study of different neural networks used in image processing, the study of developments in the area of fuzzy logic etc is carried out in this work
Resumo:
Shrimp grow out systems under zero water exchange mode demand constant remediation of total ammonia nitrogen (TAN) andNO2 −–Nto protect the crop. To address this issue, aninexpensive and user-friendly technology using immobilized nitrifying bacterial consortia (NBC) as bioaugmentors has been developed and proposed for adoption in shrimp culture systems. Indigenous NBC stored at 4 °C were activated at room temperature (28 °C) and cultured in a 2 L bench top fermentor. The consortia, after enumeration by epifluorescence microscopy,were immobilized on delignifiedwood particles of a soft wood tree Ailantus altissima (300–1500 μm) having a surface area of 1.87m2 g−1. Selection of wood particle as substratumwas based on adsorption of NBC on to the particles, biofilm formation, and their subsequent nitrification potential. The immobilization could be achievedwithin 72 h with an initial cell density of 1×105 cells mL−1. On experimenting with the lowest dosage of 0.2 g (wet weight) immobilized NBC in 20 L seawater, a TAN removal rate of 2.4 mg L−1 within three days was observed. An NBC immobilization device could be developed for on site generation of the bioaugmentor preparation as per requirement. The product of immobilization never exhibited lag phase when transferred to fresh medium. The extent of nitrification in a simulated systemwas two times the rate observed in the control systems suggesting the efficacy in real life situations. The products of nitrification in all experiments were undetectable due to denitrifying potency, whichmade the NBC an ideal option for biological nitrogen removal. The immobilized NBC thus generated has been named TANOX (Total Ammonia Nitrogen Oxidizer)
Resumo:
Extending IPv6 to IEEE 802.15.4-based Low power Wireless Personal Area Networks requires efficient header compression mechanisms to adapt to their limited bandwidth, memory and energy constraints. This paper presents an experimental evaluation of an improved header compression scheme which provides better compression of IPv6 multicast addresses and UDP port numbers compared to existing mechanisms. This scheme outperforms the existing compression mechanism in terms of data throughput of the network and energy consumption of nodes. It enhances throughput by up to 8% and reduces transmission energy of nodes by about 5%.
Resumo:
This work proposes a parallel genetic algorithm for compressing scanned document images. A fitness function is designed with Hausdorff distance which determines the terminating condition. The algorithm helps to locate the text lines. A greater compression ratio has achieved with lesser distortion
Resumo:
In this paper, an improved technique for evolving wavelet coefficients refined for compression and reconstruction of fingerprint images is presented. The FBI fingerprint compression standard [1, 2] uses the cdf 9/7 wavelet filter coefficients. Lifting scheme is an efficient way to represent classical wavelets with fewer filter coefficients [3, 4]. Here Genetic algorithm (GA) is used to evolve better lifting filter coefficients for cdf 9/7 wavelet to compress and reconstruct fingerprint images with better quality. Since the lifting filter coefficients are few in numbers compared to the corresponding classical wavelet filter coefficients, they are evolved at a faster rate using GA. A better reconstructed image quality in terms of Peak-Signal-to-Noise-Ratio (PSNR) is achieved with the best lifting filter coefficients evolved for a compression ratio 16:1. These evolved coefficients perform well for other compression ratios also.
Resumo:
In this article, techniques have been presented for faster evolution of wavelet lifting coefficients for fingerprint image compression (FIC). In addition to increasing the computational speed by 81.35%, the coefficients performed much better than the reported coefficients in literature. Generally, full-size images are used for evolving wavelet coefficients, which is time consuming. To overcome this, in this work, wavelets were evolved with resized, cropped, resized-average and cropped-average images. On comparing the peak- signal-to-noise-ratios (PSNR) offered by the evolved wavelets, it was found that the cropped images excelled the resized images and is in par with the results reported till date. Wavelet lifting coefficients evolved from an average of four 256 256 centre-cropped images took less than 1/5th the evolution time reported in literature. It produced an improvement of 1.009 dB in average PSNR. Improvement in average PSNR was observed for other compression ratios (CR) and degraded images as well. The proposed technique gave better PSNR for various bit rates, with set partitioning in hierarchical trees (SPIHT) coder. These coefficients performed well with other fingerprint databases as well.
Resumo:
This paper explains the Genetic Algorithm (GA) evolution of optimized wavelet that surpass the cdf9/7 wavelet for fingerprint compression and reconstruction. Optimized wavelets have already been evolved in previous works in the literature, but they are highly computationally complex and time consuming. Therefore, in this work, a simple approach is made to reduce the computational complexity of the evolution algorithm. A training image set comprised of three 32x32 size cropped images performed much better than the reported coefficients in literature. An average improvement of 1.0059 dB in PSNR above the classical cdf9/7 wavelet over the 80 fingerprint images was achieved. In addition, the computational speed was increased by 90.18 %. The evolved coefficients for compression ratio (CR) 16:1 yielded better average PSNR for other CRs also. Improvement in average PSNR was experienced for degraded and noisy images as well
Resumo:
The thesis explores the area of still image compression. The image compression techniques can be broadly classified into lossless and lossy compression. The most common lossy compression techniques are based on Transform coding, Vector Quantization and Fractals. Transform coding is the simplest of the above and generally employs reversible transforms like, DCT, DWT, etc. Mapped Real Transform (MRT) is an evolving integer transform, based on real additions alone. The present research work aims at developing new image compression techniques based on MRT. Most of the transform coding techniques employ fixed block size image segmentation, usually 8×8. Hence, a fixed block size transform coding is implemented using MRT and the merits and demerits are analyzed for both 8×8 and 4×4 blocks. The N2 unique MRT coefficients, for each block, are computed using templates. Considering the merits and demerits of fixed block size transform coding techniques, a hybrid form of these techniques is implemented to improve the performance of compression. The performance of the hybrid coder is found to be better compared to the fixed block size coders. Thus, if the block size is made adaptive, the performance can be further improved. In adaptive block size coding, the block size may vary from the size of the image to 2×2. Hence, the computation of MRT using templates is impractical due to memory requirements. So, an adaptive transform coder based on Unique MRT (UMRT), a compact form of MRT, is implemented to get better performance in terms of PSNR and HVS The suitability of MRT in vector quantization of images is then experimented. The UMRT based Classified Vector Quantization (CVQ) is implemented subsequently. The edges in the images are identified and classified by employing a UMRT based criteria. Based on the above experiments, a new technique named “MRT based Adaptive Transform Coder with Classified Vector Quantization (MATC-CVQ)”is developed. Its performance is evaluated and compared against existing techniques. A comparison with standard JPEG & the well-known Shapiro’s Embedded Zero-tree Wavelet (EZW) is done and found that the proposed technique gives better performance for majority of images