915 resultados para compression reinforcement


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The focus of this thesis is placed on text data compression based on the fundamental coding scheme referred to as the American Standard Code for Information Interchange or ASCII. The research objective is the development of software algorithms that result in significant compression of text data. Past and current compression techniques have been thoroughly reviewed to ensure proper contrast between the compression results of the proposed technique with those of existing ones. The research problem is based on the need to achieve higher compression of text files in order to save valuable memory space and increase the transmission rate of these text files. It was deemed necessary that the compression algorithm to be developed would have to be effective even for small files and be able to contend with uncommon words as they are dynamically included in the dictionary once they are encountered. A critical design aspect of this compression technique is its compatibility to existing compression techniques. In other words, the developed algorithm can be used in conjunction with existing techniques to yield even higher compression ratios. This thesis demonstrates such capabilities and such outcomes, and the research objective of achieving higher compression ratio is attained.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The Sustainability has been evidence in the world today; organizations have sought to be more and more into this philosophy in their processes, whether products or attendance. In the present work were manufactured eco-composites with animal fiber (dog wool) that is currently discarded into the environment without any use. Project phases consisted on the initial treatment of fibers with alkaline solution (NaOH) at 0.05 mols for removal of impurities, developing methods to convert these fibers (reinforcement) blended with castor oil polyurethane (matrix) in eco-composite with different proportions (5%, 10%, 15% and 20%). Fiber properties were evaluated by analysis of SEM, XRD and FTIR. The composites were produced by compression molding with dimensions 30x30x1cm. For characterization of the composites the following tests were performed: mechanical (tensile, compression, shore hardness A) according the standards and testing water absorption, moisture regain and biodegradation. The analysis of thermal properties on fibers and composites were by TG, DSC, thermal conductivity, resistivity, heat capacity and thermal resistance. Analyzing the results of these tests, it was observed that the composite reinforced with 20% showed a better thermal performance between others composites and dimensional stability when compared to commercial thermal insulation. Also is possible to observe a balance in moisture absorption of the composite being shown with its higher absorption rate in this same sample (20%). The micrographs show the fiber interaction regions with polyurethane to fill the empty spaces. In hardness and compression testing can identify that with increasing percentage of the fiber material acquires a greater stiffness by making a higher voltage is used for forming necessary. So by the tests performed in eco-composites, the highest percentage of fiber used as reinforcement in their composition obtained a better performance compared to the remaining eco-composites, reaching values very close to the PU.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Medical imaging technologies are experiencing a growth in terms of usage and image resolution, namely in diagnostics systems that require a large set of images, like CT or MRI. Furthermore, legal restrictions impose that these scans must be archived for several years. These facts led to the increase of storage costs in medical image databases and institutions. Thus, a demand for more efficient compression tools, used for archiving and communication, is arising. Currently, the DICOM standard, that makes recommendations for medical communications and imaging compression, recommends lossless encoders such as JPEG, RLE, JPEG-LS and JPEG2000. However, none of these encoders include inter-slice prediction in their algorithms. This dissertation presents the research work on medical image compression, using the MRP encoder. MRP is one of the most efficient lossless image compression algorithm. Several processing techniques are proposed to adapt the input medical images to the encoder characteristics. Two of these techniques, namely changing the alignment of slices for compression and a pixel-wise difference predictor, increased the compression efficiency of MRP, by up to 27.9%. Inter-slice prediction support was also added to MRP, using uni and bi-directional techniques. Also, the pixel-wise difference predictor was added to the algorithm. Overall, the compression efficiency of MRP was improved by 46.1%. Thus, these techniques allow for compression ratio savings of 57.1%, compared to DICOM encoders, and 33.2%, compared to HEVC RExt Random Access. This makes MRP the most efficient of the encoders under study.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Medical imaging technology and applications are continuously evolving, dealing with images of increasing spatial and temporal resolutions, which allow easier and more accurate medical diagnosis. However, this increase in resolution demands a growing amount of data to be stored and transmitted. Despite the high coding efficiency achieved by the most recent image and video coding standards in lossy compression, they are not well suited for quality-critical medical image compression where either near-lossless or lossless coding is required. In this dissertation, two different approaches to improve lossless coding of volumetric medical images, such as Magnetic Resonance and Computed Tomography, were studied and implemented using the latest standard High Efficiency Video Encoder (HEVC). In a first approach, the use of geometric transformations to perform inter-slice prediction was investigated. For the second approach, a pixel-wise prediction technique, based on Least-Squares prediction, that exploits inter-slice redundancy was proposed to extend the current HEVC lossless tools. Experimental results show a bitrate reduction between 45% and 49%, when compared with DICOM recommended encoders, and 13.7% when compared with standard HEVC.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We propose two new approaches to enhance the spectral compression process arising from nonlinear pulse propagation in an optical fibre. We numerically show that an additional sinusoidal temporal phase modulation of the pulse enables efficient reduction of the intensity level of side lobes in the spectrum. Another strategy is to select a regime of propagation in which normal group-velocity dispersion reshapes the initial stretched pulse to a near-Fourier-transform-limited rectangular waveform.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We propose a new, simple approach to enhance the spectral compression process arising from nonlinear pulse propagation in an optical fiber. We numerically show that an additional sinusoidal temporal phase modulation of the pulse enables efficient reduction of the intensity level of the side lobes in the spectrum that are produced by the mismatch between the initial linear negative chirp of the pulse and the self-phase modulation-induced nonlinear positive chirp. Remarkable increase of both the extent of spectrum narrowing and the quality of the compressed spectrum is afforded by the proposed approach across a wide range of experimentally accessible parameters.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We propose a simple approach to enhance the spectral compression arising from nonlinear pulse propagation in a Kerr medium. We numerically show that an additional sinusoidal temporal phase modulation enables efficient reduction of the intensity level of spectral side lobes.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND: Previous research has found accumulating evidence for atypical reward processing in autism spectrum disorders (ASD), particularly in the context of social rewards. Yet, this line of research has focused largely on positive social reinforcement, while little is known about the processing of negative reinforcement in individuals with ASD. METHODS: The present study examined neural responses to social negative reinforcement (a face displaying negative affect) and non-social negative reinforcement (monetary loss) in children with ASD relative to typically developing children, using functional magnetic resonance imaging (fMRI). RESULTS: We found that children with ASD demonstrated hypoactivation of the right caudate nucleus while anticipating non-social negative reinforcement and hypoactivation of a network of frontostriatal regions (including the nucleus accumbens, caudate nucleus, and putamen) while anticipating social negative reinforcement. In addition, activation of the right caudate nucleus during non-social negative reinforcement was associated with individual differences in social motivation. CONCLUSIONS: These results suggest that atypical responding to negative reinforcement in children with ASD may contribute to social motivational deficits in this population.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A substantial amount of information on the Internet is present in the form of text. The value of this semi-structured and unstructured data has been widely acknowledged, with consequent scientific and commercial exploitation. The ever-increasing data production, however, pushes data analytic platforms to their limit. This thesis proposes techniques for more efficient textual big data analysis suitable for the Hadoop analytic platform. This research explores the direct processing of compressed textual data. The focus is on developing novel compression methods with a number of desirable properties to support text-based big data analysis in distributed environments. The novel contributions of this work include the following. Firstly, a Content-aware Partial Compression (CaPC) scheme is developed. CaPC makes a distinction between informational and functional content in which only the informational content is compressed. Thus, the compressed data is made transparent to existing software libraries which often rely on functional content to work. Secondly, a context-free bit-oriented compression scheme (Approximated Huffman Compression) based on the Huffman algorithm is developed. This uses a hybrid data structure that allows pattern searching in compressed data in linear time. Thirdly, several modern compression schemes have been extended so that the compressed data can be safely split with respect to logical data records in distributed file systems. Furthermore, an innovative two layer compression architecture is used, in which each compression layer is appropriate for the corresponding stage of data processing. Peripheral libraries are developed that seamlessly link the proposed compression schemes to existing analytic platforms and computational frameworks, and also make the use of the compressed data transparent to developers. The compression schemes have been evaluated for a number of standard MapReduce analysis tasks using a collection of real-world datasets. In comparison with existing solutions, they have shown substantial improvement in performance and significant reduction in system resource requirements.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Traditional heuristic approaches to the Examination Timetabling Problem normally utilize a stochastic method during Optimization for the selection of the next examination to be considered for timetabling within the neighbourhood search process. This paper presents a technique whereby the stochastic method has been augmented with information from a weighted list gathered during the initial adaptive construction phase, with the purpose of intelligently directing examination selection. In addition, a Reinforcement Learning technique has been adapted to identify the most effective portions of the weighted list in terms of facilitating the greatest potential for overall solution improvement. The technique is tested against the 2007 International Timetabling Competition datasets with solutions generated within a time frame specified by the competition organizers. The results generated are better than those of the competition winner in seven of the twelve examinations, while being competitive for the remaining five examinations. This paper also shows experimentally how using reinforcement learning has improved upon our previous technique.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Behavior of granular material subjected to repeated load triaxial compression tests is characterized by a model based on rate process theory. Starting with the Arrhenius equation from chemical kinetics, the relationship of temperature, shear stress, normal stress and volume change to deformation rate is developed. The proposed model equation includes these factors as a product of exponential terms. An empirical relationship between deformation and the cube root of the number of stress applications at constant temperature and normal stress is combined with the rate equation to yield an integrated relationship of temperature, deviator stress, confining pressure and number of deviator stress applications to axial strain. The experimental program consists of 64 repeated load triaxial compression tests, 52 on untreated crushed stone and 12 on the same crushed stone material treated with 4% asphalt cement. Results were analyzed with multiple linear regression techniques and show substantial agreement with the model equations. Experimental results fit the rate equation somewhat better than the integrated equation when all variable quantities are considered. The coefficient of shear temperature gives the activation enthalpy, which is about 4.7 kilocalories/mole for untreated material and 39.4 kilocalories/mole for asphalt-treated material. This indicates the activation enthalpy is about that of the pore fluid. The proportionality coefficient of deviator stress may be used to measure flow unit volume. The volumes thus determined for untreated and asphalt-treated material are not substantially different. This may be coincidental since comparison with flow unit volumes reported by others indicates flow unit volume is related to gradation of untreated material. The flow unit volume of asphalt-treated material may relate to asphalt cement content. The proposed model equations provide a more rational basis for further studies of factors affecting deformation of granular materials under stress similar to that in pavement subjected to transient traffic loads.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Spinal cord injury (SCI) is a devastating neurological disorder that affects thousands of people each year. Although in recent decades significant progress has been made in relation to understanding the molecular and cellular events underlying the nervous damage, spinal cord injury is still a highly disabling condition for which there is no curative therapy. People affected by spinal cord injuries manifested dysfunction or loss, temporary or permanent, of motor, sensory and / or autonomic functions depending on the spinal lesion damaged. Currently, the incidence rate of this type of injury is approximately 15-40 cases per million people worldwide. At the origin of these lesions are: road accidents, falls, interpersonal violence and the practice of sports. In this work we placed the hypothesis that HA is one of the component of the scar tissue formed after a compressive SCI, that it is likely synthetised by the perilesional glial cells and that it might support the permeation of the glial scar during the late phase of SCI. Nowadays, much focus is drawn on the recovery of CNS function, made impossible after SCI due to the high content of sulfated proteoglycans in the extracellular matrix. Counterbalancing the ratio between these proteoglycans and hyaluronic acid could be one of the experimental therapy to re-permeate the glial scar tissue formed after SCI, making possible axonal regrowth and functional recovery. Therefore, we established a model of spinal cord compression in mice and studied the glial scar tissue, particularly through the characterization of the expression of enzymes related to the metabolism of HA and the subsequent concentration thereof at different distances of the lesion epicenter. Our results show that the lesion induced in mice shows results similar to those produced in human lesions, in terms of histologic similarities and behavioral results. but these animals demonstrate an impressive spontaneous reorganization mechanism of the spinal cord tissue that occurs after injury and allows for partial recovery of the functions of the CNS. As regards the study of the glial scar, changes were recorded at the level of mRNA expression of enzymes metabolizing HA i.e., after injury there was a decreased expression of HA synthases 1-2 (HAS 1-2) and an increase of the expression HAS3 synthase mRNA, as well as the enzymes responsible for the HA catabolism, HYAL 1-2. But the amount of HA measured through the ELISA test was found unchanged after injury, it is not possible to explain this fact only with the change of expression of enzymes. At two weeks and in response to SCI, we found synthesized HA by reactive astrocytes and probably by others like microglial cells as it was advanced by the HA/GFAP+ and HA/IBA1+ cells co-location.