822 resultados para Compression rates
Resumo:
Vuosi vuodelta kasvava tietokoneiden prosessointikyky on mahdollistanut harmaataso- ja RGB-värikuvia tarkempien spektrikuvien käsittelyn järjellisessä ajassa ilman suuria kustannuksia. Ongelmana on kuitenkin, ettei talletus- ja tiedonsiirtomedia ole kehittynyt prosessointikyvyn vauhdissa. Ratkaisu tähän ongelmaan on spektrikuvien tiivistäminen talletuksen ja tiedonsiirron ajaksi. Tässä työssä esitellään menetelmä, jossa spektrikuva tiivistetään kahdessa vaiheessa: ensin ryhmittelemällä itseorganisoituvan kartan (SOM) avulla ja toisessa vaiheessa jatketaan tiivistämistä perinteisin menetelmin. Saadut tiivistyssuhteet ovat merkittäviä vääristymän pysyessä siedettävänä. Työ on tehty Lappeenrannan teknillisen korkeakoulun Tietotekniikan osaston Tietojenkäsittelytekniikan tutkimuslaboratoriossa osana laajempaa kuvantiivistyksen tutkimushanketta.
Resumo:
Nowadays several electronics devices support digital videos. Some examples of these devices are cellphones, digital cameras, video cameras and digital televisions. However, raw videos present a huge amount of data, millions of bits, for their representation as the way they were captured. To store them in its primary form it would be necessary a huge amount of disk space and a huge bandwidth to allow the transmission of these data. The video compression becomes essential to make possible information storage and transmission. Motion Estimation is a technique used in the video coder that explores the temporal redundancy present in video sequences to reduce the amount of data necessary to represent the information. This work presents a hardware architecture of a motion estimation module for high resolution videos according to H.264/AVC standard. The H.264/AVC is the most advanced video coder standard, with several new features which allow it to achieve high compression rates. The architecture presented in this work was developed to provide a high data reuse. The data reuse schema adopted reduces the bandwidth required to execute motion estimation. The motion estimation is the task responsible for the largest share of the gains obtained with the H.264/AVC standard so this module is essential for final video coder performance. This work is included in Rede H.264 project which aims to develop Brazilian technology for Brazilian System of Digital Television
Resumo:
Different methods of cutting fluid application are used on turning of a difficult-tomachine steel (SAE EV-8). A semi-synthetic cutting fluid was applied using a conventional method, minimum quantity of cutting fluid (MQCF), and pulverization. By the minimum quantity method was also applied a lubricant of vegetable oil (MQL). Thereafter, a cutting fluid jet under high pressure (3.0 MPa) was singly applied in the following regions: chip-tool interface; top surface of the chip; and tool-workpiece contact. Two other methods were used: an interflow between conventional application and chip-tool interface jet and, finally, three jets simultaneously applied. In order to carry out these tests, it was necessary to set up a high pressure system using a piston pump for generating a cutting fluid jet, a Venturi for fluid application (MQCF and MQL), and a nozzle for cutting fluid pulverization. The output variables analyzed included tool life, surface roughness, cutting tool temperature, cutting force, chip form, chip compression rate and machined specimen microstructure. It can be observed that the tool life increases and the cutting force decreases with the application of cutting fluid jet, mainly when it is directed to the chip-tool interface. Excluding the methods involving jet fluid, the conventional method seems to be more efficient than other methods of low pressure. © (2013) Trans Tech Publications, Switzerland.
Resumo:
Quizás el Código Morse, inventado en 1838 para su uso en la telegrafía, es uno de los primeros ejemplos de la utilización práctica de la compresión de datos [1], donde las letras más comunes del alfabeto son codificadas con códigos más cortos que las demás. A partir de 1940 y tras el desarrollo de la teoría de la información y la creación de los primeros ordenadores, la compresión de la información ha sido un reto constante y fundamental entre los campos de trabajo de investigadores de todo tipo. Cuanto mayor es nuestra comprensión sobre el significado de la información, mayor es nuestro éxito comprimiéndola. En el caso de la información multimedia, su naturaleza permite la compresión con pérdidas, alcanzando así cotas de compresión imposibles para los algoritmos sin pérdidas. Estos “recientes” algoritmos con pérdidas han estado mayoritariamente basados en transformación de la información al dominio de la frecuencia y en la eliminación de parte de la información en dicho dominio. Transformar al dominio de la frecuencia posee ventajas pero también involucra unos costes computacionales inevitables. Esta tesis presenta un nuevo algoritmo de compresión multimedia llamado “LHE” (Logarithmical Hopping Encoding) que no requiere transformación al dominio de la frecuencia, sino que trabaja en el dominio del espacio. Esto lo convierte en un algoritmo lineal de reducida complejidad computacional. Los resultados del algoritmo son prometedores, superando al estándar JPEG en calidad y velocidad. Para ello el algoritmo utiliza como base la respuesta fisiológica del ojo humano ante el estímulo luminoso. El ojo, al igual que el resto de los sentidos, responde al logaritmo de la señal de acuerdo a la ley de Weber. El algoritmo se compone de varias etapas. Una de ellas es la medición de la “Relevancia Perceptual”, una nueva métrica que nos va a permitir medir la relevancia que tiene la información en la mente del sujeto y en base a la misma, degradar en mayor o menor medida su contenido, a través de lo que he llamado “sub-muestreado elástico”. La etapa de sub-muestreado elástico constituye una nueva técnica sin precedentes en el tratamiento digital de imágenes. Permite tomar más o menos muestras en diferentes áreas de una imagen en función de su relevancia perceptual. En esta tesis se dan los primeros pasos para la elaboración de lo que puede llegar a ser un nuevo formato estándar de compresión multimedia (imagen, video y audio) libre de patentes y de alto rendimiento tanto en velocidad como en calidad. ABSTRACT The Morse code, invented in 1838 for use in telegraphy, is one of the first examples of the practical use of data compression [1], where the most common letters of the alphabet are coded shorter than the rest of codes. From 1940 and after the development of the theory of information and the creation of the first computers, compression of information has been a constant and fundamental challenge among any type of researchers. The greater our understanding of the meaning of information, the greater our success at compressing. In the case of multimedia information, its nature allows lossy compression, reaching impossible compression rates compared with lossless algorithms. These "recent" lossy algorithms have been mainly based on information transformation to frequency domain and elimination of some of the information in that domain. Transforming the frequency domain has advantages but also involves inevitable computational costs. This thesis introduces a new multimedia compression algorithm called "LHE" (logarithmical Hopping Encoding) that does not require transformation to frequency domain, but works in the space domain. This feature makes LHE a linear algorithm of reduced computational complexity. The results of the algorithm are promising, outperforming the JPEG standard in quality and speed. The basis of the algorithm is the physiological response of the human eye to the light stimulus. The eye, like other senses, responds to the logarithm of the signal according with Weber law. The algorithm consists of several stages. One is the measurement of "perceptual relevance," a new metric that will allow us to measure the relevance of information in the subject's mind and based on it; degrade accordingly their contents, through what I have called "elastic downsampling". Elastic downsampling stage is an unprecedented new technique in digital image processing. It lets take more or less samples in different areas of an image based on their perceptual relevance. This thesis introduces the first steps for the development of what may become a new standard multimedia compression format (image, video and audio) free of patents and high performance in both speed and quality.
Resumo:
Automatic Text Summarization has been shown to be useful for Natural Language Processing tasks such as Question Answering or Text Classification and other related fields of computer science such as Information Retrieval. Since Geographical Information Retrieval can be considered as an extension of the Information Retrieval field, the generation of summaries could be integrated into these systems by acting as an intermediate stage, with the purpose of reducing the document length. In this manner, the access time for information searching will be improved, while at the same time relevant documents will be also retrieved. Therefore, in this paper we propose the generation of two types of summaries (generic and geographical) applying several compression rates in order to evaluate their effectiveness in the Geographical Information Retrieval task. The evaluation has been carried out using GeoCLEF as evaluation framework and following an Information Retrieval perspective without considering the geo-reranking phase commonly used in these systems. Although single-document summarization has not performed well in general, the slight improvements obtained for some types of the proposed summaries, particularly for those based on geographical information, made us believe that the integration of Text Summarization with Geographical Information Retrieval may be beneficial, and consequently, the experimental set-up developed in this research work serves as a basis for further investigations in this field.
Resumo:
Compressed covariance sensing using quadratic samplers is gaining increasing interest in recent literature. Covariance matrix often plays the role of a sufficient statistic in many signal and information processing tasks. However, owing to the large dimension of the data, it may become necessary to obtain a compressed sketch of the high dimensional covariance matrix to reduce the associated storage and communication costs. Nested sampling has been proposed in the past as an efficient sub-Nyquist sampling strategy that enables perfect reconstruction of the autocorrelation sequence of Wide-Sense Stationary (WSS) signals, as though it was sampled at the Nyquist rate. The key idea behind nested sampling is to exploit properties of the difference set that naturally arises in quadratic measurement model associated with covariance compression. In this thesis, we will focus on developing novel versions of nested sampling for low rank Toeplitz covariance estimation, and phase retrieval, where the latter problem finds many applications in high resolution optical imaging, X-ray crystallography and molecular imaging. The problem of low rank compressive Toeplitz covariance estimation is first shown to be fundamentally related to that of line spectrum recovery. In absence if noise, this connection can be exploited to develop a particular kind of sampler called the Generalized Nested Sampler (GNS), that can achieve optimal compression rates. In presence of bounded noise, we develop a regularization-free algorithm that provably leads to stable recovery of the high dimensional Toeplitz matrix from its order-wise minimal sketch acquired using a GNS. Contrary to existing TV-norm and nuclear norm based reconstruction algorithms, our technique does not use any tuning parameters, which can be of great practical value. The idea of nested sampling idea also finds a surprising use in the problem of phase retrieval, which has been of great interest in recent times for its convex formulation via PhaseLift, By using another modified version of nested sampling, namely the Partial Nested Fourier Sampler (PNFS), we show that with probability one, it is possible to achieve a certain conjectured lower bound on the necessary measurement size. Moreover, for sparse data, an l1 minimization based algorithm is proposed that can lead to stable phase retrieval using order-wise minimal number of measurements.
Resumo:
Over the last decade, there has been a trend where water utility companies aim to make water distribution networks more intelligent in order to improve their quality of service, reduce water waste, minimize maintenance costs etc., by incorporating IoT technologies. Current state of the art solutions use expensive power hungry deployments to monitor and transmit water network states periodically in order to detect anomalous behaviors such as water leakage and bursts. However, more than 97% of water network assets are remote away from power and are often in geographically remote underpopulated areas, facts that make current approaches unsuitable for next generation more dynamic adaptive water networks. Battery-driven wireless sensor/actuator based solutions are theoretically the perfect choice to support next generation water distribution. In this paper, we present an end-to-end water leak localization system, which exploits edge processing and enables the use of battery-driven sensor nodes. Our system combines a lightweight edge anomaly detection algorithm based on compression rates and an efficient localization algorithm based on graph theory. The edge anomaly detection and localization elements of the systems produce a timely and accurate localization result and reduce the communication by 99% compared to the traditional periodic communication. We evaluated our schemes by deploying non-intrusive sensors measuring vibrational data on a real-world water test rig that have had controlled leakage and burst scenarios implemented.
Resumo:
Purpose: To analyse prospectively the long-term results of Gamma Knife surgery (GKS) in patients with trigeminal neuralgia secondary to megadolichobasilar artery (MBA). Methods: Between December 1992 and November 2010, 33 consecutive patients presenting with ITN secondary to MBA were operated by GKS and followed prospectively in Timone University Hospital. The follow up is at least of 1 year in 29 patients. The median age was 74.90 years (range 51 to 90). The GKS typically was performed using MR and CT imaging guidance and a single 4 mm isocenter. The median of the prescription dose (at the 100%) was 90 Gy (range 80 to 90). The target was placed on the cisternal portion of the Vth nerve. Clinical and dosimetric parameters were analyzed. GKS was the first surgical procedure in 23 patients (79.31%). Results: The median follow- up period was 46.12 months (range 12.95 to 157.93). All the 29 patients (100%) were initially pain free in a median time of 13.5 days (range 0 to 240). The probability of remaining pain free at 0.5, 1, 2 years was 93.1%, 79.3% and 75.7% respectively, reaching at this time the flat part of the curve. Seven patients (24.13%) experienced a recurrence with a median delay of 10.75 months (range 3.77 to 12.62). The actuarial rate of recurrence was not higher than in our population with essential TN although atypical pain was associated with a much higher risk of recurrence (HR= 6.92, p= 0.0117). The hypoesthesia actuarial rates at 0.5 years was 4.3% and at 1 year reach 13% and remains stable till 12 years with a median delay of onset of 7 (5, 12) months. Female patients had a statistically much lower probability of developing a facial numbness (p of 0.03). No patient reported a bothersome hypoesthesia. Conclusion: Retrogaserian, high dose GKS, turned out to be very safe with only 13.04% hypoesthesia, which was never disabling (0%), while achieving high quality pain control. The majority of the patients demonstrated a prolonged effect of radiosurgery in absence of any trigeminal nerve disturbance.
Resumo:
In this article, techniques have been presented for faster evolution of wavelet lifting coefficients for fingerprint image compression (FIC). In addition to increasing the computational speed by 81.35%, the coefficients performed much better than the reported coefficients in literature. Generally, full-size images are used for evolving wavelet coefficients, which is time consuming. To overcome this, in this work, wavelets were evolved with resized, cropped, resized-average and cropped-average images. On comparing the peak- signal-to-noise-ratios (PSNR) offered by the evolved wavelets, it was found that the cropped images excelled the resized images and is in par with the results reported till date. Wavelet lifting coefficients evolved from an average of four 256 256 centre-cropped images took less than 1/5th the evolution time reported in literature. It produced an improvement of 1.009 dB in average PSNR. Improvement in average PSNR was observed for other compression ratios (CR) and degraded images as well. The proposed technique gave better PSNR for various bit rates, with set partitioning in hierarchical trees (SPIHT) coder. These coefficients performed well with other fingerprint databases as well.
Resumo:
We compare the use of plastically compressed collagen gels to conventional collagen gels as scaffolds onto which corneal limbal epithelial cells (LECs) are seeded to construct an artificial corneal epithelium. LECs were isolated from bovine corneas (limbus) and seeded onto either conventional uncompressed or novel compressed collagen gels and grown in culture. Scanning electron microscopy (SEM) results showed that fibers within the uncompressed gel were loose and irregularly ordered, whereas the fibers within the compressed gel were densely packed and more evenly arranged. Quantitative analysis of LECs expansion across the surface of the two gels showed similar growth rates (p > 0.05). Under SEM, the LECs, expanded on uncompressed gels, showed a rough and heterogeneous morphology, whereas on the compressed gel, the cells displayed a smooth and homogeneous morphology. Transmission electron microscopy (TEM) results showed the compressed scaffold to contain collagen fibers of regular diameter and similar orientation resembling collagen fibers within the normal cornea. TEM and light microscopy also showed that cell–cell and cell–matrix attachment, stratification, and cell density were superior in LECs expanded upon compressed collagen gels. This study demonstrated that the compressed collagen gel was an excellent biomaterial scaffold highly suited to the construction of an artificial corneal epithelium and a significant improvement upon conventional collagen gels.
Resumo:
Trauma or degenerative diseases such as osteonecrosis may determine bone loss whose recover is promised by a "tissue engineering“ approach. This strategy involves the use of stem cells, grown onboard of adequate biocompatible/bioreabsorbable hosting templates (usually defined as scaffolds) and cultured in specific dynamic environments afforded by differentiation-inducing actuators (usually defined as bioreactors) to produce implantable tissue constructs. The purpose of this thesis is to evaluate, by finite element modeling of flow/compression-induced deformation, alginate scaffolds intended for bone tissue engineering. This work was conducted at the Biomechanics Laboratory of the Institute of Biomedical and Neural Engineering of the Reykjavik University of Iceland. In this respect, Comsol Multiphysics 5.1 simulations were carried out to approximate the loads over alginate 3D matrices under perfusion, compression and perfusion+compression, when varyingalginate pore size and flow/compression regimen. The results of the simulations show that the shear forces in the matrix of the scaffold increase coherently with the increase in flow and load, and decrease with the increase of the pore size. Flow and load rates suggested for proper osteogenic cell differentiation are reported.
Resumo:
The need for a stronger and more durable building material is becoming more important as the structural engineering field expands and challenges the behavioral limits of current materials. One of the demands for stronger material is rooted in the effects that dynamic loading has on a structure. High strain rates on the order of 101 s-1 to 103 s-1, though a small part of the overall types of loading that occur anywhere between 10-8 s-1 to 104 s-1 and at any point in a structures life, have very important effects when considering dynamic loading on a structure. High strain rates such as these can cause the material and structure to behave differently than at slower strain rates, which necessitates the need for the testing of materials under such loading to understand its behavior. Ultra high performance concrete (UHPC), a relatively new material in the U.S. construction industry, exhibits many enhanced strength and durability properties compared to the standard normal strength concrete. However, the use of this material for high strain rate applications requires an understanding of UHPC’s dynamic properties under corresponding loads. One such dynamic property is the increase in compressive strength under high strain rate load conditions, quantified as the dynamic increase factor (DIF). This factor allows a designer to relate the dynamic compressive strength back to the static compressive strength, which generally is a well-established property. Previous research establishes the relationships for the concept of DIF in design. The generally accepted methodology for obtaining high strain rates to study the enhanced behavior of compressive material strength is the split Hopkinson pressure bar (SHPB). In this research, 83 Cor-Tuf UHPC specimens were tested in dynamic compression using a SHPB at Michigan Technological University. The specimens were separated into two categories: ambient cured and thermally treated, with aspect ratios of 0.5:1, 1:1, and 2:1 within each category. There was statistically no significant difference in mean DIF for the aspect ratios and cure regimes that were considered in this study. DIF’s ranged from 1.85 to 2.09. Failure modes were observed to be mostly Type 2, Type 4, or combinations thereof for all specimen aspect ratios when classified according to ASTM C39 fracture pattern guidelines. The Comite Euro-International du Beton (CEB) model for DIF versus strain rate does not accurately predict the DIF for UHPC data gathered in this study. Additionally, a measurement system analysis was conducted to observe variance within the measurement system and a general linear model analysis was performed to examine the interaction and main effects that aspect ratio, cannon pressure, and cure method have on the maximum dynamic stress.
Resumo:
BACKGROUND CONTEXT The Swiss Federal Office of Public Health mandated a nationwide health technology assessment-registry for balloon kyphoplasty (BKP) for decision making on reimbursement of these interventions. The early results of the registry led to a permanent coverage of BKP by basic health insurance. The documentation was continued for further evidence generation. PURPOSE This analysis reports on the 1 year results of patients after BKP treatment. STUDY DESIGN Prospective multicenter observational case series. PATIENT SAMPLE The data on 625 cases with 819 treated vertebrae were documented from March 2005 to May 2012. OUTCOME MEASURES Surgeon-administered outcome instruments were primary intervention form for BKP and the follow-up form; patient self-reported measures were EuroQol-5D questionnaire, North American Spine Society outcome instrument /Core Outcome Measures Index (including visual analog scale), and a comorbidity questionnaire. Outcome measures were back pain, medication, quality of life (QoL), cement extrusions, and new fractures within the first postoperative year. METHODS Data were recorded preoperatively and at 3 to 6-month and 1-year follow-ups. Wilcoxon signed-rank test was used for comparison of pre- with postoperative measurements. Multivariate logistic regression was used to identify factors with a significant influence on the outcome. RESULTS Seventy percent of patients were women with mean age of 71 years (range, 18-91 years); mean age of men was 65 years (range, 15-93 years). Significant and clinically relevant reduction of back pain, improvement of QoL, and reduction of pain killer consumption was seen within the first postoperative year. Preoperative back pain decreased from 69.3 to 29.0 at 3 to 6-month and remained unchanged at 1-year follow-ups. Consequently, QoL improved from 0.23 to 0.71 and 0.75 at the same follow-up intervals. The overall vertebra-based cement extrusion rates with and without extrusions into intervertebral discs were 22.1% and 15.3%, respectively. Symptomatic cement extrusions with radiculopathy were five (0.8%). A new vertebral fracture within a year from the BKP surgery was observed in 18.4% of the patients. CONCLUSIONS The results of the largest observational study for BKP so far are consistent with published randomized trials and systematic reviews. In this routine health care setting, BKP is safe and effective in reducing pain, improving QoL, and lowering pain_killer consumption and has an acceptable rate of cement extrusions. Postoperative outcome results show clear and significant clinical improvement at early follow-up that remain stable during the first postoperative year.
Resumo:
An AZ31 rolled sheet alloy has been tested at dynamic strain rates View the MathML source at 250 °C up to various intermediate strains before failure in order to investigate the predominant deformation and restoration mechanisms. In particular, tests have been carried out in compression along the rolling direction (RD), in tension along the RD and in compression along the normal direction (ND). It has been found that dynamic recrystallization (DRX) takes place despite the limited diffusion taking place under the high strain rates investigated. The DRX mechanisms and kinetics depend on the operative deformation mechanisms and thus vary for different loading modes (tension, compression) as well as for different relative orientations between the loading axis and the c-axes of the grains. In particular, DRX is enhanced by the operation of 〈c + a〉 slip, since cross-slip and climb take place more readily than for other slip systems, and thus the formation of high angle boundaries is easier. DRX is also clearly promoted by twinning.
Resumo:
Thesis (Ph.D.)--University of Washington, 2016-06