53 resultados para Histogram processing
Resumo:
In this paper, we describe several techniques for detecting tonic pitch value in Indian classical music. In Indian music, the raga is the basic melodic framework and it is built on the tonic. Tonic detection is therefore fundamental for any melodic analysis in Indian classical music. This workexplores detection of tonic by processing the pitch histograms of Indian classic music. Processing of pitch histograms using group delay functions and its ability to amplify certain traits of Indian music in the pitch histogram, is discussed. Three different strategies to detect tonic, namely, the concert method, the template matching and segmented histogram method are proposed. The concert method exploits the fact that the tonic is constant over a piece/concert.templatematchingmethod and segmented histogrammethodsuse the properties: (i) the tonic is always present in the background, (ii) some notes are less inflected and dominant, to detect the tonic of individual pieces. All the three methods yield good results for Carnatic music (90−100% accuracy), while for Hindustanimusic, the templatemethod works best, provided the v¯adi samv¯adi notes for a given piece are known (85%).
Resumo:
This note describes ParallelKnoppix, a bootable CD that allows creation of a Linux cluster in very little time. An experienced user can create a cluster ready to execute MPI programs in less than 10 minutes. The computers used may be heterogeneous machines, of the IA-32 architecture. When the cluster is shut down, all machines except one are in their original state, and the last can be returned to its original state by deleting a directory. The system thus provides a means of using non-dedicated computers to create a cluster. An example session is documented.
Resumo:
High hydrostatic pressure is being increasingly investigated in food processing. It causes microbial inactivation and therefore extends the shelf life and enhances the safety of food products. Yeasts, molds, and vegetative cells of bacteria can be inactivated by pressures in the range of 200 to 700 MPa. Microorganisms are more or less sensitive to pressure depending on several factors such as type, strain and the phase or state of the cells. In general, Gram-positive organisms are usually more resistant than Gram-negative. High pressure processing modifies the permeability of the cell membrane, the ion exchange and causes changes in morphology and biochemical reactions, protein denaturations and inhibition of genetic mechanisms. High pressure has been used successfully to extend the shelf life of high-acid foods such as refrigerated fruit juices, jellies and jams. There is now an increasing interest in the use of this technology to extend the shelf life of low-acid foods such as different types of meat products.
Resumo:
The effects of high pressure on the composition of food products have not been evaluated extensively. Since, it is necessary to take in consideration the possible effects in basis to the changes induced in the bio molecules by the application of high pressures. The main effect on protein is the denaturation, because the covalent bonds are not affected; however hydrogen bonding, hydrophobic and intermolecular interactions are modified or destroyed. 1 High pressure can modify the activity of some enzymes. If this is done the proteolysis and lipolysis could be more or less intense and the content of free amino acids and fatty acids will be different. This could be related to the bioavailability of these compounds. Low pressures (100 MPa) have been shown to activate some enzymes (monomeric enzymes). Higher pressures induce loss of the enzyme activity. However some enzymes are very stable (ex. Lipase ~ 600 - 1000 MPa). Lipoxygenase is less stable, and there is little information about the effects on antioxidant enzymes. Other important issue is the influence of high pressure on oxidation susceptibility. This could modify the composition of lipids if the degree of the oxidation would have been higher or lower than in the traditional product. Pressure produces the damage of cell membranes favouring the contact between substrates and enzymes, exposure to oxidation of membrane fatty acids and loos of the efficiency of vitamin E. These effects can also affect to protein oxidation. In this study different compounds were analysed to establish the differences between non-treated and high-pressure treated products.
Resumo:
"Vegeu el resum a l'inici del document del fitxer adjunt."
Resumo:
La regeneració òssia és un procés estudiat per experts de tot el món. Aquests experts estudien materials capaços d’accelerar el procés de formació de teixit ossi en zones on s’han produït defectes ossis. Després d’un determinat període de temps de l’aplicació dels materials d’estudi en la zona on hi havia una manca de teixit ossi, s’obtenen imatges d’aquesta zona on l’expert mitjançant l’ inspecció visual d’aquestes imatges avalua si l’os s’ha regenerat bé o no. El problema d’aquest mètode d’avaluació es que requereix d’un expert on la valoració d’aquest és subjectiva i difícil de quantificar, el que pot provocar que hi hagi discordança entre experts. Amb la finalitat de aprofitar les imatges en que es basa l’expert per avaluar la capacitat de regeneració òssia dels materials d’estudi es proposa realitzar un anàlisi quantitatiu de la regeneració òssia basat en el processament d’imatge. L’algorisme dissenyat es capaç de classificar imatges de la mandíbula en: imatges de regeneració bona i dolenta mitjançant la parametrització de l’histograma de nivells de grisos de la imatge, solucionant la falta d’objectivitat del mètode d’avaluació de la regeneració òssia i la necessitat d’un expert per realitzar-la.
Resumo:
Configuració d'un entorn de desenvolupament en el IDE Eclipse. Introducció als SIG. Usos, utilitats i exemples. Conèixer la eina gvSIG. Conèixer els estàndards més estesos de l'Open Geospatial Consortium (OGC) i en especial del Web Processing Services. Analitzar, dissenyar i desenvolupar un client capaç de consumir serveis wps.
Resumo:
Estudi dels estàndards definits per l'Open Geospatial Consortium, i més concretament en l'estàndard Web Processing Service (wps). Així mateix, ha tingut una component pràctica que ha consistit en el disseny i desenvolupament d'un client capaç de consumir serveis Web creats segons wps i integrat a la plataforma gvSIG.
Resumo:
El artículo muestra, a través del estudio del caso del buscador semántico del portal Organic.Edunet, cómo el uso de tecnologías cerradas en la creación de interfaces avanzadas de visualización de datos impide su desarrollo y evolución. En el artículo se mostrará también cómo, combinado con técnicas para la medición y valoración de la usabilidad de las aplicaciones, el uso de tecnologías abiertas permite detectar los problemas del interface, proponer soluciones o alternativas, e implementarlas rápidamente.
Resumo:
In the context of the round table the following topics related to image colour processing will be discussed: historical point of view. Studies of Aguilonius, Gerritsen, Newton and Maxwell. CIE standard (Commission International de lpsilaEclaraige). Colour models. RGB, HIS, etc. Colour segmentation based on HSI model. Industrial applications. Summary and discussion. At the end, video images showing the robustness of colour in front of B/W images will be presented
Resumo:
A major obstacle to processing images of the ocean floor comes from the absorption and scattering effects of the light in the aquatic environment. Due to the absorption of the natural light, underwater vehicles often require artificial light sources attached to them to provide the adequate illumination. Unfortunately, these flashlights tend to illuminate the scene in a nonuniform fashion, and, as the vehicle moves, induce shadows in the scene. For this reason, the first step towards application of standard computer vision techniques to underwater imaging requires dealing first with these lighting problems. This paper analyses and compares existing methodologies to deal with low-contrast, nonuniform illumination in underwater image sequences. The reviewed techniques include: (i) study of the illumination-reflectance model, (ii) local histogram equalization, (iii) homomorphic filtering, and, (iv) subtraction of the illumination field. Several experiments on real data have been conducted to compare the different approaches
Resumo:
In this paper, an information theoretic framework for image segmentation is presented. This approach is based on the information channel that goes from the image intensity histogram to the regions of the partitioned image. It allows us to define a new family of segmentation methods which maximize the mutual information of the channel. Firstly, a greedy top-down algorithm which partitions an image into homogeneous regions is introduced. Secondly, a histogram quantization algorithm which clusters color bins in a greedy bottom-up way is defined. Finally, the resulting regions in the partitioning algorithm can optionally be merged using the quantized histogram
Resumo:
In the present work, microstructure improvement using FSP (Friction Stir Processing) is studied. In the first part of the work, the microstructure improvement of as-cast A356 is demonstrated. Some tensile tests were applied to check the increase in ductility. However, the expected results couldn’t be achieved. In the second part, the microstructure improvement of a fusion weld in 1050 aluminium alloy is presented. Hardness tests were carried out to prove the mechanical propertyimprovements. In the third and last part, the microstructure improvement of 1050 aluminium alloy is achieved. A discussion of the mechanical property improvements induced by FSP is made. The influence of tool traverse speed on microstructure and mechanical properties is also discussed. Hardness tests and recrystallization theory enabled us to find out such influence
Resumo:
The increasing volume of data describing humandisease processes and the growing complexity of understanding, managing, and sharing such data presents a huge challenge for clinicians and medical researchers. This paper presents the@neurIST system, which provides an infrastructure for biomedical research while aiding clinical care, by bringing together heterogeneous data and complex processing and computing services. Although @neurIST targets the investigation and treatment of cerebral aneurysms, the system’s architecture is generic enough that it could be adapted to the treatment of other diseases.Innovations in @neurIST include confining the patient data pertaining to aneurysms inside a single environment that offers cliniciansthe tools to analyze and interpret patient data and make use of knowledge-based guidance in planning their treatment. Medicalresearchers gain access to a critical mass of aneurysm related data due to the system’s ability to federate distributed informationsources. A semantically mediated grid infrastructure ensures that both clinicians and researchers are able to seamlessly access andwork on data that is distributed across multiple sites in a secure way in addition to providing computing resources on demand forperforming computationally intensive simulations for treatment planning and research.
Resumo:
The spectral efficiency achievable with joint processing of pilot and data symbol observations is compared with that achievable through the conventional (separate) approach of first estimating the channel on the basis of the pilot symbols alone, and subsequently detecting the datasymbols. Studied on the basis of a mutual information lower bound, joint processing is found to provide a non-negligible advantage relative to separate processing, particularly for fast fading. It is shown that, regardless of the fading rate, only a very small number of pilot symbols (at most one per transmit antenna and per channel coherence interval) shouldbe transmitted if joint processing is allowed.