927 resultados para graphics processor
Resumo:
Présentation: Cet article a été publié dans le journal : Computerised medical imaging and graphics (CMIG). Le but de cet article est de recaler les vertèbres extraites à partir d’images RM avec des vertèbres extraites à partir d’images RX pour des patients scoliotiques, en tenant compte des déformations non-rigides due au changement de posture entre ces deux modalités. À ces fins, une méthode de recalage à l’aide d’un modèle articulé est proposée. Cette méthode a été comparée avec un recalage rigide en calculant l’erreur sur des points de repère, ainsi qu’en calculant la différence entre l’angle de Cobb avant et après recalage. Une validation additionelle de la méthode de recalage présentée ici se trouve dans l’annexe A. Ce travail servira de première étape dans la fusion des images RM, RX et TP du tronc complet. Donc, cet article vérifie l’hypothèse 1 décrite dans la section 3.2.1.
Resumo:
clRNG et clProbdist sont deux interfaces de programmation (APIs) que nous avons développées pour la génération de nombres aléatoires uniformes et non uniformes sur des dispositifs de calculs parallèles en utilisant l’environnement OpenCL. La première interface permet de créer au niveau d’un ordinateur central (hôte) des objets de type stream considérés comme des générateurs virtuels parallèles qui peuvent être utilisés aussi bien sur l’hôte que sur les dispositifs parallèles (unités de traitement graphique, CPU multinoyaux, etc.) pour la génération de séquences de nombres aléatoires. La seconde interface permet aussi de générer au niveau de ces unités des variables aléatoires selon différentes lois de probabilité continues et discrètes. Dans ce mémoire, nous allons rappeler des notions de base sur les générateurs de nombres aléatoires, décrire les systèmes hétérogènes ainsi que les techniques de génération parallèle de nombres aléatoires. Nous présenterons aussi les différents modèles composant l’architecture de l’environnement OpenCL et détaillerons les structures des APIs développées. Nous distinguons pour clRNG les fonctions qui permettent la création des streams, les fonctions qui génèrent les variables aléatoires uniformes ainsi que celles qui manipulent les états des streams. clProbDist contient les fonctions de génération de variables aléatoires non uniformes selon la technique d’inversion ainsi que les fonctions qui permettent de retourner différentes statistiques des lois de distribution implémentées. Nous évaluerons ces interfaces de programmation avec deux simulations qui implémentent un exemple simplifié d’un modèle d’inventaire et un exemple d’une option financière. Enfin, nous fournirons les résultats d’expérimentation sur les performances des générateurs implémentés.
Resumo:
La synthèse d'images dites photoréalistes nécessite d'évaluer numériquement la manière dont la lumière et la matière interagissent physiquement, ce qui, malgré la puissance de calcul impressionnante dont nous bénéficions aujourd'hui et qui ne cesse d'augmenter, est encore bien loin de devenir une tâche triviale pour nos ordinateurs. Ceci est dû en majeure partie à la manière dont nous représentons les objets: afin de reproduire les interactions subtiles qui mènent à la perception du détail, il est nécessaire de modéliser des quantités phénoménales de géométries. Au moment du rendu, cette complexité conduit inexorablement à de lourdes requêtes d'entrées-sorties, qui, couplées à des évaluations d'opérateurs de filtrage complexes, rendent les temps de calcul nécessaires à produire des images sans défaut totalement déraisonnables. Afin de pallier ces limitations sous les contraintes actuelles, il est nécessaire de dériver une représentation multiéchelle de la matière. Dans cette thèse, nous construisons une telle représentation pour la matière dont l'interface correspond à une surface perturbée, une configuration qui se construit généralement via des cartes d'élévations en infographie. Nous dérivons notre représentation dans le contexte de la théorie des microfacettes (conçue à l'origine pour modéliser la réflectance de surfaces rugueuses), que nous présentons d'abord, puis augmentons en deux temps. Dans un premier temps, nous rendons la théorie applicable à travers plusieurs échelles d'observation en la généralisant aux statistiques de microfacettes décentrées. Dans l'autre, nous dérivons une procédure d'inversion capable de reconstruire les statistiques de microfacettes à partir de réponses de réflexion d'un matériau arbitraire dans les configurations de rétroréflexion. Nous montrons comment cette théorie augmentée peut être exploitée afin de dériver un opérateur général et efficace de rééchantillonnage approximatif de cartes d'élévations qui (a) préserve l'anisotropie du transport de la lumière pour n'importe quelle résolution, (b) peut être appliqué en amont du rendu et stocké dans des MIP maps afin de diminuer drastiquement le nombre de requêtes d'entrées-sorties, et (c) simplifie de manière considérable les opérations de filtrage par pixel, le tout conduisant à des temps de rendu plus courts. Afin de valider et démontrer l'efficacité de notre opérateur, nous synthétisons des images photoréalistes anticrenelées et les comparons à des images de référence. De plus, nous fournissons une implantation C++ complète tout au long de la dissertation afin de faciliter la reproduction des résultats obtenus. Nous concluons avec une discussion portant sur les limitations de notre approche, ainsi que sur les verrous restant à lever afin de dériver une représentation multiéchelle de la matière encore plus générale.
Resumo:
The thesis introduced the octree and addressed the complete nature of problems encountered, while building and imaging system based on octrees. An efficient Bottom-up recursive algorithm and its iterative counterpart for the raster to octree conversion of CAT scan slices, to improve the speed of generating the octree from the slices, the possibility of utilizing the inherent parallesism in the conversion programme is explored in this thesis. The octree node, which stores the volume information in cube often stores the average density information could lead to “patchy”distribution of density during the image reconstruction. In an attempt to alleviate this problem and explored the possibility of using VQ to represent the imformation contained within a cube. Considering the ease of accommodating the process of compressing the information during the generation of octrees from CAT scan slices, proposed use of wavelet transforms to generate the compressed information in a cube. The modified algorithm for generating octrees from the slices is shown to accommodate the eavelet compression easily. Rendering the stored information in the form of octree is a complex task, necessarily because of the requirement to display the volumetric information. The reys traced from each cube in the octree, sum up the density en-route, accounting for the opacities and transparencies produced due to variations in density.
Resumo:
Embedded systems are usually designed for a single or a specified set of tasks. This specificity means the system design as well as its hardware/software development can be highly optimized. Embedded software must meet the requirements such as high reliability operation on resource-constrained platforms, real time constraints and rapid development. This necessitates the adoption of static machine codes analysis tools running on a host machine for the validation and optimization of embedded system codes, which can help meet all of these goals. This could significantly augment the software quality and is still a challenging field.Embedded systems are usually designed for a single or a specified set of tasks. This specificity means the system design as well as its hardware/software development can be highly optimized. Embedded software must meet the requirements such as high reliability operation on resource-constrained platforms, real time constraints and rapid development. This necessitates the adoption of static machine codes analysis tools running on a host machine for the validation and optimization of embedded system codes, which can help meet all of these goals. This could significantly augment the software quality and is still a challenging field.Embedded systems are usually designed for a single or a specified set of tasks. This specificity means the system design as well as its hardware/software development can be highly optimized. Embedded software must meet the requirements such as high reliability operation on resource-constrained platforms, real time constraints and rapid development. This necessitates the adoption of static machine codes analysis tools running on a host machine for the validation and optimization of embedded system codes, which can help meet all of these goals. This could significantly augment the software quality and is still a challenging field.Embedded systems are usually designed for a single or a specified set of tasks. This specificity means the system design as well as its hardware/software development can be highly optimized. Embedded software must meet the requirements such as high reliability operation on resource-constrained platforms, real time constraints and rapid development. This necessitates the adoption of static machine codes analysis tools running on a host machine for the validation and optimization of embedded system codes, which can help meet all of these goals. This could significantly augment the software quality and is still a challenging field.This dissertation contributes to an architecture oriented code validation, error localization and optimization technique assisting the embedded system designer in software debugging, to make it more effective at early detection of software bugs that are otherwise hard to detect, using the static analysis of machine codes. The focus of this work is to develop methods that automatically localize faults as well as optimize the code and thus improve the debugging process as well as quality of the code.Validation is done with the help of rules of inferences formulated for the target processor. The rules govern the occurrence of illegitimate/out of place instructions and code sequences for executing the computational and integrated peripheral functions. The stipulated rules are encoded in propositional logic formulae and their compliance is tested individually in all possible execution paths of the application programs. An incorrect sequence of machine code pattern is identified using slicing techniques on the control flow graph generated from the machine code.An algorithm to assist the compiler to eliminate the redundant bank switching codes and decide on optimum data allocation to banked memory resulting in minimum number of bank switching codes in embedded system software is proposed. A relation matrix and a state transition diagram formed for the active memory bank state transition corresponding to each bank selection instruction is used for the detection of redundant codes. Instances of code redundancy based on the stipulated rules for the target processor are identified.This validation and optimization tool can be integrated to the system development environment. It is a novel approach independent of compiler/assembler, applicable to a wide range of processors once appropriate rules are formulated. Program states are identified mainly with machine code pattern, which drastically reduces the state space creation contributing to an improved state-of-the-art model checking. Though the technique described is general, the implementation is architecture oriented, and hence the feasibility study is conducted on PIC16F87X microcontrollers. The proposed tool will be very useful in steering novices towards correct use of difficult microcontroller features in developing embedded systems.
Resumo:
India is the largest producer and processor of cashew in the world. The export value of cashew is about Rupees 2600 crore during 2004-05. Kerala is the main processing and exporting center of cashew. In Kerala most of the cashew processing factories are located in Kollam district. The industry provides livelihood for about 6-7 lakhs of employees and farmers, the cashew industry has national importance. In Kollam district alone there are more than 2.5 lakhs employees directly involved in the industry, which comes about 10 per cent of the population of the district, out of which 95 per cent are women workers. It is a fact that any amount received by a woman worker will be utilized directly for the benefit of the family and hence the link relating to family welfare is quite clear. Even though the Government of Kerala has incorporated the Kerala State Cashew Development Corporation (KSCDC) and Kerala State Cashew Workers Apex Industrial Co—operative Society (CAPEX) to develop the Cashew industry, the cashew industry and ancillary industries did not grow as per the expectation. In this context, an attempt has been made to analyze the problems and potential of the industry so as to make the industry viable and sustainable for the perpetual employment and income generation as well as the overall development of the Kollam district.
Resumo:
Decimal multiplication is an integral part of financial, commercial, and internet-based computations. This paper presents a novel double digit decimal multiplication (DDDM) technique that offers low latency and high throughput. This design performs two digit multiplications simultaneously in one clock cycle. Double digit fixed point decimal multipliers for 7digit, 16 digit and 34 digit are simulated using Leonardo Spectrum from Mentor Graphics Corporation using ASIC Library. The paper also presents area and delay comparisons for these fixed point multipliers on Xilinx, Altera, Actel and Quick logic FPGAs. This multiplier design can be extended to support decimal floating point multiplication for IEEE 754- 2008 standard.
Resumo:
This paper proposes a region based image retrieval system using the local colour and texture features of image sub regions. The regions of interest (ROI) are roughly identified by segmenting the image into fixed partitions, finding the edge map and applying morphological dilation. The colour and texture features of the ROIs are computed from the histograms of the quantized HSV colour space and Gray Level co- occurrence matrix (GLCM) respectively. Each ROI of the query image is compared with same number of ROIs of the target image that are arranged in the descending order of white pixel density in the regions, using Euclidean distance measure for similarity computation. Preliminary experimental results show that the proposed method provides better retrieving result than retrieval using some of the existing methods.
Resumo:
This paper presents methods for moving object detection in airborne video surveillance. The motion segmentation in the above scenario is usually difficult because of small size of the object, motion of camera, and inconsistency in detected object shape etc. Here we present a motion segmentation system for moving camera video, based on background subtraction. An adaptive background building is used to take advantage of creation of background based on most recent frame. Our proposed system suggests CPU efficient alternative for conventional batch processing based background subtraction systems. We further refine the segmented motion by meanshift based mode association.
Resumo:
In this paper, we propose a handwritten character recognition system for Malayalam language. The feature extraction phase consists of gradient and curvature calculation and dimensionality reduction using Principal Component Analysis. Directional information from the arc tangent of gradient is used as gradient feature. Strength of gradient in curvature direction is used as the curvature feature. The proposed system uses a combination of gradient and curvature feature in reduced dimension as the feature vector. For classification, discriminative power of Support Vector Machine (SVM) is evaluated. The results reveal that SVM with Radial Basis Function (RBF) kernel yield the best performance with 96.28% and 97.96% of accuracy in two different datasets. This is the highest accuracy ever reported on these datasets
Resumo:
In this paper, we propose a multispectral analysis system using wavelet based Principal Component Analysis (PCA), to improve the brain tissue classification from MRI images. Global transforms like PCA often neglects significant small abnormality details, while dealing with a massive amount of multispectral data. In order to resolve this issue, input dataset is expanded by detail coefficients from multisignal wavelet analysis. Then, PCA is applied on the new dataset to perform feature analysis. Finally, an unsupervised classification with Fuzzy C-Means clustering algorithm is used to measure the improvement in reproducibility and accuracy of the results. A detailed comparative analysis of classified tissues with those from conventional PCA is also carried out. Proposed method yielded good improvement in classification of small abnormalities with high sensitivity/accuracy values, 98.9/98.3, for clinical analysis. Experimental results from synthetic and clinical data recommend the new method as a promising approach in brain tissue analysis.
Resumo:
We have investigated the effects of swift heavy ion irradiation on thermally evaporated 44 nm thick, amorphous Co77Fe23 thin films on silicon substrates using 100 MeV Ag7+ ions fluences of 1 1011 ions/ cm2, 1 1012 ions/cm2, 1 1013 ions/cm2, and 3 1013 ions/cm2. The structural modifications upon swift heavy irradiation were investigated using glancing angle X-ray diffraction. The surface morphological evolution of thin film with irradiation was studied using Atomic Force Microscopy. Power spectral density analysis was used to correlate the roughness variation with structural modifications investigated using X-ray diffraction. Magnetic measurements were carried out using vibrating sample magnetometry and the observed variation in coercivity of the irradiated films is explained on the basis of stress relaxation. Magnetic force microscopy images are subjected to analysis using the scanning probe image processor software. These results are in agreement with the results obtained using vibrating sample magnetometry. The magnetic and structural properties are correlated
Resumo:
Pedicle screw insertion technique has made revolution in the surgical treatment of spinal fractures and spinal disorders. Although X- ray fluoroscopy based navigation is popular, there is risk of prolonged exposure to X- ray radiation. Systems that have lower radiation risk are generally quite expensive. The position and orientation of the drill is clinically very important in pedicle screw fixation. In this paper, the position and orientation of the marker on the drill is determined using pattern recognition based methods, using geometric features, obtained from the input video sequence taken from CCD camera. A search is then performed on the video frames after preprocessing, to obtain the exact position and orientation of the drill. An animated graphics, showing the instantaneous position and orientation of the drill is then overlaid on the processed video for real time drill control and navigation
Resumo:
Bank switching in embedded processors having partitioned memory architecture results in code size as well as run time overhead. An algorithm and its application to assist the compiler in eliminating the redundant bank switching codes introduced and deciding the optimum data allocation to banked memory is presented in this work. A relation matrix formed for the memory bank state transition corresponding to each bank selection instruction is used for the detection of redundant codes. Data allocation to memory is done by considering all possible permutation of memory banks and combination of data. The compiler output corresponding to each data mapping scheme is subjected to a static machine code analysis which identifies the one with minimum number of bank switching codes. Even though the method is compiler independent, the algorithm utilizes certain architectural features of the target processor. A prototype based on PIC 16F87X microcontrollers is described. This method scales well into larger number of memory blocks and other architectures so that high performance compilers can integrate this technique for efficient code generation. The technique is illustrated with an example
Resumo:
The research in the area of geopolymer is gaining momentum during the past 20 years. Studies confirm that geopolymer concrete has good compressive strength, tensile strength, flexural strength, modulus of elasticity and durability. These properties are comparable with OPC concrete.There are many occasions where concrete is exposed to elevated temperatures like fire exposure from thermal processor, exposure from furnaces, nuclear exposure, etc.. In such cases, understanding of the behaviour of concrete and structural members exposed to elevated temperatures is vital. Even though many research reports are available about the behaviour of OPC concrete at elevated temperatures, there is limited information available about the behaviour of geopolymer concrete after exposure to elevated temperatures. A preliminary study was carried out for the selection of a mix proportion. The important variable considered in the present study include alkali/fly ash ratio, percentage of total aggregate content, fine aggregate to total aggregate ratio, molarity of sodium hydroxide, sodium silicate to sodium hydroxide ratio, curing temperature and curing period. Influence of different variables on engineering properties of geopolymer concrete was investigated. The study on interface shear strength of reinforced and unreinforced geopolymer concrete as well as OPC concrete was also carried out. Engineering properties of fly ash based geopolymer concrete after exposure to elevated temperatures (ambient to 800 °C) were studied and the corresponding results were compared with those of conventional concrete. Scanning Electron Microscope analysis, Fourier Transform Infrared analysis, X-ray powder Diffractometer analysis and Thermogravimetric analysis of geopolymer mortar or paste at ambient temperature and after exposure to elevated temperature were also carried out in the present research work. Experimental study was conducted on geopolymer concrete beams after exposure to elevated temperatures (ambient to 800 °C). Load deflection characteristics, ductility and moment-curvature behaviour of the geopolymer concrete beams after exposure to elevated temperatures were investigated. Based on the present study, major conclusions derived could be summarized as follows. There is a definite proportion for various ingredients to achieve maximum strength properties. Geopolymer concrete with total aggregate content of 70% by volume, ratio of fine aggregate to total aggregate of 0.35, NaOH molarity 10, Na2SiO3/NaOH ratio of 2.5 and alkali to fly ash ratio of 0.55 gave maximum compressive strength in the present study. An early strength development in geopolymer concrete could be achieved by the proper selection of curing temperature and the period of curing. With 24 hours of curing at 100 °C, 96.4% of the 28th day cube compressive strength could be achieved in 7 days in the present study. The interface shear strength of geopolymer concrete is lower to that of OPC concrete. Compared to OPC concrete, a reduction in the interface shear strength by 33% and 29% was observed for unreinforced and reinforced geopolymer specimens respectively. The interface shear strength of geopolymer concrete is lower than ordinary Portland cement concrete. The interface shear strength of geopolymer concrete can be approximately estimated as 50% of the value obtained based on the available equations for the calculation of interface shear strength of ordinary portland cement concrete (method used in Mattock and ACI). Fly ash based geopolymer concrete undergoes a high rate of strength loss (compressive strength, tensile strength and modulus of elasticity) during its early heating period (up to 200 °C) compared to OPC concrete. At a temperature exposure beyond 600 °C, the unreacted crystalline materials in geopolymer concrete get transformed into amorphous state and undergo polymerization. As a result, there is no further strength loss (compressive strength, tensile strength and modulus of elasticity) in geopolymer concrete, whereas, OPC concrete continues to lose its strength properties at a faster rate beyond a temperature exposure of 600 °C. At present no equation is available to predict the strength properties of geopolymer concrete after exposure to elevated temperatures. Based on the study carried out, new equations have been proposed to predict the residual strengths (cube compressive strength, split tensile strength and modulus of elasticity) of geopolymer concrete after exposure to elevated temperatures (upto 800 °C). These equations could be used for material modelling until better refined equations are available. Compared to OPC concrete, geopolymer concrete shows better resistance against surface cracking when exposed to elevated temperatures. In the present study, while OPC concrete started developing cracks at 400 °C, geopolymer concrete did not show any visible cracks up to 600 °C and developed only minor cracks at an exposure temperatureof 800 °C. Geopolymer concrete beams develop crack at an early load stages if they are exposed to elevated temperatures. Even though the material strength of the geopolymer concrete does not decrease beyond 600 °C, the flexural strength of corresponding beam reduces rapidly after 600 °C temperature exposure, primarily due to the rapid loss of the strength of steel. With increase in temperature, the curvature at yield point of geopolymer concrete beam increases and thereby the ductility reduces. In the present study, compared to the ductility at ambient temperature, the ductility of geopolymer concrete beams reduces by 63.8% at 800 °C temperature exposure. Appropriate equations have been proposed to predict the service load crack width of geopolymer concrete beam exposed to elevated temperatures. These equations could be used to limit the service load on geopolymer concrete beams exposed to elevated temperatures (up to 800 °C) for a predefined crack width (between 0.1mm and 0.3 mm) or vice versa. The moment-curvature relationship of geopolymer concrete beams at ambient temperature is similar to that of RCC beams and this could be predicted using strain compatibility approach Once exposed to an elevated temperature, the strain compatibility approach underestimates the curvature of geopolymer concrete beams between the first cracking and yielding point.