35 resultados para graphics processor

em Biblioteca Digital da Produção Intelectual da Universidade de São Paulo (BDPI/USP)


Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper proposes and describes an architecture that allows the both engineer and programmer for defining and quantifying which peripheral of a microcontroller will be important to the particular project. For each application, it is necessary to use different types of peripherals. In this study, we have verified the possibility for emulating the behavior of peripheral in specifically CPUs. These CPUs hold a RAM memory, where code spaces specifically written for them could represent the behavior of some target peripheral, which are loaded and executed on it. We believed that the proposed architecture will provide larger flexibility in the use of the microcontrolles since this ""dedicated hardware components"" don`t execute to a special function, but it is a hardware capable to self adapt to the needs of each project. This research had as fundament a comparative study of four current microcontrollers. Preliminary tests using VHDL and FPGAs were done.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

OBJETIVO: Desenvolver a instrumentação e o "software" para topografia de córnea de grande-ângulo usando o tradicional disco de Plácido. O objetivo é permitir o mapeamento de uma região maior da córnea para topógrafos de córnea que usem a técnica de Plácido, fazendo-se uma adaptação simples na mira. MÉTODOS: Utilizando o tradicional disco de Plácido de um topógrafo de córnea tradicional, 9 LEDs (Light Emitting Diodes) foram adaptados no anteparo cônico para que o paciente voluntário pudesse fixar o olhar em diferentes direções. Para cada direção imagens de Plácido foram digitalizadas e processadas para formar, por meio de algoritmo envolvendo elementos sofisticados de computação gráfica, um mapa tridimensional completo da córnea toda. RESULTADOS: Resultados apresentados neste trabalho mostram que uma região de até 100% maior pode ser mapeada usando esta técnica, permitindo que o clínico mapeie até próximo ao limbo da córnea. São apresentados aqui os resultados para uma superfície esférica de calibração e também para uma córnea in vivo com alto grau de astigmatismo, mostrando a curvatura e elevação. CONCLUSÃO: Acredita-se que esta nova técnica pode propiciar a melhoria de alguns processos, como por exemplo: adaptação de lentes de contato, algoritmos para ablações costumizadas para hipermetropia, entre outros.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Objectives: To analyze the effects of low-level laser therapy (LLLT), 670 nm, with doses of 4 and 7 J/cm(2), on the repair of surgical wounds covered by occlusive dressings. Background Data: The effect of LLLT on the healing process of covered wounds is not well defined. Materials and Methods: For the histologic analysis with HE staining, 50 male Wistar rats were submitted to surgical incisions and divided into 10 groups (n=5): control; stimulated with 4 and 7 J/cm(2) daily, for 7 and 14 days, with or without occlusion. Reepithelization and the number of leukocytes, fibroblasts, and fibrocytes were obtained with an image processor. For the biomechanical analysis, 25 rats were submitted to a surgical incision and divided into five groups (n=5): treated for 14 days with and without occlusive dressing, and the sham group. Samples of the lesions were collected and submitted to the tensile test. One-way analysis of variance was performed, followed by post hoc analysis. A Tukey test was used on the biomechanical data, and the Tamhane test on the histologic data. A significance level of 5% was chosen (p <= 0.05). Results: The 4 and 7J/cm(2) laser with and without occlusive dressing did not alter significantly the reepithelization rate of the wounds. The 7 J/cm(2) laser reduced the number of leukocytes significantly. The number of fibroblasts was higher in the groups treated with laser for 7 days, and was significant in the covered 4 J/cm(2) laser group. Conclusions: Greater interference of the laser-treatment procedure was noted with 7 days of stimulation, and the occlusive dressing did not alter its biostimulatory effects.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In Natural Language Processing (NLP) symbolic systems, several linguistic phenomena, for instance, the thematic role relationships between sentence constituents, such as AGENT, PATIENT, and LOCATION, can be accounted for by the employment of a rule-based grammar. Another approach to NLP concerns the use of the connectionist model, which has the benefits of learning, generalization and fault tolerance, among others. A third option merges the two previous approaches into a hybrid one: a symbolic thematic theory is used to supply the connectionist network with initial knowledge. Inspired on neuroscience, it is proposed a symbolic-connectionist hybrid system called BIO theta PRED (BIOlogically plausible thematic (theta) symbolic-connectionist PREDictor), designed to reveal the thematic grid assigned to a sentence. Its connectionist architecture comprises, as input, a featural representation of the words (based on the verb/noun WordNet classification and on the classical semantic microfeature representation), and, as output, the thematic grid assigned to the sentence. BIO theta PRED is designed to ""predict"" thematic (semantic) roles assigned to words in a sentence context, employing biologically inspired training algorithm and architecture, and adopting a psycholinguistic view of thematic theory.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper presents a framework to build medical training applications by using virtual reality and a tool that helps the class instantiation of this framework. The main purpose is to make easier the building of virtual reality applications in the medical training area, considering systems to simulate biopsy exams and make available deformation, collision detection, and stereoscopy functionalities. The instantiation of the classes allows quick implementation of the tools for such a purpose, thus reducing errors and offering low cost due to the use of open source tools. Using the instantiation tool, the process of building applications is fast and easy. Therefore, computer programmers can obtain an initial application and adapt it to their needs. This tool allows the user to include, delete, and edit parameters in the functionalities chosen as well as storing these parameters for future use. In order to verify the efficiency of the framework, some case studies are presented.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper presents a compact embedded fuzzy system for three-phase induction-motor scalar speed control. The control strategy consists in keeping constant the voltage-frequency ratio of the induction-motor supply source. A fuzzy-control system is built on a digital signal processor, which uses speed error and speed-error variation to change both the fundamental voltage amplitude and frequency of a sinusoidal pulsewidth modulation inverter. An alternative optimized method for embedded fuzzy-system design is also proposed. The controller performance, in relation to reference and load-torque variations, is evaluated by experimental results. A comparative analysis with conventional proportional-integral controller is also achieved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Since the 1990s several large companies have been publishing nonfinancial performance reports. Focusing initially on the physical environment, these reports evolved to consider social relations, as well as data on the firm`s economic performance. A few mining companies pioneered this trend, and in the last years some of them incorporated the three dimensions of sustainable development, publishing so-called sustainability reports. This article reviews 31 reports published between 2001 and 2006 by four major mining companies. A set of 62 assessment items organized in six categories (namely context and commitment, management, environmental, social and economic performance, and accessibility and assurance) were selected to guide the review. The items were derived from international literature and recommended best practices, including the Global Reporting Initiative G3 framework. A content analysis was performed using the report as a sampling unit, and using phrases, graphics, or tables containing certain information as data collection units. A basic rating scale (0 or 1) was used for noting the presence or absence of information and a final percentage score was obtained for each report. Results show that there is a clear evolution in report`s comprehensiveness and depth. Categories ""accessibility and assurance"" and ""economic performance"" featured the lowest scores and do not present a clear evolution trend in the period, whereas categories ""context and commitment"" and ""social performance"" presented the best results and regular improvement; the category ""environmental performance,"" despite it not reaching the biggest scores, also featured constant evolution. Description of data measurement techniques, besides more comprehensive third-party verification are the items most in need of improvement.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A simplex-lattice statistical project was employed to study an optimization method for a preservative system in an ophthalmic suspension of dexametasone and polymyxin B. The assay matrix generated 17 formulas which were differentiated by the preservatives and EDTA (disodium ethylene diamine-tetraacetate), being the independent variable: X-1 = chlorhexidine digluconate (0.010 % w/v); X-2 = phenylethanol (0.500 % w/v); X-3 = EDTA (0.100 % w/v). The dependent variable was the Dvalue obtained from the microbial challenge of the formulas and calculated when the microbial killing process was modeled by an exponential function. The analysis of the dependent variable, performed using the software Design Expert/W, originated cubic equations with terms derived from stepwise adjustment method for the challenging microorganisms: Pseudomonas aeruginosa, Burkholderia cepacia, Staphylococcus aureus, Candida albicans and Aspergillus niger. Besides the mathematical expressions, the response surfaces and the contour graphics were obtained for each assay. The contour graphs obtained were overlaid in order to permit the identification of a region containing the most adequate formulas (graphic strategy), having as representatives: X-1 = 0.10 ( 0.001 % w/v); X-2 = 0.80 (0.400 % w/v); X-3 = 0.10 (0.010 % w/v). Additionally, in order to minimize responses (Dvalue), a numerical strategy corresponding to the use of the desirability function was used, which resulted in the following independent variables combinations: X-1 = 0.25 (0.0025 % w/v); X-2 = 0.75 (0.375 % w/v); X-3 = 0. These formulas, derived from the two strategies (graphic and numerical), were submitted to microbial challenge, and the experimental Dvalue obtained was compared to the theoretical Dvalue calculated from the cubic equation. Both Dvalues were similar to all the assays except that related to Staphylococcus aureus. This microorganism, as well as Pseudomonas aeruginosa, presented intense susceptibility to the formulas independently from the preservative and EDTA concentrations. Both formulas derived from graphic and numerical strategies attained the recommended criteria adopted by the official method. It was concluded that the model proposed allowed the optimization of the formulas in their preservation aspect.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We have used various computational methodologies including molecular dynamics, density functional theory, virtual screening, ADMET predictions and molecular interaction field studies to design and analyze four novel potential inhibitors of farnesyltransferase (FTase). Evaluation of two proposals regarding their drug potential as well as lead compounds have indicated them as novel promising FTase inhibitors, with theoretically interesting pharmacotherapeutic profiles, when Compared to the very active and most cited FTase inhibitors that have activity data reported, which are launched drugs or compounds in clinical tests. One of our two proposals appears to be a more promising drug candidate and FTase inhibitor, but both derivative molecules indicate potentially very good pharmacotherapeutic profiles in comparison with Tipifarnib and Lonafarnib, two reference pharmaceuticals. Two other proposals have been selected with virtual screening approaches and investigated by LIS, which suggest novel and alternatives scaffolds to design future potential FTase inhibitors. Such compounds can be explored as promising molecules to initiate a research protocol in order to discover novel anticancer drug candidates targeting farnesyltransferase, in the fight against cancer. (C) 2009 Elsevier Inc. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Time-averaged conformations of (+/-)-1-[3,4-(methylenedioxy)phenyl]-2-methylaminopropane hydrochloride (MDMA, ""ecstasy"") in D(2)O, and of its free base and trifluoroacetate in CDCl(3), were deduced from their (1)H NMR spectra and used to calculate their conformer distribution. Their rotational potential energy surface (PES) was calculated at the RHF/6-31G(d,p), 133LYP/6-31G(d,p), B3LYP/cc-pVDZ and AM1 levels. Solvent effects were evaluated using the polarizable continuum model. The NMR and theoretical studies showed that, in the free base, the N-methyl group and the ring are preferentially trans. This preference is stronger in the salts and corresponds to the X-ray structure of the hydrochloride. However, the energy barriers separating these forms are very low. The X-ray diffraction crystal structures of the anhydrous salt and its monohydrate differed mainly in the trans or cis relationship of the N-methyl group to the a-methyl, although these two forms interconvert freely in solution. (C) 2007 Elsevier Inc. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We present a new technique for obtaining model fittings to very long baseline interferometric images of astrophysical jets. The method minimizes a performance function proportional to the sum of the squared difference between the model and observed images. The model image is constructed by summing N(s) elliptical Gaussian sources characterized by six parameters: two-dimensional peak position, peak intensity, eccentricity, amplitude, and orientation angle of the major axis. We present results for the fitting of two main benchmark jets: the first constructed from three individual Gaussian sources, the second formed by five Gaussian sources. Both jets were analyzed by our cross-entropy technique in finite and infinite signal-to-noise regimes, the background noise chosen to mimic that found in interferometric radio maps. Those images were constructed to simulate most of the conditions encountered in interferometric images of active galactic nuclei. We show that the cross-entropy technique is capable of recovering the parameters of the sources with a similar accuracy to that obtained from the very traditional Astronomical Image Processing System Package task IMFIT when the image is relatively simple (e. g., few components). For more complex interferometric maps, our method displays superior performance in recovering the parameters of the jet components. Our methodology is also able to show quantitatively the number of individual components present in an image. An additional application of the cross-entropy technique to a real image of a BL Lac object is shown and discussed. Our results indicate that our cross-entropy model-fitting technique must be used in situations involving the analysis of complex emission regions having more than three sources, even though it is substantially slower than current model-fitting tasks (at least 10,000 times slower for a single processor, depending on the number of sources to be optimized). As in the case of any model fitting performed in the image plane, caution is required in analyzing images constructed from a poorly sampled (u, v) plane.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We report the use of optical coherence tomography (OCT) to detect and quantify demineralization process induced by S. mutans biofilm in third molars human teeth. Artificial lesions were induced by a S. mutans microbiological culture and the samples (N = 50) were divided into groups according to the demineralization time: 3, 5, 7, 9, and 11days. The OCT system was implemented using a light source delivering an average power of 96 mu W in the sample arm, and spectral characteristics allowing 23 mu m of axial resolution. The images were produced with lateral scans step of 10 pan and analyzed individually. As a result of the evaluation of theses images, lesion depth was calculated as function of demineralization time. The depth of the lesion in the root dentine increased from 70 pm to 230,urn (corrected by the enamel refraction index, 1.62 @ 856 nm), depending of exposure time. The lesion depth in root dentine was correlated to demineralization time, showing that it follows a geometrical progression like a bacteria growth law. [GRAPHICS] Progression of lesion depth in root dentine as function of exposure time, showing that it follows a geometrical progression like a bacteria growth law(C) 2009 by Astro Ltd. Published exclusively by WILEY-VCH Verlag GmbH & Co. KGaA

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Visualization of high-dimensional data requires a mapping to a visual space. Whenever the goal is to preserve similarity relations a frequent strategy is to use 2D projections, which afford intuitive interactive exploration, e. g., by users locating and selecting groups and gradually drilling down to individual objects. In this paper, we propose a framework for projecting high-dimensional data to 3D visual spaces, based on a generalization of the Least-Square Projection (LSP). We compare projections to 2D and 3D visual spaces both quantitatively and through a user study considering certain exploration tasks. The quantitative analysis confirms that 3D projections outperform 2D projections in terms of precision. The user study indicates that certain tasks can be more reliably and confidently answered with 3D projections. Nonetheless, as 3D projections are displayed on 2D screens, interaction is more difficult. Therefore, we incorporate suitable interaction functionalities into a framework that supports 3D transformations, predefined optimal 2D views, coordinated 2D and 3D views, and hierarchical 3D cluster definition and exploration. For visually encoding data clusters in a 3D setup, we employ color coding of projected data points as well as four types of surface renderings. A second user study evaluates the suitability of these visual encodings. Several examples illustrate the framework`s applicability for both visual exploration of multidimensional abstract (non-spatial) data as well as the feature space of multi-variate spatial data.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Point placement strategies aim at mapping data points represented in higher dimensions to bi-dimensional spaces and are frequently used to visualize relationships amongst data instances. They have been valuable tools for analysis and exploration of data sets of various kinds. Many conventional techniques, however, do not behave well when the number of dimensions is high, such as in the case of documents collections. Later approaches handle that shortcoming, but may cause too much clutter to allow flexible exploration to take place. In this work we present a novel hierarchical point placement technique that is capable of dealing with these problems. While good grouping and separation of data with high similarity is maintained without increasing computation cost, its hierarchical structure lends itself both to exploration in various levels of detail and to handling data in subsets, improving analysis capability and also allowing manipulation of larger data sets.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The problem of projecting multidimensional data into lower dimensions has been pursued by many researchers due to its potential application to data analyses of various kinds. This paper presents a novel multidimensional projection technique based on least square approximations. The approximations compute the coordinates of a set of projected points based on the coordinates of a reduced number of control points with defined geometry. We name the technique Least Square Projections ( LSP). From an initial projection of the control points, LSP defines the positioning of their neighboring points through a numerical solution that aims at preserving a similarity relationship between the points given by a metric in mD. In order to perform the projection, a small number of distance calculations are necessary, and no repositioning of the points is required to obtain a final solution with satisfactory precision. The results show the capability of the technique to form groups of points by degree of similarity in 2D. We illustrate that capability through its application to mapping collections of textual documents from varied sources, a strategic yet difficult application. LSP is faster and more accurate than other existing high-quality methods, particularly where it was mostly tested, that is, for mapping text sets.