927 resultados para graphics processor
Resumo:
In Natural Language Processing (NLP) symbolic systems, several linguistic phenomena, for instance, the thematic role relationships between sentence constituents, such as AGENT, PATIENT, and LOCATION, can be accounted for by the employment of a rule-based grammar. Another approach to NLP concerns the use of the connectionist model, which has the benefits of learning, generalization and fault tolerance, among others. A third option merges the two previous approaches into a hybrid one: a symbolic thematic theory is used to supply the connectionist network with initial knowledge. Inspired on neuroscience, it is proposed a symbolic-connectionist hybrid system called BIO theta PRED (BIOlogically plausible thematic (theta) symbolic-connectionist PREDictor), designed to reveal the thematic grid assigned to a sentence. Its connectionist architecture comprises, as input, a featural representation of the words (based on the verb/noun WordNet classification and on the classical semantic microfeature representation), and, as output, the thematic grid assigned to the sentence. BIO theta PRED is designed to ""predict"" thematic (semantic) roles assigned to words in a sentence context, employing biologically inspired training algorithm and architecture, and adopting a psycholinguistic view of thematic theory.
Resumo:
This paper presents a framework to build medical training applications by using virtual reality and a tool that helps the class instantiation of this framework. The main purpose is to make easier the building of virtual reality applications in the medical training area, considering systems to simulate biopsy exams and make available deformation, collision detection, and stereoscopy functionalities. The instantiation of the classes allows quick implementation of the tools for such a purpose, thus reducing errors and offering low cost due to the use of open source tools. Using the instantiation tool, the process of building applications is fast and easy. Therefore, computer programmers can obtain an initial application and adapt it to their needs. This tool allows the user to include, delete, and edit parameters in the functionalities chosen as well as storing these parameters for future use. In order to verify the efficiency of the framework, some case studies are presented.
Resumo:
This paper presents a compact embedded fuzzy system for three-phase induction-motor scalar speed control. The control strategy consists in keeping constant the voltage-frequency ratio of the induction-motor supply source. A fuzzy-control system is built on a digital signal processor, which uses speed error and speed-error variation to change both the fundamental voltage amplitude and frequency of a sinusoidal pulsewidth modulation inverter. An alternative optimized method for embedded fuzzy-system design is also proposed. The controller performance, in relation to reference and load-torque variations, is evaluated by experimental results. A comparative analysis with conventional proportional-integral controller is also achieved.
Resumo:
Since the 1990s several large companies have been publishing nonfinancial performance reports. Focusing initially on the physical environment, these reports evolved to consider social relations, as well as data on the firm`s economic performance. A few mining companies pioneered this trend, and in the last years some of them incorporated the three dimensions of sustainable development, publishing so-called sustainability reports. This article reviews 31 reports published between 2001 and 2006 by four major mining companies. A set of 62 assessment items organized in six categories (namely context and commitment, management, environmental, social and economic performance, and accessibility and assurance) were selected to guide the review. The items were derived from international literature and recommended best practices, including the Global Reporting Initiative G3 framework. A content analysis was performed using the report as a sampling unit, and using phrases, graphics, or tables containing certain information as data collection units. A basic rating scale (0 or 1) was used for noting the presence or absence of information and a final percentage score was obtained for each report. Results show that there is a clear evolution in report`s comprehensiveness and depth. Categories ""accessibility and assurance"" and ""economic performance"" featured the lowest scores and do not present a clear evolution trend in the period, whereas categories ""context and commitment"" and ""social performance"" presented the best results and regular improvement; the category ""environmental performance,"" despite it not reaching the biggest scores, also featured constant evolution. Description of data measurement techniques, besides more comprehensive third-party verification are the items most in need of improvement.
Resumo:
A simplex-lattice statistical project was employed to study an optimization method for a preservative system in an ophthalmic suspension of dexametasone and polymyxin B. The assay matrix generated 17 formulas which were differentiated by the preservatives and EDTA (disodium ethylene diamine-tetraacetate), being the independent variable: X-1 = chlorhexidine digluconate (0.010 % w/v); X-2 = phenylethanol (0.500 % w/v); X-3 = EDTA (0.100 % w/v). The dependent variable was the Dvalue obtained from the microbial challenge of the formulas and calculated when the microbial killing process was modeled by an exponential function. The analysis of the dependent variable, performed using the software Design Expert/W, originated cubic equations with terms derived from stepwise adjustment method for the challenging microorganisms: Pseudomonas aeruginosa, Burkholderia cepacia, Staphylococcus aureus, Candida albicans and Aspergillus niger. Besides the mathematical expressions, the response surfaces and the contour graphics were obtained for each assay. The contour graphs obtained were overlaid in order to permit the identification of a region containing the most adequate formulas (graphic strategy), having as representatives: X-1 = 0.10 ( 0.001 % w/v); X-2 = 0.80 (0.400 % w/v); X-3 = 0.10 (0.010 % w/v). Additionally, in order to minimize responses (Dvalue), a numerical strategy corresponding to the use of the desirability function was used, which resulted in the following independent variables combinations: X-1 = 0.25 (0.0025 % w/v); X-2 = 0.75 (0.375 % w/v); X-3 = 0. These formulas, derived from the two strategies (graphic and numerical), were submitted to microbial challenge, and the experimental Dvalue obtained was compared to the theoretical Dvalue calculated from the cubic equation. Both Dvalues were similar to all the assays except that related to Staphylococcus aureus. This microorganism, as well as Pseudomonas aeruginosa, presented intense susceptibility to the formulas independently from the preservative and EDTA concentrations. Both formulas derived from graphic and numerical strategies attained the recommended criteria adopted by the official method. It was concluded that the model proposed allowed the optimization of the formulas in their preservation aspect.
Resumo:
We have used various computational methodologies including molecular dynamics, density functional theory, virtual screening, ADMET predictions and molecular interaction field studies to design and analyze four novel potential inhibitors of farnesyltransferase (FTase). Evaluation of two proposals regarding their drug potential as well as lead compounds have indicated them as novel promising FTase inhibitors, with theoretically interesting pharmacotherapeutic profiles, when Compared to the very active and most cited FTase inhibitors that have activity data reported, which are launched drugs or compounds in clinical tests. One of our two proposals appears to be a more promising drug candidate and FTase inhibitor, but both derivative molecules indicate potentially very good pharmacotherapeutic profiles in comparison with Tipifarnib and Lonafarnib, two reference pharmaceuticals. Two other proposals have been selected with virtual screening approaches and investigated by LIS, which suggest novel and alternatives scaffolds to design future potential FTase inhibitors. Such compounds can be explored as promising molecules to initiate a research protocol in order to discover novel anticancer drug candidates targeting farnesyltransferase, in the fight against cancer. (C) 2009 Elsevier Inc. All rights reserved.
Resumo:
Time-averaged conformations of (+/-)-1-[3,4-(methylenedioxy)phenyl]-2-methylaminopropane hydrochloride (MDMA, ""ecstasy"") in D(2)O, and of its free base and trifluoroacetate in CDCl(3), were deduced from their (1)H NMR spectra and used to calculate their conformer distribution. Their rotational potential energy surface (PES) was calculated at the RHF/6-31G(d,p), 133LYP/6-31G(d,p), B3LYP/cc-pVDZ and AM1 levels. Solvent effects were evaluated using the polarizable continuum model. The NMR and theoretical studies showed that, in the free base, the N-methyl group and the ring are preferentially trans. This preference is stronger in the salts and corresponds to the X-ray structure of the hydrochloride. However, the energy barriers separating these forms are very low. The X-ray diffraction crystal structures of the anhydrous salt and its monohydrate differed mainly in the trans or cis relationship of the N-methyl group to the a-methyl, although these two forms interconvert freely in solution. (C) 2007 Elsevier Inc. All rights reserved.
Resumo:
The demand for more pixels is beginning to be met as manufacturers increase the native resolution of projector chips. Tiling several projectors still offers a solution to augment the pixel capacity of a display. However, problems of color and illumination uniformity across projectors need to be addressed as well as the computer software required to drive such devices. We present the results obtained on a desktop-size tiled projector array of three D-ILA projectors sharing a common illumination source. A short throw lens (0.8:1) on each projector yields a 21-in. diagonal for each image tile; the composite image on a 3×1 array is 3840×1024 pixels with a resolution of about 80 dpi. The system preserves desktop resolution, is compact, and can fit in a normal room or laboratory. The projectors are mounted on precision six-axis positioners, which allow pixel level alignment. A fiber optic beamsplitting system and a single set of red, green, and blue dichroic filters are the key to color and illumination uniformity. The D-ILA chips inside each projector can be adjusted separately to set or change characteristics such as contrast, brightness, or gamma curves. The projectors were then matched carefully: photometric variations were corrected, leading to a seamless image. Photometric measurements were performed to characterize the display and are reported here. This system is driven by a small PC cluster fitted with graphics cards and running Linux. It can be scaled to accommodate an array of 2×3 or 3×3 projectors, thus increasing the number of pixels of the final image. Finally, we present current uses of the display in fields such as astrophysics and archaeology (remote sensing).
Resumo:
One of the challenges in scientific visualization is to generate software libraries suitable for the large-scale data emerging from tera-scale simulations and instruments. We describe the efforts currently under way at SDSC and NPACI to address these challenges. The scope of the SDSC project spans data handling, graphics, visualization, and scientific application domains. Components of the research focus on the following areas: intelligent data storage, layout and handling, using an associated “Floor-Plan” (meta data); performance optimization on parallel architectures; extension of SDSC’s scalable, parallel, direct volume renderer to allow perspective viewing; and interactive rendering of fractional images (“imagelets”), which facilitates the examination of large datasets. These concepts are coordinated within a data-visualization pipeline, which operates on component data blocks sized to fit within the available computing resources. A key feature of the scheme is that the meta data, which tag the data blocks, can be propagated and applied consistently. This is possible at the disk level, in distributing the computations across parallel processors; in “imagelet” composition; and in feature tagging. The work reflects the emerging challenges and opportunities presented by the ongoing progress in high-performance computing (HPC) and the deployment of the data, computational, and visualization Grids.
Resumo:
Background. Age-related motor slowing may reflect either motor programming deficits, poorer movement execution, or mere strategic preferences for online guidance of movement. We controlled such preferences, limiting the extent to which movements could be programmed. Methods. Twenty-four young and 24 older adults performed a line drawing task that allowed movements to he prepared in advance in one case (i.e., cue initially available indicating target location) and not in another (i.e., no cue initially available as to target location). Participants connected large or small targets illuminated by light-emitting diodes upon a graphics tablet that sampled pen tip position at 200 Hz. Results. Older adults had a disproportionate difficulty initiating movement when prevented from programming in advance. Older adults produced slower, less efficient movements, particularly when prevented from programming under greater precision requirements. Conclusions. The slower movements of older adults do not simply reflect a preference for online control, as older adults have less efficient movements when forced to reprogram their movements. Age-related motor slowing kinematically resembles that seen in patients with cerebellar dysfunction.
Resumo:
In this and a preceding paper, we provide an introduction to the Fujitsu VPP range of vector-parallel supercomputers and to some of the computational chemistry software available for the VPP. Here, we consider the implementation and performance of seven popular chemistry application packages. The codes discussed range from classical molecular dynamics to semiempirical and ab initio quantum chemistry. All have evolved from sequential codes, and have typically been parallelised using a replicated data approach. As such they are well suited to the large-memory/fast-processor architecture of the VPP. For one code, CASTEP, a distributed-memory data-driven parallelisation scheme is presented. (C) 2000 Published by Elsevier Science B.V. All rights reserved.
Resumo:
EXAFS spectra of [(HC(Ph2PO)(3))(2)Cu](ClO4)(2). 2H(2)O have been measured at room temperature. These show that the CuO6 unit is tetragonally elongated, rather than having the compressed tetragonal geometry previously inferred from the X-ray crystal structure determination. [GRAPHICS]
Resumo:
Although planning is important for the functioning of patients with dementia of the Alzheimer Type (DAT), little is known about response programming in DAT. This study used a cueing paradigm coupled with quantitative kinematic analysis to document the preparation and execution of movements made by a group of 12 DAT patients and their age and sex matched controls. Participants connected a series of targets placed upon a WACOM SD420 graphics tablet, in response to the pattern of illumination of a set of light emitting diodes (LEDs). In one condition, participants could programme the upcoming movement, whilst in another they were forced to reprogramme this movement on-line (i.e. they were not provided with advance information about the location of the upcoming target). DAT patients were found to have programming deficits, taking longer to initiate movements; particularly in the absence of cues. While problems spontaneously programming a movement might cause a greater reliance upon on-line guidance, when both groups were required to guide the movement on-line, DAT patients continued to show slower and less efficient movements implying declining sensori-motor function; these differences were not simply due to strategy or medication status. (C) 1997 Elsevier Science Ltd.
Resumo:
This study aimed to quantify the efficiency and smoothness of voluntary movement in Huntington's disease (HD) by the use of a graphics tablet that permits analysis of movement profiles. In particular, we aimed to ascertain whether a concurrent task (digit span) would affect the kinematics of goal-directed movements. Twelve patients with HD and their matched controls performed 12 vertical zig-zag movements, with both left and right hands (with and without the concurrent task), to large or small circular targets over long or short extents. The concurrent task was associated with shorter movement times and reduced right-hand superiority. Patients with HD were overall slower, especially with long strokes, and had similar peak velocities for both small and large targets, so that controls could better accommodate differences in target size. Patients with HD spent more time decelerating, especially with small targets, whereas controls allocated more nearly equal proportions of time to the acceleration and deceleration phases of movement, especially with large targets. Short strokes were generally less force inefficient than were long strokes, especially so for either hand in either group in the absence of the concurrent task, and for the right hand in its presence. With the concurrent task, however, the left hand's behavior changed differentially for the two groups; for patients with HD, it became more force efficient with short strokes and even less efficient with long strokes, whereas for controls, it became more efficient with long strokes. Controls may be able to divert attention away from the inferior left hand, increasing its automaticity, whereas patients with HD, because of disease, may be forced to engage even further online visual control under the demands of a concurrent task. Patients with HD may perhaps become increasingly reliant on terminal visual guidance, which indicates an impairment in constructing and refining an internal representation of the movement necessary for its. effective execution. Basal ganglia dysfunction may impair the ability to use internally generated cues to guide movement.
Resumo:
Objectives-This study adopted a concurrent task design and aimed to quantify the efficiency and smoothness of voluntary movement in Tourette's syndrome via the use of a graphics tablet which permits analysis of movement profiles. In particular, the aim was to ascertain whether a concurrent task (digit span) would affect the kinematics of goal directed movements, and whether patients with Tourette's syndrome would exhibit abnormal functional asymmetries compared with their matched controls. Methods-Twelve patients with Tourette's syndrome and their matched controls performed 12 vertical zig zag movements, with both left and right hands (with and without the concurrent task), to large or small targets over long or short extents. Results-With short strokes, controls showed the predicted right hand superiority in movement time more strongly than patients with Tourette's syndrome, who instead showed greater hand symmetry with short strokes. The right hand of controls was less force efficient with long strokes and more force efficient with short strokes, whereas either hand of patients with Tourette's syndrome was equally force efficient, irrespective of stroke length, with an overall performance profile similar to but better than that of the controls' left hand. The concurrent task, however, increased the force efficiency of the right hand in patients with Tourette's syndrome and the left hand in controls. Conclusions-Patients with Tourette's syndrome, compared with controls, were not impaired in the performance of fast, goal directed movements such as aiming at targets; they performed in certain respects better than controls. The findings clearly add to the growing literature on anomalous lateralisation in Tourette's syndrome, which may be explained by the recently reported loss of normal basal ganglia asymmetries in that disorder.