993 resultados para Graphical processing units


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Thermoplastic starch (TPS) was modified with ascorbic acid and citric acid by melt processing of native starch with glycerol as plasticizer in an intensive batch mixer at 160 degrees C. It was found that the molar mass decreases with acid content and processing time causing the reduction in melting temperature (T(m)). As observed by the results of X-ray diffraction and DSC measurements, crystallinity was not changed by the reaction with organic acids. T(m) depression with falling molar mass was interpreted on the basis of the effect of concentration of end-chain units, which act as diluents. FTIR did not show any appreciable change in starch chemical compositions, leading to the conclusion that the main changes observed were produced by the variation in molar mass of the material. We demonstrated that it is possible to decrease melt viscosity without the need for more plasticizer thus avoiding side-effects such as an increase in water affinity or relevant changes in the dynamic mechanical properties. (C) 2010 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

No fully effective treatment has been developed since the discovery of Chagas` disease. Since drug-resistant Trypanosoma cruzi strains are occurring and the current therapy is effective in the acute phase but with various adverse side effects, more studies are needed to characterize the susceptibility of T. cruzi to new drugs. Pre-mRNA maturation in trypanosomatids occurs through a process called trans-splicing, which is unusual RNA processing reaction, and it implies the processing of polycistronic transcription units into individual mRNAs; a short transcript spliced leader (SL RNA) is trans-spliced to the acceptor pre-mRNA, giving origin to the mature mRNA. Cubebin derivatives seem to provide treatments with less collateral effects than benznidazole and showed similar or better trypanocidal activities than benznidazole. Therefore, the cubebin derivatives ((-)-6,6`-dinitrohinokinin (DNH) and (-)-hinokinin (HQ)) interference in the mRNA processing was evaluated using T. cruzi permeable cells (Y and BOL (Bolivia) strains) following by RNase protection reaction. These substances seem to intervene in any step of the RNA transcription, promoting alterations in the RNA synthesis, even though the RNA processing mechanism still occurs. Furthermore, HQ presented better activity against the parasites than DNH, meaning that BOL strain seems to be more resistant than Y.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Dissertação para obtenção do Grau de Doutor em Química Sustentável

Relevância:

30.00% 30.00%

Publicador:

Resumo:

With the objective to evaluate PCR-mediated detection of Mycobacterium tuberculosis DNA as a diagnostic procedure for diagnosis of tuberculosis in individuals attending ambulatory services in Primary Health Units of the City Tuberculosis Program in Rio de Janeiro, Brazil, their sputum samples were collected and treated with a DNA extraction procedure using silica-guanidiniumthiocyanate. This procedure has been described to be highly efficient for extraction of different kind of nucleic acids from bacteria and clinical samples. Upon comparing PCR results with the number of acid-fast bacilli, no direct relation was observed between the number of bacilli present in the sample and PCR positivity. Part of the processed samples was therefore spiked with pure DNA of M. tuberculosis and inhibition of the PCR reaction was verified in 22 out of 36 (61%) of the samples, demonstrating that the extraction procedure as originally described should not be used for PCR analysis of sputum samples.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Traditionally, braided river research has considered flow, sediment transport processes and, recently, vegetation dynamics in relation to river morphodynamics. However, if considering the development of woody vegetated patches over a time scale of decades, we must consider the extent to which soil forming processes, particularly related to soil organic matter, impact the alluvial geomorphic-vegetation system. Here we quantify the soil organic matter processing (humification) that occurs on young alluvial landforms. We sampled different geomorphic units, ranging from the active river channel to established river terraces in a braided river system. For each geomorphic unit, soil pits were used to sample sediment/soil layers that were analysed in terms of grain size (<2mm) and organic matter quantity and quality (RockEval method). A principal components analysis was used to identify patterns in the dataset. Results suggest that during the succession from bare river gravels to a terrace soil, there is a transition from small amounts of external organic matter supply provided by sedimentation processes (e.g. organic matter transported in suspension and deposited on bars), to large amounts of autogenic in situ organic matter production due to plant colonisation. This appears to change the time scale and pathways of alluvial succession (bio-geomorphic succession). However, this process is complicated by: the ongoing possibility of local sedimentation, which can serve to isolate surface layers via aggradation from the exogenic supply; and erosion which tends to create fresh deposits upon which organic matter processing must re-start. The result is a complex pattern of organic matter states as well as a general lack of any clear chronosequence within the active river corridor. This state reflects the continual battle between deposition events that can isolate organic matter from the surface, erosion events that can destroy accumulating organic matter and the early ecosystem processes necessary to assist the co-evolution of soil and vegetation. A key question emerges over the extent to which the fresh organic matter deposited in the active zone is capable of significantly transforming the local geochemical environment sufficiently to accelerate soil development.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper discusses the qualitativecomparative evaluation performed on theresults of two machine translation systemswith different approaches to the processing ofmulti-word units. It proposes a solution forovercoming the difficulties multi-word unitspresent to machine translation by adopting amethodology that combines the lexicongrammar approach with OpenLogos ontologyand semantico-syntactic rules. The paper alsodiscusses the importance of a qualitativeevaluation metrics to correctly evaluate theperformance of machine translation engineswith regards to multi-word units.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We studied the action of high pressure processing on the inactivation of two foodborne pathogens, Staphylococcus aureus ATCC 6538 and Salmonella enteritidis ATCC 13076, suspended in a culture medium and inoculated into caviar samples. The baroresistance of the two pathogens in a tryptic soy broth suspension at a concentration of 10(8)-10(9) colony-forming units/ml was tested for continuous and cycled pressurization in the 150- to 550-MPa range and for 15-min treatments at room temperature. The increase of cycle number permitted the reduction of the pressure level able to totally inactivate both microorganisms in the tryptic soy broth suspension, whereas the effect of different procedure times on complete inactivation of the microorganisms inoculated into caviar was similar.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

No fully effective treatment has been developed since the discovery of Chagas' disease by Carlos Chagas in 1909. Since drug-resistant Trypanosoma cruzi strains are occurring and the current therapy is effectiveness in the acute phase but with various adverse side effects, more studies are needed to characterize the susceptibility of T. cruzi to new drugs. Many natural and/or synthetic substances showing trypanocidal activity have been used, even though they are not likely to be turned into clinically approved drugs. Originally, drug screening was performed using natural products, with only limited knowledge of the molecular mechanism involved in the development of diseases. Trans-splicing, which is unusual RNA processing reaction and occurs in nematodes and trypanosomes, implies the processing of polycistronic transcription units into individual mRNAs; a short transcript spliced leader (SL RNA) is trans-spliced to the acceptor pre-mRNA, giving origin to the mature mRNA. In the present study, permeable cells of T. cruzi epimastigote forms (Y, BOL and NCS strains) were treated to evaluate the interference of two drugs (hydroxymethylnitrofurazone - NFOH-121 and nitrofurazone) in the trans-splicing reaction using silver-stained PAGE analysis. Both drugs induced a significant reduction in RNA processing at concentrations from 5 to 12.5 µM. These data agreed with the biological findings, since the number of parasites decreased, especially with NFOH-121. This proposed methodology allows a rapid and cost-effective screening strategy for detecting drug interference in the trans-splicing mechanism of T. cruzi.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This lexical decision study with eye tracking of Japanese two-kanji-character words investigated the order in which a whole two-character word and its morphographic constituents are activated in the course of lexical access, the relative contributions of the left and the right characters in lexical decision, the depth to which semantic radicals are processed, and how nonlinguistic factors affect lexical processes. Mixed-effects regression analyses of response times and subgaze durations (i.e., first-pass fixation time spent on each of the two characters) revealed joint contributions of morphographic units at all levels of the linguistic structure with the magnitude and the direction of the lexical effects modulated by readers’ locus of attention in a left-to-right preferred processing path. During the early time frame, character effects were larger in magnitude and more robust than radical and whole-word effects, regardless of the font size and the type of nonwords. Extending previous radical-based and character-based models, we propose a task/decision-sensitive character-driven processing model with a level-skipping assumption: Connections from the feature level bypass the lower radical level and link up directly to the higher character level.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Almost everyone sketches. People use sketches day in and day out in many different and heterogeneous fields, to share their thoughts and clarify ambiguous interpretations, for example. The media used to sketch varies from analog tools like flipcharts to digital tools like smartboards. Whereas analog tools are usually affected by insufficient editing capabilities like cut/copy/paste, digital tools greatly support these scenarios. Digital tools can be grouped into informal and formal tools. Informal tools can be understood as simple drawing environments, whereas formal tools offer sophisticated support to create, optimize and validate diagrams of a certain application domain. Most digital formal tools force users to stick to a concrete syntax and editing workflow, limiting the user’s creativity. For that reason, a lot of people first sketch their ideas using the flexibility of analog or digital informal tools. Subsequently, the sketch is "portrayed" in an appropriate digital formal tool. This work presents Scribble, a highly configurable and extensible sketching framework which allows to dynamically inject sketching features into existing graphical diagram editors, based on Eclipse GEF. This allows to combine the flexibility of informal tools with the power of formal tools without any effort. No additional code is required to augment a GEF editor with sophisticated sketching features. Scribble recognizes drawn elements as well as handwritten text and automatically generates the corresponding domain elements. A local training data library is created dynamically by incrementally learning shapes, drawn by the user. Training data can be shared with others using the WebScribble web application which has been created as part of this work.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A long development time is needed from the design to the implementation of an AUV. During the first steps, simulation plays an important role, since it allows for the development of preliminary versions of the control system to be integrated. Once the robot is ready, the control systems are implemented, tuned and tested. The use of a real-time simulator can help closing the gap between off-line simulation and real testing using the already implemented robot. When properly interfaced with the robot hardware, a real-time graphical simulation with a "hardware in the loop" configuration, can allow for the testing of the implemented control system running in the actual robot hardware. Hence, the development time is drastically reduced. These paper overviews the field of graphical simulators used for AUV development proposing a classification. It also presents NEPTUNE, a multi-vehicle, real-time, graphical simulator based on OpenGL that allows hardware in the loop simulations

Relevância:

30.00% 30.00%

Publicador:

Resumo:

When performing data fusion, one often measures where targets were and then wishes to deduce where targets currently are. There has been recent research on the processing of such out-of-sequence data. This research has culminated in the development of a number of algorithms for solving the associated tracking problem. This paper reviews these different approaches in a common Bayesian framework and proposes an architecture that orthogonalises the data association and out-of-sequence problems such that any combination of solutions to these two problems can be used together. The emphasis is not on advocating one approach over another on the basis of computational expense, but rather on understanding the relationships among the algorithms so that any approximations made are explicit. Results for a multi-sensor scenario involving out-of-sequence data association are used to illustrate the utility of this approach in a specific context.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: Expression microarrays are increasingly used to obtain large scale transcriptomic information on a wide range of biological samples. Nevertheless, there is still much debate on the best ways to process data, to design experiments and analyse the output. Furthermore, many of the more sophisticated mathematical approaches to data analysis in the literature remain inaccessible to much of the biological research community. In this study we examine ways of extracting and analysing a large data set obtained using the Agilent long oligonucleotide transcriptomics platform, applied to a set of human macrophage and dendritic cell samples. Results: We describe and validate a series of data extraction, transformation and normalisation steps which are implemented via a new R function. Analysis of replicate normalised reference data demonstrate that intrarray variability is small (only around 2 of the mean log signal), while interarray variability from replicate array measurements has a standard deviation (SD) of around 0.5 log(2) units (6 of mean). The common practise of working with ratios of Cy5/Cy3 signal offers little further improvement in terms of reducing error. Comparison to expression data obtained using Arabidopsis samples demonstrates that the large number of genes in each sample showing a low level of transcription reflect the real complexity of the cellular transcriptome. Multidimensional scaling is used to show that the processed data identifies an underlying structure which reflect some of the key biological variables which define the data set. This structure is robust, allowing reliable comparison of samples collected over a number of years and collected by a variety of operators. Conclusions: This study outlines a robust and easily implemented pipeline for extracting, transforming normalising and visualising transcriptomic array data from Agilent expression platform. The analysis is used to obtain quantitative estimates of the SD arising from experimental (non biological) intra- and interarray variability, and for a lower threshold for determining whether an individual gene is expressed. The study provides a reliable basis for further more extensive studies of the systems biology of eukaryotic cells.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Derivational morphological processes allow us to create new words (e.g. punish (V) to noun (N) punishment) from base forms. The number of steps from the basic units to derived words often varies (e.g., nationalityprocessing of multiple surface morphemes. Here we report the first study to investigate morphological processing where derivational steps are not overtly marked (e.g., bridge-N>bridge-V) i.e., zero-derivation ( Aronoff, 1980). We compared the processing of one-step (soakingprocessing, such as the left inferior frontal gyrus (LIFG). Critically, activation was also more pronounced for two-step compared to one-step forms. Since both types of derived words have the same surface structure, our findings suggest that morphological processing is based on underlying morphological complexity, independent of overt affixation. This study is the first to provide evidence for the processing of zero derivation, and demonstrates that morphological processing cannot be reduced to surface form-based segmentation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A constraint satisfaction problem is a classical artificial intelligence paradigm characterized by a set of variables (each variable with an associated domain of possible values), and a set of constraints that specify relations among subsets of these variables. Solutions are assignments of values to all variables that satisfy all the constraints. Many real world problems may be modelled by means of constraints. The range of problems that can use this representation is very diverse and embraces areas like resource allocation, scheduling, timetabling or vehicle routing. Constraint programming is a form of declarative programming in the sense that instead of specifying a sequence of steps to execute, it relies on properties of the solutions to be found, which are explicitly defined by constraints. The idea of constraint programming is to solve problems by stating constraints which must be satisfied by the solutions. Constraint programming is based on specialized constraint solvers that take advantage of constraints to search for solutions. The success and popularity of complex problem solving tools can be greatly enhanced by the availability of friendly user interfaces. User interfaces cover two fundamental areas: receiving information from the user and communicating it to the system; and getting information from the system and deliver it to the user. Despite its potential impact, adequate user interfaces are uncommon in constraint programming in general. The main goal of this project is to develop a graphical user interface that allows to, intuitively, represent constraint satisfaction problems. The idea is to visually represent the variables of the problem, their domains and the problem constraints and enable the user to interact with an adequate constraint solver to process the constraints and compute the solutions. Moreover, the graphical interface should be capable of configure the solver’s parameters and present solutions in an appealing interactive way. As a proof of concept, the developed application – GraphicalConstraints – focus on continuous constraint programming, which deals with real valued variables and numerical constraints (equations and inequalities). RealPaver, a state-of-the-art solver in continuous domains, was used in the application. The graphical interface supports all stages of constraint processing, from the design of the constraint network to the presentation of the end feasible space solutions as 2D or 3D boxes.