959 resultados para GRAPHICS
Resumo:
Nowadays technological trend is based on finding materials that could support low weight with satisfactory mechanical properties and for this reason composite material became a very attractive topic in research projects all over the world. Due to its heterogenic properties, this type of material shows scatter in mechanical test results, especially in cyclic loading. Therefore it is important to predict its fatigue strength behaviour by statistic analysis, once fatigue causes approximately 90% of the failure in structural components. The present work aimed to investigate the fatigue behaviour of the Twill/Cycom 890 composite, which is carbon fiber reinforced with polymeric resin as matrix and manufactured via RTM process (Resin Transfer Molding). All samples were tested in different tensile level in triplicate in order to associate these values. The statistical analysis was conducted with Two-Parameter Weibull Distribution and then evaluated the fatigue life results for the composite. Weibull graphics were used to determine the scale and shape parameters. The S-N curve for the Twill/Cycom composite was drawn and indicated the number of cycles to occur the first damages in this material. The probability of failure was associated with material reliability, as shown in graphics for the different tensile levels and fatigue life. In addition, the laminate was evaluated by ultrasonic inspection showing a regular impregnation. The fractographic analysis conducted by SEM showed failure mechanisms for polymeric composites associated to cyclic loadings ... (Complete abstract click electronic access below)
Resumo:
In proton therapy, the deposition of secondary particles energy originated by nuclear inelastic process (n, 2H, 3H, 3He and α) has a contribution in the total dose that deserves to be discussed. In calculations of plans implemented for routine treatment, the paid dose is calculated whereas the proton loses energy by ionization and or coulomb excitement. The contribution of inelastic processes associated with nuclear reactions is not considered. There are only estimates for pure materials or simple composition (water, for example), because of the difficulty of processing targets consisting of different materials. For this project, we use the Monte Carlo method employing the code MCNPX v2.50 (Monte Carlo N-Particle eXtended) to present results of the contribution to the total dose of secondary particles. In this work, it was implemented a cylindrical phantom composed by cortical bone, for proton beams between 100 and 200 MeV. With the results obtained, it was possible to generate graphics to analyze: the dose deposition relation with and without nuclear interaction, the multiplicity and percentage of deposited dose for each secondary particle and a radial dispersion of neutrons in the material
Resumo:
Through my experiences as a teacher assistant in a free course design, I seek to understand the intentions and needs of children and youth with the design, so as to work it in a more consistent as the season and wishes of each student. For this I seek aid in studies on children's drawing of Derdyk Edith and also in observing the relationships that children and young people have with the drawing held in my practice as a teacher. Still reflect on the design in some of its particulars, especially concerning the optics of some artists and scholars of the practice of drawing. At the end of the study, propose suggestions for practices to be developed with children and young people who would be various forms of working drawing with experimental forms of building graphics, aiming so that students can add these unique experiences to broaden their understanding of design as well as its arch graphic possibilities. Thus inspired, especially in Derdyk considerations about the experimental design, I can conclude that through experimentation, we can offer young people a diverse thinking design and the possibilities of conceiving it.
Resumo:
With the rapid growth of the use of Web applications in various fields of knowledge, the term Web service enter into evidence in the current scenario, which refers to services from different origins and purpose, offered through local networks and also available in some cases, on the Internet. The architecture of this type of application offers data processing on server side thereby, running applications and complex and slow processes is very interesting, which is the case with most algorithms involving visualization. The VTK is a library intended for visualization, and features a large variety of methods and algorithms for this purpose, but with a graphics engine that requires processing capacity. The union of these two resources can bring interesting results and contribute for performance improvements in the VTK library. This study is discussed in this project, through testing and communication overhead analysis
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
A survey we conducted, and bring expressed in this text is a preliminary study on the shape of asteroids and allowed us to understand a little more the dynamics around these bodies, in order that the images we have of asteroids are the most irregular possible. In this work, the asteroid is modeled by the method of the polyhedron, which provides a very good accuracy of the irregular shape of the body. Through study of models for non-spherical gravitational potential bodies, implementation of computational algorithms and numerical simulations a preliminary analysis was performed in relation to the shape of asteroids 4179 Toutatis, 6489 Golevka, 2063 Bacchus, 1620 Geographos and 1998 ML14, as well as regions of stability instability, we compute the coefficients of the gravitational potential. The work not only enables expansion for the case of asteroids, but also for other non-spherical bodies, contributing to the development of targeted studies the origin and evolution of the solar system, and perhaps the origin of the earth, and new technologies for modeling and mapping of non-spherical bodies. The main results were obtained by analyzing the graphics format and planning of asteroids, which confirmed how these bodies are irregular and show how distribution of non-homogeneous mass. Observe the behavior of the curves of zero velocity and equipotential curves as well as their respective surfaces. Also, compute some values of the gravitational potential and the spherical harmonic coefficients of each object. Furthermore, we find possible equilibrium points of asteroids except 4179 Toutatis, and analyze its stability
Resumo:
A study was conducted to determine the optimal time for collecting the breath examination urea breath marked with the stable isotope 13C. We selected patients before undergoing the examination of endoscopy at the Endoscopy Section of the University Hospital of Botucatu - SP. A screening was performed to determine which patients wanted and could participate. Before performing endoscopy basal sample was collected from the patient and then the labeled urea ingested. The blows were collected in double every 2.5 minutes until an interval of 30.0 minutes after were collected every 5.0 minutes until the time of 45.0 minutes . The samples were analyzed in a mass spectrometer for isotope ratio, located in the center of Stable Isotopes, Institute of Biosciences, UNESP - Botucatu campus. The data were studied and arranged in the form of graphics to better interpretation of results. Based on the obtained results it was determined that a standby time of 15.0 minutes to collect the wind is sufficient for accurate diagnosis and effective
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)
Resumo:
Pós-graduação em Biofísica Molecular - IBILCE
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)
Resumo:
In situ megascale hydraulic diffusivities (D) of a confined loess aquifer were estimated at various scales (10 <= L <= 1500 m) by a finite difference model, and laboratory microscale diffusivities of a loess sample by empirical formulas. A scatter plot reveals that D fits to a single power function of L, providing that microscale diffusivities are assigned to L = 1 m and that differences in diffusivity observed between micro- and megascales are assigned to medium heterogeneity appraised by variations in the curvature and slope of natural hydraulic head waves propagating through the aquifer. Subsequently, a general power relationship between D and L is defined where the base and exponent terms stand for the aquifer storage capability under a confined regime of flow, for the microscale hydraulic conductivity and specific yield of loess, and for the changes in curvature and slope of hydraulic head waves relative to values defined at unit scale.[GRAPHICS]Editor Z.W. Kundzewicz
Resumo:
Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)
Resumo:
Relevance feedback approaches have been established as an important tool for interactive search, enabling users to express their needs. However, in view of the growth of multimedia collections available, the user efforts required by these methods tend to increase as well, demanding approaches for reducing the need of user interactions. In this context, this paper proposes a semi-supervised learning algorithm for relevance feedback to be used in image retrieval tasks. The proposed semi-supervised algorithm aims at using both supervised and unsupervised approaches simultaneously. While a supervised step is performed using the information collected from the user feedback, an unsupervised step exploits the intrinsic dataset structure, which is represented in terms of ranked lists of images. Several experiments were conducted for different image retrieval tasks involving shape, color, and texture descriptors and different datasets. The proposed approach was also evaluated on multimodal retrieval tasks, considering visual and textual descriptors. Experimental results demonstrate the effectiveness of the proposed approach.
Resumo:
In the pattern recognition research field, Support Vector Machines (SVM) have been an effectiveness tool for classification purposes, being successively employed in many applications. The SVM input data is transformed into a high dimensional space using some kernel functions where linear separation is more likely. However, there are some computational drawbacks associated to SVM. One of them is the computational burden required to find out the more adequate parameters for the kernel mapping considering each non-linearly separable input data space, which reflects the performance of SVM. This paper introduces the Polynomial Powers of Sigmoid for SVM kernel mapping, and it shows their advantages over well-known kernel functions using real and synthetic datasets.