10 resultados para dynamic geometry software
em Universidade Federal do Rio Grande do Norte(UFRN)
Resumo:
This work focuses on the creation and applications of a dynamic simulation software in order to study the hard metal structure (WC-Co). The technological ground used to increase the GPU hardware capacity was Geforce 9600 GT along with the PhysX chip created to make games more realistic. The software simulates the three-dimensional carbide structure to the shape of a cubic box where tungsten carbide (WC) are modeled as triangular prisms and truncated triangular prisms. The program was proven effective regarding checking testes, ranging from calculations of parameter measures such as the capacity to increase the number of particles simulated dynamically. It was possible to make an investigation of both the mean parameters and distributions stereological parameters used to characterize the carbide structure through cutting plans. Grounded on the cutting plans concerning the analyzed structures, we have investigated the linear intercepts, the intercepts to the area, and the perimeter section of the intercepted grains as well as the binder phase to the structure by calculating the mean value and distribution of the free path. As literature shows almost consensually that the distribution of the linear intercepts is lognormal, this suggests that the grain distribution is also lognormal. Thus, a routine was developed regarding the program which made possible a more detailed research on this issue. We have observed that it is possible, under certain values for the parameters which define the shape and size of the Prismatic grain to find out the distribution to the linear intercepts that approach the lognormal shape. Regarding a number of developed simulations, we have observed that the distribution curves of the linear and area intercepts as well as the perimeter section are consistent with studies on static computer simulation to these parameters.
Resumo:
This work focuses on the creation and applications of a dynamic simulation software in order to study the hard metal structure (WC-Co). The technological ground used to increase the GPU hardware capacity was Geforce 9600 GT along with the PhysX chip created to make games more realistic. The software simulates the three-dimensional carbide structure to the shape of a cubic box where tungsten carbide (WC) are modeled as triangular prisms and truncated triangular prisms. The program was proven effective regarding checking testes, ranging from calculations of parameter measures such as the capacity to increase the number of particles simulated dynamically. It was possible to make an investigation of both the mean parameters and distributions stereological parameters used to characterize the carbide structure through cutting plans. Grounded on the cutting plans concerning the analyzed structures, we have investigated the linear intercepts, the intercepts to the area, and the perimeter section of the intercepted grains as well as the binder phase to the structure by calculating the mean value and distribution of the free path. As literature shows almost consensually that the distribution of the linear intercepts is lognormal, this suggests that the grain distribution is also lognormal. Thus, a routine was developed regarding the program which made possible a more detailed research on this issue. We have observed that it is possible, under certain values for the parameters which define the shape and size of the Prismatic grain to find out the distribution to the linear intercepts that approach the lognormal shape. Regarding a number of developed simulations, we have observed that the distribution curves of the linear and area intercepts as well as the perimeter section are consistent with studies on static computer simulation to these parameters.
Resumo:
Some authors have shown the need of understanding the technological structuring process in contemporary firms. From this perspective, the software industry is a very important element because it provides products and services directly to many organizations from many fields. In this case, the Brazilian software industry has some peculiarities that distinguish it from other industries located in developed countries, which makes its understanding even more relevant. There is evidence that local firms take different strategies and structural configurations to enter into a market naturally dominated by large multinational firms. Therefore, this study aims to understand not only the structural configurations assumed by domestic firms but also the dynamic and the process that lead to these different configurations. To do so, this PhD dissertation investigates the institutional environment, its entities and the isomorphic movements, by employing an exploratory, descriptive and explanatory multiple cases study. Eight software development companies from the Recife's information technology Cluster were visited. Also, a form was applied and an interview with one of the main firm s professional was conducted. Although the study is predominantly qualitative, part of the data was analyzed through charts and graphs, providing a companies and environment overview that was very useful to analysis done through the interviews interpretation. As a result, it was realized that companies are structured around hybrids business models from two ideal types of software development companies, which are: software factory and technology-based company. Regarding the development process, it was found that there is a balanced distribution between the traditional and agile development paradigm. Among the traditional methodologies, the Rational Unified Process (RUP) is predominant. The Scrum is the most used methodology among the organizations based on the Agile Manifesto's principles. Regarding the structuring process, each institutional entity acts in such way that generates different isomorphic pressure. Emphasis was given to entities such as customers, research agencies, clusters, market-leading businesses, public universities, incubators, software industry organizations, technology vendors, development tool suppliers and manager s school and background because they relate themselves in a close way with the software firms. About this relationship, a dual and bilateral influence was found. Finally, the structuring level of the organizational field has been also identified as low, which gives a chance to organizational actors of acting independently
Resumo:
This work proposes a model to investigate the use of a cylindrical antenna used in the thermal method of recovering through electromagnetic radiation of high-viscosity oil. The antenna has a simple geometry, adapted dipole type, and it can be modelled by using Maxwell s equation. The wavelet transforms are used as basis functions and applied in conjunction with the method of moments to obtain the current distribution in the antenna. The electric field, power and temperature distribution are carefully calculated for the analysis of the antenna as electromagnetic heating. The energy performance is analyzed based on thermo-fluid dynamic simulations at field scale, and through the adaptation in the Steam Thermal and Advanced Processes Reservoir Simulator (STARS) by Computer Modelling Group (CMG). The model proposed and the numerical results obtained are stable and presented good agreement with the results reported in the specialized literature
Resumo:
Spacecraft move with high speeds and suffer abrupt changes in acceleration. So, an onboard GPS receiver could calculate navigation solutions if the Doppler effect is taken into consideration during the satellite signals acquisition and tracking. Thus, for the receiver subject to such dynamic cope these shifts in the frequency signal, resulting from this effect, it is imperative to adjust its acquisition bandwidth and increase its tracking loop to a higher order. This paper presents the changes in the GPS Orion s software, an open architecture receiver produced by GEC Plessey Semiconductors, nowadays Zarlink, in order to make it able to generate navigation fix for vehicle under high dynamics, especially Low Earth Orbit satellites. GPS Architect development system, sold by the same company, supported the modifications. Furthermore, it presents GPS Monitor Aerospace s characteristics, a computational tool developed for monitoring navigation fix calculated by the GPS receiver, through graphics. Although it was not possible to simulate the software modifications implemented in the receiver in high dynamics, it was observed that the receiver worked in stationary tests, verified also in the new interface. This work also presents the results of GPS Receiver for Aerospace Applications experiment, achieved with the receiver s participation in a suborbital mission, Operation Maracati 2, in December 2010, using a digital second order carrier tracking loop. Despite an incident moments before the launch have hindered the effective navigation of the receiver, it was observed that the experiment worked properly, acquiring new satellites and tracking them during the VSB-30 rocket flight.
Resumo:
The pumping through progressing cavities system has been more and more employed in the petroleum industry. This occurs because of its capacity of elevation of highly viscous oils or fluids with great concentration of sand or other solid particles. A Progressing Cavity Pump (PCP) consists, basically, of a rotor - a metallic device similar to an eccentric screw, and a stator - a steel tube internally covered by a double helix, which may be rigid or deformable/elastomeric. In general, it is submitted to a combination of well pressure with the pressure generated by the pumping process itself. In elastomeric PCPs, this combined effort compresses the stator and generates, or enlarges, the clearance existing between the rotor and the stator, thus reducing the closing effect between their cavities. Such opening of the sealing region produces what is known as fluid slip or slippage, reducing the efficiency of the PCP pumping system. Therefore, this research aims to develop a transient three-dimensional computational model that, based on single-lobe PCP kinematics, is able to simulate the fluid-structure interaction that occurs in the interior of metallic and elastomeric PCPs. The main goal is to evaluate the dynamic characteristics of PCP s efficiency based on detailed and instantaneous information of velocity, pressure and deformation fields in their interior. To reach these goals (development and use of the model), it was also necessary the development of a methodology for generation of dynamic, mobile and deformable, computational meshes representing fluid and structural regions of a PCP. This additional intermediary step has been characterized as the biggest challenge for the elaboration and running of the computational model due to the complex kinematic and critical geometry of this type of pump (different helix angles between rotor and stator as well as large length scale aspect ratios). The processes of dynamic generation of meshes and of simultaneous evaluation of the deformations suffered by the elastomer are fulfilled through subroutines written in Fortan 90 language that dynamically interact with the CFX/ANSYS fluid dynamic software. Since a structural elastic linear model is employed to evaluate elastomer deformations, it is not necessary to use any CAE package for structural analysis. However, an initial proposal for dynamic simulation using hyperelastic models through ANSYS software is also presented in this research. Validation of the results produced with the present methodology (mesh generation, flow simulation in metallic PCPs and simulation of fluid-structure interaction in elastomeric PCPs) is obtained through comparison with experimental results reported by the literature. It is expected that the development and application of such a computational model may provide better details of the dynamics of the flow within metallic and elastomeric PCPs, so that better control systems may be implemented in the artificial elevation area by PCP
Resumo:
A manutenção e evolução de sistemas de software tornou-se uma tarefa bastante crítica ao longo dos últimos anos devido à diversidade e alta demanda de funcionalidades, dispositivos e usuários. Entender e analisar como novas mudanças impactam os atributos de qualidade da arquitetura de tais sistemas é um pré-requisito essencial para evitar a deterioração de sua qualidade durante sua evolução. Esta tese propõe uma abordagem automatizada para a análise de variação do atributo de qualidade de desempenho em termos de tempo de execução (tempo de resposta). Ela é implementada por um framework que adota técnicas de análise dinâmica e mineração de repositório de software para fornecer uma forma automatizada de revelar fontes potenciais – commits e issues – de variação de desempenho em cenários durante a evolução de sistemas de software. A abordagem define quatro fases: (i) preparação – escolher os cenários e preparar os releases alvos; (ii) análise dinâmica – determinar o desempenho de cenários e métodos calculando seus tempos de execução; (iii) análise de variação – processar e comparar os resultados da análise dinâmica para releases diferentes; e (iv) mineração de repositório – identificar issues e commits associados com a variação de desempenho detectada. Estudos empíricos foram realizados para avaliar a abordagem de diferentes perspectivas. Um estudo exploratório analisou a viabilidade de se aplicar a abordagem em sistemas de diferentes domínios para identificar automaticamente elementos de código fonte com variação de desempenho e as mudanças que afetaram tais elementos durante uma evolução. Esse estudo analisou três sistemas: (i) SIGAA – um sistema web para gerência acadêmica; (ii) ArgoUML – uma ferramenta de modelagem UML; e (iii) Netty – um framework para aplicações de rede. Outro estudo realizou uma análise evolucionária ao aplicar a abordagem em múltiplos releases do Netty, e dos frameworks web Wicket e Jetty. Nesse estudo foram analisados 21 releases (sete de cada sistema), totalizando 57 cenários. Em resumo, foram encontrados 14 cenários com variação significante de desempenho para Netty, 13 para Wicket e 9 para Jetty. Adicionalmente, foi obtido feedback de oito desenvolvedores desses sistemas através de um formulário online. Finalmente, no último estudo, um modelo de regressão para desempenho foi desenvolvido visando indicar propriedades de commits que são mais prováveis a causar degradação de desempenho. No geral, 997 commits foram minerados, sendo 103 recuperados de elementos de código fonte degradados e 19 de otimizados, enquanto 875 não tiveram impacto no tempo de execução. O número de dias antes de disponibilizar o release e o dia da semana se mostraram como as variáveis mais relevantes dos commits que degradam desempenho no nosso modelo. A área de característica de operação do receptor (ROC – Receiver Operating Characteristic) do modelo de regressão é 60%, o que significa que usar o modelo para decidir se um commit causará degradação ou não é 10% melhor do que uma decisão aleatória.
Resumo:
A manutenção e evolução de sistemas de software tornou-se uma tarefa bastante crítica ao longo dos últimos anos devido à diversidade e alta demanda de funcionalidades, dispositivos e usuários. Entender e analisar como novas mudanças impactam os atributos de qualidade da arquitetura de tais sistemas é um pré-requisito essencial para evitar a deterioração de sua qualidade durante sua evolução. Esta tese propõe uma abordagem automatizada para a análise de variação do atributo de qualidade de desempenho em termos de tempo de execução (tempo de resposta). Ela é implementada por um framework que adota técnicas de análise dinâmica e mineração de repositório de software para fornecer uma forma automatizada de revelar fontes potenciais – commits e issues – de variação de desempenho em cenários durante a evolução de sistemas de software. A abordagem define quatro fases: (i) preparação – escolher os cenários e preparar os releases alvos; (ii) análise dinâmica – determinar o desempenho de cenários e métodos calculando seus tempos de execução; (iii) análise de variação – processar e comparar os resultados da análise dinâmica para releases diferentes; e (iv) mineração de repositório – identificar issues e commits associados com a variação de desempenho detectada. Estudos empíricos foram realizados para avaliar a abordagem de diferentes perspectivas. Um estudo exploratório analisou a viabilidade de se aplicar a abordagem em sistemas de diferentes domínios para identificar automaticamente elementos de código fonte com variação de desempenho e as mudanças que afetaram tais elementos durante uma evolução. Esse estudo analisou três sistemas: (i) SIGAA – um sistema web para gerência acadêmica; (ii) ArgoUML – uma ferramenta de modelagem UML; e (iii) Netty – um framework para aplicações de rede. Outro estudo realizou uma análise evolucionária ao aplicar a abordagem em múltiplos releases do Netty, e dos frameworks web Wicket e Jetty. Nesse estudo foram analisados 21 releases (sete de cada sistema), totalizando 57 cenários. Em resumo, foram encontrados 14 cenários com variação significante de desempenho para Netty, 13 para Wicket e 9 para Jetty. Adicionalmente, foi obtido feedback de oito desenvolvedores desses sistemas através de um formulário online. Finalmente, no último estudo, um modelo de regressão para desempenho foi desenvolvido visando indicar propriedades de commits que são mais prováveis a causar degradação de desempenho. No geral, 997 commits foram minerados, sendo 103 recuperados de elementos de código fonte degradados e 19 de otimizados, enquanto 875 não tiveram impacto no tempo de execução. O número de dias antes de disponibilizar o release e o dia da semana se mostraram como as variáveis mais relevantes dos commits que degradam desempenho no nosso modelo. A área de característica de operação do receptor (ROC – Receiver Operating Characteristic) do modelo de regressão é 60%, o que significa que usar o modelo para decidir se um commit causará degradação ou não é 10% melhor do que uma decisão aleatória.
Resumo:
Some authors have shown the need of understanding the technological structuring process in contemporary firms. From this perspective, the software industry is a very important element because it provides products and services directly to many organizations from many fields. In this case, the Brazilian software industry has some peculiarities that distinguish it from other industries located in developed countries, which makes its understanding even more relevant. There is evidence that local firms take different strategies and structural configurations to enter into a market naturally dominated by large multinational firms. Therefore, this study aims to understand not only the structural configurations assumed by domestic firms but also the dynamic and the process that lead to these different configurations. To do so, this PhD dissertation investigates the institutional environment, its entities and the isomorphic movements, by employing an exploratory, descriptive and explanatory multiple cases study. Eight software development companies from the Recife's information technology Cluster were visited. Also, a form was applied and an interview with one of the main firm s professional was conducted. Although the study is predominantly qualitative, part of the data was analyzed through charts and graphs, providing a companies and environment overview that was very useful to analysis done through the interviews interpretation. As a result, it was realized that companies are structured around hybrids business models from two ideal types of software development companies, which are: software factory and technology-based company. Regarding the development process, it was found that there is a balanced distribution between the traditional and agile development paradigm. Among the traditional methodologies, the Rational Unified Process (RUP) is predominant. The Scrum is the most used methodology among the organizations based on the Agile Manifesto's principles. Regarding the structuring process, each institutional entity acts in such way that generates different isomorphic pressure. Emphasis was given to entities such as customers, research agencies, clusters, market-leading businesses, public universities, incubators, software industry organizations, technology vendors, development tool suppliers and manager s school and background because they relate themselves in a close way with the software firms. About this relationship, a dual and bilateral influence was found. Finally, the structuring level of the organizational field has been also identified as low, which gives a chance to organizational actors of acting independently
Resumo:
This work proposes a model to investigate the use of a cylindrical antenna used in the thermal method of recovering through electromagnetic radiation of high-viscosity oil. The antenna has a simple geometry, adapted dipole type, and it can be modelled by using Maxwell s equation. The wavelet transforms are used as basis functions and applied in conjunction with the method of moments to obtain the current distribution in the antenna. The electric field, power and temperature distribution are carefully calculated for the analysis of the antenna as electromagnetic heating. The energy performance is analyzed based on thermo-fluid dynamic simulations at field scale, and through the adaptation in the Steam Thermal and Advanced Processes Reservoir Simulator (STARS) by Computer Modelling Group (CMG). The model proposed and the numerical results obtained are stable and presented good agreement with the results reported in the specialized literature