212 resultados para Otimização de código
Resumo:
The aim of this work was the development a computer code for simulation and analysis of atomic spectra from databases constructed from the literature. There were created four routines that can be useful for spectroscopic studies in the atomic processes of laser isotope separation. In the first routine, Possible Transitions, the program checks the possible electron transitions from an energy level of the atom present in the database considering the selection rules for an electric dipole transition. The second routine, Locator Transitions, checks the possible electronic transitions within a user-specified spectral region. The routine Spectra Simulator creates simulated spectra using the graphical application gnuplot through lorentzian curve and finally, the routine Electronic Temperature determines the temperature of electronic excitation of the atom, thought the Boltzmann Plot Method. To test the reliability of the program there were obtained experimental emission spectra of a hollow cathode discharge of dysprosium and argon as a buffer gas. The hollow cathode discharge has been subjected to different values of operating currents and pressure of inert gas. The spectra obtained were treated with the assistance of program routines developed (Transition Locator and Spectra Simulator) and temperatures electronic excitation of the atoms of dysprosium in the different discharge conditions were calculated (routine Electronic Temperature). The results showed that the electronic excitation temperature of the neutral dysprosium atoms in the hollow cathode discharge increases with increasing current applied to the cathode and also by increasing the gas pressure buffer. The determination coefficients, R2, obtained by the Electronic Temperature routine using the linear adjust of the Boltzmann Plot Method were greater... (Complete abstract click electronic access below)
Resumo:
This dissertation has as main theme the discuss about how the use of mathematical models for process optimization. The current scenario of strong competition to conquer the consumer market necessitates the development of improvements to better performance of the process as a whole, is to reduce costs, increase efficiency or effectiveness. Thus, the use of methodologies to assist in this process is becoming increasingly viable. Methodologies developed in the past are being studied and improved. An example is the Desirability, the object of the present study, which was developed in the 80's and has been improved over time. To understand and study this methodology was applied to the desirability function in three instances, where it was used Design of Experiments (DOE), taken from scientific papers, using the Solver tool (Excel ®) and desirability (Minitab ®). Thus, in addition to studying the methodology, it was possible to compare the performance of tools used for optimization in different situations. From the results of this study, it was possible to validate the superiority of one of the models studied compared fairly
Resumo:
As normas nacionais e internacionais prevêem que a manutenção dos níveis de radiação deve estar abaixo do permitido. Sendo assim, a ICRP [1] (International Commission on Radiological Protection) exige métodos de otimização para garantir que o público esteja exposto aos menores níveis de radiação possíveis. Como método de otimização, aproximações teóricas e semi-empiricas podem realizar uma determinação do espectro de raios-X, sendo fundamental para o diagnóstico de energia, estimando a dose de radiações em pacientes e formulando modelos de blindagem. Métodos adequados de radioproteção foram desenvolvidos na física médica como a medicina nuclear, a radioterapia e a radiologia diagnóstica. Um dos métodos semi-empiricos utilizados é o modelo de TBC que é capaz de reproduzir e calcular os espectros gerados pelo anodo de tungstênio. Com o modelo de TBC modificado é possível também obedecer às exigências das barreiras protetoras presentes na radiologia, levando em conta a forma de onda arbitrária e a filtração adicional na geração do espectro não presente no modelo original. Além disso, realiza-se a calibração do espectro gerado para que o modelo de TBC represente a quantidade e comportamento de radiações típicas. Dessa forma, realiza-se uma revisão do modelo de TBC implementando-o ao programa matemático Matlab e comparando-o com os resultados adquiridos pelo Código MCNP-5 no Método de Monte Carlo. Os resultados encontrados são bastante satisfatórios, tanto em termos quantitativos quanto qualitativos dos feixes. Para a calibração, desenvolve-se uma análise dos espectros gerados pelo TBC Modificado aplicado ao programa Mathcad e Matlab sob as mesmas condições. Os espectros gerados apresentam o mesmo comportamento, diferindo em até 12% nos valores encontrados para camadas semi-redutoras, coeficiente de homogeneidade e energia efetiva
Resumo:
This work approaches, in a simplified manner, the analysis of an aircraft’s trajectory through the 3 main flight phases, climb, cruise and descent, related to fuel consumption and elapsed time. From this analysis is developed a tool that aims optimize the flight planning operational procedure, providing an altitude that comply with fuel saving during the trip, or minimizes the trip time. The use of any altitude is an operator’s decision, that aims comply with their operational needs of each trip, getting the results provided by the tool as a primary approach to the flight profile that also bring up economics aspects of each possibility of decision to be taken. Since the aeronautical Market has singular problems, as the flight altitude optimization, there is the need to solutions very customized that many times can not attend every restriction for each operator and its related kind of operation. When we talk about executive aircrafts, is possible to note that its operators does not have enough engineering and logistic support, when compared to huge airlines companies, to analyze all exceptions of each singular operation, creating many times wastes that can be avoiding with a tool described herein in this work
Resumo:
Atualmente, atender as necessidades dos consumidores é uma das metas mais importantes, os consumidores estão em busca de produtos com qualidade e preços mais acessíveis, para isso, é indispensável que as empresas se atualizem para melhorar seus produtos e serviços. Com este cenário, as superligas estão cada vez mais ganhando mercado, pois possuem ótimas propriedades, principalmente em relação a operar em temperaturas elevadas, podendo proporcionar maior eficiência para motores que necessitam trabalhar em altas temperaturas. Em contra partida a essa vantagem, as superligas possuem uma baixa usinabilidade, sendo importante a análise do processo de usinagem para se tornarem mais aplicáveis. Este trabalho visa à otimização do processo de torneamento cilíndrico da superliga Nimonic 80A, com o intuito de melhorar a qualidade do produto, utilizando o Método de Taguchi, com o arranjo ortogonal L16, sendo o comprimento de corte definido como variável resposta e analisados seis fatores que poderiam influenciar na sua variação, tais fatores são: velocidade de corte, avanço, profundidade de corte, tipo de pastilha, lubrificação e dureza do material. Os resultados obtidos demonstraram que os fatores avanço, tipo de pastilha e lubrificação são significativos e exercem influencia no processo, sendo que o avanço deve ser ajustado no nível de 0,12 mm/rev, a pastilha a ser utilizada deve ser CP250 e a lubrificação deve ser feita de maneira abundante, para a otimização do processo. Com a análise dos resultados, também podemos observar a eficiência e confiabilidade do método utilizado, mostrando resultados coerentes
Resumo:
Suitable computacional tools allow to build applications that can link information to its physical location, and represent them into visual and interactive schemes, e ectively reaching the power of visual comunication. This leads the user to synthesize information in a simple and e cient way. These applications are linked to the de nition of Geographic Information System (GIS). GIS are comprised by many concepts and tools, which have the main purpose of collecting, storing, viewing and processing spatial data, obtaining the information needed for decision making. Within this context, this paper presents the Conception and Implementation of a Control System for Urban Forestry through Integration of Free and Open Source Software. This conception arose from the need of an Environmental Project developed by the Agriculture's House of the city of Regente Feij o, which has as main objectives cataloging and management of urban a orestation of the municipality. Due to this diversity of concepts, the challenge in building this system is the integration of platforms that are involved in all stages: collecting and storage of data, including maps and other spatial information, operations on the stored information, obtaining results and graphical visualization of the same. After implementation, it was possible to provide for the system users an improvement in the capacity of perception in the information analysis and facilitate the process of decision making
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
This paper consists in applying the Modern Theory of Portfolio (MPT) using central position measurements for modeling the return on investment fund of a renowned financial institution and compare with the Medium- CVaR model. The measurement of risks and returns becomes increasingly important for investors to minimize their losses to maximize thus their possibilities of earnings, taking into account sudden change scenarios. We present concepts of investment funds used as data and research on central position measurements to determine which measure was more suitable. To assemble the Efficient Frontier of assets considering the method proposed measure of central position-CVaR position is used MatLab. Then, after getting the Frontier, it was possible to compare it with the Medium-CVaR model, already proven effective. Finally, we analyze the viability of the proposed model in portfolio management
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)
Resumo:
A inflamação é basicamente uma resposta de proteção tissular cuja função é proteger o organismo de micro-organismos e toxinas que promovem a lesão celular. A resposta inflamatória gera o aparecimento dos quatro sinais cardinais dor, edema, calor e rubor podendo posteriormente ocorrer a perda de função do tecido ou órgão. Os anti-inflamatórios não esteroides (AINEs) promovem sua atividade alterando a atividade de prostaglandinas (PGs), por inibição das isoformas de ciclo-oxigenases constitutivas (COX1) e induzidas (COX2). A inibição não seletiva destas enzimas (por AINEs clássicos) gera reações adversas graves, como as reações gastrointestinais, em pacientes pré-dispostos e/ou que as utilizam por um período prolongado. Os AINEs de segunda geração (inibidores seletivos de COX2), também se mostraram tóxicos, podendo causar alterações cardiovasculares. Neste sentido, o planejamento de novos compostos com atividade anti-inflamatória e destituídos de toxicidade ainda é um desafio para a química farmacêutica e medicinal. Dois derivados de ibuprofeno (AINE clássico) foram obtidos anteriormente no Lapdesf por Castro (2008) e Vizioli (2006), demonstrando atividade anti-inflamatória em modelo de inflamação aguda (edema de pata) e crônica (colite ulcerativa), destituídos de gastroulceração (VIZIOLI, 2009). Diante destes resultados o presente estudo visou à preparação dos compostos e o estudo pré-clínico de toxicidade em dose única (aguda) e em doses repetidas em camundongos, bem como as análises comportamentais, histopatológicas dos órgãos e ensaios bioquímicos. Os achados comportamentais, bioquímicos e histopatológicos permitem concluir que não foi observada toxicidade nos modelos estudados, exceto para o grupo de administração dose única de 2000 mg/kg via intraperitoneal do composto Lapdesf ibu-tau.
Resumo:
This work has as main theme optimize the method of determination of fatty acids such as methyl octanoate, methyl palmitate, methyl stearate, methyl oleate, methyl linoleate and methyl linolenate in blood plasma samples from mice. The method proved to be very suitable for the analysis, in which we obtained the following linear coefficients: 0.9992, 0.9989, 0.9996, 0.9995, 0.9999 for methyl linoleate acid, methyl oleate, methyl palmitate, methyl stearate and methyl octanoate, respectively. Esterification of the samples gave good reading of chromatograms of samples without interfering peaks. The results obtained were as expected from the diet of mices
Resumo:
The X-ray fluorescence analysis (XRF) is an important technique for the qualitative and quantitative determination of chemical components in a sample. It is based on measurement of the intensity of the emitted characteristic radiation by the elements of the sample, after being properly excited. One of the modalities of this technique is the total reflection x-ray fluorescence (TXRF). In TXRF, the angle of refraction of the incident beam tends to zero and the refracted beam is tangent to the sample-support interface. Thus, there is a minimum angle of incidence that there is no refracted beam and all the incident radiation undergoes total reflection. As it is implemented in very small samples, in a film format, self-absorption effects should not very relevant. In this study, we evaluated the feasibility of using code MCNPX (Monte Carlo N - Particle eXtended), to simulate a measure implemented by the TXRF technique. In this way, it was verified the quality of response of a system by TXRF spectroscopy using synchrotron radiation as excitation beam for a simple setup, by retrieving the characteristic energies and the concentrations of the elements in the sample. The steps of data processing, after obtaining the excitation spectra, were the same as in a real experiment and included the obtaining of the sensitivity curve for the simulated system. The agreement between the theoretical and simulated values of Ka characteristic energies for different elements was lower than 1 % .The obtained concentration of the elements of the sample had high relatively errors ( between 6 and 60 % ) due mainly to lack of knowing about some realistic physical parameters of the sample , such as density . In this way, this result does not preclude the use of MCNPX code for this type of application
Resumo:
Pós-graduação em Química - IQ
Resumo:
Upper canines impaction are considered the second most frequent and are associated to important esthetics and functional limitations. Among the treatment strategies described in the literature the most commonly used are the extraction of the primary canines and the surgical exposure followed by orthodontic traction, that requires an adequate interdisciplinary approach. The aim of this case report is to draw the attention of the clinician to the possibility of adapting the segmented arch technique to manage a canine impaction clinical case.