1000 resultados para Simulação em Estatística
Resumo:
Este trabalho apresenta resultados geoquímicos multielementares de sedimentos de corrente no estado de São Paulo, obtidos através do projeto institucional do Serviço Geológico do Brasil denominado \"Levantamento Geoquímico de Baixa Densidade no Brasil\". Dados analíticos de 1422 amostras de sedimento de corrente obtidos por ICP-MS (Inductively Coupled Plasma Mass Spectrometry), para 32 elementos químicos (Al, Ba, Be, Ca, Ce, Co, Cr, Cs, Cu, Fe, Ga, Hf, K, La, Mg, Mn, Mo, Nb, Ni, P, Pb, Rb, Sc, Sn, Sr, Th, Ti, U, V, Y, Zn e Zr), foram processadas e abordadas através da análise estatística uni e multivariada. Os resultados do tratamento dos dados através de técnicas estatísticas univariadas forneceram os valores de background geoquímico (teor de fundo) dos 32 elementos para todo estado de São Paulo. A análise georreferenciada das distribuições geoquímicas unielementares evidenciaram a compartimentação geológica da área. As duas principais províncias geológicas do estado de São Paulo, Bacia do Paraná e Complexo Cristalino, se destacam claramente na maioria das distribuições geoquímicas. Unidades geológicas de maior expressão, como a Formação Serra Geral e o Grupo Bauru também foram claramente destacadas. Outras feições geoquímicas indicaram possíveis áreas contaminadas e unidades geológicas não cartografadas. Os resultados da aplicação de métodos estatísticos multivariados aos dados geoquímicos com 24 variáveis (Al, Ba, Ce, Co, Cr, Cs, Cu, Fe, Ga, La, Mn, Nb, Ni, Pb, Rb, Sc, Sr, Th, Ti, U, V, Y, Zn e Zr) permitiram definir as principais assinaturas e associações geoquímicas existentes em todo estado de São Paulo e correlacioná-las aos principais domínios litológicos. A análise de agrupamentos em modo Q forneceu oito grupos de amostras geoquimicamente correlacionáveis, que georreferenciadas reproduziram os principais compartimentos geológicos do estado: Complexo Cristalino, Grupos Itararé e Passa Dois, Formação Serra Geral e Grupos Bauru e Caiuá. A análise discriminante multigrupos comprovou, estatisticamente, a classificação dos grupos formados pela análise de agrupamentos e forneceu as principais variáveis discriminantes: Fe, Co, Sc, V e Cu. A análise de componentes principais, abordada em conjunto com a análise fatorial pelo método de rotação varimax, forneceram os principais fatores multivariados e suas respectivas associações elementares. O georreferenciamento dos valores de escores fatoriais multivariados delimitaram as áreas onde as associações elementares ocorrem e forneceram mapas multivariados para todo o estado. Por fim, conclui-se que os métodos estatísticos aplicados são indispensáveis no tratamento, apresentação e interpretação de dados geoquímicos. Ademais, com base em uma visão integrada dos resultados obtidos, este trabalho recomenda: (1) a execução dos levantamentos geoquímicos de baixa densidade em todo país em caráter de prioridade, pois são altamente eficazes na definição de backgrounds regionais e delimitação de províncias geoquímicas com interesse metalogenético e ambiental; (2) a execução do mapeamento geológico contínuo em escala adequada (maiores que 1:100.000) em áreas que apontam para possíveis existências de unidades não cartografadas nos mapas geológicos atuais.
Resumo:
Neste projeto foi desenvolvido um método computacional para verificação da melhor combinação tela intensificadora - filme para exames mamográficos através do estudo de suas características sensitométricas. O software, desenvolvido em ambiente Delphi para windows, apresenta na tela do microcomputador a imagem a ser obtida para cada tipo de combinação tela intensificadora - filme, utilizando imagens de \"Phantoms\" e de mamas reais. Em razão da ampla quantidade de fatores que influenciam a imagem mamográfica final, tais como magnificação, característica dos filmes e telas intensificadoras e condições da processadora, o método proposto pode proporcionar uma ampla avaliação da qualidade dos sistemas de imagem mamográfica de uma forma simples, rápida e automática, através de procedimentos de simulação computacional. A simulação investigou a influência que um determinado sistema de registro exerce sobre a qualidade da imagem, possibilitando conhecer previamente a imagem final a ser obtida com diferentes equipamentos e sistemas de registro. Dentre os sistemas investigados, três filmes (Kodak Min R 2000, Fuji UM MA-HC e Fuji ADM) e duas telas intensificadoras (Kodak Min R 2000 e Fuji AD Mammo Fine), aquele que apresentou melhores resultados, com melhor qualidade de imagens e menor exposição à paciente foi o de tela Min R 2000 com filme Min R 2000 da Kodak.
Resumo:
O paradigma das redes em chip (NoCs) surgiu a fim de permitir alto grau de integração entre vários núcleos de sistemas em chip (SoCs), cuja comunicação é tradicionalmente baseada em barramentos. As NoCs são definidas como uma estrutura de switches e canais ponto a ponto que interconectam núcleos de propriedades intelectuais (IPs) de um SoC, provendo uma plataforma de comunicação entre os mesmos. As redes em chip sem fio (WiNoCs) são uma abordagem evolucionária do conceito de rede em chip (NoC), a qual possibilita a adoção dos mecanismos de roteamento das NoCs com o uso de tecnologias sem fio, propondo a otimização dos fluxos de tráfego, a redução de conectores e a atuação em conjunto com as NoCs tradicionais, reduzindo a carga nos barramentos. O uso do roteamento dinâmico dentro das redes em chip sem fio permite o desligamento seletivo de partes do hardware, o que reduz a energia consumida. Contudo, a escolha de onde empregar um link sem fio em uma NoC é uma tarefa complexa, dado que os nós são pontes de tráfego os quais não podem ser desligados sem potencialmente quebrar uma rota preestabelecida. Além de fornecer uma visão sobre as arquiteturas de NoCs e do estado da arte do paradigma emergente de WiNoC, este trabalho também propõe um método de avaliação baseado no já consolidado simulador ns-2, cujo objetivo é testar cenários híbridos de NoC e WiNoC. A partir desta abordagem é possível avaliar diferentes parâmetros das WiNoCs associados a aspectos de roteamento, aplicação e número de nós envolvidos em redes hierárquicas. Por meio da análise de tais simulações também é possível investigar qual estratégia de roteamento é mais recomendada para um determinado cenário de utilização, o que é relevante ao se escolher a disposição espacial dos nós em uma NoC. Os experimentos realizados são o estudo da dinâmica de funcionamento dos protocolos ad hoc de roteamento sem fio em uma topologia hierárquica de WiNoC, seguido da análise de tamanho da rede e dos padrões de tráfego na WiNoC.
Resumo:
As plataformas de e-Learning são cada vez mais utilizadas na educação à distância, facto que se encontra diretamente relacionado com a possibilidade de proporcionarem aos seus alunos a valência de poderem assistir a cursos em qualquer lugar. Dentro do âmbito das plataformas de e-Learning encontra-se um grupo especialmente interessante: as plataformas adaptativas, que tendem a substituir o professor (presencial) através de interatividade, variabilidade de conteúdos, automatização e capacidade para resolução de problemas e simulação de comportamentos educacionais. O projeto ADAPT (plataforma adaptativa de e-Learning) consiste na criação de uma destas plataformas, implementando tutoria inteligente, resolução de problemas com base em experiências passadas, algoritmos genéticos e link-mining. É na área de link-mining que surge o desenvolvimento desta dissertação que documenta o desenvolvimento de quatro módulos distintos: O primeiro módulo consiste num motor de busca para sugestão de conteúdos alternativos; o segundo módulo consiste na identificação de mudanças de estilo de aprendizagem; o terceiro módulo consiste numa plataforma de análise de dados que implementa várias técnicas de data mining e estatística para fornecer aos professores/tutores informações importantes que não seriam visíveis sem recurso a este tipo de técnicas; por fim, o último módulo consiste num sistema de recomendações que sugere aos alunos os artigos mais adequados com base nas consultas de alunos com perfis semelhantes. Esta tese documenta o desenvolvimento dos vários protótipos para cada um destes módulos. Os testes efetuados para cada módulo mostram que as metodologias utilizadas são válidas e viáveis.
Resumo:
The distribution and mobilization of fluid in a porous medium depend on the capillary, gravity, and viscous forces. In oil field, the processes of enhanced oil recovery involve change and importance of these forces to increase the oil recovery factor. In the case of gas assisted gravity drainage (GAGD) process is important to understand the physical mechanisms to mobilize oil through the interaction of these forces. For this reason, several authors have developed physical models in laboratory and core floods of GAGD to study the performance of these forces through dimensionless groups. These models showed conclusive results. However, numerical simulation models have not been used for this type of study. Therefore, the objective of this work is to study the performance of capillary, viscous and gravity forces on GAGD process and its influence on the oil recovery factor through a 2D numerical simulation model. To analyze the interplay of these forces, dimensionless groups reported in the literature have been used such as Capillary Number (Nc), Bond number (Nb) and Gravity Number (Ng). This was done to determine the effectiveness of each force related to the other one. A comparison of the results obtained from the numerical simulation was also carried out with the results reported in the literature. The results showed that before breakthrough time, the lower is the injection flow rate, oil recovery is increased by capillary force, and after breakthrough time, the higher is the injection flow rate, oil recovery is increased by gravity force. A good relationship was found between the results obtained in this research with those published in the literature. The simulation results indicated that before the gas breakthrough, higher oil recoveries were obtained at lower Nc and Nb and, after the gas breakthrough, higher oil recoveries were obtained at lower Ng. The numerical models are consistent with the reported results in the literature
Resumo:
Conselho Nacional de Desenvolvimento Científico e Tecnológico - CNPq
Resumo:
Primary processing of natural gas platforms as Mexilhão Field (PMXL-1 ) in the Santos Basin, where monoethylene glycol (MEG) has been used to inhibit the formation of hydrates, present operational problems caused by salt scale in the recovery unit of MEG. Bibliographic search and data analysis of salt solubility in mixed solvents, namely water and MEG, indicate that experimental reports are available to a relatively restricted number of ionic species present in the produced water, such as NaCl and KCl. The aim of this study was to develop a method for calculating of salt solubilities in mixed solvent mixtures, in explantion, NaCl or KCl in aqueous mixtures of MEG. The method of calculating extend the Pitzer model, with the approach Lorimer, for aqueous systems containing a salt and another solvent (MEG). Python language in the Integrated Development Environment (IDE) Eclipse was used in the creation of the computational applications. The results indicate the feasibility of the proposed calculation method for a systematic series of salt (NaCl or KCl) solubility data in aqueous mixtures of MEG at various temperatures. Moreover, the application of the developed tool in Python has proven to be suitable for parameter estimation and simulation purposes
Resumo:
This work consists of the integrated design process analyses with thermal energetic simulation during the early design stages, based on six practical cases. It aims to schematize the integration process, identifying the thermal energetic analyses contributions at each design phase and identifying the highest impact parameters on building performance. The simulations were run in the DesignBuilder energy tool, which has the same EnergyPlus engine, validated. This tool was chosen due to the flexible and user friendly graphic interface for modeling and output assessment, including the parametric simulation to compare design alternatives. The six case studies energy tools are three architectural and three retrofit projects, and the author the simulations as a consultant or as a designer. The case studies were selected based on the commitment of the designers in order to achieve performance goals, and their availability to share the process since the early pre-design analyses, allowing schematizing the whole process, and supporting the design decisions with quantifications, including energy targets. The thermoenergetic performance analyses integration is feasible since the early stages, except when only a short time is available to run the simulations. The simulation contributions are more important during the sketch and detail phases. The predesign phase can be assisted by means of reliable bioclimatic guidelines. It was verified that every case study had two dominant design variables on the general performance. These variables differ according the building characteristics and always coincide with the local bioclimatic strategies. The adaptation of alternatives to the design increases as earlier it occurs. The use of simulation is very useful: to prove and convince the architects; to quantify the cost benefits and payback period to the retrofit designer; and to the simulator confirm the desirable result and report the performance to the client
Resumo:
The city of Natal has a significant daylight availability, although it use isn’t systematically explored in schools architecture. In this context, this research aims to determine procedures for the analysis of the daylight performance in school design in Natal-RN. The method of analysis is divided in Visible Sky Factor (VSF), simulating and analyzing the results. The annual variation of the daylight behavior requires the adoption of dynamic simulation as data procedure. The classrooms were modelled in SketchUp, simulated in Daysim program and the results were assessed by means of spreadsheets in Microsoft Excel. The classrooms dimensions are 7.20mx 7.20m, with windows-to-wall-ratio (WWR) of 20%, 40% and 50%, and with different shading devices, such as standard horizontal overhang, sloped overhang, standard horizontal overhang with side view protection, standard horizontal overhang with a dropped edge, standard horizontal overhang with three horizontal louvers, double standard horizontal overhang, double standard horizontal overhang with three horizontal louvers, plus the use of shelf light in half the models with WWR of 40% and 50%. The data was organized in spreadsheets, with two intervals of UDI: between 300lux and 2000 lux and between 300lux and 3000lux. The simulation was performed with the weather file of 2009 to the city of NatalRN. The graphical outputs are illuminance curves, isolines of UDI among 300lux and 2000 lux and tables with index of occurrences of glare and to an UDI among 300lux 3000lux. The best UDI300-2000lux performance was evidenced to: Phase 1 (models with WWR of 20%), Phase 2 (models with WWR of 40% and 50% with light shelf). The best UDI300-3000lux performance was evidenced to: Phase 1 (models with WWR of 20% and 40% with light shelf) and Phase 2 (models with WWR of 40% and 50% with light shelf). The outputs prove that the daylight quality mainly depends on the shading system efficacy to avoid the glare occurrence, which determines the daylight discomfort. The bioclimatic recommendations of big openings with partial shading (with an opening with direct sunlight) resulted in illuminances level higher than the acceptable upper threshold. The improvement of the shading system percentage (from 73% to 91%) in medium-size of openings (WWR 40% and 50%) reduced or eliminate the glare occurrence without compromising the daylight zone depth (7.20m). The passive zone was determined for classrooms with satisfactory daylight performance, it was calculated the daylight zone depth rule-of-thumb with the ratio between daylight zone depth and the height of the window for different size of openings. The ratio ranged from 1.54 to 2.57 for WWR of 20%, 40% and 50% respectively. There was a reduction or elimination of glare in the passive area with light shelf, or with awning window shading.
Resumo:
The hospital is a place of complex actions, where several activities for serving the population are performed such as: medical appointments, exams, surgeries, emergency care, admission in wards and ICUs. These activities are mixed with anxiety, impatience, despair and distress of patients and their families, issues involving emotional balance both for professionals who provide services for them as for people cared by them. The healthcare crisis in Brazil is getting worse every year and today, constitutes a major problem for private hospitals. The patient that comes to emergencies progressively increase, and in contrast, there is no supply of hospital beds in the same proportion, causing overcrowding, declines in the quality of care delivered to patients, drain of professionals of the health area and difficulty in management the beds. This work presents a study that seeks to create an alternative tool that can contribute to the management of a private hospital beds. It also seeks to identify potential issues or deficiencies and therefore make changes in flow for an increase in service capacity, thus reducing costs without compromising the quality of services provided. The tool used was the Computational Simulation –based in discrete event, which aims to identify the main parameters to be considered for a proper modeling of this system. This study took as reference the admission of a private hospital, based on the current scenario, where your apartments are in saturation level as its occupancy rate. The relocation of project beds aims to meet the growing demand for surgeries and hospital admissions observed by the current administration.
Resumo:
The heavy part of the oil can be used for numerous purposes, e.g. to obtain lubricating oils. In this context, many researchers have been studying alternatives such separation of crude oil components, among which may be mentioned molecular distillation. Molecular distillation is a forced evaporation technique different from other conventional processes in the literature. This process can be classified as a special distillation case under high vacuum with pressures that reach extremely low ranges of the order of 0.1 Pascal. The evaporation and condensation surfaces must have a distance from each other of the magnitude order of mean free path of the evaporated molecules, that is, molecules evaporated easily reach the condenser, because they find a route without obstacles, what is desirable. Thus, the main contribution of this work is the simulation of the falling-film molecular distillation for crude oil mixtures. The crude oil was characterized using UniSim® Design and R430 Aspen HYSYS® V8.5. The results of this characterization were performed in spreadsheets of Microsoft® Excel®, calculations of the physicochemical properties of the waste of an oil sample, i.e., thermodynamic and transport. Based on this estimated properties and boundary conditions suggested by the literature, equations of temperature and concentration profiles were resolved through the implicit finite difference method using the programming language Visual Basic® (VBA) for Excel®. The result of the temperature profile showed consistent with the reproduced by literature, having in their initial values a slight distortion as a result of the nature of the studied oil is lighter than the literature, since the results of the concentration profiles were effective allowing realize that the concentration of the more volatile decreases and of the less volatile increases due to the length of the evaporator. According to the transport phenomena present in the process, the velocity profile tends to increase to a peak and then decreases, and the film thickness decreases, both as a function of the evaporator length. It is concluded that the simulation code in Visual Basic® language (VBA) is a final product of the work that allows application to molecular distillation of petroleum and other similar mixtures.
Resumo:
The understanding of the occurrence and flow of groundwater in the subsurface is of fundamental importance in the exploitation of water, just like knowledge of all associated hydrogeological context. These factors are primarily controlled by geometry of a certain pore system, given the nature of sedimentary aquifers. Thus, the microstructural characterization, as the interconnectivity of the system, it is essential to know the macro properties porosity and permeability of reservoir rock, in which can be done on a statistical characterization by twodimensional analysis. The latter is being held on a computing platform, using image thin sections of reservoir rock, allowing the prediction of the properties effective porosity and hydraulic conductivity. For Barreiras Aquifer to obtain such parameters derived primarily from the interpretation of tests of aquifers, a practice that usually involves a fairly complex logistics in terms of equipment and personnel required in addition to high cost of operation. Thus, the analysis and digital image processing is presented as an alternative tool for the characterization of hydraulic parameters, showing up as a practical and inexpensive method. This methodology is based on a flowchart work involving sampling, preparation of thin sections and their respective images, segmentation and geometric characterization, three-dimensional reconstruction and flow simulation. In this research, computational image analysis of thin sections of rocks has shown that aquifer storage coefficients ranging from 0,035 to 0,12 with an average of 0,076, while its hydrogeological substrate (associated with the top of the carbonate sequence outcropping not region) presents effective porosities of the order of 2%. For the transport regime, it is evidenced that the methodology presents results below of those found in the bibliographic data relating to hydraulic conductivity, mean values of 1,04 x10-6 m/s, with fluctuations between 2,94 x10-6 m/s and 3,61x10-8 m/s, probably due to the larger scale study and the heterogeneity of the medium studied.
Resumo:
Intense precipitation events (IPE) have been causing great social and economic losses in the affected regions. In the Amazon, these events can have serious impacts, primarily for populations living on the margins of its countless rivers, because when water levels are elevated, floods and/or inundations are generally observed. Thus, the main objective of this research is to study IPE, through Extreme Value Theory (EVT), to estimate return periods of these events and identify regions of the Brazilian Amazon where IPE have the largest values. The study was performed using daily rainfall data of the hydrometeorological network managed by the National Water Agency (Agência Nacional de Água) and the Meteorological Data Bank for Education and Research (Banco de Dados Meteorológicos para Ensino e Pesquisa) of the National Institute of Meteorology (Instituto Nacional de Meteorologia), covering the period 1983-2012. First, homogeneous rainfall regions were determined through cluster analysis, using the hierarchical agglomerative Ward method. Then synthetic series to represent the homogeneous regions were created. Next EVT, was applied in these series, through Generalized Extreme Value (GEV) and the Generalized Pareto Distribution (GPD). The goodness of fit of these distributions were evaluated by the application of the Kolmogorov-Smirnov test, which compares the cumulated empirical distributions with the theoretical ones. Finally, the composition technique was used to characterize the prevailing atmospheric patterns for the occurrence of IPE. The results suggest that the Brazilian Amazon has six pluvial homogeneous regions. It is expected more severe IPE to occur in the south and in the Amazon coast. More intense rainfall events are expected during the rainy or transitions seasons of each sub-region, with total daily precipitation of 146.1, 143.1 and 109.4 mm (GEV) and 201.6, 209.5 and 152.4 mm (GPD), at least once year, in the south, in the coast and in the northwest of the Brazilian Amazon, respectively. For the south Amazonia, the composition analysis revealed that IPE are associated with the configuration and formation of the South Atlantic Convergence Zone. Along the coast, intense precipitation events are associated with mesoscale systems, such Squall Lines. In Northwest Amazonia IPE are apparently associated with the Intertropical Convergence Zone and/or local convection.
Resumo:
In the Oil industry, oil and gas pipelines are commonly utilized to perform the transportation of production fluids to longer distances. The maintenance of the pipelines passes through the analysis of several tools, in which the most currently used are the pipelines inspection cells, popularly knowing as PIG. Among the variants existing in the market, the instrumented PIG has a significant relevance; acknowledging that through the numerous sensors existing in the equipment, it can detect faults or potential failure along the inspected line. Despite its versatility, the instrumented PIG suffers from speed variations, impairing the reading of sensors embedded in it. Considering that PIG moves depending on the speed of the production fluid, a way to control his speed is to control the flow of the fluid through the pressure control, reducing the flow rate of the produced flow, resulting in reduction of overall production the fluid in the ducts own or with the use of a restrictive element (valve) installed on it. The characteristic of the flow rate/pressure drop from restrictive elements of the orifice plate is deducted usually from the ideal energy equation (Bernoulli’s equation) and later, the losses are corrected normally through experimental tests. Thus, with the objective of controlling the fluids flow passing through the PIG, a valve shutter actuated by solenoid has been developed. This configuration allows an ease control and stabilization of the flow adjustment, with a consequent response in the pressure drops between upstream and downstream of the restriction. It was assembled a test bench for better definition of flow coefficients; composed by a duct with intern diameter of four inches, one set of shutters arranged in a plate and pressure gauges for checking the pressure drop in the test. The line was pressurized and based on the pressure drop it was possible to draw a curve able to characterize the flow coefficient of the control valve prototype and simulate in mockup the functioning, resulting in PIG speed reduction of approximately 68%.