956 resultados para Mathematical Techniques--Error Analysis
Resumo:
An integrated mathematical model for the kinetics of multicomponent adsorption on microporous carbon was developed. Transport in this bidisperse solid is represented by balance equations in the macropore and micropore phases, in which gas-phase diffusion dominates the mass transfer in the macropores, with the phenomenological diffusivities represented by the generalized Maxwell-Stefan (GMS) formulation. Viscous flow also contributes to the macropore fluxes and is included in the MS expressions. Diffusion of the adsorbed phase controls the mass transfer in the micro ore phase, p which is also described in a similar way by the MS method. The adsorption isotherms are represented by a new heterogeneous modified vacancy solution theory formulation of adsorption, which has proved to be a robust method for adsorption on activated carbons. The model is applied to the coadsorption and codesorption of C2H6 and C3H8 on Ajax and Norit carbon, as well as the displacement on Ajax carbon. The effect of the viscous flow in the macropore phase is not significant for the cases studied. The model accurately predicts the overshoot behavior and rollup of C2H6 during coadsorption. The prediction for the heavier compound C3H8 is always satisfactory, though at higher C3H8 mole fraction, the overshoot extent of C2H6 is overpredicted, possibly due to neglect of heat effects.
Resumo:
O trabalho buscou analisar questões de desigualdade regional no Espírito Santo através da linha de pesquisa denominada Nova Geografia Econômica (NGE). Uma forma de realizar essa análise é através do estudo da relação entre diferenciais de salário e mercado potencial. Mais precisamente, o trabalho procurou verificar o impacto de fatores geográficos de segunda natureza – mercado potencial – nos salário médios municipais. Inicialmente, por meio de uma Análise Exploratória de Dados Espaciais, verificou-se que os salários são maiores próximos às regiões com alto mercado potencial (litoral/RMGV). Por meio da utilização de técnicas de estatística e econometria espacial foi possível observar para os anos de 2000 e 2010 a existência de uma estrutura espacial de salários no Espírito Santo. O coeficiente de erro autorregressivo foi positivo e estatisticamente significativo, indicando o modelo SEM (spatial error model) como o mais apropriado para modelar os efeitos espaciais. Os resultados indicam ainda que não só fatores educacionais afetam os salários, fatores geográficos de segunda natureza possuem um efeito até maior quando comparados aos primeiros. Conclui-se, como demonstra o modelo central da NGE que, forças exclusivamente de mercado nem sempre levam ao equilíbrio equalizador dos rendimentos, pelo contrário, levam à conformação de uma estrutura do tipo centro-periferia com diferença persistente de rendimentos entre as regiões. Adicionalmente, verifica-se que os municípios que apresentam maior salário, maior mercado potencial e melhores indicadores sociais são àqueles localizados no litoral do estado, mais precisamente os municípios próximos à RMGV. Sendo assim, o trabalho reforça a necessidade de que se pense estratégias que fomentem a criação de novas centralidades no Espírito Santo, a fim de atuar na redução das desigualdades regionais. O trabalho se insere num grupo de vários outros estudos que analisaram questões de desigualdade e concentração produtiva no Espírito Santo. A contribuição está na utilização do referencial teórico da NGE, que ainda não havia sido empregada para o estado, e na utilização de técnicas de estatística espacial e econometria espacial.
Resumo:
O soro de leite é um subproduto da fabricação do queijo, seja por acidificação ou por processo enzimático. Em condições ideais, a caseína do leite se agrega formando um gel, que posteriormente cortado, induz a separação e liberação do soro. É utilizado de diversas formas em toda a indústria alimentícia, possui rica composição em lactose, sais minerais e proteínas. A desidratação é um dos principais processos utilizados para beneficiamento e transformação do soro. Diante disto, o objetivo deste trabalho foi avaliar a influência dos métodos de secagem: liofilização, leito de espuma (nas temperaturas de 40, 50, 60, 70 e 80ºC) e spray-dryer (nas temperaturas de 55, 60, 65, 70 e 75ºC), sobre as características de umidade, proteína, cor e solubilidade do soro, bem como estudar o seu processo de secagem. O soro foi obtido e desidratado após concentração por osmose reversa, testando 11 tratamentos, em 3 repetições, utilizando um delineamento inteiramente casualizado. Os resultados demonstraram que o modelo matemático que melhor se ajustou foi o modelo de Page, apresentado um coeficiente de determinação ajustado acima de 0,98 e erro padrão da regressão em todas as temperaturas abaixo de 0,04 para o método por leito de espuma. Para o método de liofilização os respectivos valores foram 0,9975 e 0,01612. A partir disso, pode-se elaborar um modelo matemático generalizado, apresentando um coeficiente de determinação igual a 0,9888. No caso do leito de espuma, observou-se que à medida que se aumenta a temperatura do ar de secagem, o tempo de secagem diminui e os valores do coeficiente de difusão efetiva aumentam. Porém, a redução no tempo de secagem entre os intervalos de temperatura, diminui com o aumento da mesma. A energia de ativação para a difusão no processo de secagem do soro foi de 26,650 kJ/mol e para todas as avaliações físico-químicas e tecnológicas, a análise de variância apresentou um valor de F significativo (p<0,05), indicando que há pelo menos um contraste entre as médias dos tratamentos que é significativo.
Resumo:
Within the development of motor vehicles, crash safety (e.g. occupant protection, pedestrian protection, low speed damageability), is one of the most important attributes. In order to be able to fulfill the increased requirements in the framework of shorter cycle times and rising pressure to reduce costs, car manufacturers keep intensifying the use of virtual development tools such as those in the domain of Computer Aided Engineering (CAE). For crash simulations, the explicit finite element method (FEM) is applied. The accuracy of the simulation process is highly dependent on the accuracy of the simulation model, including the midplane mesh. One of the roughest approximations typically made is the actual part thickness which, in reality, can vary locally. However, almost always a constant thickness value is defined throughout the entire part due to complexity reasons. On the other hand, for precise fracture analysis within FEM, the correct thickness consideration is one key enabler. Thus, availability of per element thickness information, which does not exist explicitly in the FEM model, can significantly contribute to an improved crash simulation quality, especially regarding fracture prediction. Even though the thickness is not explicitly available from the FEM model, it can be inferred from the original CAD geometric model through geometric calculations. This paper proposes and compares two thickness estimation algorithms based on ray tracing and nearest neighbour 3D range searches. A systematic quantitative analysis of the accuracy of both algorithms is presented, as well as a thorough identification of particular geometric arrangements under which their accuracy can be compared. These results enable the identification of each technique’s weaknesses and hint towards a new, integrated, approach to the problem that linearly combines the estimates produced by each algorithm.
Resumo:
Graphical user interfaces (GUIs) are critical components of todays software. Given their increased relevance, correctness and usability of GUIs are becoming essential. This paper describes the latest results in the development of our tool to reverse engineer the GUI layer of interactive computing systems. We use static analysis techniques to generate models of the user interface behaviour from source code. Models help in graphical user interface inspection by allowing designers to concentrate on its more important aspects. One particularly type of model that the tool is able to generate is state machines. The paper shows how graph theory can be useful when applied to these models. A number of metrics and algorithms are used in the analysis of aspects of the user interface's quality. The ultimate goal of the tool is to enable analysis of interactive system through GUIs source code inspection.
Resumo:
Current software development often relies on non-trivial coordination logic for combining autonomous services, eventually running on different platforms. As a rule, however, such a coordination layer is strongly woven within the application at source code level. Therefore, its precise identification becomes a major methodological (and technical) problem and a challenge to any program understanding or refactoring process. The approach introduced in this paper resorts to slicing techniques to extract coordination data from source code. Such data are captured in a specific dependency graph structure from which a coordination model can be recovered either in the form of an Orc specification or as a collection of code fragments corresponding to the identification of typical coordination patterns in the system. Tool support is also discussed
Resumo:
What sort of component coordination strategies emerge in a software integration process? How can such strategies be discovered and further analysed? How close are they to the coordination component of the envisaged architectural model which was supposed to guide the integration process? This paper introduces a framework in which such questions can be discussed and illustrates its use by describing part of a real case-study. The approach is based on a methodology which enables semi-automatic discovery of coordination patterns from source code, combining generalized slicing techniques and graph manipulation
Resumo:
Color model representation allows characterizing in a quantitative manner, any defined color spectrum of visible light, i.e. with a wavelength between 400nm and 700nm. To accomplish that, each model, or color space, is associated with a function that allows mapping the spectral power distribution of the visible electromagnetic radiation, in a space defined by a set of discrete values that quantify the color components composing the model. Some color spaces are sensitive to changes in lighting conditions. Others assure the preservation of certain chromatic features, remaining immune to these changes. Therefore, it becomes necessary to identify the strengths and weaknesses of each model in order to justify the adoption of color spaces in image processing and analysis techniques. This chapter will address the topic of digital imaging, main standards and formats. Next we will set the mathematical model of the image acquisition sensor response, which enables assessment of the various color spaces, with the aim of determining their invariance to illumination changes.
Resumo:
A hierarchical matrix is an efficient data-sparse representation of a matrix, especially useful for large dimensional problems. It consists of low-rank subblocks leading to low memory requirements as well as inexpensive computational costs. In this work, we discuss the use of the hierarchical matrix technique in the numerical solution of a large scale eigenvalue problem arising from a finite rank discretization of an integral operator. The operator is of convolution type, it is defined through the first exponential-integral function and, hence, it is weakly singular. We develop analytical expressions for the approximate degenerate kernels and deduce error upper bounds for these approximations. Some computational results illustrating the efficiency and robustness of the approach are presented.
Resumo:
Graphical user interfaces (GUIs) are critical components of today's open source software. Given their increased relevance, the correctness and usability of GUIs are becoming essential. This paper describes the latest results in the development of our tool to reverse engineer the GUI layer of interactive computing open source systems. We use static analysis techniques to generate models of the user interface behavior from source code. Models help in graphical user interface inspection by allowing designers to concentrate on its more important aspects. One particular type of model that the tool is able to generate is state machines. The paper shows how graph theory can be useful when applied to these models. A number of metrics and algorithms are used in the analysis of aspects of the user interface's quality. The ultimate goal of the tool is to enable analysis of interactive system through GUIs source code inspection.
Resumo:
Background: Regulating mechanisms of branching morphogenesis of fetal lung rat explants have been an essential tool for molecular research. This work presents a new methodology to accurately quantify the epithelial, outer contour and peripheral airway buds of lung explants during cellular development from microscopic images. Methods: The outer contour was defined using an adaptive and multi-scale threshold algorithm whose level was automatically calculated based on an entropy maximization criterion. The inner lung epithelial was defined by a clustering procedure that groups small image regions according to the minimum description length principle and local statistical properties. Finally, the number of peripheral buds were counted as the skeleton branched ends from a skeletonized image of the lung inner epithelial. Results: The time for lung branching morphometric analysis was reduced in 98% in contrast to the manual method. Best results were obtained in the first two days of cellular development, with lesser standard deviations. Non-significant differences were found between the automatic and manual results in all culture days. Conclusions: The proposed method introduces a series of advantages related to its intuitive use and accuracy, making the technique suitable to images with different lightning characteristics and allowing a reliable comparison between different researchers.