854 resultados para Algoritmos computacionais
Resumo:
Pós-graduação em Biologia Geral e Aplicada - IBB
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
Pós-graduação em Ciências Cartográficas - FCT
Resumo:
Pós-graduação em Engenharia Mecânica - FEIS
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)
Resumo:
The wide territorial extension of Brazil derails the installation and maintenance of instruments for measuring solar radiation, which makes necessary the development and application of models that are able to estimate reliable and sufficient data for many different activities that use such data. And these, in most cases, are estimated from the Ångström equation. Based on this model, this project aimed to estimate the global solar radiation at Presidente Prudente-SP, Brazil, using daily data from 1999 to 2007. The solar radiation data have been extracted from the paper tapes of actinograph bi-metallic (Robitsch) daily records at the meteorological station in the Faculty of Science and Technology, UNESP. These tapes were scanned, resulting in digital images with x and y coordinates pairs (x = time; y = solar radiation, cal/min.cm²). The daily global solar radiation is the area under the curve of the image. This value has been calculated by computer algorithms. After the acquisition and calculation of the values needed to develop the Ångström equation have been determined the constants a and b, using linear regression between the values of Rg/R0 (solar radiation/solar radiation on a horizontal surface at the top of atmosphere), as ordered, and n/N (number of hours of sunshine/day length in hours) as abscissa. The slope of the line will be the constant b and the linear coefficient, the constant a. The estimated results were compared to the observed using the Kolmogorov-Smirnov test, realizing that the models can be accepted. So, the equation to aim the solar global radiation is: Rg = R0 (0,2662+0,3592 n/N)
Resumo:
Pós-graduação em Matemática Universitária - IGCE
Resumo:
O desenvolvimento de algoritmos computacionais para a obtenção de distribuições de tamanho de partícula em dispersões e que utilizam dados espectroscópicos em tempo real e in-line a partir de sensores permitirá uma variedade de aplicações, como o monitoramento de propriedades em fluidos de corte industriais, acompanhamento de processos de polimerização, tratamento de efluentes e sensoriamento atmosférico. O presente estudo tem como objetivo a implementação e comparação de técnicas para resolução de problemas de inversão, desenvolvendo algoritmos que forneçam distribuição de tamanho de partículas em dispersões a partir de dados de espectroscopia UV-Vis-Nir (Ultravioleta, Visível e Infravermelho próximo). Foram implementadas quatro técnicas, sendo uma delas um método alternativo sem a presença de etapas de inversão. Os métodos que utilizaram alguma técnica de inversão evidenciaram a dificuldade em se obter distribuições de tamanho de gotas (DTG) de boa qualidade, enquanto o método alternativo foi aquele que se mostrou mais eficiente e confiável. Este estudo é parte de um programa cooperativo entre a Universidade de São Paulo e a Universidade de Bremen chamado programa BRAGECRIM (Brazilian German Cooperative Research Initiative in Manufacturing) e é financiado pela FAPESP, CAPES, FINEP e CNPq (Brasil) e DFG (Alemanha).
Resumo:
The objective of this research was to investigate monthly climatological, seasonal, annual and interdecadal of the reference evapotranspiration (ETo) in Acre state in order to better understand its spatial and temporal variability and identify possible trends in the region. The study was conducted with data from Rio Branco municipalities, the state capital, Tarauacá and Cruzeiro do Sul considering a 30-year period (1985-2014), from monthly data from weather stations surface of the National Institute of Meteorology. The methodology was held, first, the consistency of meteorological data. Thus, it was made the gap filling in the time series by means of multivariate techniques. Subsequently were performed statistical tests trend (Mann-Kendall) and homogeneity, by Sen's estimator of the magnitude of this trend is estimated, as well as computational algorithms containing parametric and non-parametric tests for two samples to identify from that year the trend has become significant. Finally, analysis of variance technique (ANOVA) was adopted in order to verify whether there were significant differences in average annual evapotranspiration between locations. The indirect method of Penman-Montheith parameterized by FAO was used to calculate the ETo. The results of this work through examination of the descriptive statistics showed that the ETo the annual average was 3.80, 2.92 and 2.86 mm day-1 year, to Rio Branco, Tarauacá and Cruzeiro do Sul, respectively. Featuring quite remarkable seasonal pattern with a minimum in June and a maximum in October, with Rio Branco to town one with the strongest signal (amplitudes) on the other hand, the Southern Cross presented the highest variability among the studied locations. By ANOVA it was found that the average annual statistically different for a significance level of 1% between locations, but the annual average between Cruzeiro do Sul and Tarauacá no statistically significant differences. For the three locations, the 2000s was the one with the highest ETo values associated with warmer waters of the North Atlantic basin and the 80s to lower values, associated with cooler waters of this basin. By analyzing the Mann-kendall and Sen estimator test, there was a trend of increasing the seasonal reference evapotranspiration (fall, winter and spring) on the order of 0.11 mm per decade and that from the years of 1990, 1996 and 2001 became statistically significant to the localities of Cruzeiro do Sul Tarauacá and Rio Branco, respectively. For trend analysis of meteorological parameters was observed positive trend in the 5% level of significance, for average temperature, minimum temperature and solar radiation.
Resumo:
The fluorescent proteins are an essential tool in many fields of biology, since they allow us to watch the development of structures and dynamic processes of cells in living tissue, with the aid of fluorescence microscopy. Optogenectics is another technique that is currently widely used in Neuroscience. In general, this technique allows to activate/deactivate neurons with the radiation of certain wavelengths on the cells that have ion channels sensitive to light, at the same time that can be used with fluorescent proteins. This dissertation has two main objectives. Initially, we study the interaction of light radiation and mice brain tissue to be applied in optogenetic experiments. In this step, we model absorption and scattering effects using mice brain tissue characteristics and Kubelka-Munk theory, for specific wavelengths, as a function of light penetration depth (distance) within the tissue. Furthermore, we model temperature variations using the finite element method to solve Pennes’ bioheat equation, with the aid of COMSOL Multiphysics Modeling Software 4.4, where we simulate protocols of light stimulation tipically used in optogenetics. Subsequently, we develop some computational algorithms to reduce the exposure of neuron cells to the light radiation necessary for the visualization of their emitted fluorescence. At this stage, we describe the image processing techniques developed to be used in fluorescence microscopy to reduce the exposure of the brain samples to continuous light, which is responsible for fluorochrome excitation. The developed techniques are able to track, in real time, a region of interest (ROI) and replace the fluorescence emitted by the cells by a virtual mask, as a result of the overlay of the tracked ROI and the fluorescence information previously stored, preserving cell location, independently of the time exposure to fluorescent light. In summary, this dissertation intends to investigate and describe the effects of light radiation in brain tissue, within the context of Optogenetics, in addition to providing a computational tool to be used in fluorescence microscopy experiments to reduce image bleaching and photodamage due to the intense exposure of fluorescent cells to light radiation.
Resumo:
The purpose of this work is to demonstrate and to assess a simple algorithm for automatic estimation of the most salient region in an image, that have possible application in computer vision. The algorithm uses the connection between color dissimilarities in the image and the image’s most salient region. The algorithm also avoids using image priors. Pixel dissimilarity is an informal function of the distance of a specific pixel’s color to other pixels’ colors in an image. We examine the relation between pixel color dissimilarity and salient region detection on the MSRA1K image dataset. We propose a simple algorithm for salient region detection through random pixel color dissimilarity. We define dissimilarity by accumulating the distance between each pixel and a sample of n other random pixels, in the CIELAB color space. An important result is that random dissimilarity between each pixel and just another pixel (n = 1) is enough to create adequate saliency maps when combined with median filter, with competitive average performance if compared with other related methods in the saliency detection research field. The assessment was performed by means of precision-recall curves. This idea is inspired on the human attention mechanism that is able to choose few specific regions to focus on, a biological system that the computer vision community aims to emulate. We also review some of the history on this topic of selective attention.
Resumo:
A Histologia, o estudo de tecidos, é uma das áreas fundamentais da Biologia que permitiu enormes avanços científicos. Sendo uma tarefa exigente, meticulosa e demorada, será importante aproveitar a existência de ferramentas e algoritmos computacionais no seu auxílio, tornando o processo mais rápido e possibilitando a descoberta de informação que poderá não estar visível à partida. Esta dissertação tem como principal objectivo averiguar se um animal foi ou não sujeito à ingestão de um xenobiótico. Com esse objectivo em vista, utilizaram-se técnicas de processamento e segmentação de imagem aplicadas a imagens de tecido renal de ratos saudáveis e ratos que ingeriram o xenobiótico. Destas imagens extraíram-se inúmeras características do corpúsculo renal que após serem analisadas através de vários algoritmos de classificação mostraram ser possível saber se o animal ingeriu ou não o xenobiótico, com um reduzido grau de incerteza. ABSTRACT: Histology, the study of tissues, is one of the key areas of Biology that has allowed huge advances in Science. Being a demanding, meticulous and time consuming task, it is important to use the existence of computational tools and algorithms in its aid, making the process faster and enabling the discovery of information that may not be initially visible. The main goal of this thesis is to ascertain if an animal was subjected or not to the ingestion of a xenobiotic. With this in mind, were used image processing and segmentation techniques applied on images of kidney tissue from healthy rats and rats that ingested the xenobiotic. From these images were extracted several features of renal glomeruli that after being analyzed by various classification algorithms had shown to be possible to know, with an acceptable degree of certainty, if the animal ingested or not the xenobiotic.
Resumo:
Pós-graduação em Ciência da Computação - IBILCE
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)
Resumo:
A engenharia geotécnica é uma das grandes áreas da engenharia civil que estuda a interação entre as construções realizadas pelo homem ou de fenômenos naturais com o ambiente geológico, que na grande maioria das vezes trata-se de solos parcialmente saturados. Neste sentido, o desempenho de obras como estabilização, contenção de barragens, muros de contenção, fundações e estradas estão condicionados a uma correta predição do fluxo de água no interior dos solos. Porém, como a área das regiões a serem estudas com relação à predição do fluxo de água são comumente da ordem de quilômetros quadrados, as soluções dos modelos matemáticos exigem malhas computacionais de grandes proporções, ocasionando sérias limitações associadas aos requisitos de memória computacional e tempo de processamento. A fim de contornar estas limitações, métodos numéricos eficientes devem ser empregados na solução do problema em análise. Portanto, métodos iterativos para solução de sistemas não lineares e lineares esparsos de grande porte devem ser utilizados neste tipo de aplicação. Em suma, visto a relevância do tema, esta pesquisa aproximou uma solução para a equação diferencial parcial de Richards pelo método dos volumes finitos em duas dimensões, empregando o método de Picard e Newton com maior eficiência computacional. Para tanto, foram utilizadas técnicas iterativas de resolução de sistemas lineares baseados no espaço de Krylov com matrizes pré-condicionadoras com a biblioteca numérica Portable, Extensible Toolkit for Scientific Computation (PETSc). Os resultados indicam que quando se resolve a equação de Richards considerando-se o método de PICARD-KRYLOV, não importando o modelo de avaliação do solo, a melhor combinação para resolução dos sistemas lineares é o método dos gradientes biconjugados estabilizado mais o pré-condicionador SOR. Por outro lado, quando se utiliza as equações de van Genuchten deve ser optar pela combinação do método dos gradientes conjugados em conjunto com pré-condicionador SOR. Quando se adota o método de NEWTON-KRYLOV, o método gradientes biconjugados estabilizado é o mais eficiente na resolução do sistema linear do passo de Newton, com relação ao pré-condicionador deve-se dar preferência ao bloco Jacobi. Por fim, há evidências que apontam que o método PICARD-KRYLOV pode ser mais vantajoso que o método de NEWTON-KRYLOV, quando empregados na resolução da equação diferencial parcial de Richards.