852 resultados para Initial data problem
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)
Resumo:
Pós-graduação em Linguística e Língua Portuguesa - FCLAR
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
A hanseníase é uma doença infecciosa crônica, granulomatosa, de curso lento, causada pelo Mycobacterium Leprae. O bacilo acomete principalmente os nervos periféricos, causando lesões na face, mãos e pés, que podem gerar incapacidades físicas severas que contribuem para a instalação de padrões deformantes e incapacidades. A lesão do tipo mão em garra é uma sequela que pode ser observada em pacientes com lesões ao nível dos membros superiores sendo muito incapacitante, dificultando a realização das atividades de vida diária destes indivíduos e consequentemente prejudicando sua qualidade de vida e satisfação pessoal. Estas lesões geram repercussões no contexto de vida do indivíduo contribuindo para a instalação de alterações nos aspectos psicoemocionais, além do estigma próprio da doença. A intervenção terapêutica ocupacional utilizando a tecnologia assistiva de baixo custo para auxílio nas atividades de vida diária de pacientes com mão em garra objetiva a minimização dos déficits funcionais apresentados durante a utilização de adaptações funcionais utilizadas na realização de suas atividades cotidianas como alimentação, higiene pessoal e vestuário. A intervenção realizou-se através da aplicação de um protocolo de avaliação em Terapia Ocupacional conhecido como Medida Canadense de Desempenho Ocupacional (COPM) que mede o grau de desempenho e Satisfação do paciente ao realizar suas atividades de vida diária. O protocolo foi aplicado inicialmente junto aos pacientes coletando dados sobre a realização das suas atividades de vida diária sem a utilização de recursos de tecnologia assistiva. A aplicação do protocolo baseou-se na definição de cinco problemas comuns a todos os participantes, revelando graus muito baixos de desempenho e satisfação obtidos durante a realização das atividades avaliadas. Posteriormente realizou-se o processo de prescrição, confecção e treinamento das adaptações desenvolvidas para cada paciente, somando-se um total de cento e vinte aparelhos (120) desenvolvidos. Aplicou-se novamente o mesmo protocolo com os mesmos pacientes abordando os mesmos problemas após a realização de um período de treinamento das adaptações funcionais desenvolvidas, comparando-se os dados coletados no primeiro e segundo COPM. Comparando-se aos dados iniciais apresentados, os dados coletados na segunda avaliação do COPM apontaram um aumento significativo do grau de desempenho e satisfação dos pacientes além de ganho funcional. Concluí-se com esta pesquisa que a proposta de intervenção terapêutica ocupacional utilizando equipamentos de tecnologia assistiva de baixo custo (adaptações) é viável, possui resultados satisfatórios e favorece um grande alcance social devido à redução de custos dos dispositivos desenvolvidos.
Resumo:
We discuss the one-sided Green's function, associated with an initial value problem and the two-sided Green's function related to a boundary value problem. We present a specific calculation associated with a differential equation with constant coefficients. For both problems, we also present the Laplace integral transform as another methodology to calculate these Green's functions and conclude which is the most convenient one. An incursion in the so-called fractional Green's function is also presented. As an example, we discuss the isotropic harmonic oscillator.
Resumo:
In the present work it is proposed to do a revision on some studies on the dynamics of the Prometheus-Pandora system. In special, those studies that deal with anomalous behaviours observed on its components, identi ed as angular lags in these satellite`s orbits. Initially, it is presented a general description, contextualising the main characteristics of this system. The main publications related to this subject are analised and commented, in chronological order, showing the advances made in the knowledge of such dynamics. An analysis of the initial conditions, used by Goldreich e Rappaport (2003a ,b) e Cruz (2004), obtained through observations made by the Voyager 1 and 2 spacecrafts and by the Hubble space telescope, it is made in order to try to reproduce their results. However, no clear conclusion of the values used were found. The tests addopted in the analysis are from Cruz (2004), which reproduced the results and o ered a new explanation on the origin of the observed angular lags. The addopetd methodology involves the numerical integration of the equations of motion of the system, including the zonal harmonics J2, J4 and J6 of Saturn's gravitational potential. A fundamental consideration in this study is the use of geometric elements instead of osculating elements. It was found the set of initial data that best reproduces the results from Goldreich e Rappaport (2003a, b) and Cruz (2004)
Resumo:
The flow of Ricci is an analytical tool, and a similar equation for heat geometry, a diffusive process which acts on a variety of metrics Riemannian and thus can be used in mathematics to understand the topology of varieties and also in the study geometric theories. Thus, the Ricci curvature plays an important role in the General Theory of Relativity, characterized as a geometric theory, which is the dominant term in the Einstein field equations. The present work has as main objectives to develop and apply Ricci flow techniques to general relativity, in this case, a three-dimensional asymptotically flat Riemannian metric as a set of initial data for Einstein equations and establish relations and comparisons between them.
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
Pós-graduação em Física - FEG
Resumo:
This paper aims to investigate the conceptions about the relation between genotype and phenotype of Biological Sciences Degree students who take part in a research group in Epistemology of Biology. In an initial data collection, the presence of ideas based on a restricted to genes and environment relation - without considering the organism and its life history - became evident . However, during the group discussions on the topic there were other statements involving other concepts, such as: molecular interactions, chance, organism and Developmental Biology. The analysis of conceptual (re)constructions that emerged in the group allowed the proposition and the development of an explanatory model for the relation between genotype and phenotype.
Resumo:
Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)
Resumo:
In this paper we continue the development of the differential calculus started in Aragona et al. (Monatsh. Math. 144: 13-29, 2005). Guided by the so-called sharp topology and the interpretation of Colombeau generalized functions as point functions on generalized point sets, we introduce the notion of membranes and extend the definition of integrals, given in Aragona et al. (Monatsh. Math. 144: 13-29, 2005), to integrals defined on membranes. We use this to prove a generalized version of the Cauchy formula and to obtain the Goursat Theorem for generalized holomorphic functions. A number of results from classical differential and integral calculus, like the inverse and implicit function theorems and Green's theorem, are transferred to the generalized setting. Further, we indicate that solution formulas for transport and wave equations with generalized initial data can be obtained as well.
Resumo:
Abstract Background A large number of probabilistic models used in sequence analysis assign non-zero probability values to most input sequences. To decide when a given probability is sufficient the most common way is bayesian binary classification, where the probability of the model characterizing the sequence family of interest is compared to that of an alternative probability model. We can use as alternative model a null model. This is the scoring technique used by sequence analysis tools such as HMMER, SAM and INFERNAL. The most prevalent null models are position-independent residue distributions that include: the uniform distribution, genomic distribution, family-specific distribution and the target sequence distribution. This paper presents a study to evaluate the impact of the choice of a null model in the final result of classifications. In particular, we are interested in minimizing the number of false predictions in a classification. This is a crucial issue to reduce costs of biological validation. Results For all the tests, the target null model presented the lowest number of false positives, when using random sequences as a test. The study was performed in DNA sequences using GC content as the measure of content bias, but the results should be valid also for protein sequences. To broaden the application of the results, the study was performed using randomly generated sequences. Previous studies were performed on aminoacid sequences, using only one probabilistic model (HMM) and on a specific benchmark, and lack more general conclusions about the performance of null models. Finally, a benchmark test with P. falciparum confirmed these results. Conclusions Of the evaluated models the best suited for classification are the uniform model and the target model. However, the use of the uniform model presents a GC bias that can cause more false positives for candidate sequences with extreme compositional bias, a characteristic not described in previous studies. In these cases the target model is more dependable for biological validation due to its higher specificity.
Resumo:
[EN] In this paper we show that a classic optical flow technique by Nagel and Enkelmann can be regarded as an early anisotropic diffusion method with a diffusion tensor. We introduce three improvements into the model formulation that avoid inconsistencies caused by centering the brightness term and the smoothness term in different images use a linear scale-space focusing strategy from coarse to fine scales for avoiding convergence to physically irrelevant local minima, and create an energy functional that is invariant under linear brightness changes. Applying a gradient descent method to the resulting energy functional leads to a system of diffusion-reaction equations. We prove that this system has a unique solution under realistic assumptions on the initial data, and we present an efficient linear implicit numerical scheme in detail. Our method creates flow fields with 100% density over the entire image domain, it is robust under a large range of parameter variations, and it can recover displacement fields that are far beyond the typical one-pixel limits which are characteristic for many differential methods for determining optical flow. We show that it performs better than the classic optical flow methods with 100% density that are evaluated by Barron et al. (1994). Our software is available from the Internet.