937 resultados para Lagrange interpolation


Relevância:

10.00% 10.00%

Publicador:

Resumo:

This work is divided into two distinct parts. The first part consists of the study of the metal organic framework UiO-66Zr, where the aim was to determine the force field that best describes the adsorption equilibrium properties of two different gases, methane and carbon dioxide. The other part of the work focuses on the study of the single wall carbon nanotube topology for ethane adsorption; the aim was to simplify as much as possible the solid-fluid force field model to increase the computational efficiency of the Monte Carlo simulations. The choice of both adsorbents relies on their potential use in adsorption processes, such as the capture and storage of carbon dioxide, natural gas storage, separation of components of biogas, and olefin/paraffin separations. The adsorption studies on the two porous materials were performed by molecular simulation using the grand canonical Monte Carlo (μ,V,T) method, over the temperature range of 298-343 K and pressure range 0.06-70 bar. The calibration curves of pressure and density as a function of chemical potential and temperature for the three adsorbates under study, were obtained Monte Carlo simulation in the canonical ensemble (N,V,T); polynomial fit and interpolation of the obtained data allowed to determine the pressure and gas density at any chemical potential. The adsorption equilibria of methane and carbon dioxide in UiO-66Zr were simulated and compared with the experimental data obtained by Jasmina H. Cavka et al. The results show that the best force field for both gases is a chargeless united-atom force field based on the TraPPE model. Using this validated force field it was possible to estimate the isosteric heats of adsorption and the Henry constants. In the Grand-Canonical Monte Carlo simulations of carbon nanotubes, we conclude that the fastest type of run is obtained with a force field that approximates the nanotube as a smooth cylinder; this approximation gives execution times that are 1.6 times faster than the typical atomistic runs.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

IntroductionThe objective of this study was to analyze the spatial behavior of the occurrence of trachoma cases detected in the City of Bauru, State of São Paulo, Brazil, in 2006 in order to use the information collected to set priority areas for optimization of health resources.Methodsthe trachoma cases identified in 2006 were georeferenced. The data evaluated were: schools where the trachoma cases studied, data from the 2000 Census, census tract, type of housing, water supply conditions, distribution of income and levels of education of household heads. In the Google Earth® software and TerraView® were made descriptive spatial analysis and estimates of the Kernel. Each area was studied by interpolation of the density surfaces exposing events to facilitate to recognize the clusters.ResultsOf the 66 cases detected, only one (1.5%) was not a resident of the city's outskirts. A positive association was detected of trachoma cases and the percentage of heads of household with income below three minimum wages and schooling under eight years of education.ConclusionsThe recognition of the spatial distribution of trachoma cases coincided with the areas of greatest social inequality in Bauru City. The micro-areas identified are those that should be prioritized in the rationalization of health resources. There is the possibility of using the trachoma cases detected as an indicator of performance of micro priority health programs.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Programa Doutoral em Matemática e Aplicações.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Species distribution modeling has relevant implications for the studies of biodiversity, decision making about conservation and knowledge about ecological requirements of the species. The aim of this study was to evaluate if the use of forest inventories can improve the estimation of occurrence probability, identify the limits of the potential distribution and habitat preference of a group of timber tree species. The environmental predictor variables were: elevation, slope, aspect, normalized difference vegetation index (NDVI) and height above the nearest drainage (HAND). To estimate the distribution of species we used the maximum entropy method (Maxent). In comparison with a random distribution, using topographic variables and vegetation index as features, the Maxent method predicted with an average accuracy of 86% the geographical distribution of studied species. The altitude and NDVI were the most important variables. There were limitations to the interpolation of the models for non-sampled locations and that are outside of the elevation gradient associated with the occurrence data in approximately 7% of the basin area. Ceiba pentandra (samaúma), Castilla ulei (caucho) and Hura crepitans (assacu) is more likely to occur in nearby water course areas. Clarisia racemosa (guariúba), Amburana acreana (cerejeira), Aspidosperma macrocarpon (pereiro), Apuleia leiocarpa (cumaru cetim), Aspidosperma parvifolium (amarelão) and Astronium lecointei (aroeira) can also occur in upland forest and well drained soils. This modeling approach has potential for application on other tropical species still less studied, especially those that are under pressure from logging.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this article, we develop a specification technique for building multiplicative time-varying GARCH models of Amado and Teräsvirta (2008, 2013). The variance is decomposed into an unconditional and a conditional component such that the unconditional variance component is allowed to evolve smoothly over time. This nonstationary component is defined as a linear combination of logistic transition functions with time as the transition variable. The appropriate number of transition functions is determined by a sequence of specification tests. For that purpose, a coherent modelling strategy based on statistical inference is presented. It is heavily dependent on Lagrange multiplier type misspecification tests. The tests are easily implemented as they are entirely based on auxiliary regressions. Finite-sample properties of the strategy and tests are examined by simulation. The modelling strategy is illustrated in practice with two real examples: an empirical application to daily exchange rate returns and another one to daily coffee futures returns.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper we consider the approximate computation of isospectral flows based on finite integration methods( FIM) with radial basis functions( RBF) interpolation,a new algorithm is developed. Our method ensures the symmetry of the solutions. Numerical experiments demonstrate that the solutions have higher accuracy by our algorithm than by the second order Runge- Kutta( RK2) method.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

"Series title: Springerbriefs in applied sciences and technology, ISSN 2191-530X"

Relevância:

10.00% 10.00%

Publicador:

Resumo:

OBJETIVO: Analisar tendências do risco de morte por doenças circulatórias (DC) em 13 estados do Brasil, no período de 1980 a 1998. MÉTODOS: Dados de mortalidade por DC, isquêmicas do coração (DIC) e cerebrovasculares (DCbV) nos 13 estados foram obtidos do Ministério da Saúde. Estimativas das populações, de 1980 a 1998, foram calculadas por meio de interpolação, pelo método de Lagrange, com base nos dados dos Censos de 1970, 1980, 1991 e contagem populacional de 1996. As tendências foram analisadas pelo modelo de regressão linear múltipla. RESULTADOS: A mortalidade por DC mostrou tendência de queda na maioria dos estados. Observou-se aumento, nos homens, em Pernambuco, para todas as faixas etárias, em Goiás, a partir de quarenta anos e na Bahia e Mato Grosso, a partir dos cinqüenta anos. Nas mulheres, aumento em Mato Grosso, a partir dos trinta anos, em Pernambuco, a partir dos quarenta anos, e em Goiás, nas faixas etárias entre trinta e 49 anos. Em Goiás, nas outras faixas etárias, o aumento foi discreto. Para as DIC, aumento da mortalidade para todas as faixas etárias em Mato Grosso e Pernambuco, e a partir dos quarenta anos, na Bahia, Goiás e Pará. Para as DCbV, aumento da mortalidade para todas as faixas etárias em Mato Grosso e Pernambuco, e a partir dos quarenta anos na Bahia e em Goiás. CONCLUSÃO: Observou-se importante aumento do risco de morte para as doenças circulatórias nos estados menos desenvolvidos do Brasil.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

ორიგინალურ და ცნობილ ინტერპოლაციურ ფორმულებზე დაყრდნობით გაანალიზებულია ფაზური სივრცის ქვეფენებს შორის გარდამავალ არეებში სპექტრალური ფუნქციების სიმკვრივეების ყოფაქცევის თავისებურებანი ნეიტრალურ და გამტარ ატმოსფეროს შემთხვევაში. რიცხვითი გამოთვლების შედეგები წარმოდგენილია გრაფიკების სახით.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Elliptic differential equations, finite element method, mortar element method, streamline diffusion FEM, upwind method, numerical method, error estimate, interpolation operator, grid generation, adaptive refinement

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The main object of the present paper consists in giving formulas and methods which enable us to determine the minimum number of repetitions or of individuals necessary to garantee some extent the success of an experiment. The theoretical basis of all processes consists essentially in the following. Knowing the frequency of the desired p and of the non desired ovents q we may calculate the frequency of all possi- ble combinations, to be expected in n repetitions, by expanding the binomium (p-+q)n. Determining which of these combinations we want to avoid we calculate their total frequency, selecting the value of the exponent n of the binomium in such a way that this total frequency is equal or smaller than the accepted limit of precision n/pª{ 1/n1 (q/p)n + 1/(n-1)| (q/p)n-1 + 1/ 2!(n-2)| (q/p)n-2 + 1/3(n-3) (q/p)n-3... < Plim - -(1b) There does not exist an absolute limit of precision since its value depends not only upon psychological factors in our judgement, but is at the same sime a function of the number of repetitions For this reasen y have proposed (1,56) two relative values, one equal to 1-5n as the lowest value of probability and the other equal to 1-10n as the highest value of improbability, leaving between them what may be called the "region of doubt However these formulas cannot be applied in our case since this number n is just the unknown quantity. Thus we have to use, instead of the more exact values of these two formulas, the conventional limits of P.lim equal to 0,05 (Precision 5%), equal to 0,01 (Precision 1%, and to 0,001 (Precision P, 1%). The binominal formula as explained above (cf. formula 1, pg. 85), however is of rather limited applicability owing to the excessive calculus necessary, and we have thus to procure approximations as substitutes. We may use, without loss of precision, the following approximations: a) The normal or Gaussean distribution when the expected frequency p has any value between 0,1 and 0,9, and when n is at least superior to ten. b) The Poisson distribution when the expected frequecy p is smaller than 0,1. Tables V to VII show for some special cases that these approximations are very satisfactory. The praticai solution of the following problems, stated in the introduction can now be given: A) What is the minimum number of repititions necessary in order to avoid that any one of a treatments, varieties etc. may be accidentally always the best, on the best and second best, or the first, second, and third best or finally one of the n beat treatments, varieties etc. Using the first term of the binomium, we have the following equation for n: n = log Riim / log (m:) = log Riim / log.m - log a --------------(5) B) What is the minimun number of individuals necessary in 01der that a ceratin type, expected with the frequency p, may appaer at least in one, two, three or a=m+1 individuals. 1) For p between 0,1 and 0,9 and using the Gaussean approximation we have: on - ó. p (1-p) n - a -1.m b= δ. 1-p /p e c = m/p } -------------------(7) n = b + b² + 4 c/ 2 n´ = 1/p n cor = n + n' ---------- (8) We have to use the correction n' when p has a value between 0,25 and 0,75. The greek letters delta represents in the present esse the unilateral limits of the Gaussean distribution for the three conventional limits of precision : 1,64; 2,33; and 3,09 respectively. h we are only interested in having at least one individual, and m becomes equal to zero, the formula reduces to : c= m/p o para a = 1 a = { b + b²}² = b² = δ2 1- p /p }-----------------(9) n = 1/p n (cor) = n + n´ 2) If p is smaller than 0,1 we may use table 1 in order to find the mean m of a Poisson distribution and determine. n = m: p C) Which is the minimun number of individuals necessary for distinguishing two frequencies p1 and p2? 1) When pl and p2 are values between 0,1 and 0,9 we have: n = { δ p1 ( 1-pi) + p2) / p2 (1 - p2) n= 1/p1-p2 }------------ (13) n (cor) We have again to use the unilateral limits of the Gaussean distribution. The correction n' should be used if at least one of the valors pl or p2 has a value between 0,25 and 0,75. A more complicated formula may be used in cases where whe want to increase the precision : n (p1 - p2) δ { p1 (1- p2 ) / n= m δ = δ p1 ( 1 - p1) + p2 ( 1 - p2) c= m / p1 - p2 n = { b2 + 4 4 c }2 }--------- (14) n = 1/ p1 - p2 2) When both pl and p2 are smaller than 0,1 we determine the quocient (pl-r-p2) and procure the corresponding number m2 of a Poisson distribution in table 2. The value n is found by the equation : n = mg /p2 ------------- (15) D) What is the minimun number necessary for distinguishing three or more frequencies, p2 p1 p3. If the frequecies pl p2 p3 are values between 0,1 e 0,9 we have to solve the individual equations and sue the higest value of n thus determined : n 1.2 = {δ p1 (1 - p1) / p1 - p2 }² = Fiim n 1.2 = { δ p1 ( 1 - p1) + p1 ( 1 - p1) }² } -- (16) Delta represents now the bilateral limits of the : Gaussean distrioution : 1,96-2,58-3,29. 2) No table was prepared for the relatively rare cases of a comparison of threes or more frequencies below 0,1 and in such cases extremely high numbers would be required. E) A process is given which serves to solve two problemr of informatory nature : a) if a special type appears in n individuals with a frequency p(obs), what may be the corresponding ideal value of p(esp), or; b) if we study samples of n in diviuals and expect a certain type with a frequency p(esp) what may be the extreme limits of p(obs) in individual farmlies ? I.) If we are dealing with values between 0,1 and 0,9 we may use table 3. To solve the first question we select the respective horizontal line for p(obs) and determine which column corresponds to our value of n and find the respective value of p(esp) by interpolating between columns. In order to solve the second problem we start with the respective column for p(esp) and find the horizontal line for the given value of n either diretly or by approximation and by interpolation. 2) For frequencies smaller than 0,1 we have to use table 4 and transform the fractions p(esp) and p(obs) in numbers of Poisson series by multiplication with n. Tn order to solve the first broblem, we verify in which line the lower Poisson limit is equal to m(obs) and transform the corresponding value of m into frequecy p(esp) by dividing through n. The observed frequency may thus be a chance deviate of any value between 0,0... and the values given by dividing the value of m in the table by n. In the second case we transform first the expectation p(esp) into a value of m and procure in the horizontal line, corresponding to m(esp) the extreme values om m which than must be transformed, by dividing through n into values of p(obs). F) Partial and progressive tests may be recomended in all cases where there is lack of material or where the loss of time is less importent than the cost of large scale experiments since in many cases the minimun number necessary to garantee the results within the limits of precision is rather large. One should not forget that the minimun number really represents at the same time a maximun number, necessary only if one takes into consideration essentially the disfavorable variations, but smaller numbers may frequently already satisfactory results. For instance, by definition, we know that a frequecy of p means that we expect one individual in every total o(f1-p). If there were no chance variations, this number (1- p) will be suficient. and if there were favorable variations a smaller number still may yield one individual of the desired type. r.nus trusting to luck, one may start the experiment with numbers, smaller than the minimun calculated according to the formulas given above, and increase the total untill the desired result is obtained and this may well b ebefore the "minimum number" is reached. Some concrete examples of this partial or progressive procedure are given from our genetical experiments with maize.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The author proves that equation, Σy n ΣZx | ΣxyZx ΣxZx ΣxZ2x | = 0, Σy ΣZx Σy2x | where Z = 10-cq and q is a numerical constant, used by Pimentel Gomes and Malavolta in several articles for the interpolation of Mitscherlih's equation y = A [ 1 - 10 - c (x + b) ] by the least squares method, always has a zero of order three for Z = 1. Therefore, equation A Zm + A1Zm -1 + ........... + Am = 0 obtained from that determinant can be divided by (Z-1)³. This property provides a good test for the correctness of the computations and facilitates the solution of the equation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We quantify the long-time behavior of a system of (partially) inelastic particles in a stochastic thermostat by means of the contractivity of a suitable metric in the set of probability measures. Existence, uniqueness, boundedness of moments and regularity of a steady state are derived from this basic property. The solutions of the kinetic model are proved to converge exponentially as t→ ∞ to this diffusive equilibrium in this distance metrizing the weak convergence of measures. Then, we prove a uniform bound in time on Sobolev norms of the solution, provided the initial data has a finite norm in the corresponding Sobolev space. These results are then combined, using interpolation inequalities, to obtain exponential convergence to the diffusive equilibrium in the strong L¹-norm, as well as various Sobolev norms.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper we study basic properties of the weighted Hardy space for the unit disc with the weight function satisfying Muckenhoupt's (Aq) condition, and study related approximation problems (expansion, moment and interpolation) with respect to two incomplete systems of holomorphic functions in this space.