550 resultados para SMOOTHING SPLINES


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Propõe-se método novo e completo para análise de acetona em ar exalado envolvendo coleta com pré-concentração em água, derivatização química e determinação eletroquímica assistida por novo algoritmo de processamento de sinais. Na literatura recente a acetona expirada vem sendo avaliada como biomarcador para monitoramento não invasivo de quadros clínicos como diabetes e insuficiência cardíaca, daí a importância da proposta. Entre as aminas que reagem com acetona para formar iminas eletroativas, estudadas por polarografia em meados do século passado, a glicina apresentou melhor conjunto de características para a definição do método de determinação por voltametria de onda quadrada sem a necessidade de remoção de oxigênio (25 Hz, amplitude de 20 mV, incremento de 5 mV, eletrodo de gota de mercúrio). O meio reacional, composto de glicina (2 mol·L-1) em meio NaOH (1 mol·L-1), serviu também de eletrólito e o pico de redução da imina em -1,57 V vs. Ag|AgCl constituiu o sinal analítico. Para tratamento dos sinais, foi desenvolvido e avaliado um algoritmo inovador baseado em interpolação de linha base por ajuste de curvas de Bézier e ajuste de gaussiana ao pico. Essa combinação permitiu reconhecimento e quantificação de picos relativamente baixos e largos sobre linha com curvatura acentuada e ruído, situação em que métodos convencionais falham e curvas do tipo spline se mostraram menos apropriadas. A implementação do algoritmo (disponível em http://github.com/batistagl/chemapps) foi realizada utilizando programa open source de álgebra matricial integrado diretamente com software de controle do potenciostato. Para demonstrar a generalidade da extensão dos recursos nativos do equipamento mediante integração com programação externa em linguagem Octave (open source), implementou-se a técnica da cronocoulometria tridimensional, com visualização de resultados já tratados em projeções de malha de perspectiva 3D sob qualquer ângulo. A determinação eletroquímica de acetona em fase aquosa, assistida pelo algoritmo baseado em curvas de Bézier, é rápida e automática, tem limite de detecção de 3,5·10-6 mol·L-1 (0,2 mg·L-1) e faixa linear que atende aos requisitos da análise em ar exalado. O acetaldeído, comumente presente em ar exalado, em especial, após consumo de bebidas alcoólicas, dá origem a pico voltamétrico em -1,40 V, contornando interferência que prejudica vários outros métodos publicados na literatura e abrindo possibilidade de determinação simultânea. Resultados obtidos com amostras reais são concordantes com os obtidos por método espectrofotométrico, em uso rotineiro desde o seu aperfeiçoamento na dissertação de mestrado do autor desta tese. Em relação à dissertação, também se otimizou a geometria do dispositivo de coleta, de modo a concentrar a acetona num volume menor de água gelada e prover maior conforto ao paciente. O método completo apresentado, englobando o dispositivo de amostragem aperfeiçoado e o novo e efetivo algoritmo para tratamento automático de sinais voltamétricos, está pronto para ser aplicado. Evolução para um analisador portátil depende de melhorias no limite de detecção e facilidade de obtenção eletrodos sólidos (impressos) com filme de mercúrio, vez que eletrodos de bismuto ou diamante dopado com boro, entre outros, não apresentaram resposta.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We present a modelling method to estimate the 3-D geometry and location of homogeneously magnetized sources from magnetic anomaly data. As input information, the procedure needs the parameters defining the magnetization vector (intensity, inclination and declination) and the Earth's magnetic field direction. When these two vectors are expected to be different in direction, we propose to estimate the magnetization direction from the magnetic map. Then, using this information, we apply an inversion approach based on a genetic algorithm which finds the geometry of the sources by seeking the optimum solution from an initial population of models in successive iterations through an evolutionary process. The evolution consists of three genetic operators (selection, crossover and mutation), which act on each generation, and a smoothing operator, which looks for the best fit to the observed data and a solution consisting of plausible compact sources. The method allows the use of non-gridded, non-planar and inaccurate anomaly data and non-regular subsurface partitions. In addition, neither constraints for the depth to the top of the sources nor an initial model are necessary, although previous models can be incorporated into the process. We show the results of a test using two complex synthetic anomalies to demonstrate the efficiency of our inversion method. The application to real data is illustrated with aeromagnetic data of the volcanic island of Gran Canaria (Canary Islands).

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Esta pesquisa visa a análise da contribuição de cinco variáveis de entrada e a otimização do desempenho termo-hidráulico de trocadores de calor com venezianas combinados com geradores de vórtices delta-winglets. O desempenho termohidráulico de duas geometrias distintas, aqui nomeadas por GEO1 e GEO2, foram avaliadas. Smoothing Spline ANOVA foi usado para avaliar a contribuição dos parâmetros de entrada na transferência de calor e perda de carga. Considerando aplicação automotiva, foram investigados números de Reynolds iguais a 120 e 240, baseados no diâmetro hidráulico. Os resultados indicaram que o ângulo de venezianas é o maior contribuidor para o aumento do fator de atrito para GEO1 e GEO2, para ambos os números de Reynolds. Para o número de Reynolds menor, o parâmetro mais importante em termos de transferência de calor foi o ângulo das venezianas para ambas as geometrias. Para o número de Reynolds maior, o ângulo de ataque dos geradores de vórtices posicionados na primeira fileira é o maior contribuidor para a tranfesferência de calor, no caso da geometria GEO1, enquanto que o ângulo de ataque dos geradores de vórtices na primeira fileira foi tão importante quanto os ângulos das venezianas para a geometria GEO2. Embora as geometrias analisadas possam ser consideradas como técnicas compostas de intensificação da transferência de calor, não foram observadas interações relevantes entre ângulo de venezianas e parâmetros dos geradores de vórtices. O processo de otimização usa NSGA-II (Non-Dominated Sorting Genetic Algorithm) combinado com redes neurais artificiais. Os resultados mostraram que a adição dos geradores de vórtices em GEO1 aumentaram a transferência de calor em 21% e 23% com aumentos na perda de carga iguais a 24,66% e 36,67% para o menor e maior números de Reynolds, respectivamente. Para GEO2, a transferência de calor aumentou 13% e 15% com aumento na perda de carga de 20,33% e 23,70%, para o menor e maior número de Reynolds, respectivamente. As soluções otimizadas para o fator de Colburn mostraram que a transferência de calor atrás da primeira e da segunda fileiras de geradores de vórtices tem a mesma ordem de magnitude para ambos os números de Reynolds. Os padrões de escoamento e as características de transferência de calor das soluções otimizadas apresentaram comportamentos vi particulares, diferentemente daqueles encontrados quando as duas técnicas de intensificação de transferência de calor são aplicadas separadamente.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Several studies have analyzed discretionary accruals to address earnings-smoothing behaviors in the banking industry. We argue that the characteristic link between accruals and earnings may be nonlinear, since both the incentives to manipulate income and the practical way to do so depend partially on the relative size of earnings. Given a sample of 15,268 US banks over the period 1996–2011, the main results in this paper suggest that, depending on the size of earnings, bank managers tend to engage in earnings-decreasing strategies when earnings are negative (“big-bath”), use earnings-increasing strategies when earnings are positive, and use provisions as a smoothing device when earnings are positive and substantial (“cookie-jar” accounting). This evidence, which cannot be explained by the earnings-smoothing hypothesis, is consistent with the compensation theory. Neglecting nonlinear patterns in the econometric modeling of these accruals may lead to misleading conclusions regarding the characteristic strategies used in earnings management.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Visual information is increasingly being used in a great number of applications in order to perform the guidance of joint structures. This paper proposes an image-based controller which allows the joint structure guidance when its number of degrees of freedom is greater than the required for the developed task. In this case, the controller solves the redundancy combining two different tasks: the primary task allows the correct guidance using image information, and the secondary task determines the most adequate joint structure posture solving the possible joint redundancy regarding the performed task in the image space. The method proposed to guide the joint structure also employs a smoothing Kalman filter not only to determine the moment when abrupt changes occur in the tracked trajectory, but also to estimate and compensate these changes using the proposed filter. Furthermore, a direct visual control approach is proposed which integrates the visual information provided by this smoothing Kalman filter. This last aspect permits the correct tracking when noisy measurements are obtained. All the contributions are integrated in an application which requires the tracking of the faces of Asperger children.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A MATLAB-based computer code has been developed for the simultaneous wavelet analysis and filtering of several environmental time series, particularly focused on the analyses of cave monitoring data. The continuous wavelet transform, the discrete wavelet transform and the discrete wavelet packet transform have been implemented to provide a fast and precise time–period examination of the time series at different period bands. Moreover, statistic methods to examine the relation between two signals have been included. Finally, the entropy of curves and splines based methods have also been developed for segmenting and modeling the analyzed time series. All these methods together provide a user-friendly and fast program for the environmental signal analysis, with useful, practical and understandable results.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The Remez penalty and smoothing algorithm (RPSALG) is a unified framework for penalty and smoothing methods for solving min-max convex semi-infinite programing problems, whose convergence was analyzed in a previous paper of three of the authors. In this paper we consider a partial implementation of RPSALG for solving ordinary convex semi-infinite programming problems. Each iteration of RPSALG involves two types of auxiliary optimization problems: the first one consists of obtaining an approximate solution of some discretized convex problem, while the second one requires to solve a non-convex optimization problem involving the parametric constraints as objective function with the parameter as variable. In this paper we tackle the latter problem with a variant of the cutting angle method called ECAM, a global optimization procedure for solving Lipschitz programming problems. We implement different variants of RPSALG which are compared with the unique publicly available SIP solver, NSIPS, on a battery of test problems.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Através de alguns exemplos práticos, pretende-se defender que o conhecimento geométrico e, em particular, o conhecimento das curvas cónicas e suas aplicações, pode potenciar o trabalho projetual dos designers, diminuir os custos de hardware e software no ensino e no trabalho profissional, diminuir a necessidade de recurso a meios sofisticados e caros, reduzir a necessidade de permanente atualização dos meios tecnológicos, e de utilização de software que implique formação especializada e, sobretudo, que necessite de longos períodos de formação. Temos em vista contribuir para o reconhecimento da importância do estudo destas curvas e das superfícies por elas geradas, em especial no ensino da Geometria em cursos de Design. De facto, a partir da sistematização do conhecimento existente em outras áreas, como, por exemplo, a arquitetura e as engenharias, pelo aprofundamento da adaptação de propriedades das cónicas e de conhecimentos de áreas, como a geometria analítica ou a projetiva para a linguagem dos traçados geométricos, e pela contribuição com a sugestão de novos traçados, pode desenvolver-se a capacidade dos designers e estudantes de design resolverem problemas, no âmbito do projeto, na representação técnica e na comunicação externa com não peritos.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Ce mémoire a pour but de déterminer des nouvelles méthodes de détection de rupture et/ou de tendance. Après une brève introduction théorique sur les splines, plusieurs méthodes de détection de rupture existant déjà dans la littérature seront présentées. Puis, de nouvelles méthodes de détection de rupture qui utilisent les splines et la statistique bayésienne seront présentées. De plus, afin de bien comprendre d’où provient la méthode utilisant la statistique bayésienne, une introduction sur la théorie bayésienne sera présentée. À l’aide de simulations, nous effectuerons une comparaison de la puissance de toutes ces méthodes. Toujours en utilisant des simulations, une analyse plus en profondeur de la nouvelle méthode la plus efficace sera effectuée. Ensuite, celle-ci sera appliquée sur des données réelles. Une brève conclusion fera une récapitulation de ce mémoire.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Countries in a monetary union can adjust to shocks either through internal or external mechanisms. We quantitatively assess for the European Union a number of relevant mechanisms suggested by Mundell’s optimal currency area theory, and compare them to the United States. For this purpose, we update a number of empirical analyses in the economic literature that identify (1) the size of asymmetries across countries and (2) the magnitude of insurance mechanisms relative to similar mechanisms and compare results for the European Monetary Union (EMU) with those obtained for the US. To study the level of synchronization between EMU countries we follow Alesina et al. (2002) and Barro and Tenreyro (2007). To measure the effect of an employment shock on employment levels, unemployment rates and participation rates we perform an analysis based on Blanchard and Katz (1992) and Decressin and Fatas (1995). We measure consumption smoothing through capital markets, fiscal transfers and savings, using the approach by Asdrubali et al. (1996) and Afonso and Furceri (2007). To analyze risk sharing through a common safety net for banks we perform a rudimentary simulation analysis. |

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In order to stabilise and improve their income situation, rural households are strongly encouraged to diversify their activities both within and outside the agricultural sector. Often, however, this advice is only moderately pursued. This paper addresses issues of rural household income diversification in the case of Poland. It investigates returns from rural household income strategies using propensity score matching methods and extensive datasets spanning 1998-2008. Results suggest that returns from combining farm and off-farm activities were lower than returns from concentrating on farming or on self-employment outside agriculture. This differential is stable over time although returns from diversification have relatively improved after Poland’s accession to the EU. This is also visible in the fact that since 2006 returns from combining farm and off-farm activities have evened with returns from relying solely on hired off-farm labour, thus smoothing the difference observed before the accession. Further, over the analysed period, households pursuing the diversification strategy performed better than those relying solely on unearned income. Finally, in general, the income in households combining farm and off-farm activities was higher than in those combining two off-farm income sources.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The policy of the European Union (EU) towards Taiwan has mostly been analysed either as a by-product of EU-China relations or with reference to the general lack of a European geopolitical approach towards East Asia. By adopting a lobbying approach which focusses on Taiwan’s different ‘channels of influence’ within the complex European foreign policy system in Brussels, this study provides new insights into the functioning of EU-Taiwan relations. It also sheds new light on the implications of the radical change in Taiwanese diplomacy after 2008, when Chen Shui-bian’s assertive and identity-based diplomacy was replaced with the Kuomintang’s new dogma of ‘workable diplomacy’. Based on semi-guided interviews with Taiwanese and European actors, this paper examines why Taiwanese lobbying in Brussels, albeit very active and professional, is not salient enough to meet the challenges arising from the overwhelming Chinese competition and from the increasing proliferation of regional trade agreements – with active EU participation – in the Asia-Pacific region. It argues that the pragmatic ‘workable diplomacy’ approach, while smoothing out working-level relations between Taiwan and the EU, fails to attract a sufficient degree of political and public attention in Europe to the Taiwan question and thus fosters the neglect of Taiwan by European foreign policy-makers. The main challenge faced by Taiwanese diplomacy, however, is not simply one of convincing through technical arguments, but one of agenda setting, that is, of redefining European priorities in Taiwan’s favour.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

nlcheck is a simple diagnostic tool that can be used after fitting a model to quickly check the linearity assumption for a given predictor. nlcheck categorizes the predictor into bins, refits the model including dummy variables for the bins, and then performs a joint Wald test for the added parameters. Alternative, nlcheck uses linear splines for the adaptive model. Support for discrete variables is also provided. Optionally, nlcheck also displays a graph of the adjusted linear predictions from the original model and the adaptive model

Relevância:

10.00% 10.00%

Publicador:

Resumo:

1Recent studies demonstrated the sensitivity of northern forest ecosystems to changes in the amount and duration of snow cover at annual to decadal time scales. However, the consequences of snowfall variability remain uncertain for ecological variables operating at longer time scales, especially the distributions of forest communities. 2The Great Lakes region of North America offers a unique setting to examine the long-term effects of variable snowfall on forest communities. Lake-effect snow produces a three-fold gradient in annual snowfall over tens of kilometres, and dramatic edaphic variations occur among landform types resulting from Quaternary glaciations. We tested the hypothesis that these factors interact to control the distributions of mesic (dominated by Acer saccharum, Tsuga canadensis and Fagus grandifolia) and xeric forests (dominated by Pinus and Quercus spp.) in northern Lower Michigan. 3We compiled pre-European-settlement vegetation data and overlaid these data with records of climate, water balance and soil, onto Landtype Association polygons in a geographical information system. We then used multivariate adaptive regression splines to model the abundance of mesic vegetation in relation to environmental controls. 4Snowfall is the most predictive among five variables retained by our model, and it affects model performance 29% more than soil texture, the second most important variable. The abundance of mesic trees is high on fine-textured soils regardless of snowfall, but it increases with snowfall on coarse-textured substrates. Lake-effect snowfall also determines the species composition within mesic forests. The weighted importance of A. saccharum is significantly greater than of T. canadensis or F. grandifolia within the lake-effect snowbelt, whereas T. canadensis is more plentiful outside the snowbelt. These patterns are probably driven by the influence of snowfall on soil moisture, nutrient availability and fire return intervals. 5Our results imply that a key factor dictating the spatio-temporal patterns of forest communities in the vast region around the Great Lakes is how the lake-effect snowfall regime responds to global change. Snowfall reductions will probably cause a major decrease in the abundance of ecologically and economically important species, such as A. saccharum.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Modeling of self-similar traffic is performed for the queuing system of G/M/1/K type using Weibull distribution. To study the self-similar traffic the simulation model is developed by using SIMULINK software package in MATLAB environment. Approximation of self-similar traffic on the basis of spline functions. Modeling self-similar traffic is carried outfor QS of W/M/1/K type using the Weibull distribution. Initial data are: the value of Hurst parameter H=0,65, the shape parameter of the distribution curve α≈0,7 and distribution parameter β≈0,0099. Considering that the self-similar traffic is characterized by the presence of "splashes" and long-termdependence between the moments of requests arrival in this study under given initial data it is reasonable to use linear interpolation splines.