972 resultados para Classical methods


Relevância:

60.00% 60.00%

Publicador:

Resumo:

An analytical method is developed for solving an inverse problem for Helmholtz's equation associated with two semi-infinite incompressible fluids of different variable refractive indices, separated by a plane interface. The unknowns of the inverse problem are: (i) the refractive indices of the two fluids, (ii) the ratio of the densities of the two fluids, and (iii) the strength of an acoustic source assumed to be situated at the interface of the two fluids. These are determined from the pressure on the interface produced by the acoustic source. The effect of the surface tension force at the interface is taken into account in this paper. The application of the proposed analytical method to solve the inverse problem is also illustrated with several examples. In particular, exact solutions of two direct problems are first derived using standard classical methods which are then used in our proposed inverse method to recover the unknowns of the corresponding inverse problems. The results are found to be in excellent agreement.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Despite the expenditure of huge amounts of money and human effort, the Green Revolution has largely failed to benefit the vast majority of the rural poor in Africa: those smallholding farmers who sell little, if any, of what they grow and rely almost entirely upon natural soil fertility, rainfall and traditional broodstock and seed varieties. New approaches on food production and income generation in the rural areas must be found if this sector of agricultural community is to be assisted. Integrated resources management (IRM) in general, and integrated agriculture-aquaculture (IAA) in particular, may offer some solutions in cases where the classical methods of improving farm output have failed and/or been unsustainable.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

A análise de fundações sob solicitações dinâmicas é algo sempre presente em projetos na área industrial. É um campo pouco explorado na área de engenharia geotécnica, onde existem relativamente poucas informações no Brasil, de maneira geral. O método mais comum de realizar essas análises é a simplificação de modelos estruturais a partir do uso de molas. Sabe-se que esses coeficientes de reação têm uma variação relativamente grande e que esse enfoque de projeto pode, em alguns casos, mostrar-se contra a segurança ou levar a superdimensionamentos desnecessários. Verifica-se, então, a necessidade de uma avaliação mais criteriosa, utilizando a interação solo x estrutura, onde as molas comumente utilizadas nas análises vibratórias convencionais são substituídas pela rigidez real do solo quando concebido como um meio contínuo, através de sua discretização pelo método dos elementos finitos. A presente dissertação analisa o problema através do módulo de dinâmica do programa Plaxis 2D. Neste tipo de análise, além da modelagem do solo como um meio contínuo, torna-se possível introduzir condições de contorno específicas ao problema em estudo, múltiplas camadas de solo, sejam horizontais ou inclinadas, além da introdução de amortecedores capazes de evitar a reflexão espúria das ondas incidentes nos limites da malha de elementos finitos e assim modelar mais adequadamente a perda de energia por radiação. A presente dissertação compara medições experimentais e soluções eficientes de métodos vibratórios clássicos com a resposta obtida pelo MEF, mostrando resultados bastante satisfatórios tanto pelos métodos clássicos quanto pelo MEF.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Perante diversas situações da engenharia são utilizadas formulações empíricas de dimensionamento baseadas em dados de campo e experiência profissional que definem muito o caráter subjetivo da metodologia padrão de projeto. O presente trabalho de pesquisa aborda os diversos métodos de obtenção dos esforços gerados em dutos enterrados submetidos a cargas dinâmicas e estáticas e sua posterior reavaliação através de modelagem numérica com o programa Plaxis 3D. Os métodos analíticos não convencionais foram comparados com o método padrão de cálculo sendo que o mesmo demonstrou ter uma boa precisão mesmo sem considerar outros fatores importantes como a parcela de resistência devida à coesão do solo e sua deformabilidade. A modelagem numérica demonstrou o conservadorismo do método de Marston e o subdmensionamento do espraiamento em prisma devido aos efeitos locais ocasionados pela adoção do recobrimento mínimo e sobrecarga dinâmica elevada. Também se observou, através da modelagem 3D, que a utilização dos dois métodos clássicos favorecem a obtenção de resultados dentro da razoabilidade.Verificou-se também, como resultado desta pesquisa, que a proposta de um método clássico modificado permite uma melhor aproximação da carga que atinge o duto.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Porphyrin metabolic disruption from exposure to xenobiotic contaminants such as heavy metals, dioxins, and aromatic hydrocarbons can elicit overproduction of porphyrins. Measurement of porphyrin levels, when used in conjunction with other diagnostic assays, can help elucidate an organism’s physiological condition and provide evidence for exposure to certain toxicants. A sensitive microplate fluorometric assay has been optimized for detecting total porphyrin levels in detergent solubilized protein extracts from symbiotic, dinoflagellate containing cnidarian tissues. The denaturing buffer used in this modified assay contains a number of potentially interfering components (e.g., sodium dodecyl sulfate (SDS), dithiothreitol (DTT), protease inhibitors, and chlorophyll from the symbiotic zooxanthellae), which required examination and validation. Examination of buffer components were validated for use in this porphyrin assay; while the use of a specific spectrofluorometric filter (excitation 400 ± 15 nm; emission 600 ± 20 nm) minimized chlorophyll interference. The detection limit for this assay is 10 fmol of total porphyrin per μg of total soluble protein and linearity is maintained up to 5000 fmol. The ability to measure total porphyrins in a SDS protein extract now allows a single extract to be used in multiple assays. This is an advantage over classical methods, particularly when tissue samples are limiting, as is often the case with coral due to availability and collection permit restrictions.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Copyright © (2014) by the International Machine Learning Society (IMLS) All rights reserved. Classical methods such as Principal Component Analysis (PCA) and Canonical Correlation Analysis (CCA) are ubiquitous in statistics. However, these techniques are only able to reveal linear re-lationships in data. Although nonlinear variants of PCA and CCA have been proposed, these are computationally prohibitive in the large scale. In a separate strand of recent research, randomized methods have been proposed to construct features that help reveal nonlinear patterns in data. For basic tasks such as regression or classification, random features exhibit little or no loss in performance, while achieving drastic savings in computational requirements. In this paper we leverage randomness to design scalable new variants of nonlinear PCA and CCA; our ideas extend to key multivariate analysis tools such as spectral clustering or LDA. We demonstrate our algorithms through experiments on real- world data, on which we compare against the state-of-the-art. A simple R implementation of the presented algorithms is provided.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The complete internal transcribed spacer 1 (ITS1), 5.8S ribosomal DNA, and ITS2 region of the ribosomal DNA from 60 specimens belonging to two closely related bucephalid digeneans (Dollfustrema vaneyi and Dollfustrema hefeiensis) from different localities, hosts, and microhabitat sites were cloned to examine the level of sequence variation and the taxonomic levels to show utility in species identification and phylogeny estimation. Our data show that these molecular markers can help to discriminate the two species, which are morphologically very close and difficult to separate by classical methods. We found 21 haplotypes defined by 44 polymorphic positions in 38 individuals of D. vaneyi, and 16 haplotypes defined by 43 polymorphic positions in 22 individuals of D. hefeiensis. There is no shared haplotypes between the two species. Haplotype rather than nucleotide diversity is similar between the two species. Phylogenetic analyses reveal two robustly supported clades, one corresponding to D. vaneyi and the other corresponding to D. hefeiensis. However, the population structures between the two species seem to be incongruent and show no geographic and host-specific structure among them, further indicating that the two species may have had a more complex evolutionary history than expected.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

© 2015 John P. Cunningham and Zoubin Ghahramani. Linear dimensionality reduction methods are a cornerstone of analyzing high dimensional data, due to their simple geometric interpretations and typically attractive computational properties. These methods capture many data features of interest, such as covariance, dynamical structure, correlation between data sets, input-output relationships, and margin between data classes. Methods have been developed with a variety of names and motivations in many fields, and perhaps as a result the connections between all these methods have not been highlighted. Here we survey methods from this disparate literature as optimization programs over matrix manifolds. We discuss principal component analysis, factor analysis, linear multidimensional scaling, Fisher's linear discriminant analysis, canonical correlations analysis, maximum autocorrelation factors, slow feature analysis, sufficient dimensionality reduction, undercomplete independent component analysis, linear regression, distance metric learning, and more. This optimization framework gives insight to some rarely discussed shortcomings of well-known methods, such as the suboptimality of certain eigenvector solutions. Modern techniques for optimization over matrix manifolds enable a generic linear dimensionality reduction solver, which accepts as input data and an objective to be optimized, and returns, as output, an optimal low-dimensional projection of the data. This simple optimization framework further allows straightforward generalizations and novel variants of classical methods, which we demonstrate here by creating an orthogonal-projection canonical correlations analysis. More broadly, this survey and generic solver suggest that linear dimensionality reduction can move toward becoming a blackbox, objective-agnostic numerical technology.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In addition to classical methods, namely kriging, Inverse Distance Weighting (IDW) and splines, which have been frequently used for interpolating the spatial patterns of soil properties, a relatively more accurate surface modelling technique is being developed in recent years, namely high accuracy surface modelling (HASM). It has been used in the numerical tests, DEM construction and the interpolation of climate and ecosystem changes. In this paper, HASM was applied to interpolate soil pH for assessing its feasibility of soil property interpolation in a red soil region of Jiangxi Province, China. Soil pH was measured on 150 samples of topsoil (0-20 cm) for the interpolation and comparing the performance of HASM, kriging. IDW and splines. The mean errors (MEs) of interpolations indicate little bias of interpolation for soil pH by the four techniques. HASM has less mean absolute error (MAE) and root mean square error (RMSE) than kriging, IDW and splines. HASM is still the most accurate one when we use the mean rank and the standard deviation of the ranks to avoid the outlier effects in assessing the prediction performance of the four methods. Therefore, HASM can be considered as an alternative and accurate method for interpolating soil properties. Further researches of HASM are needed to combine HASM with ancillary variables to improve the interpolation performance and develop a user-friendly algorithm that can be implemented in a GIS package. (C) 2009 Elsevier B.V. All rights reserved.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The purpose of this project is to present selected violin pieces by Paul Hindemith (1895-1963) against a backdrop of the diverse styles and traditions that he integrated in his music. For this dissertation project, selected violin sonatas by Hindemith were performed in three recitals alongside pieces by other German and Austro-German composers. These recitals were also recorded for archival purposes. The first recital, performed with pianist David Ballena on December 10, 2005, in Gildenhorn Recital Hall at the University of Maryland, College Park, included Violin Sonata Op.11, No. 1 (1918) by Paul Hindemith, Sonatina in D Major, Op. 137 (1816) by Franz Schubert, and Sonata in E-flat Major, Op.18 (1887) by Richard Strauss. The second recital, performed with pianist David Ballena on May 9, 2006, in Gildenhorn Recital Hall at the University of Maryland, included Sonata in E Minor, KV 304 (1778) by Wolfgang Amadeus Mozart, Sonata in E (1935) by Paul Hindemith, Romance for Violin and Orchestra No.1 in G Major (1800-1802) by Ludwig Van Beethoven, and Sonata for Violin and Piano in A minor, Op. 105 (1851) by Robert Schumann. The third recital, performed with David Ballena and Kai-Ching Chang on November 10, 2006 in Ulrich Recital Hall at the University of Maryland, included Violin Sonata Op.12 No.1 in D Major (1798) by Ludwig Van Beethoven, Sonata for Violin and Harpsichord No.4 in C Minor BWV 1017 (1720) by J.S. Bach, and Violin Sonata Op.11 No.2 (1918) by Paul Hindemith. For each of my dissertation recitals, I picked a piece by Hindemith as the core of the program then picked pieces by other composers that have similar key, similar texture, same number of movements or similar feeling to complete my program. Although his pieces used some classical methods of composition, he added his own distinct style: extension of chromaticism; his prominent use of interval of the fourth; his chromatic alteration of diatonic scale degrees; and his non-traditional cadences. Hindemith left behind a legacy of multi-dimensional, and innovative music capable of expressing both the old and the new aesthetics.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In a recent Letter to the Editor (J Rao, D Delande and K T Taylor 2001 J. Phys. B: At. Mol. Opt. Phys. 34 L391-9) we made a brief first report of our quantal and classical calculations for the hydrogen atom in crossed electric and magnetic fields at constant scaled energy and constant scaled electric field strength. A principal point of that communication was our statement that each and every peak in the Fourier transform of the scaled quantum photo-excitation spectrum for scaled energy value epsilon = -0.586 538 871028 43 and scaled electric value (f) over tilde = 0.068 537 846 207 618 71 could be identified with a scaled action value of a found and mapped-out closed orbit up to a scaled action of 20. In this follow-up paper, besides presenting full details of our quantum and classical methods, we set out the scaled action values of all 317 closed orbits involved, together with the geometries of many.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Measuring the structural similarity of graphs is a challenging and outstanding problem. Most of the classical approaches of the so-called exact graph matching methods are based on graph or subgraph isomorphic relations of the underlying graphs. In contrast to these methods in this paper we introduce a novel approach to measure the structural similarity of directed and undirected graphs that is mainly based on margins of feature vectors representing graphs. We introduce novel graph similarity and dissimilarity measures, provide some properties and analyze their algorithmic complexity. We find that the computational complexity of our measures is polynomial in the graph size and, hence, significantly better than classical methods from, e.g. exact graph matching which are NP-complete. Numerically, we provide some examples of our measure and compare the results with the well-known graph edit distance. (c) 2006 Elsevier Inc. All rights reserved.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Tese de Doutoramento, Neurologia, Faculdade de Medicina, Universidade de Lisboa, 2014

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This paper presents a mechanically verified implementation of an algorithm for deciding the equivalence of Kleene algebra terms within the Coq proof assistant. The algorithm decides equivalence of two given regular expressions through an iterated process of testing the equivalence of their partial derivatives and does not require the construction of the corresponding automata. Recent theoretical and experimental research provides evidence that this method is, on average, more efficient than the classical methods based on automata. We present some performance tests, comparisons with similar approaches, and also introduce a generalization of the algorithm to decide the equivalence of terms of Kleene algebra with tests. The motivation for the work presented in this paper is that of using the libraries developed as trusted frameworks for carrying out certified program verification.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

[français] L’objectif de cette recherche est d’analyser l’organisation interne d’une firme de sondage sous l’angle des sphères du milieu de travail telles qu’identifiées par Bélanger, Giles et Murray (2004), soient : la gestion de la production, l’organisation du travail et la relation à l’emploi. Plus spécifiquement, nous chercherons à savoir comment se comporte la firme enquêtée face à la gestion de la flexibilité organisationnelle et quel est l’impact de celle-ci sur les trois sphères du travail. L’analyse utilise la méthodologie de l’étude de cas et fait appel à divers types de matériaux : des observations ponctuelles, des entrevues informelles et les bases de données administratives ainsi que les rapports d’évaluation des entrevues téléphoniques effectuées par les intervieweurs. De même, l’analyse des résultats utilise à la fois des méthodes plus classiques telles que les corrélations ainsi que des représentations graphiques et des analyses qualitatives. L’analyse permet de repérer une logique de fonctionnement à l’œuvre dans les différentes sphères de l’emploi : l’importante standardisation des processus de travail (dans le champ de la gestion de la production), la réduction des marges de manœuvre (dans le champ de l’organisation du travail) et la non reconnaissance de l’expertise des intervieweurs (dans le champ de la relation à l’emploi). Les contradictions repérées dans l’analyse, entre les sphères de l’emploi et les objectifs de flexibilité, montrent que les structures mises en place bloquent, dans une certaine mesure, la capacité d’initiative et d’adaptation que la flexibilité exige. La recherche a montré que ce qu’on demande aux intervieweurs est à la fois le reflet des exigences de la flexibilité, tel que constaté dans ce mémoire, mais aussi, des exigences sociales face à la méthodologie de sondage. Tout porte à déduire que celles-ci peuvent engendrer un plafonnement de la performance des employés. Mots-clés : centres d’appels, intervieweurs, firmes de sondage, flexibilité organisationnelle, gestion de la production, organisation du travail, relation à l’emploi, travail émotionnel.