63 resultados para sparse matrix technique
Resumo:
Graph pebbling is a network model for studying whether or not a given supply of discrete pebbles can satisfy a given demand via pebbling moves. A pebbling move across an edge of a graph takes two pebbles from one endpoint and places one pebble at the other endpoint; the other pebble is lost in transit as a toll. It has been shown that deciding whether a supply can meet a demand on a graph is NP-complete. The pebbling number of a graph is the smallest t such that every supply of t pebbles can satisfy every demand of one pebble. Deciding if the pebbling number is at most k is NP 2 -complete. In this paper we develop a tool, called theWeight Function Lemma, for computing upper bounds and sometimes exact values for pebbling numbers with the assistance of linear optimization. With this tool we are able to calculate the pebbling numbers of much larger graphs than in previous algorithms, and much more quickly as well. We also obtain results for many families of graphs, in many cases by hand, with much simpler and remarkably shorter proofs than given in previously existing arguments (certificates typically of size at most the number of vertices times the maximum degree), especially for highly symmetric graphs. Here we apply theWeight Function Lemma to several specific graphs, including the Petersen, Lemke, 4th weak Bruhat, Lemke squared, and two random graphs, as well as to a number of infinite families of graphs, such as trees, cycles, graph powers of cycles, cubes, and some generalized Petersen and Coxeter graphs. This partly answers a question of Pachter, et al., by computing the pebbling exponent of cycles to within an asymptotically small range. It is conceivable that this method yields an approximation algorithm for graph pebbling.
Resumo:
Factor analysis as frequent technique for multivariate data inspection is widely used also for compositional data analysis. The usual way is to use a centered logratio (clr)transformation to obtain the random vector y of dimension D. The factor model istheny = Λf + e (1)with the factors f of dimension k & D, the error term e, and the loadings matrix Λ.Using the usual model assumptions (see, e.g., Basilevsky, 1994), the factor analysismodel (1) can be written asCov(y) = ΛΛT + ψ (2)where ψ = Cov(e) has a diagonal form. The diagonal elements of ψ as well as theloadings matrix Λ are estimated from an estimation of Cov(y).Given observed clr transformed data Y as realizations of the random vectory. Outliers or deviations from the idealized model assumptions of factor analysiscan severely effect the parameter estimation. As a way out, robust estimation ofthe covariance matrix of Y will lead to robust estimates of Λ and ψ in (2), seePison et al. (2003). Well known robust covariance estimators with good statisticalproperties, like the MCD or the S-estimators (see, e.g. Maronna et al., 2006), relyon a full-rank data matrix Y which is not the case for clr transformed data (see,e.g., Aitchison, 1986).The isometric logratio (ilr) transformation (Egozcue et al., 2003) solves thissingularity problem. The data matrix Y is transformed to a matrix Z by usingan orthonormal basis of lower dimension. Using the ilr transformed data, a robustcovariance matrix C(Z) can be estimated. The result can be back-transformed tothe clr space byC(Y ) = V C(Z)V Twhere the matrix V with orthonormal columns comes from the relation betweenthe clr and the ilr transformation. Now the parameters in the model (2) can beestimated (Basilevsky, 1994) and the results have a direct interpretation since thelinks to the original variables are still preserved.The above procedure will be applied to data from geochemistry. Our specialinterest is on comparing the results with those of Reimann et al. (2002) for the Kolaproject data
Resumo:
This paper presents the implementation details of a coded structured light system for rapid shape acquisition of unknown surfaces. Such techniques are based on the projection of patterns onto a measuring surface and grabbing images of every projection with a camera. Analyzing the pattern deformations that appear in the images, 3D information of the surface can be calculated. The implemented technique projects a unique pattern so that it can be used to measure moving surfaces. The structure of the pattern is a grid where the color of the slits are selected using a De Bruijn sequence. Moreover, since both axis of the pattern are coded, the cross points of the grid have two codewords (which permits to reconstruct them very precisely), while pixels belonging to horizontal and vertical slits have also a codeword. Different sets of colors are used for horizontal and vertical slits, so the resulting pattern is invariant to rotation. Therefore, the alignment constraint between camera and projector considered by a lot of authors is not necessary
Resumo:
Epipolar geometry is a key point in computer vision and the fundamental matrix estimation is the only way to compute it. This article surveys several methods of fundamental matrix estimation which have been classified into linear methods, iterative methods and robust methods. All of these methods have been programmed and their accuracy analysed using real images. A summary, accompanied with experimental results, is given
Resumo:
Aquest projecte de doctorat és un treball interdisciplinari adreçat a l’obtenció de nous nanocompòsits (NCs) funcionals sintetitzats a partir de materials polimèrics bescanviadors d’ions que són modificats amb nanopartícules metàl•liques (NPMs) de diferent composició. Els materials desenvolupats s’avaluen en funció de dues possibles aplicacions: 1) com a catalitzadors de reaccions orgàniques d’interès actual (NCs basats en pal•ladi) i, 2) la seva dedicació a aplicacions bactericides en el tractament d’aigües domèstiques o industrials (NCs basats en plata). El desenvolupament de nanomaterials és de gran interès a l’actualitat donades les seves especials propietats, l’aprofitament de les quals és la força impulsora per a la fabricació de nous NCs. Les nanopartícules metàl•liques estabilitzades en polímer (Polymer Stabilized Metal Nanoparticles, PSNPM) s’han preparat mitjançant la tècnica in-situ de síntesi intermatricial (Inter-matrix synthesis, IMS) que consisteix en la càrrega seqüencial dels grups funcionals de les matrius polimèriques amb ions metàl•lics, i la seva posterior reducció química dins de la matriu polimèrica de bescanvi iònic. L’estabilització en matrius polimèriques evita l’agregació entre elles (self-aggreagtion), un dels principals problemes coneguts de les NPs. Pel desenvolupament d’aquesta metodologia, s’han emprat diferents tipus de matrius polimèriques de bescanvi iònic: membrana Sulfonated PolyEtherEtherKetone, SPEEK, així com fibres sintètiques basades en polypropilè amb diferents tipus de grups funcionals, que ens permeten el seu ús com a filtres en la desinfecció de solucions aquoses o com a material catalitzador. Durant el projecte s’ha anat avançant en l’optimització del material nanocomposite final per a les aplicacions d’interès, en quant activitat i funcionalitat de les nanopartícules i estabilitat del nanocomposite. Així, s’ha optimitzat la síntesi de NPs estabilitzades en resines de bescanvi iònic, realitzant un screening de diferents tipus de resines i la seva avaluació en aplicacions industrials d’interès.
Resumo:
The quantitative estimation of Sea Surface Temperatures from fossils assemblages is afundamental issue in palaeoclimatic and paleooceanographic investigations. TheModern Analogue Technique, a widely adopted method based on direct comparison offossil assemblages with modern coretop samples, was revised with the aim ofconforming it to compositional data analysis. The new CODAMAT method wasdeveloped by adopting the Aitchison metric as distance measure. Modern coretopdatasets are characterised by a large amount of zeros. The zero replacement was carriedout by adopting a Bayesian approach to the zero replacement, based on a posteriorestimation of the parameter of the multinomial distribution. The number of modernanalogues from which reconstructing the SST was determined by means of a multipleapproach by considering the Proxies correlation matrix, Standardized Residual Sum ofSquares and Mean Squared Distance. This new CODAMAT method was applied to theplanktonic foraminiferal assemblages of a core recovered in the Tyrrhenian Sea.Kew words: Modern analogues, Aitchison distance, Proxies correlation matrix,Standardized Residual Sum of Squares
Resumo:
In this paper a novel rank estimation technique for trajectories motion segmentation within the Local Subspace Affinity (LSA) framework is presented. This technique, called Enhanced Model Selection (EMS), is based on the relationship between the estimated rank of the trajectory matrix and the affinity matrix built by LSA. The results on synthetic and real data show that without any a priori knowledge, EMS automatically provides an accurate and robust rank estimation, improving the accuracy of the final motion segmentation
Resumo:
De acuerdo con los objetivos generales del proyecto y plan de trabajo previsto, para esta anualidad, se obtuvieron fibras y microfibras de celulosa a partir de dos fuentes: celulosa vegetal de pino y eucalipto y celulosa bacterial. Las microfibrillas han sido utilizadas como material de refuerzo para la fabricación de materiales compuestos a partir de caucho natural, policaprolactona y polivinil alcohol. Las muestras se fabricaron mediante la técnica de "casting" en medio acuoso y temperatura ambiente. Las muestras fueron caracterizados en sus propiedades mecánicas, físicas y térmicas. Se observó que, en general, la adición de las microfibrillas de celulosa en las matrices poliméricas provoca una mejora sustancial en las propiedades mecánicas del material en comparación con el polímero sin reforzar. Los resultados pueden resumirse de la siguiente manera: 1.Fabricación de materiales compuestos a base de caucho natural y fibras de celulosa. Se obtuvieron fibras y nanofibras de celulosa que fueron modificadas químicamente y usadas como refuerzo en matriz de caucho. Los resultados mostraron mejora de propiedades mecánicas del material, principalmente en los materiales compuestos reforzados con nanofibras. 2. Obtención de whiskers de celulosa y su utilización como material de refuerzo en una matriz de policaprolactona. Se obtuvieron whiskers de celulosa a partir de pasta blanqueada. La adición en una matriz de policaprolactona produjo materiales compuestos con propiedades mecánicas superiores a la matriz, con buena dispersión de los whiskers. 3. Obtención de fibras de celulosa bacterial y nanofibras de celulosa, aislamiento y utilización sobre una matriz de polivinil alcohol. Se obtuvo celulosa bacterial a partir de la bacteria Gluconacetobacter xylinum. Además se fabricaron nanofibras de celulosa a partir eucalipto blanqueado. La celulosa bacterial como material de refuerzo no produjo importantes mejoras en las propiedades mecánicas de la matriz; en cambio se observaron mejoras destacables con la nanofibra como refuerzo.
Resumo:
In the present work, microstructure improvement using FSP (Friction Stir Processing) is studied. In the first part of the work, the microstructure improvement of as-cast A356 is demonstrated. Some tensile tests were applied to check the increase in ductility. However, the expected results couldn’t be achieved. In the second part, the microstructure improvement of a fusion weld in 1050 aluminium alloy is presented. Hardness tests were carried out to prove the mechanical propertyimprovements. In the third and last part, the microstructure improvement of 1050 aluminium alloy is achieved. A discussion of the mechanical property improvements induced by FSP is made. The influence of tool traverse speed on microstructure and mechanical properties is also discussed. Hardness tests and recrystallization theory enabled us to find out such influence
Resumo:
The applicability of the protein phosphatase inhibition assay (PPIA) to the determination of okadaic acid (OA) and its acyl derivatives in shellfish samples has been investigated, using a recombinant PP2A and a commercial one. Mediterranean mussel, wedge clam, Pacific oyster and flat oyster have been chosen as model species. Shellfish matrix loading limits for the PPIA have been established, according to the shellfish species and the enzyme source. A synergistic inhibitory effect has been observed in the presence of OA and shellfish matrix, which has been overcome by the application of a correction factor (0.48). Finally, Mediterranean mussel samples obtained from Rı´a de Arousa during a DSP closure associated to Dinophysis acuminata, determined as positive by the mouse bioassay, have been analysed with the PPIAs. The OA equivalent contents provided by the PPIAs correlate satisfactorily with those obtained by liquid chromatography–tandem mass spectrometry (LC–MS/MS).
Resumo:
It is proved the algebraic equality between Jennrich's (1970) asymptotic$X^2$ test for equality of correlation matrices, and a Wald test statisticderived from Neudecker and Wesselman's (1990) expression of theasymptoticvariance matrix of the sample correlation matrix.
Resumo:
Asymptotic chi-squared test statistics for testing the equality ofmoment vectors are developed. The test statistics proposed aregeneralizedWald test statistics that specialize for different settings by inserting andappropriate asymptotic variance matrix of sample moments. Scaled teststatisticsare also considered for dealing with situations of non-iid sampling. Thespecializationwill be carried out for testing the equality of multinomial populations, andtheequality of variance and correlation matrices for both normal andnon-normaldata. When testing the equality of correlation matrices, a scaled versionofthe normal theory chi-squared statistic is proven to be an asymptoticallyexactchi-squared statistic in the case of elliptical data.
Resumo:
Graphical displays which show inter--sample distances are importantfor the interpretation and presentation of multivariate data. Except whenthe displays are two--dimensional, however, they are often difficult tovisualize as a whole. A device, based on multidimensional unfolding, isdescribed for presenting some intrinsically high--dimensional displays infewer, usually two, dimensions. This goal is achieved by representing eachsample by a pair of points, say $R_i$ and $r_i$, so that a theoreticaldistance between the $i$-th and $j$-th samples is represented twice, onceby the distance between $R_i$ and $r_j$ and once by the distance between$R_j$ and $r_i$. Self--distances between $R_i$ and $r_i$ need not be zero.The mathematical conditions for unfolding to exhibit symmetry are established.Algorithms for finding approximate fits, not constrained to be symmetric,are discussed and some examples are given.