961 resultados para Singular Curve
Resumo:
Iniciales grab. xil
Resumo:
Canonical Correlation Analysis for Interpreting Airborne Laser Scanning Metrics along the Lorenz Curve of Tree Size Inequality
Resumo:
Esta Tesis se centra en el desarrollo de un método para la reconstrucción de bases de datos experimentales incompletas de más de dos dimensiones. Como idea general, consiste en la aplicación iterativa de la descomposición en valores singulares de alto orden sobre la base de datos incompleta. Este nuevo método se inspira en el que ha servido de base para la reconstrucción de huecos en bases de datos bidimensionales inventado por Everson y Sirovich (1995) que a su vez, ha sido mejorado por Beckers y Rixen (2003) y simultáneamente por Venturi y Karniadakis (2004). Además, se ha previsto la adaptación de este nuevo método para tratar el posible ruido característico de bases de datos experimentales y a su vez, bases de datos estructuradas cuya información no forma un hiperrectángulo perfecto. Se usará una base de datos tridimensional de muestra como modelo, obtenida a través de una función transcendental, para calibrar e ilustrar el método. A continuación se detalla un exhaustivo estudio del funcionamiento del método y sus variantes para distintas bases de datos aerodinámicas. En concreto, se usarán tres bases de datos tridimensionales que contienen la distribución de presiones sobre un ala. Una se ha generado a través de un método semi-analítico con la intención de estudiar distintos tipos de discretizaciones espaciales. El resto resultan de dos modelos numéricos calculados en C F D . Por último, el método se aplica a una base de datos experimental de más de tres dimensiones que contiene la medida de fuerzas de una configuración ala de Prandtl obtenida de una campaña de ensayos en túnel de viento, donde se estudiaba un amplio espacio de parámetros geométricos de la configuración que como resultado ha generado una base de datos donde la información está dispersa. ABSTRACT A method based on an iterative application of high order singular value decomposition is derived for the reconstruction of missing data in multidimensional databases. The method is inspired by a seminal gappy reconstruction method for two-dimensional databases invented by Everson and Sirovich (1995) and improved by Beckers and Rixen (2003) and Venturi and Karniadakis (2004). In addition, the method is adapted to treat both noisy and structured-but-nonrectangular databases. The method is calibrated and illustrated using a three-dimensional toy model database that is obtained by discretizing a transcendental function. The performance of the method is tested on three aerodynamic databases for the flow past a wing, one obtained by a semi-analytical method, and two resulting from computational fluid dynamics. The method is finally applied to an experimental database consisting in a non-exhaustive parameter space measurement of forces for a box-wing configuration.
Resumo:
Estudio preliminar de la edición facsímil del libro publicado en 1743. Serie : Fondo antiguo de la Escuela Técnica Superior de Arquitectura de Madrid ; 7
Resumo:
Esta Tesis presenta un nuevo método para filtrar errores en bases de datos multidimensionales. Este método no precisa ninguna información a priori sobre la naturaleza de los errores. En concreto, los errrores no deben ser necesariamente pequeños, ni de distribución aleatoria ni tener media cero. El único requerimiento es que no estén correlados con la información limpia propia de la base de datos. Este nuevo método se basa en una extensión mejorada del método básico de reconstrucción de huecos (capaz de reconstruir la información que falta de una base de datos multidimensional en posiciones conocidas) inventado por Everson y Sirovich (1995). El método de reconstrucción de huecos mejorado ha evolucionado como un método de filtrado de errores de dos pasos: en primer lugar, (a) identifica las posiciones en la base de datos afectadas por los errores y después, (b) reconstruye la información en dichas posiciones tratando la información de éstas como información desconocida. El método resultante filtra errores O(1) de forma eficiente, tanto si son errores aleatorios como sistemáticos e incluso si su distribución en la base de datos está concentrada o esparcida por ella. Primero, se ilustra el funcionamiento delmétodo con una base de datosmodelo bidimensional, que resulta de la dicretización de una función transcendental. Posteriormente, se presentan algunos casos prácticos de aplicación del método a dos bases de datos tridimensionales aerodinámicas que contienen la distribución de presiones sobre un ala a varios ángulos de ataque. Estas bases de datos resultan de modelos numéricos calculados en CFD. ABSTRACT A method is presented to filter errors out in multidimensional databases. The method does not require any a priori information about the nature the errors. In particular, the errors need not to be small, neither random, nor exhibit zero mean. Instead, they are only required to be relatively uncorrelated to the clean information contained in the database. The method is based on an improved extension of a seminal iterative gappy reconstruction method (able to reconstruct lost information at known positions in the database) due to Everson and Sirovich (1995). The improved gappy reconstruction method is evolved as an error filtering method in two steps, since it is adapted to first (a) identify the error locations in the database and then (b) reconstruct the information in these locations by treating the associated data as gappy data. The resultingmethod filters out O(1) errors in an efficient fashion, both when these are random and when they are systematic, and also both when they concentrated and when they are spread along the database. The performance of the method is first illustrated using a two-dimensional toymodel database resulting fromdiscretizing a transcendental function and then tested on two CFD-calculated, three-dimensional aerodynamic databases containing the pressure coefficient on the surface of a wing for varying values of the angle of attack. A more general performance analysis of the method is presented with the intention of quantifying the randomness factor the method admits maintaining a correct performance and secondly, quantifying the size of error the method can detect. Lastly, some improvements of the method are proposed with their respective verification.
Resumo:
We describe the use of singular value decomposition in transforming genome-wide expression data from genes × arrays space to reduced diagonalized “eigengenes” × “eigenarrays” space, where the eigengenes (or eigenarrays) are unique orthonormal superpositions of the genes (or arrays). Normalizing the data by filtering out the eigengenes (and eigenarrays) that are inferred to represent noise or experimental artifacts enables meaningful comparison of the expression of different genes across different arrays in different experiments. Sorting the data according to the eigengenes and eigenarrays gives a global picture of the dynamics of gene expression, in which individual genes and arrays appear to be classified into groups of similar regulation and function, or similar cellular state and biological phenotype, respectively. After normalization and sorting, the significant eigengenes and eigenarrays can be associated with observed genome-wide effects of regulators, or with measured samples, in which these regulators are overactive or underactive, respectively.
Resumo:
We give conditions that rule out formation of sharp fronts for certain two-dimensional incompressible flows. We show that a necessary condition of having a sharp front is that the flow has to have uncontrolled velocity growth. In the case of the quasi-geostrophic equation and two-dimensional Euler equation, we obtain estimates on the formation of semi-uniform fronts.
Resumo:
The phase transition for turbulent diffusion, reported by Avellaneda and Majda [Avellaneda, M. & Majda, A. J. (1994) Philos. Trans. R. Soc. London A 346, 205-233, and several earlier papers], is traced to a modeling assumption in which the energy spectrum of the turbulent fluid is singularly dependent on the viscosity in the inertial range. Phenomenological models of turbulence and intermittency, by contrast, require that the energy spectrum be independent of the viscosity in the inertial range. When the energy spectrum is assumed to be consistent with the phenomenological models, there is no phase transition for turbulent diffusion.
Resumo:
We prove global existence of nonnegative solutions to the one dimensional degenerate parabolic problems containing a singular term. We also show the global quenching phenomena for L1 initial datums. Moreover, the free boundary problem is considered in this paper.
Resumo:
Perylene bisimides (PBIs) are n-type semiconducting and photogenerating materials widely used in a variety of optoelectronic devices. Particularly interesting are PBIs that are simultaneously water-soluble and liquid-crystalline (PBI-W+LC) and, thus, attractive for the development of high-performing easily processable applications in biology and “green” organic electronics. In this work, singular temperatures connected to charge transport mechanism transitions in a PBI-W+LC derivative are determined with high accuracy by means of temperature-dependent photocurrent studies. These singular temperatures include not only the ones observed at 60 and 110 °C, corresponding to phase transition temperatures from crystalline to liquid-crystalline (LC) and from LC to the isotropic phase, respectively, as confirmed by differential scanning calorimetry (DSC), but also a transition at 45 °C, not observed by DSC. By analyzing the photocurrent dependence simultaneously on temperature and on light intensity, this transition is interpreted as a change from monomolecular to bimolecular recombination. These results might be useful for other semiconducting photogenerating materials, not necessarily PBIs or even organic semiconductors, which also show transport behavior changes at singular temperatures not connected with structural or phase transitions.
Resumo:
Aportación en el marco de las Jornadas Internacionales de Arquitectura y Urbanismo desde la perspectiva de las arquitectas, celebradas en la Escuela Técnica Superior de Arquitectura de Madrid en diciembre de 2008, sobre el papel desempeñado en la disciplina por las bienales de arquitectura y urbanismo, el papel de las mujeres en la arquitectura y el papel de las mujeres arquitectas en las bienales de arquitectura y urbanismo, en concreto, por Rosa Grena Kliass, Sofía von Ellrichshausen y la Casa Poli y Carme Pinós y la Torre Cube, todas ellas premiadas en estos certámenes.
Resumo:
Mercury intrusion porosimetry (MIP) has been widely used to evaluate the quality of concrete through the pore size distribution parameters. Two of these parameters are the critical pore diameter (Dcrit) and the percentage of the most interconnected net of pores compared to the total volume of pores. Some researchers consider Dcrit as the diameter obtained from the inflexion point of the cumulative mercury intrusion curve while others consider Dcrit as the diameter obtained from the point of abrupt variation in the same curve. This study aims to analyze two groups of concretes of varying w/c ratios, one cast with pozzolanic cement and another with high initial strength cement, in order to determine which of these diameters feature a better correlation with the quality parameters of the concretes. The concrete quality parameters used for the evaluations were (1) the w/c ratios and (2) chloride diffusion coefficients measured at approximately 90 days. MIP cumulative distributions of the same concretes were also measured at about 90 days, and Dcrit values were determined (1) from the point of abrupt variation and alternatively, (2) from the inflexion point of each of these plots. It was found that Dcrit values measured from the point of abrupt variation were useful indicators of the quality of the concrete, but the Dcrit values based on the inflexion points were not. Hence, it is recommended that Dcrit and the percentage of the most interconnected volume of pores should be obtained considering the point of abrupt variation of the cumulative curve of pore size distribution.
Resumo:
La Pharmacopea de la Armada, obra de Leandro de Vega, publicada en 1760 para uso de médicos y cirujanos de los buques y hospitales de la marina española a lo largo del siglo XVIII, está considerada como la primera farmacopea naval española. Su autor la define con un «catálogo de medicamentos pertenecientes a las enfermedades médicas», en definitiva un nomenclátor de fórmulas para la preparación de los medicamentos de mayor utilidad para los navegantes de la época, tanto de uso interno como externo. En el presente artículo se analiza en profundidad esta obra, reconocida su relevancia de fuente primaria, situándola en su contexto histórico, detallando el contenido de sus tratados, así como dando noticia biográfica de su autor e informando de la posición profesional y misión de sus destinatarios.
Resumo:
At present, the market is severely mispricing Greece’s sovereign risk relative to the country’s fundamentals. As a result of the mispricing, financial intermediation in Greece has become dysfunctional and the privatisation of state-owned assets has stalled. This mispricing is partially due to an illiquid and fragmented government yield curve. A well-designed public liability management exercise can lead to a more efficient pricing of Greece’s government bonds and thereby help restore stable and affordable financing for the country’s private sector, which is imperative in order to overcome Greece’s deep recession. This paper proposes three measures to enhance the functioning of the Greek government debt market: i) Greece should issue a new five-year bond, ii) it should consolidate the 20 individual series of government bonds into four liquid securities and iii) it should offer investors a swap of these newly created bonds into dollar-denominated securities. Each of these measures would be beneficial to the Hellenic Republic, since the government would be able to reduce the face value and the net present value of its debt stock. Furthermore, this exercise would facilitate the resumption of market access, which is a necessary condition for continuous multilateral disbursements to Greece.