26 resultados para reproducibility of calculated surfaces


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The technical improvement and new applications of Infrared Thermography (IRT) with healthy subjects should be accompanied by results about the reproducibility of IRT measurements in different popula-tion groups. In addition, there is a remarkable necessity of a larger supply on software to analyze IRT images of human beings. Therefore, the objectives of this study were: firstly, to investigate the reproducibility of skin temperature (Tsk) on overweight and obese subjects using IRT in different Regions of Interest (ROI), moments and side-to-side differences (?T); and secondly, to check the reliability of a new software called Termotracker®, specialized on the analysis of IRT images of human beings. Methods: 22 overweight and obese males (11) and females (11) (age: 41,51±7,76 years; height: 1,65±0,09 m; weight: 82,41±11,81 Kg; BMI: 30,17±2,58 kg/m²) were assessed in two consecutive thermograms (5 seconds in-between) by the same observer, using an infrared camera (FLIR T335, Sweden) to get 4 IRT images from the whole body. 11 ROI were selected using Termotracker® to analyze its reproducibility and reliability through Intra-class Correlation Coefficient (ICC) and Coefficient of Variation (CV) values. Results: The reproducibility of the side-to-side differences (?T) between two consecutive thermograms was very high in all ROIs (Mean ICC = 0,989), and excellent between two computers (Mean ICC = 0,998). The re-liability of the software was very high in all the ROIs (Mean ICC = 0,999). Intraexaminer reliability analysing the same subjects in two consecutive thermograms was also very high (Mean ICC = 0,997). CV values of the different ROIs were around 2%. Conclusions: Skin temperature on overweight subjects had an excellent reproducibility for consecutive ther-mograms. The reproducibility of thermal asymmetries (?T) was also good but it had the influence of several factors that should be further investigated. Termotracker® reached excellent reliability results and it is a relia-ble and objective software to analyse IRT images of humans beings.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Esta tesis trata sobre métodos de corrección que compensan la variación de las condiciones de iluminación en aplicaciones de imagen y video a color. Estas variaciones hacen que a menudo fallen aquellos algoritmos de visión artificial que utilizan características de color para describir los objetos. Se formulan tres preguntas de investigación que definen el marco de trabajo de esta tesis. La primera cuestión aborda las similitudes que se dan entre las imágenes de superficies adyacentes en relación a su comportamiento fotométrico. En base al análisis del modelo de formación de imágenes en situaciones dinámicas, esta tesis propone un modelo capaz de predecir las variaciones de color de la región de una determinada imagen a partir de las variaciones de las regiones colindantes. Dicho modelo se denomina Quotient Relational Model of Regions. Este modelo es válido cuando: las fuentes de luz iluminan todas las superficies incluídas en él; estas superficies están próximas entre sí y tienen orientaciones similares; y cuando son en su mayoría lambertianas. Bajo ciertas circunstancias, la respuesta fotométrica de una región se puede relacionar con el resto mediante una combinación lineal. No se ha podido encontrar en la literatura científica ningún trabajo previo que proponga este tipo de modelo relacional. La segunda cuestión va un paso más allá y se pregunta si estas similitudes se pueden utilizar para corregir variaciones fotométricas desconocidas en una región también desconocida, a partir de regiones conocidas adyacentes. Para ello, se propone un método llamado Linear Correction Mapping capaz de dar una respuesta afirmativa a esta cuestión bajo las circunstancias caracterizadas previamente. Para calcular los parámetros del modelo se requiere una etapa de entrenamiento previo. El método, que inicialmente funciona para una sola cámara, se amplía para funcionar en arquitecturas con varias cámaras sin solape entre sus campos visuales. Para ello, tan solo se necesitan varias muestras de imágenes del mismo objeto capturadas por todas las cámaras. Además, este método tiene en cuenta tanto las variaciones de iluminación, como los cambios en los parámetros de exposición de las cámaras. Todos los métodos de corrección de imagen fallan cuando la imagen del objeto que tiene que ser corregido está sobreexpuesta o cuando su relación señal a ruido es muy baja. Así, la tercera cuestión se refiere a si se puede establecer un proceso de control de la adquisición que permita obtener una exposición óptima cuando las condiciones de iluminación no están controladas. De este modo, se propone un método denominado Camera Exposure Control capaz de mantener una exposición adecuada siempre y cuando las variaciones de iluminación puedan recogerse dentro del margen dinámico de la cámara. Los métodos propuestos se evaluaron individualmente. La metodología llevada a cabo en los experimentos consistió en, primero, seleccionar algunos escenarios que cubrieran situaciones representativas donde los métodos fueran válidos teóricamente. El Linear Correction Mapping fue validado en tres aplicaciones de re-identificación de objetos (vehículos, caras y personas) que utilizaban como caracterísiticas la distribución de color de éstos. Por otra parte, el Camera Exposure Control se probó en un parking al aire libre. Además de esto, se definieron varios indicadores que permitieron comparar objetivamente los resultados de los métodos propuestos con otros métodos relevantes de corrección y auto exposición referidos en el estado del arte. Los resultados de la evaluación demostraron que los métodos propuestos mejoran los métodos comparados en la mayoría de las situaciones. Basándose en los resultados obtenidos, se puede decir que las respuestas a las preguntas de investigación planteadas son afirmativas, aunque en circunstancias limitadas. Esto quiere decir que, las hipótesis planteadas respecto a la predicción, la corrección basada en ésta y la auto exposición, son factibles en aquellas situaciones identificadas a lo largo de la tesis pero que, sin embargo, no se puede garantizar que se cumplan de manera general. Por otra parte, se señalan como trabajo de investigación futuro algunas cuestiones nuevas y retos científicos que aparecen a partir del trabajo presentado en esta tesis. ABSTRACT This thesis discusses the correction methods used to compensate the variation of lighting conditions in colour image and video applications. These variations are such that Computer Vision algorithms that use colour features to describe objects mostly fail. Three research questions are formulated that define the framework of the thesis. The first question addresses the similarities of the photometric behaviour between images of dissimilar adjacent surfaces. Based on the analysis of the image formation model in dynamic situations, this thesis proposes a model that predicts the colour variations of the region of an image from the variations of the surrounded regions. This proposed model is called the Quotient Relational Model of Regions. This model is valid when the light sources illuminate all of the surfaces included in the model; these surfaces are placed close each other, have similar orientations, and are primarily Lambertian. Under certain circumstances, a linear combination is established between the photometric responses of the regions. Previous work that proposed such a relational model was not found in the scientific literature. The second question examines whether those similarities could be used to correct the unknown photometric variations in an unknown region from the known adjacent regions. A method is proposed, called Linear Correction Mapping, which is capable of providing an affirmative answer under the circumstances previously characterised. A training stage is required to determine the parameters of the model. The method for single camera scenarios is extended to cover non-overlapping multi-camera architectures. To this extent, only several image samples of the same object acquired by all of the cameras are required. Furthermore, both the light variations and the changes in the camera exposure settings are covered by correction mapping. Every image correction method is unsuccessful when the image of the object to be corrected is overexposed or the signal-to-noise ratio is very low. Thus, the third question refers to the control of the acquisition process to obtain an optimal exposure in uncontrolled light conditions. A Camera Exposure Control method is proposed that is capable of holding a suitable exposure provided that the light variations can be collected within the dynamic range of the camera. Each one of the proposed methods was evaluated individually. The methodology of the experiments consisted of first selecting some scenarios that cover the representative situations for which the methods are theoretically valid. Linear Correction Mapping was validated using three object re-identification applications (vehicles, faces and persons) based on the object colour distributions. Camera Exposure Control was proved in an outdoor parking scenario. In addition, several performance indicators were defined to objectively compare the results with other relevant state of the art correction and auto-exposure methods. The results of the evaluation demonstrated that the proposed methods outperform the compared ones in the most situations. Based on the obtained results, the answers to the above-described research questions are affirmative in limited circumstances, that is, the hypothesis of the forecasting, the correction based on it, and the auto exposure are feasible in the situations identified in the thesis, although they cannot be guaranteed in general. Furthermore, the presented work raises new questions and scientific challenges, which are highlighted as future research work.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Here we show that potassium-doped tungsten foil should be preferred to pure tungsten foil when considering tungsten laminate pipes for structural divertor applications. Potassium-doped tungsten materials are well known from the bulb industry and show an enhanced creep and recrystallization behaviour that can be explained by the formation of potassium-filled bubbles that are surrounding the elongated grains, leading to an interlocking of the microstructure. In this way, the ultra-fine grained (UFG) microstructure of tungsten foil can be stabilized and with it the extraordinary mechanical properties of the foil in terms of ductility, toughness, brittle-to-ductile transition, and radiation resistance. In this paper we show the results of three-point bending tests performed at room temperature on annealed pure tungsten and potassium-doped tungsten foils (800, 900, 1000, 1100, 1200, 1300, 1400, 1600, 1800, 2000, 2200, and 2400 °C for 1 h in vacuum). The microstructural assessment covers the measurement of the hardness and analyses of fractured surfaces as well as a comparison of the microstructure by optical microscopy. The results show that there is a positive effect of potassium-doped tungsten foils compared to pure tungsten foil and demonstrate the potential of the doped foil

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Reflectance anisotropy spectroscopy (RAS) was employed to determine the optimal specific molar flow of Sb needed to grow GaInP with a given order parameter by MOVPE. The RAS signature of GaInP surfaces exposed to different Sb/P molar flow ratios were recorded, and the RAS peak at 3.02 eV provided a feature that was sensitive to the amount of Sb on the surface. The range of Sb/P ratios over which Sb acts as a surfactant was determined using the RA intensity of this peak, and different GaInP layers were grown using different Sb/P ratios. The order parameter of the resulting layers was measured by PL at 20 K. This procedure may be extensible to the calibration of surfactant-mediated growth of other materials exhibiting characteristic RAS signatures.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The possibility of designing and manufacturing biomedical microdevices with multiple length-scale geometries can help to promote special interactions both with their environment and with surrounding biological systems. These interactions aim to enhance biocompatibility and overall performance by using biomimetic approaches. In this paper, we present a design and manufacturing procedure for obtaining multi-scale biomedical microsystems based on the combination of two additive manufacturing processes: a conventional laser writer to manufacture the overall device structure, and a direct-laser writer based on two-photon polymerization to yield finer details. The process excels for its versatility, accuracy and manufacturing speed and allows for the manufacture of microsystems and implants with overall sizes up to several millimeters and with details down to sub-micrometric structures. As an application example we have focused on manufacturing a biomedical microsystem to analyze the impact of microtextured surfaces on cell motility. This process yielded a relevant increase in precision and manufacturing speed when compared with more conventional rapid prototyping procedures.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We discuss three geometric constructions and their relations, namely the offset, the conchoid and the pedal construction. The offset surface F d of a given surface F is the set of points at fixed normal distance d of F. The conchoid surface G d of a given surface G is obtained by increasing the radius function by d with respect to a given reference point O. There is a nice relation between offsets and conchoids: The pedal surfaces of a family of offset surfaces are a family of conchoid surfaces. Since this relation is birational, a family of rational offset surfaces corresponds to a family of rational conchoid surfaces and vice versa. We present theoretical principles of this mapping and apply it to ruled surfaces and quadrics. Since these surfaces have rational offsets and conchoids, their pedal and inverse pedal surfaces are new classes of rational conchoid surfaces and rational offset surfaces.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The virtual acoustic reality techniques are powerful tools for the recovery of acoustical heritage of historic buildings. Through the acoustic modeling and auralization techniques it´s possible to reconstruct the sound of disappeared buildings or the ones with significant modifications over the years, knowing the original geometry and the acoustic characteristics of their surfaces. This paper shows the results of a research project whose goal is the virtual recovery of the sound of the Hispanic Rite, the rite celebrated by Christians of the Iberian Peninsula before the imposition of the Roman Rite in the mid-eleventh century. For this purpose, acoustic models of a series of Pre-Romanesque churches were made. These acoustic models represent the churches in their original state, following the reconstruction hypothesis proposed by leading researchers in medieval liturgical archeology. Multichannel anechoic recordings of several pieces of the music of the Hispanic Rite have been carried out using a spherical array composed of 31 microphones. Finally, static and dynamic auralizations have been developed, involving the different liturgical configurations which were usual in this rite.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Petrophysical properties, such as porosity, permeability, density or anisotropy de-termine the alterability of stone surfaces from archaeological sites, and therefore, the future preservation of the material. Others, like superficial roughness or color, may point out changes due to alteration processes, natural or man-induced, for ex-ample, by conservation treatments. The application of conservation treatments may vary some of these properties forcing the stone surface to a re-adaptation to the new conditions, which could generate new processes of deterioration. In this study changes resulting from the application of consolidating and hydrophobic treatments on stone materials from the Roman Theatre (marble and granite) and the Mitreo’s House (mural painting and mosaics), both archaeological sites from Merida (Spain), are analyzed. The use of portable field devices allows us to perform analyses both on site and in la-boratory, comparing treated and untreated samples. Treatments consisted of syn-thetic resins, consolidating (such as tetraethoxysilane TEOS) and hydrophobic products. Results confirm that undesirable changes may occur, with consequences ranging from purely aesthetic variations to physical, chemical and mechanical damages. This also permits us to check limitations in the use of these techniques for the evaluation of conservation treatments.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Sexpartite vaults constitute one of the most interesting chapters in European Gothic architecture. Originally, the use of the square cross-ribbed vault was limited to relatively small spaces, but when the need arose to cover spaces of considerable size, a new vault with very peculiar characteristics appeared. This new vault was a cross-ribbed vault that was reinforced in the centre by a rib that was parallel to the transverse ribs which effectively divided the vault in half. This configuration breaks the side arch into two fragments, creating a pair of windows on each side. The volumetrics of these vaults is extremely complex and the difficulties involved in their construction perhaps explain why they were abandoned in favour of the simple cross ribbed vault, now with rectangular sections. The existence of the sexpartite vault barely lasted more than fifty years, from the end of the XII century and the beginning of the XIII. Towards the end of the 19th century Viollet-le-Duc gave a succinct explanation of this type of vault. A. Choisy also, later, devotes some pages to the French sexpartite vault; since then, the subject has only been broached in a few references in later studies on Gothic architecture. However, despite its short period of existence, the sexpartite vault spread throughout Europe and was used to build important vaulting. Viollet-le-Duc's sexpartite vault could be considered to be the prototype of them all, while it is true that the studies that we have conducted so far lead us to affirm that there is a wide variety of vaults, with different volumetric spaces and different construction strategies. Therefore, we believe that this chapter of international Gothic deserves further study applying the knowledge and resources that are available today. This paper has been written to explore the most significant European sexpartite vaults. New measurement technology has led to a revolution in research into the history of construction, allowing studies to be conducted that were hitherto impossible. Thorough data collection using total station and photogrammetry has enabled us to identify the stereotomy of the voussoirs, tas-de-charges and keystones, as well as the bonding of the surfaces of the severies. A comparison of the construction techniques employed in the different vaults studied reveals common construction features and aspects that are specific to each country. Thus we are able to establish the relationship between sexpartite vaults in different European countries and their influence on each other.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

La reproducibilidad de estudios y resultados científicos es una meta a tener en cuenta por cualquier científico a la hora de publicar el producto de una investigación. El auge de la ciencia computacional, como una forma de llevar a cabo estudios empíricos haciendo uso de modelos matemáticos y simulaciones, ha derivado en una serie de nuevos retos con respecto a la reproducibilidad de dichos experimentos. La adopción de los flujos de trabajo como método para especificar el procedimiento científico de estos experimentos, así como las iniciativas orientadas a la conservación de los datos experimentales desarrolladas en las últimas décadas, han solucionado parcialmente este problema. Sin embargo, para afrontarlo de forma completa, la conservación y reproducibilidad del equipamiento computacional asociado a los flujos de trabajo científicos deben ser tenidas en cuenta. La amplia gama de recursos hardware y software necesarios para ejecutar un flujo de trabajo científico hace que sea necesario aportar una descripción completa detallando que recursos son necesarios y como estos deben de ser configurados. En esta tesis abordamos la reproducibilidad de los entornos de ejecución para flujos de trabajo científicos, mediante su documentación usando un modelo formal que puede ser usado para obtener un entorno equivalente. Para ello, se ha propuesto un conjunto de modelos para representar y relacionar los conceptos relevantes de dichos entornos, así como un conjunto de herramientas que hacen uso de dichos módulos para generar una descripción de la infraestructura, y un algoritmo capaz de generar una nueva especificación de entorno de ejecución a partir de dicha descripción, la cual puede ser usada para recrearlo usando técnicas de virtualización. Estas contribuciones han sido aplicadas a un conjunto representativo de experimentos científicos pertenecientes a diferentes dominios de la ciencia, exponiendo cada uno de ellos diferentes requisitos hardware y software. Los resultados obtenidos muestran la viabilidad de propuesta desarrollada, reproduciendo de forma satisfactoria los experimentos estudiados en diferentes entornos de virtualización. ABSTRACT Reproducibility of scientific studies and results is a goal that every scientist must pursuit when announcing research outcomes. The rise of computational science, as a way of conducting empirical studies by using mathematical models and simulations, have opened a new range of challenges in this context. The adoption of workflows as a way of detailing the scientific procedure of these experiments, along with the experimental data conservation initiatives that have been undertaken during last decades, have partially eased this problem. However, in order to fully address it, the conservation and reproducibility of the computational equipment related to them must be also considered. The wide range of software and hardware resources required to execute a scientific workflow implies that a comprehensive description detailing what those resources are and how they are arranged is necessary. In this thesis we address the issue of reproducibility of execution environments for scientific workflows, by documenting them in a formalized way, which can be later used to obtain and equivalent one. In order to do so, we propose a set of semantic models for representing and relating the relevant information of those environments, as well as a set of tools that uses these models for generating a description of the infrastructure, and an algorithmic process that consumes these descriptions for deriving a new execution environment specification, which can be enacted into a new equivalent one using virtualization solutions. We apply these three contributions to a set of representative scientific experiments, belonging to different scientific domains, and exposing different software and hardware requirements. The obtained results prove the feasibility of the proposed approach, by successfully reproducing the target experiments under different virtualization environments.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A high-fidelity virtual tool for the numerical simulation of low-velocity impact damage in unidirectional composite laminates is proposed. A continuum material model for the simulation of intraply damage phenomena is implemented in a numerical scheme as a user subroutine of the commercially available Abaqus finite element package. Delaminations are simulated using of cohesive surfaces. The use of structured meshes, aligned with fiber directions allows the physically-sound simulation of matrix cracks parallel to fiber directions, and their interaction with the development of delaminations. The implementation of element erosion criteria and the application of intraply and interlaminar friction allow for the simulation of fiber splits and their entanglement, which in turn results in permanent indentation in the impacted laminate. It is shown that this simulation strategy gives sound results for impact energies bellow and above the Barely Visible Impact Damage threshold, up to laminate perforation conditions