30 resultados para Polyester and vinylester thermoset matrices
Resumo:
A small Positron Emission Tomography demonstrator based on LYSO slabs and Silicon Photomultiplier matrices is under construction at the University and INFN of Pisa. In this paper we present the characterization results of the read-out electronics and of the detection system. Two SiPM matrices, composed by 8 × 8 SiPM pixels, 1.5 mm pitch, have been coupled one to one to a LYSO crystals array. Custom Front-End ASICs were used to read the 64 channels of each matrix. Data from each Front-End were multiplexed and sent to a DAQ board for the digital conversion; a motherboard collects the data and communicates with a host computer through a USB port. Specific tests were carried out on the system in order to assess its performance. Futhermore we have measured some of the most important parameters of the system for PET application.
Resumo:
A nonlinear implicit finite element model for the solution of two-dimensional (2-D) shallow water equations, based on a Galerkin formulation of the 2-D estuaries hydrodynamic equations, has been developed. Spatial discretization has been achieved by the use of isoparametric, Lagrangian elements. To obtain the different element matrices, Simpson numerical integration has been applied. For time integration of the model, several schemes in finite differences have been used: the Cranck-Nicholson iterative method supplies a superior accuracy and allows us to work with the greatest time step Δt; however, central differences time integration produces a greater velocity of calculation. The model has been tested with different examples to check its accuracy and advantages in relation to computation and handling of matrices. Finally, an application to the Bay of Santander is also presented.
Resumo:
Here, a novel and efficient moving object detection strategy by non-parametric modeling is presented. Whereas the foreground is modeled by combining color and spatial information, the background model is constructed exclusively with color information, thus resulting in a great reduction of the computational and memory requirements. The estimation of the background and foreground covariance matrices, allows us to obtain compact moving regions while the number of false detections is reduced. Additionally, the application of a tracking strategy provides a priori knowledge about the spatial position of the moving objects, which improves the performance of the Bayesian classifier
Resumo:
This article examines, from the energy viewpoint, a new lightweight, slim, high energy efficient, light-transmitting envelope system, providing for seamless, free-form designs for use in architectural projects. The research was based on envelope components already existing on the market, especially components implemented with granular silica gel insulation, as this is the most effective translucent thermal insulation there is today. The tests run on these materials revealed that there is not one that has all the features required of the new envelope model, although some do have properties that could be exploited to generate this envelope, namely, the vacuum chamber of vacuum insulated panels (VIP), the monolithic aerogel used as insulation in some prototypes, reinforced polyester barriers. By combining these three design components — the high-performance thermal insulation of the vacuum chamber combined with monolithic silica gel insulation, the free-form design potential provided by materials like reinforced polyester and epoxy resins—, we have been able to define and test a new, variable geometry, energy-saving envelope system.
Resumo:
The adhesives used for applications in marine environments are subject to particular chemical conditions, which are mainly characterised by an elevated chlorine ion content and intermittent wetting/drying cycles, among others.These conditions can limit the use of adhesives due to the degradation processes that they experience. In this work, the chemical degradation of two different polymers, polyurethane and vinylester, was studied in natural seawater under immersion for different periods of time.The diffusion coefficients and concentration profiles of water throughout the thickness of the adhesiveswere obtained.Microstructural changes in the polymer due to the action of water were observed by SEM, and the chemical degradation of the polymer was monitored with the Fourier transform infrared spectroscopy (FTIR) and differential scanning calorimetry (DSC). The degradation of the mechanical properties of the adhesive was determined by creep tests withMixed Cantilever Beam (MCB) specimens at different temperatures. After 180 days of immersion of the specimens, it was concluded that the J-integral value (depending on the strain) implies a loss of stiffness of 51% and a decrease in the failure load of 59% for the adhesive tested.
Resumo:
This article shows software that allows determining the statistical behavior of qualitative data originating surveys previously transformed with a Likert’s scale to quantitative data. The main intention is offer to users a useful tool to know statistics' characteristics and forecasts of financial risks in a fast and simple way. Additionally,this paper presents the definition of operational risk. On the other hand, the article explains different techniques to do surveys with a Likert’s scale (Avila, 2008) to know expert’s opinion with the transformation of qualitative data to quantitative data. In addition, this paper will show how is very easy to distinguish an expert’s opinion related to risk, but when users have a lot of surveys and matrices is very difficult to obtain results because is necessary to compare common data. On the other hand, statistical value representative must be extracted from common data to get weight of each risk. In the end, this article exposes the development of “Qualitative Operational Risk Software” or QORS by its acronym, which has been designed to determine the root of risks in organizations and its value at operational risk OpVaR (Jorion, 2008; Chernobai et al, 2008) when input data comes from expert’s opinion and their associated matrices.
Resumo:
Gamma detectors based on monolithic scintillator blocks coupled to APDs matrices have proved to be a good alternative to pixelated ones for PET scanners. They provide comparable spatial resolution, improve the sensitivity and make easier the mechanical design of the system. In this study we evaluate by means of Geant4-based simulations the possibility of replacing the APDs by SiPMs. Several commercial matrices of light sensors coupled to LYSO:Ce monolithic blocks have been simulated and compared. Regarding the spatial resolution and linearity of the detector, SiPMs with high photo detection efficiency could become an advantageous replacement for the APDs
Resumo:
Polymer nanocomposites, specifically nanoclay-reinforced polymers, have attracted great interest as matrix materials for high temperature composite applications. Nanocomposites require relatively low dispersant loads to achieve significant property enhancements. These enhancements are mainly a consequence of the interfacial effects that result from dispersing the silicate nanolayers in the polymer matrix and the high in-plane strength, stiffness and aspect ratio of the lamellar nanoparticles. The montmorillonite (MMT) clay, modified with organic onium ions with long alkyl chains as Cloisites, has been widely used to obtain nanocomposites. The presence of reactive groups in organic onium ions can form chemical bonds with the polymer matrix which favours a very high exfoliation degree of the clay platelets in the nanocomposite (1,2)
Resumo:
Reducing duplication in ex-situ collections is complicated and requires good quality genetic markers. This study was conducted to assess the value of endosperm proteins and SSRs for validation of potential duplicates and monitoring intra-accession variability. Fifty durum wheat (Triticum turgidum ssp. durum) accessions grouped in 23 potential duplicates, and previously characterised for 30 agro-morphological traits, were analysed for gliadin and high molecular weight glutenin (HMWG) subunit alleles, total protein, and 24 SSRs, covering a wide genome area. Similarity and dissimilarity matrices were generated based on protein and SSRs alleles. For heterogeneous accessions at gliadins the percent pattern homology (PH) between gliadin patterns and the Nei’s coefficient of genetic identity (I) were computed. Eighteen duplicates identical for proteins showed none or less than 3 unshared SSRs alleles. For heterogeneous accessions PH and I values lower than 80 identified clearly off-types with more than 3 SSRs unshared. Only those biotypes differing in no more than one protein-coding locus were confirmed with SSRs. A good concordance among proteins, morphological traits, and SSR were detected. However, the discrepancy in similarity detected in some cases showed that it is advisable to evaluate redundancy through distinct approaches. The analysis in proteins together with SSRs data are very useful to identify duplicates, biotypes, close related genotypes, and contaminations
Resumo:
The bonding quality of epoxy glued timber and glass fibre reinforced polymers (GFRP) was evaluated by means of compression loading shear test. Three timber species (Radiata pine, Laricio pine and Oak) and two kinds of GFRP (plates and rods made with polyester resin reinforced with mat and roving glass fibre) were glued and tested using three epoxy formulations. The increase in shear strength with age after the setting of epoxy formulations and the effect of surface roughness on timber and GRP gluing (the planing of the surface of timber and the previous sanding of GRP) were studied. It can be concluded that the mechanical properties of these products make them suitable for use in the reinforcement of deteriorated timber structures, and that a rough timber surface is preferable to a planed one, while the previous sanding of GRP surfaces is not advantageous.
Resumo:
Purpose The purpose of this paper is to present what kind of elements and evaluation methods should be included into a framework for evaluating the achievements and impacts of transport projects supported in EC Framework Programmes (FP). Further, the paper discusses the possibilities of such an evaluation framework in producing recommendations regarding future transport research and policy objectives as well as mutual learning for the basis of strategic long term planning. Methods The paper describes the two-dimensional evaluation methodology developed in the course of the FP7 METRONOME project. The dimensions are: (1) achievement of project objectives and targets in different levels and (2) research project impacts according to four impact groups. The methodology uses four complementary approaches in evaluation, namely evaluation matrices, coordinator questionnaires, lead user interviews and workshops. Results Based on the methodology testing, with a sample of FP5 and FP6 projects, the main results relating to the rationale, implementation and achievements of FP projects is presented. In general, achievement of objectives in both FPs was good. Strongest impacts were identified within the impact group of management and co-ordination. Also scientific and end-user impacts of the projects were adequate, but wider societal impacts quite modest. The paper concludes with a discussion both on the theoretical and practical implications of the proposed methodology and by presenting some relevant future research needs.
Resumo:
This study forms part of wider research conducted under a EU 7 th Framework Programme (COmputationally Driven design of Innovative CEment-based materials or CODICE). The ultimate aim is the multi-scale modelling of the variations in mechanical performance in degraded and non-degraded cementitious matrices. The model is being experimentally validated by hydrating the main tri-calcium silicate (T1-C3S) and bi-calcium silicate (β-C2S), phases present in Portland cement and their blends. The present paper discusses micro- and nanoscale studies of the cementitious skeletons forming during the hydration of C3S, C2S and 70 % / 30 % blends of both C3S/C2S and C2S/C3S with a water/cement ratio of 0.4. The hydrated pastes were characterized at different curing ages with 29 Si NMR, SEM/TEM/EDS, BET, and nanoindentation. The findings served as a basis for the micro- and nanoscale characterization of the hydration products formed, especially C-S-H gels. Differences were identified in composition, structure and mechanical behaviour (nanoindentation), depending on whether the gels formed in C3S or C2S pastes. The C3S gels had more compact morphologies, smaller BET-N2 specific surface area and lesser porosity than the gels from C2S-rich pastes. The results of nanoindentation tests appear to indicate that the various C-S-H phases formed in hydrated C3S and C2S have the same mechanical properties as those formed in Portland cement paste. Compared to the C3S sample, the hydrated C2S specimen was dominated by the loose-packed (LP) and the low-density (LD) C-S-H phases, and had a much lower content of the high density (HD) C-S-H phase
Resumo:
La tesis MEDIDAS AUTOSEMEJANTES EN EL PLANO, MOMENTOS Y MATRICES DE HESSENBERG se enmarca entre las áreas de la teoría geométrica de la medida, la teoría de polinomios ortogonales y la teoría de operadores. La memoria aborda el estudio de medidas con soporte acotado en el plano complejo vistas con la óptica de las matrices infinitas de momentos y de Hessenberg asociadas a estas medidas que en la teoría de los polinomios ortogonales las representan. En particular se centra en el estudio de las medidas autosemejantes que son las medidas de equilibrio definidas por un sistema de funciones iteradas (SFI). Los conjuntos autosemejantes son conjuntos que tienen la propiedad geométrica de descomponerse en unión de piezas semejantes al conjunto total. Estas piezas pueden solaparse o no, cuando el solapamiento es pequeño la teoría de Hutchinson [Hut81] funciona bien, pero cuando no existen restricciones falla. El problema del solapamiento consiste en controlar la medida de este solapamiento. Un ejemplo de la complejidad de este problema se plantea con las convoluciones infinitas de distribuciones de Bernoulli, que han resultado ser un ejemplo de medidas autosemejantes en el caso real. En 1935 Jessen y A. Wintner [JW35] ya se planteaba este problema, lejos de ser sencillo ha sido estudiado durante más de setenta y cinco años y siguen sin resolverse las principales cuestiones planteadas ya por A. Garsia [Gar62] en 1962. El interés que ha despertado este problema así como la complejidad del mismo está demostrado por las numerosas publicaciones que abordan cuestiones relacionadas con este problema ver por ejemplo [JW35], [Erd39], [PS96], [Ma00], [Ma96], [Sol98], [Mat95], [PS96], [Sim05],[JKS07] [JKS11]. En el primer capítulo comenzamos introduciendo con detalle las medidas autosemejante en el plano complejo y los sistemas de funciones iteradas, así como los conceptos de la teoría de la medida necesarios para describirlos. A continuación se introducen las herramientas necesarias de teoría de polinomios ortogonales, matrices infinitas y operadores que se van a usar. En el segundo y tercer capítulo trasladamos las propiedades geométricas de las medidas autosemejantes a las matrices de momentos y de Hessenberg, respectivamente. A partir de estos resultados se describen algoritmos para calcular estas matrices a partir del SFI correspondiente. Concretamente, se obtienen fórmulas explícitas y algoritmos de aproximación para los momentos y matrices de momentos de medidas fractales, a partir de un teorema del punto fijo para las matrices. Además utilizando técnicas de la teoría de operadores, se han extendido al plano complejo los resultados que G. Mantica [Ma00, Ma96] obtenía en el caso real. Este resultado es la base para definir un algoritmo estable de aproximación de la matriz de Hessenberg asociada a una medida fractal u obtener secciones finitas exactas de matrices Hessenberg asociadas a una suma de medidas. En el último capítulo, se consideran medidas, μ, más generales y se estudia el comportamiento asintótico de los autovalores de una matriz hermitiana de momentos y su impacto en las propiedades de la medida asociada. En el resultado central se demuestra que si los polinomios asociados son densos en L2(μ) entonces necesariamente el autovalor mínimo de las secciones finitas de la matriz de momentos de la medida tiende a cero. ABSTRACT The Thesis work “Self-similar Measures on the Plane, Moments and Hessenberg Matrices” is framed among the geometric measure theory, orthogonal polynomials and operator theory. The work studies measures with compact support on the complex plane from the point of view of the associated infinite moments and Hessenberg matrices representing them in the theory of orthogonal polynomials. More precisely, it concentrates on the study of the self-similar measures that are equilibrium measures in a iterated functions system. Self-similar sets have the geometric property of being decomposable in a union of similar pieces to the complete set. These pieces can overlap. If the overlapping is small, Hutchinson’s theory [Hut81] works well, however, when it has no restrictions, the theory does not hold. The overlapping problem consists in controlling the measure of the overlap. The complexity of this problem is exemplified in the infinite convolutions of Bernoulli’s distributions, that are an example of self-similar measures in the real case. As early as 1935 [JW35], Jessen and Wintner posed this problem, that far from being simple, has been studied during more than 75 years. The main cuestiones posed by Garsia in 1962 [Gar62] remain unsolved. The interest in this problem, together with its complexity, is demonstrated by the number of publications that over the years have dealt with it. See, for example, [JW35], [Erd39], [PS96], [Ma00], [Ma96], [Sol98], [Mat95], [PS96], [Sim05], [JKS07] [JKS11]. In the first chapter, we will start with a detailed introduction to the self-similar measurements in the complex plane and to the iterated functions systems, also including the concepts of measure theory needed to describe them. Next, we introduce the necessary tools from orthogonal polynomials, infinite matrices and operators. In the second and third chapter we will translate the geometric properties of selfsimilar measures to the moments and Hessenberg matrices. From these results, we will describe algorithms to calculate these matrices from the corresponding iterated functions systems. To be precise, we obtain explicit formulas and approximation algorithms for the moments and moment matrices of fractal measures from a new fixed point theorem for matrices. Moreover, using techniques from operator theory, we extend to the complex plane the real case results obtained by Mantica [Ma00, Ma96]. This result is the base to define a stable algorithm that approximates the Hessenberg matrix associated to a fractal measure and obtains exact finite sections of Hessenberg matrices associated to a sum of measurements. In the last chapter, we consider more general measures, μ, and study the asymptotic behaviour of the eigenvalues of a hermitian matrix of moments, together with its impact on the properties of the associated measure. In the main result we demonstrate that, if the associated polynomials are dense in L2(μ), then necessarily follows that the minimum eigenvalue of the finite sections of the moments matrix goes to zero.
Resumo:
Pragmatism is the leading motivation of regularization. We can understand regularization as a modification of the maximum-likelihood estimator so that a reasonable answer could be given in an unstable or ill-posed situation. To mention some typical examples, this happens when fitting parametric or non-parametric models with more parameters than data or when estimating large covariance matrices. Regularization is usually used, in addition, to improve the bias-variance tradeoff of an estimation. Then, the definition of regularization is quite general, and, although the introduction of a penalty is probably the most popular type, it is just one out of multiple forms of regularization. In this dissertation, we focus on the applications of regularization for obtaining sparse or parsimonious representations, where only a subset of the inputs is used. A particular form of regularization, L1-regularization, plays a key role for reaching sparsity. Most of the contributions presented here revolve around L1-regularization, although other forms of regularization are explored (also pursuing sparsity in some sense). In addition to present a compact review of L1-regularization and its applications in statistical and machine learning, we devise methodology for regression, supervised classification and structure induction of graphical models. Within the regression paradigm, we focus on kernel smoothing learning, proposing techniques for kernel design that are suitable for high dimensional settings and sparse regression functions. We also present an application of regularized regression techniques for modeling the response of biological neurons. Supervised classification advances deal, on the one hand, with the application of regularization for obtaining a na¨ıve Bayes classifier and, on the other hand, with a novel algorithm for brain-computer interface design that uses group regularization in an efficient manner. Finally, we present a heuristic for inducing structures of Gaussian Bayesian networks using L1-regularization as a filter. El pragmatismo es la principal motivación de la regularización. Podemos entender la regularización como una modificación del estimador de máxima verosimilitud, de tal manera que se pueda dar una respuesta cuando la configuración del problema es inestable. A modo de ejemplo, podemos mencionar el ajuste de modelos paramétricos o no paramétricos cuando hay más parámetros que casos en el conjunto de datos, o la estimación de grandes matrices de covarianzas. Se suele recurrir a la regularización, además, para mejorar el compromiso sesgo-varianza en una estimación. Por tanto, la definición de regularización es muy general y, aunque la introducción de una función de penalización es probablemente el método más popular, éste es sólo uno de entre varias posibilidades. En esta tesis se ha trabajado en aplicaciones de regularización para obtener representaciones dispersas, donde sólo se usa un subconjunto de las entradas. En particular, la regularización L1 juega un papel clave en la búsqueda de dicha dispersión. La mayor parte de las contribuciones presentadas en la tesis giran alrededor de la regularización L1, aunque también se exploran otras formas de regularización (que igualmente persiguen un modelo disperso). Además de presentar una revisión de la regularización L1 y sus aplicaciones en estadística y aprendizaje de máquina, se ha desarrollado metodología para regresión, clasificación supervisada y aprendizaje de estructura en modelos gráficos. Dentro de la regresión, se ha trabajado principalmente en métodos de regresión local, proponiendo técnicas de diseño del kernel que sean adecuadas a configuraciones de alta dimensionalidad y funciones de regresión dispersas. También se presenta una aplicación de las técnicas de regresión regularizada para modelar la respuesta de neuronas reales. Los avances en clasificación supervisada tratan, por una parte, con el uso de regularización para obtener un clasificador naive Bayes y, por otra parte, con el desarrollo de un algoritmo que usa regularización por grupos de una manera eficiente y que se ha aplicado al diseño de interfaces cerebromáquina. Finalmente, se presenta una heurística para inducir la estructura de redes Bayesianas Gaussianas usando regularización L1 a modo de filtro.
Resumo:
Abstract Tree tomato (Solanum betaceum) is an Andean small tree cultivated for its juicy fruits. Little information is available on the characterization of genetic resources and breeding of this neglected crop. We have studied the molecular diversity with AFLP markers using 11 combinations of primers of a collection of 25 S. betaceum accessions belonging to four cultivar groups, most of which had been previously morphologically characterized, as well as one accession of the wild relative S. cajanumense.Atotal of 197 AFLP fragments were scored, of which 84 (43 %) were polymorphic. When excluding S. cajanumense from the analysis, the number of polymorphic AFLP fragments was 78 (40 %). Unique AFLP fingerprints were obtained for every accession, but no AFLP fragments specific and universal to any of the four cultivar groups were found. The total genetic diversity (HT) of cultivated accessions was HT = 0.2904, while for cultivar groups it ranged from HT = 0.1846 in the orange group to HT = 0.2498 in the orange pointed group. Genetic differentiation among cultivar groups (GST) was low (GST = 0.2248), which was matched by low values of genetic distance among cultivar groups. The diversity of collections from Ecuador, which we hypothesize is a center of diversity for tree tomato, was similar to that from other origins (HT = 0.2884 and HT = 0.2645, respectively). Cluster and PCoA analyses clearly separated wild S. cajanumense from the cultivated species. However, materials of different cultivar groups and origins were intermingled in both analyses. The Mantel test correlation coefficient of the matrices of morphological and AFLP distances was low (-0.024) and non-significant. Overall, the results show that a wide diversity is present in each of the cultivar groups, indicate that Ecuador may be regarded as a center of accumulation of diversity for this crop, and confirm that AFLP and morphological characterization data are complementary. The results obtained are of value for the conservation of genetic resources and breeding of tree tomato, as an assessment of the genetic diversity and relationships among differen