968 resultados para DENSITY-MATRICES


Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents the results of a study on the use of rice husk ash (RHA) for property modification of high density polyethylene (HDPE). Rice husk is a waste product of the rice processing industry. It is used widely as a fuel which results in large quantities of RHA. Here, the characterization of RHA has been done with the help of X-ray diffraction (XRD), Inductively Coupled Plasma Atomic Emission Spectroscopy (ICPAES), light scattering based particle size analysis, Fourier transform infrared spectroscopy (FTIR) and Scanning Electron Microscope (SEM). Most reports suggest that RHA when blended directly with polymers without polar groups does not improve the properties of the polymer substantially. In this study RHA is blended with HDPE in the presence of a compatibilizer. The compatibilized HDPE-RHA blend has a tensile strength about 18% higher than that of virgin HDPE. The elongation-at-break is also higher for the compatibilized blend. TGA studies reveal that uncompatibilized as well as compatibilized HDPERHA composites have excellent thermal stability. The results prove that RHA is a valuable reinforcing material for HDPE and the environmental pollution arising from RHA can be eliminated in a profitable way by this technique.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Increasing amounts of plastic waste in the environment have become a problem of gigantic proportions. The case of linear low-density polyethylene (LLDPE) is especially significant as it is widely used for packaging and other applications. This synthetic polymer is normally not biodegradable until it is degraded into low molecular mass fragments that can be assimilated by microorganisms. Blends of nonbiodegradable polymers and biodegradable commercial polymers such as poly (vinyl alcohol) (PVA) can facilitate a reduction in the volume of plastic waste when they undergo partial degradation. Further, the remaining fragments stand a greater chance of undergoing biodegradation in a much shorter span of time. In this investigation, LLDPE was blended with different proportions of PVA (5–30%) in a torque rheometer. Mechanical, thermal, and biodegradation studies were carried out on the blends. The biodegradability of LLDPE/PVA blends has been studied in two environments: (1) in a culture medium containing Vibrio sp. and (2) soil environment, both over a period of 15 weeks. Blends exposed to culture medium degraded more than that exposed to soil environment. Changes in various properties of LLDPE/PVA blends before and after degradation were monitored using Fourier transform infrared spectroscopy, a differential scanning calorimeter (DSC) for crystallinity, and scanning electron microscope (SEM) for surface morphology among other things. Percentage crystallinity decreased as the PVA content increased and biodegradation resulted in an increase of crystallinity in LLDPE/PVA blends. The results prove that partial biodegradation of the blends has occurred holding promise for an eventual biodegradable product

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Upgrading two widely used standard plastics, polypropylene (PP) and high density polyethylene (HDPE), and generating a variety of useful engineering materials based on these blends have been the main objective of this study. Upgradation was effected by using nanomodifiers and/or fibrous modifiers. PP and HDPE were selected for modification due to their attractive inherent properties and wide spectrum of use. Blending is the engineered method of producing new materials with tailor made properties. It has the advantages of both the materials. PP has high tensile and flexural strength and the HDPE acts as an impact modifier in the resultant blend. Hence an optimized blend of PP and HDPE was selected as the matrix material for upgradation. Nanokaolinite clay and E-glass fibre were chosen for modifying PP/HDPE blend. As the first stage of the work, the mechanical, thermal, morphological, rheological, dynamic mechanical and crystallization characteristics of the polymer nanocomposites prepared with PP/HDPE blend and different surface modified nanokaolinite clay were analyzed. As the second stage of the work, the effect of simultaneous inclusion of nanokaolinite clay (both N100A and N100) and short glass fibres are investigated. The presence of nanofiller has increased the properties of hybrid composites to a greater extent than micro composites. As the last stage, micromechanical modeling of both nano and hybrid A composite is carried out to analyze the behavior of the composite under load bearing conditions. These theoretical analyses indicate that the polymer-nanoclay interfacial characteristics partially converge to a state of perfect interfacial bonding (Takayanagi model) with an iso-stress (Reuss IROM) response. In the case of hybrid composites the experimental data follows the trend of Halpin-Tsai model. This implies that matrix and filler experience varying amount of strain and interfacial adhesion between filler and matrix and also between the two fillers which play a vital role in determining the modulus of the hybrid composites.A significant observation from this study is that the requirement of higher fibre loading for efficient reinforcement of polymers can be substantially reduced by the presence of nanofiller together with much lower fibre content in the composite. Hybrid composites with both nanokaolinite clay and micron sized E-glass fibre as reinforcements in PP/HDPE matrix will generate a novel class of high performance, cost effective engineering material.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Seit einigen Jahren erfährt Lateinamerika einen epochalen Umbruch. Das neoliberale Modell ist in einer Krise. Die Politik des „Washington Consensus“ und das Diktum einer politisch nicht kontrollierbaren Globalisierung werden zunehmend hinterfagt. Aus der Linkswende haben sich neue alternative Politikkonzepten entwickelt. In dem vorliegenden Working Paper wird auf die Beziehung von sozialen Bewegungen, Ideologien und Regierungen eingegangen. In ihrer Diagnose arbeitet Maristella Svampa die ambivalenten Charakteristiken des aktuellen Wandels in Lateinamerika heraus. Daran schließt eine analytische Annäherung an die verschiedenen ideologischen Traditionen an, die den Widerstandssektor prägen. Abschließend werden bei der Analyse der vier wichtigsten Tendenzen einige der wichtigsten Daten über die Region präsentiert. Zu diesen Tendenzen gehören der Fortschritt der indigenen Kämpfe, die Konsolidierung neuer Formen des Kampfes, die Reaktivierung der national-populären Tradition, sowie die Rückkehr des „Desarrollismo“. Letztere wird sowohl von progressiven als auch eher konservativ-neoliberalen Regierungen unterstützt.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In dieser Doktorarbeit wird eine akkurate Methode zur Bestimmung von Grundzustandseigenschaften stark korrelierter Elektronen im Rahmen von Gittermodellen entwickelt und angewandt. In der Dichtematrix-Funktional-Theorie (LDFT, vom englischen lattice density functional theory) ist die Ein-Teilchen-Dichtematrix γ die fundamentale Variable. Auf der Basis eines verallgemeinerten Hohenberg-Kohn-Theorems ergibt sich die Grundzustandsenergie Egs[γgs] = min° E[γ] durch die Minimierung des Energiefunktionals E[γ] bezüglich aller physikalischer bzw. repräsentativer γ. Das Energiefunktional kann in zwei Beiträge aufgeteilt werden: Das Funktional der kinetischen Energie T[γ], dessen lineare Abhängigkeit von γ genau bekannt ist, und das Funktional der Korrelationsenergie W[γ], dessen Abhängigkeit von γ nicht explizit bekannt ist. Das Auffinden präziser Näherungen für W[γ] stellt die tatsächliche Herausforderung dieser These dar. Einem Teil dieser Arbeit liegen vorausgegangene Studien zu Grunde, in denen eine Näherung des Funktionals W[γ] für das Hubbardmodell, basierend auf Skalierungshypothesen und exakten analytischen Ergebnissen für das Dimer, hergeleitet wird. Jedoch ist dieser Ansatz begrenzt auf spin-unabhängige und homogene Systeme. Um den Anwendungsbereich von LDFT zu erweitern, entwickeln wir drei verschiedene Ansätze zur Herleitung von W[γ], die das Studium von Systemen mit gebrochener Symmetrie ermöglichen. Zuerst wird das bisherige Skalierungsfunktional erweitert auf Systeme mit Ladungstransfer. Eine systematische Untersuchung der Abhängigkeit des Funktionals W[γ] von der Ladungsverteilung ergibt ähnliche Skalierungseigenschaften wie für den homogenen Fall. Daraufhin wird eine Erweiterung auf das Hubbardmodell auf bipartiten Gittern hergeleitet und an sowohl endlichen als auch unendlichen Systemen mit repulsiver und attraktiver Wechselwirkung angewandt. Die hohe Genauigkeit dieses Funktionals wird aufgezeigt. Es erweist sich jedoch als schwierig, diesen Ansatz auf komplexere Systeme zu übertragen, da bei der Berechnung von W[γ] das System als ganzes betrachtet wird. Um dieses Problem zu bewältigen, leiten wir eine weitere Näherung basierend auf lokalen Skalierungseigenschaften her. Dieses Funktional ist lokal bezüglich der Gitterplätze formuliert und ist daher anwendbar auf jede Art von geordneten oder ungeordneten Hamiltonoperatoren mit lokalen Wechselwirkungen. Als Anwendungen untersuchen wir den Metall-Isolator-Übergang sowohl im ionischen Hubbardmodell in einer und zwei Dimensionen als auch in eindimensionalen Hubbardketten mit nächsten und übernächsten Nachbarn. Schließlich entwickeln wir ein numerisches Verfahren zur Berechnung von W[γ], basierend auf exakten Diagonalisierungen eines effektiven Vielteilchen-Hamilton-Operators, welcher einen von einem effektiven Medium umgebenen Cluster beschreibt. Dieser effektive Hamiltonoperator hängt von der Dichtematrix γ ab und erlaubt die Herleitung von Näherungen an W[γ], dessen Qualität sich systematisch mit steigender Clustergröße verbessert. Die Formulierung ist spinabhängig und ermöglicht eine direkte Verallgemeinerung auf korrelierte Systeme mit mehreren Orbitalen, wie zum Beispiel auf den spd-Hamilton-Operator. Darüber hinaus berücksichtigt sie die Effekte kurzreichweitiger Ladungs- und Spinfluktuationen in dem Funktional. Für das Hubbardmodell wird die Genauigkeit der Methode durch Vergleich mit Bethe-Ansatz-Resultaten (1D) und Quanten-Monte-Carlo-Simulationen (2D) veranschaulicht. Zum Abschluss wird ein Ausblick auf relevante zukünftige Entwicklungen dieser Theorie gegeben.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The main focus and concerns of this PhD thesis is the growth of III-V semiconductor nanostructures (Quantum dots (QDs) and quantum dashes) on silicon substrates using molecular beam epitaxy (MBE) technique. The investigation of influence of the major growth parameters on their basic properties (density, geometry, composition, size etc.) and the systematic characterization of their structural and optical properties are the core of the research work. The monolithic integration of III-V optoelectronic devices with silicon electronic circuits could bring enormous prospect for the existing semiconductor technology. Our challenging approach is to combine the superior passive optical properties of silicon with the superior optical emission properties of III-V material by reducing the amount of III-V materials to the very limit of the active region. Different heteroepitaxial integration approaches have been investigated to overcome the materials issues between III-V and Si. However, this include the self-assembled growth of InAs and InGaAs QDs in silicon and GaAx matrices directly on flat silicon substrate, sitecontrolled growth of (GaAs/In0,15Ga0,85As/GaAs) QDs on pre-patterned Si substrate and the direct growth of GaP on Si using migration enhanced epitaxy (MEE) and MBE growth modes. An efficient ex-situ-buffered HF (BHF) and in-situ surface cleaning sequence based on atomic hydrogen (AH) cleaning at 500 °C combined with thermal oxide desorption within a temperature range of 700-900 °C has been established. The removal of oxide desorption was confirmed by semicircular streaky reflection high energy electron diffraction (RHEED) patterns indicating a 2D smooth surface construction prior to the MBE growth. The evolution of size, density and shape of the QDs are ex-situ characterized by atomic-force microscopy (AFM) and transmission electron microscopy (TEM). The InAs QDs density is strongly increased from 108 to 1011 cm-2 at V/III ratios in the range of 15-35 (beam equivalent pressure values). InAs QD formations are not observed at temperatures of 500 °C and above. Growth experiments on (111) substrates show orientation dependent QD formation behaviour. A significant shape and size transition with elongated InAs quantum dots and dashes has been observed on (111) orientation and at higher Indium-growth rate of 0.3 ML/s. The 2D strain mapping derived from high-resolution TEM of InAs QDs embedded in silicon matrix confirmed semi-coherent and fully relaxed QDs embedded in defectfree silicon matrix. The strain relaxation is released by dislocation loops exclusively localized along the InAs/Si interfaces and partial dislocations with stacking faults inside the InAs clusters. The site controlled growth of GaAs/In0,15Ga0,85As/GaAs nanostructures has been demonstrated for the first time with 1 μm spacing and very low nominal deposition thicknesses, directly on pre-patterned Si without the use of SiO2 mask. Thin planar GaP layer was successfully grown through migration enhanced epitaxy (MEE) to initiate a planar GaP wetting layer at the polar/non-polar interface, which work as a virtual GaP substrate, for the GaP-MBE subsequently growth on the GaP-MEE layer with total thickness of 50 nm. The best root mean square (RMS) roughness value was as good as 1.3 nm. However, these results are highly encouraging for the realization of III-V optical devices on silicon for potential applications.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We formulate density estimation as an inverse operator problem. We then use convergence results of empirical distribution functions to true distribution functions to develop an algorithm for multivariate density estimation. The algorithm is based upon a Support Vector Machine (SVM) approach to solving inverse operator problems. The algorithm is implemented and tested on simulated data from different distributions and different dimensionalities, gaussians and laplacians in $R^2$ and $R^{12}$. A comparison in performance is made with Gaussian Mixture Models (GMMs). Our algorithm does as well or better than the GMMs for the simulations tested and has the added advantage of being automated with respect to parameters.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper we focus on the problem of estimating a bounded density using a finite combination of densities from a given class. We consider the Maximum Likelihood Procedure (MLE) and the greedy procedure described by Li and Barron. Approximation and estimation bounds are given for the above methods. We extend and improve upon the estimation results of Li and Barron, and in particular prove an $O(\\frac{1}{\\sqrt{n}})$ bound on the estimation error which does not depend on the number of densities in the estimated combination.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

High density, uniform GaN nanodot arrays with controllable size have been synthesized by using template-assisted selective growth. The GaN nanodots with average diameter 40nm, 80nm and 120nm were selectively grown by metalorganic chemical vapor deposition (MOCVD) on a nano-patterned SiO2/GaN template. The nanoporous SiO2 on GaN surface was created by inductively coupled plasma etching (ICP) using anodic aluminum oxide (AAO) template as a mask. This selective regrowth results in highly crystalline GaN nanodots confirmed by high resolution transmission electron microscopy. The narrow size distribution and uniform spatial position of the nanoscale dots offer potential advantages over self-assembled dots grown by the Stranski–Krastanow mode.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Compositional data analysis motivated the introduction of a complete Euclidean structure in the simplex of D parts. This was based on the early work of J. Aitchison (1986) and completed recently when Aitchinson distance in the simplex was associated with an inner product and orthonormal bases were identified (Aitchison and others, 2002; Egozcue and others, 2003). A partition of the support of a random variable generates a composition by assigning the probability of each interval to a part of the composition. One can imagine that the partition can be refined and the probability density would represent a kind of continuous composition of probabilities in a simplex of infinitely many parts. This intuitive idea would lead to a Hilbert-space of probability densities by generalizing the Aitchison geometry for compositions in the simplex into the set probability densities

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In a seminal paper, Aitchison and Lauder (1985) introduced classical kernel density estimation techniques in the context of compositional data analysis. Indeed, they gave two options for the choice of the kernel to be used in the kernel estimator. One of these kernels is based on the use the alr transformation on the simplex SD jointly with the normal distribution on RD-1. However, these authors themselves recognized that this method has some deficiencies. A method for overcoming these dificulties based on recent developments for compositional data analysis and multivariate kernel estimation theory, combining the ilr transformation with the use of the normal density with a full bandwidth matrix, was recently proposed in Martín-Fernández, Chacón and Mateu- Figueras (2006). Here we present an extensive simulation study that compares both methods in practice, thus exploring the finite-sample behaviour of both estimators

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Functional Data Analysis (FDA) deals with samples where a whole function is observed for each individual. A particular case of FDA is when the observed functions are density functions, that are also an example of infinite dimensional compositional data. In this work we compare several methods for dimensionality reduction for this particular type of data: functional principal components analysis (PCA) with or without a previous data transformation and multidimensional scaling (MDS) for diferent inter-densities distances, one of them taking into account the compositional nature of density functions. The difeerent methods are applied to both artificial and real data (households income distributions)

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We present a new approach to model and classify breast parenchymal tissue. Given a mammogram, first, we will discover the distribution of the different tissue densities in an unsupervised manner, and second, we will use this tissue distribution to perform the classification. We achieve this using a classifier based on local descriptors and probabilistic Latent Semantic Analysis (pLSA), a generative model from the statistical text literature. We studied the influence of different descriptors like texture and SIFT features at the classification stage showing that textons outperform SIFT in all cases. Moreover we demonstrate that pLSA automatically extracts meaningful latent aspects generating a compact tissue representation based on their densities, useful for discriminating on mammogram classification. We show the results of tissue classification over the MIAS and DDSM datasets. We compare our method with approaches that classified these same datasets showing a better performance of our proposal

Relevância:

20.00% 20.00%

Publicador:

Resumo:

It has been shown that the accuracy of mammographic abnormality detection methods is strongly dependent on the breast tissue characteristics, where a dense breast drastically reduces detection sensitivity. In addition, breast tissue density is widely accepted to be an important risk indicator for the development of breast cancer. Here, we describe the development of an automatic breast tissue classification methodology, which can be summarized in a number of distinct steps: 1) the segmentation of the breast area into fatty versus dense mammographic tissue; 2) the extraction of morphological and texture features from the segmented breast areas; and 3) the use of a Bayesian combination of a number of classifiers. The evaluation, based on a large number of cases from two different mammographic data sets, shows a strong correlation ( and 0.67 for the two data sets) between automatic and expert-based Breast Imaging Reporting and Data System mammographic density assessment

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A recent trend in digital mammography is computer-aided diagnosis systems, which are computerised tools designed to assist radiologists. Most of these systems are used for the automatic detection of abnormalities. However, recent studies have shown that their sensitivity is significantly decreased as the density of the breast increases. This dependence is method specific. In this paper we propose a new approach to the classification of mammographic images according to their breast parenchymal density. Our classification uses information extracted from segmentation results and is based on the underlying breast tissue texture. Classification performance was based on a large set of digitised mammograms. Evaluation involves different classifiers and uses a leave-one-out methodology. Results demonstrate the feasibility of estimating breast density using image processing and analysis techniques