998 resultados para INTERMEDIATE DENSITY


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The Kr 4s-electron photoionization cross section as a function of the exciting-photon energy in the range between 30 eV and 90 eV was calculated using the configuration interaction (CI) technique in intermediate coupling. In the calculations the 4p spin-orbital interaction and corrections due to higher orders of perturbation theory (the so-called Coulomb interaction correlational decrease) were considered. Energies of Kr II states were calculated and agree with spectroscopic data within less than 10 meV. For some of the Kr II states new assignments were suggested on the basis of the largest component among the calculated CI wavefunctions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The classical scattering cross section of two colliding nuclei at intermediate and relativistic energies is reevaluated. The influence of retardation and magnetic field effects is taken into account. Corrections due to electron screening as well as due to attractive nuclear forces are discussed. This paper represents an addendum to [l].

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Type and rate of fertilizers influence the level of soil organic carbon (Corg) and total nitrogen (Nt) markedly, but the effect on C and N partitioning into different pools is open to question. The objectives of the present work were to: (i) quantify the impact of fertilizer type and rate on labile, intermediate and passive C and N pools by using a combination of biological, chemical and mathematical methods; (ii) explain previously reported differences in the soil organic matter (SOM) levels between soils receiving farmyard manure with or without biodynamic preparations by using Corg time series and information on SOM partitioning; and (iii) quantify the long-term and short-term dynamics of SOM in density fractions and microbial biomass as affected by fertilizer type and rate and determine the incorporation of crop residues into labile SOM fractions. Samples were taken from a sandy Cambisol from the long-term fertilization trial in Darmstadt, Germany, founded in 1980. The nine treatments (four field replicates) were: straw incorporation plus application of mineral fertilizer (MSI) and application of rotted farmyard manure with (DYN) or without (FYM) addition of biodynamic preparations, each at high (140 – 150 kg N ha-1 year-1; MSIH, DYNH, FYMH), medium (100 kg N ha-1 year-1; MSIM, DYNM, FYMM) and low (50 – 60 kg N ha-1 year-1; MSIL, DYNL, FYML) rates. The main findings were: (i) The stocks of Corg (t ha-1) were affected by fertilizer type and rate and increased in the order MSIL (23.6), MSIM (23.7), MSIH (24.2) < FYML (25.3) < FYMM (28.1), FYMH (28.1). Stocks of Nt were affected in the same way (C/N ratio: 11). Storage of C and N in the modelled labile pools (turnover times: 462 and 153 days for C and N, respectively) were not influenced by the type of fertilizer (FYM and MSI) but depended significantly (p ≤ 0.05) on the application rate and ranged from 1.8 to 3.2 t C ha 1 (7 – 13% of Corg) and from 90 to 140 kg N ha-1 (4-5% of Nt). In the calculated intermediate pool (C/N ratio 7), stocks of C were markedly higher in FYM treatments (15-18 t ha-1) compared to MSI treatments (12-14 t ha-1). This showed that differences in SOM stocks in the sandy Cambisol induced by fertilizer rate may be short-lived in case of changing management, but differences induced by fertilizer type may persist for decades. (ii) Crop yields, estimated C inputs (1.5 t ha-1 year-1) with crop residue, microbial bio¬mass C (Cmic, 118 – 150 mg kg-1), microbial biomass N (17 – 20 mg kg-1) and labile C and N pools did not differ significantly between FYM and DYN treatments. However, labile C increased linearly with application rate (R2 = 0.53) from 7 to 11% of Corg. This also applied for labile N (3.5 to 4.9% of Nt). The higher contents of Corg in DYN treatments existed since 1982, when the first sampling was conducted for all individual treatments. Contents of Corg between DYN and FYM treatments con-verged slightly since then. Furthermore, at least 30% of the difference in Corg was located in the passive pool where a treatment effect could be excluded. Therefore, the reported differences in Corg contents existed most likely since the beginning of the experiment and, as a single factor of biodynamic agriculture, application of bio-dynamic preparations had no effect on SOM stocks. (iii) Stocks of SOM, light fraction organic C (LFOC, ρ ≤ 2.0 g cm-3), light fraction organic N and Cmic decreased in the order FYMH > FYML > MSIH, MSIL for all sampling dates in 2008 (March, May, September, December). However, statistical significance of treatment effects differed between the dates, probably due to dif-ferences in the spatial variation throughout the year. The high proportion of LFOC on total Corg stocks (45 – 55%) highlighted the importance of selective preservation of OM as a stabilization mechanism in this sandy Cambisol. The apparent turnover time of LFOC was between 21 and 32 years, which agreed very well with studies with substantially longer vegetation change compared to our study. Overall, both approaches; (I) the combination of incubation, chemical fractionation and simple modelling and (II) the density fractionation; provided complementary information on the partitioning of SOM into pools of different stability. The density fractionation showed that differences in Corg stocks between FYM and MSI treatments were mainly located in the light fraction, i.e. induced by higher recalcitrance of the organic input in the FYM treatments. Moreover, the use of the combination of biological, chemical and mathematical methods indicated that effects of fertilizer rate on total Corg and Nt stocks may be short-lived, but that the effect of fertilizer type may persist for longer time spans in the sandy Cambisol.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In dieser Doktorarbeit wird eine akkurate Methode zur Bestimmung von Grundzustandseigenschaften stark korrelierter Elektronen im Rahmen von Gittermodellen entwickelt und angewandt. In der Dichtematrix-Funktional-Theorie (LDFT, vom englischen lattice density functional theory) ist die Ein-Teilchen-Dichtematrix γ die fundamentale Variable. Auf der Basis eines verallgemeinerten Hohenberg-Kohn-Theorems ergibt sich die Grundzustandsenergie Egs[γgs] = min° E[γ] durch die Minimierung des Energiefunktionals E[γ] bezüglich aller physikalischer bzw. repräsentativer γ. Das Energiefunktional kann in zwei Beiträge aufgeteilt werden: Das Funktional der kinetischen Energie T[γ], dessen lineare Abhängigkeit von γ genau bekannt ist, und das Funktional der Korrelationsenergie W[γ], dessen Abhängigkeit von γ nicht explizit bekannt ist. Das Auffinden präziser Näherungen für W[γ] stellt die tatsächliche Herausforderung dieser These dar. Einem Teil dieser Arbeit liegen vorausgegangene Studien zu Grunde, in denen eine Näherung des Funktionals W[γ] für das Hubbardmodell, basierend auf Skalierungshypothesen und exakten analytischen Ergebnissen für das Dimer, hergeleitet wird. Jedoch ist dieser Ansatz begrenzt auf spin-unabhängige und homogene Systeme. Um den Anwendungsbereich von LDFT zu erweitern, entwickeln wir drei verschiedene Ansätze zur Herleitung von W[γ], die das Studium von Systemen mit gebrochener Symmetrie ermöglichen. Zuerst wird das bisherige Skalierungsfunktional erweitert auf Systeme mit Ladungstransfer. Eine systematische Untersuchung der Abhängigkeit des Funktionals W[γ] von der Ladungsverteilung ergibt ähnliche Skalierungseigenschaften wie für den homogenen Fall. Daraufhin wird eine Erweiterung auf das Hubbardmodell auf bipartiten Gittern hergeleitet und an sowohl endlichen als auch unendlichen Systemen mit repulsiver und attraktiver Wechselwirkung angewandt. Die hohe Genauigkeit dieses Funktionals wird aufgezeigt. Es erweist sich jedoch als schwierig, diesen Ansatz auf komplexere Systeme zu übertragen, da bei der Berechnung von W[γ] das System als ganzes betrachtet wird. Um dieses Problem zu bewältigen, leiten wir eine weitere Näherung basierend auf lokalen Skalierungseigenschaften her. Dieses Funktional ist lokal bezüglich der Gitterplätze formuliert und ist daher anwendbar auf jede Art von geordneten oder ungeordneten Hamiltonoperatoren mit lokalen Wechselwirkungen. Als Anwendungen untersuchen wir den Metall-Isolator-Übergang sowohl im ionischen Hubbardmodell in einer und zwei Dimensionen als auch in eindimensionalen Hubbardketten mit nächsten und übernächsten Nachbarn. Schließlich entwickeln wir ein numerisches Verfahren zur Berechnung von W[γ], basierend auf exakten Diagonalisierungen eines effektiven Vielteilchen-Hamilton-Operators, welcher einen von einem effektiven Medium umgebenen Cluster beschreibt. Dieser effektive Hamiltonoperator hängt von der Dichtematrix γ ab und erlaubt die Herleitung von Näherungen an W[γ], dessen Qualität sich systematisch mit steigender Clustergröße verbessert. Die Formulierung ist spinabhängig und ermöglicht eine direkte Verallgemeinerung auf korrelierte Systeme mit mehreren Orbitalen, wie zum Beispiel auf den spd-Hamilton-Operator. Darüber hinaus berücksichtigt sie die Effekte kurzreichweitiger Ladungs- und Spinfluktuationen in dem Funktional. Für das Hubbardmodell wird die Genauigkeit der Methode durch Vergleich mit Bethe-Ansatz-Resultaten (1D) und Quanten-Monte-Carlo-Simulationen (2D) veranschaulicht. Zum Abschluss wird ein Ausblick auf relevante zukünftige Entwicklungen dieser Theorie gegeben.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

General-purpose computing devices allow us to (1) customize computation after fabrication and (2) conserve area by reusing expensive active circuitry for different functions in time. We define RP-space, a restricted domain of the general-purpose architectural space focussed on reconfigurable computing architectures. Two dominant features differentiate reconfigurable from special-purpose architectures and account for most of the area overhead associated with RP devices: (1) instructions which tell the device how to behave, and (2) flexible interconnect which supports task dependent dataflow between operations. We can characterize RP-space by the allocation and structure of these resources and compare the efficiencies of architectural points across broad application characteristics. Conventional FPGAs fall at one extreme end of this space and their efficiency ranges over two orders of magnitude across the space of application characteristics. Understanding RP-space and its consequences allows us to pick the best architecture for a task and to search for more robust design points in the space. Our DPGA, a fine- grained computing device which adds small, on-chip instruction memories to FPGAs is one such design point. For typical logic applications and finite- state machines, a DPGA can implement tasks in one-third the area of a traditional FPGA. TSFPGA, a variant of the DPGA which focuses on heavily time-switched interconnect, achieves circuit densities close to the DPGA, while reducing typical physical mapping times from hours to seconds. Rigid, fabrication-time organization of instruction resources significantly narrows the range of efficiency for conventional architectures. To avoid this performance brittleness, we developed MATRIX, the first architecture to defer the binding of instruction resources until run-time, allowing the application to organize resources according to its needs. Our focus MATRIX design point is based on an array of 8-bit ALU and register-file building blocks interconnected via a byte-wide network. With today's silicon, a single chip MATRIX array can deliver over 10 Gop/s (8-bit ops). On sample image processing tasks, we show that MATRIX yields 10-20x the computational density of conventional processors. Understanding the cost structure of RP-space helps us identify these intermediate architectural points and may provide useful insight more broadly in guiding our continual search for robust and efficient general-purpose computing structures.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We formulate density estimation as an inverse operator problem. We then use convergence results of empirical distribution functions to true distribution functions to develop an algorithm for multivariate density estimation. The algorithm is based upon a Support Vector Machine (SVM) approach to solving inverse operator problems. The algorithm is implemented and tested on simulated data from different distributions and different dimensionalities, gaussians and laplacians in $R^2$ and $R^{12}$. A comparison in performance is made with Gaussian Mixture Models (GMMs). Our algorithm does as well or better than the GMMs for the simulations tested and has the added advantage of being automated with respect to parameters.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper we focus on the problem of estimating a bounded density using a finite combination of densities from a given class. We consider the Maximum Likelihood Procedure (MLE) and the greedy procedure described by Li and Barron. Approximation and estimation bounds are given for the above methods. We extend and improve upon the estimation results of Li and Barron, and in particular prove an $O(\\frac{1}{\\sqrt{n}})$ bound on the estimation error which does not depend on the number of densities in the estimated combination.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

High density, uniform GaN nanodot arrays with controllable size have been synthesized by using template-assisted selective growth. The GaN nanodots with average diameter 40nm, 80nm and 120nm were selectively grown by metalorganic chemical vapor deposition (MOCVD) on a nano-patterned SiO2/GaN template. The nanoporous SiO2 on GaN surface was created by inductively coupled plasma etching (ICP) using anodic aluminum oxide (AAO) template as a mask. This selective regrowth results in highly crystalline GaN nanodots confirmed by high resolution transmission electron microscopy. The narrow size distribution and uniform spatial position of the nanoscale dots offer potential advantages over self-assembled dots grown by the Stranski–Krastanow mode.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Compositional data analysis motivated the introduction of a complete Euclidean structure in the simplex of D parts. This was based on the early work of J. Aitchison (1986) and completed recently when Aitchinson distance in the simplex was associated with an inner product and orthonormal bases were identified (Aitchison and others, 2002; Egozcue and others, 2003). A partition of the support of a random variable generates a composition by assigning the probability of each interval to a part of the composition. One can imagine that the partition can be refined and the probability density would represent a kind of continuous composition of probabilities in a simplex of infinitely many parts. This intuitive idea would lead to a Hilbert-space of probability densities by generalizing the Aitchison geometry for compositions in the simplex into the set probability densities

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In a seminal paper, Aitchison and Lauder (1985) introduced classical kernel density estimation techniques in the context of compositional data analysis. Indeed, they gave two options for the choice of the kernel to be used in the kernel estimator. One of these kernels is based on the use the alr transformation on the simplex SD jointly with the normal distribution on RD-1. However, these authors themselves recognized that this method has some deficiencies. A method for overcoming these dificulties based on recent developments for compositional data analysis and multivariate kernel estimation theory, combining the ilr transformation with the use of the normal density with a full bandwidth matrix, was recently proposed in Martín-Fernández, Chacón and Mateu- Figueras (2006). Here we present an extensive simulation study that compares both methods in practice, thus exploring the finite-sample behaviour of both estimators

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Functional Data Analysis (FDA) deals with samples where a whole function is observed for each individual. A particular case of FDA is when the observed functions are density functions, that are also an example of infinite dimensional compositional data. In this work we compare several methods for dimensionality reduction for this particular type of data: functional principal components analysis (PCA) with or without a previous data transformation and multidimensional scaling (MDS) for diferent inter-densities distances, one of them taking into account the compositional nature of density functions. The difeerent methods are applied to both artificial and real data (households income distributions)

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We present a new approach to model and classify breast parenchymal tissue. Given a mammogram, first, we will discover the distribution of the different tissue densities in an unsupervised manner, and second, we will use this tissue distribution to perform the classification. We achieve this using a classifier based on local descriptors and probabilistic Latent Semantic Analysis (pLSA), a generative model from the statistical text literature. We studied the influence of different descriptors like texture and SIFT features at the classification stage showing that textons outperform SIFT in all cases. Moreover we demonstrate that pLSA automatically extracts meaningful latent aspects generating a compact tissue representation based on their densities, useful for discriminating on mammogram classification. We show the results of tissue classification over the MIAS and DDSM datasets. We compare our method with approaches that classified these same datasets showing a better performance of our proposal

Relevância:

20.00% 20.00%

Publicador:

Resumo:

It has been shown that the accuracy of mammographic abnormality detection methods is strongly dependent on the breast tissue characteristics, where a dense breast drastically reduces detection sensitivity. In addition, breast tissue density is widely accepted to be an important risk indicator for the development of breast cancer. Here, we describe the development of an automatic breast tissue classification methodology, which can be summarized in a number of distinct steps: 1) the segmentation of the breast area into fatty versus dense mammographic tissue; 2) the extraction of morphological and texture features from the segmented breast areas; and 3) the use of a Bayesian combination of a number of classifiers. The evaluation, based on a large number of cases from two different mammographic data sets, shows a strong correlation ( and 0.67 for the two data sets) between automatic and expert-based Breast Imaging Reporting and Data System mammographic density assessment

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A recent trend in digital mammography is computer-aided diagnosis systems, which are computerised tools designed to assist radiologists. Most of these systems are used for the automatic detection of abnormalities. However, recent studies have shown that their sensitivity is significantly decreased as the density of the breast increases. This dependence is method specific. In this paper we propose a new approach to the classification of mammographic images according to their breast parenchymal density. Our classification uses information extracted from segmentation results and is based on the underlying breast tissue texture. Classification performance was based on a large set of digitised mammograms. Evaluation involves different classifiers and uses a leave-one-out methodology. Results demonstrate the feasibility of estimating breast density using image processing and analysis techniques

Relevância:

20.00% 20.00%

Publicador:

Resumo:

En la Enfermedad Coronaria (EC) existen factores genéticos, socioculturales, medioambientales y raciales adicionales a los factores de riesgo cardiovascular mayores que podrían influir en su presentación. Se desconoce el impacto de la raza en la severidad de la enfermedad coronaria en los pacientes extranjeros que son enviados a nuestro Servicio. Objetivos: Comparar la severidad de la EC multivaso en una población de pacientes de las Antillas y Nacionales, pareados por la escala Framingham. Metodología: Realizamos un estudio de corte transversal, comparando pacientes colombianos contra pacientes provenientes de las Antillas holandesas con similares factores de riesgo según escala de Framingham, catalogándolos por grupos de riesgo bajo, intermedio, alto y muy alto. Todos con EC severa multivaso documentada por angiografía coronaria desde enero del 2009 hasta Junio de 2011. Se excluyeron pacientes con antecedentes de intervención percutánea o quirúrgica previa. Resultados: Ingresaron 115 pacientes internacionales y 115 pacientes nacionales. La relación hombres/mujeres 3:1. La proporción de grupos de riesgo fue de bajo riesgo 2.5%, intermedio 15%, alto 19.3%, y muy alto 63.4%. El Syntax Score en pacientes nacionales fue 14.3+/-7.4 y en internacionales 22.2+/-10.5 p: 0.002. Conclusiones: En pacientes provenientes de las Antillas Holandesas, valorados en nuestra institución, se observó una mayor severidad de la enfermedad coronaria comparada con una población nacional con factores de riesgo similares. Estos hallazgos sugieren la influencia de la raza y factores genéticos en la severidad y extensión de la EC