116 resultados para Partial autocorrelationsspectral density
em Consorci de Serveis Universitaris de Catalunya (CSUC), Spain
Resumo:
Geometric parameters of binary (1:1) PdZn and PtZn alloys with CuAu-L10 structure were calculated with a density functional method. Based on the total energies, the alloys are predicted to feature equal formation energies. Calculated surface energies of PdZn and PtZn alloys show that (111) and (100) surfaces exposing stoichiometric layers are more stable than (001) and (110) surfaces comprising alternating Pd (Pt) and Zn layers. The surface energy values of alloys lie between the surface energies of the individual components, but they differ from their composition weighted averages. Compared with the pure metals, the valence d-band widths and the Pd or Pt partial densities of states at the Fermi level are dramatically reduced in PdZn and PtZn alloys. The local valence d-band density of states of Pd and Pt in the alloys resemble that of metallic Cu, suggesting that a similar catalytic performance of these systems can be related to this similarity in the local electronic structures.
Resumo:
L'anàlisi de la densitat urbana és utilitzada per examinar la distribució espacial de la població dins de les àrees urbanes, i és força útil per planificar els serveis públics. En aquest article, s'estudien setze formes funcionals clàssiques de la relació existent entre la densitat i la distancia en la regió metropolitana de Barcelona i els seus onze subcentres.
Resumo:
The presence of subcentres cannot be captured by an exponential function. Cubic spline functions seem more appropriate to depict the polycentricity pattern of modern urban systems. Using data from Barcelona Metropolitan Region, two possible population subcentre delimitation procedures are discussed. One, taking an estimated derivative equal to zero, the other, a density gradient equal to zero. It is argued that, in using a cubic spline function, a delimitation strategy based on derivatives is more appropriate than one based on gradients because the estimated density can be negative in sections with very low densities and few observations, leading to sudden changes in estimated gradients. It is also argued that using as a criteria for subcentre delimitation a second derivative with value zero allow us to capture a more restricted subcentre area than using as a criteria a first derivative zero. This methodology can also be used for intermediate ring delimitation.
Resumo:
"Vegeu el resum a l'inici del document del fitxer adjunt."
Resumo:
This paper aims at providing a Bayesian parametric framework to tackle the accessibility problem across space in urban theory. Adopting continuous variables in a probabilistic setting we are able to associate with the distribution density to the Kendall's tau index and replicate the general issues related to the role of proximity in a more general context. In addition, by referring to the Beta and Gamma distribution, we are able to introduce a differentiation feature in each spatial unit without incurring in any a-priori definition of territorial units. We are also providing an empirical application of our theoretical setting to study the density distribution of the population across Massachusetts.
Resumo:
"Vegeu el resum a l'inici del document del fitxer adjunt."
Resumo:
This paper explores two major issues, from biophysical and historical viewpoints. We examine land management, which we define as the long-term fertility maintenance of land in relation to agriculture, fishery and forestry. We also explore humans’ positive role as agents aiming to reinforce harmonious materials circulation within the land. Liebig’s view on nature, agriculture and land, emphasizes the maintenance of long-term land fertility based on his agronomical thought that the circulation of matter in agricultural fields must be maintained with manure as much as possible. The thoughts of several classical economists, on nature, agriculture and land are reassessed from Liebig’s view point. Then, the land management problem is discussed at a much more fundamental level, to understand the necessary conditions for life in relation to land management. This point is analyzed in terms of two mechanisms: entropy disposal on the earth, and material circulation against gravitational field. Finally from the historical example of the metropolis of Edo, it is shown that there is yet another necessary condition for the sustainable management of land based on the creation of harmonious material cycles among cities, farm land, forests and surrounding sea areas in which humans play a vital role as agent.
Gaussian estimates for the density of the non-linear stochastic heat equation in any space dimension
Resumo:
In this paper, we establish lower and upper Gaussian bounds for the probability density of the mild solution to the stochastic heat equation with multiplicative noise and in any space dimension. The driving perturbation is a Gaussian noise which is white in time with some spatially homogeneous covariance. These estimates are obtained using tools of the Malliavin calculus. The most challenging part is the lower bound, which is obtained by adapting a general method developed by Kohatsu-Higa to the underlying spatially homogeneous Gaussian setting. Both lower and upper estimates have the same form: a Gaussian density with a variance which is equal to that of the mild solution of the corresponding linear equation with additive noise.
Resumo:
Functional Data Analysis (FDA) deals with samples where a whole function is observedfor each individual. A particular case of FDA is when the observed functions are densityfunctions, that are also an example of infinite dimensional compositional data. In thiswork we compare several methods for dimensionality reduction for this particular typeof data: functional principal components analysis (PCA) with or without a previousdata transformation and multidimensional scaling (MDS) for diferent inter-densitiesdistances, one of them taking into account the compositional nature of density functions. The difeerent methods are applied to both artificial and real data (householdsincome distributions)
Resumo:
Compositional data analysis motivated the introduction of a complete Euclidean structure in the simplex of D parts. This was based on the early work of J. Aitchison (1986) and completed recently when Aitchinson distance in the simplex was associated with an inner product and orthonormal bases were identified (Aitchison and others, 2002; Egozcue and others, 2003). A partition of the support of a random variable generates a composition by assigning the probability of each interval to a part of the composition. One can imagine that the partition can be refined and the probability density would represent a kind of continuous composition of probabilities in a simplex of infinitely many parts. This intuitive idea would lead to a Hilbert-space of probability densitiesby generalizing the Aitchison geometry for compositions in the simplex into the set probability densities
Resumo:
A recent trend in digital mammography is computer-aided diagnosis systems, which are computerised tools designed to assist radiologists. Most of these systems are used for the automatic detection of abnormalities. However, recent studies have shown that their sensitivity is significantly decreased as the density of the breast increases. This dependence is method specific. In this paper we propose a new approach to the classification of mammographic images according to their breast parenchymal density. Our classification uses information extracted from segmentation results and is based on the underlying breast tissue texture. Classification performance was based on a large set of digitised mammograms. Evaluation involves different classifiers and uses a leave-one-out methodology. Results demonstrate the feasibility of estimating breast density using image processing and analysis techniques
Resumo:
In this paper, different recovery methods applied at different network layers and time scales are used in order to enhance the network reliability. Each layer deploys its own fault management methods. However, current recovery methods are applied to only a specific layer. New protection schemes, based on the proposed partial disjoint path algorithm, are defined in order to avoid protection duplications in a multi-layer scenario. The new protection schemes also encompass shared segment backup computation and shared risk link group identification. A complete set of experiments proves the efficiency of the proposed methods in relation with previous ones, in terms of resources used to protect the network, the failure recovery time and the request rejection ratio
Resumo:
In a seminal paper, Aitchison and Lauder (1985) introduced classical kernel densityestimation techniques in the context of compositional data analysis. Indeed, they gavetwo options for the choice of the kernel to be used in the kernel estimator. One ofthese kernels is based on the use the alr transformation on the simplex SD jointly withthe normal distribution on RD-1. However, these authors themselves recognized thatthis method has some deficiencies. A method for overcoming these dificulties based onrecent developments for compositional data analysis and multivariate kernel estimationtheory, combining the ilr transformation with the use of the normal density with a fullbandwidth matrix, was recently proposed in Martín-Fernández, Chacón and Mateu-Figueras (2006). Here we present an extensive simulation study that compares bothmethods in practice, thus exploring the finite-sample behaviour of both estimators
Resumo:
One of the criticisms leveled at the model of dispersed city found all over the world is its unarticulated, random, and undifferentiated nature. To check this idea in the Barcelona Metropolitan Region, we estimated the impact of the urban spatial structure (CBD, subcenters and transportation infrastructures) over the population density and commuting distance. The results are unfavorable to the hypothesis of the increasing destructuring of cities given that the explanatory capacity of both functions improves over time, both when other control variables are not included and when they are included.
Resumo:
Customer satisfaction and retention are key issues for organizations in today’s competitive market place. As such, much research and revenue has been invested in developing accurate ways of assessing consumer satisfaction at both the macro (national) and micro (organizational) level, facilitating comparisons in performance both within and between industries. Since the instigation of the national customer satisfaction indices (CSI), partial least squares (PLS) has been used to estimate the CSI models in preference to structural equation models (SEM) because they do not rely on strict assumptions about the data. However, this choice was based upon some misconceptions about the use of SEM’s and does not take into consideration more recent advances in SEM, including estimation methods that are robust to non-normality and missing data. In this paper, both SEM and PLS approaches were compared by evaluating perceptions of the Isle of Man Post Office Products and Customer service using a CSI format. The new robust SEM procedures were found to be advantageous over PLS. Product quality was found to be the only driver of customer satisfaction, while image and satisfaction were the only predictors of loyalty, thus arguing for the specificity of postal services