58 resultados para lattice parameters
em Consorci de Serveis Universitaris de Catalunya (CSUC), Spain
Resumo:
The electronic structure of the wurtzite-type phase of aluminum nitride has been investigated by means of periodic ab initio Hartree-Fock calculations. The binding energy, lattice parameters (a,c), and the internal coordinate (u) have been calculated. All structural parameters are in excellent agreement with the experimental data. The electronic structure and bonding in AlN are analyzed by means of density-of-states projections and electron-density maps. The calculated values of the bulk modulus, its pressure derivative, the optical-phonon frequencies at the center of the Brillouin zone, and the full set of elastic constants are in good agreement with the experimental data.
Resumo:
The electronic structure of the wurtzite-type phase of aluminum nitride has been investigated by means of periodic ab initio Hartree-Fock calculations. The binding energy, lattice parameters (a,c), and the internal coordinate (u) have been calculated. All structural parameters are in excellent agreement with the experimental data. The electronic structure and bonding in AlN are analyzed by means of density-of-states projections and electron-density maps. The calculated values of the bulk modulus, its pressure derivative, the optical-phonon frequencies at the center of the Brillouin zone, and the full set of elastic constants are in good agreement with the experimental data.
Resumo:
The possible coexistence of ferromagnetism and charge/orbital order in Bi3/4Sr1/4MnO3 has been investigated. The manganite Bi0.75Sr0.25MnO3, with commensurate charge balance, undergoes an electronic transition at TCO~600 K that produces a longrange modulation with double periodicity along a and c axis, and unusual anisotropic evolution of the lattice parameters. The previously proposed ferromagnetic properties of this new ordered phase were studied by magnetometry and diffraction techniques. In zero field the magnetic structure is globally antiferromagnetic, ruling out the apparition of spontaneous ferromagnetism. However, the application of magnetic fields produces a continuous progressive canting of the moments, inducing a ferromagnetic phase even for relatively small fields (H<<1 T). Application of pulsed high fields produces a remarkable and reversible spin polarization (under 30 T, the ferromagnetic moment is ~3 ¿B/Mn, without any sign of charge order melting). The coexistence of ferromagnetism and charge order at low and very-high fields is a remarkable property of this system.
Resumo:
We have analyzed a two-dimensional lattice-gas model of cylindrical molecules which can exhibit four possible orientations. The Hamiltonian of the model contains positional and orientational energy interaction terms. The ground state of the model has been investigated on the basis of Karl¿s theorem. Monte Carlo simulation results have confirmed the predicted ground state. The model is able to reproduce, with appropriate values of the Hamiltonian parameters, both, a smectic-nematic-like transition and a nematic-isotropic-like transition. We have also analyzed the phase diagram of the system by mean-field techniques and Monte Carlo simulations. Mean-field calculations agree well qualitatively with Monte Carlo results but overestimate transition temperatures.
Resumo:
We consider a two dimensional lattice coupled with nearest neighbor interaction potential of power type. The existence of infinite many periodic solutions is shown by using minimax methods.
Resumo:
En los últimos tiempos la telefonía móvil ha experimentado una reducción de los terminales gracias a la miniaturización de los filtros a frecuencias de microondas. Los filtros pasa banda más utilizados son los basados en la tecnología SAW, sin embargo son incompatibles con tecnologías de silicio y su comportamiento se degrada a frecuencias superiores de 3 GHz, por ello los estudios actuales se centran en la tecnología BAW. Las dos arquitecturas convencionales de filtros basados en resonadores BAW unidos eléctricamente son el ladder y lattice. Sin embargo, en este proyecto se estudiará la topología half lattice, la cual presenta un mejor comportamiento y unas dimensiones más reducidas. Para ello se obtendrán las ecuaciones de diseño del filtro, y con ellas se realizará la implementación a partir de la frecuencia central y el ancho de banda relativo.
Resumo:
Low concentrations of elements in geochemical analyses have the peculiarity of beingcompositional data and, for a given level of significance, are likely to be beyond thecapabilities of laboratories to distinguish between minute concentrations and completeabsence, thus preventing laboratories from reporting extremely low concentrations of theanalyte. Instead, what is reported is the detection limit, which is the minimumconcentration that conclusively differentiates between presence and absence of theelement. A spatially distributed exhaustive sample is employed in this study to generateunbiased sub-samples, which are further censored to observe the effect that differentdetection limits and sample sizes have on the inference of population distributionsstarting from geochemical analyses having specimens below detection limit (nondetects).The isometric logratio transformation is used to convert the compositional data in thesimplex to samples in real space, thus allowing the practitioner to properly borrow fromthe large source of statistical techniques valid only in real space. The bootstrap method isused to numerically investigate the reliability of inferring several distributionalparameters employing different forms of imputation for the censored data. The casestudy illustrates that, in general, best results are obtained when imputations are madeusing the distribution best fitting the readings above detection limit and exposes theproblems of other more widely used practices. When the sample is spatially correlated, itis necessary to combine the bootstrap with stochastic simulation
Resumo:
One of the tantalising remaining problems in compositional data analysis lies in how to deal with data sets in which there are components which are essential zeros. By anessential zero we mean a component which is truly zero, not something recorded as zero simply because the experimental design or the measuring instrument has not been sufficiently sensitive to detect a trace of the part. Such essential zeros occur inmany compositional situations, such as household budget patterns, time budgets,palaeontological zonation studies, ecological abundance studies. Devices such as nonzero replacement and amalgamation are almost invariably ad hoc and unsuccessful insuch situations. From consideration of such examples it seems sensible to build up amodel in two stages, the first determining where the zeros will occur and the secondhow the unit available is distributed among the non-zero parts. In this paper we suggest two such models, an independent binomial conditional logistic normal model and a hierarchical dependent binomial conditional logistic normal model. The compositional data in such modelling consist of an incidence matrix and a conditional compositional matrix. Interesting statistical problems arise, such as the question of estimability of parameters, the nature of the computational process for the estimation of both the incidence and compositional parameters caused by the complexity of the subcompositional structure, the formation of meaningful hypotheses, and the devising of suitable testing methodology within a lattice of such essential zero-compositional hypotheses. The methodology is illustrated by application to both simulated and real compositional data
Resumo:
A systolic array to implement lattice-reduction-aided lineardetection is proposed for a MIMO receiver. The lattice reductionalgorithm and the ensuing linear detections are operated in the same array, which can be hardware-efficient. All-swap lattice reduction algorithm (ASLR) is considered for the systolic design.ASLR is a variant of the LLL algorithm, which processes all lattice basis vectors within one iteration. Lattice-reduction-aided linear detection based on ASLR and LLL algorithms have very similarbit-error-rate performance, while ASLR is more time efficient inthe systolic array, especially for systems with a large number ofantennas.
Resumo:
For the standard kernel density estimate, it is known that one can tune the bandwidth such that the expected L1 error is within a constant factor of the optimal L1 error (obtained when one is allowed to choose the bandwidth with knowledge of the density). In this paper, we pose the same problem for variable bandwidth kernel estimates where the bandwidths are allowed to depend upon the location. We show in particular that for positive kernels on the real line, for any data-based bandwidth, there exists a densityfor which the ratio of expected L1 error over optimal L1 error tends to infinity. Thus, the problem of tuning the variable bandwidth in an optimal manner is ``too hard''. Moreover, from the class of counterexamples exhibited in the paper, it appears thatplacing conditions on the densities (monotonicity, convexity, smoothness) does not help.
Resumo:
Most methods for small-area estimation are based on composite estimators derived from design- or model-based methods. A composite estimator is a linear combination of a direct and an indirect estimator with weights that usually depend on unknown parameters which need to be estimated. Although model-based small-area estimators are usually based on random-effects models, the assumption of fixed effects is at face value more appropriate.Model-based estimators are justified by the assumption of random (interchangeable) area effects; in practice, however, areas are not interchangeable. In the present paper we empirically assess the quality of several small-area estimators in the setting in which the area effects are treated as fixed. We consider two settings: one that draws samples from a theoretical population, and another that draws samples from an empirical population of a labor force register maintained by the National Institute of Social Security (NISS) of Catalonia. We distinguish two types of composite estimators: a) those that use weights that involve area specific estimates of bias and variance; and, b) those that use weights that involve a common variance and a common squared bias estimate for all the areas. We assess their precision and discuss alternatives to optimizing composite estimation in applications.
Resumo:
Many dynamic revenue management models divide the sale period into a finite number of periods T and assume, invoking a fine-enough grid of time, that each period sees at most one booking request. These Poisson-type assumptions restrict the variability of the demand in the model, but researchers and practitioners were willing to overlook this for the benefit of tractability of the models. In this paper, we criticize this model from another angle. Estimating the discrete finite-period model poses problems of indeterminacy and non-robustness: Arbitrarily fixing T leads to arbitrary control values and on the other hand estimating T from data adds an additional layer of indeterminacy. To counter this, we first propose an alternate finite-population model that avoids this problem of fixing T and allows a wider range of demand distributions, while retaining the useful marginal-value properties of the finite-period model. The finite-population model still requires jointly estimating market size and the parameters of the customer purchase model without observing no-purchases. Estimation of market-size when no-purchases are unobservable has rarely been attempted in the marketing or revenue management literature. Indeed, we point out that it is akin to the classical statistical problem of estimating the parameters of a binomial distribution with unknown population size and success probability, and hence likely to be challenging. However, when the purchase probabilities are given by a functional form such as a multinomial-logit model, we propose an estimation heuristic that exploits the specification of the functional form, the variety of the offer sets in a typical RM setting, and qualitative knowledge of arrival rates. Finally we perform simulations to show that the estimator is very promising in obtaining unbiased estimates of population size and the model parameters.
Resumo:
uvby H-beta photometry has been obtained for a sample of 93 selected main sequence A stars. The purpose was to determine accurate effective temperatures, surface gravities, and absolute magnitudes for an individual determination of ages and parallaxes, which have to be included in a more extensive work analyzing the kinematic properties of A V stars. Several calibrations and methods to determine the above mentioned parameters have been reviewed, allowing the design of a new algorithm for their determination. The results obtained using this procedure were tested in a previous paper using uvby H-beta data from the Hauck and Mermilliod catalogue, and comparing the rusulting temperatures, surface gravities and absolute magnitudes with empirical determinations of these parameters.
Resumo:
This paper estimates a model of airline competition for the Spanish air transport market. I test the explanatory power of alternative oligopoly models with capacity constraints. In addition, I analyse the degree of density economies. Results show that Spanish airlines conduct follows a price-leadership scheme so that it is less competitive than the Cournot solution. I also find evidence that thin routes can be considered as natural monopolies