889 resultados para Data structure


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Relativistic multi-configuration Dirac Fock (MCDF) wavefunctions coupled to good angular momentum J have been calculated for low lying states of Ba I and Ba II. These wavefunctions are compared with semiempirical ones derived from experimental atomic energy levels. It is found that significantly better agreement is obtained when close configurations are included in the MCDF wavefunctions. Calculations of the electronic part of the field isotope shift lead to very good agreement with electronic factors derived from experimental data. Furthermore, the slopes of the lines in a King plot analysis of many of the optical lines are predicted accurately by these calculations. However, the MCDF wavefunctions seem not to be of sufficient accuracy to give agreement with the experimental magnetic dipole and electric quadrupole hyperfine structure constants.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Results of the Dirac-Slater discrete variational calculations for the group 4, 5, and 6 highest chlorides including elements 104, 105, and 106 have shown that the groups are not identical with respect to trends in the electronic structure and bonding. The charge density distribution data show that notwithstanding the basic increase in covalency within the groups this increase diminishes in going from group 4 to group 6. As a result, E106Cl_6 will be less stable toward thermal decomposition than WCl_6, which is confirmed by an estimated low E106-Cl bond energy. \delta H_form equal to -90.3 ± 6 kcal/rnol is obtained for E106Cl_6 in the gas phase, which is indicative of a very low stability of this compound. The stability of the maximum oxidation state is shown to decrease in the direction E104(+4) > E105(+5) > E106(+6).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

While most data analysis and decision support tools use numerical aspects of the data, Conceptual Information Systems focus on their conceptual structure. This paper discusses how both approaches can be combined.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Lasers play an important role for medical, sensoric and data storage devices. This thesis is focused on design, technology development, fabrication and characterization of hybrid ultraviolet Vertical-Cavity Surface-Emitting Lasers (UV VCSEL) with organic laser-active material and inorganic distributed Bragg reflectors (DBR). Multilayer structures with different layer thicknesses, refractive indices and absorption coefficients of the inorganic materials were studied using theoretical model calculations. During the simulations the structure parameters such as materials and thicknesses have been varied. This procedure was repeated several times during the design optimization process including also the feedback from technology and characterization. Two types of VCSEL devices were investigated. The first is an index coupled structure consisting of bottom and top DBR dielectric mirrors. In the space in between them is the cavity, which includes active region and defines the spectral gain profile. In this configuration the maximum electrical field is concentrated in the cavity and can destroy the chemical structure of the active material. The second type of laser is a so called complex coupled VCSEL. In this structure the active material is placed not only in the cavity but also in parts of the DBR structure. The simulations show that such a distribution of the active material reduces the required pumping power for reaching lasing threshold. High efficiency is achieved by substituting the dielectric material with high refractive index for the periods closer to the cavity. The inorganic materials for the DBR mirrors have been deposited by Plasma- Enhanced Chemical Vapor Deposition (PECVD) and Dual Ion Beam Sputtering (DIBS) machines. Extended optimizations of the technological processes have been performed. All the processes are carried out in a clean room Class 1 and Class 10000. The optical properties and the thicknesses of the layers are measured in-situ by spectroscopic ellipsometry and spectroscopic reflectometry. The surface roughness is analyzed by atomic force microscopy (AFM) and images of the devices are taken with scanning electron microscope (SEM). The silicon dioxide (SiO2) and silicon nitride (Si3N4) layers deposited by the PECVD machine show defects of the material structure and have higher absorption in the ultra violet range compared to ion beam deposition (IBD). This results in low reflectivity of the DBR mirrors and also reduces the optical properties of the VCSEL devices. However PECVD has the advantage that the stress in the layers can be tuned and compensated, in contrast to IBD at the moment. A sputtering machine Ionsys 1000 produced by Roth&Rau company, is used for the deposition of silicon dioxide (SiO2), silicon nitride (Si3N4), aluminum oxide (Al2O3) and zirconium dioxide (ZrO2). The chamber is equipped with main (sputter) and assisted ion sources. The dielectric materials were optimized by introducing additional oxygen and nitrogen into the chamber. DBR mirrors with different material combinations were deposited. The measured optical properties of the fabricated multilayer structures show an excellent agreement with the results of theoretical model calculations. The layers deposited by puttering show high compressive stress. As an active region a novel organic material with spiro-linked molecules is used. Two different materials have been evaporated by utilizing a dye evaporation machine in the clean room of the department Makromolekulare Chemie und Molekulare Materialien (mmCmm). The Spiro-Octopus-1 organic material has a maximum emission at the wavelength λemission = 395 nm and the Spiro-Pphenal has a maximum emission at the wavelength λemission = 418 nm. Both of them have high refractive index and can be combined with low refractive index materials like silicon dioxide (SiO2). The sputtering method shows excellent optical quality of the deposited materials and high reflection of the multilayer structures. The bottom DBR mirrors for all VCSEL devices were deposited by the DIBS machine, whereas the top DBR mirror deposited either by PECVD or by combination of PECVD and DIBS. The fabricated VCSEL structures were optically pumped by nitrogen laser at wavelength λpumping = 337 nm. The emission was measured by spectrometer. A radiation of the VCSEL structure at wavelength 392 nm and 420 nm is observed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The main task of this work has been to investigate the effects of anisotropy onto the propagation of seismic waves along the Upper Mantle below Germany and adjacent areas. Refraction- and reflexion seismic experiments proved the existence of Upper Mantle anisotropy and its influence onto the propagation of Pn-waves. By the 3D tomographic investigations that have been done here for the crust and the upper mantle, considering the influence of anisotropy, a gap for the investigations in Europe has been closed. These investigations have been done with the SSH-Inversionprogram of Prof. Dr. M. Koch, which is able to compute simultaneously the seismic structure and hypocenters. For the investigation, a dataset has been available with recordings between the years 1975 to 2003 with a total of 60249 P- and 54212 S-phase records of 10028 seismic events. At the beginning, a precise analysis of the residuals (RES, the difference between calculated and observed arrivaltime) has been done which confirmed the existence of anisotropy for Pn-phases. The recognized sinusoidal distribution has been compensated by an extension of the SSH-program by an ellipse with a slow and rectangular fast axis with azimuth to correct the Pn-velocities. The azimuth of the fast axis has been fixed by the application of the simultaneous inversion at 25° - 27° with a variation of the velocities at +- 2.5 about an average value at 8 km/s. This new value differs from the old one at 35°, recognized in the initial residual analysis. This depends on the new computed hypocenters together with the structure. The application of the elliptical correction has resulted in a better fit of the vertical layered 1D-Model, compared to the results of preceding seismological experiments and 1D and 2D investigations. The optimal result of the 1D-inversion has been used as initial starting model for the 3D-inversions to compute the three dimensional picture of the seismic structure of the Crust and Upper Mantle. The simultaneous inversion has showed an optimization of the relocalization of the hypocenters and the reconstruction of the seismic structure in comparison to the geology and tectonic, as described by other investigations. The investigations for the seismic structure and the relocalization have been confirmed by several different tests. First, synthetic traveltime data are computed with an anisotropic variation and inverted with and without anisotropic correction. Further, tests with randomly disturbed hypocenters and traveltime data have been proceeded to verify the influence of the initial values onto the relocalization accuracy and onto the seismic structure and to test for a further improvement by the application of the anisotropic correction. Finally, the results of the work have been applied onto the Waldkirch earthquake in 2004 to compare the isotropic and the anisotropic relocalization with the initial optimal one to verify whether there is some improvement.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The component structure of a 34-item scale measuring different aspects of job satisfaction was investigated among extension officers in North West Province, South Africa. A simple random sampling technique was used to select 40 extension officers from which data were collected. A structured questionnaire consisting of 34 job satisfaction and 10 personal characteristic items was administered to the extension officers. Items on job satisfaction were measured at interval level and analyzedwith Principal ComponentAnalysis. Most of the respondents (82.5%) weremales, between 40 to 45 years, 85% were married and 87.5% had a diploma as their educational qualification. Furthermore, 54% of the households size between 4 to 6 persons, whereas 75% were Christians. The majority of the extension officers lived in their job area (82.5), while 80% covered at least 3 communities and 3 farmer groups. In terms of number of farmers covered, only 40% of the extension officers covered more than 500 farmers and 45% travelled more than 40 km to reach their farmers. From the job satisfaction items 9 components were extracted to show areas for job satisfaction among extension officers. These were in-service training, research policies, communicating recommended practices, financial support for self and family, quality of technical help, opportunity to advance education, management and control of operations, rewarding system and sanctions. The results have several implications for motivating extension officers for high job performance especially with large number of clients and small number of extension agents.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We present a statistical image-based shape + structure model for Bayesian visual hull reconstruction and 3D structure inference. The 3D shape of a class of objects is represented by sets of contours from silhouette views simultaneously observed from multiple calibrated cameras. Bayesian reconstructions of new shapes are then estimated using a prior density constructed with a mixture model and probabilistic principal components analysis. We show how the use of a class-specific prior in a visual hull reconstruction can reduce the effect of segmentation errors from the silhouette extraction process. The proposed method is applied to a data set of pedestrian images, and improvements in the approximate 3D models under various noise conditions are shown. We further augment the shape model to incorporate structural features of interest; unknown structural parameters for a novel set of contours are then inferred via the Bayesian reconstruction process. Model matching and parameter inference are done entirely in the image domain and require no explicit 3D construction. Our shape model enables accurate estimation of structure despite segmentation errors or missing views in the input silhouettes, and works even with only a single input view. Using a data set of thousands of pedestrian images generated from a synthetic model, we can accurately infer the 3D locations of 19 joints on the body based on observed silhouette contours from real images.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Modeling and predicting co-occurrences of events is a fundamental problem of unsupervised learning. In this contribution we develop a statistical framework for analyzing co-occurrence data in a general setting where elementary observations are joint occurrences of pairs of abstract objects from two finite sets. The main challenge for statistical models in this context is to overcome the inherent data sparseness and to estimate the probabilities for pairs which were rarely observed or even unobserved in a given sample set. Moreover, it is often of considerable interest to extract grouping structure or to find a hierarchical data organization. A novel family of mixture models is proposed which explain the observed data by a finite number of shared aspects or clusters. This provides a common framework for statistical inference and structure discovery and also includes several recently proposed models as special cases. Adopting the maximum likelihood principle, EM algorithms are derived to fit the model parameters. We develop improved versions of EM which largely avoid overfitting problems and overcome the inherent locality of EM--based optimization. Among the broad variety of possible applications, e.g., in information retrieval, natural language processing, data mining, and computer vision, we have chosen document retrieval, the statistical analysis of noun/adjective co-occurrence and the unsupervised segmentation of textured images to test and evaluate the proposed algorithms.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

One of the tantalising remaining problems in compositional data analysis lies in how to deal with data sets in which there are components which are essential zeros. By an essential zero we mean a component which is truly zero, not something recorded as zero simply because the experimental design or the measuring instrument has not been sufficiently sensitive to detect a trace of the part. Such essential zeros occur in many compositional situations, such as household budget patterns, time budgets, palaeontological zonation studies, ecological abundance studies. Devices such as nonzero replacement and amalgamation are almost invariably ad hoc and unsuccessful in such situations. From consideration of such examples it seems sensible to build up a model in two stages, the first determining where the zeros will occur and the second how the unit available is distributed among the non-zero parts. In this paper we suggest two such models, an independent binomial conditional logistic normal model and a hierarchical dependent binomial conditional logistic normal model. The compositional data in such modelling consist of an incidence matrix and a conditional compositional matrix. Interesting statistical problems arise, such as the question of estimability of parameters, the nature of the computational process for the estimation of both the incidence and compositional parameters caused by the complexity of the subcompositional structure, the formation of meaningful hypotheses, and the devising of suitable testing methodology within a lattice of such essential zero-compositional hypotheses. The methodology is illustrated by application to both simulated and real compositional data

Relevância:

30.00% 30.00%

Publicador:

Resumo:

One of the disadvantages of old age is that there is more past than future: this, however, may be turned into an advantage if the wealth of experience and, hopefully, wisdom gained in the past can be reflected upon and throw some light on possible future trends. To an extent, then, this talk is necessarily personal, certainly nostalgic, but also self critical and inquisitive about our understanding of the discipline of statistics. A number of almost philosophical themes will run through the talk: search for appropriate modelling in relation to the real problem envisaged, emphasis on sensible balances between simplicity and complexity, the relative roles of theory and practice, the nature of communication of inferential ideas to the statistical layman, the inter-related roles of teaching, consultation and research. A list of keywords might be: identification of sample space and its mathematical structure, choices between transform and stay, the role of parametric modelling, the role of a sample space metric, the underused hypothesis lattice, the nature of compositional change, particularly in relation to the modelling of processes. While the main theme will be relevance to compositional data analysis we shall point to substantial implications for general multivariate analysis arising from experience of the development of compositional data analysis…

Relevância:

30.00% 30.00%

Publicador:

Resumo:

As stated in Aitchison (1986), a proper study of relative variation in a compositional data set should be based on logratios, and dealing with logratios excludes dealing with zeros. Nevertheless, it is clear that zero observations might be present in real data sets, either because the corresponding part is completely absent –essential zeros– or because it is below detection limit –rounded zeros. Because the second kind of zeros is usually understood as “a trace too small to measure”, it seems reasonable to replace them by a suitable small value, and this has been the traditional approach. As stated, e.g. by Tauber (1999) and by Martín-Fernández, Barceló-Vidal, and Pawlowsky-Glahn (2000), the principal problem in compositional data analysis is related to rounded zeros. One should be careful to use a replacement strategy that does not seriously distort the general structure of the data. In particular, the covariance structure of the involved parts –and thus the metric properties– should be preserved, as otherwise further analysis on subpopulations could be misleading. Following this point of view, a non-parametric imputation method is introduced in Martín-Fernández, Barceló-Vidal, and Pawlowsky-Glahn (2000). This method is analyzed in depth by Martín-Fernández, Barceló-Vidal, and Pawlowsky-Glahn (2003) where it is shown that the theoretical drawbacks of the additive zero replacement method proposed in Aitchison (1986) can be overcome using a new multiplicative approach on the non-zero parts of a composition. The new approach has reasonable properties from a compositional point of view. In particular, it is “natural” in the sense that it recovers the “true” composition if replacement values are identical to the missing values, and it is coherent with the basic operations on the simplex. This coherence implies that the covariance structure of subcompositions with no zeros is preserved. As a generalization of the multiplicative replacement, in the same paper a substitution method for missing values on compositional data sets is introduced

Relevância:

30.00% 30.00%

Publicador:

Resumo:

First discussion on compositional data analysis is attributable to Karl Pearson, in 1897. However, notwithstanding the recent developments on algebraic structure of the simplex, more than twenty years after Aitchison’s idea of log-transformations of closed data, scientific literature is again full of statistical treatments of this type of data by using traditional methodologies. This is particularly true in environmental geochemistry where besides the problem of the closure, the spatial structure (dependence) of the data have to be considered. In this work we propose the use of log-contrast values, obtained by a simplicial principal component analysis, as LQGLFDWRUV of given environmental conditions. The investigation of the log-constrast frequency distributions allows pointing out the statistical laws able to generate the values and to govern their variability. The changes, if compared, for example, with the mean values of the random variables assumed as models, or other reference parameters, allow defining monitors to be used to assess the extent of possible environmental contamination. Case study on running and ground waters from Chiavenna Valley (Northern Italy) by using Na+, K+, Ca2+, Mg2+, HCO3-, SO4 2- and Cl- concentrations will be illustrated

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Simpson's paradox, also known as amalgamation or aggregation paradox, appears when dealing with proportions. Proportions are by construction parts of a whole, which can be interpreted as compositions assuming they only carry relative information. The Aitchison inner product space structure of the simplex, the sample space of compositions, explains the appearance of the paradox, given that amalgamation is a nonlinear operation within that structure. Here we propose to use balances, which are specific elements of this structure, to analyse situations where the paradox might appear. With the proposed approach we obtain that the centre of the tables analysed is a natural way to compare them, which avoids by construction the possibility of a paradox. Key words: Aitchison geometry, geometric mean, orthogonal projection

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Planners in public and private institutions would like coherent forecasts of the components of age-specic mortality, such as causes of death. This has been di cult to achieve because the relative values of the forecast components often fail to behave in a way that is coherent with historical experience. In addition, when the group forecasts are combined the result is often incompatible with an all-groups forecast. It has been shown that cause-specic mortality forecasts are pessimistic when compared with all-cause forecasts (Wilmoth, 1995). This paper abandons the conventional approach of using log mortality rates and forecasts the density of deaths in the life table. Since these values obey a unit sum constraint for both conventional single-decrement life tables (only one absorbing state) and multiple-decrement tables (more than one absorbing state), they are intrinsically relative rather than absolute values across decrements as well as ages. Using the methods of Compositional Data Analysis pioneered by Aitchison (1986), death densities are transformed into the real space so that the full range of multivariate statistics can be applied, then back-transformed to positive values so that the unit sum constraint is honoured. The structure of the best-known, single-decrement mortality-rate forecasting model, devised by Lee and Carter (1992), is expressed in compositional form and the results from the two models are compared. The compositional model is extended to a multiple-decrement form and used to forecast mortality by cause of death for Japan

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper we use the most representative models that exist in the literature on term structure of interest rates. In particular, we explore affine one factor models and polynomial-type approximations such as Nelson and Siegel. Our empirical application considers monthly data of USA and Colombia for estimation and forecasting. We find that affine models do not provide adequate performance either in-sample or out-of-sample. On the contrary, parsimonious models such as Nelson and Siegel have adequate results in-sample, however out-of-sample they are not able to systematically improve upon random walk base forecast.