5 resultados para complex structures up to isometry
em Universitätsbibliothek Kassel, Universität Kassel, Germany
Resumo:
The chemical elements up to Z = 172 are calculated with a relativistic Hartree-Fock-Slater program taking into account the effect of the extended nucleus. Predictions of the binding energies, the X-ray spectra and the number of electrons inside the nuclei are given for the inner electron shells. The predicted chemical behaviour will be discussed for a11 elements between Z = 104-120 and compared with previous known extrapolations. For the elements Z = 121-172 predictions of their chemistry and a proposal for the continuation of the Periodic Table are given. The eighth chemical period ends with Z = 164 located below Mercury. The ninth period starts with an alkaline and alkaline earth metal and ends immediately similarly to the second and third period with a noble gas at Z = 172. Mit einem relativistischen Hartree-Fock-Slater Rechenprogramm werden die chemischen Elemente bis zur Ordnungszahl 172 berechnet, wobei der Einfluß des ausgedehnten Kernes berücksichtigt wurde. Für die innersten Elektronenschalen werden Voraussagen über deren Bindungsenergie, das Röntgenspektrum und die Zahl der Elektronen im Kern gemacht. Die voraussichtliche Chemie der Elemente zwischen Z = 104 und 120 wird diskutiert und mit bereits vorhandenen Extrapolationen verglichen. Für die Elemente Z = 121-172 wird eine Voraussage über das chemische Verhalten gegeben, sowie ein Vorschlag für die Fortsetzung des Periodensystems gemacht. Die achte chemische Periode endet mit dem Element 164 im Periodensystem unter Quecksilber gelegen. Die neunte Periode beginnt mit einem Alkali- und Erdalkalimetall und endet sofort wieder wie in der zweiten und dritten Periode mit einem Edelgas bei Z = 172.
Resumo:
Land use is a crucial link between human activities and the natural environment and one of the main driving forces of global environmental change. Large parts of the terrestrial land surface are used for agriculture, forestry, settlements and infrastructure. Given the importance of land use, it is essential to understand the multitude of influential factors and resulting land use patterns. An essential methodology to study and quantify such interactions is provided by the adoption of land-use models. By the application of land-use models, it is possible to analyze the complex structure of linkages and feedbacks and to also determine the relevance of driving forces. Modeling land use and land use changes has a long-term tradition. In particular on the regional scale, a variety of models for different regions and research questions has been created. Modeling capabilities grow with steady advances in computer technology, which on the one hand are driven by increasing computing power on the other hand by new methods in software development, e.g. object- and component-oriented architectures. In this thesis, SITE (Simulation of Terrestrial Environments), a novel framework for integrated regional sland-use modeling, will be introduced and discussed. Particular features of SITE are the notably extended capability to integrate models and the strict separation of application and implementation. These features enable efficient development, test and usage of integrated land-use models. On its system side, SITE provides generic data structures (grid, grid cells, attributes etc.) and takes over the responsibility for their administration. By means of a scripting language (Python) that has been extended by language features specific for land-use modeling, these data structures can be utilized and manipulated by modeling applications. The scripting language interpreter is embedded in SITE. The integration of sub models can be achieved via the scripting language or by usage of a generic interface provided by SITE. Furthermore, functionalities important for land-use modeling like model calibration, model tests and analysis support of simulation results have been integrated into the generic framework. During the implementation of SITE, specific emphasis was laid on expandability, maintainability and usability. Along with the modeling framework a land use model for the analysis of the stability of tropical rainforest margins was developed in the context of the collaborative research project STORMA (SFB 552). In a research area in Central Sulawesi, Indonesia, socio-environmental impacts of land-use changes were examined. SITE was used to simulate land-use dynamics in the historical period of 1981 to 2002. Analogous to that, a scenario that did not consider migration in the population dynamics, was analyzed. For the calculation of crop yields and trace gas emissions, the DAYCENT agro-ecosystem model was integrated. In this case study, it could be shown that land-use changes in the Indonesian research area could mainly be characterized by the expansion of agricultural areas at the expense of natural forest. For this reason, the situation had to be interpreted as unsustainable even though increased agricultural use implied economic improvements and higher farmers' incomes. Due to the importance of model calibration, it was explicitly addressed in the SITE architecture through the introduction of a specific component. The calibration functionality can be used by all SITE applications and enables largely automated model calibration. Calibration in SITE is understood as a process that finds an optimal or at least adequate solution for a set of arbitrarily selectable model parameters with respect to an objective function. In SITE, an objective function typically is a map comparison algorithm capable of comparing a simulation result to a reference map. Several map optimization and map comparison methodologies are available and can be combined. The STORMA land-use model was calibrated using a genetic algorithm for optimization and the figure of merit map comparison measure as objective function. The time period for the calibration ranged from 1981 to 2002. For this period, respective reference land-use maps were compiled. It could be shown, that an efficient automated model calibration with SITE is possible. Nevertheless, the selection of the calibration parameters required detailed knowledge about the underlying land-use model and cannot be automated. In another case study decreases in crop yields and resulting losses in income from coffee cultivation were analyzed and quantified under the assumption of four different deforestation scenarios. For this task, an empirical model, describing the dependence of bee pollination and resulting coffee fruit set from the distance to the closest natural forest, was integrated. Land-use simulations showed, that depending on the magnitude and location of ongoing forest conversion, pollination services are expected to decline continuously. This results in a reduction of coffee yields of up to 18% and a loss of net revenues per hectare of up to 14%. However, the study also showed that ecological and economic values can be preserved if patches of natural vegetation are conservated in the agricultural landscape. -----------------------------------------------------------------------
Resumo:
A key argument for modeling knowledge in ontologies is the easy re-use and re-engineering of the knowledge. However, beside consistency checking, current ontology engineering tools provide only basic functionalities for analyzing ontologies. Since ontologies can be considered as (labeled, directed) graphs, graph analysis techniques are a suitable answer for this need. Graph analysis has been performed by sociologists for over 60 years, and resulted in the vivid research area of Social Network Analysis (SNA). While social network structures in general currently receive high attention in the Semantic Web community, there are only very few SNA applications up to now, and virtually none for analyzing the structure of ontologies. We illustrate in this paper the benefits of applying SNA to ontologies and the Semantic Web, and discuss which research topics arise on the edge between the two areas. In particular, we discuss how different notions of centrality describe the core content and structure of an ontology. From the rather simple notion of degree centrality over betweenness centrality to the more complex eigenvector centrality based on Hermitian matrices, we illustrate the insights these measures provide on two ontologies, which are different in purpose, scope, and size.
Resumo:
Mesh generation is an important step inmany numerical methods.We present the “HierarchicalGraphMeshing” (HGM)method as a novel approach to mesh generation, based on algebraic graph theory.The HGM method can be used to systematically construct configurations exhibiting multiple hierarchies and complex symmetry characteristics. The hierarchical description of structures provided by the HGM method can be exploited to increase the efficiency of multiscale and multigrid methods. In this paper, the HGMmethod is employed for the systematic construction of super carbon nanotubes of arbitrary order, which present a pertinent example of structurally and geometrically complex, yet highly regular, structures. The HGMalgorithm is computationally efficient and exhibits good scaling characteristics. In particular, it scales linearly for super carbon nanotube structures and is working much faster than geometry-based methods employing neighborhood search algorithms. Its modular character makes it conducive to automatization. For the generation of a mesh, the information about the geometry of the structure in a given configuration is added in a way that relates geometric symmetries to structural symmetries. The intrinsically hierarchic description of the resulting mesh greatly reduces the effort of determining mesh hierarchies for multigrid and multiscale applications and helps to exploit symmetry-related methods in the mechanical analysis of complex structures.
Resumo:
Sensing with electromagnetic waves having frequencies in the Terahertz-range is a very attractive investigative method with applications in fundamental research and industrial settings. Up to now, a lot of sources and detectors are available. However, most of these systems are bulky and have to be used in controllable environments such as laboratories. In 1993 Dyakonov and Shur suggested that plasma waves developing in field-effect-transistors can be used to emit and detect THz-radiation. Later on, it was shown that these plasma waves lead to rectification and allows for building efficient detectors. In contrast to the prediction that these plasma waves lead to new promising solid-state sources, only a few weak sources are known up to now. This work studies THz plasma waves in semiconductor devices using the Monte Carlo method in order to resolve this issue. A fast Monte Carlo solver was developed implementing a nonparabolic bandstructure representation of the used semiconductors. By investigating simplified field-effect-transistors it was found that the plasma frequency follows under equilibrium conditions the analytical predictions. However, no current oscillations were found at room temperature or with a current flowing in the channel. For more complex structures, consisting of ungated and gated regions, it was found that the plasma frequency does not follow the value predicted by the dispersion relation of the gated nor the ungated device.