997 resultados para dynamic phenomena


Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, we study some dynamic generalized information measures between a true distribution and an observed (weighted) distribution, useful in life length studies. Further, some bounds and inequalities related to these measures are also studied

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, the residual Kullback–Leibler discrimination information measure is extended to conditionally specified models. The extension is used to characterize some bivariate distributions. These distributions are also characterized in terms of proportional hazard rate models and weighted distributions. Moreover, we also obtain some bounds for this dynamic discrimination function by using the likelihood ratio order and some preceding results.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Recently, cumulative residual entropy (CRE) has been found to be a new measure of information that parallels Shannon’s entropy (see Rao et al. [Cumulative residual entropy: A new measure of information, IEEE Trans. Inform. Theory. 50(6) (2004), pp. 1220–1228] and Asadi and Zohrevand [On the dynamic cumulative residual entropy, J. Stat. Plann. Inference 137 (2007), pp. 1931–1941]). Motivated by this finding, in this paper, we introduce a generalized measure of it, namely cumulative residual Renyi’s entropy, and study its properties.We also examine it in relation to some applied problems such as weighted and equilibrium models. Finally, we extend this measure into the bivariate set-up and prove certain characterizing relationships to identify different bivariate lifetime models

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this article, we study some relevant information divergence measures viz. Renyi divergence and Kerridge’s inaccuracy measures. These measures are extended to conditionally specifiedmodels and they are used to characterize some bivariate distributions using the concepts of weighted and proportional hazard rate models. Moreover, some bounds are obtained for these measures using the likelihood ratio order

Relevância:

20.00% 20.00%

Publicador:

Resumo:

For the theoretical investigation of local phenomena (adsorption at surfaces, defects or impurities within a crystal, etc.) one can assume that the effects caused by the local disturbance are only limited to the neighbouring particles. With this model, that is well-known as cluster-approximation, an infinite system can be simulated by a much smaller segment of the surface (Cluster). The size of this segment varies strongly for different systems. Calculations to the convergence of bond distance and binding energy of an adsorbed aluminum atom on an Al(100)-surface showed that more than 100 atoms are necessary to get a sufficient description of surface properties. However with a full-quantummechanical approach these system sizes cannot be calculated because of the effort in computer memory and processor speed. Therefore we developed an embedding procedure for the simulation of surfaces and solids, where the whole system is partitioned in several parts which itsself are treated differently: the internal part (cluster), which is located near the place of the adsorbate, is calculated completely self-consistently and is embedded into an environment, whereas the influence of the environment on the cluster enters as an additional, external potential to the relativistic Kohn-Sham-equations. The basis of the procedure represents the density functional theory. However this means that the choice of the electronic density of the environment constitutes the quality of the embedding procedure. The environment density was modelled in three different ways: atomic densities; of a large prepended calculation without embedding transferred densities; bulk-densities (copied). The embedding procedure was tested on the atomic adsorptions of 'Al on Al(100) and Cu on Cu(100). The result was that if the environment is choices appropriately for the Al-system one needs only 9 embedded atoms to reproduce the results of exact slab-calculations. For the Cu-system first calculations without embedding procedures were accomplished, with the result that already 60 atoms are sufficient as a surface-cluster. Using the embedding procedure the same values with only 25 atoms were obtained. This means a substantial improvement if one takes into consideration that the calculation time increased cubically with the number of atoms. With the embedding method Infinite systems can be treated by molecular methods. Additionally the program code was extended by the possibility to make molecular-dynamic simulations. Now it is possible apart from the past calculations of fixed cores to investigate also structures of small clusters and surfaces. A first application we made with the adsorption of Cu on Cu(100). We calculated the relaxed positions of the atoms that were located close to the adsorption site and afterwards made the full-quantummechanical calculation of this system. We did that procedure for different distances to the surface. Thus a realistic adsorption process could be examined for the first time. It should be remarked that when doing the Cu reference-calculations (without embedding) we begun to parallelize the entire program code. Only because of this aspect the investigations for the 100 atomic Cu surface-clusters were possible. Due to the good efficiency of both the parallelization and the developed embedding procedure we will be able to apply the combination in future. This will help to work on more these areas it will be possible to bring in results of full-relativistic molecular calculations, what will be very interesting especially for the regime of heavy systems.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In dieser Arbeit wird die Wechselwirkung zwischen einem Photon und einem Elektron im starken Coulombfeld eines Atomkerns am Beispiel des radiativen Elektroneneinfangs beim Stoß hochgeladener Teilchen untersucht. In den letzten Jahren wurde dieser Ladungsaustauschprozess insbesondere für relativistische Ion–Atom–Stöße sowohl experimentell als auch theoretisch ausführlich erforscht. In Zentrum standen dabei haupsächlich die totalen und differentiellen Wirkungsquerschnitte. In neuerer Zeit werden vermehrt Spin– und Polarisationseffekte sowie Korrelationseffekte bei diesen Stoßprozessen diskutiert. Man erwartet, dass diese sehr empfindlich auf relativistische Effekte im Stoß reagieren und man deshalb eine hervorragende Methode zu deren Bestimmung erhält. Darüber hinaus könnten diese Messungen auch indirekt dazu führen, dass man die Polarisation des Ionenstrahls bestimmen kann. Damit würden sich neue experimentelle Möglichkeiten sowohl in der Atom– als auch der Kernphysik ergeben. In dieser Dissertation werden zunächst diese ersten Untersuchungen zu den Spin–, Polarisations– und Korrelationseffekten systematisch zusammengefasst. Die Dichtematrixtheorie liefert hierzu die geeignete Methode. Mit dieser Methode werden dann die allgemeinen Gleichungen für die Zweistufen–Rekombination hergeleitet. In diesem Prozess wird ein Elektron zunächst radiativ in einen angeregten Zustand eingefangen, der dann im zweiten Schritt unter Emission des zweiten (charakteristischen) Photons in den Grundzustand übergeht. Diese Gleichungen können natürlich auf beliebige Mehrstufen– sowie Einstufen–Prozesse erweitert werden. Im direkten Elektroneneinfang in den Grundzustand wurde die ”lineare” Polarisation der Rekombinationsphotonen untersucht. Es wurde gezeigt, dass man damit eine Möglichkeit zur Bestimmung der Polarisation der Teilchen im Eingangskanal des Schwerionenstoßes hat. Rechnungen zur Rekombination bei nackten U92+ Projektilen zeigen z. B., dass die Spinpolarisation der einfallenden Elektronen zu einer Drehung der linearen Polarisation der emittierten Photonen aus der Streuebene heraus führt. Diese Polarisationdrehung kann mit neu entwickelten orts– und polarisationsempfindlichen Festkörperdetektoren gemessen werden. Damit erhält man eine Methode zur Messung der Polarisation der einfallenden Elektronen und des Ionenstrahls. Die K–Schalen–Rekombination ist ein einfaches Beispiel eines Ein–Stufen–Prozesses. Das am besten bekannte Beispiel der Zwei–Stufen–Rekombination ist der Elektroneneinfang in den 2p3/2–Zustand des nackten Ions und anschließendem Lyman–1–Zerfall (2p3/2 ! 1s1/2). Im Rahmen der Dichte–Matrix–Theorie wurden sowohl die Winkelverteilung als auch die lineare Polarisation der charakteristischen Photonen untersucht. Beide (messbaren) Größen werden beträchtlich durch die Interferenz des E1–Kanals (elektrischer Dipol) mit dem viel schwächeren M2–Kanal (magnetischer Quadrupol) beeinflusst. Für die Winkelverteilung des Lyman–1 Zerfalls im Wasserstoff–ähnlichen Uran führt diese E1–M2–Mischung zu einem 30%–Effekt. Die Berücksichtigung dieser Interferenz behebt die bisher vorhandene Diskrepanz von Theorie und Experiment beim Alignment des 2p3/2–Zustands. Neben diesen Ein–Teichen–Querschnitten (Messung des Einfangphotons oder des charakteristischen Photons) wurde auch die Korrelation zwischen den beiden berechnet. Diese Korrelationen sollten in X–X–Koinzidenz–Messungen beobbachtbar sein. Der Schwerpunkt dieser Untersuchungen lag bei der Photon–Photon–Winkelkorrelation, die experimentell am einfachsten zu messen ist. In dieser Arbeit wurden ausführliche Berechnungen der koinzidenten X–X–Winkelverteilungen beim Elektroneneinfang in den 2p3/2–Zustand des nackten Uranions und beim anschließenden Lyman–1–Übergang durchgeführt. Wie bereits erwähnt, hängt die Winkelverteilung des charakteristischen Photons nicht nur vom Winkel des Rekombinationsphotons, sondern auch stark von der Spin–Polarisation der einfallenden Teilchen ab. Damit eröffnet sich eine zweite Möglichkeit zur Messung der Polaristion des einfallenden Ionenstrahls bzw. der einfallenden Elektronen.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Context awareness, dynamic reconfiguration at runtime and heterogeneity are key characteristics of future distributed systems, particularly in ubiquitous and mobile computing scenarios. The main contributions of this dissertation are theoretical as well as architectural concepts facilitating information exchange and fusion in heterogeneous and dynamic distributed environments. Our main focus is on bridging the heterogeneity issues and, at the same time, considering uncertain, imprecise and unreliable sensor information in information fusion and reasoning approaches. A domain ontology is used to establish a common vocabulary for the exchanged information. We thereby explicitly support different representations for the same kind of information and provide Inter-Representation Operations that convert between them. Special account is taken of the conversion of associated meta-data that express uncertainty and impreciseness. The Unscented Transformation, for example, is applied to propagate Gaussian normal distributions across highly non-linear Inter-Representation Operations. Uncertain sensor information is fused using the Dempster-Shafer Theory of Evidence as it allows explicit modelling of partial and complete ignorance. We also show how to incorporate the Dempster-Shafer Theory of Evidence into probabilistic reasoning schemes such as Hidden Markov Models in order to be able to consider the uncertainty of sensor information when deriving high-level information from low-level data. For all these concepts we provide architectural support as a guideline for developers of innovative information exchange and fusion infrastructures that are particularly targeted at heterogeneous dynamic environments. Two case studies serve as proof of concept. The first case study focuses on heterogeneous autonomous robots that have to spontaneously form a cooperative team in order to achieve a common goal. The second case study is concerned with an approach for user activity recognition which serves as baseline for a context-aware adaptive application. Both case studies demonstrate the viability and strengths of the proposed solution and emphasize that the Dempster-Shafer Theory of Evidence should be preferred to pure probability theory in applications involving non-linear Inter-Representation Operations.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Temporal changes in odor concentration are vitally important to many animals orienting and navigating in their environment. How are such temporal changes detected? Within the scope of the present work an accurate stimulation and analysis system was developed to examine the dynamics of physiological properties of Drosophila melanogaster olfactory receptor organs. Subsequently a new method for delivering odor stimuli was tested and used to present the first dynamic characterization of olfactory receptors at the level of single neurons. Initially, recordings of the whole antenna were conducted while stimulating with different odors. The odor delivery system allowed the dynamic characterization of the whole fly antenna, including its sensilla and receptor neurons. Based on the obtained electroantennogram data a new odor delivery method called digital sequence method was developed. In addition the degree of accuracy was enhanced, initially using electroantennograms, and later recordings of odorant receptor cells at the single sensilla level. This work shows for the first time that different odors evoked different responses within one neuron depending on the chemical structure of the odor. The present work offers new insights into the dynamic properties of olfactory transduction in Drosophila melanogaster and describes time dependent parameters underlying these properties.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Auf dem Gebiet der Strukturdynamik sind computergestützte Modellvalidierungstechniken inzwischen weit verbreitet. Dabei werden experimentelle Modaldaten, um ein numerisches Modell für weitere Analysen zu korrigieren. Gleichwohl repräsentiert das validierte Modell nur das dynamische Verhalten der getesteten Struktur. In der Realität gibt es wiederum viele Faktoren, die zwangsläufig zu variierenden Ergebnissen von Modaltests führen werden: Sich verändernde Umgebungsbedingungen während eines Tests, leicht unterschiedliche Testaufbauten, ein Test an einer nominell gleichen aber anderen Struktur (z.B. aus der Serienfertigung), etc. Damit eine stochastische Simulation durchgeführt werden kann, muss eine Reihe von Annahmen für die verwendeten Zufallsvariablengetroffen werden. Folglich bedarf es einer inversen Methode, die es ermöglicht ein stochastisches Modell aus experimentellen Modaldaten zu identifizieren. Die Arbeit beschreibt die Entwicklung eines parameter-basierten Ansatzes, um stochastische Simulationsmodelle auf dem Gebiet der Strukturdynamik zu identifizieren. Die entwickelte Methode beruht auf Sensitivitäten erster Ordnung, mit denen Parametermittelwerte und Kovarianzen des numerischen Modells aus stochastischen experimentellen Modaldaten bestimmt werden können.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Brazil has been increasing its importance in agricultural markets. The reasons are well known to be the relative abundance of land, the increasing technology used in crops, and the development of the agribusiness sector which allow for a fast response to price stimuli. The elasticity of acreage response to increases in expected return is estimated for Soybeans in a dynamic (long term) error correction model. Regarding yield patterns, a large variation in the yearly rates of growth in yield is observed, climate being probably the main source of this variation which result in ‘good’ and ‘bad’ years. In South America, special attention should be given to the El Niño and La Niña phenomena, both said to have important effects on rainfalls patterns and consequently in yield. The influence on El Niño and La Niña in historical data is examined and some ways of estimating the impact of climate on yield of Soybean and Corn markets are proposed. Possible implications of climate change may apply.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Almost everyone sketches. People use sketches day in and day out in many different and heterogeneous fields, to share their thoughts and clarify ambiguous interpretations, for example. The media used to sketch varies from analog tools like flipcharts to digital tools like smartboards. Whereas analog tools are usually affected by insufficient editing capabilities like cut/copy/paste, digital tools greatly support these scenarios. Digital tools can be grouped into informal and formal tools. Informal tools can be understood as simple drawing environments, whereas formal tools offer sophisticated support to create, optimize and validate diagrams of a certain application domain. Most digital formal tools force users to stick to a concrete syntax and editing workflow, limiting the user’s creativity. For that reason, a lot of people first sketch their ideas using the flexibility of analog or digital informal tools. Subsequently, the sketch is "portrayed" in an appropriate digital formal tool. This work presents Scribble, a highly configurable and extensible sketching framework which allows to dynamically inject sketching features into existing graphical diagram editors, based on Eclipse GEF. This allows to combine the flexibility of informal tools with the power of formal tools without any effort. No additional code is required to augment a GEF editor with sophisticated sketching features. Scribble recognizes drawn elements as well as handwritten text and automatically generates the corresponding domain elements. A local training data library is created dynamically by incrementally learning shapes, drawn by the user. Training data can be shared with others using the WebScribble web application which has been created as part of this work.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This research aims to understand the fundamental dynamic behavior of servo-controlled machinery in response to various types of sensory feedback. As an example of such a system, we study robot force control, a scheme which promises to greatly expand the capabilities of industrial robots by allowing manipulators to interact with uncertain and dynamic tasks. Dynamic models are developed which allow the effects of actuator dynamics, structural flexibility, and workpiece interaction to be explored in the frequency and time domains. The models are used first to explain the causes of robot force control instability, and then to find methods of improving this performance.