980 resultados para Bivariate Exponential
Resumo:
In this article we introduce some structural relationships between weighted and original variables in the context of maintainability function and reversed repair rate. Furthermore, we prove some characterization theorems for specific models such as power, exponential, Pareto II, beta, and Pearson system of distributions using the relationships between the original and weighted random variables
Resumo:
In this paper, we study the relationship between the failure rate and the mean residual life of doubly truncated random variables. Accordingly, we develop characterizations for exponential, Pareto 11 and beta distributions. Further, we generalize the identities for fire Pearson and the exponential family of distributions given respectively in Nair and Sankaran (1991) and Consul (1995). Applications of these measures in file context of lengthbiased models are also explored
Resumo:
Recently, cumulative residual entropy (CRE) has been found to be a new measure of information that parallels Shannon’s entropy (see Rao et al. [Cumulative residual entropy: A new measure of information, IEEE Trans. Inform. Theory. 50(6) (2004), pp. 1220–1228] and Asadi and Zohrevand [On the dynamic cumulative residual entropy, J. Stat. Plann. Inference 137 (2007), pp. 1931–1941]). Motivated by this finding, in this paper, we introduce a generalized measure of it, namely cumulative residual Renyi’s entropy, and study its properties.We also examine it in relation to some applied problems such as weighted and equilibrium models. Finally, we extend this measure into the bivariate set-up and prove certain characterizing relationships to identify different bivariate lifetime models
Resumo:
Recently, reciprocal subtangent has been used as a useful tool to describe the behaviour of a density curve. Motivated by this, in the present article we extend the concept to the weighted models. Characterization results are proved for models viz. gamma, Rayleigh, equilibrium, residual lifetime, and proportional hazards. An identity under weighted distribution is also obtained when the reciprocal subtangent takes the form of a general class of distributions. Finally, an extension of reciprocal subtangent for the weighted models in the bivariate and multivariate cases are introduced and proved some useful results
Resumo:
Multimodal imaging agents that combine magnetic and fluorescent imaging capabilities are desirable for the high spatial and temporal resolution. In the present work, we report the synthesis of multifunctional fluorescent ferrofluids using iron oxide as the magnetic core and rhodamine B as fluorochrome shell. The core–shell structure was designed in such a way that fluorescence quenching due to the inner magnetic core was minimized by an intermediate layer of silica. The intermediate passive layer of silica was realized by a novel method which involves the esterification reaction between the epoxy group of prehydrolysed 3-Glyidoxypropyltrimethoxysilane and the surfactant over iron oxide. The as-synthesized ferrofluids have a high saturation magnetization in the range of 62–65 emu/g and were found to emit light of wavelength 640 nm ( excitation = 446 nm). Time resolved life time decay analysis showed a bi-exponential decay pattern with an increase in the decay life time in the presence of intermediate silica layer. Cytotoxicity studies confirmed the cell viability of these materials. The in vitro MRI imaging illustrated a high contrast when these multimodal nano probes were employed and the R2 relaxivity of these ∗Author to whom correspondence should be addressed. Email: smissmis@gmail.com sample was found to be 334 mM−1s−1 which reveals its high potential as a T2 contrast enhancing agent
Resumo:
One of the interesting consequences of Einstein's General Theory of Relativity is the black hole solutions. Until the observation made by Hawking in 1970s, it was believed that black holes are perfectly black. The General Theory of Relativity says that black holes are objects which absorb both matter and radiation crossing the event horizon. The event horizon is a surface through which even light is not able to escape. It acts as a one sided membrane that allows the passage of particles only in one direction i.e. towards the center of black holes. All the particles that are absorbed by black hole increases the mass of the black hole and thus the size of event horizon also increases. Hawking showed in 1970s that when applying quantum mechanical laws to black holes they are not perfectly black but they can emit radiation. Thus the black hole can have temperature known as Hawking temperature. In the thesis we have studied some aspects of black holes in f(R) theory of gravity and Einstein's General Theory of Relativity. The scattering of scalar field in this background space time studied in the first chapter shows that the extended black hole will scatter scalar waves and have a scattering cross section and applying tunneling mechanism we have obtained the Hawking temperature of this black hole. In the following chapter we have investigated the quasinormal properties of the extended black hole. We have studied the electromagnetic and scalar perturbations in this space-time and find that the black hole frequencies are complex and show exponential damping indicating the black hole is stable against the perturbations. In the present study we show that not only the black holes exist in modified gravities but also they have similar properties of black hole space times in General Theory of Relativity. 2 + 1 black holes or three dimensional black holes are simplified examples of more complicated four dimensional black holes. Thus these models of black holes are known as toy models of black holes in four dimensional black holes in General theory of Relativity. We have studied some properties of these types of black holes in Einstein model (General Theory of Relativity). A three dimensional black hole known as MSW is taken for our study. The thermodynamics and spectroscopy of MSW black hole are studied and obtained the area spectrum which is equispaced and different thermo dynamical properties are studied. The Dirac perturbation of this three dimensional black hole is studied and the resulting quasinormal spectrum of this three dimensional black hole is obtained. The different quasinormal frequencies are tabulated in tables and these values show an exponential damping of oscillations indicating the black hole is stable against the mass less Dirac perturbation. In General Theory of Relativity almost all solutions contain singularities. The cosmological solution and different black hole solutions of Einstein's field equation contain singularities. The regular black hole solutions are those which are solutions of Einstein's equation and have no singularity at the origin. These solutions possess event horizon but have no central singularity. Such a solution was first put forward by Bardeen. Hayward proposed a similar regular black hole solution. We have studied the thermodynamics and spectroscopy of Hay-ward regular black holes. We have also obtained the different thermodynamic properties and the area spectrum. The area spectrum is a function of the horizon radius. The entropy-heat capacity curve has a discontinuity at some value of entropy showing a phase transition.
Resumo:
Ausgangspunkt der Dissertation ist ein von V. Maz'ya entwickeltes Verfahren, eine gegebene Funktion f : Rn ! R durch eine Linearkombination fh radialer glatter exponentiell fallender Basisfunktionen zu approximieren, die im Gegensatz zu den Splines lediglich eine näherungsweise Zerlegung der Eins bilden und somit ein für h ! 0 nicht konvergentes Verfahren definieren. Dieses Verfahren wurde unter dem Namen Approximate Approximations bekannt. Es zeigt sich jedoch, dass diese fehlende Konvergenz für die Praxis nicht relevant ist, da der Fehler zwischen f und der Approximation fh über gewisse Parameter unterhalb der Maschinengenauigkeit heutiger Rechner eingestellt werden kann. Darüber hinaus besitzt das Verfahren große Vorteile bei der numerischen Lösung von Cauchy-Problemen der Form Lu = f mit einem geeigneten linearen partiellen Differentialoperator L im Rn. Approximiert man die rechte Seite f durch fh, so lassen sich in vielen Fällen explizite Formeln für die entsprechenden approximativen Volumenpotentiale uh angeben, die nur noch eine eindimensionale Integration (z.B. die Errorfunktion) enthalten. Zur numerischen Lösung von Randwertproblemen ist das von Maz'ya entwickelte Verfahren bisher noch nicht genutzt worden, mit Ausnahme heuristischer bzw. experimenteller Betrachtungen zur sogenannten Randpunktmethode. Hier setzt die Dissertation ein. Auf der Grundlage radialer Basisfunktionen wird ein neues Approximationsverfahren entwickelt, welches die Vorzüge der von Maz'ya für Cauchy-Probleme entwickelten Methode auf die numerische Lösung von Randwertproblemen überträgt. Dabei werden stellvertretend das innere Dirichlet-Problem für die Laplace-Gleichung und für die Stokes-Gleichungen im R2 behandelt, wobei für jeden der einzelnen Approximationsschritte Konvergenzuntersuchungen durchgeführt und Fehlerabschätzungen angegeben werden.
Resumo:
Five laboratory incubation experiments were carried out to assess the salinity-induced changes in the microbial use of sugarcane filter cake added to soil. The first laboratory experiment was carried out to prove the hypothesis that the lower content of fungal biomass in a saline soil reduces the decomposition of a complex organic substrate in comparison to a non-saline soil under acidic conditions. Three different rates (0.5, 1.0, and 2.0%) of sugarcane filter cake were added to both soils and incubated for 63 days at 30°C. In the saline control soil without amendment, cumulative CO2 production was 70% greater than in the corresponding non-saline control soil, but the formation of inorganic N did not differ between these two soils. However, nitrification was inhibited in the saline soil. The increase in cumulative CO2 production by adding filter cake was similar in both soils, corresponding to 29% of the filter cake C at all three addition rates. Also the increases in microbial biomass C and biomass N were linearly related to the amount of filter cake added, but this increase was slightly higher for both properties in the saline soil. In contrast to microbial biomass, the absolute increase in ergosterol content in the saline soil was on average only half that in the non-saline soil and it showed also strong temporal changes during the incubation: A strong initial increase after adding the filter cake was followed by a rapid decline. The addition of filter cake led to immobilisation of inorganic N in both soils. This immobilisation was not expected, because the total C-to-total N ratio of the filter cake was below 13 and the organic C-to-organic N ratio in the 0.5 M K2SO4 extract of this material was even lower at 9.2. The immobilisation was considerably higher in the saline soil than in the non-saline soil. The N immobilisation capacity of sugarcane filter cake should be considered when this material is applied to arable sites at high rations. The second incubation experiment was carried out to examine the N immobilizing effect of sugarcane filter cake (C/N ratio of 12.4) and to investigate whether mixing it with compost (C/N ratio of 10.5) has any synergistic effects on C and N mineralization after incorporation into the soil. Approximately 19% of the compost C added and 37% of the filter cake C were evolved as CO2, assuming that the amendments had no effects on the decomposition of soil organic C. However, only 28% of the added filter cake was lost according to the total C and d13C values. Filter cake and compost contained initially significant concentrations of inorganic N, which was nearly completely immobilized between day 7 and 14 of the incubation in most cases. After day 14, N re-mineralization occurred at an average rate of 0.73 µg N g-1 soil d-1 in most amendment treatments, paralleling the N mineralization rate of the non-amended control without significant difference. No significant net N mineralization from the amendment N occurred in any of the amendment treatments in comparison to the control. The addition of compost and filter cake resulted in a linear increase in microbial biomass C with increasing amounts of C added. This increase was not affected by differences in substrate quality, especially the three times larger content of K2SO4 extractable organic C in the sugarcane filter cake. In most amendment treatments, microbial biomass C and biomass N increased until the end of the incubation. No synergistic effects could be observed in the mixture treatments of compost and sugarcane filter cake. The third 42-day incubation experiment was conducted to answer the questions whether the decomposition of sugarcane filter cake also result in immobilization of nitrogen in a saline alkaline soil and whether the mixing of sugarcane filter cake with glucose (adjusted to a C/N ratio of 12.5 with (NH4)2SO4) change its decomposition. The relative percentage CO2 evolved increased from 35% of the added C in the pure 0.5% filter cake treatment to 41% in the 0.5% filter cake +0.25% glucose treatment to 48% in the 0.5% filter cake +0.5% glucose treatment. The three different amendment treatments led to immediate increases in microbial biomass C and biomass N within 6 h that persisted only in the pure filter cake treatment until the end of the incubation. The fungal cell-membrane component ergosterol showed initially an over-proportionate increase in relation to microbial biomass C that fully disappeared at the end of the incubation. The cellulase activity showed a 5-fold increase after filter cake addition, which was not further increased by the additional glucose amendment. The cellulase activity showed an exponential decline to values around 4% of the initial value in all treatments. The amount of inorganic N immobilized from day 0 to day 14 increased with increasing amount of C added in comparison to the control treatment. Since day 14, the immobilized N was re-mineralized at rates between 1.31 and 1.51 µg N g-1 soil d-1 in the amendment treatments and was thus more than doubled in comparison with the control treatment. This means that the re-mineralization rate is independent from the actual size of the microbial residues pool and also independent from the size of the soil microbial biomass. Other unknown soil properties seem to form a soil-specific gate for the release of inorganic N. The fourth incubation experiment was carried out with the objective of assessing the effects of salt additions containing different anions (Cl-, SO42-, HCO3-) on the microbial use of sugarcane filter cake and dhancha leaves amended to inoculated sterile quartz sand. In the subsequent fifth experiment, the objective was to assess the effects of inoculum and temperature on the decomposition of sugar cane filter cake. In the fourth experiment, sugarcane filter cake led to significantly lower respiration rates, lower contents of extractable C and N, and lower contents of microbial biomass C and N than dhancha leaves, but to a higher respiratory quotient RQ and to a higher content of the fungal biomarker ergosterol. The RQ was significantly increased after salt addition, when comparing the average of all salinity treatments with the control. Differences in anion composition had no clear effects on the RQ values. In experiment 2, the rise in temperature from 20 to 40°C increased the CO2 production rate by a factor of 1.6, the O2 consumption rate by a factor of 1.9 and the ergosterol content by 60%. In contrast, the contents of microbial biomass N decreased by 60% and the RQ by 13%. The effects of the inoculation with a saline soil were in most cases negative and did not indicate a better adaptation of these organisms to salinity. The general effects of anion composition on microbial biomass and activity indices were small and inconsistent. Only the fraction of 0.5 M K2SO4 extractable C and N in non-fumigated soil was consistently increased in the 1.2 M NaHCO3 treatment of both experiments. In contrast to the small salinity effects, the quality of the substrate has overwhelming effects on microbial biomass and activity indices, especially on the fungal part of the microbial community.
Resumo:
A recurrent iterated function system (RIFS) is a genaralization of an IFS and provides nonself-affine fractal sets which are closer to natural objects. In general, it's attractor is not a continuous surface in R3. A recurrent fractal interpolation surface (RFIS) is an attractor of RIFS which is a graph of bivariate continuous interpolation function. We introduce a general method of generating recurrent interpolation surface which are at- tractors of RIFSs about any data set on a grid.
Resumo:
In der vorliegenden Dissertation werden Systeme von parallel arbeitenden und miteinander kommunizierenden Restart-Automaten (engl.: systems of parallel communicating restarting automata; abgekürzt PCRA-Systeme) vorgestellt und untersucht. Dabei werden zwei bekannte Konzepte aus den Bereichen Formale Sprachen und Automatentheorie miteinander vescrknüpft: das Modell der Restart-Automaten und die sogenannten PC-Systeme (systems of parallel communicating components). Ein PCRA-System besteht aus endlich vielen Restart-Automaten, welche einerseits parallel und unabhängig voneinander lokale Berechnungen durchführen und andererseits miteinander kommunizieren dürfen. Die Kommunikation erfolgt dabei durch ein festgelegtes Kommunikationsprotokoll, das mithilfe von speziellen Kommunikationszuständen realisiert wird. Ein wesentliches Merkmal hinsichtlich der Kommunikationsstruktur in Systemen von miteinander kooperierenden Komponenten ist, ob die Kommunikation zentralisiert oder nichtzentralisiert erfolgt. Während in einer nichtzentralisierten Kommunikationsstruktur jede Komponente mit jeder anderen Komponente kommunizieren darf, findet jegliche Kommunikation innerhalb einer zentralisierten Kommunikationsstruktur ausschließlich mit einer ausgewählten Master-Komponente statt. Eines der wichtigsten Resultate dieser Arbeit zeigt, dass zentralisierte Systeme und nichtzentralisierte Systeme die gleiche Berechnungsstärke besitzen (das ist im Allgemeinen bei PC-Systemen nicht so). Darüber hinaus bewirkt auch die Verwendung von Multicast- oder Broadcast-Kommunikationsansätzen neben Punkt-zu-Punkt-Kommunikationen keine Erhöhung der Berechnungsstärke. Desweiteren wird die Ausdrucksstärke von PCRA-Systemen untersucht und mit der von PC-Systemen von endlichen Automaten und mit der von Mehrkopfautomaten verglichen. PC-Systeme von endlichen Automaten besitzen bekanntermaßen die gleiche Ausdrucksstärke wie Einwegmehrkopfautomaten und bilden eine untere Schranke für die Ausdrucksstärke von PCRA-Systemen mit Einwegkomponenten. Tatsächlich sind PCRA-Systeme auch dann stärker als PC-Systeme von endlichen Automaten, wenn die Komponenten für sich genommen die gleiche Ausdrucksstärke besitzen, also die regulären Sprachen charakterisieren. Für PCRA-Systeme mit Zweiwegekomponenten werden als untere Schranke die Sprachklassen der Zweiwegemehrkopfautomaten im deterministischen und im nichtdeterministischen Fall gezeigt, welche wiederum den bekannten Komplexitätsklassen L (deterministisch logarithmischer Platz) und NL (nichtdeterministisch logarithmischer Platz) entsprechen. Als obere Schranke wird die Klasse der kontextsensitiven Sprachen gezeigt. Außerdem werden Erweiterungen von Restart-Automaten betrachtet (nonforgetting-Eigenschaft, shrinking-Eigenschaft), welche bei einzelnen Komponenten eine Erhöhung der Berechnungsstärke bewirken, in Systemen jedoch deren Stärke nicht erhöhen. Die von PCRA-Systemen charakterisierten Sprachklassen sind unter diversen Sprachoperationen abgeschlossen und einige Sprachklassen sind sogar abstrakte Sprachfamilien (sogenannte AFL's). Abschließend werden für PCRA-Systeme spezifische Probleme auf ihre Entscheidbarkeit hin untersucht. Es wird gezeigt, dass Leerheit, Universalität, Inklusion, Gleichheit und Endlichkeit bereits für Systeme mit zwei Restart-Automaten des schwächsten Typs nicht semientscheidbar sind. Für das Wortproblem wird gezeigt, dass es im deterministischen Fall in quadratischer Zeit und im nichtdeterministischen Fall in exponentieller Zeit entscheidbar ist.
Resumo:
The aim of this work is to find simple formulas for the moments mu_n for all families of classical orthogonal polynomials listed in the book by Koekoek, Lesky and Swarttouw. The generating functions or exponential generating functions for those moments are given.
Resumo:
Die Berechnung des 1912 von Birkhoff eingeführten chromatischen Polynoms eines Graphen stellt bekanntlich ein NP-vollständiges Problem dar. Dieses gilt somit erst recht für die Verallgemeinerung des chromatischen Polynoms zum bivariaten chromatischen Polynom nach Dohmen, Pönitz und Tittmann aus dem Jahre 2003. Eine von Averbouch, Godlin und Makowsky 2008 vorgestellte Rekursionsformel verursacht durch wiederholte Anwendung im Allgemeinen einen exponentiellen Rechenaufwand. Daher war das Ziel der vorliegenden Dissertation, Vereinfachungen zur Berechnung des bivariaten chromatischen Polynoms spezieller Graphentypen zu finden. Hierbei wurden folgende Resultate erzielt: Für Vereinigungen von Sternen, für vollständige Graphen, aus welchen die Kanten von Sternen mit paarweise voneinander verschiedenen Ecken gelöscht wurden, für spezielle Splitgraphen und für vollständig partite Graphen konnten rekursionsfreie Gleichungen zur Berechnung des bivariaten chromatischen Polynoms mit jeweils linear beschränkter Rechenzeit gefunden werden. Weiterhin werden Möglichkeiten der Reduktion allgemeiner Splitgraphen, bestimmter bipartiter Graphen sowie vollständig partiter Graphen vorgestellt. Bei letzteren erweist sich eine hierbei gefundene Rekursionsformel durch eine polynomiell beschränkte Laufzeit als effektive Methode. Ferner konnte in einem Abschnitt zu Trennern in Graphen gezeigt werden, dass der Spezialfall der trennenden Cliquen, welcher im univariaten Fall sehr einfach ist, im bivariaten Fall sehr komplexe Methoden erfordert. Ein Zusammenhang zwischen dem bivariaten chromatischen Polynom und dem Matchingpolynom wurde für vollständige Graphen, welchen die Kanten von Sternen mit paarweise voneinander verschiedenen Ecken entnommen wurden, sowie für Bicliquen hergestellt. Die vorliegende Dissertation liefert darüber hinaus auch einige Untersuchungen zum trivariaten chromatischen Polynom, welches auf White (2011) zurückgeht und eine weitere Verallgemeinerung des bivariaten chromatischen Polynoms darstellt. Hierbei konnte gezeigt werden, dass dessen Berechnung selbst für einfache Graphentypen schon recht kompliziert ist. Dieses trifft sogar dann noch zu, wenn man die einzelnen Koeffizienten als bivariate Polynome abspaltet und einzeln berechnet. Abschließend stellt die Arbeit zu vielen Resultaten Implementierungen mit dem Computeralgebrasystem Mathematica bereit, welche zahlreiche Möglichkeiten zu eigenständigen Versuchen bieten.
Resumo:
Traditional inventory models focus on risk-neutral decision makers, i.e., characterizing replenishment strategies that maximize expected total profit, or equivalently, minimize expected total cost over a planning horizon. In this paper, we propose a framework for incorporating risk aversion in multi-period inventory models as well as multi-period models that coordinate inventory and pricing strategies. In each case, we characterize the optimal policy for various measures of risk that have been commonly used in the finance literature. In particular, we show that the structure of the optimal policy for a decision maker with exponential utility functions is almost identical to the structure of the optimal risk-neutral inventory (and pricing) policies. Computational results demonstrate the importance of this approach not only to risk-averse decision makers, but also to risk-neutral decision makers with limited information on the demand distribution.
Resumo:
The Aitchison vector space structure for the simplex is generalized to a Hilbert space structure A2(P) for distributions and likelihoods on arbitrary spaces. Central notations of statistics, such as Information or Likelihood, can be identified in the algebraical structure of A2(P) and their corresponding notions in compositional data analysis, such as Aitchison distance or centered log ratio transform. In this way very elaborated aspects of mathematical statistics can be understood easily in the light of a simple vector space structure and of compositional data analysis. E.g. combination of statistical information such as Bayesian updating, combination of likelihood and robust M-estimation functions are simple additions/ perturbations in A2(Pprior). Weighting observations corresponds to a weighted addition of the corresponding evidence. Likelihood based statistics for general exponential families turns out to have a particularly easy interpretation in terms of A2(P). Regular exponential families form finite dimensional linear subspaces of A2(P) and they correspond to finite dimensional subspaces formed by their posterior in the dual information space A2(Pprior). The Aitchison norm can identified with mean Fisher information. The closing constant itself is identified with a generalization of the cummulant function and shown to be Kullback Leiblers directed information. Fisher information is the local geometry of the manifold induced by the A2(P) derivative of the Kullback Leibler information and the space A2(P) can therefore be seen as the tangential geometry of statistical inference at the distribution P. The discussion of A2(P) valued random variables, such as estimation functions or likelihoods, give a further interpretation of Fisher information as the expected squared norm of evidence and a scale free understanding of unbiased reasoning
Resumo:
The preceding two editions of CoDaWork included talks on the possible consideration of densities as infinite compositions: Egozcue and D´ıaz-Barrero (2003) extended the Euclidean structure of the simplex to a Hilbert space structure of the set of densities within a bounded interval, and van den Boogaart (2005) generalized this to the set of densities bounded by an arbitrary reference density. From the many variations of the Hilbert structures available, we work with three cases. For bounded variables, a basis derived from Legendre polynomials is used. For variables with a lower bound, we standardize them with respect to an exponential distribution and express their densities as coordinates in a basis derived from Laguerre polynomials. Finally, for unbounded variables, a normal distribution is used as reference, and coordinates are obtained with respect to a Hermite-polynomials-based basis. To get the coordinates, several approaches can be considered. A numerical accuracy problem occurs if one estimates the coordinates directly by using discretized scalar products. Thus we propose to use a weighted linear regression approach, where all k- order polynomials are used as predictand variables and weights are proportional to the reference density. Finally, for the case of 2-order Hermite polinomials (normal reference) and 1-order Laguerre polinomials (exponential), one can also derive the coordinates from their relationships to the classical mean and variance. Apart of these theoretical issues, this contribution focuses on the application of this theory to two main problems in sedimentary geology: the comparison of several grain size distributions, and the comparison among different rocks of the empirical distribution of a property measured on a batch of individual grains from the same rock or sediment, like their composition