963 resultados para PAIR DISTRIBUTION FUNCTION


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Multivariate lifetime data arise in various forms including recurrent event data when individuals are followed to observe the sequence of occurrences of a certain type of event; correlated lifetime when an individual is followed for the occurrence of two or more types of events, or when distinct individuals have dependent event times. In most studies there are covariates such as treatments, group indicators, individual characteristics, or environmental conditions, whose relationship to lifetime is of interest. This leads to a consideration of regression models.The well known Cox proportional hazards model and its variations, using the marginal hazard functions employed for the analysis of multivariate survival data in literature are not sufficient to explain the complete dependence structure of pair of lifetimes on the covariate vector. Motivated by this, in Chapter 2, we introduced a bivariate proportional hazards model using vector hazard function of Johnson and Kotz (1975), in which the covariates under study have different effect on two components of the vector hazard function. The proposed model is useful in real life situations to study the dependence structure of pair of lifetimes on the covariate vector . The well known partial likelihood approach is used for the estimation of parameter vectors. We then introduced a bivariate proportional hazards model for gap times of recurrent events in Chapter 3. The model incorporates both marginal and joint dependence of the distribution of gap times on the covariate vector . In many fields of application, mean residual life function is considered superior concept than the hazard function. Motivated by this, in Chapter 4, we considered a new semi-parametric model, bivariate proportional mean residual life time model, to assess the relationship between mean residual life and covariates for gap time of recurrent events. The counting process approach is used for the inference procedures of the gap time of recurrent events. In many survival studies, the distribution of lifetime may depend on the distribution of censoring time. In Chapter 5, we introduced a proportional hazards model for duration times and developed inference procedures under dependent (informative) censoring. In Chapter 6, we introduced a bivariate proportional hazards model for competing risks data under right censoring. The asymptotic properties of the estimators of the parameters of different models developed in previous chapters, were studied. The proposed models were applied to various real life situations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A simple method is presented to evaluate the effects of short-range correlations on the momentum distribution of nucleons in nuclear matter within the framework of the Greens function approach. The method provides a very efficient representation of the single-particle Greens function for a correlated system. The reliability of this method is established by comparing its results to those obtained in more elaborate calculations. The sensitivity of the momentum distribution on the nucleon-nucleon interaction and the nuclear density is studied. The momentum distributions of nucleons in finite nuclei are derived from those in nuclear matter using a local-density approximation. These results are compared to those obtained directly for light nuclei like 16O.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Soil microorganisms play a main part in organic matter decomposition and are consequently necessary to soil ecosystem processes maintaining primary productivity of plants. In light of current concerns about the impact of cultivation and climate change on biodiversity and ecosystem performance, it is vital to expand a complete understanding of the microbial community ecology in our soils. In the present study we measured the depth wise profile of microbial load in relation with important soil physicochemical characteristics (soil temperature, soil pH, moisture content, organic carbon and available NPK) of the soil samples collected from Mahatma Gandhi University Campus, Kottayam (midland region of Kerala). Soil cores (30 cm deep) were taken and the cores were separated into three 10-cm depths to examine depth wise distribution. In the present study, bacterial load ranged from 141×105 to 271×105 CFU/g (10cm depth), from 80×105 to 131×105 CFU/g (20cm depth) and from 260×104 to 47×105 CFU/g (30cm depth). Fungal load varies from 124×103 to 27×104 CFU/g, from 61×103 to110×103 CFU/g and from 16×103 to 49×103 CFU/g at 10, 20 and 30 cm respectively. Actinomycetes count ranged from 129×103 to 60×104 CFU/g (10cm), from 70×103 to 31×104 CFU/g (20cm) and from 14×103 to 66×103 CFU/g (30cm). The study revealed that there was a significant difference in the depthwise distribution of microbial load and soil physico-chemical properties. Bacterial, fungal and actinomycetes load showed a decreasing trend with increasing depth at all the sites. Except pH all other physicochemical properties showed decreasing trend with increasing depth. The vertical profile of total microbial load was well matched with the depthwise profiles of soil nutrients and organic carbon that is microbial load was highest at the soil surface where organics and nutrients were highest

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Quantile functions are efficient and equivalent alternatives to distribution functions in modeling and analysis of statistical data (see Gilchrist, 2000; Nair and Sankaran, 2009). Motivated by this, in the present paper, we introduce a quantile based Shannon entropy function. We also introduce residual entropy function in the quantile setup and study its properties. Unlike the residual entropy function due to Ebrahimi (1996), the residual quantile entropy function determines the quantile density function uniquely through a simple relationship. The measure is used to define two nonparametric classes of distributions

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Di Crescenzo and Longobardi (2002) introduced a measure of uncertainty in past lifetime distributions and studied its relationship with residual entropy function. In the present paper, we introduce a quantile version of the entropy function in past lifetime and study its properties. Unlike the measure of uncertainty given in Di Crescenzo and Longobardi (2002) the proposed measure uniquely determines the underlying probability distribution. The measure is used to study two nonparametric classes of distributions. We prove characterizations theorems for some well known quantile lifetime distributions

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The screening correction to the coherent pair-production cross section on the oxygen molecule has been calculated using self-consistent relativistic wave functions for the one-center and two-center Coulomb potentials. It is shown that the modification of the wave function due to molecular binding and the interference between contributions from the two atoms have both sizeable effects on the screening correction. The so-obtained coherent pair-production cross section which makes up the largest part of the total atomic cross section was used to evaluate the total nuclear absorption cross section from photon attenuation measurements on liquid oxygen. The result agrees with cross sections for other nuclei if A-scaling is assumed. The molecular effect on the pair cross section amounts to 15 % of the nuclear cross section in the {\delta-resonance} region.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In dieser Arbeit werden mithilfe der Likelihood-Tiefen, eingeführt von Mizera und Müller (2004), (ausreißer-)robuste Schätzfunktionen und Tests für den unbekannten Parameter einer stetigen Dichtefunktion entwickelt. Die entwickelten Verfahren werden dann auf drei verschiedene Verteilungen angewandt. Für eindimensionale Parameter wird die Likelihood-Tiefe eines Parameters im Datensatz als das Minimum aus dem Anteil der Daten, für die die Ableitung der Loglikelihood-Funktion nach dem Parameter nicht negativ ist, und dem Anteil der Daten, für die diese Ableitung nicht positiv ist, berechnet. Damit hat der Parameter die größte Tiefe, für den beide Anzahlen gleich groß sind. Dieser wird zunächst als Schätzer gewählt, da die Likelihood-Tiefe ein Maß dafür sein soll, wie gut ein Parameter zum Datensatz passt. Asymptotisch hat der Parameter die größte Tiefe, für den die Wahrscheinlichkeit, dass für eine Beobachtung die Ableitung der Loglikelihood-Funktion nach dem Parameter nicht negativ ist, gleich einhalb ist. Wenn dies für den zu Grunde liegenden Parameter nicht der Fall ist, ist der Schätzer basierend auf der Likelihood-Tiefe verfälscht. In dieser Arbeit wird gezeigt, wie diese Verfälschung korrigiert werden kann sodass die korrigierten Schätzer konsistente Schätzungen bilden. Zur Entwicklung von Tests für den Parameter, wird die von Müller (2005) entwickelte Simplex Likelihood-Tiefe, die eine U-Statistik ist, benutzt. Es zeigt sich, dass für dieselben Verteilungen, für die die Likelihood-Tiefe verfälschte Schätzer liefert, die Simplex Likelihood-Tiefe eine unverfälschte U-Statistik ist. Damit ist insbesondere die asymptotische Verteilung bekannt und es lassen sich Tests für verschiedene Hypothesen formulieren. Die Verschiebung in der Tiefe führt aber für einige Hypothesen zu einer schlechten Güte des zugehörigen Tests. Es werden daher korrigierte Tests eingeführt und Voraussetzungen angegeben, unter denen diese dann konsistent sind. Die Arbeit besteht aus zwei Teilen. Im ersten Teil der Arbeit wird die allgemeine Theorie über die Schätzfunktionen und Tests dargestellt und zudem deren jeweiligen Konsistenz gezeigt. Im zweiten Teil wird die Theorie auf drei verschiedene Verteilungen angewandt: Die Weibull-Verteilung, die Gauß- und die Gumbel-Copula. Damit wird gezeigt, wie die Verfahren des ersten Teils genutzt werden können, um (robuste) konsistente Schätzfunktionen und Tests für den unbekannten Parameter der Verteilung herzuleiten. Insgesamt zeigt sich, dass für die drei Verteilungen mithilfe der Likelihood-Tiefen robuste Schätzfunktionen und Tests gefunden werden können. In unverfälschten Daten sind vorhandene Standardmethoden zum Teil überlegen, jedoch zeigt sich der Vorteil der neuen Methoden in kontaminierten Daten und Daten mit Ausreißern.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The algebraic-geometric structure of the simplex, known as Aitchison geometry, is used to look at the Dirichlet family of distributions from a new perspective. A classical Dirichlet density function is expressed with respect to the Lebesgue measure on real space. We propose here to change this measure by the Aitchison measure on the simplex, and study some properties and characteristic measures of the resulting density

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A novel test of spatial independence of the distribution of crystals or phases in rocks based on compositional statistics is introduced. It improves and generalizes the common joins-count statistics known from map analysis in geographic information systems. Assigning phases independently to objects in RD is modelled by a single-trial multinomial random function Z(x), where the probabilities of phases add to one and are explicitly modelled as compositions in the K-part simplex SK. Thus, apparent inconsistencies of the tests based on the conventional joins{count statistics and their possibly contradictory interpretations are avoided. In practical applications we assume that the probabilities of phases do not depend on the location but are identical everywhere in the domain of de nition. Thus, the model involves the sum of r independent identical multinomial distributed 1-trial random variables which is an r-trial multinomial distributed random variable. The probabilities of the distribution of the r counts can be considered as a composition in the Q-part simplex SQ. They span the so called Hardy-Weinberg manifold H that is proved to be a K-1-affine subspace of SQ. This is a generalisation of the well-known Hardy-Weinberg law of genetics. If the assignment of phases accounts for some kind of spatial dependence, then the r-trial probabilities do not remain on H. This suggests the use of the Aitchison distance between observed probabilities to H to test dependence. Moreover, when there is a spatial uctuation of the multinomial probabilities, the observed r-trial probabilities move on H. This shift can be used as to check for these uctuations. A practical procedure and an algorithm to perform the test have been developed. Some cases applied to simulated and real data are presented. Key words: Spatial distribution of crystals in rocks, spatial distribution of phases, joins-count statistics, multinomial distribution, Hardy-Weinberg law, Hardy-Weinberg manifold, Aitchison geometry

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We include solvation effects in tight-binding Hamiltonians for hole states in DNA. The corresponding linear-response parameters are derived from accurate estimates of solvation energy calculated for several hole charge distributions in DNA stacks. Two models are considered: (A) the correction to a diagonal Hamiltonian matrix element depends only on the charge localized on the corresponding site and (B) in addition to this term, the reaction field due to adjacent base pairs is accounted for. We show that both schemes give very similar results. The effects of the polar medium on the hole distribution in DNA are studied. We conclude that the effects of polar surroundings essentially suppress charge delocalization in DNA, and hole states in (GC)n sequences are localized on individual guanines

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Sulphate-reducing bacteria (SRB) and methanogenic archaea (MA) are important anaerobic terminal oxidisers of organic matter. However, we have little knowledge about the distribution and types of SRB and MA in the environment or the functional role they play in situ. Here we have utilised sediment slurry microcosms amended with ecologically significant substrates, including acetate and hydrogen, and specific functional inhibitors, to identify the important SRB and MA groups in two contrasting sites on a UK estuary. Substrate and inhibitor additions had significant effects on methane production and on acetate and sulphate consumption in the slurries. By using specific 16S-targeted oligonucleotide probes we were able to link specific SRB and MA groups to the use of the added substrates. Acetate consumption in the freshwater-dominated sediments was mediated by Methanosarcinales under low-sulphate conditions and Desulfobacter under the high-sulphate conditions that simulated a tidal incursion. In the marine-dominated sediments, acetate consumption was linked to Desulfobacter. Addition of trimethylamine, a non-competitive substrate for methanogenesis, led to a large increase in Methanosarcinales signal in marine slurries. Desulfobulbus was linked to non-sulphate-dependent H-2 consumption in the freshwater sediments. The addition of sulphate to freshwater sediments inhibited methane production and reduced signal from probes targeted to Methanosarcinales and Methanomicrobiales, while the addition of molybdate to marine sediments inhibited Desulfobulbus and Desulfobacterium. These data complement our understanding of the ecophysiology of the organisms detected and make a firm connection between the capabilities of species, as observed in the laboratory, to their roles in the environment. (C) 2003 Federation of European Microbiological Societies. Published by Elsevier Science B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The correlated k-distribution (CKD) method is widely used in the radiative transfer schemes of atmospheric models and involves dividing the spectrum into a number of bands and then reordering the gaseous absorption coefficients within each one. The fluxes and heating rates for each band may then be computed by discretizing the reordered spectrum into of order 10 quadrature points per major gas and performing a monochromatic radiation calculation for each point. In this presentation it is shown that for clear-sky longwave calculations, sufficient accuracy for most applications can be achieved without the need for bands: reordering may be performed on the entire longwave spectrum. The resulting full-spectrum correlated k (FSCK) method requires significantly fewer monochromatic calculations than standard CKD to achieve a given accuracy. The concept is first demonstrated by comparing with line-by-line calculations for an atmosphere containing only water vapor, in which it is shown that the accuracy of heating-rate calculations improves approximately in proportion to the square of the number of quadrature points. For more than around 20 points, the root-mean-squared error flattens out at around 0.015 K/day due to the imperfect rank correlation of absorption spectra at different pressures in the profile. The spectral overlap of m different gases is treated by considering an m-dimensional hypercube where each axis corresponds to the reordered spectrum of one of the gases. This hypercube is then divided up into a number of volumes, each approximated by a single quadrature point, such that the total number of quadrature points is slightly fewer than the sum of the number that would be required to treat each of the gases separately. The gaseous absorptions for each quadrature point are optimized such that they minimize a cost function expressing the deviation of the heating rates and fluxes calculated by the FSCK method from line-by-line calculations for a number of training profiles. This approach is validated for atmospheres containing water vapor, carbon dioxide, and ozone, in which it is found that in the troposphere and most of the stratosphere, heating-rate errors of less than 0.2 K/day can be achieved using a total of 23 quadrature points, decreasing to less than 0.1 K/day for 32 quadrature points. It would be relatively straightforward to extend the method to include other gases.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper we perform an analytical and numerical study of Extreme Value distributions in discrete dynamical systems. In this setting, recent works have shown how to get a statistics of extremes in agreement with the classical Extreme Value Theory. We pursue these investigations by giving analytical expressions of Extreme Value distribution parameters for maps that have an absolutely continuous invariant measure. We compare these analytical results with numerical experiments in which we study the convergence to limiting distributions using the so called block-maxima approach, pointing out in which cases we obtain robust estimation of parameters. In regular maps for which mixing properties do not hold, we show that the fitting procedure to the classical Extreme Value Distribution fails, as expected. However, we obtain an empirical distribution that can be explained starting from a different observable function for which Nicolis et al. (Phys. Rev. Lett. 97(21): 210602, 2006) have found analytical results.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper we perform an analytical and numerical study of Extreme Value distributions in discrete dynamical systems that have a singular measure. Using the block maxima approach described in Faranda et al. [2011] we show that, numerically, the Extreme Value distribution for these maps can be associated to the Generalised Extreme Value family where the parameters scale with the information dimension. The numerical analysis are performed on a few low dimensional maps. For the middle third Cantor set and the Sierpinskij triangle obtained using Iterated Function Systems, experimental parameters show a very good agreement with the theoretical values. For strange attractors like Lozi and H\`enon maps a slower convergence to the Generalised Extreme Value distribution is observed. Even in presence of large statistics the observed convergence is slower if compared with the maps which have an absolute continuous invariant measure. Nevertheless and within the uncertainty computed range, the results are in good agreement with the theoretical estimates.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The nucleotide sequence of a 3 kb region immediately upstream of the sef operon operon of Salmonella enteritidis was determined. A 1230 base pair insertion sequence which shared sequence identity (> 75%) with members of the IS3 family was revealed. This element, designated IS1230, had almost identical (90% identity) terminal inverted repeats to Escherichia coli IS3 but unlike other IS3-like sequences lacked the two characteristic open reading frames which encode the putative transposase. S. enteritidis possessed only one copy of this insertion sequence although Southern hybridisation analysis of restriction digests of genomic DNA revealed another fragment located in a region different from the sef operon which hybridised weakly which suggested the presence of an IS1230 homologue. The distribution of IS1230 and IS1230-like elements was shown to be widespread amongst salmonellas and the patterns of restriction fragments which hybridised differed significantly between Salmonella serotypes and it is suggested that IS1230 has potential for development as a differential diagnostic tool.