901 resultados para extinction probability
Resumo:
Using the independent particle model as our basis we present a scheme to reduce the complexity and computational effort to calculate inclusive probabilities in many-electron collision system. As an example we present an application to K - K charge transfer in collisions of 2.6 MeV Ne{^9+} on Ne. We are able to give impact parameter-dependent probabilities for many-particle states which could lead to KLL-Auger electrons after collision and we compare with experimental values.
Resumo:
Using the single-particle amplitudes from a 20-level coupled-channel calculation with ab initio relativistic self consistent LCAO-MO Dirac-Fock-Slater energy eigenvalues and matrix elements we calculate within the frame of the inclusive probability formalism impact-parameter-dependent K-hole transfer probabilities. As an example we show results for the heavy asymmetric collision system S{^15+} on Ar for impact energies from 4.7 to 16 MeV. The inclusive probability formalism which reinstates the many-particle aspect of the collision system permits a qualitative and quantitative agreement with the experiment which is not achieved by the single-particle picture.
Resumo:
Graphical techniques for modeling the dependencies of randomvariables have been explored in a variety of different areas includingstatistics, statistical physics, artificial intelligence, speech recognition, image processing, and genetics.Formalisms for manipulating these models have been developedrelatively independently in these research communities. In this paper weexplore hidden Markov models (HMMs) and related structures within the general framework of probabilistic independencenetworks (PINs). The paper contains a self-contained review of the basic principles of PINs.It is shown that the well-known forward-backward (F-B) and Viterbialgorithms for HMMs are special cases of more general inference algorithms forarbitrary PINs. Furthermore, the existence of inference and estimationalgorithms for more general graphical models provides a set of analysistools for HMM practitioners who wish to explore a richer class of HMMstructures.Examples of relatively complex models to handle sensorfusion and coarticulationin speech recognitionare introduced and treated within the graphical model framework toillustrate the advantages of the general approach.
Resumo:
Compositional data analysis motivated the introduction of a complete Euclidean structure in the simplex of D parts. This was based on the early work of J. Aitchison (1986) and completed recently when Aitchinson distance in the simplex was associated with an inner product and orthonormal bases were identified (Aitchison and others, 2002; Egozcue and others, 2003). A partition of the support of a random variable generates a composition by assigning the probability of each interval to a part of the composition. One can imagine that the partition can be refined and the probability density would represent a kind of continuous composition of probabilities in a simplex of infinitely many parts. This intuitive idea would lead to a Hilbert-space of probability densities by generalizing the Aitchison geometry for compositions in the simplex into the set probability densities
Resumo:
The Aitchison vector space structure for the simplex is generalized to a Hilbert space structure A2(P) for distributions and likelihoods on arbitrary spaces. Central notations of statistics, such as Information or Likelihood, can be identified in the algebraical structure of A2(P) and their corresponding notions in compositional data analysis, such as Aitchison distance or centered log ratio transform. In this way very elaborated aspects of mathematical statistics can be understood easily in the light of a simple vector space structure and of compositional data analysis. E.g. combination of statistical information such as Bayesian updating, combination of likelihood and robust M-estimation functions are simple additions/ perturbations in A2(Pprior). Weighting observations corresponds to a weighted addition of the corresponding evidence. Likelihood based statistics for general exponential families turns out to have a particularly easy interpretation in terms of A2(P). Regular exponential families form finite dimensional linear subspaces of A2(P) and they correspond to finite dimensional subspaces formed by their posterior in the dual information space A2(Pprior). The Aitchison norm can identified with mean Fisher information. The closing constant itself is identified with a generalization of the cummulant function and shown to be Kullback Leiblers directed information. Fisher information is the local geometry of the manifold induced by the A2(P) derivative of the Kullback Leibler information and the space A2(P) can therefore be seen as the tangential geometry of statistical inference at the distribution P. The discussion of A2(P) valued random variables, such as estimation functions or likelihoods, give a further interpretation of Fisher information as the expected squared norm of evidence and a scale free understanding of unbiased reasoning
Resumo:
In this paper, we define a new scheme to develop and evaluate protection strategies for building reliable GMPLS networks. This is based on what we have called the network protection degree (NPD). The NPD consists of an a priori evaluation, the failure sensibility degree (FSD), which provides the failure probability, and an a posteriori evaluation, the failure impact degree (FID), which determines the impact on the network in case of failure, in terms of packet loss and recovery time. Having mathematical formulated these components, experimental results demonstrate the benefits of the utilization of the NPD, when used to enhance some current QoS routing algorithms in order to offer a certain degree of protection
Resumo:
In networks with small buffers, such as optical packet switching based networks, the convolution approach is presented as one of the most accurate method used for the connection admission control. Admission control and resource management have been addressed in other works oriented to bursty traffic and ATM. This paper focuses on heterogeneous traffic in OPS based networks. Using heterogeneous traffic and bufferless networks the enhanced convolution approach is a good solution. However, both methods (CA and ECA) present a high computational cost for high number of connections. Two new mechanisms (UMCA and ISCA) based on Monte Carlo method are proposed to overcome this drawback. Simulation results show that our proposals achieve lower computational cost compared to enhanced convolution approach with an small stochastic error in the probability estimation
Resumo:
Este trabajo presenta un estudio de caso de perdurabilidad empresarial de la empresa Casa Dental Eduardo Daza Ltda, que como Empresa de Familia, desafía la tendencia de su desaparición a partir de la tercera generación, como lo han hecho la mayoría de empresas de esta naturaleza a nivel mundial. En primera instancia se contextualiza y caracteriza las empresas de familia a nivel mundial a partir de una revisión de la literatura existente, para poder consolidar un marco referencial que permite una aproximación al estado del arte en el ámbito corporativo de empresas de naturaleza familiar, para luego analizar su dinámica actual en la economía colombiana, en un enfoque desde lo general a lo particular. Posteriormente se presenta las características que demarcan el éxito de las empresas de familia, su gestión y gobernabilidad, el protocolo de traspaso y el direccionamiento estratégico hacia la innovación como sinónimo que garantiza la perdurabilidad en el tiempo. Después de esta contextualización, se presenta las particularidades desde un contexto histórico de la empresa Casa Dental Eduardo Daza Ltda, sus vicisitudes corporativas, su decadencia y renacer, hasta convertirse en un caso de éxito empresarial a nivel de empresa de familia, sustentando en su comportamiento económico financiero en los últimos tres años, para poder definir sus particularidades, tendencias y prospectiva en el inmediato futuro. Se concluye que las empresas de familia como forma más antigua de organización empresarial, tienen un amplio reconocimiento como un actor importante y diferenciador de la economía mundial, pues el aparato productivos de muchos países desarrollados y en vías de desarrollo se soporta en empresas de familia. La evidencia empírica señala dos características distintivas de las empresas de familia: 1) en casi todos los países estas empresas tienen una alta participación en la creación de riqueza; y 2) tienen una alta probabilidad de desaparecer. En todas las economías, en mayor o menor grado, conviven estos dos fenómenos tan significativos: su importancia económica y su vulnerabilidad
Resumo:
Analiza los diferentes tipos de animales que se han extinguido naturalmente hace mucho tiempo y los que están en peligro actualmente por las actividades del hombre. Con un texto sencillo se explica por qué tantos se han extinguido y cómo se sabe que una vez existieron. Para edades comprendidas entre nueve y doce años.
Resumo:
Resumen en portugués, español y francés. Resumen basado en el de la publicación
Resumo:
Aquesta tesi es basa en el programa de reintroducció de la llúdriga eurasiàtica (Lutra lutra) a les conques dels rius Muga i Fluvià (Catalunya) durant la segona meitat dels 1990s. Els objectius de la tesi foren demostrar la viabilitat de la reintroducció, demostrar l'èxit de la mateixa, estudiar aspectes ecològics i etològics de l'espècie, aprofitant l'oportunitat única de gaudir d'una població "de disseny" i determinar les probabilitats de supervivència de la població a llarg termini. La reintroducció de la llúdriga a les conques dels rius Muga i Fluvià va reeixir, doncs l'àrea geogràfica ocupada efectivament es va incrementar fins a un 64% d'estacions positives a l'hivern 2001-02. La troballa de tres exemplars adults nascuts a l'àrea de reintroducció és una altra prova que valida l'èxit del programa. La densitat d'exemplars calculada a través dels censos visuals ha resultat baixa (0.04-0.11 llúdrigues/km), però s'aproxima al que hom pot esperar en els primers estadis d'una població reintroduïda, encara poc nombrosa però distribuïda en una gran àrea. La mortalitat post-alliberament va ser del 22% un any després de l'alliberament, similar o inferior a la d'altres programes de reintroducció de llúdrigues reeixits. La mortalitat va ser deguda principalment a atropellaments (56%). El patró d'activitat de les llúdrigues reintroduïdes va esdevenir principalment nocturn i crepuscular, amb una escassa activitat diürna. Les seves àrees vitals van ser del mateix ordre (34,2 km) que les calculades en d'altres estudis realitzats a Europa. La longitud mitjana de riu recorreguda per una llúdriga durant 24 hores va ser de 4,2 km per les femelles i 7,6 km pels mascles. Durant el període de radioseguiment dues femelles van criar i els seus moviments van poder ser estudiats amb deteniment. La resposta de la nova població de llúdrigues a les fluctuacions estacionals en la disponibilitat d'aigua, habitual a les regions mediterrànies, va consistir en la concentració en una àrea menor durant el període de sequera estival, a causa de l'increment de trams secs, inhabitables per la llúdriga per la manca d'aliment, fet que va provocar expansions i contraccions periòdiques en l'àrea de distribució. La persistència a llarg termini de la població reintroduïda va ser estudiada mitjançant una Anàlisi de Viabilitat Poblacional (PVA). El resultat va ser un baix risc d'extinció de la població en els propers 100 anys i la majoria dels escenaris simulats (65%) van assolir el criteri d'un mínim de 90% de probabilitat de supervivència. Del model poblacional construït es dedueix que un punt clau per assegurar la viabilitat de la població reintroduïda és la reducció de la mortalitat accidental. A l'àrea d'estudi, els atropellaments causen més del 50% de la mortalitat i aquesta pot ser reduïda mitjançant la construcció de passos de fauna, el tancament lateral d'alguns trams de carretera perillosos i el control de la velocitat en algunes vies. El projecte de reintroducció ha posat a punt un protocol per a la captura, maneig i alliberament de llúdrigues salvatges, que pot contenir informació útil per a programes similars. També ha suposat una oportunitat única d'estudiar una població dissenyada artificialment i poder comparar diversos mètodes per estimar la distribució i la densitat de poblacions de llúdrigues. Per últim, la reintroducció portada a terme a les conques dels rius Muga i Fluvià ha aconseguit crear una nova població de llúdrigues, que persisteix en el temps, que es reprodueix regularment i que es dispersa progressivament, fins i tot a noves conques fluvials.
Resumo:
This dissertation focuses on the problem of providing mechanisms for routing point to point and multipoint connections in ATM networks. In general the notion of multipoint connection refers to connections that involve a group of users with more than two members. The main objective of this dissertation is to contribute to design efficient routing protocols with alterative routes in fully connected VP-based ATM Networks for call establishment of point to point and multipoint VC connections. An efficient route should be computed during this connection establishment phase.
Resumo:
AEA Technology has provided an assessment of the probability of α-mode containment failure for the Sizewell B PWR. After a preliminary review of the methodologies available it was decided to use the probabilistic approach described in the paper, based on an extension of the methodology developed by Theofanous et al. (Nucl. Sci. Eng. 97 (1987) 259–325). The input to the assessment is 12 probability distributions; the bases for the quantification of these distributions are discussed. The α-mode assessment performed for the Sizewell B PWR has demonstrated the practicality of the event-tree method with input data represented by probability distributions. The assessment itself has drawn attention to a number of topics, which may be plant and sequence dependent, and has indicated the importance of melt relocation scenarios. The α-mode failure probability following an accident that leads to core melt relocation to the lower head for the Sizewell B PWR has been assessed as a few parts in 10 000, on the basis of current information. This assessment has been the first to consider elevated pressures (6 MPa and 15 MPa) besides atmospheric pressure, but the results suggest only a modest sensitivity to system pressure.
Resumo:
The purpose of Research Theme 4 (RT4) was to advance understanding of the basic science issues at the heart of the ENSEMBLES project, focusing on the key processes that govern climate variability and change, and that determine the predictability of climate. Particular attention was given to understanding linear and non-linear feedbacks that may lead to climate surprises,and to understanding the factors that govern the probability of extreme events. Improved understanding of these issues will contribute significantly to the quantification and reduction of uncertainty in seasonal to decadal predictions and projections of climate change. RT4 exploited the ENSEMBLES integrations (stream 1) performed in RT2A as well as undertaking its own experimentation to explore key processes within the climate system. It was working at the cutting edge of problems related to climate feedbacks, the interaction between climate variability and climate change � especially how climate change pertains to extreme events, and the predictability of the climate system on a range of time-scales. The statisticalmethodologies developed for extreme event analysis are new and state-of-the-art. The RT4-coordinated experiments, which have been conducted with six different atmospheric GCMs forced by common timeinvariant sea surface temperature (SST) and sea-ice fields (removing some sources of inter-model variability), are designed to help to understand model uncertainty (rather than scenario or initial condition uncertainty) in predictions of the response to greenhouse-gas-induced warming. RT4 links strongly with RT5 on the evaluation of the ENSEMBLES prediction system and feeds back its results to RT1 to guide improvements in the Earth system models and, through its research on predictability, to steer the development of methods for initialising the ensembles