987 resultados para Jackson`s theorem
Resumo:
The two-node tandem Jackson network serves as a convenient reference model for the analysis and testing of different methodologies and techniques in rare event simulation. In this paper we consider a new approach to efficiently estimate the probability that the content of the second buffer exceeds some high level L before it becomes empty, starting from a given state. The approach is based on a Markov additive process representation of the buffer processes, leading to an exponential change of measure to be used in an importance sampling procedure. Unlike changes of measures proposed and studied in recent literature, the one derived here is a function of the content of the first buffer. We prove that when the first buffer is finite, this method yields asymptotically efficient simulation for any set of arrival and service rates. In fact, the relative error is bounded independent of the level L; a new result which is not established for any other known method. When the first buffer is infinite, we propose a natural extension of the exponential change of measure for the finite buffer case. In this case, the relative error is shown to be bounded (independent of L) only when the second server is the bottleneck; a result which is known to hold for some other methods derived through large deviations analysis. When the first server is the bottleneck, experimental results using our method seem to suggest that the relative error is bounded linearly in L.
Resumo:
A number of authors concerned with the analysis of rock jointing have used the idea that the joint areal or diametral distribution can be linked to the trace length distribution through a theorem attributed to Crofton. This brief paper seeks to demonstrate why Crofton's theorem need not be used to link moments of the trace length distribution captured by scan line or areal mapping to the moments of the diametral distribution of joints represented as disks and that it is incorrect to do so. The valid relationships for areal or scan line mapping between all the moments of the trace length distribution and those of the joint size distribution for joints modeled as disks are recalled and compared with those that might be applied were Crofton's theorem assumed to apply. For areal mapping, the relationship is fortuitously correct but incorrect for scan line mapping.
Resumo:
Large values for the mass-to-light ratio (ϒ) in self-gravitating systems is one of the most important evidences of dark matter. We propose a expression for the mass-to-light ratio in spherical systems using MOND. Results for the COMA cluster reveal that a modification of the gravity, as proposed by MOND, can reduce significantly this value.
Resumo:
We show that a self-generated set of combinatorial games, S. may not be hereditarily closed but, strong self-generation and hereditary closure are equivalent in the universe of short games. In [13], the question "Is there a set which will give a non-distributive but modular lattice?" appears. A useful necessary condition for the existence of a finite non-distributive modular L(S) is proved. We show the existence of S such that L(S) is modular and not distributive, exhibiting the first known example. More, we prove a Representation Theorem with Games that allows the generation of all finite lattices in game context. Finally, a computational tool for drawing lattices of games is presented. (C) 2014 Elsevier B.V. All rights reserved.
Resumo:
This paper presents the applicability of a reinforcement learning algorithm based on the application of the Bayesian theorem of probability. The proposed reinforcement learning algorithm is an advantageous and indispensable tool for ALBidS (Adaptive Learning strategic Bidding System), a multi-agent system that has the purpose of providing decision support to electricity market negotiating players. ALBidS uses a set of different strategies for providing decision support to market players. These strategies are used accordingly to their probability of success for each different context. The approach proposed in this paper uses a Bayesian network for deciding the most probably successful action at each time, depending on past events. The performance of the proposed methodology is tested using electricity market simulations in MASCEM (Multi-Agent Simulator of Competitive Electricity Markets). MASCEM provides the means for simulating a real electricity market environment, based on real data from real electricity market operators.
Resumo:
Objetivo: Traduzir para o português e avaliar as propriedades de medidas da Escala de Sunderland e da Escala Revista de Cubbin & Jackson, instrumentos cuja finalidade é avaliar o risco de desenvolvimento de úlceras por pressão em terapia intensiva. Métodos: O estudo compreendeu os processos de tradução e de adaptação das escalas à língua portuguesa, bem como o processo de validação dos instrumentos em estudo. A amostra foi constituída por 90 pacientes internados na unidade de terapia intensiva. Na avaliação da fiabilidade, foram identificados valores de alfa de Cronbach de 0,702 e de 0,708 para a Escala de Sunderland e a Escala Revista de Cubbin & Jackson, respectivamente. A validação de critério (preditiva) foi realizada comparativamente com a Escala de Braden (padrão-ouro), sendo as principais medidas avaliadas a sensibilidade, a especificidade, o valor preditivo positivo, o valor preditivo negativo e a área sob uma curva,que foram calculadas com base nos pontos de corte definidos pelos autores. Resultados: A Escala de Sunderland obteve 60% de sensibilidade, 86,7% de especificidade, 47,4% de valor preditivo positivo, 91,5% de valor preditivo negativo e 0,86 para a área sob uma curva. A Escala Revista de Cubbin & Jackson obteve 73,3% de sensibilidade, 86,7% de especificidade, 52,4% de valor preditivo positivo, 94,2% de valor preditivo negativo e 0,91 para a área sob uma curva. A Escala de Braden obteve 100% de sensibilidade, 5,3% de especificidade, 17,4% de valor preditivo positivo, 100% de valor preditivo negativo e 0,72 para a área sob uma curva. Conclusão: Ambos os instrumentos demonstram possuir fiabilidade e validade para a utilização. Nessa amostra, a Escala Revista de Cubbin & Jackson obteve melhores valores preditivos para desenvolvimento de úlceras por pressão em terapia intensiva.
Resumo:
La verificación y el análisis de programas con características probabilistas es una tarea necesaria del quehacer científico y tecnológico actual. El éxito y su posterior masificación de las implementaciones de protocolos de comunicación a nivel hardware y soluciones probabilistas a problemas distribuidos hacen más que interesante el uso de agentes estocásticos como elementos de programación. En muchos de estos casos el uso de agentes aleatorios produce soluciones mejores y más eficientes; en otros proveen soluciones donde es imposible encontrarlas por métodos tradicionales. Estos algoritmos se encuentran generalmente embebidos en múltiples mecanismos de hardware, por lo que un error en los mismos puede llegar a producir una multiplicación no deseada de sus efectos nocivos.Actualmente el mayor esfuerzo en el análisis de programas probabilísticos se lleva a cabo en el estudio y desarrollo de herramientas denominadas chequeadores de modelos probabilísticos. Las mismas, dado un modelo finito del sistema estocástico, obtienen de forma automática varias medidas de performance del mismo. Aunque esto puede ser bastante útil a la hora de verificar programas, para sistemas de uso general se hace necesario poder chequear especificaciones más completas que hacen a la corrección del algoritmo. Incluso sería interesante poder obtener automáticamente las propiedades del sistema, en forma de invariantes y contraejemplos.En este proyecto se pretende abordar el problema de análisis estático de programas probabilísticos mediante el uso de herramientas deductivas como probadores de teoremas y SMT solvers. Las mismas han mostrado su madurez y eficacia en atacar problemas de la programación tradicional. Con el fin de no perder automaticidad en los métodos, trabajaremos dentro del marco de "Interpretación Abstracta" el cual nos brinda un delineamiento para nuestro desarrollo teórico. Al mismo tiempo pondremos en práctica estos fundamentos mediante implementaciones concretas que utilicen aquellas herramientas.
Resumo:
The classical central limit theorem states the uniform convergence of the distribution functions of the standardized sums of independent and identically distributed square integrable real-valued random variables to the standard normal distribution function. While first versions of the central limit theorem are already due to Moivre (1730) and Laplace (1812), a systematic study of this topic started at the beginning of the last century with the fundamental work of Lyapunov (1900, 1901). Meanwhile, extensions of the central limit theorem are available for a multitude of settings. This includes, e.g., Banach space valued random variables as well as substantial relaxations of the assumptions of independence and identical distributions. Furthermore, explicit error bounds are established and asymptotic expansions are employed to obtain better approximations. Classical error estimates like the famous bound of Berry and Esseen are stated in terms of absolute moments of the random summands and therefore do not reflect a potential closeness of the distributions of the single random summands to a normal distribution. Non-classical approaches take this issue into account by providing error estimates based on, e.g., pseudomoments. The latter field of investigation was initiated by work of Zolotarev in the 1960's and is still in its infancy compared to the development of the classical theory. For example, non-classical error bounds for asymptotic expansions seem not to be available up to now ...
Resumo:
v. 1
Resumo:
v. 2
Resumo:
The main aim of this short paper is to advertize the Koosis theorem in the mathematical community, especially among those who study orthogonal polynomials. We (try to) do this by proving a new theorem about asymptotics of orthogonal polynomi- als for which the Koosis theorem seems to be the most natural tool. Namely, we consider the case when a SzegÄo measure on the unit circumference is perturbed by an arbitrary measure inside the unit disk and an arbitrary Blaschke sequence of point masses outside the unit disk.