902 resultados para Eigenvalue Bounds
Resumo:
For a design D, define spec(D) = {\M\ \ M is a minimal defining set of D} to be the spectrum of minimal defining sets of D. In this note we give bounds on the size of an element in spec(D) when D is a Steiner system. We also show that the spectrum of minimal defining sets of the Steiner triple system given by the points and lines of PG(3,2) equals {16,17,18,19,20,21,22}, and point out some open questions concerning the Steiner triple systems associated with PG(n, 2) in general. (C) 2002 Elsevier Science B.V. All rights reserved.
Resumo:
Error condition detected We consider discrete two-point boundary value problems of the form D-2 y(k+1) = f (kh, y(k), D y(k)), for k = 1,...,n - 1, (0,0) = G((y(0),y(n));(Dy-1,Dy-n)), where Dy-k = (y(k) - Yk-I)/h and h = 1/n. This arises as a finite difference approximation to y" = f(x,y,y'), x is an element of [0,1], (0,0) = G((y(0),y(1));(y'(0),y'(1))). We assume that f and G = (g(0), g(1)) are continuous and fully nonlinear, that there exist pairs of strict lower and strict upper solutions for the continuous problem, and that f and G satisfy additional assumptions that are known to yield a priori bounds on, and to guarantee the existence of solutions of the continuous problem. Under these assumptions we show that there are at least three distinct solutions of the discrete approximation which approximate solutions to the continuous problem as the grid size, h, goes to 0. (C) 2003 Elsevier Science Ltd. All rights reserved.
Resumo:
This article presents Monte Carlo techniques for estimating network reliability. For highly reliable networks, techniques based on graph evolution models provide very good performance. However, they are known to have significant simulation cost. An existing hybrid scheme (based on partitioning the time space) is available to speed up the simulations; however, there are difficulties with optimizing the important parameter associated with this scheme. To overcome these difficulties, a new hybrid scheme (based on partitioning the edge set) is proposed in this article. The proposed scheme shows orders of magnitude improvement of performance over the existing techniques in certain classes of network. It also provides reliability bounds with little overhead.
Resumo:
A review of spontaneous rupture in thin films with tangentially immobile interfaces is presented that emphasizes the theoretical developments of film drainage and corrugation growth through the linearization of lubrication theory in a cylindrical geometry. Spontaneous rupture occurs when corrugations from adjacent interfaces become unstable and grow to a critical thickness. A corrugated interface is composed of a number of waveforms and each waveform becomes unstable at a unique transition thickness. The onset of instability occurs at the maximum transition thickness, and it is shown that only upper and lower bounds of this thickness can be predicted from linear stability analysis. The upper bound is equivalent to the Freakel criterion and is obtained from the zeroth order approximation of the H-3 term in the evolution equation. This criterion is determined solely by the film radius, interfacial tension and Hamaker constant. The lower bound is obtained from the first order approximation of the H-3 term in the evolution equation and is dependent on the film thinning velocity A semi-empirical equation, referred to as the MTR equation, is obtained by combining the drainage theory of Manev et al. [J. Dispersion Sci. Technol., 18 (1997) 769] and the experimental measurements of Radoev et al. [J. Colloid Interface Sci. 95 (1983) 254] and is shown to provide accurate predictions of film thinning velocity near the critical thickness of rupture. The MTR equation permits the prediction of the lower bound of the maximum transition thickness based entirely on film radius, Plateau border radius, interfacial tension, temperature and Hamaker constant. The MTR equation extrapolates to Reynolds equation under conditions when the Plateau border pressure is small, which provides a lower bound for the maximum transition thickness that is equivalent to the criterion of Gumerman and Homsy [Chem. Eng. Commun. 2 (1975) 27]. The relative accuracy of either bound is thought to be dependent on the amplitude of the hydrodynamic corrugations, and a semiempirical correlation is also obtained that permits the amplitude to be calculated as a function of the upper and lower bound of the maximum transition thickness. The relationship between the evolving theoretical developments is demonstrated by three film thickness master curves, which reduce to simple analytical expressions under limiting conditions when the drainage pressure drop is controlled by either the Plateau border capillary pressure or the van der Waals disjoining pressure. The master curves simplify solution of the various theoretical predictions enormously over the entire range of the linear approximation. Finally, it is shown that when the Frenkel criterion is used to assess film stability, recent studies reach conclusions that are contrary to the relevance of spontaneous rupture as a cell-opening mechanism in foams. (C) 2003 Elsevier Science B.V. All rights reserved.
Resumo:
We analyze the sequences of round-off errors of the orbits of a discretized planar rotation, from a probabilistic angle. It was shown [Bosio & Vivaldi, 2000] that for a dense set of parameters, the discretized map can be embedded into an expanding p-adic dynamical system, which serves as a source of deterministic randomness. For each parameter value, these systems can generate infinitely many distinct pseudo-random sequences over a finite alphabet, whose average period is conjectured to grow exponentially with the bit-length of the initial condition (the seed). We study some properties of these symbolic sequences, deriving a central limit theorem for the deviations between round-off and exact orbits, and obtain bounds concerning repetitions of words. We also explore some asymptotic problems computationally, verifying, among other things, that the occurrence of words of a given length is consistent with that of an abstract Bernoulli sequence.
Resumo:
O presente trabalho objetiva avaliar o desempenho do MECID (Método dos Elementos de Contorno com Interpolação Direta) para resolver o termo integral referente à inércia na Equação de Helmholtz e, deste modo, permitir a modelagem do Problema de Autovalor assim como calcular as frequências naturais, comparando-o com os resultados obtidos pelo MEF (Método dos Elementos Finitos), gerado pela Formulação Clássica de Galerkin. Em primeira instância, serão abordados alguns problemas governados pela equação de Poisson, possibilitando iniciar a comparação de desempenho entre os métodos numéricos aqui abordados. Os problemas resolvidos se aplicam em diferentes e importantes áreas da engenharia, como na transmissão de calor, no eletromagnetismo e em problemas elásticos particulares. Em termos numéricos, sabe-se das dificuldades existentes na aproximação precisa de distribuições mais complexas de cargas, fontes ou sorvedouros no interior do domínio para qualquer técnica de contorno. No entanto, este trabalho mostra que, apesar de tais dificuldades, o desempenho do Método dos Elementos de Contorno é superior, tanto no cálculo da variável básica, quanto na sua derivada. Para tanto, são resolvidos problemas bidimensionais referentes a membranas elásticas, esforços em barras devido ao peso próprio e problemas de determinação de frequências naturais em problemas acústicos em domínios fechados, dentre outros apresentados, utilizando malhas com diferentes graus de refinamento, além de elementos lineares com funções de bases radiais para o MECID e funções base de interpolação polinomial de grau (um) para o MEF. São geradas curvas de desempenho através do cálculo do erro médio percentual para cada malha, demonstrando a convergência e a precisão de cada método. Os resultados também são comparados com as soluções analíticas, quando disponíveis, para cada exemplo resolvido neste trabalho.
Resumo:
No ponto de vista jurídico existe um velho instituto jurídico que se chama “Levantamento ou desconsideração da personalidade colectiva” que pode permitir – por palavras breves – imputar as dívidas do BES ao Novo Banco. Também é possível nos socorrermos do instituto do Abuso do Direito (art. 334º do Código Civil): “É ilegítimo o exercício de um direito, quando o titular exceda manifestamente os limites impostos pela boa fé, pelos bons costumes ou pelo fim social ou económico desse direito”. Aplica-se o art. 11º/8 do Código Penal: «8 - A cisão e a fusão não determinam a extinção da responsabilidade criminal da pessoa colectiva ou entidade equiparada, respondendo pela prática do crime: § a) A pessoa colectiva ou entidade equiparada em que a fusão se tiver efectivado; § e § b) As pessoas colectivas ou entidades equiparadas que resultaram da cisão». § In the legal point of view there is an old legal principle called "Lifting or disregard of legal personality" which can allow - for brief words - charge the debts of the BES to the New Bank. It is also possible in socorrermos Law Abuse Institute (Article 334 of the Civil Code.): "It is illegitimate exercise of a right, where the proprietor clearly exceed the bounds of good faith, morality or the social or economic purpose this right ". Applies the art. 11/8 of the Penal Code: "8 - The split and the merger does not determine the extinction of criminal liability of the legal or related entity person, accounting for the crime: § a) The legal person or related entity in the merger if paid up; § and § b) A legal entity or similar entities resulting from the split. "
Resumo:
A análise de componentes principais é uma técnica de estatística multivariada utilizada para examinar a interdependência entre variáveis. A sua principal característica é a capacidade de reduzir dados, e tem sido usada para o desenvolvimento de instrumentos de pesquisas psiquiátricas e na classificação dos transtornos psiquiátricos. Esta técnica foi utilizada para estudar a estrutura fatorial do Questionário de Morbidade Psiquiátrica do Adulto (QMPA). O questionário foi composto de 45 questões de resposta sim/não que identificam sintomas psiquiátricos, uso de serviço e de drogas psicotrópicas. O questionário foi aplicado em 6.470 indivíduos maiores de 15 anos, em amostras representativas da população de três cidades brasileiras (Brasília, São Paulo e Porto Alegre). O estudo teve como objetivo comparar a estrutura fatorial do questionário nas três regiões urbanas brasileiras. Sete fatores foram encontrados que explicam 42,7% da variância total da amostra. O fator 1, Ansiedade/Somatização ("eigenvalue" (EV) = 3.812 e variância explicada (VE) = 10,9%); O fator 2, Irritabilidade/Depressão (EV = 2.412 e VE = 6,9%); O fator 3, Deficiência Mental (EV= 2.014 e VE = 5,8%); O fator 4, Alcoolismo (EV = 1.903 e VE = 5,4%); O fator 5, Exaltação do Humor (EV = 1.621 e VE = 4,6%); O fator 6, Transtorno de Percepção (EV = 1.599 e VE = 4,6%) e o fator 7, Tratamento (EV = 1.592 e VE = 4,5%).O QMPA apresentou estruturas fatoriais semelhantes nas três cidades. Baseados nos achados, são feitas sugestões para que algumas questões sejam modificadas e para a exclusão de outras em uma futura versão do questionário.
Resumo:
We study the effect that flavor-changing neutral current interactions of the top quark will have on the branching ratio of charged decays of the top quark. We have performed an integrated analysis using Tevatron and B-factories data and with just the further assumption that the Cabibbo-Kobayashi-Maskawa matrix is unitary, we can obtain very restrictive bounds on the strong and electroweak flavor-changing neutral current branching ratios Br(t -> qX)< 4.0x10(-4), where X is any vector boson and a sum in q=u, c is implied.
Resumo:
Preliminary version
Resumo:
LHC has found hints for a Higgs particle of 125 GeV. We investigate the possibility that such a particle is a mixture of scalar and pseudoscalar states. For definiteness, we concentrate on a two-Higgs doublet model with explicit CP violation and soft Z(2) violation. Including all Higgs production mechanisms, we determine the current constraints obtained by comparing h -> yy with h -> VV*, and comment on the information which can be gained by measurements of h -> b (b) over bar. We find bounds vertical bar s(2)vertical bar less than or similar to 0.83 at one sigma, where vertical bar s(2)vertical bar = 0 (vertical bar s(2)vertical bar = 1) corresponds to a pure scalar (pure pseudoscalar) state.
Resumo:
We study the implications of the searches based on H -> tau(+)tau-by the ATLAS and CMS collaborations on the parameter space of the two-Higgs-doublet model (2HDM). In the 2HDM, the scalars can decay into a tau pair with a branching ratio larger than the SM one, leading to constraints on the 2HDM parameter space. We show that in model II, values of tan beta > 1.8 are definitively excluded if the pseudoscalar is in the mass range 110 GeV < m(A) < 145 GeV. We have also discussed the implications for the 2HDM of the recent dimuon search by the ATLAS collaboration for a CP-odd scalar in the mass range 4-12 GeV.
Resumo:
This paper presents a distributed model predictive control (DMPC) for indoor thermal comfort that simultaneously optimizes the consumption of a limited shared energy resource. The control objective of each subsystem is to minimize the heating/cooling energy cost while maintaining the indoor temperature and used power inside bounds. In a distributed coordinated environment, the control uses multiple dynamically decoupled agents (one for each subsystem/house) aiming to achieve satisfaction of coupling constraints. According to the hourly power demand profile, each house assigns a priority level that indicates how much is willing to bid in auction for consume the limited clean resource. This procedure allows the bidding value vary hourly and consequently, the agents order to access to the clean energy also varies. Despite of power constraints, all houses have also thermal comfort constraints that must be fulfilled. The system is simulated with several houses in a distributed environment.
Resumo:
We present the first version of a new tool to scan the parameter space of generic scalar potentials, SCANNERS (Coimbra et al., SCANNERS project., 2013). The main goal of SCANNERS is to help distinguish between different patterns of symmetry breaking for each scalar potential. In this work we use it to investigate the possibility of excluding regions of the phase diagram of several versions of a complex singlet extension of the Standard Model, with future LHC results. We find that if another scalar is found, one can exclude a phase with a dark matter candidate in definite regions of the parameter space, while predicting whether a third scalar to be found must be lighter or heavier. The first version of the code is publicly available and contains various generic core routines for tree level vacuum stability analysis, as well as implementations of collider bounds, dark matter constraints, electroweak precision constraints and tree level unitarity.
Resumo:
The possibility of creating baryon asymmetry at the electroweak phase transition in the minimal supersymmetric standard model is considered for the case when right-handed squarks are much lighter than left-handed ones. It is shown that the usual requirement upsilon(T-c)/T-c greater than or similar to 1 for baryogenesis can be satisfied in a range of the parameters of the model, consistent with present experimental bounds.