438 resultados para theorems


Relevância:

10.00% 10.00%

Publicador:

Resumo:

This research has focused on the study of the behavior and of the collapse of masonry arch bridges. The latest decades have seen an increasing interest in this structural type, that is still present and in use, despite the passage of time and the variation of the transport means. Several strategies have been developed during the time to simulate the response of this type of structures, although even today there is no generally accepted standard one for assessment of masonry arch bridges. The aim of this thesis is to compare the principal analytical and numerical methods existing in literature on case studies, trying to highlight values and weaknesses. The methods taken in exam are mainly three: i) the Thrust Line Analysis Method; ii) the Mechanism Method; iii) the Finite Element Methods. The Thrust Line Analysis Method and the Mechanism Method are analytical methods and derived from two of the fundamental theorems of the Plastic Analysis, while the Finite Element Method is a numerical method, that uses different strategies of discretization to analyze the structure. Every method is applied to the case study through computer-based representations, that allow a friendly-use application of the principles explained. A particular closed-form approach based on an elasto-plastic material model and developed by some Belgian researchers is also studied. To compare the three methods, two different case study have been analyzed: i) a generic masonry arch bridge with a single span; ii) a real masonry arch bridge, the Clemente Bridge, built on Savio River in Cesena. In the analyses performed, all the models are two-dimensional in order to have results comparable between the different methods taken in exam. The different methods have been compared with each other in terms of collapse load and of hinge positions.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In the present dissertation we consider Feynman integrals in the framework of dimensional regularization. As all such integrals can be expressed in terms of scalar integrals, we focus on this latter kind of integrals in their Feynman parametric representation and study their mathematical properties, partially applying graph theory, algebraic geometry and number theory. The three main topics are the graph theoretic properties of the Symanzik polynomials, the termination of the sector decomposition algorithm of Binoth and Heinrich and the arithmetic nature of the Laurent coefficients of Feynman integrals.rnrnThe integrand of an arbitrary dimensionally regularised, scalar Feynman integral can be expressed in terms of the two well-known Symanzik polynomials. We give a detailed review on the graph theoretic properties of these polynomials. Due to the matrix-tree-theorem the first of these polynomials can be constructed from the determinant of a minor of the generic Laplacian matrix of a graph. By use of a generalization of this theorem, the all-minors-matrix-tree theorem, we derive a new relation which furthermore relates the second Symanzik polynomial to the Laplacian matrix of a graph.rnrnStarting from the Feynman parametric parameterization, the sector decomposition algorithm of Binoth and Heinrich serves for the numerical evaluation of the Laurent coefficients of an arbitrary Feynman integral in the Euclidean momentum region. This widely used algorithm contains an iterated step, consisting of an appropriate decomposition of the domain of integration and the deformation of the resulting pieces. This procedure leads to a disentanglement of the overlapping singularities of the integral. By giving a counter-example we exhibit the problem, that this iterative step of the algorithm does not terminate for every possible case. We solve this problem by presenting an appropriate extension of the algorithm, which is guaranteed to terminate. This is achieved by mapping the iterative step to an abstract combinatorial problem, known as Hironaka's polyhedra game. We present a publicly available implementation of the improved algorithm. Furthermore we explain the relationship of the sector decomposition method with the resolution of singularities of a variety, given by a sequence of blow-ups, in algebraic geometry.rnrnMotivated by the connection between Feynman integrals and topics of algebraic geometry we consider the set of periods as defined by Kontsevich and Zagier. This special set of numbers contains the set of multiple zeta values and certain values of polylogarithms, which in turn are known to be present in results for Laurent coefficients of certain dimensionally regularized Feynman integrals. By use of the extended sector decomposition algorithm we prove a theorem which implies, that the Laurent coefficients of an arbitrary Feynman integral are periods if the masses and kinematical invariants take values in the Euclidean momentum region. The statement is formulated for an even more general class of integrals, allowing for an arbitrary number of polynomials in the integrand.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This is a research B for the University of Bologna. The course is the civil engineering LAUREA MAGISTRALE at UNIBO. The main purpose of this research is to promote another way of explaining, analyzing and presenting some civil engineering aspects to the students worldwide by theory, modeling and photos. The basic idea is divided into three steps. The first one is to present and analyze the theoretical parts. So a detailed analysis of the theory combined with theorems, explanations, examples and exercises will cover this step. At the second, a model will make clear all these parts that were discussed in the theory by showing how the structures work or fail. The modeling is able to present the behavior of many elements, in scale which we use in the real structures. After these two steps an interesting exhibition of photos from the real world with comments will give the chance to the engineers to observe all these theoretical and modeling-laboratory staff in many different cases. For example many civil engineers in the world may know about the air pressure on the structures but many of them have never seen the extraordinary behavior of the bridge of Tacoma ‘dancing with the air’. At this point I would like to say that what I have done is not a book, but a research of how this ‘3 step’ presentation or explanation of some mechanical characteristics could be helpful. I know that my research is something different and new and in my opinion is very important because it helps students to go deeper in the science and also gives new ideas and inspirations. This way of teaching can be used at all lessons especially at the technical. Hope that one day all the books will adopt this kind of presentation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Der semileptonische Zerfall K^±→π^0 μ^± υ ist ein geeigneter Kanal zur Be-stimmung des CKM-Matrixelementes 〖|V〗_us |. Das hadronische Matrixelement dieses Zerfalls wird durch zwei dimensionslose Formfaktoren f_± (t) beschrieben. Diese sind abhängig vom Impulsübertrag t=〖(p_K-p_π)〗^2 auf das Leptonpaar. Zur Bestimmung von 〖|V〗_us | dienen die Formfaktoren als wichtige Parameter zur Berechnung des Phasenraumintegrals dieses Zerfalls. Eine präzise Messung der Formfaktoren ist zusätzlich dadurch motiviert, dass das Resultat des NA48-Experimentes von den übrigen Messungen der Experimente KLOE, KTeV und ISTRA+ abweicht. Die Daten einer Messperiode des NA48/2 -Experimentes mit offenem Trigger aus dem Jahre 2004 wurden analysiert. Daraus wählte ich 1.8 Millionen K_μ3^±-Zerfallskandidaten mit einem Untergrundanteil von weniger als 0.1% aus. Zur Bestimmung der Formfaktoren diente die zweidimensionale Dalitz-Verteilung der Daten, nachdem sie auf Akzeptanz des Detektors und auf radiative Effekte korrigiert war. An diese Verteilung wurde die theoretische Parameter-abhängige Funktion mit einer Chiquadrat-Methode angepasst. Es ergeben sich für quadratische, Pol- und dispersive Parametrisierung folgende Formfaktoren: λ_0=(14.82±〖1.67〗_stat±〖0.62〗_sys )×〖10〗^(-3) λ_+^'=(25.53±〖3.51〗_stat±〖1.90〗_sys )×〖10〗^(-3) λ_+^''=( 1.40±〖1.30〗_stat±〖0.48〗_sys )×〖10〗^(-3) m_S=1204.8±〖32.0〗_stat±〖11.4〗_(sys ) MeV/c^2 m_V=(877.4±〖11.1〗_stat±〖11.2〗_(sys ) MeV/c^2 LnC=0.1871±〖0.0088〗_stat±〖0.0031〗_(sys )±=〖0.0056〗_ext Λ_+=(25.42±〖0.73〗_stat±〖0.73〗_(sys )±=〖1.52〗_ext )×〖10〗^(-3) Die Resultate stimmen mit den Messungen der Experimente KLOE, KTeV und ISTRA+ gut überein, und ermöglichen eine Verbesserung des globalen Fits der Formfaktoren. Mit Hilfe der dispersiven Parametrisierung der Formfaktoren, unter Verwendung des Callan-Treiman-Theorems, ist es möglich, einen Wert für f_± (0) zu bestimmen. Das Resultat lautet: f_+ (0)=0.987±〖0.011〗_(NA48/2)±〖0.008〗_(ext ) Der für f_+ (0) berechnete Wert stimmt im Fehler gut mit den vorherigen Messungen von KTeV, KLOE und ISTRA+ überein, weicht jedoch um knapp zwei Standardabweichungen von der theoretischen Vorhersage ab.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this Thesis we consider a class of second order partial differential operators with non-negative characteristic form and with smooth coefficients. Main assumptions on the relevant operators are hypoellipticity and existence of a well-behaved global fundamental solution. We first make a deep analysis of the L-Green function for arbitrary open sets and of its applications to the Representation Theorems of Riesz-type for L-subharmonic and L-superharmonic functions. Then, we prove an Inverse Mean value Theorem characterizing the superlevel sets of the fundamental solution by means of L-harmonic functions. Furthermore, we establish a Lebesgue-type result showing the role of the mean-integal operator in solving the homogeneus Dirichlet problem related to L in the Perron-Wiener sense. Finally, we compare Perron-Wiener and weak variational solutions of the homogeneous Dirichlet problem, under specific hypothesis on the boundary datum.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The purpose of this study is to analyse the regularity of a differential operator, the Kohn Laplacian, in two settings: the Heisenberg group and the strongly pseudoconvex CR manifolds. The Heisenberg group is defined as a space of dimension 2n+1 with a product. It can be seen in two different ways: as a Lie group and as the boundary of the Siegel UpperHalf Space. On the Heisenberg group there exists the tangential CR complex. From this we define its adjoint and the Kohn-Laplacian. Then we obtain estimates for the Kohn-Laplacian and find its solvability and hypoellipticity. For stating L^p and Holder estimates, we talk about homogeneous distributions. In the second part we start working with a manifold M of real dimension 2n+1. We say that M is a CR manifold if some properties are satisfied. More, we say that a CR manifold M is strongly pseudoconvex if the Levi form defined on M is positive defined. Since we will show that the Heisenberg group is a model for the strongly pseudo-convex CR manifolds, we look for an osculating Heisenberg structure in a neighborhood of a point in M, and we want this structure to change smoothly from a point to another. For that, we define Normal Coordinates and we study their properties. We also examinate different Normal Coordinates in the case of a real hypersurface with an induced CR structure. Finally, we define again the CR complex, its adjoint and the Laplacian operator on M. We study these new operators showing subelliptic estimates. For that, we don't need M to be pseudo-complex but we ask less, that is, the Z(q) and the Y(q) conditions. This provides local regularity theorems for Laplacian and show its hypoellipticity on M.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Il contenuto fisico della Relatività Generale è espresso dal Principio di Equivalenza, che sancisce l'equivalenza di geometria e gravitazione. La teoria predice l'esistenza dei buchi neri, i più semplici oggetti macroscopici esistenti in natura: essi sono infatti descritti da pochi parametri, le cui variazioni obbediscono a leggi analoghe a quelle della termodinamica. La termodinamica dei buchi neri è posta su basi solide dalla meccanica quantistica, mediante il fenomeno noto come radiazione di Hawking. Questi risultati gettano una luce su una possibile teoria quantistica della gravitazione, ma ad oggi una simile teoria è ancora lontana. In questa tesi ci proponiamo di studiare i buchi neri nei loro aspetti sia classici che quantistici. I primi due capitoli sono dedicati all'esposizione dei principali risultati raggiunti in ambito teorico: in particolare ci soffermeremo sui singularity theorems, le leggi della meccanica dei buchi neri e la radiazione di Hawking. Il terzo capitolo, che estende la discussione sulle singolarità, espone la teoria dei buchi neri non singolari, pensati come un modello effettivo di rimozione delle singolarità. Infine il quarto capitolo esplora le ulteriori conseguenze della meccanica quantistica sulla dinamica dei buchi neri, mediante l'uso della nozione di entropia di entanglement.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Justification logics are refinements of modal logics where modalities are replaced by justification terms. They are connected to modal logics via so-called realization theorems. We present a syntactic proof of a single realization theorem that uniformly connects all the normal modal logics formed from the axioms \$mathsfd\$, \$mathsft\$, \$mathsfb\$, \$mathsf4\$, and \$mathsf5\$ with their justification counterparts. The proof employs cut-free nested sequent systems together with Fitting's realization merging technique. We further strengthen the realization theorem for \$mathsfKB5\$ and \$mathsfS5\$ by showing that the positive introspection operator is superfluous.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Chapter 1 is used to introduce the basic tools and mechanics used within this thesis. Some historical uses and background are touched upon as well. The majority of the definitions are contained within this chapter as well. In Chapter 2 we consider the question whether one can decompose λ copies of monochromatic Kv into copies of Kk such that each copy of the Kk contains at most one edge from each Kv. This is called a proper edge coloring (Hurd, Sarvate, [29]). The majority of the content in this section is a wide variety of examples to explain the constructions used in Chapters 3 and 4. In Chapters 3 and 4 we investigate how to properly color BIBD(v, k, λ) for k = 4, and 5. Not only will there be direct constructions of relatively small BIBDs, we also prove some generalized constructions used within. In Chapter 5 we talk about an alternate solution to Chapters 3 and 4. A purely graph theoretical solution using matchings, augmenting paths, and theorems about the edgechromatic number is used to develop a theorem that than covers all possible cases. We also discuss how this method performed compared to the methods in Chapters 3 and 4. In Chapter 6, we switch topics to Latin rectangles that have the same number of symbols and an equivalent sized matrix to Latin squares. Suppose ab = n2. We define an equitable Latin rectangle as an a × b matrix on a set of n symbols where each symbol appears either [b/n] or [b/n] times in each row of the matrix and either [a/n] or [a/n] times in each column of the matrix. Two equitable Latin rectangles are orthogonal in the usual way. Denote a set of ka × b mutually orthogonal equitable Latin rectangles as a k–MOELR(a, b; n). We show that there exists a k–MOELR(a, b; n) for all a, b, n where k is at least 3 with some exceptions.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We prove analogs of classical almost sure dimension theorems for Euclidean projection mappings in the first Heisenberg group, equipped with a sub-Riemannian metric.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We generalize uniqueness theorems for non-extremal black holes with three mutually independent Killing vector fields in five-dimensional minimal supergravity in order to account for the existence of non-trivial two-cycles in the domain of outer communication. The black hole space-times we consider may contain multiple disconnected horizons and be asymptotically flat or asymptotically Kaluza–Klein. We show that in order to uniquely specify the black hole space-time, besides providing its domain structure and a set of asymptotic and local charges, it is necessary to measure the magnetic fluxes that support the two-cycles as well as fluxes in the two semi-infinite rotation planes of the domain diagram.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We study representations of MV-algebras -- equivalently, unital lattice-ordered abelian groups -- through the lens of Stone-Priestley duality, using canonical extensions as an essential tool. Specifically, the theory of canonical extensions implies that the (Stone-Priestley) dual spaces of MV-algebras carry the structure of topological partial commutative ordered semigroups. We use this structure to obtain two different decompositions of such spaces, one indexed over the prime MV-spectrum, the other over the maximal MV-spectrum. These decompositions yield sheaf representations of MV-algebras, using a new and purely duality-theoretic result that relates certain sheaf representations of distributive lattices to decompositions of their dual spaces. Importantly, the proofs of the MV-algebraic representation theorems that we obtain in this way are distinguished from the existing work on this topic by the following features: (1) we use only basic algebraic facts about MV-algebras; (2) we show that the two aforementioned sheaf representations are special cases of a common result, with potential for generalizations; and (3) we show that these results are strongly related to the structure of the Stone-Priestley duals of MV-algebras. In addition, using our analysis of these decompositions, we prove that MV-algebras with isomorphic underlying lattices have homeomorphic maximal MV-spectra. This result is an MV-algebraic generalization of a classical theorem by Kaplansky stating that two compact Hausdorff spaces are homeomorphic if, and only if, the lattices of continuous [0, 1]-valued functions on the spaces are isomorphic.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The usual Skolemization procedure, which removes strong quantifiers by introducing new function symbols, is in general unsound for first-order substructural logics defined based on classes of complete residuated lattices. However, it is shown here (following similar ideas of Baaz and Iemhoff for first-order intermediate logics in [1]) that first-order substructural logics with a semantics satisfying certain witnessing conditions admit a “parallel” Skolemization procedure where a strong quantifier is removed by introducing a finite disjunction or conjunction (as appropriate) of formulas with multiple new function symbols. These logics typically lack equivalent prenex forms. Also, semantic consequence does not in general reduce to satisfiability. The Skolemization theorems presented here therefore take various forms, applying to the left or right of the consequence relation, and to all formulas or only prenex formulas.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The purpose of this research was to determine if principles from organizational theory could be used as a framework to compare and contrast safety interventions developed by for-profit industry for the time period 1986–1996. A literature search of electronic databases and manual search of journals and local university libraries' book stacks was conducted for safety interventions developed by for-profit businesses. To maintain a constant regulatory environment, the business sectors of nuclear power, aviation and non-profits were excluded. Safety intervention evaluations were screened for scientific merit. Leavitt's model from organization theory was updated to include safety climate and renamed the Updated Leavitt's Model. In all, 8000 safety citations were retrieved, 525 met the inclusion criteria, 255 met the organizational safety intervention criteria, and 50 met the scientific merit criteria. Most came from non-public health journals. These 50 were categorized by the Updated Leavitt's Model according to where within the organizational structure the intervention took place. Evidence tables were constructed for descriptive comparison. The interventions clustered in the areas of social structure, safety climate, the interaction between social structure and participants, and the interaction between technology and participants. No interventions were found in the interactions between social structure and technology, goals and technology, or participants and goals. Despite the scientific merit criteria, many still had significant study design weaknesses. Five interventions tested for statistical significance but none of the interventions commented on the power of their study. Empiric studies based on safety climate theorems had the most rigorous designs. There was an attempt in these studies to address randomization amongst subjects to avoid bias. This work highlights the utility of using the Updated Leavitt's Model, a model from organizational theory, as a framework when comparing safety interventions. This work also highlights the need for better study design of future trials of safety interventions. ^

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The history of the logistic function since its introduction in 1838 is reviewed, and the logistic model for a polychotomous response variable is presented with a discussion of the assumptions involved in its derivation and use. Following this, the maximum likelihood estimators for the model parameters are derived along with a Newton-Raphson iterative procedure for evaluation. A rigorous mathematical derivation of the limiting distribution of the maximum likelihood estimators is then presented using a characteristic function approach. An appendix with theorems on the asymptotic normality of sample sums when the observations are not identically distributed, with proofs, supports the presentation on asymptotic properties of the maximum likelihood estimators. Finally, two applications of the model are presented using data from the Hypertension Detection and Follow-up Program, a prospective, population-based, randomized trial of treatment for hypertension. The first application compares the risk of five-year mortality from cardiovascular causes with that from noncardiovascular causes; the second application compares risk factors for fatal or nonfatal coronary heart disease with those for fatal or nonfatal stroke. ^