956 resultados para Theorem of calculus


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Despite the fact that the integral form of the equations of classical electrodynamics is well known, the same is not true for non-Abelian gauge theories. The aim of the present paper is threefold. First, we present the integral form of the classical Yang-Mills equations in the presence of sources and then use it to solve the long-standing problem of constructing conserved charges, for any field configuration, which are invariant under general gauge transformations and not only under transformations that go to a constant at spatial infinity. The construction is based on concepts in loop spaces and on a generalization of the non-Abelian Stokes theorem for two-form connections. The third goal of the paper is to present the integral form of the self-dual Yang-Mills equations and calculate the conserved charges associated with them. The charges are explicitly evaluated for the cases of monopoles, dyons, instantons and merons, and we show that in many cases those charges must be quantized. Our results are important in the understanding of global properties of non-Abelian gauge theories.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Our objective here is to prove that the uniform convergence of a sequence of Kurzweil integrable functions implies the convergence of the sequence formed by its corresponding integrals.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper analyzes concepts of independence and assumptions of convexity in the theory of sets of probability distributions. The starting point is Kyburg and Pittarelli's discussion of "convex Bayesianism" (in particular their proposals concerning E-admissibility, independence, and convexity). The paper offers an organized review of the literature on independence for sets of probability distributions; new results on graphoid properties and on the justification of "strong independence" (using exchangeability) are presented. Finally, the connection between Kyburg and Pittarelli's results and recent developments on the axiomatization of non-binary preferences, and its impact on "complete" independence, are described.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A reaction-diffusion equation with variable diffusivity and non-linear flux boundary condition is considered. The goal is to give sufficient conditions on the diffusivity function for nonexistence and also for existence of nonconstant stable stationary solutions. Applications are given for the main result of nonexistence.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVE: To estimate the pretest probability of Cushing's syndrome (CS) diagnosis by a Bayesian approach using intuitive clinical judgment. MATERIALS AND METHODS: Physicians were requested, in seven endocrinology meetings, to answer three questions: "Based on your personal expertise, after obtaining clinical history and physical examination, without using laboratorial tests, what is your probability of diagnosing Cushing's Syndrome?"; "For how long have you been practicing Endocrinology?"; and "Where do you work?". A Bayesian beta regression, using the WinBugs software was employed. RESULTS: We obtained 294 questionnaires. The mean pretest probability of CS diagnosis was 51.6% (95%CI: 48.7-54.3). The probability was directly related to experience in endocrinology, but not with the place of work. CONCLUSION: Pretest probability of CS diagnosis was estimated using a Bayesian methodology. Although pretest likelihood can be context-dependent, experience based on years of practice may help the practitioner to diagnosis CS. Arq Bras Endocrinol Metab. 2012;56(9):633-7

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A systematic approach to model nonlinear systems using norm-bounded linear differential inclusions (NLDIs) is proposed in this paper. The resulting NLDI model is suitable for the application of linear control design techniques and, therefore, it is possible to fulfill certain specifications for the underlying nonlinear system, within an operating region of interest in the state-space, using a linear controller designed for this NLDI model. Hence, a procedure to design a dynamic output feedback controller for the NLDI model is also proposed in this paper. One of the main contributions of the proposed modeling and control approach is the use of the mean-value theorem to represent the nonlinear system by a linear parameter-varying model, which is then mapped into a polytopic linear differential inclusion (PLDI) within the region of interest. To avoid the combinatorial problem that is inherent of polytopic models for medium- and large-sized systems, the PLDI is transformed into an NLDI, and the whole process is carried out ensuring that all trajectories of the underlying nonlinear system are also trajectories of the resulting NLDI within the operating region of interest. Furthermore, it is also possible to choose a particular structure for the NLDI parameters to reduce the conservatism in the representation of the nonlinear system by the NLDI model, and this feature is also one important contribution of this paper. Once the NLDI representation of the nonlinear system is obtained, the paper proposes the application of a linear control design method to this representation. The design is based on quadratic Lyapunov functions and formulated as search problem over a set of bilinear matrix inequalities (BMIs), which is solved using a two-step separation procedure that maps the BMIs into a set of corresponding linear matrix inequalities. Two numerical examples are given to demonstrate the effectiveness of the proposed approach.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

[EN] The purpose of this paper is to present some fixed point theorems for Meir-Keeler contractions in a complete metric space endowed with a partial order. MSC: 47H10.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

[EN] The purpose of this paper is to investigate the existence and uniqueness of positive solutions for the following fractional boundary value problem D 0 + α u ( t ) + f ( t , u ( t ) ) = 0 , 0 < t < 1 , u ( 0 ) = u ( 1 ) = u ′ ( 0 ) = 0 , where 2 < α ≤ 3 and D 0 + α is the Riemann-Liouville fractional derivative. Our analysis relies on a fixed-point theorem in partially ordered metric spaces. The autonomous case of this problem was studied in the paper [Zhao et al., Abs. Appl. Anal., to appear], but in Zhao et al. (to appear), the question of uniqueness of the solution is not treated. We also present some examples where we compare our results with the ones obtained in Zhao et al. (to appear). 2010 Mathematics Subject Classification: 34B15

Relevância:

30.00% 30.00%

Publicador:

Resumo:

[EN] We establish the existence and uniqueness of a positive and nondecreasing solution to a singular boundary value problem of a class of nonlinear fractional differential equation. Our analysis relies on a fixed point theorem in partially ordered sets.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

[EN] The purpose of this paper is to provide sufficient conditions for the existence of a unique best proximity point for Geraghty-contractions.Our paper provides an extension of a result due to Geraghty (Proc. Am. Math. Soc. 40:604-608, 1973).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The main aim of this Ph.D. dissertation is the study of clustering dependent data by means of copula functions with particular emphasis on microarray data. Copula functions are a popular multivariate modeling tool in each field where the multivariate dependence is of great interest and their use in clustering has not been still investigated. The first part of this work contains the review of the literature of clustering methods, copula functions and microarray experiments. The attention focuses on the K–means (Hartigan, 1975; Hartigan and Wong, 1979), the hierarchical (Everitt, 1974) and the model–based (Fraley and Raftery, 1998, 1999, 2000, 2007) clustering techniques because their performance is compared. Then, the probabilistic interpretation of the Sklar’s theorem (Sklar’s, 1959), the estimation methods for copulas like the Inference for Margins (Joe and Xu, 1996) and the Archimedean and Elliptical copula families are presented. In the end, applications of clustering methods and copulas to the genetic and microarray experiments are highlighted. The second part contains the original contribution proposed. A simulation study is performed in order to evaluate the performance of the K–means and the hierarchical bottom–up clustering methods in identifying clusters according to the dependence structure of the data generating process. Different simulations are performed by varying different conditions (e.g., the kind of margins (distinct, overlapping and nested) and the value of the dependence parameter ) and the results are evaluated by means of different measures of performance. In light of the simulation results and of the limits of the two investigated clustering methods, a new clustering algorithm based on copula functions (‘CoClust’ in brief) is proposed. The basic idea, the iterative procedure of the CoClust and the description of the written R functions with their output are given. The CoClust algorithm is tested on simulated data (by varying the number of clusters, the copula models, the dependence parameter value and the degree of overlap of margins) and is compared with the performance of model–based clustering by using different measures of performance, like the percentage of well–identified number of clusters and the not rejection percentage of H0 on . It is shown that the CoClust algorithm allows to overcome all observed limits of the other investigated clustering techniques and is able to identify clusters according to the dependence structure of the data independently of the degree of overlap of margins and the strength of the dependence. The CoClust uses a criterion based on the maximized log–likelihood function of the copula and can virtually account for any possible dependence relationship between observations. Many peculiar characteristics are shown for the CoClust, e.g. its capability of identifying the true number of clusters and the fact that it does not require a starting classification. Finally, the CoClust algorithm is applied to the real microarray data of Hedenfalk et al. (2001) both to the gene expressions observed in three different cancer samples and to the columns (tumor samples) of the whole data matrix.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

My aim is to develop a theory of cooperation within the organization and empirically test it. Drawing upon social exchange theory, social identity theory, the idea of collective intentions, and social constructivism, the main assumption of my work implies that both cooperation and the organization itself are continually shaped and restructured by actions, judgments, and symbolic interpretations of the parties involved. Therefore, I propose that the decision to cooperate, expressed say as an intention to cooperate, reflects and depends on a three step social process shaped by the interpretations of the actors involved. The first step entails an instrumental evaluation of cooperation in terms of social exchange. In the second step, this “social calculus” is translated into cognitive, emotional and evaluative reactions directed toward the organization. Finally, once the identification process is completed and membership awareness is established, I propose that individuals will start to think largely in terms of “We” instead of “I”. Self-goals are redefined at the collective level, and the outcomes for self, others, and the organization become practically interchangeable. I decided to apply my theory to an important cooperative problem in management research: knowledge exchange within organizations. Hence, I conducted a quantitative survey among the members of the virtual community, “www.borse.it” (n=108). Within this community, members freely decide to exchange their knowledge about the stock market among themselves. Because of the confirmatory requirements and the structural complexity of the theory proposed (i.e., the proposal that instrumental evaluations will induce social identity and this in turn will causes collective intentions), I use Structural Equation Modeling to test all hypotheses in this dissertation. The empirical survey-based study found support for the theory of cooperation proposed in this dissertation. The findings suggest that an appropriate conceptualization of the decision to exchange knowledge is one where collective intentions depend proximally on social identity (i.e., cognitive identification, affective commitment, and evaluative engagement) with the organization, and this identity depends on instrumental evaluations of cooperators (i.e., perceived value of the knowledge received, assessment of past reciprocity, expected reciprocity, and expected social outcomes of the exchange). Furthermore, I find that social identity fully mediates the effects of instrumental motives on collective intentions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The aim of this thesis is to go through different approaches for proving expressiveness properties in several concurrent languages. We analyse four different calculi exploiting for each one a different technique. We begin with the analysis of a synchronous language, we explore the expressiveness of a fragment of CCS! (a variant of Milner's CCS where replication is considered instead of recursion) w.r.t. the existence of faithful encodings (i.e. encodings that respect the behaviour of the encoded model without introducing unnecessary computations) of models of computability strictly less expressive than Turing Machines. Namely, grammars of types 1,2 and 3 in the Chomsky Hierarchy. We then move to asynchronous languages and we study full abstraction for two Linda-like languages. Linda can be considered as the asynchronous version of CCS plus a shared memory (a multiset of elements) that is used for storing messages. After having defined a denotational semantics based on traces, we obtain fully abstract semantics for both languages by using suitable abstractions in order to identify different traces which do not correspond to different behaviours. Since the ability of one of the two variants considered of recognising multiple occurrences of messages in the store (which accounts for an increase of expressiveness) reflects in a less complex abstraction, we then study other languages where multiplicity plays a fundamental role. We consider the language CHR (Constraint Handling Rules) a language which uses multi-headed (guarded) rules. We prove that multiple heads augment the expressive power of the language. Indeed we show that if we restrict to rules where the head contains at most n atoms we could generate a hierarchy of languages with increasing expressiveness (i.e. the CHR language allowing at most n atoms in the heads is more expressive than the language allowing at most m atoms, with mcalculus is a formalism for modelling molecular biology where molecules are terms with internal state and sites, bonds are represented by shared names labelling sites, and reactions are represented by rewriting rules. Depending on the shape of the rewriting rules, several dialects of the calculus can be obtained. We analyse the expressive power of some of these dialects by focusing on decidability and undecidability for problems like reachability and coverability.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Human reasoning is a fascinating and complex cognitive process that can be applied in different research areas such as philosophy, psychology, laws and financial. Unfortunately, developing supporting software (to those different areas) able to cope such as complex reasoning it’s difficult and requires a suitable logic abstract formalism. In this thesis we aim to develop a program, that has the job to evaluate a theory (a set of rules) w.r.t. a Goal, and provide some results such as “The Goal is derivable from the KB5 (of the theory)”. In order to achieve this goal we need to analyse different logics and choose the one that best meets our needs. In logic, usually, we try to determine if a given conclusion is logically implied by a set of assumptions T (theory). However, when we deal with programming logic we need an efficient algorithm in order to find such implications. In this work we use a logic rather similar to human logic. Indeed, human reasoning requires an extension of the first order logic able to reach a conclusion depending on not definitely true6 premises belonging to a incomplete set of knowledge. Thus, we implemented a defeasible logic7 framework able to manipulate defeasible rules. Defeasible logic is a non-monotonic logic designed for efficient defeasible reasoning by Nute (see Chapter 2). Those kind of applications are useful in laws area especially if they offer an implementation of an argumentation framework that provides a formal modelling of game. Roughly speaking, let the theory is the set of laws, a keyclaim is the conclusion that one of the party wants to prove (and the other one wants to defeat) and adding dynamic assertion of rules, namely, facts putted forward by the parties, then, we can play an argumentative challenge between two players and decide if the conclusion is provable or not depending on the different strategies performed by the players. Implementing a game model requires one more meta-interpreter able to evaluate the defeasible logic framework; indeed, according to Göedel theorem (see on page 127), we cannot evaluate the meaning of a language using the tools provided by the language itself, but we need a meta-language able to manipulate the object language8. Thus, rather than a simple meta-interpreter, we propose a Meta-level containing different Meta-evaluators. The former has been explained above, the second one is needed to perform the game model, and the last one will be used to change game execution and tree derivation strategies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Interactive theorem provers are tools designed for the certification of formal proofs developed by means of man-machine collaboration. Formal proofs obtained in this way cover a large variety of logical theories, ranging from the branches of mainstream mathematics, to the field of software verification. The border between these two worlds is marked by results in theoretical computer science and proofs related to the metatheory of programming languages. This last field, which is an obvious application of interactive theorem proving, poses nonetheless a serious challenge to the users of such tools, due both to the particularly structured way in which these proofs are constructed, and to difficulties related to the management of notions typical of programming languages like variable binding. This thesis is composed of two parts, discussing our experience in the development of the Matita interactive theorem prover and its use in the mechanization of the metatheory of programming languages. More specifically, part I covers: - the results of our effort in providing a better framework for the development of tactics for Matita, in order to make their implementation and debugging easier, also resulting in a much clearer code; - a discussion of the implementation of two tactics, providing infrastructure for the unification of constructor forms and the inversion of inductive predicates; we point out interactions between induction and inversion and provide an advancement over the state of the art. In the second part of the thesis, we focus on aspects related to the formalization of programming languages. We describe two works of ours: - a discussion of basic issues we encountered in our formalizations of part 1A of the Poplmark challenge, where we apply the extended inversion principles we implemented for Matita; - a formalization of an algebraic logical framework, posing more complex challenges, including multiple binding and a form of hereditary substitution; this work adopts, for the encoding of binding, an extension of Masahiko Sato's canonical locally named representation we designed during our visit to the Laboratory for Foundations of Computer Science at the University of Edinburgh, under the supervision of Randy Pollack.