1000 resultados para Anàlisi combinatòria
Resumo:
We study situations of allocating positions or jobs to students or workers based on priorities. An example is the assignment of medical students to hospital residencies on the basis of one or several entrance exams. For markets without couples, e.g., for ``undergraduate student placement,'' acyclicity is a necessary and sufficient condition for the existence of a fair and efficient placement mechanism (Ergin, 2002). We show that in the presence of couples, which introduces complementarities into the students' preferences, acyclicity is still necessary, but not sufficient (Theorem 4.1). A second necessary condition (Theorem 4.2) is ``priority-togetherness'' of couples. A priority structure that satisfies both necessary conditions is called pt-acyclic. For student placement problems where all quotas are equal to one we characterize pt-acyclicity (Lemma 5.1) and show that it is a sufficient condition for the existence of a fair and efficient placement mechanism (Theorem 5.1). If in addition to pt-acyclicity we require ``reallocation-'' and ``vacancy-fairness'' for couples, the so-called dictator-bidictator placement mechanism is the unique fair and efficient placement mechanism (Theorem 5.2). Finally, for general student placement problems, we show that pt-acyclicity may not be sufficient for the existence of a fair and efficient placement mechanism (Examples 5.4, 5.5, and 5.6). We identify a sufficient condition such that the so-called sequential placement mechanism produces a fair and efficient allocation (Theorem 5.3).
Resumo:
"Vegeu el resum a l'inici del document del fitxer adjunt."
Resumo:
We construct generating trees with with one, two, and three labels for some classes of permutations avoiding generalized patterns of length 3 and 4. These trees are built by adding at each level an entry to the right end of the permutation, which allows us to incorporate the adjacency condition about some entries in an occurrence of a generalized pattern. We use these trees to find functional equations for the generating functions enumerating these classes of permutations with respect to different parameters. In several cases we solve them using the kernel method and some ideas of Bousquet-Mélou [2]. We obtain refinements of known enumerative results and find new ones.
Resumo:
To a finite graph there corresponds a free partially commutative group: with the given graph as commutation graph. In this paper we construct an orthogonality theory for graphs and their corresponding free partially commutative groups. The theory developed here provides tools for the study of the structure of partially commutative groups, their universal theory and automorphism groups. In particular the theory is applied in this paper to the centraliser lattice of such groups.
Resumo:
Counting labelled planar graphs, and typical properties of random labelled planar graphs, have received much attention recently. We start the process here of extending these investigations to graphs embeddable on any fixed surface S. In particular we show that the labelled graphs embeddable on S have the same growth constant as for planar graphs, and the same holds for unlabelled graphs. Also, if we pick a graph uniformly at random from the graphs embeddable on S which have vertex set {1, . . . , n}, then with probability tending to 1 as n → ∞, this random graph either is connected or consists of one giant component together with a few nodes in small planar components.
Resumo:
One of the main implications of the efficient market hypothesis (EMH) is that expected future returns on financial assets are not predictable if investors are risk neutral. In this paper we argue that financial time series offer more information than that this hypothesis seems to supply. In particular we postulate that runs of very large returns can be predictable for small time periods. In order to prove this we propose a TAR(3,1)-GARCH(1,1) model that is able to describe two different types of extreme events: a first type generated by large uncertainty regimes where runs of extremes are not predictable and a second type where extremes come from isolated dread/joy events. This model is new in the literature in nonlinear processes. Its novelty resides on two features of the model that make it different from previous TAR methodologies. The regimes are motivated by the occurrence of extreme values and the threshold variable is defined by the shock affecting the process in the preceding period. In this way this model is able to uncover dependence and clustering of extremes in high as well as in low volatility periods. This model is tested with data from General Motors stocks prices corresponding to two crises that had a substantial impact in financial markets worldwide; the Black Monday of October 1987 and September 11th, 2001. By analyzing the periods around these crises we find evidence of statistical significance of our model and thereby of predictability of extremes for September 11th but not for Black Monday. These findings support the hypotheses of a big negative event producing runs of negative returns in the first case, and of the burst of a worldwide stock market bubble in the second example. JEL classification: C12; C15; C22; C51 Keywords and Phrases: asymmetries, crises, extreme values, hypothesis testing, leverage effect, nonlinearities, threshold models
Resumo:
"Vegeu el resum a l'inici del document del fitxer adjunt."
Resumo:
A new expression for the characteristic function of log-spot in Heston model is presented. This expression more clearly exhibits its properties as an analytic characteristic function and allows us to compute the exact domain of the moment generating function. This result is then applied to the volatility smile at extreme strikes and to the control of the moments of spot. We also give a factorization of the moment generating function as product of Bessel type factors, and an approximating sequence to the law of log-spot is deduced.
Resumo:
Using recent results on the behavior of multiple Wiener-Itô integrals based on Stein's method, we prove Hsu-Robbins and Spitzer's theorems for sequences of correlated random variables related to the increments of the fractional Brownian motion.
Resumo:
Approximate Quickselect, a simple modification of the well known Quickselect algorithm for selection, can be used to efficiently find an element with rank k in a given range [i..j], out of n given elements. We study basic cost measures of Approximate Quickselect by computing exact and asymptotic results for the expected number of passes, comparisons and data moves during the execution of this algorithm. The key element appearing in the analysis of Approximate Quickselect is a trivariate recurrence that we solve in full generality. The general solution of the recurrence proves to be very useful, as it allows us to tackle several related problems, besides the analysis that originally motivated us. In particular, we have been able to carry out a precise analysis of the expected number of moves of the ith element when selecting the jth smallest element with standard Quickselect, where we are able to give both exact and asymptotic results. Moreover, we can apply our general results to obtain exact and asymptotic results for several parameters in binary search trees, namely the expected number of common ancestors of the nodes with rank i and j, the expected size of the subtree rooted at the least common ancestor of the nodes with rank i and j, and the expected distance between the nodes of ranks i and j.
Resumo:
This paper shows that certain quotients of entire functions are characteristic functions. Under some conditions, we provide expressions for the densities of such characteristic functions which turn out to be generalized Dirichlet series which in turn can be expressed as an infinite linear combination of exponential or Laplace densities. We apply these results to several examples.
Resumo:
"Vegeu el resum a l'inici del document del fitxer adjunt."
Resumo:
The work studies a general multiserver queue in which the service time of an arriving customer and the next interarrival period may depend on both the current waiting time and the server assigned to the arriving customer. Stability of the system is proved under general assumptions on the predetermined distributions describing the model. The proof exploits a combination of the Markov property of the workload process with a regenerative property of the process. The key idea leading to stability is a characterization of the limit behavior of the forward renewal process generated by regenerations. Extensions of the basic model are also studied.
Resumo:
We investigate in this note the dynamics of a one-dimensional Keller-Segel type model on the half-line. On the contrary to the classical configuration, the chemical production term is located on the boundary. We prove, under suitable assumptions, the following dichotomy which is reminiscent of the two-dimensional Keller-Segel system. Solutions are global if the mass is below the critical mass, they blow-up in finite time above the critical mass, and they converge to some equilibrium at the critical mass. Entropy techniques are presented which aim at providing quantitative convergence results for the subcritical case. This note is completed with a brief introduction to a more realistic model (still one-dimensional).