541 resultados para Schoenberg Conjecture


Relevância:

10.00% 10.00%

Publicador:

Resumo:

We study the Hawking radiation of a (4+n)-dimensional Schwarzschild black hole imbedded in space-time with a positive cosmological constant. The greybody and energy emission rates of scalars, fermions, bosons, and gravitons are calculated in the full range of energy. Valuable information on the dimensions and curvature of space-time is revealed. Furthermore, we investigate the entropy radiated and lost by black holes. We find their ratio near 1 in favor of the Bekenstein's conjecture.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In many different spatial discrimination tasks, such as in determining the sign of the offset in a vernier stimulus, the human visual system exhibits hyperacuity-level performance by evaluating spatial relations with the precision of a fraction of a photoreceptor"s diameter. We propose that this impressive performance depends in part on a fast learning process that uses relatively few examples and occurs at an early processing stage in the visual pathway. We show that this hypothesis is plausible by demonstrating that it is possible to synthesize, from a small number of examples of a given task, a simple (HyperBF) network that attains the required performance level. We then verify with psychophysical experiments some of the key predictions of our conjecture. In particular, we show that fast timulus-specific learning indeed takes place in the human visual system and that this learning does not transfer between two slightly different hyperacuity tasks.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Olusanya, O. (2004). Double Jeopardy Without Parameters: Re-characterization in International Criminal Law. Series Supranational Criminal Law: Capita Selecta, volume 2. Antwerp: Intersentia. RAE2008

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We give an explicit and easy-to-verify characterization for subsets in finite total orders (infinitely many of them in general) to be uniformly definable by a first-order formula. From this characterization we derive immediately that Beth's definability theorem does not hold in any class of finite total orders, as well as that McColm's first conjecture is true for all classes of finite total orders. Another consequence is a natural 0-1 law for definable subsets on finite total orders expressed as a statement about the possible densities of first-order definable subsets.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

One-and two-dimensional cellular automata which are known to be fault-tolerant are very complex. On the other hand, only very simple cellular automata have actually been proven to lack fault-tolerance, i.e., to be mixing. The latter either have large noise probability ε or belong to the small family of two-state nearest-neighbor monotonic rules which includes local majority voting. For a certain simple automaton L called the soldiers rule, this problem has intrigued researchers for the last two decades since L is clearly more robust than local voting: in the absence of noise, L eliminates any finite island of perturbation from an initial configuration of all 0's or all 1's. The same holds for a 4-state monotonic variant of L, K, called two-line voting. We will prove that the probabilistic cellular automata Kε and Lε asymptotically lose all information about their initial state when subject to small, strongly biased noise. The mixing property trivially implies that the systems are ergodic. The finite-time information-retaining quality of a mixing system can be represented by its relaxation time Relax(⋅), which measures the time before the onset of significant information loss. This is known to grow as (1/ε)^c for noisy local voting. The impressive error-correction ability of L has prompted some researchers to conjecture that Relax(Lε) = 2^(c/ε). We prove the tight bound 2^(c1log^21/ε) < Relax(Lε) < 2^(c2log^21/ε) for a biased error model. The same holds for Kε. Moreover, the lower bound is independent of the bias assumption. The strong bias assumption makes it possible to apply sparsity/renormalization techniques, the main tools of our investigation, used earlier in the opposite context of proving fault-tolerance.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We present what we believe to be the first thorough characterization of live streaming media content delivered over the Internet. Our characterization of over five million requests spanning a 28-day period is done at three increasingly granular levels, corresponding to clients, sessions, and transfers. Our findings support two important conclusions. First, we show that the nature of interactions between users and objects is fundamentally different for live versus stored objects. Access to stored objects is user driven, whereas access to live objects is object driven. This reversal of active/passive roles of users and objects leads to interesting dualities. For instance, our analysis underscores a Zipf-like profile for user interest in a given object, which is to be contrasted to the classic Zipf-like popularity of objects for a given user. Also, our analysis reveals that transfer lengths are highly variable and that this variability is due to the stickiness of clients to a particular live object, as opposed to structural (size) properties of objects. Second, based on observations we make, we conjecture that the particular characteristics of live media access workloads are likely to be highly dependent on the nature of the live content being accessed. In our study, this dependence is clear from the strong temporal correlations we observed in the traces, which we attribute to the synchronizing impact of live content on access characteristics. Based on our analyses, we present a model for live media workload generation that incorporates many of our findings, and which we implement in GISMO [19].

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The distributed outstar, a generalization of the outstar neural network for spatial pattern learning, is introduced. In the outstar, signals from a source node cause weights to learn and recall arbitrary patterns across a target field of nodes. The distributed outstar replaces the outstar source node with a source field of arbitrarily many nodes, whose activity pattern may be arbitrarily distributed or compressed. Learning proceeds according to a principle of atrophy due to disuse, whereby a path weight decreases in joint proportion to the transmitted path signal and the degree of disuse of the target node. During learning, the total signal to a target node converges toward that node's activity level. Weight changes at a node are apportioned according to the distributed pattern of converging signals. Three synaptic transmission functions, by a product rule, a capacity rule, and a threshold rule, are examined for this system. The three rules are computationally equivalent when source field activity is maximally compressed, or winner-take-all. When source field activity is distributed, catastrophic forgetting may occur. Only the threshold rule solves this problem. Analysis of spatial pattern learning by distributed codes thereby leads to the conjecture that the unit of long-term memory in such a system is an adaptive threshold, rather than the multiplicative path weight widely used in neural models.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

It is a neural network truth universally acknowledged, that the signal transmitted to a target node must be equal to the product of the path signal times a weight. Analysis of catastrophic forgetting by distributed codes leads to the unexpected conclusion that this universal synaptic transmission rule may not be optimal in certain neural networks. The distributed outstar, a network designed to support stable codes with fast or slow learning, generalizes the outstar network for spatial pattern learning. In the outstar, signals from a source node cause weights to learn and recall arbitrary patterns across a target field of nodes. The distributed outstar replaces the outstar source node with a source field, of arbitrarily many nodes, where the activity pattern may be arbitrarily distributed or compressed. Learning proceeds according to a principle of atrophy due to disuse whereby a path weight decreases in joint proportion to the transmittcd path signal and the degree of disuse of the target node. During learning, the total signal to a target node converges toward that node's activity level. Weight changes at a node are apportioned according to the distributed pattern of converging signals three types of synaptic transmission, a product rule, a capacity rule, and a threshold rule, are examined for this system. The three rules are computationally equivalent when source field activity is maximally compressed, or winner-take-all when source field activity is distributed, catastrophic forgetting may occur. Only the threshold rule solves this problem. Analysis of spatial pattern learning by distributed codes thereby leads to the conjecture that the optimal unit of long-term memory in such a system is a subtractive threshold, rather than a multiplicative weight.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In 1938, in Düsseldorf, the Nazis put on an exhibit entitled "Entartete Musik” (degenerate music), which included composers on the basis of their “racial origins” (i.e. Jews), or because of the “modernist style” of their music. Performance, publication, broadcast, or sale of music by composers deemed “degenerate” was forbidden by law throughout the Third Reich. Among these composers were some of the most prominent composers of the first half of the twentieth-century. They included Stravinsky, Schoenberg, Webern, Berg, Mahler, Ernst Krenek, George Gershwin, Kurt Weill, Erwin Schulhoff, and others. The music of nineteenth-century composers of Jewish origin, such as Mendelssohn and Meyerbeer, was also officially proscribed. In each of the three recitals for this project, significant works were performed by composers who were included in this exhibition, namely, Mendelssohn, Webern, Berg, Weill, and Hans Gal. In addition, as an example of self-censorship, a work of Karl Amadeus Hartmann was included. Hartmann chose “internal exile” by refusing to allow performance of his works in Germany during the Nazi regime. One notable exception to the above categories was a work by Beethoven that was presented as a bellwether of the relationship between music and politics. The range of styles and genres in these three recitals indicates the degree to which Nazi musical censorship cut a wide swath across Europe’s musical life with devastating consequences for its music and culture.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Variation, or the re-working of existing musical material, has consistently attracted the attention of composers and performers throughout the history of Western music. In three recorded recitals at the University of Maryland School of Music, this dissertation project explores a diverse range of expressive possibilities for violin in seven types of variation form in Austro-German works for violin from the 17th through the 20th centuries. The first program, consisting of Baroque Period works, performed on period instrument, includes the divisions on “John come kiss me now” from The Division Violin by Thomas Baltzar (1631 – 1663), constant bass variations in Sonate Unarum Fidium by Johann Heinrich von Schmelzer (1623 – 1680), arbitrary variation in Sonata for Violin and Continuo in E Major, Op. 1, No. 12 “Roger” by George Friedrich Händel (1685 – 1759), and French Double style, melodic-outline variation in Partita for Unaccompanied Violin in B Minor by Johan Sebastian Bach (1685 – 1750). Theme and Variations, a popular Classical Period format, is represented by the Sonata for Piano and Violin in G Major K. 379 by Wolfgang Amadeus Mozart (1756 – 1791) and Sonata for Violin and Piano in A Major, Op. 47 No. 9 the “Kreutzer” by Ludwig van Beethoven (1770 – 1827). Fantasy for Piano and Violin in C Major D. 934 by Franz Schubert (1797 – 1828) represents the 19th century fantasia variation. In these pieces, the piano and violin parts are densely interwoven, having equal importance. Many 20th century composers incorporated diverse types of variations in their works and are represented in the third recital program comprising: serial variation in the Phantasy for Violin and Piano Op.47 of Arnold Schoenberg (1874 – 1951); a strict form of melodic-outline variation in Sonate für Violine allein, Op. 31, No. 2 of Paul Hindemith (1895 – 1963); ostinato variation in Johan Halvorsen’s (1864 – 1935) Passacaglia for Violin and Viola, after G. F. Handel’s Passacaglia from the Harpsichord Suite No. 7 in G Minor. Pianist Audrey Andrist, harpsichordist Sooyoung Jung, and violist Dong-Wook Kim assisted in these performances.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This study sought to understand the phenomenon of faculty involvement in indirect cost under-recovery. The focus of the study was on public research university STEM (science, technology, engineering and mathematics) faculty, and their perspectives on, and behavior towards, a higher education fiscal policy. The explanatory scheme was derived from anthropological theory, and incorporated organizational culture, faculty socialization, and political bargaining models in the conceptual framework. This study drew on two key assumptions. The first assumption was that faculty understanding of, and behavior toward, indirect cost recovery represents values, beliefs, and choices drawn from the distinct professional socialization and distinct culture of faculty. The second assumption was that when faculty and institutional administrators are in conflict over indirect cost recovery, the resultant formal administrative decision comes about through political bargaining over critical resources. The research design was a single site, qualitative case study with a focus on learning the meaning of the phenomenon as understood by the informants. In this study the informants were tenured and tenure track research university faculty in the STEM fields who were highly successful at obtaining Federal sponsored research funds, with individual sponsored research portfolios of at least one million dollars. The data consisted of 11 informant interviews, bolstered by documentary evidence. The findings indicated that faculty socialization and organizational culture were the most dominant themes, while political bargaining emerged as significantly less prominent. Public research university STEM faculty are most concerned about the survival of their research programs and the discovery facilitated by their research programs. They resort to conjecture when confronted by the issue of indirect cost recovery. The findings direct institutional administrators to consider less emphasis on compliance and hierarchy when working with expert professionals such as science faculty. Instead a more effective focus might be on communication and clarity in budget processes and organizational decision-making, and a concentration on critical administrative support that can relieve faculty administrative burdens. For higher education researchers, the findings suggest that we need to create more sophisticated models to help us understand organizations dependent on expert professionals.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Relative Evidential Supports (RES) was developed and justified several years ago as a non-numeric apparatus that allows us to compare evidential supports for alternative conclusions when making a decision. An extension called Graded Relative Evidence (GRE) of the RES concept of pairwise balancing and trading-off of evidence is reported here which keeps its basic features of simplicity and perspicacity but enriches its modelling fidelity by permitting very modest and intuitive variations in degrees of outweighing (which the essentially binary RES does not). The formal justification is very simply based on linkages to RES and to the Dempster - Shafer theory of evidence. The use of the simple extension is illustrated and to a small degree further justified empirically by application to a topical scientific debate about what is called the Congo Crossover Conjecture here. This decision-making instance is chosen because of the wealth of evidence that has been accumulated on both sides of the debate and the range of evidence strengths manifested in it. The conjecture is that the advent of Aids was in the late 1950s in the Congo when a vaccine for polio was allegedly cultivated in the kidneys of chimpanzees which allowed the Aids infection to cross over to humans from primates. © 2005 Springer.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Distorted-wave Born approximation calculations for Ps formation in positron impact on He, Ne, Ar, Kr and Xe are reported for the energy range up to 200 eV. Capture into the n = 1, 2 and 3 states of Ps is calculated explicitly and 1/n(3) scaling is used to estimate capture into states with n > 3. The calculations for the heavier noble gases allow for capture not only from the outer np(6) shell of the atom but also from the first inner ns(2) shell. However, the inner shell capture is found to be very small. Although by no means unambiguous, the calculations provide some support to the conjecture of Larrichia et al. [J. Phys. B 35 (2002) 2525] that the double peak and shoulder structures observed experimentally for Ps formation in Ar, Kr and Xe arise from formation in excited states. (C) 2004 Elsevier B.V. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A beam splitter is a simple, readily available device which can act to entangle output optical fields. We show that a necessary condition for the fields at the output of the beam splitter to be entangled is that the pure input states exhibit nonclassical behavior. We generalize this proof for arbitrary (pure or impure) Gaussian input states. Specifically, nonclassicality of the input Gaussian fields is a necessary condition for entanglement of the field modes with the help of a beam splitter. We conjecture that this is a general property of beam splitters: Nonclassicality of the inputs is a necessary condition for entangling fields in a beam splitter.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

La realidad del voluntariado es sumamente compleja hasta el punto de que resulta complicado definir y caracterizar el trabajo voluntario, dada la gran variedad de interpretaciones, motivaciones, variables sociodemográficas y aspectos culturales que configuran el perfil de los voluntarios. El objetivo de este trabajo es analizar la influencia conjunta de algunas variables sociodemográficas, así como de los valores culturales de índole secular o tradicional, sobre el perfil de los voluntarios en Europa. Además, se investiga qué variables orientan a los voluntarios hacia un determinado tipo de voluntariado u otro. Para ello se ha aplicado principalmente una metodología de regresión logística a partir de la información disponible en la European Value Study. Los resultados obtenidos ayudan a establecer una caracterización del voluntariado en Europa, y confirman la influencia de los valores culturales, en primer lugar, en la realización o no de trabajos de voluntariado, y en segundo lugar, en la elección que hacen estas personas del tipo de actividad con la que están comprometidos. Al analizar dos tipos de voluntariado de motivación supuestamente muy diferente, se concluye que existe un grupo de valores que influyen en ambos, aunque el sentido y la intensidad en la que lo hacen sea diferente; por otra parte, algunos valores tienen influencia o no en la realización de trabajos de voluntariado, dependiendo del tipo específico al que nos refiramos.