939 resultados para Computational music theory
Resumo:
Answer Set Programming (ASP) is a popular framework for modelling combinatorial problems. However, ASP cannot be used easily for reasoning about uncertain information. Possibilistic ASP (PASP) is an extension of ASP that combines possibilistic logic and ASP. In PASP a weight is associated with each rule, whereas this weight is interpreted as the certainty with which the conclusion can be established when the body is known to hold. As such, it allows us to model and reason about uncertain information in an intuitive way. In this paper we present new semantics for PASP in which rules are interpreted as constraints on possibility distributions. Special models of these constraints are then identified as possibilistic answer sets. In addition, since ASP is a special case of PASP in which all the rules are entirely certain, we obtain a new characterization of ASP in terms of constraints on possibility distributions. This allows us to uncover a new form of disjunction, called weak disjunction, that has not been previously considered in the literature. In addition to introducing and motivating the semantics of weak disjunction, we also pinpoint its computational complexity. In particular, while the complexity of most reasoning tasks coincides with standard disjunctive ASP, we find that brave reasoning for programs with weak disjunctions is easier.
Resumo:
We show that a self-generated set of combinatorial games, S, may not be hereditarily closed but, strong self-generation and hereditary closure are equivalent in the universe of short games. In [13], the question “Is there a set which will give an on-distributive but modular lattice?” appears. A useful necessary condition for the existence of a finite non-distributive modular L(S) is proved. We show the existence of S such that L(S) is modular and not distributive, exhibiting the first known example. More, we prove a Representation Theorem with Games that allows the generation of all finite lattices in game context. Finally, a computational tool for drawing lattices of games is presented.
Resumo:
As técnicas estatísticas são fundamentais em ciência e a análise de regressão linear é, quiçá, uma das metodologias mais usadas. É bem conhecido da literatura que, sob determinadas condições, a regressão linear é uma ferramenta estatística poderosíssima. Infelizmente, na prática, algumas dessas condições raramente são satisfeitas e os modelos de regressão tornam-se mal-postos, inviabilizando, assim, a aplicação dos tradicionais métodos de estimação. Este trabalho apresenta algumas contribuições para a teoria de máxima entropia na estimação de modelos mal-postos, em particular na estimação de modelos de regressão linear com pequenas amostras, afetados por colinearidade e outliers. A investigação é desenvolvida em três vertentes, nomeadamente na estimação de eficiência técnica com fronteiras de produção condicionadas a estados contingentes, na estimação do parâmetro ridge em regressão ridge e, por último, em novos desenvolvimentos na estimação com máxima entropia. Na estimação de eficiência técnica com fronteiras de produção condicionadas a estados contingentes, o trabalho desenvolvido evidencia um melhor desempenho dos estimadores de máxima entropia em relação ao estimador de máxima verosimilhança. Este bom desempenho é notório em modelos com poucas observações por estado e em modelos com um grande número de estados, os quais são comummente afetados por colinearidade. Espera-se que a utilização de estimadores de máxima entropia contribua para o tão desejado aumento de trabalho empírico com estas fronteiras de produção. Em regressão ridge o maior desafio é a estimação do parâmetro ridge. Embora existam inúmeros procedimentos disponíveis na literatura, a verdade é que não existe nenhum que supere todos os outros. Neste trabalho é proposto um novo estimador do parâmetro ridge, que combina a análise do traço ridge e a estimação com máxima entropia. Os resultados obtidos nos estudos de simulação sugerem que este novo estimador é um dos melhores procedimentos existentes na literatura para a estimação do parâmetro ridge. O estimador de máxima entropia de Leuven é baseado no método dos mínimos quadrados, na entropia de Shannon e em conceitos da eletrodinâmica quântica. Este estimador suplanta a principal crítica apontada ao estimador de máxima entropia generalizada, uma vez que prescinde dos suportes para os parâmetros e erros do modelo de regressão. Neste trabalho são apresentadas novas contribuições para a teoria de máxima entropia na estimação de modelos mal-postos, tendo por base o estimador de máxima entropia de Leuven, a teoria da informação e a regressão robusta. Os estimadores desenvolvidos revelam um bom desempenho em modelos de regressão linear com pequenas amostras, afetados por colinearidade e outliers. Por último, são apresentados alguns códigos computacionais para estimação com máxima entropia, contribuindo, deste modo, para um aumento dos escassos recursos computacionais atualmente disponíveis.
Resumo:
We show that a self-generated set of combinatorial games, S. may not be hereditarily closed but, strong self-generation and hereditary closure are equivalent in the universe of short games. In [13], the question "Is there a set which will give a non-distributive but modular lattice?" appears. A useful necessary condition for the existence of a finite non-distributive modular L(S) is proved. We show the existence of S such that L(S) is modular and not distributive, exhibiting the first known example. More, we prove a Representation Theorem with Games that allows the generation of all finite lattices in game context. Finally, a computational tool for drawing lattices of games is presented. (C) 2014 Elsevier B.V. All rights reserved.
Resumo:
The present paper studies the probability of ruin of an insurer, if excess of loss reinsurance with reinstatements is applied. In the setting of the classical Cramer-Lundberg risk model, piecewise deterministic Markov processes are used to describe the free surplus process in this more general situation. It is shown that the finite-time ruin probability is both the solution of a partial integro-differential equation and the fixed point of a contractive integral operator. We exploit the latter representation to develop and implement a recursive algorithm for numerical approximation of the ruin probability that involves high-dimensional integration. Furthermore we study the behavior of the finite-time ruin probability under various levels of initial surplus and security loadings and compare the efficiency of the numerical algorithm with the computational alternative of stochastic simulation of the risk process. (C) 2011 Elsevier Inc. All rights reserved.
Resumo:
The topic of this thesis is marginaVminority popular music and the question of identity; the term "marginaVminority" specifically refers to members of racial and cultural minorities who are socially and politically marginalized. The thesis argument is that popular music produced by members of cultural and racial minorities establishes cultural identity and resists racist discourse. Three marginaVminority popular music artists and their songs have been chosen for analysis in support of the argument: Gil Scott-Heron's "Gun," Tracy Chapman's "Fast Car" and Robbie Robertson's "Sacrifice." The thesis will draw from two fields of study; popular music and postcolonialism. Within the area of popular music, Theodor Adorno's "Standardization" theory is the focus. Within the area of postcolonialism, this thesis concentrates on two specific topics; 1) Stuart Hall's and Homi Bhabha's overlapping perspectives that identity is a process of cultural signification, and 2) Homi Bhabha's concept of the "Third Space." For Bhabha (1995a), the Third Space defines cultures in the moment of their use, at the moment of their exchange. The idea of identities arising out of cultural struggle suggests that identity is a process as opposed to a fixed center, an enclosed totality. Cultures arise from historical memory and memory has no center. Historical memory is de-centered and thus cultures are also de-centered, they are not enclosed totalities. This is what Bhabha means by "hybridity" of culture - that cultures are not unitary totalities, they are ways of knowing and speaking about a reality that is in constant flux. In this regard, the language of "Otherness" depends on suppressing or marginalizing the productive capacity of culture in the act of enunciation. The Third Space represents a strategy of enunciation that disrupts, interrupts and dislocates the dominant discursive construction of US and THEM, (a construction explained by Hall's concept of binary oppositions, detailed in Chapter 2). Bhabha uses the term "enunciation" as a linguistic metaphor for how cultural differences are articulated through discourse and thus how differences are discursively produced. Like Hall, Bhabha views culture as a process of understanding and of signification because Bhabha sees traditional cultures' struggle against colonizing cultures as transforming them. Adorno's theory of Standardization will be understood as a theoretical position of Western authority. The thesis will argue that Adorno's theory rests on the assumption that there is an "essence" to music, an essence that Adorno rationalizes as structure/form. The thesis will demonstrate that constructing music as possessing an essence is connected to ideology and power and in this regard, Adorno's Standardization theory is a discourse of White Western power. It will be argued that "essentialism" is at the root of Western "rationalization" of music, and that the definition of what constitutes music is an extension of Western racist "discourses" of the Other. The methodological framework of the thesis entails a) applying semiotics to each of the three songs examined and b) also applying Bhabha's model of the Third Space to each of the songs. In this thesis, semiotics specifically refers to Stuart Hall's retheorized semiotics, which recognizes the dual function of semiotics in the analysis of marginal racial/cultural identities, i.e., simultaneously represent embedded racial/cultural stereotypes, and the marginal raciaVcultural first person voice that disavows and thus reinscribes stereotyped identities. (Here, and throughout this thesis, "first person voice" is used not to denote the voice of the songwriter, but rather the collective voice of a marginal racial/cultural group). This dual function fits with Hall's and Bhabha's idea that cultural identity emerges out of cultural antagonism, cultural struggle. Bhabha's Third Space is also applied to each of the songs to show that cultural "struggle" between colonizers and colonized produces cultural hybridities, musically expressed as fusions of styles/sounds. The purpose of combining semiotics and postcolonialism in the three songs to be analyzed is to show that marginal popular music, produced by members of cultural and racial minorities, establishes cultural identity and resists racist discourse by overwriting identities of racial/cultural stereotypes with identities shaped by the first person voice enunciated in the Third Space, to produce identities of cultural hybridities. Semiotic codes of embedded "Black" and "Indian" stereotypes in each song's musical and lyrical text will be read and shown to be overwritten by the semiotic codes of the first person voice, which are decoded with the aid of postcolonial concepts such as "ambivalence," "hybridity" and "enunciation."
Resumo:
The aim of this MA thesis is to demonstrate how corporate concentration within the global music industry specifically affects the Canadian music industry's ability to compete for its own national audience as well as audiences worldwide. Federal public policies, regulatory regimes and subsidies are considered within the context of the structure of the global marketplace which is, in effect, an oligopoly controlled by four major corporations. Through an extensive literature review of political economy theory, Canadian public policies and music studies, as well as personal interviews conducted with Canadian musicians, entrepreneurs and public servants, I will situate my research within the body of political economy theory; present a detailed report of the structure of the global music industry; address the key players within the industry; describe the relationship between the major corporations and the independent companies operating in the industry; discuss how new technologies affect said relationships; consider the effectiveness of Canadian public policies in safeguarding the national music industry; and recommend steps that can be taken to remedy the shortcomings of Federal policies and regulatory regimes.
Resumo:
The use of theory to understand and facilitate catalytic enantioselective organic transformations involving copper and hydrobenzoin derivatives is reported. Section A details the use of theory to predict, facilitate, and understand a copper promoted amino oxygenation reaction reported by Chemler et al. Using Density Functional Theory (DFT), employing the hybrid B3LYP functional and a LanL2DZ/6-31G(d) basis set, the mechanistic details were studied on a N-tosyl-o-allylaniline and a [alpha]-methyl-[gamma]-alkenyl sulfonamide substrate. The results suggest the N-C bond formation proceeds via a cisaminocupration, and not through a radical-type mechanism. Additionally, the origin of diastereoselection observed with [alpha]-methyl-[gamma]-alkenyl sulfonamide arises from avoidance of unfavourable steric interactions between the methyl substituent and the N -protecting group. Section B details the computationally guided, experimental investigation of two hydrobenzoin derivatives as ligands/ catalysts, as well as the attempted synthesis of a third hydrobenzoin derivative. The bis-boronic acid derived from hydrobenzoin was successful as a Lewis acid catalyst in the Bignielli reaction and the Conia ene reaction, but provided only racemic products. The chiral diol derived from hydrobenzoin successfully increased the rate of the addition of diethyl zinc to benzaldehyde in the presence of titanium tetraisopropoxide, however poor enantioinduction was obseverved. Notably, the observed reactivity was successfully predicted by theoretical calculations.
Resumo:
Understanding how stem and progenitor cells choose between alternative cell fates is a major challenge in developmental biology. Efforts to tackle this problem have been hampered by the scarcity of markers that can be used to predict cell division outcomes. Here we present a computational method, based on algorithmic information theory, to analyze dynamic features of living cells over time. Using this method, we asked whether rat retinal progenitor cells (RPCs) display characteristic phenotypes before undergoing mitosis that could foretell their fate. We predicted whether RPCs will undergo a self-renewing or terminal division with 99% accuracy, or whether they will produce two photoreceptors or another combination of offspring with 87% accuracy. Our implementation can segment, track and generate predictions for 40 cells simultaneously on a standard computer at 5 min per frame. This method could be used to isolate cell populations with specific developmental potential, enabling previously impossible investigations.
Resumo:
En synthèse d’images, reproduire les effets complexes de la lumière sur des matériaux transluminescents, tels que la cire, le marbre ou la peau, contribue grandement au réalisme d’une image. Malheureusement, ce réalisme supplémentaire est couteux en temps de calcul. Les modèles basés sur la théorie de la diffusion visent à réduire ce coût en simulant le comportement physique du transport de la lumière sous surfacique tout en imposant des contraintes de variation sur la lumière incidente et sortante. Une composante importante de ces modèles est leur application à évaluer hiérarchiquement l’intégrale numérique de l’illumination sur la surface d’un objet. Cette thèse révise en premier lieu la littérature actuelle sur la simulation réaliste de la transluminescence, avant d’investiguer plus en profondeur leur application et les extensions des modèles de diffusion en synthèse d’images. Ainsi, nous proposons et évaluons une nouvelle technique d’intégration numérique hiérarchique utilisant une nouvelle analyse fréquentielle de la lumière sortante et incidente pour adapter efficacement le taux d’échantillonnage pendant l’intégration. Nous appliquons cette théorie à plusieurs modèles qui correspondent à l’état de l’art en diffusion, octroyant une amélioration possible à leur efficacité et précision.
Resumo:
La synthèse d'images dites photoréalistes nécessite d'évaluer numériquement la manière dont la lumière et la matière interagissent physiquement, ce qui, malgré la puissance de calcul impressionnante dont nous bénéficions aujourd'hui et qui ne cesse d'augmenter, est encore bien loin de devenir une tâche triviale pour nos ordinateurs. Ceci est dû en majeure partie à la manière dont nous représentons les objets: afin de reproduire les interactions subtiles qui mènent à la perception du détail, il est nécessaire de modéliser des quantités phénoménales de géométries. Au moment du rendu, cette complexité conduit inexorablement à de lourdes requêtes d'entrées-sorties, qui, couplées à des évaluations d'opérateurs de filtrage complexes, rendent les temps de calcul nécessaires à produire des images sans défaut totalement déraisonnables. Afin de pallier ces limitations sous les contraintes actuelles, il est nécessaire de dériver une représentation multiéchelle de la matière. Dans cette thèse, nous construisons une telle représentation pour la matière dont l'interface correspond à une surface perturbée, une configuration qui se construit généralement via des cartes d'élévations en infographie. Nous dérivons notre représentation dans le contexte de la théorie des microfacettes (conçue à l'origine pour modéliser la réflectance de surfaces rugueuses), que nous présentons d'abord, puis augmentons en deux temps. Dans un premier temps, nous rendons la théorie applicable à travers plusieurs échelles d'observation en la généralisant aux statistiques de microfacettes décentrées. Dans l'autre, nous dérivons une procédure d'inversion capable de reconstruire les statistiques de microfacettes à partir de réponses de réflexion d'un matériau arbitraire dans les configurations de rétroréflexion. Nous montrons comment cette théorie augmentée peut être exploitée afin de dériver un opérateur général et efficace de rééchantillonnage approximatif de cartes d'élévations qui (a) préserve l'anisotropie du transport de la lumière pour n'importe quelle résolution, (b) peut être appliqué en amont du rendu et stocké dans des MIP maps afin de diminuer drastiquement le nombre de requêtes d'entrées-sorties, et (c) simplifie de manière considérable les opérations de filtrage par pixel, le tout conduisant à des temps de rendu plus courts. Afin de valider et démontrer l'efficacité de notre opérateur, nous synthétisons des images photoréalistes anticrenelées et les comparons à des images de référence. De plus, nous fournissons une implantation C++ complète tout au long de la dissertation afin de faciliter la reproduction des résultats obtenus. Nous concluons avec une discussion portant sur les limitations de notre approche, ainsi que sur les verrous restant à lever afin de dériver une représentation multiéchelle de la matière encore plus générale.
Resumo:
A new approach, the multipole theory (MT) method, is presented for the computation of cutoff wavenumbers of waveguides partially filled with dielectric. The MT formulation of the eigenvalue problem of an inhomogeneous waveguide is derived. Representative computational examples, including dielectric-rod-loaded rectangular and double-ridged waveguides, are given to validate the theory, and to demonstrate the degree of its efficiency
Resumo:
FT-IR spectrum of quinoline-2-carbaldehyde benzoyl hydrazone (HQb H2O) was recorded and analyzed. The synthesis and crystal structure data are also described. The vibrational wavenumbers were examined theoretically using the Gaussian03 package of programs using HF/6-31G(d) and B3LYP/6-31G(d) levels of theory. The data obtained from vibrational wavenumber calculations are used to assign vibrational bands obtained in infrared spectroscopy of the studied molecule. The first hyperpolarizability, infrared intensities and Raman activities are reported. The calculated first hyperpolarizability is comparable with the reported values of similar derivatives and is an attractive object for future studies of non-linear optics. The geometrical parameters of the title compound obtained from XRD studies are in agreement with the calculated values. The changes in the CAN bond lengths suggest an extended p-electron delocalization over quinoline and hydrazone moieties which is responsible for the non-linearity of the molecule
Resumo:
Using the independent particle model as our basis we present a scheme to reduce the complexity and computational effort to calculate inclusive probabilities in many-electron collision system. As an example we present an application to K - K charge transfer in collisions of 2.6 MeV Ne{^9+} on Ne. We are able to give impact parameter-dependent probabilities for many-particle states which could lead to KLL-Auger electrons after collision and we compare with experimental values.
Resumo:
We show that optimizing a quantum gate for an open quantum system requires the time evolution of only three states irrespective of the dimension of Hilbert space. This represents a significant reduction in computational resources compared to the complete basis of Liouville space that is commonly believed necessary for this task. The reduction is based on two observations: the target is not a general dynamical map but a unitary operation; and the time evolution of two properly chosen states is sufficient to distinguish any two unitaries. We illustrate gate optimization employing a reduced set of states for a controlled phasegate with trapped atoms as qubit carriers and a iSWAP gate with superconducting qubits.