958 resultados para Fundamental Decoherence
Resumo:
Big Brother Watch and others have filed a complaint against the United Kingdom under the European Convention on Human Rights about a violation of Article 8, the right to privacy. It regards the NSA affair and UK-based surveillance activities operated by secret services. The question is whether it will be declared admissible and, if so, whether the European Court of Human Rights will find a violation. This article discusses three possible challenges for these types of complaints and analyses whether the current privacy paradigm is still adequate in view of the development known as Big Data.
Resumo:
The in-medium physics of heavy quarkonium is an ideal proving ground for our ability to connect knowledge about the fundamental laws of physics to phenomenological predictions. One possible route to take is to attempt a description of heavy quark bound states at finite temperature through a Schrödinger equation with an instantaneous potential. Here we review recent progress in devising a comprehensive approach to define such a potential from first principles QCD and extract its, in general complex, values from non-perturbative lattice QCD simulations. Based on the theory of open quantum systems we will show how to interpret the role of the imaginary part in terms of spatial decoherence by introducing the concept of a stochastic potential. Shortcomings as well as possible paths for improvement are discussed.
Resumo:
Spike timing dependent plasticity (STDP) is a phenomenon in which the precise timing of spikes affects the sign and magnitude of changes in synaptic strength. STDP is often interpreted as the comprehensive learning rule for a synapse - the "first law" of synaptic plasticity. This interpretation is made explicit in theoretical models in which the total plasticity produced by complex spike patterns results from a superposition of the effects of all spike pairs. Although such models are appealing for their simplicity, they can fail dramatically. For example, the measured single-spike learning rule between hippocampal CA3 and CA1 pyramidal neurons does not predict the existence of long-term potentiation one of the best-known forms of synaptic plasticity. Layers of complexity have been added to the basic STDP model to repair predictive failures, but they have been outstripped by experimental data. We propose an alternate first law: neural activity triggers changes in key biochemical intermediates, which act as a more direct trigger of plasticity mechanisms. One particularly successful model uses intracellular calcium as the intermediate and can account for many observed properties of bidirectional plasticity. In this formulation, STDP is not itself the basis for explaining other forms of plasticity, but is instead a consequence of changes in the biochemical intermediate, calcium. Eventually a mechanism-based framework for learning rules should include other messengers, discrete change at individual synapses, spread of plasticity among neighboring synapses, and priming of hidden processes that change a synapse's susceptibility to future change. Mechanism-based models provide a rich framework for the computational representation of synaptic plasticity.
Resumo:
Quarks were introduced 50 years ago opening the road towards our understanding of the elementary constituents of matter and their fundamental interactions. Since then, a spectacular progress has been made with important discoveries that led to the establishment of the Standard Theory that describes accurately the basic constituents of the observable matter, namely quarks and leptons, interacting with the exchange of three fundamental forces, the weak, electromagnetic and strong force. Particle physics is now entering a new era driven by the quest of understanding of the composition of our Universe such as the unobservable (dark) matter, the hierarchy of masses and forces, the unification of all fundamental interactions with gravity in a consistent quantum framework, and several other important questions. A candidate theory providing answers to many of these questions is string theory that replaces the notion of point particles by extended objects, such as closed and open strings. In this short note, I will give a brief overview of string unification, describe in particular how quarks and leptons can emerge and discuss what are possible predictions for particle physics and cosmology that could test these ideas.
Resumo:
Quarks were introduced 50 years ago opening the road towards our understanding of the elementary constituents of matter and their fundamental interactions. Since then, a spectacular progress has been made with important discoveries that led to the establishment of the Standard Theory that describes accurately the basic constituents of the observable matter, namely quarks and leptons, interacting with the exchange of three fundamental forces, the weak, electromagnetic and strong force. Particle physics is now entering a new era driven by the quest of understanding of the composition of our Universe such as the unobservable (dark) matter, the hierarchy of masses and forces, the unification of all fundamental interactions with gravity in a consistent quantum framework, and several other important questions. A candidate theory providing answers to many of these questions is string theory that replaces the notion of point particles by extended objects, such as closed and open strings. In this short note, I will give a brief overview of string unification, describe in particular how quarks and leptons can emerge and discuss what are possible predictions for particle physics and cosmology that could test these ideas.
Resumo:
by Clementina de Rothschild. Transl. from the German [[Elektronische Ressource]]
Resumo:
INTRODUCTION This paper focuses exclusively on experimental models with ultra high dilutions (i.e. beyond 10(-23)) that have been submitted to replication scrutiny. It updates previous surveys, considers suggestions made by the research community and compares the state of replication in 1994 with that in 2015. METHODS Following literature research, biochemical, immunological, botanical, cell biological and zoological studies on ultra high dilutions (potencies) were included. Reports were grouped into initial studies, laboratory-internal, multicentre and external replications. Repetition could yield either comparable, or zero, or opposite results. The null-hypothesis was that test and control groups would not be distinguishable (zero effect). RESULTS A total of 126 studies were found. From these, 28 were initial studies. When all 98 replicative studies were considered, 70.4% (i.e. 69) reported a result comparable to that of the initial study, 20.4% (20) zero effect and 9.2% (9) an opposite result. Both for the studies until 1994 and the studies 1995-2015 the null-hypothesis (dominance of zero results) should be rejected. Furthermore, the odds of finding a comparable result are generally higher than of finding an opposite result. Although this is true for all three types of replication studies, the fraction of comparable studies diminishes from laboratory-internal (total 82.9%) to multicentre (total 75%) to external (total 48.3%), while the fraction of opposite results was 4.9%, 10.7% and 13.8%. Furthermore, it became obvious that the probability of an external replication producing comparable results is bigger for models that had already been further scrutinized by the initial researchers. CONCLUSIONS We found 28 experimental models which underwent replication. In total, 24 models were replicated with comparable results, 12 models with zero effect, and 6 models with opposite results. Five models were externally reproduced with comparable results. We encourage further replications of studies in order to learn more about the model systems used.
Resumo:
Alveolar echinococcosis, caused by the tapeworm Echinococcus multilocularis, is one of the most severe parasitic diseases in humans and represents one of the 17 neglected diseases prioritised by the World Health Organisation (WHO) in 2012. Considering the major medical and veterinary importance of this parasite, the phylogeny of the genus Echinococcus is of considerable importance; yet, despite numerous efforts with both mitochondrial and nuclear data, it has remained unresolved. The genus is clearly complex, and this is one of the reasons for the incomplete understanding of its taxonomy. Although taxonomic studies have recognised E. multilocularis as a separate entity from the Echinococcus granulosus complex and other members of the genus, it would be premature to draw firm conclusions about the taxonomy of the genus before the phylogeny of the whole genus is fully resolved. The recent sequencing of E. multilocularis and E. granulosus genomes opens new possibilities for performing in-depth phylogenetic analyses. In addition, whole genome data provide the possibility of inferring phylogenies based on a large number of functional genes, i.e. genes that trace the evolutionary history of adaptation in E. multilocularis and other members of the genus. Moreover, genomic data open new avenues for studying the molecular epidemiology of E. multilocularis: genotyping studies with larger panels of genetic markers allow the genetic diversity and spatial dynamics of parasites to be evaluated with greater precision. There is an urgent need for international coordination of genotyping of E. multilocularis isolates from animals and human patients. This could be fundamental for a better understanding of the transmission of alveolar echinococcosis and for designing efficient healthcare strategies.
Resumo:
No abstract available.
Resumo:
Fil: Mercadante, Facundo . Universidad Nacional de Cuyo. Facultad de Ciencias Políticas y Sociales