866 resultados para active and passive quantum error correction


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The present thesis is divided into two main research areas: Classical Cosmology and (Loop) Quantum Gravity. The first part concerns cosmological models with one phantom and one scalar field, that provide the `super-accelerated' scenario not excluded by observations, thus exploring alternatives to the standard LambdaCDM scenario. The second part concerns the spinfoam approach to (Loop) Quantum Gravity, which is an attempt to provide a `sum-over-histories' formulation of gravitational quantum transition amplitudes. The research here presented focuses on the face amplitude of a generic spinfoam model for Quantum Gravity.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis is focused on the study of techniques that allow to have reliable transmission of multimedia content in streaming and broadcasting applications, targeting in particular video content. The design of efficient error-control mechanisms, to enhance video transmission systems reliability, has been addressed considering cross-layer and multi-layer/multi-dimensional channel coding techniques to cope with bit errors as well as packet erasures. Mechanisms for unequal time interleaving have been designed as a viable solution to reduce the impact of errors and erasures by acting on the time diversity of the data flow, thus enhancing robustness against correlated channel impairments. In order to account for the nature of the factors which affect the physical layer channel in the evaluation of FEC schemes performances, an ad-hoc error-event modeling has been devised. In addition, the impact of error correction/protection techniques on the quality perceived by the consumers of video services applications and techniques for objective/subjective quality evaluation have been studied. The applicability and value of the proposed techniques have been tested by considering practical constraints and requirements of real system implementations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The space environment has always been one of the most challenging for communications, both at physical and network layer. Concerning the latter, the most common challenges are the lack of continuous network connectivity, very long delays and relatively frequent losses. Because of these problems, the normal TCP/IP suite protocols are hardly applicable. Moreover, in space scenarios reliability is fundamental. In fact, it is usually not tolerable to lose important information or to receive it with a very large delay because of a challenging transmission channel. In terrestrial protocols, such as TCP, reliability is obtained by means of an ARQ (Automatic Retransmission reQuest) method, which, however, has not good performance when there are long delays on the transmission channel. At physical layer, Forward Error Correction Codes (FECs), based on the insertion of redundant information, are an alternative way to assure reliability. On binary channels, when single bits are flipped because of channel noise, redundancy bits can be exploited to recover the original information. In the presence of binary erasure channels, where bits are not flipped but lost, redundancy can still be used to recover the original information. FECs codes, designed for this purpose, are usually called Erasure Codes (ECs). It is worth noting that ECs, primarily studied for binary channels, can also be used at upper layers, i.e. applied on packets instead of bits, offering a very interesting alternative to the usual ARQ methods, especially in the presence of long delays. A protocol created to add reliability to DTN networks is the Licklider Transmission Protocol (LTP), created to obtain better performance on long delay links. The aim of this thesis is the application of ECs to LTP.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Energy-dependent intestinal calcium absorption is important for the maintenance of calcium and bone homeostasis, especially when dietary calcium supply is restricted. The active form of vitamin D, 1,25-dihydroxyvitamin D(3) [1,25(OH)(2)D(3)], is a crucial regulator of this process and increases the expression of the transient receptor potential vanilloid 6 (Trpv6) calcium channel that mediates calcium transfer across the intestinal apical membrane. Genetic inactivation of Trpv6 in mice (Trpv6(-/-)) showed, however, that TRPV6 is redundant for intestinal calcium absorption when dietary calcium content is normal/high and passive diffusion likely contributes to maintain normal serum calcium levels. On the other hand, Trpv6 inactivation impaired the increase in intestinal calcium transport following calcium restriction, however without resulting in hypocalcemia. A possible explanation is that normocalcemia is maintained at the expense of bone homeostasis, a hypothesis investigated in this study. In this study, we thoroughly analyzed the bone phenotype of Trpv6(-/-) mice receiving a normal (approximately 1%) or low (approximately 0.02%) calcium diet from weaning onwards using micro-computed tomography, histomorphometry and serum parameters. When dietary supply of calcium is normal, Trpv6 inactivation did not affect growth plate morphology, bone mass and remodeling parameters in young adult or aging mice. Restricting dietary calcium had no effect on serum calcium levels and resulted in a comparable reduction in bone mass accrual in Trpv6(+/+) and Trpv6(-/-) mice (-35% and 45% respectively). This decrease in bone mass was associated with a similar increase in bone resorption, whereas serum osteocalcin levels and the amount of unmineralized bone matrix were only significantly increased in Trpv6(-/-) mice. Taken together, our findings indicate that TRPV6 contributes to intestinal calcium transport when dietary calcium supply is limited and in this condition indirectly regulates bone formation and/or mineralization.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Passive states of quantum systems are states from which no system energy can be extracted by any cyclic (unitary) process. Gibbs states of all temperatures are passive. Strong local (SL) passive states are defined to allow any general quantum operation, but the operation is required to be local, being applied only to a specific subsystem. Any mixture of eigenstates in a system-dependent neighborhood of a nondegenerate entangled ground state is found to be SL passive. In particular, Gibbs states are SL passive with respect to a subsystem only at or below a critical system-dependent temperature. SL passivity is associated in many-body systems with the presence of ground state entanglement in a way suggestive of collective quantum phenomena such as quantum phase transitions, superconductivity, and the quantum Hall effect. The presence of SL passivity is detailed for some simple spin systems where it is found that SL passivity is neither confined to systems of only a few particles nor limited to the near vicinity of the ground state.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We used differential GPS measurements from a 13 station GPS network spanning the Santa Ana Volcano and Coatepeque Caldera to characterize the inter-eruptive activity and tectonic movements near these two active and potentially hazardous features. Caldera-forming events occurred from 70-40 ka and at Santa Ana/Izalco volcanoes eruptive activity occurred as recently as 2005. Twelve differential stations were surveyed for 1 to 2 hours on a monthly basis from February through September 2009 and tied to a centrally located continuous GPS station, which serves as the reference site for this volcanic network. Repeatabilities of the averages from 20-minute sessions taken over 20 hours or longer range from 2-11 mm in the horizontal (north and east) components of the inter-station baselines, suggesting a lower detection limit for the horizontal components of any short-term tectonic or volcanic deformation. Repeatabilities of the vertical baseline component range from 12-34 mm. Analysis of the precipitable water vapor in the troposphere suggests that tropospheric decorrelation as a function of baseline lengths and variable site elevations are the most likely sources of vertical error. Differential motions of the 12 sites relative to the continuous reference site reveal inflation from February through July at several sites surrounding the caldera with vertical displacements that range from 61 mm to 139 mm followed by a lower magnitude deflation event on 1.8-7.4 km-long baselines. Uplift rates for the inflationary period reach 300 mm/yr with 1σ uncertainties of +/- 26 – 119 mm. Only one other station outside the caldera exhibits a similar deformation trend, suggesting a localized source. The results suggest that the use of differential GPS measurements from short duration occupations over short baselines can be a useful monitoring tool at sub-tropical volcanoes and calderas.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This dissertation concerns the intersection of three areas of discrete mathematics: finite geometries, design theory, and coding theory. The central theme is the power of finite geometry designs, which are constructed from the points and t-dimensional subspaces of a projective or affine geometry. We use these designs to construct and analyze combinatorial objects which inherit their best properties from these geometric structures. A central question in the study of finite geometry designs is Hamada’s conjecture, which proposes that finite geometry designs are the unique designs with minimum p-rank among all designs with the same parameters. In this dissertation, we will examine several questions related to Hamada’s conjecture, including the existence of counterexamples. We will also study the applicability of certain decoding methods to known counterexamples. We begin by constructing an infinite family of counterexamples to Hamada’s conjecture. These designs are the first infinite class of counterexamples for the affine case of Hamada’s conjecture. We further demonstrate how these designs, along with the projective polarity designs of Jungnickel and Tonchev, admit majority-logic decoding schemes. The codes obtained from these polarity designs attain error-correcting performance which is, in certain cases, equal to that of the finite geometry designs from which they are derived. This further demonstrates the highly geometric structure maintained by these designs. Finite geometries also help us construct several types of quantum error-correcting codes. We use relatives of finite geometry designs to construct infinite families of q-ary quantum stabilizer codes. We also construct entanglement-assisted quantum error-correcting codes (EAQECCs) which admit a particularly efficient and effective error-correcting scheme, while also providing the first general method for constructing these quantum codes with known parameters and desirable properties. Finite geometry designs are used to give exceptional examples of these codes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper is a summary of the main contribu- tions of the PhD thesis published in [1]. The main research contributions of the thesis are driven by the research question how to design simple, yet efficient and robust run-time adaptive resource allocation schemes within the commu- nication stack of Wireless Sensor Network (WSN) nodes. The thesis addresses several problem domains with con- tributions on different layers of the WSN communication stack. The main contributions can be summarized as follows: First, a a novel run-time adaptive MAC protocol is intro- duced, which stepwise allocates the power-hungry radio interface in an on-demand manner when the encountered traffic load requires it. Second, the thesis outlines a metho- dology for robust, reliable and accurate software-based energy-estimation, which is calculated at network run- time on the sensor node itself. Third, the thesis evaluates several Forward Error Correction (FEC) strategies to adap- tively allocate the correctional power of Error Correcting Codes (ECCs) to cope with timely and spatially variable bit error rates. Fourth, in the context of TCP-based communi- cations in WSNs, the thesis evaluates distributed caching and local retransmission strategies to overcome the perfor- mance degrading effects of packet corruption and trans- mission failures when transmitting data over multiple hops. The performance of all developed protocols are eval- uated on a self-developed real-world WSN testbed and achieve superior performance over selected existing ap- proaches, especially where traffic load and channel condi- tions are suspect to rapid variations over time.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Cell competition is the short-range elimination of slow-dividing cells through apoptosis when confronted with a faster growing population. It is based on the comparison of relative cell fitness between neighboring cells and is a striking example of tissue adaptability that could play a central role in developmental error correction and cancer progression in both Drosophila melanogaster and mammals. Cell competition has led to the discovery of multiple pathways that affect cell fitness and drive cell elimination. The diversity of these pathways could reflect unrelated phenomena, yet recent evidence suggests some common wiring and the existence of a bona fide fitness comparison pathway.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Abelian and non-Abelian gauge theories are of central importance in many areas of physics. In condensed matter physics, AbelianU(1) lattice gauge theories arise in the description of certain quantum spin liquids. In quantum information theory, Kitaev’s toric code is a Z(2) lattice gauge theory. In particle physics, Quantum Chromodynamics (QCD), the non-Abelian SU(3) gauge theory of the strong interactions between quarks and gluons, is nonperturbatively regularized on a lattice. Quantum link models extend the concept of lattice gauge theories beyond the Wilson formulation, and are well suited for both digital and analog quantum simulation using ultracold atomic gases in optical lattices. Since quantum simulators do not suffer from the notorious sign problem, they open the door to studies of the real-time evolution of strongly coupled quantum systems, which are impossible with classical simulation methods. A plethora of interesting lattice gauge theories suggests itself for quantum simulation, which should allow us to address very challenging problems, ranging from confinement and deconfinement, or chiral symmetry breaking and its restoration at finite baryon density, to color superconductivity and the real-time evolution of heavy-ion collisions, first in simpler model gauge theories and ultimately in QCD.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND Accurate needle placement is crucial for the success of percutaneous radiological needle interventions. We compared three guiding methods using an optical-based navigation system: freehand, using a stereotactic aiming device and active depth control, and using a stereotactic aiming device and passive depth control. METHODS For each method, 25 punctures were performed on a non-rigid phantom. Five 1 mm metal screws were used as targets. Time requirements were recorded, and target positioning errors (TPE) were measured on control scans as the distance between needle tip and target. RESULTS Time requirements were reduced using the aiming device and passive depth control. The Euclidian TPE was similar for each method (4.6 ± 1.2-4.9 ± 1.7 mm). However, the lateral component was significantly lower when an aiming device was used (2.3 ± 1.3-2.8 ± 1.6 mm with an aiming device vs 4.2 ± 2.0 mm without). DISCUSSION Using an aiming device may increase the lateral accuracy of navigated needle insertion.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In numerous intervention studies and education field trials, random assignment to treatment occurs in clusters rather than at the level of observation. This departure of random assignment of units may be due to logistics, political feasibility, or ecological validity. Data within the same cluster or grouping are often correlated. Application of traditional regression techniques, which assume independence between observations, to clustered data produce consistent parameter estimates. However such estimators are often inefficient as compared to methods which incorporate the clustered nature of the data into the estimation procedure (Neuhaus 1993).1 Multilevel models, also known as random effects or random components models, can be used to account for the clustering of data by estimating higher level, or group, as well as lower level, or individual variation. Designing a study, in which the unit of observation is nested within higher level groupings, requires the determination of sample sizes at each level. This study investigates the design and analysis of various sampling strategies for a 3-level repeated measures design on the parameter estimates when the outcome variable of interest follows a Poisson distribution. ^ Results study suggest that second order PQL estimation produces the least biased estimates in the 3-level multilevel Poisson model followed by first order PQL and then second and first order MQL. The MQL estimates of both fixed and random parameters are generally satisfactory when the level 2 and level 3 variation is less than 0.10. However, as the higher level error variance increases, the MQL estimates become increasingly biased. If convergence of the estimation algorithm is not obtained by PQL procedure and higher level error variance is large, the estimates may be significantly biased. In this case bias correction techniques such as bootstrapping should be considered as an alternative procedure. For larger sample sizes, those structures with 20 or more units sampled at levels with normally distributed random errors produced more stable estimates with less sampling variance than structures with an increased number of level 1 units. For small sample sizes, sampling fewer units at the level with Poisson variation produces less sampling variation, however this criterion is no longer important when sample sizes are large. ^ 1Neuhaus J (1993). “Estimation efficiency and Tests of Covariate Effects with Clustered Binary Data”. Biometrics , 49, 989–996^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A quantum critical point (QCP) is a singularity in the phase diagram arising because of quantum mechanical fluctuations. The exotic properties of some of the most enigmatic physical systems, including unconventional metals and superconductors, quantum magnets and ultracold atomic condensates, have been related to the importance of critical quantum and thermal fluctuations near such a point. However, direct and continuous control of these fluctuations has been difficult to realize, and complete thermodynamic and spectroscopic information is required to disentangle the effects of quantum and classical physics around a QCP. Here we achieve this control in a high-pressure, high-resolution neutron scattering experiment on the quantum dimer material TlCuCl3. By measuring the magnetic excitation spectrum across the entire quantum critical phase diagram, we illustrate the similarities between quantum and thermal melting of magnetic order. We prove the critical nature of the unconventional longitudinal (Higgs) mode of the ordered phase by damping it thermally. We demonstrate the development of two types of criticality, quantum and classical, and use their static and dynamic scaling properties to conclude that quantum and thermal fluctuations can behave largely independently near a QCP.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We examine the time-series relationship between housing prices in Los Angeles, Las Vegas, and Phoenix. First, temporal Granger causality tests reveal that Los Angeles housing prices cause housing prices in Las Vegas (directly) and Phoenix (indirectly). In addition, Las Vegas housing prices cause housing prices in Phoenix. Los Angeles housing prices prove exogenous in a temporal sense and Phoenix housing prices do not cause prices in the other two markets. Second, we calculate out-of-sample forecasts in each market, using various vector autoregessive (VAR) and vector error-correction (VEC) models, as well as Bayesian, spatial, and causality versions of these models with various priors. Different specifications provide superior forecasts in the different cities. Finally, we consider the ability of theses time-series models to provide accurate out-of-sample predictions of turning points in housing prices that occurred in 2006:Q4. Recursive forecasts, where the sample is updated each quarter, provide reasonably good forecasts of turning points.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A limiting factor in the accuracy and precision of U/Pb zircon dates is accurate correction for initial disequilibrium in the 238U and 235U decay chains. The longest-lived-and therefore most abundant-intermediate daughter product in the 235U isotopic decay chain is 231Pa (T1/2 = 32.71 ka), and the partitioning behavior of Pa in zircon is not well constrained. Here we report high-precision thermal ionization mass spectrometry (TIMS) U-Pb zircon data from two samples from Ocean Drilling Program (ODP) Hole 735B, which show evidence for incorporation of excess 231Pa during zircon crystallization. The most precise analyses from the two samples have consistent Th-corrected 206Pb/238U dates with weighted means of 11.9325 ± 0.0039 Ma (n = 9) and 11.920 ± 0.011 Ma (n = 4), but distinctly older 207Pb/235U dates that vary from 12.330 ± 0.048 Ma to 12.140 ± 0.044 Ma and 12.03 ± 0.24 to 12.40 ± 0.27 Ma, respectively. If the excess 207Pb is due to variable initial excess 231Pa, calculated initial (231Pa)/(235U) activity ratios for the two samples range from 5.6 ± 1.0 to 9.6 ± 1.1 and 3.5 ± 5.2 to 11.4 ± 5.8. The data from the more precisely dated sample yields estimated DPazircon/DUzircon from 2.2-3.8 and 5.6-9.6, assuming (231Pa)/(235U) of the melt equal to the global average of recently erupted mid-ocean ridge basaltic glasses or secular equilibrium, respectively. High precision ID-TIMS analyses from nine additional samples from Hole 735B and nearby Hole 1105A suggest similar partitioning. The lower range of DPazircon/DUzircon is consistent with ion microprobe measurements of 231Pa in zircons from Holocene and Pleistocene rhyolitic eruptions (Schmitt (2007; doi:10.2138/am.2007.2449) and Schmitt (2011; doi:10.1146/annurev-earth-040610-133330)). The data suggest that 231Pa is preferentially incorporated during zircon crystallization over a range of magmatic compositions, and excess initial 231Pa may be more common in zircons than acknowledged. The degree of initial disequilibrium in the 235U decay chain suggested by the data from this study, and other recent high precision datasets, leads to resolvable discordance in high precision dates of Cenozoic to Mesozoic zircons. Minor discordance in zircons of this age may therefore reflect initial excess 231Pa and does not require either inheritance or Pb loss.