954 resultados para Delsarte-Mceliece Theorem
Resumo:
Recent controversy on the quantum dots dephasing mechanisms (between pure and inelastic) is re-examined by isolating the quantum dots from their substrate by using the appropriate limits of the ionization energy theory and the quantum adiabatic theorem. When the phonons in the quantum dots are isolated adiabatically from the phonons in the substrate, the elastic or pure dephasing becomes the dominant mechanism. On the other hand, for the case where the phonons from the substrate are non-adiabatically coupled to the quantum dots, the inelastic dephasing process takes over. This switch-over is due to different elemental composition in quantum dots as compared to its substrate. We also provide unambiguous analysis as to understand why GaAs/AlGaAs quantum dots may only have pure dephasing while InAs/GaAs quantum dots give rise to the inelastic dephasing as the dominant mechanism. It is shown that the elemental composition plays an important role (of both quantum dots and substrate) in evaluating the dephasing mechanisms of quantum dots.
Resumo:
Tobacco smoking, alcohol drinking, and occupational exposures to polycyclic aromatic hydrocarbons are the major proven risk factors for human head and neck squamous-cell cancer (HNSCC). Major research focus on gene-environment interactions concerning HNSCC has been on genes encoding enzymes of metabolism for tobacco smoke constituents and repair enzymes. To investigate the role of genetically determined individual predispositions in enzymes of xenobiotic metabolism and in repair enzymes under the exogenous risk factor tobacco smoke in the carcinogenesis of HNSCC, we conducted a case-control study on 312 cases and 300 noncancer controls. We focused on the impact of 22 sequence variations in CYP1A1, CYP1B1, CYP2E1, ERCC2/XPD, GSTM1, GSTP1, GSTT1, NAT2, NQO1, and XRCC1. To assess relevant main and interactive effects of polymorphic genes on the susceptibility to HNSCC we used statistical models such as logic regression and a Bayesian version of logic regression. In subgroup analysis of nonsmokers, main effects in ERCC2 (Lys751Gln) C/C genotype and combined ERCC2 (Arg156Arg) C/A and A/A genotypes were predominant. When stratifying for smokers, the data revealed main effects on combined CYP1B1 (Leu432Val) C/G and G/G genotypes, followed by CYP1B1 (Leu432Val) G/G genotype and CYP2E1 (-70G>T) G/T genotype. When fitting logistic regression models including relevant main effects and interactions in smokers, we found relevant associations of CYP1B1 (Leu432Val) C/G genotype and CYP2E1 (-70G>T) G/T genotype (OR, 10.84; 95% CI, 1.64-71.53) as well as CYP1B1 (Leu432Val) G/G genotype and GSTM1 null/null genotype (OR, 11.79; 95% CI, 2.18-63.77) with HNSCC. The findings underline the relevance of genotypes of polymorphic CYP1B1 combined with exposures to tobacco smoke.
Resumo:
This research aims to explore and identify political risks on a large infrastructure project in an exaggerated environment to ascertain whether sufficient objective information can be gathered by project managers to utilise risk modelling techniques. During the study, the author proposes a new definition of political risk; performs a detailed project study of the Neelum Jhelum Hydroelectric Project in Pakistan; implements a probabilistic model using the principle of decomposition and Bayes probabilistic theorem and answers the question: was it possible for project managers to obtain all the relevant objective data to implement a probabilistic model?
Resumo:
Background Although the detrimental impact of major depressive disorder (MDD) at the individual level has been described, its global epidemiology remains unclear given limitations in the data. Here we present the modelled epidemiological profile of MDD dealing with heterogeneity in the data, enforcing internal consistency between epidemiological parameters and making estimates for world regions with no empirical data. These estimates were used to quantify the burden of MDD for the Global Burden of Disease Study 2010 (GBD 2010). Method Analyses drew on data from our existing literature review of the epidemiology of MDD. DisMod-MR, the latest version of the generic disease modelling system redesigned as a Bayesian meta-regression tool, derived prevalence by age, year and sex for 21 regions. Prior epidemiological knowledge, study- and country-level covariates adjusted sub-optimal raw data. Results There were over 298 million cases of MDD globally at any point in time in 2010, with the highest proportion of cases occurring between 25 and 34 years. Global point prevalence was very similar across time (4.4% (95% uncertainty: 4.2–4.7%) in 1990, 4.4% (4.1–4.7%) in 2005 and 2010), but higher in females (5.5% (5.0–6.0%) compared to males (3.2% (3.0–3.6%) in 2010. Regions in conflict had higher prevalence than those with no conflict. The annual incidence of an episode of MDD followed a similar age and regional pattern to prevalence but was about one and a half times higher, consistent with an average duration of 37.7 weeks. Conclusion We were able to integrate available data, including those from high quality surveys and sub-optimal studies, into a model adjusting for known methodological sources of heterogeneity. We were also able to estimate the epidemiology of MDD in regions with no available data. This informed GBD 2010 and the public health field, with a clearer understanding of the global distribution of MDD.
Resumo:
Background Depressive disorders were a leading cause of burden in the Global Burden of Disease (GBD) 1990 and 2000 studies. Here, we analyze the burden of depressive disorders in GBD 2010 and present severity proportions, burden by country, region, age, sex, and year, as well as burden of depressive disorders as a risk factor for suicide and ischemic heart disease. Methods and Findings Burden was calculated for major depressive disorder (MDD) and dysthymia. A systematic review of epidemiological data was conducted. The data were pooled using a Bayesian meta-regression. Disability weights from population survey data quantified the severity of health loss from depressive disorders. These weights were used to calculate years lived with disability (YLDs) and disability adjusted life years (DALYs). Separate DALYs were estimated for suicide and ischemic heart disease attributable to depressive disorders.Depressive disorders were the second leading cause of YLDs in 2010. MDD accounted for 8.2% (5.9%-10.8%) of global YLDs and dysthymia for 1.4% (0.9%-2.0%). Depressive disorders were a leading cause of DALYs even though no mortality was attributed to them as the underlying cause. MDD accounted for 2.5% (1.9%-3.2%) of global DALYs and dysthymia for 0.5% (0.3%-0.6%). There was more regional variation in burden for MDD than for dysthymia; with higher estimates in females, and adults of working age. Whilst burden increased by 37.5% between 1990 and 2010, this was due to population growth and ageing. MDD explained 16 million suicide DALYs and almost 4 million ischemic heart disease DALYs. This attributable burden would increase the overall burden of depressive disorders from 3.0% (2.2%-3.8%) to 3.8% (3.0%-4.7%) of global DALYs. Conclusions GBD 2010 identified depressive disorders as a leading cause of burden. MDD was also a contributor of burden allocated to suicide and ischemic heart disease. These findings emphasize the importance of including depressive disorders as a public-health priority and implementing cost-effective interventions to reduce its burden.Please see later in the article for the Editors' Summary.
Resumo:
We present a systematic, practical approach to developing risk prediction systems, suitable for use with large databases of medical information. An important part of this approach is a novel feature selection algorithm which uses the area under the receiver operating characteristic (ROC) curve to measure the expected discriminative power of different sets of predictor variables. We describe this algorithm and use it to select variables to predict risk of a specific adverse pregnancy outcome: failure to progress in labour. Neural network, logistic regression and hierarchical Bayesian risk prediction models are constructed, all of which achieve close to the limit of performance attainable on this prediction task. We show that better prediction performance requires more discriminative clinical information rather than improved modelling techniques. It is also shown that better diagnostic criteria in clinical records would greatly assist the development of systems to predict risk in pregnancy. We present a systematic, practical approach to developing risk prediction systems, suitable for use with large databases of medical information. An important part of this approach is a novel feature selection algorithm which uses the area under the receiver operating characteristic (ROC) curve to measure the expected discriminative power of different sets of predictor variables. We describe this algorithm and use it to select variables to predict risk of a specific adverse pregnancy outcome: failure to progress in labour. Neural network, logistic regression and hierarchical Bayesian risk prediction models are constructed, all of which achieve close to the limit of performance attainable on this prediction task. We show that better prediction performance requires more discriminative clinical information rather than improved modelling techniques. It is also shown that better diagnostic criteria in clinical records would greatly assist the development of systems to predict risk in pregnancy.
Resumo:
Real-world cryptographic protocols such as the widely used Transport Layer Security (TLS) protocol support many different combinations of cryptographic algorithms (called ciphersuites) and simultaneously support different versions. Recent advances in provable security have shown that most modern TLS ciphersuites are secure authenticated and confidential channel establishment (ACCE) protocols, but these analyses generally focus on single ciphersuites in isolation. In this paper we extend the ACCE model to cover protocols with many different sub-protocols, capturing both multiple ciphersuites and multiple versions, and define a security notion for secure negotiation of the optimal sub-protocol. We give a generic theorem that shows how secure negotiation follows, with some additional conditions, from the authentication property of secure ACCE protocols. Using this framework, we analyse the security of ciphersuite and three variants of version negotiation in TLS, including a recently proposed mechanism for detecting fallback attacks.
Resumo:
To further investigate susceptibility loci identified by genome-wide association studies, we genotyped 5,500 SNPs across 14 associated regions in 8,000 samples from a control group and 3 diseases: type 2 diabetes (T2D), coronary artery disease (CAD) and Graves' disease. We defined, using Bayes theorem, credible sets of SNPs that were 95% likely, based on posterior probability, to contain the causal disease-associated SNPs. In 3 of the 14 regions, TCF7L2 (T2D), CTLA4 (Graves' disease) and CDKN2A-CDKN2B (T2D), much of the posterior probability rested on a single SNP, and, in 4 other regions (CDKN2A-CDKN2B (CAD) and CDKAL1, FTO and HHEX (T2D)), the 95% sets were small, thereby excluding most SNPs as potentially causal. Very few SNPs in our credible sets had annotated functions, illustrating the limitations in understanding the mechanisms underlying susceptibility to common diseases. Our results also show the value of more detailed mapping to target sequences for functional studies. © 2012 Nature America, Inc. All rights reserved.
Resumo:
Let G = (V, E) be a finite, simple and undirected graph. For S subset of V, let delta(S, G) = {(u, v) is an element of E : u is an element of S and v is an element of V - S} be the edge boundary of S. Given an integer i, 1 <= i <= vertical bar V vertical bar, let the edge isoperimetric value of G at i be defined as b(e)(i, G) = min(S subset of V:vertical bar S vertical bar=i)vertical bar delta(S, G)vertical bar. The edge isoperimetric peak of G is defined as b(e)(G) = max(1 <= j <=vertical bar V vertical bar)b(e)(j, G). Let b(v)(G) denote the vertex isoperimetric peak defined in a corresponding way. The problem of determining a lower bound for the vertex isoperimetric peak in complete t-ary trees was recently considered in [Y. Otachi, K. Yamazaki, A lower bound for the vertex boundary-width of complete k-ary trees, Discrete Mathematics, in press (doi: 10.1016/j.disc.2007.05.014)]. In this paper we provide bounds which improve those in the above cited paper. Our results can be generalized to arbitrary (rooted) trees. The depth d of a tree is the number of nodes on the longest path starting from the root and ending at a leaf. In this paper we show that for a complete binary tree of depth d (denoted as T-d(2)), c(1)d <= b(e) (T-d(2)) <= d and c(2)d <= b(v)(T-d(2)) <= d where c(1), c(2) are constants. For a complete t-ary tree of depth d (denoted as T-d(t)) and d >= c log t where c is a constant, we show that c(1)root td <= b(e)(T-d(t)) <= td and c(2)d/root t <= b(v) (T-d(t)) <= d where c(1), c(2) are constants. At the heart of our proof we have the following theorem which works for an arbitrary rooted tree and not just for a complete t-ary tree. Let T = (V, E, r) be a finite, connected and rooted tree - the root being the vertex r. Define a weight function w : V -> N where the weight w(u) of a vertex u is the number of its successors (including itself) and let the weight index eta(T) be defined as the number of distinct weights in the tree, i.e eta(T) vertical bar{w(u) : u is an element of V}vertical bar. For a positive integer k, let l(k) = vertical bar{i is an element of N : 1 <= i <= vertical bar V vertical bar, b(e)(i, G) <= k}vertical bar. We show that l(k) <= 2(2 eta+k k)
Resumo:
The problem of separability in recent models of classical relativistic interacting particles is examined. This physical requirement is shown to be more subtle than naive separability of all the constraints defining the system: it is adequate to be able to canonically transform the time-fixing constraints from an unseparated to a separated form when clusters emerge. Viewing separability in this way, and within a specific framework, we are led to a new no-interaction theorem which states the incompatibility of nontrivial interaction with relativistic invariance, separability, and invariant world lines for more than two particles.
Resumo:
Recent work on the violent relaxation of collisionless stellar systems has been based on the notion of a wide class of entropy functions. A theorem concerning entropy increase has been proved. We draw attention to some underlying assumptions that have been ignored in the applications of this theorem to stellar dynamical problems. Once these are taken into account, the use of this theorem is at best heuristic. We present a simple counter-example.
Resumo:
It is generally known that the orbital diamagnetism of a classical system of charged particles in thermal equilibrium is identically zero —the Bohr-van Leeuwen theorem. Physically, this null result derives from the exact cancellation of the orbital diamagnetic moment associated with the complete cyclotron orbits of the charged particles by the paramagnetic moment subtended by the incomplete orbits skipping the boundary in the opposite sense. Motivated by this crucial but subtle role of the boundary, we have simulated here the case of a finite but unbounded system, namely that of a charged particle moving on the surface of a sphere in the presence of an externally applied uniform magnetic field. Following a real space-time approach based on the classical Langevin equation, we have computed the orbital magnetic moment that now indeed turns out to be non-zero and has the diamagnetic sign. To the best of our knowledge, this is the first report of the possibility of finite classical diamagnetism in principle, and it is due to the avoided cancellation.
Resumo:
One of the most fundamental questions in the philosophy of mathematics concerns the relation between truth and formal proof. The position according to which the two concepts are the same is called deflationism, and the opposing viewpoint substantialism. In an important result of mathematical logic, Kurt Gödel proved in his first incompleteness theorem that all consistent formal systems containing arithmetic include sentences that can neither be proved nor disproved within that system. However, such undecidable Gödel sentences can be established to be true once we expand the formal system with Alfred Tarski s semantical theory of truth, as shown by Stewart Shapiro and Jeffrey Ketland in their semantical arguments for the substantiality of truth. According to them, in Gödel sentences we have an explicit case of true but unprovable sentences, and hence deflationism is refuted. Against that, Neil Tennant has shown that instead of Tarskian truth we can expand the formal system with a soundness principle, according to which all provable sentences are assertable, and the assertability of Gödel sentences follows. This way, the relevant question is not whether we can establish the truth of Gödel sentences, but whether Tarskian truth is a more plausible expansion than a soundness principle. In this work I will argue that this problem is best approached once we think of mathematics as the full human phenomenon, and not just consisting of formal systems. When pre-formal mathematical thinking is included in our account, we see that Tarskian truth is in fact not an expansion at all. I claim that what proof is to formal mathematics, truth is to pre-formal thinking, and the Tarskian account of semantical truth mirrors this relation accurately. However, the introduction of pre-formal mathematics is vulnerable to the deflationist counterargument that while existing in practice, pre-formal thinking could still be philosophically superfluous if it does not refer to anything objective. Against this, I argue that all truly deflationist philosophical theories lead to arbitrariness of mathematics. In all other philosophical accounts of mathematics there is room for a reference of the pre-formal mathematics, and the expansion of Tarkian truth can be made naturally. Hence, if we reject the arbitrariness of mathematics, I argue in this work, we must accept the substantiality of truth. Related subjects such as neo-Fregeanism will also be covered, and shown not to change the need for Tarskian truth. The only remaining route for the deflationist is to change the underlying logic so that our formal languages can include their own truth predicates, which Tarski showed to be impossible for classical first-order languages. With such logics we would have no need to expand the formal systems, and the above argument would fail. From the alternative approaches, in this work I focus mostly on the Independence Friendly (IF) logic of Jaakko Hintikka and Gabriel Sandu. Hintikka has claimed that an IF language can include its own adequate truth predicate. I argue that while this is indeed the case, we cannot recognize the truth predicate as such within the same IF language, and the need for Tarskian truth remains. In addition to IF logic, also second-order logic and Saul Kripke s approach using Kleenean logic will be shown to fail in a similar fashion.
Resumo:
This splitting techniques for MARKOV chains developed by NUMMELIN (1978a) and ATHREYA and NEY (1978b) are used to derive an imbedded renewal process in WOLD's point process with MARKOV-correlated intervals. This leads to a simple proof of renewal theorems for such processes. In particular, a key renewal theorem is proved, from which analogues to both BLACKWELL's and BREIMAN's forms of the renewal theorem can be deduced.
Resumo:
he Dirac generator formalism for relativistic Hamiltonian dynamics is reviewed along with its extension to constraint formalism. In these theories evolution is with respect to a dynamically defined parameter, and thus time evolution involves an eleventh generator. These formulations evade the No-Interaction Theorem. But the incorporation of separability reopens the question, and together with the World Line Condition leads to a second no-interaction theorem for systems of three or more particles. Proofs are omitted, but the results of recent research in this area is highlighted.