56 resultados para 240500 Classical Physics


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Résumé La cryptographie classique est basée sur des concepts mathématiques dont la sécurité dépend de la complexité du calcul de l'inverse des fonctions. Ce type de chiffrement est à la merci de la puissance de calcul des ordinateurs ainsi que la découverte d'algorithme permettant le calcul des inverses de certaines fonctions mathématiques en un temps «raisonnable ». L'utilisation d'un procédé dont la sécurité est scientifiquement prouvée s'avère donc indispensable surtout les échanges critiques (systèmes bancaires, gouvernements,...). La cryptographie quantique répond à ce besoin. En effet, sa sécurité est basée sur des lois de la physique quantique lui assurant un fonctionnement inconditionnellement sécurisé. Toutefois, l'application et l'intégration de la cryptographie quantique sont un souci pour les développeurs de ce type de solution. Cette thèse justifie la nécessité de l'utilisation de la cryptographie quantique. Elle montre que le coût engendré par le déploiement de cette solution est justifié. Elle propose un mécanisme simple et réalisable d'intégration de la cryptographie quantique dans des protocoles de communication largement utilisés comme les protocoles PPP, IPSec et le protocole 802.1li. Des scénarios d'application illustrent la faisabilité de ces solutions. Une méthodologie d'évaluation, selon les critères communs, des solutions basées sur la cryptographie quantique est également proposée dans ce document. Abstract Classical cryptography is based on mathematical functions. The robustness of a cryptosystem essentially depends on the difficulty of computing the inverse of its one-way function. There is no mathematical proof that establishes whether it is impossible to find the inverse of a given one-way function. Therefore, it is mandatory to use a cryptosystem whose security is scientifically proven (especially for banking, governments, etc.). On the other hand, the security of quantum cryptography can be formally demonstrated. In fact, its security is based on the laws of physics that assure the unconditional security. How is it possible to use and integrate quantum cryptography into existing solutions? This thesis proposes a method to integrate quantum cryptography into existing communication protocols like PPP, IPSec and the 802.l1i protocol. It sketches out some possible scenarios in order to prove the feasibility and to estimate the cost of such scenarios. Directives and checkpoints are given to help in certifying quantum cryptography solutions according to Common Criteria.

Relevância:

20.00% 20.00%

Publicador:

Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND AND PURPOSE: Intensity-modulated radiotherapy (IMRT) credentialing for a EORTC study was performed using an anthropomorphic head phantom from the Radiological Physics Center (RPC; RPCPH). Institutions were retrospectively requested to irradiate their institutional phantom (INSTPH) using the same treatment plan in the framework of a Virtual Phantom Project (VPP) for IMRT credentialing. MATERIALS AND METHODS: CT data set of the institutional phantom and measured 2D dose matrices were requested from centers and sent to a dedicated secure EORTC uploader. Data from the RPCPH and INSTPH were thereafter centrally analyzed and inter-compared by the QA team using commercially available software (RIT; ver.5.2; Colorado Springs, USA). RESULTS: Eighteen institutions participated to the VPP. The measurements of 6 (33%) institutions could not be analyzed centrally. All other centers passed both the VPP and the RPC ±7%/4 mm credentialing criteria. At the 5%/5 mm gamma criteria (90% of pixels passing), 11(92%) as compared to 12 (100%) centers pass the credentialing process with RPCPH and INSTPH (p = 0.29), respectively. The corresponding pass rate for the 3%/3 mm gamma criteria (90% of pixels passing) was 2 (17%) and 9 (75%; p = 0.01), respectively. CONCLUSIONS: IMRT dosimetry gamma evaluations in a single plane for a H&N prospective trial using the INSTPH measurements showed agreement at the gamma index criteria of ±5%/5 mm (90% of pixels passing) for a small number of VPP measurements. Using more stringent, criteria, the RPCPH and INSTPH comparison showed disagreement. More data is warranted and urgently required within the framework of prospective studies.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Depth-averaged velocities and unit discharges within a 30 km reach of one of the world's largest rivers, the Rio Parana, Argentina, were simulated using three hydrodynamic models with different process representations: a reduced complexity (RC) model that neglects most of the physics governing fluid flow, a two-dimensional model based on the shallow water equations, and a three-dimensional model based on the Reynolds-averaged Navier-Stokes equations. Row characteristics simulated using all three models were compared with data obtained by acoustic Doppler current profiler surveys at four cross sections within the study reach. This analysis demonstrates that, surprisingly, the performance of the RC model is generally equal to, and in some instances better than, that of the physics based models in terms of the statistical agreement between simulated and measured flow properties. In addition, in contrast to previous applications of RC models, the present study demonstrates that the RC model can successfully predict measured flow velocities. The strong performance of the RC model reflects, in part, the simplicity of the depth-averaged mean flow patterns within the study reach and the dominant role of channel-scale topographic features in controlling the flow dynamics. Moreover, the very low water surface slopes that typify large sand-bed rivers enable flow depths to be estimated reliably in the RC model using a simple fixed-lid planar water surface approximation. This approach overcomes a major problem encountered in the application of RC models in environments characterised by shallow flows and steep bed gradients. The RC model is four orders of magnitude faster than the physics based models when performing steady-state hydrodynamic calculations. However, the iterative nature of the RC model calculations implies a reduction in computational efficiency relative to some other RC models. A further implication of this is that, if used to simulate channel morphodynamics, the present RC model may offer only a marginal advantage in terms of computational efficiency over approaches based on the shallow water equations. These observations illustrate the trade off between model realism and efficiency that is a key consideration in RC modelling. Moreover, this outcome highlights a need to rethink the use of RC morphodynamic models in fluvial geomorphology and to move away from existing grid-based approaches, such as the popular cellular automata (CA) models, that remain essentially reductionist in nature. In the case of the world's largest sand-bed rivers, this might be achieved by implementing the RC model outlined here as one element within a hierarchical modelling framework that would enable computationally efficient simulation of the morphodynamics of large rivers over millennial time scales. (C) 2012 Elsevier B.V. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Karyotype analysis of acute lymphoblastic leukemia (ALL) at diagnosis has provided valuable prognostic markers for treatment stratification. However, reports of cytogenetic studies of relapsed ALL samples are limited. We compared the karyotypes from 436 nonselected B-cell precursor ALL patients at initial diagnosis and of 76 patients at first relapse. We noticed a relative increase of karyotypes that did not fall into the classic ALL cytogenetic subgroups (high hyperdiploidy, t(12;21), t(9;22), 11q23, t(1;19), <45 chromosomes) in a group of 29 patients at relapse (38%) compared to 130 patients at presentation (30%). Non-classical cytogenetic aberrations in these 29 patients were mostly found on chromosomes 1, 2, 7, 9, 13, 14, and 17. We also describe six rare reciprocal translocations, three of which involved 14q32. The most frequent abnormalities were found in 9p (12/29 cases) and were associated with a marked decrease in the duration of the second remission, but not of the probability of 10-year event-free survival after relapse treatment. From 29 patients with non-classical cytogenetic aberrations, only 8 (28%) had been stratified to a high risk-arm on the first treatment protocol, suggesting that this subgroup might benefit from the identification of new prognostic markers in future studies.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Invasive fungal diseases (IFDs) continue to cause considerable morbidity and mortality in patients with haematological malignancy. Diagnosis of IFD is difficult, with the sensitivity of the gold standard tests (culture and histopathology) often reported to be low, which may at least in part be due to sub-optimal sampling or subsequent handling in the routine microbiological laboratory. Therefore, a working group of the European Conference in Infections in Leukaemia was convened in 2009 with the task of reviewing the classical diagnostic procedures and providing recommendations for their optimal use. The recommendations were presented and approved at the ECIL-3 conference in September 2009. Although new serological and molecular tests are examined in separate papers, this review focuses on sample types, microscopy and culture procedures, antifungal susceptibility testing and imaging. The performance and limitations of these procedures are discussed and recommendations are provided on when and how to use them and how to interpret the results.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The theory of language has occupied a special place in the history of Indian thought. Indian philosophers give particular attention to the analysis of the cognition obtained from language, known under the generic name of śābdabodha. This term is used to denote, among other things, the cognition episode of the hearer, the content of which is described in the form of a paraphrase of a sentence represented as a hierarchical structure. Philosophers submit the meaning of the component items of a sentence and their relationship to a thorough examination, and represent the content of the resulting cognition as a paraphrase centred on a meaning element, that is taken as principal qualificand (mukhyaviśesya) which is qualified by the other meaning elements. This analysis is the object of continuous debate over a period of more than a thousand years between the philosophers of the schools of Mimāmsā, Nyāya (mainly in its Navya form) and Vyākarana. While these philosophers are in complete agreement on the idea that the cognition of sentence meaning has a hierarchical structure and share the concept of a single principal qualificand (qualified by other meaning elements), they strongly disagree on the question which meaning element has this role and by which morphological item it is expressed. This disagreement is the central point of their debate and gives rise to competing versions of this theory. The Mïmāmsakas argue that the principal qualificand is what they call bhāvanā ̒bringing into being̒, ̒efficient force̒ or ̒productive operation̒, expressed by the verbal affix, and distinct from the specific procedures signified by the verbal root; the Naiyāyikas generally take it to be the meaning of the word with the first case ending, while the Vaiyākaranas take it to be the operation expressed by the verbal root. All the participants rely on the Pāninian grammar, insofar as the Mimāmsakas and Naiyāyikas do not compose a new grammar of Sanskrit, but use different interpretive strategies in order to justify their views, that are often in overt contradiction with the interpretation of the Pāninian rules accepted by the Vaiyākaranas. In each of the three positions, weakness in one area is compensated by strength in another, and the cumulative force of the total argumentation shows that no position can be declared as correct or overall superior to the others. This book is an attempt to understand this debate, and to show that, to make full sense of the irreconcilable positions of the three schools, one must go beyond linguistic factors and consider the very beginnings of each school's concern with the issue under scrutiny. The texts, and particularly the late texts of each school present very complex versions of the theory, yet the key to understanding why these positions remain irreconcilable seems to lie elsewhere, this in spite of extensive argumentation involving a great deal of linguistic and logical technicalities. Historically, this theory arises in Mimāmsā (with Sabara and Kumārila), then in Nyāya (with Udayana), in a doctrinal and theological context, as a byproduct of the debate over Vedic authority. The Navya-Vaiyākaranas enter this debate last (with Bhattoji Dïksita and Kaunda Bhatta), with the declared aim of refuting the arguments of the Mïmāmsakas and Naiyāyikas by bringing to light the shortcomings in their understanding of Pāninian grammar. The central argument has focused on the capacity of the initial contexts, with the network of issues to which the principal qualificand theory is connected, to render intelligible the presuppositions and aims behind the complex linguistic justification of the classical and late stages of this debate. Reading the debate in this light not only reveals the rationality and internal coherence of each position beyond the linguistic arguments, but makes it possible to understand why the thinkers of the three schools have continued to hold on to three mutually exclusive positions. They are defending not only their version of the principal qualificand theory, but (though not openly acknowledged) the entire network of arguments, linguistic and/or extra-linguistic, to which this theory is connected, as well as the presuppositions and aims underlying these arguments.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

According to molecular epidemiology theory, two isolates belong to the same chain of transmission if they are similar according to a highly discriminatory molecular typing method. This has been demonstrated in outbreaks, but is rarely studied in endemic situations. Person-to-person transmission cannot be established when isolates of meticillin-resistant Staphylococcus aureus (MRSA) belong to endemically predominant genotypes. By contrast, isolates of infrequent genotypes might be more suitable for epidemiological tracking. The objective of the present study was to determine, in newly identified patients harbouring non-predominant MRSA genotypes, whether putative epidemiological links inferred from molecular typing could replace classical epidemiology in the context of a regional surveillance programme. MRSA genotypes were defined using double-locus sequence typing (DLST) combining clfB and spa genes. A total of 1,268 non-repetitive MRSA isolates recovered between 2005 and 2006 in Western Switzerland were typed: 897 isolates (71%) belonged to four predominant genotypes, 231 (18%) to 55 non-predominant genotypes, and 140 (11%) were unique. Obvious epidemiological links were found in only 106/231 (46%) patients carrying isolates with non-predominant genotypes suggesting that molecular surveillance identified twice as many clusters as those that may have been suspected with classical epidemiological links. However, not all of these molecular clusters represented person-to-person transmission. Thus, molecular typing cannot replace classical epidemiology but is complementary. A prospective surveillance of MRSA genotypes could help to target epidemiological tracking in order to recognise new risk factors in hospital and community settings, or emergence of new epidemic clones.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The paper argues that the formulation of quantum mechanics proposed by Ghirardi, Rimini and Weber (GRW) is a serious candidate for being a fundamental physical theory and explores its ontological commitments from this perspective. In particular, we propose to conceive of spatial superpositions of non-massless microsystems as dispositions or powers, more precisely propensities, to generate spontaneous localizations. We set out five reasons for this view, namely that (1) it provides for a clear sense in which quantum systems in entangled states possess properties even in the absence of definite values; (2) it vindicates objective, single-case probabilities; (3) it yields a clear transition from quantum to classical properties; (4) it enables to draw a clear distinction between purely mathematical and physical structures, and (5) it grounds the arrow of time in the time-irreversible manifestation of the propensities to localize.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

While scientific realism generally assumes that successful scientific explanations yield information about reality, realists also have to admit that not all information acquired in this way is equally well warranted. Some versions of scientific realism do this by saying that explanatory posits with which we have established some kind of causal contact are better warranted than those that merely appear in theoretical hypotheses. I first explicate this distinction by considering some general criteria that permit us to distinguish causal warrant from theoretical warrant. I then apply these criteria to a specific case from particle physics, claiming that scientific realism has to incorporate the distinction between causal and theoretical warrant if it is to be an adequate stance in the philosophy of particle physics.