719 resultados para atomicity violation
Uncommon crown-root fracture treated with adhesive tooth fragment reattachment: 7 years of follow-up
Resumo:
Crown-root fractures account for 5% of all fractures in permanent teeth and can involve enamel, dentin, and cementum. Depending on whether there is pulpal involvement, these problems may be classified as complicated (which are more common) or noncomplicated. The treatment depends on the level of the fracture line, root length and/or morphology, and esthetic needs. Several treatment strategies are available for esthetic and functional rehabilitation in crown-root fractures. Adhesive tooth fragment reattachment is the most conservative restorative option when the tooth fragment is available and the biological width has no or minimal violation. This article reports a case of an uncomplicated crown-root fracture in the permanent maxillary right central incisor of a young patient who received treatment with adhesive tooth fragment reattachment, preserving the anatomic characteristics of the fractured tooth after periodontal intervention. The fracture line of the fragment had an unusual shape, starting on the palatal side and extending to the buccal side subgingivally. After 7 years, the attached coronal fragment remained in position with good esthetics, as well as clinical and radiographic signs of pulpal vitality, periodontal health, and root integrity, thus indicating success.
Resumo:
Pós-graduação em Ciências Ambientais - Sorocaba
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)
Resumo:
Pós-graduação em Física - IFT
Resumo:
Pós-graduação em Física - IFT
Resumo:
Pós-graduação em Física - IFT
Resumo:
Pós-graduação em Física - IFT
Resumo:
In which refers to statistical process control, the analysis of univariate cases is not enough for many types of company, being necessary to resort to multivariate cases. Besides, it is usually supposed that the observations are independent. However, the violation of this hypothesis indicates the existence of autocorrelation in the process. In this work, by a basic quantitative approach for an exploratory and experimental research, the study target are the multivariate autocorrelated control charts, using Hotteling T². The ARL values were collected by simulations of a computational program on FORTRAN language, with objective of studying the charts properties, in addition to compare with the
Resumo:
The issue of multijurisdictional practice ("MJP") concerns whether, and to what extent, lawyers can practice law in states in which they are not licensed. Under current law in Nebraska and almost every other state, it may be a violation of both the ethics rules and state law for a lawyer not licensed in that state to engage in activity that constitutes the practice of law, even on a temporary basis. This is law that is no longer practical or necessary and Nebraska should now consider modifying it.
Resumo:
In which refers to statistical process control, the analysis of univariate cases is not enough for many types of company, being necessary to resort to multivariate cases. Besides, it is usually supposed that the observations are independent. However, the violation of this hypothesis indicates the existence of autocorrelation in the process. In this work, by a basic quantitative approach for an exploratory and experimental research, the study target are the multivariate autocorrelated control charts, using Hotteling T². The ARL values were collected by simulations of a computational program on FORTRAN language, with objective of studying the charts properties, in addition to compare with the
Resumo:
In this paper, we perform a thorough analysis of a spectral phase-encoded time spreading optical code division multiple access (SPECTS-OCDMA) system based on Walsh-Hadamard (W-H) codes aiming not only at finding optimal code-set selections but also at assessing its loss of security due to crosstalk. We prove that an inadequate choice of codes can make the crosstalk between active users to become large enough so as to cause the data from the user of interest to be detected by other user. The proposed algorithm for code optimization targets code sets that produce minimum bit error rate (BER) among all codes for a specific number of simultaneous users. This methodology allows us to find optimal code sets for any OCDMA system, regardless the code family used and the number of active users. This procedure is crucial for circumventing the unexpected lack of security due to crosstalk. We also show that a SPECTS-OCDMA system based on W-H 32(64) fundamentally limits the number of simultaneous users to 4(8) with no security violation due to crosstalk. More importantly, we prove that only a small fraction of the available code sets is actually immune to crosstalk with acceptable BER (<10(-9)) i.e., approximately 0.5% for W-H 32 with four simultaneous users, and about 1 x 10(-4)% for W-H 64 with eight simultaneous users.
Resumo:
We study magneto-optical properties of monolayer graphene by means of quantum field theory methods in the framework of the Dirac model. We reveal a good agreement between the Dirac model and a recent experiment on giant Faraday rotation in cyclotron resonance [23]. We also predict other regimes when the effects are well pronounced. The general dependence of the Faraday rotation and absorption on various parameters of samples is revealed both for suspended and epitaxial graphene.
Resumo:
In the framework of gauged flavour symmetries, new fermions in parity symmetric representations of the standard model are generically needed for the compensation of mixed anomalies. The key point is that their masses are also protected by flavour symmetries and some of them are expected to lie way below the flavour symmetry breaking scale(s), which has to occur many orders of magnitude above the electroweak scale to be compatible with the available data from flavour changing neutral currents and CP violation experiments. We argue that, actually, some of these fermions would plausibly get masses within the LHC range. If they are taken to be heavy quarks and leptons, in (bi)-fundamental representations of the standard model symmetries, their mixings with the light ones are strongly constrained to be very small by electroweak precision data. The alternative chosen here is to exactly forbid such mixings by breaking of flavour symmetries into an exact discrete symmetry, the so-called proton-hexality, primarily suggested to avoid proton decay. As a consequence of the large value needed for the flavour breaking scale, those heavy particles are long-lived and rather appropriate for the current and future searches at the LHC for quasi-stable hadrons and leptons. In fact, the LHC experiments have already started to look for them.
Resumo:
The current cosmological dark sector (dark matter plus dark energy) is challenging our comprehension about the physical processes taking place in the Universe. Recently, some authors tried to falsify the basic underlying assumptions of such dark matterdark energy paradigm. In this Letter, we show that oversimplifications of the measurement process may produce false positives to any consistency test based on the globally homogeneous and isotropic ? cold dark matter (?CDM) model and its expansion history based on distance measurements. In particular, when local inhomogeneity effects due to clumped matter or voids are taken into account, an apparent violation of the basic assumptions (Copernican Principle) seems to be present. Conversely, the amplitude of the deviations also probes the degree of reliability underlying the phenomenological DyerRoeder procedure by confronting its predictions with the accuracy of the weak lensing approach. Finally, a new method is devised to reconstruct the effects of the inhomogeneities in a ?CDM model, and some suggestions of how to distinguish between clumpiness (or void) effects from different cosmologies are discussed.