973 resultados para GNSS, Ambiguity resolution, Regularization, Ill-posed problem, Success probability


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Identifying hadronic molecular states and/or hadrons with multiquark components either with or without exotic quantum numbers is a long-standing challenge in hadronic physics. We suggest that studying the production of these hadrons in relativistic heavy ion collisions offers a promising resolution to this problem as yields of exotic hadrons are expected to be strongly affected by their structures. Using the coalescence model for hadron production, we find that, compared to the case of a nonexotic hadron with normal quark numbers, the yield of an exotic hadron is typically an order of magnitude smaller when it is a compact multiquark state and a factor of 2 or more larger when it is a loosely bound hadronic molecule. We further find that some of the newly proposed heavy exotic states could be produced and realistically measured in these experiments.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The energy spectrum of an electron confined in a quantum dot (QD) with a three-dimensional anisotropic parabolic potential in a tilted magnetic field was found analytically. The theory describes exactly the mixing of in-plane and out-of-plane motions of an electron caused by a tilted magnetic field, which could be seen, for example, in the level anticrossing. For charged QDs in a tilted magnetic field we predict three strong resonant lines in the far-infrared-absorption spectra.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We solve the operator ordering problem for the quantum continuous integrable su(1,1) Landau-Lifshitz model, and give a prescription to obtain the quantum trace identities, and the spectrum for the higher-order local charges. We also show that this method, based on operator regularization and renormalization, which guarantees quantum integrability, as well as the construction of self-adjoint extensions, can be used as an alternative to the discretization procedure, and unlike the latter, is based only on integrable representations. (C) 2010 American Institute of Physics. [doi:10.1063/1.3509374]

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We investigate the quantum integrability of the Landau-Lifshitz (LL) model and solve the long-standing problem of finding the local quantum Hamiltonian for the arbitrary n-particle sector. The particular difficulty of the LL model quantization, which arises due to the ill-defined operator product, is dealt with by simultaneously regularizing the operator product and constructing the self-adjoint extensions of a very particular structure. The diagonalizibility difficulties of the Hamiltonian of the LL model, due to the highly singular nature of the quantum-mechanical Hamiltonian, are also resolved in our method for the arbitrary n-particle sector. We explicitly demonstrate the consistency of our construction with the quantum inverse scattering method due to Sklyanin [Lett. Math. Phys. 15, 357 (1988)] and give a prescription to systematically construct the general solution, which explains and generalizes the puzzling results of Sklyanin for the particular two-particle sector case. Moreover, we demonstrate the S-matrix factorization and show that it is a consequence of the discontinuity conditions on the functions involved in the construction of the self-adjoint extensions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

High-resolution synchrotron x-ray diffraction measurements were performed on single crystalline and powder samples of BiMn(2)O(5). A linear temperature dependence of the unit cell volume was found between T(N)=38 and 100 K, suggesting that a low-energy lattice excitation may be responsible for the lattice expansion in this temperature range. Between T(*)similar to 65 K and T(N), all lattice parameters showed incipient magnetoelastic effects, due to short-range spin correlations. An anisotropic strain along the a direction was also observed below T(*). Below T(N), a relatively large contraction of the a parameter following the square of the average sublattice magnetization of Mn was found, indicating that a second-order spin Hamiltonian accounts for the magnetic interactions along this direction. On the other hand, the more complex behaviors found for b and c suggest additional magnetic transitions below T(N) and perhaps higher-order terms in the spin Hamiltonian. Polycrystalline samples grown by distinct routes and with nearly homogeneous crystal structure above T(N) presented structural phase coexistence below T(N), indicating a close competition amongst distinct magnetostructural states in this compound.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We investigate the performance of a variant of Axelrod's model for dissemination of culture-the Adaptive Culture Heuristic (ACH)-on solving an NP-Complete optimization problem, namely, the classification of binary input patterns of size F by a Boolean Binary Perceptron. In this heuristic, N agents, characterized by binary strings of length F which represent possible solutions to the optimization problem, are fixed at the sites of a square lattice and interact with their nearest neighbors only. The interactions are such that the agents' strings (or cultures) become more similar to the low-cost strings of their neighbors resulting in the dissemination of these strings across the lattice. Eventually the dynamics freezes into a homogeneous absorbing configuration in which all agents exhibit identical solutions to the optimization problem. We find through extensive simulations that the probability of finding the optimal solution is a function of the reduced variable F/N(1/4) so that the number of agents must increase with the fourth power of the problem size, N proportional to F(4), to guarantee a fixed probability of success. In this case, we find that the relaxation time to reach an absorbing configuration scales with F(6) which can be interpreted as the overall computational cost of the ACH to find an optimal set of weights for a Boolean binary perceptron, given a fixed probability of success.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

An exciting unsolved problem in the study of high energy processes of early type stars concerns the physical mechanism for producing X-rays near the Be star gamma Cassiopeiae. By now we know that this source and several ""gamma Cas analogs"" exhibit an unusual hard thermal X-ray spectrum, compared both to normal massive stars and the non-thermal emission of known Be/X-ray binaries. Also, its light curve is variable on almost all conceivable timescales. In this study we reanalyze a high dispersion spectrum obtained by Chandra in 2001 and combine it with the analysis of a new (2004) spectrum and light curve obtained by XMM-Newton. We find that both spectra can be fit well with 3-4 optically thin, thermal components consisting of a hot component having a temperature kT(Q) similar to 12-14 keV, perhaps one with a value of similar to 2.4 keV, and two with well defined values near 0.6 keV and 0.11 keV. We argue that these components arise in discrete (almost monothermal) plasmas. Moreover, they cannot be produced within an integral gas structure or by the cooling of a dominant hot process. Consistent with earlier findings, we also find that the Fe abundance arising from K-shell ions is significantly subsolar and less than the Fe abundance from L-shell ions. We also find novel properties not present in the earlier Chandra spectrum, including a dramatic decrease in the local photoelectric absorption of soft X-rays, a decrease in the strength of the Fe and possibly of the Si K fluorescence features, underpredicted lines in two ions each of Ne and N (suggesting abundances that are similar to 1.5-3x and similar to 4x solar, respectively), and broadening of the strong NeXLy alpha and OVIII Ly alpha lines. In addition, we note certain traits in the gamma Cas spectrum that are different from those of the fairly well studied analog HD110432 - in this sense the stars have different ""personalities."" In particular, for gamma Cas the hot X-ray component remains nearly constant in temperature, and the photoelectric absorption of the X-ray plasmas can change dramatically. As found by previous investigators of gamma Cas, changes in flux, whether occurring slowly or in rapidly evolving flares, are only seldomly accompanied by variations in hardness. Moreover, the light curve can show a ""periodicity"" that is due to the presence of flux minima that recur semiregularly over a few hours, and which can appear again at different epochs.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Efficient automatic protein classification is of central importance in genomic annotation. As an independent way to check the reliability of the classification, we propose a statistical approach to test if two sets of protein domain sequences coming from two families of the Pfam database are significantly different. We model protein sequences as realizations of Variable Length Markov Chains (VLMC) and we use the context trees as a signature of each protein family. Our approach is based on a Kolmogorov-Smirnov-type goodness-of-fit test proposed by Balding et at. [Limit theorems for sequences of random trees (2008), DOI: 10.1007/s11749-008-0092-z]. The test statistic is a supremum over the space of trees of a function of the two samples; its computation grows, in principle, exponentially fast with the maximal number of nodes of the potential trees. We show how to transform this problem into a max-flow over a related graph which can be solved using a Ford-Fulkerson algorithm in polynomial time on that number. We apply the test to 10 randomly chosen protein domain families from the seed of Pfam-A database (high quality, manually curated families). The test shows that the distributions of context trees coming from different families are significantly different. We emphasize that this is a novel mathematical approach to validate the automatic clustering of sequences in any context. We also study the performance of the test via simulations on Galton-Watson related processes.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The width of a closed convex subset of n-dimensional Euclidean space is the distance between two parallel supporting hyperplanes. The Blaschke-Lebesgue problem consists of minimizing the volume in the class of convex sets of fixed constant width and is still open in dimension n >= 3. In this paper we describe a necessary condition that the minimizer of the Blaschke-Lebesgue must satisfy in dimension n = 3: we prove that the smooth components of the boundary of the minimizer have their smaller principal curvature constant and therefore are either spherical caps or pieces of tubes (canal surfaces).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The enzymatic kinetic resolution of tert-butyl 2-(1-hydroxyethyl) phenylcarbamate via lipase-catalyzed transesterification reaction was studied. We investigated several reaction conditions and the carbamate was resolved by Candida antarctica lipase B (CAL-B), leading to the optically pure (R)- and (S)-enantiomers. The enzymatic process showed excellent enantioselectivity (E > 200). (R)- and (S)-tert-butyl 2-(1-hydroxyethyl) phenylcarbamate were easily transformed into the corresponding (R)and (S)-1-(2-aminophenyl)ethanols.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Large scale enzymatic resolution of racemic sulcatol 2 has been useful for stereoselective biocatalysis. This reaction was fast and selective, using vinyl acetate as donor of acyl group and lipase from Candida antarctica (CALB) as catalyst. The large scale reaction (5.0 g, 39 mmol) afforded high optical purities for S-(+)-sulcatol 2 and R-(+)-sulcatyl acetate 3, i.e., ee > 99 per cent and good yields (45 per cent) within a short time (40 min). Thermodynamic parameters for the chemoesterification of sulcatol 2 by vinyl acetate were evaluated. The enthalpy and Gibbs free energy values of this reaction were negative, indicating that this process is exothermic and spontaneous which is in agreement with the reaction obtained enzymatically.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The first problem of the Seleucid mathematical cuneiform tablet BM 34 568 calculates the diagonal of a rectangle from its sides without resorting to the Pythagorean rule. For this reason, it has been a source of discussion among specialists ever since its first publication. but so far no consensus in relation to its mathematical meaning has been attained. This paper presents two new interpretations of the scribe`s procedure. based on the assumption that he was able to reduce the problem to a standard Mesopotamian question about reciprocal numbers. These new interpretations are then linked to interpretations of the Old Babylonian tablet Plimpton 322 and to the presence of Pythagorean triples in the contexts of Old Babylonian and Hellenistic mathematics. (C) 2007 Elsevier Inc. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We consider a class of two-dimensional problems in classical linear elasticity for which material overlapping occurs in the absence of singularities. Of course, material overlapping is not physically realistic, and one possible way to prevent it uses a constrained minimization theory. In this theory, a minimization problem consists of minimizing the total potential energy of a linear elastic body subject to the constraint that the deformation field must be locally invertible. Here, we use an interior and an exterior penalty formulation of the minimization problem together with both a standard finite element method and classical nonlinear programming techniques to compute the minimizers. We compare both formulations by solving a plane problem numerically in the context of the constrained minimization theory. The problem has a closed-form solution, which is used to validate the numerical results. This solution is regular everywhere, including the boundary. In particular, we show numerical results which indicate that, for a fixed finite element mesh, the sequences of numerical solutions obtained with both the interior and the exterior penalty formulations converge to the same limit function as the penalization is enforced. This limit function yields an approximate deformation field to the plane problem that is locally invertible at all points in the domain. As the mesh is refined, this field converges to the exact solution of the plane problem.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper addresses the time-variant reliability analysis of structures with random resistance or random system parameters. It deals with the problem of a random load process crossing a random barrier level. The implications of approximating the arrival rate of the first overload by an ensemble-crossing rate are studied. The error involved in this so-called ""ensemble-crossing rate"" approximation is described in terms of load process and barrier distribution parameters, and in terms of the number of load cycles. Existing results are reviewed, and significant improvements involving load process bandwidth, mean-crossing frequency and time are presented. The paper shows that the ensemble-crossing rate approximation can be accurate enough for problems where load process variance is large in comparison to barrier variance, but especially when the number of load cycles is small. This includes important practical applications like random vibration due to impact loadings and earthquake loading. Two application examples are presented, one involving earthquake loading and one involving a frame structure subject to wind and snow loadings. (C) 2007 Elsevier Ltd. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents a proposal for a Quality Management System for a generic GNSS Surveying Company as an alternative for management and service quality improvements. As a result of the increased demand for GNSS measurements, a large number of new or restructured companies were established to operate in that market. Considering that GNSS surveying is a new process, some changes must be performed in order to accommodate the old surveying techniques and the old fashioned management to the new reality. This requires a new management model that must be based on a well-described procedure sequence aiming at the Total Management Quality for the company. The proposed Quality Management System was based on the requirements of the Quality System ISO 9000:2000, applied to the whole company, focusing on the productive process of GNSS surveying work.