953 resultados para palindromic polynomial


Relevância:

10.00% 10.00%

Publicador:

Resumo:

We introduce and analyze hp-version discontinuous Galerkin (dG) finite element methods for the numerical approximation of linear second-order elliptic boundary-value problems in three-dimensional polyhedral domains. To resolve possible corner-, edge- and corner-edge singularities, we consider hexahedral meshes that are geometrically and anisotropically refined toward the corresponding neighborhoods. Similarly, the local polynomial degrees are increased linearly and possibly anisotropically away from singularities. We design interior penalty hp-dG methods and prove that they are well-defined for problems with singular solutions and stable under the proposed hp-refinements. We establish (abstract) error bounds that will allow us to prove exponential rates of convergence in the second part of this work.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The goal of this paper is to establish exponential convergence of $hp$-version interior penalty (IP) discontinuous Galerkin (dG) finite element methods for the numerical approximation of linear second-order elliptic boundary-value problems with homogeneous Dirichlet boundary conditions and piecewise analytic data in three-dimensional polyhedral domains. More precisely, we shall analyze the convergence of the $hp$-IP dG methods considered in [D. Schötzau, C. Schwab, T. P. Wihler, SIAM J. Numer. Anal., 51 (2013), pp. 1610--1633] based on axiparallel $\sigma$-geometric anisotropic meshes and $\bm{s}$-linear anisotropic polynomial degree distributions.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A new anisotropic elastic-viscoplastic damage constitutive model for bone is proposed using an eccentric elliptical yield criterion and nonlinear isotropic hardening. A micromechanics-based multiscale homogenization scheme proposed by Reisinger et al. is used to obtain the effective elastic properties of lamellar bone. The dissipative process in bone is modeled as viscoplastic deformation coupled to damage. The model is based on an orthotropic ecuntric elliptical criterion in stress space. In order to simplify material identification, an eccentric elliptical isotropic yield surface was defined in strain space, which is transformed to a stress-based criterion by means of the damaged compliance tensor. Viscoplasticity is implemented by means of the continuous Perzyna formulation. Damage is modeled by a scalar function of the accumulated plastic strain D(κ) , reducing all element s of the stiffness matrix. A polynomial flow rule is proposed in order to capture the rate-dependent post-yield behavior of lamellar bone. A numerical algorithm to perform the back projection on the rate-dependent yield surface has been developed and implemented in the commercial finite element solver Abaqus/Standard as a user subroutine UMAT. A consistent tangent operator has been derived and implemented in order to ensure quadratic convergence. Correct implementation of the algorithm, convergence, and accuracy of the tangent operator was tested by means of strain- and stress-based single element tests. A finite element simulation of nano- indentation in lamellar bone was finally performed in order to show the abilities of the newly developed constitutive model.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Two homosexual men were colonized in the urethra with Haemophilus parainfluenzae nonsusceptible to ampicillin (MIC, 8 μg/ml), amoxicillin-clavulanate (MIC, 4 μg/ml), cefotaxime (MIC, 1.5 μg/ml), cefepime (MIC, 3 μg/ml), meropenem (MIC, 0.5 μg/ml), cefuroxime, azithromycin, ciprofloxacin, tetracycline, and chloramphenicol (all MICs, ≥ 32 μg/ml). Repetitive extragenic palindromic PCR (rep-PCR) showed that the strains were indistinguishable. The isolates had amino acid substitutions in PBP3, L4, GyrA, and ParC and possessed Mef(A), Tet(M), and CatS resistance mechanisms. This is the first report of extensively drug-resistant (XDR) H. parainfluenzae.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Proof nets provide abstract counterparts to sequent proofs modulo rule permutations; the idea being that if two proofs have the same underlying proof-net, they are in essence the same proof. Providing a convincing proof-net counterpart to proofs in the classical sequent calculus is thus an important step in understanding classical sequent calculus proofs. By convincing, we mean that (a) there should be a canonical function from sequent proofs to proof nets, (b) it should be possible to check the correctness of a net in polynomial time, (c) every correct net should be obtainable from a sequent calculus proof, and (d) there should be a cut-elimination procedure which preserves correctness. Previous attempts to give proof-net-like objects for propositional classical logic have failed at least one of the above conditions. In Richard McKinley (2010) [22], the author presented a calculus of proof nets (expansion nets) satisfying (a) and (b); the paper defined a sequent calculus corresponding to expansion nets but gave no explicit demonstration of (c). That sequent calculus, called LK∗ in this paper, is a novel one-sided sequent calculus with both additively and multiplicatively formulated disjunction rules. In this paper (a self-contained extended version of Richard McKinley (2010) [22]), we give a full proof of (c) for expansion nets with respect to LK∗, and in addition give a cut-elimination procedure internal to expansion nets – this makes expansion nets the first notion of proof-net for classical logic satisfying all four criteria.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We consider the problem of fitting a union of subspaces to a collection of data points drawn from one or more subspaces and corrupted by noise and/or gross errors. We pose this problem as a non-convex optimization problem, where the goal is to decompose the corrupted data matrix as the sum of a clean and self-expressive dictionary plus a matrix of noise and/or gross errors. By self-expressive we mean a dictionary whose atoms can be expressed as linear combinations of themselves with low-rank coefficients. In the case of noisy data, our key contribution is to show that this non-convex matrix decomposition problem can be solved in closed form from the SVD of the noisy data matrix. The solution involves a novel polynomial thresholding operator on the singular values of the data matrix, which requires minimal shrinkage. For one subspace, a particular case of our framework leads to classical PCA, which requires no shrinkage. For multiple subspaces, the low-rank coefficients obtained by our framework can be used to construct a data affinity matrix from which the clustering of the data according to the subspaces can be obtained by spectral clustering. In the case of data corrupted by gross errors, we solve the problem using an alternating minimization approach, which combines our polynomial thresholding operator with the more traditional shrinkage-thresholding operator. Experiments on motion segmentation and face clustering show that our framework performs on par with state-of-the-art techniques at a reduced computational cost.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

OBJECTIVE: To determine whether algorithms developed for the World Wide Web can be applied to the biomedical literature in order to identify articles that are important as well as relevant. DESIGN AND MEASUREMENTS A direct comparison of eight algorithms: simple PubMed queries, clinical queries (sensitive and specific versions), vector cosine comparison, citation count, journal impact factor, PageRank, and machine learning based on polynomial support vector machines. The objective was to prioritize important articles, defined as being included in a pre-existing bibliography of important literature in surgical oncology. RESULTS Citation-based algorithms were more effective than noncitation-based algorithms at identifying important articles. The most effective strategies were simple citation count and PageRank, which on average identified over six important articles in the first 100 results compared to 0.85 for the best noncitation-based algorithm (p < 0.001). The authors saw similar differences between citation-based and noncitation-based algorithms at 10, 20, 50, 200, 500, and 1,000 results (p < 0.001). Citation lag affects performance of PageRank more than simple citation count. However, in spite of citation lag, citation-based algorithms remain more effective than noncitation-based algorithms. CONCLUSION Algorithms that have proved successful on the World Wide Web can be applied to biomedical information retrieval. Citation-based algorithms can help identify important articles within large sets of relevant results. Further studies are needed to determine whether citation-based algorithms can effectively meet actual user information needs.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

An integrated approach for multi-spectral segmentation of MR images is presented. This method is based on the fuzzy c-means (FCM) and includes bias field correction and contextual constraints over spatial intensity distribution and accounts for the non-spherical cluster's shape in the feature space. The bias field is modeled as a linear combination of smooth polynomial basis functions for fast computation in the clustering iterations. Regularization terms for the neighborhood continuity of intensity are added into the FCM cost functions. To reduce the computational complexity, the contextual regularizations are separated from the clustering iterations. Since the feature space is not isotropic, distance measure adopted in Gustafson-Kessel (G-K) algorithm is used instead of the Euclidean distance, to account for the non-spherical shape of the clusters in the feature space. These algorithms are quantitatively evaluated on MR brain images using the similarity measures.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Expressing emotions has social functions; it provides information, affects social interactions, and shapes relationships with others. Expressing positive emotions could be a strategic tool for improving goal attainment during social interactions at work. Such effects have been found in research on social contagion, impression management, and emotion work. However, expressing emotions one does not feel entails the risk of being perceived as inauthentic. This risk may well be worth taking when the emotions felt are negative, as expressing negative emotions usually has negative effects. When experiencing positive emotions, however, expressing them authentically promises benefits, and the advantage of amplifying them is not so obvious. We postulated that expressing, and amplifying, positive emotions would foster goal attainment in social interactions at work, particularly when dealing with superiors. Analyses are based on 494 interactions involving the pursuit of a goal by 113 employes. Multilevel analyses, including polynomial analyses, show that authentic display of positive emotions supported goal attainment throughout. However, amplifying felt positive emotions promoted goal attainment only in interactions with superiors, but not with colleagues. Results are discussed with regard to the importance of hierarchy for detecting, and interpreting, signs of strategic display of positive emotions.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Intensity non-uniformity (bias field) correction, contextual constraints over spatial intensity distribution and non-spherical cluster's shape in the feature space are incorporated into the fuzzy c-means (FCM) for segmentation of three-dimensional multi-spectral MR images. The bias field is modeled by a linear combination of smooth polynomial basis functions for fast computation in the clustering iterations. Regularization terms for the neighborhood continuity of either intensity or membership are added into the FCM cost functions. Since the feature space is not isotropic, distance measures, other than the Euclidean distance, are used to account for the shape and volumetric effects of clusters in the feature space. The performance of segmentation is improved by combining the adaptive FCM scheme with the criteria used in Gustafson-Kessel (G-K) and Gath-Geva (G-G) algorithms through the inclusion of the cluster scatter measure. The performance of this integrated approach is quantitatively evaluated on normal MR brain images using the similarity measures. The improvement in the quality of segmentation obtained with our method is also demonstrated by comparing our results with those produced by FSL (FMRIB Software Library), a software package that is commonly used for tissue classification.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Virtual colonoscopy (VC) is a minimally invasive means for identifying colorectal polyps and colorectal lesions by insufflating a patient’s bowel, applying contrast agent via rectal catheter, and performing multi-detector computed tomography (MDCT) scans. The technique is recommended for colonic health screening by the American Cancer Society but not funded by the Centers for Medicare and Medicaid Services (CMS) partially because of potential risks from radiation exposure. To date, no in‐vivo organ dose measurements have been performed for MDCT scans; thus, the accuracy of any current dose estimates is currently unknown. In this study, two TLDs were affixed to the inner lumen of standard rectal catheters used in VC, and in-vivo rectal dose measurements were obtained within 6 VC patients. In order to calculate rectal dose, TLD-100 powder response was characterized at diagnostic doses such that appropriate correction factors could be determined for VC. A third-order polynomial regression with a goodness of fit factor of R2=0.992 was constructed from this data. Rectal dose measurements were acquired with TLDs during simulated VC within a modified anthropomorphic phantom configured to represent three sizes of patients undergoing VC. The measured rectal doses decreased in an exponential manner with increasing phantom effective diameter, with R2=0.993 for the exponential regression model and a maximum percent coefficient of variation (%CoV) of 4.33%. In-vivo measurements yielded rectal doses ranged from that decreased exponentially with increasing patient effective diameter, in a manner that was also favorably predicted by the size specific dose estimate (SSDE) model for all VC patients that were of similar age, body composition, and TLD placement. The measured rectal dose within a younger patient was favorably predicted by the anthropomorphic phantom dose regression model due to similarities in the percentages of highly attenuating material at the respective measurement locations and in the placement of the TLDs. The in-vivo TLD response did not increase in %CoV with decreasing dose, and the largest %CoV was 10.0%.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The objects of study in this thesis are knots. More precisely, positive braid knots, which include algebraic knots and torus knots. In the first part of this thesis, we compare two classical knot invariants - the genus g and the signature σ - for positive braid knots. Our main result on positive braid knots establishes a linear lower bound for the signature in terms of the genus. In the second part of the thesis, a positive braid approach is applied to the study of the local behavior of polynomial functions from the complex affine plane to the complex numbers. After endowing polynomial function germs with a suitable topology, the adjacency problem arises: for a fixed germ f, what classes of germs g can be found arbitrarily close to f? We introduce two purely topological notions of adjacency for knots and discuss connections to algebraic notions of adjacency and the adjacency problem.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We define an applicative theory of truth TPT which proves totality exactly for the polynomial time computable functions. TPT has natural and simple axioms since nearly all its truth axioms are standard for truth theories over an applicative framework. The only exception is the axiom dealing with the word predicate. The truth predicate can only reflect elementhood in the words for terms that have smaller length than a given word. This makes it possible to achieve the very low proof-theoretic strength. Truth induction can be allowed without any constraints. For these reasons the system TPT has the high expressive power one expects from truth theories. It allows embeddings of feasible systems of explicit mathematics and bounded arithmetic. The proof that the theory TPT is feasible is not easy. It is not possible to apply a standard realisation approach. For this reason we develop a new realisation approach whose realisation functions work on directed acyclic graphs. In this way, we can express and manipulate realisation information more efficiently.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The spread of antibiotic-resistant bacteria through food has become a major public health concern because some important human pathogens may be transferred via the food chain. Acinetobacter baumannii is one of the most life-threatening gram-negative pathogens; multidrug-resistant (MDR) clones of A. baumannii are spreading worldwide, causing outbreaks in hospitals. However, the role of raw meat as a reservoir of A. baumannii remains unexplored. In this study, we describe for the first time the antibiotic susceptibility and fingerprint (repetitive extragenic palindromic PCR [rep-PCR] profile and sequence types [STs]) of A. baumannii strains found in raw meat retailed in Switzerland. Our results indicate that A. baumannii was present in 62 (25.0%) of 248 (CI 95%: 19.7 to 30.9%) meat samples analyzed between November 2012 and May 2013, with those derived from poultry being the most contaminated (48.0% [CI 95%: 37.8 to 58.3%]). Thirty-nine strains were further tested for antibiotic susceptibility and clonality. Strains were frequently not susceptible (intermediate and/or resistant) to third- and fourth-generation cephalosporins for human use (i.e., ceftriaxone [65%], cefotaxime [32%], ceftazidime [5%], and cefepime [2.5%]). Resistance to piperacillin-tazobactam, ciprofloxacin, colistin, and tetracycline was sporadically observed (2.5, 2.5, 5, and 5%, respectively), whereas resistance to carbapenems was not found. The strains were genetically very diverse from each other and belonged to 29 different STs, forming 12 singletons and 6 clonal complexes (CCs), of which 3 were new (CC277, CC360, and CC347). RepPCR analysis further distinguished some strains of the same ST. Moreover, some A. baumannii strains from meat belonged to the clonal complexes CC32 and CC79, similar to the MDR isolates responsible for human infections. In conclusion, our findings suggest that raw meat represents a reservoir of MDR A. baumannii and may serve as a vector for the spread of these pathogens into both community and hospital settings.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The sensitivity of the gas flow field to changes in different initial conditions has been studied for the case of a highly simplified cometary nucleus model. The nucleus model simulated a homogeneously outgassing sphere with a more active ring around an axis of symmetry. The varied initial conditions were the number density of the homogeneous region, the surface temperature, and the composition of the flow (varying amounts of H2O and CO2) from the active ring. The sensitivity analysis was performed using the Polynomial Chaos Expansion (PCE) method. Direct Simulation Monte Carlo (DSMC) was used for the flow, thereby allowing strong deviations from local thermal equilibrium. The PCE approach can be used to produce a sensitivity analysis with only four runs per modified input parameter and allows one to study and quantify non-linear responses of measurable parameters to linear changes in the input over a wide range. Hence the PCE allows one to obtain a functional relationship between the flow field properties at every point in the inner coma and the input conditions. It is for example shown that the velocity and the temperature of the background gas are not simply linear functions of the initial number density at the source. As probably expected, the main influence on the resulting flow field parameter is the corresponding initial parameter (i.e. the initial number density determines the background number density, the temperature of the surface determines the flow field temperature, etc.). However, the velocity of the flow field is also influenced by the surface temperature while the number density is not sensitive to the surface temperature at all in our model set-up. Another example is the change in the composition of the flow over the active area. Such changes can be seen in the velocity but again not in the number density. Although this study uses only a simple test case, we suggest that the approach, when applied to a real case in 3D, should assist in identifying the sensitivity of gas parameters measured in situ by, for example, the Rosetta spacecraft to the surface boundary conditions and vice versa.