45 resultados para Lipschitz trivial
Resumo:
Fractures in men are a major health issue, and data on the antifracture efficacy of therapies for osteoporosis in men are limited. We studied the effect of zoledronic acid on fracture risk among men with osteoporosis.
Resumo:
A cascading failure is a failure in a system of interconnected parts, in which the breakdown of one element can lead to the subsequent collapse of the others. The aim of this paper is to introduce a simple combinatorial model for the study of cascading failures. In particular, having in mind particle systems and Markov random fields, we take into consideration a network of interacting urns displaced over a lattice. Every urn is Pólya-like and its reinforcement matrix is not only a function of time (time contagion) but also of the behavior of the neighboring urns (spatial contagion), and of a random component, which can represent either simple fate or the impact of exogenous factors. In this way a non-trivial dependence structure among the urns is built, and it is used to study default avalanches over the lattice. Thanks to its flexibility and its interesting probabilistic properties, the given construction may be used to model different phenomena characterized by cascading failures such as power grids and financial networks.
Resumo:
The intracellular protozoan parasites Theileria parva and T. annulata transform the cells they infect, inducing uncontrolled proliferation. This is not a trivial event as, in addition to permanently switching on the complex pathways that govern all steps of the cell cycle, the built-in apoptotic safety mechanisms that prevent 'illegitimate' cell replication also need to be inactivated. Recent experiments show that the NF-kappa B and phosphoinositide 3-kinase (PtdIns-3K) pathways are important participants in the transformation process. I kappa B kinase (IKK), a pivotal kinase complex in the NF-kappa B pathway, is recruited to the parasite surface where it becomes activated. The PtdIns-3K/Akt/PKB pathway is also constitutively activated in a parasite-dependent manner, but contrary to IKK, activation is probably not triggered by direct association with the parasite.
Resumo:
Software repositories have been getting a lot of attention from researchers in recent years. In order to analyze software repositories, it is necessary to first extract raw data from the version control and problem tracking systems. This poses two challenges: (1) extraction requires a non-trivial effort, and (2) the results depend on the heuristics used during extraction. These challenges burden researchers that are new to the community and make it difficult to benchmark software repository mining since it is almost impossible to reproduce experiments done by another team. In this paper we present the TA-RE corpus. TA-RE collects extracted data from software repositories in order to build a collection of projects that will simplify extraction process. Additionally the collection can be used for benchmarking. As the first step we propose an exchange language capable of making sharing and reusing data as simple as possible.
Resumo:
Writing unit tests for legacy systems is a key maintenance task. When writing tests for object-oriented programs, objects need to be set up and the expected effects of executing the unit under test need to be verified. If developers lack internal knowledge of a system, the task of writing tests is non-trivial. To address this problem, we propose an approach that exposes side effects detected in example runs of the system and uses these side effects to guide the developer when writing tests. We introduce a visualization called Test Blueprint, through which we identify what the required fixture is and what assertions are needed to verify the correct behavior of a unit under test. The dynamic analysis technique that underlies our approach is based on both tracing method executions and on tracking the flow of objects at runtime. To demonstrate the usefulness of our approach we present results from two case studies.
Resumo:
In this article, we develop the a priori and a posteriori error analysis of hp-version interior penalty discontinuous Galerkin finite element methods for strongly monotone quasi-Newtonian fluid flows in a bounded Lipschitz domain Ω ⊂ ℝd, d = 2, 3. In the latter case, computable upper and lower bounds on the error are derived in terms of a natural energy norm, which are explicit in the local mesh size and local polynomial degree of the approximating finite element method. A series of numerical experiments illustrate the performance of the proposed a posteriori error indicators within an automatic hp-adaptive refinement algorithm.
Resumo:
We discuss non-geometric supersymmetric heterotic string models in D=4, in the framework of the free fermionic construction. We perform a systematic scan of models with four a priori left-right asymmetric Z2 projections and shifts. We analyze some 220 models, identifying 18 inequivalent classes and addressing variants generated by discrete torsions. They do not contain geometrical or trivial neutral moduli, apart from the dilaton. However, we show the existence of flat directions in the form of exactly marginal deformations and identify patterns of symmetry breaking where product gauge groups, realized at level one, are broken to their diagonal at higher level. We also describe an “inverse Gepner map” from Heterotic to Type II models that could be used, in certain non geometric settings, to define “effective” topological invariants.
Resumo:
Stemmatology, or the reconstruction of the transmission history of texts, is a field that stands particularly to gain from digital methods. Many scholars already take stemmatic approaches that rely heavily on computational analysis of the collated text (e.g. Robinson and O’Hara 1996; Salemans 2000; Heikkilä 2005; Windram et al. 2008 among many others). Although there is great value in computationally assisted stemmatology, providing as it does a reproducible result and allowing access to the relevant methodological process in related fields such as evolutionary biology, computational stemmatics is not without its critics. The current state-of-the-art effectively forces scholars to choose between a preconceived judgment of the significance of textual differences (the Lachmannian or neo-Lachmannian approach, and the weighted phylogenetic approach) or to make no judgment at all (the unweighted phylogenetic approach). Some basis for judgment of the significance of variation is sorely needed for medieval text criticism in particular. By this, we mean that there is a need for a statistical empirical profile of the text-genealogical significance of the different sorts of variation in different sorts of medieval texts. The rules that apply to copies of Greek and Latin classics may not apply to copies of medieval Dutch story collections; the practices of copying authoritative texts such as the Bible will most likely have been different from the practices of copying the Lives of local saints and other commonly adapted texts. It is nevertheless imperative that we have a consistent, flexible, and analytically tractable model for capturing these phenomena of transmission. In this article, we present a computational model that captures most of the phenomena of text variation, and a method for analysis of one or more stemma hypotheses against the variation model. We apply this method to three ‘artificial traditions’ (i.e. texts copied under laboratory conditions by scholars to study the properties of text variation) and four genuine medieval traditions whose transmission history is known or deduced in varying degrees. Although our findings are necessarily limited by the small number of texts at our disposal, we demonstrate here some of the wide variety of calculations that can be made using our model. Certain of our results call sharply into question the utility of excluding ‘trivial’ variation such as orthographic and spelling changes from stemmatic analysis.
Resumo:
OBJECTIVE The Coherex-EU Study evaluated the safety and efficacy of PFO closure utilizing novel in-tunnel PFO closure devices. BACKGROUND Transcatheter closure of patent foramen ovale (PFO) followed the development of transcatheter closure devices designed to patch atrial septal defects (ASDs). The Coherex FlatStent™ and FlatStent™ EF devices were designed specifically to treat PFO anatomy. METHODS A total of 95 patients with a clinical indication for PFO closure were enrolled in a prospective, multicenter first in man study at six clinical sites. Thirty-six patients received the first-generation FlatStent study device, and 57 patients received the second-generation FlatStent EF study device, which was modified based on clinical experience during the first 38 cases. Two patients enrolled to receive the first generation did not receive a device. RESULTS At 6 months post-procedure, 45% (17/38) of the intention-to-treat (ITT) cohort receiving the first-generation FlatStent device had complete closure, 26% (10/38) had a trivial residual shunt, and 29% (11/38) had a moderate to large residual shunt. In the ITT cohort receiving the second-generation FlatStent EF device, 76% (43/57) had complete closure, 12% (7/57) had a trivial shunt, and 12% had a moderate to large shunt. Five major adverse events occurred, all without sequelae. CONCLUSION This initial study of the Coherex FlatStent/FlatStent EF PFO Closure System demonstrated the potential for in-tunnel PFO closure. The in-tunnel Coherex FlatStent EF may offer an alternative to septal repair devices for PFO closure in appropriately selected patients; however, further investigation will be necessary to establish the best use of this device.
Resumo:
Image denoising methods have been implemented in both spatial and transform domains. Each domain has its advantages and shortcomings, which can be complemented by each other. State-of-the-art methods like block-matching 3D filtering (BM3D) therefore combine both domains. However, implementation of such methods is not trivial. We offer a hybrid method that is surprisingly easy to implement and yet rivals BM3D in quality.
Resumo:
We investigate the 2-d O(3) model with a q-term as a toy model for slowly walking 4-d non-Abelian gauge theories. Using the very efficient meron-cluster algorithm, an accurate investigation of the scale dependence of the renormalized coupling is carried out for different values of the vacuum angle q. Approaching q = p, the infrared dynamics of the 2-d O(3) model is determined by a non-trivial conformal fixed point. We provide evidence for a slowly walking behavior near the fixed point and we perform a finite-size scaling analysis of the mass gap.
Resumo:
In order to analyze software systems, it is necessary to model them. Static software models are commonly imported by parsing source code and related data. Unfortunately, building custom parsers for most programming languages is a non-trivial endeavour. This poses a major bottleneck for analyzing software systems programmed in languages for which importers do not already exist. Luckily, initial software models do not require detailed parsers, so it is possible to start analysis with a coarse-grained importer, which is then gradually refined. In this paper we propose an approach to "agile modeling" that exploits island grammars to extract initial coarse-grained models, parser combinators to enable gradual refinement of model importers, and various heuristics to recognize language structure, keywords and other language artifacts.
Resumo:
Time-based localization techniques such as multilateration are favoured for positioning to wide-band signals. Applying the same techniques with narrow-band signals such as GSM is not so trivial. The process is challenged by the needs of synchronization accuracy and timestamp resolution both in the nanoseconds range. We propose approaches to deal with both challenges. On the one hand, we introduce a method to eliminate the negative effect of synchronization offset on time measurements. On the other hand, we propose timestamps with nanoseconds accuracy by using timing information from the signal processing chain. For a set of experiments, ranging from sub-urban to indoor environments, we show that our proposed approaches are able to improve the localization accuracy of TDOA approaches by several factors. We are even able to demonstrate errors as small as 10 meters for outdoor settings with narrow-band signals.
Resumo:
We provide the dictionary between four-dimensional gauged supergravity and type II compactifications on T6 with metric and gauge fluxes in the absence of supersymmetry breaking sources, such as branes and orientifold planes. Secondly, we prove that there is a unique isotropic compactification allowing for critical points. It corresponds to a type IIA background given by a product of two 3-tori with SO(3) twists and results in a unique theory (gauging) with a non-semisimple gauge algebra. Besides the known four AdS solutions surviving the orientifold projection to N = 4 induced by O6-planes, this theory contains a novel AdS solution that requires non-trivial orientifold-odd fluxes, hence being a genuine critical point of the N = 8 theory.