933 resultados para non-trivial data structures


Relevância:

100.00% 100.00%

Publicador:

Resumo:

We provide the dictionary between four-dimensional gauged supergravity and type II compactifications on T6 with metric and gauge fluxes in the absence of supersymmetry breaking sources, such as branes and orientifold planes. Secondly, we prove that there is a unique isotropic compactification allowing for critical points. It corresponds to a type IIA background given by a product of two 3-tori with SO(3) twists and results in a unique theory (gauging) with a non-semisimple gauge algebra. Besides the known four AdS solutions surviving the orientifold projection to N = 4 induced by O6-planes, this theory contains a novel AdS solution that requires non-trivial orientifold-odd fluxes, hence being a genuine critical point of the N = 8 theory.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We generalize uniqueness theorems for non-extremal black holes with three mutually independent Killing vector fields in five-dimensional minimal supergravity in order to account for the existence of non-trivial two-cycles in the domain of outer communication. The black hole space-times we consider may contain multiple disconnected horizons and be asymptotically flat or asymptotically Kaluza–Klein. We show that in order to uniquely specify the black hole space-time, besides providing its domain structure and a set of asymptotic and local charges, it is necessary to measure the magnetic fluxes that support the two-cycles as well as fluxes in the two semi-infinite rotation planes of the domain diagram.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We introduce a version of operational set theory, OST−, without a choice operation, which has a machinery for Δ0Δ0 separation based on truth functions and the separation operator, and a new kind of applicative set theory, so-called weak explicit set theory WEST, based on Gödel operations. We show that both the theories and Kripke–Platek set theory KPKP with infinity are pairwise Π1Π1 equivalent. We also show analogous assertions for subtheories with ∈-induction restricted in various ways and for supertheories extended by powerset, beta, limit and Mahlo operations. Whereas the upper bound is given by a refinement of inductive definition in KPKP, the lower bound is by a combination, in a specific way, of realisability, (intuitionistic) forcing and negative interpretations. Thus, despite interpretability between classical theories, we make “a detour via intuitionistic theories”. The combined interpretation, seen as a model construction in the sense of Visser's miniature model theory, is a new way of construction for classical theories and could be said the third kind of model construction ever used which is non-trivial on the logical connective level, after generic extension à la Cohen and Krivine's classical realisability model.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper, we describe dynamic unicast to increase communication efficiency in opportunistic Information-centric networks. The approach is based on broadcast requests to quickly find content and dynamically creating unicast links to content sources without the need of neighbor discovery. The links are kept temporarily as long as they deliver content and are quickly removed otherwise. Evaluations in mobile networks show that this approach maintains ICN flexibility to support seamless mobile communication and achieves up to 56.6% shorter transmission times compared to broadcast in case of multiple concurrent requesters. Apart from that, dynamic unicast unburdens listener nodes from processing unwanted content resulting in lower processing overhead and power consumption at these nodes. The approach can be easily included into existing ICN architectures using only available data structures.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The all-loop anisotropic Thirring model interpolates between the WZW model and the non-Abelian T-dual of the anisotropic principal chiral model. We focus on the SU(2) case and we prove that it is classically integrable by providing its Lax pair formulation. We derive its underlying symmetry current algebra and use it to show that the Poisson brackets of the spatial part of the Lax pair, assume the Maillet form. In this way we procure the corresponding r and s matrices which provide non-trivial solutions to the modified Yang–Baxter equation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Software architecture consists of a set of design choices that can be partially expressed in form of rules that the implementation must conform to. Architectural rules are intended to ensure properties that fulfill fundamental non-functional requirements. Verifying architectural rules is often a non- trivial activity: available tools are often not very usable and support only a narrow subset of the rules that are commonly specified by practitioners. In this paper we present a new highly-readable declarative language for specifying architectural rules. With our approach, users can specify a wide variety of rules using a single uniform notation. Rules can get tested by third-party tools by conforming to pre-defined specification templates. Practitioners can take advantage of the capabilities of a growing number of testing tools without dealing with them directly.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Even though the Standard Model with a Higgs mass mH = 125GeV possesses no bulk phase transition, its thermodynamics still experiences a "soft point" at temperatures around T = 160GeV, with a deviation from ideal gas thermodynamics. Such a deviation may have an effect on precision computations of weakly interacting dark matter relic abundances if their mass is in the few TeV range, or on leptogenesis scenarios operating in this temperature range. By making use of results from lattice simulations based on a dimensionally reduced effective field theory, we estimate the relevant thermodynamic functions across the crossover. The results are tabulated in a numerical form permitting for their insertion as a background equation of state into cosmological particle production/decoupling codes. We find that Higgs dynamics induces a non-trivial "structure" visible e.g. in the heat capacity, but that in general the largest radiative corrections originate from QCD effects, reducing the energy density by a couple of percent from the free value even at T > 160GeV.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We construct several classes of worldvolume effective actions for black holes by integrating out spatial sections of the worldvolume geometry of asymptotically flat black branes. This provides a generalisation of the blackfold approach for higher-dimensional black holes and yields a map between different effective theories, which we exploit by obtaining new hydrodynamic and elastic transport coefficients via simple integrations. Using Euclidean minimal surfaces in order to decouple the fluid dynamics on different sections of the worldvolume, we obtain local effective theories for ultraspinning Myers-Perry branes and helicoidal black branes, described in terms of a stress-energy tensor, particle currents and non-trivial boost vectors. We then study in detail and present novel compact and non-compact geometries for black hole horizons in higher-dimensional asymptotically flat space-time. These include doubly-spinning black rings, black helicoids and helicoidal p-branes as well as helicoidal black rings and helicoidal black tori in D ≥ 6.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Solver Add-in of Microsoft Excel is widely used in courses on Operations Research and in industrial applications. Since the 2010 version of Microsoft Excel, the Solver Add-in comprises a so-called evolutionary solver. We analyze how this metaheuristic can be applied to the resource-constrained project scheduling problem (RCPSP). We present an implementation of a schedule-generation scheme in a spreadsheet, which combined with the evolutionary solver can be used for devising good feasible schedules. Our computational results indicate that using this approach, non-trivial instances of the RCPSP can be (approximately) solved to optimality.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We revisit the theory of null shells in general relativity, with a particular emphasis on null shells placed at horizons of black holes. We study in detail the considerable freedom that is available in the case that one solders two metrics together across null hypersurfaces (such as Killing horizons) for which the induced metric is invariant under translations along the null generators. In this case the group of soldering transformations turns out to be infinite dimensional, and these solderings create non-trivial horizon shells containing both massless matter and impulsive gravitational wave components. We also rephrase this result in the language of Carrollian symmetry groups. To illustrate this phenomenon we discuss in detail the example of shells on the horizon of the Schwarzschild black hole (with equal interior and exterior mass), uncovering a rich classical structure at the horizon and deriving an explicit expression for the general horizon shell energy-momentum tensor. In the special case of BMS-like soldering supertranslations we find a conserved shell-energy that is strikingly similar to the standard expression for asymptotic BMS supertranslation charges, suggesting a direct relation between the physical properties of these horizon shells and the recently proposed BMS supertranslation hair of a black hole.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Friedreich's ataxia is caused by the expansion of the GAA•TTC trinucleotide repeat sequence located in intron 1 of the frataxin gene. The long GAA•TTC repeats are known to form several non-B DNA structures including hairpins, triplexes, parallel DNA and sticky DNA. Therefore it is believed that alternative DNA structures play a role in the loss of mRNA transcript and functional frataxin protein in FRDA patients. We wanted to further elucidate the characteristics for formation and stability of sticky DNA by evaluating the structure in a plasmid based system in vitro and in vivo in Escherichia coli. The negative supercoil density of plasmids harboring different lengths of GAA•TTC repeats, as well as either one or two repeat tracts were studied in E. coli to determine if plasmids containing two long tracts (≥60 repeats) in a direct repeat orientation would have a different topological effect in vivo compared to plasmids that harbored only one GAA•TTC tract or two tracts of < 60 repeats. The experiments revealed that, in fact, sticky DNA forming plasmids had a lower average negative supercoil density (-σ) compared to all other control plasmids used that had the potential to form other non-B DNA structures such as triplexes or Z-DNA. Also, the requirements for in vitro dissociation and reconstitution of the DNA•DNA associated region of sticky DNA were evaluated. Results conclude that the two repeat tracts associate in the presence of negative supercoiling and MgCl 2 or MnCl2 in a time and concentration-dependent manner. Interaction of the repeat sequences was not observed in the absence of negative supercoiling and/or MgCl2 or in the presence of other monovalent or divalent cations, indicating that supercoiling and quite specific cations are needed for the association of sticky DNA. These are the first experiments studying a more specific role of supercoiling and cation influence on this DNA conformation. To support our model of the topological effects of sticky DNA in plasmids, changes in sticky DNA band migration was measured with reference to the linear DNA after treatment with increasing concentrations of ethidium bromide (EtBr). The presence of independent negative supercoil domains was confirmed by this method and found to be segregated by the DNA-DNA associated region. Sequence-specific polyamide molecules were used to test the effect of binding of the ligands to the GAA•TTC repeats on the inhibition of sticky DNA. The destabilization of the sticky DNA conformation in vitro through this binding of the polyamides demonstrated the first conceptual therapeutic approach for the treatment of FRDA at the DNA molecular level. ^ Thus, examining the properties of sticky DNA formed by these long repeat tracts is important in the elucidation of the possible role of sticky DNA in Friedreich's ataxia. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Interaction effect is an important scientific interest for many areas of research. Common approach for investigating the interaction effect of two continuous covariates on a response variable is through a cross-product term in multiple linear regression. In epidemiological studies, the two-way analysis of variance (ANOVA) type of method has also been utilized to examine the interaction effect by replacing the continuous covariates with their discretized levels. However, the implications of model assumptions of either approach have not been examined and the statistical validation has only focused on the general method, not specifically for the interaction effect.^ In this dissertation, we investigated the validity of both approaches based on the mathematical assumptions for non-skewed data. We showed that linear regression may not be an appropriate model when the interaction effect exists because it implies a highly skewed distribution for the response variable. We also showed that the normality and constant variance assumptions required by ANOVA are not satisfied in the model where the continuous covariates are replaced with their discretized levels. Therefore, naïve application of ANOVA method may lead to an incorrect conclusion. ^ Given the problems identified above, we proposed a novel method modifying from the traditional ANOVA approach to rigorously evaluate the interaction effect. The analytical expression of the interaction effect was derived based on the conditional distribution of the response variable given the discretized continuous covariates. A testing procedure that combines the p-values from each level of the discretized covariates was developed to test the overall significance of the interaction effect. According to the simulation study, the proposed method is more powerful then the least squares regression and the ANOVA method in detecting the interaction effect when data comes from a trivariate normal distribution. The proposed method was applied to a dataset from the National Institute of Neurological Disorders and Stroke (NINDS) tissue plasminogen activator (t-PA) stroke trial, and baseline age-by-weight interaction effect was found significant in predicting the change from baseline in NIHSS at Month-3 among patients received t-PA therapy.^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

ExxonMobil, a Fortune 500 oil and gas corporation, has a global workforce with employees assigned to projects in areas at risk for infectious diseases, particularly malaria. As such, the corporation has put in place a program to protect the health of workers and ensure their safety in malaria endemic zones. This program is called the Malaria Control Program (MCP). One component of this program is the more specific Malaria Chemoprophylaxis Compliance Program (MCCP), in which employees enroll following consent to random drug testing for compliance with the company's chemoprophylaxis requirements. Each year, data is gathered on the number of employees working in these locations and are selected randomly and tested for chemoprophylaxis compliance. The selection strives to test each eligible worker once per year. Test results that come back positive for the chemoprophylaxis drug are considered "detects" and tests that are negative for the drug and therefore show the worker is non-compliant at risk for severe malaria infection are considered "non-detect". ^ The current practice report used aggregate data to calculate statistics on test results to reflect compliance among both employees and contractors in various malaria-endemic areas. This aggregate, non-individualized data has been compiled and reflects the effectiveness and reach of ExxonMobil's Malaria Chemoprophylaxis Compliance Program. In order to assess compliance, information on the number of non-detect test results was compared to the number of tests completed per year. The data shows that over time, non-detect results have declined in both employee and contractor populations, and vary somewhat by location due to size and scope of the MCCP implemented in-country. Although the data indicate a positive trend for the corporation, some recommendations have been made for future implementation of the program.^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In the biomedical studies, the general data structures have been the matched (paired) and unmatched designs. Recently, many researchers are interested in Meta-Analysis to obtain a better understanding from several clinical data of a medical treatment. The hybrid design, which is combined two data structures, may create the fundamental question for statistical methods and the challenges for statistical inferences. The applied methods are depending on the underlying distribution. If the outcomes are normally distributed, we would use the classic paired and two independent sample T-tests on the matched and unmatched cases. If not, we can apply Wilcoxon signed rank and rank sum test on each case. ^ To assess an overall treatment effect on a hybrid design, we can apply the inverse variance weight method used in Meta-Analysis. On the nonparametric case, we can use a test statistic which is combined on two Wilcoxon test statistics. However, these two test statistics are not in same scale. We propose the Hybrid Test Statistic based on the Hodges-Lehmann estimates of the treatment effects, which are medians in the same scale.^ To compare the proposed method, we use the classic meta-analysis T-test statistic on the combined the estimates of the treatment effects from two T-test statistics. Theoretically, the efficiency of two unbiased estimators of a parameter is the ratio of their variances. With the concept of Asymptotic Relative Efficiency (ARE) developed by Pitman, we show ARE of the hybrid test statistic relative to classic meta-analysis T-test statistic using the Hodges-Lemann estimators associated with two test statistics.^ From several simulation studies, we calculate the empirical type I error rate and power of the test statistics. The proposed statistic would provide effective tool to evaluate and understand the treatment effect in various public health studies as well as clinical trials.^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Empirical relationships between physical properties determined non-destructively by core logging devices and calibrated by carbonate and opal measurements determined on discrete samples allow extraction of carbonate and opal records from the non-destructive measurements in biogenic settings. Contents of detrital material can be calculated as a residual. For carbonate and opal the correlation coefficients (r) are 0.954 and ?0.916 for sediment density, ?0.816 and 0.845 for compressional-wave velocity, 0.908 and ?0.942 for acoustic impedance, and 0.886 and ?0.865 for sediment color (lightness). Carbonate contents increase in concert with increasing density and acoustic impedance, decreasing velocity and lighter sediment color. The opposite is true for opal. The advantages of deriving the sediment composition quantitatively from core logging are: (i) sampling resolution is increased significantly, (ii) non-destructive data can be gathered rapidly, and (iii) laboratory work on discrete samples can be reduced. Applied to paleoceanographic problems, this method offers the opportunity of precise stratigraphic correlations and of studying processes related to biogenic sedimentation in more detail. Density is most promising because it is most strongly affected by changes in composition.