1000 resultados para Slide-rule


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The correlation clustering problem is a fundamental problem in both theory and practice, and it involves identifying clusters of objects in a data set based on their similarity. A traditional modeling of this question as a graph theoretic problem involves associating vertices with data points and indicating similarity by adjacency. Clusters then correspond to cliques in the graph. The resulting optimization problem, Cluster Editing (and several variants) are very well-studied algorithmically. In many situations, however, translating clusters to cliques can be somewhat restrictive. A more flexible notion would be that of a structure where the vertices are mutually ``not too far apart'', without necessarily being adjacent. One such generalization is realized by structures called s-clubs, which are graphs of diameter at most s. In this work, we study the question of finding a set of at most k edges whose removal leaves us with a graph whose components are s-clubs. Recently, it has been shown that unless Exponential Time Hypothesis fail (ETH) fails Cluster Editing (whose components are 1-clubs) does not admit sub-exponential time algorithm STACS, 2013]. That is, there is no algorithm solving the problem in time 2 degrees((k))n(O(1)). However, surprisingly they show that when the number of cliques in the output graph is restricted to d, then the problem can be solved in time O(2(O(root dk)) + m + n). We show that this sub-exponential time algorithm for the fixed number of cliques is rather an exception than a rule. Our first result shows that assuming the ETH, there is no algorithm solving the s-Club Cluster Edge Deletion problem in time 2 degrees((k))n(O(1)). We show, further, that even the problem of deleting edges to obtain a graph with d s-clubs cannot be solved in time 2 degrees((k))n(O)(1) for any fixed s, d >= 2. This is a radical contrast from the situation established for cliques, where sub-exponential algorithms are known.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Understanding dinucleotide sequence directed structures of nuleic acids and their variability from experimental observation remained ineffective due to unavailability of statistically meaningful data. We have attempted to understand this from energy scan along twist, roll, and slide degrees of freedom which are mostly dependent on dinucleotide sequence using ab initio density functional theory. We have carried out stacking energy analysis in these dinucleotide parameter phase space for all ten unique dinucleotide steps in DNA and RNA using DFT-D by B97X-D/6-31G(2d,2p), which appears to satisfactorily explain conformational preferences for AU/AU step in our recent study. We show that values of roll, slide, and twist of most of the dinucleotide sequences in crystal structures fall in the low energy region. The minimum energy regions with large twist values are associated with the roll and slide values of B-DNA, whereas, smaller twist values correspond to higher stability to RNA and A-DNA like conformations. Incorporation of solvent effect by CPCM method could explain the preference shown by some sequences to occur in B-DNA or A-DNA conformations. Conformational preference of BII sub-state in B-DNA is preferentially displayed mainly by pyrimidine-purine steps and partly by purine-purine steps. The purine-pyrimidine steps show largest effect of 5-methyl group of thymine in stacking energy and the introduction of solvent reduces this effect significantly. These predicted structures and variabilities can explain the effect of sequence on DNA and RNA functionality. (c) 2014 Wiley Periodicals, Inc. Biopolymers 103: 134-147, 2015.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The trapezoidal rule, which is a special case of the Newmark family of algorithms, is one of the most widely used methods for transient hyperbolic problems. In this work, we show that this rule conserves linear and angular momenta and energy in the case of undamped linear elastodynamics problems, and an ``energy-like measure'' in the case of undamped acoustic problems. These conservation properties, thus, provide a rational basis for using this algorithm. In linear elastodynamics problems, variants of the trapezoidal rule that incorporate ``high-frequency'' dissipation are often used, since the higher frequencies, which are not approximated properly by the standard displacement-based approach, often result in unphysical behavior. Instead of modifying the trapezoidal algorithm, we propose using a hybrid finite element framework for constructing the stiffness matrix. Hybrid finite elements, which are based on a two-field variational formulation involving displacement and stresses, are known to approximate the eigenvalues much more accurately than the standard displacement-based approach, thereby either bypassing or reducing the need for high-frequency dissipation. We show this by means of several examples, where we compare the numerical solutions obtained using the displacement-based and hybrid approaches against analytical solutions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The disclosure of information and its misuse in Privacy Preserving Data Mining (PPDM) systems is a concern to the parties involved. In PPDM systems data is available amongst multiple parties collaborating to achieve cumulative mining accuracy. The vertically partitioned data available with the parties involved cannot provide accurate mining results when compared to the collaborative mining results. To overcome the privacy issue in data disclosure this paper describes a Key Distribution-Less Privacy Preserving Data Mining (KDLPPDM) system in which the publication of local association rules generated by the parties is published. The association rules are securely combined to form the combined rule set using the Commutative RSA algorithm. The combined rule sets established are used to classify or mine the data. The results discussed in this paper compare the accuracy of the rules generated using the C4. 5 based KDLPPDM system and the CS. 0 based KDLPPDM system using receiver operating characteristics curves (ROC).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Using polydispersity index as an additional order parameter we investigate freezing/melting transition of Lennard-Jones polydisperse systems (with Gaussian polydispersity in size), especially to gain insight into the origin of the terminal polydispersity. The average inherent structure (IS) energy and root mean square displacement (RMSD) of the solid before melting both exhibit quite similar polydispersity dependence including a discontinuity at solid-liquid transition point. Lindemann ratio, obtained from RMSD, is found to be dependent on temperature. At a given number density, there exists a value of polydispersity index (delta (P)) above which no crystalline solid is stable. This transition value of polydispersity(termed as transition polydispersity, delta (P) ) is found to depend strongly on temperature, a feature missed in hard sphere model systems. Additionally, for a particular temperature when number density is increased, delta (P) shifts to higher values. This temperature and number density dependent value of delta (P) saturates surprisingly to a value which is found to be nearly the same for all temperatures, known as terminal polydispersity (delta (TP)). This value (delta (TP) similar to 0.11) is in excellent agreement with the experimental value of 0.12, but differs from hard sphere transition where this limiting value is only 0.048. Terminal polydispersity (delta (TP)) thus has a quasiuniversal character. Interestingly, the bifurcation diagram obtained from non-linear integral equation theories of freezing seems to provide an explanation of the existence of unique terminal polydispersity in polydisperse systems. Global bond orientational order parameter is calculated to obtain further insights into mechanism for melting.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Resumen: Este artículo considera la identificación de los reyes en los epitomes de la Dinastía XIII de Manetón, y su función en las tradiciones historiográficas del antiguo Egipto. A pesar del rechazo de larga data de la Dinastía XXIII de Manetón como ahistórica, aquí se argumenta que los nombres preservados en la Dinastía XIII son parte de una auténtica tradición historiografíca originada con el rey kushita Taharka. El artículo va aún más allá para sugerir razones específicas de por qué la Dinastía XIII fue integrada con otras tradiciones de listas reales, así como una reconstrucción histórica de tal proceso. Más aún, éste análisis identifica funciones específicas para los nombres que aún no han sido identificados, Psammus y Zet, en la versión de Julio Africano del epitome de Manetón. El argumento considera la perspectiva política y cultural de los reyes kushitas que eran responsables de una rama de la tradición de la lista de reyes y ofrece algunas interpretaciones de las prácticas reales kushitas a la luz de estas conclusiones.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

El trabajo aborda el gobierno de las instituciones como una alternativa superadora del activismo judicial producto de la visión neoconstitucionalista del orden social.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper uses a structural approach based on the indirect inference principle to estimate a standard version of the new Keynesian monetary (NKM) model augmented with term structure using both revised and real-time data. The estimation results show that the term spread and policy inertia are both important determinants of the U.S. estimated monetary policy rule whereas the persistence of shocks plays a small but significant role when revised and real-time data of output and inflation are both considered. More importantly, the relative importance of term spread and persistent shocks in the policy rule and the shock transmission mechanism drastically change when it is taken into account that real-time data are not well behaved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Published as an article in: Spanish Economic Review, 2008, vol. 10, issue 4, pages 251-277.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Using US data for the period 1967:5-2002:4, this paper empirically investigates the performance of an augmented version of the Taylor rule (ATR) that (i) allows for the presence of switching regimes, (ii) considers the long-short term spread in addition to the typical variables, (iii) uses an alternative monthly indicator of general economic activity suggested by Stock and Watson (1999), and (iv) considers interest rate smoothing. The estimation results show the existence of switching regimes, one characterized by low volatility and the other by high volatility. Moreover, the scale of the responses of the Federal funds rate to movements in the term spread, inflation and the economic activity index depend on the regime. The estimation results also show robust empirical evidence that the ATR has been more stable during the term of office of Chairman Greenspan than in the pre-Greenspan period. However, a closer look at the Greenspan period shows the existence of two alternative regimes and that the response of the Fed funds rate to inflation has not been significant during this period once the term spread is considered.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this study we define a cost sharing rule for cost sharing problems. This rule is related to the serial cost-sharing rule defined by Moulin and Shenker (1992). We give some formulas and axiomatic characterizations for the new rule. The axiomatic characterizations are related to some previous ones provided by Moulin and Shenker (1994) and Albizuri (2010).

Relevância:

20.00% 20.00%

Publicador: