893 resultados para Minimal Set


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Food webs represent trophic (feeding) interactions in ecosystems. Since the late 1970s, it has been recognized that food-webs have a surprisingly close relationship to interval graphs. One interpretation of food-web intervality is that trophic niche space is low-dimensional, meaning that the trophic character of a species can be expressed by a single or at most a few quantitative traits. In a companion paper we demonstrated, by simulating a minimal food-web model, that food webs are also expected to be interval when niche-space is high-dimensional. Here we characterize the fundamental mechanisms underlying this phenomenon by proving a set of rigorous conditions for food-web intervality in high-dimensional niche spaces. Our results apply to a large class of food-web models, including the special case previously studied numerically.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this work we present the theoretical framework for the solution of the time-dependent Schrödinger equation (TDSE) of atomic and molecular systems under strong electromagnetic fields with the configuration space of the electron’s coordinates separated over two regions; that is, regions I and II. In region I the solution of the TDSE is obtained by an R-matrix basis set representation of the time-dependent wave function. In region II a grid representation of the wave function is considered and propagation in space and time is obtained through the finite-difference method. With this, a combination of basis set and grid methods is put forward for tackling multiregion time-dependent problems. In both regions, a high-order explicit scheme is employed for the time propagation. While, in a purely hydrogenic system no approximation is involved due to this separation, in multielectron systems the validity and the usefulness of the present method relies on the basic assumption of R-matrix theory, namely, that beyond a certain distance (encompassing region I) a single ejected electron is distinguishable from the other electrons of the multielectron system and evolves there (region II) effectively as a one-electron system. The method is developed in detail for single active electron systems and applied to the exemplar case of the hydrogen atom in an intense laser field.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Power system islanding can improve the continuity of power supply. Synchronous islanded operation enables the islanded system to remain in phase with the main power system while not electrically connected, so avoiding out-of-synchronism re-closure. Specific consideration is required for the multiple-set scenario. In this paper a suitable island management system is proposed, with the emphasis being on maximum island flexibility by allowing passive islanding transitions to occur, facilitated by intelligent control. These transitions include: island detection, identification, fragmentation, merging and return-to-mains. It can be challenging to detect these transitions while maintaining syn-chronous islanded operation. The performance of this control system in the presence of a variable wind power in-feed is also examined. A Mathworks SimPowerSystems simulation is used to investigate the performance of the island management system. The benefit and requirements for energy storage, com-munications and distribution system protection for this application are considered.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Let D be the differentiation operator Df = f' acting on the Fréchet space H of all entire functions in one variable with the standard (compact-open) topology. It is known since the 1950’s that the set H(D) of hypercyclic vectors for the operator D is non-empty. We treat two questions raised by Aron, Conejero, Peris and Seoane-Sepúlveda whether the set H(D) contains (up to the zero function) a non-trivial subalgebra of H or an infinite-dimensional closed linear subspace of H. In the present article both questions are answered affirmatively.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We propose a new approach for the inversion of anisotropic P-wave data based on Monte Carlo methods combined with a multigrid approach. Simulated annealing facilitates objective minimization of the functional characterizing the misfit between observed and predicted traveltimes, as controlled by the Thomsen anisotropy parameters (epsilon, delta). Cycling between finer and coarser grids enhances the computational efficiency of the inversion process, thus accelerating the convergence of the solution while acting as a regularization technique of the inverse problem. Multigrid perturbation samples the probability density function without the requirements for the user to adjust tuning parameters. This increases the probability that the preferred global, rather than a poor local, minimum is attained. Undertaking multigrid refinement and Monte Carlo search in parallel produces more robust convergence than does the initially more intuitive approach of completing them sequentially. We demonstrate the usefulness of the new multigrid Monte Carlo (MGMC) scheme by applying it to (a) synthetic, noise-contaminated data reflecting an isotropic subsurface of constant slowness, horizontally layered geologic media and discrete subsurface anomalies; and (b) a crosshole seismic data set acquired by previous authors at the Reskajeage test site in Cornwall, UK. Inverted distributions of slowness (s) and the Thomson anisotropy parameters (epsilon, delta) compare favourably with those obtained previously using a popular matrix-based method. Reconstruction of the Thomsen epsilon parameter is particularly robust compared to that of slowness and the Thomsen delta parameter, even in the face of complex subsurface anomalies. The Thomsen epsilon and delta parameters have enhanced sensitivities to bulk-fabric and fracture-based anisotropies in the TI medium at Reskajeage. Because reconstruction of slowness (s) is intimately linked to that epsilon and delta in the MGMC scheme, inverted images of phase velocity reflect the integrated effects of these two modes of anisotropy. The new MGMC technique thus promises to facilitate rapid inversion of crosshole P-wave data for seismic slownesses and the Thomsen anisotropy parameters, with minimal user input in the inversion process.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

As a potential alternative to CMOS technology, QCA provides an interesting paradigm in both communication and computation. However, QCAs unique four-phase clocking scheme and timing constraints present serious timing issues for interconnection and feedback. In this work, a cut-set retiming design procedure is proposed to resolve these QCA timing issues. The proposed design procedure can accommodate QCAs unique characteristics by performing delay-transfer and time-scaling to reallocate the existing delays so as to achieve efficient clocking zone assignment. Cut-set retiming makes it possible to effectively design relatively complex QCA circuits that include feedback. It utilizes the similar characteristics of synchronization, deep pipelines and local interconnections common to both QCA and systolic architectures. As a case study, a systolic Montgomery modular multiplier is designed to illustrate the procedure. Furthermore, a nonsystolic architecture, an S27 benchmark circuit, is designed and compared with previous designs. The comparison shows that the cut-set retiming method achieves a more efficient design, with a reduction of 22%, 44%, and 46% in terms of cell count, area, and latency, respectively.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Measuring the degree of inconsistency of a belief base is an important issue in many real world applications. It has been increasingly recognized that deriving syntax sensitive inconsistency measures for a belief base from its minimal inconsistent subsets is a natural way forward. Most of the current proposals along this line do not take the impact of the size of each minimal inconsistent subset into account. However, as illustrated by the well-known Lottery Paradox, as the size of a minimal inconsistent subset increases, the degree of its inconsistency decreases. Another lack in current studies in this area is about the role of free formulas of a belief base in measuring the degree of inconsistency. This has not yet been characterized well. Adding free formulas to a belief base can enlarge the set of consistent subsets of that base. However, consistent subsets of a belief base also have an impact on the syntax sensitive normalized measures of the degree of inconsistency, the reason for this is that each consistent subset can be considered as a distinctive plausible perspective reflected by that belief base,whilst eachminimal inconsistent subset projects a distinctive viewof the inconsistency. To address these two issues,we propose a normalized framework formeasuring the degree of inconsistency of a belief base which unifies the impact of both consistent subsets and minimal inconsistent subsets. We also show that this normalized framework satisfies all the properties deemed necessary by common consent to characterize an intuitively satisfactory measure of the degree of inconsistency for belief bases. Finally, we use a simple but explanatory example in equirements engineering to illustrate the application of the normalized framework.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this preliminary study, we investigate how inconsistency in a network intrusion detection rule set can be measured. To achieve this, we first examine the structure of these rules which are based on Snort and incorporate regular expression (Regex) pattern matching. We then identify primitive elements in these rules in order to translate the rules into their (equivalent) logical forms and to establish connections between them. Additional rules from background knowledge are also introduced to make the correlations among rules more explicit. We measure the degree of inconsistency in formulae of such a rule set (using the Scoring function, Shapley inconsistency values and Blame measure for prioritized knowledge) and compare the informativeness of these measures. Finally, we propose a new measure of inconsistency for prioritized knowledge which incorporates the normalized number of atoms in a language involved in inconsistency to provide a deeper inspection of inconsistent formulae. We conclude that such measures are useful for the network intrusion domain assuming that introducing expert knowledge for correlation of rules is feasible.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

It is increasingly recognized that identifying the degree of blame or responsibility of each formula for inconsistency of a knowledge base (i.e. a set of formulas) is useful for making rational decisions to resolve inconsistency in that knowledge base. Most current techniques for measuring the blame of each formula with regard to an inconsistent knowledge base focus on classical knowledge bases only. Proposals for measuring the blames of formulas with regard to an inconsistent prioritized knowledge base have not yet been given much consideration. However, the notion of priority is important in inconsistency-tolerant reasoning. This article investigates this issue and presents a family of measurements for the degree of blame of each formula in an inconsistent prioritized knowledge base by using the minimal inconsistent subsets of that knowledge base. First of all, we present a set of intuitive postulates as general criteria to characterize rational measurements for the blames of formulas of an inconsistent prioritized knowledge base. Then we present a family of measurements for the blame of each formula in an inconsistent prioritized knowledge base under the guidance of the principle of proportionality, one of the intuitive postulates. We also demonstrate that each of these measurements possesses the properties that it ought to have. Finally, we use a simple but explanatory example in requirements engineering to illustrate the application of these measurements. Compared to the related works, the postulates presented in this article consider the special characteristics of minimal inconsistent subsets as well as the priority levels of formulas. This makes them more appropriate to characterizing the inconsistency measures defined from minimal inconsistent subsets for prioritized knowledge bases as well as classical knowledge bases. Correspondingly, the measures guided by these postulates can intuitively capture the inconsistency for prioritized knowledge bases.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We propose an exchange rate model that is a hybrid of the conventional specification with monetary fundamentals and the Evans–Lyons microstructure approach. We estimate a model augmented with order flow variables, using a unique data set: almost 100 monthly observations on interdealer order flow on dollar/euro and dollar/yen. The augmented macroeconomic, or “hybrid,” model exhibits greater in-sample stability and out of sample forecasting improvement vis-à-vis the basic macroeconomic and random walk specifications.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Motivation: We study a stochastic method for approximating the set of local minima in partial RNA folding landscapes associated with a bounded-distance neighbourhood of folding conformations. The conformations are limited to RNA secondary structures without pseudoknots. The method aims at exploring partial energy landscapes pL induced by folding simulations and their underlying neighbourhood relations. It combines an approximation of the number of local optima devised by Garnier and Kallel (2002) with a run-time estimation for identifying sets of local optima established by Reeves and Eremeev (2004).

Results: The method is tested on nine sequences of length between 50 nt and 400 nt, which allows us to compare the results with data generated by RNAsubopt and subsequent barrier tree calculations. On the nine sequences, the method captures on average 92% of local minima with settings designed for a target of 95%. The run-time of the heuristic can be estimated by O(n2D?ln?), where n is the sequence length, ? is the number of local minima in the partial landscape pL under consideration and D is the maximum number of steepest descent steps in attraction basins associated with pL.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A UV indicator/dosimeter based on benzyl viologen (BV2+) encapsulated in polyvinyl alcohol (PVA) is described. Upon exposure to UV light, the BV2+/PVA film turns a striking purple colour due to the formation of the cation radical, BV center dot+. The usual oxygen sensitivity of BV center dot+ is significantly reduced due to the very low oxygen permeability of the encapsulating polymer, PVA. Exposure of a typical BV2+/PVA film, for a set amount of time, to UVB light with different UV indices produces different levels of BV center dot+, as measured by the absorbance of the film at 550 nm. A plot of the change in absorbance at this wavelength, Delta Abs(550), as a function of UV index, UVI, produces a linear calibration curve which allows the film to be used as a UVB indicator, and a similar procedure could be employed to allow it to be used as a solar UVI indicator. A typical BV2+/PVA film generates a significant, semi-permanent (stable for > 24 h) saturated purple colour (absorbance similar to 0.8-0.9) upon exposure to sunlight equivalent to a minimal erythemal dose associated with Caucasian skin, i.e. skin type II. The current drawbacks of the film and the possible future use of the BV2+/PVA film as a personal solar UV dosimeter for all skin types are briefly discussed.