45 resultados para Approximate Solution


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Cobre Las Cruces is a renowned copper mining company located in Sevilla, with unexpected problems in wireless communications that have a direct affectation in production. Therefore, the main goals are to improve the WiFi infrastructure, to secure it and to detect and prevent from attacks and from the installation of rogue (and non-authorized) APs. All of that integrated with the current ICT infrastructure.This project has been divided into four phases, although only two of them have been included into the TFC; they are the analysis of the current situation and the design of a WLAN solution.Once the analysis part was finished, some weaknesses were detected. Subjects such as lack of connectivity and control, ignorance about installed WiFi devices and their localization and state and, by and large, the use of weak security mechanisms were some of the problems found. Additionally, due to the fact that the working area became larger and new WiFi infrastructures were added, the first phase took more time than expected.As a result of the detailed analysis, some goals were defined to solve and it was designed a centralized approach able to cope with them. A solution based on 802.11i and 802.1x protocols, digital certificates, a probe system running as IDS/IPS and ligthweight APs in conjunction with a Wireless LAN Controller are the main features.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We present a method for analyzing the curvature (second derivatives) of the conical intersection hyperline at an optimized critical point. Our method uses the projected Hessians of the degenerate states after elimination of the two branching space coordinates, and is equivalent to a frequency calculation on a single Born-Oppenheimer potential-energy surface. Based on the projected Hessians, we develop an equation for the energy as a function of a set of curvilinear coordinates where the degeneracy is preserved to second order (i.e., the conical intersection hyperline). The curvature of the potential-energy surface in these coordinates is the curvature of the conical intersection hyperline itself, and thus determines whether one has a minimum or saddle point on the hyperline. The equation used to classify optimized conical intersection points depends in a simple way on the first- and second-order degeneracy splittings calculated at these points. As an example, for fulvene, we show that the two optimized conical intersection points of C2v symmetry are saddle points on the intersection hyperline. Accordingly, there are further intersection points of lower energy, and one of C2 symmetry - presented here for the first time - is found to be the global minimum in the intersection space

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Recently, the surprising result that ab initio calculations on benzene and other planar arenes at correlated MP2, MP3, configuration interaction with singles and doubles (CISD), and coupled cluster with singles and doubles levels of theory using standard Pople’s basis sets yield nonplanar minima has been reported. The planar optimized structures turn out to be transition states presenting one or more large imaginary frequencies, whereas single-determinant-based methods lead to the expected planar minima and no imaginary frequencies. It has been suggested that such anomalous behavior can be originated by two-electron basis set incompleteness error. In this work, we show that the reported pitfalls can be interpreted in terms of intramolecular basis set superposition error (BSSE) effects, mostly between the C–H moieties constituting the arenes. We have carried out counterpoise-corrected optimizations and frequency calculations at the Hartree–Fock, B3LYP, MP2, and CISD levels of theory with several basis sets for a number of arenes. In all cases, correcting for intramolecular BSSE fixes the anomalous behavior of the correlated methods, whereas no significant differences are observed in the single-determinant case. Consequently, all systems studied are planar at all levels of theory. The effect of different intramolecular fragment definitions and the particular case of charged species, namely, cyclopentadienyl and indenyl anions, respectively, are also discussed

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Biplots are graphical displays of data matrices based on the decomposition of a matrix as the product of two matrices. Elements of these two matrices are used as coordinates for the rows and columns of the data matrix, with an interpretation of the joint presentation that relies on the properties of the scalar product. Because the decomposition is not unique, there are several alternative ways to scale the row and column points of the biplot, which can cause confusion amongst users, especially when software packages are not united in their approach to this issue. We propose a new scaling of the solution, called the standard biplot, which applies equally well to a wide variety of analyses such as correspondence analysis, principal component analysis, log-ratio analysis and the graphical results of a discriminant analysis/MANOVA, in fact to any method based on the singular-value decomposition. The standard biplot also handles data matrices with widely different levels of inherent variance. Two concepts taken from correspondence analysis are important to this idea: the weighting of row and column points, and the contributions made by the points to the solution. In the standard biplot one set of points, usually the rows of the data matrix, optimally represent the positions of the cases or sample units, which are weighted and usually standardized in some way unless the matrix contains values that are comparable in their raw form. The other set of points, usually the columns, is represented in accordance with their contributions to the low-dimensional solution. As for any biplot, the projections of the row points onto vectors defined by the column points approximate the centred and (optionally) standardized data. The method is illustrated with several examples to demonstrate how the standard biplot copes in different situations to give a joint map which needs only one common scale on the principal axes, thus avoiding the problem of enlarging or contracting the scale of one set of points to make the biplot readable. The proposal also solves the problem in correspondence analysis of low-frequency categories that are located on the periphery of the map, giving the false impression that they are important.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We extend Aumann's theorem [Aumann 1987], deriving correlated equilibria as a consequence of common priors and common knowledge of rationality, by explicitly allowing for non-rational behavior. Wereplace the assumption of common knowledge of rationality with a substantially weaker one, joint p-belief of rationality, where agents believe the other agents are rational with probability p or more. We show that behavior in this case constitutes a kind of correlated equilibrium satisfying certain p-belief constraints, and that it varies continuously in the parameters p and, for p sufficiently close to one,with high probability is supported on strategies that survive the iterated elimination of strictly dominated strategies. Finally, we extend the analysis to characterizing rational expectations of interimtypes, to games of incomplete information, as well as to the case of non-common priors.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Consider the problem of testing k hypotheses simultaneously. In this paper,we discuss finite and large sample theory of stepdown methods that providecontrol of the familywise error rate (FWE). In order to improve upon theBonferroni method or Holm's (1979) stepdown method, Westfall and Young(1993) make eective use of resampling to construct stepdown methods thatimplicitly estimate the dependence structure of the test statistics. However,their methods depend on an assumption called subset pivotality. The goalof this paper is to construct general stepdown methods that do not requiresuch an assumption. In order to accomplish this, we take a close look atwhat makes stepdown procedures work, and a key component is a monotonicityrequirement of critical values. By imposing such monotonicity on estimatedcritical values (which is not an assumption on the model but an assumptionon the method), it is demonstrated that the problem of constructing a validmultiple test procedure which controls the FWE can be reduced to the problemof contructing a single test which controls the usual probability of a Type 1error. This reduction allows us to draw upon an enormous resamplingliterature as a general means of test contruction.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper studies the rate of convergence of an appropriatediscretization scheme of the solution of the Mc Kean-Vlasovequation introduced by Bossy and Talay. More specifically,we consider approximations of the distribution and of thedensity of the solution of the stochastic differentialequation associated to the Mc Kean - Vlasov equation. Thescheme adopted here is a mixed one: Euler/weakly interactingparticle system. If $n$ is the number of weakly interactingparticles and $h$ is the uniform step in the timediscretization, we prove that the rate of convergence of thedistribution functions of the approximating sequence in the $L^1(\Omega\times \Bbb R)$ norm and in the sup norm is of theorder of $\frac 1{\sqrt n} + h $, while for the densities is ofthe order $ h +\frac 1 {\sqrt {nh}}$. This result is obtainedby carefully employing techniques of Malliavin Calculus.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Creative accounting is a growing issue of interest in Spain. In this article we argue that the concept true and fair view can limit or promote the use of creative accounting depending upon its interpretation. We review the range of meanings that true and fair view can take at an international level and compare the experience of the United Kingdom with the Australian one by analysing the use of true and fair view to limit creative accounting. Finally, we suggest lines of action to be considered by the Spanish accounting standards-setting institutions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In a previous paper a novel Generalized Multiobjective Multitree model (GMM-model) was proposed. This model considers for the first time multitree-multicast load balancing with splitting in a multiobjective context, whose mathematical solution is a whole Pareto optimal set that can include several results than it has been possible to find in the publications surveyed. To solve the GMM-model, in this paper a multi-objective evolutionary algorithm (MOEA) inspired by the Strength Pareto Evolutionary Algorithm (SPEA) is proposed. Experimental results considering up to 11 different objectives are presented for the well-known NSF network, with two simultaneous data flows

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The results of a crystal structure refinement of an anisotropic grandite garnet specimen with composition Gro36-4 And63-6 are given. The structure obtained has orthorrombic symmetry (space group Fddd) and is compared with similar results obtained by other authors. In all cases the reduction of symmetry is due to the ordering of Fe3+ and Al in octahedral sites. Non cubic structures of grandites are discussed in connection with optical, morphological an grou-th features of these minerals.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We study under which conditions the core of a game involved in a convex decomposition of another game turns out to be a stable set of the decomposed game. Some applications and numerical examples, including the remarkable Lucas¿ five player game with a unique stable set different from the core, are reckoning and analyzed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We have performed a detailed study of the zenith angle dependence of the regeneration factor and distributions of events at SNO and SK for different solutions of the solar neutrino problem. In particular, we discuss the oscillatory behavior and the synchronization effect in the distribution for the LMA solution, the parametric peak for the LOW solution, etc. A physical interpretation of the effects is given. We suggest a new binning of events which emphasizes the distinctive features of the zenith angle distributions for the different solutions. We also find the correlations between the integrated day-night asymmetry and the rates of events in different zenith angle bins. The study of these correlations strengthens the identification power of the analysis.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The vacuum Einstein equations in five dimensions are shown to admit a solution describing a stationary asymptotically flat spacetime regular on and outside an event horizon of topology S1S2. It describes a rotating black ring. This is the first example of a stationary asymptotically flat vacuum solution with an event horizon of nonspherical topology. The existence of this solution implies that the uniqueness theorems valid in four dimensions do not have simple five-dimensional generalizations. It is suggested that increasing the spin of a spherical black hole beyond a critical value results in a transition to a black ring, which can have an arbitrarily large angular momentum for a given mass.