142 resultados para Soft computing
Resumo:
We present here a theoretical approach to compute the molecular magnetic anisotropy parameters, D (M) and E (M) for single molecule magnets in any given spin eigenstate of exchange spin Hamiltonian. We first describe a hybrid constant M (S) valence bond (VB) technique of solving spin Hamiltonians employing full spatial and spin symmetry adaptation and we illustrate this technique by solving the exchange Hamiltonian of the Cu6Fe8 system. Treating the anisotropy Hamiltonian as perturbation, we compute the D (M)and E(M) values for various eigenstates of the exchange Hamiltonian. Since, the dipolar contribution to the magnetic anisotropy is negligibly small, we calculate the molecular anisotropy from the single-ion anisotropies of the metal centers. We have studied the variation of D (M) and E(M) by rotating the single-ion anisotropies in the case of Mn12Ac and Fe-8 SMMs in ground and few low-lying excited states of the exchange Hamiltonian. In both the systems, we find that the molecular anisotropy changes drastically when the single-ion anisotropies are rotated. While in Mn12Ac SMM D (M) values depend strongly on the spin of the eigenstate, it is almost independent of the spin of the eigenstate in Fe-8 SMM. We also find that the D (M)value is almost insensitive to the orientation of the anisotropy of the core Mn(IV) ions. The dependence of D (M) on the energy gap between the ground and the excited states in both the systems has also been studied by using different sets of exchange constants.
Resumo:
We study soft gluon k(t)-resurnmation and the relevance of InfraRed (IR) gluons for the energy dependence of total hadronic cross-sections. In our model, consistency with the Froissart bound is directly related to the ansatz that the IR behaviour of the QCD coupling constant follows an inverse power law.
Resumo:
We compute the throughput obtained by a TCP connection in a UMTS environment. For downloading data at a mobile terminal, the packets of each TCP connection are stored in separate queues at the base station (node B). Also due to fragmentation of the TCP packets into Protocol Data Units (PDU) and link layer retransmissions of PDUs there can be significant delays at the queue of the node B. In such a scenario the existing models of TCP may not be sufficient. Thus, we provide a new approximate TCP model and also obtain new closed-form expressions of mean window size. Using these we obtain the throughput of a TCP connection which matches with simulations quite well.
Resumo:
With technology scaling, vulnerability to soft errors in random logic is increasing. There is a need for on-line error detection and protection for logic gates even at sea level. The error checker is the key element for an on-line detection mechanism. We compare three different checkers for error detection from the point of view of area, power and false error detection rates. We find that the double sampling checker (used in Razor), is the simplest and most area and power efficient, but suffers from very high false detection rates of 1.15 times the actual error rates. We also find that the alternate approaches of triple sampling and integrate and sample method (I&S) can be designed to have zero false detection rates, but at an increased area, power and implementation complexity. The triple sampling method has about 1.74 times the area and twice the power as compared to the Double Sampling method and also needs a complex clock generation scheme. The I&S method needs about 16% more power with 0.58 times the area as double sampling, but comes with more stringent implementation constraints as it requires detection of small voltage swings.
Resumo:
A model for total cross-sections incorporating QCD jet cross-sections and soft gluon resummation is described and compared with present data on pp and pp cross-sections. Predictions for LHC are presented for different parameter sets. It is shown that they differ according to the small x-behaviour of available parton density functions.
Resumo:
Theoretical expressions for stresses and displacements have been derived for bending under a ring load of a free shell, a shell embedded in a soft medium, and a shell containing a soft core. Numerical work has been done for typical cases with an Elliot 803 Digital Computer and influence lines are drawn therefrom.
Resumo:
Ion transport in a recently demonstrated promising soft matter solid plastic-polymer electrolyte is discussed here in the context of solvent dynamics and ion association. The plastic-polymer composite electrolytes display liquid-like ionic conductivity in the solid state,compliable mechanical strength (similar to 1 MPa), and wide electrochemical voltage stability (>= 5 V). Polyacrylonitrile (PAN) dispersed in lithium perchlorate (LiClO4)-succinonitrile (SN) was chosen as the model system for the study (abbreviated LiClO4-SN:PAN). Systematic observation of various mid-infrared isomer and ion association bands as a function of temperature and polyme concentration shows an effective increase in trans conformer concentration along with free Li+ ion concentration. This strongly supports the view that enhancement in LiClO4-SN:PAN ionic conductivity over the neat plastic electrolyte (LiClO4-SN) is due to both increase in charge mobility and concentration. The ionic conductivity and infrared spectroscopy studies are supported by Brillouin light scattering. For the LiClO4-SN:PAN composites, a peak at 17 GHz was observed in addition to the normal trans-gauche isomerism (as in neat SN) at 12 GHz. The fast process is attributed to increased dynamics of those SN molecules whose energy barrier of transition from gauche to trans has reduced under influences induced by the changes in temperature and polymer concentration. The observations from ionic conductivity, spectroscopy, and light scattering studies were further supplemented by temperature dependent nuclear magnetic resonance H-1 and Li-7 line width measurements.
Resumo:
The problem is solved using the Love function and Flügge shell theory. Numerical work has been done with a computer for various values of shell geometry parameters and elastic constants.
Resumo:
We present a method to perform in situ microrheological measurements on monolayers of soft materials undergoing viscoelastic transitions under compression. Using the combination of a Langmuir trough mounted on the inverted microscope stage of a laser scanning confocal microscope we track the motion of individual fluorescent quantum dots partly dispersed in monolayers spread at the air-water interface. From the calculated mean square displacement of the probe particles and extending a well established scheme of the generalized Stokes-Einstein relation in bulk to the interface we arrive at the viscoelastic modulus for the respective monolayers as a function of surface density. Measurements on monolayers of glassy as well as nonglassy polymers and a standard fatty acid clearly show sensitivity of our technique to subtle variations, in the viscoelastic properties of the highly confined materials under compression. Evidence for possible spatial variations of such viscoelastic properties at a given surface density for the fatty acid monolayer is also provided.
Resumo:
The move towards IT outsourcing is the first step towards an environment where compute infrastructure is treated as a service. In utility computing this IT service has to honor Service Level Agreements (SLA) in order to meet the desired Quality of Service (QoS) guarantees. Such an environment requires reliable services in order to maximize the utilization of the resources and to decrease the Total Cost of Ownership (TCO). Such reliability cannot come at the cost of resource duplication, since it increases the TCO of the data center and hence the cost per compute unit. We, in this paper, look into aspects of projecting impact of hardware failures on the SLAs and techniques required to take proactive recovery steps in case of a predicted failure. By maintaining health vectors of all hardware and system resources, we predict the failure probability of resources based on observed hardware errors/failure events, at runtime. This inturn influences an availability aware middleware to take proactive action (even before the application is affected in case the system and the application have low recoverability). The proposed framework has been prototyped on a system running HP-UX. Our offline analysis of the prediction system on hardware error logs indicate no more than 10% false positives. This work to the best of our knowledge is the first of its kind to perform an end-to-end analysis of the impact of a hardware fault on application SLAs, in a live system.
Resumo:
Conformance testing focuses on checking whether an implementation. under test (IUT) behaves according to its specification. Typically, testers are interested it? performing targeted tests that exercise certain features of the IUT This intention is formalized as a test purpose. The tester needs a "strategy" to reach the goal specified by the test purpose. Also, for a particular test case, the strategy should tell the tester whether the IUT has passed, failed. or deviated front the test purpose. In [8] Jeron and Morel show how to compute, for a given finite state machine specification and a test purpose automaton, a complete test graph (CTG) which represents all test strategies. In this paper; we consider the case when the specification is a hierarchical state machine and show how to compute a hierarchical CTG which preserves the hierarchical structure of the specification. We also propose an algorithm for an online test oracle which avoids a space overhead associated with the CTG.
Resumo:
It is a policy of Solid State Communications’ Executive Editorial Board to organize special issues from time to time on topics of current interests. The present issue focuses on soft condensed matter, a rapidly developing and diverse area of importance not only for the basic science, but also for its potential applications. The ten articles in this issue are intended to give the readers a snapshot of some latest developments in soft condensed matter, mainly from the point of view of basic science. As the special issues are intended for a broad audience, most articles are short reviews that introduce the readers to the relevant topics. Hence this special issue can be especially helpful to readers who might not be specialists in this area but would like to have a quick grasp on some of the interesting research directions.
Resumo:
We present a method to perform in situ microrheological measurements on monolayers of soft materials undergoing viscoelastic transitions under compression. Using the combination of a Langmuir trough mounted on the inverted microscope stage of a laser scanning confocal microscope we track the motion of individual fluorescent quantum dots partly dispersed in monolayers spread at the air-water interface. From the calculated mean square displacement of the probe particles and extending a well established scheme of the generalized Stokes-Einstein relation in bulk to the interface we arrive at the viscoelastic modulus for the respective monolayers as a function of surface density. Measurements on monolayers of glassy as well as nonglassy polymers and a standard fatty acid clearly show sensitivity of our technique to subtle variations, in the viscoelastic properties of the highly confined materials under compression. Evidence for possible spatial variations of such viscoelastic properties at a given surface density for the fatty acid monolayer is also provided.
Resumo:
An important issue in the design of a distributed computing system (DCS) is the development of a suitable protocol. This paper presents an effort to systematize the protocol design procedure for a DCS. Protocol design and development can be divided into six phases: specification of the DCS, specification of protocol requirements, protocol design, specification and validation of the designed protocol, performance evaluation, and hardware/software implementation. This paper describes techniques for the second and third phases, while the first phase has been considered by the authors in their earlier work. Matrix and set theoretic based approaches are used for specification of a DCS and for specification of the protocol requirements. These two formal specification techniques form the basis of the development of a simple and straightforward procedure for the design of the protocol. The applicability of the above design procedure has been illustrated by considering an example of a computing system encountered on board a spacecraft. A Petri-net based approach has been adopted to model the protocol. The methodology developed in this paper can be used in other DCS applications.