68 resultados para PROBABILITY REPRESENTATION
Resumo:
This study views each protein structure as a network of noncovalent connections between amino acid side chains. Each amino acid in a protein structure is a node, and the strength of the noncovalent interactions between two amino acids is evaluated for edge determination. The protein structure graphs (PSGs) for 232 proteins have been constructed as a function of the cutoff of the amino acid interaction strength at a few carefully chosen values. Analysis of such PSGs constructed on the basis of edge weights has shown the following: 1), The PSGs exhibit a complex topological network behavior, which is dependent on the interaction cutoff chosen for PSG construction. 2), A transition is observed at a critical interaction cutoff, in all the proteins, as monitored by the size of the largest cluster (giant component) in the graph. Amazingly, this transition occurs within a narrow range of interaction cutoff for all the proteins, irrespective of the size or the fold topology. And 3), the amino acid preferences to be highly connected (hub frequency) have been evaluated as a function of the interaction cutoff. We observe that the aromatic residues along with arginine, histidine, and methionine act as strong hubs at high interaction cutoffs, whereas the hydrophobic leucine and isoleucine residues get added to these hubs at low interaction cutoffs, forming weak hubs. The hubs identified are found to play a role in bringing together different secondary structural elements in the tertiary structure of the proteins. They are also found to contribute to the additional stability of the thermophilic proteins when compared to their mesophilic counterparts and hence could be crucial for the folding and stability of the unique three-dimensional structure of proteins. Based on these results, we also predict a few residues in the thermophilic and mesophilic proteins that can be mutated to alter their thermal stability.
Resumo:
A set of sufficient conditions to construct lambda-real symbol Maximum Likelihood (ML) decodable STBCs have recently been provided by Karmakar et al. STBCs satisfying these sufficient conditions were named as Clifford Unitary Weight (CUW) codes. In this paper, the maximal rate (as measured in complex symbols per channel use) of CUW codes for lambda = 2(a), a is an element of N is obtained using tools from representation theory. Two algebraic constructions of codes achieving this maximal rate are also provided. One of the constructions is obtained using linear representation of finite groups whereas the other construction is based on the concept of right module algebra over non-commutative rings. To the knowledge of the authors, this is the first paper in which matrices over non-commutative rings is used to construct STBCs. An algebraic explanation is provided for the 'ABBA' construction first proposed by Tirkkonen et al and the tensor product construction proposed by Karmakar et al. Furthermore, it is established that the 4 transmit antenna STBC originally proposed by Tirkkonen et al based on the ABBA construction is actually a single complex symbol ML decodable code if the design variables are permuted and signal sets of appropriate dimensions are chosen.
Resumo:
The probability that a random process crosses an arbitrary level for the first time is expressed as a Gram—Charlier series, the leading term of which is the Poisson approximation. The coefficients of this series are related to the moments of the number of level crossings. The results are applicable to both stationary and non-stationary processes. Some numerical results are presented for the response process of a linear single-degree-of-freedom oscillator under Gaussian white noise excitation.
Resumo:
Hamilton’s theory of turns for the group SU(2) is exploited to develop a new geometrical representation for polarization optics. While pure polarization states are represented by points on the Poincaré sphere, linear intensity preserving optical systems are represented by great circle arcs on another sphere. Composition of systems, and their action on polarization states, are both reduced to geometrical operations. Several synthesis problems, especially in relation to the Pancharatnam-Berry-Aharonov-Anandan geometrical phase, are clarified with the new representation. The general relation between the geometrical phase, and the solid angle on the Poincaré sphere, is established.
Resumo:
With the objective of better understanding the significance of New Car Assessment Program (NCAP) tests conducted by the National Highway Traffic Safety Administration (NHTSA), head-on collisions between two identical cars of different sizes and between cars and a pickup truck are studied in the present paper using LS-DYNA models. Available finite element models of a compact car (Dodge Neon), midsize car (Dodge Intrepid), and pickup truck (Chevrolet C1500) are first improved and validated by comparing theanalysis-based vehicle deceleration pulses against corresponding NCAP crash test histories reported by NHTSA. In confirmation of prevalent perception, simulation-bascd results indicate that an NCAP test against a rigid barrier is a good representation of a collision between two similar cars approaching each other at a speed of 56.3 kmph (35 mph) both in terms of peak deceleration and intrusions. However, analyses carried out for collisions between two incompatible vehicles, such as an Intrepid or Neon against a C1500, point to the inability of the NCAP tests in representing the substantially higher intrusions in the front upper regions experienced by the cars, although peak decelerations in cars arc comparable to those observed in NCAP tests. In an attempt to improve the capability of a front NCAP test to better represent real-world crashes between incompatible vehicles, i.e., ones with contrasting ride height and lower body stiffness, two modified rigid barriers are studied. One of these barriers, which is of stepped geometry with a curved front face, leads to significantly improved correlation of intrusions in the upper regions of cars with respect to those yielded in the simulation of collisions between incompatible vehicles, together with the yielding of similar vehicle peak decelerations obtained in NCAP tests.
Resumo:
We report numerical and analytic results for the spatial survival probability for fluctuating one-dimensional interfaces with Edwards-Wilkinson or Kardar-Parisi-Zhang dynamics in the steady state. Our numerical results are obtained from analysis of steady-state profiles generated by integrating a spatially discretized form of the Edwards-Wilkinson equation to long times. We show that the survival probability exhibits scaling behavior in its dependence on the system size and the "sampling interval" used in the measurement for both "steady-state" and "finite" initial conditions. Analytic results for the scaling functions are obtained from a path-integral treatment of a formulation of the problem in terms of one-dimensional Brownian motion. A "deterministic approximation" is used to obtain closed-form expressions for survival probabilities from the formally exact analytic treatment. The resulting approximate analytic results provide a fairly good description of the numerical data.
Resumo:
The probability distribution of the eigenvalues of a second-order stochastic boundary value problem is considered. The solution is characterized in terms of the zeros of an associated initial value problem. It is further shown that the probability distribution is related to the solution of a first-order nonlinear stochastic differential equation. Solutions of this equation based on the theory of Markov processes and also on the closure approximation are presented. A string with stochastic mass distribution is considered as an example for numerical work. The theoretical probability distribution functions are compared with digital simulation results. The comparison is found to be reasonably good.
Resumo:
The different formalisms for the representation of thermodynamic data on dilute multicomponent solutions are critically reviewed. The thermodynamic consistency of the formalisms are examined and the interrelations between them are highlighted. The options are constraints in the use of the interaction parameter and Darken's quadratic formalisms for multicomponent solutions are discussed in the light of the available experimental data. Truncatred Maclaurin series expansion is thermodynamically inconsistent unless special relations between interaction parameters are invoked. However, the lack of strict mathematical consistency does not affect the practical use of the formalism. Expressions for excess partial properties can be integrated along defined composition paths without significant loss of accuracy. Although thermodynamically consistent, the applicability of Darken's quadratic formalism to strongly interacting systems remains to be established by experiment.
Resumo:
In this paper, a novel genetic algorithm is developed by generating artificial chromosomes with probability control to solve the machine scheduling problems. Generating artificial chromosomes for Genetic Algorithm (ACGA) is closely related to Evolutionary Algorithms Based on Probabilistic Models (EAPM). The artificial chromosomes are generated by a probability model that extracts the gene information from current population. ACGA is considered as a hybrid algorithm because both the conventional genetic operators and a probability model are integrated. The ACGA proposed in this paper, further employs the ``evaporation concept'' applied in Ant Colony Optimization (ACO) to solve the permutation flowshop problem. The ``evaporation concept'' is used to reduce the effect of past experience and to explore new alternative solutions. In this paper, we propose three different methods for the probability of evaporation. This probability of evaporation is applied as soon as a job is assigned to a position in the permutation flowshop problem. Experimental results show that our ACGA with the evaporation concept gives better performance than some algorithms in the literature.
Resumo:
A systematic structure analysis of the correlation functions of statistical quantum optics is carried out. From a suitably defined auxiliary two‐point function we are able to identify the excited modes in the wave field. The relative simplicity of the higher order correlation functions emerge as a byproduct and the conditions under which these are made pure are derived. These results depend in a crucial manner on the notion of coherence indices and of unimodular coherence indices. A new class of approximate expressions for the density operator of a statistical wave field is worked out based on discrete characteristic sets. These are even more economical than the diagonal coherent state representations. An appreciation of the subtleties of quantum theory obtains. Certain implications for the physics of light beams are cited.
Resumo:
Representation and quantification of uncertainty in climate change impact studies are a difficult task. Several sources of uncertainty arise in studies of hydrologic impacts of climate change, such as those due to choice of general circulation models (GCMs), scenarios and downscaling methods. Recently, much work has focused on uncertainty quantification and modeling in regional climate change impacts. In this paper, an uncertainty modeling framework is evaluated, which uses a generalized uncertainty measure to combine GCM, scenario and downscaling uncertainties. The Dempster-Shafer (D-S) evidence theory is used for representing and combining uncertainty from various sources. A significant advantage of the D-S framework over the traditional probabilistic approach is that it allows for the allocation of a probability mass to sets or intervals, and can hence handle both aleatory or stochastic uncertainty, and epistemic or subjective uncertainty. This paper shows how the D-S theory can be used to represent beliefs in some hypotheses such as hydrologic drought or wet conditions, describe uncertainty and ignorance in the system, and give a quantitative measurement of belief and plausibility in results. The D-S approach has been used in this work for information synthesis using various evidence combination rules having different conflict modeling approaches. A case study is presented for hydrologic drought prediction using downscaled streamflow in the Mahanadi River at Hirakud in Orissa, India. Projections of n most likely monsoon streamflow sequences are obtained from a conditional random field (CRF) downscaling model, using an ensemble of three GCMs for three scenarios, which are converted to monsoon standardized streamflow index (SSFI-4) series. This range is used to specify the basic probability assignment (bpa) for a Dempster-Shafer structure, which represents uncertainty associated with each of the SSFI-4 classifications. These uncertainties are then combined across GCMs and scenarios using various evidence combination rules given by the D-S theory. A Bayesian approach is also presented for this case study, which models the uncertainty in projected frequencies of SSFI-4 classifications by deriving a posterior distribution for the frequency of each classification, using an ensemble of GCMs and scenarios. Results from the D-S and Bayesian approaches are compared, and relative merits of each approach are discussed. Both approaches show an increasing probability of extreme, severe and moderate droughts and decreasing probability of normal and wet conditions in Orissa as a result of climate change. (C) 2010 Elsevier Ltd. All rights reserved.
Resumo:
We present a biquadratic Lagrangian plate bending element with consistent fields for the constrained transverse shear strain functions. A technique involving expansion of the strain interpolations in terms of Legendre polynomials is used to redistribute the kinematically derived shear strain fields so that the field-consistent forms (i.e. avoiding locking) are also variationally correct (i.e. do not violate the variational norms). Also, a rational method of isoparametric Jacobian transformation is incorporated so that the constrained covariant shear strain fields are always consistent in whatever general quadrilateral form the element may take. Finally the element is compared with another formulation which was recently published. The element is subjected to several robust bench mark tests and is found to pass all the tests efficiently.
Resumo:
Computer Vision has seen a resurgence in the parts-based representation for objects over the past few years. The parts are usually annotated beforehand for training. We present an annotation free parts-based representation for the pedestrian using Non-Negative Matrix Factorization (NMF). We show that NMF is able to capture the wide range of pose and clothing of the pedestrians. We use a modified form of NMF i.e. NMF with sparsity constraints on the factored matrices. We also make use of Riemannian distance metric for similarity measurements in NMF space as the basis vectors generated by NMF aren't orthogonal. We show that for 1% drop in accuracy as compared to the Histogram of Oriented Gradients (HOG) representation we can achieve robustness to partial occlusion.