906 resultados para Pseudorandom permutation ensemble


Relevância:

10.00% 10.00%

Publicador:

Resumo:

[EN]In this paper we deal with distributions over permutation spaces. The Mallows model is the mode l in use. The associated distance for permutations is the Hamming distance.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

[EN]In this paper we deal with probability distributions over permutation spaces. The Probability model in use is the Mallows model. The distance for permutations that the model uses in the Ulam distance.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

在白以龙小组已有工作的基础上,利用他们提出的简化的耦合斑图模型,几种可能的应力重分配模型在该论文中得到了进一步的讨论。通过不同的应力重分配模型,我们发现了该类演化过程中的三个一般规律,这些规律对于类似的动力过程的预报(例如,非均匀介质的破坏)提供了线索。首先,我们采用系综统计的方法,对相同宏观参量的大量样本的强度分布作了考察,结果表明:宏观强度的统计结果可以非常好的拟合为Weibull分布,且其Weibull模数与系统的大小、应力重分配的方式以及细观单元的强度分布相关。其次,在模拟过程中,对演化过程中的能量释放、损伤事件的统计发现,它们存在标度行为,而且这一标度主要归因于灾变点附近的损伤事件。这一现象表明这一转变具有某种临界特征。最后,对于我们模型中的动力学过程,我们发现了灾变预报的线索。从样本演化过程 中的能量释放规律来看,我们发现有两件事是有意义的:一是辨别出主破坏的发生点(在这一时修,系统中的大部分能量得到释放);另外,给出转变点(整体稳定转化为演化诱致灾变)的预报。对前一个问题,我们通过考察系统在GS和EIC段的应力损伤涨落特征可以给出回答,通常,在EIC段的最大应力涨落(通常出现在主破坏过程中)比在GS过程中的最大应力涨落高一个数量级,根据这一差异,可以设立一个应力涨落的警戒值来判断系统所处的演化状态。对于后者,受到地震预报中采用的加卸载响应比(LURR)的启发,我们通过对系统中的外回转应力或损伤单元施加一个微增扰动,然后,根据系统在扰动前后释放的能量和相应的扰动,就可以得到临界敏感系数,临界敏感系数在灾变点附近迅速增加,在灾变点之迅速下降到1附近-我们称这一特征为临界敏感性。不同的应力重分配模型下得到了类似的现象,由此看来,对于 类似的动力学过程,临界敏感性是一个一般的特征。这一特征可能为我们对非均匀脆性介质的破坏提供线索。

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Recently, probability models on rankings have been proposed in the field of estimation of distribution algorithms in order to solve permutation-based combinatorial optimisation problems. Particularly, distance-based ranking models, such as Mallows and Generalized Mallows under the Kendall’s-t distance, have demonstrated their validity when solving this type of problems. Nevertheless, there are still many trends that deserve further study. In this paper, we extend the use of distance-based ranking models in the framework of EDAs by introducing new distance metrics such as Cayley and Ulam. In order to analyse the performance of the Mallows and Generalized Mallows EDAs under the Kendall, Cayley and Ulam distances, we run them on a benchmark of 120 instances from four well known permutation problems. The conducted experiments showed that there is not just one metric that performs the best in all the problems. However, the statistical test pointed out that Mallows-Ulam EDA is the most stable algorithm among the studied proposals.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We study the statistical properties of the image speckles produced by strong-scattering objects in the 4f optical imaging system. Using the generic expression of the complex amplitude of speckles and the approximation of the double-exponential function, we first obtain the ensemble average of the speckle intensity. Then we derive the variance of the speckle intensity based on the rotational transformation of the real and imaginary parts of the complex amplitude of speckles. We finally obtain the expression for the contrast of the. speckles, which is explicitly related to the statistical parameters of random surface and to the parameters of the imaging system. Our results are an obvious improvement compared with those reported in the literature, where the relations including such implicit quantities as the average size of the scattering grains of the random surface and the number of scattering grains are usually used. The results of this paper would be helpful for the characterization of random surface by speckle contrast.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The presented doctoral research utilizes time-resolved spectroscopy to characterize protein dynamics and folding mechanisms. We resolve millisecond-timescale folding by coupling time-resolved fluorescence energy transfer (trFRET) to a continuous flow microfluidic mixer to obtain intramolecular distance distributions throughout the folding process. We have elucidated the folding mechanisms of two cytochromes---one that exhibits two-state folding (cytochrome cb562) and one that has both a kinetic refolding intermediate ensemble and a distinct equilibrium unfolding intermediate (cytochrome c552). Our data reveal that the distinct structural features of cytochrome c552 contribute to its thermostability.

We have also investigated intrachain contact dynamics in unfolded cytochrome cb562 by monitoring electron transfer, which occurs as the heme collides with a ruthenium photosensitizer, covalently bound to residues along the polypeptide. Intrachain diffusion for chemically denatured proteins proceeds on the microsecond timescale with an upper limit of 0.1 microseconds. The power-law dependence (slope = -1.5) of the rate constants on the number of peptide bonds between the heme and Ru complex indicate that cytochrome cb562 is minimally frustrated.

In addition, we have explored the pathway dependence of electron tunneling rates between metal sites in proteins. Our research group has converted cytochrome b562 to a c-type cytochrome with the porphyrin covalently bound to cysteine sidechains. We have investigated the effects of the changes to the protein structure (i.e., increased rigidity and potential new equatorial tunneling pathways) on the electron transfer rates, measured by transient absorption, in a series of ruthenium photosensitizer-modified proteins.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Storage systems are widely used and have played a crucial rule in both consumer and industrial products, for example, personal computers, data centers, and embedded systems. However, such system suffers from issues of cost, restricted-lifetime, and reliability with the emergence of new systems and devices, such as distributed storage and flash memory, respectively. Information theory, on the other hand, provides fundamental bounds and solutions to fully utilize resources such as data density, information I/O and network bandwidth. This thesis bridges these two topics, and proposes to solve challenges in data storage using a variety of coding techniques, so that storage becomes faster, more affordable, and more reliable.

We consider the system level and study the integration of RAID schemes and distributed storage. Erasure-correcting codes are the basis of the ubiquitous RAID schemes for storage systems, where disks correspond to symbols in the code and are located in a (distributed) network. Specifically, RAID schemes are based on MDS (maximum distance separable) array codes that enable optimal storage and efficient encoding and decoding algorithms. With r redundancy symbols an MDS code can sustain r erasures. For example, consider an MDS code that can correct two erasures. It is clear that when two symbols are erased, one needs to access and transmit all the remaining information to rebuild the erasures. However, an interesting and practical question is: What is the smallest fraction of information that one needs to access and transmit in order to correct a single erasure? In Part I we will show that the lower bound of 1/2 is achievable and that the result can be generalized to codes with arbitrary number of parities and optimal rebuilding.

We consider the device level and study coding and modulation techniques for emerging non-volatile memories such as flash memory. In particular, rank modulation is a novel data representation scheme proposed by Jiang et al. for multi-level flash memory cells, in which a set of n cells stores information in the permutation induced by the different charge levels of the individual cells. It eliminates the need for discrete cell levels, as well as overshoot errors, when programming cells. In order to decrease the decoding complexity, we propose two variations of this scheme in Part II: bounded rank modulation where only small sliding windows of cells are sorted to generated permutations, and partial rank modulation where only part of the n cells are used to represent data. We study limits on the capacity of bounded rank modulation and propose encoding and decoding algorithms. We show that overlaps between windows will increase capacity. We present Gray codes spanning all possible partial-rank states and using only ``push-to-the-top'' operations. These Gray codes turn out to solve an open combinatorial problem called universal cycle, which is a sequence of integers generating all possible partial permutations.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

1000 p. (Anexos: 929-965 p.; bibliografía 965-1000 p.). Capítulos de discusión y conclusiones en castellano y francés.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Dynamic properties of proteins have crucial roles in understanding protein function and molecular mechanism within cells. In this paper, we combined total internal reflection fluorescence microscopy with oblique illumination fluorescence microscopy to observe directly the movement and localization of membrane-anchored green fluorescence proteins in living cells. Total internal reflect illumination allowed the observation of proteins in the cell membrane of living cells since the penetrate depth could be adjusted to about 80 nm, and oblique illumination allowed the observation of proteins both in the cytoplasm and apical membrane, which made this combination a promising tool to investigate the dynamics of proteins through the whole cell. Not only individual protein molecule tracks have been analyzed quantitatively but also cumulative probability distribution function analysis of ensemble trajectories has been done to reveal the mobility of proteins. Finally, single particle tracking has acted as a compensation for single molecule tracking. All the results exhibited green fluorescence protein dynamics within cytoplasm, on the membrane and from cytoplasm to plasma membrane.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Hypervelocity impact of meteoroids and orbital debris poses a serious and growing threat to spacecraft. To study hypervelocity impact phenomena, a comprehensive ensemble of real-time concurrently operated diagnostics has been developed and implemented in the Small Particle Hypervelocity Impact Range (SPHIR) facility. This suite of simultaneously operated instrumentation provides multiple complementary measurements that facilitate the characterization of many impact phenomena in a single experiment. The investigation of hypervelocity impact phenomena described in this work focuses on normal impacts of 1.8 mm nylon 6/6 cylinder projectiles and variable thickness aluminum targets. The SPHIR facility two-stage light-gas gun is capable of routinely launching 5.5 mg nylon impactors to speeds of 5 to 7 km/s. Refinement of legacy SPHIR operation procedures and the investigation of first-stage pressure have improved the velocity performance of the facility, resulting in an increase in average impact velocity of at least 0.57 km/s. Results for the perforation area indicate the considered range of target thicknesses represent multiple regimes describing the non-monotonic scaling of target perforation with decreasing target thickness. The laser side-lighting (LSL) system has been developed to provide ultra-high-speed shadowgraph images of the impact event. This novel optical technique is demonstrated to characterize the propagation velocity and two-dimensional optical density of impact-generated debris clouds. Additionally, a debris capture system is located behind the target during every experiment to provide complementary information regarding the trajectory distribution and penetration depth of individual debris particles. The utilization of a coherent, collimated illumination source in the LSL system facilitates the simultaneous measurement of impact phenomena with near-IR and UV-vis spectrograph systems. Comparison of LSL images to concurrent IR results indicates two distinctly different phenomena. A high-speed, pressure-dependent IR-emitting cloud is observed in experiments to expand at velocities much higher than the debris and ejecta phenomena observed using the LSL system. In double-plate target configurations, this phenomena is observed to interact with the rear-wall several micro-seconds before the subsequent arrival of the debris cloud. Additionally, dimensional analysis presented by Whitham for blast waves is shown to describe the pressure-dependent radial expansion of the observed IR-emitting phenomena. Although this work focuses on a single hypervelocity impact configuration, the diagnostic capabilities and techniques described can be used with a wide variety of impactors, materials, and geometries to investigate any number of engineering and scientific problems.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The pure Coulomb explosions of the methane clusters (CA(4))(n), (light atom A = H or D) have been investigated by a simplified electrostatic model for both a single cluster and an ensemble of clusters with a given cluster size distribution. The dependence of the energy of ions produced from the explosions on cluster size and the charge state of the carbon ions has been analysed. It is found that, unlike the average proton energy which increases with the charge q of the carbon ions, the average deuteron energy tends to saturate as q becomes larger than 4. This implies that when the laser intensity is sufficiently high for the (CD4)(n) to be ionized to a charge state of (C4+D4+)(n), the neutron yield from a table-top laser-driven Coulomb explosion of deuterated methane clusters (CD4)(n) could be increased significantly by increasing the interaction volume rather than by increasing the laser intensity to produce the higher charge state (C6+D4+)(n). The flight-time spectra of the carbon ions and the light ions have also been studied.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This thesis describes the theoretical solution and experimental verification of phase conjugation via nondegenerate four-wave mixing in resonant media. The theoretical work models the resonant medium as a two-level atomic system with the lower state of the system being the ground state of the atom. Working initially with an ensemble of stationary atoms, the density matrix equations are solved by third-order perturbation theory in the presence of the four applied electro-magnetic fields which are assumed to be nearly resonant with the atomic transition. Two of the applied fields are assumed to be non-depleted counterpropagating pump waves while the third wave is an incident signal wave. The fourth wave is the phase conjugate wave which is generated by the interaction of the three previous waves with the nonlinear medium. The solution of the density matrix equations gives the local polarization of the atom. The polarization is used in Maxwell's equations as a source term to solve for the propagation and generation of the signal wave and phase conjugate wave through the nonlinear medium. Studying the dependence of the phase conjugate signal on the various parameters such as frequency, we show how an ultrahigh-Q isotropically sensitive optical filter can be constructed using the phase conjugation process.

In many cases the pump waves may saturate the resonant medium so we also present another solution to the density matrix equations which is correct to all orders in the amplitude of the pump waves since the third-order solution is correct only to first-order in each of the field amplitudes. In the saturated regime, we predict several new phenomena associated with degenerate four-wave mixing and also describe the ac Stark effect and how it modifies the frequency response of the filtering process. We also show how a narrow bandwidth optical filter with an efficiency greater than unity can be constructed.

In many atomic systems the atoms are moving at significant velocities such that the Doppler linewidth of the system is larger than the homogeneous linewidth. The latter linewidth dominates the response of the ensemble of stationary atoms. To better understand this case the density matrix equations are solved to third-order by perturbation theory for an atom of velocity v. The solution for the polarization is then integrated over the velocity distribution of the macroscopic system which is assumed to be a gaussian distribution of velocities since that is an excellent model of many real systems. Using the Doppler broadened system, we explain how a tunable optical filter can be constructed whose bandwidth is limited by the homogeneous linewidth of the atom while the tuning range of the filter extends over the entire Doppler profile.

Since it is a resonant system, sodium vapor is used as the nonlinear medium in our experiments. The relevant properties of sodium are discussed in great detail. In particular, the wavefunctions of the 3S and 3P states are analyzed and a discussion of how the 3S-3P transition models a two-level system is given.

Using sodium as the nonlinear medium we demonstrate an ultrahigh-Q optical filter using phase conjugation via nondegenerate four-wave mixing as the filtering process. The filter has a FWHM bandwidth of 41 MHz and a maximum efficiency of 4 x 10-3. However, our theoretical work and other experimental work with sodium suggest that an efficient filter with both gain and a narrower bandwidth should be quite feasible.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Nucleic acids are a useful substrate for engineering at the molecular level. Designing the detailed energetics and kinetics of interactions between nucleic acid strands remains a challenge. Building on previous algorithms to characterize the ensemble of dilute solutions of nucleic acids, we present a design algorithm that allows optimization of structural features and binding energetics of a test tube of interacting nucleic acid strands. We extend this formulation to handle multiple thermodynamic states and combinatorial constraints to allow optimization of pathways of interacting nucleic acids. In both design strategies, low-cost estimates to thermodynamic properties are calculated using hierarchical ensemble decomposition and test tube ensemble focusing. These algorithms are tested on randomized test sets and on example pathways drawn from the molecular programming literature. To analyze the kinetic properties of designed sequences, we describe algorithms to identify dominant species and kinetic rates using coarse-graining at the scale of a small box containing several strands or a large box containing a dilute solution of strands.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The Madden-Julian Oscillation (MJO) is a pattern of intense rainfall and associated planetary-scale circulations in the tropical atmosphere, with a recurrence interval of 30-90 days. Although the MJO was first discovered 40 years ago, it is still a challenge to simulate the MJO in general circulation models (GCMs), and even with simple models it is difficult to agree on the basic mechanisms. This deficiency is mainly due to our poor understanding of moist convection—deep cumulus clouds and thunderstorms, which occur at scales that are smaller than the resolution elements of the GCMs. Moist convection is the most important mechanism for transporting energy from the ocean to the atmosphere. Success in simulating the MJO will improve our understanding of moist convection and thereby improve weather and climate forecasting.

We address this fundamental subject by analyzing observational datasets, constructing a hierarchy of numerical models, and developing theories. Parameters of the models are taken from observation, and the simulated MJO fits the data without further adjustments. The major findings include: 1) the MJO may be an ensemble of convection events linked together by small-scale high-frequency inertia-gravity waves; 2) the eastward propagation of the MJO is determined by the difference between the eastward and westward phase speeds of the waves; 3) the planetary scale of the MJO is the length over which temperature anomalies can be effectively smoothed by gravity waves; 4) the strength of the MJO increases with the typical strength of convection, which increases in a warming climate; 5) the horizontal scale of the MJO increases with the spatial frequency of convection; and 6) triggered convection, where potential energy accumulates until a threshold is reached, is important in simulating the MJO. Our findings challenge previous paradigms, which consider the MJO as a large-scale mode, and point to ways for improving the climate models.