49 resultados para REFLECTIVITY DATA
Resumo:
Data-flow analysis is an integral part of any aggressive optimizing compiler. We propose a framework for improving the precision of data-flow analysis in the presence of complex control-flow. W initially perform data-flow analysis to determine those control-flow merges which cause the loss in data-flow analysis precision. The control-flow graph of the program is then restructured such that performing data-flow analysis on the resulting restructured graph gives more precise results. The proposed framework is both simple, involving the familiar notion of product automata, and also general, since it is applicable to any forward data-flow analysis. Apart from proving that our restructuring process is correct, we also show that restructuring is effective in that it necessarily leads to more optimization opportunities. Furthermore, the framework handles the trade-off between the increase in data-flow precision and the code size increase inherent in the restructuring. We show that determining an optimal restructuring is NP-hard, and propose and evaluate a greedy strategy. The framework has been implemented in the Scale research compiler, and instantiated for the specific problem of Constant Propagation. On the SPECINT 2000 benchmark suite we observe an average speedup of 4% in the running times over Wegman-Zadeck conditional constant propagation algorithm and 2% over a purely path profile guided approach.
Resumo:
A computationally efficient agglomerative clustering algorithm based on multilevel theory is presented. Here, the data set is divided randomly into a number of partitions. The samples of each such partition are clustered separately using hierarchical agglomerative clustering algorithm to form sub-clusters. These are merged at higher levels to get the final classification. This algorithm leads to the same classification as that of hierarchical agglomerative clustering algorithm when the clusters are well separated. The advantages of this algorithm are short run time and small storage requirement. It is observed that the savings, in storage space and computation time, increase nonlinearly with the sample size.
Resumo:
The applicability of a formalism involving an exponential function of composition x1 in interpreting the thermodynamic properties of alloys has been studied. The excess integral and partial molar free energies of mixing are expressed as: $$\begin{gathered} \Delta F^{xs} = a_o x_1 (1 - x_1 )e^{bx_1 } \hfill \\ RTln\gamma _1 = a_o (1 - x_1 )^2 (1 + bx_1 )e^{bx_1 } \hfill \\ RTln\gamma _2 = a_o x_1^2 (1 - b + bx_1 )e^{bx_1 } \hfill \\ \end{gathered} $$ The equations are used in interpreting experimental data for several relatively weakly interacting binary systems. For the purpose of comparison, activity coefficients obtained by the subregular model and Krupkowski’s formalism have also been computed. The present equations may be considered to be convenient in describing the thermodynamic behavior of metallic solutions.
Resumo:
3,3',5,5'-Tetrabromo-4,4'-dlaminodlphenyMhane has been synthedzed and Its spectral and thermal characterlstlcs have been examined.
Resumo:
A multi-access scheme is proposed for handling priority-based messages in data communication systems through satellites. The different schemes by which time slots are alloted by the satellite are based on a ‘priority index’. The performance characteristics of the system using these schemes under different traffic conditions are discussed.
Resumo:
Abstract is not available.
Resumo:
Screen-less oscillation photography is the method of choice for recording three-dimensional X-ray diffraction data for crystals of biological macromolecules. The geometry of an oscillation camera is extremely simple. However, the manner in which the reciprocal lattice is recorded in any experiment is fairly complex. This depends on the Laue symmetry of the reciprocal lattice, the lattice type, the orientation of the crystal on the camera and to a lesser extent on the unit-cell dimensions. Exploring the relative efficiency of collecting X-ray diffraction data for different crystal orientations prior to data collection might reduce the number of films required to record most of the unique data and the consequent amount of time required for processing these films. Here algorithms are presented suitable for this purpose and results are reported for the 11 Laue groups, different lattice types and crystal orientations often employed in data collection.
Resumo:
Many novel computer architectures like array and multiprocessors which achieve high performance through the use of concurrency exploit variations of the von Neumann model of computation. The effective utilization of the machines makes special demands on programmers and their programming languages, such as the structuring of data into vectors or the partitioning of programs into concurrent processes. In comparison, the data flow model of computation demands only that the principle of structured programming be followed. A data flow program, often represented as a data flow graph, is a program that expresses a computation by indicating the data dependencies among operators. A data flow computer is a machine designed to take advantage of concurrency in data flow graphs by executing data independent operations in parallel. In this paper, we discuss the design of a high level language (DFL: Data Flow Language) suitable for data flow computers. Some sample procedures in DFL are presented. The implementation aspects have not been discussed in detail since there are no new problems encountered. The language DFL embodies the concepts of functional programming, but in appearance closely resembles Pascal. The language is a better vehicle than the data flow graph for expressing a parallel algorithm. The compiler has been implemented on a DEC 1090 system in Pascal.
Resumo:
This paper presents a Chance-constraint Programming approach for constructing maximum-margin classifiers which are robust to interval-valued uncertainty in training examples. The methodology ensures that uncertain examples are classified correctly with high probability by employing chance-constraints. The main contribution of the paper is to pose the resultant optimization problem as a Second Order Cone Program by using large deviation inequalities, due to Bernstein. Apart from support and mean of the uncertain examples these Bernstein based relaxations make no further assumptions on the underlying uncertainty. Classifiers built using the proposed approach are less conservative, yield higher margins and hence are expected to generalize better than existing methods. Experimental results on synthetic and real-world datasets show that the proposed classifiers are better equipped to handle interval-valued uncertainty than state-of-the-art.
Resumo:
A compilation of crystal structure data on deoxyribo- and ribonucleosides and their higher derivatives is presented. The aim of this paper is to highlight the flexibility of deoxyribose and ribose rings. So far, the conformational parameters of nucleic acids constituents of ribose and deoxyribose have not been analysed separately. This paper aims to correlate the conformational parameters with the nature and puckering of the sugar. Deoxyribose puckering occurs in the C2′ endo region while ribose puckering is observed both in the C3′ endo and C2′ endo regions. A few endocyclic and exocyclic bond angles depend on the puckering and the nature of the sugar. The majority of structures have an anti conformation about the glycosyl bond. There appears to be a puckering dependence on the torsion angle about the C4′---C5′ bonds. Such stereochemical information is useful in model building studies of polynucleotides and nucleic acids.
Resumo:
Tetrapeptide sequences of the type Z-Pro-Y-X were obtained from the crystal structure data on 34 globular proteins, and used in an analysis of the positional preferences of the individual amino acid residues in the β-turn conformation. The effect of fixing proline as the second position residue in the tetrapeptide sequence was studied by comparing the data obtained on the positional preferences with the corresponding data obtained by Chou and Fasman using the Z-R-Y-X sequence, where no particular residue was fixed in any of the four positions. While, in general, several amino acid residues having relatively very high or very low preferences for specific positions were found to be common to both the Z-Pro-Y-X and Z-R-Y-X sequences, many significant differences were found between the two sets of data, which are to be attributed to specific interactions arising from the presence of the proline residue.
Resumo:
Data flow computers are high-speed machines in which an instruction is executed as soon as all its operands are available. This paper describes the EXtended MANchester (EXMAN) data flow computer which incorporates three major extensions to the basic Manchester machine. As extensions we provide a multiple matching units scheme, an efficient, implementation of array data structure, and a facility to concurrently execute reentrant routines. A simulator for the EXMAN computer has been coded in the discrete event simulation language, SIMULA 67, on the DEC 1090 system. Performance analysis studies have been conducted on the simulated EXMAN computer to study the effectiveness of the proposed extensions. The performance experiments have been carried out using three sample problems: matrix multiplication, Bresenham's line drawing algorithm, and the polygon scan-conversion algorithm.
Resumo:
A-DNA pattern, obtained using a flat plat camera, was indexed by Fuller Image on the basis of a c-face centred monoclinic cell with A = 22.24 Å, B = 40.62 Å, C = 28.15 Å and β = 97.0°. A precession photograph of A-DNA which gives an undistorted picture of the lattice, showed that the unit cell parameters as given by Fuller Image were not quite correct. The precession photograph showed a strong meridional reflection (R = 0.00 Å−1) on the 11th layer line. But the occurrence of the meridional reflection on the 11th layer line could not be explained on the basis of the cell parameters given by Fuller Image ; using those cell parameters the reflection which comes closest to the meridian on 11th layer line is at R = 0.025 Å−1. However, a simple interchange of a and b values accounted for the meridional reflection on 11th layer line. The corrected cell parameter refined against 28 strong spots are A = 40.75 Å, B = 22.07 Å, C = 28.16 Å and β = 97.5°. In the new unit cell of A-DNA, the packing arrangement of the two molecules is different from that in the old one. Nonetheless, our earlier contention is again reaffirmed that both right and left-handed A-DNA are stereochemically allowed and consistent with the observed fibre pattern.
Resumo:
The average dimensions of the peptide unit have been obtained from the data reported in recent crystal structure analyses of di- and tripeptides. The bond lengths and bond angles agree with those in common use, except for the bond angle C---N---H, which is about 4° less than the accepted value, and the angle C2α---N---H which is about 4° more. The angle τ (Cα) has a mean value of 114° for glycyl residues and 110° for non-glycyl residues. Attention is directed to these mean values as observed in crystal structures, as they are relevant for model building of peptide chain structures.