217 resultados para Compression tension test
Resumo:
When the variation of secondary compression, with log(10) t is non-linear, the quantification of secondary settlement through the coefficient of secondary compression, C-alpha epsilon, becomes difficult which frequently leads to an underestimate of the settlement, Log(10) delta - log(10) t representation of such true-compression data has the distinct advantage of exhibiting linear secondary compression behaviour over an appreciably larger time span. The slope of the secondary compression portion of the log(10) e - log(10) t curve expressed as Delta(log e)/(log t) and called the 'secondary compression factor', m, proves to be a better alternative to C-alpha epsilon and the prediction of secondary settlement is improved.
Resumo:
Sandwich structures, especially those with honeycomb and grid structures as the core material, are very commonly employed in aircraft structures. There is an increasing use of closed-pore rigid syntactic foams as core materials in sandwich constructions because they possess a number of favourable properties. The syntactic foams, owing to their structure and formation, behave differently under compression compared to other traditionally used core materials. In the present study, therefore, syntactic foam core sandwich constructions are evaluated for their behaviour under compression in both edgewise and flatwise orientations. Further, the work characterises the relative performance of two sets of sandwich materials, one containing glass-epoxy and the other, glass/carbon hybrid-epoxy skins. As non-standard geometry test specimens were involved, only a comparative evaluation was contemplated in this approach. The experiments indicate that the nature of the reinforcement fabric in the skin has a bearing on the test results in edgewise orientation. Thus, the tendency towards initiation of vertical crack in the central plane of the core material, which is a typical fracture event in this kind of material, was found to occur after a delay for the specimens containing the glass fabric in the skin. Attempts are made to establish the correlation between observations made on the test specimen visually during the course of testing and the post-compression microscopic examinations of the fracture features.
Resumo:
The consistency of very soft sediments prevents the conventional oedometer test from being applied to study their compressibility and permeability characteristics. The hydraulic consolidation test in existence requires sophisticated instrumentation and testing procedures. The present paper proposes a seepage-force-induced consolidation testing procedure for studying the compressibility and permeability behavior of soft sediments at low effective stress levels. The good agreement that has been observed between the results obtained from the proposed method and the conventional oedometer test at overlapping effective stress levels indicates that the proposed method can be used to study the compressibility and permeability characteristics of soft sediments at low effective stress levels satisfactorily.
Resumo:
We propose a physical mechanism for the triggering of starbursts in interacting spiral galaxies by shock compression of the pre-existing disk giant molecular clouds (GMCs). We show that as a disk GMC tumbles into the central region of a galaxy following a galactic tidal encounter, it undergoes a radiative shock compression by the pre-existing high pressure of the central molecular intercloud medium. The shocked outer shell of a GMC becomes gravitationally unstable, which results in a burst of star formation in the initially stable GMC. In the case of colliding galaxies with physical overlap such as Arp 244, the cloud compression is shown to occur due to the hot, high-pressure remnant gas resulting from the collisions of atomic hydrogen gas clouds from the two galaxies. The resulting values of infrared luminosity agree with observations. The main mode of triggered star formation is via clusters of stars, thus we can naturally explain the formation of young, luminous star clusters observed in starburst galaxies.
Resumo:
The moisture absorption and changes in compression strengths in glass-epoxy (G-E composites without and with discrete quantities of graphite powders introduced into the resin mix prior to its spreading on specific glass fabric (layers) during the lay-up (stacking) sequence forms the subject matter of this report. The results point to higher moisture absorption for graphite bearing specimens. The strengths of graphite-free coupons show a continuous decrease, while the filler bearing ones show an initial rise followed by a drop for larger exposure times. Scanning Fractographic features were examined for an understanding of the process. The observations were explained invoking the effect of matrix plasticizing and the role of interfacial regions.
Resumo:
Two methods based on wavelet/wavelet packet expansion to denoise and compress optical tomography data containing scattered noise are presented, In the first, the wavelet expansion coefficients of noisy data are shrunk using a soft threshold. In the second, the data are expanded into a wavelet packet tree upon which a best basis search is done. The resulting coefficients are truncated on the basis of energy content. It can be seen that the first method results in efficient denoising of experimental data when scattering particle density in the medium surrounding the object was up to 12.0 x 10(6) per cm(3). This method achieves a compression ratio of approximate to 8:1. The wavelet packet based method resulted in a compression of up to 11:1 and also exhibited reasonable noise reduction capability. Tomographic reconstructions obtained from denoised data are presented. (C) 1999 Published by Elsevier Science B.V. All rights reserved,
Resumo:
The amount of data contained in electroencephalogram (EEG) recordings is quite massive and this places constraints on bandwidth and storage. The requirement of online transmission of data needs a scheme that allows higher performance with lower computation. Single channel algorithms, when applied on multichannel EEG data fail to meet this requirement. While there have been many methods proposed for multichannel ECG compression, not much work appears to have been done in the area of multichannel EEG. compression. In this paper, we present an EEG compression algorithm based on a multichannel model, which gives higher performance compared to other algorithms. Simulations have been performed on both normal and pathological EEG data and it is observed that a high compression ratio with very large SNR is obtained in both cases. The reconstructed signals are found to match the original signals very closely, thus confirming that diagnostic information is being preserved during transmission.
Resumo:
Particulate composites based on polymer matrices generally contain fillers, especially those that are abundantly available and are cheaper. The inclusion of these, besides improving the properties, makes the system costwise viable, In the present study, fly ash was tried as a filler in epoxy. The filler particle surfaces were modified using three chemical surface treatment techniques in order to elicit the effect of adhesion at the interface on the mechanical properties of these composites. The compatibilizing of the filler with the use of a silane coupling agent yielded the best compression strength values. Scanning Electron Microscopy (SEM) has been used to characterize and supplement the mechanical test data.
Resumo:
The effect of the test gas on the flow field around a 120degrees apex angle blunt cone has been investigated in a shock tunnel at a nominal Mach number of 5.75. The shock standoff distance around the blunt cone was measured by an electrical discharge technique using both carbon dioxide and air as test gases. The forebody laminar convective heat transfer to the blunt cone was measured with platinum thin-film sensors in both air and carbon dioxide environments. An increase of 10 to 15% in the measured heat transfer values was observed with carbon dioxide as the test gas in comparison to air. The measured thickness of the shock layer along the stagnation streamline was 3.57 +/- 0.17 mm in air and 3.29 +/- 0.26 mm in carbon dioxide. The computed thickness of the shock layer for air and carbon dioxide were 3.98 mm and 3.02 mm, respectively. The observed increase in the measured heat transfer rates in carbon dioxide compared to air was due to the higher density ratio across the bow shock wave and the reduced shock layer thickness.
Resumo:
We consider the breaking of a polymer molecule which is fixed at one end and is acted upon by a force at the other. The polymer is assumed to be a linear chain joined together by bonds which satisfy the Morse potential. The applied force is found to modify the Morse potential so that the minimum becomes metastable. Breaking is just the decay of this metastable bond, by causing it to go over the barrier. Increasing the force causes the potential to become more and more distorted and eventually leads to the disappearance of the barrier. The limiting force at which the barrier disappears is D(e)a/2,D-e with a the parameters characterizing the Morse potential. The rate of breaking is first calculated using multidimensional quantum transition state theory. We use the harmonic approximation to account for vibrations of all the units. It includes tunneling contributions to the rate, but is valid only above a certain critical temperature. It is possible to get an analytical expression for the rate of breaking. We have calculated the rate of breaking for a model, which mimics polyethylene. First we calculate the rate of breaking of a single bond, without worrying about the other bonds. Inclusion of other bonds under the harmonic approximation is found to lower this rate by at the most one order of magnitude. Quantum effects are found to increase the rate of breaking and are significant only at temperatures less than 150 K. At 300 K, the calculations predict a bond in polyethylene to have a lifetime of only seconds at a force which is only half the limiting force. Calculations were also done using the Lennard-Jones potential. The results for Lennard-Jones and Morse potentials were rather different, due to the different long-range behaviors of the two potentials. A calculation including friction was carried out, at the classical level, by assuming that each atom of the chain is coupled to its own collection of harmonic oscillators. Comparison of the results with the simulations of Oliveira and Taylor [J. Chem. Phys. 101, 10 118 (1994)] showed the rate to be two to three orders of magnitude higher. As a possible explanation of discrepancy, we consider the translational motion of the ends of the broken chains. Using a continuum approximation for the chain, we find that in the absence of friction, the rate of the process can be limited by the rate at which the two broken ends separate from one another and the lowering of the rate is at the most a factor of 2, for the parameters used in the simulation (for polyethylene). In the presence of friction, we find that the rate can be lowered by one to two orders of magnitude, making our results to be in reasonable agreement with the simulations.
Resumo:
Syntactic foam made by mechanical mixing of glass hollow spheres in epoxy resin matrix is characterized for compressive properties in the present study. Volume fraction of hollow spheres in the syntactic foam under investigation is kept at 67.8%. Effect of specimen aspect ratio on failure behavior and stress-strain curve of the material is highlighted. Considerable differences are noted in the macroscopic fracture features of the specimen and the stress-strain curve with the variation in specimen aspect ratio, although compressive yield strength values were within a narrow range. Post compression test scanning electron microscopic observations coupled with the macroscopic observations taken during the test helped in explaining the deviation in specimen behavior and in gathering support for the proposed arguments.
Resumo:
We propose a scheme for the compression of tree structured intermediate code consisting of a sequence of trees specified by a regular tree grammar. The scheme is based on arithmetic coding, and the model that works in conjunction with the coder is automatically generated from the syntactical specification of the tree language. Experiments on data sets consisting of intermediate code trees yield compression ratios ranging from 2.5 to 8, for file sizes ranging from 167 bytes to 1 megabyte.
Resumo:
The purpose of this paper is to present exergy charts for carbon dioxide (CO2) based on the new fundamental equation of state and the results of a thermodynamic analysis of conventional and trans-critical vapour compression refrigeration cycles using the data thereof. The calculation scheme is anchored on the Mathematica platform. There exist upper and lower bounds for the high cycle pressure for a given set of evaporating and pre-throttling temperatures. The maximum possible exergetic efficiency for each case was determined. Empirical correlations for exergetic efficiency and COP, valid in the range of temperatures studied here, are obtained. The exergy losses have been quantified. (C) 2003 Elsevier Ltd. All rights reserved.
Resumo:
The problem of guessing a random string is revisited. The relation-ship between guessing without distortion and compression is extended to the case when source alphabet size is countably in¯nite. Further, similar relationship is established for the case when distortion allowed by establishing a tight relationship between rate distortion codes and guessing strategies.