936 resultados para UNIAXIAL COMPRESSION


Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this work, a new methodology is devised to obtain the fracture properties of nuclear fuel cladding in the hoop direction. The proposed method combines ring compression tests and a finite element method that includes a damage model based on cohesive crack theory, applied to unirradiated hydrogen-charged ZIRLOTM nuclear fuel cladding. Samples with hydrogen concentrations from 0 to 2000 ppm were tested at 20 �C. Agreement between the finite element simulations and the experimental results is excellent in all cases. The parameters of the cohesive crack model are obtained from the simulations, with the fracture energy and fracture toughness being calculated in turn. The evolution of fracture toughness in the hoop direction with the hydrogen concentration (up to 2000 ppm) is reported for the first time for ZIRLOTM cladding. Additionally, the fracture micromechanisms are examined as a function of the hydrogen concentration. In the as-received samples, the micromechanism is the nucleation, growth and coalescence of voids, whereas in the samples with 2000 ppm, a combination of cuasicleavage and plastic deformation, along with secondary microcracking is observed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The cyclic compression of several granular systems has been simulated with a molecular dynamics code. All the samples consisted of bidimensional, soft, frictionless and equal-sized particles that were initially arranged according to a squared lattice and were compressed by randomly generated irregular walls. The compression protocols can be described by some control variables (volume or external force acting on the walls) and by some dimensionless factors, that relate stiffness, density, diameter, damping ratio and water surface tension to the external forces, displacements and periods. Each protocol, that is associated to a dynamic process, results in an arrangement with its own macroscopic features: volume (or packing ratio), coordination number, and stress; and the differences between packings can be highly significant. The statistical distribution of the force-moment state of the particles (i.e. the equivalent average stress multiplied by the volume) is analyzed. In spite of the lack of a theoretical framework based on statistical mechanics specific for these protocols, it is shown how the obtained distributions of mean and relative deviatoric force-moment are. Then it is discussed on the nature of these distributions and on their relation to specific protocols.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

El pericardio es un material que se utiliza cuando se hace necesaria la sustitución de los velos de las válvulas cardiacas. En el presente trabajo se evalúa la durabilidad en fatiga de membranas de pericardio de ternera tratadas con glutaraldehído. Con tal propósito, se ensayaron 72 probetas de pericardio en condiciones fisiológicas de humedad y temperatura. Los ensayos se realizaron primero a fatiga hasta un número determinado de ciclos, entre un mínimo de 100 y un máximo de 4000, para luego ensayarse hasta rotura mediante un ensayo uniaxial de tracción simple. Las probetas consideradas control se sometieron a un único ensayo uniaxial de tracción. Se ha comprobado que la energía disipada en los primeros ciclos de las probetas que rompieron prematuramente (antes de finalizar el ciclado) es significativamente mayor que la energía disipada en las probetas que resistieron todos los ciclos de carga y descarga.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

El pericardio es un material que se utiliza cuando se hace necesaria la sustitución de los velos de las válvulas cardiacas. En el presente trabajo se evalúa la durabilidad en fatiga de membranas de pericardio de ternera tratadas con glutaraldehído. Con tal propósito, se ensayaron 72 probetas de pericardio en condiciones fisiológicas de humedad y temperatura. Los ensayos se realizaron primero a fatiga hasta un número determinado de ciclos, entre un mínimo de 100 y un máximo de 4000, para luego ensayarse hasta rotura mediante un ensayo uniaxial de tracción simple. Las probetas consideradas control se sometieron a un único ensayo uniaxial de tracción. Se ha comprobado que la energía disipada en los primeros ciclos de las probetas que rompieron prematuramente (antes de finalizar el ciclado) es significativamente mayor que la energía disipada en las probetas que resistieron todos los ciclos de carga y descarga.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In many applications (like social or sensor networks) the in- formation generated can be represented as a continuous stream of RDF items, where each item describes an application event (social network post, sensor measurement, etc). In this paper we focus on compressing RDF streams. In particular, we propose an approach for lossless RDF stream compression, named RDSZ (RDF Differential Stream compressor based on Zlib). This approach takes advantage of the structural similarities among items in a stream by combining a differential item encoding mechanism with the general purpose stream compressor Zlib. Empirical evaluation using several RDF stream datasets shows that this combi- nation produces gains in compression ratios with respect to using Zlib alone.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

LHE (logarithmical hopping encoding) is a computationally efficient image compression algorithm that exploits the Weber–Fechner law to encode the error between colour component predictions and the actual value of such components. More concretely, for each pixel, luminance and chrominance predictions are calculated as a function of the surrounding pixels and then the error between the predictions and the actual values are logarithmically quantised. The main advantage of LHE is that although it is capable of achieving a low-bit rate encoding with high quality results in terms of peak signal-to-noise ratio (PSNR) and image quality metrics with full-reference (FSIM) and non-reference (blind/referenceless image spatial quality evaluator), its time complexity is O( n) and its memory complexity is O(1). Furthermore, an enhanced version of the algorithm is proposed, where the output codes provided by the logarithmical quantiser are used in a pre-processing stage to estimate the perceptual relevance of the image blocks. This allows the algorithm to downsample the blocks with low perceptual relevance, thus improving the compression rate. The performance of LHE is especially remarkable when the bit per pixel rate is low, showing much better quality, in terms of PSNR and FSIM, than JPEG and slightly lower quality than JPEG-2000 but being more computationally efficient.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Debido al creciente aumento del tamaño de los datos en muchos de los actuales sistemas de información, muchos de los algoritmos de recorrido de estas estructuras pierden rendimento para realizar búsquedas en estos. Debido a que la representacion de estos datos en muchos casos se realiza mediante estructuras nodo-vertice (Grafos), en el año 2009 se creó el reto Graph500. Con anterioridad, otros retos como Top500 servían para medir el rendimiento en base a la capacidad de cálculo de los sistemas, mediante tests LINPACK. En caso de Graph500 la medicion se realiza mediante la ejecución de un algoritmo de recorrido en anchura de grafos (BFS en inglés) aplicada a Grafos. El algoritmo BFS es uno de los pilares de otros muchos algoritmos utilizados en grafos como SSSP, shortest path o Betweeness centrality. Una mejora en este ayudaría a la mejora de los otros que lo utilizan. Analisis del Problema El algoritmos BFS utilizado en los sistemas de computación de alto rendimiento (HPC en ingles) es usualmente una version para sistemas distribuidos del algoritmo secuencial original. En esta versión distribuida se inicia la ejecución realizando un particionado del grafo y posteriormente cada uno de los procesadores distribuidos computará una parte y distribuirá sus resultados a los demás sistemas. Debido a que la diferencia de velocidad entre el procesamiento en cada uno de estos nodos y la transfencia de datos por la red de interconexión es muy alta (estando en desventaja la red de interconexion) han sido bastantes las aproximaciones tomadas para reducir la perdida de rendimiento al realizar transferencias. Respecto al particionado inicial del grafo, el enfoque tradicional (llamado 1D-partitioned graph en ingles) consiste en asignar a cada nodo unos vertices fijos que él procesará. Para disminuir el tráfico de datos se propuso otro particionado (2D) en el cual la distribución se haciá en base a las aristas del grafo, en vez de a los vertices. Este particionado reducía el trafico en la red en una proporcion O(NxM) a O(log(N)). Si bien han habido otros enfoques para reducir la transferecnia como: reordemaniento inicial de los vertices para añadir localidad en los nodos, o particionados dinámicos, el enfoque que se va a proponer en este trabajo va a consistir en aplicar técnicas recientes de compression de grandes sistemas de datos como Bases de datos de alto volume o motores de búsqueda en internet para comprimir los datos de las transferencias entre nodos.---ABSTRACT---The Breadth First Search (BFS) algorithm is the foundation and building block of many higher graph-based operations such as spanning trees, shortest paths and betweenness centrality. The importance of this algorithm increases each day due to it is a key requirement for many data structures which are becoming popular nowadays. These data structures turn out to be internally graph structures. When the BFS algorithm is parallelized and the data is distributed into several processors, some research shows a performance limitation introduced by the interconnection network [31]. Hence, improvements on the area of communications may benefit the global performance in this key algorithm. In this work it is presented an alternative compression mechanism. It differs with current existing methods in that it is aware of characteristics of the data which may benefit the compression. Apart from this, we will perform a other test to see how this algorithm (in a dis- tributed scenario) benefits from traditional instruction-based optimizations. Last, we will review the current supercomputing techniques and the related work being done in the area.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The optimal design of a vertical cantilever beam is presented in this paper. The beam is assumed immersed in an elastic Winkler soil and subjected to several loads: a point force at the tip section, its self weight and a uniform distributed load along its length. lbe optimal design problem is to find the beam of a given length and minimum volume, such that the resultant compressive stresses are admisible. This prohlem is analyzed according to linear elasticity theory and within different alternative structural models: column, Navier-Bernoulli beam-column, Timoshenko beamcolumn (i.e. with shear strain) under conservative loads, typically, constant direction loads. Results obtained in each case are compared, in order to evaluate the sensitivity of model on the numerical results. The beam optimal design is described by the section distribution layout (area, second moment, shear area etc.) along the beam span and the corresponding beam total volume. Other situations, some of them very interesting from a theoretical point of view, with follower loads (Beck and Leipholz problems) are also discussed, leaving for future work numerical details and results.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The propagation of inhomogeneous, weakly nonlinear waves is considered in a cochlear model having two degrees of freedom that represent the transverse motions of the tectorial and basilar membranes within the organ of Corti. It is assumed that nonlinearity arises from the saturation of outer hair cell active force generation. I use multiple scale asymptotics and treat nonlinearity as a correction to a linear hydroelastic wave. The resulting theory is used to explain experimentally observed features of the response of the cochlear partition to a pure tone, including: the amplification of the response in a healthy cochlea vs a dead one; the less than linear growth rate of the response to increasing sound pressure level; and the amount of distortion to be expected at high and low frequencies at basal and apical locations, respectively. I also show that the outer hair cell nonlinearity generates retrograde waves.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Objectives: To examine the delay in presentation, diagnosis, and treatment of malignant spinal cord compression and to define the effect of this delay on motor and bladder function at the time of treatment.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Constant pressure and temperature molecular dynamics techniques have been employed to investigate the changes in structure and volumes of two globular proteins, superoxide dismutase and lysozyme, under pressure. Compression (the relative changes in the proteins' volumes), computed with the Voronoi technique, is closely related with the so-called protein intrinsic compressibility, estimated by sound velocity measurements. In particular, compression computed with Voronoi volumes predicts, in agreement with experimental estimates, a negative bound water contribution to the apparent protein compression. While the use of van der Waals and molecular volumes underestimates the intrinsic compressibilities of proteins, Voronoi volumes produce results closer to experimental estimates. Remarkably, for two globular proteins of very different secondary structures, we compute identical (within statistical error) protein intrinsic compressions, as predicted by recent experimental studies. Changes in the protein interatomic distances under compression are also investigated. It is found that, on average, short distances compress less than longer ones. This nonuniform contraction underlines the peculiar nature of the structural changes due to pressure in contrast with temperature effects, which instead produce spatially uniform changes in proteins. The structural effects observed in the simulations at high pressure can explain protein compressibility measurements carried out by fluorimetric and hole burning techniques. Finally, the calculation of the proteins static structure factor shows significant shifts in the peaks at short wavenumber as pressure changes. These effects might provide an alternative way to obtain information concerning compressibilities of selected protein regions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Nanometer-sized metallic necks have the unique ability to sustain extreme uniaxial loads (about 20 times greater than the bulk material). We present an experimental and theoretical study of the electronic transport properties under such extreme conditions. Conductance measurements on gold and aluminum necks show a strikingly different behavior: While gold shows the expected conductance decrease with increasing elastic elongation of the neck, aluminum necks behave in the opposite way. We have performed first-principles electronic-structure calculations which reproduce this behavior, showing that it is an intrinsic property of the bulk band structure under high uniaxial strain.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The use of 3D data in mobile robotics applications provides valuable information about the robot’s environment but usually the huge amount of 3D information is unmanageable by the robot storage and computing capabilities. A data compression is necessary to store and manage this information but preserving as much information as possible. In this paper, we propose a 3D lossy compression system based on plane extraction which represent the points of each scene plane as a Delaunay triangulation and a set of points/area information. The compression system can be customized to achieve different data compression or accuracy ratios. It also supports a color segmentation stage to preserve original scene color information and provides a realistic scene reconstruction. The design of the method provides a fast scene reconstruction useful for further visualization or processing tasks.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

There are a large number of image processing applications that work with different performance requirements and available resources. Recent advances in image compression focus on reducing image size and processing time, but offer no real-time solutions for providing time/quality flexibility of the resulting image, such as using them to transmit the image contents of web pages. In this paper we propose a method for encoding still images based on the JPEG standard that allows the compression/decompression time cost and image quality to be adjusted to the needs of each application and to the bandwidth conditions of the network. The real-time control is based on a collection of adjustable parameters relating both to aspects of implementation and to the hardware with which the algorithm is processed. The proposed encoding system is evaluated in terms of compression ratio, processing delay and quality of the compressed image when compared with the standard method.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The use of 3D data in mobile robotics applications provides valuable information about the robot’s environment. However usually the huge amount of 3D information is difficult to manage due to the fact that the robot storage system and computing capabilities are insufficient. Therefore, a data compression method is necessary to store and process this information while preserving as much information as possible. A few methods have been proposed to compress 3D information. Nevertheless, there does not exist a consistent public benchmark for comparing the results (compression level, distance reconstructed error, etc.) obtained with different methods. In this paper, we propose a dataset composed of a set of 3D point clouds with different structure and texture variability to evaluate the results obtained from 3D data compression methods. We also provide useful tools for comparing compression methods, using as a baseline the results obtained by existing relevant compression methods.