994 resultados para Linear Viscoelastic Materials
Resumo:
Graph pebbling is a network model for studying whether or not a given supply of discrete pebbles can satisfy a given demand via pebbling moves. A pebbling move across an edge of a graph takes two pebbles from one endpoint and places one pebble at the other endpoint; the other pebble is lost in transit as a toll. It has been shown that deciding whether a supply can meet a demand on a graph is NP-complete. The pebbling number of a graph is the smallest t such that every supply of t pebbles can satisfy every demand of one pebble. Deciding if the pebbling number is at most k is NP 2 -complete. In this paper we develop a tool, called theWeight Function Lemma, for computing upper bounds and sometimes exact values for pebbling numbers with the assistance of linear optimization. With this tool we are able to calculate the pebbling numbers of much larger graphs than in previous algorithms, and much more quickly as well. We also obtain results for many families of graphs, in many cases by hand, with much simpler and remarkably shorter proofs than given in previously existing arguments (certificates typically of size at most the number of vertices times the maximum degree), especially for highly symmetric graphs. Here we apply theWeight Function Lemma to several specific graphs, including the Petersen, Lemke, 4th weak Bruhat, Lemke squared, and two random graphs, as well as to a number of infinite families of graphs, such as trees, cycles, graph powers of cycles, cubes, and some generalized Petersen and Coxeter graphs. This partly answers a question of Pachter, et al., by computing the pebbling exponent of cycles to within an asymptotically small range. It is conceivable that this method yields an approximation algorithm for graph pebbling.
Resumo:
The problem of finding a feasible solution to a linear inequality system arises in numerous contexts. In [12] an algorithm, called extended relaxation method, that solves the feasibility problem, has been proposed by the authors. Convergence of the algorithm has been proven. In this paper, we onsider a class of extended relaxation methods depending on a parameter and prove their convergence. Numerical experiments have been provided, as well.
Resumo:
En aquest projecte es presenta un estudi sobre punxonats i esbocats per a materials d’Alt límit elàstic (Acers AHSS; Advanced High Strength Steels), amb l’objectiu d’aconseguir saber quin és el millor procés de tall per aconseguir esbocats correctes. Utilitzant la premsa hidràulica de la Fundació Centre Tecnològic de Manresa (CTM) es realitza el punxonat i els esbocats a tots els materials per tal d’avaluar el seu comportament alhora de ser esbocats. A partir d’aquests resultats obtindrem les gràfiques de ràtio d’esbocat per a cadascun dels materials en els diferents sentits de esbocat. Com a mètode alternatiu del procés de tall, es realitzen unes mostres amb tall per aigua. S’observarà si els esbocats trenquen o no trenquen i en quines condicions ho fan. Una vegada realitzats aquests assajos, es procedirà a realitzar l’estudi de microdureses amb un indentador Vickers per tal de comprovar l’afectació de l’operació de tall en la microestructura del material. Es realitzarà també algun perfil de dureses utilitzant el Nanoindentador. Per finalitzar el projecte, s’inclourà un apartat de conclusions i un estudi mediambiental produït com a conseqüència de l’elaboració del projecte, així com un pressupost total d’aquest.
Resumo:
Edició crítica del París e Viana català que té com a testimoni de base l'incunable de Girona de 1495, que es conserva a la Biblioteca Reial de Copenhaguen. El treball conté una anàlisi comparativa de les variants de l'edició de Girona i de les de l'edició de Barcelona. Es desmenteix la hipòtesi de Cátedra (1986: 37-38), segons la qual el Curial e Güelfa podria dependre del París e Viana i s'argumenta la inconveniència de classificar aquesta novel·la cavalleresca com a ficció sentimental. Les notes permeten resseguir els motius literaris comuns que constitueixen el relat
Resumo:
OBJECTIVE: The purpose of this study was to compare the use of different variables to measure the clinical wear of two denture tooth materials in two analysis centers. METHODS: Twelve edentulous patients were provided with full dentures. Two different denture tooth materials (experimental material and control) were placed randomly in accordance with the split-mouth design. For wear measurements, impressions were made after an adjustment phase of 1-2 weeks and after 6, 12, 18, and 24 months. The occlusal wear of the posterior denture teeth of 11 subjects was assessed in two study centers by use of plaster replicas and 3D laser-scanning methods. In both centers sequential scans of the occlusal surfaces were digitized and superimposed. Wear was described by use of four different variables. Statistical analysis was performed after log-transformation of the wear data by use of the Pearson and Lin correlation and by use of a mixed linear model. RESULTS: Mean occlusal vertical wear of the denture teeth after 24 months was between 120μm and 212μm, depending on wear variable and material. For three of the four variables, wear of the experimental material was statistically significantly less than that of the control. Comparison of the two study centers, however, revealed correlation of the wear variables was only moderate whereas strong correlation was observed among the different wear variables evaluated by each center. SIGNIFICANCE: Moderate correlation was observed for clinical wear measurements by optical 3D laser scanning in two different study centers. For the two denture tooth materials, wear measurements limited to the attrition zones led to the same qualitative assessment.
Resumo:
We study preconditioning techniques for discontinuous Galerkin discretizations of isotropic linear elasticity problems in primal (displacement) formulation. We propose subspace correction methods based on a splitting of the vector valued piecewise linear discontinuous finite element space, that are optimal with respect to the mesh size and the Lamé parameters. The pure displacement, the mixed and the traction free problems are discussed in detail. We present a convergence analysis of the proposed preconditioners and include numerical examples that validate the theory and assess the performance of the preconditioners.
Resumo:
Significant progress has been made with regard to the quantitative integration of geophysical and hydrological data at the local scale. However, extending the corresponding approaches to the scale of a field site represents a major, and as-of-yet largely unresolved, challenge. To address this problem, we have developed downscaling procedure based on a non-linear Bayesian sequential simulation approach. The main objective of this algorithm is to estimate the value of the sparsely sampled hydraulic conductivity at non-sampled locations based on its relation to the electrical conductivity logged at collocated wells and surface resistivity measurements, which are available throughout the studied site. The in situ relationship between the hydraulic and electrical conductivities is described through a non-parametric multivariatekernel density function. Then a stochastic integration of low-resolution, large-scale electrical resistivity tomography (ERT) data in combination with high-resolution, local-scale downhole measurements of the hydraulic and electrical conductivities is applied. The overall viability of this downscaling approach is tested and validated by comparing flow and transport simulation through the original and the upscaled hydraulic conductivity fields. Our results indicate that the proposed procedure allows obtaining remarkably faithful estimates of the regional-scale hydraulic conductivity structure and correspondingly reliable predictions of the transport characteristics over relatively long distances.
Resumo:
RATIONALE AND OBJECTIVES: To determine optimum spatial resolution when imaging peripheral arteries with magnetic resonance angiography (MRA). MATERIALS AND METHODS: Eight vessel diameters ranging from 1.0 to 8.0 mm were simulated in a vascular phantom. A total of 40 three-dimensional flash MRA sequences were acquired with incremental variations of fields of view, matrix size, and slice thickness. The accurately known eight diameters were combined pairwise to generate 22 "exact" degrees of stenosis ranging from 42% to 87%. Then, the diameters were measured in the MRA images by three independent observers and with quantitative angiography (QA) software and used to compute the degrees of stenosis corresponding to the 22 "exact" ones. The accuracy and reproducibility of vessel diameter measurements and stenosis calculations were assessed for vessel size ranging from 6 to 8 mm (iliac artery), 4 to 5 mm (femoro-popliteal arteries), and 1 to 3 mm (infrapopliteal arteries). Maximum pixel dimension and slice thickness to obtain a mean error in stenosis evaluation of less than 10% were determined by linear regression analysis. RESULTS: Mean errors on stenosis quantification were 8.8% +/- 6.3% for 6- to 8-mm vessels, 15.5% +/- 8.2% for 4- to 5-mm vessels, and 18.9% +/- 7.5% for 1- to 3-mm vessels. Mean errors on stenosis calculation were 12.3% +/- 8.2% for observers and 11.4% +/- 15.1% for QA software (P = .0342). To evaluate stenosis with a mean error of less than 10%, maximum pixel surface, the pixel size in the phase direction, and the slice thickness should be less than 1.56 mm2, 1.34 mm, 1.70 mm, respectively (voxel size 2.65 mm3) for 6- to 8-mm vessels; 1.31 mm2, 1.10 mm, 1.34 mm (voxel size 1.76 mm3), for 4- to 5-mm vessels; and 1.17 mm2, 0.90 mm, 0.9 mm (voxel size 1.05 mm3) for 1- to 3-mm vessels. CONCLUSION: Higher spatial resolution than currently used should be selected for imaging peripheral vessels.
Resumo:
Solid phase microextraction (SPME) has been widely used for many years in various applications, such as environmental and water samples, food and fragrance analysis, or biological fluids. The aim of this study was to suggest the SPME method as an alternative to conventional techniques used in the evaluation of worker exposure to benzene, toluene, ethylbenzene, and xylene (BTEX). Polymethylsiloxane-carboxen (PDMS/CAR) showed as the most effective stationary phase material for sorbing BTEX among other materials (polyacrylate, PDMS, PDMS/divinylbenzene, Carbowax/divinylbenzene). Various experimental conditions were studied to apply SPME to BTEX quantitation in field situations. The uptake rate of the selected fiber (75 microm PDMS/CAR) was determined for each analyte at various concentrations, relative humidities, and airflow velocities from static (calm air) to dynamic (> 200 cm/s) conditions. The SPME method also was compared with the National Institute of Occupational Safety and Health method 1501. Unlike the latter, the SPME approach fulfills the new requirement for the threshold limit value-short term exposure limit (TLV-STEL) of 2.5 ppm for benzene (8 mg/m(3))
Resumo:
This paper introduces local distance-based generalized linear models. These models extend (weighted) distance-based linear models firstly with the generalized linear model concept, then by localizing. Distances between individuals are the only predictor information needed to fit these models. Therefore they are applicable to mixed (qualitative and quantitative) explanatory variables or when the regressor is of functional type. Models can be fitted and analysed with the R package dbstats, which implements several distancebased prediction methods.
Resumo:
L'objectiu principal del treball de recerca és el disseny de dos crèdits variables, matèries optatives dins de l'Educació Secundària Obligatòria, entorn de la cultura grega clàssica.
Resumo:
The application of two approaches for high-throughput, high-resolution X-ray phase contrast tomographic imaging being used at the tomographic microscopy and coherent radiology experiments (TOMCAT) beamline of the SLS is discussed and illustrated. Differential phase contrast (DPC) imaging, using a grating interferometer and a phase-stepping technique, is integrated into the beamline environment at TOMCAT in terms of the fast acquisition and reconstruction of data and the availability to scan samples within an aqueous environment. A second phase contrast method is a modified transfer of intensity approach that can yield the 3D distribution of the decrement of the refractive index of a weakly absorbing object from a single tomographic dataset. The two methods are complementary to one another: the DPC method is characterised by a higher sensitivity and by moderate resolution with larger samples; the modified transfer of intensity approach is particularly suited for small specimens when high resolution (around 1 mu m) is required. Both are being applied to investigations in the biological and materials science fields.
Resumo:
In this paper, a phenomenologically motivated magneto-mechanically coupled finite strain elastic framework for simulating the curing process of polymers in the presence of a magnetic load is proposed. This approach is in line with previous works by Hossain and co-workers on finite strain curing modelling framework for the purely mechanical polymer curing (Hossain et al., 2009b). The proposed thermodynamically consistent approach is independent of any particular free energy function that may be used for the fully-cured magneto-sensitive polymer modelling, i.e. any phenomenological or micromechanical-inspired free energy can be inserted into the main modelling framework. For the fabrication of magneto-sensitive polymers, micron-size ferromagnetic particles are mixed with the liquid matrix material in the uncured stage. The particles align in a preferred direction with the application of a magnetic field during the curing process. The polymer curing process is a complex (visco) elastic process that transforms a fluid to a solid with time. Such transformation process is modelled by an appropriate constitutive relation which takes into account the temporal evolution of the material parameters appearing in a particular energy function. For demonstration in this work, a frequently used energy function is chosen, i.e. the classical Mooney-Rivlin free energy enhanced by coupling terms. Several representative numerical examples are demonstrated that prove the capability of our approach to correctly capture common features in polymers undergoing curing processes in the presence of a magneto-mechanical coupled load.
Resumo:
Freshwater snails belonging to the genus Biomphalaria act as intermediate hosts for the parasite trematode Schistosoma mansoni in Africa and in the neotropical region. Identification of such molluscs is carried out based on morphological characters and the presence of cercariae is verified through squeezing snails between two glass slides or by exposing them to artificial light. However, sometimes, the material collected includes molluscs with decomposed bodies or, yet, only empty shells, which precludes their identification and S. mansoni detection. Due to these difficulties, we have developed a methodology in which DNA may be extracted from traces of organic material from inside shells in order to identify molluscs through polymerase chain reaction and restriction fragment length polymorphism and to detect S. mansoni into these snails, by using low stringency polymerase chain reaction. Species-specific profiles obtained from B. glabrata, B. straminea, and B. tenagophila snails and their shells, maintained in laboratory for ten years, showed the same profiles. S. mansoni profiles showed to be present in shell specimens as far as the eighth week after being removed from aquarium.