959 resultados para processing method


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objective. To investigate the processing induced particle alignment on fracture behavior of four multiphase dental ceramics (one porcelain, two glass-ceramics and a glass-infiltrated-alumina composite). Methods. Disks (empty set12mm x 1.1 mm-thick) and bars (3 mm x 4 mm x 20 mm) of each material were processed according to manufacturer instructions, machined and polished. Fracture toughness (K(IC)) was determined by the indentation strength method using 3-point bending and biaxial flexure fixtures for the fracture of bars and disks, respectively. Microstructural and fractographic analyses were performed with scanning electron microscopy, energy dispersive spectroscopy and X-ray diffraction. Results. The isotropic microstructure of the porcelain and the leucite-based glass-ceramic resulted in similar fracture toughness values regardless of the specimen geometry. On the other hand, materials containing second-phase particles with high aspect ratio (lithium disilicate glass-ceramic and glass-infiltrated-alumina composite) showed lower fracture toughness for disk specimens compared to bars. For the lithium disilicate glass-ceramic disks, it was demonstrated that the occurrence of particle alignment during the heat-pressing procedure resulted in an unfavorable pattern that created weak microstructural paths during the biaxial test. For the glass-infiltrated-alumina composite, the microstructural analysis showed that the large alumina platelets tended to align their large surfaces perpendicularly to the direction of particle deposition during slip casting of green preforms. Significance. The fracture toughness of dental ceramics with anisotropic microstructure should be determined by means of biaxial testing, since it results in lower values. (C) 2009 Academy of Dental Materials. Published by Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Two hazard risk assessment matrices for the ranking of occupational health risks are described. The qualitative matrix uses qualitative measures of probability and consequence to determine risk assessment codes for hazard-disease combinations. A walk-through survey of an underground metalliferous mine and concentrator is used to demonstrate how the qualitative matrix can be applied to determine priorities for the control of occupational health hazards. The semi-quantitative matrix uses attributable risk as a quantitative measure of probability and uses qualitative measures of consequence. A practical application of this matrix is the determination of occupational health priorities using existing epidemiological studies. Calculated attributable risks from epidemiological studies of hazard-disease combinations in mining and minerals processing are used as examples. These historic response data do not reflect the risks associated with current exposures. A method using current exposure data, known exposure-response relationships and the semi-quantitative matrix is proposed for more accurate and current risk rankings.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Three-dimensional (3D) synthetic aperture radar (SAR) imaging via multiple-pass processing is an extension of interferometric SAR imaging. It exploits more than two flight passes to achieve a desired resolution in elevation. In this paper, a novel approach is developed to reconstruct a 3D space-borne SAR image with multiple-pass processing. It involves image registration, phase correction and elevational imaging. An image model matching is developed for multiple image registration, an eigenvector method is proposed for the phase correction and the elevational imaging is conducted using a Fourier transform or a super-resolution method for enhancement of elevational resolution. 3D SAR images are obtained by processing simulated data and real data from the first European Remote Sensing satellite (ERS-1) with the proposed approaches.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The binary diffusivities of water in low molecular weight sugars; fructose, sucrose and a high molecular weight carbohydrate; maltodextrin (DE 11) and the effective diffusivities of water in mixtures of these sugars (sucrose, glucose, fructose) and maltodextrin (DE 11) were determined using a simplified procedure based on the Regular Regime Approach. The effective diffusivity of these mixtures exhibited both the concentration and molecular weight dependence. Surface stickiness was observed in all samples during desorption, with fructose exhibiting the highest and maltodextrin the lowest. (C) 2002 Elsevier Science Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A detailed analysis procedure is described for evaluating rates of volumetric change in brain structures based on structural magnetic resonance (MR) images. In this procedure, a series of image processing tools have been employed to address the problems encountered in measuring rates of change based on structural MR images. These tools include an algorithm for intensity non-uniforniity correction, a robust algorithm for three-dimensional image registration with sub-voxel precision and an algorithm for brain tissue segmentation. However, a unique feature in the procedure is the use of a fractional volume model that has been developed to provide a quantitative measure for the partial volume effect. With this model, the fractional constituent tissue volumes are evaluated for voxels at the tissue boundary that manifest partial volume effect, thus allowing tissue boundaries be defined at a sub-voxel level and in an automated fashion. Validation studies are presented on key algorithms including segmentation and registration. An overall assessment of the method is provided through the evaluation of the rates of brain atrophy in a group of normal elderly subjects for which the rate of brain atrophy due to normal aging is predictably small. An application of the method is given in Part 11 where the rates of brain atrophy in various brain regions are studied in relation to normal aging and Alzheimer's disease. (C) 2002 Elsevier Science Inc. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A supersweet sweet corn hybrid, Pacific H5, was planted at weekly intervals (P-1 to P-5) in spring in South-Eastern Queensland. All plantings were harvested at the same time resulting in immature seed for the last planting (P-5). The seed was handled by three methods: manual harvest and processing (M-1), manual harvest and mechanical processing (M-2) and mechanical harvest and processing (M-3), and later graded into three sizes (small, medium and large). After eight months storage at 12-14degreesC, seed was maintained at 30degreesC with bimonthly monitoring of germination for fourteen months and seed damage at the end of this period. Seed quality was greatest for M-1 and was reduced by mechanical processing but not by mechanical harvesting. Large and medium seed had higher germination due to greater storage reserves but also more seed damage during mechanical processing. Immature seed from premature harvest (P-5) had poor quality especially when processed mechanically and reinforced the need for harvested seed to be physiologically mature.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Electricity markets are complex environments with very particular characteristics. MASCEM is a market simulator developed to allow deep studies of the interactions between the players that take part in the electricity market negotiations. This paper presents a new proposal for the definition of MASCEM players’ strategies to negotiate in the market. The proposed methodology is multiagent based, using reinforcement learning algorithms to provide players with the capabilities to perceive the changes in the environment, while adapting their bids formulation according to their needs, using a set of different techniques that are at their disposal. Each agent has the knowledge about a different method for defining a strategy for playing in the market, the main agent chooses the best among all those, and provides it to the market player that requests, to be used in the market. This paper also presents a methodology to manage the efficiency/effectiveness balance of this method, to guarantee that the degradation of the simulator processing times takes the correct measure.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Alheiras are a traditional, smoked, fermented meat sausage, produced in Portugal, with an undeniable cultural and gastronomic legacy. In this study, we assessed the nutritional value of this product, as well as the influence of different types of thermal processing. Alheiras from Mirandela were submitted to six different procedures: microwave, skillet, oven, charcoal grill, electric fryer and electric grill. Protein, fat, carbohydrate, minerals, NaCl, and cholesterol contents, as well as fatty acid profile were evaluated. The results show that alheiras are not hypercaloric but an unbalanced foodstuff (high levels of proteins and lipids) and the type of processing has a major impact on their nutritional value. Charcoal grill is the healthiest option: less fat (12.5 g/100 g) and cholesterol (29.3 mg/100 g), corresponding to a lower caloric intake (231.8 kcal, less 13% than the raw ones). Inversely, fried alheiras presented the worst nutritional profile, with the highest levels of fat (18.1 g/100 g) and cholesterol (76.0 g/100 g).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Coronary artery disease (CAD) is currently one of the most prevalent diseases in the world population and calcium deposits in coronary arteries are one direct risk factor. These can be assessed by the calcium score (CS) application, available via a computed tomography (CT) scan, which gives an accurate indication of the development of the disease. However, the ionising radiation applied to patients is high. This study aimed to optimise the protocol acquisition in order to reduce the radiation dose and explain the flow of procedures to quantify CAD. The main differences in the clinical results, when automated or semiautomated post-processing is used, will be shown, and the epidemiology, imaging, risk factors and prognosis of the disease described. The software steps and the values that allow the risk of developingCADto be predicted will be presented. A64-row multidetector CT scan with dual source and two phantoms (pig hearts) were used to demonstrate the advantages and disadvantages of the Agatston method. The tube energy was balanced. Two measurements were obtained in each of the three experimental protocols (64, 128, 256 mAs). Considerable changes appeared between the values of CS relating to the protocol variation. The predefined standard protocol provided the lowest dose of radiation (0.43 mGy). This study found that the variation in the radiation dose between protocols, taking into consideration the dose control systems attached to the CT equipment and image quality, was not sufficient to justify changing the default protocol provided by the manufacturer.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Naturally Occurring Radioactive Materials (NORM) are materials that are found naturally in the environment and contain radioactive isotopes that can cause negative effects on the health of workers who manipulate them. Present in underground work like mining and tunnel construction in granite zones, these materials are difficult to identify and characterize without appropriate equipment for risk evaluation. The assessing methods were exemplified with a case study applied to the handling and processing of phosphoric rock where one found significant amounts of radioactive isotopes and consequently elevated radon concentrations in enclosed spaces containing these materials. © 2015 Taylor & Francis Group, London.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The most common techniques for stress analysis/strength prediction of adhesive joints involve analytical or numerical methods such as the Finite Element Method (FEM). However, the Boundary Element Method (BEM) is an alternative numerical technique that has been successfully applied for the solution of a wide variety of engineering problems. This work evaluates the applicability of the boundary elem ent code BEASY as a design tool to analyze adhesive joints. The linearity of peak shear and peel stresses with the applied displacement is studied and compared between BEASY and the analytical model of Frostig et al., considering a bonded single-lap joint under tensile loading. The BEM results are also compared with FEM in terms of stress distributions. To evaluate the mesh convergence of BEASY, the influence of the mesh refinement on peak shear and peel stress distributions is assessed. Joint stress predictions are carried out numerically in BEASY and ABAQUS®, and analytically by the models of Volkersen, Goland, and Reissner and Frostig et al. The failure loads for each model are compared with experimental results. The preparation, processing, and mesh creation times are compared for all models. BEASY results presented a good agreement with the conventional methods.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Dissertação apresentada na Faculdade de Ciências e Tecnologia da Universidade Nova de Lisboa para obtenção do grau de Mestre em Engenharia Mecânica

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Hyperspectral imaging has become one of the main topics in remote sensing applications, which comprise hundreds of spectral bands at different (almost contiguous) wavelength channels over the same area generating large data volumes comprising several GBs per flight. This high spectral resolution can be used for object detection and for discriminate between different objects based on their spectral characteristics. One of the main problems involved in hyperspectral analysis is the presence of mixed pixels, which arise when the spacial resolution of the sensor is not able to separate spectrally distinct materials. Spectral unmixing is one of the most important task for hyperspectral data exploitation. However, the unmixing algorithms can be computationally very expensive, and even high power consuming, which compromises the use in applications under on-board constraints. In recent years, graphics processing units (GPUs) have evolved into highly parallel and programmable systems. Specifically, several hyperspectral imaging algorithms have shown to be able to benefit from this hardware taking advantage of the extremely high floating-point processing performance, compact size, huge memory bandwidth, and relatively low cost of these units, which make them appealing for onboard data processing. In this paper, we propose a parallel implementation of an augmented Lagragian based method for unsupervised hyperspectral linear unmixing on GPUs using CUDA. The method called simplex identification via split augmented Lagrangian (SISAL) aims to identify the endmembers of a scene, i.e., is able to unmix hyperspectral data sets in which the pure pixel assumption is violated. The efficient implementation of SISAL method presented in this work exploits the GPU architecture at low level, using shared memory and coalesced accesses to memory.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Remote hyperspectral sensors collect large amounts of data per flight usually with low spatial resolution. It is known that the bandwidth connection between the satellite/airborne platform and the ground station is reduced, thus a compression onboard method is desirable to reduce the amount of data to be transmitted. This paper presents a parallel implementation of an compressive sensing method, called parallel hyperspectral coded aperture (P-HYCA), for graphics processing units (GPU) using the compute unified device architecture (CUDA). This method takes into account two main properties of hyperspectral dataset, namely the high correlation existing among the spectral bands and the generally low number of endmembers needed to explain the data, which largely reduces the number of measurements necessary to correctly reconstruct the original data. Experimental results conducted using synthetic and real hyperspectral datasets on two different GPU architectures by NVIDIA: GeForce GTX 590 and GeForce GTX TITAN, reveal that the use of GPUs can provide real-time compressive sensing performance. The achieved speedup is up to 20 times when compared with the processing time of HYCA running on one core of the Intel i7-2600 CPU (3.4GHz), with 16 Gbyte memory.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

One of the main problems of hyperspectral data analysis is the presence of mixed pixels due to the low spatial resolution of such images. Linear spectral unmixing aims at inferring pure spectral signatures and their fractions at each pixel of the scene. The huge data volumes acquired by hyperspectral sensors put stringent requirements on processing and unmixing methods. This letter proposes an efficient implementation of the method called simplex identification via split augmented Lagrangian (SISAL) which exploits the graphics processing unit (GPU) architecture at low level using Compute Unified Device Architecture. SISAL aims to identify the endmembers of a scene, i.e., is able to unmix hyperspectral data sets in which the pure pixel assumption is violated. The proposed implementation is performed in a pixel-by-pixel fashion using coalesced accesses to memory and exploiting shared memory to store temporary data. Furthermore, the kernels have been optimized to minimize the threads divergence, therefore achieving high GPU occupancy. The experimental results obtained for the simulated and real hyperspectral data sets reveal speedups up to 49 times, which demonstrates that the GPU implementation can significantly accelerate the method's execution over big data sets while maintaining the methods accuracy.