955 resultados para MESH REFINEMENT


Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper reviews the current knowledge and understanding of martensitic transformations in ceramics - the tetragonal to monoclinic transformation in zirconia in particular. This martensitic transformation is the key to transformation toughening in zirconia ceramics. A very considerable body of experimental data on the characteristics of this transformation is now available. In addition, theoretical predictions can be made using the phenomenological theory of martensitic transformations. As the paper will illustrate, the phenomenological theory is capable of explaining all the reported microstructural and crystallographic features of the transformation in zirconia and in some other ceramic systems. Hence the theory, supported by experiment, can be used with considerable confidence to provide the quantitative data that is essential for developing a credible, comprehensive understanding of the transformation toughening process. A critical feature in transformation toughening is the shape strain that accompanies the transformation. This shape strain, or nucleation strain, determines whether or not the stress-induced martensitic transformation can occur at the tip of a potentially dangerous crack. If transformation does take place, then it is the net transformation strain left behind in the transformed region that provides toughening by hindering crack growth. The fracture mechanics based models for transformation toughening, therefore, depend on having a full understanding of the characteristics of the martensitic transformation and, in particular, on being able to specify both these strains. A review of the development of the models for transformation toughening shows that their refinement and improvement over the last couple of decades has been largely a result of the inclusion of more of the characteristics of the stress-induced martensitic transformation. The paper advances an improved model for the stress-induced martensitic transformation and the strains resulting from the transformation. This model, which separates the nucleation strain from the subsequent net transformation strain, is shown to be superior to any of the constitutive models currently available. (C) 2002 Elsevier Science Ltd. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The most characteristic feature of the microstructure of a magnesium alloy that contains more than a few tenths per cent soluble zirconium is the zirconium-rich cores that exist in most grains. The morphology, distribution and composition of cores observed in a Mg-0.56%Zr alloy and the small particles present in them were investigated. (C) 2002 Acta Materialia Inc. Published by Elsevier Science Ltd. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A range of lasers. is now available for use in dentistry. This paper summarizes key current and emerging applications, for lasers in clinical practice. A major diagnostic application of low power lasers is the detection of caries, using fluorescence elicited from hydroxyapatite or from bacterial by-products. Laser fluorescence is an effective method for detecting and quantifying incipient occlusal and cervical,carious lesions, and with further refinement could be used in the, same manner for proximal lesions. Photoactivated dye techniques have been developed which use low power lasers to elicit a photochemical reaction, Photoactivated dye techniques' can be used to disinfect root canals, periodontal pockets, cavity preparations and sites of peri-implantitis. Using similar principles, more powerful lasers tan be used for photodynamic therapy in the treatment of malignancies of the oral mucosa. Laser-driven photochemical reactions can also be used for tooth whitening. In combination with fluoride, laser irradiation can improve the resistance of tooth structure to demineralization, and this application is of particular benefit for susceptible sites in high caries risk patients. Laser technology for caries' removal, cavity preparation and soft tissue surgery is at a high state of refinement, having had several decades of development up to the present time. Used in conjunction with or as a replacement for traditional methods, it is expected that specific laser technologies will become an essential component of contemporary dental practice over the next decade.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The most widely used method for predicting the onset of continuous caving is Laubscher's caving chart. A detailed examination of this method was undertaken which concluded that it had limitations which may impact on results, particularly when dealing with stronger rock masses that are outside current experience. These limitations relate to inadequate guidelines for adjustment factors to rock mass rating (RMR), concerns about the position on the chart of critical case history data, undocumented changes to the method and an inadequate number of data points to be confident of stability boundaries. A review was undertaken on the application and reliability of a numerical method of assessing cavability. The review highlighted a number of issues, which at this stage, make numerical continuum methods problematic for predicting cavability. This is in particular reference to sensitivity to input parameters that are difficult to determine accurately and mesh dependency. An extended version of the Mathews method for open stope design was developed as an alternative method of predicting the onset of continuous caving. A number of caving case histories were collected and analyzed and a caving boundary delineated statistically on the Mathews stability graph. The definition of the caving boundary was aided by the existence of a large and wide-ranging stability database from non-caving mines. A caving rate model was extrapolated from the extended Mathews stability graph but could only be partially validated due to a lack of reliable data.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The Test of Mouse Proficiency (TOMP) was developed to assist occupational therapists and education professionals assess computer mouse competency skills in children from preschool to upper primary (elementary) school age. The preliminary reliability and validity of TOMP are reported in this paper. Methods used to examine the internal consistency, test-retest reliability, and criterion- and construct-related validity of the test are elaborated. In the continuing process of test refinement, these preliminary studies support to varying degrees the reliability and validity of TOMP. Recommendations for further validation of the assessment are discussed along with indications for potential clinical application.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Multiple HLA class I alleles can bind peptides with common sequence motifs due to structural similarities in the peptide binding cleft, and these groups of alleles have been classified into supertypes. Nine major HLA supertypes have been proposed, including an A24 supertype that includes A*2301, A*2402, and A*3001. Evidence for this A24 supertype is limited to HLA sequence homology and/or similarity in peptide binding motifs for the alleles. To investigate the immunological relevance of this proposed supertype, we have examined two viral epitopes (from EBV and CMV) initially defined as HLA-A*2301-binding peptides. The data clearly demonstrate that each peptide could be recognized by CTL clones in the context of A*2301 or A*2402; thus validating the inclusion of these three alleles within an A24 supertype. Furthermore, CTL responses to the EBV epitope were detectable in both A*2301(+) and A*2402(+) individuals who had been previously exposed to this virus. These data substantiate the biological relevance of the A24 supertype, and the identification of viral epitopes with the capacity to bind promiscuously across this supertype could aid efforts to develop CTL-based vaccines or immunotherapy. The degeneracy in HLA restriction displayed by some T cells in this study also suggests that the dogma of self-MHC restriction needs some refinement to accommodate foreign peptide recognition in the context of multiple supertype alleles.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Increased professionalism in rugby has elicited rapid changes in the fitness profile of elite players. Recent research, focusing on the physiological and anthropometrical characteristics of rugby players, and the demands of competition are reviewed. The paucity of research on contemporary elite rugby players is highlighted, along with the need for standardised testing protocols. Recent data reinforce the pronounced differences in the anthropometric and physical characteristics of the forwards and backs. Forwards are typically heavier, taller, and have a greater proportion of body fat than backs. These characteristics are changing, with forwards developing greater total mass and higher muscularity. The forwards demonstrate superior absolute aerobic and anaerobic power, and Muscular strength. Results favour the backs when body mass is taken into account. The scaling of results to body mass can be problematic and future investigations should present results using power function ratios. Recommended tests for elite players include body mass and skinfolds, vertical jump, speed, and the multi-stage shuttle run. Repeat sprint testing is a possible avenue for more specific evaluation of players. During competition, high-intensity efforts are often followed by periods of incomplete recovery. The total work over the duration of a game is lower in the backs compared with the forwards; forwards spend greater time in physical contact with the opposition while the backs spend more time in free running, allowing them to cover greater distances. The intense efforts undertaken by rugby players place considerable stress on anaerobic energy sources, while the aerobic system provides energy during repeated efforts and for recovery. Training should focus on repeated brief high-intensity efforts with short rest intervals to condition players to the demands of the game. Training for the forwards should emphasise the higher work rates of the game, while extended rest periods can be provided to the backs. Players should not only be prepared for the demands of competition, but also the stress of travel and extreme environmental conditions. The greater professionalism of rugby union has increased scientific research in the sport; however, there is scope for significant refinement of investigations on the physiological demands of the game, and sports-specific testing procedures.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The Timed Interval Calculus, a timed-trace formalism based on set theory, is introduced. It is extended with an induction law and a unit for concatenation, which facilitates the proof of properties over trace histories. The effectiveness of the extended Timed Interval Calculus is demonstrated via a benchmark case study, the mine pump. Specifically, a safety property relating to the operation of a mine shaft is proved, based on an implementation of the mine pump and assumptions about the environment of the mine. (C) 2002 Elsevier Science B.V. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We introduce a refinement of the standard continuous variable teleportation measurement and displacement strategies. This refinement makes use of prior knowledge about the target state and the partial information carried by the classical channel when entanglement is nonmaximal. This gives an improvement in the output quality of the protocol. The strategies we introduce could be used in current continuous variable teleportation experiments.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Um algoritmo numérico foi criado para apresentar a solução da conversão termoquímica de um combustível sólido. O mesmo foi criado de forma a ser flexível e dependente do mecanismo de reação a ser representado. Para tanto, um sistema das equações características desse tipo de problema foi resolvido através de um método iterativo unido a matemática simbólica. Em função de não linearidades nas equações e por se tratar de pequenas partículas, será aplicado o método de Newton para reduzir o sistema de equações diferenciais parciais (EDP’s) para um sistema de equações diferenciais ordinárias (EDO’s). Tal processo redução é baseado na união desse método iterativo à diferenciação numérica, pois consegue incorporar nas EDO’s resultantes funções analíticas. O modelo reduzido será solucionado numericamente usando-se a técnica do gradiente bi-conjugado (BCG). Tal modelo promete ter taxa de convergência alta, se utilizando de um número baixo de iterações, além de apresentar alta velocidade na apresentação das soluções do novo sistema linear gerado. Além disso, o algoritmo se mostra independente do tamanho da malha constituidora. Para a validação, a massa normalizada será calculada e comparada com valores experimentais de termogravimetria encontrados na literatura, , e um teste com um mecanismo simplificado de reação será realizado.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

O presente trabalho objetiva avaliar o desempenho do MECID (Método dos Elementos de Contorno com Interpolação Direta) para resolver o termo integral referente à inércia na Equação de Helmholtz e, deste modo, permitir a modelagem do Problema de Autovalor assim como calcular as frequências naturais, comparando-o com os resultados obtidos pelo MEF (Método dos Elementos Finitos), gerado pela Formulação Clássica de Galerkin. Em primeira instância, serão abordados alguns problemas governados pela equação de Poisson, possibilitando iniciar a comparação de desempenho entre os métodos numéricos aqui abordados. Os problemas resolvidos se aplicam em diferentes e importantes áreas da engenharia, como na transmissão de calor, no eletromagnetismo e em problemas elásticos particulares. Em termos numéricos, sabe-se das dificuldades existentes na aproximação precisa de distribuições mais complexas de cargas, fontes ou sorvedouros no interior do domínio para qualquer técnica de contorno. No entanto, este trabalho mostra que, apesar de tais dificuldades, o desempenho do Método dos Elementos de Contorno é superior, tanto no cálculo da variável básica, quanto na sua derivada. Para tanto, são resolvidos problemas bidimensionais referentes a membranas elásticas, esforços em barras devido ao peso próprio e problemas de determinação de frequências naturais em problemas acústicos em domínios fechados, dentre outros apresentados, utilizando malhas com diferentes graus de refinamento, além de elementos lineares com funções de bases radiais para o MECID e funções base de interpolação polinomial de grau (um) para o MEF. São geradas curvas de desempenho através do cálculo do erro médio percentual para cada malha, demonstrando a convergência e a precisão de cada método. Os resultados também são comparados com as soluções analíticas, quando disponíveis, para cada exemplo resolvido neste trabalho.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Within the development of motor vehicles, crash safety (e.g. occupant protection, pedestrian protection, low speed damageability), is one of the most important attributes. In order to be able to fulfill the increased requirements in the framework of shorter cycle times and rising pressure to reduce costs, car manufacturers keep intensifying the use of virtual development tools such as those in the domain of Computer Aided Engineering (CAE). For crash simulations, the explicit finite element method (FEM) is applied. The accuracy of the simulation process is highly dependent on the accuracy of the simulation model, including the midplane mesh. One of the roughest approximations typically made is the actual part thickness which, in reality, can vary locally. However, almost always a constant thickness value is defined throughout the entire part due to complexity reasons. On the other hand, for precise fracture analysis within FEM, the correct thickness consideration is one key enabler. Thus, availability of per element thickness information, which does not exist explicitly in the FEM model, can significantly contribute to an improved crash simulation quality, especially regarding fracture prediction. Even though the thickness is not explicitly available from the FEM model, it can be inferred from the original CAD geometric model through geometric calculations. This paper proposes and compares two thickness estimation algorithms based on ray tracing and nearest neighbour 3D range searches. A systematic quantitative analysis of the accuracy of both algorithms is presented, as well as a thorough identification of particular geometric arrangements under which their accuracy can be compared. These results enable the identification of each technique’s weaknesses and hint towards a new, integrated, approach to the problem that linearly combines the estimates produced by each algorithm.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Pectus excavatum is the most common congenital deformity of the anterior chest wall, in which several ribs and the sternum grow abnormally. Nowadays, the surgical correction is carried out in children and adults through Nuss technic. This technic has been shown to be safe with major drivers as cosmesis and the prevention of psychological problems and social stress. Nowadays, no application is known to predict the cosmetic outcome of the pectus excavatum surgical correction. Such tool could be used to help the surgeon and the patient in the moment of deciding the need for surgery correction. This work is a first step to predict postsurgical outcome in pectus excavatum surgery correction. Facing this goal, it was firstly determined a point cloud of the skin surface along the thoracic wall using Computed Tomography (before surgical correction) and the Polhemus FastSCAN (after the surgical correction). Then, a surface mesh was reconstructed from the two point clouds using a Radial Basis Function algorithm for further affine registration between the meshes. After registration, one studied the surgical correction influence area (SCIA) of the thoracic wall. This SCIA was used to train, test and validate artificial neural networks in order to predict the surgical outcome of pectus excavatum correction and to determine the degree of convergence of SCIA in different patients. Often, ANN did not converge to a satisfactory solution (each patient had its own deformity characteristics), thus invalidating the creation of a mathematical model capable of estimating, with satisfactory results, the postsurgical outcome

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In the last years, it has become increasingly clear that neurodegenerative diseases involve protein aggregation, a process often used as disease progression readout and to develop therapeutic strategies. This work presents an image processing tool to automatic segment, classify and quantify these aggregates and the whole 3D body of the nematode Caenorhabditis Elegans. A total of 150 data set images, containing different slices, were captured with a confocal microscope from animals of distinct genetic conditions. Because of the animals’ transparency, most of the slices pixels appeared dark, hampering their body volume direct reconstruction. Therefore, for each data set, all slices were stacked in one single 2D image in order to determine a volume approximation. The gradient of this image was input to an anisotropic diffusion algorithm that uses the Tukey’s biweight as edge-stopping function. The image histogram median of this outcome was used to dynamically determine a thresholding level, which allows the determination of a smoothed exterior contour of the worm and the medial axis of the worm body from thinning its skeleton. Based on this exterior contour diameter and the medial animal axis, random 3D points were then calculated to produce a volume mesh approximation. The protein aggregations were subsequently segmented based on an iso-value and blended with the resulting volume mesh. The results obtained were consistent with qualitative observations in literature, allowing non-biased, reliable and high throughput protein aggregates quantification. This may lead to a significant improvement on neurodegenerative diseases treatment planning and interventions prevention

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Pectus Carinatum (PC) is a chest deformity consisting on the anterior protrusion of the sternum and adjacent costal cartilages. Non-operative corrections, such as the orthotic compression brace, require previous information of the patient chest surface, to improve the overall brace fit. This paper focuses on the validation of the Kinect scanner for the modelling of an orthotic compression brace for the correction of Pectus Carinatum. To this extent, a phantom chest wall surface was acquired using two scanner systems – Kinect and Polhemus FastSCAN – and compared through CT. The results show a RMS error of 3.25mm between the CT data and the surface mesh from the Kinect sensor and 1.5mm from the FastSCAN sensor