855 resultados para rietveld refinement


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Objectives: Advances in surface electromyography (sEMG) techniques provide a clear indication that refinement of electrode location relative to innervation zones (IZ) is required in order to optimise the accuracy, relevance and repeatability of the sEMG signals. The aim of this study was to identify the IZ for the sternocleidomastoid and anterior scalene muscles to provide guidelines for electrode positioning for future clinical and research applications. Methods: Eleven volunteer subjects participated in this study. Myoelectric signals were detected from the sternal and clavicular heads of the stemocleidomastoid and the anterior scalene muscles bilaterally using a linear array of 8 electrodes during isometric cervical flexion contractions. The signals were reviewed and the IZ(s) were identified, marked on the subjects' skin and measurements were obtained relative to selected anatomical landmarks. Results: The position of the IZ lay consistently around the mid-point or in the superior portion of the muscles studied. Conclusions: Results suggest that electrodes should be positioned over the lower portion of the muscle and not the mid-point, which has been commonly used in previous studies. Recommendations for sensor placement on these muscles should assist investigators and clinicians to ensure improved validity in future sEMG applications. (C) 2002 Elsevier Science Ireland Ltd. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Accurate habitat mapping is critical to landscape ecological studies such as required for developing and testing Montreal Process indicator 1.1e, fragmentation of forest types. This task poses a major challenge to remote sensing, especially in mixedspecies, variable-age forests such as dry eucalypt forests of subtropical eastern Australia. In this paper, we apply an innovative approach that uses a small section of one-metre resolution airborne data to calibrate a moderate spatial resolution model (30 m resolution; scale 1:50 000) based on Landsat Thematic Mapper data to estimate canopy structural properties in St Marys State Forest, near Maryborough, south-eastern Queensland. The approach applies an image-processing model that assumes each image pixel is significantly larger than individual tree crowns and gaps to estimate crown-cover percentage, stem density and mean crown diameter. These parameters were classified into three discrete habitat classes to match the ecology of four exudivorous arboreal species (yellowbellied glider Petaurus australis, sugar glider P. breviceps, squirrel glider P. norfolcensis , and feathertail glider Acrobates pygmaeus), and one folivorous arboreal marsupial, the greater glider Petauroides volans. These species were targeted due to the known ecological preference for old trees with hollows, and differences in their home range requirements. The overall mapping accuracy, visually assessed against transects (n = 93) interpreted from a digital orthophoto and validated in the field, was 79% (KHAT statistic = 0.72). The KHAT statistic serves as an indicator of the extent that the percentage correct values of the error matrix are due to ‘true’ agreement verses ‘chance’ agreement. This means that we are able to reliably report on the effect of habitat loss on target species, especially those with a large home range size (e.g. yellow-bellied glider). However, the classified habitat map failed to accurately capture the spatial patterning (e.g. patch size and shape) of stands with a trace or sub-dominance of senescent trees. This outcome makes the reporting of the effects of habitat fragmentation more problematic, especially for species with a small home range size (e.g. feathertail glider). With further model refinement and validation, however, this moderateresolution approach offers an important, cost eff e c t i v e advancement in mapping the age of dry eucalypt forests in the region.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Program compilation can be formally defined as a sequence of equivalence-preserving transformations, or refinements, from high-level language programs to assembler code, Recent models also incorporate timing properties, but the resulting formalisms are intimidatingly complex. Here we take advantage of a new, simple model of real-time refinement, based on predicate transformer semantics, to present a straightforward compilation formalism that incorporates real-time constraints. (C) 2002 Elsevier Science B.V. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper reviews the current knowledge and understanding of martensitic transformations in ceramics - the tetragonal to monoclinic transformation in zirconia in particular. This martensitic transformation is the key to transformation toughening in zirconia ceramics. A very considerable body of experimental data on the characteristics of this transformation is now available. In addition, theoretical predictions can be made using the phenomenological theory of martensitic transformations. As the paper will illustrate, the phenomenological theory is capable of explaining all the reported microstructural and crystallographic features of the transformation in zirconia and in some other ceramic systems. Hence the theory, supported by experiment, can be used with considerable confidence to provide the quantitative data that is essential for developing a credible, comprehensive understanding of the transformation toughening process. A critical feature in transformation toughening is the shape strain that accompanies the transformation. This shape strain, or nucleation strain, determines whether or not the stress-induced martensitic transformation can occur at the tip of a potentially dangerous crack. If transformation does take place, then it is the net transformation strain left behind in the transformed region that provides toughening by hindering crack growth. The fracture mechanics based models for transformation toughening, therefore, depend on having a full understanding of the characteristics of the martensitic transformation and, in particular, on being able to specify both these strains. A review of the development of the models for transformation toughening shows that their refinement and improvement over the last couple of decades has been largely a result of the inclusion of more of the characteristics of the stress-induced martensitic transformation. The paper advances an improved model for the stress-induced martensitic transformation and the strains resulting from the transformation. This model, which separates the nucleation strain from the subsequent net transformation strain, is shown to be superior to any of the constitutive models currently available. (C) 2002 Elsevier Science Ltd. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The most characteristic feature of the microstructure of a magnesium alloy that contains more than a few tenths per cent soluble zirconium is the zirconium-rich cores that exist in most grains. The morphology, distribution and composition of cores observed in a Mg-0.56%Zr alloy and the small particles present in them were investigated. (C) 2002 Acta Materialia Inc. Published by Elsevier Science Ltd. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A range of lasers. is now available for use in dentistry. This paper summarizes key current and emerging applications, for lasers in clinical practice. A major diagnostic application of low power lasers is the detection of caries, using fluorescence elicited from hydroxyapatite or from bacterial by-products. Laser fluorescence is an effective method for detecting and quantifying incipient occlusal and cervical,carious lesions, and with further refinement could be used in the, same manner for proximal lesions. Photoactivated dye techniques have been developed which use low power lasers to elicit a photochemical reaction, Photoactivated dye techniques' can be used to disinfect root canals, periodontal pockets, cavity preparations and sites of peri-implantitis. Using similar principles, more powerful lasers tan be used for photodynamic therapy in the treatment of malignancies of the oral mucosa. Laser-driven photochemical reactions can also be used for tooth whitening. In combination with fluoride, laser irradiation can improve the resistance of tooth structure to demineralization, and this application is of particular benefit for susceptible sites in high caries risk patients. Laser technology for caries' removal, cavity preparation and soft tissue surgery is at a high state of refinement, having had several decades of development up to the present time. Used in conjunction with or as a replacement for traditional methods, it is expected that specific laser technologies will become an essential component of contemporary dental practice over the next decade.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The Test of Mouse Proficiency (TOMP) was developed to assist occupational therapists and education professionals assess computer mouse competency skills in children from preschool to upper primary (elementary) school age. The preliminary reliability and validity of TOMP are reported in this paper. Methods used to examine the internal consistency, test-retest reliability, and criterion- and construct-related validity of the test are elaborated. In the continuing process of test refinement, these preliminary studies support to varying degrees the reliability and validity of TOMP. Recommendations for further validation of the assessment are discussed along with indications for potential clinical application.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Multiple HLA class I alleles can bind peptides with common sequence motifs due to structural similarities in the peptide binding cleft, and these groups of alleles have been classified into supertypes. Nine major HLA supertypes have been proposed, including an A24 supertype that includes A*2301, A*2402, and A*3001. Evidence for this A24 supertype is limited to HLA sequence homology and/or similarity in peptide binding motifs for the alleles. To investigate the immunological relevance of this proposed supertype, we have examined two viral epitopes (from EBV and CMV) initially defined as HLA-A*2301-binding peptides. The data clearly demonstrate that each peptide could be recognized by CTL clones in the context of A*2301 or A*2402; thus validating the inclusion of these three alleles within an A24 supertype. Furthermore, CTL responses to the EBV epitope were detectable in both A*2301(+) and A*2402(+) individuals who had been previously exposed to this virus. These data substantiate the biological relevance of the A24 supertype, and the identification of viral epitopes with the capacity to bind promiscuously across this supertype could aid efforts to develop CTL-based vaccines or immunotherapy. The degeneracy in HLA restriction displayed by some T cells in this study also suggests that the dogma of self-MHC restriction needs some refinement to accommodate foreign peptide recognition in the context of multiple supertype alleles.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Increased professionalism in rugby has elicited rapid changes in the fitness profile of elite players. Recent research, focusing on the physiological and anthropometrical characteristics of rugby players, and the demands of competition are reviewed. The paucity of research on contemporary elite rugby players is highlighted, along with the need for standardised testing protocols. Recent data reinforce the pronounced differences in the anthropometric and physical characteristics of the forwards and backs. Forwards are typically heavier, taller, and have a greater proportion of body fat than backs. These characteristics are changing, with forwards developing greater total mass and higher muscularity. The forwards demonstrate superior absolute aerobic and anaerobic power, and Muscular strength. Results favour the backs when body mass is taken into account. The scaling of results to body mass can be problematic and future investigations should present results using power function ratios. Recommended tests for elite players include body mass and skinfolds, vertical jump, speed, and the multi-stage shuttle run. Repeat sprint testing is a possible avenue for more specific evaluation of players. During competition, high-intensity efforts are often followed by periods of incomplete recovery. The total work over the duration of a game is lower in the backs compared with the forwards; forwards spend greater time in physical contact with the opposition while the backs spend more time in free running, allowing them to cover greater distances. The intense efforts undertaken by rugby players place considerable stress on anaerobic energy sources, while the aerobic system provides energy during repeated efforts and for recovery. Training should focus on repeated brief high-intensity efforts with short rest intervals to condition players to the demands of the game. Training for the forwards should emphasise the higher work rates of the game, while extended rest periods can be provided to the backs. Players should not only be prepared for the demands of competition, but also the stress of travel and extreme environmental conditions. The greater professionalism of rugby union has increased scientific research in the sport; however, there is scope for significant refinement of investigations on the physiological demands of the game, and sports-specific testing procedures.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The Timed Interval Calculus, a timed-trace formalism based on set theory, is introduced. It is extended with an induction law and a unit for concatenation, which facilitates the proof of properties over trace histories. The effectiveness of the extended Timed Interval Calculus is demonstrated via a benchmark case study, the mine pump. Specifically, a safety property relating to the operation of a mine shaft is proved, based on an implementation of the mine pump and assumptions about the environment of the mine. (C) 2002 Elsevier Science B.V. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We introduce a refinement of the standard continuous variable teleportation measurement and displacement strategies. This refinement makes use of prior knowledge about the target state and the partial information carried by the classical channel when entanglement is nonmaximal. This gives an improvement in the output quality of the protocol. The strategies we introduce could be used in current continuous variable teleportation experiments.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

O presente trabalho objetiva avaliar o desempenho do MECID (Método dos Elementos de Contorno com Interpolação Direta) para resolver o termo integral referente à inércia na Equação de Helmholtz e, deste modo, permitir a modelagem do Problema de Autovalor assim como calcular as frequências naturais, comparando-o com os resultados obtidos pelo MEF (Método dos Elementos Finitos), gerado pela Formulação Clássica de Galerkin. Em primeira instância, serão abordados alguns problemas governados pela equação de Poisson, possibilitando iniciar a comparação de desempenho entre os métodos numéricos aqui abordados. Os problemas resolvidos se aplicam em diferentes e importantes áreas da engenharia, como na transmissão de calor, no eletromagnetismo e em problemas elásticos particulares. Em termos numéricos, sabe-se das dificuldades existentes na aproximação precisa de distribuições mais complexas de cargas, fontes ou sorvedouros no interior do domínio para qualquer técnica de contorno. No entanto, este trabalho mostra que, apesar de tais dificuldades, o desempenho do Método dos Elementos de Contorno é superior, tanto no cálculo da variável básica, quanto na sua derivada. Para tanto, são resolvidos problemas bidimensionais referentes a membranas elásticas, esforços em barras devido ao peso próprio e problemas de determinação de frequências naturais em problemas acústicos em domínios fechados, dentre outros apresentados, utilizando malhas com diferentes graus de refinamento, além de elementos lineares com funções de bases radiais para o MECID e funções base de interpolação polinomial de grau (um) para o MEF. São geradas curvas de desempenho através do cálculo do erro médio percentual para cada malha, demonstrando a convergência e a precisão de cada método. Os resultados também são comparados com as soluções analíticas, quando disponíveis, para cada exemplo resolvido neste trabalho.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper seeks to understand the use and the consequences of Participatory Geographic Information System (PGIS) in a Mexican local community. A multilevel framework was applied, mainly influenced by two theoretical lenses – structurationist view and social shaping of technology – structured in three dimensions – context, process and content – according to contextualist logic. The results of our study have brought two main contributions. The first is the refinement of the theoretical framework in order to better investigate the implementation and use of Information and Communication Technology (ICT) artifacts by local communities for social and environmental purposes. The second contribution is the extension of existing IS (Information Systems) literature on participatory practices through identification of important conditions for helping the mobilization of ICT as a tool for empowering local communities.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In the past thirty years, a series of plans have been developed by successive Brazilian governments in a continuing effort to maximize the nation's resources for economic and social growth. This planning history has been quantitatively rich but qualitatively poor. The disjunction has stimulated Professor Mello e Souza to address himself to the problem of national planning and to offer some criticisms of Brazilian planning experience. Though political instability has obviously been a factor promoting discontinuity, his criticisms are aimed at the attitudes and strategic concepts which have sought to link planning to national goals and administration. He criticizes the fascination with techniques and plans to the exclusion of proper diagnosis of the socio-political reality, developing instruments to coordinate and carry out objectives, and creating an administrative structure centralized enough to make national decisions and decentralized enough to perform on the basis of those decisions. Thus, fixed, quantified objectives abound while the problem of functioning mechanisms for the coordinated, rational use of resources has been left unattended. Although his interest and criticism are focused on the process and experience of national planning, he recognized variation in the level and results of Brazilian planning. National plans have failed due to faulty conception of the function of planning. Sectorial plans, save in the sector of the petroleum industry under government responsibility, ha e not succeeded in overcoming the problems of formulation and execution thereby repeating old technical errors. Planning for the private sector has a somewhat brighter history due to the use of Grupos Executivos which has enabled the planning process to transcend the formalism and tradition-bound attitudes of the regular bureaucracy. Regional planning offers two relatively successful experiences, Sudene and the strategy of the regionally oriented autarchy. Thus, planning history in Brazil is not entirely black but a certain shade of grey. The major part of the article, however, is devoted to a descriptive analysis of the national planning experience. The plans included in this analysis are: The Works and Equipment Plan (POE); The Health, Food, Transportation and Energy Plan (Salte); The Program of Goals; The Trienal Plan of Economic and Social Development; and the Plan of Governmental Economic Action (Paeg). Using these five plans for his historical experience the author sets out a series of errors of formulation and execution by which he analyzes that experience. With respect to formulation, he speaks of a lack of elaboration of programs and projects, of coordination among diverse goals, and of provision of qualified staff and techniques. He mentions the absence of the definition of resources necessary to the financing of the plan and the inadequate quantification of sectorial and national goals due to the lack of reliable statistical information. Finally, he notes the failure to coordinate the annual budget with the multi-year plans. He sees the problems of execution as beginning in the absence of coordination between the various sectors of the public administration, the failure to develop an operative system of decentralization, the absence of any system of financial and fiscal control over execution, the difficulties imposed by the system of public accounting, and the absence of an adequate program of allocation for the liberation of resources. He ends by pointing to the failure to develop and use an integrated system of political economic tools in a mode compatible with the objective of the plans. The body of the article analyzes national planning experience in Brazil using these lists of errors as rough model of criticism. Several conclusions emerge from this analysis with regard to planning in Brazil and in developing countries, in general. Plans have generally been of little avail in Brazil because of the lack of a continuous, bureaucratized (in the Weberian sense) planning organization set in an instrumentally suitable administrative structure and based on thorough diagnoses of socio-economic conditions and problems. Plans have become the justification for planning. Planning has come to be conceived as a rational method of orienting the process of decisions through the establishment of a precise and quantified relation between means and ends. But this conception has led to a planning history rimmed with frustration, and failure, because of its rigidity in the face of flexible and changing reality. Rather, he suggests a conception of planning which understands it "as a rational process of formulating decisions about the policy, economy, and society whose only demand is that of managing the instrumentarium in a harmonious and integrated form in order to reach explicit, but not quantified ends". He calls this "planning without plans": the establishment of broad-scale tendencies through diagnosis whose implementation is carried out through an adjustable, coherent instrumentarium of political-economic tools. Administration according to a plan of multiple, integrated goals is a sound procedure if the nation's administrative machinery contains the technical development needed to control the multiple variables linked to any situation of socio-economic change. Brazil does not possess this level of refinement and any strategy of planning relevant to its problems must recognize this. The reforms which have been attempted fail to make this recognition as is true of the conception of planning informing the Brazilian experience. Therefore, unworkable plans, ill-diagnosed with little or no supportive instrumentarium or flexibility have been Brazil's legacy. This legacy seems likely to continue until the conception of planning comes to live in the reality of Brazil.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In the Sparse Point Representation (SPR) method the principle is to retain the function data indicated by significant interpolatory wavelet coefficients, which are defined as interpolation errors by means of an interpolating subdivision scheme. Typically, a SPR grid is coarse in smooth regions, and refined close to irregularities. Furthermore, the computation of partial derivatives of a function from the information of its SPR content is performed in two steps. The first one is a refinement procedure to extend the SPR by the inclusion of new interpolated point values in a security zone. Then, for points in the refined grid, such derivatives are approximated by uniform finite differences, using a step size proportional to each point local scale. If required neighboring stencils are not present in the grid, the corresponding missing point values are approximated from coarser scales using the interpolating subdivision scheme. Using the cubic interpolation subdivision scheme, we demonstrate that such adaptive finite differences can be formulated in terms of a collocation scheme based on the wavelet expansion associated to the SPR. For this purpose, we prove some results concerning the local behavior of such wavelet reconstruction operators, which stand for SPR grids having appropriate structures. This statement implies that the adaptive finite difference scheme and the one using the step size of the finest level produce the same result at SPR grid points. Consequently, in addition to the refinement strategy, our analysis indicates that some care must be taken concerning the grid structure, in order to keep the truncation error under a certain accuracy limit. Illustrating results are presented for 2D Maxwell's equation numerical solutions.