926 resultados para Polyharmonic distortion modeling. X-parameters. Test-Bench. Planar structures. PHD
Resumo:
β -type Ti-alloy is a promising biomedical implant material as it has a low Young’s modulus but is also known to have inferior surface hardness. Various surface treatments can be applied to enhance the surface hardness. Physical vapour deposition (PVD) and chemical vapour deposition (CVD) are two examples of this but these techniques have limitations such as poor interfacial adhesion and high distortion. Laser surface treatment is a relatively new surface modification method to enhance the surface hardness but its application is still not accepted by the industry. The major problem of this process involves surface melting which results in higher surface roughness after the laser surface treatment. This paper will report the results achieved by a 100 W CW fiber laser for laser surface treatment without the surface being melted. Laser processing parameters were carefully selected so that the surface could be treated without surface melting and thus the surface finish of the component could be maintained. The surface and microstructural characteristics of the treated samples were examined using X-ray diffractometry (XRD), optical microscopy (OM), 3-D surface profile & contact angle measurements and nano-indentation test.
Resumo:
PURPOSE:
To determine the test-retest variability in perimetric, optic disc, and macular thickness parameters in a cohort of treated patients with established glaucoma.
PATIENTS AND METHODS:
In this cohort study, the authors analyzed the imaging studies and visual field tests at the baseline and 6-month visits of 162 eyes of 162 participant in the Glaucoma Imaging Longitudinal Study (GILS). They assessed the difference, expressed as the standard error of measurement, of Humphrey field analyzer II (HFA) Swedish Interactive Threshold Algorithm fast, Heidelberg retinal tomograph (HRT) II, and retinal thickness analyzer (RTA) parameters between the two visits and assumed that this difference was due to measurement variability, not pathologic change. A statistically significant change was defined as twice the standard error of measurement.
RESULTS:
In this cohort of treated glaucoma patients, it was found that statistically significant changes were 3.2 dB for mean deviation (MD), 2.2 for pattern standard deviation (PSD), 0.12 for cup shape measure, 0.26 mm for rim area, and 32.8 microm and 31.8 microm for superior and inferior macular thickness, respectively. On the basis of these values, it was estimated that the number of potential progression events detectable in this cohort by the parameters of MD, PSD, cup shape measure, rim area, superior macular thickness, and inferior macular thickness was 7.5, 6.0, 2.3, 5.7, 3.1, and 3.4, respectively.
CONCLUSIONS:
The variability of the measurements of MD, PSD, and rim area, relative to the range of possible values, is less than the variability of cup shape measure or macular thickness measurements. Therefore, the former measurements may be more useful global measurements for assessing progressive glaucoma damage.
Resumo:
The evaporation of exoplanetary atmospheres is thought to be driven by high-energy irradiation. However, the actual mass loss rates are not well constrained. Co-I Kipping has recently discovered that the star KOI-314, an M1V dwarf at 65 pc distance, is orbited by two earth-sized planets, the inner one of them rocky and the outer one gaseous (P_orb = 14d and 23d). Other recent works have shown an abundance of small rocky planets in very close orbits around their host stars, suggesting that the stellar high-energy irradiation evaporates away gaseous envelopes. KOI-314 is the first nearby system in which earth-sized planets of both types are detected, allowing us to constrain the efficiency of planetary evaporation if the stellar X-ray irradiation is measured. We therefore propose a 10 ks Chandra ACIS-S pointing to determine the stellar X-ray luminosity and hardness ratio. The accuracy of the orbital solution decreases quickly due to Transit-Timing Variations, which is why we ask for DDT.
Resumo:
Beta-type Ti-alloy is a promising biomedical implant material as it has a low Young’s modulus and is also known to have inferior surface hardness. Various surface treatments can be applied to enhance the surface hardness. Physical vapor deposition and chemical vapor deposition are two examples of this but these techniques have limitations such as poor interfacial adhesion and high distortion. Laser surface treatment is a relatively new surface modification method to enhance the surface hardness but its application is still not accepted by the industry. The major problem of this process involves surface melting which results in higher surface roughness after the laser surface treatment. This paper will report the results achieved by a 100 W continuous wave (CW) fiber laser for laser surface treatment without the surface being melted. Laser processing parameters were carefully selected so that the surface could be treated without surface melting and thus the surface finish of the component could be maintained. The surface and microstructural characteristics of the treated samples were examined using x-ray diffractometry, optical microscopy, three-dimensional surface profile and contact angle measurements, and nanoindentation test.
Resumo:
Experimental Extended X-ray Absorption Fine Structure (EXAFS) spectra carry information about the chemical structure of metal protein complexes. However, pre- dicting the structure of such complexes from EXAFS spectra is not a simple task. Currently methods such as Monte Carlo optimization or simulated annealing are used in structure refinement of EXAFS. These methods have proven somewhat successful in structure refinement but have not been successful in finding the global minima. Multiple population based algorithms, including a genetic algorithm, a restarting ge- netic algorithm, differential evolution, and particle swarm optimization, are studied for their effectiveness in structure refinement of EXAFS. The oxygen-evolving com- plex in S1 is used as a benchmark for comparing the algorithms. These algorithms were successful in finding new atomic structures that produced improved calculated EXAFS spectra over atomic structures previously found.
Resumo:
The technique of Monte Carlo (MC) tests [Dwass (1957), Barnard (1963)] provides an attractive method of building exact tests from statistics whose finite sample distribution is intractable but can be simulated (provided it does not involve nuisance parameters). We extend this method in two ways: first, by allowing for MC tests based on exchangeable possibly discrete test statistics; second, by generalizing the method to statistics whose null distributions involve nuisance parameters (maximized MC tests, MMC). Simplified asymptotically justified versions of the MMC method are also proposed and it is shown that they provide a simple way of improving standard asymptotics and dealing with nonstandard asymptotics (e.g., unit root asymptotics). Parametric bootstrap tests may be interpreted as a simplified version of the MMC method (without the general validity properties of the latter).
Resumo:
Il a été démontré que l’hétérotachie, variation du taux de substitutions au cours du temps et entre les sites, est un phénomène fréquent au sein de données réelles. Échouer à modéliser l’hétérotachie peut potentiellement causer des artéfacts phylogénétiques. Actuellement, plusieurs modèles traitent l’hétérotachie : le modèle à mélange des longueurs de branche (MLB) ainsi que diverses formes du modèle covarion. Dans ce projet, notre but est de trouver un modèle qui prenne efficacement en compte les signaux hétérotaches présents dans les données, et ainsi améliorer l’inférence phylogénétique. Pour parvenir à nos fins, deux études ont été réalisées. Dans la première, nous comparons le modèle MLB avec le modèle covarion et le modèle homogène grâce aux test AIC et BIC, ainsi que par validation croisée. A partir de nos résultats, nous pouvons conclure que le modèle MLB n’est pas nécessaire pour les sites dont les longueurs de branche diffèrent sur l’ensemble de l’arbre, car, dans les données réelles, le signaux hétérotaches qui interfèrent avec l’inférence phylogénétique sont généralement concentrés dans une zone limitée de l’arbre. Dans la seconde étude, nous relaxons l’hypothèse que le modèle covarion est homogène entre les sites, et développons un modèle à mélanges basé sur un processus de Dirichlet. Afin d’évaluer différents modèles hétérogènes, nous définissons plusieurs tests de non-conformité par échantillonnage postérieur prédictif pour étudier divers aspects de l’évolution moléculaire à partir de cartographies stochastiques. Ces tests montrent que le modèle à mélanges covarion utilisé avec une loi gamma est capable de refléter adéquatement les variations de substitutions tant à l’intérieur d’un site qu’entre les sites. Notre recherche permet de décrire de façon détaillée l’hétérotachie dans des données réelles et donne des pistes à suivre pour de futurs modèles hétérotaches. Les tests de non conformité par échantillonnage postérieur prédictif fournissent des outils de diagnostic pour évaluer les modèles en détails. De plus, nos deux études révèlent la non spécificité des modèles hétérogènes et, en conséquence, la présence d’interactions entre différents modèles hétérogènes. Nos études suggèrent fortement que les données contiennent différents caractères hétérogènes qui devraient être pris en compte simultanément dans les analyses phylogénétiques.
Resumo:
L'un des principaux défis de l'interprétation radiographique réside dans la compréhension de l’anatomie radiographique, laquelle est intrinsèquement liée à la disposition tridimensionnelle des structures anatomiques et à l’impact du positionnement du tube radiogène vis-à-vis de ces structures lors de l'acquisition de l'image. Traditionnellement, des radiographies obtenues selon des projections standard sont employées pour enseigner l'anatomie radiographique en médecine vétérinaire. La tomodensitométrie − ou communément appelée CT (Computed Tomography) − partage plusieurs des caractéristiques de la radiographie en ce qui a trait à la génération des images. À l’aide d'un plug-in spécialement développé (ORS Visual ©), la matrice contenant les images CT est déformée pour reproduire les effets géométriques propres au positionnement du tube et du détecteur vis-à-vis du patient radiographié, tout particulièrement les effets de magnification et de distorsion. Afin d'évaluer le rendu des images simulées, différentes régions corporelles ont été imagées au CT chez deux chiens, un chat et un cheval, avant d'être radiographiées suivant des protocoles d'examens standards. Pour valider le potentiel éducatif des simulations, dix radiologistes certifiés ont comparé à l'aveugle neuf séries d'images radiographiques simulées aux séries radiographiques standard. Plusieurs critères ont été évalués, soient le grade de visualisation des marqueurs anatomiques, le réalisme et la qualité radiographique des images, le positionnement du patient et le potentiel éducatif de celles-ci pour différents niveaux de formation vétérinaire. Les résultats généraux indiquent que les images radiographiques simulées à partir de ce modèle sont suffisamment représentatives de la réalité pour être employées dans l’enseignement de l’anatomie radiographique en médecine vétérinaire.
Resumo:
La scoliose idiopathique de l’adolescence (SIA) est une déformation tridimensionnelle de la colonne vertébrale et de la cage thoracique dont la cause est inconnue. Il semble que la ceinture pelvienne soit impliquée dans la pathogénie de la SIA, car des différences géométriques des os coxaux ont été observées. Notamment, une rotation du bassin ou une inclinaison latérale dans le sens de la courbe scoliotique ont été mises en évidence en plus des distorsions osseuses. Il est difficile de dissocier la rotation du bassin de son asymétrie, car la majorité des études porte sur des données radiologiques bidimensionnelles. Une analyse tridimensionnelle de la morphologie du bassin de patientes ayant une SIA, mais n’ayant pas reçu de traitement par corset ou chirurgie permettrait d’identifier le rôle du bassin dans la progression de la scoliose. Dix-sept jeunes filles atteintes de la SIA ont participé à cette étude pour lesquelles des radiographies bi-planaires en station debout étaient disponibles au moment du diagnostic par un chirurgien orthopédiste pédiatrique et à l’instant de la prescription d'un corset. Des radiographies postéro-antérieures et latérales avaient été prises au moyen du système EOS®. Douze repères anatomiques du bassin ont été identifiés sur les paires de radiographies, alors que quatre repères ont été identifiés sur la radiographie postéro-antérieure uniquement. Ces quatre derniers n’étaient pas identifiables sur la radiographie latérale à cause de la superposition des repères droits et gauches. La reconstruction tridimensionnelle du bassin a été réalisée à partir de deux clichés radiographiques des 12 premiers repères osseux. Au total, neuf paramètres tridimensionnels ont été calculés afin de quantifier l’asymétrie et la distorsion du bassin entre les deux temps donnés. Des paramètres bidimensionnels ont également été mesurés sur les quatre derniers repères osseux afin de documenter des déformations du bassin pertinentes à la pratique clinique, telle que la rotation axiale de celui-ci. Dans le but d'évaluer une possible asymétrie entre les os coxaux du bassin, les paramètres tridimensionnels du bassin gauche ont été comparés à ceux du côté droit à chaque temps, au moyen d'un test-t pour échantillon apparié. La morphologie pelvienne a été également évaluée par l'analyse multivariée (MANOVA) à mesures répétées à deux conditions (côté*temps). En conséquence, nous avons constaté une croissance osseuse statistiquement significative du bassin dans l’intervalle de temps entre le diagnostic de la scoliose et le port du corset (p=0,033). Une asymétrie significative entre les côtés gauche et droit du bassin (p=0,013) a également été constatée. En ce qui concerne les paramètres bidimensionnels, nous avons constaté une augmentation de la version pelvienne (p=0,024) au cours de la croissance des jeunes filles. Finalement, le bassin n'a pas présenté de distorsion, puisqu'une valeur de p de 0,763 a été observée. En conclusion, la croissance des jeunes filles atteintes de la scoliose idiopathique de l'adolescence est accompagnée d'une asymétrie morphologique entre les deux os coxaux du bassin. Cette asymétrie constatée au moment du diagnostic de la scoliose des filles a évolué jusqu'à l’instant où le port du corset a été prescrit. Quant aux paramètres bidimensionnels, nous pouvons conclure que la rotation du bassin vers l'arrière a augmenté au cours de la croissance des jeunes filles, produisant ainsi une rétroversion pelvienne dans le plan sagittal. La distorsion tridimensionnelle du bassin n'a toutefois pas été observée au cours de la croissance des jeunes filles.
Resumo:
In this Letter a new physical model for metal-insulatormetal CMOS capacitors is presented. In the model the parameters of the circuit are derived from the physical structural details. Physical behaviors due to metal skin effect and inductance have been considered. The model has been confirmed by 3D EM simulator and design rules proposed. The model presented is scalable with capacitor geometry, allowing designers to predict and optimize quality factor. The approach has been verified for MIM CMOS capacitors
Resumo:
The thesis mainly focuses on material characterization in different environments: freely available samples taken in planar fonn, biological samples available in small quantities and buried objects.Free space method, finds many applications in the fields of industry, medicine and communication. As it is a non-contact method, it can be employed for monitoring the electrical properties of materials moving through a conveyor belt in real time. Also, measurement on such systems at high temperature is possible. NID theory can be applied to the characterization of thin films. Dielectric properties of thin films deposited on any dielectric substrate can be determined. ln chemical industry, the stages of a chemical reaction can be monitored online. Online monitoring will be more efficient as it saves time and avoids risk of sample collection.Dielectric contrast is one of the main factors, which decides the detectability of a system. lt could be noted that the two dielectric objects of same dielectric constant 3.2 (s, of plastic mine) placed in a medium of dielectric constant 2.56 (er of sand) could even be detected employing the time domain analysis of the reflected signal. This type of detection finds strategic importance as it provides solution to the problem of clearance of non-metallic mines. The demining of these mines using the conventional techniques had been proved futile. The studies on the detection of voids and leakage in pipes find many applications.The determined electrical properties of tissues can be used for numerical modeling of cells, microwave imaging, SAR test etc. All these techniques need the accurate determination of dielectric constant. ln the modem world, the use of cellular and other wireless communication systems is booming up. At the same time people are concemed about the hazardous effects of microwaves on living cells. The effect is usually studied on human phantom models. The construction of the models requires the knowledge of the dielectric parameters of the various body tissues. lt is in this context that the present study gains significance. The case study on biological samples shows that the properties of normal and infected body tissues are different. Even though the change in the dielectric properties of infected samples from that of normal one may not be a clear evidence of an ailment, it is an indication of some disorder.ln medical field, the free space method may be adapted for imaging the biological samples. This method can also be used in wireless technology. Evaluation of electrical properties and attenuation of obstacles in the path of RF waves can be done using free waves. An intelligent system for controlling the power output or frequency depending on the feed back values of the attenuation may be developed.The simulation employed in GPR can be extended for the exploration of the effects due to the factors such as the different proportion of water content in the soil, the level and roughness of the soil etc on the reflected signal. This may find applications in geological explorations. ln the detection of mines, a state-of-the art technique for scanning and imaging an active mine field can be developed using GPR. The probing antenna can be attached to a robotic arm capable of three degrees of rotation and the whole detecting system can be housed in a military vehicle. In industry, a system based on the GPR principle can be developed for monitoring liquid or gas through a pipe, as pipe with and without the sample gives different reflection responses. lt may also be implemented for the online monitoring of different stages of extraction and purification of crude petroleum in a plant.Since biological samples show fluctuation in the dielectric nature with time and other physiological conditions, more investigation in this direction should be done. The infected cells at various stages of advancement and the normal cells should be analysed. The results from these comparative studies can be utilized for the detection of the onset of such diseases. Studying the properties of infected tissues at different stages, the threshold of detectability of infected cells can be determined.
Resumo:
Land use is a crucial link between human activities and the natural environment and one of the main driving forces of global environmental change. Large parts of the terrestrial land surface are used for agriculture, forestry, settlements and infrastructure. Given the importance of land use, it is essential to understand the multitude of influential factors and resulting land use patterns. An essential methodology to study and quantify such interactions is provided by the adoption of land-use models. By the application of land-use models, it is possible to analyze the complex structure of linkages and feedbacks and to also determine the relevance of driving forces. Modeling land use and land use changes has a long-term tradition. In particular on the regional scale, a variety of models for different regions and research questions has been created. Modeling capabilities grow with steady advances in computer technology, which on the one hand are driven by increasing computing power on the other hand by new methods in software development, e.g. object- and component-oriented architectures. In this thesis, SITE (Simulation of Terrestrial Environments), a novel framework for integrated regional sland-use modeling, will be introduced and discussed. Particular features of SITE are the notably extended capability to integrate models and the strict separation of application and implementation. These features enable efficient development, test and usage of integrated land-use models. On its system side, SITE provides generic data structures (grid, grid cells, attributes etc.) and takes over the responsibility for their administration. By means of a scripting language (Python) that has been extended by language features specific for land-use modeling, these data structures can be utilized and manipulated by modeling applications. The scripting language interpreter is embedded in SITE. The integration of sub models can be achieved via the scripting language or by usage of a generic interface provided by SITE. Furthermore, functionalities important for land-use modeling like model calibration, model tests and analysis support of simulation results have been integrated into the generic framework. During the implementation of SITE, specific emphasis was laid on expandability, maintainability and usability. Along with the modeling framework a land use model for the analysis of the stability of tropical rainforest margins was developed in the context of the collaborative research project STORMA (SFB 552). In a research area in Central Sulawesi, Indonesia, socio-environmental impacts of land-use changes were examined. SITE was used to simulate land-use dynamics in the historical period of 1981 to 2002. Analogous to that, a scenario that did not consider migration in the population dynamics, was analyzed. For the calculation of crop yields and trace gas emissions, the DAYCENT agro-ecosystem model was integrated. In this case study, it could be shown that land-use changes in the Indonesian research area could mainly be characterized by the expansion of agricultural areas at the expense of natural forest. For this reason, the situation had to be interpreted as unsustainable even though increased agricultural use implied economic improvements and higher farmers' incomes. Due to the importance of model calibration, it was explicitly addressed in the SITE architecture through the introduction of a specific component. The calibration functionality can be used by all SITE applications and enables largely automated model calibration. Calibration in SITE is understood as a process that finds an optimal or at least adequate solution for a set of arbitrarily selectable model parameters with respect to an objective function. In SITE, an objective function typically is a map comparison algorithm capable of comparing a simulation result to a reference map. Several map optimization and map comparison methodologies are available and can be combined. The STORMA land-use model was calibrated using a genetic algorithm for optimization and the figure of merit map comparison measure as objective function. The time period for the calibration ranged from 1981 to 2002. For this period, respective reference land-use maps were compiled. It could be shown, that an efficient automated model calibration with SITE is possible. Nevertheless, the selection of the calibration parameters required detailed knowledge about the underlying land-use model and cannot be automated. In another case study decreases in crop yields and resulting losses in income from coffee cultivation were analyzed and quantified under the assumption of four different deforestation scenarios. For this task, an empirical model, describing the dependence of bee pollination and resulting coffee fruit set from the distance to the closest natural forest, was integrated. Land-use simulations showed, that depending on the magnitude and location of ongoing forest conversion, pollination services are expected to decline continuously. This results in a reduction of coffee yields of up to 18% and a loss of net revenues per hectare of up to 14%. However, the study also showed that ecological and economic values can be preserved if patches of natural vegetation are conservated in the agricultural landscape. -----------------------------------------------------------------------
Resumo:
This paper studies the test-retest reliability of distortion product otoacoustic emissions (DPOAE) in newborns in a neonatal intensive care unit.
Resumo:
Ecological risk assessments must increasingly consider the effects of chemical mixtures on the environment as anthropogenic pollution continues to grow in complexity. Yet testing every possible mixture combination is impractical and unfeasible; thus, there is an urgent need for models that can accurately predict mixture toxicity from single-compound data. Currently, two models are frequently used to predict mixture toxicity from single-compound data: Concentration addition and independent action (IA). The accuracy of the predictions generated by these models is currently debated and needs to be resolved before their use in risk assessments can be fully justified. The present study addresses this issue by determining whether the IA model adequately described the toxicity of binary mixtures of five pesticides and other environmental contaminants (cadmium, chlorpyrifos, diuron, nickel, and prochloraz) each with dissimilar modes of action on the reproduction of the nematode Caenorhabditis elegans. In three out of 10 cases, the IA model failed to describe mixture toxicity adequately with significant or antagonism being observed. In a further three cases, there was an indication of synergy, antagonism, and effect-level-dependent deviations, respectively, but these were not statistically significant. The extent of the significant deviations that were found varied, but all were such that the predicted percentage effect seen on reproductive output would have been wrong by 18 to 35% (i.e., the effect concentration expected to cause a 50% effect led to an 85% effect). The presence of such a high number and variety of deviations has important implications for the use of existing mixture toxicity models for risk assessments, especially where all or part of the deviation is synergistic.
Modeling of atmospheric effects on InSAR measurements by incorporating terrain elevation information
Resumo:
We propose an elevation-dependent calibratory method to correct for the water vapour-induced delays over Mt. Etna that affect the interferometric syntheric aperture radar (InSAR) results. Water vapour delay fields are modelled from individual zenith delay estimates on a network of continuous GPS receivers. These are interpolated using simple kriging with varying local means over two domains, above and below 2 km in altitude. Test results with data from a meteorological station and 14 continuous GPS stations over Mt. Etna show that a reduction of the mean phase delay field of about 27% is achieved after the model is applied to a 35-day interferogram. (C) 2006 Elsevier Ltd. All rights reserved.