932 resultados para Metals - Formability - Simulation methods


Relevância:

100.00% 100.00%

Publicador:

Resumo:

This dissertation deals with two specific aspects of a potential hydrogen-based energy economy, namely the problems of energy storage and energy conversion. In order to contribute to the solution of these problems, the structural and dynamical properties of two promising materials for hydrogen storage (lithium imide/amide) and proton conduction (poly[vinyl phosphonic acid]) are modeled on an atomistic scale by means of first principles molecular dynamics simulation methods.rnrnrnIn the case of the hydrogen storage system lithium amide/imide (LiNH_2/Li_2NH), the focus was on the interplay of structural features and nuclear quantum effects. For these calculations, Path-Integral Molecular Dynamics (PIMD) simulations were used. The structures of these materials at room temperature were elucidated; in collaboration with an experimental group, a very good agreement between calculated and experimental solid-state 1H-NMR chemical shifts was observed. Specifically, the structure of Li_2NH features a disordered arrangement of the Li lattice, which was not reported in previous studies. In addition, a persistent precession of the NH bonds was observed in our simulations. We provide evidence that this precession is the consequence of a toroid-shaped effective potential, in which the protons in the material are immersed. This potential is essentially flat along the torus azimuthal angle, which might lead to important quantum delocalization effects of the protons over the torus.rnrnOn the energy conversion side, the dynamics of protons in a proton conducting polymer (poly[vinyl phosphonic acid], PVPA) was studied by means of a steered ab-initio Molecular Dynamics approach applied on a simplified polymer model. The focus was put on understanding the microscopic proton transport mechanism in polymer membranes, and on characterizing the relevance of the local environment. This covers particularly the effect of water molecules, which participate in the hydrogen bonding network in the material. The results indicate that these water molecules are essential for the effectiveness of proton conduction. A water-mediated Grotthuss mechanism is identified as the main contributor to proton conduction, which agrees with the experimentally observed decay on conductivity for the same material in the absence of water molecules.rnrnThe gain in understanding the microscopic processes and structures present in this materials can help the development of new materials with improved properties, thus contributing to the solution of problems in the implementation of fuel cells.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The report reviews the technology of Free-space Optical Communication (FSO) and simulation methods for testing the performance of diverged beam in the technology. In addition to the introduction, the theory of turbulence and its effect over laser is also reviewed. In the simulation revision chapter, on-off keying (OOK) and diverged beam is assumed in the transmitter, and in the receiver, avalanche photodiode (APD) is utilized to convert the photon stream into electron stream. Phase screens are adopted to simulate the effect of turbulence over the phase of the optical beam. Apart from this, the method of data processing is introduced and retrospected. In the summary chapter, there is a general explanation of different beam divergence and their performance.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The electrolytic cleaning of metals by anodic methods has been known for many years. It was recognized long ago that when the temperature and concentration of the electro­lyte were properly regulated, bright clean surfaces were obtained.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Moderne generische Fertigungsverfahren für innengekühlte Werkzeuge bieten nahezu beliebige Freiheitsgrade zur Gestaltung konturnaher Kühlkanäle. Daraus resultiert ein erhöhter Anspruch an das Werkzeugengineering und die Optimierung der Kühlleistung. Geeignete Simulationsverfahren (wie z.B. Computational Fluid Dynamics - CFD) unterstützen die optimierte Werkzeugauslegung in idealer Weise. Mit der Erstellung virtueller Teststände können Varianten effizient und kostengünstig verglichen und die Kosten für Prototypen und Nacharbeiten reduziert werden. Im Computermodell des Werkzeugs erlauben Soft-Sensoren an beliebiger Position die Überwachung temperatur-kritischer Stellen sowohl im Fluid- als auch im Solidbereich. Der hier durchgeführte Benchmark vergleicht die Performance eines optimierten Werkzeugeinsatzes mit einer konventionellen Kühlung. Die im virtuellen Prozess vorhergesagte Zykluszeitreduzierung steht in guter Übereinstimmung mit realen Experimenten an den ausgeführten Werkzeugen.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Relationships between mineralization, collagen orientation and indentation modulus were investigated in bone structural units from the mid-shaft of human femora using a site-matched design. Mineral mass fraction, collagen fibril angle and indentation moduli were measured in registered anatomical sites using backscattered electron imaging, polarized light microscopy and nano-indentation, respectively. Theoretical indentation moduli were calculated with a homogenization model from the quantified mineral densities and mean collagen fibril orientations. The average indentation moduli predicted based on local mineralization and collagen fibers arrangement were not significantly different from the average measured experimentally with nanoindentation (p=0.9). Surprisingly, no substantial correlation of the measured indentation moduli with tissue mineralization and/or collagen fiber arrangement was found. Nano-porosity, micro-damage, collagen cross-links, non-collagenous proteins or other parameters affect the indentation measurements. Additional testing/simulation methods need to be considered to properly understand the variability of indentation moduli, beyond the mineralization and collagen arrangement in bone structural units.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Software-maintenance offshore outsourcing (SMOO) projects have been plagued by tedious knowledge transfer during the service transition to the vendor. Vendor engineers risk being over-strained by the high amounts of novel information, resulting in extra costs that may erode the business case behind offshoring. Although stakeholders may desire to avoid these extra costs by implementing appropriate knowledge transfer practices, little is known on how effective knowledge transfer can be designed and managed in light of the high cognitive loads in SMOO transitions. The dissertation at hand addresses this research gap by presenting and integrating four studies. The studies draw on cognitive load theory, attributional theory, and control theory and they apply qualitative, quantitative, and simulation methods to qualitative data from eight in-depth longitudinal cases. The results suggest that the choice of appropriate learning tasks may be more central to knowledge transfer than the amount of information shared with vendor engineers. Moreover, because vendor staff may not be able to and not dare to effectively self-manage learn-ing tasks during early transition, client-driven controls may be initially required and subsequently faded out. Collectively, the results call for people-based rather than codification-based knowledge management strategies in at least moderately specific and complex software environments.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Susceptibility of different restorative materials to toothbrush abrasion and coffee staining Objective: The aim of this study was to evaluate the susceptibility of different restorative materials to surface alterations after an aging simulation. Methods: Specimens (n=15 per material) of five different restorative materials (CER: ceramic/Vita Mark II; EMP: composite/Empress Direct; LAV: CAD/CAM composite/Lava Ultimate; COM: prefabricated composite/Componeer; VEN: prefabricated composite/Venear) were produced. Whereas CER was glazed, EMP and LAV were polished with silicon polishers, and COM and VEN were left untreated. Mean roughness (Ra and Rz) and colorimetric parameters (L*a*b*), expressed as colour change (E), were measured. The specimens underwent an artificial aging procedure. After baseline measurements (M1), the specimens were successively immersed for 24 hours in coffee (M2), abraded in a toothbrushing simulator (M3), immersed in coffee (M4), abraded (M5) and repeatedly abraded (M6). After each aging procedure (M2-M6), surface roughness and colorimetric parameters were recorded. Differences between the materials regarding Ra/Rz and E were analysed with a nonparametric ANOVA analysis. The level of significance was set at α=0.05. Results: The lowest roughness values were obtained for CER. A significant increase in Ra was detected for EMP, COM and VEN compared to CER. The Ra/Rz values were found to be highly significantly different for the materials and measuring times (M) (p<0.0001). Regarding E most alterations were found for EMP and COM, whereas CER and LAV remained mostly stable. The E values were significantly different for the materials and M (p<0.0001). Conclusion: The ceramic and the CAD/CAM composite were the most stable materials with regard to roughness and colour change and the only materials that resulted in Ra values below 0.2 μm (the clinically relevant threshold). Venears and Componeers were more inert than the direct composite material and thus might be an alternative for extensive restorations in the aesthetic zone.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Femoroacetabular impingement (FAI) is a dynamic conflict of the hip defined by a pathological, early abutment of the proximal femur onto the acetabulum or pelvis. In the past two decades, FAI has received increasing focus in both research and clinical practice as a cause of hip pain and prearthrotic deformity. Anatomical abnormalities such as an aspherical femoral head (cam-type FAI), a focal or general overgrowth of the acetabulum (pincer-type FAI), a high riding greater or lesser trochanter (extra-articular FAI), or abnormal torsion of the femur have been identified as underlying pathomorphologies. Open and arthroscopic treatment options are available to correct the deformity and to allow impingement-free range of motion. In routine practice, diagnosis and treatment planning of FAI is based on clinical examination and conventional imaging modalities such as standard radiography, magnetic resonance arthrography (MRA), and computed tomography (CT). Modern software tools allow three-dimensional analysis of the hip joint by extracting pelvic landmarks from two-dimensional antero-posterior pelvic radiographs. An object-oriented cross-platform program (Hip2Norm) has been developed and validated to standardize pelvic rotation and tilt on conventional AP pelvis radiographs. It has been shown that Hip2Norm is an accurate, consistent, reliable and reproducible tool for the correction of selected hip parameters on conventional radiographs. In contrast to conventional imaging modalities, which provide only static visualization, novel computer assisted tools have been developed to allow the dynamic analysis of FAI pathomechanics. In this context, a validated, CT-based software package (HipMotion) has been introduced. HipMotion is based on polygonal three-dimensional models of the patient’s pelvis and femur. The software includes simulation methods for range of motion, collision detection and accurate mapping of impingement areas. A preoperative treatment plan can be created by performing a virtual resection of any mapped impingement zones both on the femoral head-neck junction, as well as the acetabular rim using the same three-dimensional models. The following book chapter provides a summarized description of current computer-assisted tools for the diagnosis and treatment planning of FAI highlighting the possibility for both static and dynamic evaluation, reliability and reproducibility, and its applicability to routine clinical use.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The determination of size as well as power of a test is a vital part of a Clinical Trial Design. This research focuses on the simulation of clinical trial data with time-to-event as the primary outcome. It investigates the impact of different recruitment patterns, and time dependent hazard structures on size and power of the log-rank test. A non-homogeneous Poisson process is used to simulate entry times according to the different accrual patterns. A Weibull distribution is employed to simulate survival times according to the different hazard structures. The current study utilizes simulation methods to evaluate the effect of different recruitment patterns on size and power estimates of the log-rank test. The size of the log-rank test is estimated by simulating survival times with identical hazard rates between the treatment and the control arm of the study resulting in a hazard ratio of one. Powers of the log-rank test at specific values of hazard ratio (≠1) are estimated by simulating survival times with different, but proportional hazard rates for the two arms of the study. Different shapes (constant, decreasing, or increasing) of the hazard function of the Weibull distribution are also considered to assess the effect of hazard structure on the size and power of the log-rank test. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Chinese government commits to reach its peak carbon emissions before 2030, which requires China to implement new policies. Using a CGE model, this study conducts simulation studies on the functions of an energy tax and a carbon tax and analyzes their effects on macro-economic indices. The Chinese economy is affected at an acceptable level by the two taxes. GDP will lose less than 0.8% with a carbon tax of 100, 50, or 10 RMB/ton CO2 or 5% of the delivery price of an energy tax. Thus, the loss of real disposable personal income is smaller. Compared with implementing a single tax, a combined carbon and energy tax induces more emission reductions with relatively smaller economic costs. With these taxes, the domestic competitiveness of energy intensive industries is improved. Additionally, we found that the sooner such taxes are launched, the smaller the economic costs and the more significant the achieved emission reductions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

La vulnerabilidad de los sistemas ganaderos de pastoreo pone en evidencia la necesidad de herramientas para evaluar y mitigar los efectos de la sequía. El avance en la teledetección ha despertado el interés por explotar potenciales aplicaciones, y está dando lugar a un intenso desarrollo de innovaciones en distintos campos. Una de estas áreas es la gestión del riesgo climático, en donde la utilización de índices de vegetación permite la evaluación de la sequía. En esta investigación, se analiza el impacto de la sequía y se evalúa el potencial de nuevas tecnologías como la teledetección para la gestión del riesgo de sequía en sistemas de ganadería extensiva. Para ello, se desarrollan tres aplicaciones: (i) evaluar el impacto económico de la sequía en una explotación ganadera extensiva de la dehesa de Andalucía, (ii) elaborar mapas de vulnerabilidad a la sequía en pastos de Chile y (iii) diseñar y evaluar el potencial de un seguro indexado para sequía en pastos en la región de Coquimbo en Chile. En la primera aplicación, se diseña un modelo dinámico y estocástico que integra aspectos climáticos, ecológicos, agronómicos y socioeconómicos para evaluar el riesgo de sequía. El modelo simula una explotación ganadera tipo de la dehesa de Andalucía para el período 1999-2010. El método de Análisis Histórico y la simulación de MonteCarlo se utilizan para identificar los principales factores de riesgo de la explotación, entre los que destacan, los periodos de inicios del verano e inicios de invierno. Los resultados muestran la existencia de un desfase temporal entre el riesgo climático y riesgo económico, teniendo este último un periodo de duración más extenso en el tiempo. También, revelan que la intensidad, frecuencia y duración son tres atributos cruciales que determinan el impacto económico de la sequía. La estrategia de reducción de la carga ganadera permite aminorar el riesgo, pero conlleva una disminución en el margen bruto de la explotación. La segunda aplicación está dedicada a la elaboración de mapas de vulnerabilidad a la sequia en pastos de Chile. Para ello, se propone y desarrolla un índice de riesgo económico (IRESP) sencillo de interpretar y replicable, que integra factores de riesgo y estrategias de adaptación para obtener una medida del Valor en Riesgo, es decir, la máxima pérdida esperada en un año con un nivel de significación del 5%.La representación espacial del IRESP pone en evidencia patrones espaciales y diferencias significativas en la vulnerabilidad a la sequía a lo largo de Chile. Además, refleja que la vulnerabilidad no siempre esta correlacionada con el riesgo climático y demuestra la importancia de considerar las estrategias de adaptación. Las medidas de autocorrelación espacial revelan que el riesgo sistémico es considerablemente mayor en el sur que en el resto de zonas. Los resultados demuestran que el IRESP transmite información pertinente y, que los mapas de vulnerabilidad pueden ser una herramienta útil en el diseño de políticas y toma de decisiones para la gestión del riesgo de sequía. La tercera aplicación evalúa el potencial de un seguro indexado para sequía en pastos en la región de Coquimbo en Chile. Para lo cual, se desarrolla un modelo estocástico para estimar la prima actuarialmente justa del seguro y se proponen y evalúan pautas alternativas para mejorar el diseño del contrato. Se aborda el riesgo base, el principal problema de los seguros indexados identificado en la literatura y, que está referido a la correlación imperfecta del índice con las pérdidas de la explotación. Para ello, se sigue un enfoque bayesiano que permite evaluar el impacto en el riesgo base de las pautas de diseño propuestas: i) una zonificación por clúster que considera aspectos espacio-temporales, ii) un período de garantía acotado a los ciclos fenológicos del pasto y iii) umbral de garantía. Los resultados muestran que tanto la zonificación como el periodo de garantía reducen el riesgo base considerablemente. Sin embargo, el umbral de garantía tiene un efecto ambiguo sobre el riesgo base. Por otra parte, la zonificación por clúster contribuye a aminorar el riesgo sistémico que enfrentan las aseguradoras. Estos resultados han puesto de manifiesto que un buen diseño de contrato puede tener un doble dividendo, por un lado aumentar su utilidad y, por otro, reducir el coste del seguro. Un diseño de contrato eficiente junto con los avances en la teledetección y un adecuado marco institucional son los pilares básicos para el buen funcionamiento de un programa de seguro. Las nuevas tecnologías ofrecen un importante potencial para la innovación en la gestión del riesgo climático. Los avances en este campo pueden proporcionar importantes beneficios sociales en los países en desarrollo y regiones vulnerables, donde las herramientas para gestionar eficazmente los riesgos sistémicos como la sequía pueden ser de gran ayuda para el desarrollo. The vulnerability of grazing livestock systems highlights the need for tools to assess and mitigate the adverse impact of drought. The recent and rapid progress in remote sensing has awakened an interest for tapping into potential applications, triggering intensive efforts to develop innovations in a number of spheres. One of these areas is climate risk management, where the use of vegetation indices facilitates assessment of drought. This research analyzes drought impacts and evaluates the potential of new technologies such as remote sensing to manage drought risk in extensive livestock systems. Three essays in drought risk management are developed to: (i) assess the economic impact of drought on a livestock farm in the Andalusian Dehesa, (ii) build drought vulnerability maps in Chilean grazing lands, and (iii) design and evaluate the potential of an index insurance policy to address the risk of drought in grazing lands in Coquimbo, Chile. In the first essay, a dynamic and stochastic farm model is designed combining climate, agronomic, socio-economic and ecological aspects to assess drought risk. The model is developed to simulate a representative livestock farm in the Dehesa of Andalusia for the time period 1999-2010. Burn analysis and MonteCarlo simulation methods are used to identify the significance of various risk sources at the farm. Most notably, early summer and early winter are identified as periods of peak risk. Moreover, there is a significant time lag between climate and economic risk and this later last longer than the former. It is shown that intensity, frequency and duration of the drought are three crucial attributes that shape the economic impact of drought. Sensitivity analysis is conducted to assess the sustainability of farm management strategies and demonstrates that lowering the stocking rate reduces farmer exposure to drought risk but entails a reduction in the expected gross margin. The second essay, mapping drought vulnerability in Chilean grazing lands, proposes and builds an index of economic risk (IRESP) that is replicable and simple to interpret. This methodology integrates risk factors and adaptation strategies to deliver information on Value at Risk, maximum expected losses at 5% significance level. Mapping IRESP provides evidence about spatial patterns and significant differences in drought vulnerability across Chilean grazing lands. Spatial autocorrelation measures reveal that systemic risk is considerably larger in the South as compared to Northern or Central Regions. Furthermore, it is shown that vulnerability is not necessarily correlated with climate risk and that adaptation strategies do matter. These results show that IRESP conveys relevant information and that vulnerability maps may be useful tools to assess policy design and decision-making in drought risk management. The third essay develops a stochastic model to estimate the actuarially fair premium and evaluates the potential of an indexed insurance policy to manage drought risk in Coquimbo, a relevant livestock farming region of Chile. Basis risk refers to the imperfect correlation of the index and farmer loses and is identified in the literature as a main limitation of index insurance. A Bayesian approach is proposed to assess the impact on basis risk of alternative guidelines in contract design: i) A cluster zoning that considers space-time aspects, ii) A guarantee period bounded to fit phenological cycles, and iii) the triggering index threshold. Results show that both the proposed zoning and guarantee period considerably reduces basis risk. However, the triggering index threshold has an ambiguous effect on basis risk. On the other hand, cluster zoning contributes to ameliorate systemic risk faced by the insurer. These results highlighted that adequate contract design is important and may result in double dividend. On the one hand, increasing farmers’ utility and, secondly, reducing the cost of insurance. An efficient contract design coupled with advances in remote sensing and an appropriate institutional framework are the basis for an efficient operation of an insurance program. The new technologies offer significant potential for innovation in climate risk managements. Progress in this field is capturing increasing attention and may provide important social gains in developing countries and vulnerable regions where the tools to efficiently manage systemic risks, such as drought, may be a means to foster development.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper is concerned with the study of the berth system in port terminals. The main objective is to present the management methodologies, which include empirical methods, analytical methods and simulation methods The comparison shows that these three methods are not independent, but they are complementary. Each method has advantages and limitations and these depend on the type of study performed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Nanoinformatics has recently emerged to address the need of computing applications at the nano level. In this regard, the authors have participated in various initiatives to identify its concepts, foundations and challenges. While nanomaterials open up the possibility for developing new devices in many industrial and scientific areas, they also offer breakthrough perspectives for the prevention, diagnosis and treatment of diseases. In this paper, we analyze the different aspects of nanoinformatics and suggest five research topics to help catalyze new research and development in the area, particularly focused on nanomedicine. We also encompass the use of informatics to further the biological and clinical applications of basic research in nanoscience and nanotechnology, and the related concept of an extended ?nanotype? to coalesce information related to nanoparticles. We suggest how nanoinformatics could accelerate developments in nanomedicine, similarly to what happened with the Human Genome and other -omics projects, on issues like exchanging modeling and simulation methods and tools, linking toxicity information to clinical and personal databases or developing new approaches for scientific ontologies, among many others.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In the recent years the missing fourth component, the memristor, was successfully synthesized. However, the mathematical complexity and variety of the models behind this component, in addition to the existence of convergence problems in the simulations, make the design of memristor-based applications long and difficult. In this work we present a memristor model characterization framework which supports the automated generation of subcircuit files. The proposed environment allows the designer to choose and parameterize the memristor model that best suits for a given application. The framework carries out characterizing simulations in order to study the possible non-convergence problems, solving the dependence on the simulation conditions and guaranteeing the functionality and performance of the design. Additionally, the occurrence of undesirable effects related to PVT variations is also taken into account. By performing a Monte Carlo or a corner analysis, the designer is aware of the safety margins which assure the correct device operation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Dynamic thermal management techniques require a collection of on-chip thermal sensors that imply a significant area and power overhead. Finding the optimum number of temperature monitors and their location on the chip surface to optimize accuracy is an NP-hard problem. In this work we improve the modeling of the problem by including area, power and networking constraints along with the consideration of three inaccuracy terms: spatial errors, sampling rate errors and monitor-inherent errors. The problem is solved by the simulated annealing algorithm. We apply the algorithm to a test case employing three different types of monitors to highlight the importance of the different metrics. Finally we present a case study of the Alpha 21364 processor under two different constraint scenarios.