964 resultados para Model combination
Resumo:
The geometries of a catchment constitute the basis for distributed physically based numerical modeling of different geoscientific disciplines. In this paper results from ground-penetrating radar (GPR) measurements, in terms of a 3D model of total sediment thickness and active layer thickness in a periglacial catchment in western Greenland, is presented. Using the topography, thickness and distribution of sediments is calculated. Vegetation classification and GPR measurements are used to scale active layer thickness from local measurements to catchment scale models. Annual maximum active layer thickness varies from 0.3 m in wetlands to 2.0 m in barren areas and areas of exposed bedrock. Maximum sediment thickness is estimated to be 12.3 m in the major valleys of the catchment. A method to correlate surface vegetation with active layer thickness is also presented. By using relatively simple methods, such as probing and vegetation classification, it is possible to upscale local point measurements to catchment scale models, in areas where the upper subsurface is relatively homogenous. The resulting spatial model of active layer thickness can be used in combination with the sediment model as a geometrical input to further studies of subsurface mass-transport and hydrological flow paths in the periglacial catchment through numerical modelling.
Resumo:
It is commonly understood that the observed decline in precipitation in South-West Australia during the 20th century is caused by anthropogenic factors. Candidates therefore are changes to large-scale atmospheric circulations due to global warming, extensive deforestation and anthropogenic aerosol emissions - all of which are effective on different spatial and temporal scales. This contribution focusses on the role of rapidly rising aerosol emissions from anthropogenic sources in South-West Australia around 1970. An analysis of historical longterm rainfall data of the Bureau of Meteorology shows that South-West Australia as a whole experienced a gradual decline in precipitation over the 20th century. However, on smaller scales and for the particular example of the Perth catchment area, a sudden drop in precipitation around 1970 is apparent. Modelling experiments at a convection-resolving resolution of 3.3km using the Weather and Research Forecasting (WRF) model version 3.6.1 with the aerosol-aware Thompson-Eidhammer microphysics scheme are conducted for the period 1970-1974. A comparison of four runs with different prescribed aerosol emissions and without aerosol effects demonstrates that tripling the pre-1960s atmospheric CCN and IN concentrations can suppress precipitation by 2-9%, depending on the area and the season. This suggests that a combination of all three processes is required to account for the gradual decline in rainfall seen for greater South-West Australia and for the sudden drop observed in areas along the West Coast in the 1970s: changing atmospheric circulations, deforestation and anthropogenic aerosol emissions.
Resumo:
This data set provides a high-resolution digital elevation model (DEM) of a thermokarst depression (~7 km²) on ice-complex deposits in the Arctic Lena Delta, Siberia. The DEM based on a geodetic field survey and was used for quantitative land surface analyses and detailed description of the thermokarst depression morphology. Detailed morphometrical analyses, volume calculations, and solar radiation modeling were performed and statistically analyzed by Ulrich et al. (2010) to investigate the asymmetrical thermokarst depression development and directed lake migration previously proposed by Morgenstern et al. (2008). Furthermore, the high-resolution DEM in combination with satellite data allowed detailed analyses of spatial and temporal landscape changes due to thermokarst development (Günther, 2009).
Resumo:
We have recently demonstrated a biosensor based on a lattice of SU8 pillars on a 1 μm SiO2/Si wafer by measuring vertically reflectivity as a function of wavelength. The biodetection has been proven with the combination of Bovine Serum Albumin (BSA) protein and its antibody (antiBSA). A BSA layer is attached to the pillars; the biorecognition of antiBSA involves a shift in the reflectivity curve, related with the concentration of antiBSA. A detection limit in the order of 2 ng/ml is achieved for a rhombic lattice of pillars with a lattice parameter (a) of 800 nm, a height (h) of 420 nm and a diameter(d) of 200 nm. These results correlate with calculations using 3D-finite difference time domain method. A 2D simplified model is proposed, consisting of a multilayer model where the pillars are turned into a 420 nm layer with an effective refractive index obtained by using Beam Propagation Method (BPM) algorithm. Results provided by this model are in good correlation with experimental data, reaching a reduction in time from one day to 15 minutes, giving a fast but accurate tool to optimize the design and maximizing sensitivity, and allows analyzing the influence of different variables (diameter, height and lattice parameter). Sensitivity is obtained for a variety of configurations, reaching a limit of detection under 1 ng/ml. Optimum design is not only chosen because of its sensitivity but also its feasibility, both from fabrication (limited by aspect ratio and proximity of the pillars) and fluidic point of view. (© 2011 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim)
Resumo:
A combination of Method of Moments (MoM) and compound slot Equivalent Circuit Model for linear array design is presented in this document. From the S Matrix of the single element, the more suitable network for its characterization is analyzed and selected. Then according to the radiation requirements of the desired array, the elements are designed and then properly connected by means of Forward Matching Procedure (FMP), which takes into account impedance matters in order to keep the input matched at the designing frequency. Comparison between HFSS simulations and MoM-FMP results are also presented. First part of this work was introduced in (1)(2) but a summary is included here to make the understanding easier.
Resumo:
OntoTag - A Linguistic and Ontological Annotation Model Suitable for the Semantic Web
1. INTRODUCTION. LINGUISTIC TOOLS AND ANNOTATIONS: THEIR LIGHTS AND SHADOWS
Computational Linguistics is already a consolidated research area. It builds upon the results of other two major ones, namely Linguistics and Computer Science and Engineering, and it aims at developing computational models of human language (or natural language, as it is termed in this area). Possibly, its most well-known applications are the different tools developed so far for processing human language, such as machine translation systems and speech recognizers or dictation programs.
These tools for processing human language are commonly referred to as linguistic tools. Apart from the examples mentioned above, there are also other types of linguistic tools that perhaps are not so well-known, but on which most of the other applications of Computational Linguistics are built. These other types of linguistic tools comprise POS taggers, natural language parsers and semantic taggers, amongst others. All of them can be termed linguistic annotation tools.
Linguistic annotation tools are important assets. In fact, POS and semantic taggers (and, to a lesser extent, also natural language parsers) have become critical resources for the computer applications that process natural language. Hence, any computer application that has to analyse a text automatically and ‘intelligently’ will include at least a module for POS tagging. The more an application needs to ‘understand’ the meaning of the text it processes, the more linguistic tools and/or modules it will incorporate and integrate.
However, linguistic annotation tools have still some limitations, which can be summarised as follows:
1. Normally, they perform annotations only at a certain linguistic level (that is, Morphology, Syntax, Semantics, etc.).
2. They usually introduce a certain rate of errors and ambiguities when tagging. This error rate ranges from 10 percent up to 50 percent of the units annotated for unrestricted, general texts.
3. Their annotations are most frequently formulated in terms of an annotation schema designed and implemented ad hoc.
A priori, it seems that the interoperation and the integration of several linguistic tools into an appropriate software architecture could most likely solve the limitations stated in (1). Besides, integrating several linguistic annotation tools and making them interoperate could also minimise the limitation stated in (2). Nevertheless, in the latter case, all these tools should produce annotations for a common level, which would have to be combined in order to correct their corresponding errors and inaccuracies. Yet, the limitation stated in (3) prevents both types of integration and interoperation from being easily achieved.
In addition, most high-level annotation tools rely on other lower-level annotation tools and their outputs to generate their own ones. For example, sense-tagging tools (operating at the semantic level) often use POS taggers (operating at a lower level, i.e., the morphosyntactic) to identify the grammatical category of the word or lexical unit they are annotating. Accordingly, if a faulty or inaccurate low-level annotation tool is to be used by other higher-level one in its process, the errors and inaccuracies of the former should be minimised in advance. Otherwise, these errors and inaccuracies would be transferred to (and even magnified in) the annotations of the high-level annotation tool.
Therefore, it would be quite useful to find a way to
(i) correct or, at least, reduce the errors and the inaccuracies of lower-level linguistic tools;
(ii) unify the annotation schemas of different linguistic annotation tools or, more generally speaking, make these tools (as well as their annotations) interoperate.
Clearly, solving (i) and (ii) should ease the automatic annotation of web pages by means of linguistic tools, and their transformation into Semantic Web pages (Berners-Lee, Hendler and Lassila, 2001). Yet, as stated above, (ii) is a type of interoperability problem. There again, ontologies (Gruber, 1993; Borst, 1997) have been successfully applied thus far to solve several interoperability problems. Hence, ontologies should help solve also the problems and limitations of linguistic annotation tools aforementioned.
Thus, to summarise, the main aim of the present work was to combine somehow these separated approaches, mechanisms and tools for annotation from Linguistics and Ontological Engineering (and the Semantic Web) in a sort of hybrid (linguistic and ontological) annotation model, suitable for both areas. This hybrid (semantic) annotation model should (a) benefit from the advances, models, techniques, mechanisms and tools of these two areas; (b) minimise (and even solve, when possible) some of the problems found in each of them; and (c) be suitable for the Semantic Web. The concrete goals that helped attain this aim are presented in the following section.
2. GOALS OF THE PRESENT WORK
As mentioned above, the main goal of this work was to specify a hybrid (that is, linguistically-motivated and ontology-based) model of annotation suitable for the Semantic Web (i.e. it had to produce a semantic annotation of web page contents). This entailed that the tags included in the annotations of the model had to (1) represent linguistic concepts (or linguistic categories, as they are termed in ISO/DCR (2008)), in order for this model to be linguistically-motivated; (2) be ontological terms (i.e., use an ontological vocabulary), in order for the model to be ontology-based; and (3) be structured (linked) as a collection of ontology-based
Resumo:
Social behaviour is mainly based on swarm colonies, in which each individual shares its knowledge about the environment with other individuals to get optimal solutions. Such co-operative model differs from competitive models in the way that individuals die and are born by combining information of alive ones. This paper presents the particle swarm optimization with differential evolution algorithm in order to train a neural network instead the classic back propagation algorithm. The performance of a neural network for particular problems is critically dependant on the choice of the processing elements, the net architecture and the learning algorithm. This work is focused in the development of methods for the evolutionary design of artificial neural networks. This paper focuses in optimizing the topology and structure of connectivity for these networks.
Resumo:
Following the Integrated Water Resources Management approach, the European Water Framework Directive demands Member States to develop water management plans at the catchment level. Those plans have to integrate the different interests and must be developed with stakeholder participation. To face these requirements, managers need tools to assess the impacts of possible management alternatives on natural and socio-economic systems. These tools should ideally be able to address the complexity and uncertainties of the water system, while serving as a platform for stakeholder participation. The objective of our research was to develop a participatory integrated assessment model, based on the combination of a crop model, an economic model and a participatory Bayesian network, with an application in the middle Guadiana sub-basin, in Spain. The methodology is intended to capture the complexity of water management problems, incorporating the relevant sectors, as well as the relevant scales involved in water management decision making. The integrated model has allowed us testing different management, market and climate change scenarios and assessing the impacts of such scenarios on the natural system (crops), on the socio-economic system (farms) and on the environment (water resources). Finally, this integrated assessment modelling process has allowed stakeholder participation, complying with the main requirements of current European water laws.
Resumo:
This article presents a new material model developed with the aim of analyzing failure of blunt notched components made of nonlinear brittle materials. The model, which combines the cohesive crack model with Hencky's theory of total deformations, is used to simulate an experimental benchmark carried out previously by the authors. Such combination is achieved through the embedded crack approach concept. In spite of the unavailability of precise material data, the numerical predictions obtained show good agreement with the experimental results.
Resumo:
A coupled elastoplastic-damage constitutive model with Lode angle dependent failure criterion for high strain and ballistic applications is presented. A Lode angle dependent function is added to the equivalent plastic strain to failure definition of the Johnson–Cook failure criterion. The weakening in the elastic law and in the Johnson–Cook-like constitutive relation implicitly introduces the Lode angle dependency in the elastoplastic behaviour. The material model is calibrated for precipitation hardened Inconel 718 nickel-base superalloy. The combination of a Lode angle dependent failure criterion with weakened constitutive equations is proven to predict fracture patterns of the mechanical tests performed and provide reliable results. Additionally, the mesh size dependency on the prediction of the fracture patterns was studied, showing that was crucial to predict such patterns
Resumo:
An innovative background modeling technique that is able to accurately segment foreground regions in RGB-D imagery (RGB plus depth) has been presented in this paper. The technique is based on a Bayesian framework that efficiently fuses different sources of information to segment the foreground. In particular, the final segmentation is obtained by considering a prediction of the foreground regions, carried out by a novel Bayesian Network with a depth-based dynamic model, and, by considering two independent depth and color-based mixture of Gaussians background models. The efficient Bayesian combination of all these data reduces the noise and uncertainties introduced by the color and depth features and the corresponding models. As a result, more compact segmentations, and refined foreground object silhouettes are obtained. Experimental results with different databases suggest that the proposed technique outperforms existing state-of-the-art algorithms.
Resumo:
El auge y penetración de las nuevas tecnologías junto con la llamada Web Social están cambiando la forma en la que accedemos a la medicina. Cada vez más pacientes y profesionales de la medicina están creando y consumiendo recursos digitales de contenido clínico a través de Internet, surgiendo el problema de cómo asegurar la fiabilidad de estos recursos. Además, un nuevo concepto está apareciendo, el de pervasive healthcare o sanidad ubicua, motivado por pacientes que demandan un acceso a los servicios sanitarios en todo momento y en todo lugar. Este nuevo escenario lleva aparejado un problema de confianza en los proveedores de servicios sanitarios. Las plataformas de eLearning se están erigiendo como paradigma de esta nueva Medicina 2.0 ya que proveen un servicio abierto a la vez que controlado/supervisado a recursos digitales, y facilitan las interacciones y consultas entre usuarios, suponiendo una buena aproximación para esta sanidad ubicua. En estos entornos los problemas de fiabilidad y confianza pueden ser solventados mediante la implementación de mecanismos de recomendación de recursos y personas de manera confiable. Tradicionalmente las plataformas de eLearning ya cuentan con mecanismos de recomendación, si bien están más enfocados a la recomendación de recursos. Para la recomendación de usuarios es necesario acudir a mecanismos más elaborados como son los sistemas de confianza y reputación (trust and reputation) En ambos casos, tanto la recomendación de recursos como el cálculo de la reputación de los usuarios se realiza teniendo en cuenta criterios principalmente subjetivos como son las opiniones de los usuarios. En esta tesis doctoral proponemos un nuevo modelo de confianza y reputación que combina evaluaciones automáticas de los recursos digitales en una plataforma de eLearning, con las opiniones vertidas por los usuarios como resultado de las interacciones con otros usuarios o después de consumir un recurso. El enfoque seguido presenta la novedad de la combinación de una parte objetiva con otra subjetiva, persiguiendo mitigar el efecto de posibles castigos subjetivos por parte de usuarios malintencionados, a la vez que enriquecer las evaluaciones objetivas con información adicional acerca de la capacidad pedagógica del recurso o de la persona. El resultado son recomendaciones siempre adaptadas a los requisitos de los usuarios, y de la máxima calidad tanto técnica como educativa. Esta nueva aproximación requiere una nueva herramienta para su validación in-silico, al no existir ninguna aplicación que permita la simulación de plataformas de eLearning con mecanismos de recomendación de recursos y personas, donde además los recursos sean evaluados objetivamente. Este trabajo de investigación propone pues una nueva herramienta, basada en el paradigma de programación orientada a agentes inteligentes para el modelado de comportamientos complejos de usuarios en plataformas de eLearning. Además, la herramienta permite también la simulación del funcionamiento de este tipo de entornos dedicados al intercambio de conocimiento. La evaluación del trabajo propuesto en este documento de tesis se ha realizado de manera iterativa a lo largo de diferentes escenarios en los que se ha situado al sistema frente a una amplia gama de comportamientos de usuarios. Se ha comparado el rendimiento del modelo de confianza y reputación propuesto frente a dos modos de recomendación tradicionales: a) utilizando sólo las opiniones subjetivas de los usuarios para el cálculo de la reputación y por extensión la recomendación; y b) teniendo en cuenta sólo la calidad objetiva del recurso sin hacer ningún cálculo de reputación. Los resultados obtenidos nos permiten afirmar que el modelo desarrollado mejora la recomendación ofrecida por las aproximaciones tradicionales, mostrando una mayor flexibilidad y capacidad de adaptación a diferentes situaciones. Además, el modelo propuesto es capaz de asegurar la recomendación de nuevos usuarios entrando al sistema frente a la nula recomendación para estos usuarios presentada por el modo de recomendación predominante en otras plataformas que basan la recomendación sólo en las opiniones de otros usuarios. Por último, el paradigma de agentes inteligentes ha probado su valía a la hora de modelar plataformas virtuales complejas orientadas al intercambio de conocimiento, especialmente a la hora de modelar y simular el comportamiento de los usuarios de estos entornos. La herramienta de simulación desarrollada ha permitido la evaluación del modelo de confianza y reputación propuesto en esta tesis en una amplia gama de situaciones diferentes. ABSTRACT Internet is changing everything, and this revolution is especially present in traditionally offline spaces such as medicine. In recent years health consumers and health service providers are actively creating and consuming Web contents stimulated by the emergence of the Social Web. Reliability stands out as the main concern when accessing the overwhelming amount of information available online. Along with this new way of accessing the medicine, new concepts like ubiquitous or pervasive healthcare are appearing. Trustworthiness assessment is gaining relevance: open health provisioning systems require mechanisms that help evaluating individuals’ reputation in pursuit of introducing safety to these open and dynamic environments. Technical Enhanced Learning (TEL) -commonly known as eLearning- platforms arise as a paradigm of this Medicine 2.0. They provide an open while controlled/supervised access to resources generated and shared by users, enhancing what it is being called informal learning. TEL systems also facilitate direct interactions amongst users for consultation, resulting in a good approach to ubiquitous healthcare. The aforementioned reliability and trustworthiness problems can be faced by the implementation of mechanisms for the trusted recommendation of both resources and healthcare services providers. Traditionally, eLearning platforms already integrate recommendation mechanisms, although this recommendations are basically focused on providing an ordered classifications of resources. For users’ recommendation, the implementation of trust and reputation systems appears as the best solution. Nevertheless, both approaches base the recommendation on the information from the subjective opinions of other users of the platform regarding the resources or the users. In this PhD work a novel approach is presented for the recommendation of both resources and users within open environments focused on knowledge exchange, as it is the case of TEL systems for ubiquitous healthcare. The proposed solution adds the objective evaluation of the resources to the traditional subjective personal opinions to estimate the reputation of the resources and of the users of the system. This combined measure, along with the reliability of that calculation, is used to provide trusted recommendations. The integration of opinions and evaluations, subjective and objective, allows the model to defend itself against misbehaviours. Furthermore, it also allows ‘colouring’ cold evaluation values by providing additional quality information such as the educational capacities of a digital resource in an eLearning system. As a result, the recommendations are always adapted to user requirements, and of the maximum technical and educational quality. To our knowledge, the combination of objective assessments and subjective opinions to provide recommendation has not been considered before in the literature. Therefore, for the evaluation of the trust and reputation model defined in this PhD thesis, a new simulation tool will be developed following the agent-oriented programming paradigm. The multi-agent approach allows an easy modelling of independent and proactive behaviours for the simulation of users of the system, conforming a faithful resemblance of real users of TEL platforms. For the evaluation of the proposed work, an iterative approach have been followed, testing the performance of the trust and reputation model while providing recommendation in a varied range of scenarios. A comparison with two traditional recommendation mechanisms was performed: a) using only users’ past opinions about a resource and/or other users; and b) not using any reputation assessment and providing the recommendation considering directly the objective quality of the resources. The results show that the developed model improves traditional approaches at providing recommendations in Technology Enhanced Learning (TEL) platforms, presenting a higher adaptability to different situations, whereas traditional approaches only have good results under favourable conditions. Furthermore the promotion period mechanism implemented successfully helps new users in the system to be recommended for direct interactions as well as the resources created by them. On the contrary OnlyOpinions fails completely and new users are never recommended, while traditional approaches only work partially. Finally, the agent-oriented programming (AOP) paradigm has proven its validity at modelling users’ behaviours in TEL platforms. Intelligent software agents’ characteristics matched the main requirements of the simulation tool. The proactivity, sociability and adaptability of the developed agents allowed reproducing real users’ actions and attitudes through the diverse situations defined in the evaluation framework. The result were independent users, accessing to different resources and communicating amongst them to fulfil their needs, basing these interactions on the recommendations provided by the reputation engine.
Resumo:
Las transformaciones martensíticas (MT) se definen como un cambio en la estructura del cristal para formar una fase coherente o estructuras de dominio multivariante, a partir de la fase inicial con la misma composición, debido a pequeños intercambios o movimientos atómicos cooperativos. En el siglo pasado se han descubierto MT en diferentes materiales partiendo desde los aceros hasta las aleaciones con memoria de forma, materiales cerámicos y materiales inteligentes. Todos muestran propiedades destacables como alta resistencia mecánica, memoria de forma, efectos de superelasticidad o funcionalidades ferroicas como la piezoelectricidad, electro y magneto-estricción etc. Varios modelos/teorías se han desarrollado en sinergia con el desarrollo de la física del estado sólido para entender por qué las MT generan microstructuras muy variadas y ricas que muestran propiedades muy interesantes. Entre las teorías mejor aceptadas se encuentra la Teoría Fenomenológica de la Cristalografía Martensítica (PTMC, por sus siglas en inglés) que predice el plano de hábito y las relaciones de orientación entre la austenita y la martensita. La reinterpretación de la teoría PTMC en un entorno de mecánica del continuo (CM-PTMC) explica la formación de los dominios de estructuras multivariantes, mientras que la teoría de Landau con dinámica de inercia desentraña los mecanismos físicos de los precursores y otros comportamientos dinámicos. La dinámica de red cristalina desvela la reducción de la dureza acústica de las ondas de tensión de red que da lugar a transformaciones débiles de primer orden en el desplazamiento. A pesar de las diferencias entre las teorías estáticas y dinámicas dado su origen en diversas ramas de la física (por ejemplo mecánica continua o dinámica de la red cristalina), estas teorías deben estar inherentemente conectadas entre sí y mostrar ciertos elementos en común en una perspectiva unificada de la física. No obstante las conexiones físicas y diferencias entre las teorías/modelos no se han tratado hasta la fecha, aun siendo de importancia crítica para la mejora de modelos de MT y para el desarrollo integrado de modelos de transformaciones acopladas de desplazamiento-difusión. Por lo tanto, esta tesis comenzó con dos objetivos claros. El primero fue encontrar las conexiones físicas y las diferencias entre los modelos de MT mediante un análisis teórico detallado y simulaciones numéricas. El segundo objetivo fue expandir el modelo de Landau para ser capaz de estudiar MT en policristales, en el caso de transformaciones acopladas de desplazamiento-difusión, y en presencia de dislocaciones. Comenzando con un resumen de los antecedente, en este trabajo se presentan las bases físicas de los modelos actuales de MT. Su capacidad para predecir MT se clarifica mediante el ansis teórico y las simulaciones de la evolución microstructural de MT de cúbicoatetragonal y cúbicoatrigonal en 3D. Este análisis revela que el modelo de Landau con representación irreducible de la deformación transformada es equivalente a la teoría CM-PTMC y al modelo de microelasticidad para predecir los rasgos estáticos durante la MT, pero proporciona una mejor interpretación de los comportamientos dinámicos. Sin embargo, las aplicaciones del modelo de Landau en materiales estructurales están limitadas por su complejidad. Por tanto, el primer resultado de esta tesis es el desarrollo del modelo de Landau nolineal con representación irreducible de deformaciones y de la dinámica de inercia para policristales. La simulación demuestra que el modelo propuesto es consistente fcamente con el CM-PTMC en la descripción estática, y también permite una predicción del diagrama de fases con la clásica forma ’en C’ de los modos de nucleación martensítica activados por la combinación de temperaturas de enfriamiento y las condiciones de tensión aplicada correlacionadas con la transformación de energía de Landau. Posteriomente, el modelo de Landau de MT es integrado con un modelo de transformación de difusión cuantitativa para elucidar la relajación atómica y la difusión de corto alcance de los elementos durante la MT en acero. El modelo de transformaciones de desplazamiento y difusión incluye los efectos de la relajación en borde de grano para la nucleación heterogenea y la evolución espacio-temporal de potenciales de difusión y movilidades químicas mediante el acoplamiento de herramientas de cálculo y bases de datos termo-cinéticos de tipo CALPHAD. El modelo se aplica para estudiar la evolución microstructural de aceros al carbono policristalinos procesados por enfriamiento y partición (Q&P) en 2D. La microstructura y la composición obtenida mediante la simulación se comparan con los datos experimentales disponibles. Los resultados muestran el importante papel jugado por las diferencias en movilidad de difusión entre la fase austenita y martensita en la distibución de carbono en las aceros. Finalmente, un modelo multi-campo es propuesto mediante la incorporación del modelo de dislocación en grano-grueso al modelo desarrollado de Landau para incluir las diferencias morfológicas entre aceros y aleaciones con memoria de forma con la misma ruptura de simetría. La nucleación de dislocaciones, la formación de la martensita ’butterfly’, y la redistribución del carbono después del revenido son bien representadas en las simulaciones 2D del estudio de la evolución de la microstructura en aceros representativos. Con dicha simulación demostramos que incluyendo las dislocaciones obtenemos para dichos aceros, una buena comparación frente a los datos experimentales de la morfología de los bordes de macla, la existencia de austenita retenida dentro de la martensita, etc. Por tanto, basado en un modelo integral y en el desarrollo de códigos durante esta tesis, se ha creado una herramienta de modelización multiescala y multi-campo. Dicha herramienta acopla la termodinámica y la mecánica del continuo en la macroescala con la cinética de difusión y los modelos de campo de fase/Landau en la mesoescala, y también incluye los principios de la cristalografía y de la dinámica de red cristalina en la microescala. ABSTRACT Martensitic transformation (MT), in a narrow sense, is defined as the change of the crystal structure to form a coherent phase, or multi-variant domain structures out from a parent phase with the same composition, by small shuffles or co-operative movements of atoms. Over the past century, MTs have been discovered in different materials from steels to shape memory alloys, ceramics, and smart materials. They lead to remarkable properties such as high strength, shape memory/superelasticity effects or ferroic functionalities including piezoelectricity, electro- and magneto-striction, etc. Various theories/models have been developed, in synergy with development of solid state physics, to understand why MT can generate these rich microstructures and give rise to intriguing properties. Among the well-established theories, the Phenomenological Theory of Martensitic Crystallography (PTMC) is able to predict the habit plane and the orientation relationship between austenite and martensite. The re-interpretation of the PTMC theory within a continuum mechanics framework (CM-PTMC) explains the formation of the multivariant domain structures, while the Landau theory with inertial dynamics unravels the physical origins of precursors and other dynamic behaviors. The crystal lattice dynamics unveils the acoustic softening of the lattice strain waves leading to the weak first-order displacive transformation, etc. Though differing in statics or dynamics due to their origins in different branches of physics (e.g. continuum mechanics or crystal lattice dynamics), these theories should be inherently connected with each other and show certain elements in common within a unified perspective of physics. However, the physical connections and distinctions among the theories/models have not been addressed yet, although they are critical to further improving the models of MTs and to develop integrated models for more complex displacivediffusive coupled transformations. Therefore, this thesis started with two objectives. The first one was to reveal the physical connections and distinctions among the models of MT by means of detailed theoretical analyses and numerical simulations. The second objective was to expand the Landau model to be able to study MTs in polycrystals, in the case of displacive-diffusive coupled transformations, and in the presence of the dislocations. Starting with a comprehensive review, the physical kernels of the current models of MTs are presented. Their ability to predict MTs is clarified by means of theoretical analyses and simulations of the microstructure evolution of cubic-to-tetragonal and cubic-to-trigonal MTs in 3D. This analysis reveals that the Landau model with irreducible representation of the transformed strain is equivalent to the CM-PTMC theory and microelasticity model to predict the static features during MTs but provides better interpretation of the dynamic behaviors. However, the applications of the Landau model in structural materials are limited due its the complexity. Thus, the first result of this thesis is the development of a nonlinear Landau model with irreducible representation of strains and the inertial dynamics for polycrystals. The simulation demonstrates that the updated model is physically consistent with the CM-PTMC in statics, and also permits a prediction of a classical ’C shaped’ phase diagram of martensitic nucleation modes activated by the combination of quenching temperature and applied stress conditions interplaying with Landau transformation energy. Next, the Landau model of MT is further integrated with a quantitative diffusional transformation model to elucidate atomic relaxation and short range diffusion of elements during the MT in steel. The model for displacive-diffusive transformations includes the effects of grain boundary relaxation for heterogeneous nucleation and the spatio-temporal evolution of diffusion potentials and chemical mobility by means of coupling with a CALPHAD-type thermo-kinetic calculation engine and database. The model is applied to study for the microstructure evolution of polycrystalline carbon steels processed by the Quenching and Partitioning (Q&P) process in 2D. The simulated mixed microstructure and composition distribution are compared with available experimental data. The results show that the important role played by the differences in diffusion mobility between austenite and martensite to the partitioning in carbon steels. Finally, a multi-field model is proposed by incorporating the coarse-grained dislocation model to the developed Landau model to account for the morphological difference between steels and shape memory alloys with same symmetry breaking. The dislocation nucleation, the formation of the ’butterfly’ martensite, and the redistribution of carbon after tempering are well represented in the 2D simulations for the microstructure evolution of the representative steels. With the simulation, we demonstrate that the dislocations account for the experimental observation of rough twin boundaries, retained austenite within martensite, etc. in steels. Thus, based on the integrated model and the in-house codes developed in thesis, a preliminary multi-field, multiscale modeling tool is built up. The new tool couples thermodynamics and continuum mechanics at the macroscale with diffusion kinetics and phase field/Landau model at the mesoscale, and also includes the essentials of crystallography and crystal lattice dynamics at microscale.
Resumo:
An engineering modification of blade element/momentum theory is applied to describe the vertical autorotation of helicopter rotors. A full non‐linear aerodynamic model is considered for the airfoils, taking into account the dependence of lift and drag coefficients on both the angle of attack and the Reynolds number. The proposed model, which has been validated in previous work, has allowed the identification of different autorotation modes, which depend on the descent velocity and the twist of the rotor blades. These modes present different radial distributions of driven and driving blade regions, as well as different radial upwash/downwash patterns. The number of blade sections with zero tangential force, the existence of a downwash region in the rotor disk, the stability of the autorotation state, and the overall rotor autorotation efficiency, are all analyzed in terms of the flight velocity and the characteristics of the rotor. It is shown that, in vertical autorotation, larger blade twist leads to smaller values of descent velocity for a given thrust generated by the rotor in the autorotational state.
Resumo:
We report here that a cancer gene therapy protocol using a combination of IL-12, pro-IL-18, and IL-1β converting enzyme (ICE) cDNA expression vectors simultaneously delivered via gene gun can significantly augment antitumor effects, evidently by generating increased levels of bioactive IL-18 and consequently IFN-γ. First, we compared the levels of IFN-γ secreted by mouse splenocytes stimulated with tumor cells transfected with various test genes, including IL-12 alone; pro-IL-18 alone; pro-IL-18 and ICE; IL-12 and pro-IL-18; and IL-12, pro-IL-18, and ICE. Among these treatments, the combination of IL-12, pro-IL-18, and ICE cDNA resulted in the highest level of IFN-γ production from splenocytes in vitro, and similar results were obtained when these same treatments were delivered to the skin of a mouse by gene gun and IFN-γ levels were measured at the skin transfection site in vivo. Furthermore, the triple gene combinatorial gene therapy protocol was the most effective among all tested groups at suppressing the growth of TS/A (murine mammary adenocarcinoma) tumors previously implanted intradermally at the skin site receiving DNA transfer by gene gun on days 6, 8, 10, and 12 after tumor implantation. Fifty percent of mice treated with the combined three-gene protocol underwent complete tumor regression. In vivo depletion experiments showed that this antitumor effect was CD8+ T cell-mediated and partially IFN-γ-dependent. These results suggest that a combinatorial gene therapy protocol using a mixture of IL-12, pro-IL-18, and ICE cDNAs can confer potent antitumor activities against established TS/A tumors via cytotoxic CD8+ T cells and IFN-γ-dependent pathways.