918 resultados para Multi-scale lacunarity


Relevância:

80.00% 80.00%

Publicador:

Resumo:

A new Coastal Rapid Environmental Assessment (CREA) strategy has been developed and successfully applied to the Northern Adriatic Sea. CREA strategy exploits the recent advent of operational oceanography to establish a CREA system based on an operational regional forecasting system and coastal monitoring networks of opportunity. The methodology wishes to initialize a coastal high resolution model, nested within the regional forecasting system, blending the large scale parent model fields with the available coastal observations to generate the requisite field estimates. CREA modeling system consists of a high resolution, O(800m), Adriatic SHELF model (ASHELF) implemented into the Northern Adriatic basin and nested within the Adriatic Forecasting System (AFS) (Oddo et al. 2006). The observational system is composed by the coastal networks established in the framework of ADRICOSM (ADRiatic sea integrated COastal areaS and river basin Managment system) Pilot Project. An assimilation technique exerts a correction of the initial field provided by AFS on the basis of the available observations. The blending of the two data sets has been carried out through a multi-scale optimal interpolation technique developed by Mariano and Brown (1992). Two CREA weekly exercises have been conducted: the first, at the beginning of May (spring experiment); the second in middle August (summer experiment). The weeks have been chosen looking at the availability of all coastal observations in the initialization day and one week later to validate model results, verifying our predictive skills. ASHELF spin up time has been investigated too, through a dedicated experiment, in order to obtain the maximum forecast accuracy within a minimum time. Energetic evaluations show that for the Northern Adriatic Sea and for the forcing applied, a spin-up period of one week allows ASHELF to generate new circulation features enabled by the increased resolution and its total kinetic energy to establish a new dynamical balance. CREA results, evaluated by mean of standard statistics between ASHELF and coastal CTDs, show improvement deriving from the initialization technique and a good model performance in the coastal areas of the Northern Adriatic basin, characterized by a shallow and wide continental shelf subject to substantial freshwater influence from rivers. Results demonstrate the feasibility of our CREA strategy to support coastal zone management and wish an additional establishment of operational coastal monitoring activities to advance it.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The research work was aimed at studying, with a deterministic approach, the relationships between the rock’s texture and its mechanical properties determined at the laboratory scale. The experimentation was performed on a monomineralic crystalline rock, varying in texture, i.e. grains shape. Multi-scale analysis has been adopted to determine the elasto-mechanical properties of the crystals composing the rock and its strength and deformability at the macro-scale. This let us to understand how the structural variability of the investigated rock affects its macromechanical behaviour. Investigations have been performed on three different scales: nano-scale (order of nm), micro-scale (tens of m) and macro-scale (cm). Innovative techniques for rock mechanics, i.e. Depth Sensing Indentation (DSI), have been applied, in order to determine the elasto-mechanical properties of the calcite grains. These techniques have also allowed to study the influence of grain boundaries on the mechanical response of calcite grains by varying the indents’ sizes and to quantify the effect of the applied load on the hardness and elastic modulus of the grain (indentation size effect, ISE). The secondary effects of static indentation Berkovich, Vickers and Knoop were analyzed by SEM, and some considerations on the rock’s brittle behaviour and the effect of microcracks can be made.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Thrust fault-related folds in carbonate rocks are characterized by deformation accommodated by different structures, such as joints, faults, pressure solution seams, and deformation bands. Defining the development of fracture systems related to the folding process is significant both for theoretical and practical purposes. Fracture systems are useful constrains in order to understand the kinematical evolution of the fold. Furthermore, understanding the relationships between folding and fracturing provides a noteworthy contribution for reconstructing the geodynamic and the structural evolution of the studied area. Moreover, as fold-related fractures influence fluid flow through rocks, fracture systems are relevant for energy production (geothermal studies, methane and CO2 , storage and hydrocarbon exploration), environmental and social issues (pollutant distribution, aquifer characterization). The PhD project shows results of a study carried out in a multilayer carbonate anticline characterized by different mechanical properties. The aim of this study is to understand the factors which influence the fracture formation and to define their temporal sequence during the folding process. The studied are is located in the Cingoli anticline (Northern Apennines), which is characterized by a pelagic multilayer characterized by sequences with different mechanical stratigraphies. A multi-scale analysis has been made in several outcrops located in different structural positions. This project shows that the conceptual sketches proposed in literature and the strain distribution models outline well the geometrical orientation of most of the set of fractures observed in the Cingoli anticline. On the other hand, the present work suggests the relevance of the mechanical stratigraphy in particular controlling the type of fractures formed (e.g. pressure solution seams, joints or shear fractures) and their subsequent evolution. Through a multi-scale analysis, and on the basis of the temporal relationship between fracture sets and their orientation respect layering, I also suggest a conceptual model for fracture systems formation.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In der vorliegenden Arbeit werden verschiedene Wassermodelle in sogenannten Multiskalen-Computersimulationen mit zwei Auflösungen untersucht, in atomistischer Auflösung und in einer vergröberten Auflösung, die als "coarse-grained" bezeichnet wird. In der atomistischen Auflösung wird ein Wassermolekül, entsprechend seiner chemischen Struktur, durch drei Atome beschrieben, im Gegensatz dazu wird ein Molekül in der coarse-grained Auflösung durch eine Kugel dargestellt.rnrnDie coarse-grained Modelle, die in dieser Arbeit vorgestellt werden, werden mit verschiedenen coarse-graining Methoden entwickelt. Hierbei kommen hauptsächlich die "iterative Boltzmann Inversion" und die "iterative Monte Carlo Inversion" zum Einsatz. Beides sind struktur-basierte Ansätze, die darauf abzielen bestimmte strukturelle Eigenschaften, wie etwa die Paarverteilungsfunktionen, des zugrundeliegenden atomistischen Systems zu reproduzieren. Zur automatisierten Anwendung dieser Methoden wurde das Softwarepaket "Versatile Object-oriented Toolkit for Coarse-Graining Applications" (VOTCA) entwickelt.rnrnEs wird untersucht, in welchem Maße coarse-grained Modelle mehrere Eigenschaftenrndes zugrundeliegenden atomistischen Modells gleichzeitig reproduzieren können, z.B. thermodynamische Eigenschaften wie Druck und Kompressibilität oder strukturelle Eigenschaften, die nicht zur Modellbildung verwendet wurden, z.B. das tetraedrische Packungsverhalten, welches für viele spezielle Eigenschaft von Wasser verantwortlich ist.rnrnMit Hilfe des "Adaptive Resolution Schemes" werden beide Auflösungen in einer Simulation kombiniert. Dabei profitiert man von den Vorteilen beider Modelle:rnVon der detaillierten Darstellung eines räumlich kleinen Bereichs in atomistischer Auflösung und von der rechnerischen Effizienz des coarse-grained Modells, die den Bereich simulierbarer Zeit- und Längenskalen vergrössert.rnrnIn diesen Simulationen kann der Einfluss des Wasserstoffbrückenbindungsnetzwerks auf die Hydration von Fullerenen untersucht werden. Es zeigt sich, dass die Struktur der Wassermoleküle an der Oberfläche hauptsächlich von der Art der Wechselwirkung zwischen dem Fulleren und Wasser und weniger von dem Wasserstoffbrückenbindungsnetzwerk dominiert wird.rn

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The quench characteristics of second generation (2 G) YBCO Coated Conductor (CC) tapes are of fundamental importance for the design and safe operation of superconducting cables and magnets based on this material. Their ability to transport high current densities at high temperature, up to 77 K, and at very high fields, over 20 T, together with the increasing knowledge in their manufacturing, which is reducing their cost, are pushing the use of this innovative material in numerous system applications, from high field magnets for research to motors and generators as well as for cables. The aim of this Ph. D. thesis is the experimental analysis and numerical simulations of quench in superconducting HTS tapes and coils. A measurements facility for the characterization of superconducting tapes and coils was designed, assembled and tested. The facility consist of a cryostat, a cryocooler, a vacuum system, resistive and superconducting current leads and signal feedthrough. Moreover, the data acquisition system and the software for critical current and quench measurements were developed. A 2D model was developed using the finite element code COMSOL Multiphysics R . The problem of modeling the high aspect ratio of the tape is tackled by multiplying the tape thickness by a constant factor, compensating the heat and electrical balance equations by introducing a material anisotropy. The model was then validated both with the results of a 1D quench model based on a non-linear electric circuit coupled to a thermal model of the tape, to literature measurements and to critical current and quench measurements made in the cryogenic facility. Finally the model was extended to the study of coils and windings with the definition of the tape and stack homogenized properties. The procedure allows the definition of a multi-scale hierarchical model, able to simulate the windings with different degrees of detail.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

We propose an innovative, integrated, cost-effective health system to combat major non-communicable diseases (NCDs), including cardiovascular, chronic respiratory, metabolic, rheumatologic and neurologic disorders and cancers, which together are the predominant health problem of the 21st century. This proposed holistic strategy involves comprehensive patient-centered integrated care and multi-scale, multi-modal and multi-level systems approaches to tackle NCDs as a common group of diseases. Rather than studying each disease individually, it will take into account their intertwined gene-environment, socio-economic interactions and co-morbidities that lead to individual-specific complex phenotypes. It will implement a road map for predictive, preventive, personalized and participatory (P4) medicine based on a robust and extensive knowledge management infrastructure that contains individual patient information. It will be supported by strategic partnerships involving all stakeholders, including general practitioners associated with patient-centered care. This systems medicine strategy, which will take a holistic approach to disease, is designed to allow the results to be used globally, taking into account the needs and specificities of local economies and health systems.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Although sustainable land management (SLM) is widely promoted to prevent and mitigate land degradation and desertification, its monitoring and assessment (M&A) has received much less attention. This paper compiles methodological approaches which to date have been little reported in the literature. It draws lessons from these experiences and identifies common elements and future pathways as a basis for a global approach. The paper starts with local level methods where the World Overview of Conservation Approaches and Technologies (WOCAT) framework catalogues SLM case studies. This tool has been included in the local level assessment of Land Degradation Assessment in Drylands (LADA) and in the EU-DESIRE project. Complementary site-based approaches can enhance an ecological process-based understanding of SLM variation. At national and sub-national levels, a joint WOCAT/LADA/DESIRE spatial assessment based on land use systems identifies the status and trends of degradation and SLM, including causes, drivers and impacts on ecosystem services. Expert consultation is combined with scientific evidence and enhanced where necessary with secondary data and indicator databases. At the global level, the Global Environment Facility (GEF) knowledge from the land (KM:Land) initiative uses indicators to demonstrate impacts of SLM investments. Key lessons learnt include the need for a multi-scale approach, making use of common indicators and a variety of information sources, including scientific data and local knowledge through participatory methods. Methodological consistencies allow cross-scale analyses, and findings are analysed and documented for use by decision-makers at various levels. Effective M&A of SLM [e.g. for United Nations Convention to Combat Desertification (UNCCD)] requires a comprehensive methodological framework agreed by the major players.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Correspondence establishment is a key step in statistical shape model building. There are several automated methods for solving this problem in 3D, but they usually can only handle objects with simple topology, like that of a sphere or a disc. We propose an extension to correspondence establishment over a population based on the optimization of the minimal description length function, allowing considering objects with arbitrary topology. Instead of using a fixed structure of kernel placement on a sphere for the systematic manipulation of point landmark positions, we rely on an adaptive, hierarchical organization of surface patches. This hierarchy can be built on surfaces of arbitrary topology and the resulting patches are used as a basis for a consistent, multi-scale modification of the surfaces' parameterization, based on point distribution models. The feasibility of the approach is demonstrated on synthetic models with different topologies.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Materials are inherently multi-scale in nature consisting of distinct characteristics at various length scales from atoms to bulk material. There are no widely accepted predictive multi-scale modeling techniques that span from atomic level to bulk relating the effects of the structure at the nanometer (10-9 meter) on macro-scale properties. Traditional engineering deals with treating matter as continuous with no internal structure. In contrast to engineers, physicists have dealt with matter in its discrete structure at small length scales to understand fundamental behavior of materials. Multiscale modeling is of great scientific and technical importance as it can aid in designing novel materials that will enable us to tailor properties specific to an application like multi-functional materials. Polymer nanocomposite materials have the potential to provide significant increases in mechanical properties relative to current polymers used for structural applications. The nanoscale reinforcements have the potential to increase the effective interface between the reinforcement and the matrix by orders of magnitude for a given reinforcement volume fraction as relative to traditional micro- or macro-scale reinforcements. To facilitate the development of polymer nanocomposite materials, constitutive relationships must be established that predict the bulk mechanical properties of the materials as a function of the molecular structure. A computational hierarchical multiscale modeling technique is developed to study the bulk-level constitutive behavior of polymeric materials as a function of its molecular chemistry. Various parameters and modeling techniques from computational chemistry to continuum mechanics are utilized for the current modeling method. The cause and effect relationship of the parameters are studied to establish an efficient modeling framework. The proposed methodology is applied to three different polymers and validated using experimental data available in literature.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

For half a century the integrated circuits (ICs) that make up the heart of electronic devices have been steadily improving by shrinking at an exponential rate. However, as the current crop of ICs get smaller and the insulating layers involved become thinner, electrons leak through due to quantum mechanical tunneling. This is one of several issues which will bring an end to this incredible streak of exponential improvement of this type of transistor device, after which future improvements will have to come from employing fundamentally different transistor architecture rather than fine tuning and miniaturizing the metal-oxide-semiconductor field effect transistors (MOSFETs) in use today. Several new transistor designs, some designed and built here at Michigan Tech, involve electrons tunneling their way through arrays of nanoparticles. We use a multi-scale approach to model these devices and study their behavior. For investigating the tunneling characteristics of the individual junctions, we use a first-principles approach to model conduction between sub-nanometer gold particles. To estimate the change in energy due to the movement of individual electrons, we use the finite element method to calculate electrostatic capacitances. The kinetic Monte Carlo method allows us to use our knowledge of these details to simulate the dynamics of an entire device— sometimes consisting of hundreds of individual particles—and watch as a device ‘turns on’ and starts conducting an electric current. Scanning tunneling microscopy (STM) and the closely related scanning tunneling spectroscopy (STS) are a family of powerful experimental techniques that allow for the probing and imaging of surfaces and molecules at atomic resolution. However, interpretation of the results often requires comparison with theoretical and computational models. We have developed a new method for calculating STM topographs and STS spectra. This method combines an established method for approximating the geometric variation of the electronic density of states, with a modern method for calculating spin-dependent tunneling currents, offering a unique balance between accuracy and accessibility.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Brain activity relies on transient, fluctuating interactions between segregated neuronal populations. Synchronization within a single and between distributed neuronal clusters reflects the dynamics of these cooperative patterns. Thus absence epilepsy can be used as a model for integrated, large-scale investigation of the emergence of pathological collective dynamics in the brain. Indeed, spike-wave discharges (SWD) of an absence seizure are thought to reflect abnormal cortical hypersynchronization. In this paper, we address two questions: how and where do SWD arise in the human brain? Therefore, we explored the spatio-temporal dynamics of interactions within and between widely distributed cortical sites using magneto-encephalographic recordings of spontaneous absence seizures. We then extracted, from their time-frequency analysis, local synchronization of cortical sources and long-range synchronization linking distant sites. Our analyses revealed a reproducible sequence of 1) long-range desynchronization, 2) increased local synchronization and 3) increased long-range synchronization. Although both local and long-range synchronization displayed different spatio-temporal profiles, their cortical projection within an initiation time window overlap and reveal a multifocal fronto-central network. These observations contradict the classical view of sudden generalized synchronous activities in absence epilepsy. Furthermore, they suggest that brain states transition may rely on multi-scale processes involving both local and distant interactions.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Image-based modeling of tumor growth combines methods from cancer simulation and medical imaging. In this context, we present a novel approach to adapt a healthy brain atlas to MR images of tumor patients. In order to establish correspondence between a healthy atlas and a pathologic patient image, tumor growth modeling in combination with registration algorithms is employed. In a first step, the tumor is grown in the atlas based on a new multi-scale, multi-physics model including growth simulation from the cellular level up to the biomechanical level, accounting for cell proliferation and tissue deformations. Large-scale deformations are handled with an Eulerian approach for finite element computations, which can operate directly on the image voxel mesh. Subsequently, dense correspondence between the modified atlas and patient image is established using nonrigid registration. The method offers opportunities in atlasbased segmentation of tumor-bearing brain images as well as for improved patient-specific simulation and prognosis of tumor progression.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper reviews the methods, benefits and challenges associated with the adoption and translation of computational fluid dynamics (CFD) modelling within cardiovascular medicine. CFD, a specialist area of mathematics and a branch of fluid mechanics, is used routinely in a diverse range of safety-critical engineering systems, which increasingly is being applied to the cardiovascular system. By facilitating rapid, economical, low-risk prototyping, CFD modelling has already revolutionised research and development of devices such as stents, valve prostheses, and ventricular assist devices. Combined with cardiovascular imaging, CFD simulation enables detailed characterisation of complex physiological pressure and flow fields and the computation of metrics which cannot be directly measured, for example, wall shear stress. CFD models are now being translated into clinical tools for physicians to use across the spectrum of coronary, valvular, congenital, myocardial and peripheral vascular diseases. CFD modelling is apposite for minimally-invasive patient assessment. Patient-specific (incorporating data unique to the individual) and multi-scale (combining models of different length- and time-scales) modelling enables individualised risk prediction and virtual treatment planning. This represents a significant departure from traditional dependence upon registry-based, population-averaged data. Model integration is progressively moving towards 'digital patient' or 'virtual physiological human' representations. When combined with population-scale numerical models, these models have the potential to reduce the cost, time and risk associated with clinical trials. The adoption of CFD modelling signals a new era in cardiovascular medicine. While potentially highly beneficial, a number of academic and commercial groups are addressing the associated methodological, regulatory, education- and service-related challenges.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

We present a novel surrogate model-based global optimization framework allowing a large number of function evaluations. The method, called SpLEGO, is based on a multi-scale expected improvement (EI) framework relying on both sparse and local Gaussian process (GP) models. First, a bi-objective approach relying on a global sparse GP model is used to determine potential next sampling regions. Local GP models are then constructed within each selected region. The method subsequently employs the standard expected improvement criterion to deal with the exploration-exploitation trade-off within selected local models, leading to a decision on where to perform the next function evaluation(s). The potential of our approach is demonstrated using the so-called Sparse Pseudo-input GP as a global model. The algorithm is tested on four benchmark problems, whose number of starting points ranges from 102 to 104. Our results show that SpLEGO is effective and capable of solving problems with large number of starting points, and it even provides significant advantages when compared with state-of-the-art EI algorithms.