850 resultados para building information modelling


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Although Recovery is often defined as the less studied and documented phase of the Emergency Management Cycle, a wide literature is available for describing characteristics and sub-phases of this process. Previous works do not allow to gain an overall perspective because of a lack of systematic consistent monitoring of recovery utilizing advanced technologies such as remote sensing and GIS technologies. Taking into consideration the key role of Remote Sensing in Response and Damage Assessment, this thesis is aimed to verify the appropriateness of such advanced monitoring techniques to detect recovery advancements over time, with close attention to the main characteristics of the study event: Hurricane Katrina storm surge. Based on multi-source, multi-sensor and multi-temporal data, the post-Katrina recovery was analysed using both a qualitative and a quantitative approach. The first phase was dedicated to the investigation of the relation between urban types, damage and recovery state, referring to geographical and technological parameters. Damage and recovery scales were proposed to review critical observations on remarkable surge- induced effects on various typologies of structures, analyzed at a per-building level. This wide-ranging investigation allowed a new understanding of the distinctive features of the recovery process. A quantitative analysis was employed to develop methodological procedures suited to recognize and monitor distribution, timing and characteristics of recovery activities in the study area. Promising results, gained by applying supervised classification algorithms to detect localization and distribution of blue tarp, have proved that this methodology may help the analyst in the detection and monitoring of recovery activities in areas that have been affected by medium damage. The study found that Mahalanobis Distance was the classifier which provided the most accurate results, in localising blue roofs with 93.7% of blue roof classified correctly and a producer accuracy of 70%. It was seen to be the classifier least sensitive to spectral signature alteration. The application of the dissimilarity textural classification to satellite imagery has demonstrated the suitability of this technique for the detection of debris distribution and for the monitoring of demolition and reconstruction activities in the study area. Linking these geographically extensive techniques with expert per-building interpretation of advanced-technology ground surveys provides a multi-faceted view of the physical recovery process. Remote sensing and GIS technologies combined to advanced ground survey approach provides extremely valuable capability in Recovery activities monitoring and may constitute a technical basis to lead aid organization and local government in the Recovery management.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The presented study carried out an analysis on rural landscape changes. In particular the study focuses on the understanding of driving forces acting on the rural built environment using a statistical spatial model implemented through GIS techniques. It is well known that the study of landscape changes is essential for a conscious decision making in land planning. From a bibliography review results a general lack of studies dealing with the modeling of rural built environment and hence a theoretical modelling approach for such purpose is needed. The advancement in technology and modernity in building construction and agriculture have gradually changed the rural built environment. In addition, the phenomenon of urbanization of a determined the construction of new volumes that occurred beside abandoned or derelict rural buildings. Consequently there are two types of transformation dynamics affecting mainly the rural built environment that can be observed: the conversion of rural buildings and the increasing of building numbers. It is the specific aim of the presented study to propose a methodology for the development of a spatial model that allows the identification of driving forces that acted on the behaviours of the building allocation. In fact one of the most concerning dynamic nowadays is related to an irrational expansion of buildings sprawl across landscape. The proposed methodology is composed by some conceptual steps that cover different aspects related to the development of a spatial model: the selection of a response variable that better describe the phenomenon under study, the identification of possible driving forces, the sampling methodology concerning the collection of data, the most suitable algorithm to be adopted in relation to statistical theory and method used, the calibration process and evaluation of the model. A different combination of factors in various parts of the territory generated favourable or less favourable conditions for the building allocation and the existence of buildings represents the evidence of such optimum. Conversely the absence of buildings expresses a combination of agents which is not suitable for building allocation. Presence or absence of buildings can be adopted as indicators of such driving conditions, since they represent the expression of the action of driving forces in the land suitability sorting process. The existence of correlation between site selection and hypothetical driving forces, evaluated by means of modeling techniques, provides an evidence of which driving forces are involved in the allocation dynamic and an insight on their level of influence into the process. GIS software by means of spatial analysis tools allows to associate the concept of presence and absence with point futures generating a point process. Presence or absence of buildings at some site locations represent the expression of these driving factors interaction. In case of presences, points represent locations of real existing buildings, conversely absences represent locations were buildings are not existent and so they are generated by a stochastic mechanism. Possible driving forces are selected and the existence of a causal relationship with building allocations is assessed through a spatial model. The adoption of empirical statistical models provides a mechanism for the explanatory variable analysis and for the identification of key driving variables behind the site selection process for new building allocation. The model developed by following the methodology is applied to a case study to test the validity of the methodology. In particular the study area for the testing of the methodology is represented by the New District of Imola characterized by a prevailing agricultural production vocation and were transformation dynamic intensively occurred. The development of the model involved the identification of predictive variables (related to geomorphologic, socio-economic, structural and infrastructural systems of landscape) capable of representing the driving forces responsible for landscape changes.. The calibration of the model is carried out referring to spatial data regarding the periurban and rural area of the study area within the 1975-2005 time period by means of Generalised linear model. The resulting output from the model fit is continuous grid surface where cells assume values ranged from 0 to 1 of probability of building occurrences along the rural and periurban area of the study area. Hence the response variable assesses the changes in the rural built environment occurred in such time interval and is correlated to the selected explanatory variables by means of a generalized linear model using logistic regression. Comparing the probability map obtained from the model to the actual rural building distribution in 2005, the interpretation capability of the model can be evaluated. The proposed model can be also applied to the interpretation of trends which occurred in other study areas, and also referring to different time intervals, depending on the availability of data. The use of suitable data in terms of time, information, and spatial resolution and the costs related to data acquisition, pre-processing, and survey are among the most critical aspects of model implementation. Future in-depth studies can focus on using the proposed model to predict short/medium-range future scenarios for the rural built environment distribution in the study area. In order to predict future scenarios it is necessary to assume that the driving forces do not change and that their levels of influence within the model are not far from those assessed for the time interval used for the calibration.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The aims of this research were: - To identify the characteristics, properties and provenance of the building and decorative material found in three Hungarian Roman sites: Nagyharsány, Nemesvámos-Balácapuszta and Aquincum - To provide a database of information on the different sites - To have an overview of main conservation strategies applied in Hungary. Geological studies, macroscopical and microscopical observations, XRD investigations, physical and chemical analyses allowed us to define the characteristics and properties of the different kinds of collected materials. Building stones sampled from Nagyharsány site showed two different kinds of massive limestone belonging to the areas surrounding the villa. Also Building stones sampled from Nemesvámos-Balácapuszta Roman villa proved to be compatible with limestone belonging to local sources. Mural painting fragments show that all samples are units composed of multilayered structures. Mosaic tesserae can be classified as following: -Pale yellow , blackish and pink tesserae are comparable with local limestone; -White tessera, composed of marble, was probably imported from distant regions of the Empire, as the usual practice of Romans. Mortars present different characteristics according to the age, the site and the functions: -Building mortars are generally lime based, white or pale yellow in colour, present a high percentage of aggregates represented by fine sand; -Supporting mortars from both mosaics and mural paintings are reddish or pinkish in colour, due to the presence of high percentage of brick dust and tiles fragments, and present a higher content of MgO. Although the condition of the sites, there is an insignificant content of soluble salts. Database The whole study has allowed us to provide work sheets for each samples, including all characteristics and properties. Furthermore, all sites included in the frame of the research have been described and illustrated on the base of their floor plans, material and construction methodologies. It can be concluded that: 1. In Nagyharsány Archaeological site, it is possible to define a sequence of different construction phases on the base of the study of building material and mortars. The results are comparable with the chronology of the site provided by the archaeologists 2. The material used for construction was of local origin while the more precious ones, used for decorative elements, were probably imported from long distance 3. Construction techniques in Hungary mainly refer to the usual Roman knowledge and practice (Vitruvius); few differences have been found 4. The database will represent an archive for Archaeologists, Historians and Conservators dealing with Roman period in Hungary.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Ontology design and population -core aspects of semantic technologies- re- cently have become fields of great interest due to the increasing need of domain-specific knowledge bases that can boost the use of Semantic Web. For building such knowledge resources, the state of the art tools for ontology design require a lot of human work. Producing meaningful schemas and populating them with domain-specific data is in fact a very difficult and time-consuming task. Even more if the task consists in modelling knowledge at a web scale. The primary aim of this work is to investigate a novel and flexible method- ology for automatically learning ontology from textual data, lightening the human workload required for conceptualizing domain-specific knowledge and populating an extracted schema with real data, speeding up the whole ontology production process. Here computational linguistics plays a fundamental role, from automati- cally identifying facts from natural language and extracting frame of relations among recognized entities, to producing linked data with which extending existing knowledge bases or creating new ones. In the state of the art, automatic ontology learning systems are mainly based on plain-pipelined linguistics classifiers performing tasks such as Named Entity recognition, Entity resolution, Taxonomy and Relation extraction [11]. These approaches present some weaknesses, specially in capturing struc- tures through which the meaning of complex concepts is expressed [24]. Humans, in fact, tend to organize knowledge in well-defined patterns, which include participant entities and meaningful relations linking entities with each other. In literature, these structures have been called Semantic Frames by Fill- 6 Introduction more [20], or more recently as Knowledge Patterns [23]. Some NLP studies has recently shown the possibility of performing more accurate deep parsing with the ability of logically understanding the structure of discourse [7]. In this work, some of these technologies have been investigated and em- ployed to produce accurate ontology schemas. The long-term goal is to collect large amounts of semantically structured information from the web of crowds, through an automated process, in order to identify and investigate the cognitive patterns used by human to organize their knowledge.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The general aim of this work is to contribute to the energy performance assessment of ventilated façades by the simultaneous use of experimental data and numerical simulations. A significant amount of experimental work was done on different types of ventilated façades with natural ventilation. The measurements were taken on a test building. The external walls of this tower are rainscreen ventilated façades. Ventilation grills are located at the top and at the bottom of the tower. In this work the modelling of the test building using a dynamic thermal simulation program (ESP-r) is presented and the main results discussed. In order to investigate the best summer thermal performance of rainscreen ventilated skin façade a study for different setups of rainscreen walls was made. In particular, influences of ventilation grills, air cavity thickness, skin colour, skin material, orientation of façade were investigated. It is shown that some types of rainscreen ventilated façade typologies are capable of lowering the cooling energy demand of a few percent points.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The diagnosis, grading and classification of tumours has benefited considerably from the development of DCE-MRI which is now essential to the adequate clinical management of many tumour types due to its capability in detecting active angiogenesis. Several strategies have been proposed for DCE-MRI evaluation. Visual inspection of contrast agent concentration curves vs time is a very simple yet operator dependent procedure, therefore more objective approaches have been developed in order to facilitate comparison between studies. In so called model free approaches, descriptive or heuristic information extracted from time series raw data have been used for tissue classification. The main issue concerning these schemes is that they have not a direct interpretation in terms of physiological properties of the tissues. On the other hand, model based investigations typically involve compartmental tracer kinetic modelling and pixel-by-pixel estimation of kinetic parameters via non-linear regression applied on region of interests opportunely selected by the physician. This approach has the advantage to provide parameters directly related to the pathophysiological properties of the tissue such as vessel permeability, local regional blood flow, extraction fraction, concentration gradient between plasma and extravascular-extracellular space. Anyway, nonlinear modelling is computational demanding and the accuracy of the estimates can be affected by the signal-to-noise ratio and by the initial solutions. The principal aim of this thesis is investigate the use of semi-quantitative and quantitative parameters for segmentation and classification of breast lesion. The objectives can be subdivided as follow: describe the principal techniques to evaluate time intensity curve in DCE-MRI with focus on kinetic model proposed in literature; to evaluate the influence in parametrization choice for a classic bi-compartmental kinetic models; to evaluate the performance of a method for simultaneous tracer kinetic modelling and pixel classification; to evaluate performance of machine learning techniques training for segmentation and classification of breast lesion.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Several countries have acquired, over the past decades, large amounts of area covering Airborne Electromagnetic data. Contribution of airborne geophysics has dramatically increased for both groundwater resource mapping and management proving how those systems are appropriate for large-scale and efficient groundwater surveying. We start with processing and inversion of two AEM dataset from two different systems collected over the Spiritwood Valley Aquifer area, Manitoba, Canada respectively, the AeroTEM III (commissioned by the Geological Survey of Canada in 2010) and the “Full waveform VTEM” dataset, collected and tested over the same survey area, during the fall 2011. We demonstrate that in the presence of multiple datasets, either AEM and ground data, due processing, inversion, post-processing, data integration and data calibration is the proper approach capable of providing reliable and consistent resistivity models. Our approach can be of interest to many end users, ranging from Geological Surveys, Universities to Private Companies, which are often proprietary of large geophysical databases to be interpreted for geological and\or hydrogeological purposes. In this study we deeply investigate the role of integration of several complimentary types of geophysical data collected over the same survey area. We show that data integration can improve inversions, reduce ambiguity and deliver high resolution results. We further attempt to use the final, most reliable output resistivity models as a solid basis for building a knowledge-driven 3D geological voxel-based model. A voxel approach allows a quantitative understanding of the hydrogeological setting of the area, and it can be further used to estimate the aquifers volumes (i.e. potential amount of groundwater resources) as well as hydrogeological flow model prediction. In addition, we investigated the impact of an AEM dataset towards hydrogeological mapping and 3D hydrogeological modeling, comparing it to having only a ground based TEM dataset and\or to having only boreholes data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis reports a study on the seismic response of two-dimensional squat elements and their effect on the behavior of building structures. Part A is devoted to the study of unreinforced masonry infills, while part B is focused on reinforced concrete sandwich walls. Part A begins with a comprehensive review of modelling techniques and code provisions for infilled frame structures. Then state-of-the practice techniques are applied for a real case to test the ability of actual modeling techniques to reproduce observed behaviors. The first developments towards a seismic-resistant masonry infill system are presented. Preliminary design recommendations for the seismic design of the seismic-resistant masonry infill are finally provided. Part B is focused on the seismic behavior of a specific reinforced concrete sandwich panel system. First, the results of in-plane psuudostatic cyclic tests are described. Refinements to the conventional modified compression field theory are introduced in order to better simulate the monotonic envelope of the cyclic response. The refinements deal with the constitutive model for the shotcrete in tension and the embedded bars. Then the hysteretic response of the panels is studied according to a continuum damage model. Damage state limits are identified. Design recommendations for the seismic design of the studied reinforced concrete sandwich walls are finally provided.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Tonalite-trondhjemite-granodiorite (TTG) gneisses form up to two-thirds of the preserved Archean continental crust and there is considerable debate regarding the primary magmatic processes of the generation of these rocks. The popular theories indicate that these rocks were formed by partial melting of basaltic oceanic crust which was previously metamorphosed to garnet-amphibolite and/or eclogite facies conditions either at the base of thick oceanic crust or by subduction processes.rnThis study investigates a new aspect regarding the source rock for Archean continental crust which is inferred to have had a bulk compostion richer in magnesium (picrite) than present-day basaltic oceanic crust. This difference is supposed to originate from a higher geothermal gradient in the early Archean which may have induced higher degrees of partial melting in the mantle, which resulted in a thicker and more magnesian oceanic crust. rnThe methods used to investigate the role of a more MgO-rich source rock in the formation of TTG-like melts in the context of this new approach are mineral equilibria calculations with the software THERMOCALC and high-pressure experiments conducted from 10–20 kbar and 900–1100 °C, both combined in a forward modelling approach. Initially, P–T pseudosections for natural rock compositions with increasing MgO contents were calculated in the system NCFMASHTO (Na2O–CaO–FeO–MgO–Al2O3–SiO2–H2O–TiO2) to ascertain the metamorphic products from rocks with increasing MgO contents from a MORB up to a komatiite. A small number of previous experiments on komatiites showed the development of pyroxenite instead of eclogite and garnet-amphibolite during metamorphism and established that melts of these pyroxenites are of basaltic composition, thus again building oceanic crust instead of continental crust.rnThe P–T pseudosections calculated represent a continuous development of their metamorphic products from amphibolites and eclogites towards pyroxenites. On the basis of these calculations and the changes within the range of compositions, three picritic Models of Archean Oceanic Crust (MAOC) were established with different MgO contents (11, 13 and 15 wt%) ranging between basalt and komatiite. The thermodynamic modelling for MAOC 11, 13 and 15 at supersolidus conditions is imprecise since no appropriate melt model for metabasic rocks is currently available and the melt model for metapelitic rocks resulted in unsatisfactory calculations. The partially molten region is therfore covered by high-pressure experiments. The results of the experiments show a transition from predominantly tonalitic melts in MAOC 11 to basaltic melts in MAOC 15 and a solidus moving towards higher temperatures with increasing magnesium in the bulk composition. Tonalitic melts were generated in MAOC 11 and 13 at pressures up to 12.5 kbar in the presence of garnet, clinopyroxene, plagioclase plus/minus quartz (plus/minus orthopyroxene in the presence of quartz and at lower pressures) in the absence of amphibole but it could not be explicitly indicated whether the tonalitic melts coexisting with an eclogitic residue and rutile at 20 kbar do belong to the Archean TTG suite. Basaltic melts were generated predominantly in the presence of granulite facies residues such as amphibole plus/minus garnet, plagioclase, orthopyroxene that lack quartz in all MAOC compositions at pressures up to 15 kbar. rnThe tonalitic melts generated in MAOC 11 and 13 indicate that thicker oceanic crust with more magnesium than that of a modern basalt is also a viable source for the generation of TTG-like melts and therefore continental crust in the Archean. The experimental results are related to different geologic settings as a function of pressure. The favoured setting for the generation of early TTG-like melts at 15 kbar is the base of an oceanic crust thicker than existing today or by melting of slabs in shallow subduction zones, both without interaction of tonalic melts with the mantle. Tonalitic melts at 20 kbar may have been generated below the plagioclase stability by slab melting in deeper subduction zones that have developed with time during the progressive cooling of the Earth, but it is unlikely that those melts reached lower pressure levels without further mantle interaction.rn

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This work illustrates a soil-tunnel-structure interaction study performed by an integrated,geotechnical and structural,approach based on 3D finite element analyses and validated against experimental observations.The study aims at analysing the response of reinforced concrete framed buildings on discrete foundations in interaction with metro lines.It refers to the case of the twin tunnels of the Milan (Italy) metro line 5,recently built in coarse grained materials using EPB machines,for which subsidence measurements collected along ground and building sections during tunnelling were available.Settlements measured under freefield conditions are firstly back interpreted using Gaussian empirical predictions. Then,the in situ measurements’ analysis is extended to include the evolving response of a 9 storey reinforced concrete building while being undercrossed by the metro line.In the finite element study,the soil mechanical behaviour is described using an advanced constitutive model. This latter,when combined with a proper simulation of the excavation process, proves to realistically reproduce the subsidence profiles under free field conditions and to capture the interaction phenomena occurring between the twin tunnels during the excavation. Furthermore, when the numerical model is extended to include the building, schematised in a detailed manner, the results are in good agreement with the monitoring data for different stages of the twin tunnelling. Thus, they indirectly confirm the satisfactory performance of the adopted numerical approach which also allows a direct evaluation of the structural response as an outcome of the analysis. Further analyses are also carried out modelling the building with different levels of detail. The results highlight that, in this case, the simplified approach based on the equivalent plate schematisation is inadequate to capture the real tunnelling induced displacement field. The overall behaviour of the system proves to be mainly influenced by the buried portion of the building which plays an essential role in the interaction mechanism, due to its high stiffness.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Holding the major share of stellar mass in galaxies and being also old and passively evolving, early-type galaxies (ETGs) are the primary probes in investigating these various evolution scenarios, as well as being useful means to provide insights on cosmological parameters. In this thesis work I focused specifically on ETGs and on their capability in constraining galaxy formation and evolution; in particular, the principal aims were to derive some of the ETGs evolutionary parameters, such as age, metallicity and star formation history (SFH) and to study their age-redshift and mass-age relations. In order to infer galaxy physical parameters, I used the public code STARLIGHT: this program provides a best fit to the observed spectrum from a combination of many theoretical models defined in user-made libraries. the comparison between the output and input light-weighted ages shows a good agreement starting from SNRs of ∼ 10, with a bias of ∼ 2.2% and a dispersion 3%. Furthermore, also metallicities and SFHs are well reproduced. In the second part of the thesis I performed an analysis on real data, starting from Sloan Digital Sky Survey (SDSS) spectra. I found that galaxies get older with cosmic time and with increasing mass (for a fixed redshift bin); absolute light-weighted ages, instead, result independent from the fitting parameters or the synthetic models used. Metallicities, instead, are very similar from each other and clearly consistent with the ones derived from the Lick indices. The predicted SFH indicates the presence of a double burst of star formation. Velocity dispersions and extinctiona are also well constrained, following the expected behaviours. As a further step, I also fitted single SDSS spectra (with SNR∼ 20), to verify that stacked spectra gave the same results without introducing any bias: this is an important check, if one wants to apply the method at higher z, where stacked spectra are necessary to increase the SNR. Our upcoming aim is to adopt this approach also on galaxy spectra obtained from higher redshift Surveys, such as BOSS (z ∼ 0.5), zCOSMOS (z 1), K20 (z ∼ 1), GMASS (z ∼ 1.5) and, eventually, Euclid (z 2). Indeed, I am currently carrying on a preliminary study to estabilish the applicability of the method to lower resolution, as well as higher redshift (z 2) spectra, just like the Euclid ones.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis is focused on Smart Grid applications in medium voltage distribution networks. For the development of new applications it appears useful the availability of simulation tools able to model dynamic behavior of both the power system and the communication network. Such a co-simulation environment would allow the assessment of the feasibility of using a given network technology to support communication-based Smart Grid control schemes on an existing segment of the electrical grid and to determine the range of control schemes that different communications technologies can support. For this reason, is presented a co-simulation platform that has been built by linking the Electromagnetic Transients Program Simulator (EMTP v3.0) with a Telecommunication Network Simulator (OPNET-Riverbed v18.0). The simulator is used to design and analyze a coordinate use of Distributed Energy Resources (DERs) for the voltage/var control (VVC) in distribution network. This thesis is focused control structure based on the use of phase measurement units (PMUs). In order to limit the required reinforcements of the communication infrastructures currently adopted by Distribution Network Operators (DNOs), the study is focused on leader-less MAS schemes that do not assign special coordinating rules to specific agents. Leader-less MAS are expected to produce more uniform communication traffic than centralized approaches that include a moderator agent. Moreover, leader-less MAS are expected to be less affected by limitations and constraint of some communication links. The developed co-simulator has allowed the definition of specific countermeasures against the limitations of the communication network, with particular reference to the latency and loss and information, for both the case of wired and wireless communication networks. Moreover, the co-simulation platform has bee also coupled with a mobility simulator in order to study specific countermeasures against the negative effects on the medium voltage/current distribution network caused by the concurrent connection of electric vehicles.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Forest models are tools for explaining and predicting the dynamics of forest ecosystems. They simulate forest behavior by integrating information on the underlying processes in trees, soil and atmosphere. Bayesian calibration is the application of probability theory to parameter estimation. It is a method, applicable to all models, that quantifies output uncertainty and identifies key parameters and variables. This study aims at testing the Bayesian procedure for calibration to different types of forest models, to evaluate their performances and the uncertainties associated with them. In particular,we aimed at 1) applying a Bayesian framework to calibrate forest models and test their performances in different biomes and different environmental conditions, 2) identifying and solve structure-related issues in simple models, and 3) identifying the advantages of additional information made available when calibrating forest models with a Bayesian approach. We applied the Bayesian framework to calibrate the Prelued model on eight Italian eddy-covariance sites in Chapter 2. The ability of Prelued to reproduce the estimated Gross Primary Productivity (GPP) was tested over contrasting natural vegetation types that represented a wide range of climatic and environmental conditions. The issues related to Prelued's multiplicative structure were the main topic of Chapter 3: several different MCMC-based procedures were applied within a Bayesian framework to calibrate the model, and their performances were compared. A more complex model was applied in Chapter 4, focusing on the application of the physiology-based model HYDRALL to the forest ecosystem of Lavarone (IT) to evaluate the importance of additional information in the calibration procedure and their impact on model performances, model uncertainties, and parameter estimation. Overall, the Bayesian technique proved to be an excellent and versatile tool to successfully calibrate forest models of different structure and complexity, on different kind and number of variables and with a different number of parameters involved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Stable isotope composition of atmospheric carbon monoxide: A modelling study.rnrnThis study aims at an improved understanding of the stable carbon and oxygen isotope composition of the carbon monoxide (CO) in the global atmosphere by means of numerical simulations. At first, a new kinetic chemistry tagging technique for the most complete parameterisation of isotope effects has been introduced into the Modular Earth Submodel System (MESSy) framework. Incorporated into the ECHAM/MESSy Atmospheric Chemistry (EMAC) general circulation model, an explicit treatment of the isotope effects on the global scale is now possible. The expanded model system has been applied to simulate the chemical system containing up to five isotopologues of all carbon- and oxygen-bearing species, which ultimately determine the δ13C, δ18O and Δ17O isotopic signatures of atmospheric CO. As model input, a new stable isotope-inclusive emission inventory for the relevant trace gases has been compiled. The uncertainties of the emission estimates and of the resulting simulated mixing and isotope ratios have been analysed. The simulated CO mixing and stable isotope ratios have been compared to in-situ measurements from ground-based observatories and from the civil-aircraft-mounted CARIBIC−1 measurement platform.rnrnThe systematically underestimated 13CO/12CO ratios of earlier, simplified modelling studies can now be partly explained. The EMAC simulations do not support the inferences of those studies, which suggest for CO a reduced input of the highly depleted in 13C methane oxidation source. In particular, a high average yield of 0.94 CO per reacted methane (CH4) molecule is simulated in the troposphere, to a large extent due to the competition between the deposition and convective transport processes affecting the CH4 to CO reaction chain intermediates. None of the other factors, assumed or disregarded in previous studies, however hypothesised to have the potential in enriching tropospheric CO in 13C, were found significant when explicitly simulated. The inaccurate surface emissions, likely underestimated over East Asia, are responsible for roughly half of the discrepancies between the simulated and observed 13CO in the northern hemisphere (NH), whereas the remote southern hemisphere (SH) compositions suggest an underestimated fractionation during the oxidation of CO by the hydroxyl radical (OH). A reanalysis of the kinetic isotope effect (KIE) in this reaction contrasts the conventional assumption of a mere pressure dependence, and instead suggests an additional temperature dependence of the 13C KIE, which is driven by changes in the partitioning of the reaction exit channels. This result is yet to be confirmed in the laboratory.rnrnApart from 13CO, for the first time the atmospheric distribution of the oxygen mass-independent fractionation (MIF) in CO, Δ17O, has been consistently simulated on the global scale with EMAC. The applicability of Δ17O(CO) observations to unravelling changes in the tropospheric CH4-CO-OH system has been scrutinised, as well as the implications of the ozone (O3) input to the CO isotope oxygen budget. The Δ17O(CO) is confirmed to be the principal signal for the CO photochemical age, thus providing a measure for the OH chiefly involved in the sink of CO. The highly mass-independently fractionated O3 oxygen is estimated to comprise around 2% of the overall tropospheric CO source, which has implications for the δ18O, but less likely for the Δ17O CO budgets. Finally, additional sensitivity simulations with EMAC corroborate the nearly equal net effects of the present-day CH4 and CO burdens in removing tropospheric OH, as well as the large turnover and stability of the abundance of the latter. The simulated CO isotopologues nonetheless hint at a likely insufficient OH regeneration in the NH high latitudes and the upper troposphere / lower stratosphere (UTLS).rn

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Numerical modelling was performed to study the dynamics of multilayer detachment folding and salt tectonics. In the case of multilayer detachment folding, analytically derived diagrams show several folding modes, half of which are applicable to crustal scale folding. 3D numerical simulations are in agreement with 2D predictions, yet fold interactions result in complex fold patterns. Pre-existing salt diapirs change folding patterns as they localize the initial deformation. If diapir spacing is much smaller than the dominant folding wavelength, diapirs appear in fold synclines or limbs.rnNumerical models of 3D down-building diapirism show that sedimentation rate controls whether diapirs will form and influences the overall patterns of diapirism. Numerical codes were used to retrodeform modelled salt diapirs. Reverse modelling can retrieve the initial geometries of a 2D Rayleigh-Taylor instability with non-linear rheologies. Although intermediate geometries of down-built diapirs are retrieved, forward and reverse modelling solutions deviate. rnFinally, the dynamics of fold-and-thrusts belts formed over a tilted viscous detachment is studied and it is demonstrated that mechanical stratigraphy has an impact on the deformation style, switching from thrust- to folding-dominated. The basal angle of the detachment controls the deformation sequence of the fold-and-thrust belt and results are consistent with critical wedge theory.rn