834 resultados para Measurement based model identification
Resumo:
Forest models are tools for explaining and predicting the dynamics of forest ecosystems. They simulate forest behavior by integrating information on the underlying processes in trees, soil and atmosphere. Bayesian calibration is the application of probability theory to parameter estimation. It is a method, applicable to all models, that quantifies output uncertainty and identifies key parameters and variables. This study aims at testing the Bayesian procedure for calibration to different types of forest models, to evaluate their performances and the uncertainties associated with them. In particular,we aimed at 1) applying a Bayesian framework to calibrate forest models and test their performances in different biomes and different environmental conditions, 2) identifying and solve structure-related issues in simple models, and 3) identifying the advantages of additional information made available when calibrating forest models with a Bayesian approach. We applied the Bayesian framework to calibrate the Prelued model on eight Italian eddy-covariance sites in Chapter 2. The ability of Prelued to reproduce the estimated Gross Primary Productivity (GPP) was tested over contrasting natural vegetation types that represented a wide range of climatic and environmental conditions. The issues related to Prelued's multiplicative structure were the main topic of Chapter 3: several different MCMC-based procedures were applied within a Bayesian framework to calibrate the model, and their performances were compared. A more complex model was applied in Chapter 4, focusing on the application of the physiology-based model HYDRALL to the forest ecosystem of Lavarone (IT) to evaluate the importance of additional information in the calibration procedure and their impact on model performances, model uncertainties, and parameter estimation. Overall, the Bayesian technique proved to be an excellent and versatile tool to successfully calibrate forest models of different structure and complexity, on different kind and number of variables and with a different number of parameters involved.
Resumo:
We consider stochastic individual-based models for social behaviour of groups of animals. In these models the trajectory of each animal is given by a stochastic differential equation with interaction. The social interaction is contained in the drift term of the SDE. We consider a global aggregation force and a short-range repulsion force. The repulsion range and strength gets rescaled with the number of animals N. We show that for N tending to infinity stochastic fluctuations disappear and a smoothed version of the empirical process converges uniformly towards the solution of a nonlinear, nonlocal partial differential equation of advection-reaction-diffusion type. The rescaling of the repulsion in the individual-based model implies that the corresponding term in the limit equation is local while the aggregation term is non-local. Moreover, we discuss the effect of a predator on the system and derive an analogous convergence result. The predator acts as an repulsive force. Different laws of motion for the predator are considered.
Resumo:
Zur Registrierung von Pharmazeutika ist eine umfassende Analyse ihres genotoxischen Potentials von Nöten. Aufgrund der Vielzahl genotoxischer Mechanismen und deren resultierenden Schäden wird ein gestaffeltes Testdesign durch die ICH-Richtlinie S2(R1) „Guidance on genotoxicity testing and data interpretation for pharmaceuticals intended for human use S2(R1)“ definiert, um alle genotoxischen Substanzen zu identifizieren. Die Standardtestbatterie ist in der frühen Phase der Arzneimittelentwicklung aufgrund des geringen Durchsatzes und des Mangels an verfügbarer Substanzmenge vermindert anwendbar. Darüber hinaus verfügen in vitro Genotoxizitätstests in Säugerzellen über eine relativ geringe Spezifität. Für eine vollständige Sicherheitsbeurteilung wird eine in vivo Testung auf Kanzerogenität benötigt. Allerdings sind diese Testsysteme kosten- und zeitintensiv. Aufgrund dessen zielen neue Forschungsansätze auf die Verbesserung der Prädiktivität und die Erfassung des genotoxischen Potentials bereits in der frühen Phase der Arzneimittelentwicklung ab. Die high content imaging (HCI)-Technologie offeriert einen Ansatz zur Verbesserung des Durchsatzes verglichen mit der Standardtestbatterie. Zusätzlich hat ein Zell-basiertes Modell den Vorteil Daten relativ schnell bei gleichzeitig geringem Bedarf an Substanzmenge zu generieren. Demzufolge ermöglichen HCI-basierte Testsysteme eine Prüfung in der frühen Phase der pharmazeutischen Arzneimittelentwicklung. Das Ziel dieser Studie ist die Entwicklung eines neuen, spezifischen und sensitiven HCI-basierten Testsytems für Genotoxine und Progenotoxine in vitro unter Verwendung von HepG2-Zellen gewesen. Aufgrund ihrer begrenzten metabolischen Kapazität wurde ein kombiniertes System bestehend aus HepG2-Zellen und einem metabolischen Aktivierungssystem zur Testung progenotoxischer Substanzen etabliert. Basierend auf einer vorherigen Genomexpressionsprofilierung (Boehme et al., 2011) und einer Literaturrecherche wurden die folgenden neun unterschiedlichen Proteine der DNA-Schadensantwort als putative Marker der Substanz-induzierten Genotoxizität ausgewählt: p-p53 (Ser15), p21, p-H2AX (Ser139), p-Chk1 (Ser345) p-ATM (Ser1981), p-ATR (Ser428), p-CDC2 (Thr14/Tyr15), GADD45A und p-Chk2 (Thr68). Die Expression bzw. Aktivierung dieser Proteine wurde 48 h nach Behandlung mit den (pro-) genotoxischen Substanzen (Cyclophosphamid, 7,12-Dimethylbenz[a]anthracen, Aflatoxin B1, 2-Acetylaminofluoren, Methylmethansulfonat, Actinomycin D, Etoposid) und den nicht-genotoxischen Substanzen (D-Mannitol, Phenforminhydrochlorid, Progesteron) unter Verwendung der HCI-Technologie ermittelt. Die beste Klassifizierung wurde bei Verwendung der folgenden fünf der ursprünglichen neun putativen Markerproteine erreicht: p-p53 (Ser15), p21, p-H2AX (Ser139), p-Chk1 (Ser345) und p-ATM (Ser1981). In einem zweiten Teil dieser Arbeit wurden die fünf ausgewählten Proteine mit Substanzen, welche von dem European Centre for the Validation of Alternative Methods (ECVAM) zur Beurteilung der Leistung neuer oder modifizierter in vitro Genotoxizitätstests empfohlen sind, getestet. Dieses neue Testsystem erzielte eine Sensitivität von 80 % und eine Spezifität von 86 %, was in einer Prädiktivität von 84 % resultierte. Der synergetische Effekt dieser fünf Proteine ermöglicht die Identifizierung von genotoxischen Substanzen, welche DNA-Schädigungen durch eine Vielzahl von unterschiedlichen Mechanismen induzieren, mit einem hohen Erfolg. Zusammenfassend konnte ein hochprädiktives Prüfungssystem mit metabolischer Aktivierung für ein breites Spektrum potenziell genotoxischer Substanzen generiert werden, welches sich aufgrund des hohen Durchsatzes, des geringen Zeitaufwandes und der geringen Menge benötigter Substanz zur Substanzpriorisierung und -selektion in der Phase der Leitstrukturoptimierung eignet und darüber hinaus mechanistische Hinweise auf die genotoxische Wirkung der Testsubstanz liefert.
Resumo:
Ein charakteristisches, neuropathologisches Merkmal der Alzheimer-Demenz (AD), der am häufigsten vorkommenden Demenz-Form des Menschen, ist das Auftreten von senilen Plaques im Gehirn der Patienten. Hierbei stellt das neurotoxische A-beta Peptid den Hauptbestandteil dieser Ablagerungen dar. Einen Beitrag zu der pathologisch erhöhten A-beta Generierung liefert das verschobene Expressionsgleichgewicht der um APP-konkurrierenden Proteasen BACE-1 und ADAM10 zu Gunsten der beta-Sekretase BACE-1. In der vorliegenden Dissertation sollten molekulare Mechanismen identifiziert werden, die zu einem pathologisch veränderten Gleichgewicht der APP-Spaltung und somit zum Entstehen und Fortschritt der AD beitragen. Des Weiteren sollten Substanzen identifiziert werden, die durch Beeinflussung der Genexpression einer der beiden Proteasen das physiologische Gleichgewicht der APP-Prozessierung wiederherstellen können und somit therapeutisch einsetzbar sind.rnAnhand eines „Screenings“ von 704 Transkriptionsfaktoren wurden 23 Faktoren erhalten die das Verhältnis ADAM10- pro BACE-1-Promotor Aktivität beeinflussten. Exemplarisch wurden zwei der molekularen Faktoren auf ihren Wirkmechanismus untersucht: Der TF „X box binding protein-1“ (XBP-1), der die so genannte „unfolded protein response“ (UPR) reguliert, erhöhte die Expression von ADAM10 in Zellkultur-Experimenten. Die Menge dieses Faktors war in AD-Patienten im Vergleich zu gesunden, Alters-korrelierten Kontrollen signifikant erniedrigt. Im Gegensatz dazu verminderte der Seneszenz-assoziierte TF „T box 2“ (Tbx2) die Menge an ADAM10 in SH-SY5Y Zellen. Die Expression des Faktors selbst war in post-mortem Kortexgewebe von AD-Patienten erhöht. Zusätzlich zu den TFs konnten in einer Kooperation mit dem Helmholtz Zentrum München drei microRNAs (miRNA 103, 107, 1306) bioinformatisch prädiziert und experimentell validiert werden, die die Expression des humanen ADAM10 reduzierten.rnIm Rahmen dieser Arbeit konnten damit körpereigene Faktoren identifiziert werden, die die Menge an ADAM10 regulieren und folglich potenziell an der Entstehung der gestörten Homöostase der APP-Prozessierung beteiligt sind. Somit ist die AD auch im Hinblick auf eine A-beta-vermittelte Pathologie als multifaktorielle Krankheit zu verstehen, in der verschiedene Regulatoren zur gestörten APP-Prozessierung und somit zur pathologisch gesteigerten A-beta Generierung beitragen können. rnEine pharmakologische Erhöhung der ADAM10 Genexpression würde zu der Freisetzung von neuroprotektivem APPs-alpha und gleichzeitig zu einer reduzierten A-beta Generierung führen. Deshalb war ein weiteres Ziel dieser Arbeit die Evaluierung von Substanzen mit therapeutischem Potenzial im Hinblick auf eine erhöhte ADAM10 Expression. Von 640 FDA-zugelassenen Medikamenten einer Substanz-Bibliothek wurden 23 Substanzen identifiziert, die die Menge an ADAM10 signifikant steigerten während die Expression von BACE-1 und APP unbeeinflusst blieb. In Zusammenarbeit mit dem Institut für Pathologie (Johannes Gutenberg Universität Mainz) wurde ein Zellkultur-basiertes Modell etabliert, um die Permeationsfähigkeit der potenziellen Kandidaten-Substanzen über die Blut-Hirn Schranke (BHS) zu untersuchen. Von den 23 Medikamenten konnten neun im Rahmen des etablierten Modells als BHS-gängig charakterisiert werden. Somit erfüllen diese verbleibenden Medikamente die grundlegenden Anforderungen an ein AD-Therapeutikum. rnADAM10 spaltet neben APP eine Vielzahl anderer Substrate mit unterschiedlichen Funktionen in der Zelle. Zum Beispiel reguliert das Zelladhäsionsmolekül Neuroligin-1 (NL-1), das von ADAM10 prozessiert wird, die synaptische Funktion exzitatorischer Neurone. Aus diesem Grund ist die Abschätzung potenzieller, Therapie-bedingter Nebenwirkungen sehr wichtig. Im Rahmen eines Forschungsaufenthalts an der Universität von Tokio konnte in primären, kortikalen Neuronen der Ratte bei einer Retinoid-induzierten Erhöhung von ADAM10 neben einer vermehrten alpha-sekretorischen APP-Prozessierung auch eine gesteigerte Spaltung von NL-1 beobachtet werden. Dies lässt vermuten, dass bei einer Behandlung mit dem Retinoid Acitretin neben einer vermehrten APP-Spaltung durch ADAM10 auch die Regulation glutamaterger Neurone durch die Spaltung von NL-1 betroffen ist. Anhand eines geeigneten Alzheimer-Tiermodells sollten diese Befunde weiter analysiert werden, um so auf einen sicheren therapeutischen Ansatz bezüglich einer vermehrten ADAM10 Genexpression schließen zu können.rn
Resumo:
Experimental measurements are used to characterize the anisotropy of flow stress in extruded magnesium alloy AZ31 sheet during uniaxial tension tests at temperatures between 350°C and 450°C, and strain rates ranging from 10-5 to 10-2 s-1. The sheet exhibits lower flow stress and higher tensile ductility when loaded with the tensile axis perpendicular to the extrusion direction compared to when it is loaded parallel to the extrusion direction. This anisotropy is found to be grain size, strain rate, and temperature dependent, but is only weakly dependent on texture. A microstructure based model (D. E. Cipoletti, A. F. Bower, P. E. Krajewski, Scr. Mater., 64 (2011) 931–934) is used to explain the origin of the anisotropic behavior. In contrast to room temperature behavior, where anisotropy is principally a consequence of the low resistance to slip on the basal slip system, elevated temperature anisotropy is found to be caused by the grain structure of extruded sheet. The grains are elongated parallel to the extrusion direction, leading to a lower effective grain size perpendicular to the extrusion direction. As a result, grain boundary sliding occurs more readily if the material is loaded perpendicular to the extrusion direction.
Resumo:
Alternans of cardiac action potential duration (APD) is a well-known arrhythmogenic mechanism which results from dynamical instabilities. The propensity to alternans is classically investigated by examining APD restitution and by deriving APD restitution slopes as predictive markers. However, experiments have shown that such markers are not always accurate for the prediction of alternans. Using a mathematical ventricular cell model known to exhibit unstable dynamics of both membrane potential and Ca2+ cycling, we demonstrate that an accurate marker can be obtained by pacing at cycle lengths (CLs) varying randomly around a basic CL (BCL) and by evaluating the transfer function between the time series of CLs and APDs using an autoregressive-moving-average (ARMA) model. The first pole of this transfer function corresponds to the eigenvalue (λalt) of the dominant eigenmode of the cardiac system, which predicts that alternans occurs when λalt≤−1. For different BCLs, control values of λalt were obtained using eigenmode analysis and compared to the first pole of the transfer function estimated using ARMA model fitting in simulations of random pacing protocols. In all versions of the cell model, this pole provided an accurate estimation of λalt. Furthermore, during slow ramp decreases of BCL or simulated drug application, this approach predicted the onset of alternans by extrapolating the time course of the estimated λalt. In conclusion, stochastic pacing and ARMA model identification represents a novel approach to predict alternans without making any assumptions about its ionic mechanisms. It should therefore be applicable experimentally for any type of myocardial cell.
Resumo:
Soil erosion models and soil erosion risk maps are often used as indicators to assess potential soil erosion in order to assist policy decisions. This paper shows the scientific basis of the soil erosion risk map of Switzerland and its application in policy and practice. Linking a USLE/RUSLE-based model approach (AVErosion) founded on multiple flow algorithms and the unit contributing area concept with an extremely precise and high-resolution digital terrain model (2 m × 2 m grid) using GIS allows for a realistic assessment of the potential soil erosion risk, on single plots, i.e. uniform and comprehensive for the agricultural area of Switzerland (862,579 ha in the valley area and the lower mountain regions). The national or small-scale soil erosion prognosis has thus reached a level heretofore possible only in smaller catchment areas or single plots. Validation was carried out using soil loss data from soil erosion damage mappings in the field from long-term monitoring in different test areas. 45% of the evaluated agricultural area of Switzerland was classified as low potential erosion risk, 12% as moderate potential erosion risk, and 43% as high potential erosion risk. However, many of the areas classified as high potential erosion risk are located at the transition from valley to mountain zone, where many areas are used as permanent grassland, which drastically lowers their current erosion risk. The present soil erosion risk map serves on the one hand to identify and prioritise the high-erosion risk areas, and on the other hand to promote awareness amongst farmers and authorities. It was published on the internet and will be made available to the authorities in digital form. It is intended as a tool for simplifying and standardising enforcement of the legal framework for soil erosion prevention in Switzerland. The work therefore provides a successful example of cooperation between science, policy and practice.
Resumo:
The effect of shot particles on the high temperature, low cycle fatigue of a hybrid fiber/particulate metal-matrix composite (MMC) was studied. Two hybrid composites with the general composition A356/35%SiC particle/5%Fiber (one without shot) were tested. It was found that shot particles acting as stress concentrators had little effect on the fatigue performance. It appears that fibers with a high silica content were more likely to debond from the matrix. Final failure of the composite was found to occur preferentially in the matrix. SiC particles fracture progressively during fatigue testing, leading to higher stress in the matrix, and final failure by matrix overload. A continuum mechanics based model was developed to predict failure in fatigue based on the tensile properties of the matrix and particles. By accounting for matrix yielding and recovery, composite creep and particle strength distribution, failure of the composite was predicted.
Resumo:
Riparian zones are dynamic, transitional ecosystems between aquatic and terrestrial ecosystems with well defined vegetation and soil characteristics. Development of an all-encompassing definition for riparian ecotones, because of their high variability, is challenging. However, there are two primary factors that all riparian ecotones are dependent on: the watercourse and its associated floodplain. Previous approaches to riparian boundary delineation have utilized fixed width buffers, but this methodology has proven to be inadequate as it only takes the watercourse into consideration and ignores critical geomorphology, associated vegetation and soil characteristics. Our approach offers advantages over other previously used methods by utilizing: the geospatial modeling capabilities of ArcMap GIS; a better sampling technique along the water course that can distinguish the 50-year flood plain, which is the optimal hydrologic descriptor of riparian ecotones; the Soil Survey Database (SSURGO) and National Wetland Inventory (NWI) databases to distinguish contiguous areas beyond the 50-year plain; and land use/cover characteristics associated with the delineated riparian zones. The model utilizes spatial data readily available from Federal and State agencies and geospatial clearinghouses. An accuracy assessment was performed to assess the impact of varying the 50-year flood height, changing the DEM spatial resolution (1, 3, 5 and 10m), and positional inaccuracies with the National Hydrography Dataset (NHD) streams layer on the boundary placement of the delineated variable width riparian ecotones area. The result of this study is a robust and automated GIS based model attached to ESRI ArcMap software to delineate and classify variable-width riparian ecotones.
Resumo:
Heterogeneous materials are ubiquitous in nature and as synthetic materials. These materials provide unique combination of desirable mechanical properties emerging from its heterogeneities at different length scales. Future structural and technological applications will require the development of advanced light weight materials with superior strength and toughness. Cost effective design of the advanced high performance synthetic materials by tailoring their microstructure is the challenge facing the materials design community. Prior knowledge of structure-property relationships for these materials is imperative for optimal design. Thus, understanding such relationships for heterogeneous materials is of primary interest. Furthermore, computational burden is becoming critical concern in several areas of heterogeneous materials design. Therefore, computationally efficient and accurate predictive tools are highly essential. In the present study, we mainly focus on mechanical behavior of soft cellular materials and tough biological material such as mussel byssus thread. Cellular materials exhibit microstructural heterogeneity by interconnected network of same material phase. However, mussel byssus thread comprises of two distinct material phases. A robust numerical framework is developed to investigate the micromechanisms behind the macroscopic response of both of these materials. Using this framework, effect of microstuctural parameters has been addressed on the stress state of cellular specimens during split Hopkinson pressure bar test. A voronoi tessellation based algorithm has been developed to simulate the cellular microstructure. Micromechanisms (microinertia, microbuckling and microbending) governing macroscopic behavior of cellular solids are investigated thoroughly with respect to various microstructural and loading parameters. To understand the origin of high toughness of mussel byssus thread, a Genetic Algorithm (GA) based optimization framework has been developed. It is found that two different material phases (collagens) of mussel byssus thread are optimally distributed along the thread. These applications demonstrate that the presence of heterogeneity in the system demands high computational resources for simulation and modeling. Thus, Higher Dimensional Model Representation (HDMR) based surrogate modeling concept has been proposed to reduce computational complexity. The applicability of such methodology has been demonstrated in failure envelope construction and in multiscale finite element techniques. It is observed that surrogate based model can capture the behavior of complex material systems with sufficient accuracy. The computational algorithms presented in this thesis will further pave the way for accurate prediction of macroscopic deformation behavior of various class of advanced materials from their measurable microstructural features at a reasonable computational cost.
Resumo:
Time-averaged discharge rates (TADR) were calculated for five lava flows at Pacaya Volcano (Guatemala), using an adapted version of a previously developed satellite-based model. Imagery acquired during periods of effusive activity between the years 2000 and 2010 were obtained from two sensors of differing temporal and spatial resolutions; the Moderate Resolution Imaging Spectroradiometer (MODIS), and the Geostationary Operational Environmental Satellites (GOES) Imager. A total of 2873 MODIS and 2642 GOES images were searched manually for volcanic “hot spots”. It was found that MODIS imagery, with superior spatial resolution, produced better results than GOES imagery, so only MODIS data were used for quantitative analyses. Spectral radiances were transformed into TADR via two methods; first, by best-fitting some of the parameters (i.e. density, vesicularity, crystal content, temperature change) of the TADR estimation model to match flow volumes previously estimated from ground surveys and aerial photographs, and second by measuring those parameters from lava samples to make independent estimates. A relatively stable relationship was defined using the second method, which suggests the possibility of estimating lava discharge rates in near-real-time during future volcanic crises at Pacaya.
Resumo:
This document will demonstrate the methodology used to create an energy and conductance based model for power electronic converters. The work is intended to be a replacement for voltage and current based models which have limited applicability to the network nodal equations. Using conductance-based modeling allows direct application of load differential equations to the bus admittance matrix (Y-bus) with a unified approach. When applied directly to the Y-bus, the system becomes much easier to simulate since the state variables do not need to be transformed. The proposed transformation applies to loads, sources, and energy storage systems and is useful for DC microgrids. Transformed state models of a complete microgrid are compared to experimental results and show the models accurately reflect the system dynamic behavior.
Resumo:
A novel solution to the long standing issue of chip entanglement and breakage in metal cutting is presented in this dissertation. Through this work, an attempt is made to achieve universal chip control in machining by using chip guidance and subsequent breakage by backward bending (tensile loading of the chip's rough top surface) to effectively control long continuous chips into small segments. One big limitation of using chip breaker geometries in disposable carbide inserts is that the application range is limited to a narrow band depending on cutting conditions. Even within a recommended operating range, chip breakers do not function effectively as designed due to the inherent variations of the cutting process. Moreover, for a particular process, matching the chip breaker geometry with the right cutting conditions to achieve effective chip control is a very iterative process. The existence of a large variety of proprietary chip breaker designs further exacerbates the problem of easily implementing a robust and comprehensive chip control technique. To address the need for a robust and universal chip control technique, a new method is proposed in this work. By using a single tool top form geometry coupled with a tooling system for inducing chip breaking by backward bending, the proposed method achieves comprehensive chip control over a wide range of cutting conditions. A geometry based model is developed to predict a variable edge inclination angle that guides the chip flow to a predetermined target location. Chip kinematics for the new tool geometry is examined via photographic evidence from experimental cutting trials. Both qualitative and quantitative methods are used to characterize the chip kinematics. Results from the chip characterization studies indicate that the chip flow and final form show a remarkable consistency across multiple levels of workpiece and tool configurations as well as cutting conditions. A new tooling system is then designed to comprehensively break the chip by backward bending. Test results with the new tooling system prove that by utilizing the chip guidance and backward bending mechanism, long continuous chips can be more consistently broken into smaller segments that are generally deemed acceptable or good chips. It is found that the proposed tool can be applied effectively over a wider range of cutting conditions than present chip breakers thus taking possibly the first step towards achieving universal chip control in machining.
Resumo:
Most accounts of child language acquisition use as analytic tools adult-like syntactic categories and schemas (formal grammars) with little concern for whether they are psychologically real for young children. Recent research has demonstrated, however, that children do not operate initially with such abstract linguistic entities, but instead operate on the basis of concrete, item-based constructions. Children construct more abstract linguistic constructions only gradually – on the basis of linguistic experience in which frequency plays a key role – and they constrain these constructions to their appropriate ranges of use only gradually as well – again on the basis of linguistic experience in which frequency plays a key role. The best account of first language acquisition is provided by a construction-based, usage-based model in which children process the language they experience in discourse interactions with other persons, relying explicitly and exclusively on social and cognitive skills that children of this age are known to possess.
Resumo:
Following European legislative initiatives in the field of copyright limitations and exceptions, policy flexibilities formerly available to mem- ber states has been greatly diminished. The law in this area is increasingly incapable of accommodating any expansion in the scope of freely permitted acts, even where such expansion may be an appropriate response to changes in social and technological conditions. In this article, the causes of this problem are briefly canvassed and a number of potential solutions are noted. It is suggested that one such solution – the adoption of an open, factor-based model similar to s 107 of the United States’ Copyright Act – has not received the serious attention it deserves. The fair use paradigm has generally been dismissed as excessively unpredictable, contrary to international law and/or culturally alien. Drawing on recent fair use scholarship, it is argued here that these disadvantages are over-stated and that the potential for the development of a European fair use model merits investigation.