879 resultados para Physics Based Modeling
Resumo:
The marine aragonite cycle has been included in the global biogeochemical model PISCES to study the role of aragonite in shallow water CaCO3 dissolution. Aragonite production is parameterized as a function of mesozooplankton biomass and aragonite saturation state of ambient waters. Observation-based estimates of marine carbonate production and dissolution are well reproduced by the model and about 60% of the combined CaCO3 water column dissolution from aragonite and calcite is simulated above 2000 m. In contrast, a calcite-only version yields a much smaller fraction. This suggests that the aragonite cycle should be included in models for a realistic representation of CaCO3 dissolution and alkalinity. For the SRES A2 CO2 scenario, production rates of aragonite are projected to notably decrease after 2050. By the end of this century, global aragonite production is reduced by 29% and total CaCO3 production by 19% relative to pre-industrial. Geographically, the effect from increasing atmospheric CO2, and the subsequent reduction in saturation state, is largest in the subpolar and polar areas where the modeled aragonite production is projected to decrease by 65% until 2100.
Resumo:
When observers are presented with two visual targets appearing in the same position in close temporal proximity, a marked reduction in detection performance of the second target has often been reported, the so-called attentional blink phenomenon. Several studies found a similar decrement of P300 amplitudes during the attentional blink period as observed with detection performances of the second target. However, whether the parallel courses of second target performances and corresponding P300 amplitudes resulted from the same underlying mechanisms remained unclear. The aim of our study was therefore to investigate whether the mechanisms underlying the AB can be assessed by fixed-links modeling and whether this kind of assessment would reveal the same or at least related processes in the behavioral and electrophysiological data. On both levels of observation three highly similar processes could be identified: an increasing, a decreasing and a u-shaped trend. Corresponding processes from the behavioral and electrophysiological data were substantially correlated, with the two u-shaped trends showing the strongest association with each other. Our results provide evidence for the assumption that the same mechanisms underlie attentional blink task performance at the electrophysiological and behavioral levels as assessed by fixed-links models.
Resumo:
The spectacular images of Comet 103P/Hartley 2 recorded by the Medium Resolution Instrument (MRI) and High Resolution Instrument (HRI) on board of the Extrasolar Planet Observation and Deep Impact Extended Investigation (EPOXI) spacecraft, as the Deep Impact extended mission, revealed that its bi-lobed very active nucleus outgasses volatiles heterogeneously. Indeed, CO2 is the primary driver of activity by dragging out chunks of pure ice out of the nucleus from the sub-solar lobe that appear to be the main source of water in Hartley 2's coma by sublimating slowly as they go away from the nucleus. However, water vapor is released by direct sublimation of the nucleus at the waist without any significant amount of either CO2 or icy grains. The coma structure for a comet with such areas of diverse chemistry differs from the usual models where gases are produced in a homogeneous way from the surface. We use the fully kinetic Direct Simulation Monte Carlo model of Tenishev et al. (Tenishev, V.M., Combi, M.R., Davidsson, B. [2008]. Astrophys. J. 685, 659-677; Tenishev, V.M., Combi, M.R., Rubin, M. [2011]. Astrophys. J. 732, 104-120) applied to Comet 103P/Hartley 2 including sublimating icy grains to reproduce the observations made by EPOXI and ground-based measurements. A realistic bi-lobed nucleus with a succession of active areas with different chemistry was included in the model enabling us to study in details the coma of Hartley 2. The different gas production rates from each area were found by fitting the spectra computed using a line-by-line non-LTE radiative transfer model to the HRI observations. The presence of icy grains with long lifetimes, which are pushed anti-sunward by radiation pressure, explains the observed OH asymmetry with enhancement on the night side of the coma.
Volcanic forcing for climate modeling: a new microphysics-based data set covering years 1600–present
Resumo:
As the understanding and representation of the impacts of volcanic eruptions on climate have improved in the last decades, uncertainties in the stratospheric aerosol forcing from large eruptions are now linked not only to visible optical depth estimates on a global scale but also to details on the size, latitude and altitude distributions of the stratospheric aerosols. Based on our understanding of these uncertainties, we propose a new model-based approach to generating a volcanic forcing for general circulation model (GCM) and chemistry–climate model (CCM) simulations. This new volcanic forcing, covering the 1600–present period, uses an aerosol microphysical model to provide a realistic, physically consistent treatment of the stratospheric sulfate aerosols. Twenty-six eruptions were modeled individually using the latest available ice cores aerosol mass estimates and historical data on the latitude and date of eruptions. The evolution of aerosol spatial and size distribution after the sulfur dioxide discharge are hence characterized for each volcanic eruption. Large variations are seen in hemispheric partitioning and size distributions in relation to location/date of eruptions and injected SO2 masses. Results for recent eruptions show reasonable agreement with observations. By providing these new estimates of spatial distributions of shortwave and long-wave radiative perturbations, this volcanic forcing may help to better constrain the climate model responses to volcanic eruptions in the 1600–present period. The final data set consists of 3-D values (with constant longitude) of spectrally resolved extinction coefficients, single scattering albedos and asymmetry factors calculated for different wavelength bands upon request. Surface area densities for heterogeneous chemistry are also provided.
Resumo:
Net primary production (NPP) is commonly modeled as a function of chlorophyll concentration (Chl), even though it has been long recognized that variability in intracellular chlorophyll content from light acclimation and nutrient stress confounds the relationship between Chl and phytoplankton biomass. It was suggested previously that satellite estimates of backscattering can be related to phytoplankton carbon biomass (C) under conditions of a conserved particle size distribution or a relatively stable relationship between C and total particulate organic carbon. Together, C and Chl can be used to describe physiological state (through variations in Chl:C ratios) and NPP. Here, we fully develop the carbon-based productivity model (CbPM) to include information on the subsurface light field and nitracline depths to parameterize photoacclimation and nutrient stress throughout the water column. This depth-resolved approach produces profiles of biological properties (Chl, C, NPP) that are broadly consistent with observations. The CbPM is validated using regional in situ data sets of irradiance-derived products, phytoplankton chlorophyll: carbon ratios, and measured NPP rates. CbPM-based distributions of global NPP are significantly different in both space and time from previous Chl-based estimates because of the distinction between biomass and physiological influences on global Chl fields. The new model yields annual, areally integrated water column production of similar to 52 Pg C a(-1) for the global oceans.
Resumo:
We present a novel framework for encoding latency analysis of arbitrary multiview video coding prediction structures. This framework avoids the need to consider an specific encoder architecture for encoding latency analysis by assuming an unlimited processing capacity on the multiview encoder. Under this assumption, only the influence of the prediction structure and the processing times have to be considered, and the encoding latency is solved systematically by means of a graph model. The results obtained with this model are valid for a multiview encoder with sufficient processing capacity and serve as a lower bound otherwise. Furthermore, with the objective of low latency encoder design with low penalty on rate-distortion performance, the graph model allows us to identify the prediction relationships that add higher encoding latency to the encoder. Experimental results for JMVM prediction structures illustrate how low latency prediction structures with a low rate-distortion penalty can be derived in a systematic manner using the new model.
Resumo:
Getting a lower energy cost has always been a challenge for concentrated photovoltaic. The FK concentrator enhances the performance (efficiency, acceptance angle and manufacturing tolerances) of the conventional CPV system based on a Fresnel primary stage and a secondary lens, while keeping its simplicity and potentially low‐cost manufacturing. At the same time F‐XTP (Fresnel lens+reflective prism), at the first glance has better cost potential but significantly higher sensitivity to manufacturing errors. This work presents comparison of these two approaches applied to two main technologies of Fresnel lens production (PMMA and Silicone on Glass) and effect of standard deformations that occur under real operation conditions
Resumo:
The purpose of this work is to propose a structure for simulating power systems using behavioral models of nonlinear DC to DC converters implemented through a look-up table of gains. This structure is specially designed for converters whose output impedance depends on the load current level, e.g. quasi-resonant converters. The proposed model is a generic one whose parameters can be obtained by direct measuring the transient response at different operating points. It also includes optional functionalities for modeling converters with current limitation and current sharing in paralleling characteristics. The pusposed structured also allows including aditional characteristics of the DC to DC converter as the efficency as a function of the input voltage and the output current or overvoltage and undervoltage protections. In addition, this proposed model is valid for overdamped and underdamped situations.
Resumo:
In the field of detection and monitoring of dynamic objects in quasi-static scenes, background subtraction techniques where background is modeled at pixel-level, although showing very significant limitations, are extensively used. In this work we propose a novel approach to background modeling that operates at region-level in a wavelet based multi-resolution framework. Based on a segmentation of the background, characterization is made for each region independently as a mixture of K Gaussian modes, considering the model of the approximation and detail coefficients at the different wavelet decomposition levels. Background region characterization is updated along time, and the detection of elements of interest is carried out computing the distance between background region models and those of each incoming image in the sequence. The inclusion of the context in the modeling scheme through each region characterization makes the model robust, being able to support not only gradual illumination and long-term changes, but also sudden illumination changes and the presence of strong shadows in the scene
Resumo:
Here, a novel and efficient moving object detection strategy by non-parametric modeling is presented. Whereas the foreground is modeled by combining color and spatial information, the background model is constructed exclusively with color information, thus resulting in a great reduction of the computational and memory requirements. The estimation of the background and foreground covariance matrices, allows us to obtain compact moving regions while the number of false detections is reduced. Additionally, the application of a tracking strategy provides a priori knowledge about the spatial position of the moving objects, which improves the performance of the Bayesian classifier
Resumo:
This paper studies feature subset selection in classification using a multiobjective estimation of distribution algorithm. We consider six functions, namely area under ROC curve, sensitivity, specificity, precision, F1 measure and Brier score, for evaluation of feature subsets and as the objectives of the problem. One of the characteristics of these objective functions is the existence of noise in their values that should be appropriately handled during optimization. Our proposed algorithm consists of two major techniques which are specially designed for the feature subset selection problem. The first one is a solution ranking method based on interval values to handle the noise in the objectives of this problem. The second one is a model estimation method for learning a joint probabilistic model of objectives and variables which is used to generate new solutions and advance through the search space. To simplify model estimation, l1 regularized regression is used to select a subset of problem variables before model learning. The proposed algorithm is compared with a well-known ranking method for interval-valued objectives and a standard multiobjective genetic algorithm. Particularly, the effects of the two new techniques are experimentally investigated. The experimental results show that the proposed algorithm is able to obtain comparable or better performance on the tested datasets.
Resumo:
This paper proposes a new multi-objective estimation of distribution algorithm (EDA) based on joint modeling of objectives and variables. This EDA uses the multi-dimensional Bayesian network as its probabilistic model. In this way it can capture the dependencies between objectives, variables and objectives, as well as the dependencies learnt between variables in other Bayesian network-based EDAs. This model leads to a problem decomposition that helps the proposed algorithm to find better trade-off solutions to the multi-objective problem. In addition to Pareto set approximation, the algorithm is also able to estimate the structure of the multi-objective problem. To apply the algorithm to many-objective problems, the algorithm includes four different ranking methods proposed in the literature for this purpose. The algorithm is applied to the set of walking fish group (WFG) problems, and its optimization performance is compared with an evolutionary algorithm and another multi-objective EDA. The experimental results show that the proposed algorithm performs significantly better on many of the problems and for different objective space dimensions, and achieves comparable results on some compared with the other algorithms.
Resumo:
En la actualidad, el seguimiento de la dinámica de los procesos medio ambientales está considerado como un punto de gran interés en el campo medioambiental. La cobertura espacio temporal de los datos de teledetección proporciona información continua con una alta frecuencia temporal, permitiendo el análisis de la evolución de los ecosistemas desde diferentes escalas espacio-temporales. Aunque el valor de la teledetección ha sido ampliamente probado, en la actualidad solo existe un número reducido de metodologías que permiten su análisis de una forma cuantitativa. En la presente tesis se propone un esquema de trabajo para explotar las series temporales de datos de teledetección, basado en la combinación del análisis estadístico de series de tiempo y la fenometría. El objetivo principal es demostrar el uso de las series temporales de datos de teledetección para analizar la dinámica de variables medio ambientales de una forma cuantitativa. Los objetivos específicos son: (1) evaluar dichas variables medio ambientales y (2) desarrollar modelos empíricos para predecir su comportamiento futuro. Estos objetivos se materializan en cuatro aplicaciones cuyos objetivos específicos son: (1) evaluar y cartografiar estados fenológicos del cultivo del algodón mediante análisis espectral y fenometría, (2) evaluar y modelizar la estacionalidad de incendios forestales en dos regiones bioclimáticas mediante modelos dinámicos, (3) predecir el riesgo de incendios forestales a nivel pixel utilizando modelos dinámicos y (4) evaluar el funcionamiento de la vegetación en base a la autocorrelación temporal y la fenometría. Los resultados de esta tesis muestran la utilidad del ajuste de funciones para modelizar los índices espectrales AS1 y AS2. Los parámetros fenológicos derivados del ajuste de funciones permiten la identificación de distintos estados fenológicos del cultivo del algodón. El análisis espectral ha demostrado, de una forma cuantitativa, la presencia de un ciclo en el índice AS2 y de dos ciclos en el AS1 así como el comportamiento unimodal y bimodal de la estacionalidad de incendios en las regiones mediterránea y templada respectivamente. Modelos autorregresivos han sido utilizados para caracterizar la dinámica de la estacionalidad de incendios y para predecir de una forma muy precisa el riesgo de incendios forestales a nivel pixel. Ha sido demostrada la utilidad de la autocorrelación temporal para definir y caracterizar el funcionamiento de la vegetación a nivel pixel. Finalmente el concepto “Optical Functional Type” ha sido definido, donde se propone que los pixeles deberían ser considerados como unidades temporales y analizados en función de su dinámica temporal. ix SUMMARY A good understanding of land surface processes is considered as a key subject in environmental sciences. The spatial-temporal coverage of remote sensing data provides continuous observations with a high temporal frequency allowing the assessment of ecosystem evolution at different temporal and spatial scales. Although the value of remote sensing time series has been firmly proved, only few time series methods have been developed for analyzing this data in a quantitative and continuous manner. In the present dissertation a working framework to exploit Remote Sensing time series is proposed based on the combination of Time Series Analysis and phenometric approach. The main goal is to demonstrate the use of remote sensing time series to analyze quantitatively environmental variable dynamics. The specific objectives are (1) to assess environmental variables based on remote sensing time series and (2) to develop empirical models to forecast environmental variables. These objectives have been achieved in four applications which specific objectives are (1) assessing and mapping cotton crop phenological stages using spectral and phenometric analyses, (2) assessing and modeling fire seasonality in two different ecoregions by dynamic models, (3) forecasting forest fire risk on a pixel basis by dynamic models, and (4) assessing vegetation functioning based on temporal autocorrelation and phenometric analysis. The results of this dissertation show the usefulness of function fitting procedures to model AS1 and AS2. Phenometrics derived from function fitting procedure makes it possible to identify cotton crop phenological stages. Spectral analysis has demonstrated quantitatively the presence of one cycle in AS2 and two in AS1 and the unimodal and bimodal behaviour of fire seasonality in the Mediterranean and temperate ecoregions respectively. Autoregressive models has been used to characterize the dynamics of fire seasonality in two ecoregions and to forecasts accurately fire risk on a pixel basis. The usefulness of temporal autocorrelation to define and characterized land surface functioning has been demonstrated. And finally the “Optical Functional Types” concept has been proposed, in this approach pixels could be as temporal unities based on its temporal dynamics or functioning.
Resumo:
Usability plays an important role to satisfy users? needs. There are many recommendations in the HCI literature on how to improve software usability. Our research focuses on such recommendations that affect the system architecture rather than just the interface. However, improving software usability in aspects that affect architecture increases the analyst?s workload and development complexity. This paper proposes a solution based on model-driven development. We propose representing functional usability mechanisms abstractly by means of conceptual primitives. The analyst will use these primitives to incorporate functional usability features at the early stages of the development process. Following the model-driven development paradigm, these features are then automatically transformed into subsequent steps of development, a practice that is hidden from the analyst.