982 resultados para brightness temperature modeling


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Some of the most valued natural and cultural landscapes on Earth lie in river basins that are poorly gauged and have incomplete historical climate and runoff records. The Mara River Basin of East Africa is such a basin. It hosts the internationally renowned Mara-Serengeti landscape as well as a rich mixture of indigenous cultures. The Mara River is the sole source of surface water to the landscape during the dry season and periods of drought. During recent years, the flow of the Mara River has become increasingly erratic, especially in the upper reaches, and resource managers are hampered by a lack of understanding of the relative influence of different sources of flow alteration. Uncertainties about the impacts of future climate change compound the challenges. We applied the Soil Water Assessment Tool (SWAT) to investigate the response of the headwater hydrology of the Mara River to scenarios of continued land use change and projected climate change. Under the data-scarce conditions of the basin, model performance was improved using satellite-based estimated rainfall data, which may also improve the usefulness of runoff models in other parts of East Africa. The results of the analysis indicate that any further conversion of forests to agriculture and grassland in the basin headwaters is likely to reduce dry season flows and increase peak flows, leading to greater water scarcity at critical times of the year and exacerbating erosion on hillslopes. Most climate change projections for the region call for modest and seasonally variable increases in precipitation (5–10 %) accompanied by increases in temperature (2.5–3.5 °C). Simulated runoff responses to climate change scenarios were non-linear and suggest the basin is highly vulnerable under low (−3 %) and high (+25 %) extremes of projected precipitation changes, but under median projections (+7 %) there is little impact on annual water yields or mean discharge. Modest increases in precipitation are partitioned largely to increased evapotranspiration. Overall, model results support the existing efforts of Mara water resource managers to protect headwater forests and indicate that additional emphasis should be placed on improving land management practices that enhance infiltration and aquifer recharge as part of a wider program of climate change adaptation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Eurasian inland propagation of temperature anomalies during glacial millennial-scale climate variability is poorly understood but this knowledge is crucial to understanding hemisphere-wide atmospheric teleconnection patterns and climate mechanisms. Based on biomarkers and geochemical paleothermometers, a pronounced continental temperature variability between 64,000 and 20,000 years ago, coinciding with the Greenland Dansgaard-Oeschger cycles, was determined in a well-dated sediment record from the formerly enclosed Black Sea. Cooling during Heinrich events was not stronger than during other stadials in the Black Sea. This is corroborated by modeling results showing that regular Dansgaard-Oeschger cycles penetrated deeper into the Eurasian continent than Heinrich events. The pattern of coastal ice-rafted detritus suggests a strong dependence on the climate background state, with significantly milder winters during periods of reduced Eurasian ice sheets and an intensified meridional atmospheric circulation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Bayesian nonparametric models, such as the Gaussian process and the Dirichlet process, have been extensively applied for target kinematics modeling in various applications including environmental monitoring, traffic planning, endangered species tracking, dynamic scene analysis, autonomous robot navigation, and human motion modeling. As shown by these successful applications, Bayesian nonparametric models are able to adjust their complexities adaptively from data as necessary, and are resistant to overfitting or underfitting. However, most existing works assume that the sensor measurements used to learn the Bayesian nonparametric target kinematics models are obtained a priori or that the target kinematics can be measured by the sensor at any given time throughout the task. Little work has been done for controlling the sensor with bounded field of view to obtain measurements of mobile targets that are most informative for reducing the uncertainty of the Bayesian nonparametric models. To present the systematic sensor planning approach to leaning Bayesian nonparametric models, the Gaussian process target kinematics model is introduced at first, which is capable of describing time-invariant spatial phenomena, such as ocean currents, temperature distributions and wind velocity fields. The Dirichlet process-Gaussian process target kinematics model is subsequently discussed for modeling mixture of mobile targets, such as pedestrian motion patterns.

Novel information theoretic functions are developed for these introduced Bayesian nonparametric target kinematics models to represent the expected utility of measurements as a function of sensor control inputs and random environmental variables. A Gaussian process expected Kullback Leibler divergence is developed as the expectation of the KL divergence between the current (prior) and posterior Gaussian process target kinematics models with respect to the future measurements. Then, this approach is extended to develop a new information value function that can be used to estimate target kinematics described by a Dirichlet process-Gaussian process mixture model. A theorem is proposed that shows the novel information theoretic functions are bounded. Based on this theorem, efficient estimators of the new information theoretic functions are designed, which are proved to be unbiased with the variance of the resultant approximation error decreasing linearly as the number of samples increases. Computational complexities for optimizing the novel information theoretic functions under sensor dynamics constraints are studied, and are proved to be NP-hard. A cumulative lower bound is then proposed to reduce the computational complexity to polynomial time.

Three sensor planning algorithms are developed according to the assumptions on the target kinematics and the sensor dynamics. For problems where the control space of the sensor is discrete, a greedy algorithm is proposed. The efficiency of the greedy algorithm is demonstrated by a numerical experiment with data of ocean currents obtained by moored buoys. A sweep line algorithm is developed for applications where the sensor control space is continuous and unconstrained. Synthetic simulations as well as physical experiments with ground robots and a surveillance camera are conducted to evaluate the performance of the sweep line algorithm. Moreover, a lexicographic algorithm is designed based on the cumulative lower bound of the novel information theoretic functions, for the scenario where the sensor dynamics are constrained. Numerical experiments with real data collected from indoor pedestrians by a commercial pan-tilt camera are performed to examine the lexicographic algorithm. Results from both the numerical simulations and the physical experiments show that the three sensor planning algorithms proposed in this dissertation based on the novel information theoretic functions are superior at learning the target kinematics with

little or no prior knowledge

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the process of engineering design of structural shapes, the flat plate analysis results can be generalized to predict behaviors of complete structural shapes. In this case, the purpose of this project is to analyze a thin flat plate under conductive heat transfer and to simulate the temperature distribution, thermal stresses, total displacements, and buckling deformations. The current approach in these cases has been using the Finite Element Method (FEM), whose basis is the construction of a conforming mesh. In contrast, this project uses the mesh-free Scan Solve Method. This method eliminates the meshing limitation using a non-conforming mesh. I implemented this modeling process developing numerical algorithms and software tools to model thermally induced buckling. In addition, convergence analysis was achieved, and the results were compared with FEM. In conclusion, the results demonstrate that the method gives similar solutions to FEM in quality, but it is computationally less time consuming.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Within Canada there are more than 2.5 million bundles of spent nuclear fuel with another approximately 2 million bundles to be generated in the future. Canada, and every country around the world that has taken a decision on management of spent nuclear fuel, has decided on long-term containment and isolation of the fuel within a deep geological repository. At depth, a deep geological repository consists of a network of placement rooms where the bundles will be located within a multi-layered system that incorporates engineered and natural barriers. The barriers will be placed in a complex thermal-hydraulic-mechanical-chemical-biological (THMCB) environment. A large database of material properties for all components in the repository are required to construct representative models. Within the repository, the sealing materials will experience elevated temperatures due to the thermal gradient produced by radioactive decay heat from the waste inside the container. Furthermore, high porewater pressure due to the depth of repository along with possibility of elevated salinity of groundwater would cause the bentonite-based materials to be under transient hydraulic conditions. Therefore it is crucial to characterize the sealing materials over a wide range of thermal-hydraulic conditions. A comprehensive experimental program has been conducted to measure properties (mainly focused on thermal properties) of all sealing materials involved in Mark II concept at plausible thermal-hydraulic conditions. The thermal response of Canada’s concept for a deep geological repository has been modelled using experimentally measured thermal properties. Plausible scenarios are defined and the effects of these scenarios are examined on the container surface temperature as well as the surrounding geosphere to assess whether they meet design criteria for the cases studied. The thermal response shows that if all the materials even being at dried condition, repository still performs acceptably as long as sealing materials remain in contact.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We review and compare four broad categories of spatially-explicit modelling approaches currently used to understand and project changes in the distribution and productivity of living marine resources including: 1) statistical species distribution models, 2) physiology-based, biophysical models of single life stages or the whole life cycle of species, 3) food web models, and 4) end-to-end models. Single pressures are rare and, in the future, models must be able to examine multiple factors affecting living marine resources such as interactions between: i) climate-driven changes in temperature regimes and acidification, ii) reductions in water quality due to eutrophication, iii) the introduction of alien invasive species, and/or iv) (over-)exploitation by fisheries. Statistical (correlative) approaches can be used to detect historical patterns which may not be relevant in the future. Advancing predictive capacity of changes in distribution and productivity of living marine resources requires explicit modelling of biological and physical mechanisms. New formulations are needed which (depending on the question) will need to strive for more realism in ecophysiology and behaviour of individuals, life history strategies of species, as well as trophodynamic interactions occurring at different spatial scales. Coupling existing models (e.g. physical, biological, economic) is one avenue that has proven successful. However, fundamental advancements are needed to address key issues such as the adaptive capacity of species/groups and ecosystems. The continued development of end-to-end models (e.g., physics to fish to human sectors) will be critical if we hope to assess how multiple pressures may interact to cause changes in living marine resources including the ecological and economic costs and trade-offs of different spatial management strategies. Given the strengths and weaknesses of the various types of models reviewed here, confidence in projections of changes in the distribution and productivity of living marine resources will be increased by assessing model structural uncertainty through biological ensemble modelling.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We review and compare four broad categories of spatially-explicit modelling approaches currently used to understand and project changes in the distribution and productivity of living marine resources including: 1) statistical species distribution models, 2) physiology-based, biophysical models of single life stages or the whole life cycle of species, 3) food web models, and 4) end-to-end models. Single pressures are rare and, in the future, models must be able to examine multiple factors affecting living marine resources such as interactions between: i) climate-driven changes in temperature regimes and acidification, ii) reductions in water quality due to eutrophication, iii) the introduction of alien invasive species, and/or iv) (over-)exploitation by fisheries. Statistical (correlative) approaches can be used to detect historical patterns which may not be relevant in the future. Advancing predictive capacity of changes in distribution and productivity of living marine resources requires explicit modelling of biological and physical mechanisms. New formulations are needed which (depending on the question) will need to strive for more realism in ecophysiology and behaviour of individuals, life history strategies of species, as well as trophodynamic interactions occurring at different spatial scales. Coupling existing models (e.g. physical, biological, economic) is one avenue that has proven successful. However, fundamental advancements are needed to address key issues such as the adaptive capacity of species/groups and ecosystems. The continued development of end-to-end models (e.g., physics to fish to human sectors) will be critical if we hope to assess how multiple pressures may interact to cause changes in living marine resources including the ecological and economic costs and trade-offs of different spatial management strategies. Given the strengths and weaknesses of the various types of models reviewed here, confidence in projections of changes in the distribution and productivity of living marine resources will be increased by assessing model structural uncertainty through biological ensemble modelling.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Bulk gallium nitride (GaN) power semiconductor devices are gaining significant interest in recent years, creating the need for technology computer aided design (TCAD) simulation to accurately model and optimize these devices. This paper comprehensively reviews and compares different GaN physical models and model parameters in the literature, and discusses the appropriate selection of these models and parameters for TCAD simulation. 2-D drift-diffusion semi-classical simulation is carried out for 2.6 kV and 3.7 kV bulk GaN vertical PN diodes. The simulated forward current-voltage and reverse breakdown characteristics are in good agreement with the measurement data even over a wide temperature range.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

During this thesis work a coupled thermo-mechanical finite element model (FEM) was builtto simulate hot rolling in the blooming mill at Sandvik Materials Technology (SMT) inSandviken. The blooming mill is the first in a long line of processes that continuously or ingotcast ingots are subjected to before becoming finished products. The aim of this thesis work was twofold. The first was to create a parameterized finiteelement (FE) model of the blooming mill. The commercial FE software package MSCMarc/Mentat was used to create this model and the programing language Python was used toparameterize it. Second, two different pass schedules (A and B) were studied and comparedusing the model. The two pass series were evaluated with focus on their ability to healcentreline porosity, i.e. to close voids in the centre of the ingot. This evaluation was made by studying the hydrostatic stress (σm), the von Mises stress (σeq)and the plastic strain (εp) in the centre of the ingot. From these parameters the stress triaxiality(Tx) and the hydrostatic integration parameter (Gm) were calculated for each pass in bothseries using two different transportation times (30 and 150 s) from the furnace. The relationbetween Gm and an analytical parameter (Δ) was also studied. This parameter is the ratiobetween the mean height of the ingot and the contact length between the rolls and the ingot,which is useful as a rule of thumb to determine the homogeneity or penetration of strain for aspecific pass. The pass series designed with fewer passes (B), many with greater reduction, was shown toachieve better void closure theoretically. It was also shown that a temperature gradient, whichis the result of a longer holding time between the furnace and the blooming mill leads toimproved void closure.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Thesis (Ph.D.)--University of Washington, 2016-08

Relevância:

30.00% 30.00%

Publicador:

Resumo:

La stratégie actuelle de contrôle de la qualité de l’anode est inadéquate pour détecter les anodes défectueuses avant qu’elles ne soient installées dans les cuves d’électrolyse. Des travaux antérieurs ont porté sur la modélisation du procédé de fabrication des anodes afin de prédire leurs propriétés directement après la cuisson en utilisant des méthodes statistiques multivariées. La stratégie de carottage des anodes utilisée à l’usine partenaire fait en sorte que ce modèle ne peut être utilisé que pour prédire les propriétés des anodes cuites aux positions les plus chaudes et les plus froides du four à cuire. Le travail actuel propose une stratégie pour considérer l’histoire thermique des anodes cuites à n’importe quelle position et permettre de prédire leurs propriétés. Il est montré qu’en combinant des variables binaires pour définir l’alvéole et la position de cuisson avec les données routinières mesurées sur le four à cuire, les profils de température des anodes cuites à différentes positions peuvent être prédits. Également, ces données ont été incluses dans le modèle pour la prédiction des propriétés des anodes. Les résultats de prédiction ont été validés en effectuant du carottage supplémentaire et les performances du modèle sont concluantes pour la densité apparente et réelle, la force de compression, la réactivité à l’air et le Lc et ce peu importe la position de cuisson.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

La modélisation de la cryolite, utilisée dans la fabrication de l’aluminium, implique plusieurs défis, notament la présence de discontinuités dans la solution et l’inclusion de la difference de densité entre les phases solide et liquide. Pour surmonter ces défis, plusieurs éléments novateurs ont été développés dans cette thèse. En premier lieu, le problème du changement de phase, communément appelé problème de Stefan, a été résolu en deux dimensions en utilisant la méthode des éléments finis étendue. Une formulation utilisant un multiplicateur de Lagrange stable spécialement développée et une interpolation enrichie a été utilisée pour imposer la température de fusion à l’interface. La vitesse de l’interface est déterminée par le saut dans le flux de chaleur à travers l’interface et a été calculée en utilisant la solution du multiplicateur de Lagrange. En second lieu, les effets convectifs ont été inclus par la résolution des équations de Stokes dans la phase liquide en utilisant la méthode des éléments finis étendue aussi. Troisièmement, le changement de densité entre les phases solide et liquide, généralement négligé dans la littérature, a été pris en compte par l’ajout d’une condition aux limites de vitesse non nulle à l’interface solide-liquide pour respecter la conservation de la masse dans le système. Des problèmes analytiques et numériques ont été résolus pour valider les divers composants du modèle et le système d’équations couplés. Les solutions aux problèmes numériques ont été comparées aux solutions obtenues avec l’algorithme de déplacement de maillage de Comsol. Ces comparaisons démontrent que le modèle par éléments finis étendue reproduit correctement le problème de changement phase avec densités variables.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The blast furnace is the main ironmaking production unit in the world which converts iron ore with coke and hot blast into liquid iron, hot metal, which is used for steelmaking. The furnace acts as a counter-current reactor charged with layers of raw material of very different gas permeability. The arrangement of these layers, or burden distribution, is the most important factor influencing the gas flow conditions inside the furnace, which dictate the efficiency of the heat transfer and reduction processes. For proper control the furnace operators should know the overall conditions in the furnace and be able to predict how control actions affect the state of the furnace. However, due to high temperatures and pressure, hostile atmosphere and mechanical wear it is very difficult to measure internal variables. Instead, the operators have to rely extensively on measurements obtained at the boundaries of the furnace and make their decisions on the basis of heuristic rules and results from mathematical models. It is particularly difficult to understand the distribution of the burden materials because of the complex behavior of the particulate materials during charging. The aim of this doctoral thesis is to clarify some aspects of burden distribution and to develop tools that can aid the decision-making process in the control of the burden and gas distribution in the blast furnace. A relatively simple mathematical model was created for simulation of the distribution of the burden material with a bell-less top charging system. The model developed is fast and it can therefore be used by the operators to gain understanding of the formation of layers for different charging programs. The results were verified by findings from charging experiments using a small-scale charging rig at the laboratory. A basic gas flow model was developed which utilized the results of the burden distribution model to estimate the gas permeability of the upper part of the blast furnace. This combined formulation for gas and burden distribution made it possible to implement a search for the best combination of charging parameters to achieve a target gas temperature distribution. As this mathematical task is discontinuous and non-differentiable, a genetic algorithm was applied to solve the optimization problem. It was demonstrated that the method was able to evolve optimal charging programs that fulfilled the target conditions. Even though the burden distribution model provides information about the layer structure, it neglects some effects which influence the results, such as mixed layer formation and coke collapse. A more accurate numerical method for studying particle mechanics, the Discrete Element Method (DEM), was used to study some aspects of the charging process more closely. Model charging programs were simulated using DEM and compared with the results from small-scale experiments. The mixed layer was defined and the voidage of mixed layers was estimated. The mixed layer was found to have about 12% less voidage than layers of the individual burden components. Finally, a model for predicting the extent of coke collapse when heavier pellets are charged over a layer of lighter coke particles was formulated based on slope stability theory, and was used to update the coke layer distribution after charging in the mathematical model. In designing this revision, results from DEM simulations and charging experiments for some charging programs were used. The findings from the coke collapse analysis can be used to design charging programs with more stable coke layers.