860 resultados para Model development guidelines
Resumo:
This study examines how different microphysical parameterization schemes influence orographically induced precipitation and the distributions of hydrometeors and water vapour for midlatitude summer conditions in the Weather Research and Forecasting (WRF) model. A high-resolution two-dimensional idealized simulation is used to assess the differences between the schemes in which a moist air flow is interacting with a bell-shaped 2 km high mountain. Periodic lateral boundary conditions are chosen to recirculate atmospheric water in the domain. It is found that the 13 selected microphysical schemes conserve the water in the model domain. The gain or loss of water is less than 0.81% over a simulation time interval of 61 days. The differences of the microphysical schemes in terms of the distributions of water vapour, hydrometeors and accumulated precipitation are presented and discussed. The Kessler scheme, the only scheme without ice-phase processes, shows final values of cloud liquid water 14 times greater than the other schemes. The differences among the other schemes are not as extreme, but still they differ up to 79% in water vapour, up to 10 times in hydrometeors and up to 64% in accumulated precipitation at the end of the simulation. The microphysical schemes also differ in the surface evaporation rate. The WRF single-moment 3-class scheme has the highest surface evaporation rate compensated by the highest precipitation rate. The different distributions of hydrometeors and water vapour of the microphysical schemes induce differences up to 49 W m−2 in the downwelling shortwave radiation and up to 33 W m−2 in the downwelling longwave radiation.
Resumo:
The newly developed atmosphere–ocean-chemistry-climate model SOCOL-MPIOM is presented by demonstrating the influence of the interactive chemistry module on the climate state and the variability. Therefore, we compare pre-industrial control simulations with (CHEM) and without (NOCHEM) interactive chemistry. In general, the influence of the chemistry on the mean state and the variability is small and mainly restricted to the stratosphere and mesosphere. The largest differences are found for the atmospheric dynamics in the polar regions, with slightly stronger northern and southern winter polar vortices in CHEM. The strengthening of the vortex is related to larger stratospheric temperature gradients, which are attributed to a parametrization of the absorption of ozone and oxygen in the Lyman-alpha, Schumann–Runge, Hartley, and Higgins bands. This effect is parametrized in the version with interactive chemistry only. A second reason for the temperature differences between CHEM and NOCHEM is related to diurnal variations in the ozone concentrations in the higher atmosphere, which are missing in NOCHEM. Furthermore, stratospheric water vapour concentrations differ substantially between the two experiments, but their effect on the temperatures is small. In both setups, the simulated intensity and variability of the northern polar vortex is inside the range of present day observations. Sudden stratospheric warming events are well reproduced in terms of their frequency, but the distribution amongst the winter months is too uniform. Additionally, the performance of SOCOL-MPIOM under changing external forcings is assessed for the period 1600–2000 using an ensemble of simulations driven by a spectral solar forcing reconstruction. The amplitude of the reconstruction is large in comparison to other state-of-the-art reconstructions, providing an upper limit for the importance of the solar signal. In the pre-industrial period (1600–1850) the simulated surface temperature trends are in reasonable agreement with temperature reconstructions, although the multi-decadal variability is more pronounced. This enhanced variability can be attributed to the variability in the solar forcing. The simulated temperature reductions during the Maunder Minimum are in the lowest probability range of the proxy records. During the Dalton Minimum, when also volcanic forcing is an important driver of temperature variations, the agreement is better. In the industrial period from 1850 onward SOCOL-MPIOM overestimates the temperature increase in comparison to observational data sets. Sensitivity simulations show that this overestimation can be attributed to the increasing trend in the solar forcing reconstruction that is used in this study and an additional warming induced by the simulated ozone changes.
Resumo:
Simulating surface wind over complex terrain is a challenge in regional climate modelling. Therefore, this study aims at identifying a set-up of the Weather Research and Forecasting Model (WRF) model that minimises system- atic errors of surface winds in hindcast simulations. Major factors of the model configuration are tested to find a suitable set-up: the horizontal resolution, the planetary boundary layer (PBL) parameterisation scheme and the way the WRF is nested to the driving data set. Hence, a number of sensitivity simulations at a spatial resolution of 2 km are carried out and compared to observations. Given the importance of wind storms, the analysis is based on case studies of 24 historical wind storms that caused great economic damage in Switzerland. Each of these events is downscaled using eight different model set-ups, but sharing the same driving data set. The results show that the lack of representation of the unresolved topography leads to a general overestimation of wind speed in WRF. However, this bias can be substantially reduced by using a PBL scheme that explicitly considers the effects of non-resolved topography, which also improves the spatial structure of wind speed over Switzerland. The wind direction, although generally well reproduced, is not very sensitive to the PBL scheme. Further sensitivity tests include four types of nesting methods: nesting only at the boundaries of the outermost domain, analysis nudging, spectral nudging, and the so-called re-forecast method, where the simulation is frequently restarted. These simulations show that restricting the freedom of the model to develop large-scale disturbances slightly increases the temporal agreement with the observations, at the same time that it further reduces the overestimation of wind speed, especially for maximum wind peaks. The model performance is also evaluated in the outermost domains, where the resolution is coarser. The results demonstrate the important role of horizontal resolution, where the step from 6 to 2 km significantly improves model performance. In summary, the combination of a grid size of 2 km, the non-local PBL scheme modified to explicitly account for non-resolved orography, as well as analysis or spectral nudging, is a superior combination when dynamical downscaling is aimed at reproducing real wind fields.
Resumo:
The Microwave Emission Model of Layered Snowpacks (MEMLS) was originally developed for microwave emissions of snowpacks in the frequency range 5–100 GHz. It is based on six-flux theory to describe radiative transfer in snow including absorption, multiple volume scattering, radiation trapping due to internal reflection and a combination of coherent and incoherent superposition of reflections between horizontal layer interfaces. Here we introduce MEMLS3&a, an extension of MEMLS, which includes a backscatter model for active microwave remote sensing of snow. The reflectivity is decomposed into diffuse and specular components. Slight undulations of the snow surface are taken into account. The treatment of like- and cross-polarization is accomplished by an empirical splitting parameter q. MEMLS3&a (as well as MEMLS) is set up in a way that snow input parameters can be derived by objective measurement methods which avoid fitting procedures of the scattering efficiency of snow, required by several other models. For the validation of the model we have used a combination of active and passive measurements from the NoSREx (Nordic Snow Radar Experiment) campaign in Sodankylä, Finland. We find a reasonable agreement between the measurements and simulations, subject to uncertainties in hitherto unmeasured input parameters of the backscatter model. The model is written in Matlab and the code is publicly available for download through the following website: http://www.iapmw.unibe.ch/research/projects/snowtools/memls.html.
Resumo:
In a feasibility study, the potential of proxy data for the temperature and salinity during the Last Glacial Maximum (LGM, about 19 000 to 23 000 years before present) in constraining the strength of the Atlantic meridional overturning circulation (AMOC) with a general ocean circulation model was explored. The proxy data were simulated by drawing data from four different model simulations at the ocean sediment core locations of the Multiproxy Approach for the Reconstruction of the Glacial Ocean surface (MARGO) project, and perturbing these data with realistic noise estimates. The results suggest that our method has the potential to provide estimates of the past strength of the AMOC even from sparse data, but in general, paleo-sea-surface temperature data without additional prior knowledge about the ocean state during the LGM is not adequate to constrain the model. On the one hand, additional data in the deep-ocean and salinity data are shown to be highly important in estimating the LGM circulation. On the other hand, increasing the amount of surface data alone does not appear to be enough for better estimates. Finally, better initial guesses to start the state estimation procedure would greatly improve the performance of the method. Indeed, with a sufficiently good first guess, just the sea-surface temperature data from the MARGO project promise to be sufficient for reliable estimates of the strength of the AMOC.
Resumo:
Carbon (C) and nitrogen (N) process-based models are important tools for estimating and reporting greenhouse gas emissions and changes in soil C stocks. There is a need for continuous evaluation, development and adaptation of these models to improve scientific understanding, national inventories and assessment of mitigation options across the world. To date, much of the information needed to describe different processes like transpiration, photosynthesis, plant growth and maintenance, above and below ground carbon dynamics, decomposition and nitrogen mineralization. In ecosystem models remains inaccessible to the wider community, being stored within model computer source code, or held internally by modelling teams. Here we describe the Global Research Alliance Modelling Platform (GRAMP), a web-based modelling platform to link researchers with appropriate datasets, models and training material. It will provide access to model source code and an interactive platform for researchers to form a consensus on existing methods, and to synthesize new ideas, which will help to advance progress in this area. The platform will eventually support a variety of models, but to trial the platform and test the architecture and functionality, it was piloted with variants of the DNDC model. The intention is to form a worldwide collaborative network (a virtual laboratory) via an interactive website with access to models and best practice guidelines; appropriate datasets for testing, calibrating and evaluating models; on-line tutorials and links to modelling and data provider research groups, and their associated publications. A graphical user interface has been designed to view the model development tree and access all of the above functions.
Resumo:
This study explored the utility of the impact response surface (IRS) approach for investigating model ensemble crop yield responses under a large range of changes in climate. IRSs of spring and winter wheat Triticum aestivum yields were constructed from a 26-member ensemble of process-based crop simulation models for sites in Finland, Germany and Spain across a latitudinal transect. The sensitivity of modelled yield to systematic increments of changes in temperature (-2 to +9°C) and precipitation (-50 to +50%) was tested by modifying values of baseline (1981 to 2010) daily weather, with CO2 concentration fixed at 360 ppm. The IRS approach offers an effective method of portraying model behaviour under changing climate as well as advantages for analysing, comparing and presenting results from multi-model ensemble simulations. Though individual model behaviour occasionally departed markedly from the average, ensemble median responses across sites and crop varieties indicated that yields decline with higher temperatures and decreased precipitation and increase with higher precipitation. Across the uncertainty ranges defined for the IRSs, yields were more sensitive to temperature than precipitation changes at the Finnish site while sensitivities were mixed at the German and Spanish sites. Precipitation effects diminished under higher temperature changes. While the bivariate and multi-model characteristics of the analysis impose some limits to interpretation, the IRS approach nonetheless provides additional insights into sensitivities to inter-model and inter-annual variability. Taken together, these sensitivities may help to pinpoint processes such as heat stress, vernalisation or drought effects requiring refinement in future model development.
Resumo:
Atualmente, os edifícios, em função dos avanços tecnológicos dos sistemas prediais, necessitam de maior planejamento, detalhamento de projetos, controles de execução e treinamento dos profissionais de operação e manutenção para atender os requisitos de projeto do proprietário, que têm como premissa os conceitos de sustentabilidade, qualidade e desempenho. A presença dos sofisticados sistemas de controle contribui para facilitar o gerenciamento de insumos como água e energia, porém pequenas falhas podem levar a grandes falhas de desempenho. As falhas na concepção dos edifícios têm início com a má interpretação, por parte da equipe técnica, dos requisitos dos proprietários. Assim, é necessário que na concepção do edifício sejam verificados todos os requisitos solicitados pelos proprietários para o edifício durante o seu ciclo de vida. A falha da comunicação percorre toda a cadeia produtiva do edifício gerando falhas de planejamento, projeto, execução e manutenção. O comissionamento é um processo para atender aos requisitos de projeto do proprietário, documentar as fases do ciclo de vida dos edifícios, capacitar os profissionais de operação e manutenção, com o objetivo de evitar as falhas, diminuir desperdícios e retrabalhos, melhorar a qualidade, o desempenho e a sustentabilidade dos edifícios. O comissionamento é mais difundido em sistemas prediais de ar condicionado e de iluminação tendo como meta a alta eficiência energética e a economia de água, sendo pouco utilizado no Brasil. Neste contexto, o objetivo desta pesquisa é propor um modelo conceitual de comissionamento para sistemas prediais. A metodologia adotada para o desenvolvimento do modelo, utiliza a pesquisa bibliográfica como procedimento técnico. Para elucidar em que fase o comissionamento é aplicado, este foi relacionado com os outros conceitos utilizados no ciclo de vida do edifício como a coordenação de projeto, o gerenciamento de execução, o gerenciamento de facilidades, a qualidade, o desempenho, a sustentabilidade. Para o desenvolvimento do modelo conceitual é apresentado um fluxo das fases do ciclo de vida e respectivas etapas do comissionamento e outro fluxo com a relação de documentos gerados em cada fase. O resultado, ou seja, o modelo conceitual dá as diretrizes para o desenvolvimento de um comissionamento por meio da descrição das atividades, das competências e dos produtos gerados em cada etapa do comissionamento, conforme as fases do ciclo do edifício. Este trabalho contribui para difundir o comissionamento e embasar a sua aplicação em edifícios.
Resumo:
Crash reduction factors (CRFs) are used to estimate the potential number of traffic crashes expected to be prevented from investment in safety improvement projects. The method used to develop CRFs in Florida has been based on the commonly used before-and-after approach. This approach suffers from a widely recognized problem known as regression-to-the-mean (RTM). The Empirical Bayes (EB) method has been introduced as a means to addressing the RTM problem. This method requires the information from both the treatment and reference sites in order to predict the expected number of crashes had the safety improvement projects at the treatment sites not been implemented. The information from the reference sites is estimated from a safety performance function (SPF), which is a mathematical relationship that links crashes to traffic exposure. The objective of this dissertation was to develop the SPFs for different functional classes of the Florida State Highway System. Crash data from years 2001 through 2003 along with traffic and geometric data were used in the SPF model development. SPFs for both rural and urban roadway categories were developed. The modeling data used were based on one-mile segments that contain homogeneous traffic and geometric conditions within each segment. Segments involving intersections were excluded. The scatter plots of data show that the relationships between crashes and traffic exposure are nonlinear, that crashes increase with traffic exposure in an increasing rate. Four regression models, namely, Poisson (PRM), Negative Binomial (NBRM), zero-inflated Poisson (ZIP), and zero-inflated Negative Binomial (ZINB), were fitted to the one-mile segment records for individual roadway categories. The best model was selected for each category based on a combination of the Likelihood Ratio test, the Vuong statistical test, and the Akaike's Information Criterion (AIC). The NBRM model was found to be appropriate for only one category and the ZINB model was found to be more appropriate for six other categories. The overall results show that the Negative Binomial distribution model generally provides a better fit for the data than the Poisson distribution model. In addition, the ZINB model was found to give the best fit when the count data exhibit excess zeros and over-dispersion for most of the roadway categories. While model validation shows that most data points fall within the 95% prediction intervals of the models developed, the Pearson goodness-of-fit measure does not show statistical significance. This is expected as traffic volume is only one of the many factors contributing to the overall crash experience, and that the SPFs are to be applied in conjunction with Accident Modification Factors (AMFs) to further account for the safety impacts of major geometric features before arriving at the final crash prediction. However, with improved traffic and crash data quality, the crash prediction power of SPF models may be further improved.
Resumo:
The Model for Prediction Across Scales (MPAS) is a novel set of Earth system simulation components and consists of an atmospheric model, an ocean model and a land-ice model. Its distinct features are the use of unstructured Voronoi meshes and C-grid discretisation to address shortcomings of global models on regular grids and the use of limited area models nested in a forcing data set, with respect to parallel scalability, numerical accuracy and physical consistency. This concept allows one to include the feedback of regional land use information on weather and climate at local and global scales in a consistent way, which is impossible to achieve with traditional limited area modelling approaches. Here, we present an in-depth evaluation of MPAS with regards to technical aspects of performing model runs and scalability for three medium-size meshes on four different high-performance computing (HPC) sites with different architectures and compilers. We uncover model limitations and identify new aspects for the model optimisation that are introduced by the use of unstructured Voronoi meshes. We further demonstrate the model performance of MPAS in terms of its capability to reproduce the dynamics of the West African monsoon (WAM) and its associated precipitation in a pilot study. Constrained by available computational resources, we compare 11-month runs for two meshes with observations and a reference simulation from the Weather Research and Forecasting (WRF) model. We show that MPAS can reproduce the atmospheric dynamics on global and local scales in this experiment, but identify a precipitation excess for the West African region. Finally, we conduct extreme scaling tests on a global 3?km mesh with more than 65 million horizontal grid cells on up to half a million cores. We discuss necessary modifications of the model code to improve its parallel performance in general and specific to the HPC environment. We confirm good scaling (70?% parallel efficiency or better) of the MPAS model and provide numbers on the computational requirements for experiments with the 3?km mesh. In doing so, we show that global, convection-resolving atmospheric simulations with MPAS are within reach of current and next generations of high-end computing facilities.
Resumo:
In this study we present first results of a new model development, ECHAM5-JSBACH-wiso, where we have incorporated the stable water isotopes H218O and HDO as tracers in the hydrological cycle of the coupled atmosphere-land surface model ECHAM5-JSBACH. The ECHAM5-JSBACH-wiso model was run under present-day climate conditions at two different resolutions (T31L19, T63L31). A comparison between ECHAM5-JSBACH-wiso and ECHAM5-wiso shows that the coupling has a strong impact on the simulated temperature and soil wetness. Caused by these changes of temperature and the hydrological cycle, the d18O in precipitation also shows variations from -4 permil up to 4 permil. One of the strongest anomalies is shown over northeast Asia where, due to an increase of temperature, the d18O in precipitation increases as well. In order to analyze the sensitivity of the fractionation processes over land, we compare a set of simulations with various implementations of these processes over the land surface. The simulations allow us to distinguish between no fractionation, fractionation included in the evaporation flux (from bare soil) and also fractionation included in both evaporation and transpiration (from water transport through plants) fluxes. While the isotopic composition of the soil water may change for d18O by up to +8 permil:, the simulated d18O in precipitation shows only slight differences on the order of ±1 permil. The simulated isotopic composition of precipitation fits well with the available observations from the GNIP (Global Network of Isotopes in Precipitation) database.
Resumo:
Visualization and interpretation of geological observations into a cohesive geological model are essential to Earth sciences and related fields. Various emerging technologies offer approaches to multi-scale visualization of heterogeneous data, providing new opportunities that facilitate model development and interpretation processes. These include increased accessibility to 3D scanning technology, global connectivity, and Web-based interactive platforms. The geological sciences and geological engineering disciplines are adopting these technologies as volumes of data and physical samples greatly increase. However, a standardized and universally agreed upon workflow and approach have yet to properly be developed. In this thesis, the 3D scanning workflow is presented as a foundation for a virtual geological database. This database provides augmented levels of tangibility to students and researchers who have little to no access to locations that are remote or inaccessible. A Web-GIS platform was utilized jointly with customized widgets developed throughout the course of this research to aid in visualizing hand-sized/meso-scale geological samples within a geologic and geospatial context. This context is provided as a macro-scale GIS interface, where geophysical and geodetic images and data are visualized. Specifically, an interactive interface is developed that allows for simultaneous visualization to improve the understanding of geological trends and relationships. These developed tools will allow for rapid data access and global sharing, and will facilitate comprehension of geological models using multi-scale heterogeneous observations.
Resumo:
It is argued in this study that current investigations of the role of conflict in shared leadership teams and, thus, teams in which all members have the opportunity to participate in its decision-making process are insufficient as they have focused on the downsides of these conflicts. This study demonstrates that task conflict is beneficial in that it can have positive effects on innovation in teams. It shows that particularly in shared leadership management consultant teams task conflict can stimulate innovation. Therefore, this research investigates the relationships among shared leadership, conflict and innovation. The research develops and empirically tests a conceptual model which demonstrates the relationships between these concepts and for which the inclusion of multiple research methods was essential. The sequential explanatory approach included a combination of quantitative and qualitative methods, the order of which can be adapted for other domains of application. The conceptual model was first tested with a sample of 329 management consultants. This was followed by 25, in-depth, face-to-face interviews conducted with individual survey respondents. In addition, weekly meetings of a management consultant team in action were video recorded over several months. This allowed for an in-depth explanation of the findings from the survey by providing an understanding of the underlying processes. The inclusion of observational methods provided a validating role and explained how and why conflicts contributed to the development of team innovation, through the analysis of subtleties and fleeting disagreements in a real-life management consultant team. The results deliver an assessment of the theoretical model and demonstrate that task conflict can allow for additional innovation in management consultant teams operating under a shared leadership structure. A practical model and guidelines for management consultant teams wanting to enhance their innovatory capacities are provided. In addition, a novel-user methodology which includes video observations is developed, with recommendations and steps aiding researchers aiming to employ a similar combination of methods. An original contribution to knowledge is made regarding the positive effects that task conflict can have towards innovation in shared leadership teams. Collaboration and trust are identified as important mediators between shared leadership and task conflict and significant regarding the development of innovation. The effectiveness of shared leadership in reducing negative relationship conflict and the benefits of both shared leadership and task conflict in enhancing innovation are demonstrated.
Propuesta sostenible para mitigar los efectos climáticos adversos en una ciudad costera de Argentina
Resumo:
Los indicadores de sostenibilidad climática constituyen herramientas fundamentales para complementar las políticas de ordenamiento del territorio urbano y pueden beneficiar la calidad de vida sus habitantes. En el presente trabajo se diseñó un indicador climático urbano para la ciudad de Bahía Blanca considerando variables meteorológicas y análisis de la percepción social. El mismo permitió delimitar la ciudad en cuatro regiones bien diferenciadas entre sí. A partir de entonces, se realizó una propuesta sostenible para mitigar los efectos adversos del clima a partir de la aplicación del método DPSIR. Las mismas estuvieron destinadas a mejorar las condiciones de vida de la población. Los resultados permitieron considerar que una pronta implementación de la misma junto con una activa participación de los actores sociales y los tomadores de decisiones es necesaria para mejorar las condiciones actuales en la que se encuentra la ciudad. Con las medidas propuestas, la población local sabrá cómo actuar ante la ocurrencia de distintos eventos extremos, eventos de desconfort climático, etc. Al ser un método sencillo, la metodología aplicada en este estudio puede replicarse en otras ciudades del mundo con el objetivo de mejorar la calidad de vida de los habitantes.
Resumo:
Transient simulations are widely used in studying the past climate as they provide better comparison with any exisiting proxy data. However, multi-millennial transient simulations using coupled climate models are usually computationally very expensive. As a result several acceleration techniques are implemented when using numerical simulations to recreate past climate. In this study, we compare the results from transient simulations of the present and the last interglacial with and without acceleration of the orbital forcing, using the comprehensive coupled climate model CCSM3 (Community Climate System Model 3). Our study shows that in low-latitude regions, the simulation of long-term variations in interglacial surface climate is not significantly affected by the use of the acceleration technique (with an acceleration factor of 10) and hence, large-scale model-data comparison of surface variables is not hampered. However, in high-latitude regions where the surface climate has a direct connection to the deep ocean, e.g. in the Southern Ocean or the Nordic Seas, acceleration-induced biases in sea-surface temperature evolution may occur with potential influence on the dynamics of the overlying atmosphere. The data provided here are from both accelerated and non-accelerated runs as decadal mean values.