899 resultados para Simulation analysis
Resumo:
The transport of stratospheric air deep into the troposphere via convection is investigated numerically using the UK Met Office Unified Model. A convective system that formed on 27 June 2004 near southeast England, in the vicinity an upper level potential vorticity anomaly and a lowered tropopause, provides the basis for analysis. Transport is diagnosed using a stratospheric tracer that can either be passed through or withheld from the model’s convective parameterization scheme. Three simulations are performed at increasingly finer resolutions, with horizontal grid lengths of 12, 4, and 1 km. In the 12 and 4 km simulations, tracer is transported deeply into the troposphere by the parameterized convection. In the 1 km simulation, for which the convective parameterization is disengaged, deep transport is still accomplished but with a much smaller magnitude. However, the 1 km simulation resolves stirring along the tropopause that does not exist in the coarser simulations. In all three simulations, the concentration of the deeply transported tracer is small, three orders of magnitude less than that of the shallow transport near the tropopause, most likely because of the efficient dilution of parcels in the lower troposphere.
Resumo:
The long-term stability, high accuracy, all-weather capability, high vertical resolution, and global coverage of Global Navigation Satellite System (GNSS) radio occultation (RO) suggests it as a promising tool for global monitoring of atmospheric temperature change. With the aim to investigate and quantify how well a GNSS RO observing system is able to detect climate trends, we are currently performing an (climate) observing system simulation experiment over the 25-year period 2001 to 2025, which involves quasi-realistic modeling of the neutral atmosphere and the ionosphere. We carried out two climate simulations with the general circulation model MAECHAM5 (Middle Atmosphere European Centre/Hamburg Model Version 5) of the MPI-M Hamburg, covering the period 2001–2025: One control run with natural variability only and one run also including anthropogenic forcings due to greenhouse gases, sulfate aerosols, and tropospheric ozone. On the basis of this, we perform quasi-realistic simulations of RO observables for a small GNSS receiver constellation (six satellites), state-of-the-art data processing for atmospheric profiles retrieval, and a statistical analysis of temperature trends in both the “observed” climatology and the “true” climatology. Here we describe the setup of the experiment and results from a test bed study conducted to obtain a basic set of realistic estimates of observational errors (instrument- and retrieval processing-related errors) and sampling errors (due to spatial-temporal undersampling). The test bed results, obtained for a typical summer season and compared to the climatic 2001–2025 trends from the MAECHAM5 simulation including anthropogenic forcing, were found encouraging for performing the full 25-year experiment. They indicated that observational and sampling errors (both contributing about 0.2 K) are consistent with recent estimates of these errors from real RO data and that they should be sufficiently small for monitoring expected temperature trends in the global atmosphere over the next 10 to 20 years in most regions of the upper troposphere and lower stratosphere (UTLS). Inspection of the MAECHAM5 trends in different RO-accessible atmospheric parameters (microwave refractivity and pressure/geopotential height in addition to temperature) indicates complementary climate change sensitivity in different regions of the UTLS so that optimized climate monitoring shall combine information from all climatic key variables retrievable from GNSS RO data.
Resumo:
The characteristics of convectively-generated gravity waves during an episode of deep convection near the coast of Wales are examined in both high resolution mesoscale simulations [with the (UK) Met Oce Unified Model] and in observations from a Mesosphere-Stratosphere-Troposphere (MST) wind profiling Doppler radar. Deep convection reached the tropopause and generated vertically propagating, high frequency waves in the lower stratosphere that produced vertical velocity perturbations O(1 m/s). Wavelet analysis is applied in order to determine the characteristic periods and wavelengths of the waves. In both the simulations and observations, the wavelet spectra contain several distinct preferred scales indicated by multiple spectral peaks. The peaks are most pronounced in the horizontal spectra at several wavelengths less than 50 km. Although these peaks are most clear and of largest amplitude in the highest resolution simulations (with 1 km horizontal grid length), they are also evident in coarser simulations (with 4 km horizontal grid length). Peaks also exist in the vertical and temporal spectra (between approximately 2.5 and 4.5 km, and 10 to 30 minutes, respectively) with good agreement between simulation and observation. Two-dimensional (wavenumber-frequency) spectra demonstrate that each of the selected horizontal scales contains peaks at each of preferred temporal scales revealed by the one- dimensional spectra alone.
Resumo:
Bloom-forming and toxin-producing cyanobacteria remain a persistent nuisance across the world. Modelling of cyanobacteria in freshwaters is an important tool for understanding their population dynamics and predicting the location and timing of the bloom events in lakes and rivers. A new deterministic-mathematical model was developed, which simulates the growth and movement of cyanobacterial blooms in river systems. The model focuses on the mathematical description of the bloom formation, vertical migration and lateral transport of colonies within river environments by taking into account the major factors that affect the cyanobacterial bloom formation in rivers including, light, nutrients and temperature. A technique called generalised sensitivity analysis was applied to the model to identify the critical parameter uncertainties in the model and investigates the interaction between the chosen parameters of the model. The result of the analysis suggested that 8 out of 12 parameters were significant in obtaining the observed cyanobacterial behaviour in a simulation. It was found that there was a high degree of correlation between the half-saturation rate constants used in the model.
Laboratory simulation of terrestrial meteorite weathering using the Bensour (LL6) ordinary chondrite
Resumo:
Laboratory dissolution experiments using the LL6 ordinary chondrite Bensour demonstrate that meteoritic minerals readily react with distilled water at low temperatures, liberating ions into solution and forming reaction products. Three experiments were performed, all for 68 days and at atmospheric fO(2) but using a range of water/rock ratios and different ternperatures. Experiments I and 2 were batch experiments and undertaken at room temperature, whereas in experiment 3, condensed boiling water was dripped onto meteorite subsamples within a Soxhlet extractor. Solutions from experiment 1 were chemically analyzed at the end of the experiment, whereas aliquots were extracted from experiments 2 and 3 for analysis at regular intervals. In all three experiments, a very significant proportion of the Na, Cl, and K within the Bensour subsamples entered solution, demonstrating that chlorapatite and feldspar were especially susceptible to dissolution. Concentrations of Mg, Al, Si, Ca, and Fe in solution were strongly affected by the precipitation of reaction products and Mg and Ca may also have been removed by sorption. Calculations predict saturation of experimental solutions with respect to Al hydroxides, Fe oxides, and Fe (oxy)hydroxides, which would have frequently been accompanied by hydrous aluminosilicates. Some reaction products were identified and include silica, a Mg-rich silicate, Fe oxides, and Fe (oxy)hydroxides. The implications of these results are that even very short periods of subaerial exposure of ordinary chondrites will lead to dissolution of primary minerals and crystallization of weathering products that are likely to include aluminosilicates and silicates, Mg-Ca carbonates, and sulfates in addition to the ubiquitous Fe oxides and (oxy)hydroxides.
Resumo:
Water quality models generally require a relatively large number of parameters to define their functional relationships, and since prior information on parameter values is limited, these are commonly defined by fitting the model to observed data. In this paper, the identifiability of water quality parameters and the associated uncertainty in model simulations are investigated. A modification to the water quality model `Quality Simulation Along River Systems' is presented in which an improved flow component is used within the existing water quality model framework. The performance of the model is evaluated in an application to the Bedford Ouse river, UK, using a Monte-Carlo analysis toolbox. The essential framework of the model proved to be sound, and calibration and validation performance was generally good. However some supposedly important water quality parameters associated with algal activity were found to be completely insensitive, and hence non-identifiable, within the model structure, while others (nitrification and sedimentation) had optimum values at or close to zero, indicating that those processes were not detectable from the data set examined. (C) 2003 Elsevier Science B.V. All rights reserved.
Resumo:
Reanalysis data provide an excellent test bed for impacts prediction systems. because they represent an upper limit on the skill of climate models. Indian groundnut (Arachis hypogaea L.) yields have been simulated using the General Large-Area Model (GLAM) for annual crops and the European Centre for Medium-Range Weather Forecasts (ECMWF) 40-yr reanalysis (ERA-40). The ability of ERA-40 to represent the Indian summer monsoon has been examined. The ability of GLAM. when driven with daily ERA-40 data, to model both observed yields and observed relationships between subseasonal weather and yield has been assessed. Mean yields "were simulated well across much of India. Correlations between observed and modeled yields, where these are significant. are comparable to correlations between observed yields and ERA-40 rainfall. Uncertainties due to the input planting window, crop duration, and weather data have been examined. A reduction in the root-mean-square error of simulated yields was achieved by applying bias correction techniques to the precipitation. The stability of the relationship between weather and yield over time has been examined. Weather-yield correlations vary on decadal time scales. and this has direct implications for the accuracy of yield simulations. Analysis of the skewness of both detrended yields and precipitation suggest that nonclimatic factors are partly responsible for this nonstationarity. Evidence from other studies, including data on cereal and pulse yields, indicates that this result is not particular to groundnut yield. The detection and modeling of nonstationary weather-yield relationships emerges from this study as an important part of the process of understanding and predicting the impacts of climate variability and change on crop yields.
Resumo:
In a sequential clinical trial, accrual of data on patients often continues after the stopping criterion for the study has been met. This is termed “overrunning.” Overrunning occurs mainly when the primary response from each patient is measured after some extended observation period. The objective of this article is to compare two methods of allowing for overrunning. In particular, simulation studies are reported that assess the two procedures in terms of how well they maintain the intended type I error rate. The effect on power resulting from the incorporation of “overrunning data” using the two procedures is evaluated.
Resumo:
Assaying a large number of genetic markers from patients in clinical trials is now possible in order to tailor drugs with respect to efficacy. The statistical methodology for analysing such massive data sets is challenging. The most popular type of statistical analysis is to use a univariate test for each genetic marker, once all the data from a clinical study have been collected. This paper presents a sequential method for conducting an omnibus test for detecting gene-drug interactions across the genome, thus allowing informed decisions at the earliest opportunity and overcoming the multiple testing problems from conducting many univariate tests. We first propose an omnibus test for a fixed sample size. This test is based on combining F-statistics that test for an interaction between treatment and the individual single nucleotide polymorphism (SNP). As SNPs tend to be correlated, we use permutations to calculate a global p-value. We extend our omnibus test to the sequential case. In order to control the type I error rate, we propose a sequential method that uses permutations to obtain the stopping boundaries. The results of a simulation study show that the sequential permutation method is more powerful than alternative sequential methods that control the type I error rate, such as the inverse-normal method. The proposed method is flexible as we do not need to assume a mode of inheritance and can also adjust for confounding factors. An application to real clinical data illustrates that the method is computationally feasible for a large number of SNPs. Copyright (c) 2007 John Wiley & Sons, Ltd.
Resumo:
We focus on the comparison of three statistical models used to estimate the treatment effect in metaanalysis when individually pooled data are available. The models are two conventional models, namely a multi-level and a model based upon an approximate likelihood, and a newly developed model, the profile likelihood model which might be viewed as an extension of the Mantel-Haenszel approach. To exemplify these methods, we use results from a meta-analysis of 22 trials to prevent respiratory tract infections. We show that by using the multi-level approach, in the case of baseline heterogeneity, the number of clusters or components is considerably over-estimated. The approximate and profile likelihood method showed nearly the same pattern for the treatment effect distribution. To provide more evidence two simulation studies are accomplished. The profile likelihood can be considered as a clear alternative to the approximate likelihood model. In the case of strong baseline heterogeneity, the profile likelihood method shows superior behaviour when compared with the multi-level model. Copyright (C) 2006 John Wiley & Sons, Ltd.
Resumo:
This article introduces a quantitative approach to e-commerce system evaluation based on the theory of process simulation. The general concept of e-commerce system simulation is presented based on the considerations of some limitations in e-commerce system development such as the huge amount of initial investments of time and money, and the long period from business planning to system development, then to system test and operation, and finally to exact return; in other words, currently used system analysis and development method cannot tell investors about some keen attentions such as how good their e-commerce system could be, how many investment repayments they could have, and which area they should improve regarding the initial business plan. In order to exam the value and its potential effects of an e-commerce business plan, it is necessary to use a quantitative evaluation approach and the authors of this article believe that process simulation is an appropriate option. The overall objective of this article is to apply the theory of process simulation to e-commerce system evaluation, and the authors will achieve this though an experimental study on a business plan for online construction and demolition waste exchange. The methodologies adopted in this article include literature review, system analysis and development, simulation modelling and analysis, and case study. The results from this article include the concept of e-commerce system simulation, a comprehensive review of simulation methods adopted in e-commerce system evaluation, and a real case study of applying simulation to e-commerce system evaluation. Furthermore, the authors hope that the adoption and implementation of the process simulation approach can effectively support business decision-making, and improve the efficiency of e-commerce systems.
Resumo:
This paper summarizes the design, manufacturing, testing, and finite element analysis (FEA) of glass-fibre-reinforced polyester leaf springs for rail freight vehicles. FEA predictions of load-deflection curves under static loading are presented, together with comparisons with test results. Bending stress distribution at typical load conditions is plotted for the springs. The springs have been mounted on a real wagon and drop tests at tare and full load have been carried out on a purpose-built shaker rig. The transient response of the springs from tests and FEA is presented and discussed.
Resumo:
In designing modern office buildings, building spaces are frequently zoned by introducing internal partitioning, which may have a significant influence on the room air environment. This internal partitioning was studied by means of model test, numerical simulation, and statistical analysis as the final stage. In this paper, the results produced from the statistical analysis are summarized and presented.
Resumo:
A combined mathematical model for predicting heat penetration and microbial inactivation in a solid body heated by conduction was tested experimentally by inoculating agar cylinders with Salmonella typhimurium or Enterococcus faecium and heating in a water bath. Regions of growth where bacteria had survived after heating were measured by image analysis and compared with model predictions. Visualisation of the regions of growth was improved by incorporating chromogenic metabolic indicators into the agar. Preliminary tests established that the model performed satisfactorily with both test organisms and with cylinders of different diameter. The model was then used in simulation studies in which the parameters D, z, inoculum size, cylinder diameter and heating temperature were systematically varied. These simulations showed that the biological variables D, z and inoculum size had a relatively small effect on the time needed to eliminate bacteria at the cylinder axis in comparison with the physical variables heating temperature and cylinder diameter, which had a much greater relative effect. (c) 2005 Elsevier B.V All rights reserved.
Resumo:
We present the symbolic resonance analysis (SRA) as a viable method for addressing the problem of enhancing a weakly dominant mode in a mixture of impulse responses obtained from a nonlinear dynamical system. We demonstrate this using results from a numerical simulation with Duffing oscillators in different domains of their parameter space, and by analyzing event-related brain potentials (ERPs) from a language processing experiment in German as a representative application. In this paradigm, the averaged ERPs exhibit an N400 followed by a sentence final negativity. Contemporary sentence processing models predict a late positivity (P600) as well. We show that the SRA is able to unveil the P600 evoked by the critical stimuli as a weakly dominant mode from the covering sentence final negativity. (c) 2007 American Institute of Physics. (c) 2007 American Institute of Physics.