904 resultados para calibration of rainfall-runoff models


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Infant formula is often produced as an agglomerated powder using a spray drying process. Pneumatic conveying is commonly used for transporting this product within a manufacturing plant. The transient mechanical loads imposed by this process cause some of the agglomerates to disintegrate, which has implications for key quality characteristics of the formula including bulk density and wettability. This thesis used both experimental and modelling approaches to investigate this breakage during conveying. One set of conveying trials had the objective of establishing relationships between the geometry and operating conditions of the conveying system and the resulting changes in bulk properties of the infant formula upon conveying. A modular stainless steel pneumatic conveying rig was constructed for these trials. The mode of conveying and air velocity had a statistically-significant effect on bulk density at a 95% level, while mode of conveying was the only factor which significantly influenced D[4,3] or wettability. A separate set of conveying experiments investigated the effect of infant formula composition, rather than the pneumatic conveying parameters, and also assessed the relationships between the mechanical responses of individual agglomerates of four infant formulae and their compositions. The bulk densities before conveying, and the forces and strains at failure of individual agglomerates, were related to the protein content. The force at failure and stiffness of individual agglomerates were strongly correlated, and generally increased with increasing protein to fat ratio while the strain at failure decreased. Two models of breakage were developed at different scales; the first was a detailed discrete element model of a single agglomerate. This was calibrated using a novel approach based on Taguchi methods which was shown to have considerable advantages over basic parameter studies which are widely used. The data obtained using this model compared well to experimental results for quasi-static uniaxial compression of individual agglomerates. The model also gave adequate results for dynamic loading simulations. A probabilistic model of pneumatic conveying was also developed; this was suitable for predicting breakage in large populations of agglomerates and was highly versatile: parts of the model could easily be substituted by the researcher according to their specific requirements.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We firstly examine the model of Hobson and Rogers for the volatility of a financial asset such as a stock or share. The main feature of this model is the specification of volatility in terms of past price returns. The volatility process and the underlying price process share the same source of randomness and so the model is said to be complete. Complete models are advantageous as they allow a unique, preference independent price for options on the underlying price process. One of the main objectives of the model is to reproduce the `smiles' and `skews' seen in the market implied volatilities and this model produces the desired effect. In the first main piece of work we numerically calibrate the model of Hobson and Rogers for comparison with existing literature. We also develop parameter estimation methods based on the calibration of a GARCH model. We examine alternative specifications of the volatility and show an improvement of model fit to market data based on these specifications. We also show how to process market data in order to take account of inter-day movements in the volatility surface. In the second piece of work, we extend the Hobson and Rogers model in a way that better reflects market structure. We extend the model to take into account both first and second order effects. We derive and numerically solve the pde which describes the price of options under this extended model. We show that this extension allows for a better fit to the market data. Finally, we analyse the parameters of this extended model in order to understand intuitively the role of these parameters in the volatility surface.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

While there is growing interest in measuring the size and scope of local spillovers, it is well understood that such spillovers cannot be distinguished from unobservable local attributes using solely the observed location decisions of individuals or firms. We propose an empirical strategy for recovering estimates of spillovers in the presence of unobserved local attributes for a broadly applicable class of equilibrium sorting models. Our approach relies on an IV strategy derived from the internal logic of the sorting model itself. We show practically how the strategy is implemented, provide intuition for our instruments, discuss the role of effective choice-set variation in identifying the model, and carry-out a series of Monte Carlo simulations to demonstrate performance in small samples. © 2007 The Author(s). Journal compilation Royal Economic Society 2007.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

INTRODUCTION: We previously reported models that characterized the synergistic interaction between remifentanil and sevoflurane in blunting responses to verbal and painful stimuli. This preliminary study evaluated the ability of these models to predict a return of responsiveness during emergence from anesthesia and a response to tibial pressure when patients required analgesics in the recovery room. We hypothesized that model predictions would be consistent with observed responses. We also hypothesized that under non-steady-state conditions, accounting for the lag time between sevoflurane effect-site concentration (Ce) and end-tidal (ET) concentration would improve predictions. METHODS: Twenty patients received a sevoflurane, remifentanil, and fentanyl anesthetic. Two model predictions of responsiveness were recorded at emergence: an ET-based and a Ce-based prediction. Similarly, 2 predictions of a response to noxious stimuli were recorded when patients first required analgesics in the recovery room. Model predictions were compared with observations with graphical and temporal analyses. RESULTS: While patients were anesthetized, model predictions indicated a high likelihood that patients would be unresponsive (> or = 99%). However, after termination of the anesthetic, models exhibited a wide range of predictions at emergence (1%-97%). Although wide, the Ce-based predictions of responsiveness were better distributed over a percentage ranking of observations than the ET-based predictions. For the ET-based model, 45% of the patients awoke within 2 min of the 50% model predicted probability of unresponsiveness and 65% awoke within 4 min. For the Ce-based model, 45% of the patients awoke within 1 min of the 50% model predicted probability of unresponsiveness and 85% awoke within 3.2 min. Predictions of a response to a painful stimulus in the recovery room were similar for the Ce- and ET-based models. DISCUSSION: Results confirmed, in part, our study hypothesis; accounting for the lag time between Ce and ET sevoflurane concentrations improved model predictions of responsiveness but had no effect on predicting a response to a noxious stimulus in the recovery room. These models may be useful in predicting events of clinical interest but large-scale evaluations with numerous patients are needed to better characterize model performance.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A common but informal notion in social network analysis and other fields is the concept of a core/periphery structure. The intuitive conception entails a dense, cohesive core and a sparse, unconnected periphery. This paper seeks to formalize the intuitive notion of a core/periphery structure and suggests algorithms for detecting this structure, along with statistical tests for testing a priori hypotheses. Different models are presented for different kinds of graphs (directed and undirected, valued and nonvalued). In addition, the close relation of the continuous models developed to certain centrality measures is discussed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Computer based mathematical models describing aircraft fire have a role to play in the design and development of safer aircraft, in the implementation of safer and more rigorous certification criteria and in post mortuum accident investigation. As the cost involved in performing large-scale fire experiments for the next generation 'Ultra High Capacity Aircraft' (UHCA) are expected to be prohibitively high, the development and use of these modelling tools may become essential if these aircraft are to prove a safe and viable reality. By describing the present capabilities and limitations of aircraft fire models, this paper will examine the future development of these models in the areas of large scale applications through parallel computing, combustion modelling and extinguishment modelling.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The electronics industry and the problems associated with the cooling of microelectronic equipment are developing rapidly. Thermal engineers now find it necessary to consider the complex area of equipment cooling at some level. This continually growing industry also faces heightened pressure from consumers to provide electronic product miniaturization, which in itself increases the demand for accurate thermal management predictions to assure product reliability. Computational fluid dynamics (CFD) is considered a powerful and almost essential tool for the design, development and optimization of engineering applications. CFD is now widely used within the electronics packaging design community to thermally characterize the performance of both the electronic component and system environment. This paper discusses CFD results for a large variety of investigated turbulence models. Comparison against experimental data illustrates the predictive accuracy of currently used models and highlights the growing demand for greater mathematical modelling accuracy with regards to thermal characterization. Also a newly formulated low Reynolds number (i.e. transitional) turbulence model is proposed with emphasis on hybrid techniques.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The export of organic carbon from the surface ocean by sinking particles is an important, yet highly uncertain, component of the global carbon cycle. Here we introduce a mechanistic assessment of the global ocean carbon export using satellite observations, including determinations of net primary production and the slope of the particle size spectrum, to drive a food-web model that estimates the production of sinking zooplankton feces and algal aggregates comprising the sinking particle flux at the base of the euphotic zone. The synthesis of observations and models reveals fundamentally different and ecologically consistent regional-scale patterns in export and export efficiency not found in previous global carbon export assessments. The model reproduces regional-scale particle export field observations and predicts a climatological mean global carbon export from the euphotic zone of ~6 Pg C yr−1. Global export estimates show small variation (typically < 10%) to factor of 2 changes in model parameter values. The model is also robust to the choices of the satellite data products used and enables interannual changes to be quantified. The present synthesis of observations and models provides a path for quantifying the ocean's biological pump.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The accuracy of two satellite models of marine primary (PP) and new production (NP) were assessed against 14C and 15N uptake measurements taken during six research cruises in the northern North Atlantic. The wavelength resolving model (WRM) was more accurate than the Vertical General Production Model (VGPM) for computation of both PP and NP. Mean monthly satellite maps of PP and NP for both models were generated from 1997 to 2010 using SeaWiFS data for the Irminger basin and North Atlantic. Intra- and inter-annual variability of the two models was compared in six hydrographic zones. Both models exhibited similar spatio-temporal patterns: PP and NP increased from April to June and decreased by August. Higher values were associated with the East Greenland Current (EGC), Iceland Basin (ICB) and the Reykjanes Ridge (RKR) and lower values occurred in the Central Irminger Current (CIC), North Irminger Current (NIC) and Southern Irminger Current (SIC). The annual PP and NP over the SeaWiFS record was 258 and 82 gC m-2 yr-1 respectively for the VGPM and 190 and 41 gC m-2 yr-1 for the WRM. Average annual cumulative sum in the anomalies of NP for the VGPM were positively correlated with the North Atlantic Oscillation (NAO) in the EGC, CIC and SIC and negatively correlated with the multivariate ENSO index (MEI) in the ICB. By contrast, cumulative sum of the anomalies of NP for the WRM were significantly correlated with NAO only in the EGC and CIC. NP from both VGPM and WRM exhibited significant negative correlations with Arctic Oscillation (AO) in all hydrographic zones. The differences in estimates of PP and NP in these hydrographic zones arise principally from the parameterisation of the euphotic depth and the SST dependence of photo-physiological term in the VGPM, which has a greater sensitivity to variations in temperature than the WRM. In waters of 0 to 5C PP using the VGPM was 43% higher than WRM, from 5 to 10C the VGPM was 29% higher and from 10 to 15C the VGPM was 27% higher.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Coastal zones and shelf-seas are important for tourism, commercial fishing and aquaculture. As a result the importance of good water quality within these regions to support life is recognised worldwide and a number of international directives for monitoring them now exist. This paper describes the AlgaRisk water quality monitoring demonstration service that was developed and operated for the UK Environment Agency in response to the microbiological monitoring needs within the revised European Union Bathing Waters Directive. The AlgaRisk approach used satellite Earth observation to provide a near-real time monitoring of microbiological water quality and a series of nested operational models (atmospheric and hydrodynamic-ecosystem) provided a forecast capability. For the period of the demonstration service (2008–2013) all monitoring and forecast datasets were processed in near-real time on a daily basis and disseminated through a dedicated web portal, with extracted data automatically emailed to agency staff. Near-real time data processing was achieved using a series of supercomputers and an Open Grid approach. The novel web portal and java-based viewer enabled users to visualise and interrogate current and historical data. The system description, the algorithms employed and example results focussing on a case study of an incidence of the harmful algal bloom Karenia mikimotoi are presented. Recommendations and the potential exploitation of web services for future water quality monitoring services are discussed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We synthesise and update results from the suite of biophysical, larval-dispersal models developed in the Benguela Current ecosystem. Biophysical models of larval dispersal use outputs of physical hydrodynamic models as inputs to individual-based models in which biological processes acting during the larval life are included. In the Benguela, such models were first applied to simulate the dispersal of anchovy Engraulis encrasicolus and sardine Sardinops sagax ichthyoplankton, and more recently of the early life stages of chokka-squid Loligo reynaudii and Cape hakes Merluccius spp. We identify how the models have helped advance understanding of key processes for these species. We then discuss which aspects of the early life of marine species in the Benguela Current ecosystem are still not well understood and could benefit from new modelling studies.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Regime shifts have been reported in many marine ecosystems, and are often expressed as an abrupt change occurring in multiple physical and biological components of the system. In the Gulf of Alaska, a regime shift in the late 1970s was observed, indicated by an abrupt increase in sea surface temperature and major shifts in the catch of many fish species. This late 1970s regime shift in the Gulf of Alaska was followed by another shift in the late 1980s, not as pervasive as the 1977 shift, but which nevertheless did not return to the prior state. A thorough understanding of the extent and mechanisms leading to such regime shifts is challenged by data paucity in time and space. We investigate the ability of a suite of ocean biogeochemistry models of varying complexity to simulate regime shifts in the Gulf of Alaska by examining the presence of abrupt changes in time series of physical variables (sea surface temperature and mixed layer depth), nutrients and biological variables (chlorophyll, primary productivity and plankton biomass) using change-point analysis. Our study demonstrates that ocean biogeochemical models are capable of simulating the late 1970s shift, indicating an abrupt increase in sea surface temperature forcing followed by an abrupt decrease in nutrients and biological productivity. This predicted shift is consistent among all the models, although some of them exhibit an abrupt transition (i.e. a significant shift from one year to the next), whereas others simulate a smoother transition. Some models further suggest that the late 1980s shift was constrained by changes in mixed layer depth. Our study demonstrates that ocean biogeochemical can successfully simulate regime shifts in the Gulf of Alaska region, thereby providing better understanding of how changes in physical conditions are propagated from lower to upper trophic levels through bottom-up controls.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

1. Barnacles are a good model organism for the study of open populations with space-limited recruitment. These models are applicable to other species with open supply of new individuals and resource limitation. The inclusion of space in models leads to reductions in recruitment with increasing density, and thus predictions of population size and stability are possible. 2. Despite the potential generality of a demographic theory for open space-limited populations, the models currently have a narrow empirical base. In this study, a model for an open population with space-limited recruitment was extended to include size-specific survival and promotions to any size class. The assumptions of this model were tested using data from a pan-European study of the barnacle Chthamalus montagui Southward. Two models were constructed: a 6-month model and a periodic annual model. Predicted equilibria and their stabilities were compared between shores. 3. Tests of model assumptions supported the extension of the theory to include promotions to any size class. Mortality was found to be size-specific and density independent. Studied populations were open, with recruitment proportional to free space. 4. The 6-month model showed a significant interaction between time and location for equilibrium free space. This may have been due to contrasts in the timing of structuring processes (i.e. creating and filling space) between Mediterranean and Atlantic systems. Integration of the 6-month models into a periodic annual model removed the differences in equilibrium-free space between locations. 5. Model predictions show a remarkable similarity between shores at a European scale. Populations were persistent and all solutions were stable. This reflects the apparent absence of density-dependent mortality and a high adult survivorship in C. montagui. As populations are intrinsically stable, observations of fluctuations in density are directly attributable to variations in the environmental forcing of recruitment or mortality

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Self-compacting concrete (SCC) is generally designed with a relatively higher content of finer, which includes cement, and dosage of superplasticizer than the conventional concrete. The design of the current SCC leads to high compressive strength, which is already used in special applications, where the high cost of materials can be tolerated. Using SCC, which eliminates the need for vibration, leads to increased speed of casting and thus reduces labour requirement, energy consumption, construction time, and cost of equipment. In order to obtain and gain maximum benefit from SCC it has to be used for wider applications. The cost of materials will be decreased by reducing the cement content and using a minimum amount of admixtures. This paper reviews statistical models obtained from a factorial design which was carried out to determine the influence of four key parameters on filling ability, passing ability, segregation and compressive strength. These parameters are important for the successful development of medium strength self-compacting concrete (MS-SCC). The parameters considered in the study were the contents of cement and pulverised fuel ash (PFA), water-to-powder ratio (W/P), and dosage of superplasticizer (SP). The responses of the derived statistical models are slump flow, fluidity loss, rheological parameters, Orimet time, V-funnel time, L-box, JRing combined to Orimet, JRing combined to cone, fresh segregation, and compressive strength at 7, 28 and 90 days. The models are valid for mixes made with 0.38 to 0.72 W/P ratio, 60 to 216 kg/m3 of cement content, 183 to 317 kg/m3 of PFA and 0 to 1% of SP, by mass of powder. The utility of such models to optimize concrete mixes to achieve good balance between filling ability, passing ability, segregation, compressive strength, and cost is discussed. Examples highlighting the usefulness of the models are presented using isoresponse surfaces to demonstrate single and coupled effects of mix parameters on slump flow, loss of fluidity, flow resistance, segregation, JRing combined to Orimet, and compressive strength at 7 and 28 days. Cost analysis is carried out to show trade-offs between cost of materials and specified consistency levels and compressive strength at 7 and 28 days that can be used to identify economic mixes. The paper establishes the usefulness of the mathematical models as a tool to facilitate the test protocol required to optimise medium strength SCC.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

For interpreting past changes on a regional or global scale, the timings of proxy-inferred events are usually aligned with data from other locations. However, too often chronological uncertainties are ignored in proxy diagrams and multisite comparisons, making it possible for researchers to fall into the trap of sucking separate events into one illusionary event (or vice versa). Here we largely solve this "suck in and smear syndrome" for radiocarbon (14C) dated sequences. In a Bayesian framework, millions of plausible age-models are constructed to quantify the chronological uncertainties within and between proxy archives. We test the technique on replicated high-resolution 14C-dated peat cores deposited during the "Little Ice Age" (c. AD 1400-1900), a period characterized by abrupt climate changes and severe 14C calibration problems. Owing to internal variability in proxy data and uncertainties in age-models, these (and possibly many more) archives are not consistent in recording decadal climate change. Through explicit statistical tests of palaeoenvironmental hypotheses, we can move forward to systematic interpretations of proxy data. However, chronological uncertainties of non-annually resolved palaeoclimate records are too large for answering decadal timescale questions.