879 resultados para model testing


Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents a control strategy for blood glucose(BG) level regulation in type 1 diabetic patients. To design the controller, model-based predictive control scheme has been applied to a newly developed diabetic patient model. The controller is provided with a feedforward loop to improve meal compensation, a gain-scheduling scheme to account for different BG levels, and an asymmetric cost function to reduce hypoglycemic risk. A simulation environment that has been approved for testing of artificial pancreas control algorithms has been used to test the controller. The simulation results show a good controller performance in fasting conditions and meal disturbance rejection, and robustness against model–patient mismatch and errors in meal estimation

Relevância:

30.00% 30.00%

Publicador:

Resumo:

I test the presence of hidden information and action in the automobile insurance market using a data set from several Colombian insurers. To identify the presence of hidden information I find a common knowledge variable providing information on policyholder s risk type which is related to both experienced risk and insurance demand and that was excluded from the pricing mechanism. Such unused variable is the record of policyholder s traffic offenses. I find evidence of adverse selection in six of the nine insurance companies for which the test is performed. From the point of view of hidden action I develop a dynamic model of effort in accident prevention given an insurance contract with bonus experience rating scheme and I show that individual accident probability decreases with previous accidents. This result brings a testable implication for the empirical identification of hidden action and based on that result I estimate an econometric model of the time spans between the purchase of the insurance and the first claim, between the first claim and the second one, and so on. I find strong evidence on the existence of unobserved heterogeneity that deceives the testable implication. Once the unobserved heterogeneity is controlled, I find conclusive statistical grounds supporting the presence of moral hazard in the Colombian insurance market.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper analyzes the measure of systemic importance ∆CoV aR proposed by Adrian and Brunnermeier (2009, 2010) within the context of a similar class of risk measures used in the risk management literature. In addition, we develop a series of testing procedures, based on ∆CoV aR, to identify and rank the systemically important institutions. We stress the importance of statistical testing in interpreting the measure of systemic importance. An empirical application illustrates the testing procedures, using equity data for three European banks.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We develop a model where a free genetic test reveals whether the individual tested has a low or high probability of developing a disease. A costly prevention effort allows high-risk agents to decrease the probability of developing the disease. Agents are not obliged to take the test, but must disclose its results to insurers. Insurers offer separating contracts which take into account the individual risk, so that taking the test is associated to a discrimination risk. We study the individual decisions to take the test and to undertake the prevention effort as a function of the effort cost and of its e¢ ciency. We obtain that, if effort is observable by insurers, agents undertake the test only if the effort cost is neither too large nor too low. If the effort cost is not observable by insurers, they face a moral hazard problem which induces them to under-provide insurance. We obtain the counterintuitive result that moral hazard increases the value of the test if the effort cost is low enough. Also, agents may perform the test for lower levels of prevention e¢ ciency when effort is not observable

Relevância:

30.00% 30.00%

Publicador:

Resumo:

En aquesta tesis s'ha desenvolupat un sistema de control capaç d'optimitzar el funcionament dels Reactors Discontinus Seqüencials dins el camp de l'eliminació de matèria orgànica i nitrogen de les aigües residuals. El sistema de control permet ajustar en línia la durada de les etapes de reacció a partir de mesures directes o indirectes de sondes. En una primera etapa de la tesis s'ha estudiat la calibració de models matemàtics que permeten realitzar fàcilment provatures de diferents estratègies de control. A partir de l'anàlisis de dades històriques s'han plantejat diferents opcions per controlar l'SBR i les més convenients s'han provat mitjançant simulació. Després d'assegurar l'èxit de l'estratègia de control mitjançant simulacions s'ha implementat en una planta semi-industrial. Finalment es planteja l'estructura d'uns sistema supervisor encarregat de controlar el funcionament de l'SBR no només a nivell de fases sinó també a nivell cicle.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

La majoria de les fallades en elements estructurals són degudes a càrrega per fatiga. En conseqüència, la fatiga mecànica és un factor clau per al disseny d'elements mecànics. En el cas de materials compòsits laminats, el procés de fallada per fatiga inclou diferents mecanismes de dany que resulten en la degradació del material. Un dels mecanismes de dany més importants és la delaminació entre capes del laminat. En el cas de components aeronàutics, les plaques de composit estan exposades a impactes i les delaminacions apareixen facilment en un laminat després d'un impacte. Molts components fets de compòsit tenen formes corbes, superposició de capes i capes amb diferents orientacions que fan que la delaminació es propagui en un mode mixt que depen de la grandària de la delaminació. És a dir, les delaminacions generalment es propaguen en mode mixt variable. És per això que és important desenvolupar nous mètodes per caracteritzar el creixement subcrític en mode mixt per fatiga de les delaminacions. El principal objectiu d'aquest treball és la caracterització del creixement en mode mixt variable de les delaminacions en compòsits laminats per efecte de càrregues a fatiga. Amb aquest fi, es proposa un nou model per al creixement per fatiga de la delaminació en mode mixt. Contràriament als models ja existents, el model que es proposa es formula d'acord a la variació no-monotònica dels paràmetres de propagació amb el mode mixt observada en diferents resultats experimentals. A més, es du a terme un anàlisi de l'assaig mixed-mode end load split (MMELS), la característica més important del qual és la variació del mode mixt a mesura que la delaminació creix. Per a aquest anàlisi, es tenen em compte dos mètodes teòrics presents en la literatura. No obstant, les expressions resultants per l'assaig MMELS no són equivalents i les diferències entre els dos mètodes poden ser importants, fins a 50 vegades. Per aquest motiu, en aquest treball es porta a terme un anàlisi alternatiu més acurat del MMELS per tal d'establir una comparació. Aquest anàlisi alternatiu es basa en el mètode dels elements finits i virtual crack closure technique (VCCT). D'aquest anàlisi en resulten importants aspectes a considerar per a la bona caracterització de materials utilitzant l'assaig MMELS. Durant l'estudi s'ha dissenyat i construït un utillatge per l'assaig MMELS. Per a la caracterització experimental de la propagació per fatiga de delaminacions en mode mixt variable s'utilitzen diferents provetes de laminats carboni/epoxy essencialment unidireccionals. També es du a terme un anàlisi fractogràfic d'algunes de les superfícies de fractura per delaminació. Els resultats experimentals són comparats amb les prediccions del model proposat per la propagació per fatiga d'esquerdes interlaminars.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We discuss and test the potential usefulness of single-column models (SCMs) for the testing of stchastic physics schemes that have been proposed for use in general circulation models (GCMs). We argue that although single column tests cannot be definitive in exposing the full behaviour of a stochastic method in the full GCM, and although there are differences between SCM testing of deterministic and stochastic methods, nonetheless SCM testing remains a useful tool. It is necessary to consider an ensemble of SCM runs produced by the stochastic method. These can be usefully compared to deterministic ensembles describing initial condition uncertainty and also to combinations of these (with structural model changes) into poor man's ensembles. The proposed methodology is demonstrated using an SCM experiment recently developed by the GCSS community, simulating the transitions between active and suppressed periods of tropical convection.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We discuss and test the potential usefulness of single-column models (SCMs) for the testing of stochastic physics schemes that have been proposed for use in general circulation models (GCMs). We argue that although single column tests cannot be definitive in exposing the full behaviour of a stochastic method in the full GCM, and although there are differences between SCM testing of deterministic and stochastic methods, SCM testing remains a useful tool. It is necessary to consider an ensemble of SCM runs produced by the stochastic method. These can be usefully compared to deterministic ensembles describing initial condition uncertainty and also to combinations of these (with structural model changes) into poor man's ensembles. The proposed methodology is demonstrated using an SCM experiment recently developed by the GCSS (GEWEX Cloud System Study) community, simulating transitions between active and suppressed periods of tropical convection.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Analysis of the vertical velocity of ice crystals observed with a 1.5micron Doppler lidar from a continuous sample of stratiform ice clouds over 17 months show that the distribution of Doppler velocity varies strongly with temperature, with mean velocities of 0.2m/s at -40C, increasing to 0.6m/s at -10C due to particle growth and broadening of the size spectrum. We examine the likely influence of crystals smaller than 60microns by forward modelling their effect on the area-weighted fall speed, and comparing the results to the lidar observations. The comparison strongly suggests that the concentration of small crystals in most clouds is much lower than measured in-situ by some cloud droplet probes. We argue that the discrepancy is likely due to shattering of large crystals on the probe inlet, and that numerous small particles should not be included in numerical weather and climate model parameterizations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The GEFSOC Project developed a system for estimating soil carbon (C) stocks and changes at the national and sub-national scale. As part of the development of the system, the Century ecosystem model was evaluated for its ability to simulate soil organic C (SOC) changes in environmental conditions in the Indo-Gangetic Plains, India (IGP). Two long-term fertilizer trials (LTFT), with all necessary parameters needed to run Century, were used for this purpose: a jute (Corchorus capsularis L.), rice (Oryza sativa L.) and wheat (Triticum aestivum L.) trial at Barrackpore, West Bengal, and a rice-wheat trial at Ludhiana, Punjab. The trials represent two contrasting climates of the IGP, viz. semi-arid, dry with mean annual rainfall (MAR) of < 800 mm and humid with > 1600 turn. Both trials involved several different treatments with different organic and inorganic fertilizer inputs. In general, the model tended to overestimate treatment effects by approximately 15%. At the semi-arid site, modelled data simulated actual data reasonably well for all treatments, with the control and chemical N + farm yard manure showing the best agreement (RMSE = 7). At the humid site, Century performed less well. This could have been due to a range of factors including site history. During the study, Century was calibrated to simulate crop yields for the two sites considered using data from across the Indian IGP. However, further adjustments may improve model performance at these sites and others in the IGP. The availability of more longterm experimental data sets (especially those involving flooded lowland rice and triple cropping systems from the IGP) for testing and validation is critical to the application of the model's predictive capabilities for this area of the Indian sub-continent. (C) 2007 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The conceptual and parameter uncertainty of the semi-distributed INCA-N (Integrated Nutrients in Catchments-Nitrogen) model was studied using the GLUE (Generalized Likelihood Uncertainty Estimation) methodology combined with quantitative experimental knowledge, the concept known as 'soft data'. Cumulative inorganic N leaching, annual plant N uptake and annual mineralization proved to be useful soft data to constrain the parameter space. The INCA-N model was able to simulate the seasonal and inter-annual variations in the stream-water nitrate concentrations, although the lowest concentrations during the growing season were not reproduced. This suggested that there were some retention processes or losses either in peatland/wetland areas or in the river which were not included in the INCA-N model. The results of the study suggested that soft data was a way to reduce parameter equifinality, and that the calibration and testing of distributed hydrological and nutrient leaching models should be based both on runoff and/or nutrient concentration data and the qualitative knowledge of experimentalist. (c) 2006 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Testing of the Integrated Nitrogen model for Catchments (INCA) in a wide range of ecosystem types across Europe has shown that the model underestimates N transformation processes to a large extent in northern catchments of Finland and Norway in winter and spring. It is found, and generally assumed, that microbial activity in soils proceeds at low rates at northern latitudes during winter, even at sub-zero temperatures. The INCA model was modified to improve the simulation of N transformation rates in northern catchments, characterised by cold climates and extensive snow accumulation and insulation in winter, by introducing an empirical function to simulate soil temperatures below the seasonal snow pack, and a degree-day model to calculate the depth of the snow pack. The proposed snow-correction factor improved the simulation of soil temperatures at Finnish and Norwegian field sites in winter, although soil temperature was still underestimated during periods with a thin snow cover. Finally, a comparison between the modified INCA version (v. 1.7) and the former version (v. 1.6) was made at the Simojoki river basin in northern Finland and at Dalelva Brook in northern Norway. The new modules did not imply any significant changes in simulated NO3- concentration levels in the streams but improved the timing of simulated higher concentrations. The inclusion of a modified temperature response function and an empirical snow-correction factor improved the flexibility and applicability of the model for climate effect studies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We develop the linearization of a semi-implicit semi-Lagrangian model of the one-dimensional shallow-water equations using two different methods. The usual tangent linear model, formed by linearizing the discrete nonlinear model, is compared with a model formed by first linearizing the continuous nonlinear equations and then discretizing. Both models are shown to perform equally well for finite perturbations. However, the asymptotic behaviour of the two models differs as the perturbation size is reduced. This leads to difficulties in showing that the models are correctly coded using the standard tests. To overcome this difficulty we propose a new method for testing linear models, which we demonstrate both theoretically and numerically. © Crown copyright, 2003. Royal Meteorological Society

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We compare laboratory observations of equilibrated baroclinic waves in the rotating two-layer annulus, with numerical simulations from a quasi-geostrophic model. The laboratory experiments lie well outside the quasi-geostrophic regime: the Rossby number reaches unity; the depth-to-width aspect ratio is large; and the fluid contains ageostrophic inertia–gravity waves. Despite being formally inapplicable, the quasi-geostrophic model captures the laboratory flows reasonably well. The model displays several systematic biases, which are consequences of its treatment of boundary layers and neglect of interfacial surface tension and which may be explained without invoking the dynamical effects of the moderate Rossby number, large aspect ratio or inertia–gravity waves. We conclude that quasi-geostrophic theory appears to continue to apply well outside its formal bounds.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Aeolian mineral dust aerosol is an important consideration in the Earth's radiation budget as well as a source of nutrients to oceanic and land biota. The modelling of aeolian mineral dust has been improving consistently despite the relatively sparse observations to constrain them. This study documents the development of a new dust emissions scheme in the Met Office Unified ModelTM (MetUM) based on the Dust Entrainment and Deposition (DEAD) module. Four separate case studies are used to test and constrain the model output. Initial testing was undertaken on a large dust event over North Africa in March 2006 with the model constrained using AERONET data. The second case study involved testing the capability of the model to represent dust events in the Middle East without being re-tuned from the March 2006 case in the Sahara. While the model is unable to capture some of the daytime variation in AERONET AOD there is good agreement between the model and observed dust events. In the final two case studies new observations from in situ aircraft data during the Dust Outflow and Deposition to the Ocean (DODO) campaigns in February and August 2006 were used. These recent observations provided further data on dust size distributions and vertical profiles to constrain the model. The modelled DODO cases were also compared to AERONET data to make sure the radiative properties of the dust were comparable to observations. Copyright © 2009 Royal Meteorological Society and Crown Copyright