133 resultados para Inverse Methodology


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The requirement to forecast volcanic ash concentrations was amplified as a response to the 2010 Eyjafjallajökull eruption when ash safety limits for aviation were introduced in the European area. The ability to provide accurate quantitative forecasts relies to a large extent on the source term which is the emissions of ash as a function of time and height. This study presents source term estimations of the ash emissions from the Eyjafjallajökull eruption derived with an inversion algorithm which constrains modeled ash emissions with satellite observations of volcanic ash. The algorithm is tested with input from two different dispersion models, run on three different meteorological input data sets. The results are robust to which dispersion model and meteorological data are used. Modeled ash concentrations are compared quantitatively to independent measurements from three different research aircraft and one surface measurement station. These comparisons show that the models perform reasonably well in simulating the ash concentrations, and simulations using the source term obtained from the inversion are in overall better agreement with the observations (rank correlation = 0.55, Figure of Merit in Time (FMT) = 25–46%) than simulations using simplified source terms (rank correlation = 0.21, FMT = 20–35%). The vertical structures of the modeled ash clouds mostly agree with lidar observations, and the modeled ash particle size distributions agree reasonably well with observed size distributions. There are occasionally large differences between simulations but the model mean usually outperforms any individual model. The results emphasize the benefits of using an ensemble-based forecast for improved quantification of uncertainties in future ash crises.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A new approach to the study of the local organization in amorphous polymer materials is presented. The method couples neutron diffraction experiments that explore the structure on the spatial scale 1–20 Å with the reverse Monte Carlo fitting procedure to predict structures that accurately represent the experimental scattering results over the whole momentum transfer range explored. Molecular mechanics and molecular dynamics techniques are also used to produce atomistic models independently from any experimental input, thereby providing a test of the viability of the reverse Monte Carlo method in generating realistic models for amorphous polymeric systems. An analysis of the obtained models in terms of single chain properties and of orientational correlations between chain segments is presented. We show the viability of the method with data from molten polyethylene. The analysis derives a model with average C-C and C-H bond lengths of 1.55 Å and 1.1 Å respectively, average backbone valence angle of 112, a torsional angle distribution characterized by a fraction of trans conformers of 0.67 and, finally, a weak interchain orientational correlation at around 4 Å.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A statistical methodology is proposed and tested for the analysis of extreme values of atmospheric wave activity at mid-latitudes. The adopted methods are the classical block-maximum and peak over threshold, respectively based on the generalized extreme value (GEV) distribution and the generalized Pareto distribution (GPD). Time-series of the ‘Wave Activity Index’ (WAI) and the ‘Baroclinic Activity Index’ (BAI) are computed from simulations of the General Circulation Model ECHAM4.6, which is run under perpetual January conditions. Both the GEV and the GPD analyses indicate that the extremes ofWAI and BAI areWeibull distributed, this corresponds to distributions with an upper bound. However, a remarkably large variability is found in the tails of such distributions; distinct simulations carried out under the same experimental setup provide sensibly different estimates of the 200-yr WAI return level. The consequences of this phenomenon in applications of the methodology to climate change studies are discussed. The atmospheric configurations characteristic of the maxima and minima of WAI and BAI are also examined.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper examines the changes in the length of commercial property leases over the last decade and presents an analysis of the consequent investment and occupational pricing implications for commercial property investmentsIt is argued that the pricing implications of a short lease to an investor are contingent upon the expected costs of the letting termination to the investor, the probability that the letting will be terminated and the volatility of rental values.The paper examines the key factors influencing these variables and presents a framework for incorporating their effects into pricing models.Approaches to their valuation derived from option pricing are critically assessed. It is argued that such models also tend to neglect the price effects of specific risk factors such as tenant circumstances and the terms of break clause. Specific risk factors have a significant bearing on the probability of letting termination and on the level of the resultant financial losses. The merits of a simulation methododology are examined for rental and capital valuations of short leases and properties with break clauses.It is concluded that in addition to the rigour of its internal logic, the success of any methodology is predicated upon the accuracy of the inputs.The lack of reliable data on patterns in, and incidence of, lease termination and the lack of reliable time series of historic property performance limit the efficacy of financial models.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Optimal state estimation from given observations of a dynamical system by data assimilation is generally an ill-posed inverse problem. In order to solve the problem, a standard Tikhonov, or L2, regularization is used, based on certain statistical assumptions on the errors in the data. The regularization term constrains the estimate of the state to remain close to a prior estimate. In the presence of model error, this approach does not capture the initial state of the system accurately, as the initial state estimate is derived by minimizing the average error between the model predictions and the observations over a time window. Here we examine an alternative L1 regularization technique that has proved valuable in image processing. We show that for examples of flow with sharp fronts and shocks, the L1 regularization technique performs more accurately than standard L2 regularization.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Blood clotting response (BCR) resistance tests are available for a number of anticoagulant rodenticides. However, during the development of these tests many of the test parameters have been changed, making meaningful comparisons between results difficult. It was recognised that a standard methodology was urgently required for future BCR resistance tests and, accordingly, this document presents a reappraisal of published tests, and proposes a standard protocol for future use (see Appendix). The protocol can be used to provide information on the incidence and degree of resistance in a particular rodent population; to provide a simple comparison of resistance factors between active ingredients, thus giving clear information about cross-resistance for any given strain; and to provide comparisons of susceptibility or resistance between different populations. The methodology has a sound statistical basis in being based on the ED50 response, and requires many fewer animals than the resistance tests in current use. Most importantly, tests can be used to give a clear indication of the likely practical impact of the resistance on field efficacy. The present study was commissioned and funded by the Rodenticide Resistance Action Committee (RRAC) of CropLife International.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Following a malicious or accidental atmospheric release in an outdoor environment it is essential for first responders to ensure safety by identifying areas where human life may be in danger. For this to happen quickly, reliable information is needed on the source strength and location, and the type of chemical agent released. We present here an inverse modelling technique that estimates the source strength and location of such a release, together with the uncertainty in those estimates, using a limited number of measurements of concentration from a network of chemical sensors considering a single, steady, ground-level source. The technique is evaluated using data from a set of dispersion experiments conducted in a meteorological wind tunnel, where simultaneous measurements of concentration time series were obtained in the plume from a ground-level point-source emission of a passive tracer. In particular, we analyze the response to the number of sensors deployed and their arrangement, and to sampling and model errors. We find that the inverse algorithm can generate acceptable estimates of the source characteristics with as few as four sensors, providing these are well-placed and that the sampling error is controlled. Configurations with at least three sensors in a profile across the plume were found to be superior to other arrangements examined. Analysis of the influence of sampling error due to the use of short averaging times showed that the uncertainty in the source estimates grew as the sampling time decreased. This demonstrated that averaging times greater than about 5min (full scale time) lead to acceptable accuracy.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We present a new method to determine mesospheric electron densities from partially reflected medium frequency radar pulses. The technique uses an optimal estimation inverse method and retrieves both an electron density profile and a gradient electron density profile. As well as accounting for the absorption of the two magnetoionic modes formed by ionospheric birefringence of each radar pulse, the forward model of the retrieval parameterises possible Fresnel scatter of each mode by fine electronic structure, phase changes of each mode due to Faraday rotation and the dependence of the amplitudes of the backscattered modes upon pulse width. Validation results indicate that known profiles can be retrieved and that χ2 tests upon retrieval parameters satisfy validity criteria. Application to measurements shows that retrieved electron density profiles are consistent with accepted ideas about seasonal variability of electron densities and their dependence upon nitric oxide production and transport.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Maincrop potato yields in Scotland have increased by 3035 similar to t similar to ha-1 since 1960 as a result of many changes, but has changing climate contributed anything to this? The purpose of this work was to answer this question. Daily weather data for the period 19602006 were analysed for five locations covering the zones of potato growing on the east coast of Scotland (between 55.213 and 57.646 similar to N) to determine trends in temperature, rainfall and solar radiation. A physiologically based potato yield model was validated using data obtained from a long-term field trial in eastern Scotland and then employed to simulate crop development and potential yield at each of the five sites. Over the 47 similar to years, there were significant increases in annual air and 30 similar to cm soil temperatures (0.27 and 0.30 similar to K similar to decade-1, respectively), but no significant changes in annual precipitation or in the timing of the last frost in spring and the first frost of autumn. There was no evidence of any north to south gradient of warming. Simulated emergence and canopy closure became earlier at all five sites over the period with the advance being greater in the north (3.7 and 3.6 similar to days similar to decade-1, respectively) than the south (0.5 and 0.8 similar to days similar to decade-1, respectively). Potential yield increased with time, generally reflecting the increased duration of the green canopy, at average rates of 2.8 similar to t similar to ha-1 decade-1 for chitted seed (sprouted prior to planting) and 2.5 similar to t similar to ha-1 decade-1 for unchitted seed. The measured warming could contribute potential yield increases of up to 13.2 similar to t similar to ha-1 for chitted potato (range 7.119.3 similar to t similar to ha-1) and 11.5 similar to t similar to ha-1 for unchitted potato (range 7.115.5 similar to t similar to ha-1) equivalent to 3439% of the increased potential yield over the period or 2326% of the increase in actual measured yields.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

There is evidence that consumption of fish, especially oily fish, has substantial beneficial effects on health. In particular an inverse relationship of oily fish intake to coronary heart disease incidence has been established. These beneficial effects are ascribed to fish oil components including long chain ω-3 polyunsaturated fatty acids. On the other hand it should be noted that oily fish also contains hazardous substances such as dioxins, PCBs and methylmercury. Soy consumption has been associated with potential beneficial and adverse effects. The claimed benefits include reduced risk of cardiovascular disease; osteoporosis, breast and prostate cancer whereas potential adverse effects include impaired thyroid function, disruption of sex hormone levels, changes in reproductive function and increased breast cancer risk The two cases of natural foods highlight the need to consider both risks and benefits in order to establish the net health impact associated to the consumption of specific food products. Within the Sixth Framework programme of the European Commission, the BRAFO project was funded to develop a framework that allows for the quantitative comparison of human health risks and benefits in relation to foods and food compounds. This paper describes the application of the developed framework to two natural foods, farmed salmon and soy protein. We conclude that the BRAFO methodology is highly applicable to natural foods. It will help the benefit-risk managers in selecting the appropriate dietary recommendations for the population.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We study inverse problems in neural field theory, i.e., the construction of synaptic weight kernels yielding a prescribed neural field dynamics. We address the issues of existence, uniqueness, and stability of solutions to the inverse problem for the Amari neural field equation as a special case, and prove that these problems are generally ill-posed. In order to construct solutions to the inverse problem, we first recast the Amari equation into a linear perceptron equation in an infinite-dimensional Banach or Hilbert space. In a second step, we construct sets of biorthogonal function systems allowing the approximation of synaptic weight kernels by a generalized Hebbian learning rule. Numerically, this construction is implemented by the Moore–Penrose pseudoinverse method. We demonstrate the instability of these solutions and use the Tikhonov regularization method for stabilization and to prevent numerical overfitting. We illustrate the stable construction of kernels by means of three instructive examples.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Salmonella enterica serotypes Derby, Mbandaka, Montevideo, Livingstone, and Senftenberg were among the 10 most prevalent serotypes isolated from farm animals in England and Wales in 1999. These serotypes are of potential zoonotic relevance; however, there is currently no "gold standard" fingerprinting method for them. A collection of isolates representing the former serotypes and serotype Gold Coast were analyzed using plasmid profiling, pulsed-field gel electrophoresis (PFGE), and ribotyping. The success of the molecular methods in identifying DNA polymorphisms was different for each serotype. Plasmid profiling was particularly useful for serotype Derby isolates, and it also provided a good level of discrimination for serotype Senftenberg. For most serotypes, we observed a number of nontypeable plasmid-free strains, which represents a limitation of this technique. Fingerprinting of genomic DNA by ribotyping and PFGE produced a significant variation in results, depending on the serotype of the strain. Both PstI/SphI ribotyping and XbaI-PFGE provided a similar degree of strain differentiation for serotype Derby and serotype Senftenberg, only marginally lower than that achieved by plasmid profiling. Ribotyping was less sensitive than PFGE when applied to serotype Mbandaka or serotype Montevideo. Serotype Gold Coast isolates were found to be nontypeable by XbaI-PFGE, and a significant proportion of them were found to be plasmid free. A similar situation applies to a number of serotype Livingstone isolates which were nontypeable by plasmid profiling and/or PFGE. In summary, the serotype of the isolates has a considerable influence in deciding the best typing strategy; a single method cannot be relied upon for discriminating between strains, and a combination of typing methods allows further discrimination.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Inducing rules from very large datasets is one of the most challenging areas in data mining. Several approaches exist to scaling up classification rule induction to large datasets, namely data reduction and the parallelisation of classification rule induction algorithms. In the area of parallelisation of classification rule induction algorithms most of the work has been concentrated on the Top Down Induction of Decision Trees (TDIDT), also known as the ‘divide and conquer’ approach. However powerful alternative algorithms exist that induce modular rules. Most of these alternative algorithms follow the ‘separate and conquer’ approach of inducing rules, but very little work has been done to make the ‘separate and conquer’ approach scale better on large training data. This paper examines the potential of the recently developed blackboard based J-PMCRI methodology for parallelising modular classification rule induction algorithms that follow the ‘separate and conquer’ approach. A concrete implementation of the methodology is evaluated empirically on very large datasets.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This contribution introduces a new digital predistorter to compensate serious distortions caused by memory high power amplifiers (HPAs) which exhibit output saturation characteristics. The proposed design is based on direct learning using a data-driven B-spline Wiener system modeling approach. The nonlinear HPA with memory is first identified based on the B-spline neural network model using the Gauss-Newton algorithm, which incorporates the efficient De Boor algorithm with both B-spline curve and first derivative recursions. The estimated Wiener HPA model is then used to design the Hammerstein predistorter. In particular, the inverse of the amplitude distortion of the HPA's static nonlinearity can be calculated effectively using the Newton-Raphson formula based on the inverse of De Boor algorithm. A major advantage of this approach is that both the Wiener HPA identification and the Hammerstein predistorter inverse can be achieved very efficiently and accurately. Simulation results obtained are presented to demonstrate the effectiveness of this novel digital predistorter design.