78 resultados para Model s analysis
em CentAUR: Central Archive University of Reading - UK
Resumo:
A reference model of Fallible Endgame Play has been implemented and exercised with the chess-engine WILHELM. Past experiments have demonstrated the value of the model and the robustness of decisions based on it: experiments agree well with a Markov Model theory. Here, the reference model is exercised on the well-known endgame KBBKN.
Resumo:
A reference model of Fallible Endgame Play has been implemented and exercised with the chess engine WILHELM. Various experiments have demonstrated the value of the model and the robustness of decisions based on it. Experimental results have also been compared with the theoretical predictions of a Markov model of the endgame and found to be in close agreement.
Resumo:
The dependence of much of Africa on rain fed agriculture leads to a high vulnerability to fluctuations in rainfall amount. Hence, accurate monitoring of near-real time rainfall is particularly useful, for example in forewarning possible crop shortfalls in drought-prone areas. Unfortunately, ground based observations are often inadequate. Rainfall estimates from satellite-based algorithms and numerical model outputs can fill this data gap, however rigorous assessment of such estimates is required. In this case, three satellite based products (NOAA-RFE 2.0, GPCP-1DD and TAMSAT) and two numerical model outputs (ERA-40 and ERA-Interim) have been evaluated for Uganda in East Africa using a network of 27 rain gauges. The study focuses on the years 2001 to 2005 and considers the main rainy season (February to June). All data sets were converted to the same temporal and spatial scales. Kriging was used for the spatial interpolation of the gauge data. All three satellite products showed similar characteristics and had a high level of skill that exceeded both model outputs. ERA-Interim had a tendency to overestimate whilst ERA-40 consistently underestimated the Ugandan rainfall.
Resumo:
The Newton‐Raphson method is proposed for the solution of the nonlinear equation arising from a theoretical model of an acid/base titration. It is shown that it is necessary to modify the form of the equation in order that the iteration is guaranteed to converge. A particular example is considered to illustrate the analysis and method, and a BASIC program is included that can be used to predict the pH of any weak acid/weak base titration.
Resumo:
We have developed a model of the local field potential (LFP) based on the conservation of charge, the independence principle of ionic flows and the classical Hodgkin–Huxley (HH) type intracellular model of synaptic activity. Insights were gained through the simulation of the HH intracellular model on the nonlinear relationship between the balance of synaptic conductances and that of post-synaptic currents. The latter is dependent not only on the former, but also on the temporal lag between the excitatory and inhibitory conductances, as well as the strength of the afferent signal. The proposed LFP model provides a method for decomposing the LFP recordings near the soma of layer IV pyramidal neurons in the barrel cortex of anaesthetised rats into two highly correlated components with opposite polarity. The temporal dynamics and the proportional balance of the two components are comparable to the excitatory and inhibitory post-synaptic currents computed from the HH model. This suggests that the two components of the LFP reflect the underlying excitatory and inhibitory post-synaptic currents of the local neural population. We further used the model to decompose a sequence of evoked LFP responses under repetitive electrical stimulation (5 Hz) of the whisker pad. We found that as neural responses adapted, the excitatory and inhibitory components also adapted proportionately, while the temporal lag between the onsets of the two components increased during frequency adaptation. Our results demonstrated that the balance between neural excitation and inhibition can be investigated using extracellular recordings. Extension of the model to incorporate multiple compartments should allow more quantitative interpretations of surface Electroencephalography (EEG) recordings into components reflecting the excitatory, inhibitory and passive ionic current flows generated by local neural populations.
Resumo:
Aircraft systems are highly nonlinear and time varying. High-performance aircraft at high angles of incidence experience undesired coupling of the lateral and longitudinal variables, resulting in departure from normal controlled � ight. The construction of a robust closed-loop control that extends the stable and decoupled � ight envelope as far as possible is pursued. For the study of these systems, nonlinear analysis methods are needed. Previously, bifurcation techniques have been used mainly to analyze open-loop nonlinear aircraft models and to investigate control effects on dynamic behavior. Linear feedback control designs constructed by eigenstructure assignment methods at a � xed � ight condition are investigated for a simple nonlinear aircraft model. Bifurcation analysis, in conjunction with linear control design methods, is shown to aid control law design for the nonlinear system.
Resumo:
Measurements of anthropogenic tracers such as chlorofluorocarbons and tritium must be quantitatively combined with ocean general circulation models as a component of systematic model development. The authors have developed and tested an inverse method, using a Green's function, to constrain general circulation models with transient tracer data. Using this method chlorofluorocarbon-11 and -12 (CFC-11 and -12) observations are combined with a North Atlantic configuration of the Miami Isopycnic Coordinate Ocean Model with 4/3 degrees resolution. Systematic differences can be seen between the observed CFC concentrations and prior CFC fields simulated by the model. These differences are reduced by the inversion, which determines the optimal gas transfer across the air-sea interface, accounting for uncertainties in the tracer observations. After including the effects of unresolved variability in the CFC fields, the model is found to be inconsistent with the observations because the model/data misfit slightly exceeds the error estimates. By excluding observations in waters ventilated north of the Greenland-Scotland ridge (sigma (0) < 27.82 kg m(-3); shallower than about 2000 m), the fit is improved, indicating that the Nordic overflows are poorly represented in the model. Some systematic differences in the model/data residuals remain and are related, in part, to excessively deep model ventilation near Rockall and deficient ventilation in the main thermocline of the eastern subtropical gyre. Nevertheless, there do not appear to be gross errors in the basin-scale model circulation. Analysis of the CFC inventory using the constrained model suggests that the North Atlantic Ocean shallower than about 2000 m was near 20% saturated in the mid-1990s. Overall, this basin is a sink to 22% of the total atmosphere-to-ocean CFC-11 flux-twice the global average value. The average water mass formation rates over the CFC transient are 7.0 and 6.0 Sv (Sv = 10(6) m(3) s(-1)) for subtropical mode water and subpolar mode water, respectively.
Resumo:
The Stochastic Diffusion Search (SDS) was developed as a solution to the best-fit search problem. Thus, as a special case it is capable of solving the transform invariant pattern recognition problem. SDS is efficient and, although inherently probabilistic, produces very reliable solutions in widely ranging search conditions. However, to date a systematic formal investigation of its properties has not been carried out. This thesis addresses this problem. The thesis reports results pertaining to the global convergence of SDS as well as characterising its time complexity. However, the main emphasis of the work, reports on the resource allocation aspect of the Stochastic Diffusion Search operations. The thesis introduces a novel model of the algorithm, generalising an Ehrenfest Urn Model from statistical physics. This approach makes it possible to obtain a thorough characterisation of the response of the algorithm in terms of the parameters describing the search conditions in case of a unique best-fit pattern in the search space. This model is further generalised in order to account for different search conditions: two solutions in the search space and search for a unique solution in a noisy search space. Also an approximate solution in the case of two alternative solutions is proposed and compared with predictions of the extended Ehrenfest Urn model. The analysis performed enabled a quantitative characterisation of the Stochastic Diffusion Search in terms of exploration and exploitation of the search space. It appeared that SDS is biased towards the latter mode of operation. This novel perspective on the Stochastic Diffusion Search lead to an investigation of extensions of the standard SDS, which would strike a different balance between these two modes of search space processing. Thus, two novel algorithms were derived from the standard Stochastic Diffusion Search, ‘context-free’ and ‘context-sensitive’ SDS, and their properties were analysed with respect to resource allocation. It appeared that they shared some of the desired features of their predecessor but also possessed some properties not present in the classic SDS. The theory developed in the thesis was illustrated throughout with carefully chosen simulations of a best-fit search for a string pattern, a simple but representative domain, enabling careful control of search conditions.
Resumo:
Boreal winter wind storm situations over Central Europe are investigated by means of an objective cluster analysis. Surface data from the NCEP-Reanalysis and ECHAM4/OPYC3-climate change GHG simulation (IS92a) are considered. To achieve an optimum separation of clusters of extreme storm conditions, 55 clusters of weather patterns are differentiated. To reduce the computational effort, a PCA is initially performed, leading to a data reduction of about 98 %. The clustering itself was computed on 3-day periods constructed with the first six PCs using "k-means" clustering algorithm. The applied method enables an evaluation of the time evolution of the synoptic developments. The climate change signal is constructed by a projection of the GCM simulation on the EOFs attained from the NCEP-Reanalysis. Consequently, the same clusters are obtained and frequency distributions can be compared. For Central Europe, four primary storm clusters are identified. These clusters feature almost 72 % of the historical extreme storms events and add only to 5 % of the total relative frequency. Moreover, they show a statistically significant signature in the associated wind fields over Europe. An increased frequency of Central European storm clusters is detected with enhanced GHG conditions, associated with an enhancement of the pressure gradient over Central Europe. Consequently, more intense wind events over Central Europe are expected. The presented algorithm will be highly valuable for the analysis of huge data amounts as is required for e.g. multi-model ensemble analysis, particularly because of the enormous data reduction.
Resumo:
Cholesterol is one of the key constituents for maintaining the cellular membrane and thus the integrity of the cell itself. In contrast high levels of cholesterol in the blood are known to be a major risk factor in the development of cardiovascular disease. We formulate a deterministic nonlinear ordinary differential equation model of the sterol regulatory element binding protein 2 (SREBP-2) cholesterol genetic regulatory pathway in an hepatocyte. The mathematical model includes a description of genetic transcription by SREBP-2 which is subsequently translated to mRNA leading to the formation of 3-hydroxy-3-methylglutaryl coenzyme A reductase (HMGCR), a main precursor of cholesterol synthesis. Cholesterol synthesis subsequently leads to the regulation of SREBP-2 via a negative feedback formulation. Parameterised with data from the literature, the model is used to understand how SREBP-2 transcription and regulation affects cellular cholesterol concentration. Model stability analysis shows that the only positive steady-state of the system exhibits purely oscillatory, damped oscillatory or monotic behaviour under certain parameter conditions. In light of our findings we postulate how cholesterol homestasis is maintained within the cell and the advantages of our model formulation are discussed with respect to other models of genetic regulation within the literature.
Resumo:
A new method for assessing forecast skill and predictability that involves the identification and tracking of extratropical cyclones has been developed and implemented to obtain detailed information about the prediction of cyclones that cannot be obtained from more conventional analysis methodologies. The cyclones were identified and tracked along the forecast trajectories, and statistics were generated to determine the rate at which the position and intensity of the forecasted storms diverge from the analyzed tracks as a function of forecast lead time. The results show a higher level of skill in predicting the position of extratropical cyclones than the intensity. They also show that there is potential to improve the skill in predicting the position by 1 - 1.5 days and the intensity by 2 - 3 days, via improvements to the forecast model. Further analysis shows that forecasted storms move at a slower speed than analyzed storms on average and that there is a larger error in the predicted amplitudes of intense storms than the weaker storms. The results also show that some storms can be predicted up to 3 days before they are identified as an 850-hPa vorticity center in the analyses. In general, the results show a higher level of skill in the Northern Hemisphere (NH) than the Southern Hemisphere (SH); however, the rapid growth of NH winter storms is not very well predicted. The impact that observations of different types have on the prediction of the extratropical cyclones has also been explored, using forecasts integrated from analyses that were constructed from reduced observing systems. A terrestrial, satellite, and surface-based system were investigated and the results showed that the predictive skill of the terrestrial system was superior to the satellite system in the NH. Further analysis showed that the satellite system was not very good at predicting the growth of the storms. In the SH the terrestrial system has significantly less skill than the satellite system, highlighting the dominance of satellite observations in this hemisphere. The surface system has very poor predictive skill in both hemispheres.
Resumo:
Climate models provide compelling evidence that if greenhouse gas emissions continue at present rates, then key global temperature thresholds (such as the European Union limit of two degrees of warming since pre-industrial times) are very likely to be crossed in the next few decades. However, there is relatively little attention paid to whether, should a dangerous temperature level be exceeded, it is feasible for the global temperature to then return to safer levels in a usefully short time. We focus on the timescales needed to reduce atmospheric greenhouse gases and associated temperatures back below potentially dangerous thresholds, using a state-of-the-art general circulation model. This analysis is extended with a simple climate model to provide uncertainty bounds. We find that even for very large reductions in emissions, temperature reduction is likely to occur at a low rate. Policy-makers need to consider such very long recovery timescales implicit in the Earth system when formulating future emission pathways that have the potential to 'overshoot' particular atmospheric concentrations of greenhouse gases and, more importantly, related temperature levels that might be considered dangerous.
Resumo:
A significant desert dust deposition event occurred on Mt. Elbrus, Caucasus Mountains, Russia on 5 May 2009, where the deposited dust later appeared as a brown layer in the snow pack. An examination of dust transportation history and analysis of chemical and physical properties of the deposited dust were used to develop a new approach for high-resolution “provenancing” of dust deposition events recorded in snow pack using multiple independent techniques. A combination of SEVIRI red-green-blue composite imagery, MODIS atmospheric optical depth fields derived using the Deep Blue algorithm, air mass trajectories derived with HYSPLIT model and analysis of meteorological data enabled identification of dust source regions with high temporal (hours) and spatial (ca. 100 km) resolution. Dust, deposited on 5 May 2009, originated in the foothills of the Djebel Akhdar in eastern Libya where dust sources were activated by the intrusion of cold air from the Mediterranean Sea and Saharan low pressure system and transported to the Caucasus along the eastern Mediterranean coast, Syria and Turkey. Particles with an average diameter below 8 μm accounted for 90% of the measured particles in the sample with a mean of 3.58 μm, median 2.48 μm. The chemical signature of this long-travelled dust was significantly different from the locally-produced dust and close to that of soils collected in a palaeolake in the source region, in concentrations of hematite. Potential addition of dust from a secondary source in northern Mesopotamia introduced uncertainty in the “provenancing” of dust from this event. Nevertheless, the approach adopted here enables other dust horizons in the snowpack to be linked to specific dust transport events recorded in remote sensing and meteorological data archives.
Resumo:
A significant desert dust deposition event occurred on Mt. Elbrus, Caucasus Mountains, Russia on 5 May 2009, where the deposited dust later appeared as a brown layer in the snow pack. An examination of dust transportation history and analysis of chemical and physical properties of the deposited dust were used to develop a new approach for high-resolution provenancing of dust deposition events recorded in snow pack using multiple independent techniques. A combination of SEVIRI red-green-blue composite imagery, MODIS atmospheric optical depth fields derived using the Deep Blue algorithm, air mass trajectories derived with HYSPLIT model and analysis of meteorological data enabled identification of dust source regions with high temporal (hours) and spatial (ca. 100 km) resolution. Dust, deposited on 5 May 2009, originated in the foothills of the Djebel Akhdar in eastern Libya where dust sources were activated by the intrusion of cold air from the Mediterranean Sea and Saharan low pressure system and transported to the Caucasus along the eastern Mediterranean coast, Syria and Turkey. Particles with an average diameter below 8 μm accounted for 90% of the measured particles in the sample with a mean of 3.58 μm, median 2.48 μm and the dominant mode of 0.60 μm. The chemical signature of this long-travelled dust was significantly different from the locally-produced dust and close to that of soils collected in a palaeolake in the source region, in concentrations of hematite and oxides of aluminium, manganese, and magnesium. Potential addition of dust from a secondary source in northern Mesopotamia introduced uncertainty in the provenancing of dust from this event. Nevertheless, the approach adopted here enables other dust horizons in the snowpack to be linked to specific dust transport events recorded in remote sensing and meteorological data archives.
Resumo:
A record of dust deposition events between 2009 and 2012 on Mt. Elbrus, Caucasus Mountains derived from a snow pit and a shallow ice core is presented for the first time for this region. A combination of isotopic analysis, SEVIRI red-green-blue composite imagery, MODIS atmospheric optical depth fields derived using the Deep Blue algorithm, air mass trajectories derived using the HYSPLIT model and analysis of meteorological data enabled identification of dust source regions with high temporal (hours) and spatial (cf. 20–100 km) resolution. Seventeen dust deposition events were detected; fourteen occurred in March–June, one in February and two in October. Four events originated in the Sahara, predominantly in north-eastern Libya and eastern Algeria. Thirteen events originated in the Middle East, in the Syrian Desert and northern Mesopotamia, from a mixture of natural and anthropogenic sources. Dust transportation from Sahara was associated with vigorous Saharan depressions, strong surface winds in the source region and mid-tropospheric south-westerly flow with daily winds speeds of 20–30 m s−1 at 700 hPa level and, although these events were less frequent, they resulted in higher dust concentrations in snow. Dust transportation from the Middle East was associated with weaker depressions forming over the source region, high pressure centered over or extending towards the Caspian Sea and a weaker southerly or south-easterly flow towards the Caucasus Mountains with daily wind speeds of 12–18 m s−1 at 700 hPa level. Higher concentrations of nitrates and ammonium characterise dust from the Middle East deposited on Mt. Elbrus in 2009 indicating contribution of anthropogenic sources. The modal values of particle size distributions ranged between 1.98 μm and 4.16 μm. Most samples were characterised by modal values of 2.0–2.8 μm with an average of 2.6 μm and there was no significant difference between dust from the Sahara and the Middle East.