41 resultados para Real-time database and information retrieval systems
Resumo:
The major technical objectives of the RC-NSPES are to provide a framework for the concurrent operation of reactive and pro-active security functions to deliver efficient and optimised intrusion detection schemes as well as enhanced and highly correlated rule sets for more effective alerts management and root-cause analysis. The design and implementation of the RC-NSPES solution includes a number of innovative features in terms of real-time programmable embedded hardware (FPGA) deployment as well as in the integrated management station. These have been devised so as to deliver enhanced detection of attacks and contextualised alerts against threats that can arise from both the network layer and the application layer protocols. The resulting architecture represents an efficient and effective framework for the future deployment of network security systems.
Resumo:
The reaction between gas-phase ozone and monolayers of the unsaturated lipid 1-palmitoy1-2-oleoyl-sn-glycero-3-phosphocholine, POPC, on aqueous solutions has been studied in real time using neutron reflection and surface pressure measurements. The reaction between ozone and lung surfactant, which contains POPC, leads to decreased pulmonary function, but little is known shout the changes that occur to the interfacial material as a result of oxidation. The results reveal that the initial reaction of ozone with POPC leads to a rapid increase in surface pressure followed by a slow decrease to very low values. The neutron reflection measurements, performed on an isotopologue of POPC with a selectively deuterated palmitoyl strand, reveal that the reaction leads to loss of this strand from the air-water interface. suggesting either solubilization of the product lipid or degradation of the palmitoyl strand by a reactive species. Reactions of H-1-POPC on D2O reveal that the headgroup region of the lipids in aqueous solution is not dramatically perturbed by the reaction of POPC monolayers with ozone supporting degradation of the palmitoyl strand rather than solubilization. The results are consistent with the reaction of ozone with the oleoyl strand of POPC at the air water interface leading to the formation of OH radicals. the highly reactive OH radicals produced can then go on to react with the saturated palmitoyl strands leading to the formation or oxidized lipids with shorter alkyl tails.
Resumo:
To ensure minimum loss of system security and revenue it is essential that faults on underground cable systems be located and repaired rapidly. Currently in the UK, the impulse current method is used to prelocate faults, prior to using acoustic methods to pinpoint the fault location. The impulse current method is heavily dependent on the engineer's knowledge and experience in recognising/interpreting the transient waveforms produced by the fault. The development of a prototype real-time expert system aid for the prelocation of cable faults is described. Results from the prototype demonstrate the feasibility and benefits of the expert system as an aid for the diagnosis and location of faults on underground cable systems.
Resumo:
The advantages of standard bus systems have been appreciated for many years. The ability to connect only those modules required to perform a given task has both technical and commercial advantages over a system with a fixed architecture which cannot be easily expanded or updated. Although such bus standards have proliferated in the microprocessor field, a general purpose low-cost standard for digital video processing has yet to gain acceptance. The paper describes the likely requirements of such a system, and discusses three currently available commercial systems. A new bus specification known as Vidibus, developed to fulfil these requirements, is presented. Results from applications already implemented using this real-time bus system are also given.
Resumo:
An algorithm for solving nonlinear discrete time optimal control problems with model-reality differences is presented. The technique uses Dynamic Integrated System Optimization and Parameter Estimation (DISOPE), which achieves the correct optimal solution in spite of deficiencies in the mathematical model employed in the optimization procedure. A version of the algorithm with a linear-quadratic model-based problem, implemented in the C+ + programming language, is developed and applied to illustrative simulation examples. An analysis of the optimality and convergence properties of the algorithm is also presented.
Resumo:
A near real-time flood detection algorithm giving a synoptic overview of the extent of flooding in both urban and rural areas, and capable of working during night-time and day-time even if cloud was present, could be a useful tool for operational flood relief management. The paper describes an automatic algorithm using high resolution Synthetic Aperture Radar (SAR) satellite data that builds on existing approaches, including the use of image segmentation techniques prior to object classification to cope with the very large number of pixels in these scenes. Flood detection in urban areas is guided by the flood extent derived in adjacent rural areas. The algorithm assumes that high resolution topographic height data are available for at least the urban areas of the scene, in order that a SAR simulator may be used to estimate areas of radar shadow and layover. The algorithm proved capable of detecting flooding in rural areas using TerraSAR-X with good accuracy, and in urban areas with reasonable accuracy. The accuracy was reduced in urban areas partly because of TerraSAR-X’s restricted visibility of the ground surface due to radar shadow and layover.
Resumo:
A near real-time flood detection algorithm giving a synoptic overview of the extent of flooding in both urban and rural areas, and capable of working during night-time and day-time even if cloud was present, could be a useful tool for operational flood relief management and flood forecasting. The paper describes an automatic algorithm using high resolution Synthetic Aperture Radar (SAR) satellite data that assumes that high resolution topographic height data are available for at least the urban areas of the scene, in order that a SAR simulator may be used to estimate areas of radar shadow and layover. The algorithm proved capable of detecting flooding in rural areas using TerraSAR-X with good accuracy, and in urban areas with reasonable accuracy.
Resumo:
A near real-time flood detection algorithm giving a synoptic overview of the extent of flooding in both urban and rural areas, and capable of working during night-time and day-time even if cloud was present, could be a useful tool for operational flood relief management. The paper describes an automatic algorithm using high resolution Synthetic Aperture Radar (SAR) satellite data that builds on existing approaches, including the use of image segmentation techniques prior to object classification to cope with the very large number of pixels in these scenes. Flood detection in urban areas is guided by the flood extent derived in adjacent rural areas. The algorithm assumes that high resolution topographic height data are available for at least the urban areas of the scene, in order that a SAR simulator may be used to estimate areas of radar shadow and layover. The algorithm proved capable of detecting flooding in rural areas using TerraSAR-X with good accuracy, classifying 89% of flooded pixels correctly, with an associated false positive rate of 6%. Of the urban water pixels visible to TerraSAR-X, 75% were correctly detected, with a false positive rate of 24%. If all urban water pixels were considered, including those in shadow and layover regions, these figures fell to 57% and 18% respectively.
Resumo:
In this paper we describe how to cope with the delays inherent in a real time control system for a steerable stereo head/eye platform. A purposive and reactive system requires the use of fast vision algorithms to provide the controller with the error signals to drive the platform. The time-critical implementation of these algorithms is necessary, not only to enable short latency reaction to real world events, but also to provide sufficiently high frequency results with small enough delays that controller remain stable. However, even with precise knowledge of that delay, nonlinearities in the plant make modelling of that plant impossible, thus precluding the use of a Smith Regulator. Moreover, the major delay in the system is in the feedback (image capture and vision processing) rather than feed forward (controller) loop. Delays ranging between 40msecs and 80msecs are common for the simple 2D processes, but might extend to several hundred milliseconds for more sophisticated 3D processes. The strategy presented gives precise control over the gaze direction of the cameras despite the lack of a priori knowledge of the delays involved. The resulting controller is shown to have a similar structure to the Smith Regulator, but with essential modifications.
Resumo:
User interaction within a virtual environment may take various forms: a teleconferencing application will require users to speak to each other (Geak, 1993), with computer supported co-operative working; an Engineer may wish to pass an object to another user for examination; in a battle field simulation (McDonough, 1992), users might exchange fire. In all cases it is necessary for the actions of one user to be presented to the others sufficiently quickly to allow realistic interaction. In this paper we take a fresh look at the approach of virtual reality operating systems by tackling the underlying issues of creating real-time multi-user environments.
Resumo:
We present the first climate prediction of the coming decade made with multiple models, initialized with prior observations. This prediction accrues from an international activity to exchange decadal predictions in near real-time, in order to assess differences and similarities, provide a consensus view to prevent over-confidence in forecasts from any single model, and establish current collective capability. We stress that the forecast is experimental, since the skill of the multi-model system is as yet unknown. Nevertheless, the forecast systems used here are based on models that have undergone rigorous evaluation and individually have been evaluated for forecast skill. Moreover, it is important to publish forecasts to enable open evaluation, and to provide a focus on climate change in the coming decade. Initialized forecasts of the year 2011 agree well with observations, with a pattern correlation of 0.62 compared to 0.31 for uninitialized projections. In particular, the forecast correctly predicted La Niña in the Pacific, and warm conditions in the north Atlantic and USA. A similar pattern is predicted for 2012 but with a weaker La Niña. Indices of Atlantic multi-decadal variability and Pacific decadal variability show no signal beyond climatology after 2015, while temperature in the Niño3 region is predicted to warm slightly by about 0.5 °C over the coming decade. However, uncertainties are large for individual years and initialization has little impact beyond the first 4 years in most regions. Relative to uninitialized forecasts, initialized forecasts are significantly warmer in the north Atlantic sub-polar gyre and cooler in the north Pacific throughout the decade. They are also significantly cooler in the global average and over most land and ocean regions out to several years ahead. However, in the absence of volcanic eruptions, global temperature is predicted to continue to rise, with each year from 2013 onwards having a 50 % chance of exceeding the current observed record. Verification of these forecasts will provide an important opportunity to test the performance of models and our understanding and knowledge of the drivers of climate change.
Resumo:
The three decades of on-going executives’ concerns of how to achieve successful alignment between business and information technology shows the complexity of such a vital process. Most of the challenges of alignment are related to knowledge and organisational change and several researchers have introduced a number of mechanisms to address some of these challenges. However, these mechanisms pay less attention to multi-level effects, which results in a limited un-derstanding of alignment across levels. Therefore, we reviewed these challenges from a multi-level learning perspective and found that business and IT alignment is related to the balance of exploitation and exploration strategies with the intellec-tual content of individual, group and organisational levels.