14 resultados para Wide Area Monitoring

em Aston University Research Archive


Relevância:

100.00% 100.00%

Publicador:

Resumo:

B-ISDN is a universal network which supports diverse mixes of service, applications and traffic. ATM has been accepted world-wide as the transport technique for future use in B-ISDN. ATM, being a simple packet oriented transfer technique, provides a flexible means for supporting a continuum of transport rates and is efficient due to possible statistical sharing of network resources by multiple users. In order to fully exploit the potential statistical gain, while at the same time provide diverse service and traffic mixes, an efficient traffic control must be designed. Traffic controls which include congestion and flow control are a fundamental necessity to the success and viability of future B-ISDN. Congestion and flow control is difficult in the broadband environment due to the high speed link, the wide area distance, diverse service requirements and diverse traffic characteristics. Most congestion and flow control approaches in conventional packet switched networks are reactive in nature and are not applicable in the B-ISDN environment. In this research, traffic control procedures mainly based on preventive measures for a private ATM-based network are proposed and their performance evaluated. The various traffic controls include CAC, traffic flow enforcement, priority control and an explicit feedback mechanism. These functions operate at call level and cell level. They are carried out distributively by the end terminals, the network access points and the internal elements of the network. During the connection set-up phase, the CAC decides the acceptance or denial of a connection request and allocates bandwidth to the new connection according to three schemes; peak bit rate, statistical rate and average bit rate. The statistical multiplexing rate is based on a `bufferless fluid flow model' which is simple and robust. The allocation of an average bit rate to data traffic at the expense of delay obviously improves the network bandwidth utilisation.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

There is an increasing emphasis on the use of software to control safety critical plants for a wide area of applications. The importance of ensuring the correct operation of such potentially hazardous systems points to an emphasis on the verification of the system relative to a suitably secure specification. However, the process of verification is often made more complex by the concurrency and real-time considerations which are inherent in many applications. A response to this is the use of formal methods for the specification and verification of safety critical control systems. These provide a mathematical representation of a system which permits reasoning about its properties. This thesis investigates the use of the formal method Communicating Sequential Processes (CSP) for the verification of a safety critical control application. CSP is a discrete event based process algebra which has a compositional axiomatic semantics that supports verification by formal proof. The application is an industrial case study which concerns the concurrent control of a real-time high speed mechanism. It is seen from the case study that the axiomatic verification method employed is complex. It requires the user to have a relatively comprehensive understanding of the nature of the proof system and the application. By making a series of observations the thesis notes that CSP possesses the scope to support a more procedural approach to verification in the form of testing. This thesis investigates the technique of testing and proposes the method of Ideal Test Sets. By exploiting the underlying structure of the CSP semantic model it is shown that for certain processes and specifications the obligation of verification can be reduced to that of testing the specification over a finite subset of the behaviours of the process.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This research develops a low cost remote sensing system for use in agricultural applications. The important features of the system are that it monitors the near infrared and it incorporates position and attitude measuring equipment allowing for geo-rectified images to be produced without the use of ground control points. The equipment is designed to be hand held and hence requires no structural modification to the aircraft. The portable remote sensing system consists of an inertia measurement unit (IMU), which is accelerometer based, a low-cost GPS device and a small format false colour composite digital camera. The total cost of producing such a system is below GBP 3000, which is far cheaper than equivalent existing systems. The design of the portable remote sensing device has eliminated bore sight misalignment errors from the direct geo-referencing process. A new processing technique has been introduced for the data obtained from these low-cost devices, and it is found that using this technique the image can be matched (overlaid) onto Ordnance Survey Master Maps at an accuracy compatible with precision agriculture requirements. The direct geo-referencing has also been improved by introducing an algorithm capable of correcting oblique images directly. This algorithm alters the pixels value, hence it is advised that image analysis is performed before image georectification. The drawback of this research is that the low-cost GPS device experienced bad checksum errors, which resulted in missing data. The Wide Area Augmented System (WAAS) correction could not be employed because the satellites could not be locked onto whilst flying. The best GPS data were obtained from the Garmin eTrex (15 m kinematic and 2 m static) instruments which have a highsensitivity receiver with good lock on capability. The limitation of this GPS device is the inability to effectively receive the P-Code wavelength, which is needed to gain the best accuracy when undertaking differential GPS processing. Pairing the carrier phase L1 with the pseudorange C/A-Code received, in order to determine the image coordinates by the differential technique, is still under investigation. To improve the position accuracy, it is recommended that a GPS base station should be established near the survey area, instead of using a permanent GPS base station established by the Ordnance Survey.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

We analyze the performance through numerical simulations of a new modulation format: serial dark soliton (SDS) for wide-area 100-Gb/s applications. We compare the performance of the SDS with conventional dark soliton, amplitude-modulation phase-shift keying (also known as duobinary), nonreturn-to-zero, and return-to-zero modulation formats, when subjected to typical wide-area-network impairments. We show that the SDS has a strong chromatic dispersion and polarization-mode-dispersion tolerance, while maintaining a compact spectrum suitable for strong filtering requirement in ultradense wavelength-division-multiplexing applications. The SDS can be generated using commercially available components for 40-Gb/s applications and is cost efficient when compared with other 100-Gb/s electrical-time-division-multiplexing systems.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Recent work has highlighted the potential of sol-gel-derived calcium silicate glasses for the regeneration or replacement of damaged bone tissue. The work presented herein provides new insight into the processing of bioactive calcia-silica sol-gel foams, and the reaction mechanisms associated with them when immersed in vitro in a simulated body fluid (SBF). Small-angle X-ray scattering and wide-angle X-ray scattering (diffraction) have been used to study the stabilization of these foams via heat treatment, with analogous in situ time-resolved data being gathered for a foam immersed in SBF. During thermal processing, pore sizes have been identified in the range of 16.5-62.0 nm and are only present once foams have been heated to 400 degrees C and above. Calcium nitrate crystallites were present until foams were heated to 600 degrees C; the crystallite size varied from 75 to 145 nm and increased in size with heat treatment up to 300 degrees C, then decreased in size down to 95 rim at 400 degrees C. The in situ time-resolved data show that the average pore diameter decreases as a function of immersion time in SBF, as calcium phosphates grow on the glass surfaces. Over the same time, Bragg peaks indicative of tricalcium phosphate were evident after only 1-h immersion time, and later, hydroxycarbonate apatite was also seen. The hydroxycarbonate apatite appears to have preferred orientation in the (h,k,0) direction.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Automatically generating maps of a measured variable of interest can be problematic. In this work we focus on the monitoring network context where observations are collected and reported by a network of sensors, and are then transformed into interpolated maps for use in decision making. Using traditional geostatistical methods, estimating the covariance structure of data collected in an emergency situation can be difficult. Variogram determination, whether by method-of-moment estimators or by maximum likelihood, is very sensitive to extreme values. Even when a monitoring network is in a routine mode of operation, sensors can sporadically malfunction and report extreme values. If this extreme data destabilises the model, causing the covariance structure of the observed data to be incorrectly estimated, the generated maps will be of little value, and the uncertainty estimates in particular will be misleading. Marchant and Lark [2007] propose a REML estimator for the covariance, which is shown to work on small data sets with a manual selection of the damping parameter in the robust likelihood. We show how this can be extended to allow treatment of large data sets together with an automated approach to all parameter estimation. The projected process kriging framework of Ingram et al. [2007] is extended to allow the use of robust likelihood functions, including the two component Gaussian and the Huber function. We show how our algorithm is further refined to reduce the computational complexity while at the same time minimising any loss of information. To show the benefits of this method, we use data collected from radiation monitoring networks across Europe. We compare our results to those obtained from traditional kriging methodologies and include comparisons with Box-Cox transformations of the data. We discuss the issue of whether to treat or ignore extreme values, making the distinction between the robust methods which ignore outliers and transformation methods which treat them as part of the (transformed) process. Using a case study, based on an extreme radiological events over a large area, we show how radiation data collected from monitoring networks can be analysed automatically and then used to generate reliable maps to inform decision making. We show the limitations of the methods and discuss potential extensions to remedy these.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Saxicolous lichen vegetation on Ordovician rock at the mouth of the River Dovey, South Merionethshire, is examined in relation to aspect, slope angle, light intensity, rock porosity, rock microtopography and rock stability. A number of characteristic groups of species are recognized. The environmental factors measured are discussed in some detail. In addition, the wide tolerance of most saxicolous species is emphasized.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Computer integrated monitoring is a very large area in engineering where on-line, real time data acquisition with the aid of sensors is the solution to many problems in the manufacturing industry as opposed to the old data logging method by graphics analysis. The raw data which is collected this way however is useless in the absence of a proper computerized management system. The transfer of data between the management and the shop floor processes has been impossible in the past unless all the computers in the system were totally compatible with each other. This limits the efficiency of the systems because they get governed by the limitations of the computers. General Motors of U.S.A. have recently started research on a new standard called the Manufacturing Automation Protocol (MAP) which is expected to allow data transfer between different types of computers. This is still in early development stages and also is currently very expensive. This research programme shows how such a shop floor data acquisition system and a complete management system on entirely different computers can be integrated together to form a single system by achieving data transfer communications using a cheaper but a superior alternative to MAP. Standard communication character sets and hardware such as ASCII and UARTs have been used in this method but the technique is so powerful that totally incompatible computers are shown to run different programs (in different languages) simultaneously and yet receive data from each other and process in their own CPUs with no human intervention.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Task classification is introduced as a method for the evaluation of monitoring behaviour in different task situations. On the basis of an analysis of different monitoring tasks, a task classification system comprising four task 'dimensions' is proposed. The perceptual speed and flexibility of closure categories, which are identified with signal discrimination type, comprise the principal dimension in this taxonomy, the others being sense modality, the time course of events, and source complexity. It is also proposed that decision theory provides the most complete method for the analysis of performance in monitoring tasks. Several different aspects of decision theory in relation to monitoring behaviour are described. A method is also outlined whereby both accuracy and latency measures of performance may be analysed within the same decision theory framework. Eight experiments and an organizational study are reported. The results show that a distinction can be made between the perceptual efficiency (sensitivity) of a monitor and his criterial level of response, and that in most monitoring situations, there is no decrement in efficiency over the work period, but an increase in the strictness of the response criterion. The range of tasks exhibiting either or both of these performance trends can be specified within the task classification system. In particular, it is shown that a sensitivity decrement is only obtained for 'speed' tasks with a high stimulation rate. A distinctive feature of 'speed' tasks is that target detection requires the discrimination of a change in a stimulus relative to preceding stimuli, whereas in 'closure' tasks, the information required for the discrimination of targets is presented at the same point In time. In the final study, the specification of tasks yielding sensitivity decrements is shown to be consistent with a task classification analysis of the monitoring literature. It is also demonstrated that the signal type dimension has a major influence on the consistency of individual differences in performance in different tasks. The results provide an empirical validation for the 'speed' and 'closure' categories, and suggest that individual differences are not completely task specific but are dependent on the demands common to different tasks. Task classification is therefore shovn to enable improved generalizations to be made of the factors affecting 1) performance trends over time, and 2) the consistencv of performance in different tasks. A decision theory analysis of response latencies is shown to support the view that criterion shifts are obtained in some tasks, while sensitivity shifts are obtained in others. The results of a psychophysiological study also suggest that evoked potential latency measures may provide temporal correlates of criterion shifts in monitoring tasks. Among other results, the finding that the latencies of negative responses do not increase over time is taken to invalidate arousal-based theories of performance trends over a work period. An interpretation in terms of expectancy, however, provides a more reliable explanation of criterion shifts. Although the mechanisms underlying the sensitivity decrement are not completely clear, the results rule out 'unitary' theories such as observing response and coupling theory. It is suggested that an interpretation in terms of the memory data limitations on information processing provides the most parsimonious explanation of all the results in the literature relating to sensitivity decrement. Task classification therefore enables the refinement and selection of theories of monitoring behaviour in terms of their reliability in generalizing predictions to a wide range of tasks. It is thus concluded that task classification and decision theory provide a reliable basis for the assessment and analysis of monitoring behaviour in different task situations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The technique of remote sensing provides a unique view of the earth's surface and considerable areas can be surveyed in a short amount of time. The aim of this project was to evaluate whether remote sensing, particularly using the Airborne Thematic Mapper (ATM) with its wide spectral range, was capable of monitoring landfill sites within an urban environment with the aid of image processing and Geographical Information Systems (GIS) methods. The regions under study were in the West Midlands conurbation and consisted of a large area in what is locally known as the Black Country containing heavy industry intermingled with residential areas, and a large single active landfill in north Birmingham. When waste is collected in large volumes it decays and gives off pollutants. These pollutants, landfill gas and leachate (a liquid effluent), are known to be injurious to vegetation and can cause stress and death. Vegetation under stress can exhibit a physiological change, detectable by the remote sensing systems used. The chemical and biological reactions that create the pollutants are exothermic and the gas and leachate, if they leave the waste, can be warmer than their surroundings. Thermal imagery from the ATM (daylight and dawn) and thermal video were obtained and used to find thermal anomalies on the area under study. The results showed that vegetation stress is not a reliable indicator of landfill gas migration, as sites within an urban environment have a cover too complex for the effects to be identified. Gas emissions from two sites were successfully detected by all the thermal imagery with the thermal ATM being the best. Although the results were somewhat disappointing, recent technical advancements in the remote sensing systems used in this project would allow geo-registration of ATM imagery taken on different occasions and the elimination of the effects of solar insolation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Aerial photography was used to determine the land use in a test area of the Nigerian savanna in 1950 and 1972. Changes in land use were determined and correlated with accessibility, appropriate low technology methods being used to make it easy to extend the investigation to other areas without incurring great expense. A test area of 750 sq km was chosen located in Kaduna State of Nigeria. The geography of the area is summarised together with the local knowledge which is essential for accurate photo interpretation. A land use classification was devised and tested for use with medium scale aerial photography of the savanna. The two sets of aerial photography at 1:25 000 scale were sampled using systematic dot grids. A dot density of 8 1/2 dots per sq km was calculated to give an acceptable estimate of land use. Problems of interpretation included gradation between categories, sample position uncertainty and personal bias. The results showed that in 22 years the amount of cultivated land in the test area had doubled while there had been a corresponding decrease in the amount of uncultivated land particularly woodland. The intensity of land use had generally increased. The distribution of land use changes was analysed and correlated with accessibility. Highly significant correlations were found for 1972 which had not existed in 1950. Changes in land use could also be correlated with accessibility. It was concluded that in the 22 year test period there had been intensification of land use, movement of human activity towards the main road, and a decrease in natural vegetation particularly close to the road. The classification of land use and the dot grid method of survey were shown to be applicable to a savanna test area.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Ambulatory EEG recording enables patients with epilepsy and related disorders to be monitored in an unrestricted environment for prolonged periods. Attacks can therefore be recorded and EEG changes at the time can aid diagnosis. The relevant Iiterature is reviewed and a study made of' 250 clinical investigations. A study was also made of the artefacts,encountered during ambulatory recording. Three quarters of referrals were for distinguishing between epileptic and non-epileptic attacks. Over 60% of patients showed no abnormality during attacks. In comparison with the basic EEG the ambulatory EEG provided about ten times as much information. A preliminary follow-up study showed that results, of ambulatory monitoring agreed with the final diagnosis in 8 of 12 patients studied. Of 10 patients referred, for monitoring the occurrence of absence seizures, 8 showed abnormality during the baslcJ EEG .and 10 during the ambulatory EEG. Other patients. were referred: for sleep recording and to clarify the seizure type. An investigation into once daily (OD) versus twice daily administration of sodium valproate in patients with absence seizures showed that an OD regime was equally as effective as a BD regime. Circadian variations in spike and wave activity in patients on and off treatment were also examined. There was significant agreement between subjects on the time of occurrence of abnormality during sleep only, This pattern was not ,affected with treatment nor was there any difference in the daily pattern of occurrence of abnormality between the two regimes. Overall findings suggested that ambulatory monitoring was a valuable tool in the diagnosis and treatment of epilepsy which with careful planning and patient selection could be used in any EEG department and would benefit a:wide range of patients.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Measurements of area summation for luminance-modulated stimuli are typically confounded by variations in sensitivity across the retina. Recently we conducted a detailed analysis of sensitivity across the visual field (Baldwin et al, 2012) and found it to be well-described by a bilinear “witch’s hat” function: sensitivity declines rapidly over the first 8 cycles or so, more gently thereafter. Here we multiplied luminance-modulated stimuli (4 c/deg gratings and “Swiss cheeses”) by the inverse of the witch’s hat function to compensate for the inhomogeneity. This revealed summation functions that were straight lines (on double log axes) with a slope of -1/4 extending to ≥33 cycles, demonstrating fourth-root summation of contrast over a wider area than has previously been reported for the central retina. Fourth-root summation is typically attributed to probability summation, but recent studies have rejected that interpretation in favour of a noisy energy model that performs local square-law transduction of the signal, adds noise at each location of the target and then sums over signal area. Modelling shows our results to be consistent with a wide field application of such a contrast integrator. We reject a probability summation model, a quadratic model and a matched template model of our results under the assumptions of signal detection theory. We also reject the high threshold theory of contrast detection under the assumption of probability summation over area.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The thermal activation of a silica-stabilized γ-Alumina impacts positively on the oxidative dehydrogenation of ethylbenzene (EB) to styrene (ST). A systematic thermal study reveals that the transition from γ-alumina into transitional phases at 1050C leads to an optimal enhancement of both conversion and selectivity under pseudo-steady state conditions; where active and selective coke have been deposited. The effect is observed in the reaction temperature range of 450-475C at given operation conditions resulting in the highest ST yield, while at 425C this effect is lost due to incomplete O2 conversion. The conversion increase is ascribed to the ST selectivity improvement that makes more O2 available for the main ODH reaction. The fresh aluminas and catalytically active carbon deposits on the spent catalysts were characterized by gas adsorption (N 2 and Ar), acidity evaluation by NH3-TPD and pyridine adsorption monitored by FTIR, thermal and elemental analyses, solubility in CH2Cl2 and MALDI-TOF to correlate the properties of both phases with the ST selectivity enhancement. Such an increase in selectivity was interpreted by the lower reactivity of the carbon deposits that diminished the COx formation. The site requirements of the optimal catalyst to create the more selective coke is related to the higher density of Lewis sites per surface area, no mixed Si-Al Brønsted sites are formed while the acid strength of the formed Lewis sites is relatively weaker than those of the bare alumina. © 2013 Elsevier B.V. All rights reserved.