10 resultados para Surveillance

em Aston University Research Archive


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The broad objectives of the work were to develop standard methods for the routine biological surveillance of river water quality, using the non-planktonic algae. Studies on sampling methodology indicated that natural substrata should be sampled directly wherever possible, but for routine purposes, only a semi-quantitative approach was found to be feasible. Artificial substrata were considered to be useful for sample collection in deeper waters, and of three different types tested, Polythene strips were selected for further investigation essentially on grounds of practicality. These were tested in the deeper reaches of a wide range of river types and water qualities: 26 pool sites in 14 different rivers were studied over a period of 9 months. At each site, the assemblages developing on 3 strips following a 4, or less commonly, an 3 week immersion period were analysed quantitatively. Where possible, the natural substrata were also sampled semi-quantitatively at each site, and at a nearby riffle. The results of this survey were very fragmentary: many strips failed to yield useful data, and the results were often difficult to interpret, and of limited value for water quality surveillance purposes. In one river, the Churnet, the natural substrata at 14 riffle sites were sampled semi-quantitatively on 14 occasions at intervals of 4 weeks. In this survey, the results were more readily interpreted in relation to water quality, and no special data processing was found to be necessary or helpful. Further studies carried out on the filamentous green alga Cladophora showed that this alga may have some value as a bioaccumulation indicator for metals, and as a bioassay organism for the assessment of the algal growth promoting potential of natural river waters.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Some of the factors affecting colonisation of a colonisation sampler, the Standard Aufwuchs Unit (S. Auf. U.) were investigated, namely immersion period, whether anchored on the bottom or suspended, and the influence of riffles. It was concluded that a four-week immersion period was best. S. Auf. U. anchored on the bottom collected both more taxa and individuals than suspended ones. Fewer taxa but more individuals colonised S. Auf. U. in the potamon zone compared to the rhithron zone with a consequent reduction in the values of pollution indexes and diversity. It was concluded that a completely different scoring system was necessary for lowland rivers. Macroinvertebrates colonising S. Auf. U. in simulated streams, lowland rivers and the R. Churnet reflected water quality. A variety of pollution and diversity indexes were applied to results from lowland river sites. Instead of these, it was recommended that an abbreviated species - relative abundance list be used to summarise biological data for use in lowland river surveillance. An intensive study of gastropod populations was made in simulated streams. Lynnaea peregra increased in abundance whereas Potamopyrgas jenkinsi decreased with increasing sewage effluent concentration. No clear-cut differences in reproduction were observed. The presence/absence of eight gastropod taxa was compared with concentrations of various pollutants in lowland rivers. On the basis of all field work it appeared that ammonia, nitrite, copper and zinc were the toxicants most likely to be detrimental to gastropods and that P. jenkinsi and Theodoxus fluviatilis were the least tolerant taxa. 96h acute toxicity tests of P. jenkinsi using ammonia and copper were carried out in a flow-through system after a variety of static range finding tests. P. jenkinsi was intolerant to both toxicants compared to reports on other taxa and the results suggested that these toxicants would affect distribution of this species in the field.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This thesis describes the development of an operational river basin water resources information management system. The river or drainage basin is the fundamental unit of the system; in both the modelling and prediction of hydrological processes, and in the monitoring of the effect of catchment management policies. A primary concern of the study is the collection of sufficient and sufficiently accurate information to model hydrological processes. Remote sensing, in combination with conventional point source measurement, can be a valuable source of information, but is often overlooked by hydrologists, due to the cost of acquisition and processing. This thesis describes a number of cost effective methods of acquiring remotely sensed imagery, from airborne video survey to real time ingestion of meteorological satellite data. Inexpensive micro-computer systems and peripherals are used throughout to process and manipulate the data. Spatial information systems provide a means of integrating these data with topographic and thematic cartographic data, and historical records. For the system to have any real potential the data must be stored in a readily accessible format and be easily manipulated within the database. The design of efficient man-machine interfaces and the use of software enginering methodologies are therefore included in this thesis as a major part of the design of the system. The use of low cost technologies, from micro-computers to video cameras, enables the introduction of water resources information management systems into developing countries where the potential benefits are greatest.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

T cell activation is the final step in a complex pathway through which pathogen-derived peptide fragments can elicit an immune response. For it to occur, peptides must form stable complexes with Major Histocompatibility Complex (MHC) molecules and be presented on the cell surface. Computational predictors of MHC binding are often used within in silico vaccine design pathways. We have previously shown that, paradoxically, most bacterial proteins known experimentally to elicit an immune response in disease models are depleted in peptides predicted to bind to human MHC alleles. The results presented here, derived using software proven through benchmarking to be the most accurate currently available, show that vaccine antigens contain fewer predicted MHC-binding peptides than control bacterial proteins from almost all subcellular locations with the exception of cell wall and some cytoplasmic proteins. This effect is too large to be explained from the undoubted lack of precision of the software or from the amino acid composition of the antigens. Instead, we propose that pathogens have evolved under the influence of the host immune system so that surface proteins are depleted in potential MHC-binding peptides, and suggest that identification of a protein likely to contain a single immuno-dominant epitope is likely to be a productive strategy for vaccine design.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In recent years, we have witnessed the mushrooming of pro- democracy and protest movements not only in the Arab world, but also within Europe and the Americas. Such movements have ranged from popular upheavals, like in Tunisia and Egypt, to the organization of large- scale demonstrations against unpopular policies, as in Spain, Greece and Poland. What connects these different events are not only their democratic aspirations, but also their innovative forms of communication and organization through online means, which are sometimes considered to be outside of the State’s control. At the same time, however, it has become more and more apparent that countries are attempting to increase their understanding of, and control over, their citizens’ actions in the digital sphere. This involves striving to develop surveillance instruments, control mechanisms and processes engineered to dominate the digital public sphere, which necessitates the assistance and support of private actors such as Internet intermediaries. Examples include the growing use of Internet surveillance technology with which online data traffic is analysed, and the extensive monitoring of social networks. Despite increased media attention, academic debate on the ambivalence of these technologies, mechanisms and techniques remains relatively limited, as is discussion of the involvement of corporate actors. The purpose of this edited volume is to reflect on how Internet-related technologies, mechanisms and techniques may be used as a means to enable expression, but also to restrict speech, manipulate public debate and govern global populaces.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper looks at the issue of privacy and anonymity through the prism of Scott's concept of legibility i.e. the desire of the state to obtain an ever more accurate mapping of its domain and the actors in its domain. We argue that privacy was absent in village life in the past, and it has arisen as a temporary phenomenon arising from the lack of appropriate technology to make all life in the city legible. Cities have been the loci of creativity for the major part of human civilisation. There is something specific about the illegibility of cities which facilitates creativity and innovation. By providing the technology to catalogue and classify all objects and ideas around us, this leads to a consideration of semantic web technologies, Linked Data and the Internet of Things as unwittingly furthering this ever greater legibility. There is a danger that the over description of a domain will lead to a loss in creativity and innovation. We conclude by arguing that our prime concern must be to preserve illegibility because the survival of some form, any form, of civilisation depends upon it.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Inspired by human visual cognition mechanism, this paper first presents a scene classification method based on an improved standard model feature. Compared with state-of-the-art efforts in scene classification, the newly proposed method is more robust, more selective, and of lower complexity. These advantages are demonstrated by two sets of experiments on both our own database and standard public ones. Furthermore, occlusion and disorder problems in scene classification in video surveillance are also first studied in this paper. © 2010 IEEE.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Lifelong surveillance is not cost-effective after endovascular aneurysm repair (EVAR), but is required to detect aortic complications which are fatal if untreated (type 1/3 endoleak, sac expansion, device migration). Aneurysm morphology determines the probability of aortic complications and therefore the need for surveillance, but existing analyses have proven incapable of identifying patients at sufficiently low risk to justify abandoning surveillance. This study aimed to improve the prediction of aortic complications, through the application of machine-learning techniques. Patients undergoing EVAR at 2 centres were studied from 2004–2010. Aneurysm morphology had previously been studied to derive the SGVI Score for predicting aortic complications. Bayesian Neural Networks were designed using the same data, to dichotomise patients into groups at low- or high-risk of aortic complications. Network training was performed only on patients treated at centre 1. External validation was performed by assessing network performance independently of network training, on patients treated at centre 2. Discrimination was assessed by Kaplan-Meier analysis to compare aortic complications in predicted low-risk versus predicted high-risk patients. 761 patients aged 75 +/− 7 years underwent EVAR in 2 centres. Mean follow-up was 36+/− 20 months. Neural networks were created incorporating neck angu- lation/length/diameter/volume; AAA diameter/area/volume/length/tortuosity; and common iliac tortuosity/diameter. A 19-feature network predicted aor- tic complications with excellent discrimination and external validation (5-year freedom from aortic complications in predicted low-risk vs predicted high-risk patients: 97.9% vs. 63%; p < 0.0001). A Bayesian Neural-Network algorithm can identify patients in whom it may be safe to abandon surveillance after EVAR. This proposal requires prospective study.