854 resultados para GLUCOSE MONITORING-SYSTEM
Resumo:
We describe, for the first time the use of hydrogel-forming microneedle (MN) arrays for minimally-invasive extraction and quantification of drug substances and glucose from skin in vitro and in vivo. MN prepared from aqueous blends of hydrolysed poly(methyl-vinylether-co-maleic anhydride) (11.1% w/w) and poly(ethyleneglycol) 10,000 daltons (5.6% w/w) and crosslinked by esterification swelled upon skin insertion by uptake of fluid. Post-removal, theophylline and caffeine were extracted from MN and determined using HPLC, with glucose quantified using a proprietary kit. In vitro studies using excised neonatal porcine skin bathed on the underside by physiologically-relevant analyte concentrations showed rapid (5 min) analyte uptake. For example, mean concentrations of 0.16 μg/mL and 0.85 μg/mL, respectively, were detected for the lowest (5 μg/mL) and highest (35 μg/mL) Franz cell concentrations of theophylline after 5 min insertion. A mean concentration of 0.10 μg/mL was obtained by extraction of MN inserted for 5 min into skin bathed with 5 μg/mL caffeine, while the mean concentration obtained by extraction of MN inserted into skin bathed with 15 μg/mL caffeine was 0.33 μg/mL. The mean detected glucose concentration after 5 min insertion into skin bathed with 4 mmol/L was 19.46 nmol/L. The highest theophylline concentration detected following extraction from a hydrogel-forming MN inserted for 1 h into the skin of a rat dosed orally with 10 mg/kg was of 0.363 μg/mL, whilst a maximum concentration of 0.063 μg/mL was detected following extraction from a MN inserted for 1 h into the skin of a rat dosed with 5 mg/kg theophylline. In human volunteers, the highest mean concentration of caffeine detected using MN was 91.31 μg/mL over the period from 1 to 2 h post-consumption of 100 mg Proplus® tablets. The highest mean blood glucose level was 7.89 nmol/L detected 1 h following ingestion of 75 g of glucose, while the highest mean glucose concentration extracted from MN was 4.29 nmol/L, detected after 3 hours skin insertion in human volunteers. Whilst not directly correlated, concentrations extracted from MN were clearly indicative of trends in blood in both rats and human volunteers. This work strongly illustrates the potential of hydrogel-forming MN in minimally-invasive patient monitoring and diagnosis. Further studies are now ongoing to reduce clinical insertion times and develop mathematical algorithms enabling determination of blood levels directly from MN measurements.
Resumo:
Masonry arch bridges are one of the oldest forms of bridge construction and have been around for thousands of years. Brick and stone arch bridges have proven to be highly durable as most of them have remained serviceable after hundreds of years. In contrast, many bridges built of modern materials have required extensive repair and strengthening after being in service for a relatively short part of their design life. This paper describes the structural monitoring of a novel flexible concrete arch known as: FlexiArchTM. This is a bridge system that can be transported as a flat-pack system to form an arch in-situ by the use of a flexible polymeric membrane. The system has been developed under a Knowledge Transfer Partnership between Queen’s University Belfast (QUB) and Macrete Ltd. Tievenameena Bridge in Northern Ireland was a replacement bridge for the Northern Ireland Roads Service and was monitored under different axle loadings using a range of sensors including discrete fiber optic Bragg gratings to measure the change in strain in the arch ring under live loading. This paper discusses the results of a laboratory model study carried out at QUB. A scaled arch system was loaded with a simulated moving axle. Various techniques were used to monitor the arch under the moving axle load with particular emphasis on the interaction of the arch ring and engineered backfill.
Resumo:
In this study, the implementation of an optical accelerometer unit based on fiber Bragg gratings, suitable to monitor structures with frequencies up to 45 Hz, is reported. The developed optical system was used to estimate the structure eigenfrequencies of a steel footbridge, with a total length of 300 m, over the Sao Pedro Creek, located at University of Aveiro Campus, in Portugal. The acceleration records measured with this solution are compared with those obtained by traditional commercial electronic devices, revealing a root-mean-square error of 2.53 x 10(-5) G.
Resumo:
Wireless sensor networks (WSNs) have attracted growing interest in the last decade as an infrastructure to support a diversity of ubiquitous computing and cyber-physical systems. However, most research work has focused on protocols or on specific applications. As a result, there remains a clear lack of effective, feasible and usable system architectures that address both functional and non-functional requirements in an integrated fashion. In this paper, we outline the EMMON system architecture for large-scale, dense, real-time embedded monitoring. EMMON provides a hierarchical communication architecture together with integrated middleware and command and control software. It has been designed to use standard commercially-available technologies, while maintaining as much flexibility as possible to meet specific applications requirements. The EMMON architecture has been validated through extensive simulation and experimental evaluation, including a 300+ node test-bed, which is, to the best of our knowledge, the largest single-site WSN test-bed in Europe to date.
Resumo:
Modern computer systems are plagued with stability and security problems: applications lose data, web servers are hacked, and systems crash under heavy load. Many of these problems or anomalies arise from rare program behavior caused by attacks or errors. A substantial percentage of the web-based attacks are due to buffer overflows. Many methods have been devised to detect and prevent anomalous situations that arise from buffer overflows. The current state-of-art of anomaly detection systems is relatively primitive and mainly depend on static code checking to take care of buffer overflow attacks. For protection, Stack Guards and I-leap Guards are also used in wide varieties.This dissertation proposes an anomaly detection system, based on frequencies of system calls in the system call trace. System call traces represented as frequency sequences are profiled using sequence sets. A sequence set is identified by the starting sequence and frequencies of specific system calls. The deviations of the current input sequence from the corresponding normal profile in the frequency pattern of system calls is computed and expressed as an anomaly score. A simple Bayesian model is used for an accurate detection.Experimental results are reported which show that frequency of system calls represented using sequence sets, captures the normal behavior of programs under normal conditions of usage. This captured behavior allows the system to detect anomalies with a low rate of false positives. Data are presented which show that Bayesian Network on frequency variations responds effectively to induced buffer overflows. It can also help administrators to detect deviations in program flow introduced due to errors.
Resumo:
The long-term stability, high accuracy, all-weather capability, high vertical resolution, and global coverage of Global Navigation Satellite System (GNSS) radio occultation (RO) suggests it as a promising tool for global monitoring of atmospheric temperature change. With the aim to investigate and quantify how well a GNSS RO observing system is able to detect climate trends, we are currently performing an (climate) observing system simulation experiment over the 25-year period 2001 to 2025, which involves quasi-realistic modeling of the neutral atmosphere and the ionosphere. We carried out two climate simulations with the general circulation model MAECHAM5 (Middle Atmosphere European Centre/Hamburg Model Version 5) of the MPI-M Hamburg, covering the period 2001–2025: One control run with natural variability only and one run also including anthropogenic forcings due to greenhouse gases, sulfate aerosols, and tropospheric ozone. On the basis of this, we perform quasi-realistic simulations of RO observables for a small GNSS receiver constellation (six satellites), state-of-the-art data processing for atmospheric profiles retrieval, and a statistical analysis of temperature trends in both the “observed” climatology and the “true” climatology. Here we describe the setup of the experiment and results from a test bed study conducted to obtain a basic set of realistic estimates of observational errors (instrument- and retrieval processing-related errors) and sampling errors (due to spatial-temporal undersampling). The test bed results, obtained for a typical summer season and compared to the climatic 2001–2025 trends from the MAECHAM5 simulation including anthropogenic forcing, were found encouraging for performing the full 25-year experiment. They indicated that observational and sampling errors (both contributing about 0.2 K) are consistent with recent estimates of these errors from real RO data and that they should be sufficiently small for monitoring expected temperature trends in the global atmosphere over the next 10 to 20 years in most regions of the upper troposphere and lower stratosphere (UTLS). Inspection of the MAECHAM5 trends in different RO-accessible atmospheric parameters (microwave refractivity and pressure/geopotential height in addition to temperature) indicates complementary climate change sensitivity in different regions of the UTLS so that optimized climate monitoring shall combine information from all climatic key variables retrievable from GNSS RO data.
Resumo:
This paper describes an assessment of the nitrogen and phosphorus dynamics of the River Kennet in the south east of England. The Kennet catchment (1200 km(2)) is a predominantly groundwater fed river impacted by agricultural and sewage sources of nutrient (nitrogen and phosphorus) pollution. The results from a suite of simulation models are integrated to assess the key spatial and temporal variations in the nitrogen (N) and phosphorus (P) chemistry, and the influence of changes in phosphorous inputs from a Sewage Treatment Works on the macrophyte and epiphyte growth patterns. The models used are the Export Co-efficient model, the Integrated Nitrogen in Catchments model, and a new model of in-stream phosphorus and macrophyte dynamics: the 'Kennet' model. The paper concludes with a discussion on the present state of knowledge regarding the water quality functioning, future research needs regarding environmental modelling and the use of models as management tools for large, nutrient impacted riverine systems. (C) 2003 IMACS. Published by Elsevier B.V. All rights reserved.
Resumo:
The reduction of indigo (dispersed in water) to leuco-indigo (dissolved in water) is an important industrial process and investigated here for the case of glucose as an environmentally benign reducing agent. In order to quantitatively follow the formation of leuco-indigo two approaches based on (i) rotating disk voltammetry and (ii) sonovoltammetry are developed. Leuco-indigo, once formed in alkaline solution, is readily monitored at a glassy carbon electrode in the mass transport limit employing hydrodynamic voltammetry. The presence of power ultrasound further improves the leuco-indigo determination due to additional agitation and homogenization effects. While inactive at room temperature, glucose readily reduces indigo in alkaline media at 65 degrees C. In the presence of excess glucose, a surface dissolution kinetics limited process is proposed following the rate law d eta(leuco-indigo)/dt = k x c(OH-) x S-indigo where eta(leuco-indigo) is the amount of leuco-indigo formed, k = 4.1 x 10(-9) m s(-1) (at 65 degrees C, assuming spherical particles of I gm diameter) is the heterogeneous dissolution rate constant,c(OH-) is the concentration of hydroxide, and Sindigo is the reactive surface area. The activation energy for this process in aqueous 0.2 M NaOH is E-A = 64 U mol(-1) consistent with a considerable temperature effects. The redox mediator 1,8-dihydroxyanthraquinone is shown to significantly enhance the reaction rate by catalysing the electron transfer between glucose and solid indigo particles. (c) 2006 Elsevier Ltd. All fights reserved.
Resumo:
Resource monitoring in distributed systems is required to understand the 'health' of the overall system and to help identify particular problems, such as dysfunctional hardware, a faulty, system or application software. Desirable characteristics for monitoring systems are the ability to connect to any number of different types of monitoring agents and to provide different views of the system, based on a client's particular preferences. This paper outlines and discusses the ongoing activities within the GridRM wide-area resource-monitoring project.
Resumo:
Weekly monitoring of profiles of student performances on formative and summative coursework throughout the year can be used to quickly identify those who need additional help, possibly due to acute and sudden-onset problems. Such an early-warning system can help retention, but also assist students in overcoming problems early on, thus helping them fulfil their potential in the long run. We have developed a simple approach for the automatic monitoring of student mark profiles for individual modules, which we intend to trial in the near future. Its ease of implementation means that it can be used for very large cohorts with little additional effort when marks are already collected and recorded on a spreadsheet.
Resumo:
Limnologists had an early preoccupation with lake classification. It gave a necessary structure to the many chemical and biological observations that were beginning to form the basis of one of the earliest truly environmental sciences. August Thienemann was the doyen of such classifiers and his concept with Einar Naumann of oligotrophic and eutrophic lakes remains central to the world-view that limnologists still have. Classification fell into disrepute, however, as it became clear that there would always be lakes that deviated from the prescriptions that the classifiers made for them. Continua became the de rigeur concept and lakes were seen as varying along many chemical, biological and geographic axes. Modern limnologists are comfortable with this concept. That all lakes are different guarantees an indefinite future for limnological research. For those who manage lakes and the landscapes in which they are set, however, it is not very useful. There may be as many as 300000 standing water bodies in England and Wales alone and maybe as many again in Scotland. More than 80 000 are sizable (> 1 ha). Some classification scheme to cope with these numbers is needed and, as human impacts on them increase, a system of assessing and monitoring change must be built into such a scheme. Although ways of classifying and monitoring running waters are well developed in the UK, the same is not true of standing waters. Sufficient understanding of what determines the nature and functioning of lakes exists to create a system which has intellectual credibility as well as practical usefulness. This paper outlines the thinking behind a system which will be workable on a north European basis and presents some early results.