948 resultados para Monitoring tool


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Therapeutic drug monitoring (TDM) aims to optimize treatments by individualizing dosage regimens based on the measurement of blood concentrations. Dosage individualization to maintain concentrations within a target range requires pharmacokinetic and clinical capabilities. Bayesian calculations currently represent the gold standard TDM approach but require computation assistance. In recent decades computer programs have been developed to assist clinicians in this assignment. The aim of this survey was to assess and compare computer tools designed to support TDM clinical activities. The literature and the Internet were searched to identify software. All programs were tested on personal computers. Each program was scored against a standardized grid covering pharmacokinetic relevance, user friendliness, computing aspects, interfacing and storage. A weighting factor was applied to each criterion of the grid to account for its relative importance. To assess the robustness of the software, six representative clinical vignettes were processed through each of them. Altogether, 12 software tools were identified, tested and ranked, representing a comprehensive review of the available software. Numbers of drugs handled by the software vary widely (from two to 180), and eight programs offer users the possibility of adding new drug models based on population pharmacokinetic analyses. Bayesian computation to predict dosage adaptation from blood concentration (a posteriori adjustment) is performed by ten tools, while nine are also able to propose a priori dosage regimens, based only on individual patient covariates such as age, sex and bodyweight. Among those applying Bayesian calculation, MM-USC*PACK© uses the non-parametric approach. The top two programs emerging from this benchmark were MwPharm© and TCIWorks. Most other programs evaluated had good potential while being less sophisticated or less user friendly. Programs vary in complexity and might not fit all healthcare settings. Each software tool must therefore be regarded with respect to the individual needs of hospitals or clinicians. Programs should be easy and fast for routine activities, including for non-experienced users. Computer-assisted TDM is gaining growing interest and should further improve, especially in terms of information system interfacing, user friendliness, data storage capability and report generation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

INTRODUCTION: Although long-term video-EEG monitoring (LVEM) is routinely used to investigate paroxysmal events, short-term video-EEG monitoring (SVEM) lasting <24 h is increasingly recognized as a cost-effective tool. Since, however, relatively few studies addressed the yield of SVEM among different diagnostic groups, we undertook the present study to investigate this aspect. METHODS: We retrospectively analyzed 226 consecutive SVEM recordings over 6 years. All patients were referred because routine EEGs were inconclusive. Patients were classified into 3 suspected diagnostic groups: (1) group with epileptic seizures, (2) group with psychogenic nonepileptic seizures (PNESs), and (3) group with other or undetermined diagnoses. We assessed recording lengths, interictal epileptiform discharges, epileptic seizures, PNESs, and the definitive diagnoses obtained after SVEM. RESULTS: The mean age was 34 (±18.7) years, and the median recording length was 18.6 h. Among the 226 patients, 127 referred for suspected epilepsy - 73 had a diagnosis of epilepsy, none had a diagnosis of PNESs, and 54 had other or undetermined diagnoses post-SVEM. Of the 24 patients with pre-SVEM suspected PNESs, 1 had epilepsy, 12 had PNESs, and 11 had other or undetermined diagnoses. Of the 75 patients with other diagnoses pre-SVEM, 17 had epilepsy, 11 had PNESs, and 47 had other or undetermined diagnoses. After SVEM, 15 patients had definite diagnoses other than epilepsy or PNESs, while in 96 patients, diagnosis remained unclear. Overall, a definitive diagnosis could be reached in 129/226 (57%) patients. CONCLUSIONS: This study demonstrates that in nearly 3/5 patients without a definitive diagnosis after routine EEG, SVEM allowed us to reach a diagnosis. This procedure should be encouraged in this setting, given its time-effectiveness compared with LVEM.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Imaging studies have shown reduced frontal lobe resources following total sleep deprivation (TSD). The anterior cingulate cortex (ACC) in the frontal region plays a role in performance monitoring and cognitive control; both error detection and response inhibition are impaired following sleep loss. Event-related potentials (ERPs) are an electrophysiological tool used to index the brain's response to stimuli and information processing. In the Flanker task, the error-related negativity (ERN) and error positivity (Pe) ERPs are elicited after erroneous button presses. In a Go/NoGo task, NoGo-N2 and NoGo-P3 ERPs are elicited during high conflict stimulus processing. Research investigating the impact of sleep loss on ERPs during performance monitoring is equivocal, possibly due to task differences, sample size differences and varying degrees of sleep loss. Based on the effects of sleep loss on frontal function and prior research, it was expected that the sleep deprivation group would have lower accuracy, slower reaction time and impaired remediation on performance monitoring tasks, along with attenuated and delayed stimulus- and response-locked ERPs. In the current study, 49 young adults (24 male) were screened to be healthy good sleepers and then randomly assigned to a sleep deprived (n = 24) or rested control (n = 25) group. Participants slept in the laboratory on a baseline night, followed by a second night of sleep or wake. Flanker and Go/NoGo tasks were administered in a battery at 1O:30am (i.e., 27 hours awake for the sleep deprivation group) to measure performance monitoring. On the Flanker task, the sleep deprivation group was significantly slower than controls (p's <.05), but groups did not differ on accuracy. No group differences were observed in post-error slowing, but a trend was observed for less remedial accuracy in the sleep deprived group compared to controls (p = .09), suggesting impairment in the ability to take remedial action following TSD. Delayed P300s were observed in the sleep deprived group on congruent and incongruent Flanker trials combined (p = .001). On the Go/NoGo task, the hit rate (i.e., Go accuracy) was significantly lower in the sleep deprived group compared to controls (p <.001), but no differences were found on false alarm rates (i.e., NoGo Accuracy). For the sleep deprived group, the Go-P3 was significantly smaller (p = .045) and there was a trend for a smaller NoGo-N2 compared to controls (p = .08). The ERN amplitude was reduced in the TSD group compared to controls in both the Flanker and Go/NoGo tasks. Error rate was significantly correlated with the amplitude of response-locked ERNs in control (r = -.55, p=.005) and sleep deprived groups (r = -.46, p = .021); error rate was also correlated with Pe amplitude in controls (r = .46, p=.022) and a trend was found in the sleep deprived participants (r = .39, p =. 052). An exploratory analysis showed significantly larger Pe mean amplitudes (p = .025) in the sleep deprived group compared to controls for participants who made more than 40+ errors on the Flanker task. Altered stimulus processing as indexed by delayed P3 latency during the Flanker task and smaller amplitude Go-P3s during the Go/NoGo task indicate impairment in stimulus evaluation and / or context updating during frontal lobe tasks. ERN and NoGoN2 reductions in the sleep deprived group confirm impairments in the monitoring system. These data add to a body of evidence showing that the frontal brain region is particularly vulnerable to sleep loss. Understanding the neural basis of these deficits in performance monitoring abilities is particularly important for our increasingly sleep deprived society and for safety and productivity in situations like driving and sustained operations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Le méthotrexate (MTX), un agent anti-cancéreux fréquemment utilisé en chimiothérapie, requiert généralement un suivi thérapeutique de la médication (Therapeutic Drug Monitoring, TDM) pour surveiller son niveau sanguin chez le patient afin de maximiser son efficacité tout en limitant ses effets secondaires. Malgré la fenêtre thérapeutique étroite entre l’efficacité et la toxicité, le MTX reste, à ce jour, un des agents anti-cancéreux les plus utilisés au monde. Les techniques analytiques existantes pour le TDM du MTX sont coûteuses, requièrent temps et efforts, sans nécessairement fournir promptement les résultats dans le délai requis. Afin d’accélérer le processus de dosage du MTX en TDM, une stratégie a été proposée basée sur un essai compétitif caractérisé principalement par le couplage plasmonique d’une surface métallique et de nanoparticules d’or. Plus précisément, l’essai quantitatif exploite la réaction de compétition entre le MTX et une nanoparticule d’or fonctionnalisée avec l’acide folique (FA-AuNP) ayant une affinité pour un récepteur moléculaire, la réductase humaine de dihydrofolate (hDHFR), une enzyme associée aux maladies prolifératives. Le MTX libre mixé avec les FA-AuNP, entre en compétition pour les sites de liaison de hDHFR immobilisés sur une surface active en SPR ou libres en solution. Par la suite, les FA-AuNP liées au hDHFR fournissent une amplification du signal qui est inversement proportionnelle à la concentration de MTX. La résonance des plasmons de surface (SPR) est généralement utilisée comme une technique spectroscopique pour l’interrogation des interactions biomoléculaires. Les instruments SPR commerciaux sont généralement retrouvés dans les grands laboratoires d’analyse. Ils sont également encombrants, coûteux et manquent de sélectivité dans les analyses en matrice complexe. De plus, ceux-ci n’ont pas encore démontré de l’adaptabilité en milieu clinique. Par ailleurs, les analyses SPR des petites molécules comme les médicaments n’ont pas été explorés de manière intensive dû au défi posé par le manque de la sensibilité de la technique pour cette classe de molécules. Les développements récents en science des matériaux et chimie de surfaces exploitant l’intégration des nanoparticules d’or pour l’amplification de la réponse SPR et la chimie de surface peptidique ont démontré le potentiel de franchir les limites posées par le manque de sensibilité et l’adsorption non-spécifique pour les analyses directes dans les milieux biologiques. Ces nouveaux concepts de la technologie SPR seront incorporés à un système SPR miniaturisé et compact pour exécuter des analyses rapides, fiables et sensibles pour le suivi du niveau du MTX dans le sérum de patients durant les traitements de chimiothérapie. L’objectif de cette thèse est d’explorer différentes stratégies pour améliorer l’analyse des médicaments dans les milieux complexes par les biocapteurs SPR et de mettre en perspective le potentiel des biocapteurs SPR comme un outil utile pour le TDM dans le laboratoire clinique ou au chevet du patient. Pour atteindre ces objectifs, un essai compétitif colorimétrique basé sur la résonance des plasmons de surface localisée (LSPR) pour le MTX fut établi avec des nanoparticules d’or marquées avec du FA. Ensuite, cet essai compétitif colorimétrique en solution fut adapté à une plateforme SPR. Pour les deux essais développés, la sensibilité, sélectivité, limite de détection, l’optimisation de la gamme dynamique et l’analyse du MTX dans les milieux complexes ont été inspectés. De plus, le prototype de la plateforme SPR miniaturisée fut validé par sa performance équivalente aux systèmes SPR existants ainsi que son utilité pour analyser les échantillons cliniques des patients sous chimiothérapie du MTX. Les concentrations de MTX obtenues par le prototype furent comparées avec des techniques standards, soit un essai immunologique basé sur la polarisation en fluorescence (FPIA) et la chromatographie liquide couplée avec de la spectrométrie de masse en tandem (LC-MS/MS) pour valider l’utilité du prototype comme un outil clinique pour les tests rapides de quantification du MTX. En dernier lieu, le déploiement du prototype à un laboratoire de biochimie dans un hôpital démontre l’énorme potentiel des biocapteurs SPR pour utilisation en milieux clinique.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objective To determine overall, test–retest and inter-rater reliability of posture indices among persons with idiopathic scoliosis. Design A reliability study using two raters and two test sessions. Setting Tertiary care paediatric centre. Participants Seventy participants aged between 10 and 20 years with different types of idiopathic scoliosis (Cobb angle 15 to 60°) were recruited from the scoliosis clinic. Main outcome measures Based on the XY co-ordinates of natural reference points (e.g. eyes) as well as markers placed on several anatomical landmarks, 32 angular and linear posture indices taken from digital photographs in the standing position were calculated from a specially developed software program. Generalisability theory served to estimate the reliability and standard error of measurement (SEM) for the overall, test–retest and inter-rater designs. Bland and Altman's method was also used to document agreement between sessions and raters. Results In the random design, dependability coefficients demonstrated a moderate level of reliability for six posture indices (ϕ = 0.51 to 0.72) and a good level of reliability for 26 posture indices out of 32 (ϕ ≥ 0.79). Error attributable to marker placement was negligible for most indices. Limits of agreement and SEM values were larger for shoulder protraction, trunk list, Q angle, cervical lordosis and scoliosis angles. The most reproducible indices were waist angles and knee valgus and varus. Conclusions Posture can be assessed in a global fashion from photographs in persons with idiopathic scoliosis. Despite the good reliability of marker placement, other studies are needed to minimise measurement errors in order to provide a suitable tool for monitoring change in posture over time.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

STUDY DESIGN: Concurrent validity between postural indices obtained from digital photographs (two-dimensional [2D]), surface topography imaging (three-dimensional [3D]), and radiographs. OBJECTIVE: To assess the validity of a quantitative clinical postural assessment tool of the trunk based on photographs (2D) as compared to a surface topography system (3D) as well as indices calculated from radiographs. SUMMARY OF BACKGROUND DATA: To monitor progression of scoliosis or change in posture over time in young persons with idiopathic scoliosis (IS), noninvasive and nonionizing methods are recommended. In a clinical setting, posture can be quite easily assessed by calculating key postural indices from photographs. METHODS: Quantitative postural indices of 70 subjects aged 10 to 20 years old with IS (Cobb angle, 15 degrees -60 degrees) were measured from photographs and from 3D trunk surface images taken in the standing position. Shoulder, scapula, trunk list, pelvis, scoliosis, and waist angles indices were calculated with specially designed software. Frontal and sagittal Cobb angles and trunk list were also calculated on radiographs. The Pearson correlation coefficients (r) was used to estimate concurrent validity of the 2D clinical postural tool of the trunk with indices extracted from the 3D system and with those obtained from radiographs. RESULTS: The correlation between 2D and 3D indices was good to excellent for shoulder, pelvis, trunk list, and thoracic scoliosis (0.81>r<0.97; P<0.01) but fair to moderate for thoracic kyphosis, lumbar lordosis, and thoracolumbar or lumbar scoliosis (0.30>r<0.56; P<0.05). The correlation between 2D and radiograph spinal indices was fair to good (-0.33 to -0.80 with Cobb angles and 0.76 for trunk list; P<0.05). CONCLUSION: This tool will facilitate clinical practice by monitoring trunk posture among persons with IS. Further, it may contribute to a reduction in the use of radiographs to monitor scoliosis progression.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A novel sensing technique for the in situ monitoring of the rate of pulsed laser deposition (PLD) of metal thin films has been developed. This optical fibre based sensor works on the principle of the evanescent wave penetration of waveguide modes into the uncladded portion of a multimode fibre. The utility of this optical fibre sensor is demonstrated in the case of PLD of silver thin films obtained by a Q-switched Nd:YAG laser which is used to irradiate a silver target at the required conditions for the preparation of thin films. This paper describes the performance and characteristics of the sensor and shows how the device can be used as an effective tool for the monitoring of the deposition rate of silver thin films. The fibre optic sensor is very simple, inexpensive and highly sensitive compared with existing techniques for thin film deposition rate measurements

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A novel sensing technique for the in situ monitoring of the rate of pulsed laser deposition (PLD) of metal thin films has been developed. This optical fibre based sensor works on the principle of the evanescent wave penetration of waveguide modes into the uncladded portion of a multimode fibre. The utility of this optical fibre sensor is demonstrated in the case of PLD of silver thin films obtained by a Q-switched Nd:YAG laser which is used to irradiate a silver target at the required conditions for the preparation of thin films. This paper describes the performance and characteristics of the sensor and shows how the device can be used as an effective tool for the monitoring of the deposition rate of silver thin films. The fibre optic sensor is very simple, inexpensive and highly sensitive compared with existing techniques for thin film deposition rate measurements.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this thesis, the applications of the recurrence quantification analysis in metal cutting operation in a lathe, with specific objective to detect tool wear and chatter, are presented.This study is based on the discovery that process dynamics in a lathe is low dimensional chaotic. It implies that the machine dynamics is controllable using principles of chaos theory. This understanding is to revolutionize the feature extraction methodologies used in condition monitoring systems as conventional linear methods or models are incapable of capturing the critical and strange behaviors associated with the metal cutting process.As sensor based approaches provide an automated and cost effective way to monitor and control, an efficient feature extraction methodology based on nonlinear time series analysis is much more demanding. The task here is more complex when the information has to be deduced solely from sensor signals since traditional methods do not address the issue of how to treat noise present in real-world processes and its non-stationarity. In an effort to get over these two issues to the maximum possible, this thesis adopts the recurrence quantification analysis methodology in the study since this feature extraction technique is found to be robust against noise and stationarity in the signals.The work consists of two different sets of experiments in a lathe; set-I and set-2. The experiment, set-I, study the influence of tool wear on the RQA variables whereas the set-2 is carried out to identify the sensitive RQA variables to machine tool chatter followed by its validation in actual cutting. To obtain the bounds of the spectrum of the significant RQA variable values, in set-i, a fresh tool and a worn tool are used for cutting. The first part of the set-2 experiments uses a stepped shaft in order to create chatter at a known location. And the second part uses a conical section having a uniform taper along the axis for creating chatter to onset at some distance from the smaller end by gradually increasing the depth of cut while keeping the spindle speed and feed rate constant.The study concludes by revealing the dependence of certain RQA variables; percent determinism, percent recurrence and entropy, to tool wear and chatter unambiguously. The performances of the results establish this methodology to be viable for detection of tool wear and chatter in metal cutting operation in a lathe. The key reason is that the dynamics of the system under study have been nonlinear and the recurrence quantification analysis can characterize them adequately.This work establishes that principles and practice of machining can be considerably benefited and advanced from using nonlinear dynamics and chaos theory.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

One of the objectives of the current investigation was to evaluate the effectiveness of Spirodela polyrhiza to remove heavy metals and other contaminants from the water samples collected from wetland sites of Eloor and Kannamaly under controlled conditions .The results obtained from the current study suggest that the test material S. polyrrhiza should be used in the biomonitoring and phytoremediation of municipal, agricultural and industrial effluents because of their simplicity, sensitivity and cost-effectiveness. The study throws light on the potential of this plant which can be used as an assessment tool in two diverse wetland in Ernakulum district. The results show the usefulness of combining physicochemical analysis with bioassays as such approach ensures better understanding of the toxicity of chemical pollutants and their influence on plant health. The results shows the suitability of Spirodela plant for surface water quality assessment as all selected parameters showed consistency with respect to water samples collected over a 3-monitoring periods. Similarly the relationship between the change in exposure period (2, 4 and 8 days) with the parameters were also studied in detail. Spirodela are consistent test material as they are homogeneous plant material; due to predominantly vegetative reproduction. New fronds are formed by clonal propagation thus, producing a population of genetically homogeneous plants. The result is small variability between treated individuals. It has been observed that phytoremediation of water samples collected from Eloor and Kannamaly using the floating plant system is a predominant method which is economic to construct, requires little maintenance and eco friendly.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The long-term stability, high accuracy, all-weather capability, high vertical resolution, and global coverage of Global Navigation Satellite System (GNSS) radio occultation (RO) suggests it as a promising tool for global monitoring of atmospheric temperature change. With the aim to investigate and quantify how well a GNSS RO observing system is able to detect climate trends, we are currently performing an (climate) observing system simulation experiment over the 25-year period 2001 to 2025, which involves quasi-realistic modeling of the neutral atmosphere and the ionosphere. We carried out two climate simulations with the general circulation model MAECHAM5 (Middle Atmosphere European Centre/Hamburg Model Version 5) of the MPI-M Hamburg, covering the period 2001–2025: One control run with natural variability only and one run also including anthropogenic forcings due to greenhouse gases, sulfate aerosols, and tropospheric ozone. On the basis of this, we perform quasi-realistic simulations of RO observables for a small GNSS receiver constellation (six satellites), state-of-the-art data processing for atmospheric profiles retrieval, and a statistical analysis of temperature trends in both the “observed” climatology and the “true” climatology. Here we describe the setup of the experiment and results from a test bed study conducted to obtain a basic set of realistic estimates of observational errors (instrument- and retrieval processing-related errors) and sampling errors (due to spatial-temporal undersampling). The test bed results, obtained for a typical summer season and compared to the climatic 2001–2025 trends from the MAECHAM5 simulation including anthropogenic forcing, were found encouraging for performing the full 25-year experiment. They indicated that observational and sampling errors (both contributing about 0.2 K) are consistent with recent estimates of these errors from real RO data and that they should be sufficiently small for monitoring expected temperature trends in the global atmosphere over the next 10 to 20 years in most regions of the upper troposphere and lower stratosphere (UTLS). Inspection of the MAECHAM5 trends in different RO-accessible atmospheric parameters (microwave refractivity and pressure/geopotential height in addition to temperature) indicates complementary climate change sensitivity in different regions of the UTLS so that optimized climate monitoring shall combine information from all climatic key variables retrievable from GNSS RO data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The effectiveness of development assistance has come under renewed scrutiny in recent years. In an era of growing economic liberalisation, research organisations are increasingly being asked to account for the use of public funds by demonstrating achievements. However, in the natural resources (NR) research field, conventional economic assessment techniques have focused on quantifying the impact achieved rather understanding the process that delivered it. As a result, they provide limited guidance for planners and researchers charged with selecting and implementing future research. In response, “pathways” or logic models have attracted increased interest in recent years as a remedy to this shortcoming. However, as commonly applied these suffer from two key limitations in their ability to incorporate risk and assess variance from plan. The paper reports the results of a case study that used a Bayesian belief network approach to address these limitations and outlines its potential value as a tool to assist the planning, monitoring and evaluation of development-orientated research.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The use of bioluminescence was evaluated as a tool to study Pseudomonas syringae population dynamics in susceptible and resistant plant environments. Plasmid pGLITE, containing the luxCDABE genes from Photorhabdus luminescens, was introduced into Pseudomonas syringae pv. phaseolicola race 7 strain 1449B, a Gram-negative pathogen of bean (Phaseolus vulgaris). Bacteria recovered from plant tissue over a five-day period were enumerated by counting numbers of colony forming units and by measurement of bioluminescence. Direct measurement of bioluminescence from leaf disc homogenates consistently reflected bacterial growth as determined by viable counting, but also detected subtle effects of the plant resistance response on bacterial viability. This bioluminescence procedure enables real time measurement of bacterial metabolism and population dynamics in planta, obviates the need to carry out labour intensive and time consuming traditional enumeration techniques and provides a sensitive assay for studying plant effects on bacterial cells.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Risk management (RM) comprises of risk identification, risk analysis, response planning, monitoring and action planning tasks that are carried out throughout the life cycle of a project in order to ensure that project objectives are met. Although the methodological aspects of RM are well-defined, the philosophical background is rather vague. In this paper, a learning-based approach is proposed. In order to implement this approach in practice, a tool has been developed to facilitate construction of a lessons learned database that contains risk-related information and risk assessment throughout the life cycle of a project. The tool is tested on a real construction project. The case study findings demonstrate that it can be used for storing as well as updating risk-related information and finally, carrying out a post-project appraisal. The major weaknesses of the tool are identified as, subjectivity of the risk rating process and unwillingness of people to enter information about reasons of failure.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

1. Closed Ecological Systems (CES) are small manmade ecosystems which do not have any material exchange with the surrounding environment. Recent ecological and technological advances enable successful establishment and maintenance of CES, making them a suitable tool for detecting and measuring subtle feedbacks and mechanisms. 2. As a part of an analogue (physical) C cycle modelling experiment, we developed a non-intrusive methodology to control the internal environment and to monitor atmospheric CO2 concentration inside 16 replicated CES. Whilst maintaining an air-tight seal of all CES, this approach allowed for access to the CO2 measuring equipment for periodic re-calibration and repairs. 3. To ensure reliable cross-comparison of CO2 observations between individual CES units and to minimise the cost of the system, only one CO2 sampling unit was used. An ADC BioScientific OP-2 (open-path) analyser mounted on a swinging arm was passing over a set of 16 measuring cells. Each cell was connected to an individual CES with air continuously circulating between them. 4. Using this setup, we were able to continuously measure several environmental variables and CO2 concentration within each closed system, allowing us to study minute effects of changing temperature on C fluxes within each CES. The CES and the measuring cells showed minimal air leakage during an experimental run lasting, on average, 3 months. The CO2 analyser assembly performed reliably for over 2 years, however an early iteration of the present design proved to be sensitive to positioning errors. 5. We indicate how the methodology can be further improved and suggest possible avenues where future CES based research could be applied.