140 resultados para NUTRIENT REMOVAL
Resumo:
Artifact removal from physiological signals is an essential component of the biosignal processing pipeline. The need for powerful and robust methods for this process has become particularly acute as healthcare technology deployment undergoes transition from the current hospital-centric setting toward a wearable and ubiquitous monitoring environment. Currently, determining the relative efficacy and performance of the multiple artifact removal techniques available on real world data can be problematic, due to incomplete information on the uncorrupted desired signal. The majority of techniques are presently evaluated using simulated data, and therefore, the quality of the conclusions is contingent on the fidelity of the model used. Consequently, in the biomedical signal processing community, there is considerable focus on the generation and validation of appropriate signal models for use in artifact suppression. Most approaches rely on mathematical models which capture suitable approximations to the signal dynamics or underlying physiology and, therefore, introduce some uncertainty to subsequent predictions of algorithm performance. This paper describes a more empirical approach to the modeling of the desired signal that we demonstrate for functional brain monitoring tasks which allows for the procurement of a ground truth signal which is highly correlated to a true desired signal that has been contaminated with artifacts. The availability of this ground truth, together with the corrupted signal, can then aid in determining the efficacy of selected artifact removal techniques. A number of commonly implemented artifact removal techniques were evaluated using the described methodology to validate the proposed novel test platform. © 2012 IEEE.
Resumo:
It has been suggested that there are significant overlaps between removals due to deregistration and removals arising because patients live outside the practice area. If this is true, it would mean that the current estimates of deregistration would need to be revised upwards. All outside-area removals for the calendar years 2001 and 2002 were reviewed and characterised by age, sex and Jarman score of the enumeration district of the patients' residence and distance from the practice. The average outside-area removal rate was just over one removal per practice per year. Removal rates were highest between the ages of 18 and 44 years; there were no significant differences between the sexes. Rates of removal increased exponentially with distance, although even at marked distances from the practice there were about 10 patients remaining on the list for each one removed. Residents in deprived areas were more likely to be removed, although because areas most distal to the practice tend to be affluent, overall there was a predominance of affluent patients among those who are removed. In Northern Ireland rates of outside-area removal are only slightly higher than those of deregistration. It is evident that GPs are exercising some discretion as to which of the outside-area patients they retain on their list. This has the potential to cause some misunderstanding and resentment among patients, as has been reported previously.
Resumo:
Effective implementation of the Water Framework Directive requires a reappraisal of conventional approaches to water quality monitoring. Quantifying the impact of domestic wastewater treatment systems (DWWTS) in Irish catchments is further complicated by high levels of natural heterogeneity. This paper presents a numerical model that couples attenuation to flow along different hydrological pathways contributing to river discharge; this permits estimation of the impact of DWWTS to overall nutrient fluxes under a range of geological conditions. Preliminary results suggest high levels of attenuation experienced
before DWWTS effluent reaches bedrock play a significant role in reducing its ecological impact on aquatic receptors. Conversely, low levels of attenuation in systems discharging directly to surface water may affect water quality more significantly, particularly during prolonged dry periods in areas underlain by low productivity aquifers (>60% of Ireland), where dilution capacity is limited.
Resumo:
The aim of the present study was to describe the practice of central venous catheter (CVC) removal and outcomes of catheter-related bloodstream infection (CR-BSI) in adult haematology patients. Patients were identified retrospectively according to diagnosis coding of inpatient episodes and evaluated when, on examination of medical records, there had been evidence of sepsis with strong clinical suspicion that the source was the CVC. Demographic and bacteriological data, as well as therapeutic measures and clinical outcomes, were recorded. One hundred and three patient episodes were evaluated. The most frequent type of CVC was the Hickman catheter and the most frequently isolated pathogen was coagulase-negative staphylococci. Twenty-five percent of episodes were managed with catheter removal. Treatment failure, defined as recurrence of infection within 90 days or mortality attributed to sepsis within 30 days, occurred significantly more frequently in the group managed without catheter removal (52.5% versus 4%, P
Resumo:
It has previously been shown that across different arsenic (As) soil environments, a decrease in grain selenium (Se), zinc (Zn), and nickel (Ni) concentrations is associated with an increase in grain As. In this study we aim to determine if there is a genetic element for this observation or if it is driven by the soil As environment. To determine the genetic and environmental effect on grain element composition, multielement analysis using ICP-MS was performed on rice grain from a range of rice cultivars grown in 4 different field sites (2 in Bangladesh and 2 in West Bengal). At all four sites a negative correlation was observed between grain As and grain Ni, while at three of the four sites a negative correlation was observed between grain As and grain Se and grain copper (Cu). For manganese, Ni, Cu, and Se there was also a significant genetic interaction with grain arsenic indicating some cultivars are more strongly affected by arsenic than others.
Resumo:
Plant-derived carbon is the substrate which drives the rate of microbial assimilation and turnover of nutrients, in particular N and P, within the rhizosphere. To develop a better understanding of rhizosphere dynamics, a tripartite reporter gene system has been developed. We used three lux-marked Pseudomonas fluorescens strains to report on soil (1) assimilable carbon, (2) N-status, and (3) P-status. In vivo studies using soil water, spiked with C, N and P to simulate rhizosphere conditions, showed that the tripartite reporter system can provide real-time assessment of carbon and nutrient status. Good quantitative agreement for bioluminescence output between reference material and soil water samples was found for the C and P reporters. With regard to soil nitrate, the minimum bioavailable concentration was found to be greater than that analytically detectable in soil water. This is the first time that bioavailable soil C, N and P have been quantified using a tripartite reporter gene system.