963 resultados para Biological Monitoring
Resumo:
Mode of access: Internet.
Resumo:
Biological wastewater treatment is a complex, multivariate process, in which a number of physical and biological processes occur simultaneously. In this study, principal component analysis (PCA) and parallel factor analysis (PARAFAC) were used to profile and characterise Lagoon 115E, a multistage biological lagoon treatment system at Melbourne Water's Western Treatment Plant (WTP) in Melbourne, Australia. In this study, the objective was to increase our understanding of the multivariate processes taking place in the lagoon. The data used in the study span a 7-year period during which samples were collected as often as weekly from the ponds of Lagoon 115E and subjected to analysis. The resulting database, involving 19 chemical and physical variables, was studied using the multivariate data analysis methods PCA and PARAFAC. With these methods, alterations in the state of the wastewater due to intrinsic and extrinsic factors could be discerned. The methods were effective in illustrating and visually representing the complex purification stages and cyclic changes occurring along the lagoon system. The two methods proved complementary, with each having its own beneficial features. (C) 2003 Elsevier B.V. All rights reserved.
Resumo:
Investigations into the biochemical markers associated with executive function (EF) impairment in children with early and continuously treated phenylketonuria (ECT-PKU) remain largely phenylalanine-only focused, despite experimental data showing that a high phenylalanine:tyrosine (phe:tyr) ratio is more strongly associated with EF deficit than phe alone. A high phe:tyr ratio is hypothesized to lead to a reduction in dopamine synthesis within the brain, which in turn results in the development of EF impairment. This paper provides a snapshot of current practice in the monitoring and/or treatment of tyrosine levels in children with PKU, across 12 countries from Australasia, North America and Europe. Tyrosine monitoring in this population has increased over the last 5 years, with over 80% of clinics surveyed reporting routine monitoring of tyrosine levels in infancy alongside phe levels. Twenty-five percent of clinics surveyed reported actively treating/managing tyrosine levels (with supplemental tyrosine above that contained in PKU formulas) to ensure tyrosine levels remain within normal ranges. Anecdotally, supplemental tyrosine has been reported to ameliorate symptoms of both attention deficit hyperactivity disorder and depression in this population. EF assessment of children with ECT-PKU was likewise highly variable, with 50% of clinics surveyed reporting routine assessments of intellectual function. However when function was assessed, test instruments chosen tended towards global measures of IQ prior to school entry, rather than specific assessment of EF development. Further investigation of the role of tyrosine and its relationship with phe and EF development is needed to establish whether routine tyrosine monitoring and increased supplementation is recommended.
Resumo:
Ocean processes are dynamic and complex events that occur on multiple different spatial and temporal scales. To obtain a synoptic view of such events, ocean scientists focus on the collection of long-term time series data sets. Generally, these time series measurements are continually provided in real or near-real time by fixed sensors, e.g., buoys and moorings. In recent years, an increase in the utilization of mobile sensor platforms, e.g., Autonomous Underwater Vehicles, has been seen to enable dynamic acquisition of time series data sets. However, these mobile assets are not utilized to their full capabilities, generally only performing repeated transects or user-defined patrolling loops. Here, we provide an extension to repeated patrolling of a designated area. Our algorithms provide the ability to adapt a standard mission to increase information gain in areas of greater scientific interest. By implementing a velocity control optimization along the predefined path, we are able to increase or decrease spatiotemporal sampling resolution to satisfy the sampling requirements necessary to properly resolve an oceanic phenomenon. We present a path planning algorithm that defines a sampling path, which is optimized for repeatability. This is followed by the derivation of a velocity controller that defines how the vehicle traverses the given path. The application of these tools is motivated by an ongoing research effort to understand the oceanic region off the coast of Los Angeles, California. The computed paths are implemented with the computed velocities onto autonomous vehicles for data collection during sea trials. Results from this data collection are presented and compared for analysis of the proposed technique.
Resumo:
Autonomous Underwater Vehicles (AUVs) are revolutionizing oceanography through their versatility, autonomy and endurance. However, they are still an underutilized technology. For coastal operations, the ability to track a certain feature is of interest to ocean scientists. Adaptive and predictive path planning requires frequent communication with significant data transfer. Currently, most AUVs rely on satellite phones as their primary communication. This communication protocol is expensive and slow. To reduce communication costs and provide adequate data transfer rates, we present a hardware modification along with a software system that provides an alternative robust disruption- tolerant communications framework enabling cost-effective glider operation in coastal regions. The framework is specifically designed to address multi-sensor deployments. We provide a system overview and present testing and coverage data for the network. Additionally, we include an application of ocean-model driven trajectory design, which can benefit from the use of this network and communication system. Simulation and implementation results are presented for single and multiple vehicle deployments. The presented combination of infrastructure, software development and deployment experience brings us closer to the goal of providing a reliable and cost-effective data transfer framework to enable real-time, optimal trajectory design, based on ocean model predictions, to gather in situ measurements of interesting and evolving ocean features and phenomena.
Resumo:
This is the first outdoor test of small-scale dye sensitized solar cells (DSC) powering a stand-alone nanosensor node. A solar cell test station (SCTS) has been developed using standard DSC to power a gas nanosensor, a radio transmitter, and the control electronics (CE) for battery charging. The station is remotely monitored through wired (Ethernet cable) or wireless connection (radio transmitter) in order to evaluate in real time the performance of the solar cells and devices under different weather conditions. The 408 cm2 active surface module produces enough energy to power a gas nanosensor and a radio transmitter during the day and part of the night. Also, by using a programmable load we keep the system working on the maximum power point (MPP) quantifying the total energy generated and stored in a battery. These experiments provide useful data for future outdoor applications such as nanosensor networks.
Resumo:
Ocean processes are dynamic, complex, and occur on multiple spatial and temporal scales. To obtain a synoptic view of such processes, ocean scientists collect data over long time periods. Historically, measurements were continually provided by fixed sensors, e.g., moorings, or gathered from ships. Recently, an increase in the utilization of autonomous underwater vehicles has enabled a more dynamic data acquisition approach. However, we still do not utilize the full capabilities of these vehicles. Here we present algorithms that produce persistent monitoring missions for underwater vehicles by balancing path following accuracy and sampling resolution for a given region of interest, which addresses a pressing need among ocean scientists to efficiently and effectively collect high-value data. More specifically, this paper proposes a path planning algorithm and a speed control algorithm for underwater gliders, which together give informative trajectories for the glider to persistently monitor a patch of ocean. We optimize a cost function that blends two competing factors: maximize the information value along the path, while minimizing deviation from the planned path due to ocean currents. Speed is controlled along the planned path by adjusting the pitch angle of the underwater glider, so that higher resolution samples are collected in areas of higher information value. The resulting paths are closed circuits that can be repeatedly traversed to collect long-term ocean data in dynamic environments. The algorithms were tested during sea trials on an underwater glider operating off the coast of southern California, as well as in Monterey Bay, California. The experimental results show significant improvements in data resolution and path reliability compared to previously executed sampling paths used in the respective regions.
Resumo:
Baseline monitoring of groundwater quality aims to characterize the ambient condition of the resource and identify spatial or temporal trends. Sites comprising any baseline monitoring network must be selected to provide a representative perspective of groundwater quality across the aquifer(s) of interest. Hierarchical cluster analysis (HCA) has been used as a means of assessing the representativeness of a groundwater quality monitoring network, using example datasets from New Zealand. HCA allows New Zealand's national and regional monitoring networks to be compared in terms of the number of water-quality categories identified in each network, the hydrochemistry at the centroids of these water-quality categories, the proportions of monitoring sites assigned to each water-quality category, and the range of concentrations for each analyte within each water-quality category. Through the HCA approach, the National Groundwater Monitoring Programme (117 sites) is shown to provide a highly representative perspective of groundwater quality across New Zealand, relative to the amalgamated regional monitoring networks operated by 15 different regional authorities (680 sites have sufficient data for inclusion in HCA). This methodology can be applied to evaluate the representativeness of any subset of monitoring sites taken from a larger network.
Resumo:
The monitoring sites comprising a state of the environment (SOE) network must be carefully selected to ensure that they will be representative of the broader resource. Hierarchical cluster analysis (HCA) is a data-driven technique that can potentially be employed to assess the representativeness of a SOE monitoring network. The objective of this paper is to explore the use of HCA as an approach for assessing the representativeness of the New Zealand National Groundwater Monitoring Programme (NGMP), which is comprised of 110 monitoring sites across the country.
Resumo:
The common brown leafhopper Orosius orientalis (Hemiptera: Cicadellidae) is a polyphagous vector of a range of economically important pathogens, including phytoplasmas and viruses, which infect a diverse range of crops. Studies on the plant penetration behaviour by O. orientalis were conducted using the electrical penetration graph (EPG) technique to assist in the characterisation of pathogen acquisition and transmission. EPG waveforms representing different probing activities were acquired from adult O. orientalis probing in planta, using two host species, tobacco Nicotiana tabacum and bean Phaseolus vulgaris, and in vitro using a simple sucrose-based artificial diet. Five waveforms (O1–O5) were evident when O. orientalis fed on bean, whereas only four waveforms (O1–O4) and three waveforms (O1–O3) were observed when the leafhopper fed on tobacco and on the artificial diet, respectively. Both the mean duration of each waveform and waveform type differed markedly depending on the food substrate. Waveform O4 was not observed on the artificial diet and occurred relatively rarely on tobacco plants when compared with bean plants. Waveform O5 was only observed with leafhoppers probing on beans. The attributes of the waveforms and comparative analyses with previously published Hemipteran data are presented and discussed, but further characterisation studies will be needed to confirm our suggestions.
Resumo:
Multiple reaction monitoring (MRM) mass spectrometry coupled with stable isotope dilution (SID) and liquid chromatography (LC) is increasingly used in biological and clinical studies for precise and reproducible quantification of peptides and proteins in complex sample matrices. Robust LC-SID-MRM-MS-based assays that can be replicated across laboratories and ultimately in clinical laboratory settings require standardized protocols to demonstrate that the analysis platforms are performing adequately. We developed a system suitability protocol (SSP), which employs a predigested mixture of six proteins, to facilitate performance evaluation of LC-SID-MRM-MS instrument platforms, configured with nanoflow-LC systems interfaced to triple quadrupole mass spectrometers. The SSP was designed for use with low multiplex analyses as well as high multiplex approaches when software-driven scheduling of data acquisition is required. Performance was assessed by monitoring of a range of chromatographic and mass spectrometric metrics including peak width, chromatographic resolution, peak capacity, and the variability in peak area and analyte retention time (RT) stability. The SSP, which was evaluated in 11 laboratories on a total of 15 different instruments, enabled early diagnoses of LC and MS anomalies that indicated suboptimal LC-MRM-MS performance. The observed range in variation of each of the metrics scrutinized serves to define the criteria for optimized LC-SID-MRM-MS platforms for routine use, with pass/fail criteria for system suitability performance measures defined as peak area coefficient of variation <0.15, peak width coefficient of variation <0.15, standard deviation of RT <0.15 min (9 s), and the RT drift <0.5min (30 s). The deleterious effect of a marginally performing LC-SID-MRM-MS system on the limit of quantification (LOQ) in targeted quantitative assays illustrates the use and need for a SSP to establish robust and reliable system performance. Use of a SSP helps to ensure that analyte quantification measurements can be replicated with good precision within and across multiple laboratories and should facilitate more widespread use of MRM-MS technology by the basic biomedical and clinical laboratory research communities.
Resumo:
Biomonitoring has become the ‘gold standard’ in assessing chemical exposures, and plays an important role in risk assessment. The pooling of biological specimens – combining multiple individual specimens into a single sample – can be used in biomonitoring studies to monitor levels of exposure and identify exposure trends, or to identify susceptible populations in a cost-effective manner. Pooled samples provide an estimate of central tendency, and may also reveal information about variation within the population. The development of a pooling strategy requires careful consideration of the type and number of samples collected, the number of pools required, and the number of specimens to combine per pool in order to maximize the type and robustness of the data. Creative pooling strategies can be used to explore exposure-outcome associations, and extrapolation from other larger studies can be useful in identifying elevated exposures in specific individuals. The use of pooled specimens is advantageous as it saves significantly on analytical costs, may reduce the time and resources required for recruitment, and in certain circumstances, allows quantification of samples approaching the limit of detection. In addition, use of pooled samples can provide population estimates while avoiding ethical difficulties that may be associated with reporting individual results.