929 resultados para monitoring design


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Many researchers in the field of civil structural health monitoring (SHM) have developed and tested their methods on simple to moderately complex laboratory structures such as beams, plates, frames, and trusses. Fieldwork has also been conducted by many researchers and practitioners on more complex operating bridges. Most laboratory structures do not adequately replicate the complexity of truss bridges. Informed by a brief review of the literature, this paper documents the design and proposed test plan of a structurally complex laboratory bridge model that has been specifically designed for the purpose of SHM research. Preliminary results have been presented in the companion paper.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Immunotherapy is defined as the treatment of disease by inducing, enhancing, or suppressing an immune response, whereas preventive vaccination is intended to prevent the development of diseases in healthy subjects. Most successful prophylactic vaccines rely on the induction of high titers of neutralizing antibodies. It is generally thought that therapeutic vaccination requires induction of robust T-cell mediated immunity. The diverse array of potential or already in use immunotherapeutic and preventive agents all share the commonality of stimulating the immune system. Hence, measuring those vaccination-induced immune responses gives the earliest indication of vaccine take and its immune modulating effects.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The North American Breeding Bird Survey (BBS) is the principal source of data to inform researchers about the status of and trend for boreal forest birds. Unfortunately, little BBS coverage is available in the boreal forest, where increasing concern over the status of species breeding there has increased interest in northward expansion of the BBS. However, high disturbance rates in the boreal forest may complicate roadside monitoring. If the roadside sampling frame does not capture variation in disturbance rates because of either road placement or the use of roads for resource extraction, biased trend estimates might result. In this study, we examined roadside bias in the proportional representation of habitat disturbance via spatial data on forest “loss,” forest fires, and anthropogenic disturbance. In each of 455 BBS routes, the area disturbed within multiple buffers away from the road was calculated and compared against the area disturbed in degree blocks and BBS strata. We found a nonlinear relationship between bias and distance from the road, suggesting forest loss and forest fires were underrepresented below 75 and 100 m, respectively. In contrast, anthropogenic disturbance was overrepresented at distances below 500 m and underrepresented thereafter. After accounting for distance from road, BBS routes were reasonably representative of the degree blocks they were within, with only a few strata showing biased representation. In general, anthropogenic disturbance is overrepresented in southern strata, and forest fires are underrepresented in almost all strata. Similar biases exist when comparing the entire road network and the subset sampled by BBS routes against the amount of disturbance within BBS strata; however, the magnitude of biases differed. Based on our results, we recommend that spatial stratification and rotating panel designs be used to spread limited BBS and off-road sampling effort in an unbiased fashion and that new BBS routes be established where sufficient road coverage exists.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

La temperatura es una preocupación que juega un papel protagonista en el diseño de circuitos integrados modernos. El importante aumento de las densidades de potencia que conllevan las últimas generaciones tecnológicas ha producido la aparición de gradientes térmicos y puntos calientes durante el funcionamiento normal de los chips. La temperatura tiene un impacto negativo en varios parámetros del circuito integrado como el retardo de las puertas, los gastos de disipación de calor, la fiabilidad, el consumo de energía, etc. Con el fin de luchar contra estos efectos nocivos, la técnicas de gestión dinámica de la temperatura (DTM) adaptan el comportamiento del chip en función en la información que proporciona un sistema de monitorización que mide en tiempo de ejecución la información térmica de la superficie del dado. El campo de la monitorización de la temperatura en el chip ha llamado la atención de la comunidad científica en los últimos años y es el objeto de estudio de esta tesis. Esta tesis aborda la temática de control de la temperatura en el chip desde diferentes perspectivas y niveles, ofreciendo soluciones a algunos de los temas más importantes. Los niveles físico y circuital se cubren con el diseño y la caracterización de dos nuevos sensores de temperatura especialmente diseñados para los propósitos de las técnicas DTM. El primer sensor está basado en un mecanismo que obtiene un pulso de anchura variable dependiente de la relación de las corrientes de fuga con la temperatura. De manera resumida, se carga un nodo del circuito y posteriormente se deja flotando de tal manera que se descarga a través de las corrientes de fugas de un transistor; el tiempo de descarga del nodo es la anchura del pulso. Dado que la anchura del pulso muestra una dependencia exponencial con la temperatura, la conversión a una palabra digital se realiza por medio de un contador logarítmico que realiza tanto la conversión tiempo a digital como la linealización de la salida. La estructura resultante de esta combinación de elementos se implementa en una tecnología de 0,35 _m. El sensor ocupa un área muy reducida, 10.250 nm2, y consume muy poca energía, 1.05-65.5nW a 5 muestras/s, estas cifras superaron todos los trabajos previos en el momento en que se publicó por primera vez y en el momento de la publicación de esta tesis, superan a todas las implementaciones anteriores fabricadas en el mismo nodo tecnológico. En cuanto a la precisión, el sensor ofrece una buena linealidad, incluso sin calibrar; se obtiene un error 3_ de 1,97oC, adecuado para tratar con las aplicaciones de DTM. Como se ha explicado, el sensor es completamente compatible con los procesos de fabricación CMOS, este hecho, junto con sus valores reducidos de área y consumo, lo hacen especialmente adecuado para la integración en un sistema de monitorización de DTM con un conjunto de monitores empotrados distribuidos a través del chip. Las crecientes incertidumbres de proceso asociadas a los últimos nodos tecnológicos comprometen las características de linealidad de nuestra primera propuesta de sensor. Con el objetivo de superar estos problemas, proponemos una nueva técnica para obtener la temperatura. La nueva técnica también está basada en las dependencias térmicas de las corrientes de fuga que se utilizan para descargar un nodo flotante. La novedad es que ahora la medida viene dada por el cociente de dos medidas diferentes, en una de las cuales se altera una característica del transistor de descarga |la tensión de puerta. Este cociente resulta ser muy robusto frente a variaciones de proceso y, además, la linealidad obtenida cumple ampliamente los requisitos impuestos por las políticas DTM |error 3_ de 1,17oC considerando variaciones del proceso y calibrando en dos puntos. La implementación de la parte sensora de esta nueva técnica implica varias consideraciones de diseño, tales como la generación de una referencia de tensión independiente de variaciones de proceso, que se analizan en profundidad en la tesis. Para la conversión tiempo-a-digital, se emplea la misma estructura de digitalización que en el primer sensor. Para la implementación física de la parte de digitalización, se ha construido una biblioteca de células estándar completamente nueva orientada a la reducción de área y consumo. El sensor resultante de la unión de todos los bloques se caracteriza por una energía por muestra ultra baja (48-640 pJ) y un área diminuta de 0,0016 mm2, esta cifra mejora todos los trabajos previos. Para probar esta afirmación, se realiza una comparación exhaustiva con más de 40 propuestas de sensores en la literatura científica. Subiendo el nivel de abstracción al sistema, la tercera contribución se centra en el modelado de un sistema de monitorización que consiste de un conjunto de sensores distribuidos por la superficie del chip. Todos los trabajos anteriores de la literatura tienen como objetivo maximizar la precisión del sistema con el mínimo número de monitores. Como novedad, en nuestra propuesta se introducen nuevos parámetros de calidad aparte del número de sensores, también se considera el consumo de energía, la frecuencia de muestreo, los costes de interconexión y la posibilidad de elegir diferentes tipos de monitores. El modelo se introduce en un algoritmo de recocido simulado que recibe la información térmica de un sistema, sus propiedades físicas, limitaciones de área, potencia e interconexión y una colección de tipos de monitor; el algoritmo proporciona el tipo seleccionado de monitor, el número de monitores, su posición y la velocidad de muestreo _optima. Para probar la validez del algoritmo, se presentan varios casos de estudio para el procesador Alpha 21364 considerando distintas restricciones. En comparación con otros trabajos previos en la literatura, el modelo que aquí se presenta es el más completo. Finalmente, la última contribución se dirige al nivel de red, partiendo de un conjunto de monitores de temperatura de posiciones conocidas, nos concentramos en resolver el problema de la conexión de los sensores de una forma eficiente en área y consumo. Nuestra primera propuesta en este campo es la introducción de un nuevo nivel en la jerarquía de interconexión, el nivel de trillado (o threshing en inglés), entre los monitores y los buses tradicionales de periféricos. En este nuevo nivel se aplica selectividad de datos para reducir la cantidad de información que se envía al controlador central. La idea detrás de este nuevo nivel es que en este tipo de redes la mayoría de los datos es inútil, porque desde el punto de vista del controlador sólo una pequeña cantidad de datos |normalmente sólo los valores extremos| es de interés. Para cubrir el nuevo nivel, proponemos una red de monitorización mono-conexión que se basa en un esquema de señalización en el dominio de tiempo. Este esquema reduce significativamente tanto la actividad de conmutación sobre la conexión como el consumo de energía de la red. Otra ventaja de este esquema es que los datos de los monitores llegan directamente ordenados al controlador. Si este tipo de señalización se aplica a sensores que realizan conversión tiempo-a-digital, se puede obtener compartición de recursos de digitalización tanto en tiempo como en espacio, lo que supone un importante ahorro de área y consumo. Finalmente, se presentan dos prototipos de sistemas de monitorización completos que de manera significativa superan la características de trabajos anteriores en términos de área y, especialmente, consumo de energía. Abstract Temperature is a first class design concern in modern integrated circuits. The important increase in power densities associated to recent technology evolutions has lead to the apparition of thermal gradients and hot spots during run time operation. Temperature impacts several circuit parameters such as speed, cooling budgets, reliability, power consumption, etc. In order to fight against these negative effects, dynamic thermal management (DTM) techniques adapt the behavior of the chip relying on the information of a monitoring system that provides run-time thermal information of the die surface. The field of on-chip temperature monitoring has drawn the attention of the scientific community in the recent years and is the object of study of this thesis. This thesis approaches the matter of on-chip temperature monitoring from different perspectives and levels, providing solutions to some of the most important issues. The physical and circuital levels are covered with the design and characterization of two novel temperature sensors specially tailored for DTM purposes. The first sensor is based upon a mechanism that obtains a pulse with a varying width based on the variations of the leakage currents on the temperature. In a nutshell, a circuit node is charged and subsequently left floating so that it discharges away through the subthreshold currents of a transistor; the time the node takes to discharge is the width of the pulse. Since the width of the pulse displays an exponential dependence on the temperature, the conversion into a digital word is realized by means of a logarithmic counter that performs both the timeto- digital conversion and the linearization of the output. The structure resulting from this combination of elements is implemented in a 0.35_m technology and is characterized by very reduced area, 10250 nm2, and power consumption, 1.05-65.5 nW at 5 samples/s, these figures outperformed all previous works by the time it was first published and still, by the time of the publication of this thesis, they outnumber all previous implementations in the same technology node. Concerning the accuracy, the sensor exhibits good linearity, even without calibration it displays a 3_ error of 1.97oC, appropriate to deal with DTM applications. As explained, the sensor is completely compatible with standard CMOS processes, this fact, along with its tiny area and power overhead, makes it specially suitable for the integration in a DTM monitoring system with a collection of on-chip monitors distributed across the chip. The exacerbated process fluctuations carried along with recent technology nodes jeop-ardize the linearity characteristics of the first sensor. In order to overcome these problems, a new temperature inferring technique is proposed. In this case, we also rely on the thermal dependencies of leakage currents that are used to discharge a floating node, but now, the result comes from the ratio of two different measures, in one of which we alter a characteristic of the discharging transistor |the gate voltage. This ratio proves to be very robust against process variations and displays a more than suficient linearity on the temperature |1.17oC 3_ error considering process variations and performing two-point calibration. The implementation of the sensing part based on this new technique implies several issues, such as the generation of process variations independent voltage reference, that are analyzed in depth in the thesis. In order to perform the time-to-digital conversion, we employ the same digitization structure the former sensor used. A completely new standard cell library targeting low area and power overhead is built from scratch to implement the digitization part. Putting all the pieces together, we achieve a complete sensor system that is characterized by ultra low energy per conversion of 48-640pJ and area of 0.0016mm2, this figure outperforms all previous works. To prove this statement, we perform a thorough comparison with over 40 works from the scientific literature. Moving up to the system level, the third contribution is centered on the modeling of a monitoring system consisting of set of thermal sensors distributed across the chip. All previous works from the literature target maximizing the accuracy of the system with the minimum number of monitors. In contrast, we introduce new metrics of quality apart form just the number of sensors; we consider the power consumption, the sampling frequency, the possibility to consider different types of monitors and the interconnection costs. The model is introduced in a simulated annealing algorithm that receives the thermal information of a system, its physical properties, area, power and interconnection constraints and a collection of monitor types; the algorithm yields the selected type of monitor, the number of monitors, their position and the optimum sampling rate. We test the algorithm with the Alpha 21364 processor under several constraint configurations to prove its validity. When compared to other previous works in the literature, the modeling presented here is the most complete. Finally, the last contribution targets the networking level, given an allocated set of temperature monitors, we focused on solving the problem of connecting them in an efficient way from the area and power perspectives. Our first proposal in this area is the introduction of a new interconnection hierarchy level, the threshing level, in between the monitors and the traditional peripheral buses that applies data selectivity to reduce the amount of information that is sent to the central controller. The idea behind this new level is that in this kind of networks most data are useless because from the controller viewpoint just a small amount of data |normally extreme values| is of interest. To cover the new interconnection level, we propose a single-wire monitoring network based on a time-domain signaling scheme that significantly reduces both the switching activity over the wire and the power consumption of the network. This scheme codes the information in the time domain and allows a straightforward obtention of an ordered list of values from the maximum to the minimum. If the scheme is applied to monitors that employ TDC, digitization resource sharing is achieved, producing an important saving in area and power consumption. Two prototypes of complete monitoring systems are presented, they significantly overcome previous works in terms of area and, specially, power consumption.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

OBJECTIVES Many paediatric antiretroviral therapy (ART) programmes in Southern Africa rely on CD4⁺ to monitor ART. We assessed the benefit of replacing CD4⁺ by viral load monitoring. DESIGN A mathematical modelling study. METHODS A simulation model of HIV progression over 5 years in children on ART, parameterized by data from seven South African cohorts. We simulated treatment programmes with 6-monthly CD4⁺ or 6- or 12-monthly viral load monitoring. We compared mortality, second-line ART use, immunological failure and time spent on failing ART. In further analyses, we varied the rate of virological failure, and assumed that the rate is higher with CD4⁺ than with viral load monitoring. RESULTS About 7% of children were predicted to die within 5 years, independent of the monitoring strategy. Compared with CD4⁺ monitoring, 12-monthly viral load monitoring reduced the 5-year risk of immunological failure from 1.6 to 1.0% and the mean time spent on failing ART from 6.6 to 3.6 months; 1% of children with CD4⁺ compared with 12% with viral load monitoring switched to second-line ART. Differences became larger when assuming higher rates of virological failure. When assuming higher virological failure rates with CD4⁺ than with viral load monitoring, up to 4.2% of children with CD4⁺ compared with 1.5% with viral load monitoring experienced immunological failure; the mean time spent on failing ART was 27.3 months with CD4⁺ monitoring and 6.0 months with viral load monitoring. Conclusion: Viral load monitoring did not affect 5-year mortality, but reduced time on failing ART, improved immunological response and increased switching to second-line ART.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

CO2 capture and storage (CCS) projects are presently developed to reduce the emission of anthropogenic CO2 into the atmosphere. CCS technologies are expected to account for the 20% of the CO2 reduction by 2050. One of the main concerns of CCS is whether CO2 may remain confined within the geological formation into which it is injected since post-injection CO2 migration in the time scale of years, decades and centuries is not well understood. Theoretically, CO2 can be retained at depth i) as a supercritical fluid (physical trapping), ii) as a fluid slowly migrating in an aquifer due to long flow path (hydrodynamic trapping), iii) dissolved into ground waters (solubility trapping) and iv) precipitated secondary carbonates. Carbon dioxide will be injected in the near future (2012) at Hontomín (Burgos, Spain) in the frame of the Compostilla EEPR project, led by the Fundación Ciudad de la Energía (CIUDEN). In order to detect leakage in the operational stage, a pre-injection geochemical baseline is presently being developed. In this work a geochemical monitoring design is presented to provide information about the feasibility of CO2 storage at depth.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

The integration of scientific knowledge about possible climate change impacts on water resources has a direct implication on the way water policies are being implemented and evolving. This is particularly true regarding various technical steps embedded into the EU Water Framework Directive river basin management planning, such as risk characterisation, monitoring, design and implementation of action programmes and evaluation of the "good status" objective achievements (in 2015). The need to incorporate climate change considerations into the implementation of EU water policy is currently discussed with a wide range of experts and stakeholders at EU level. Research trends are also on-going, striving to support policy developments and examining how scientific findings and recommendations could be best taken on board by policy-makers and water managers within the forthcoming years. This paper provides a snapshot of policy discussions about climate change in the context of the WFD river basin management planning and specific advancements of related EU-funded research projects. Perspectives for strengthening links among the scientific and policy-making communities in this area are also highlighted.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Objective: To assess and explain deviations from recommended practice in National Institute for Clinical Excellence (NICE) guidelines in relation to fetal heart monitoring. Design: Qualitative study. Setting: Large teaching hospital in the UK. Sample: Sixty-six hours of observation of 25 labours and interviews with 20 midwives of varying grades. Methods: Structured observations of labour and semistructured interviews with midwives. Interviews were undertaken using a prompt guide, audiotaped, and transcribed verbatim. Analysis was based on the constant comparative method, assisted by QSR N5 software. Main outcome measures: Deviations from recommended practice in relation to fetal monitoring and insights into why these occur. Results: All babies involved in the study were safely delivered, but 243 deviations from recommended practice in relation to NICE guidelines on fetal monitoring were identified, with the majority (80%) of these occurring in relation to documentation. Other deviations from recommended practice included indications for use of electronic fetal heart monitoring and conduct of fetal heart monitoring. There is evidence of difficulties with availability and maintenance of equipment, and some deficits in staff knowledge and skill. Differing orientations towards fetal monitoring were reported by midwives, which were likely to have impacts on practice. The initiation, management, and interpretation of fetal heart monitoring is complex and distributed across time, space, and professional boundaries, and practices in relation to fetal heart monitoring need to be understood within an organisational and social context. Conclusion: Some deviations from best practice guidelines may be rectified through straightforward interventions including improved systems for managing equipment and training. Other deviations from recommended practice need to be understood as the outcomes of complex processes that are likely to defy easy resolution. © RCOG 2006.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The project has provided management and other stakeholders with information necessary to make informed decisions about the management of four of the key exploited shark species caught in the Queensland inshore net fishery and northern New South Wales line fishery. The project has determined that spatial management of milk sharks within Queensland, and scalloped hammerhead, common black tip and Australian black tip sharks within Queensland and New South Wales is appropriate. The project has determined that both black tip shark species are likely to require co-operative management arrangements between Queensland and New South Wales. For scalloped hammerheads separate stocks between the two jurisdictions were identified from the fisheriesdependent samples, however genetic exchange across borders is likely to be facilitated by movement of adult females and perhaps larger males to a lesser extent. This information will greatly assist compliance with the Commonwealth Environment Protection and Biodiversity Conservation Act (1999) for shark fisheries in north-eastern Australia by providing the necessary basis for robust assessment of the status of stocks of the study species, thereby helping to deliver their sustainable harvest. It also helps to achieve objectives of the Australian National Shark Plan. The project provides the appropriate spatial framework for future monitoring and assessment of the study species. This is at a time when shark fisheries are receiving close attention from all sectors and when monitoring programs are being implemented, aimed at better assessment of stock status. This project has provided the crucial information for developing an appropriate monitoring design as well as the necessary basis for making statements about stock status. The project has addressed research priorities identified by the Queensland Fisheries Research Advisory Board, Great Barrier Reef Marine Park Authority and Queensland Fisheries. Previously management has assumed a single stock for each species on the east coast of Queensland, and management of shark fisheries in New South Wales (NSW) and Queensland has been independent of one another. The project has been able to enhance and develop links between research, management and industry. Strong positive relationships with commercial fishers were crucial in the collection of samples throughout the study area and fisheries managers were part of the project team throughout the study period. During the project the study area was extended to include both Queensland and NSW waters, creating mutualistic and positive links between the States’ research and management agencies. Extension of project results included management representatives from NSW and Queensland, as well as the Northern Territory where similar shark fisheries operate and similar species are targeted. The project was able to provide significant human capital development opportunities providing considerable value to the project outcomes. Use of vertebral microchemistry and life history characteristics as stock determination methods provided material for two PhD students based at James Cook University: Ron Schroeder, vertebral chemistry; and Alastair Harry, life history characteristic. The project has developed novel research methods that have great capacity for future application, including: • Development of a simple and rapid genetic diagnostic tool (RT-HRM-PCR assay) for differentiating among the black tip shark species, for which no simple morphological identifier exists; and • Development of laser ablation inductively coupled plasma mass spectrometry (LA-ICPMS) methods for analysing and interpreting microchemical composition of shark vertebrae. The study has provided further confirmation of the effectiveness of using a holistic approach in stock structure studies and justifies investment into such studies.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

设计了一个激光光斑实时监测与光路自动准直装置,能够实时监测激光光斑并自动准直激光输出方向。基于透镜成像原理,使用CCD探测器获得光斑的二维成像,并根据两点确定一条直线原理和使用压电陶瓷电动调整架实现光路自动准直;监测控制程序采用虚拟仪器开发软件Lab View编写,可以实时监测激光光斑模式与光斑位置抖动情况,并进行反馈控制。经测试,设计装置的调整精度达0.5μrad,反馈控制频率约1 Hz,完全可降低或消除抖动周期在1 s以上的光斑飘移。

Relevância:

60.00% 60.00%

Publicador:

Resumo:

A process evaluation of the Houston Childhood Lead Poisoning Prevention Program, 1992-1995, was conducted. The Program's goal is to reduce lead poisoning prevalence. The study proposed to determine to what extent the Program was implemented as planned by measuring how well Program services were actually: (1) received by the intended target population; (2) delivered to children with elevated blood lead levels; (3) delivered in compliance with the Centers for Disease Control and Prevention and Program guidelines and timetables; and (4) able to reduce lead poisoning prevalence among those rescreened. Utilizing a program monitoring design, the Program's pre-collected computer records were reviewed. The study sample consisted of 820 children whose blood lead levels were above 15 micrograms per deciLiter, representing approximately 2.9% of the 28,406 screened over this period. Three blood lead levels from each participant were examined: the initial elevated result; the confirmatory result; and the next rescreen result, after the elevated confirmatory level. Results showed that the Program screened approximately 18% (28,406 of 161,569) of Houston's children under age 6 years for lead poisoning. Based on Chi-square tests of significance, results also showed that lead-poisoned participants were more likely to be younger than 3 years, male and Hispanic, compared to those not lead poisoned. The age, gender and ethnic differences observed were statistically significant (p =.01, p =.00, p =.00). Four of the six Program services: medical evaluations, rescreening, environmental inspections and confirmation, had satisfactory delivery completion rates of 71%-98%. Delivery timetable compliance rates for three of the six services examined: outreach contacts, home visits and environmental inspections were below 32%. However, dangerously elevated blood lead levels fell and lead poisoning prevalence dropped from 3.3% at initial screening to 1.2% among those rescreened, after intervention. From a public health perspective, reductions in lead poisoning prevalence are very meaningful. Based on these findings, the following are recommendations for future research: (1) integrate Program database files by utilizing a computer database management program; (2) target services at Hispanic male children under age 3 years living in the highest risk neighborhoods; (3) increase resources to: improve tracking and documentation of service delivery and provide more non-medical case management and environmental services; and (4) share the evaluation methodology/findings with the Centers for Disease Control and Prevention administrators; the implications may be relevant to other program managers conducting such assessments. ^

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This paper investigates a wireless sensor network deployment - monitoring water quality, e.g. salinity and the level of the underground water table - in a remote tropical area of northern Australia. Our goal is to collect real time water quality measurements together with the amount of water being pumped out in the area, and investigate the impacts of current irrigation practice on the environments, in particular underground water salination. This is a challenging task featuring wide geographic area coverage (mean transmission range between nodes is more than 800 meters), highly variable radio propagations, high end-to-end packet delivery rate requirements, and hostile deployment environments. We have designed, implemented and deployed a sensor network system, which has been collecting water quality and flow measurements, e.g., water flow rate and water flow ticks for over one month. The preliminary results show that sensor networks are a promising solution to deploying a sustainable irrigation system, e.g., maximizing the amount of water pumped out from an area with minimum impact on water quality.