929 resultados para monitoring design


Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVE: To assess the efficiency of alternative monitoring services for people with ocular hypertension (OHT), a glaucoma risk factor.

DESIGN: Discrete event simulation model comparing five alternative care pathways: treatment at OHT diagnosis with minimal monitoring; biennial monitoring (primary and secondary care) with treatment if baseline predicted 5-year glaucoma risk is ≥6%; monitoring and treatment aligned to National Institute for Health and Care Excellence (NICE) glaucoma guidance (conservative and intensive).

SETTING: UK health services perspective.

PARTICIPANTS: Simulated cohort of 10 000 adults with OHT (mean intraocular pressure (IOP) 24.9 mm Hg (SD 2.4).

MAIN OUTCOME MEASURES: Costs, glaucoma detected, quality-adjusted life years (QALYs).

RESULTS: Treating at diagnosis was the least costly and least effective in avoiding glaucoma and progression. Intensive monitoring following NICE guidance was the most costly and effective. However, considering a wider cost-utility perspective, biennial monitoring was less costly and provided more QALYs than NICE pathways, but was unlikely to be cost-effective compared with treating at diagnosis (£86 717 per additional QALY gained). The findings were robust to risk thresholds for initiating monitoring but were sensitive to treatment threshold, National Health Service costs and treatment adherence.

CONCLUSIONS: For confirmed OHT, glaucoma monitoring more frequently than every 2 years is unlikely to be efficient. Primary treatment and minimal monitoring (assessing treatment responsiveness (IOP)) could be considered; however, further data to refine glaucoma risk prediction models and value patient preferences for treatment are needed. Consideration to innovative and affordable service redesign focused on treatment responsiveness rather than more glaucoma testing is recommended.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Monitoring of coastal and estuarine water quality has been traditionally performed by sampling with subsequent laboratory analysis. This has the disadvantages of low spatial and temporal resolution and high cost. In the last decades two alternative techniques have emerged to overcome this drawback: profiling and remote sensing. Profiling using multi-parameter sensors is now in a commercial stage. It can be used, tied to a boat, to obtain a quick “picture” of the system. The spatial resolution thus increases from single points to a line coincident with the boat track. The temporal resolution however remains unchanged since campaigns and resources involved are basically the same. The need for laboratory analysis was reduced but not eliminated because parameters like nutrients, microbiology or metals are still difficult to obtain with sensors and validation measurements are still needed. In the last years the improvement in satellite resolution has enabled its use for coastal and estuarine water monitoring. Although spatial coverage and resolution of satellite images in the present is already suitable to coastal and estuarine monitoring, temporal resolution is naturally limited to satellite passages and cloud cover. With this panorama the best approach to water monitoring is to integrate and combine data from all these sources. The natural tools to perform this integration are numerical models. Models benefit from the different sources of data to obtain a better calibration. After calibration they can be used to extend spatially and temporally the methods resolution. In Algarve (South of Portugal) a monitoring effort using this approach is being undertaken. The monitoring effort comprises five different locations including coastal waters, estuaries and coastal lagoons. The objective is to establish the base line situation to evaluate the impact of Waste Water Treatment Plants design and retrofitting. The field campaigns include monthly synoptic profiling, using an YSI 6600 multi-parameter system, laboratory analysis and fixed stations. The remote sensing uses ENVISAT\MERIS Level 2 Full Resolution data. This data is combined and used with the MOHID modelling system to obtain an integrate description of the systems. The results show the limitations of each method and the ability of the modelling system to integrate the results and to produce a comprehensive picture of the system.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Thesis to obtain the Master Degree in Electronics and Telecommunications Engineering

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Potentiometric sensors are typically unable to carry out on-site monitoring of environmental drug contaminants because of their high limits of detection (LODs). Designing a novel ligand material for the target analyte and managing the composition of the internal reference solution have been the strategies employed here to produce for the first time a potentiometric-based direct reading method for an environmental drug contaminant. This concept has been applied to sulfamethoxazole (SMX), one of the many antibiotics used in aquaculture practices that may occur in environmental waters. The novel ligand has been produced by imprinting SMX on the surface of graphitic carbon nanostructures (CN) < 500 nm. The imprinted carbon nanostructures (ICN) were dispersed in plasticizer and entrapped in a PVC matrix that included (or not) a small amount of a lipophilic additive. The membrane composition was optimized on solid-contact electrodes, allowing near-Nernstian responses down to 5.2 μg/mL and detecting 1.6 μg/mL. The membranes offered good selectivity against most of the ionic compounds in environmental water. The best membrane cocktail was applied on the smaller end of a 1000 μL micropipette tip made of polypropylene. The tip was then filled with inner reference solution containing SMX and chlorate (as interfering compound). The corresponding concentrations were studied for 1 × 10−5 to 1 × 10−10 and 1 × 10−3 to 1 × 10−8 mol/L. The best condition allowed the detection of 5.92 ng/L (or 2.3 × 10−8 mol/L) SMX for a sub-Nernstian slope of −40.3 mV/decade from 5.0 × 10−8 to 2.4 × 10−5 mol/L.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Teaching and learning computer programming is as challenging as difficult. Assessing the work of students and providing individualised feedback to all is time-consuming and error prone for teachers and frequently involves a time delay. The existent tools and specifications prove to be insufficient in complex evaluation domains where there is a greater need to practice. At the same time Massive Open Online Courses (MOOC) are appearing revealing a new way of learning, more dynamic and more accessible. However this new paradigm raises serious questions regarding the monitoring of student progress and its timely feedback. This paper provides a conceptual design model for a computer programming learning environment. This environment uses the portal interface design model gathering information from a network of services such as repositories and program evaluators. The design model includes also the integration with learning management systems, a central piece in the MOOC realm, endowing the model with characteristics such as scalability, collaboration and interoperability. This model is not limited to the domain of computer programming and can be adapted to any complex area that requires systematic evaluation with immediate feedback.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The purpose of this work is to develop a practicable approach for Telecom firms to manage the credit risk exposition to their commercial agents’ network. Particularly it will try to approach the problem of credit concession to clients’ from a corporation perspective and explore the particular scenario of agents that are part of the commercial chain of the corporation and therefore are not end-users. The agents’ network that served as a model for the presented study is composed by companies that, at the same time, are both clients and suppliers of the Telecommunication Company. In that sense the credit exposition analysis must took into consideration all financial fluxes, both inbound and outbound. The current strain on the Financial Sector in Portugal, and other peripheral European economies, combined with the high leverage situation of most companies, generates an environment prone to credit default risk. Due to these circumstances managing credit risk exposure is becoming increasingly a critical function for every company Financial Department. The approach designed in the current study combined two traditional risk monitoring tools: credit risk scoring and credit limitation policies. The objective was to design a new credit monitoring framework that is more flexible, uses both external and internal relationship history to assess risk and takes into consideration commercial objectives inside the agents’ network. Although not explored at length, the blueprint of a Credit Governance model was created for implementing the new credit monitoring framework inside the telecom firm. The Telecom Company that served as a model for the present work decided to implement the new Credit Monitoring framework after this was presented to its Executive Commission.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

For many drugs, finding the balance between efficacy and toxicity requires monitoring their concentrations in the patient's blood. Quantifying drug levels at the bedside or at home would have advantages in terms of therapeutic outcome and convenience, but current techniques require the setting of a diagnostic laboratory. We have developed semisynthetic bioluminescent sensors that permit precise measurements of drug concentrations in patient samples by spotting minimal volumes on paper and recording the signal using a simple point-and-shoot camera. Our sensors have a modular design consisting of a protein-based and a synthetic part and can be engineered to selectively recognize a wide range of drugs, including immunosuppressants, antiepileptics, anticancer agents and antiarrhythmics. This low-cost point-of-care method could make therapies safer, increase the convenience of doctors and patients and make therapeutic drug monitoring available in regions with poor infrastructure.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Questions: A multiple plot design was developed for permanent vegetation plots. How reliable are the different methods used in this design and which changes can we measure? Location: Alpine meadows (2430 m a.s.l.) in the Swiss Alps. Methods: Four inventories were obtained from 40 m(2) plots: four subplots (0.4 m(2)) with a list of species, two 10m transects with the point method (50 points on each), one subplot (4 m2) with a list of species and visual cover estimates as a percentage and the complete plot (40 m(2)) with a list of species and visual estimates in classes. This design was tested by five to seven experienced botanists in three plots. Results: Whatever the sampling size, only 45-63% of the species were seen by all the observers. However, the majority of the overlooked species had cover < 0.1%. Pairs of observers overlooked 10-20% less species than single observers. The point method was the best method for cover estimate, but it took much longer than visual cover estimates, and 100 points allowed for the monitoring of only a very limited number of species. The visual estimate as a percentage was more precise than classes. Working in pairs did not improve the estimates, but one botanist repeating the survey is more reliable than a succession of different observers. Conclusion: Lists of species are insufficient for monitoring. It is necessary to add cover estimates to allow for subsequent interpretations in spite of the overlooked species. The choice of the method depends on the available resources: the point method is time consuming but gives precise data for a limited number of species, while visual estimates are quick but allow for recording only large changes in cover. Constant pairs of observers improve the reliability of the records.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND AND OBJECTIVES: The SBP values to be achieved by antihypertensive therapy in order to maximize reduction of cardiovascular outcomes are unknown; neither is it clear whether in patients with a previous cardiovascular event, the optimal values are lower than in the low-to-moderate risk hypertensive patients, or a more cautious blood pressure (BP) reduction should be obtained. Because of the uncertainty whether 'the lower the better' or the 'J-curve' hypothesis is correct, the European Society of Hypertension and the Chinese Hypertension League have promoted a randomized trial comparing antihypertensive treatment strategies aiming at three different SBP targets in hypertensive patients with a recent stroke or transient ischaemic attack. As the optimal level of low-density lipoprotein cholesterol (LDL-C) level is also unknown in these patients, LDL-C-lowering has been included in the design. PROTOCOL DESIGN: The European Society of Hypertension-Chinese Hypertension League Stroke in Hypertension Optimal Treatment trial is a prospective multinational, randomized trial with a 3 × 2 factorial design comparing: three different SBP targets (1, <145-135; 2, <135-125; 3, <125 mmHg); two different LDL-C targets (target A, 2.8-1.8; target B, <1.8 mmol/l). The trial is to be conducted on 7500 patients aged at least 65 years (2500 in Europe, 5000 in China) with hypertension and a stroke or transient ischaemic attack 1-6 months before randomization. Antihypertensive and statin treatments will be initiated or modified using suitable registered agents chosen by the investigators, in order to maintain patients within the randomized SBP and LDL-C windows. All patients will be followed up every 3 months for BP and every 6 months for LDL-C. Ambulatory BP will be measured yearly. OUTCOMES: Primary outcome is time to stroke (fatal and non-fatal). Important secondary outcomes are: time to first major cardiovascular event; cognitive decline (Montreal Cognitive Assessment) and dementia. All major outcomes will be adjudicated by committees blind to randomized allocation. A Data and Safety Monitoring Board has open access to data and can recommend trial interruption for safety. SAMPLE SIZE CALCULATION: It has been calculated that 925 patients would reach the primary outcome after a mean 4-year follow-up, and this should provide at least 80% power to detect a 25% stroke difference between SBP targets and a 20% difference between LDL-C targets.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

years 8 months) and 24 older (M == 7 years 4 months) children. A Monitoring Process Model (MPM) was developed and tested in order to ascertain at which component process ofthe MPM age differences would emerge. The MPM had four components: (1) assessment; (2) evaluation; (3) planning; and (4) behavioural control. The MPM was assessed directly using a referential communication task in which the children were asked to make a series of five Lego buildings (a baseline condition and one building for each MPM component). Children listened to instructions from one experimenter while a second experimenter in the room (a confederate) intetjected varying levels ofverbal feedback in order to assist the children and control the component ofthe MPM. This design allowed us to determine at which "stage" ofprocessing children would most likely have difficulty monitoring themselves in this social-cognitive task. Developmental differences were obselVed for the evaluation, planning and behavioural control components suggesting that older children were able to be more successful with the more explicit metacomponents. Interestingly, however, there was no age difference in terms ofLego task success in the baseline condition suggesting that without the intelVention ofthe confederate younger children monitored the task about as well as older children. This pattern ofresults indicates that the younger children were disrupted by the feedback rather than helped. On the other hand, the older children were able to incorporate the feedback offered by the confederate into a plan ofaction. Another aim ofthis study was to assess similar processing components to those investigated by the MPM Lego task in a more naturalistic observation. Together the use ofthe Lego Task ( a social cognitive task) and the naturalistic social interaction allowed for the appraisal of cross-domain continuities and discontinuities in monitoring behaviours. In this vein, analyses were undertaken in order to ascertain whether or not successful performance in the MPM Lego Task would predict cross-domain competence in the more naturalistic social interchange. Indeed, success in the two latter components ofthe MPM (planning and behavioural control) was related to overall competence in the naturalistic task. However, this cross-domain prediction was not evident for all levels ofthe naturalistic interchange suggesting that the nature ofthe feedback a child receives is an important determinant ofresponse competency. Individual difference measures reflecting the children's general cognitive capacity (Working Memory and Digit Span) and verbal ability (vocabulary) were also taken in an effort to account for more variance in the prediction oftask success. However, these individual difference measures did not serve to enhance the prediction oftask performance in either the Lego Task or the naturalistic task. Similarly, parental responses to questionnaires pertaining to their child's temperament and social experience also failed to increase prediction oftask performance. On-line measures ofthe children's engagement, positive affect and anxiety also failed to predict competence ratings.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Advent of lasers together with the advancement in fiber optics technology has revolutionized the sensor technology. Advancement in the telemetric applications of optical fiber based measurements is an added bonus. The present thesis describes variety of fiber based sensors using techniques like micro bending, long period grating and evanescent waves. Sensors to measure various physical and chemical parameters are described in this thesis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In recent years,photonics has emerged as an essential technology related to such diverse fields like laser technology,fiber optics,communication,optical signal processing,computing,entertainment,consumer electronics etc.Availabilities of semiconductor lasers and low loss fibers have also revolutionized the field of sensor technology including telemetry. There exist fiber optic sensors which are sensitive,reliable.light weight and accurate devices which find applications in wide range of areas like biomedicine,aviation,surgery,pollution monitoring etc.,apart from areas in basic sciences.The present thesis deals with the design,fabrication and characterization of a variety of cost effective and sensitive fiber optic sensors for the trace detetction of certain environment pollutants in air and water.The sensor design is carried out using the techniques like evanescent waves,micro bending and long period gratings.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The amateur birding community has a long and proud tradition of contributing to bird surveys and bird atlases. Coordinated activities such as Breeding Bird Atlases and the Christmas Bird Count are examples of "citizen science" projects. With the advent of technology, Web 2.0 sites such as eBird have been developed to facilitate online sharing of data and thus increase the potential for real-time monitoring. However, as recently articulated in an editorial in this journal and elsewhere, monitoring is best served when based on a priori hypotheses. Harnessing citizen scientists to collect data following a hypothetico-deductive approach carries challenges. Moreover, the use of citizen science in scientific and monitoring studies has raised issues of data accuracy and quality. These issues are compounded when data collection moves into the Web 2.0 world. An examination of the literature from social geography on the concept of "citizen sensors" and volunteered geographic information (VGI) yields thoughtful reflections on the challenges of data quality/data accuracy when applying information from citizen sensors to research and management questions. VGI has been harnessed in a number of contexts, including for environmental and ecological monitoring activities. Here, I argue that conceptualizing a monitoring project as an experiment following the scientific method can further contribute to the use of VGI. I show how principles of experimental design can be applied to monitoring projects to better control for data quality of VGI. This includes suggestions for how citizen sensors can be harnessed to address issues of experimental controls and how to design monitoring projects to increase randomization and replication of sampled data, hence increasing scientific reliability and statistical power.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Improved Stratospheric and Mesospheric Sounder (ISAMS) is designed to measure the Earths middle atmosphere in the range of 4.6 to 16.6 micorns. This paper considers all the coated optical elements in two radiometric test channels. (Analysis of the spectral response will be presented as a seperate paper at this symposium, see Sheppard et al). Comparisons between the compued spectral performance and measurements from actual coatings will be discussed: These will include substrate absorption simulations. The results of environmental testing (durability and stability) are included, together with details of coating deposition and monitoring conditions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

1. Closed Ecological Systems (CES) are small manmade ecosystems which do not have any material exchange with the surrounding environment. Recent ecological and technological advances enable successful establishment and maintenance of CES, making them a suitable tool for detecting and measuring subtle feedbacks and mechanisms. 2. As a part of an analogue (physical) C cycle modelling experiment, we developed a non-intrusive methodology to control the internal environment and to monitor atmospheric CO2 concentration inside 16 replicated CES. Whilst maintaining an air-tight seal of all CES, this approach allowed for access to the CO2 measuring equipment for periodic re-calibration and repairs. 3. To ensure reliable cross-comparison of CO2 observations between individual CES units and to minimise the cost of the system, only one CO2 sampling unit was used. An ADC BioScientific OP-2 (open-path) analyser mounted on a swinging arm was passing over a set of 16 measuring cells. Each cell was connected to an individual CES with air continuously circulating between them. 4. Using this setup, we were able to continuously measure several environmental variables and CO2 concentration within each closed system, allowing us to study minute effects of changing temperature on C fluxes within each CES. The CES and the measuring cells showed minimal air leakage during an experimental run lasting, on average, 3 months. The CO2 analyser assembly performed reliably for over 2 years, however an early iteration of the present design proved to be sensitive to positioning errors. 5. We indicate how the methodology can be further improved and suggest possible avenues where future CES based research could be applied.