864 resultados para Monitoring Program Design
Resumo:
We investigate the effect of education Conditional Cash Transfer programs (CCTs) on teenage pregnancy. Our main concern is with how the size and sign of the effect may depend on the design of the program. Using a simple model we show that an education CCT that conditions renewal on school performance reduces teenage pregnancy; the program can increase teenage pregnancy if it does not condition on school performance. Then, using an original data base, we estimate the causal impact on teenage pregnancy of two education CCTs implemented in Bogot´a (Subsidio Educativo, SE, and Familias en Acci´on, FA); both programs differ particularly on whether school success is a condition for renewal or not. We show that SE has negative average effect on teenage pregnancy while FA has a null average effect. We also find that SE has either null or no effect for adolescents in all age and grade groups while FA has positive, null or negative effects for adolescents in different age and grade groups. Since SE conditions renewal on school success and FA does not, we can argue that the empirical results are consistent with the predictions of our model and that conditioning renewal of the subsidy on school success crucially determines the effect of the subsidy on teenage pregnancy
Resumo:
Aquesta tesi es basa en el programa de reintroducció de la llúdriga eurasiàtica (Lutra lutra) a les conques dels rius Muga i Fluvià (Catalunya) durant la segona meitat dels 1990s. Els objectius de la tesi foren demostrar la viabilitat de la reintroducció, demostrar l'èxit de la mateixa, estudiar aspectes ecològics i etològics de l'espècie, aprofitant l'oportunitat única de gaudir d'una població "de disseny" i determinar les probabilitats de supervivència de la població a llarg termini. La reintroducció de la llúdriga a les conques dels rius Muga i Fluvià va reeixir, doncs l'àrea geogràfica ocupada efectivament es va incrementar fins a un 64% d'estacions positives a l'hivern 2001-02. La troballa de tres exemplars adults nascuts a l'àrea de reintroducció és una altra prova que valida l'èxit del programa. La densitat d'exemplars calculada a través dels censos visuals ha resultat baixa (0.04-0.11 llúdrigues/km), però s'aproxima al que hom pot esperar en els primers estadis d'una població reintroduïda, encara poc nombrosa però distribuïda en una gran àrea. La mortalitat post-alliberament va ser del 22% un any després de l'alliberament, similar o inferior a la d'altres programes de reintroducció de llúdrigues reeixits. La mortalitat va ser deguda principalment a atropellaments (56%). El patró d'activitat de les llúdrigues reintroduïdes va esdevenir principalment nocturn i crepuscular, amb una escassa activitat diürna. Les seves àrees vitals van ser del mateix ordre (34,2 km) que les calculades en d'altres estudis realitzats a Europa. La longitud mitjana de riu recorreguda per una llúdriga durant 24 hores va ser de 4,2 km per les femelles i 7,6 km pels mascles. Durant el període de radioseguiment dues femelles van criar i els seus moviments van poder ser estudiats amb deteniment. La resposta de la nova població de llúdrigues a les fluctuacions estacionals en la disponibilitat d'aigua, habitual a les regions mediterrànies, va consistir en la concentració en una àrea menor durant el període de sequera estival, a causa de l'increment de trams secs, inhabitables per la llúdriga per la manca d'aliment, fet que va provocar expansions i contraccions periòdiques en l'àrea de distribució. La persistència a llarg termini de la població reintroduïda va ser estudiada mitjançant una Anàlisi de Viabilitat Poblacional (PVA). El resultat va ser un baix risc d'extinció de la població en els propers 100 anys i la majoria dels escenaris simulats (65%) van assolir el criteri d'un mínim de 90% de probabilitat de supervivència. Del model poblacional construït es dedueix que un punt clau per assegurar la viabilitat de la població reintroduïda és la reducció de la mortalitat accidental. A l'àrea d'estudi, els atropellaments causen més del 50% de la mortalitat i aquesta pot ser reduïda mitjançant la construcció de passos de fauna, el tancament lateral d'alguns trams de carretera perillosos i el control de la velocitat en algunes vies. El projecte de reintroducció ha posat a punt un protocol per a la captura, maneig i alliberament de llúdrigues salvatges, que pot contenir informació útil per a programes similars. També ha suposat una oportunitat única d'estudiar una població dissenyada artificialment i poder comparar diversos mètodes per estimar la distribució i la densitat de poblacions de llúdrigues. Per últim, la reintroducció portada a terme a les conques dels rius Muga i Fluvià ha aconseguit crear una nova població de llúdrigues, que persisteix en el temps, que es reprodueix regularment i que es dispersa progressivament, fins i tot a noves conques fluvials.
Resumo:
This study examined oral education components that could be successfully implemented with culturally and linguistically diverse deaf and hard of hearing (DHH) children and their families. A literature review of oral program strategies used with culturally diverse families and their children with special needs, and federal guidelines related to programs serving DHH children were conducted. Recent statistics of children in programs for DHH students who are from racially and linguistically diverse backgrounds were discussed. Additional data sources included classroom observations and multidisciplinary interviews. The data obtained was utilized to design a framework for oral programs to support culturally and linguistically diverse DHH children and their families.
Resumo:
The amateur birding community has a long and proud tradition of contributing to bird surveys and bird atlases. Coordinated activities such as Breeding Bird Atlases and the Christmas Bird Count are examples of "citizen science" projects. With the advent of technology, Web 2.0 sites such as eBird have been developed to facilitate online sharing of data and thus increase the potential for real-time monitoring. However, as recently articulated in an editorial in this journal and elsewhere, monitoring is best served when based on a priori hypotheses. Harnessing citizen scientists to collect data following a hypothetico-deductive approach carries challenges. Moreover, the use of citizen science in scientific and monitoring studies has raised issues of data accuracy and quality. These issues are compounded when data collection moves into the Web 2.0 world. An examination of the literature from social geography on the concept of "citizen sensors" and volunteered geographic information (VGI) yields thoughtful reflections on the challenges of data quality/data accuracy when applying information from citizen sensors to research and management questions. VGI has been harnessed in a number of contexts, including for environmental and ecological monitoring activities. Here, I argue that conceptualizing a monitoring project as an experiment following the scientific method can further contribute to the use of VGI. I show how principles of experimental design can be applied to monitoring projects to better control for data quality of VGI. This includes suggestions for how citizen sensors can be harnessed to address issues of experimental controls and how to design monitoring projects to increase randomization and replication of sampled data, hence increasing scientific reliability and statistical power.
Resumo:
Compute grids are used widely in many areas of environmental science, but there has been limited uptake of grid computing by the climate modelling community, partly because the characteristics of many climate models make them difficult to use with popular grid middleware systems. In particular, climate models usually produce large volumes of output data, and running them usually involves complicated workflows implemented as shell scripts. For example, NEMO (Smith et al. 2008) is a state-of-the-art ocean model that is used currently for operational ocean forecasting in France, and will soon be used in the UK for both ocean forecasting and climate modelling. On a typical modern cluster, a particular one year global ocean simulation at 1-degree resolution takes about three hours when running on 40 processors, and produces roughly 20 GB of output as 50000 separate files. 50-year simulations are common, during which the model is resubmitted as a new job after each year. Running NEMO relies on a set of complicated shell scripts and command utilities for data pre-processing and post-processing prior to job resubmission. Grid Remote Execution (G-Rex) is a pure Java grid middleware system that allows scientific applications to be deployed as Web services on remote computer systems, and then launched and controlled as if they are running on the user's own computer. Although G-Rex is general purpose middleware it has two key features that make it particularly suitable for remote execution of climate models: (1) Output from the model is transferred back to the user while the run is in progress to prevent it from accumulating on the remote system and to allow the user to monitor the model; (2) The client component is a command-line program that can easily be incorporated into existing model work-flow scripts. G-Rex has a REST (Fielding, 2000) architectural style, which allows client programs to be very simple and lightweight and allows users to interact with model runs using only a basic HTTP client (such as a Web browser or the curl utility) if they wish. This design also allows for new client interfaces to be developed in other programming languages with relatively little effort. The G-Rex server is a standard Web application that runs inside a servlet container such as Apache Tomcat and is therefore easy to install and maintain by system administrators. G-Rex is employed as the middleware for the NERC1 Cluster Grid, a small grid of HPC2 clusters belonging to collaborating NERC research institutes. Currently the NEMO (Smith et al. 2008) and POLCOMS (Holt et al, 2008) ocean models are installed, and there are plans to install the Hadley Centre’s HadCM3 model for use in the decadal climate prediction project GCEP (Haines et al., 2008). The science projects involving NEMO on the Grid have a particular focus on data assimilation (Smith et al. 2008), a technique that involves constraining model simulations with observations. The POLCOMS model will play an important part in the GCOMS project (Holt et al, 2008), which aims to simulate the world’s coastal oceans. A typical use of G-Rex by a scientist to run a climate model on the NERC Cluster Grid proceeds as follows :(1) The scientist prepares input files on his or her local machine. (2) Using information provided by the Grid’s Ganglia3 monitoring system, the scientist selects an appropriate compute resource. (3) The scientist runs the relevant workflow script on his or her local machine. This is unmodified except that calls to run the model (e.g. with “mpirun”) are simply replaced with calls to "GRexRun" (4) The G-Rex middleware automatically handles the uploading of input files to the remote resource, and the downloading of output files back to the user, including their deletion from the remote system, during the run. (5) The scientist monitors the output files, using familiar analysis and visualization tools on his or her own local machine. G-Rex is well suited to climate modelling because it addresses many of the middleware usability issues that have led to limited uptake of grid computing by climate scientists. It is a lightweight, low-impact and easy-to-install solution that is currently designed for use in relatively small grids such as the NERC Cluster Grid. A current topic of research is the use of G-Rex as an easy-to-use front-end to larger-scale Grid resources such as the UK National Grid service.
Resumo:
The Improved Stratospheric and Mesospheric Sounder (ISAMS) is designed to measure the Earths middle atmosphere in the range of 4.6 to 16.6 micorns. This paper considers all the coated optical elements in two radiometric test channels. (Analysis of the spectral response will be presented as a seperate paper at this symposium, see Sheppard et al). Comparisons between the compued spectral performance and measurements from actual coatings will be discussed: These will include substrate absorption simulations. The results of environmental testing (durability and stability) are included, together with details of coating deposition and monitoring conditions.
Resumo:
1. Closed Ecological Systems (CES) are small manmade ecosystems which do not have any material exchange with the surrounding environment. Recent ecological and technological advances enable successful establishment and maintenance of CES, making them a suitable tool for detecting and measuring subtle feedbacks and mechanisms. 2. As a part of an analogue (physical) C cycle modelling experiment, we developed a non-intrusive methodology to control the internal environment and to monitor atmospheric CO2 concentration inside 16 replicated CES. Whilst maintaining an air-tight seal of all CES, this approach allowed for access to the CO2 measuring equipment for periodic re-calibration and repairs. 3. To ensure reliable cross-comparison of CO2 observations between individual CES units and to minimise the cost of the system, only one CO2 sampling unit was used. An ADC BioScientific OP-2 (open-path) analyser mounted on a swinging arm was passing over a set of 16 measuring cells. Each cell was connected to an individual CES with air continuously circulating between them. 4. Using this setup, we were able to continuously measure several environmental variables and CO2 concentration within each closed system, allowing us to study minute effects of changing temperature on C fluxes within each CES. The CES and the measuring cells showed minimal air leakage during an experimental run lasting, on average, 3 months. The CO2 analyser assembly performed reliably for over 2 years, however an early iteration of the present design proved to be sensitive to positioning errors. 5. We indicate how the methodology can be further improved and suggest possible avenues where future CES based research could be applied.
Resumo:
The objective of this study was to investigate a novel light backscatter sensor, with a large field of view relative to curd size, for continuous on-line monitoring of coagulation and syneresis to improve curd moisture content control. A three-level, central composite design was employed to study the effects of temperature, cutting time, and CaCl2 addition on cheese making parameters. The sensor signal was recorded and analyzed. The light backscatter ratio followed a sigmoid increase during coagulation and decreased asymptotically after gel cutting. Curd yield and curd moisture content were predicted from the time to the maximum slope of the first derivative of the light backscatter ratio during coagulation and the decrease in the sensor response during syneresis. Whey fat was affected by coagulation kinetics and cutting time, suggesting curd rheological properties at cutting are dominant factors determining fat losses. The proposed technology shows potential for on-line monitoring of coagulation and syneresis. 2007 Elsevier Ltd. All rights reserved..
Resumo:
Sampling strategies for monitoring the status and trends in wildlife populations are often determined before the first survey is undertaken. However, there may be little information about the distribution of the population and so the sample design may be inefficient. Through time, as data are collected, more information about the distribution of animals in the survey region is obtained but it can be difficult to incorporate this information in the survey design. This paper introduces a framework for monitoring motile wildlife populations within which the design of future surveys can be adapted using data from past surveys whilst ensuring consistency in design-based estimates of status and trends through time. In each survey, part of the sample is selected from the previous survey sample using simple random sampling. The rest is selected with inclusion probability proportional to predicted abundance. Abundance is predicted using a model constructed from previous survey data and covariates for the whole survey region. Unbiased design-based estimators of status and trends and their variances are derived from two-phase sampling theory. Simulations over the short and long-term indicate that in general more precise estimates of status and trends are obtained using this mixed strategy than a strategy in which all of the sample is retained or all selected with probability proportional to predicted abundance. Furthermore the mixed strategy is robust to poor predictions of abundance. Estimates of status are more precise than those obtained from a rotating panel design.
Resumo:
Objective To determine the prevalence and nature of prescribing and monitoring errors in general practices in England. Design Retrospective case note review of unique medication items prescribed over a 12 month period to a 2% random sample of patients. Mixed effects logistic regression was used to analyse the data. Setting Fifteen general practices across three primary care trusts in England. Data sources Examination of 6048 unique prescription items prescribed over the previous 12 months for 1777 patients. Main outcome measures Prevalence of prescribing and monitoring errors, and severity of errors, using validated definitions. Results Prescribing and/or monitoring errors were detected in 4.9% (296/6048) of all prescription items (95% confidence interval 4.4 - 5.5%). The vast majority of errors were of mild to moderate severity, with 0.2% (11/6048) of items having a severe error. After adjusting for covariates, patient-related factors associated with an increased risk of prescribing and/or monitoring errors were: age less than 15 (Odds Ratio (OR) 1.87, 1.19 to 2.94, p=0.006) or greater than 64 years (OR 1.68, 1.04 to 2.73, p=0.035), and higher numbers of unique medication items prescribed (OR 1.16, 1.12 to 1.19, p<0.001). Conclusion Prescribing and monitoring errors are common in English general practice, although severe errors are unusual. Many factors increase the risk of error. Having identified the most common and important errors, and the factors associated with these, strategies to prevent future errors should be developed based on the study findings.
Resumo:
A flood warning system incorporates telemetered rainfall and flow/water level data measured at various locations in the catchment area. Real-time accurate data collection is required for this use, and sensor networks improve the system capabilities. However, existing sensor nodes struggle to satisfy the hydrological requirements in terms of autonomy, sensor hardware compatibility, reliability and long-range communication. We describe the design and development of a real-time measurement system for flood monitoring, and its deployment in a flash-flood prone 650 km2 semiarid watershed in Southern Spain. A developed low-power and long-range communication device, so-called DatalogV1, provides automatic data gathering and reliable transmission. DatalogV1 incorporates self-monitoring for adapting measurement schedules for consumption management and to capture events of interest. Two tests are used to assess the success of the development. The results show an autonomous and robust monitoring system for long-term collection of water level data in many sparse locations during flood events.
Resumo:
A universal systems design process is specified, tested in a case study and evaluated. It links English narratives to numbers using a categorical language framework with mathematical mappings taking the place of conjunctions and numbers. The framework is a ring of English narrative words between 1 (option) and 360 (capital); beyond 360 the ring cycles again to 1. English narratives are shown to correspond to the field of fractional numbers. The process can enable the development, presentation and communication of complex narrative policy information among communities of any scale, on a software implementation known as the "ecoputer". The information is more accessible and comprehensive than that in conventional decision support, because: (1) it is expressed in narrative language; and (2) the narratives are expressed as compounds of words within the framework. Hence option generation is made more effective than in conventional decision support processes including Multiple Criteria Decision Analysis, Life Cycle Assessment and Cost-Benefit Analysis.The case study is of a participatory workshop in UK bioenergy project objectives and criteria, at which attributes were elicited in environmental, economic and social systems. From the attributes, the framework was used to derive consequences at a range of levels of precision; these are compared with the project objectives and criteria as set out in the Case for Support. The design process is to be supported by a social information manipulation, storage and retrieval system for numeric and verbal narratives attached to the "ecoputer". The "ecoputer" will have an integrated verbal and numeric operating system. Novel design source code language will assist the development of narrative policy. The utility of the program, including in the transition to sustainable development and in applications at both community micro-scale and policy macro-scale, is discussed from public, stakeholder, corporate, Governmental and regulatory perspectives.
Resumo:
The extended flight of the Airborne Ionospheric Observatory during the Geospace Environment Modeling (GEM) Pilot program on January 16, 1990, allowed continuous all-sky monitoring of the two-dimensional ionospheric footprint of the northward interplanetary magnetic field (IMF) cusp in several wavelengths. Especially important in determining the locus of magnetosheath electron precipitation was the 630.0-nm red line emission. The most striking morphological change in the images was the transient appearance of zonally elongated regions of enhanced 630.0-nm emission which resembled “rays” emanating from the centroid of the precipitation. The appearance of these rays was strongly correlated with the Y component of the IMF: when the magnitude of By was large compared to Bz, the rays appeared; otherwise, the distribution was relatively unstructured. Late in the flight the field of view of the imager included the field of view of flow measurements from the European incoherent scatter radar (EISCAT). The rays visible in 630.0-nm emission exactly aligned with the position of strong flow jets observed by EISCAT. We attribute this correspondence to the requirement of quasi-neutrality; namely, the soft electrons have their largest precipitating fluxes where the bulk of the ions precipitate. The ions, in regions of strong convective flow, are spread out farther along the flow path than in regions of weaker flow. The occurrence and direction of these flow bursts are controlled by the IMF in a manner consistent with newly opened flux tubes; i.e., when |By| > |Bz|, tension in the reconnected field lines produce east-west flow regions downstream of the ionospheric projection of the x line. We interpret the optical rays (flow bursts), which typically last between 5 and 15 min, as evidence of periods of enhanced dayside (or lobe) reconnection when |By| > |Bz|. The length of the reconnection pulse is difficult to determine, however, since strong zonal flows would be expected to persist until the tension force in the field line has decayed, even if the duration of the enhanced reconnection was relatively short.