828 resultados para Monitoring Program Design


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The "15BO1997001" dataset is based on samples collected in the spring of 1997. The whole dataset is composed of 66 samples (from 27 stations of National Monitoring Sampling Grid) with data of zooplankton species composition, abundance and biomass. Samples were collected in discrete layers 0-10, 0-20, 0-50, 10-25, 25-50, 50-100 and from bottom up to the surface at depths depending on water column stratification and the thermocline depth. The collected material was analysed using the method of Dimov (1959). Samples were brought to volume of 25-30 ml depending upon zooplankton density and mixed intensively until all organisms were distributed randomly in the sample volume. After that 5 ml of sample was taken and poured in the counting chamber which is a rectangle form for taxomomic identification and count. Large (> 1 mm body length) and not abundant species were calculated in whole sample. Counting and measuring of organisms were made in the Dimov chamber under the stereomicroscope to the lowest taxon possible. Taxonomic identification was done at the Institute of Oceanology by Asen Konsulov using the relevant taxonomic literature (Mordukhay-Boltovskoy, F.D. (Ed.). 1968, 1969,1972 ). The biomass was estimated as wet weight by Petipa, 1959 (based on species specific wet weight). Wet weight values were transformed to dry weight using the equation DW=0.16*WW as suggested by Vinogradov & Shushkina, 1987. The collected material was analysed using the method of Dimov (1959). Samples were brought to volume of 25-30 ml depending upon zooplankton density and mixed intensively until all organisms were distributed randomly in the sample volume. After that 5 ml of sample was taken and poured in the counting chamber which is a rectangle form for taxomomic identification and count. Copepods and Cladoceras were identified and enumerated; the other mesozooplankters were identified and enumerated at higher taxonomic level (commonly named as mesozooplankton groups). Large (> 1 mm body length) and not abundant species were calculated in whole sample. Counting and measuring of organisms were made in the Dimov chamber under the stereomicroscope to the lowest taxon possible. Taxonomic identification was done at the Institute of Oceanology by Asen Konsulov using the relevant taxonomic literature (Mordukhay-Boltovskoy, F.D. (Ed.). 1968, 1969,1972 ). The biomass was estimated as wet weight by Petipa, 1959 ussing standard average weight of each species in mg/m3. WW were converted to DW by equation DW=0.16*WW (Vinogradov ME, Sushkina EA, 1987).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Improving energy efficiency is an unarguable emergent issue in developing economies and an energy efficiency standard and labeling program is an ideal mechanism to achieve this target. However, there is concern regarding whether the consumers will choose the highly energy efficient appliances because of its high price in consequence of the high cost. This paper estimates how the consumer responds to introduction of the energy efficiency standard and labeling program in China. To quantify evaluation by consumers, we estimated their consumer surplus and the benefits of products based on the estimated parameters of demand function. We found the following points. First, evaluation of energy efficiency labeling by the consumer is not monotonically correlated with the number of grades. The highest efficiency label (Label 1) is not evaluated to be no less higher than labels 2 and 3, and is sometimes lower than the least energy efficient label (Label UI). This goes against the design of policy intervention. Second, several governmental policies affects in mixed directions: the subsidies for energy saving policies to the highest degree of the labels contribute to expanding consumer welfare as the program was designed. However, the replacement for new appliances policies decreased the welfare.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Current nanometer technologies are subjected to several adverse effects that seriously impact the yield and performance of integrated circuits. Such is the case of within-die parameters uncertainties, varying workload conditions, aging, temperature, etc. Monitoring, calibration and dynamic adaptation have appeared as promising solutions to these issues and many kinds of monitors have been presented recently. In this scenario, where systems with hundreds of monitors of different types have been proposed, the need for light-weight monitoring networks has become essential. In this work we present a light-weight network architecture based on digitization resource sharing of nodes that require a time-to-digital conversion. Our proposal employs a single wire interface, shared among all the nodes in the network, and quantizes the time domain to perform the access multiplexing and transmit the information. It supposes a 16% improvement in area and power consumption compared to traditional approaches.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Geodetic volcano monitoring in Tenerife has mainly focused on the Las Cañadas Caldera, where a geodetic micronetwork and a levelling profile are located. A sensitivity test of this geodetic network showed that it should be extended to cover the whole island for volcano monitoring purposes. Furthermore, InSAR allowed detecting two unexpected movements that were beyond the scope of the traditional geodetic network. These two facts prompted us to design and observe a GPS network covering the whole of Tenerife that was monitored in August 2000. The results obtained were accurate to one centimetre, and confirm one of the deformations, although they were not definitive enough to confirm the second one. Furthermore, new cases of possible subsidence have been detected in areas where InSAR could not be used to measure deformation due to low coherence. A first modelling attempt has been made using a very simple model and its results seem to indicate that the deformation observed and the groundwater level variation in the island may be related. Future observations will be necessary for further validation and to study the time evolution of the displacements, carry out interpretation work using different types of data (gravity, gases, etc) and develop models that represent the island more closely. The results obtained are important because they might affect the geodetic volcano monitoring on the island, which will only be really useful if it is capable of distinguishing between displacements that might be linked to volcanic activity and those produced by other causes. One important result in this work is that a new geodetic monitoring system based on two complementary techniques, InSAR and GPS, has been set up on Tenerife island. This the first time that the whole surface of any of the volcanic Canary Islands has been covered with a single network for this purpose. This research has displayed the need for further similar studies in the Canary Islands, at least on the islands which pose a greater risk of volcanic reactivation, such as Lanzarote and La Palma, where InSAR techniques have been used already.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Program specialization optimizes programs for known valúes of the input. It is often the case that the set of possible input valúes is unknown, or this set is infinite. However, a form of specialization can still be performed in such cases by means of abstract interpretation, specialization then being with respect to abstract valúes (substitutions), rather than concrete ones. We study the múltiple specialization of logic programs based on abstract interpretation. This involves in principie, and based on information from global analysis, generating several versions of a program predicate for different uses of such predicate, optimizing these versions, and, finally, producing a new, "multiply specialized" program. While múltiple specialization has received theoretical attention, little previous evidence exists on its practicality. In this paper we report on the incorporation of múltiple specialization in a parallelizing compiler and quantify its effects. A novel approach to the design and implementation of the specialization system is proposed. The resulting implementation techniques result in identical specializations to those of the best previously proposed techniques but require little or no modification of some existing abstract interpreters. Our results show that, using the proposed techniques, the resulting "abstract múltiple specialization" is indeed a relevant technique in practice. In particular, in the parallelizing compiler application, a good number of run-time tests are eliminated and invariants extracted automatically from loops, resulting generally in lower overheads and in several cases in increased speedups.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Program specialization optimizes programs for known valúes of the input. It is often the case that the set of possible input valúes is unknown, or this set is infinite. However, a form of specialization can still be performed in such cases by means of abstract interpretation, specialization then being with respect to abstract valúes (substitutions), rather than concrete ones. This paper reports on the application of abstract múltiple specialization to automatic program parallelization in the &-Prolog compiler. Abstract executability, the main concept underlying abstract specialization, is formalized, the design of the specialization system presented, and a non-trivial example of specialization in automatic parallelization is given.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents a technique for achieving a class of optimizations related to the reduction of checks within cycles. The technique uses both Program Transformation and Abstract Interpretation. After a ñrst pass of an abstract interpreter which detects simple invariants, program transformation is used to build a hypothetical situation that simpliñes some predicates that should be executed within the cycle. This transformation implements the heuristic hypothesis that once conditional tests hold they may continué doing so recursively. Specialized versions of predicates are generated to detect and exploit those cases in which the invariance may hold. Abstract interpretation is then used again to verify the truth of such hypotheses and conñrm the proposed simpliñcation. This allows optimizations that go beyond those possible with only one pass of the abstract interpreter over the original program, as is normally the case. It also allows selective program specialization using a standard abstract interpreter not speciñcally designed for this purpose, thus simplifying the design of this already complex module of the compiler. In the paper, a class of programs amenable to such optimization is presented, along with some examples and an evaluation of the proposed techniques in some application áreas such as floundering detection and reducing run-time tests in automatic logic program parallelization. The analysis of the examples presented has been performed automatically by an implementation of the technique using existing abstract interpretation and program transformation tools.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We present a tutorial overview of Ciaopp, the Ciao system preprocessor. Ciao is a public-domain, next-generation logic programming system, which subsumes ISO-Prolog and is specifically designed to a) be highly extensible via librarles and b) support modular program analysis, debugging, and optimization. The latter tasks are performed in an integrated fashion by Ciaopp. Ciaopp uses modular, incremental abstract interpretation to infer properties of program predicates and literals, including types, variable instantiation properties (including modes), non-failure, determinacy, bounds on computational cost, bounds on sizes of terms in the program, etc. Using such analysis information, Ciaopp can find errors at compile-time in programs and/or perform partial verification. Ciaopp checks how programs cali system librarles and also any assertions present in the program or in other modules used by the program. These assertions are also used to genérate documentation automatically. Ciaopp also uses analysis information to perform program transformations and optimizations such as múltiple abstract specialization, parallelization (including granularity control), and optimization of run-time tests for properties which cannot be checked completely at compile-time. We illustrate "hands-on" the use of Ciaopp in all these tasks. By design, Ciaopp is a generic tool, which can be easily tailored to perform these and other tasks for different LP and CLP dialects.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Ciao Prolog incorporates a module system which allows sepárate compilation and sensible creation of standalone executables. We describe some of the main aspects of the Ciao modular compiler, ciaoc, which takes advantage of the characteristics of the Ciao Prolog module system to automatically perform sepárate and incremental compilation and efficiently build small, standalone executables with competitive run-time performance, ciaoc can also detect statically a larger number of programming errors. We also present a generic code processing library for handling modular programs, which provides an important part of the functionality of ciaoc. This library allows the development of program analysis and transformation tools in a way that is to some extent orthogonal to the details of module system design, and has been used in the implementation of ciaoc and other Ciao system tools. We also describe the different types of executables which can be generated by the Ciao compiler, which offer different tradeoffs between executable size, startup time, and portability, depending, among other factors, on the linking regime used (static, dynamic, lazy, etc.). Finally, we provide experimental data which illustrate these tradeoffs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We provide an overall description of the Ciao multiparadigm programming system emphasizing some of the novel aspects and motivations behind its design and implementation. An important aspect of Ciao is that, in addition to supporting logic programming (and, in particular, Prolog), it provides the programmer with a large number of useful features from different programming paradigms and styles and that the use of each of these features (including those of Prolog) can be turned on and off at will for each program module. Thus, a given module may be using, e.g., higher order functions and constraints, while another module may be using assignment, predicates, Prolog meta-programming, and concurrency. Furthermore, the language is designed to be extensible in a simple and modular way. Another important aspect of Ciao is its programming environment, which provides a powerful preprocessor (with an associated assertion language) capable of statically finding non-trivial bugs, verifying that programs comply with specifications, and performing many types of optimizations (including automatic parallelization). Such optimizations produce code that is highly competitive with other dynamic languages or, with the (experimental) optimizing compiler, even that of static languages, all while retaining the flexibility and interactive development of a dynamic language. This compilation architecture supports modularity and separate compilation throughout. The environment also includes a powerful autodocumenter and a unit testing framework, both closely integrated with the assertion system. The paper provides an informal overview of the language and program development environment. It aims at illustrating the design philosophy rather than at being exhaustive, which would be impossible in a single journal paper, pointing instead to previous Ciao literature.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

World Health Organization actively stresses the importance of health, nutrition and well-being of the mother to foster children development. This issue is critical in the rural areas of developing countries where monitoring of health status of children is hardly performed since population suffers from a lack of access to health care. The aim of this research is to design, implement and deploy an e-health information and communication system to support health care in 26 rural communities of Cusmapa, Nicaragua. The final solution consists of an hybrid WiMAX/WiFi architecture that provides good quality communications through VoIP taking advantage of low cost WiFi mobile devices. Thus, a WiMAX base station was installed in the health center to provide a radio link with the rural health post "El Carrizo" sited 7,4 km. in line of sight. This service makes possible personal broadband voice and data communication facilities with the health center based on WiFi enabled devices such as laptops and cellular phones without communications cost. A free software PBX was installed at "San José de Cusmapa" health care site to enable communications for physicians, nurses and a technician through mobile telephones with IEEE 802.11 b/g protocol and SIP provided by the project. Additionally, the rural health post staff (midwives, brigade) received two mobile phones with these same features. In a complementary way, the deployed health information system is ready to analyze the distribution of maternal-child population at risk and the distribution of diseases on a geographical baseline. The system works with four information layers: fertile women, children, people with disabilities and diseases. Thus, authorized staff can obtain reports about prenatal monitoring tasks, status of the communities, malnutrition, and immunization control. Data need to be updated by health care staff in order to timely detect the source of problem to implement measures addressed to alleviate and improve health status population permanently. Ongoing research is focused on a mobile platform that collects and automatically updates in the information system, the height and weight of the children locally gathered in the remote communities. This research is being granted by the program Millennium Rural Communities of the Technical University of Madrid.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

CO2 capture and storage (CCS) projects are presently developed to reduce the emission of anthropogenic CO2 into the atmosphere. CCS technologies are expected to account for the 20% of the CO2 reduction by 2050. One of the main concerns of CCS is whether CO2 may remain confined within the geological formation into which it is injected since post-injection CO2 migration in the time scale of years, decades and centuries is not well understood. Theoretically, CO2 can be retained at depth i) as a supercritical fluid (physical trapping), ii) as a fluid slowly migrating in an aquifer due to long flow path (hydrodynamic trapping), iii) dissolved into ground waters (solubility trapping) and iv) precipitated secondary carbonates. Carbon dioxide will be injected in the near future (2012) at Hontomín (Burgos, Spain) in the frame of the Compostilla EEPR project, led by the Fundación Ciudad de la Energía (CIUDEN). In order to detect leakage in the operational stage, a pre-injection geochemical baseline is presently being developed. In this work a geochemical monitoring design is presented to provide information about the feasibility of CO2 storage at depth.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The integration of scientific knowledge about possible climate change impacts on water resources has a direct implication on the way water policies are being implemented and evolving. This is particularly true regarding various technical steps embedded into the EU Water Framework Directive river basin management planning, such as risk characterisation, monitoring, design and implementation of action programmes and evaluation of the "good status" objective achievements (in 2015). The need to incorporate climate change considerations into the implementation of EU water policy is currently discussed with a wide range of experts and stakeholders at EU level. Research trends are also on-going, striving to support policy developments and examining how scientific findings and recommendations could be best taken on board by policy-makers and water managers within the forthcoming years. This paper provides a snapshot of policy discussions about climate change in the context of the WFD river basin management planning and specific advancements of related EU-funded research projects. Perspectives for strengthening links among the scientific and policy-making communities in this area are also highlighted.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Variabilities associated with CMOS evolution affect the yield and performance of current digital designs. FPGAs, which are widely used for fast prototyping and implementation of digital circuits, also suffer from these issues. Proactive approaches start to appear to achieve self-awareness and dynamic adaptation of these devices. To support these techniques we propose the employment of a multi-purpose sensor network. This infrastructure, through adequate use of configuration and automation tools, is able to obtain relevant data along the life cycle of an FPGA. This is realised at a very reduced cost, not only in terms of area or other limited resources, but also regarding the design effort required to define and deploy the measuring infrastructure. Our proposal has been validated by measuring inter-die and intra-die variability in different FPGA families.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Tool wear detection is a key issue for tool condition monitoring. The maximization of useful tool life is frequently related with the optimization of machining processes. This paper presents two model-based approaches for tool wear monitoring on the basis of neuro-fuzzy techniques. The use of a neuro-fuzzy hybridization to design a tool wear monitoring system is aiming at exploiting the synergy of neural networks and fuzzy logic, by combining human reasoning with learning and connectionist structure. The turning process that is a well-known machining process is selected for this case study. A four-input (i.e., time, cutting forces, vibrations and acoustic emissions signals) single-output (tool wear rate) model is designed and implemented on the basis of three neuro-fuzzy approaches (inductive, transductive and evolving neuro-fuzzy systems). The tool wear model is then used for monitoring the turning process. The comparative study demonstrates that the transductive neuro-fuzzy model provides better error-based performance indices for detecting tool wear than the inductive neuro-fuzzy model and than the evolving neuro-fuzzy model.