939 resultados para process parameter monitoring


Relevância:

30.00% 30.00%

Publicador:

Resumo:

This is the Stillwaters monitoring programme. Summary results 2003 and 2004 from the Environment Agency North West. This report is focuses in The Winter Monitoring of Stillwaters Programme, which began in January 2001 with the aim of gathering long term data on nutrient abundance over winter months. This allows assessment of nutrient ‘carry-over’ available for algal growth in the following year, plus year-on-year productivity. 14 stillwaters are monitored each year. The environmental issues associated with each Stillwater are summarised in the table below. Bank-side water samples are taken for nutrients (N, P and S) and chlorophyll. A YSI multi-parameter sonde measures temperature, pH, specific conductivity and dissolved oxygen (% saturation). Survey results shown in this report came from: Oak Mere and Bar mere.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This is the Stillwaters monitoring programme. Summary results 2004 and 2005 from the Environment Agency North West. This report focuses on the 5th year of winter monitoring analysis in 14 stillwaters in Cheshire. The 14 stillwaters analysed are: Comber Mere, Oss Mere, Marbury Big Mere, Chapel Mere, Bar Mere, Oak Mere, Hatch Mere, Black Lake, Betley Mere, Tabley Mere, Melchett Mere, Tatton Mere, Rostherne Mere and Mere mere. Nutrient availability in the stillwaters analysed is used to look into the productivity of the waterbody. Bank-side water samples were taken for nutrients (Nitrogen and Phosphorous) and chlorophyll. A YSI multi-parameter sonde measures temperature, pH, specific conductivity and dissolved oxygen (% saturation).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This is the Oak Mere continuous monitoring summary report, 1997 to 2000 from the Environment Agency North West. This report focuses on the continuous monitoring programme made by a multi -parameter probe in Oak Mere since summer 1997. From 1999 nutrient and chlorophyll samples were taken when the water quality instrument was serviced. Water level measurements were made since 1998. Moreover, the report shows a summary Oak mere water quality of each year (1997-2000). The physico-chemical parameters and nutrient levels included are: temperature, specific conditions, dissolved oxygen, pH, Depth, secchi disc measurements, chlorophyll a, total Phosphorus, orto-Phosphate, Nitrate, Ammonia, and Silicate.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Atlantic Croaker (Micropogonias undulatus) production dynamics along the U.S. Atlantic coast are regulated by fishing and winter water temperature. Stakeholders for this resource have recommended investigating the effects of climate covariates in assessment models. This study used state-space biomass dynamic models without (model 1) and with (model 2) the minimum winter estuarine temperature (MWET) to examine MWET effects on Atlantic Croaker population dynamics during 1972–2008. In model 2, MWET was introduced into the intrinsic rate of population increase (r). For both models, a prior probability distribution (prior) was constructed for r or a scaling parameter (r0); imputs were the fishery removals, and fall biomass indices developed by using data from the Multispecies Bottom Trawl Survey of the Northeast Fisheries Science Center, National Marine Fisheries Service, and the Coastal Trawl Survey of the Southeast Area Monitoring and Assessment Program. Model sensitivity runs incorporated a uniform (0.01,1.5) prior for r or r0 and bycatch data from the shrimp-trawl fishery. All model variants produced similar results and therefore supported the conclusion of low risk of overfishing for the Atlantic Croaker stock in the 2000s. However, the data statistically supported only model 1 and its configuration that included the shrimp-trawl fishery bycatch. The process errors of these models showed slightly positive and significant correlations with MWET, indicating that warmer winters would enhance Atlantic Croaker biomass production. Inconclusive, somewhat conflicting results indicate that biomass dynamic models should not integrate MWET, pending, perhaps, accumulation of longer time series of the variables controlling the production dynamics of Atlantic Croaker, preferably including winter-induced estimates of Atlantic Croaker kills.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Reef fishes are conspicuous and essential components of coral reef ecosystems and economies of southern Florida and the United States Virgin Islands (USVI). Throughout Florida and the USVI, reef fish are under threat from a variety of anthropogenic and natural stressors including overfishing, habitat loss, and environmental changes. The South Florida/Caribbean Network (SFCN), a unit of the National Park Service (NPS), is charged with monitoring reef fishes, among other natural and cultural resources, within six parks in the South Florida - Caribbean region (Biscayne National Park, BISC; Buck Island Reef National Monument, BUIS; Dry Tortugas National Park, DRTO; Everglades National Park, EVER; Salt River Bay National Historic Park and Ecological Preserve, SARI; Virgin Islands National Park, VIIS). Monitoring data is intended for park managers who are and will continue to be asked to make decisions to balance environmental protection, fishery sustainability and park use by visitors. The range and complexity of the issues outlined above, and the need for NPS to invest in a strategy of monitoring, modeling, and management to ensure the sustainability of its precious assets, will require strategic investment in long-term, high-precision, multispecies reef fish data that increases inherent system knowledge and reduces uncertainty. The goal of this guide is to provide the framework for park managers and researchers to create or enhance a reef fish monitoring program within areas monitored by the SFCN. The framework is expected to be applicable to other areas as well, including the Florida Keys National Marine Sanctuary and Virgin Islands Coral Reef National Monument. The favored approach is characterized by an iterative process of data collection, dataset integration, sampling design analysis, and population and community assessment that evaluates resource risks associated with management policies. Using this model, a monitoring program can adapt its survey methods to increase accuracy and precision of survey estimates as new information becomes available, and adapt to the evolving needs and broadening responsibilities of park management.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Changes in the texture (elastic nature) of the flesh of barrel salted herring during the ripening process at 4°C have been monitored. The method employs the analysis of stress-relaxation curves after compression to half of the sample thickness on an lnstron Model 1112. The parameter 'T/P' for each sample represents the reciprocal of the gradient of a line connecting P and T0.368p. This parameter characteristic of each sample's texture was calculated as the ratio of 'T/P' where, T is the relaxation time and is defined as the time required for a stress at constant strain to decrease to 1/e of its original value, where 'e' is the base of natural logarithms (2.7183). Since 1/e=0.368, the relaxation time is the time required for the force to decay to 36.8% of its original value. P is the peak height of the curve (i.e. the force value at the maximum height). This method was adopted from the bakery industry for testing the degree of gluten development in bread dough. The 'T/P' values obtained over the course of ripening for differently treated salted-herring in barrels ranged between 1 and 12. The trends in 'T/P' value, during ripening period for the different samples, appeared to be parallel changes in texture perceived by sensory observation (subjective measurement), although the heterogeneous nature of the samples gave standard deviations, about the replicate sample mean, around 5%. The method appears promising as an objective measure for monitoring this aspect of the textural quality of barrel salted-herring through ripening if reproducibility of test results can be improved by more careful standardization of sample preparation and test protocol.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A wavelength conversion device was demonstrated at the bit rate of 2.488 Gb/s with 2R (reamplification and reshaping) regenerative properties. A low frequency pilot tone was removed during the conversion process and a new one added. The wavelength converter is shown to operate well at 10 Gb/s, and tone identification/replacement should also be possible at this data rate.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The partially observable Markov decision process (POMDP) provides a popular framework for modelling spoken dialogue. This paper describes how the expectation propagation algorithm (EP) can be used to learn the parameters of the POMDP user model. Various special probability factors applicable to this task are presented, which allow the parameters be to learned when the structure of the dialogue is complex. No annotations, neither the true dialogue state nor the true semantics of user utterances, are required. Parameters optimised using the proposed techniques are shown to improve the performance of both offline transcription experiments as well as simulated dialogue management performance. ©2010 IEEE.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Socio-economic Monitoring (SocMon) training workshop followed up from the capacity building workshop held in Mannar, 2015. It's aims were to validate information collected at the previous workshop, assist in filling in any gaps and develop a vision tree fro future actions. Planned outputs included: a detailed workplan; a workshop process report; and a final socioeconomic base line report for Vidathaltivu village.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Reinforcement techniques have been successfully used to maximise the expected cumulative reward of statistical dialogue systems. Typically, reinforcement learning is used to estimate the parameters of a dialogue policy which selects the system's responses based on the inferred dialogue state. However, the inference of the dialogue state itself depends on a dialogue model which describes the expected behaviour of a user when interacting with the system. Ideally the parameters of this dialogue model should be also optimised to maximise the expected cumulative reward. This article presents two novel reinforcement algorithms for learning the parameters of a dialogue model. First, the Natural Belief Critic algorithm is designed to optimise the model parameters while the policy is kept fixed. This algorithm is suitable, for example, in systems using a handcrafted policy, perhaps prescribed by other design considerations. Second, the Natural Actor and Belief Critic algorithm jointly optimises both the model and the policy parameters. The algorithms are evaluated on a statistical dialogue system modelled as a Partially Observable Markov Decision Process in a tourist information domain. The evaluation is performed with a user simulator and with real users. The experiments indicate that model parameters estimated to maximise the expected reward function provide improved performance compared to the baseline handcrafted parameters. © 2011 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents an agenda-based user simulator which has been extended to be trainable on real data with the aim of more closely modelling the complex rational behaviour exhibited by real users. The train-able part is formed by a set of random decision points that may be encountered during the process of receiving a system act and responding with a user act. A sample-based method is presented for using real user data to estimate the parameters that control these decisions. Evaluation results are given both in terms of statistics of generated user behaviour and the quality of policies trained with different simulators. Compared to a handcrafted simulator, the trained system provides a much better fit to corpus data and evaluations suggest that this better fit should result in improved dialogue performance. © 2010 Association for Computational Linguistics.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Several research studies have been recently initiated to investigate the use of construction site images for automated infrastructure inspection, progress monitoring, etc. In these studies, it is always necessary to extract material regions (concrete or steel) from the images. Existing methods made use of material's special color/texture ranges for material information retrieval, but they do not sufficiently discuss how to find these appropriate color/texture ranges. As a result, users have to define appropriate ones by themselves, which is difficult for those who do not have enough image processing background. This paper presents a novel method of identifying concrete material regions using machine learning techniques. Under the method, each construction site image is first divided into regions through image segmentation. Then, the visual features of each region are calculated and classified with a pre-trained classifier. The output value determines whether the region is composed of concrete or not. The method was implemented using C++ and tested over hundreds of construction site images. The results were compared with the manual classification ones to indicate the method's validity.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Laser beam diagnosis is usually carried out off-line in order to minimise the disruption to the process being carried out. This paper presents the results of a fractional sampling device for a high power beam diagnosis system capable of measuring in process beam properties such as beam diameter, intensity and beam position. The paper discusses the application of this sampling technique for monitoring beam properties during the laser materials processing operation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Laser beam diagnosis is usually carried out off-line in order to minimise the disruption to the process being carried out. This paper presents the results of a fractional sampling device for a high power beam diagnosis system capable of measuring in process beam properties such as beam diameter, intensity and beam position. The paper discusses the application of this sampling technique for monitoring beam properties during the laser materials processing operation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A systematic study of the parameter space of graphene chemical vapor deposition (CVD) on polycrystalline Cu foils is presented, aiming at a more fundamental process rationale in particular regarding the choice of carbon precursor and mitigation of Cu sublimation. CH 4 as precursor requires H 2 dilution and temperatures ≥1000 °C to keep the Cu surface reduced and yield a high-quality, complete monolayer graphene coverage. The H 2 atmosphere etches as-grown graphene; hence, maintaining a balanced CH 4/H 2 ratio is critical. Such balance is more easily achieved at low-pressure conditions, at which however Cu sublimation reaches deleterious levels. In contrast, C 6H 6 as precursor requires no reactive diluent and consistently gives similar graphene quality at 100-150 °C lower temperatures. The lower process temperature and more robust processing conditions allow the problem of Cu sublimation to be effectively addressed. Graphene formation is not inherently self-limited to a monolayer for any of the precursors. Rather, the higher the supplied carbon chemical potential, the higher the likelihood of film inhomogeneity and primary and secondary multilayer graphene nucleation. For the latter, domain boundaries of the inherently polycrystalline CVD graphene offer pathways for a continued carbon supply to the catalyst. Graphene formation is significantly affected by the Cu crystallography; i.e., the evolution of microstructure and texture of the catalyst template form an integral part of the CVD process. © 2012 American Chemical Society.