975 resultados para Test systems


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The activated sludge comprises a complex microbiological community. The structure (what types of microorganisms are present) and function (what can the organisms do and at what rates) of this community are determined by external physico -chemical features and by the influent to the sewage treatment plant. The external features we can manipulate but rarely the influent. Conventional control and operational strategies optimise activated sludge processes more as a chemical system than as a biological one. While optimising the process in a short time period, these strategies may deteriorate the long-term performance of the process due to their potentially adverse impact on the microbial properties. Through briefly reviewing the evidence available in the literature that plant design and operation affect both the structure and function of the microbial community in activated sludge, we propose to add sludge population optimisation as a new dimension to the control of biological wastewater treatment systems. We stress that optimising the microbial community structure and property should be an explicit aim for the design and operation of a treatment plant. The major limitations to sludge population optimisation revolve around inadequate microbiological data, specifically community structure, function and kinetic data. However, molecular microbiological methods that strive to provide that data are being developed rapidly. The combination of these methods with the conventional approaches for kinetic study is briefly discussed. The most pressing research questions pertaining to sludge population optimisation are outlined. (C) 2002 Elsevier Science Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Concussion severity grades according to the Cantu, Colorado Medical Society, and American Academy of Neurology systems were not clearly related to the presence or duration of impaired neuropsychological test performance in 21 professional rugby league athletes. The use of concussion severity guidelines and neuropsychological testing to assist return to play decisions requires further investigation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Pulp lifters, also known, as pan lifters are an integral part of the majority of autogenous (AG), semi-autogenous (SAG) and grate discharge ball mills. The performance of the pulp lifters in conjunction with grate design determines the ultimate flow capacity of these mills. Although the function of the pulp lifters is simply to transport the slurry passed through the discharge grate into the discharge trunnion, their performance depends on their design as well as that of the grate and operating conditions such as mill speed and charge level. However, little or no work has been reported on the performance of grate-pulp lifter assemblies and in particular the influence of pulp lifter design on slurry transport. Ideally, the discharge rate through a grate-pulp lifter assembly should be equal to the discharge rate through at a given mill hold-up. However, the results obtained have shown that conventional pulp lifter designs cause considerable restrictions to flow resulting in reduced flow capacity. In this second of a two-part series of papers the performance of conventional pulp lifters (radial and spiral designs) is described and is based on extensive test work carried out in a I m diameter pilot SAG mill. (C) 2003 Elsevier Science Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Chlorophyll fluorescence measurements have a wide range of applications from basic understanding of photosynthesis functioning to plant environmental stress responses and direct assessments of plant health. The measured signal is the fluorescence intensity (expressed in relative units) and the most meaningful data are derived from the time dependent increase in fluorescence intensity achieved upon application of continuous bright light to a previously dark adapted sample. The fluorescence response changes over time and is termed the Kautsky curve or chlorophyll fluorescence transient. Recently, Strasser and Strasser (1995) formulated a group of fluorescence parameters, called the JIP-test, that quantify the stepwise flow of energy through Photosystem II, using input data from the fluorescence transient. The purpose of this study was to establish relationships between the biochemical reactions occurring in PS II and specific JIP-test parameters. This was approached using isolated systems that facilitated the addition of modifying agents, a PS II electron transport inhibitor, an electron acceptor and an uncoupler, whose effects on PS II activity are well documented in the literature. The alteration to PS II activity caused by each of these compounds could then be monitored through the JIP-test parameters and compared and contrasted with the literature. The known alteration in PS II activity of Chenopodium album atrazine resistant and sensitive biotypes was also used to gauge the effectiveness and sensitivity of the JIP-test. The information gained from the in vitro study was successfully applied to an in situ study. This is the first in a series of four papers. It shows that the trapping parameters of the JIP-test were most affected by illumination and that the reduction in trapping had a run-on effect to inhibit electron transport. When irradiance exposure proceeded to photoinhibition, the electron transport probability parameter was greatly reduced and dissipation significantly increased. These results illustrate the advantage of monitoring a number of fluorescence parameters over the use of just one, which is often the case when the F-V/F-M ratio is used.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The goal of the present study is mapping the nature of possible contributions of participatory online platforms in citizen actions that may contribute in the fight against cancer and its associated consequences. These platforms are usually associated with entertainment: in that sense, we intent to test their validity in other domains such as health, as well as contribute to an expanded perception of their potential by their users. The research is based on the analysis of online solidarity networks, namely the ones residing on Facebook, Orkut and the blogosphere, that citizens have been gradually resorting to. The research is also based on the development of newer and more efficient solutions that provide the individual (directly or indirectly affected by issues of oncology) with the means to overcome feelings of impotence and fatality. In this article, we aim at summarizing the processes of usage of these decentralized, freer participatory platforms by citizens and institutions, while attempting to unravel existing hype and stigma; we also provide a first survey of the importance and the role of institutions in this kind of endeavor; lastly, we present a prototype, developed in the context of the present study, that is specifically dedicated to addressing oncology through social media. This prototype is already available online at www.talkingaboutcancer.org, however, still under development and testing. The main objective of this platform is to allow every citizen to freely build their network of contacts and information, according to their own individual and/ or collective needs and desires.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Wireless medical systems are comprised of four stages, namely the medical device, the data transport, the data collection and the data evaluation stages. Whereas the performance of the first stage is highly regulated, the others are not. This paper concentrates on the data transport stage and argues that it is necessary to establish standardized tests to be used by medical device manufacturers to provide comparable results concerning the communication performance of the wireless networks used to transport medical data. Besides, it suggests test parameters and procedures to be used to produce comparable communication performance results.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper proposes a new methodology to reduce the probability of occurring states that cause load curtailment, while minimizing the involved costs to achieve that reduction. The methodology is supported by a hybrid method based on Fuzzy Set and Monte Carlo Simulation to catch both randomness and fuzziness of component outage parameters of transmission power system. The novelty of this research work consists in proposing two fundamentals approaches: 1) a global steady approach which deals with building the model of a faulted transmission power system aiming at minimizing the unavailability corresponding to each faulted component in transmission power system. This, results in the minimal global cost investment for the faulted components in a system states sample of the transmission network; 2) a dynamic iterative approach that checks individually the investment’s effect on the transmission network. A case study using the Reliability Test System (RTS) 1996 IEEE 24 Buses is presented to illustrate in detail the application of the proposed methodology.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper proposes a computationally efficient methodology for the optimal location and sizing of static and switched shunt capacitors in large distribution systems. The problem is formulated as the maximization of the savings produced by the reduction in energy losses and the avoided costs due to investment deferral in the expansion of the network. The proposed method selects the nodes to be compensated, as well as the optimal capacitor ratings and their operational characteristics, i.e. fixed or switched. After an appropriate linearization, the optimization problem was formulated as a large-scale mixed-integer linear problem, suitable for being solved by means of a widespread commercial package. Results of the proposed optimizing method are compared with another recent methodology reported in the literature using two test cases: a 15-bus and a 33-bus distribution network. For the both cases tested, the proposed methodology delivers better solutions indicated by higher loss savings, which are achieved with lower amounts of capacitive compensation. The proposed method has also been applied for compensating to an actual large distribution network served by AES-Venezuela in the metropolitan area of Caracas. A convergence time of about 4 seconds after 22298 iterations demonstrates the ability of the proposed methodology for efficiently handling large-scale compensation problems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents a methodology which is based on statistical failure and repair data of the transmission power system components and uses fuzzyprobabilistic modeling for system component outage parameters. Using statistical records allows developing the fuzzy membership functions of system component outage parameters. The proposed hybrid method of fuzzy set and Monte Carlo simulation based on the fuzzy-probabilistic models allows catching both randomness and fuzziness of component outage parameters. A network contingency analysis to identify any overloading or voltage violation in the network is performed once obtained the system states by Monte Carlo simulation. This is followed by a remedial action algorithm, based on optimal power flow, to reschedule generations and alleviate constraint violations and, at the same time, to avoid any load curtailment, if possible, or, otherwise, to minimize the total load curtailment, for the states identified by the contingency analysis. In order to illustrate the application of the proposed methodology to a practical case, the paper will include a case study for the Reliability Test System (RTS) 1996 IEEE 24 BUS.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Significant work has been done in the areas of Pervcomp/Ubicomp Smart Environments with advances on making proactive systems, but those advances have not made these type of systems accurately proactive. On the other hand a great deal is needed to make systems more sensible/sensitive and trustable (both in terms of reliability and privacy). We put forward the thesis that a more integral and social-aware sort of intelligence is needed to effectively interact, decide and act on behalf of people’s interest and that a way to test how effective systems are achieving these desirable behaviour is needed as a consequence. We support our thesis by providing examples on how to measure effectiveness in variety of different environments.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Introduction: Paper and thin layer chromatography methods are frequently used in Classic Nuclear Medicine for the determination of radiochemical purity (RCP) on radiopharmaceutical preparations. An aliquot of the radiopharmaceutical to be tested is spotted at the origin of a chromatographic strip (stationary phase), which in turn is placed in a chromatographic chamber in order to separate and quantify radiochemical species present in the radiopharmaceutical preparation. There are several methods for the RCP measurement, based on the use of equipment as dose calibrators, well scintillation counters, radiochromatografic scanners and gamma cameras. The purpose of this study was to compare these quantification methods for the determination of RCP. Material and Methods: 99mTc-Tetrofosmin and 99mTc-HDP are the radiopharmaceuticals chosen to serve as the basis for this study. For the determination of RCP of 99mTc-Tetrofosmin we used ITLC-SG (2.5 x 10 cm) and 2-butanone (99mTc-tetrofosmin Rf = 0.55, 99mTcO4- Rf = 1.0, other labeled impurities 99mTc-RH RF = 0.0). For the determination of RCP of 99mTc-HDP, Whatman 31ET and acetone was used (99mTc-HDP Rf = 0.0, 99mTcO4- Rf = 1.0, other labeled impurities RF = 0.0). After the development of the solvent front, the strips were allowed to dry and then imaged on the gamma camera (256x256 matrix; zoom 2; LEHR parallel-hole collimator; 5-minute image) and on the radiochromatogram scanner. Then, strips were cut in Rf 0.8 in the case of 99mTc-tetrofosmin and Rf 0.5 in the case of 99mTc-HDP. The resultant pieces were smashed in an assay tube (to minimize the effect of counting geometry) and counted in the dose calibrator and in the well scintillation counter (during 1 minute). The RCP was calculated using the formula: % 99mTc-Complex = [(99mTc-Complex) / (Total amount of 99mTc-labeled species)] x 100. Statistical analysis was done using the test of hypotheses for the difference between means in independent samples. Results:The gamma camera based method demonstrated higher operator-dependency (especially concerning the drawing of the ROIs) and the measures obtained using the dose calibrator are very sensitive to the amount of activity spotted in the chromatographic strip, so the use of a minimum of 3.7 MBq activity is essential to minimize quantification errors. Radiochromatographic scanner and well scintillation counter showed concordant results and demonstrated the higher level of precision. Conclusions: Radiochromatographic scanners and well scintillation counters based methods demonstrate to be the most accurate and less operator-dependant methods.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis presents the Fuzzy Monte Carlo Model for Transmission Power Systems Reliability based studies (FMC-TRel) methodology, which is based on statistical failure and repair data of the transmission power system components and uses fuzzyprobabilistic modeling for system component outage parameters. Using statistical records allows developing the fuzzy membership functions of system component outage parameters. The proposed hybrid method of fuzzy set and Monte Carlo simulation based on the fuzzy-probabilistic models allows catching both randomness and fuzziness of component outage parameters. A network contingency analysis to identify any overloading or voltage violation in the network is performed once obtained the system states. This is followed by a remedial action algorithm, based on Optimal Power Flow, to reschedule generations and alleviate constraint violations and, at the same time, to avoid any load curtailment, if possible, or, otherwise, to minimize the total load curtailment, for the states identified by the contingency analysis. For the system states that cause load curtailment, an optimization approach is applied to reduce the probability of occurrence of these states while minimizing the costs to achieve that reduction. This methodology is of most importance for supporting the transmission system operator decision making, namely in the identification of critical components and in the planning of future investments in the transmission power system. A case study based on Reliability Test System (RTS) 1996 IEEE 24 Bus is presented to illustrate with detail the application of the proposed methodology.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Fieldbus communication networks aim to interconnect sensors, actuators and controllers within process control applications. Therefore, they constitute the foundation upon which real-time distributed computer-controlled systems can be implemented. P-NET is a fieldbus communication standard, which uses a virtual token-passing medium-access-control mechanism. In this paper pre-run-time schedulability conditions for supporting real-time traffic with P-NET networks are established. Essentially, formulae to evaluate the upper bound of the end-to-end communication delay in P-NET messages are provided. Using this upper bound, a feasibility test is then provided to check the timing requirements for accessing remote process variables. This paper also shows how P-NET network segmentation can significantly reduce the end-to-end communication delays for messages with stringent timing requirements.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

It has been shown that in reality at least two general scenarios of data structuring are possible: (a) a self-similar (SS) scenario when the measured data form an SS structure and (b) a quasi-periodic (QP) scenario when the repeated (strongly correlated) data form random sequences that are almost periodic with respect to each other. In the second case it becomes possible to describe their behavior and express a part of their randomness quantitatively in terms of the deterministic amplitude–frequency response belonging to the generalized Prony spectrum. This possibility allows us to re-examine the conventional concept of measurements and opens a new way for the description of a wide set of different data. In particular, it concerns different complex systems when the ‘best-fit’ model pretending to be the description of the data measured is absent but the barest necessity of description of these data in terms of the reduced number of quantitative parameters exists. The possibilities of the proposed approach and detection algorithm of the QP processes were demonstrated on actual data: spectroscopic data recorded for pure water and acoustic data for a test hole. The suggested methodology allows revising the accepted classification of different incommensurable and self-affine spatial structures and finding accurate interpretation of the generalized Prony spectroscopy that includes the Fourier spectroscopy as a partial case.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We consider the global scheduling problem of multimode real-time systems upon identical multiprocessor platforms. During the execution of a multimode system, the system can change from one mode to another such that the current task set is replaced with a new task set. Thereby, ensuring that deadlines are met requires not only that a schedulability test is performed on tasks in each mode but also that (i) a protocol for transitioning from one mode to another is specified and (ii) a schedulability test for each transition is performed. In this paper, we extend the synchronous transition protocol SM-MSO in order to take into account mode-independent tasks [1], i.e., tasks of which the execution pattern must not be jeopardized by the mode changes.