918 resultados para Measure of riskiness
Resumo:
Young Clarias gariepinus cultured in an artificial tank were severely affected by an ulcer type of disease where 77% fish died within 5 weeks. From the lesions and kidney of affected fish Aeromonas, Pseudomonas, Flavobacterium, Micrococcus and Staphylococcus were isolated where Aeromonas was observed as the dominant bacteria. Among them, an A. hydrophila isolate AGK 34 was detected as a pathogen by the experimental challenge test. In order to find out a suitable remedial measure of the disease, four different chemotherapeutants were applied to the affected fish in 6 different ways under laboratory condition. Affected fish were recovered from the disease in different treatments. But the best result was obtained by a successive bath in 1-2% NaCl and subsequent oral treatment with commercial oxytetracycline at a dose of 75 mg/kg body weight of fish.
Resumo:
Silver belly (leiognathus splendens) caught in September spoiled faster than the fish caught in May. This could be due to seasonal changes. For silver belly, Total Volatile Base (TVB) value could be used as a measure of spoilage. At the beginning of spoilage TVB value is between 30-40 mg. N/100g sample. The main spoilage for silver belly appears to start between 6 and 8 hours (at 28° C-30°C) after landing on board. Therefore it is not necessary to ice silverbelly immediately; it seems to be sufficient if icing can be done within 6 hours of landing on board.
Effect of salinity on food consumption and growth of juvenile Nile tilapia (Oreochromi niloticus L.)
Resumo:
The effect of salinity (0, lO and 20%o, water temperature 28 ± l oC) on food consumption and growth of juvenile Nile tilapia, Oreochromis niloticus L. (9.94 ± 0.15 g) were investigated by feeding group of 20 fish at 2% body weight day. Individual food consumption was measured using X-radiography. There were no significant differences in growth or white muscle protein concentrations among groups. During feed deprivation, weight loss was similar for fish held at O%o and 10 %o salinity, but after 7 days over 50% of the fish maintained at 20%o salinity developed lesions covering 5-25% of the body. No significant relationships were observed between individual specific growth rates and food consumption rates within the groups. The fish in all salinity groups showed a negative correlation between specific growth rate and food conversion ratio. The coefficient of variation for wet weight specific food consumption and the mean share of meal for each fish were used as a measure of social hierarchy strength. A negative correlation was observed between coefficient of variation in food consumption and mean share of meal. The social hierarchy structure was similar in all salinities; 25% of the fish were dominant (18.29% above an equal share of meal) and 30% were subordinate (16.19% below an equal share of meal) and the remainder 45% fish fed theoretical share of meal (MSM, 5.26%).
Resumo:
Computational Design has traditionally required a great deal of geometrical and parametric data. This data can only be supplied at stages later than conceptual design, typically the detail stage, and design quality is given by some absolute fitness function. On the other hand, design evaluation offers a relative measure of design quality that requires only a sparse representation. Quality, in this case, is a measure of how well a design will complete its task.
The research intends to address the question: "Is it possible to evaluate a mechanical design at the conceptual design phase and be able to make some prediction of its quality?" Quality can be interpreted as success in the marketplace, success in performing the required task, or some other user requirement. This work aims to determine a minimum level of representation such that conceptual designs can be usefully evaluated without needing to capture detailed geometry. This representation will form the model for the conceptual designs that are being considered for evaluation. The method to be developed will be a case-based evaluation system, that uses a database of previous designs to support design exploration. The method will not be able to support novel design as case-based design implies the model topology must be fixed.
Resumo:
Background and objectives: Pentobarbital and ketamine are commonly used in animal experiments, including studies on the effects of ageing on the central nervous system. The electroencephalogram is a sensitive measure of brain activity. The present study i
Resumo:
The aim of the present Study was to investigate if different levels of circulating corticosterone (CORT) modulate the effect of nicotine on prepulse inhibition (PPI), a measure of sensorimotor gating that is disrupted in schizophrenia and other mental illnesses. Four groups of mice were investigated: sham-operated, adrenalectomized (ADX) and implanted with a cholesterol pellet, ADX and implanted with a 10 mg CORT pellet, or ADX and 50 mg, of CORT. Different CORT levels or doses of nicotine did not significantly affect startle responses. Baseline PPI was significantly reduced in mice implanted with the highest dose of CORT. In ADX mice implanted with cholesterol, nicotine treatment influenced PPI depending on the prepulse intensity. In ADX mice implanted with 50 mg of CORT, treatment with 10 mg/kg of nicotine caused a significant increase in PPI at all prepulse intensities. Binding studies showed that corticosterone treatment had significantly affected nicotinic acetylcholine receptor (nAChR) density in the mouse brain. Treatment with 50 mg CORT decreased I-125-epibatidine binding in the globus pallidus and I-125-alpha-bungarotoxin binding in the claustrum. These results suggest a possible interaction of corticosterone and nicotine at the level of the alpha4- and alpha7-type nAChR in the regulation of PPI. In situations of high circulating levels of corticosterone, nicotine may be beneficial to restore disruption of PPI. (C) 2004 Elsevier Ltd. All rights reserved.
Resumo:
Biological diversity of an ecosystem is considered a reliable measure of the state of health of the ecosystem. In Uganda's large lakes, the Victoria and Kyoga, the past three decades have been characterized by profound changes in fish species composition following the introduction of the piscivorous Nile perch (Oguto-Ohwayo 1990). Over 300 haplochromine cichlid species comprising a wide range of trophic groups were lost along with a host of non-cichlid fishes which occupied virtually all available ecological niches and in the lakes (Witte 1992). A second major ecological event has been the gradual nutrient enrichment of the water bodies (eutrophication) from diffuse and point sources, while at the same time pollutants have also gained entrance into the water systems in pace with indusfrial development and human population increases in the lake basins. Eutrophication and pollution have drastically altered the physical and-chemical character of the water medium in which different fauna and flora thrive. In Lake Victoria these alterations have resulted in changes of algal species composition from pristine community dominated by chlorophytes and diatoms (Melosira etc) to one composed largely of blue-green algae or Cyanobacteria (Microcystis, Anabaena, Planktolyngbya etc) (Mugidde 1993, Hecky 1993).
Resumo:
The movement of the circular piston in an oscillating piston positive displacement flowmeter is important in understanding the operation of the flowmeter, and the leakage of liquid past the piston plays a key role in the performance of the meter. The clearances between the piston and the chamber are small, typically less than 60 νm. In order to measure this film thickness a fluorescent dye was added to the water passing through the meter, which was illuminated with UV light. Visible light images were captured with a digital camera and analysed to give a measure of the film thickness with an uncertainty of less than 7%. It is known that this method lacks precision unless careful calibration is undertaken. Methods to achieve this are discussed in the paper. The grey level values for a range of film thicknesses were calibrated in situ with six dye concentrations to select the most appropriate one for the range of liquid film thickness. Data obtained for the oscillating piston flowmeter demonstrate the value of the fluorescence technique. The method is useful, inexpensive and straightforward and can be extended to other applications where measurement of liquid film thickness is required. © 2011 IOP Publishing Ltd.
Resumo:
Consumer goods manufacturers aiming to reduce the environmental impact associated with their products commonly pursue incremental change strategies, but more radical approaches may be required if we are to address the challenges of sustainable consumption. One strategy to realize step change reductions is to prepare a portfolio of innovations providing different levels of impact reduction in exchange for different levels of organizational resource commitment. In this research a tool is developed to support this strategy, starting with the assumption that through brainstorming or other eco-innovation approaches, a long-list of candidate innovations has been created. The tool assesses the potential greenhouse gas benefit of an innovative option against the difficulty of its implementation. A simple greenhouse gas benefit assessment method based on streamlined LCA was used to analyze impact reduction potential, and a novel measure of implementation difficulty was developed. The predictions of implementation difficulty were compared against expert opinion, and showed similar results indicating the measure can be used sensibly to predict implementation difficulty. The assessment of the environmental gain versus implementation difficulty is visualized in a matrix, showing the trade-offs of several options. The tool is deliberately simple with scalar measures of CO 2 emissions benefits and implementation difficulty so tool users must remain aware of other potential environmental burdens besides greenhouse gases (e.g. water, waste). In addition, although relative life cycle emissions benefits of an option may be low, the absolute impact of an option can be high and there may be other co-benefits, which could justify higher levels of implementation difficulty. Different types of consumer products (e.g. household, personal care, foods) have been evaluated using the tool. Initial trials of the tool within Unilever demonstrate that the tool facilitates rapid evaluation of low-carbon innovations. © 2011 Elsevier Ltd. All rights reserved.
Resumo:
This paper examines the possibility of using a background gas medium to enhance the current available from low threshold carbon cathodes. The field emission current is used to initiate a plasma in the gas medium, and thereby achieve a current multiplication effect. Results on the variation of anode current as a function of electric field and gas pressure are presented. These are compared with model calculations to verify the principles of operation. The influence of ion bombardment on the long term performance thin film carbon cathodes is examined for He and Ar multiplication plasmas. A measure of the influence of current multiplication on display quality is presented by examining light output from two standard low voltage phosphors. Also studied are the influence of doping the carbon with N to lower the threshold voltage for emission as well as the consequent impact on anode current from the plasma.
Resumo:
The optimization of dialogue policies using reinforcement learning (RL) is now an accepted part of the state of the art in spoken dialogue systems (SDS). Yet, it is still the case that the commonly used training algorithms for SDS require a large number of dialogues and hence most systems still rely on artificial data generated by a user simulator. Optimization is therefore performed off-line before releasing the system to real users. Gaussian Processes (GP) for RL have recently been applied to dialogue systems. One advantage of GP is that they compute an explicit measure of uncertainty in the value function estimates computed during learning. In this paper, a class of novel learning strategies is described which use uncertainty to control exploration on-line. Comparisons between several exploration schemes show that significant improvements to learning speed can be obtained and that rapid and safe online optimisation is possible, even on a complex task. Copyright © 2011 ISCA.
Resumo:
An X-ray imaging technique is used to probe the stability of 3-dimensional granular packs in a slowly rotating drum. Well before the surface reaches the avalanche angle, we observe intermittent plastic events associated with collective rearrangements of the grains located in the vicinity of the free surface. The energy released by these discrete events grows as the system approaches the avalanche threshold. By testing various preparation methods, we show that the pre-avalanche dynamics is not solely controlled by the difference between the free surface inclination and the avalanche angle. As a consequence, the measure of the pre-avalanche dynamics is unlikely to serve as a tool for predicting macroscopic avalanches.
Resumo:
When searching for characteristic subpatterns in potentially noisy graph data, it appears self-evident that having multiple observations would be better than having just one. However, it turns out that the inconsistencies introduced when different graph instances have different edge sets pose a serious challenge. In this work we address this challenge for the problem of finding maximum weighted cliques. We introduce the concept of most persistent soft-clique. This is subset of vertices, that 1) is almost fully or at least densely connected, 2) occurs in all or almost all graph instances, and 3) has the maximum weight. We present a measure of clique-ness, that essentially counts the number of edge missing to make a subset of vertices into a clique. With this measure, we show that the problem of finding the most persistent soft-clique problem can be cast either as: a) a max-min two person game optimization problem, or b) a min-min soft margin optimization problem. Both formulations lead to the same solution when using a partial Lagrangian method to solve the optimization problems. By experiments on synthetic data and on real social network data we show that the proposed method is able to reliably find soft cliques in graph data, even if that is distorted by random noise or unreliable observations. Copyright 2012 by the author(s)/owner(s).
Resumo:
Localization of chess-board vertices is a common task in computer vision, underpinning many applications, but relatively little work focusses on designing a specific feature detector that is fast, accurate and robust. In this paper the `Chess-board Extraction by Subtraction and Summation' (ChESS) feature detector, designed to exclusively respond to chess-board vertices, is presented. The method proposed is robust against noise, poor lighting and poor contrast, requires no prior knowledge of the extent of the chess-board pattern, is computationally very efficient, and provides a strength measure of detected features. Such a detector has significant application both in the key field of camera calibration, as well as in Structured Light 3D reconstruction. Evidence is presented showing its robustness, accuracy, and efficiency in comparison to other commonly used detectors both under simulation and in experimental 3D reconstruction of flat plate and cylindrical objects
Resumo:
In order to minimize the number of iterations to a turbine design, reasonable choices of the key parameters must be made at the preliminary design stage. The choice of blade loading is of particular concern in the low pressure (LP) turbine of civil aero engines, where the use of high-lift blades is widespread. This paper considers how blade loading should be measured, compares the performance of various loss correlations, and explores the impact of blade lift on performance and lapse rates. To these ends, an analytical design study is presented for a repeating-stage, axial-flow LP turbine. It is demonstrated that the long-established Zweifel lift coefficient (Zweifel, 1945, "The Spacing of Turbomachine Blading, Especially with Large Angular Deflection" Brown Boveri Rev., 32(1), pp. 436-444) is flawed because it does not account for the blade camber. As a result the Zweifel coefficient is only meaningful for a fixed set of flow angles and cannot be used as an absolute measure of blade loading. A lift coefficient based on circulation is instead proposed that accounts for the blade curvature and is independent of the flow angles. Various existing profile and secondary loss correlations are examined for their suitability to preliminary design. A largely qualitative comparison demonstrates that the loss correlations based on Ainley and Mathieson (Ainley and Mathieson, 1957, "A Method of Performance Estimation for Axial-Flow Turbines," ARC Reports and Memoranda No. 2974; Dunham and Came, 1970, "Improvements to the Ainley-Mathieson Method of Turbine Performance Prediction," Trans. ASME: J. Eng. Gas Turbines Power, July, pp. 252-256; Kacker and Okapuu, 1982, "A Mean Line Performance Method for Axial Flow Turbine Efficiency," J. Eng. Power, 104, pp. 111-119). are not realistic, while the profile loss model of Coull and Hodson (Coull and Hodson, 2011, "Predicting the Profile Loss of High-Lift Low Pressure Turbines," J. Turbomach., 134(2), pp. 021002) and the secondary loss model of (Traupel, W, 1977, Thermische Turbomaschinen, Springer-Verlag, Berlin) are arguably the most reasonable. A quantitative comparison with multistage rig data indicates that, together, these methods over-predict lapse rates by around 30%, highlighting the need for improved loss models and a better understanding of the multistage environment. By examining the influence of blade lift across the Smith efficiency chart, the analysis demonstrates that designs with higher flow turning will tend to be less sensitive to increases in blade loading. © 2013 American Society of Mechanical Engineers.