911 resultados para Evaluation of different sources of proteins
Resumo:
In the present investigation, three important stressors: cadmium ion (Cd++), salinity and temperature were selected to study their effects on protein and purine catabolism of O. mossambicus. Cadmium (Cd) is a biologically nonessential metal that can be toxic to aquatic animals. Cadmium is a trace element which is a common constituent of industrial effluents. It is a non-nutrient metal and toxic to fish even at low concentrations. Cadmium ions accumulate in sensitive organs like gills, liver, and kidney of fish in an unregulated manner . Thus; the toxic effects of cadmium are related to changes in natural physiological and biochemical processes in organism. The mechanics of osmoregulation (i.e. total solute and water regulation) are reasonably well understood (Evans, 1984, 1993), and most researchers agree that salinities that differ from the internal osmotic concentration of the fish must impose energetic regulatory costs for active ion transport. There is limited information on protein and purine catabolism of euryhaline fish during salinity adaptation. Within a range of non-lethal temperatures, fishes are generally able to cope with gradual temperature changes that are common in natural systems. However, rapid increases or decreases in ambient temperature may result in sub lethal physiological and behavioral responses. The catabolic pathways of proteins and purines are important biochemical processes. The results obtained signifies that O. mossambicus when exposed to different levels of cadmium ion, salinity and temperature show great variation in the catabolism of proteins and purines. The organism is trying to attain homeostasis in the presence of stressors by increasing or decreasing the activity of certain enzymes. The present study revealed that the protein and purine catabolism in O. mossambicus is sensitive to environmental stressors.
Resumo:
The consumers are becoming more concerned about food quality, especially regarding how, when and where the foods are produced (Haglund et al., 1999; Kahl et al., 2004; Alföldi, et al., 2006). Therefore, during recent years there has been a growing interest in the methods for food quality assessment, especially in the picture-development methods as a complement to traditional chemical analysis of single compounds (Kahl et al., 2006). The biocrystallization as one of the picture-developing method is based on the crystallographic phenomenon that when crystallizing aqueous solutions of dihydrate CuCl2 with adding of organic solutions, originating, e.g., from crop samples, biocrystallograms are generated with reproducible crystal patterns (Kleber & Steinike-Hartung, 1959). Its output is a crystal pattern on glass plates from which different variables (numbers) can be calculated by using image analysis. However, there is a lack of a standardized evaluation method to quantify the morphological features of the biocrystallogram image. Therefore, the main sakes of this research are (1) to optimize an existing statistical model in order to describe all the effects that contribute to the experiment, (2) to investigate the effect of image parameters on the texture analysis of the biocrystallogram images, i.e., region of interest (ROI), color transformation and histogram matching on samples from the project 020E170/F financed by the Federal Ministry of Food, Agriculture and Consumer Protection(BMELV).The samples are wheat and carrots from controlled field and farm trials, (3) to consider the strongest effect of texture parameter with the visual evaluation criteria that have been developed by a group of researcher (University of Kassel, Germany; Louis Bolk Institute (LBI), Netherlands and Biodynamic Research Association Denmark (BRAD), Denmark) in order to clarify how the relation of the texture parameter and visual characteristics on an image is. The refined statistical model was accomplished by using a lme model with repeated measurements via crossed effects, programmed in R (version 2.1.0). The validity of the F and P values is checked against the SAS program. While getting from the ANOVA the same F values, the P values are bigger in R because of the more conservative approach. The refined model is calculating more significant P values. The optimization of the image analysis is dealing with the following parameters: ROI(Region of Interest which is the area around the geometrical center), color transformation (calculation of the 1 dimensional gray level value out of the three dimensional color information of the scanned picture, which is necessary for the texture analysis), histogram matching (normalization of the histogram of the picture to enhance the contrast and to minimize the errors from lighting conditions). The samples were wheat from DOC trial with 4 field replicates for the years 2003 and 2005, “market samples”(organic and conventional neighbors with the same variety) for 2004 and 2005, carrot where the samples were obtained from the University of Kassel (2 varieties, 2 nitrogen treatments) for the years 2004, 2005, 2006 and “market samples” of carrot for the years 2004 and 2005. The criterion for the optimization was repeatability of the differentiation of the samples over the different harvest(years). For different samples different ROIs were found, which reflect the different pictures. The best color transformation that shows efficiently differentiation is relied on gray scale, i.e., equal color transformation. The second dimension of the color transformation only appeared in some years for the effect of color wavelength(hue) for carrot treated with different nitrate fertilizer levels. The best histogram matching is the Gaussian distribution. The approach was to find a connection between the variables from textural image analysis with the different visual criteria. The relation between the texture parameters and visual evaluation criteria was limited to the carrot samples, especially, as it could be well differentiated by the texture analysis. It was possible to connect groups of variables of the texture analysis with groups of criteria from the visual evaluation. These selected variables were able to differentiate the samples but not able to classify the samples according to the treatment. Contrarily, in case of visual criteria which describe the picture as a whole there is a classification in 80% of the sample cases possible. Herewith, it clearly can find the limits of the single variable approach of the image analysis (texture analysis).
Resumo:
The Scheme86 and the HP Precision Architectures represent different trends in computer processor design. The former uses wide micro-instructions, parallel hardware, and a low latency memory interface. The latter encourages pipelined implementation and visible interlocks. To compare the merits of these approaches, algorithms frequently encountered in numerical and symbolic computation were hand-coded for each architecture. Timings were done in simulators and the results were evaluated to determine the speed of each design. Based on these measurements, conclusions were drawn as to which aspects of each architecture are suitable for a high- performance computer.
Evaluation of strategic knowledge of students through authentic tasks of writingin science classroom
Resumo:
The difficulties at evaluating strategic knowledge have been connected to the use of deconxetualized tests that, at the end, do not involve the use of this knowledge. For this reason, an interest in developing authentic writing tasks that offer advantages for these kinds of evaluation have arisen. Throughout this research, authentic writing tasks were developed in a natural sciences class, with the purpose of evaluating the students' strategic knowledge. Different instruments were used to collect data, e.g. Interviews, questionnaires, a self-inform, as well as three samples of writing by the students, with the objective of analyzing the changes that occurred between one and the others, as well as to determine the decisions that students made in order to complete the assigned tasks successfully. An authentic writing task gives great opportunities to evaluate strategic knowledge. These tasks lead students to arrange their knowledge about the topic in hand, organize and adapt it to fit the objectives and the audience, also, it allows them to control and adjust their decisions on the task. This last stage became the perfect opportunity to take notes on the knowledge and regulation of cognitive processes that the students brought into play, as well as to evaluating their understanding of writing and the demands given on the different discursive genres. As a result, the students showed different degrees of strategic knowledge in the task. The students who showed a better strategic knowledge trust in their structural abilities know about discursive genres and have a good performance in basic linguistic abilities. The students who showed weak strategic knowledge distrust their writing skills, seem extremely worried about organizing the content of their texts, fail when checking their writings, and overlook or are unaware of the basic requirements of the discursive genre they are asked to exemplify. It appears that the previous knowledge and experiences at writing the students have been exposed to may affect the strategic knowledge shown when writing in this subject.
Resumo:
Resumen tomado de la publicaci??n. Con el apoyo econ??mico del departamento MIDE de la UNED
Resumo:
The time-of-detection method for aural avian point counts is a new method of estimating abundance, allowing for uncertain probability of detection. The method has been specifically designed to allow for variation in singing rates of birds. It involves dividing the time interval of the point count into several subintervals and recording the detection history of the subintervals when each bird sings. The method can be viewed as generating data equivalent to closed capture–recapture information. The method is different from the distance and multiple-observer methods in that it is not required that all the birds sing during the point count. As this method is new and there is some concern as to how well individual birds can be followed, we carried out a field test of the method using simulated known populations of singing birds, using a laptop computer to send signals to audio stations distributed around a point. The system mimics actual aural avian point counts, but also allows us to know the size and spatial distribution of the populations we are sampling. Fifty 8-min point counts (broken into four 2-min intervals) using eight species of birds were simulated. Singing rate of an individual bird of a species was simulated following a Markovian process (singing bouts followed by periods of silence), which we felt was more realistic than a truly random process. The main emphasis of our paper is to compare results from species singing at (high and low) homogenous rates per interval with those singing at (high and low) heterogeneous rates. Population size was estimated accurately for the species simulated, with a high homogeneous probability of singing. Populations of simulated species with lower but homogeneous singing probabilities were somewhat underestimated. Populations of species simulated with heterogeneous singing probabilities were substantially underestimated. Underestimation was caused by both the very low detection probabilities of all distant individuals and by individuals with low singing rates also having very low detection probabilities.
Resumo:
Across Europe, elevated phosphorus (P) concentrations in lowland rivers have made them particularly susceptible to eutrophication. This is compounded in southern and central UK by increasing pressures on water resources, which may be further enhanced by the potential effects of climate change. The EU Water Framework Directive requires an integrated approach to water resources management at the catchment scale and highlights the need for modelling tools that can distinguish relative contributions from multiple nutrient sources and are consistent with the information content of the available data. Two such models are introduced and evaluated within a stochastic framework using daily flow and total phosphorus concentrations recorded in a clay catchment typical of many areas of the lowland UK. Both models disaggregate empirical annual load estimates, derived from land use data, as a function of surface/near surface runoff, generated using a simple conceptual rainfall-runoff model. Estimates of the daily load from agricultural land, together with those from baseflow and point sources, feed into an in-stream routing algorithm. The first model assumes constant concentrations in runoff via surface/near surface pathways and incorporates an additional P store in the river-bed sediments, depleted above a critical discharge, to explicitly simulate resuspension. The second model, which is simpler, simulates P concentrations as a function of surface/near surface runoff, thus emphasising the influence of non-point source loads during flow peaks and mixing of baseflow and point sources during low flows. The temporal consistency of parameter estimates and thus the suitability of each approach is assessed dynamically following a new approach based on Monte-Carlo analysis. (c) 2004 Elsevier B.V. All rights reserved.
Resumo:
Using the Met Office large-eddy model (LEM) we simulate a mixed-phase altocumulus cloud that was observed from Chilbolton in southern England by a 94 GHz Doppler radar, a 905 nm lidar, a dual-wavelength microwave radiometer and also by four radiosondes. It is important to test and evaluate such simulations with observations, since there are significant differences between results from different cloud-resolving models for ice clouds. Simulating the Doppler radar and lidar data within the LEM allows us to compare observed and modelled quantities directly, and allows us to explore the relationships between observed and unobserved variables. For general-circulation models, which currently tend to give poor representations of mixed-phase clouds, the case shows the importance of using: (i) separate prognostic ice and liquid water, (ii) a vertical resolution that captures the thin layers of liquid water, and (iii) an accurate representation the subgrid vertical velocities that allow liquid water to form. It is shown that large-scale ascents and descents are significant for this case, and so the horizontally averaged LEM profiles are relaxed towards observed profiles to account for these. The LEM simulation then gives a reasonable. cloud, with an ice-water path approximately two thirds of that observed, with liquid water at the cloud top, as observed. However, the liquid-water cells that form in the updraughts at cloud top in the LEM have liquid-water paths (LWPs) up to half those observed, and there are too few cells, giving a mean LWP five to ten times smaller than observed. In reality, ice nucleation and fallout may deplete ice-nuclei concentrations at the cloud top, allowing more liquid water to form there, but this process is not represented in the model. Decreasing the heterogeneous nucleation rate in the LEM increased the LWP, which supports this hypothesis. The LEM captures the increase in the standard deviation in Doppler velocities (and so vertical winds) with height, but values are 1.5 to 4 times smaller than observed (although values are larger in an unforced model run, this only increases the modelled LWP by a factor of approximately two). The LEM data show that, for values larger than approximately 12 cm s(-1), the standard deviation in Doppler velocities provides an almost unbiased estimate of the standard deviation in vertical winds, but provides an overestimate for smaller values. Time-smoothing the observed Doppler velocities and modelled mass-squared-weighted fallspeeds shows that observed fallspeeds are approximately two-thirds of the modelled values. Decreasing the modelled fallspeeds to those observed increases the modelled IWC, giving an IWP 1.6 times that observed.
Resumo:
Thirty‐three snowpack models of varying complexity and purpose were evaluated across a wide range of hydrometeorological and forest canopy conditions at five Northern Hemisphere locations, for up to two winter snow seasons. Modeled estimates of snow water equivalent (SWE) or depth were compared to observations at forest and open sites at each location. Precipitation phase and duration of above‐freezing air temperatures are shown to be major influences on divergence and convergence of modeled estimates of the subcanopy snowpack. When models are considered collectively at all locations, comparisons with observations show that it is harder to model SWE at forested sites than open sites. There is no universal “best” model for all sites or locations, but comparison of the consistency of individual model performances relative to one another at different sites shows that there is less consistency at forest sites than open sites, and even less consistency between forest and open sites in the same year. A good performance by a model at a forest site is therefore unlikely to mean a good model performance by the same model at an open site (and vice versa). Calibration of models at forest sites provides lower errors than uncalibrated models at three out of four locations. However, benefits of calibration do not translate to subsequent years, and benefits gained by models calibrated for forest snow processes are not translated to open conditions.
Resumo:
A study was carried out to determine the influence of fibrolytic enzymes derived from mesophilic or thermophilic fungal sources, added at ensiling, on time-course fermentation characteristics and in vitro rumen degradation of maize silage. The mesophilic enzyme was a commercial product derived from Trichodenna reesei (L), whereas the thermophilic enzyme was a crude extract produced from Thermoascus aurantiacus (Ta) in this laboratory. The fungus was cultured using maize cobs as a carbon source. The resulting fermentation extract was deionised to remove sugars and characterised for its protein concentration, main and side enzymic activities, optimal pH, protein molecular mass and isoelectric point. In an additional study, both enzymes were added to maize forage (333.5 g DM/kg, 70.0, 469.8, 227.1 and 307.5 g/kg DM of CP, NDF, ADF and starch, respectively) at two levels each, normalized according to xylanase activity, and ensiled in 0.5 kg capacity laboratory minisilos. Duplicate silos were opened at 2, 4, 8, 15, and 60 days after ensiling, and analysed for chemical characteristics. Silages from 60 days were bulked and in vitro gas production (GP) and organic matter degradability (OMD) profiles evaluated using the Reading Pressure Technique (RPT), in a completely randomised design. The crude enzyme extract contained mainly xylanase and endoglucanase activities, with very low levels of exoglucanase, which probably limited hydrolysis of filter paper. The extract contained three major protein bands of between 29 and 55 kDa, with mainly acidic isoelectric points. Ensiling maize with enzymes lowered (P < 0.05) the final silage pH, with this effect being observed throughout the ensiling process. All enzyme treatments reduced (P < 0.05) ADF contents. Treatments including Ta produced more gas (P < 0.05) than the controls after 24 h incubation in vitro, whereas end point gas production at 96 h was not affected. Addition of Ta increased (P < 0.01) OMD after 12 h (410 and 416 g/kg versus 373 g/kg), whereas both L and Ta increased (P < 0.05) OMD after 24 h. Addition of enzymes from mesophilic or thermophilic sources to maize forage at ensiling increased the rate of acidification of the silages and improved in vitro degradation kinetics, suggesting an improvement in the nutritive quality. (C) 2003 Elsevier B.V All rights reserved.
Resumo:
Feed samples received by commercial analytical laboratories are often undefined or mixed varieties of forages, originate from various agronomic or geographical areas of the world, are mixtures (e.g., total mixed rations) and are often described incompletely or not at all. Six unified single equation approaches to predict the metabolizable energy (ME) value of feeds determined in sheep fed at maintenance ME intake were evaluated utilizing 78 individual feeds representing 17 different forages, grains, protein meals and by-product feedstuffs. The predictive approaches evaluated were two each from National Research Council [National Research Council (NRC), Nutrient Requirements of Dairy Cattle, seventh revised ed. National Academy Press, Washington, DC, USA, 2001], University of California at Davis (UC Davis) and ADAS (Stratford, UK). Slopes and intercepts for the two ADAS approaches that utilized in vitro digestibility of organic matter and either measured gross energy (GE), or a prediction of GE from component assays, and one UC Davis approach, based upon in vitro gas production and some component assays, differed from both unity and zero, respectively, while this was not the case for the two NRC and one UC Davis approach. However, within these latter three approaches, the goodness of fit (r(2)) increased from the NRC approach utilizing lignin (0.61) to the NRC approach utilizing 48 h in vitro digestion of neutral detergent fibre (NDF:0.72) and to the UC Davis approach utilizing a 30 h in vitro digestion of NDF (0.84). The reason for the difference between the precision of the NRC procedures was the failure of assayed lignin values to accurately predict 48 h in vitro digestion of NDF. However, differences among the six predictive approaches in the number of supporting assays, and their costs, as well as that the NRC approach is actually three related equations requiring categorical description of feeds (making them unsuitable for mixed feeds) while the ADAS and UC Davis approaches are single equations, suggests that the procedure of choice will vary dependent Upon local conditions, specific objectives and the feedstuffs to be evaluated. In contrast to the evaluation of the procedures among feedstuffs, no procedure was able to consistently discriminate the ME values of individual feeds within feedstuffs determined in vivo, suggesting that the quest for an accurate and precise ME predictive approach among and within feeds, may remain to be identified. (C) 2004 Elsevier B.V. All rights reserved.
Resumo:
As an immunogen of the coronavirus, the nucleoprotein (N) is a potential antigen for the serological monitoring of infectious bronchitis virus (IBV). In this report, recombinant N protein from the Beaudette strain of IBV was produced and purified from Escherichia coli as well as Sf9 ( insect) cells, and used for the coating of enzyme-linked immunosorbent assay ( ELISA) plates. The N protein produced in Sf9 cells was phosphorylated whereas N protein from E. coli was not. Our data indicated that N protein purified from E. coli was more sensitive to anti-IBV serum than the protein from Sf9 cells. The recombinant N protein did not react with the antisera to other avian pathogens, implying that it was specific in the recognition of IBV antibodies. In addition, the data from the detection of field samples and IBV strains indicated that using the recombinant protein as coating antigen could achieve an equivalent performance to an ELISA kit based on infected material extracts as a source of antigen(s). ELISAs based on recombinant proteins are safe ( no live virus), clean ( only virus antigens are present), specific ( single proteins can be used) and rapid ( to respond to new viral strains and strains that cannot necessarily be easily cultured).
Resumo:
Rolling Contact Fatigue (RCF) is one of the main issues that concern, at least initially, the head of the railway; progressively they can be of very high importance as they can propagate inside the material with the risk of damaging the railway. In this work, two different non-destructive techniques, infrared thermography (IRT) and fibre optics microscopy (FOM), were used in the inspection of railways for the tracing of defects and deterioration signs. In the first instance, two different approaches (dynamic and pulsed thermography) were used, whilst in the case of FOM, microscopic characterisation of the railway heads and classification of the deterioration -- damage on the railways according to the UIC (International Union of Railways) code, took place. Results from both techniques are presented and discussed.
Resumo:
A mixture of organic acids and lactulose for preventing or reducing colonization of the gut by Salmonella Typhimurium was evaluated in pigs. A total of 63 4-week-old commercial piglets were randomly distributed into three different experimental dietary groups: a plain diet without additives (PD) and the same diet supplemented with either 0.4% (w/v) formic acid and 0.4% lactic acid (w/v) (AC) or 1% (w/v) lactulose (LC). After 7 days of adaptation, two-thirds of the pigs (14 from each diet) were challenged with a 2-mL oral dose of 10(8) CFU/mL of Salmonella Typhimurium, leaving the remaining animals unchallenged (UC). After 4 and 10 days post-challenge, pigs were euthanized and the ileum and caecum content were aseptically sampled to (a) quantify lactic, formic, and short-chain fatty acids (SCFA), (b) quantify bacterial populations and Salmonella by fluorescence in situ hybridization and (c) qualitatively analyse bacterial populations through denaturing gradient gel electrophoresis (DGGE). Modification of fermentation products and counts of some of the bacterial groups analysed in the challenged pigs receiving the treatments AC and LC were minimal. Treatments only influenced the bacterial diversity after 10 days post-challenge, with AC generating a lower number of DGGE bands than UC(P < 0.05). Neither the inclusion of a mixture of 0.4% (w/v) formic and 0.4% (w/v) lactic acids nor of 1% (w/v) lactulose in the feed influenced numbers of Salmonella in the ileum and caecum of experimentally challenged pigs. (C) 2009 Elsevier B.V. All rights reserved.
Resumo:
Aims: This study was carried out to evaluate in vitro the fermentation properties and the potential prebiotic activity of Agave-fructans extracted from Agave tequilana (Predilife). Methods and Results: Five different commercial prebiotics were compared using 24-h pH-controlled anaerobic batch cultures inoculated with human faecal slurries. Measurement of prebiotic efficacy was obtained by comparing bacterial changes, and the production of short-chain fatty acids (SCFA) was also determined. Effects upon major groups of the microbiota were monitored over 24 h incubations by fluorescence in situ hybridization. SCFA were measured by HPLC. Fermentation of the Agave fructans (Predilife) resulted in a large increase in numbers of bifidobacteria and lactobacilli. Conclusions: Under the in vitro conditions used, this study has shown the differential impact of Predilife on the microbial ecology of the human gut. Significance and Impact of the Study: This is the first study reporting of a potential prebiotic mode of activity for Agave fructans investigated which significantly increased populations of bifidobacteria and lactobacilli compared to cellulose used as a control.