593 resultados para Using Music Teach Reading Fluency Kindergarten


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Lactoperoxidase (LP) was isolated from whey protein by cation-exchange using Carboxymethyl resin (CM-25C) and Sulphopropyl Toyopearl resin (SP-650C). Both batch and column procedures were employed and the adsorption capacities and extraction efficiencies were compared. The resin bed volume to whey volume ratios were 0.96:1.0 for CM-25C and ≤ 0.64:1.0 for SP-650 indicating higher adsorption capacity of SP-650 compared to CM-25C. The effluent LP activity depended on both the enzyme activity in the whey and the amount of whey loaded on the column within the saturation limits of the resin. The percentage recovery was high below the saturation point and fell off rapidly with over-saturation. While effective recovery was achieved with column extraction procedures, the recovery was poor in batch procedures. The whey-resin contact time had little impact on the enzyme adsorption. SDS PAGE and HPLC analyses were also carried out, the purity was examined and the proteins characterised in terms of molecular weights. Reversed phase HPLC provided clear distinction of the LP and lactoferrin (LF) peaks. The enzyme purity was higher in column effluents compared to batch effluents, judged on the basis of the clarity of the gel bands and the resolved peaks in HPLC chromatograms.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We have compiled two comprehensive gene expression profiles from mature leaf and immature seed tissue of rice (Oryza sativa ssp. japonica cultivar Nipponbare) using Serial Analysis of Gene Expression (SAGE) technology. Analysis revealed a total of 50 519 SAGE tags, corresponding to 15 131 unique transcripts. Of these, the large majority (approximately 70%) occur only once in both libraries. Unexpectedly, the most abundant transcript (approximately 3% of the total) in the leaf library was derived from a type 3 metallothionein gene. The overall frequency profiles of the abundant tag species from both tissues differ greatly and reveal seed tissue as exhibiting a non-typical pattern of gene expression characterized by an over abundance of a small number of transcripts coding for storage proteins. A high proportion ( approximately 80%) of the abundant tags (> or = 9) matched entries in our reference rice EST database, with many fewer matches for low abundant tags. Singleton transcripts that are common to both tissues were collated to generate a summary of low abundant transcripts that are expressed constitutively in rice tissues. Finally and most surprisingly, a significant number of tags were found to code for antisense transcripts, a finding that suggests a novel mechanism of gene regulation, and may have implications for the use of antisense constructs in transgenic technology.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

As the most commercially valuable cereal grown worldwide and the best-characterized in genetic terms, maize was predictably the first target for transformation among the important crops. Indeed, the first attempt at transformation of any plant was conducted on maize (1). These early efforts, however, were inevitably unsuccessful, since at that time, there were no reliable methods to permit the introduction of DNA into a cell, the expression of that DNA, and the identification of progeny derived from such a “transgenic” cell (2). Almost 20 years later, these technologies were finally combined, and the first transgenic cereals were produced. In the last few years, methods have become increasingly efficient, and transgenic maize has now been produced from protoplasts as well as from Agrobacterium-medieited or “Biolistic” delivery to embryogenic tissue (for a general comparison of methods used for maize, the reader is referred to a recent review—ref. 3). The present chapter will describe probably the simplest of the available procedures, namely the delivery of DNA to the recipient cells by vortexing them in the presence of silicon carbide (SiC) whiskers (this name will be used in preference to the term “fiber,” since it more correctly describes the single crystal nature of the material).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We previously demonstrated that a dry, room temperature stable formulation of a live bacterial vaccine was highly susceptible to bile, and suggested that this will lead to significant loss of viability of any live bacterial formulation released into the intestine using an enteric coating or capsule. We found that bile and acid tolerance is very rapidly recovered after rehydration with buffer or water, raising the possibility that rehydration in the absence of bile prior to release into the intestine might solve the problem of bile toxicity to dried cells. We describe here a novel formulation that combines extensively studied bile acid adsorbent resins with the dried bacteria, to temporarily adsorb bile acids and allow rehydration and recovery of bile resistance of bacteria in the intestine before release. Tablets containing the bile acid adsorbent cholestyramine release 250-fold more live bacteria when dissolved in a bile solution, compared to control tablets without cholestyramine or with a control resin that does not bind bile acids. We propose that a simple enteric coated oral dosage form containing bile acid adsorbent resins will allow improved live bacterial delivery to the intestine via the oral route, a major step towards room temperature stable, easily administered and distributed vaccine pills and other bacterial therapeutics

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We introduce a technique for assessing the diurnal development of convective storm systems based on outgoing longwave radiation fields. Using the size distribution of the storms measured from a series of images, we generate an array in the lengthscale-time domain based on the standard score statistic. It demonstrates succinctly the size evolution of storms as well as the dissipation kinematics. It also provides evidence related to the temperature evolution of the cloud tops. We apply this approach to a test case comparing observations made by the Geostationary Earth Radiation Budget instrument to output from the Met Office Unified Model run at two resolutions. The 12km resolution model produces peak convective activity on all lengthscales significantly earlier in the day than shown by the observations and no evidence for storms growing in size. The 4km resolution model shows realistic timing and growth evolution although the dissipation mechanism still differs from the observed data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A full assessment of para-­virtualization is important, because without knowledge about the various overheads, users can not understand whether using virtualization is a good idea or not. In this paper we are very interested in assessing the overheads of running various benchmarks on bare-­‐metal, as well as on para-­‐virtualization. The idea is to see what the overheads of para-­‐ virtualization are, as well as looking at the overheads of turning on monitoring and logging. The knowledge from assessing various benchmarks on these different systems will help a range of users understand the use of virtualization systems. In this paper we assess the overheads of using Xen, VMware, KVM and Citrix, see Table 1. These different virtualization systems are used extensively by cloud-­‐users. We are using various Netlib1 benchmarks, which have been developed by the University of Tennessee at Knoxville (UTK), and Oak Ridge National Laboratory (ORNL). In order to assess these virtualization systems, we run the benchmarks on bare-­‐metal, then on the para-­‐virtualization, and finally we turn on monitoring and logging. The later is important as users are interested in Service Level Agreements (SLAs) used by the Cloud providers, and the use of logging is a means of assessing the services bought and used from commercial providers. In this paper we assess the virtualization systems on three different systems. We use the Thamesblue supercomputer, the Hactar cluster and IBM JS20 blade server (see Table 2), which are all servers available at the University of Reading. A functional virtualization system is multi-­‐layered and is driven by the privileged components. Virtualization systems can host multiple guest operating systems, which run on its own domain, and the system schedules virtual CPUs and memory within each Virtual Machines (VM) to make the best use of the available resources. The guest-­‐operating system schedules each application accordingly. You can deploy virtualization as full virtualization or para-­‐virtualization. Full virtualization provides a total abstraction of the underlying physical system and creates a new virtual system, where the guest operating systems can run. No modifications are needed in the guest OS or application, e.g. the guest OS or application is not aware of the virtualized environment and runs normally. Para-­‐virualization requires user modification of the guest operating systems, which runs on the virtual machines, e.g. these guest operating systems are aware that they are running on a virtual machine, and provide near-­‐native performance. You can deploy both para-­‐virtualization and full virtualization across various virtualized systems. Para-­‐virtualization is an OS-­‐assisted virtualization; where some modifications are made in the guest operating system to enable better performance. In this kind of virtualization, the guest operating system is aware of the fact that it is running on the virtualized hardware and not on the bare hardware. In para-­‐virtualization, the device drivers in the guest operating system coordinate the device drivers of host operating system and reduce the performance overheads. The use of para-­‐virtualization [0] is intended to avoid the bottleneck associated with slow hardware interrupts that exist when full virtualization is employed. It has revealed [0] that para-­‐ virtualization does not impose significant performance overhead in high performance computing, and this in turn this has implications for the use of cloud computing for hosting HPC applications. The “apparent” improvement in virtualization has led us to formulate the hypothesis that certain classes of HPC applications should be able to execute in a cloud environment, with minimal performance degradation. In order to support this hypothesis, first it is necessary to define exactly what is meant by a “class” of application, and secondly it will be necessary to observe application performance, both within a virtual machine and when executing on bare hardware. A further potential complication is associated with the need for Cloud service providers to support Service Level Agreements (SLA), so that system utilisation can be audited.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Polyethylene glycol (PEG) may be added to forage based diets rich in tannins for ruminant feeding because it binds to tannins and thus prevent the formation of potentially indigestible tannin-protein complexes. The objective of this work was to determine the in vitro biodegradation (mineralization, i.e., complete breakdown of PEG to CO2) rate of PEG. C-14-Polyethylene glycol (C-14-PEG) was added to three different tropical soils (a sandy clay loam soil, SaCL; a sandy clay soil, SaC; and a sandy loam soil, SaL) and was incubated in Bartha flasks. Free PEG and PEG bound to tannins from a tannin rich local shrub were incubated under aerobic conditions for up to 70 days. The biodegradation assay monitored the (CO2)-C-14 evolved after degradation of the labelled PEG in the soils. After incubation, the amount of (CO2)-C-14 evolved from the C-14-PEG application was low. Higher PEG mineralization values were found for the soils with higher organic matter contents (20.1 and 18.6 g organic matter/kg for SaCL and SaC, respectively) than for the SaL soil (11.9 g organic matter/kg) (P < 0.05). The extent of mineralization of PEG after 70 days of incubation in the soil was significantly lower (P < 0.05) when it was added as bound to the browse tannin than in the free form (0.040 and 0.079, respectively). (c) 2005 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents the method and findings of a contingent valuation (CV) study that aimed to elicit United Kingdom citizens' willingness to pay to support legislation to phase out the use of battery cages for egg production in the European Union (EU). The method takes account of various biases associated with the CV technique, including 'warm glow', 'part-whole' and sample response biases. Estimated mean willingness to pay to support the legislation is used to estimate the annual benefit of the legislation to UK citizens. This is compared with the estimated annual costs of the legislation over a 12-year period, which allows for readjustment by the UK egg industry. The analysis shows that the estimated benefits of the legislation outweigh the costs. The study demonstrates that CV is a potentially useful technique for assessing the likely benefits associated with proposed legislation. However, estimates of CV studies must be treated with caution. It is important that they are derived from carefully designed surveys and that the willingness to pay estimation method allows for various biases. (C) 2003 Elsevier Science B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Process-based integrated modelling of weather and crop yield over large areas is becoming an important research topic. The production of the DEMETER ensemble hindcasts of weather allows this work to be carried out in a probabilistic framework. In this study, ensembles of crop yield (groundnut, Arachis hypogaea L.) were produced for 10 2.5 degrees x 2.5 degrees grid cells in western India using the DEMETER ensembles and the general large-area model (GLAM) for annual crops. Four key issues are addressed by this study. First, crop model calibration methods for use with weather ensemble data are assessed. Calibration using yield ensembles was more successful than calibration using reanalysis data (the European Centre for Medium-Range Weather Forecasts 40-yr reanalysis, ERA40). Secondly, the potential for probabilistic forecasting of crop failure is examined. The hindcasts show skill in the prediction of crop failure, with more severe failures being more predictable. Thirdly, the use of yield ensemble means to predict interannual variability in crop yield is examined and their skill assessed relative to baseline simulations using ERA40. The accuracy of multi-model yield ensemble means is equal to or greater than the accuracy using ERA40. Fourthly, the impact of two key uncertainties, sowing window and spatial scale, is briefly examined. The impact of uncertainty in the sowing window is greater with ERA40 than with the multi-model yield ensemble mean. Subgrid heterogeneity affects model accuracy: where correlations are low on the grid scale, they may be significantly positive on the subgrid scale. The implications of the results of this study for yield forecasting on seasonal time-scales are as follows. (i) There is the potential for probabilistic forecasting of crop failure (defined by a threshold yield value); forecasting of yield terciles shows less potential. (ii) Any improvement in the skill of climate models has the potential to translate into improved deterministic yield prediction. (iii) Whilst model input uncertainties are important, uncertainty in the sowing window may not require specific modelling. The implications of the results of this study for yield forecasting on multidecadal (climate change) time-scales are as follows. (i) The skill in the ensemble mean suggests that the perturbation, within uncertainty bounds, of crop and climate parameters, could potentially average out some of the errors associated with mean yield prediction. (ii) For a given technology trend, decadal fluctuations in the yield-gap parameter used by GLAM may be relatively small, implying some predictability on those time-scales.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Reanalysis data provide an excellent test bed for impacts prediction systems. because they represent an upper limit on the skill of climate models. Indian groundnut (Arachis hypogaea L.) yields have been simulated using the General Large-Area Model (GLAM) for annual crops and the European Centre for Medium-Range Weather Forecasts (ECMWF) 40-yr reanalysis (ERA-40). The ability of ERA-40 to represent the Indian summer monsoon has been examined. The ability of GLAM. when driven with daily ERA-40 data, to model both observed yields and observed relationships between subseasonal weather and yield has been assessed. Mean yields "were simulated well across much of India. Correlations between observed and modeled yields, where these are significant. are comparable to correlations between observed yields and ERA-40 rainfall. Uncertainties due to the input planting window, crop duration, and weather data have been examined. A reduction in the root-mean-square error of simulated yields was achieved by applying bias correction techniques to the precipitation. The stability of the relationship between weather and yield over time has been examined. Weather-yield correlations vary on decadal time scales. and this has direct implications for the accuracy of yield simulations. Analysis of the skewness of both detrended yields and precipitation suggest that nonclimatic factors are partly responsible for this nonstationarity. Evidence from other studies, including data on cereal and pulse yields, indicates that this result is not particular to groundnut yield. The detection and modeling of nonstationary weather-yield relationships emerges from this study as an important part of the process of understanding and predicting the impacts of climate variability and change on crop yields.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The impacts of climate change on crop productivity are often assessed using simulations from a numerical climate model as an input to a crop simulation model. The precision of these predictions reflects the uncertainty in both models. We examined how uncertainty in a climate (HadAM3) and crop General Large-Area Model (GLAM) for annual crops model affects the mean and standard deviation of crop yield simulations in present and doubled carbon dioxide (CO2) climates by perturbation of parameters in each model. The climate sensitivity parameter (lambda, the equilibrium response of global mean surface temperature to doubled CO2) was used to define the control climate. Observed 1966-1989 mean yields of groundnut (Arachis hypogaea L.) in India were simulated well by the crop model using the control climate and climates with values of lambda near the control value. The simulations were used to measure the contribution to uncertainty of key crop and climate model parameters. The standard deviation of yield was more affected by perturbation of climate parameters than crop model parameters in both the present-day and doubled CO2 climates. Climate uncertainty was higher in the doubled CO2 climate than in the present-day climate. Crop transpiration efficiency was key to crop model uncertainty in both present-day and doubled CO2 climates. The response of crop development to mean temperature contributed little uncertainty in the present-day simulations but was among the largest contributors under doubled CO2. The ensemble methods used here to quantify physical and biological uncertainty offer a method to improve model estimates of the impacts of climate change.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Increasing rates of obesity and heart disease are compromising quality of life for a growing number of people. There is much research linking adult disease with the growth and development both in utero and during the first year of life. The pig is an ideal model for studying the origins of developmental programming. The objective of this paper was to construct percentile growth curves for the pig for use in biomedical studies. The body weight (BIN) of pigs was recorded from birth to 150 days of age and their crown-to-rump length was measured over the neonatal period to enable the ponderal index (Pl; kg/m(3)) to be calculated. Data were normalised and percentile curves were constructed using Cole's lambda-mu-sigma (LMS) method for BW and PI. The construction of these percentile charts for use in biomedical research will allow a more detailed and precise tracking of growth and development of individual pigs under experimental conditions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In Central Brazil, the long-term sustainability of beef cattle systems is under threat over vast tracts of farming areas, as more than half of the 50 million hectares of sown pastures are suffering from degradation. Overgrazing practised to maintain high stocking rates is regarded as one of the main causes. High stocking rates are deliberate and crucial decisions taken by the farmers, which appear paradoxical, even irrational given the state of knowledge regarding the consequences of overgrazing. The phenomenon however appears inextricably linked with the objectives that farmers hold. In this research those objectives were elicited first and from their ranking two, ‘asset value of cattle (representing cattle ownership)' and ‘present value of economic returns', were chosen to develop an original bi-criteria Compromise Programming model to test various hypotheses postulated to explain the overgrazing behaviour. As part of the model a pasture productivity index is derived to estimate the pasture recovery cost. Different scenarios based on farmers' attitudes towards overgrazing, pasture costs and capital availability were analysed. The results of the model runs show that benefits from holding more cattle can outweigh the increased pasture recovery and maintenance costs. This result undermines the hypothesis that farmers practise overgrazing because they are unaware or uncaring about overgrazing costs. An appropriate approach to the problem of pasture degradation requires information on the economics, and its interplay with farmers' objectives, for a wide range of pasture recovery and maintenance methods. Seen within the context of farmers' objectives, some level of overgrazing appears rational. Advocacy of the simple ‘no overgrazing' rule is an insufficient strategy to maintain the long-term sustainability of the beef production systems in Central Brazil.