939 resultados para Evaluation of different sources of carbohydrates
Resumo:
Poor cold flow properties of vegetable oils are a major problem preventing the usage of many abundantly available vegetable oils as base stocks for industrial lubricants. The major objective of this research is to improve the cold flow properties of vegetable oils by various techniques like additive addition and different chemical modification processes. Conventional procedure for determining pour point is ASTM D97 method. ASTM D97 method is time consuming and reproducibility of pour point temperatures is poor between laboratories. Differential Scanning Calorimetry (DSC) is a fast, accurate and reproducible method to analyze the thermal activities during cooling/heating of oil. In this work coconut oil has been chosen as representative vegetable oil for the analysis and improvement cold flow properties since it is abundantly available in the tropics and has a very high pour point of 24 °C. DSC is used for the analysis of unmodified and modified vegetable oil. The modified oils (with acceptable pour points) were then subjected to different tests for the valuation of important lubricant properties such as viscometric, tribological (friction and wear properties), oxidative and corrosion properties.A commercial polymethacrylate based PPD was added in different percentages and the pour points were determined in each case. Styrenated phenol(SP) was added in different concentration to coconut oil and each solution was subjected to ASTM D97 test and analysis by DSC. Refined coconut oil and other oils like castor oil, sunflower oil and keranja oil were mixed in different proportions and interesterification procedure was carried out. Interesterification of coconut oil with other vegetable oils was not found to be effective in lowering the pour point of coconut oil as the reduction attained was only to the extent of 2 to 3 °C.Chemical modification by acid catalysed condensation reaction with coconut oil castor oil mixture resulted in significant reduction of pour point (from 24 ºC to -3 ºC). Instead of using triacylglycerols, when their fatty acid derivatives (lauric acid- the major fatty acid content of coconut oil and oleic acid- the major fatty acid constituents of monoand poly- unsaturated vegetable oils like olive oil, sunflower oil etc.) were used for the synthesis , the pour point could be brought down to -42 ºC. FTIR and NMR spectroscopy confirmed the ester structure of the product which is fundamental to the biodegradability of vegetable oils. The tribological performance of the synthesised product with a suitable AW/EP additive was comparable to the commercial SAE20W30 oil. The viscometric properties (viscosity and viscosity index) were also (with out additives) comparable to commercial lubricants. The TGA experiment confirmed the better oxidative performance of the product compared to vegetable oils. The sample passed corrosion test as per ASTM D130 method.
Resumo:
In the present study the preparation and characterisation of rubber ferrite composites (RFC) containing barium ferrite (BaF) and strontium ferrite (SrF) have been dealt with. The incorporation of the hard ferrites into natural and nitrile rubber was carried out according to a specific recipe for various loadings of magnetic fillers. For this, the ferrite materials namely barium ferrite and strontium ferrite having the general formula MO6Fe2O3 have been prepared by the conventional ceramic techniques. After characterisation they were incorporated into the natural and nitrile rubber matrix by mechanical method. Carbon black was also incorporated at different loading into the rubber ferrite composites to study its effect on various properties. The cure characteristics, mechanical, dielectric and magnetic properties of these composites were evaluated. The ac electrical conductivity of both the ceramic ferrites and rubber ferrite composites were also calculated using a simple relation. The investigations revealed that the rubber ferrite composites with the required dielectric and magnetic properties can be obtained by the incorporation of ferrite fillers into the rubber matrix, without compromising much on the processability and mechanical properties.
Resumo:
Non-destructive testing (NDT) is the use of non-invasive techniques to determine the integrity of a material, component, or structure. Engineers and scientists use NDT in a variety of applications, including medical imaging, materials analysis, and process control.Photothermal beam deflection technique is one of the most promising NDT technologies. Tremendous R&D effort has been made for improving the efficiency and simplicity of this technique. It is a popular technique because it can probe surfaces irrespective of the size of the sample and its surroundings. This technique has been used to characterize several semiconductor materials, because of its non-destructive and non-contact evaluation strategy. Its application further extends to analysis of wide variety of materials. Instrumentation of a NDT technique is very crucial for any material analysis. Chapter two explores the various excitation sources, source modulation techniques, detection and signal processing schemes currently practised. The features of the experimental arrangement including the steps for alignment, automation, data acquisition and data analysis are explained giving due importance to details.Theoretical studies form the backbone of photothermal techniques. The outcome of a theoretical work is the foundation of an application.The reliability of the theoretical model developed and used is proven from the studies done on crystalline.The technique is applied for analysis of transport properties such as thermal diffusivity, mobility, surface recombination velocity and minority carrier life time of the material and thermal imaging of solar cell absorber layer materials like CuInS2, CuInSe2 and SnS thin films.analysis of In2S3 thin films, which are used as buffer layer material in solar cells. The various influences of film composition, chlorine and silver incorporation in this material is brought out from the measurement of transport properties and analysis of sub band gap levels.The application of photothermal deflection technique for characterization of solar cells is a relatively new area that requires considerable attention.The application of photothermal deflection technique for characterization of solar cells is a relatively new area that requires considerable attention. Chapter six thus elucidates the theoretical aspects of application of photothermal techniques for solar cell analysis. The experimental design and method for determination of solar cell efficiency, optimum load resistance and series resistance with results from the analysis of CuInS2/In2S3 based solar cell forms the skeleton of this chapter.
Resumo:
The acoustic signals generated in solids due to interaction with pulsed laser beam is used to determine the ablation threshold of bulk polymer samples of teflon (polytetrafluoroethylene) and nylon under the irradiation from a Q-switched Nd:YAG laser at 1.06µm wavelength. A suitably designed piezoelectric transducer is employed for the detection of photoacoustic (PA) signals generated in this process. It has been observed that an abrupt increase in the amplitude of the PA signal occurs at the ablation threshold. Also there exist distinct values for the threshold corresponding to different mechanisms operative in producing damages like surface morphology, bond breaking and melting processes at different laser energy densities.
Resumo:
The present study is an investigation to address relevant chemical aspects of the three varied aquatic environments, such as mangroves, river and the estuary. The sampling locations include a thick mangrove forest with high tidal activity, a mangrove nursery with minimal disturbances and low tidal inundation, a highly polluted riverine system and an estuarine site, as reference. Nutrients and bioorganic compounds in the water column and surface sediment were estimated in an attempt to understand the regeneration properties of these different aquatic systems.Assessment of the trace metal pollution was also carried out.
Resumo:
Identification and Control of Non‐linear dynamical systems are challenging problems to the control engineers.The topic is equally relevant in communication,weather prediction ,bio medical systems and even in social systems,where nonlinearity is an integral part of the system behavior.Most of the real world systems are nonlinear in nature and wide applications are there for nonlinear system identification/modeling.The basic approach in analyzing the nonlinear systems is to build a model from known behavior manifest in the form of system output.The problem of modeling boils down to computing a suitably parameterized model,representing the process.The parameters of the model are adjusted to optimize a performanace function,based on error between the given process output and identified process/model output.While the linear system identification is well established with many classical approaches,most of those methods cannot be directly applied for nonlinear system identification.The problem becomes more complex if the system is completely unknown but only the output time series is available.Blind recognition problem is the direct consequence of such a situation.The thesis concentrates on such problems.Capability of Artificial Neural Networks to approximate many nonlinear input-output maps makes it predominantly suitable for building a function for the identification of nonlinear systems,where only the time series is available.The literature is rich with a variety of algorithms to train the Neural Network model.A comprehensive study of the computation of the model parameters,using the different algorithms and the comparison among them to choose the best technique is still a demanding requirement from practical system designers,which is not available in a concise form in the literature.The thesis is thus an attempt to develop and evaluate some of the well known algorithms and propose some new techniques,in the context of Blind recognition of nonlinear systems.It also attempts to establish the relative merits and demerits of the different approaches.comprehensiveness is achieved in utilizing the benefits of well known evaluation techniques from statistics. The study concludes by providing the results of implementation of the currently available and modified versions and newly introduced techniques for nonlinear blind system modeling followed by a comparison of their performance.It is expected that,such comprehensive study and the comparison process can be of great relevance in many fields including chemical,electrical,biological,financial and weather data analysis.Further the results reported would be of immense help for practical system designers and analysts in selecting the most appropriate method based on the goodness of the model for the particular context.
Resumo:
With a seacoast of 8,1 18 km, an exclusive economic zone (EEZ) of 2 million square km, and with an area of about 30,000 square km under aquaculture, lndia produces close to six million tonnes of fish, over 4 per cent of the world fish production. While the marine waters upto 50m depth have been fully exploited, those beyond, remain unexplored. There is an ever increasing demand for fishery resources as food. The coastal fishery resources of the country are dwindling at a rapid pace and it becomes highly imperative that we search for alternate fishery resources for food. The option we have is to hunt for marine fishery resources. Studies pertaining to proximate composition, amino acid and fatty acid composition are essential to understand the nutraceutical values of these deep sea fishery resources. The present study was aimed to carry out proximate composition of deep sea fishery resources obtained during cruises onboard the FORV Sarise Sampada, to identify fishery resources which have appreciable lipid content and thereby analyse the bioactive potentials of marine lipids, to study the amino acid profile of these fishery resources, to understand the contents of SPA, MUFA and PUFA and to calculate the n3/n6 fatty acid contents. Though the presence of nutraceuticals was identified in the marine fishery resources their use as potential food resources deserve further investigation. So the study were carried out to calculate the hepatosomatic indices of sharks & chimaeras and conduct biochemical characterisation of liver oils of Apristurus indicus, Cenlrophorus scalprams, Centroselachus crepidater, Neoharriotta raleighana, and Harriotta pinnata obtained during cruises onboard the FORV Sugar Sampada.Therapeutic use of shark liver oil is evident from its use for centuries as a remedy to heal wounds and fight flu (Neil er al. 2006). Japanese seamen called it 'samedava' or "cure all". Shark liver oil is being promoted worldwide as a dietary supplement to boost the immune system, fight infections, to treat cancer and to lessen the side effects of conventional cancer treatment. These days more emphasis is laid on the nutritive benefits of shark liver oils especially on the omega 3 polyunsaturated fatty acids ( PUFAs) (Anandan er al. 2007) and alkylglycerols (AKGs) (Pugliese er al. I998) contained in them due to the high rise of inflammatory disorders such as arthritis, asthma and neurodegenerative diseases like Alzheimer’s, Parkinson’s and Schizophrenia. So the present study also evaluate the pharmacological properties with respect to analgesic, anti-inflammatory, anti pyretic and anti-ulcer effects of four different liver oils of sharks belonging to the Indian EEZ and to identify the components of oil responsible for these activities.The analgesic and anti-inflammatory activities of liver oils from Neoharriotra raleighana (NR), Centrosymnus crepidater (CC), Apristurus indicus (AI), and Centrophorus sculpratus (CS) sharks caught from the Arabian Sea and the Indian Ocean were compared. The main objectives also include determination of the cholesterol lowering effects of liver oils of Neoharriotra raleighana (NR) and Centrophorus sculpratus (CS) on the high fat diet induced dyslipidemia and to compare the impact of four isolipidemic diets, on levels of serum diagnostic marker enzymes, on lipid profile of blood and liver and antioxidant status of heart in male Albino rats. And also to study the efficacy of Centrophorus sculpratus (CS) liver oil against Complete Freund’s Adjuvant-induced arthritis and to compare the anti-inflammatory activity of this oil with a traditionally used anti-inflammatory substance gingerol (oleoresin extracted from ginger.). The results of the present study indicated that both (Centrophorus sculpratus liver oils as well as gingerol extracts proved to be effective natural remedies against CFA-induced arthritis in Albino rats.
Resumo:
Mangroves are considered to play a significant role in global carbon cycling. Themangrove forests would fix CO2 by photosynthesis into mangrove lumber and thus decrease the possibility of a catastrophic series of events - global warming by atmospheric CO2, melting of the polar ice caps, and inundation of the great coastal cities of the world. The leaf litter and roots are the main contributors to mangrove sediments, though algal production and allochthonous detritus can also be trapped (Kristensen et al, 2008) by mangroves due to their high organic matter content and reducing nature are excellent metal retainers. Environmental pollution due to metals is of major concern. This is due to the basic fact that metals are not biodegradable or perishable the way most organic pollutants are. While most organic toxicants can be destroyed by combustion and converted into compounds such as C0, C02, SOX, NOX, metals can't be destroyed. At the most the valance and physical form of metals may change. Concentration of metals present naturally in air, water and soil is very low. Metals released into the environment through anthropogenic activities such as burning of fossils fuels, discharge of industrial effluents, mining, dumping of sewage etc leads to the development of higher than tolerable or toxic levels of metals in the environment leading to metal pollution. Of course, a large number of heavy metals such as Fe, Mn, Cu, Ni, Zn, Co, Cr, Mo, and V are essential to plants and animals and deficiency of these metals may lead to diseases, but at higher levels, it would lead to metal toxicity. Almost all industrial processes and urban activities involve release of at least trace quantities of half a dozen metals in different forms. Heavy metal pollution in the environment can remain dormant for a long time and surface with a vengeance. Once an area gets toxified with metals, it is almost impossible to detoxify it. The symptoms of metal toxicity are often quite similar to the symptoms of other common diseases such as respiratory problems, digestive disorders, skin diseases, hypertension, diabetes, jaundice etc making it all the more difficult to diagnose metal poisoning. For example the Minamata disease caused by mercury pollution in addition to affecting the nervous system can disturb liver function and cause diabetes and hypertension. The damage caused by heavy metals does not end up with the affected person. The harmful effects can be transferred to the person's progenies. Ironically heavy metal pollution is a direct offshoot of our increasing ability to mass produce metals and use them in all spheres of existence. Along with conventional physico- chemical methods, biosystem approachment is also being constantly used for combating metal pollution
Resumo:
Photothermal deflection technique (PTD) is a non-destructive tool for measuring the temperature distribution in and around a sample, due to various non-radiative decay processes occurring within the material. This tool was used to measure the carrier transport properties of CuInS2 and CuInSe2 thin films. Films with thickness <1 μm were prepared with different Cu/In ratios to vary the electrical properties. The surface recombination velocity was least for Cu-rich films (5×105 cm/s for CuInS2, 1×103 cm/s for CuInSe2), while stoichiometric films exhibited high mobility (0.6 cm2/V s for CuInS2, 32 cm2/V s for CuInSe2) and high minority carrier lifetime (0.35 μs for CuInS2, 12 μs for CuInSe2
Resumo:
The consumers are becoming more concerned about food quality, especially regarding how, when and where the foods are produced (Haglund et al., 1999; Kahl et al., 2004; Alföldi, et al., 2006). Therefore, during recent years there has been a growing interest in the methods for food quality assessment, especially in the picture-development methods as a complement to traditional chemical analysis of single compounds (Kahl et al., 2006). The biocrystallization as one of the picture-developing method is based on the crystallographic phenomenon that when crystallizing aqueous solutions of dihydrate CuCl2 with adding of organic solutions, originating, e.g., from crop samples, biocrystallograms are generated with reproducible crystal patterns (Kleber & Steinike-Hartung, 1959). Its output is a crystal pattern on glass plates from which different variables (numbers) can be calculated by using image analysis. However, there is a lack of a standardized evaluation method to quantify the morphological features of the biocrystallogram image. Therefore, the main sakes of this research are (1) to optimize an existing statistical model in order to describe all the effects that contribute to the experiment, (2) to investigate the effect of image parameters on the texture analysis of the biocrystallogram images, i.e., region of interest (ROI), color transformation and histogram matching on samples from the project 020E170/F financed by the Federal Ministry of Food, Agriculture and Consumer Protection(BMELV).The samples are wheat and carrots from controlled field and farm trials, (3) to consider the strongest effect of texture parameter with the visual evaluation criteria that have been developed by a group of researcher (University of Kassel, Germany; Louis Bolk Institute (LBI), Netherlands and Biodynamic Research Association Denmark (BRAD), Denmark) in order to clarify how the relation of the texture parameter and visual characteristics on an image is. The refined statistical model was accomplished by using a lme model with repeated measurements via crossed effects, programmed in R (version 2.1.0). The validity of the F and P values is checked against the SAS program. While getting from the ANOVA the same F values, the P values are bigger in R because of the more conservative approach. The refined model is calculating more significant P values. The optimization of the image analysis is dealing with the following parameters: ROI(Region of Interest which is the area around the geometrical center), color transformation (calculation of the 1 dimensional gray level value out of the three dimensional color information of the scanned picture, which is necessary for the texture analysis), histogram matching (normalization of the histogram of the picture to enhance the contrast and to minimize the errors from lighting conditions). The samples were wheat from DOC trial with 4 field replicates for the years 2003 and 2005, “market samples”(organic and conventional neighbors with the same variety) for 2004 and 2005, carrot where the samples were obtained from the University of Kassel (2 varieties, 2 nitrogen treatments) for the years 2004, 2005, 2006 and “market samples” of carrot for the years 2004 and 2005. The criterion for the optimization was repeatability of the differentiation of the samples over the different harvest(years). For different samples different ROIs were found, which reflect the different pictures. The best color transformation that shows efficiently differentiation is relied on gray scale, i.e., equal color transformation. The second dimension of the color transformation only appeared in some years for the effect of color wavelength(hue) for carrot treated with different nitrate fertilizer levels. The best histogram matching is the Gaussian distribution. The approach was to find a connection between the variables from textural image analysis with the different visual criteria. The relation between the texture parameters and visual evaluation criteria was limited to the carrot samples, especially, as it could be well differentiated by the texture analysis. It was possible to connect groups of variables of the texture analysis with groups of criteria from the visual evaluation. These selected variables were able to differentiate the samples but not able to classify the samples according to the treatment. Contrarily, in case of visual criteria which describe the picture as a whole there is a classification in 80% of the sample cases possible. Herewith, it clearly can find the limits of the single variable approach of the image analysis (texture analysis).
Resumo:
The Scheme86 and the HP Precision Architectures represent different trends in computer processor design. The former uses wide micro-instructions, parallel hardware, and a low latency memory interface. The latter encourages pipelined implementation and visible interlocks. To compare the merits of these approaches, algorithms frequently encountered in numerical and symbolic computation were hand-coded for each architecture. Timings were done in simulators and the results were evaluated to determine the speed of each design. Based on these measurements, conclusions were drawn as to which aspects of each architecture are suitable for a high- performance computer.
Evaluation of strategic knowledge of students through authentic tasks of writingin science classroom
Resumo:
The difficulties at evaluating strategic knowledge have been connected to the use of deconxetualized tests that, at the end, do not involve the use of this knowledge. For this reason, an interest in developing authentic writing tasks that offer advantages for these kinds of evaluation have arisen. Throughout this research, authentic writing tasks were developed in a natural sciences class, with the purpose of evaluating the students' strategic knowledge. Different instruments were used to collect data, e.g. Interviews, questionnaires, a self-inform, as well as three samples of writing by the students, with the objective of analyzing the changes that occurred between one and the others, as well as to determine the decisions that students made in order to complete the assigned tasks successfully. An authentic writing task gives great opportunities to evaluate strategic knowledge. These tasks lead students to arrange their knowledge about the topic in hand, organize and adapt it to fit the objectives and the audience, also, it allows them to control and adjust their decisions on the task. This last stage became the perfect opportunity to take notes on the knowledge and regulation of cognitive processes that the students brought into play, as well as to evaluating their understanding of writing and the demands given on the different discursive genres. As a result, the students showed different degrees of strategic knowledge in the task. The students who showed a better strategic knowledge trust in their structural abilities know about discursive genres and have a good performance in basic linguistic abilities. The students who showed weak strategic knowledge distrust their writing skills, seem extremely worried about organizing the content of their texts, fail when checking their writings, and overlook or are unaware of the basic requirements of the discursive genre they are asked to exemplify. It appears that the previous knowledge and experiences at writing the students have been exposed to may affect the strategic knowledge shown when writing in this subject.
Resumo:
Resumen tomado de la publicaci??n. Con el apoyo econ??mico del departamento MIDE de la UNED
Resumo:
The time-of-detection method for aural avian point counts is a new method of estimating abundance, allowing for uncertain probability of detection. The method has been specifically designed to allow for variation in singing rates of birds. It involves dividing the time interval of the point count into several subintervals and recording the detection history of the subintervals when each bird sings. The method can be viewed as generating data equivalent to closed capture–recapture information. The method is different from the distance and multiple-observer methods in that it is not required that all the birds sing during the point count. As this method is new and there is some concern as to how well individual birds can be followed, we carried out a field test of the method using simulated known populations of singing birds, using a laptop computer to send signals to audio stations distributed around a point. The system mimics actual aural avian point counts, but also allows us to know the size and spatial distribution of the populations we are sampling. Fifty 8-min point counts (broken into four 2-min intervals) using eight species of birds were simulated. Singing rate of an individual bird of a species was simulated following a Markovian process (singing bouts followed by periods of silence), which we felt was more realistic than a truly random process. The main emphasis of our paper is to compare results from species singing at (high and low) homogenous rates per interval with those singing at (high and low) heterogeneous rates. Population size was estimated accurately for the species simulated, with a high homogeneous probability of singing. Populations of simulated species with lower but homogeneous singing probabilities were somewhat underestimated. Populations of species simulated with heterogeneous singing probabilities were substantially underestimated. Underestimation was caused by both the very low detection probabilities of all distant individuals and by individuals with low singing rates also having very low detection probabilities.
Resumo:
Across Europe, elevated phosphorus (P) concentrations in lowland rivers have made them particularly susceptible to eutrophication. This is compounded in southern and central UK by increasing pressures on water resources, which may be further enhanced by the potential effects of climate change. The EU Water Framework Directive requires an integrated approach to water resources management at the catchment scale and highlights the need for modelling tools that can distinguish relative contributions from multiple nutrient sources and are consistent with the information content of the available data. Two such models are introduced and evaluated within a stochastic framework using daily flow and total phosphorus concentrations recorded in a clay catchment typical of many areas of the lowland UK. Both models disaggregate empirical annual load estimates, derived from land use data, as a function of surface/near surface runoff, generated using a simple conceptual rainfall-runoff model. Estimates of the daily load from agricultural land, together with those from baseflow and point sources, feed into an in-stream routing algorithm. The first model assumes constant concentrations in runoff via surface/near surface pathways and incorporates an additional P store in the river-bed sediments, depleted above a critical discharge, to explicitly simulate resuspension. The second model, which is simpler, simulates P concentrations as a function of surface/near surface runoff, thus emphasising the influence of non-point source loads during flow peaks and mixing of baseflow and point sources during low flows. The temporal consistency of parameter estimates and thus the suitability of each approach is assessed dynamically following a new approach based on Monte-Carlo analysis. (c) 2004 Elsevier B.V. All rights reserved.