1000 resultados para eddy covariance methods
Resumo:
Phlorotannins are the least studied group of tannins and are found only in brown algae. Hitherto the roles of phlorotannins, e.g. in plant-herbivore interactions, have been studied by quantifying the total contents of the soluble phlorotannins with a variety of methods. Little attention has been given to either quantitative variation in cell-wall-bound and exuded phlorotannins or to qualitative variation in individual compounds. A quantification procedure was developed to measure the amount of cell-wall-bound phlorotannins. The quantification of soluble phlorotannins was adjusted for both large- and small-scale samples and used to estimate the amounts of exuded phlorotannins using bladder wrack (Fucus vesiculosus) as a model species. In addition, separation of individual soluble phlorotannins to produce a phlorotannin profile from the phenolic crude extract was achieved by high-performance liquid chromatography (HPLC). Along with these methodological studies, attention was focused on the factors in the procedure which generated variation in the yield of phlorotannins. The objective was to enhance the efficiency of the sample preparation procedure. To resolve the problem of rapid oxidation of phlorotannins in HPLC analyses, ascorbic acid was added to the extractant. The widely used colourimetric method was found to produce a variation in the yield that was dependent upon the pH and concentration of the sample. Using these developed, adjusted and modified methods, the phenotypic plasticity of phlorotannins was studied with respect to nutrient availability and herbivory. An increase in nutrients decreased the total amount of soluble phlorotannins but did not affect the cell-wall-bound phlorotannins, the exudation of phlorotannins or the phlorotannin profile achieved with HPLC. The presence of the snail Thedoxus fluviatilis on the thallus induced production of soluble phlorotannins, and grazing by the herbivorous isopod Idotea baltica increased the exudation of phlorotannins. To study whether the among-population variations in phlorotannin contents arise from the genetic divergence or from the plastic response of algae, or both, algae from separate populations were reared in a common garden. Genetic variation among local populations was found in both the phlorotannin profile and the content of total phlorotannins. Phlorotannins were also genetically variable within populations. This suggests that local algal populations have diverged in their contents of phlorotannins, and that they may respond to natural selection and evolve both quantitatively and qualitatively.
Resumo:
Within the latest decade high-speed motor technology has been increasingly commonly applied within the range of medium and large power. More particularly, applications like such involved with gas movement and compression seem to be the most important area in which high-speed machines are used. In manufacturing the induction motor rotor core of one single piece of steel it is possible to achieve an extremely rigid rotor construction for the high-speed motor. In a mechanical sense, the solid rotor may be the best possible rotor construction. Unfortunately, the electromagnetic properties of a solid rotor are poorer than the properties of the traditional laminated rotor of an induction motor. This thesis analyses methods for improving the electromagnetic properties of a solid-rotor induction machine. The slip of the solid rotor is reduced notably if the solid rotor is axially slitted. The slitting patterns of the solid rotor are examined. It is shown how the slitting parameters affect the produced torque. Methods for decreasing the harmonic eddy currents on the surface of the rotor are also examined. The motivation for this is to improve the efficiency of the motor to reach the efficiency standard of a laminated rotor induction motor. To carry out these research tasks the finite element analysis is used. An analytical calculation of solid rotors based on the multi-layer transfer-matrix method is developed especially for the calculation of axially slitted solid rotors equipped with wellconducting end rings. The calculation results are verified by using the finite element analysis and laboratory measurements. The prototype motors of 250 – 300 kW and 140 Hz were tested to verify the results. Utilization factor data are given for several other prototypes the largest of which delivers 1000 kW at 12000 min-1.
Resumo:
This work presents new, efficient Markov chain Monte Carlo (MCMC) simulation methods for statistical analysis in various modelling applications. When using MCMC methods, the model is simulated repeatedly to explore the probability distribution describing the uncertainties in model parameters and predictions. In adaptive MCMC methods based on the Metropolis-Hastings algorithm, the proposal distribution needed by the algorithm learns from the target distribution as the simulation proceeds. Adaptive MCMC methods have been subject of intensive research lately, as they open a way for essentially easier use of the methodology. The lack of user-friendly computer programs has been a main obstacle for wider acceptance of the methods. This work provides two new adaptive MCMC methods: DRAM and AARJ. The DRAM method has been built especially to work in high dimensional and non-linear problems. The AARJ method is an extension to DRAM for model selection problems, where the mathematical formulation of the model is uncertain and we want simultaneously to fit several different models to the same observations. The methods were developed while keeping in mind the needs of modelling applications typical in environmental sciences. The development work has been pursued while working with several application projects. The applications presented in this work are: a winter time oxygen concentration model for Lake Tuusulanjärvi and adaptive control of the aerator; a nutrition model for Lake Pyhäjärvi and lake management planning; validation of the algorithms of the GOMOS ozone remote sensing instrument on board the Envisat satellite of European Space Agency and the study of the effects of aerosol model selection on the GOMOS algorithm.
Resumo:
Background: Information about the composition of regulatory regions is of great value for designing experiments to functionally characterize gene expression. The multiplicity of available applications to predict transcription factor binding sites in a particular locus contrasts with the substantial computational expertise that is demanded to manipulate them, which may constitute a potential barrier for the experimental community. Results: CBS (Conserved regulatory Binding Sites, http://compfly.bio.ub.es/CBS) is a public platform of evolutionarily conserved binding sites and enhancers predicted in multiple Drosophila genomes that is furnished with published chromatin signatures associated to transcriptionally active regions and other experimental sources of information. The rapid access to this novel body of knowledge through a user-friendly web interface enables non-expert users to identify the binding sequences available for any particular gene, transcription factor, or genome region. Conclusions: The CBS platform is a powerful resource that provides tools for data mining individual sequences and groups of co-expressed genes with epigenomics information to conduct regulatory screenings in Drosophila.
Resumo:
Recent advances in machine learning methods enable increasingly the automatic construction of various types of computer assisted methods that have been difficult or laborious to program by human experts. The tasks for which this kind of tools are needed arise in many areas, here especially in the fields of bioinformatics and natural language processing. The machine learning methods may not work satisfactorily if they are not appropriately tailored to the task in question. However, their learning performance can often be improved by taking advantage of deeper insight of the application domain or the learning problem at hand. This thesis considers developing kernel-based learning algorithms incorporating this kind of prior knowledge of the task in question in an advantageous way. Moreover, computationally efficient algorithms for training the learning machines for specific tasks are presented. In the context of kernel-based learning methods, the incorporation of prior knowledge is often done by designing appropriate kernel functions. Another well-known way is to develop cost functions that fit to the task under consideration. For disambiguation tasks in natural language, we develop kernel functions that take account of the positional information and the mutual similarities of words. It is shown that the use of this information significantly improves the disambiguation performance of the learning machine. Further, we design a new cost function that is better suitable for the task of information retrieval and for more general ranking problems than the cost functions designed for regression and classification. We also consider other applications of the kernel-based learning algorithms such as text categorization, and pattern recognition in differential display. We develop computationally efficient algorithms for training the considered learning machines with the proposed kernel functions. We also design a fast cross-validation algorithm for regularized least-squares type of learning algorithm. Further, an efficient version of the regularized least-squares algorithm that can be used together with the new cost function for preference learning and ranking tasks is proposed. In summary, we demonstrate that the incorporation of prior knowledge is possible and beneficial, and novel advanced kernels and cost functions can be used in algorithms efficiently.
Resumo:
Drying is a major step in the manufacturing process in pharmaceutical industries, and the selection of dryer and operating conditions are sometimes a bottleneck. In spite of difficulties, the bottlenecks are taken care of with utmost care due to good manufacturing practices (GMP) and industries' image in the global market. The purpose of this work is to research the use of existing knowledge for the selection of dryer and its operating conditions for drying of pharmaceutical materials with the help of methods like case-based reasoning and decision trees to reduce time and expenditure for research. The work consisted of two major parts as follows: Literature survey on the theories of spray dying, case-based reasoning and decision trees; working part includes data acquisition and testing of the models based on existing and upgraded data. Testing resulted in a combination of two models, case-based reasoning and decision trees, leading to more specific results when compared to conventional methods.
Resumo:
Currently there is a vogue for Agile Software Development methods and many software development organizations have already implemented or they are planning to implement agile methods. Objective of this thesis is to define how agile software development methods are implemented in a small organization. Agile methods covered in this thesis are Scrum and XP. From both methods the key practices are analysed and compared to waterfall method. This thesis also defines implementation strategy and actions how agile methods are implemented in a small organization. In practice organization must prepare well and all needed meters are defined before the implementation starts. In this work three different sample projects are introduced where agile methods were implemented. Experiences from these projects were encouraging although sample set of projects were too small to get trustworthy results.
Resumo:
In a very volatile industry of high technology it is of utmost importance to accurately forecast customers’ demand. However, statistical forecasting of sales, especially in heavily competitive electronics product business, has always been a challenging task due to very high variation in demand and very short product life cycles of products. The purpose of this thesis is to validate if statistical methods can be applied to forecasting sales of short life cycle electronics products and provide a feasible framework for implementing statistical forecasting in the environment of the case company. Two different approaches have been developed for forecasting on short and medium term and long term horizons. Both models are based on decomposition models, but differ in interpretation of the model residuals. For long term horizons residuals are assumed to represent white noise, whereas for short and medium term forecasting horizon residuals are modeled using statistical forecasting methods. Implementation of both approaches is performed in Matlab. Modeling results have shown that different markets exhibit different demand patterns and therefore different analytical approaches are appropriate for modeling demand in these markets. Moreover, the outcomes of modeling imply that statistical forecasting can not be handled separately from judgmental forecasting, but should be perceived only as a basis for judgmental forecasting activities. Based on modeling results recommendations for further deployment of statistical methods in sales forecasting of the case company are developed.
Resumo:
OBJECTIVE: The objective of this study was to compare posttreatment seizure severity in a phase III clinical trial of eslicarbazepine acetate (ESL) as adjunctive treatment of refractory partial-onset seizures. METHODS: The Seizure Severity Questionnaire (SSQ) was administered at baseline and posttreatment. The SSQ total score (TS) and component scores (frequency and helpfulness of warning signs before seizures [BS]; severity and bothersomeness of ictal movement and altered consciousness during seizures [DS]; cognitive, emotional, and physical aspects of postictal recovery after seizures [AS]; and overall severity and bothersomeness [SB]) were calculated for the per-protocol population. Analysis of covariance, adjusted for baseline scores, estimated differences in posttreatment least square means between treatment arms. RESULTS: Out of 547 per-protocol patients, 441 had valid SSQ TS both at baseline and posttreatment. Mean posttreatment TS for ESL 1200mg/day was significantly lower than that for placebo (2.68 vs 3.20, p<0.001), exceeding the minimal clinically important difference (MCID: 0.48). Mean DS, AS, and SB were also significantly lower with ESL 1200mg/day; differences in AS and SB exceeded the MCIDs. The TS, DS, AS, and SB were lower for ESL 800mg/day than for placebo; only SB was significant (p=0.013). For both ESL arms combined versus placebo, mean scores differed significantly for TS (p=0.006), DS (p=0.031), and SB (p=0.001). CONCLUSIONS: Therapeutic ESL doses led to clinically meaningful, dose-dependent reductions in seizure severity, as measured by SSQ scores. CLASSIFICATION OF EVIDENCE: This study presents Class I evidence that adjunctive ESL (800 and 1200mg/day) led to clinically meaningful, dose-dependent seizure severity reductions, measured by the SSQ.
Resumo:
Most current methods for adult skeletal age-at-death estimation are based on American samples comprising individuals of European and African ancestry. Our limited understanding of population variability hampers our efforts to apply these techniques to various skeletal populations around the world, especially in global forensic contexts. Further, documented skeletal samples are rare, limiting our ability to test our techniques. The objective of this paper is to test three pelvic macroscopic methods (1-Suchey-Brooks; 2- Lovejoy; 3- Buckberry and Chamberlain) on a documented modern Spanish sample. These methods were selected because they are popular among Spanish anthropologists and because they never have been tested in a Spanish sample. The study sample consists of 80 individuals (55 ♂ and 25 ♀) of known sex and age from the Valladolid collection. Results indicate that in all three methods, levels of bias and inaccuracy increase with age. The Lovejoy method performs poorly (27%) compared with Suchey-Brooks (71%) and Buckberry and Chamberlain (86%). However, the levels of correlation between phases and chronological ages are low and comparable in the three methods (< 0.395). The apparent accuracy of the Suchey-Brooks and Buckberry and Chamberlain methods is largely based on the broad width of the methods" estimated intervals. This study suggests that before systematic application of these three methodologies in Spanish populations, further statistical modeling and research into the co-variance of chronological age with morphological change is necessary. Future methods should be developed specific to various world populations, and should allow for both precision and flexibility in age estimation.