819 resultados para Efficient welding
Resumo:
Agriculture, particularly intensive crop production, makes a significant contribution to environmental pollution. A variety of canola (Brassica napus) has been genetically modified to enhance nitrogen use efficiency, effectively reducing the amount of fertilizer required for crop production. A partial life-cycle assessment adapted to crop production was used to assess the potential environmental impacts of growing genetically modified, nitrogen use-efficient (GMNUE) canola in North Dakota and Minnesota compared with a conventionally bred control variety. The analysis took into account the entire production system used to produce 1 tonne of canola. This comprised raw material extraction, processing and transportation, as well as all agricultural field operations. All emissions associated with the production of 1 tonne of canola were listed, aggregated and weighted in order to calculate the level of environmental impact. The findings show that there are a range of potential environmental benefits associated with growing GMNUE canola. These include reduced impacts on global warming, freshwater ecotoxicity, eutrophication and acidification. Given the large areas of canola grown in North America and, in particular, Canada, as well as the wide acceptance of genetically modified varieties in this area, there is the potential for GMNUE canola to reduce pollution from agriculture, with the largest reductions predicted to be in greenhouse gases and diffuse water pollution.
Resumo:
We consider the case of a multicenter trial in which the center specific sample sizes are potentially small. Under homogeneity, the conventional procedure is to pool information using a weighted estimator where the weights used are inverse estimated center-specific variances. Whereas this procedure is efficient for conventional asymptotics (e. g. center-specific sample sizes become large, number of center fixed), it is commonly believed that the efficiency of this estimator holds true also for meta-analytic asymptotics (e.g. center-specific sample size bounded, potentially small, and number of centers large). In this contribution we demonstrate that this estimator fails to be efficient. In fact, it shows a persistent bias with increasing number of centers showing that it isnot meta-consistent. In addition, we show that the Cochran and Mantel-Haenszel weighted estimators are meta-consistent and, in more generality, provide conditions on the weights such that the associated weighted estimator is meta-consistent.
Resumo:
Greenhouse cladding materials are a major component in the design of energy efficient greenhouses. The optical properties of cladding materials determine a major part of the overall performance of a greenhouse both in terms of the energy balance of the greenhouse and on crop behavior. Various film plastic greenhouse-cladding materials were measured under laboratory conditions using a spectroradiometer equipped with an integrating sphere. Films were measured over a range of angles of incidence and the effect of increasing distance between double films was also measured. PAR transmission remained nearly constant for angles of incidence increased up to 30 degrees but fell rapidly thereafter as the angles of incidence increased up to 90 degrees. Increasing distance between double films did not significantly affect PAR transmission in all films examined. These results are discussed in relation to the design criteria for an energy efficient greenhouse.
Resumo:
Details about the parameters of kinetic systems are crucial for progress in both medical and industrial research, including drug development, clinical diagnosis and biotechnology applications. Such details must be collected by a series of kinetic experiments and investigations. The correct design of the experiment is essential to collecting data suitable for analysis, modelling and deriving the correct information. We have developed a systematic and iterative Bayesian method and sets of rules for the design of enzyme kinetic experiments. Our method selects the optimum design to collect data suitable for accurate modelling and analysis and minimises the error in the parameters estimated. The rules select features of the design such as the substrate range and the number of measurements. We show here that this method can be directly applied to the study of other important kinetic systems, including drug transport, receptor binding, microbial culture and cell transport kinetics. It is possible to reduce the errors in the estimated parameters and, most importantly, increase the efficiency and cost-effectiveness by reducing the necessary amount of experiments and data points measured. (C) 2003 Federation of European Biochemical Societies. Published by Elsevier B.V. All rights reserved.
Resumo:
In areas such as drug development, clinical diagnosis and biotechnology research, acquiring details about the kinetic parameters of enzymes is crucial. The correct design of an experiment is critical to collecting data suitable for analysis, modelling and deriving the correct information. As classical design methods are not targeted to the more complex kinetics being frequently studied, attention is needed to estimate parameters of such models with low variance. We demonstrate that a Bayesian approach (the use of prior knowledge) can produce major gains quantifiable in terms of information, productivity and accuracy of each experiment. Developing the use of Bayesian Utility functions, we have used a systematic method to identify the optimum experimental designs for a number of kinetic model data sets. This has enabled the identification of trends between kinetic model types, sets of design rules and the key conclusion that such designs should be based on some prior knowledge of K-M and/or the kinetic model. We suggest an optimal and iterative method for selecting features of the design such as the substrate range, number of measurements and choice of intermediate points. The final design collects data suitable for accurate modelling and analysis and minimises the error in the parameters estimated. (C) 2003 Elsevier Science B.V. All rights reserved.
Resumo:
Presented herein is an experimental design that allows the effects of several radiative forcing factors on climate to be estimated as precisely as possible from a limited suite of atmosphere-only general circulation model (GCM) integrations. The forcings include the combined effect of observed changes in sea surface temperatures, sea ice extent, stratospheric (volcanic) aerosols, and solar output, plus the individual effects of several anthropogenic forcings. A single linear statistical model is used to estimate the forcing effects, each of which is represented by its global mean radiative forcing. The strong colinearity in time between the various anthropogenic forcings provides a technical problem that is overcome through the design of the experiment. This design uses every combination of anthropogenic forcing rather than having a few highly replicated ensembles, which is more commonly used in climate studies. Not only is this design highly efficient for a given number of integrations, but it also allows the estimation of (nonadditive) interactions between pairs of anthropogenic forcings. The simulated land surface air temperature changes since 1871 have been analyzed. The changes in natural and oceanic forcing, which itself contains some forcing from anthropogenic and natural influences, have the most influence. For the global mean, increasing greenhouse gases and the indirect aerosol effect had the largest anthropogenic effects. It was also found that an interaction between these two anthropogenic effects in the atmosphere-only GCM exists. This interaction is similar in magnitude to the individual effects of changing tropospheric and stratospheric ozone concentrations or to the direct (sulfate) aerosol effect. Various diagnostics are used to evaluate the fit of the statistical model. For the global mean, this shows that the land temperature response is proportional to the global mean radiative forcing, reinforcing the use of radiative forcing as a measure of climate change. The diagnostic tests also show that the linear model was suitable for analyses of land surface air temperature at each GCM grid point. Therefore, the linear model provides precise estimates of the space time signals for all forcing factors under consideration. For simulated 50-hPa temperatures, results show that tropospheric ozone increases have contributed to stratospheric cooling over the twentieth century almost as much as changes in well-mixed greenhouse gases.
Resumo:
At the end of its tether! The fusion of a six-membered ring onto the four-carbon-atom tether of substrate 1 provides an efficient approach toward the polycyclic ring systems of the natural products aphidicolin and stemodinone. The reaction represents a unique example of a preference for product formation from an endo exciplex in an intramolecular system (exo:endo 2:3=1.0:1.2).
Resumo:
The factors affecting the copper-catalyzed rearrangement of ammonium ylids derived from tetrahydropyriclines and diazoesters; have been examined,and the first examples of high-yielding metal-catalyzed [2,3]-sigmatropic rearrangements of a wide range of such ylids are reported. The nature of the alpha-substituent in the diazo component of the reaction has a dramatic effect upon the yields of the reaction, with electron-withdrawing substituents enhancing the yield of the reaction.
Resumo:
Event-related functional magnetic resonance imaging (efMRI) has emerged as a powerful technique for detecting brains' responses to presented stimuli. A primary goal in efMRI data analysis is to estimate the Hemodynamic Response Function (HRF) and to locate activated regions in human brains when specific tasks are performed. This paper develops new methodologies that are important improvements not only to parametric but also to nonparametric estimation and hypothesis testing of the HRF. First, an effective and computationally fast scheme for estimating the error covariance matrix for efMRI is proposed. Second, methodologies for estimation and hypothesis testing of the HRF are developed. Simulations support the effectiveness of our proposed methods. When applied to an efMRI dataset from an emotional control study, our method reveals more meaningful findings than the popular methods offered by AFNI and FSL. (C) 2008 Elsevier B.V. All rights reserved.
Resumo:
This paper develops fuzzy methods for control of the rotary inverted pendulum, an underactuated mechanical system. Two control laws are presented, one for swing up and another for the stabilization. The pendulum is swung up from the vertical down stable position to the upward unstable position in a controlled trajectory. The rules for the swing up are heuristically written such that each swing results in greater energy build up. The stabilization is achieved by mapping a stabilizing LQR control law to two fuzzy inference engines, which reduces the computational load compared with using a single fuzzy inference engine. The robustness of the balancing control is tested by attaching a bottle of water at the tip of the pendulum.
Resumo:
This paper discusses the design, implementation and synthesis of an FFT module that has been specifically optimized for use in the OFDM based Multiband UWB system, although the work is generally applicable to many other OFDM based receiver systems. Previous work has detailed the requirements for the receiver FFT module within the Multiband UWB ODFM based system and this paper draws on those requirements coupled with modern digital architecture principles and low power design criteria to converge on our optimized solution. The FFT design obtained in this paper is also applicable for implementation of the transmitter IFFT module therefore only needing one FFT module for half-duplex operation. The results from this paper enable the baseband designers of the 200Mbit/sec variant of Multiband UWB systems (and indeed other OFDM based receivers) using System-on-Chip (SoC), FPGA and ASIC technology to create cost effective and low power solutions biased toward the competitive consumer electronics market.
Resumo:
An approach to the automatic generation of efficient Field Programmable Gate Arrays (FPGAs) circuits for the Regular Expression-based (RegEx) Pattern Matching problems is presented. Using a novel design strategy, as proposed, circuits that are highly area-and-time-efficient can be automatically generated for arbitrary sets of regular expressions. This makes the technique suitable for applications that must handle very large sets of patterns at high speed, such as in the network security and intrusion detection application domains. We have combined several existing techniques to optimise our solution for such domains and proposed the way the whole process of dynamic generation of FPGAs for RegEX pattern matching could be automated efficiently.