51 resultados para Methodology
Resumo:
A calibration methodology based on an efficient and stable mathematical regularization scheme is described. This scheme is a variant of so-called Tikhonov regularization in which the parameter estimation process is formulated as a constrained minimization problem. Use of the methodology eliminates the need for a modeler to formulate a parsimonious inverse problem in which a handful of parameters are designated for estimation prior to initiating the calibration process. Instead, the level of parameter parsimony required to achieve a stable solution to the inverse problem is determined by the inversion algorithm itself. Where parameters, or combinations of parameters, cannot be uniquely estimated, they are provided with values, or assigned relationships with other parameters, that are decreed to be realistic by the modeler. Conversely, where the information content of a calibration dataset is sufficient to allow estimates to be made of the values of many parameters, the making of such estimates is not precluded by preemptive parsimonizing ahead of the calibration process. White Tikhonov schemes are very attractive and hence widely used, problems with numerical stability can sometimes arise because the strength with which regularization constraints are applied throughout the regularized inversion process cannot be guaranteed to exactly complement inadequacies in the information content of a given calibration dataset. A new technique overcomes this problem by allowing relative regularization weights to be estimated as parameters through the calibration process itself. The technique is applied to the simultaneous calibration of five subwatershed models, and it is demonstrated that the new scheme results in a more efficient inversion, and better enforcement of regularization constraints than traditional Tikhonov regularization methodologies. Moreover, it is argued that a joint calibration exercise of this type results in a more meaningful set of parameters than can be achieved by individual subwatershed model calibration. (c) 2005 Elsevier B.V. All rights reserved.
Resumo:
Microbial fuel cell (MFC) research is a rapidly evolving field that lacks established terminology and methods for the analysis of system performance. This makes it difficult for researchers to compare devices on an equivalent basis. The construction and analysis of MFCs requires knowledge of different scientific and engineering fields, ranging from microbiology and electrochemistry to materials and environmental engineering. DescribingMFCsystems therefore involves an understanding of these different scientific and engineering principles. In this paper, we provide a review of the different materials and methods used to construct MFCs, techniques used to analyze system performance, and recommendations on what information to include in MFC studies and the most useful ways to present results.
Resumo:
The integrated chemical-biological degradation combining advanced oxidation by UV/H2O2 followed by aerobic biodegradation was used to degrade C.I. Reactive Azo Red 195A, commonly used in the textile industry in Australia. An experimental design based on the response surface method was applied to evaluate the interactive effects of influencing factors (UV irradiation time, initial hydrogen peroxide dosage and recirculation ratio of the system) on decolourisation efficiency and optimizing the operating conditions of the treatment process. The effects were determined by the measurement of dye concentration and soluble chemical oxygen demand (S-COD). The results showed that the dye and S-COD removal were affected by all factors individually and interactively. Maximal colour degradation performance was predicted, and experimentally validated, with no recirculation, 30 min UV irradiation and 500 mg H2O2/L. The model predictions for colour removal, based on a three-factor/five-level Box-Wilson central composite design and the response surface method analysis, were found to be very close to additional experimental results obtained under near optimal conditions. This demonstrates the benefits of this approach in achieving good predictions while minimising the number of experiments required. (c) 2006 Elsevier B.V. All rights reserved.
Resumo:
We developed an efficient, cost effective strategy for Fmoc-based solid phase synthesis of 'difficult' peptides and/or peptides containing Asp/Asn-Gly sequences, free of aspartimide and related products, using a peptoid methodology for the preparation of N-substituted glycines.
Resumo:
An inverse methodology to assist in the design of radio-frequency (RF) head coils for high field MRI application is described in this work. Free space time-harmonic electromagnetic Green's functions and preemphasized B1 field are used to calculate the current density on the coil cylinder. With B1 field preemphasized and lowered in the middle of the RF transverse plane, the calculated current distribution can generate an internal magnetic field that can reduce the EM field/tissue interactions at high frequencies. The current distribution of a head coil operating at 4 T is calculated using inverse methodology with preemphasized B1 fields. FDTD is employed to calculate B1 field and signal intensity inside a homogenous cylindrical phantom and human head. A comparison with conventional RF birdcage coil is reported here and demonstrated that inverse-method designed coil with preemphasized B1 field can help in decreasing the notorious bright region caused by EM field/tissue interactions in the human head images at 4 T.
Resumo:
The ontological analysis of conceptual modelling techniques is of increasing popularity. Related research did not only explore the ontological deficiencies of classical techniques such as ER or UML, but also business process modelling techniques such as ARIS or even Web services standards such as BPEL4WS. While the selected ontologies are reasonably mature, it is the actual process of an ontological analysis that still lacks rigor. The current procedure leaves significant room for individual interpretations and is one reason for criticism of the entire ontological analysis. This paper proposes a procedural model for the ontological analysis based on the use of meta models, the involvement of more than one coder and metrics. This model is explained with examples from various ontological analyses.