975 resultados para Optimization analysis
Investigation and optimization of parameters affecting the multiply charged ion yield in AP-MALDI MS
Resumo:
Liquid matrix-assisted laser desorption/ionization (MALDI) allows the generation of predominantly multiply charged ions in atmospheric pressure (AP) MALDI ion sources for mass spectrometry (MS) analysis. The charge state distribution of the generated ions and the efficiency of the ion source in generating such ions crucially depend on the desolvation regime of the MALDI plume after desorption in the AP-tovacuum inlet. Both high temperature and a flow regime with increased residence time of the desorbed plume in the desolvation region promote the generation of multiply charged ions. Without such measures the application of an electric ion extraction field significantly increases the ion signal intensity of singly charged species while the detection of multiply charged species is less dependent on the extraction field. In general, optimization of high temperature application facilitates the predominant formation and detection of multiply charged compared to singly charged ion species. In this study an experimental setup and optimization strategy is described for liquid AP-MALDI MS which improves the ionization effi- ciency of selected ion species up to 14 times. In combination with ion mobility separation, the method allows the detection of multiply charged peptide and protein ions for analyte solution concentrations as low as 2 fmol/lL (0.5 lL, i.e. 1 fmol, deposited on the target) with very low sample consumption in the low nL-range.
Resumo:
Tensor clustering is an important tool that exploits intrinsically rich structures in real-world multiarray or Tensor datasets. Often in dealing with those datasets, standard practice is to use subspace clustering that is based on vectorizing multiarray data. However, vectorization of tensorial data does not exploit complete structure information. In this paper, we propose a subspace clustering algorithm without adopting any vectorization process. Our approach is based on a novel heterogeneous Tucker decomposition model taking into account cluster membership information. We propose a new clustering algorithm that alternates between different modes of the proposed heterogeneous tensor model. All but the last mode have closed-form updates. Updating the last mode reduces to optimizing over the multinomial manifold for which we investigate second order Riemannian geometry and propose a trust-region algorithm. Numerical experiments show that our proposed algorithm compete effectively with state-of-the-art clustering algorithms that are based on tensor factorization.
Resumo:
This work evaluated the effect of pressure and temperature on yield and characteristic flavour intensity of Brazilian cherry (Eugenia uniflora L) extracts obtained by supercritical CO(2) using response surface analysis, which is a simple and efficient method for first inquiries. A complete central composite 2(2) factorial experimental design was applied using temperature (ranging from 40 to 60 degrees C) and pressure (from 150 to 250 bar) as independent variables. A second order model proved to be predictive (p <= 0.05) for the extract yield as affected by pressure and temperature, with better results being achieved at the central point (200 bar and 50 degrees C). For the flavour intensity, a first order model proved to be predictive (p <= 0.05) showing the influence of temperature. Greater characteristic flavour intensity in extracts was obtained for relatively high temperature (> 50 degrees C), Therefore, as far as Brazilian cherry is concerned, optimum conditions for achieving higher extract yield do not necessarily coincide to those for obtaining richer flavour intensity. Industrial relevance: Supercritical fluid extraction (SFE) is an emerging clean technology through which one may obtain extracts free from organic solvents. Extract yields from natural products for applications in food, pharmaceutical and cosmetic industries have been widely disseminated in the literature. Accordingly, two lines of research have industrial relevance, namely, (i) operational optimization studies for high SFE yields and (ii) investigation on important properties extracts are expected to present (so as to define their prospective industrial application). Specifically, this work studied the optimization of SFE process to obtain extracts from a tropical fruit showing high intensity of its characteristic flavour, aiming at promoting its application in natural aroma enrichment of processed foods. (C) 2008 Elsevier Ltd. All rights reserved.
Resumo:
We present a new technique for obtaining model fittings to very long baseline interferometric images of astrophysical jets. The method minimizes a performance function proportional to the sum of the squared difference between the model and observed images. The model image is constructed by summing N(s) elliptical Gaussian sources characterized by six parameters: two-dimensional peak position, peak intensity, eccentricity, amplitude, and orientation angle of the major axis. We present results for the fitting of two main benchmark jets: the first constructed from three individual Gaussian sources, the second formed by five Gaussian sources. Both jets were analyzed by our cross-entropy technique in finite and infinite signal-to-noise regimes, the background noise chosen to mimic that found in interferometric radio maps. Those images were constructed to simulate most of the conditions encountered in interferometric images of active galactic nuclei. We show that the cross-entropy technique is capable of recovering the parameters of the sources with a similar accuracy to that obtained from the very traditional Astronomical Image Processing System Package task IMFIT when the image is relatively simple (e. g., few components). For more complex interferometric maps, our method displays superior performance in recovering the parameters of the jet components. Our methodology is also able to show quantitatively the number of individual components present in an image. An additional application of the cross-entropy technique to a real image of a BL Lac object is shown and discussed. Our results indicate that our cross-entropy model-fitting technique must be used in situations involving the analysis of complex emission regions having more than three sources, even though it is substantially slower than current model-fitting tasks (at least 10,000 times slower for a single processor, depending on the number of sources to be optimized). As in the case of any model fitting performed in the image plane, caution is required in analyzing images constructed from a poorly sampled (u, v) plane.
Resumo:
Many of the controversies around the concept of homology rest on the subjectivity inherent to primary homology propositions. Dynamic homology partially solves this problem, but there has been up to now scant application of it outside of the molecular domain. This is probably because morphological and behavioural characters are rich in properties, connections and qualities, so that there is less space for conflicting character delimitations. Here we present a new method for the direct optimization of behavioural data, a method that relies on the richness of this database to delimit the characters, and on dynamic procedures to establish character state identity. We use between-species congruence in the data matrix and topological stability to choose the best cladogram. We test the methodology using sequences of predatory behaviour in a group of spiders that evolved the highly modified predatory technique of spitting glue onto prey. The cladogram recovered is fully compatible with previous analyses in the literature, and thus the method seems consistent. Besides the advantage of enhanced objectivity in character proposition, the new procedure allows the use of complex, context-dependent behavioural characters in an evolutionary framework, an important step towards the practical integration of the evolutionary and ecological perspectives on diversity. (C) The Willi Hennig Society 2010.
Resumo:
In this work, the separation of nine phenolic acids (benzoic, caffeic, chlorogenic, p-coumaric, ferulic, gallic, protocatechuic, syringic, and vanillic acid) was approached by a 32 factorial design in electrolytes consisting of sodium tetraborate buffer(STB) in the concentration range of 10-50 mmol L(-1) and methanol in the volume percentage of 5-20%. Derringer`s desirability functions combined globally were tested as response functions. An optimal electrolyte composed by 50 mmol L(-1) tetraborate buffer at pH 9.2, and 7.5% (v/v) methanol allowed baseline resolution of all phenolic acids under investigation in less than 15 min. In order to promote sample clean up, to preconcentrate the phenolic fraction and to release esterified phenolic acids from the fruit matrix, elaborate liquid-liquid extraction procedures followed by alkaline hydrolysis were performed. The proposed methodology was fully validated (linearity from 10.0 to 100 mu g mL(-1), R(2) > 0.999: LOD and LOQ from 1.32 to 3.80 mu g mL(-1) and from 4.01 to 11.5 mu g mL(-1), respectively; intra-day precision better than 2.8% CV for migration time and 5.4% CV for peak area; inter-day precision better than 4.8% CV for migration time and 4.8-11% CV for peak area: recoveries from 81% to 115%) and applied successfully to the evaluation of phenolic contents of abiu-roxo (Chrysophyllum caimito), wild mulberry growing in Brazil (Morus nigra L.) and tree tomato (Cyphomandra betacea). Values in the range of 1.50-47.3 mu g g(-1) were found, with smaller amounts occurring as free phenolic acids. (C) 2009 Elsevier B.V. All rights reserved.
Resumo:
A method for the simultaneous determination of the stilbene resveratrol, four phenolic acids (syringic, coumaric, caffeic, and gallic acids), and five flavonoids (catechin, rutin, kaempferol, myricetin, and quercetin) in wine by CE was developed and validated. The CE electrolyte composition and instrumental conditions were optimized using 2(7-3) factorial design and response surface analysis, showing sodium tetraborate, MeOH, and their interaction as the most influential variables. The optimal electrophoretic conditions, minimizing the chromatographic resolution statistic values, consisted of 17 mmol/L sodium tetraborate with 20% methanol as electrolyte, constant voltage of 25 kV, hydrodynamic injection at 50 mbar for 3 s, and temperature of 25 degrees C. The R(2) values for linearity varied from 0.994 to 0.999; LOD and LOQ were 0.1 to 0.3 mg/L and 0.4 to 0.8 mg/L, respectively. The RSDs for migration time and peak area obtained from ten consecutive injections were less than 2% and recoveries varied from 97 to 102%. The method was applied to 23 samples of inexpensive Brazilian wines, showing wide compositional variation.
Resumo:
The aim of this study was to develop a fast capillary electrophoresis method for the determination of benzoate and sorbate ions in commercial beverages. In the method development the pH and constituents of the background electrolyte were selected using the effective mobility versus pH curves. As the high resolution obtained experimentally for sorbate and benzoate in the studies presented in the literature is not in agreement with that expected from the ionic mobility values published, a procedure to determine these values was carried out. The salicylate ion was used as the internal standard. The background electrolyte was composed of 25 mmol L(-1) tris(hydroxymethyl)aminomethane and 12.5 mmol L(-1) 2-hydroxyisobutyric acid, atpH 8.1.Separation was conducted in a fused-silica capillary(32 cm total length and 8.5 cm effective length, 50 mu m I.D.), with short-end injection configuration and direct UV detection at 200 nm for benzoate and salicylate and 254 nm for sorbate ions. The run time was only 28 s. A few figures of merit of the proposed method include: good linearity (R(2) > 0.999), limit of detection of 0.9 and 0.3 mg L(-1) for benzoate and sorbate, respectively, inter-day precision better than 2.7% (n =9) and recovery in the range 97.9-105%. Beverage samples were prepared by simple dilution with deionized water (1:11, v/v). Concentrations in the range of 197-401 mg L(-1) for benzoate and 28-144 mg L(-1) for sorbate were found in soft drinks and tea. (c) 2008 Elsevier B.V. All rights reserved.
Resumo:
This paper describes the optimization and use of a Sequential Injection Analysis (SIA) procedure for ammonium determination in waters. Response Surface Methodology (RSM) was used as a tool for optimization of a procedure based on the modified Berthelot reaction. The SIA system was designed to (i) prepare the reaction media by injecting an air-segmented zone containing the reagents in a mixing chamber, (ii) to aspirate the mixture back to the holding coil after homogenization, (iii) drive it to a thermostated reaction coil, where the flow is stopped for a previously established time, and (iv) to pump the mixture toward the detector flow cell for the spectrophotometric measurements. Using a 100 mu mol L(-1) ammonium solution, the following factors were considered for optimization: reaction temperature (25 - 45 degrees C), reaction time (30 - 90 s), hypochlorite concentration (20 - 40 mmol L(-1)) nitroprusside concentration (10 - 40 mmol L(-1)) and salicylate concentration (0.1 - 0.3 mol L(-1)). The proposed system fed the statistical program with absorbance data for fast construction of response surface plots. After optimization of the method, figures of merit were evaluated, as well as the ammonium concentration in some water samples. No evidence of statistical difference was observed in the results obtained by the proposed method in comparison to those obtained by a reference method based on the phenol reaction. (C) 2010 Elsevier B.V. All rights reserved.
Resumo:
The aim of this study was to develop a fast capillary electrophoresis method for the determination of propranolol in pharmaceutical preparations. In the method development the pH and constituents of the background electrolyte were selected using the effective mobility versus pH curves. Benzylamine was used as the internal standard. The background electrolyte was composed of 60 mmol L(-1) tris(hydroxymethyl)aminomethane and 30 mmol L(-1) 2-hydroxyisobutyric acid,at pH 8.1. Separation was conducted in a fused-silica capillary (32 cm total length and 8.5 cm effective length, 50 mu m I.D.) with a short-end injection configuration and direct UV detection at 214 nm. The run time was only 14 s. Three different strategies were studied in order to develop a fast CE method with low total analysis time for propranolol analysis: low flush time (Lflush) 35 runs/h, without flush (Wflush) 52 runs/h, and Invert (switched polarity) 45 runs/h. Since the three strategies developed are statistically equivalent, Mush was selected due to the higher analytical frequency in comparison with the other methods. A few figures of merit of the proposed method include: good linearity (R(2) > 0.9999); limit of detection of 0.5 mg L(-1): inter-day precision better than 1.03% (n = 9) and recovery in the range of 95.1-104.5%. (C) 2009 Elsevier B.V. All rights reserved.
Resumo:
Gelatin is a cheap and abundant natural product with very good biodegradation properties and can be used to obtain acetic acid or LiClO(4)-based gel polymer electrolytes (GPEs) with high ionic conductivity and good stability. This article presents results of GPEs obtained by the plasticization of gelatin and addition of LiBF(4), where the optimization of the system was achieved by using a factorial design type 22 with two variables: glycerol and LiBF(4). From this analysis it was stated that the effect of glycerol as a plasticizer on the ionic conductivity results is much more important than the effect obtained by varying the lithium salt content or the effect of the interaction of both variables. Also all the samples were characterized by X-ray diffraction measurements, UV-vis-NIR spectroscopy and scanning electron microscopy (SEM) and impedance spectroscopy. The ionic conductivity results of all analyzed samples as a function of temperature obey predominantly an Arrhenius relationship and the samples are stable up to 160 degrees C. Good conductivity results combined with transparency and good adhesion to the electrodes have shown that gelatin-based GPEs are very promising materials to be used as solid electrolytes in electrochromic devices. (C) 2009 Elsevier Ltd. All rights reserved.
Resumo:
In previous studies, we identified promising anti-Trypanosoma cruzi cruzain inhibitors based on thiazolylhydrazones. To optimize this series, a number of medicinal chemistry directions were explored and new thiazolylhydrazones and thiosemicarbazones were thus synthesized. Potent cruzain inhibitors were identified, such as thiazolylhydrazones 3b and 3j, which exhibited IC(50) of 200-400 nM. Furthermore, molecular docking studies showed concordance with experimentally derived structure-activity relationships (SAR) data. In the course of this work, lead compounds exhibiting in vitro activity against both the epimastigote and trypomastigote forms of T. cruzi were identified and in vivo general toxicity analysis was subsequently performed. Novel SAR were documented, including the importance of the thiocarbonyl carbon attached to the thiazolyl ring and the direct comparison between thiosemicarbazones and thiazolylhydrazones. (C) 2010 Elsevier Ltd. All rights reserved.
Resumo:
Nowadays in the world of mass consumption there is big demand for distributioncenters of bigger size. Managing such a center is a very complex and difficult taskregarding to the different processes and factors in a usual warehouse when we want tominimize the labor costs. Most of the workers’ working time is spent with travelingbetween source and destination points which cause deadheading. Even if a worker knowsthe structure of a warehouse well and because of that he or she can find the shortest pathbetween two points, it is still not guaranteed that there won’t be long traveling timebetween the locations of two consecutive tasks. We need optimal assignments betweentasks and workers.In the scientific literature Generalized Assignment Problem (GAP) is a wellknownproblem which deals with the assignment of m workers to n tasks consideringseveral constraints. The primary purpose of my thesis project was to choose a heuristics(genetic algorithm, tabu search or ant colony optimization) to be implemented into SAPExtended Warehouse Management (SAP EWM) by with task assignment will be moreeffective between tasks and resources.After system analysis I had to realize that due different constraints and businessdemands only 1:1 assingments are allowed in SAP EWM. Because of that I had to use adifferent and simpler approach – instead of the introduced heuristics – which could gainbetter assignments during the test phase in several cases. In the thesis I described indetails what ware the most important questions and problems which emerged during theplanning of my optimized assignment method.
Resumo:
In this project, two broad facets in the design of a methodology for performance optimization of indexable carbide inserts were examined. They were physical destructive testing and software simulation.For the physical testing, statistical research techniques were used for the design of the methodology. A five step method which began with Problem definition, through System identification, Statistical model formation, Data collection and Statistical analyses and results was indepthly elaborated upon. Set-up and execution of an experiment with a compression machine together with roadblocks and possible solution to curb road blocks to quality data collection were examined. 2k factorial design was illustrated and recommended for process improvement. Instances of first-order and second-order response surface analyses were encountered. In the case of curvature, test for curvature significance with center point analysis was recommended. Process optimization with method of steepest ascent and central composite design or process robustness studies of response surface analyses were also recommended.For the simulation test, AdvantEdge program was identified as the most used software for tool development. Challenges to the efficient application of this software were identified and possible solutions proposed. In conclusion, software simulation and physical testing were recommended to meet the objective of the project.
Resumo:
Combinatorial optimization problems, are one of the most important types of problems in operational research. Heuristic and metaheuristics algorithms are widely applied to find a good solution. However, a common problem is that these algorithms do not guarantee that the solution will coincide with the optimum and, hence, many solutions to real world OR-problems are afflicted with an uncertainty about the quality of the solution. The main aim of this thesis is to investigate the usability of statistical bounds to evaluate the quality of heuristic solutions applied to large combinatorial problems. The contributions of this thesis are both methodological and empirical. From a methodological point of view, the usefulness of statistical bounds on p-median problems is thoroughly investigated. The statistical bounds have good performance in providing informative quality assessment under appropriate parameter settings. Also, they outperform the commonly used Lagrangian bounds. It is demonstrated that the statistical bounds are shown to be comparable with the deterministic bounds in quadratic assignment problems. As to empirical research, environment pollution has become a worldwide problem, and transportation can cause a great amount of pollution. A new method for calculating and comparing the CO2-emissions of online and brick-and-mortar retailing is proposed. It leads to the conclusion that online retailing has significantly lesser CO2-emissions. Another problem is that the Swedish regional division is under revision and the border effect to public service accessibility is concerned of both residents and politicians. After analysis, it is shown that borders hinder the optimal location of public services and consequently the highest achievable economic and social utility may not be attained.