968 resultados para Motzkin Decomposition


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Process systems design, operation and synthesis problems under uncertainty can readily be formulated as two-stage stochastic mixed-integer linear and nonlinear (nonconvex) programming (MILP and MINLP) problems. These problems, with a scenario based formulation, lead to large-scale MILPs/MINLPs that are well structured. The first part of the thesis proposes a new finitely convergent cross decomposition method (CD), where Benders decomposition (BD) and Dantzig-Wolfe decomposition (DWD) are combined in a unified framework to improve the solution of scenario based two-stage stochastic MILPs. This method alternates between DWD iterations and BD iterations, where DWD restricted master problems and BD primal problems yield a sequence of upper bounds, and BD relaxed master problems yield a sequence of lower bounds. A variant of CD, which includes multiple columns per iteration of DW restricted master problem and multiple cuts per iteration of BD relaxed master problem, called multicolumn-multicut CD is then developed to improve solution time. Finally, an extended cross decomposition method (ECD) for solving two-stage stochastic programs with risk constraints is proposed. In this approach, a CD approach at the first level and DWD at a second level is used to solve the original problem to optimality. ECD has a computational advantage over a bilevel decomposition strategy or solving the monolith problem using an MILP solver. The second part of the thesis develops a joint decomposition approach combining Lagrangian decomposition (LD) and generalized Benders decomposition (GBD), to efficiently solve stochastic mixed-integer nonlinear nonconvex programming problems to global optimality, without the need for explicit branch and bound search. In this approach, LD subproblems and GBD subproblems are systematically solved in a single framework. The relaxed master problem obtained from the reformulation of the original problem, is solved only when necessary. A convexification of the relaxed master problem and a domain reduction procedure are integrated into the decomposition framework to improve solution efficiency. Using case studies taken from renewable resource and fossil-fuel based application in process systems engineering, it can be seen that these novel decomposition approaches have significant benefit over classical decomposition methods and state-of-the-art MILP/MINLP global optimization solvers.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Subtle structural differencescan be observed in the islets of Langer-hans region of microscopic image of pancreas cell of the rats having normal glucose tolerance and the rats having pre-diabetic(glucose intolerant)situa-tions. This paper proposes a way to automatically segment the islets of Langer-hans region fromthe histological image of rat's pancreas cell and on the basis of some morphological feature extracted from the segmented region the images are classified as normal and pre-diabetic.The experiment is done on a set of 134 images of which 56 are of normal type and the rests 78 are of pre-diabetictype. The work has two stages: primarily,segmentationof theregion of interest (roi)i.e. islets of Langerhansfrom the pancreatic cell and secondly, the extrac-tion of the morphological featuresfrom the region of interest for classification. Wavelet analysis and connected component analysis method have been used for automatic segmentationof the images. A few classifiers like OneRule, Naïve Bayes, MLP, J48 Tree, SVM etc.are used for evaluation among which MLP performed the best.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Clomazone (2-(2-chlorophenyl) methyl-4.4-dimethyl-3-isoxazolidinone) is a post emergence herbicide widely used in rice fields in Rio Grande do Sul (Brazil) with high activity against Gramineae at the recommended application rate of 700 g/ha. The presence of this chemical in the water may affect microorganisms responsible for the decomposition of organic matter. Thus, a disturbe in the trophic chain sustained by the decompositors could happen. In the present work the decomposition rate of organic matter (Typha latifolia) exposed to several concentrations of a clomazone formulation: 0 (control), 25.0, 62.0, 156.0, 390.0 and 976.0mg/L on the basis of the active ingredient was evaluated. Five litter bags containing about 3.0g of pieces of T. latifolia leaves wereplaced in aquariums with 15 of reconstituted water. In cach aquarium were added 500g of sediment from the same place of the plant collection, as a source of decompositors microorganisms. The results relative tothe control, showed that the decomposition rate in the highest and lowest dose was reduced in 50.05 and 1,28%, respectively, after 80 days.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In these last years a great effort has been put in the development of new techniques for automatic object classification, also due to the consequences in many applications such as medical imaging or driverless cars. To this end, several mathematical models have been developed from logistic regression to neural networks. A crucial aspect of these so called classification algorithms is the use of algebraic tools to represent and approximate the input data. In this thesis, we examine two different models for image classification based on a particular tensor decomposition named Tensor-Train (TT) decomposition. The use of tensor approaches preserves the multidimensional structure of the data and the neighboring relations among pixels. Furthermore the Tensor-Train, differently from other tensor decompositions, does not suffer from the curse of dimensionality making it an extremely powerful strategy when dealing with high-dimensional data. It also allows data compression when combined with truncation strategies that reduce memory requirements without spoiling classification performance. The first model we propose is based on a direct decomposition of the database by means of the TT decomposition to find basis vectors used to classify a new object. The second model is a tensor dictionary learning model, based on the TT decomposition where the terms of the decomposition are estimated using a proximal alternating linearized minimization algorithm with a spectral stepsize.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Noise is constant presence in measurements. Its origin is related to the microscopic properties of matter. Since the seminal work of Brown in 1828, the study of stochastic processes has gained an increasing interest with the development of new mathematical and analytical tools. In the last decades, the central role that noise plays in chemical and physiological processes has become recognized. The dual role of noise as nuisance/resource pushes towards the development of new decomposition techniques that divide a signal into its deterministic and stochastic components. In this thesis I show how methods based on Singular Spectrum Analysis have the right properties to fulfil the previously mentioned requirement. During my work I applied SSA to different signals of interest in chemistry: I developed a novel iterative procedure for the denoising of powder X-ray diffractograms; I “denoised” bi-dimensional images from experiments of electrochemiluminescence imaging of micro-beads obtaining new insight on ECL mechanism. I also used Principal Component Analysis to investigate the relationship between brain electrophysiological signals and voice emission.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The main contribution of this thesis is the proposal of novel strategies for the selection of parameters arising in variational models employed for the solution of inverse problems with data corrupted by Poisson noise. In light of the importance of using a significantly small dose of X-rays in Computed Tomography (CT), and its need of using advanced techniques to reconstruct the objects due to the high level of noise in the data, we will focus on parameter selection principles especially for low photon-counts, i.e. low dose Computed Tomography. For completeness, since such strategies can be adopted for various scenarios where the noise in the data typically follows a Poisson distribution, we will show their performance for other applications such as photography, astronomical and microscopy imaging. More specifically, in the first part of the thesis we will focus on low dose CT data corrupted only by Poisson noise by extending automatic selection strategies designed for Gaussian noise and improving the few existing ones for Poisson. The new approaches will show to outperform the state-of-the-art competitors especially in the low-counting regime. Moreover, we will propose to extend the best performing strategy to the hard task of multi-parameter selection showing promising results. Finally, in the last part of the thesis, we will introduce the problem of material decomposition for hyperspectral CT, which data encodes information of how different materials in the target attenuate X-rays in different ways according to the specific energy. We will conduct a preliminary comparative study to obtain accurate material decomposition starting from few noisy projection data.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this thesis, the viability of the Dynamic Mode Decomposition (DMD) as a technique to analyze and model complex dynamic real-world systems is presented. This method derives, directly from data, computationally efficient reduced-order models (ROMs) which can replace too onerous or unavailable high-fidelity physics-based models. Optimizations and extensions to the standard implementation of the methodology are proposed, investigating diverse case studies related to the decoding of complex flow phenomena. The flexibility of this data-driven technique allows its application to high-fidelity fluid dynamics simulations, as well as time series of real systems observations. The resulting ROMs are tested against two tasks: (i) reduction of the storage requirements of high-fidelity simulations or observations; (ii) interpolation and extrapolation of missing data. The capabilities of DMD can also be exploited to alleviate the cost of onerous studies that require many simulations, such as uncertainty quantification analysis, especially when dealing with complex high-dimensional systems. In this context, a novel approach to address parameter variability issues when modeling systems with space and time-variant response is proposed. Specifically, DMD is merged with another model-reduction technique, namely the Polynomial Chaos Expansion, for uncertainty quantification purposes. Useful guidelines for DMD deployment result from the study, together with the demonstration of its potential to ease diagnosis and scenario analysis when complex flow processes are involved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The decomposition of Feynman integrals into a basis of independent master integrals is an essential ingredient of high-precision theoretical predictions, that often represents a major bottleneck when processes with a high number of loops and legs are involved. In this thesis we present a new algorithm for the decomposition of Feynman integrals into master integrals with the formalism of intersection theory. Intersection theory is a novel approach that allows to decompose Feynman integrals into master integrals via projections, based on a scalar product between Feynman integrals called intersection number. We propose a new purely rational algorithm for the calculation of intersection numbers of differential $n-$forms that avoids the presence of algebraic extensions. We show how expansions around non-rational poles, which are a bottleneck of existing algorithms for intersection numbers, can be avoided by performing an expansion in series around a rational polynomial irreducible over $\mathbb{Q}$, that we refer to as $p(z)-$adic expansion. The algorithm we developed has been implemented and tested on several diagrams, both at one and two loops.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

La decomposizione di Iwasawa è una particolare decomposizione di un gruppo di Lie semisemplice G in cui i fattori sono sottogruppi chiusi di G. In questa tesi si analizza nello specifico la decomposizione di Iwasawa del gruppo SL(2,R), dandone alcune rilevanti applicazioni algebriche e topologiche (Capitolo 2). Alcune di queste sono ad esempio l'omeomorfismo di SL(2,R) con l'interno di un toro solido e l'esistenza di un unico omomorfismo continuo (quello banale) da SL(2,R) al gruppo additivo di R. Vengono poi studiate le classi di coniugio degli elementi di SL(2,R) in termini dei sottogruppi che compaiono nella decomposizione di Iwasawa e l'azione di SL(2,R) sul semipiano superiore del piano complesso, da cui è possibile ricavare una dimostrazione alternativa della decomposizione di Iwasawa. Il Capitolo 1 raccoglie il materiale introduttivo utile ad una lettura autocontenuta della tesi; in particolare vengono introdotti i gruppi topologici GL(n,F) e SL(n,F) (dove F indica un campo), l'azione di un gruppo su un insieme e l'esponenziale di matrici. Infine nel Capitolo 3 viene estesa la dimostrazione della decomposizione di Iwasawa ai gruppi SL(n,R) e SL(n,C).

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This work describes the evaluation of metals and (metallo)proteins in vitreous humor samples and their correlations with some biological aspects in different post-mortem intervals (1-7 days), taking into account both decomposing and non-decomposing bodies. After qualitative evaluation of the samples involving 26 elements, representative metal ions (Fe, Mg and Mo) are determined by inductively coupled plasma mass spectrometry after using mini-vial decomposition system for sample preparation. A significant trend for Fe is found with post-mortem time for decomposing bodies because of a significant increase of iron concentration when comparing samples from bodies presenting 3 and 7 days post-mortem interval. An important clue to elucidate the role of metals is the coupling of liquid chromatography with inductively coupled plasma mass spectrometry for identification of metals linked to proteins, as well as mass spectrometry for the identification of those proteins involved in the post-mortem interval.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Since the last decade, the combined use of chemometrics and molecular spectroscopic techniques has become a new alternative for direct drug determination, without the need of physical separation. Among the new methodologies developed, the application of PARAFAC in the decomposition of spectrofluorimetric data should be highlighted. The first objective of this article is to describe the theoretical basis of PARAFAC. For this purpose, a discussion about the order of chemometric methods used in multivariate calibration and the development of multi-dimensional methods is presented first. The other objective of this article is to divulge for the Brazilian chemical community the potential of the combination PARAFAC/spectrofluorimetry for the determination of drugs in complex biological matrices. For this purpose, two applications aiming at determining, respectively, doxorrubicine and salicylate in human plasma are presented.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We estimate litter production and leaf decomposition rate in a cerradão area, physiognomy little studied and very threatened in São Paulo State. During the period of study, litter production was 5646.9 kg.ha-1.year-1, which the 'leaf' fraction corresponded to 4081.2 kg.ha¹.year¹; the 'branch' fraction, to 1066.1 kg.ha-1.year-1; the 'reproductive structures' fraction, to 434.1 kg.ha-1.year-1; and the 'miscellaneous' fraction to 65.5 kg.ha-1.year-1. Litter production was highly seasonal and negatively correlated with relative humidity and air temperature. Leaf production was negatively correlated with relative humidity, rainfall, and air temperature. There was no significant difference between litter production found in this study and those in two other sites with cerradão and semideciduous forest, but these physiognomies differed significantly from the cerrado sensu stricto. Leaf decomposition rate (K) was 0.56. Half-life of the decomposing material was 1.8 years and turnover time was 2.3 years.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Universidade Estadual de Campinas. Faculdade de Educação Física

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A desintegração radioativa é um processo aleatório e a estimativa de todas as medidas associadas é governada por leis estatísticas. Os perfis de taxas de contagem são sempre "ruidosos" quando utilizados períodos curtos como um segundo para cada medida. Os filtros utilizados e posteriormente as correções feitas no processamento atual de dados gamaespectrométricos não são suficientes para remover ou diminuir, consideravelmente, o ruído oriundo do espectro. Dois métodos estatísticos que atuam diretamente nos dados coletados, isto é, nos espectros, vêm sendo sugeridos na literatura para remover e minimizar estes ruídos remanescentes o Noise-Adjusted Singular Value Decomposition - NASVD e Maximum Noise Fraction - MNF. Estes métodos produzem uma redução no ruído de forma significativa. Neste trabalho eles foram implementados dentro do ambiente de processamento do software Oasis Montaj e aplicados na área compreendida pelos blocos I e II do levantamento aerogeofísico que recobre a porção oeste da Província Mineral do Tapajós, entre os Estados do Pará e Amazonas. Os dados filtrados e não-filtrados com as técnicas de NASVD e MNF foram processados com os parâmetros e constantes fornecidos pela empresa Lasa Engenharia e Prospecções S.A., sendo estes comparados. Os resultados da comparação entre perfis e mapas apresentaram-se de forma promissora, pois houve um ganho na resolução dos produtos.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The use of thermoanalytical data in sample preparation is described as a tool to catch the students' attention to some details that can simplify both the analysis and the analytical procedure. In this case, the thermal decomposition of eggshells was first investigated by thermogravimetry (TGA). Although the classical procedures suggest long exposure to high temperatures, the TGA data showed that the decomposition of organic matter takes place immediately when the sample is heated up to 800 °C under air atmosphere. After decomposition, the calcium content was determined by flame atomic emission photometry and compared with the results obtained using classical volumetric titration with EDTA.