932 resultados para Convex combination


Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, we develop a novel constrained recursive least squares algorithm for adaptively combining a set of given multiple models. With data available in an online fashion, the linear combination coefficients of submodels are adapted via the proposed algorithm.We propose to minimize the mean square error with a forgetting factor, and apply the sum to one constraint to the combination parameters. Moreover an l1-norm constraint to the combination parameters is also applied with the aim to achieve sparsity of multiple models so that only a subset of models may be selected into the final model. Then a weighted l2-norm is applied as an approximation to the l1-norm term. As such at each time step, a closed solution of the model combination parameters is available. The contribution of this paper is to derive the proposed constrained recursive least squares algorithm that is computational efficient by exploiting matrix theory. The effectiveness of the approach has been demonstrated using both simulated and real time series examples.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The therapeutic efficacy of amphotericin B and voriconazole alone and in combination with one another were evaluated in immunodeficient mice (BALB/c-SCID) infected with a fluconazole-resistant strain of Cryptococcus neoformans var. grubii. The animals were infected intravenously with 3 x 10(5) cells and intraperitoneally treated with amphotericin B (1.5 mg/kg/day) in combination with voriconazole (40 mg/kg/days). Treatment began 1 day after inoculation and continued for 7 and 15 days post-inoculation. The treatments were evaluated by survival curves and yeast quantification (CFUs) in brain and lung tissues. Treatments for 15 days significantly promoted the survival of the animals compared to the control groups. Our results indicated that amphotericin B was effective in assuring longest-term survival of infected animals, but these animals still harbored the highest CFU of C. neoformans in lungs and brain at the end of the experiment. Voriconazole was not as effective alone, but in combination with amphotericin B, it prolonged survival for the second-longest time period and provided the lowest colonization of target organs by the fungus. None of the treatments were effective in complete eradication of the fungus in mice lungs and brain at the end of the experiment.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Several real problems involve the classification of data into categories or classes. Given a data set containing data whose classes are known, Machine Learning algorithms can be employed for the induction of a classifier able to predict the class of new data from the same domain, performing the desired discrimination. Some learning techniques are originally conceived for the solution of problems with only two classes, also named binary classification problems. However, many problems require the discrimination of examples into more than two categories or classes. This paper presents a survey on the main strategies for the generalization of binary classifiers to problems with more than two classes, known as multiclass classification problems. The focus is on strategies that decompose the original multiclass problem into multiple binary subtasks, whose outputs are combined to obtain the final prediction.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Photodynamic therapy, used mainly for cancer treatment and microorganisms inaction, is based on production of reactive oxygen species by light irradiation of a sensitizer. Hematoporphyrin derivatives as Photofrin (R) (PF) Photogem (R) (PG) and Photosan (R) (PF), and chlorin-c6-derivatives as Photodithazine (R)(PZ), have suitable sensitizing properties. The present study provides a way to make a fast previous evaluation of photosensitizers efficacy by a combination of techniques: a) use of brovine serum albumin and uric acid as chemical dosimeters; b) photo-hemolysis of red blood cells used as a cell membrane interaction model, and c) octanol/phosphate buffer partition to assess the relative lipophilicity of the compounds. The results suggest the photodynamic efficient rankings PZ > PG >= PF > PS. These results agree with the cytotoxicity of the photosensitizers as well as to chromatographic separation of the HpDs, both performed in our group, showing that the more lipophilic is the dye, the more acute is the damage to the RBC membrane and the oxidation of indol, which is immersed in the hydrophobic region of albumin.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper we present a novel approach for multispectral image contextual classification by combining iterative combinatorial optimization algorithms. The pixel-wise decision rule is defined using a Bayesian approach to combine two MRF models: a Gaussian Markov Random Field (GMRF) for the observations (likelihood) and a Potts model for the a priori knowledge, to regularize the solution in the presence of noisy data. Hence, the classification problem is stated according to a Maximum a Posteriori (MAP) framework. In order to approximate the MAP solution we apply several combinatorial optimization methods using multiple simultaneous initializations, making the solution less sensitive to the initial conditions and reducing both computational cost and time in comparison to Simulated Annealing, often unfeasible in many real image processing applications. Markov Random Field model parameters are estimated by Maximum Pseudo-Likelihood (MPL) approach, avoiding manual adjustments in the choice of the regularization parameters. Asymptotic evaluations assess the accuracy of the proposed parameter estimation procedure. To test and evaluate the proposed classification method, we adopt metrics for quantitative performance assessment (Cohen`s Kappa coefficient), allowing a robust and accurate statistical analysis. The obtained results clearly show that combining sub-optimal contextual algorithms significantly improves the classification performance, indicating the effectiveness of the proposed methodology. (C) 2010 Elsevier B.V. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A bipartite graph G = (V, W, E) is convex if there exists an ordering of the vertices of W such that, for each v. V, the neighbors of v are consecutive in W. We describe both a sequential and a BSP/CGM algorithm to find a maximum independent set in a convex bipartite graph. The sequential algorithm improves over the running time of the previously known algorithm and the BSP/CGM algorithm is a parallel version of the sequential one. The complexity of the algorithms does not depend on |W|.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Optimization methods that employ the classical Powell-Hestenes-Rockafellar augmented Lagrangian are useful tools for solving nonlinear programming problems. Their reputation decreased in the last 10 years due to the comparative success of interior-point Newtonian algorithms, which are asymptotically faster. In this research, a combination of both approaches is evaluated. The idea is to produce a competitive method, being more robust and efficient than its `pure` counterparts for critical problems. Moreover, an additional hybrid algorithm is defined, in which the interior-point method is replaced by the Newtonian resolution of a Karush-Kuhn-Tucker (KKT) system identified by the augmented Lagrangian algorithm. The software used in this work is freely available through the Tango Project web page:http://www.ime.usp.br/similar to egbirgin/tango/.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, we prove that if a Banach space X contains some uniformly convex subspace in certain geometric position, then the C(K, X) spaces of all X-valued continuous functions defined on the compact metric spaces K have exactly the same isomorphism classes that the C(K) spaces. This provides a vector-valued extension of classical results of Bessaga and Pelczynski (1960) [2] and Milutin (1966) [13] on the isomorphic classification of the separable C(K) spaces. As a consequence, we show that if 1 < p < q < infinity then for every infinite countable compact metric spaces K(1), K(2), K(3) and K(4) are equivalent: (a) C(K(1), l(p)) circle plus C(K(2), l(q)) is isomorphic to C(K(3), l(p)) circle plus (K(4), l(q)). (b) C(K(1)) is isomorphic to C(K(3)) and C(K(2)) is isomorphic to C(K(4)). (C) 2011 Elsevier Inc. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The fragmentation mechanisms of singlet oxygen [O(2) ((1)Delta(g))]-derived oxidation products of tryptophan (W) were analyzed using collision-induced dissociation coupled with (18)O-isotopic labeling experiments and accurate mass measurements. The five identified oxidized products, namely two isomeric alcohols (trans and cis WOH), two isomeric hydroperoxides (trans and cis WOOH), and N-formylkynurenine (FMK), were shown to share some common fragment ions and losses of small neutral molecules. Conversely, each oxidation product has its own fragmentation mechanism and intermediates, which were confirmed by (18)O-labeling studies. Isomeric WOH lost mainly H(2)O + CO, while WOOH showed preferential elimination of C(2)H(5)NO(3) by two distinct mechanisms. Differences in the spatial arrangement of the two isomeric WOHs led to differences in the intensities of the fragment ions. The same behavior was also found for trans and cis WOOH. FMK was shown to dissociate by a diverse range of mechanisms, with the loss of ammonia the most favored route. MS/MS analyses, (18)O-labeling, and H(2)(18)O experiments demonstrated the ability of FMK to exchange its oxygen atoms with water. Moreover, this approach also revealed that the carbonyl group has more pronounced oxygen exchange ability compared with the formyl group. The understanding of fragmentation mechanisms involved in O(2) ((1)Delta(g))-mediated oxidation of W provides a useful step toward the structural characterization of oxidized peptides and proteins. (J Am Soc Mass Spectrom 2009, 20, 188-197) (C) 2009 Published by Elsevier Inc. on behalf of American Society for Mass Spectrometry

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Polynorbonerne with high molecular weight was obtained via ring opening metathesis polymerization using catalysts derived from [RuCl(2)(PPh(2)Bz)(2) L] (1 for L = PPh(2) Bz; 2 for L = piperidine) type of complexes when in the presence of ethyl diazoacetate in CHCl(3). The polymer precipitated within a few minutes at 50 degrees C when using 1 with ca. 50% yield ([NBE]/[Ru] = 5000). Regarding 2, for either 30 min at 25 C or 5 min at 50 degrees C, more than 90% of yields are obtained; and at 50 C for 30 min a quantitative yield is obtained. The yield and PDI values are sensitive to the [NBE]/[Ru] ratio. The reaction of 1 with either isonicotinamide or nicotinamide produces six-coordinated complexes of [RuCl(2)(PPh(2)Bz)(2)(L)(2)] type, which are almost inactive and produce only small amounts of polymers at 50 C for 30 min. Thus, we Concluded that the novel complexes show very distinct reactivities for ROMP of NBE. This has been rationalized on account of a combination of synergistic effects of the phosphine-amine ancillary ligands. (C) 2009 Elsevier B.V. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The pulp- and paper production is a very energy intensive industry sector. Both Sweden and the U.S. are major pulpandpaper producers. This report examines the energy and the CO2-emission connected with the pulp- and paperindustry for the two countries from a lifecycle perspective.New technologies make it possible to increase the electricity production in the integrated pulp- andpaper mill through black liquor gasification and a combined cycle (BLGCC). That way, the mill canproduce excess electricity, which can be sold and replace electricity produced in power plants. In thisprocess the by-products that are formed at the pulp-making process is used as fuel to produce electricity.In pulp- and paper mills today the technology for generating energy from the by-product in aTomlinson boiler is not as efficient as it could be compared to the BLGCC technology. Scenarios havebeen designed to investigate the results from using the BLGCC technique using a life cycle analysis.Two scenarios are being represented by a 1994 mill in the U.S. and a 1994 mill in Sweden.The scenariosare based on the average energy intensity of pulp- and paper mills as operating in 1994 in the U.S.and Sweden respectively. The two other scenarios are constituted by a »reference mill« in the U.S. andSweden using state-of-the-art technology. We investigate the impact of varying recycling rates and totalenergy use and CO2-emissions from the production of printing and writing paper. To economize withthe wood and that way save trees, we can use the trees that are replaced by recycling in a biomassgasification combined cycle (BIGCC) to produce electricity in a power station. This produces extra electricitywith a lower CO2 intensity than electricity generated by, for example, coal-fired power plants.The lifecycle analysis in this thesis also includes the use of waste treatment in the paper lifecycle. Both Sweden and theU.S. are countries that recycle paper. Still there is a lot of paper waste, this paper is a part of the countries municipalsolid waste (MSW). A lot of the MSW is landfilled, but parts of it are incinerated to extract electricity. The thesis hasdesigned special scenarios for the use of MSW in the lifecycle analysis.This report is studying and comparing two different countries and two different efficiencies on theBLGCC in four different scenarios. This gives a wide survey and points to essential parameters to specificallyreflect on, when making assumptions in a lifecycle analysis. The report shows that there arethree key parameters that have to be carefully considered when making a lifecycle analysis of wood inan energy and CO2-emission perspective in the pulp- and paper mill in the U.S. and in Sweden. First,there is the energy efficiency in the pulp- and paper mill, then the efficiency of the BLGCC and last theCO2 intensity of the electricity displaced by BIGCC or BLGCC generatedelectricity. It also show that with the current technology that we havetoday, it is possible to produce CO2 free paper with a waste paper amountup to 30%. The thesis discusses the system boundaries and the assumptions.Further and more detailed research, including amongst others thesystem boundaries and forestry, is recommended for more specificanswers.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this study an optimization method for the design of combined solar and pellet heating systems is presented and evaluated. The paper describes the steps of the method by applying it for an example of system. The objective of the optimization was to find the design parameters that give the lowest auxiliary energy (pellet fuel + auxiliary electricity) and carbon monoxide (CO) emissions for a system with a typical load, a single family house in Sweden. Weighting factors have been used for the auxiliary energy use and CO emissions to give a combined target function. Different weighting factors were tested. The results show that extreme weighting factors lead to their own minima. However, it was possible to find factors that ensure low values for both auxiliary energy and CO emissions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The demands of image processing related systems are robustness, high recognition rates, capability to handle incomplete digital information, and magnanimous flexibility in capturing shape of an object in an image. It is exactly here that, the role of convex hulls comes to play. The objective of this paper is twofold. First, we summarize the state of the art in computational convex hull development for researchers interested in using convex hull image processing to build their intuition, or generate nontrivial models. Secondly, we present several applications involving convex hulls in image processing related tasks. By this, we have striven to show researchers the rich and varied set of applications they can contribute to. This paper also makes a humble effort to enthuse prospective researchers in this area. We hope that the resulting awareness will result in new advances for specific image recognition applications.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Climate model projections show that climate change will further increase the risk of flooding in many regions of the world. There is a need for climate adaptation, but building new infrastructure or additional retention basins has its limits, especially in densely populated areas where open spaces are limited. Another solution is the more efficient use of the existing infrastructure. This research investigates a method for real-time flood control by means of existing gated weirs and retention basins. The method was tested for the specific study area of the Demer basin in Belgium but is generally applicable. Today, retention basins along the Demer River are controlled by means of adjustable gated weirs based on fixed logic rules. However, because of the high complexity of the system, only suboptimal results are achieved by these rules. By making use of precipitation forecasts and combined hydrological-hydraulic river models, the state of the river network can be predicted. To fasten the calculation speed, a conceptual river model was used. The conceptual model was combined with a Model Predictive Control (MPC) algorithm and a Genetic Algorithm (GA). The MPC algorithm predicts the state of the river network depending on the positions of the adjustable weirs in the basin. The GA generates these positions in a semi-random way. Cost functions, based on water levels, were introduced to evaluate the efficiency of each generation, based on flood damage minimization. In the final phase of this research the influence of the most important MPC and GA parameters was investigated by means of a sensitivity study. The results show that the MPC-GA algorithm manages to reduce the total flood volume during the historical event of September 1998 by 46% in comparison with the current regulation. Based on the MPC-GA results, some recommendations could be formulated to improve the logic rules.