957 resultados para novel algorithm
Resumo:
Mestrado em Radioterapia
Resumo:
This paper reports a novel application of microwave-assisted extraction (MAE) of polyphenols from brewer’s spent grains (BSG). A 24 orthogonal composite design was used to obtain the optimal conditions of MAE. The influence of the MAE operational parameters (extraction time, temperature, solvent volume and stirring speed) on the extraction yield of ferulic acid was investigated through response surface methodology. The results showed that the optimal conditions were 15 min extraction time, 100 °C extraction temperature, 20 mL of solvent, and maximum stirring speed. Under these conditions, the yield of ferulic acid was 1.31±0.04% (w/w), which was fivefold higher than that obtained with conventional solid–liquid extraction techniques. The developed new extraction method considerably reduces extraction time, energy and solvent consumption, while generating fewer wastes. HPLC-DADMS analysis indicated that other hydroxycinnamic acids and several ferulic acid dehydrodimers, as well as one dehydrotrimer were also present, confirming that BSG is a valuable source of antioxidant compounds.
Resumo:
Low-density parity-check (LDPC) codes are nowadays one of the hottest topics in coding theory, notably due to their advantages in terms of bit error rate performance and low complexity. In order to exploit the potential of the Wyner-Ziv coding paradigm, practical distributed video coding (DVC) schemes should use powerful error correcting codes with near-capacity performance. In this paper, new ways to design LDPC codes for the DVC paradigm are proposed and studied. The new LDPC solutions rely on merging parity-check nodes, which corresponds to reduce the number of rows in the parity-check matrix. This allows to change gracefully the compression ratio of the source (DCT coefficient bitplane) according to the correlation between the original and the side information. The proposed LDPC codes reach a good performance for a wide range of source correlations and achieve a better RD performance when compared to the popular turbo codes.
Resumo:
Copyright © 2015 Société Française d'Ichtyologie.
Resumo:
Linear unmixing decomposes a hyperspectral image into a collection of reflectance spectra of the materials present in the scene, called endmember signatures, and the corresponding abundance fractions at each pixel in a spatial area of interest. This paper introduces a new unmixing method, called Dependent Component Analysis (DECA), which overcomes the limitations of unmixing methods based on Independent Component Analysis (ICA) and on geometrical properties of hyperspectral data. DECA models the abundance fractions as mixtures of Dirichlet densities, thus enforcing the constraints on abundance fractions imposed by the acquisition process, namely non-negativity and constant sum. The mixing matrix is inferred by a generalized expectation-maximization (GEM) type algorithm. The performance of the method is illustrated using simulated and real data.
Resumo:
Chapter in Book Proceedings with Peer Review First Iberian Conference, IbPRIA 2003, Puerto de Andratx, Mallorca, Spain, JUne 4-6, 2003. Proceedings
Resumo:
Given a set of mixed spectral (multispectral or hyperspectral) vectors, linear spectral mixture analysis, or linear unmixing, aims at estimating the number of reference substances, also called endmembers, their spectral signatures, and their abundance fractions. This paper presents a new method for unsupervised endmember extraction from hyperspectral data, termed vertex component analysis (VCA). The algorithm exploits two facts: (1) the endmembers are the vertices of a simplex and (2) the affine transformation of a simplex is also a simplex. In a series of experiments using simulated and real data, the VCA algorithm competes with state-of-the-art methods, with a computational complexity between one and two orders of magnitude lower than the best available method.
Resumo:
The calculation of the dose is one of the key steps in radiotherapy planning1-5. This calculation should be as accurate as possible, and over the years it became feasible through the implementation of new algorithms to calculate the dose on the treatment planning systems applied in radiotherapy. When a breast tumour is irradiated, it is fundamental a precise dose distribution to ensure the planning target volume (PTV) coverage and prevent skin complications. Some investigations, using breast cases, showed that the pencil beam convolution algorithm (PBC) overestimates the dose in the PTV and in the proximal region of the ipsilateral lung. However, underestimates the dose in the distal region of the ipsilateral lung, when compared with analytical anisotropic algorithm (AAA). With this study we aim to compare the performance in breast tumors of the PBC and AAA algorithms.
Resumo:
Molecularly imprinted polymers (MIP) were used as potentiometric sensors for the selective recognition and determination of chlormequat (CMQ). They were produced after radical polymerization of 4-vinyl pyridine (4-VP) or methacrylic acid (MAA) monomers in the presence of a cross-linker. CMQwas used as template. Similar nonimprinted (NI) polymers (NIP) were produced by removing the template from reaction media. The effect of kind and amount of MIP or NIP sensors on the potentiometric behavior was investigated. Main analytical features were evaluated in steady and flow modes of operation. The sensor MIP/4-VP exhibited the best performance, presenting fast near-Nernstian response for CMQover the concentration range 6.2×10-6 – 1.0×10-2 mol L-1 with detection limits of 4.1×10-6 mol L-1. The sensor was independent from the pH of test solutions in the range 5 – 10. Potentiometric selectivity coefficients of the proposed sensors were evaluated over several inorganic and organic cations. Results pointed out a good selectivity to CMQ. The sensor was applied to the potentiometric determination of CMQin commercial phytopharmaceuticals and spiked water samples. Recoveries ranged 96 to 108.5%.
Resumo:
Conferência - 16th International Symposium on Wireless Personal Multimedia Communications (WPMC)- Jun 24-27, 2013
Resumo:
A new procedure for determining eleven organochlorine pesticides in soils using microwave-assisted extraction (MAE) and headspace solid phase microextraction (HS-SPME) is described. The studied pesticides consisted of mirex, α- and γ-chlordane, p,p’-DDT, heptachlor, heptachlor epoxide isomer A, γ-hexachlorocyclohexane, dieldrin, endrin, aldrine and hexachlorobenzene. The HS-SPME was optimized for the most important parameters such as extraction time, sample volume and temperature. The present analytical procedure requires a reduced volume of organic solvents and avoids the need for extract clean-up steps. For optimized conditions the limits of detection for the method ranged from 0.02 to 3.6 ng/g, intermediate precision ranged from 14 to 36% (as CV%), and the recovery from 8 up to 51%. The proposed methodology can be used in the rapid screening of soil for the presence of the selected pesticides, and was applied to landfill soil samples.
Resumo:
This paper presents a complete, quadratic programming formulation of the standard thermal unit commitment problem in power generation planning, together with a novel iterative optimisation algorithm for its solution. The algorithm, based on a mixed-integer formulation of the problem, considers piecewise linear approximations of the quadratic fuel cost function that are dynamically updated in an iterative way, converging to the optimum; this avoids the requirement of resorting to quadratic programming, making the solution process much quicker. From extensive computational tests on a broad set of benchmark instances of this problem, the algorithm was found to be flexible and capable of easily incorporating different problem constraints. Indeed, it is able to tackle ramp constraints, which although very important in practice were rarely considered in previous publications. Most importantly, optimal solutions were obtained for several well-known benchmark instances, including instances of practical relevance, that are not yet known to have been solved to optimality. Computational experiments and their results showed that the method proposed is both simple and extremely effective.
Resumo:
Phenolic acids are ubiquitous antioxidants accounting for approximately one third of the phenolic compounds in our diet. Their importance was supported by epidemiological studies that suggest an inverse relationship between dietary intake of phenolic antioxidants and the occurrence of diseases, such as cancer and neurodegenerative disorders. However, until now, most of natural antioxidants have limited therapeutic success a fact that could be related with their limited distribution throughout the body and with the inherent difficulties to attain the target sites. The development of phenolic antioxidants based on a hybrid concept and structurally based on natural hydroxybenzoic (gallic acid) and hydroxycinnamic (caffeic acid) scaffolds seems to be a suitable solution to surpass the mentioned drawbacks. Galloylecinnamic hybrids were synthesized and their antioxidant activity as well as partition coefficients and redox potentials evaluated. The structureepropertyeactivity relationship (SPAR) study revealed the existence of a correlation between the redox potentials and antioxidant activity. The galloylecinnamic acid hybrid stands out as the best antioxidant supplementing the effect of a blend of gallic acid plus caffeic acid endorsing the hypothesis that the whole is greater than the sum of the parts. In addition, some hybrid compounds possess an appropriate lipophilicity allowing their application as chain-breaking antioxidant in biomembranes or other type of lipidic systems. Their predicted ADME properties are also in accordance with the general requirements for drug-like compounds. Accordingly, these phenolic hybrids can be seen as potential antioxidants for tackling the oxidative status linked to the neurodegenerative, inflammatory or cancer processes.
Resumo:
Objectivo do estudo: comparar o desempenho dos algoritmos Pencil Beam Convolution (PBC) e do Analytical Anisotropic Algorithm (AAA) no planeamento do tratamento de tumores de mama com radioterapia conformacional a 3D.
Resumo:
Consider a single processor and a software system. The software system comprises components and interfaces where each component has an associated interface and each component comprises a set of constrained-deadline sporadic tasks. A scheduling algorithm (called global scheduler) determines at each instant which component is active. The active component uses another scheduling algorithm (called local scheduler) to determine which task is selected for execution on the processor. The interface of a component makes certain information about a component visible to other components; the interfaces of all components are used for schedulability analysis. We address the problem of generating an interface for a component based on the tasks inside the component. We desire to (i) incur only a small loss in schedulability analysis due to the interface and (ii) ensure that the amount of space (counted in bits) of the interface is small; this is because such an interface hides as much details of the component as possible. We present an algorithm for generating such an interface.