972 resultados para Modified log-barrier function
Resumo:
In the last decade, local image features have been widely used in robot visual localization. To assess image similarity, a strategy exploiting these features compares raw descriptors extracted from the current image to those in the models of places. This paper addresses the ensuing step in this process, where a combining function must be used to aggregate results and assign each place a score. Casting the problem in the multiple classifier systems framework, we compare several candidate combiners with respect to their performance in the visual localization task. A deeper insight into the potential of the sum and product combiners is provided by testing two extensions of these algebraic rules: threshold and weighted modifications. In addition, a voting method, previously used in robot visual localization, is assessed. All combiners are tested on a visual localization task, carried out on a public dataset. It is experimentally demonstrated that the sum rule extensions globally achieve the best performance. The voting method, whilst competitive to the algebraic rules in their standard form, is shown to be outperformed by both their modified versions.
Resumo:
Aims - To compare reading performance in children with and without visual function anomalies and identify the influence of abnormal visual function and other variables in reading ability. Methods - A cross-sectional study was carried in 110 children of school age (6-11 years) with Abnormal Visual Function (AVF) and 562 children with Normal Visual Function (NVF). An orthoptic assessment (visual acuity, ocular alignment, near point of convergence and accommodation, stereopsis and vergences) and autorefraction was carried out. Oral reading was analyzed (list of 34 words). Number of errors, accuracy (percentage of success) and reading speed (words per minute - wpm) were used as reading indicators. Sociodemographic information from parents (n=670) and teachers (n=34) was obtained. Results - Children with AVF had a higher number of errors (AVF=3.00 errors; NVF=1.00 errors; p<0.001), a lower accuracy (AVF=91.18%; NVF=97.06%; p<0.001) and reading speed (AVF=24.71 wpm; NVF=27.39 wpm; p=0.007). Reading speed in the 3rd school grade was not statistically different between the two groups (AVF=31.41 wpm; NVF=32.54 wpm; p=0.113). Children with uncorrected hyperopia (p=0.003) and astigmatism (p=0.019) had worst reading performance. Children in 2nd, 3rd, or 4th grades presented a lower risk of having reading impairment when compared with the 1st grade. Conclusion - Children with AVF had reading impairment in the first school grade. It seems that reading abilities have a wide variation and this disparity lessens in older children. The slow reading characteristics of the children with AVF are similar to dyslexic children, which suggest the need for an eye evaluation before classifying the children as dyslexic.
Resumo:
This study focused on the development of a sensitive enzymatic biosensor for the determination of pirimicarb pesticide based on the immobilization of laccase on composite carbon paste electrodes. Multi- walled carbon nanotubes(MWCNTs)paste electrode modified by dispersion of laccase(3%,w/w) within the optimum composite matrix(60:40%,w/w,MWCNTs and paraffin binder)showed the best performance, with excellent electron transfer kinetic and catalytic effects related to the redox process of the substrate4- aminophenol. No metal or anti-interference membrane was added. Based on the inhibition of laccase activity, pirimicarb can be determined in the range 9.90 ×10- 7 to 1.15 ×10- 5 molL 1 using 4- aminophenol as substrate at the optimum pH of 5.0, with acceptable repeatability and reproducibility (relative standard deviations lower than 5%).The limit of detection obtained was 1.8 × 10-7 molL 1 (0.04 mgkg 1 on a fresh weight vegetable basis).The high activity and catalytic properties of the laccase- based biosensor are retained during ca. one month. The optimized electroanalytical protocol coupled to the QuEChERS methodology were applied to tomato and lettuce samples spiked at three levels; recoveries ranging from 91.0±0.1% to 101.0 ± 0.3% were attained. No significant effects in the pirimicarb electro- analysis were observed by the presence of pro-vitamin A, vitamins B1 and C,and glucose in the vegetable extracts. The proposed biosensor- based pesticide residue methodology fulfills all requisites to be used in implementation of food safety programs.
Resumo:
QuEChERS original method was modified into a new version for pesticides determination in soils. The QuEChERS method is based on liquid–liquid portioning with ACN and was followed by cleanup step using dispersive SPE and disposable pipette tips. Gas chromatographic separation with MS detection was carried out for pesticides quantification. The method was validated using recovery experiments for 36 multiclass pesticides. Mean recoveries of pesticides at each of the four spiking levels between 10–300 µg/kg of soil ranged from 70–120% for 26 pesticides with RSD values less than 15%. The method achieved low limit of detection less than 7.6 µ g/kg. Matrix effects were observed for 13 pesticides. Matrix effects were compensated by using matrix-matched calibration. The method was applied successfully using d-SPE or DPX in the analysis of the pesticides in soils from organic farming and integrated pest management.
Resumo:
A novel enzymatic biosensor for carbamate pesticides detection was developed through the direct immobilization of Trametes versicolor laccase on graphene doped carbon paste electrode functionalized with Prussianblue films (LACC/PB/GPE). Graphene was prepared by graphite sonication-assisted exfoliation and characterized by transmission electron microscopy and X-ray photoelectron spectro- scopy. The Prussian blue film electrodeposited onto graphene doped carbon paste electrode allowed considerable reduction of the charge transfer resistance and of the capacitance of the device.The combined effects of pH, enzyme concentration and incubation time on biosensor response were optimized using a 23 full-factorial statistical design and response surface methodology. Based on the inhibition of laccase activity and using 4-aminophenol as redox mediator at pH 5.0,LACC/PB/GPE exhibited suitable characteristics in terms of sensitivity, intra-and inter-day repeatability (1.8–3.8% RSD), reproducibility (4.1 and 6.3%RSD),selectivity(13.2% bias at the higher interference: substrate ratios tested),accuracy and stability(ca. twenty days)for quantification of five carbamates widely applied on tomato and potato crops.The attained detection limits ranged between 5.2×10−9 mol L−1(0.002 mg kg−1 w/w for ziram)and 1.0×10−7 mol L−1 (0.022 mg kg−1 w/w for carbofuran).Recovery values for the two tested spiking levels ranged from 90.2±0.1%(carbofuran)to 101.1±0.3% (ziram) for tomato and from 91.0±0.1%(formetanate)to 100.8±0.1%(ziram)for potato samples.The proposed methodology is appropriate to enable testing pesticide levels in food samples to fit with regulations and food inspections.
Resumo:
In this study, the effect of incorporation of recycled glass fibre reinforced plastics (GFRP) waste materials, obtained by means of shredding and milling processes, on mechanical behaviour of polyester polymer mortars (PM) was assessed. For this purpose, different contents of GFRP recyclates, between 4% up to 12% in weight, were incorporated into polyester PM materials as sand aggregates and filler replacements. The effect of the addition of a silane coupling agent to resin binder was also evaluated. Applied waste material was proceeding from the shredding of the leftovers resultant from the cutting and assembly processes of GFRP pultrusion profiles. Currently, these leftovers as well as non-conform products and scrap resulting from pultrusion manufacturing process are landfilled, with additional costs to producers and suppliers. Hence, besides the evident environmental benefits, a viable and feasible solution for these wastes would also conduct to significant economic advantages. Design of experiments and data treatment were accomplish by means of full factorial design approach and analysis of variance ANOVA. Experimental results were promising toward the recyclability of GFRP waste materials as partial replacement of aggregates and reinforcement for PM materials, with significant improvements on mechanical properties of resultant mortars with regards to waste-free formulations.
Resumo:
An electrochemical sensor has been developed for the determination of the herbicide bentazone, based on a GC electrode modified by a combination of multiwalled carbon nanotubes (MWCNT) with b-cyclodextrin (b-CD) incorporated in a polyaniline film. The results indicate that the b-CD/MWCNT modified GC electrode exhibits efficient electrocatalytic oxidation of bentazone with high sensitivity and stability. A cyclic voltammetric method to determine bentazone in phosphate buffer solution at pH 6.0, was developed, without any previous extraction, clean-up, or derivatization steps, in the range of 10–80 mmolL 1, with a detection limit of 1.6 mmolL 1 in water. The results were compared with those obtained by an established HPLC technique. No statistically significant differences being found between both methods.
Resumo:
Retinal imaging with a confocal scaning laser Ophthalmoscope (cSLO) involves scanning a small laser beam over the retina and constructing an image from the reflected light. By applying the confocal principle, tomographic images can be produced by measuring a sequence of slices at different depths. However, the thickness of such slices, when compared with the retinal thickness, is too large to give useful 3D retinal images, if no processing is done. In this work, a prototype cSLO was modified in terms hardware and software to give the ability of doing the tomographic measurements with the maximum theoretical axial resolution possible. A model eye was built to test the performance of the system. A novel algorithm has been developed which fits a double Gaussian curve to the axial intensity profiles generated from a stack of images slices. The underlying assumption is that the laser light has mainly been reflected by two structures in the retina, the internal limiting membrane and the retinal pigment epithelium. From the fitted curve topographic images and novel thickness images of the retina can be generated. Deconvolution algorithms have also been developed to improve the axial resolution of the system, using a theoretically predicted cSLO point spread function. The technique was evaluated using measurements made on a model eye, four normal eyes and seven eyes containing retinal pathology. The reproducibility, accuracy and physiological measurements obtained, were compared with available published data, and showed good agreement. The difference in the measurements when using a double rather than a single Gaussian model was also analysed.
Resumo:
Não existe uma definição única de processo de memória de longo prazo. Esse processo é geralmente definido como uma série que possui um correlograma decaindo lentamente ou um espectro infinito de frequência zero. Também se refere que uma série com tal propriedade é caracterizada pela dependência a longo prazo e por não periódicos ciclos longos, ou que essa característica descreve a estrutura de correlação de uma série de longos desfasamentos ou que é convencionalmente expressa em termos do declínio da lei-potência da função auto-covariância. O interesse crescente da investigação internacional no aprofundamento do tema é justificado pela procura de um melhor entendimento da natureza dinâmica das séries temporais dos preços dos ativos financeiros. Em primeiro lugar, a falta de consistência entre os resultados reclama novos estudos e a utilização de várias metodologias complementares. Em segundo lugar, a confirmação de processos de memória longa tem implicações relevantes ao nível da (1) modelação teórica e econométrica (i.e., dos modelos martingale de preços e das regras técnicas de negociação), (2) dos testes estatísticos aos modelos de equilíbrio e avaliação, (3) das decisões ótimas de consumo / poupança e de portefólio e (4) da medição de eficiência e racionalidade. Em terceiro lugar, ainda permanecem questões científicas empíricas sobre a identificação do modelo geral teórico de mercado mais adequado para modelar a difusão das séries. Em quarto lugar, aos reguladores e gestores de risco importa saber se existem mercados persistentes e, por isso, ineficientes, que, portanto, possam produzir retornos anormais. O objetivo do trabalho de investigação da dissertação é duplo. Por um lado, pretende proporcionar conhecimento adicional para o debate da memória de longo prazo, debruçando-se sobre o comportamento das séries diárias de retornos dos principais índices acionistas da EURONEXT. Por outro lado, pretende contribuir para o aperfeiçoamento do capital asset pricing model CAPM, considerando uma medida de risco alternativa capaz de ultrapassar os constrangimentos da hipótese de mercado eficiente EMH na presença de séries financeiras com processos sem incrementos independentes e identicamente distribuídos (i.i.d.). O estudo empírico indica a possibilidade de utilização alternativa das obrigações do tesouro (OT’s) com maturidade de longo prazo no cálculo dos retornos do mercado, dado que o seu comportamento nos mercados de dívida soberana reflete a confiança dos investidores nas condições financeiras dos Estados e mede a forma como avaliam as respetiva economias com base no desempenho da generalidade dos seus ativos. Embora o modelo de difusão de preços definido pelo movimento Browniano geométrico gBm alegue proporcionar um bom ajustamento das séries temporais financeiras, os seus pressupostos de normalidade, estacionariedade e independência das inovações residuais são adulterados pelos dados empíricos analisados. Por isso, na procura de evidências sobre a propriedade de memória longa nos mercados recorre-se à rescaled-range analysis R/S e à detrended fluctuation analysis DFA, sob abordagem do movimento Browniano fracionário fBm, para estimar o expoente Hurst H em relação às séries de dados completas e para calcular o expoente Hurst “local” H t em janelas móveis. Complementarmente, são realizados testes estatísticos de hipóteses através do rescaled-range tests R/S , do modified rescaled-range test M - R/S e do fractional differencing test GPH. Em termos de uma conclusão única a partir de todos os métodos sobre a natureza da dependência para o mercado acionista em geral, os resultados empíricos são inconclusivos. Isso quer dizer que o grau de memória de longo prazo e, assim, qualquer classificação, depende de cada mercado particular. No entanto, os resultados gerais maioritariamente positivos suportam a presença de memória longa, sob a forma de persistência, nos retornos acionistas da Bélgica, Holanda e Portugal. Isto sugere que estes mercados estão mais sujeitos a maior previsibilidade (“efeito José”), mas também a tendências que podem ser inesperadamente interrompidas por descontinuidades (“efeito Noé”), e, por isso, tendem a ser mais arriscados para negociar. Apesar da evidência de dinâmica fractal ter suporte estatístico fraco, em sintonia com a maior parte dos estudos internacionais, refuta a hipótese de passeio aleatório com incrementos i.i.d., que é a base da EMH na sua forma fraca. Atendendo a isso, propõem-se contributos para aperfeiçoamento do CAPM, através da proposta de uma nova fractal capital market line FCML e de uma nova fractal security market line FSML. A nova proposta sugere que o elemento de risco (para o mercado e para um ativo) seja dado pelo expoente H de Hurst para desfasamentos de longo prazo dos retornos acionistas. O expoente H mede o grau de memória de longo prazo nos índices acionistas, quer quando as séries de retornos seguem um processo i.i.d. não correlacionado, descrito pelo gBm(em que H = 0,5 , confirmando- se a EMH e adequando-se o CAPM), quer quando seguem um processo com dependência estatística, descrito pelo fBm(em que H é diferente de 0,5, rejeitando-se a EMH e desadequando-se o CAPM). A vantagem da FCML e da FSML é que a medida de memória de longo prazo, definida por H, é a referência adequada para traduzir o risco em modelos que possam ser aplicados a séries de dados que sigam processos i.i.d. e processos com dependência não linear. Então, estas formulações contemplam a EMH como um caso particular possível.
Resumo:
Background/Aims: Unconjugated bilirubin (UCB) impairs crucial aspects of cell function and induces apoptosis in primary cultured neurones. While mechanisms of cytotoxicity begin to unfold, mitochondria appear as potential primary targets. Methods: We used electron paramagnetic resonance spectroscopy analysis of isolated rat mitochondria to test the hypothesis that UCB physically interacts with mitochondria to induce structural membrane perturbation, leading to increased permeability, and subsequent release of apoptotic factors. Results: Our data demonstrate profound changes on mitochondrial membrane properties during incubation with UCB, including modified membrane lipid polarity and fluidity (P , 0:01), as well as disrupted protein mobility(P , 0:001). Consistent with increased permeability, cytochrome c was released from the intermembrane space(P , 0:01), perhaps uncoupling the respiratory chain and further increasing oxidative stress (P , 0:01). Both ursodeoxycholate, a mitochondrial-membrane stabilising agent, and cyclosporine A, an inhibitor of the permeability transition, almost completely abrogated UCB-induced perturbation. Conclusions: UCB directly interacts with mitochondria influencing membrane lipid and protein properties, redox status, and cytochrome c content. Thus, apoptosis induced by UCB may be mediated, at least in part, by physical perturbation of the mitochondrial membrane. These novel findings should ultimately prove useful to our evolving understanding of UCB cytotoxicity.
Resumo:
Constraints nonlinear optimization problems can be solved using penalty or barrier functions. This strategy, based on solving the problems without constraints obtained from the original problem, have shown to be e ective, particularly when used with direct search methods. An alternative to solve the previous problems is the lters method. The lters method introduced by Fletcher and Ley er in 2002, , has been widely used to solve problems of the type mentioned above. These methods use a strategy di erent from the barrier or penalty functions. The previous functions de ne a new one that combine the objective function and the constraints, while the lters method treat optimization problems as a bi-objective problems that minimize the objective function and a function that aggregates the constraints. Motivated by the work of Audet and Dennis in 2004, using lters method with derivative-free algorithms, the authors developed works where other direct search meth- ods were used, combining their potential with the lters method. More recently. In a new variant of these methods was presented, where it some alternative aggregation restrictions for the construction of lters were proposed. This paper presents a variant of the lters method, more robust than the previous ones, that has been implemented with a safeguard procedure where values of the function and constraints are interlinked and not treated completely independently.
Resumo:
Constrained nonlinear optimization problems are usually solved using penalty or barrier methods combined with unconstrained optimization methods. Another alternative used to solve constrained nonlinear optimization problems is the lters method. Filters method, introduced by Fletcher and Ley er in 2002, have been widely used in several areas of constrained nonlinear optimization. These methods treat optimization problem as bi-objective attempts to minimize the objective function and a continuous function that aggregates the constraint violation functions. Audet and Dennis have presented the rst lters method for derivative-free nonlinear programming, based on pattern search methods. Motivated by this work we have de- veloped a new direct search method, based on simplex methods, for general constrained optimization, that combines the features of the simplex method and lters method. This work presents a new variant of these methods which combines the lters method with other direct search methods and are proposed some alternatives to aggregate the constraint violation functions.