35 resultados para new methods
Resumo:
Os sistemas autónomos trazem como mais valia aos cenários de busca e salvamento a possibilidade de minimizar a presença de Humanos em situações de perigo e a capacidade de aceder a locais de difícil acesso. Na dissertação propõe-se endereçar novos métodos para perceção e navegação de veículos aéreos não tripulados (UAV), tendo como foco principal o planeamento de trajetórias e deteção de obstáculos. No que respeita à perceção foi desenvolvido um método para gerar clusters tendo por base os voxels gerados pelo Octomap. Na área de navegação, foram desenvolvidos dois novos métodos de planeamento de trajetórias, GPRM (Grid Probabilistic Roadmap) e PPRM (Particle Probabilistic Roadmap), que tem como método base para o seu desenvolvimento o PRM. O primeiro método desenvolvido, GPRM, espalha as partículas numa grid pré-definida, construindo posteriormente o roadmap na área determinada pela grid e com isto estima o trajeto mais curto até ao ponto destino. O segundo método desenvolvido, PPRM, espalha as partículas pelo cenário de aplicação, gera o roadmap considerando o mapa total e atribui uma probabilidade que irá permitir definir a trajetória otimizada. Para analisar a performance de cada método em comparação com o PRM, efetua-se a sua avaliação em três cenários distintos com recurso ao simulador MORSE.
Resumo:
Intensive use of Distributed Generation (DG) represents a change in the paradigm of power systems operation making small-scale energy generation and storage decision making relevant for the whole system. This paradigm led to the concept of smart grid for which an efficient management, both in technical and economic terms, should be assured. This paper presents a new approach to solve the economic dispatch in smart grids. The proposed methodology for resource management involves two stages. The first one considers fuzzy set theory to define the natural resources range forecast as well as the load forecast. The second stage uses heuristic optimization to determine the economic dispatch considering the generation forecast, storage management and demand response
Resumo:
Aims Obesity and asthma are widely prevalent and associated disorders. Recent studies of our group revealed that Substance P (SP) is involved in pathophysiology of obese-asthma phenotype in mice through its selective NK1 receptor (NK1-R). Lymphangiogenesis is impaired in asthma and obesity, and SP activates contractile and inflammatory pathways in lymphatics. Our aim was to study whether NK1-R expression was involved in lymphangiogenesis on visceral (VAT) and subcutaneous (SAT) adipose tissues and in the lungs, in obese-allergen sensitized mice. Main methods Diet-induced obese and ovalbumin (OVA)-sensitized Balb/c mice were treated with a selective NK1-R antagonist (CJ 12,255, Pfizer Inc., USA) or placebo. Lymphatic structures (LYVE-1 +) and NK1-R expression were analyzed by immunohistochemistry. A semi-quantitative score methodology was used for NK1-R expression. Key findings Obesity and allergen-sensitization together increased the number of LYVE-1 + lymphatics in VAT and decreased it in SAT and lungs. NK1-R was mainly expressed on adipocyte membranes of VAT, blood vessel areas of SAT, and in lung epithelium. Obesity and allergen-sensitization combined increased the expression of NK1-R in VAT, SAT and lungs. NK1-R antagonist treatment reversed the effects observed in lymphangiogenesis in those tissues. Significance The obese-asthma phenotype in mice is accompanied by increased expression of NK1-R on adipose tissues and lung epithelium, reflecting that SP released during inflammation may act directly on these tissues. Blocking NK1-R affects lymphangiogenesis, implying a role of SP, with opposite physiological consequences in VAT, and in SAT and lungs. Our results provide a clue for a novel SP role in the obese-asthma phenotype.
Resumo:
In real optimization problems, usually the analytical expression of the objective function is not known, nor its derivatives, or they are complex. In these cases it becomes essential to use optimization methods where the calculation of the derivatives, or the verification of their existence, is not necessary: the Direct Search Methods or Derivative-free Methods are one solution. When the problem has constraints, penalty functions are often used. Unfortunately the choice of the penalty parameters is, frequently, very difficult, because most strategies for choosing it are heuristics strategies. As an alternative to penalty function appeared the filter methods. A filter algorithm introduces a function that aggregates the constrained violations and constructs a biobjective problem. In this problem the step is accepted if it either reduces the objective function or the constrained violation. This implies that the filter methods are less parameter dependent than a penalty function. In this work, we present a new direct search method, based on simplex methods, for general constrained optimization that combines the features of the simplex method and filter methods. This method does not compute or approximate any derivatives, penalty constants or Lagrange multipliers. The basic idea of simplex filter algorithm is to construct an initial simplex and use the simplex to drive the search. We illustrate the behavior of our algorithm through some examples. The proposed methods were implemented in Java.
Resumo:
New potentiometric membrane sensors with cylindrical configuration for tetracycline (TC) are described based on the use of a newly designed molecularly imprinted polymer (MIP) material consisting of 2-vinylpyridine as a functional monomer in a plasticized PVC membrane. The sensor exhibited significantly enhanced response towards TC over the concentration range 1.59 10 5–1.0 10 3 mol L 1 at pH 3–5 with a lower detection limit of 1.29 10 5 mol L 1. The response was near-Nernstian, with average slopes of 63.9 mV decade 1. The effect of lipophilic salts and various foreign common ions were tested and were found to be negligible. The possibility of applying the proposed sensor to TC determination in spiked biological fluid samples was demonstrated.
Resumo:
In order to combat a variety of pests, pesticides are widely used in fruits. Several extraction procedures (liquid extraction, single drop microextraction, microwave-assisted extraction, pressurized liquid extraction, supercritical fluid extraction, solid-phase extraction, solid-phase microextraction, matrix solid-phase dispersion, and stir bar sorptive extraction) have been reported to determine pesticide residues in fruits and fruit juices. The significant change in recent years is the introduction of the Quick, Easy, Cheap, Effective, Rugged, and Safe (QuEChERS) methods in these matrices analysis. A combination of techniques reported the use of new extraction methods and chromatography to provide better quantitative recoveries at low levels. The use of mass spectrometric detectors in combination with liquid and gas chromatography has played a vital role to solve many problems related to food safety. The main attention in this review is on the achievements that have been possible because of the progress in extraction methods and the latest advances and novelties in mass spectrometry, and how these progresses have influenced the best control of food, allowing for an increase in the food safety and quality standards.
Resumo:
As a result of the stressful conditions in aquaculture facilities there is a high risk of bacterial infections among cultured fish. Chlortetracycline (CTC) is one of the antimicrobials used to solve this problem. It is a broad spectrum antibacterial active against a wide range of Gram-positive and Gram-negative bacteria. Numerous analytical methods for screening, identifying, and quantifying CTC in animal products have been developed over the years. An alternative and advantageous method should rely on expeditious and efficient procedures providing highly specific and sensitive measurements in food samples. Ion-selective electrodes (ISEs) could meet these criteria. The only ISE reported in literature for this purpose used traditional electro-active materials. A selectivity enhancement could however be achieved after improving the analyte recognition by molecularly imprinted polymers (MIPs). Several MIP particles were synthesized and used as electro-active materials. ISEs based in methacrylic acid monomers showed the best analytical performance according to slope (62.5 and 68.6 mV/decade) and detection limit (4.1×10−5 and 5.5×10−5 mol L−1). The electrodes displayed good selectivity. The ISEs are not affected by pH changes ranging from 2.5 to 13. The sensors were successfully applied to the analysis of serum, urine and fish samples.
Resumo:
Aims: Obesity and asthma are widely prevalent and associated disorders. Recent studies of our group revealed that Substance P (SP) is involved in pathophysiology of obese-asthma phenotype in mice through its selective NK1 receptor (NK1-R). Lymphangiogenesis is impaired in asthma and obesity, and SP activates contractile and inflammatory pathways in lymphatics. Our aim was to study whether NK1-R expression was involved in lymphangiogenesis on visceral (VAT) and subcutaneous (SAT) adipose tissues and in the lungs, in obeseallergen sensitized mice. Main methods: Diet-induced obese and ovalbumin (OVA)-sensitized Balb/c mice were treated with a selective NK1-R antagonist (CJ 12,255, Pfizer Inc., USA) or placebo. Lymphatic structures (LYVE-1+) and NK1-R expression were analyzed by immunohistochemistry. A semi-quantitative score methodology was used for NK1-R expression. Key findings: Obesity and allergen-sensitization together increased the number of LYVE-1+ lymphatics in VAT and decreased it in SAT and lungs. NK1-R was mainly expressed on adipocyte membranes of VAT, blood vessel areas of SAT, and in lung epithelium. Obesity and allergen-sensitization combined increased the expression of NK1-R in VAT, SAT and lungs. NK1-R antagonist treatment reversed the effects observed in lymphangiogenesis in those tissues. Significance: The obese-asthma phenotype in mice is accompanied by increased expression of NK1-R on adipose tissues and lung epithelium, reflecting that SP released during inflammation may act directly on these tissues. Blocking NK1-R affects lymphangiogenesis, implying a role of SP, with opposite physiological consequences in VAT, and in SAT and lungs. Our results provide a clue for a novel SP role in the obese-asthma phenotype.
Resumo:
Optimization problems arise in science, engineering, economy, etc. and we need to find the best solutions for each reality. The methods used to solve these problems depend on several factors, including the amount and type of accessible information, the available algorithms for solving them, and, obviously, the intrinsic characteristics of the problem. There are many kinds of optimization problems and, consequently, many kinds of methods to solve them. When the involved functions are nonlinear and their derivatives are not known or are very difficult to calculate, these methods are more rare. These kinds of functions are frequently called black box functions. To solve such problems without constraints (unconstrained optimization), we can use direct search methods. These methods do not require any derivatives or approximations of them. But when the problem has constraints (nonlinear programming problems) and, additionally, the constraint functions are black box functions, it is much more difficult to find the most appropriate method. Penalty methods can then be used. They transform the original problem into a sequence of other problems, derived from the initial, all without constraints. Then this sequence of problems (without constraints) can be solved using the methods available for unconstrained optimization. In this chapter, we present a classification of some of the existing penalty methods and describe some of their assumptions and limitations. These methods allow the solving of optimization problems with continuous, discrete, and mixing constraints, without requiring continuity, differentiability, or convexity. Thus, penalty methods can be used as the first step in the resolution of constrained problems, by means of methods that typically are used by unconstrained problems. We also discuss a new class of penalty methods for nonlinear optimization, which adjust the penalty parameter dynamically.
Resumo:
Constrained nonlinear optimization problems are usually solved using penalty or barrier methods combined with unconstrained optimization methods. Another alternative used to solve constrained nonlinear optimization problems is the lters method. Filters method, introduced by Fletcher and Ley er in 2002, have been widely used in several areas of constrained nonlinear optimization. These methods treat optimization problem as bi-objective attempts to minimize the objective function and a continuous function that aggregates the constraint violation functions. Audet and Dennis have presented the rst lters method for derivative-free nonlinear programming, based on pattern search methods. Motivated by this work we have de- veloped a new direct search method, based on simplex methods, for general constrained optimization, that combines the features of the simplex method and lters method. This work presents a new variant of these methods which combines the lters method with other direct search methods and are proposed some alternatives to aggregate the constraint violation functions.
Resumo:
Glass fibre-reinforced plastics (GFRP), nowadays commonly used in the construction, transportation and automobile sectors, have been considered inherently difficult to recycle due to both: cross-linked nature of thermoset resins, which cannot be remolded, and complex composition of the composite itself, which includes glass fibres, matrix and different types of inorganic fillers. Presently, most of the GFRP waste is landfilled leading to negative environmental impacts and supplementary added costs. With an increasing awareness of environmental matters and the subsequent desire to save resources, recycling would convert an expensive waste disposal into a profitable reusable material. There are several methods to recycle GFR thermostable materials: (a) incineration, with partial energy recovery due to the heat generated during organic part combustion; (b) thermal and/or chemical recycling, such as solvolysis, pyrolisis and similar thermal decomposition processes, with glass fibre recovering; and (c) mechanical recycling or size reduction, in which the material is subjected to a milling process in order to obtain a specific grain size that makes the material suitable as reinforcement in new formulations. This last method has important advantages over the previous ones: there is no atmospheric pollution by gas emission, a much simpler equipment is required as compared with ovens necessary for thermal recycling processes, and does not require the use of chemical solvents with subsequent environmental impacts. In this study the effect of incorporation of recycled GFRP waste materials, obtained by means of milling processes, on mechanical behavior of polyester polymer mortars was assessed. For this purpose, different contents of recycled GFRP waste materials, with distinct size gradings, were incorporated into polyester polymer mortars as sand aggregates and filler replacements. The effect of GFRP waste treatment with silane coupling agent was also assessed. Design of experiments and data treatment were accomplish by means of factorial design and analysis of variance ANOVA. The use of factorial experiment design, instead of the one-factor-at-a-time method is efficient at allowing the evaluation of the effects and possible interactions of the different material factors involved. Experimental results were promising toward the recyclability of GFRP waste materials as aggregates and filler replacements for polymer mortar, with significant gain of mechanical properties with regard to non-modified polymer mortars.
Resumo:
Volatile organic compounds are a common source of groundwater contamination that can be easily removed by air stripping in columns with random packing and using a counter-current flow between the phases. This work proposes a new methodology for column design for any type of packing and contaminant which avoids the necessity of an arbitrary chosen diameter. It also avoids the employment of the usual graphical Eckert correlations for pressure drop. The hydraulic features are previously chosen as a project criterion. The design procedure was translated into a convenient algorithm in C++ language. A column was built in order to test the design, the theoretical steady-state and dynamic behaviour. The experiments were conducted using a solution of chloroform in distilled water. The results allowed for a correction in the theoretical global mass transfer coefficient previously estimated by the Onda correlations, which depend on several parameters that are not easy to control in experiments. For best describe the column behaviour in stationary and dynamic conditions, an original mathematical model was developed. It consists in a system of two partial non linear differential equations (distributed parameters). Nevertheless, when flows are steady, the system became linear, although there is not an evident solution in analytical terms. In steady state the resulting ODE can be solved by analytical methods, and in dynamic state the discretization of the PDE by finite differences allows for the overcoming of this difficulty. To estimate the contaminant concentrations in both phases in the column, a numerical algorithm was used. The high number of resulting algebraic equations and the impossibility of generating a recursive procedure did not allow the construction of a generalized programme. But an iterative procedure developed in an electronic worksheet allowed for the simulation. The solution is stable only for similar discretizations values. If different values for time/space discretization parameters are used, the solution easily becomes unstable. The system dynamic behaviour was simulated for the common liquid phase perturbations: step, impulse, rectangular pulse and sinusoidal. The final results do not configure strange or non-predictable behaviours.
Resumo:
To meet the increasing demands of the complex inter-organizational processes and the demand for continuous innovation and internationalization, it is evident that new forms of organisation are being adopted, fostering more intensive collaboration processes and sharing of resources, in what can be called collaborative networks (Camarinha-Matos, 2006:03). Information and knowledge are crucial resources in collaborative networks, being their management fundamental processes to optimize. Knowledge organisation and collaboration systems are thus important instruments for the success of collaborative networks of organisations having been researched in the last decade in the areas of computer science, information science, management sciences, terminology and linguistics. Nevertheless, research in this area didn’t give much attention to multilingual contexts of collaboration, which pose specific and challenging problems. It is then clear that access to and representation of knowledge will happen more and more on a multilingual setting which implies the overcoming of difficulties inherent to the presence of multiple languages, through the use of processes like localization of ontologies. Although localization, like other processes that involve multilingualism, is a rather well-developed practice and its methodologies and tools fruitfully employed by the language industry in the development and adaptation of multilingual content, it has not yet been sufficiently explored as an element of support to the development of knowledge representations - in particular ontologies - expressed in more than one language. Multilingual knowledge representation is then an open research area calling for cross-contributions from knowledge engineering, terminology, ontology engineering, cognitive sciences, computational linguistics, natural language processing, and management sciences. This workshop joined researchers interested in multilingual knowledge representation, in a multidisciplinary environment to debate the possibilities of cross-fertilization between knowledge engineering, terminology, ontology engineering, cognitive sciences, computational linguistics, natural language processing, and management sciences applied to contexts where multilingualism continuously creates new and demanding challenges to current knowledge representation methods and techniques. In this workshop six papers dealing with different approaches to multilingual knowledge representation are presented, most of them describing tools, approaches and results obtained in the development of ongoing projects. In the first case, Andrés Domínguez Burgos, Koen Kerremansa and Rita Temmerman present a software module that is part of a workbench for terminological and ontological mining, Termontospider, a wiki crawler that aims at optimally traverse Wikipedia in search of domainspecific texts for extracting terminological and ontological information. The crawler is part of a tool suite for automatically developing multilingual termontological databases, i.e. ontologicallyunderpinned multilingual terminological databases. In this paper the authors describe the basic principles behind the crawler and summarized the research setting in which the tool is currently tested. In the second paper, Fumiko Kano presents a work comparing four feature-based similarity measures derived from cognitive sciences. The purpose of the comparative analysis presented by the author is to verify the potentially most effective model that can be applied for mapping independent ontologies in a culturally influenced domain. For that, datasets based on standardized pre-defined feature dimensions and values, which are obtainable from the UNESCO Institute for Statistics (UIS) have been used for the comparative analysis of the similarity measures. The purpose of the comparison is to verify the similarity measures based on the objectively developed datasets. According to the author the results demonstrate that the Bayesian Model of Generalization provides for the most effective cognitive model for identifying the most similar corresponding concepts existing for a targeted socio-cultural community. In another presentation, Thierry Declerck, Hans-Ulrich Krieger and Dagmar Gromann present an ongoing work and propose an approach to automatic extraction of information from multilingual financial Web resources, to provide candidate terms for building ontology elements or instances of ontology concepts. The authors present a complementary approach to the direct localization/translation of ontology labels, by acquiring terminologies through the access and harvesting of multilingual Web presences of structured information providers in the field of finance, leading to both the detection of candidate terms in various multilingual sources in the financial domain that can be used not only as labels of ontology classes and properties but also for the possible generation of (multilingual) domain ontologies themselves. In the next paper, Manuel Silva, António Lucas Soares and Rute Costa claim that despite the availability of tools, resources and techniques aimed at the construction of ontological artifacts, developing a shared conceptualization of a given reality still raises questions about the principles and methods that support the initial phases of conceptualization. These questions become, according to the authors, more complex when the conceptualization occurs in a multilingual setting. To tackle these issues the authors present a collaborative platform – conceptME - where terminological and knowledge representation processes support domain experts throughout a conceptualization framework, allowing the inclusion of multilingual data as a way to promote knowledge sharing and enhance conceptualization and support a multilingual ontology specification. In another presentation Frieda Steurs and Hendrik J. Kockaert present us TermWise, a large project dealing with legal terminology and phraseology for the Belgian public services, i.e. the translation office of the ministry of justice, a project which aims at developing an advanced tool including expert knowledge in the algorithms that extract specialized language from textual data (legal documents) and whose outcome is a knowledge database including Dutch/French equivalents for legal concepts, enriched with the phraseology related to the terms under discussion. Finally, Deborah Grbac, Luca Losito, Andrea Sada and Paolo Sirito report on the preliminary results of a pilot project currently ongoing at UCSC Central Library, where they propose to adapt to subject librarians, employed in large and multilingual Academic Institutions, the model used by translators working within European Union Institutions. The authors are using User Experience (UX) Analysis in order to provide subject librarians with a visual support, by means of “ontology tables” depicting conceptual linking and connections of words with concepts presented according to their semantic and linguistic meaning. The organizers hope that the selection of papers presented here will be of interest to a broad audience, and will be a starting point for further discussion and cooperation.
Resumo:
Pea-shoots are a new option as ready-to-eat baby-leaf vegetable. However, data about the nutritional composition and the shelf-life stability of these leaves, especially their phytonutrient composition is scarce. In this work, the macronutrient, micronutrient and phytonutrients profile of minimally processed pea shoots were evaluated at the beginning and at the end of a 10-day storage period. Several physicochemical characteristics (color, pH, total soluble solids, and total titratable acidity) were also monitored. Standard AOAC methods were applied in the nutritional value evaluation, while chromatographic methods with UV–vis and mass detection were used to analyze free forms of vitamins (HPLC-DAD-ESI-MS/MS), carotenoids (HPLC-DAD-APCI-MSn) and flavonoid compounds (HPLC-DAD-ESI-MSn). Atomic absorption spectrometry (HR-CS-AAS) was employed to characterize the mineral content of the leaves. As expected, pea leaves had a high water (91.5%) and low fat (0.3%) and carbohydrate (1.9%) contents, being a good source of dietary fiber (2.1%). Pea shoots showed a high content of vitamins C, E and A, potassium and phosphorous compared to other ready-to-eat green leafy vegetables. The carotenoid profile revealed a high content of β-carotene and lutein, typical from green leafy vegetables. The leaves had a mean flavonoid content of 329 mg/100 g of fresh product, mainly composed by glycosylated quercetin and kaempferol derivatives. Pea shoots kept their fresh appearance during the storage being color maintained throughout the shelf-life. The nutritional composition was in general stable during storage, showing some significant (p < 0.05) variation in certain water-soluble vitamins.
Resumo:
Demand response has gain increasing importance in the context of competitive electricity markets environment. The use of demand resources is also advantageous in the context of smart grid operation. In addition to the need of new business models for integrating demand response, adequate methods are necessary for an accurate determination of the consumers’ performance evaluation after the participation in a demand response event. The present paper makes a comparison between some of the existing baseline methods related to the consumers’ performance evaluation, comparing the results obtained with these methods and also with a method proposed by the authors of the paper. A case study demonstrates the application of the referred methods to real consumption data belonging to a consumer connected to a distribution network.