161 resultados para extractor
Resumo:
Methods for accessing data on the Web have been the focus of active research over the past few years. In this thesis we propose a method for representing Web sites as data sources. We designed a Data Extractor data retrieval solution that allows us to define queries to Web sites and process resulting data sets. Data Extractor is being integrated into the MSemODB heterogeneous database management system. With its help database queries can be distributed over both local and Web data sources within MSemODB framework. Data Extractor treats Web sites as data sources, controlling query execution and data retrieval. It works as an intermediary between the applications and the sites. Data Extractor utilizes a two-fold "custom wrapper" approach for information retrieval. Wrappers for the majority of sites are easily built using a powerful and expressive scripting language, while complex cases are processed using Java-based wrappers that utilize specially designed library of data retrieval, parsing and Web access routines. In addition to wrapper development we thoroughly investigate issues associated with Web site selection, analysis and processing. Data Extractor is designed to act as a data retrieval server, as well as an embedded data retrieval solution. We also use it to create mobile agents that are shipped over the Internet to the client's computer to perform data retrieval on behalf of the user. This approach allows Data Extractor to distribute and scale well. This study confirms feasibility of building custom wrappers for Web sites. This approach provides accuracy of data retrieval, and power and flexibility in handling of complex cases.
Resumo:
Opuntia fícus - indica (L.) Mill is a cactacea presents in the Caatinga ecosystem and shows in its chemical c omposition flavonoids, galacturonic acid and sugars. Different hydroglicolic (EHG001 and EHG002) and hydroethanolic subsequently lyophilized (EHE001 and EHE002) extracts were developed. The EHE002 had his preliminary phytochemical composition investigated by thin layer chromatography (TLC) and we observed the predominance of flavonoids. Different formulations were prepared as emulsions with Sodium Polyacrylate (and) Hydrogenated Polydecene (and) Trideceth - 6 (Rapithix® A60), and Polyacrylamide (and) C13 - 14 I soparaffin (and) Laureth - 7 (Sepigel® 305), and gel with Sodium Polyacrylate (Rapithix® A100). The sensorial evaluation was conducted by check - all - that - apply method. There were no significant differences between the scores assigned to the formulations, howe ver, we noted a preference for those formulated with 1,5% of Rapithix® A100 and 3,0% of Sepigel® 305. These and the formulation with 3% Rapithix® A60 were tested for preliminary and accelerated stability. In accelerated stability study, samples were stored at different temperatures for 90 days. Organoleptic characteristics, the pH values and rheological behavior were assessed. T he emulsions formulated with 3,0% of Sepigel® 305 and 1,5% of Rapithix® A60 w ere stable with pseudoplastic and thixotropic behavior . The moisturizing clinical efficacy of the emulsions containing 3,0% of Sepigel® 305 containing 1 and 3% of EHG001 was performed using the capacitance method (Corneometer®) and transepidermal water lost – TEWL evaluation ( Tewameter®). The results showed t hat the formulation with 3% of EHG001 increased the skin moisturizing against the vehicle and the extractor solvent formulation after five hours. The formulations containing 1 and 3% of EHG001 increased skin barrier effect by reducing transepidermal water loss up to four hours after application.
Resumo:
Dengue fever, currently the most important arbovirus, is transmitted by the bite of the Aedes aegypti mosquito. Given the absence of a prophylactic vaccine, the disease can only be controlled by combating the vector insect. However, increasing reports of resistance and environmental damage caused by insecticides have led to the urgent search for new safer alternatives. Twenty - um plant s eed extracts from the Caatinga were prepared , tested and characterized . Sodium phosphate ( 50 mM pH 8.0) was used as extractor. All extracts showed larvicidal and ovipositional deterrence activity . Extracts of D. grandiflora, E. contortisiliquum, A. cearenses , C. ferrea and C. retusa were able to attract females for posture when in low co ncentration . In the attractive concentrations, the CE of E. contortisiliquum and A. cearenses were able to kill 52% and 100% of the larvae respectively . The extracts of A. cearenses , P. viridiflora, E. velutina, M. urundeuva and S. brasiliensis were also pupicides, while extracts of P. viridiflora, E. velutina, E. contortisiliquum , A. cearenses, A. colubrina, D. grandiflora , B. cheilantha , S. spectabilis, C. pyramidalis, M. regnelli e G. americana displayed adulticidal activity. All extracts were toxic to C. dubia zooplankton . The EB of E. velutina and E. contortisiliquum did not affect the viability of fibroblasts . In all extracts were identified at least two potential insecticidal proteins such as enzyme inhibitors, lectins and chitin - binding proteins and components of secondary metabolism . Considering all bioassays , the extracts from A. cearenses, P. viridiflora, E. contortisiliquum , S. brasiliensis, E. velutina and M. urundeuva were considered the most promising . The E. contortisiliquum extracts was the only one who did not show pupicida activity, indicating that its mechanism of action larvicide and adulticidal is related only to the ingesti on of toxic compounds by insect , so it was selected to be fragmenting. As observed for the CE , th e protein fractions of E. contortisiliquum also showed larvicidal activity, highlighting that F2 showed higher larvicidal activity and lower en vironmental toxicity than the CE source. The reduction in the proteolytic activity of larvae fed with crude extra ct and fractions of E. contortisiliquum suggest ed that the trypsin inhibitors ( ITEc) would be resp onsible for larvicidal activity . However the increase in the purification of this inhibitor resulted in loss of larvicidal activity , but the absence of trypsin inhibitor reduced the effectiveness of the fractions , indicating that the ITEC contributes to the larvicidal activity of this extract. Not been observed larvicidal activity and adulticide in rich fraction vicilin, nor evidence of the contribution o f this molecule for the larvicidal activity of the extract. The results show the potential of seeds from plant extracts of Caatinga as a source of active molecules against insects A. aegypti at different stages of its development cycle, since they are comp osed of different active compounds, including protein nature, which act on different mechanisms should result in the death of insec
Resumo:
The conventional control schemes applied to Shunt Active Power Filters (SAPF) are Harmonic extractor-based strategies (HEBSs) because their effectiveness depends on how quickly and accurately the harmonic components of the nonlinear loads are identified. The SAPF can be also implemented without the use of the load harmonic extractors. In this case, the harmonic compensating term is obtained from the system active power balance. These systems can be considered as balanced-energy-based schemes (BEBSs) and their performance depends on how fast the system reaches the equilibrium state. In this case, the phase currents of the power grid are indirectly regulated by double sequence controllers with two degrees of freedom, where the internal model principle is employed to avoid reference frame transformation. Additionally the DSC controller presents robustness when the SAPF is operating under unbalanced conditions. Furthermore, SAPF implemented without harmonic detection schemes compensate simultaneously harmonic distortion and reactive power of the load. Their compensation capabilities, however, are limited by the SAPF power converter rating. Such a restriction can be minimized if the level of the reactive power correction is managed. In this work an estimation scheme for determining the filter currents is introduced to manage the compensation of reactive power. Experimental results are shown for demonstrating the performance of the proposed SAPF system.
Resumo:
The use of plants for medicinal purposes is ancient, with widespread application in medicinal drugs. Although plants are promising sources for the discovery of new molecules of pharmacological interest, estimates show that only 17% of them have been studied for their possible use in medicine. Thus, biodiversity of Brazilian flora represents an immense potential for economic use by the pharmaceutical industry. The plant Arrabidaea chica, popularly known as “pariri”, is common in the Amazon region, and it is assigned several medicinal properties. The leaves of this plant are rich in anthocyanins, which are phenolic compounds with high antioxidant power. Antioxidant compounds play a vital role in the prevention of neurological and cardiovascular diseases, cancer and diabetes, among others. Within the anthocyanins found in Arrabidaea chica, stands out Carajurin (6,7-dihydroxy-5,4’- dimethoxy-flavilium), which is the major pigment encountered in this plant. The present work aimed to study on supercritical extraction and conventional extraction (solid-liquid extraction) in leaves of Arrabidaea chica, evaluating the efficiency of the extractive processes, antioxidant activity and quantification of Carajurin contained in the extracts. Supercritical extraction used CO2 as solvent with addition of co-solvent (ethanol/water mixture) and were conducted by the dynamic method in a fixed bed extractor. The trials followed a 24-1 fractional factorial design, the dependent variables were: process yield, concentration of Carajurin and antioxidant activity; and independent variables were: pressure, temperature, concentration of co-solvent (v/v) and concentration of water in the co-solvent mixture (v/v). Yields (mass of dry extract/mass of raw material used) obtained from supercritical extraction ranged from 15.1% to 32%, and the best result was obtained at 250 bar and 40 °C, co-solvent concentration equal to 30% and concentration of water in the co-solvent mixture equal to 50%. Through statistical analysis, it was found that the concentration of co-solvent revealed significant effect on the yield. Yields obtained from conventional extractions were of 8.1% (water) and 5.5% (ethanol). Through HPLC (High-performance liquid chromatography) analysis, Carajurin was quantified in all the extracts and concentration values (Carajurin mass/mass of dry extract) ranged between 1% and 2.21% for supercritical extraction. For conventional extraction, Carajurin was not detected in the aqueous extract, while the ethanol extract showed Carajurin content of 7.04%, and therefore, more selective in Carajurin than the supercritical extraction. Evaluation of antioxidant power (radical 2,2-diphenyl-1-picrylhydrazyl – DPPH – sequestration method) of the supercritical extracts resulted in EC50 values (effective concentration which neutralizes 50% of free radicals) ranged from 38.34 to 86.13 μg/mL, while conventional extraction resulted in EC50 values of 167.34 (water) and 42.58 (ethanol) μg/mL. As for the quantification of total phenolic content (Folin-Ciocalteau analysis) of the supercritical extracts resulted in values ranged from 48.93 and 88.62 mg GAE/g extract (GAE = Gallic Acid Equivalents), while solid-liquid extraction resulted in values of 37.63 (water) and 80.54 (ethanol) mg GAE/g extract. The good antioxidant activity cannot be attributed solely to the presence of Carajurin, but also the existence of other compounds and antioxidants in Arrabidaea chica. By optimizing the experimental design, it was possible to identify the experiment that presented the best result considering the four dependent variables together. This experiment was performed under the following conditions: pressure of 200 bar, temperature of 40 °C, co-solvent concentration equal to 30% and concentration of water in the co-solvent mixture equal to 30%. It is concluded that, within the studied range, it is possible to purchase the optimum result using milder operating conditions, which implies lower costs and greater ease of operation.
Resumo:
Arrays of tidal energy converters have the potential to provide clean renewable energy for future generations. Benthic communities may, however, be affected by changes in current speeds resulting from arrays of tidal converters located in areas characterised by strong currents. Current speed, together with bottom type and depth, strongly influence benthic community distributions; however the interaction of these factors in controlling benthic dynamics in high energy environments is poorly understood. The Strangford Lough Narrows, the location of SeaGen, the world’s first single full-scale, grid-compliant tidal energy extractor, is characterised by spatially heterogenous high current flows. A hydrodynamic model was used to select a range of benthic community study sites that had median flow velocities between 1.5–2.4 m/s in a depth range of 25–30 m. 25 sites were sampled for macrobenthic community structure using drop down video survey to test the sensitivity of the distribution of benthic communities to changes in the flow field. A diverse range of species were recorded which were consistent with those for high current flow environments and corresponding to very tide-swept faunal communities in the EUNIS classification. However, over the velocity range investigated, no changes in benthic communities were observed. This suggested that the high physical disturbance associated with the high current flows in the Strangford Narrows reflected the opportunistic nature of the benthic species present with individuals being continuously and randomly affected by turbulent forces and physical damage. It is concluded that during operation, the removal of energy by marine tidal energy arrays in the far-field is unlikely to have a significant effect on benthic communities in high flow environments. The results are of major significance to developers and regulators in the tidal energy industry when considering the environmental impacts for site licences.
Resumo:
La description des termes dans les ressources terminologiques traditionnelles se limite à certaines informations, comme le terme (principalement nominal), sa définition et son équivalent dans une langue étrangère. Cette description donne rarement d’autres informations qui peuvent être très utiles pour l’utilisateur, surtout s’il consulte les ressources dans le but d’approfondir ses connaissances dans un domaine de spécialité, maitriser la rédaction professionnelle ou trouver des contextes où le terme recherché est réalisé. Les informations pouvant être utiles dans ce sens comprennent la description de la structure actancielle des termes, des contextes provenant de sources authentiques et l’inclusion d’autres parties du discours comme les verbes. Les verbes et les noms déverbaux, ou les unités terminologiques prédicatives (UTP), souvent ignorés par la terminologie classique, revêtent une grande importance lorsqu’il s’agit d’exprimer une action, un processus ou un évènement. Or, la description de ces unités nécessite un modèle de description terminologique qui rend compte de leurs particularités. Un certain nombre de terminologues (Condamines 1993, Mathieu-Colas 2002, Gross et Mathieu-Colas 2001 et L’Homme 2012, 2015) ont d’ailleurs proposé des modèles de description basés sur différents cadres théoriques. Notre recherche consiste à proposer une méthodologie de description terminologique des UTP de la langue arabe, notamment l’arabe standard moderne (ASM), selon la théorie de la Sémantique des cadres (Frame Semantics) de Fillmore (1976, 1977, 1982, 1985) et son application, le projet FrameNet (Ruppenhofer et al. 2010). Le domaine de spécialité qui nous intéresse est l’informatique. Dans notre recherche, nous nous appuyons sur un corpus recueilli du web et nous nous inspirons d’une ressource terminologique existante, le DiCoInfo (L’Homme 2008), pour compiler notre propre ressource. Nos objectifs se résument comme suit. Premièrement, nous souhaitons jeter les premières bases d’une version en ASM de cette ressource. Cette version a ses propres particularités : 1) nous visons des unités bien spécifiques, à savoir les UTP verbales et déverbales; 2) la méthodologie développée pour la compilation du DiCoInfo original devra être adaptée pour prendre en compte une langue sémitique. Par la suite, nous souhaitons créer une version en cadres de cette ressource, où nous regroupons les UTP dans des cadres sémantiques, en nous inspirant du modèle de FrameNet. À cette ressource, nous ajoutons les UTP anglaises et françaises, puisque cette partie du travail a une portée multilingue. La méthodologie consiste à extraire automatiquement les unités terminologiques verbales et nominales (UTV et UTN), comme Ham~ala (حمل) (télécharger) et taHmiyl (تحميل) (téléchargement). Pour ce faire, nous avons adapté un extracteur automatique existant, TermoStat (Drouin 2004). Ensuite, à l’aide des critères de validation terminologique (L’Homme 2004), nous validons le statut terminologique d’une partie des candidats. Après la validation, nous procédons à la création de fiches terminologiques, à l’aide d’un éditeur XML, pour chaque UTV et UTN retenue. Ces fiches comprennent certains éléments comme la structure actancielle des UTP et jusqu’à vingt contextes annotés. La dernière étape consiste à créer des cadres sémantiques à partir des UTP de l’ASM. Nous associons également des UTP anglaises et françaises en fonction des cadres créés. Cette association a mené à la création d’une ressource terminologique appelée « DiCoInfo : A Framed Version ». Dans cette ressource, les UTP qui partagent les mêmes propriétés sémantiques et structures actancielles sont regroupées dans des cadres sémantiques. Par exemple, le cadre sémantique Product_development regroupe des UTP comme Taw~ara (طور) (développer), to develop et développer. À la suite de ces étapes, nous avons obtenu un total de 106 UTP ASM compilées dans la version en ASM du DiCoInfo et 57 cadres sémantiques associés à ces unités dans la version en cadres du DiCoInfo. Notre recherche montre que l’ASM peut être décrite avec la méthodologie que nous avons mise au point.
Resumo:
Este trabalho propõe o desenvolvimento de métodos de preparo de amostra empregando a microextração líquido-líquido dispersiva (DLLME) para a extração e pré- concentração de Fe e Cu em vinho, seguido da determinação espectrofotométrica na região do ultravioleta-visível (UV-Vis). Nas extrações por DLLME, a complexação de Fe e Cu foi feita com pirrolidina ditiocarbamato de amônio (APDC) e dietilditiocarbamato de sódio (DDTC), respectivamente. Para a DLLME, foi usada uma mistura apropriada de pequenos volumes de dois solventes, um extrator e outro dispersor, a qual foi rapidamente injetada na amostra aquosa, ocorrendo à formação de uma dispersão e a extração praticamente instantânea dos analitos. Na otimização da DLLME para extração de Fe foram avaliados alguns parâmetros como, tipo de solvente extrator (C2Cl4, 80 µL) e dispersor (acetonitrila, 1300 µL) e seus volumes, pH (3,0), concentração do APDC (1%, m/v), adição de NaCl (0,02 mol L -1 ) e tempo de extração. Para extração de Cu foi aplicado um planejamento fatorial completo 25 para avaliar a influência de cinco variáveis independentes: volume dos solventes dispersor (acetonitrila, 1600 µL) e extrator (CCl4, 60 µL), concentração de DDTC (2%, m/v), pH (3,0) e concentração de NaCl. Após a otimização das condições para Fe, a curva de calibração com adição de analito foi linear entre 0,2 e 2,5 mg L-1 para vinho branco (R2 = 0,9985) e para vinho tinto (R2 = 0,9988). Para Cu, a curva de calibração com adição de analito foi linear entre 0,05 e 1,0 mg L-1 para vinho branco (R2 = 0,9995) e para vinho tinto (R2 = 0,9986). Os limites de quantificação foram de 0,75 e 0,37 mg L-1 para Fe e Cu, respectivamente. A exatidão foi avaliada utilizando ensaio de recuperação, as quais variaram entre 96% e 112%, com desvio padrão relativo inferior a 8%. Os métodos foram aplicados para 5 amostras de vinho branco e 5 amostras de vinho tinto, obtendo-se concentrações entre 1,3 e 5,3 e entre 2,5 e 4,4 mg L-1 para Fe e entre 0,4 e 1,5 e entre 0,9 e 2,5 mg L-1 para Cu, respectivamente. Os métodos desenvolvidos para a extração e pré-concentração de Fe e Cu em vinhos por DLLME e quantificação por UV-Vis mostraram-se adequados, em termos de linearidade, exatidão e precisão.
Resumo:
Variable Data Printing (VDP) has brought new flexibility and dynamism to the printed page. Each printed instance of a specific class of document can now have different degrees of customized content within the document template. This flexibility comes at a cost. If every printed page is potentially different from all others it must be rasterized separately, which is a time-consuming process. Technologies such as PPML (Personalized Print Markup Language) attempt to address this problem by dividing the bitmapped page into components that can be cached at the raster level, thereby speeding up the generation of page instances. A large number of documents are stored in Page Description Languages at a higher level of abstraction than the bitmapped page. Much of this content could be reused within a VDP environment provided that separable document components can be identified and extracted. These components then need to be individually rasterisable so that each high-level component can be related to its low-level (bitmap) equivalent. Unfortunately, the unstructured nature of most Page Description Languages makes it difficult to extract content easily. This paper outlines the problems encountered in extracting component-based content from existing page description formats, such as PostScript, PDF and SVG, and how the differences between the formats affects the ease with which content can be extracted. The techniques are illustrated with reference to a tool called COG Extractor, which extracts content from PDF and SVG and prepares it for reuse.
Resumo:
La description des termes dans les ressources terminologiques traditionnelles se limite à certaines informations, comme le terme (principalement nominal), sa définition et son équivalent dans une langue étrangère. Cette description donne rarement d’autres informations qui peuvent être très utiles pour l’utilisateur, surtout s’il consulte les ressources dans le but d’approfondir ses connaissances dans un domaine de spécialité, maitriser la rédaction professionnelle ou trouver des contextes où le terme recherché est réalisé. Les informations pouvant être utiles dans ce sens comprennent la description de la structure actancielle des termes, des contextes provenant de sources authentiques et l’inclusion d’autres parties du discours comme les verbes. Les verbes et les noms déverbaux, ou les unités terminologiques prédicatives (UTP), souvent ignorés par la terminologie classique, revêtent une grande importance lorsqu’il s’agit d’exprimer une action, un processus ou un évènement. Or, la description de ces unités nécessite un modèle de description terminologique qui rend compte de leurs particularités. Un certain nombre de terminologues (Condamines 1993, Mathieu-Colas 2002, Gross et Mathieu-Colas 2001 et L’Homme 2012, 2015) ont d’ailleurs proposé des modèles de description basés sur différents cadres théoriques. Notre recherche consiste à proposer une méthodologie de description terminologique des UTP de la langue arabe, notamment l’arabe standard moderne (ASM), selon la théorie de la Sémantique des cadres (Frame Semantics) de Fillmore (1976, 1977, 1982, 1985) et son application, le projet FrameNet (Ruppenhofer et al. 2010). Le domaine de spécialité qui nous intéresse est l’informatique. Dans notre recherche, nous nous appuyons sur un corpus recueilli du web et nous nous inspirons d’une ressource terminologique existante, le DiCoInfo (L’Homme 2008), pour compiler notre propre ressource. Nos objectifs se résument comme suit. Premièrement, nous souhaitons jeter les premières bases d’une version en ASM de cette ressource. Cette version a ses propres particularités : 1) nous visons des unités bien spécifiques, à savoir les UTP verbales et déverbales; 2) la méthodologie développée pour la compilation du DiCoInfo original devra être adaptée pour prendre en compte une langue sémitique. Par la suite, nous souhaitons créer une version en cadres de cette ressource, où nous regroupons les UTP dans des cadres sémantiques, en nous inspirant du modèle de FrameNet. À cette ressource, nous ajoutons les UTP anglaises et françaises, puisque cette partie du travail a une portée multilingue. La méthodologie consiste à extraire automatiquement les unités terminologiques verbales et nominales (UTV et UTN), comme Ham~ala (حمل) (télécharger) et taHmiyl (تحميل) (téléchargement). Pour ce faire, nous avons adapté un extracteur automatique existant, TermoStat (Drouin 2004). Ensuite, à l’aide des critères de validation terminologique (L’Homme 2004), nous validons le statut terminologique d’une partie des candidats. Après la validation, nous procédons à la création de fiches terminologiques, à l’aide d’un éditeur XML, pour chaque UTV et UTN retenue. Ces fiches comprennent certains éléments comme la structure actancielle des UTP et jusqu’à vingt contextes annotés. La dernière étape consiste à créer des cadres sémantiques à partir des UTP de l’ASM. Nous associons également des UTP anglaises et françaises en fonction des cadres créés. Cette association a mené à la création d’une ressource terminologique appelée « DiCoInfo : A Framed Version ». Dans cette ressource, les UTP qui partagent les mêmes propriétés sémantiques et structures actancielles sont regroupées dans des cadres sémantiques. Par exemple, le cadre sémantique Product_development regroupe des UTP comme Taw~ara (طور) (développer), to develop et développer. À la suite de ces étapes, nous avons obtenu un total de 106 UTP ASM compilées dans la version en ASM du DiCoInfo et 57 cadres sémantiques associés à ces unités dans la version en cadres du DiCoInfo. Notre recherche montre que l’ASM peut être décrite avec la méthodologie que nous avons mise au point.
Resumo:
This dissertation describes a deepening study about Visual Odometry problem tackled with transformer architectures. The existing VO algorithms are based on heavily hand-crafted features and are not able to generalize well to new environments. To train them, we need carefully fine-tune the hyper-parameters and the network architecture. We propose to tackle the VO problem with transformer because it is a general-purpose architecture and because it was designed to transformer sequences of data from a domain to another one, which is the case of the VO problem. Our first goal is to create synthetic dataset using BlenderProc2 framework to mitigate the problem of the dataset scarcity. The second goal is to tackle the VO problem by using different versions of the transformer architecture, which will be pre-trained on the synthetic dataset and fine-tuned on the real dataset, KITTI dataset. Our approach is defined as follows: we use a feature-extractor to extract features embeddings from a sequence of images, then we feed this sequence of embeddings to the transformer architecture, finally, an MLP is used to predict the sequence of camera poses.