12 resultados para MAPPING MOLECULAR NETWORKS
em Instituto Politécnico do Porto, Portugal
Resumo:
Wireless Sensor Networks (WSNs) are increasingly used in various application domains like home-automation, agriculture, industries and infrastructure monitoring. As applications tend to leverage larger geographical deployments of sensor networks, the availability of an intuitive and user friendly programming abstraction becomes a crucial factor in enabling faster and more efficient development, and reprogramming of applications. We propose a programming pattern named sMapReduce, inspired by the Google MapReduce framework, for mapping application behaviors on to a sensor network and enabling complex data aggregation. The proposed pattern requires a user to create a network-level application in two functions: sMap and Reduce, in order to abstract away from the low-level details without sacrificing the control to develop complex logic. Such a two-fold division of programming logic is a natural-fit to typical sensor networking operation which makes sensing and topological modalities accessible to the user.
Resumo:
The main purpose of this study was to examine the applicability of geostatistical modeling to obtain valuable information for assessing the environmental impact of sewage outfall discharges. The data set used was obtained in a monitoring campaign to S. Jacinto outfall, located off the Portuguese west coast near Aveiro region, using an AUV. The Matheron’s classical estimator was used the compute the experimental semivariogram which was fitted to three theoretical models: spherical, exponential and gaussian. The cross-validation procedure suggested the best semivariogram model and ordinary kriging was used to obtain the predictions of salinity at unknown locations. The generated map shows clearly the plume dispersion in the studied area, indicating that the effluent does not reach the near by beaches. Our study suggests that an optimal design for the AUV sampling trajectory from a geostatistical prediction point of view, can help to compute more precise predictions and hence to quantify more accurately dilution. Moreover, since accurate measurements of plume’s dilution are rare, these studies might be very helpful in the future for validation of dispersion models.
Resumo:
In this paper we present a Constraint Logic Programming (CLP) based model, and hybrid solving method for the Scheduling of Maintenance Activities in the Power Transmission Network. The model distinguishes from others not only because of its completeness but also by the way it models and solves the Electric Constraints. Specifically we present a efficient filtering algorithm for the Electrical Constraints. Furthermore, the solving method improves the pure CLP methods efficiency by integrating a type of Local Search technique with CLP. To test the approach we compare the method results with another method using a 24 bus network, which considerers 42 tasks and 24 maintenance periods.
Resumo:
This paper addresses the problem of energy resource scheduling. An aggregator will manage all distributed resources connected to its distribution network, including distributed generation based on renewable energy resources, demand response, storage systems, and electrical gridable vehicles. The use of gridable vehicles will have a significant impact on power systems management, especially in distribution networks. Therefore, the inclusion of vehicles in the optimal scheduling problem will be very important in future network management. The proposed particle swarm optimization approach is compared with a reference methodology based on mixed integer non-linear programming, implemented in GAMS, to evaluate the effectiveness of the proposed methodology. The paper includes a case study that consider a 32 bus distribution network with 66 distributed generators, 32 loads and 50 electric vehicles.
Resumo:
The principal aim of this study was to investigate the possibility of transference to Escherichia coli of β-lactam resistance genes found in bacteria isolated from ready-to-eat (RTE) Portuguese traditional food. From previous screenings, 128 β-lactam resistant isolates (from different types of cheese and of delicatessen meats), largely from the Enterobacteriaceae family were selected and 31.3% of them proved to transfer resistance determinants in transconjugation assays. Multiplex PCR in donor and transconjugant isolates did not detect bla CTX, bla SHV and bla OXY, but bla TEM was present in 85% of them, while two new TEMs (TEM-179 and TEM-180) were identified in two isolates. The sequencing of these amplicons showed identity between donor and transconjugant genes indicating in vitro plasmid DNA transfer. These results suggest that if there is an exchange of genes in natural conditions, the consumption of RTE foods, particularly with high levels of Enterobacteriaceae, can contribute to the spread of antibiotic resistance.
Resumo:
β-lactamases are hydrolytic enzymes that inactivate the β-lactam ring of antibiotics such as penicillins and cephalosporins. The major diversity of studies carried out until now have mainly focused on the characterization of β-lactamases recovered among clinical isolates of Gram-positive staphylococci and Gram-negative enterobacteria, amongst others. However, only some studies refer to the detection and development of β-lactamases carriers in healthy humans, sick animals, or even in strains isolated from environmental stocks such as food, water, or soils. Considering this, we proposed a 10-week laboratory programme for the Biochemistry and Molecular Biology laboratory for majors in the health, environmental, and agronomical sciences. During those weeks, students would be dealing with some basic techniques such as DNA extraction, bacterial transformation, polymerase chain reaction (PCR), gel electrophoresis, and the use of several bioinformatics tools. These laboratory exercises would be conducted as a mini research project in which all the classes would be connected with the previous ones. This curriculum was compared in an experiment involving two groups of students from two different majors. The new curriculum, with classes linked together as a mini research project, was taught to a major in Pharmacy and an old curriculum was taught to students from environmental health. The results showed that students who were enrolled in the new curriculum obtained better results in the final exam than the students who were enrolled in the former curriculum. Likewise, these students were found to be more enthusiastic during the laboratory classes than those from the former curriculum.
Resumo:
Collaborative Work plays an important role in today’s organizations, especially in areas where decisions must be made. However, any decision that involves a collective or group of decision makers is, by itself complex, but is becoming recurrent in recent years. In this work we present the VirtualECare project, an intelligent multi-agent system able to monitor, interact and serve its customers, which are, normally, in need of care services. In last year’s there has been a substantially increase on the number of people needed of intensive care, especially among the elderly, a phenomenon that is related to population ageing. However, this is becoming not exclusive of the elderly, as diseases like obesity, diabetes and blood pressure have been increasing among young adults. This is a new reality that needs to be dealt by the health sector, particularly by the public one. Given this scenarios, the importance of finding new and cost effective ways for health care delivery are of particular importance, especially when we believe they should not to be removed from their natural “habitat”. Following this line of thinking, the VirtualECare project will be presented, like similar ones that preceded it. Recently we have also assisted to a growing interest in combining the advances in information society - computing, telecommunications and presentation – in order to create Group Decision Support Systems (GDSS). Indeed, the new economy, along with increased competition in today’s complex business environments, takes the companies to seek complementarities in order to increase competitiveness and reduce risks. Under these scenarios, planning takes a major role in a company life. However, effective planning depends on the generation and analysis of ideas (innovative or not) and, as a result, the idea generation and management processes are crucial. Our objective is to apply the above presented GDSS to a new area. We believe that the use of GDSS in the healthcare arena will allow professionals to achieve better results in the analysis of one’s Electronically Clinical Profile (ECP). This achievement is vital, regarding the explosion of knowledge and skills, together with the need to use limited resources and get better results.
Resumo:
Mestrado em Engenharia Informática
Resumo:
The relentless discovery of cancer biomarkers demands improved methods for their detection. In this work, we developed protein imprinted polymer on three-dimensional gold nanoelectrode ensemble (GNEE) to detect epithelial ovarian cancer antigen-125 (CA 125), a protein biomarker associated with ovarian cancer. CA 125 is the standard tumor marker used to follow women during or after treatment for epithelial ovarian cancer. The template protein CA 125 was initially incorporated into the thin-film coating and, upon extraction of protein from the accessible surfaces on the thin film, imprints for CA 125 were formed. The fabrication and analysis of the CA 125 imprinted GNEE was done by using cyclic voltammetry (CV), differential pulse voltammetry (DPV) and electrochemical impedance spectroscopy (EIS) techniques. The surfaces of the very thin, protein imprinted sites on GNEE are utilized for immunospecific capture of CA 125 molecules, and the mass of bound on the electrode surface can be detected as a reduction in the faradic current from the redox marker. Under optimal conditions, the developed sensor showed good increments at the studied concentration range of 0.5–400 U mL−1. The lowest detection limit was found to be 0.5 U mL−1. Spiked human blood serum and unknown real serum samples were analyzed. The presence of non-specific proteins in the serum did not significantly affect the sensitivity of our assay. Molecular imprinting using synthetic polymers and nanomaterials provides an alternative approach to the trace detection of biomarker proteins.
Resumo:
For the past years wireless sensor networks (WSNs) have been coined as one of the most promising technologies for supporting a wide range of applications. However, outside the research community, few are the people who know what they are and what they can offer. Even fewer are the ones that have seen these networks used in real world applications. The main obstacle for the proliferation of these networks is energy, or the lack of it. Even though renewable energy sources are always present in the networks environment, designing devices that can efficiently scavenge that energy in order to sustain the operation of these networks is still an open challenge. Energy scavenging, along with energy efficiency and energy conservation, are the current available means to sustain the operation of these networks, and can all be framed within the broader concept of “Energetic Sustainability”. A comprehensive study of the several issues related to the energetic sustainability of WSNs is presented in this thesis, with a special focus in today’s applicable energy harvesting techniques and devices, and in the energy consumption of commercially available WSN hardware platforms. This work allows the understanding of the different energy concepts involving WSNs and the evaluation of the presented energy harvesting techniques for sustaining wireless sensor nodes. This survey is supported by a novel experimental analysis of the energy consumption of the most widespread commercially available WSN hardware platforms.
Resumo:
Molecularly imprinted polymers (MIP) were used as potentiometric sensors for the selective recognition and determination of chlormequat (CMQ). They were produced after radical polymerization of 4-vinyl pyridine (4-VP) or methacrylic acid (MAA) monomers in the presence of a cross-linker. CMQwas used as template. Similar nonimprinted (NI) polymers (NIP) were produced by removing the template from reaction media. The effect of kind and amount of MIP or NIP sensors on the potentiometric behavior was investigated. Main analytical features were evaluated in steady and flow modes of operation. The sensor MIP/4-VP exhibited the best performance, presenting fast near-Nernstian response for CMQover the concentration range 6.2×10-6 – 1.0×10-2 mol L-1 with detection limits of 4.1×10-6 mol L-1. The sensor was independent from the pH of test solutions in the range 5 – 10. Potentiometric selectivity coefficients of the proposed sensors were evaluated over several inorganic and organic cations. Results pointed out a good selectivity to CMQ. The sensor was applied to the potentiometric determination of CMQin commercial phytopharmaceuticals and spiked water samples. Recoveries ranged 96 to 108.5%.
Resumo:
Introdução Hoje em dia, o conceito de ontologia (Especificação explícita de uma conceptualização [Gruber, 1993]) é um conceito chave em sistemas baseados em conhecimento em geral e na Web Semântica em particular. Entretanto, os agentes de software nem sempre concordam com a mesma conceptualização, justificando assim a existência de diversas ontologias, mesmo que tratando o mesmo domínio de discurso. Para resolver/minimizar o problema de interoperabilidade entre estes agentes, o mapeamento de ontologias provou ser uma boa solução. O mapeamento de ontologias é o processo onde são especificadas relações semânticas entre entidades da ontologia origem e destino ao nível conceptual, e que por sua vez podem ser utilizados para transformar instâncias baseadas na ontologia origem em instâncias baseadas na ontologia destino. Motivação Num ambiente dinâmico como a Web Semântica, os agentes alteram não só os seus dados mas também a sua estrutura e semântica (ontologias). Este processo, denominado evolução de ontologias, pode ser definido como uma adaptação temporal da ontologia através de alterações que surgem no domínio ou nos objectivos da própria ontologia, e da gestão consistente dessas alterações [Stojanovic, 2004], podendo por vezes deixar o documento de mapeamento inconsistente. Em ambientes heterogéneos onde a interoperabilidade entre sistemas depende do documento de mapeamento, este deve reflectir as alterações efectuadas nas ontologias, existindo neste caso duas soluções: (i) gerar um novo documento de mapeamento (processo exigente em termos de tempo e recursos computacionais) ou (ii) adaptar o documento de mapeamento, corrigindo relações semânticas inválidas e criar novas relações se forem necessárias (processo menos existente em termos de tempo e recursos computacionais, mas muito dependente da informação sobre as alterações efectuadas). O principal objectivo deste trabalho é a análise, especificação e desenvolvimento do processo de evolução do documento de mapeamento de forma a reflectir as alterações efectuadas durante o processo de evolução de ontologias. Contexto Este trabalho foi desenvolvido no contexto do MAFRA Toolkit1. O MAFRA (MApping FRAmework) Toolkit é uma aplicação desenvolvida no GECAD2 que permite a especificação declarativa de relações semânticas entre entidades de uma ontologia origem e outra de destino, utilizando os seguintes componentes principais: Concept Bridge – Representa uma relação semântica entre um conceito de origem e um de destino; Property Bridge – Representa uma relação semântica entre uma ou mais propriedades de origem e uma ou mais propriedades de destino; Service – São aplicados às Semantic Bridges (Property e Concept Bridges) definindo como as instâncias origem devem ser transformadas em instâncias de destino. Estes conceitos estão especificados na ontologia SBO (Semantic Bridge Ontology) [Silva, 2004]. No contexto deste trabalho, um documento de mapeamento é uma instanciação do SBO, contendo relações semânticas entre entidades da ontologia de origem e da ontologia de destino. Processo de evolução do mapeamento O processo de evolução de mapeamento é o processo onde as entidades do documento de mapeamento são adaptadas, reflectindo eventuais alterações nas ontologias mapeadas, tentando o quanto possível preservar a semântica das relações semântica especificadas. Se as ontologias origem e/ou destino sofrerem alterações, algumas relações semânticas podem tornar-se inválidas, ou novas relações serão necessárias, sendo por isso este processo composto por dois sub-processos: (i) correcção de relações semânticas e (ii) processamento de novas entidades das ontologias. O processamento de novas entidades das ontologias requer a descoberta e cálculo de semelhanças entre entidades e a especificação de relações de acordo com a ontologia/linguagem SBO. Estas fases (“similarity measure” e “semantic bridging”) são implementadas no MAFRA Toolkit, sendo o processo (semi-) automático de mapeamento de ontologias descrito em [Silva, 2004].O processo de correcção de entidades SBO inválidas requer um bom conhecimento da ontologia/linguagem SBO, das suas entidades e relações, e de todas as suas restrições, i.e. da sua estrutura e semântica. Este procedimento consiste em (i) identificar as entidades SBO inválidas, (ii) a causa da sua invalidez e (iii) corrigi-las da melhor forma possível. Nesta fase foi utilizada informação vinda do processo de evolução das ontologias com o objectivo de melhorar a qualidade de todo o processo. Conclusões Para além do processo de evolução do mapeamento desenvolvido, um dos pontos mais importantes deste trabalho foi a aquisição de um conhecimento mais profundo sobre ontologias, processo de evolução de ontologias, mapeamento etc., expansão dos horizontes de conhecimento, adquirindo ainda mais a consciência da complexidade do problema em questão, o que permite antever e perspectivar novos desafios para o futuro.