932 resultados para Scene Graph
Resumo:
We provide all agent; the capability to infer the relations (assertions) entailed by the rules that, describe the formal semantics of art RDFS knowledge-base. The proposed inferencing process formulates each semantic restriction as a rule implemented within a, SPARQL query statement. The process expands the original RDF graph into a fuller graph that. explicitly captures the rule's described semantics. The approach is currently being explored in order to support descriptions that follow the generic Semantic Web Rule Language. An experiment, using the Fire-Brigade domain, a small-scale knowledge-base, is adopted to illustrate the agent modeling method and the inferencing process.
Resumo:
A organização automática de mensagens de correio electrónico é um desafio actual na área da aprendizagem automática. O número excessivo de mensagens afecta cada vez mais utilizadores, especialmente os que usam o correio electrónico como ferramenta de comunicação e trabalho. Esta tese aborda o problema da organização automática de mensagens de correio electrónico propondo uma solução que tem como objectivo a etiquetagem automática de mensagens. A etiquetagem automática é feita com recurso às pastas de correio electrónico anteriormente criadas pelos utilizadores, tratando-as como etiquetas, e à sugestão de múltiplas etiquetas para cada mensagem (top-N). São estudadas várias técnicas de aprendizagem e os vários campos que compõe uma mensagem de correio electrónico são analisados de forma a determinar a sua adequação como elementos de classificação. O foco deste trabalho recai sobre os campos textuais (o assunto e o corpo das mensagens), estudando-se diferentes formas de representação, selecção de características e algoritmos de classificação. É ainda efectuada a avaliação dos campos de participantes através de algoritmos de classificação que os representam usando o modelo vectorial ou como um grafo. Os vários campos são combinados para classificação utilizando a técnica de combinação de classificadores Votação por Maioria. Os testes são efectuados com um subconjunto de mensagens de correio electrónico da Enron e um conjunto de dados privados disponibilizados pelo Institute for Systems and Technologies of Information, Control and Communication (INSTICC). Estes conjuntos são analisados de forma a perceber as características dos dados. A avaliação do sistema é realizada através da percentagem de acerto dos classificadores. Os resultados obtidos apresentam melhorias significativas em comparação com os trabalhos relacionados.
Resumo:
Since collaborative networked organisations are usually formed by independent and heterogeneous entities, it is natural that each member holds his own set of values, and that conflicts among partners might emerge because of some misalignment of values. In contrast, it is often stated in literature that the alignment between the value systems of members involved in collaborative processes is a prerequisite for successful co-working. As a result, the issue of core value alignment in collaborative networks started to attract attention. However, methods to analyse such alignment are lacking mainly because the concept of 'alignment' in this context is still ill defined and shows a multifaceted nature. As a contribution to the area, this article introduces an approach based on causal models and graph theory for the analysis of core value alignment in collaborative networks. The potential application of the approach is then discussed in the virtual organisations' breeding environment context.
Resumo:
Topology optimization consists in finding the spatial distribution of a given total volume of material for the resulting structure to have some optimal property, for instance, maximization of structural stiffness or maximization of the fundamental eigenfrequency. In this paper a Genetic Algorithm (GA) employing a representation method based on trees is developed to generate initial feasible individuals that remain feasible upon crossover and mutation and as such do not require any repairing operator to ensure feasibility. Several application examples are studied involving the topology optimization of structures where the objective functions is the maximization of the stiffness and the maximization of the first and the second eigenfrequencies of a plate, all cases having a prescribed material volume constraint.
Resumo:
Formaldehyde (CH2O), the most simple and reactive aldehyde, is a colorless, reactive and readily polymerizing gas at room temperature (National Toxicology Program [NTP]. It has a pungent suffocating odor that is recognized by most human subjects at concentrations below 1 ppm. Aleksandr Butlerov synthesized the chemical in 1859, but it was August Wilhelm von Hofmann who identified it as the product formed from passing methanol and air over a heated platinum spiral in 1867. This method is still the basis for the industrial production of formaldehyde today, in which methanol is oxidized using a metal catalyst. By the early 20th century, with the explosion of knowledge in chemistry and physics, coupled with demands for more innovative synthetic products, the scene was set for the birth of a new material–plastics. According to the Report on Carcinogens, formaldehyde ranks 25th in the overall U.S. chemical production, with more than 5 million tons produced each year. Formaldehyde annual production rises up to 21 million tons worldwide and it has increased in China with 7.5 million tons produced in 2007. Given its economic importance and widespread use, many people are exposed to formaldehyde environmentally and/or occupationally. Commercially, formaldehyde is manufactured as an aqueous solution called formalin, usually containing 37% by weight of dissolved formaldehyde. This chemical is present in all regions of the atmosphere arising from the oxidation of biogenic and anthropogenic hydrocarbons. Formaldehyde concentration levels range typically from 2 to 45 ppbV (parts per billion in a given volume) in urban settings that are mainly governed by primary emissions and secondary formation.
Resumo:
Mestrado em Radiações Aplicadas às Tecnologias da Saúde. Área de especialização: Ressonância Magnética
Resumo:
Background: With the decrease of DNA sequencing costs, sequence-based typing methods are rapidly becoming the gold standard for epidemiological surveillance. These methods provide reproducible and comparable results needed for a global scale bacterial population analysis, while retaining their usefulness for local epidemiological surveys. Online databases that collect the generated allelic profiles and associated epidemiological data are available but this wealth of data remains underused and are frequently poorly annotated since no user-friendly tool exists to analyze and explore it. Results: PHYLOViZ is platform independent Java software that allows the integrated analysis of sequence-based typing methods, including SNP data generated from whole genome sequence approaches, and associated epidemiological data. goeBURST and its Minimum Spanning Tree expansion are used for visualizing the possible evolutionary relationships between isolates. The results can be displayed as an annotated graph overlaying the query results of any other epidemiological data available. Conclusions: PHYLOViZ is a user-friendly software that allows the combined analysis of multiple data sources for microbial epidemiological and population studies. It is freely available at http://www.phyloviz.net.
Resumo:
António Ferro foi uma personalidade fulcral para a cultura do século XX em Portugal, sendo as suas obras reflexo do tempo que viveu e das ideias que defendeu, estando quer estas, quer as de sua mulher, Fernanda de Castro, recheadas de referências à indumentária e as modas que se iam sucedendo e que muitas vezes descreveu. ABSTRACT - António Ferro was a main personality of the cultural scene of the twentieth century in Portugal. His works reflect the time he lived and ideas that defended, being either it, or those written by his wife, Fernanda de Castro, filled with references to clothing and the fashions that were succeeding and that he often described.
Resumo:
O catálogo é um produto de comunicação com presença forte no mundo da moda, reflexo por excelência de cada coleção e imagem de marca. Verifica-se que os critérios objectivos de avaliação dos catálogos não foram ainda alvo de uma análise estruturada e aprofundada, apesar da importância que possuem para as empresas do sector do Vestuário. Assim, decidiuse estudar os elementos que definem o nível de qualidade estética e gráfica dos catálogos de vestuário. Analisaram-se catálogos de marcas portuguesas e internacionais da indústria da moda cuja qualidade é reconhecida pelos profissionais de design. Como resultado desse estudo foi criada a Matriz AQC - Matriz de Avaliação da Qualidade dos Catálogos Produto e a Matriz AQC – Matriz de avaliação da Qualidade dos Catálogos Imagem.
Resumo:
Fluorescence confocal microscopy (FCM) is now one of the most important tools in biomedicine research. In fact, it makes it possible to accurately study the dynamic processes occurring inside the cell and its nucleus by following the motion of fluorescent molecules over time. Due to the small amount of acquired radiation and the huge optical and electronics amplification, the FCM images are usually corrupted by a severe type of Poisson noise. This noise may be even more damaging when very low intensity incident radiation is used to avoid phototoxicity. In this paper, a Bayesian algorithm is proposed to remove the Poisson intensity dependent noise corrupting the FCM image sequences. The observations are organized in a 3-D tensor where each plane is one of the images acquired along the time of a cell nucleus using the fluorescence loss in photobleaching (FLIP) technique. The method removes simultaneously the noise by considering different spatial and temporal correlations. This is accomplished by using an anisotropic 3-D filter that may be separately tuned in space and in time dimensions. Tests using synthetic and real data are described and presented to illustrate the application of the algorithm. A comparison with several state-of-the-art algorithms is also presented.
Resumo:
Electricity markets are complex environments, involving a large number of different entities, playing in a dynamic scene to obtain the best advantages and profits. MASCEM is a multi-agent electricity market simulator to model market players and simulate their operation in the market. Market players are entities with specific characteristics and objectives, making their decisions and interacting with other players. MASCEM is integrated with ALBidS, a system that provides several dynamic strategies for agents’ behavior. This paper presents a method that aims at enhancing ALBidS competence in endowing market players with adequate strategic bidding capabilities, allowing them to obtain the higher possible gains out of the market. This method uses a reinforcement learning algorithm to learn from experience how to choose the best from a set of possible actions. These actions are defined accordingly to the most probable points of bidding success. With the purpose of accelerating the convergence process, a simulated annealing based algorithm is included.
Resumo:
Metalearning is a subfield of machine learning with special pro-pensity for dynamic and complex environments, from which it is difficult to extract predictable knowledge. The field of study of this work is the electricity market, which due to the restructuring that recently took place, became an especially complex and unpredictable environment, involving a large number of different entities, playing in a dynamic scene to obtain the best advantages and profits. This paper presents the development of a metalearner, applied to the decision support of electricity markets’ negotia-tion entities. The proposed metalearner takes advantage on several learning algorithms implemented in ALBidS, an adaptive learning system that pro-vides decision support to electricity markets’ participating players. Using the outputs of each different strategy as inputs, the metalearner creates its own output, considering each strategy with a different weight, depending on its individual quality of performance. The results of the proposed meth-od are studied and analyzed using MASCEM - a multi-agent electricity market simulator that models market players and simulates their operation in the market. This simulator provides the chance to test the metalearner in scenarios based on real electricity market´s data.
Resumo:
Electricity markets are complex environments, involving a large number of different entities, playing in a dynamic scene to obtain the best advantages and profits. MASCEM is a multi-agent electricity market simu-lator to model market players and simulate their operation in the market. Market players are entities with specific characteristics and objectives, making their decisions and interacting with other players. MASCEM pro-vides several dynamic strategies for agents’ behaviour. This paper presents a method that aims to provide market players strategic bidding capabilities, allowing them to obtain the higher possible gains out of the market. This method uses an auxiliary forecasting tool, e.g. an Artificial Neural Net-work, to predict the electricity market prices, and analyses its forecasting error patterns. Through the recognition of such patterns occurrence, the method predicts the expected error for the next forecast, and uses it to adapt the actual forecast. The goal is to approximate the forecast to the real value, reducing the forecasting error.
Resumo:
OBJETIVOS: Nenhum estudo de base populacional foi realizado para mostrar o uso potencial de diagnóstico virológico do vírus rábico. O estudo realizado teve por objetivo estimar parâmetros de acurácia para o isolamento de vírus rábico em célula McCoy, como um método alternativo, e comparar com o uso da célula N2A, considerada método de referência. MÉTODOS: Foi realizado um inquérito em 120 morcegos coletados aleatoriamente, na Mata Atlântica, no Estado de São Paulo. Utilizou-se a reação de imunofluorescência para a detecção do vírus rábico isolado no cérebro desses morcegos, avaliado nos dois sistemas de cultivos celulares. Dois bancos de dados foram formados com os resultados. A análise foi feita com o programa Computer Methods for Diagnosis Tests (CMDT), utilizando a técnica de two-graph-receiver operating characteristic (TG-ROC) para obter os parâmetros de sensibilidade e especificidade, além de outros indicadores, tais como eficácia, valor preditivo positivo, valor preditivo negativo e razão de verossimilhança. RESULTADOS: A célula N2A apresentou 90% de sensibilidade e especificidade, enquanto que a célula McCoy obteve 95% para os mesmos parâmetros. Os valores foram baseados em pontos de cortes otimizados para cada uma das células. CONCLUSÕES: Observou-se que a célula McCoy permite obter estimativas de acurácia superiores aos resultados observados com a célula de N2A, representando um método alternativo eficaz no isolamento do vírus rábico.
Resumo:
Dissertação apresentada à Escola Superior de Comunicação Social como parte dos requisitos para obtenção de grau de mestre em Audiovisual e Multimédia.