26 resultados para Maximal Outerplanar Graph
Resumo:
The advances made in channel-capacity codes, such as turbo codes and low-density parity-check (LDPC) codes, have played a major role in the emerging distributed source coding paradigm. LDPC codes can be easily adapted to new source coding strategies due to their natural representation as bipartite graphs and the use of quasi-optimal decoding algorithms, such as belief propagation. This paper tackles a relevant scenario in distributedvideo coding: lossy source coding when multiple side information (SI) hypotheses are available at the decoder, each one correlated with the source according to different correlation noise channels. Thus, it is proposed to exploit multiple SI hypotheses through an efficient joint decoding technique withmultiple LDPC syndrome decoders that exchange information to obtain coding efficiency improvements. At the decoder side, the multiple SI hypotheses are created with motion compensated frame interpolation and fused together in a novel iterative LDPC based Slepian-Wolf decoding algorithm. With the creation of multiple SI hypotheses and the proposed decoding algorithm, bitrate savings up to 8.0% are obtained for similar decoded quality.
Resumo:
This paper presents an algorithm to efficiently generate the state-space of systems specified using the IOPT Petri-net modeling formalism. IOPT nets are a non-autonomous Petri-net class, based on Place-Transition nets with an extended set of features designed to allow the rapid prototyping and synthesis of system controllers through an existing hardware-software co-design framework. To obtain coherent and deterministic operation, IOPT nets use a maximal-step execution semantics where, in a single execution step, all enabled transitions will fire simultaneously. This fact increases the resulting state-space complexity and can cause an arc "explosion" effect. Real-world applications, with several million states, will reach a higher order of magnitude number of arcs, leading to the need for high performance state-space generator algorithms. The proposed algorithm applies a compilation approach to read a PNML file containing one IOPT model and automatically generate an optimized C program to calculate the corresponding state-space.
Resumo:
Introduction: Familial amyloidotic polyneuropathy (FAP) is a neurodegenerative disease that leads to sensory and motor polyneuropathies as well as functional limitations. So far, liver transplantation is the only treatment for FAP because the mutated protein causing the disease is mainly produced in the liver. With the increasing survival of transplant recipients, functional and cardiovascular problems as consequences of immunosuppressant side effects are increasing associated with sedentary lifestyles and/or retransplantation status. We sought to analyze the impact of exercise training programs on 1 FAP patient’s course long-term after liver transplantation. Methodology. A FAP patient (female; 49 years of age; body mass index 18.8 kg/m2) underwent a liver transplantation 133 months before assessment. She was assessed for body composition, isometric quadriceps muscle strength, functional capacity, fatigue, and levels of physical activity before and after a 6-month period of combined exercise training. Results: After the exercise training program, almost all variables were improved, namely, total body skeletal muscle mass, proximal femoral bone mineral density, quadriceps strength, maximal oxygen consumption on 6 minutes walk test (6mwt) or VO2peak, total ventilation on 6mwt, and fatigue. The improvement in distance on 6mwt (69.2 m) was clinically significant. Preintervention the levels of physical activity were below international recommendations for health; after the program they achieved the recommendations. Conclusion: The results showed an improvement in functional capacity with a decrease in future disability risk associated with a better lifestyle with respect to physical activity levels in 1 patient.
Resumo:
We derive a set of differential inequalities for positive definite functions based on previous results derived for positive definite kernels by purely algebraic methods. Our main results show that the global behavior of a smooth positive definite function is, to a large extent, determined solely by the sequence of even-order derivatives at the origin: if a single one of these vanishes then the function is constant; if they are all non-zero and satisfy a natural growth condition, the function is real-analytic and consequently extends holomorphically to a maximal horizontal strip of the complex plane.
Resumo:
We provide all agent; the capability to infer the relations (assertions) entailed by the rules that, describe the formal semantics of art RDFS knowledge-base. The proposed inferencing process formulates each semantic restriction as a rule implemented within a, SPARQL query statement. The process expands the original RDF graph into a fuller graph that. explicitly captures the rule's described semantics. The approach is currently being explored in order to support descriptions that follow the generic Semantic Web Rule Language. An experiment, using the Fire-Brigade domain, a small-scale knowledge-base, is adopted to illustrate the agent modeling method and the inferencing process.
Resumo:
A organização automática de mensagens de correio electrónico é um desafio actual na área da aprendizagem automática. O número excessivo de mensagens afecta cada vez mais utilizadores, especialmente os que usam o correio electrónico como ferramenta de comunicação e trabalho. Esta tese aborda o problema da organização automática de mensagens de correio electrónico propondo uma solução que tem como objectivo a etiquetagem automática de mensagens. A etiquetagem automática é feita com recurso às pastas de correio electrónico anteriormente criadas pelos utilizadores, tratando-as como etiquetas, e à sugestão de múltiplas etiquetas para cada mensagem (top-N). São estudadas várias técnicas de aprendizagem e os vários campos que compõe uma mensagem de correio electrónico são analisados de forma a determinar a sua adequação como elementos de classificação. O foco deste trabalho recai sobre os campos textuais (o assunto e o corpo das mensagens), estudando-se diferentes formas de representação, selecção de características e algoritmos de classificação. É ainda efectuada a avaliação dos campos de participantes através de algoritmos de classificação que os representam usando o modelo vectorial ou como um grafo. Os vários campos são combinados para classificação utilizando a técnica de combinação de classificadores Votação por Maioria. Os testes são efectuados com um subconjunto de mensagens de correio electrónico da Enron e um conjunto de dados privados disponibilizados pelo Institute for Systems and Technologies of Information, Control and Communication (INSTICC). Estes conjuntos são analisados de forma a perceber as características dos dados. A avaliação do sistema é realizada através da percentagem de acerto dos classificadores. Os resultados obtidos apresentam melhorias significativas em comparação com os trabalhos relacionados.
Resumo:
Since collaborative networked organisations are usually formed by independent and heterogeneous entities, it is natural that each member holds his own set of values, and that conflicts among partners might emerge because of some misalignment of values. In contrast, it is often stated in literature that the alignment between the value systems of members involved in collaborative processes is a prerequisite for successful co-working. As a result, the issue of core value alignment in collaborative networks started to attract attention. However, methods to analyse such alignment are lacking mainly because the concept of 'alignment' in this context is still ill defined and shows a multifaceted nature. As a contribution to the area, this article introduces an approach based on causal models and graph theory for the analysis of core value alignment in collaborative networks. The potential application of the approach is then discussed in the virtual organisations' breeding environment context.
Resumo:
Topology optimization consists in finding the spatial distribution of a given total volume of material for the resulting structure to have some optimal property, for instance, maximization of structural stiffness or maximization of the fundamental eigenfrequency. In this paper a Genetic Algorithm (GA) employing a representation method based on trees is developed to generate initial feasible individuals that remain feasible upon crossover and mutation and as such do not require any repairing operator to ensure feasibility. Several application examples are studied involving the topology optimization of structures where the objective functions is the maximization of the stiffness and the maximization of the first and the second eigenfrequencies of a plate, all cases having a prescribed material volume constraint.
Resumo:
Mestrado em Radiações Aplicadas às Tecnologias da Saúde. Área de especialização: Ressonância Magnética
Resumo:
Background: With the decrease of DNA sequencing costs, sequence-based typing methods are rapidly becoming the gold standard for epidemiological surveillance. These methods provide reproducible and comparable results needed for a global scale bacterial population analysis, while retaining their usefulness for local epidemiological surveys. Online databases that collect the generated allelic profiles and associated epidemiological data are available but this wealth of data remains underused and are frequently poorly annotated since no user-friendly tool exists to analyze and explore it. Results: PHYLOViZ is platform independent Java software that allows the integrated analysis of sequence-based typing methods, including SNP data generated from whole genome sequence approaches, and associated epidemiological data. goeBURST and its Minimum Spanning Tree expansion are used for visualizing the possible evolutionary relationships between isolates. The results can be displayed as an annotated graph overlaying the query results of any other epidemiological data available. Conclusions: PHYLOViZ is a user-friendly software that allows the combined analysis of multiple data sources for microbial epidemiological and population studies. It is freely available at http://www.phyloviz.net.
Resumo:
We classify all possible implementations of an Abelian symmetry in the two-Higgs-doublet model with fermions. We identify those symmetries which are consistent with nonvanishing quark masses and a Cabibbo-Kobayashi-Maskawa quark-mixing matrix (CKM), which is not block-diagonal. Our analysis takes us from a plethora of possibilities down to 246 relevant cases, requiring only 34 distinct matrix forms. We show that applying Z(n) with n >= 4 to the scalar sector leads to a continuous U(1) symmetry in the whole Lagrangian. Finally, we address the possibilities of spontaneous CP violation and of natural suppression of the flavor-changing neutral currents. We explain why our work is relevant even for non-Abelian symmetries.
Resumo:
Fluorescence confocal microscopy (FCM) is now one of the most important tools in biomedicine research. In fact, it makes it possible to accurately study the dynamic processes occurring inside the cell and its nucleus by following the motion of fluorescent molecules over time. Due to the small amount of acquired radiation and the huge optical and electronics amplification, the FCM images are usually corrupted by a severe type of Poisson noise. This noise may be even more damaging when very low intensity incident radiation is used to avoid phototoxicity. In this paper, a Bayesian algorithm is proposed to remove the Poisson intensity dependent noise corrupting the FCM image sequences. The observations are organized in a 3-D tensor where each plane is one of the images acquired along the time of a cell nucleus using the fluorescence loss in photobleaching (FLIP) technique. The method removes simultaneously the noise by considering different spatial and temporal correlations. This is accomplished by using an anisotropic 3-D filter that may be separately tuned in space and in time dimensions. Tests using synthetic and real data are described and presented to illustrate the application of the algorithm. A comparison with several state-of-the-art algorithms is also presented.
Resumo:
Dissertação apresentada à Escola Superior de Comunicação Social como parte dos requisitos para obtenção de grau de mestre em Audiovisual e Multimédia.
Resumo:
We consider the quark sector of theories containing three scalar SU(2)(L) doublets in the triplet representation of A(4) (or S-4) and three generations of quarks in arbitrary A(4) (or S-4) representations. We show that for all possible choices of quark field representations and for all possible alignments of the Higgs vacuum expectation values that can constitute global minima of the scalar potential, it is not possible to obtain simultaneously nonvanishing quark masses and a nonvanishing CP-violating phase in the Cabibbo-Kobayashi-Maskawa quark mixing matrix. As a result, in this minimal form, models with three scalar fields in the triplet representation of A(4) or S-4 cannot be extended to the quark sector in a way consistent with experiment. DOI: 10.1103/PhysRevD.87.055010.
Resumo:
Dissertação Final de Mestrado para obtenção do grau de Mestre em Engenharia Mecânica no perfil de Manutenção e Produção