916 resultados para multivehicle interaction directed-graph model
Resumo:
The association between increases in cerebral glucose metabolism and the development of acidosis is largely inferential, based on reports linking hyperglycemia with poor neurological outcome, lactate accumulation, and the severity of acidosis. We measured local cerebral metabolic rate for glucose (lCMRglc) and an index of brain pH--the acid-base index (ABI)--concurrently and characterized their interaction in a model of focal cerebral ischemia in rats in a double-label autoradiographic study, using ($\sp{14}$C) 2-deoxyglucose and ($\sp{14}$C) dimethyloxazolidinedione. Computer-assisted digitization and analysis permitted the simultaneous quantification of the two variables on a pixel-by-pixel basis in the same brain slices. Hemispheres ipsilateral to tamponade-induced middle cerebral occlusion showed areas of normal, depressed and elevated glucose metabolic rate (as defined by an interhemispheric asymmetry index) after two hours of ischemia. Regions of normal glucose metabolic rate showed normal ABI (pH $\pm$ SD = 6.97 $\pm$ 0.09), regions of depressed lCMRglc showed severe acidosis (6.69 $\pm$ 0.14), and regions of elevated lCMRglc showed moderate acidosis (6.88 $\pm$ 0.10), all significantly different at the.00125 level as shown by analysis of variance. Moderate acidosis in regions of increased lCMRglc suggests that anaerobic glycolysis causes excess protons to be generated by the uncoupling of ATP synthesis and hydrolysis. ^
Resumo:
We present a novel framework for encoding latency analysis of arbitrary multiview video coding prediction structures. This framework avoids the need to consider an specific encoder architecture for encoding latency analysis by assuming an unlimited processing capacity on the multiview encoder. Under this assumption, only the influence of the prediction structure and the processing times have to be considered, and the encoding latency is solved systematically by means of a graph model. The results obtained with this model are valid for a multiview encoder with sufficient processing capacity and serve as a lower bound otherwise. Furthermore, with the objective of low latency encoder design with low penalty on rate-distortion performance, the graph model allows us to identify the prediction relationships that add higher encoding latency to the encoder. Experimental results for JMVM prediction structures illustrate how low latency prediction structures with a low rate-distortion penalty can be derived in a systematic manner using the new model.
Resumo:
Los hipergrafos dirigidos se han empleado en problemas relacionados con lógica proposicional, bases de datos relacionales, linguística computacional y aprendizaje automático. Los hipergrafos dirigidos han sido también utilizados como alternativa a los grafos (bipartitos) dirigidos para facilitar el estudio de las interacciones entre componentes de sistemas complejos que no pueden ser fácilmente modelados usando exclusivamente relaciones binarias. En este contexto, este tipo de representación es conocida como hiper-redes. Un hipergrafo dirigido es una generalización de un grafo dirigido especialmente adecuado para la representación de relaciones de muchos a muchos. Mientras que una arista en un grafo dirigido define una relación entre dos de sus nodos, una hiperarista en un hipergrafo dirigido define una relación entre dos conjuntos de sus nodos. La conexión fuerte es una relación de equivalencia que divide el conjunto de nodos de un hipergrafo dirigido en particiones y cada partición define una clase de equivalencia conocida como componente fuertemente conexo. El estudio de los componentes fuertemente conexos de un hipergrafo dirigido puede ayudar a conseguir una mejor comprensión de la estructura de este tipo de hipergrafos cuando su tamaño es considerable. En el caso de grafo dirigidos, existen algoritmos muy eficientes para el cálculo de los componentes fuertemente conexos en grafos de gran tamaño. Gracias a estos algoritmos, se ha podido averiguar que la estructura de la WWW tiene forma de “pajarita”, donde más del 70% del los nodos están distribuidos en tres grandes conjuntos y uno de ellos es un componente fuertemente conexo. Este tipo de estructura ha sido también observada en redes complejas en otras áreas como la biología. Estudios de naturaleza similar no han podido ser realizados en hipergrafos dirigidos porque no existe algoritmos capaces de calcular los componentes fuertemente conexos de este tipo de hipergrafos. En esta tesis doctoral, hemos investigado como calcular los componentes fuertemente conexos de un hipergrafo dirigido. En concreto, hemos desarrollado dos algoritmos para este problema y hemos determinado que son correctos y cuál es su complejidad computacional. Ambos algoritmos han sido evaluados empíricamente para comparar sus tiempos de ejecución. Para la evaluación, hemos producido una selección de hipergrafos dirigidos generados de forma aleatoria inspirados en modelos muy conocidos de grafos aleatorios como Erdos-Renyi, Newman-Watts-Strogatz and Barabasi-Albert. Varias optimizaciones para ambos algoritmos han sido implementadas y analizadas en la tesis. En concreto, colapsar los componentes fuertemente conexos del grafo dirigido que se puede construir eliminando ciertas hiperaristas complejas del hipergrafo dirigido original, mejora notablemente los tiempos de ejecucion de los algoritmos para varios de los hipergrafos utilizados en la evaluación. Aparte de los ejemplos de aplicación mencionados anteriormente, los hipergrafos dirigidos han sido también empleados en el área de representación de conocimiento. En concreto, este tipo de hipergrafos se han usado para el cálculo de módulos de ontologías. Una ontología puede ser definida como un conjunto de axiomas que especifican formalmente un conjunto de símbolos y sus relaciones, mientras que un modulo puede ser entendido como un subconjunto de axiomas de la ontología que recoge todo el conocimiento que almacena la ontología sobre un conjunto especifico de símbolos y sus relaciones. En la tesis nos hemos centrado solamente en módulos que han sido calculados usando la técnica de localidad sintáctica. Debido a que las ontologías pueden ser muy grandes, el cálculo de módulos puede facilitar las tareas de re-utilización y mantenimiento de dichas ontologías. Sin embargo, analizar todos los posibles módulos de una ontología es, en general, muy costoso porque el numero de módulos crece de forma exponencial con respecto al número de símbolos y de axiomas de la ontología. Afortunadamente, los axiomas de una ontología pueden ser divididos en particiones conocidas como átomos. Cada átomo representa un conjunto máximo de axiomas que siempre aparecen juntos en un modulo. La decomposición atómica de una ontología es definida como un grafo dirigido de tal forma que cada nodo del grafo corresponde con un átomo y cada arista define una dependencia entre una pareja de átomos. En esta tesis introducimos el concepto de“axiom dependency hypergraph” que generaliza el concepto de descomposición atómica de una ontología. Un modulo en una ontología correspondería con un componente conexo en este tipo de hipergrafos y un átomo de una ontología con un componente fuertemente conexo. Hemos adaptado la implementación de nuestros algoritmos para que funcionen también con axiom dependency hypergraphs y poder de esa forma calcular los átomos de una ontología. Para demostrar la viabilidad de esta idea, hemos incorporado nuestros algoritmos en una aplicación que hemos desarrollado para la extracción de módulos y la descomposición atómica de ontologías. A la aplicación la hemos llamado HyS y hemos estudiado sus tiempos de ejecución usando una selección de ontologías muy conocidas del área biomédica, la mayoría disponibles en el portal de Internet NCBO. Los resultados de la evaluación muestran que los tiempos de ejecución de HyS son mucho mejores que las aplicaciones más rápidas conocidas. ABSTRACT Directed hypergraphs are an intuitive modelling formalism that have been used in problems related to propositional logic, relational databases, computational linguistic and machine learning. Directed hypergraphs are also presented as an alternative to directed (bipartite) graphs to facilitate the study of the interactions between components of complex systems that cannot naturally be modelled as binary relations. In this context, they are known as hyper-networks. A directed hypergraph is a generalization of a directed graph suitable for representing many-to-many relationships. While an edge in a directed graph defines a relation between two nodes of the graph, a hyperedge in a directed hypergraph defines a relation between two sets of nodes. Strong-connectivity is an equivalence relation that induces a partition of the set of nodes of a directed hypergraph into strongly-connected components. These components can be collapsed into single nodes. As result, the size of the original hypergraph can significantly be reduced if the strongly-connected components have many nodes. This approach might contribute to better understand how the nodes of a hypergraph are connected, in particular when the hypergraphs are large. In the case of directed graphs, there are efficient algorithms that can be used to compute the strongly-connected components of large graphs. For instance, it has been shown that the macroscopic structure of the World Wide Web can be represented as a “bow-tie” diagram where more than 70% of the nodes are distributed into three large sets and one of these sets is a large strongly-connected component. This particular structure has been also observed in complex networks in other fields such as, e.g., biology. Similar studies cannot be conducted in a directed hypergraph because there does not exist any algorithm for computing the strongly-connected components of the hypergraph. In this thesis, we investigate ways to compute the strongly-connected components of directed hypergraphs. We present two new algorithms and we show their correctness and computational complexity. One of these algorithms is inspired by Tarjan’s algorithm for directed graphs. The second algorithm follows a simple approach to compute the stronglyconnected components. This approach is based on the fact that two nodes of a graph that are strongly-connected can also reach the same nodes. In other words, the connected component of each node is the same. Both algorithms are empirically evaluated to compare their performances. To this end, we have produced a selection of random directed hypergraphs inspired by existent and well-known random graphs models like Erd˝os-Renyi and Newman-Watts-Strogatz. Besides the application examples that we mentioned earlier, directed hypergraphs have also been employed in the field of knowledge representation. In particular, they have been used to compute the modules of an ontology. An ontology is defined as a collection of axioms that provides a formal specification of a set of terms and their relationships; and a module is a subset of an ontology that completely captures the meaning of certain terms as defined in the ontology. In particular, we focus on the modules computed using the notion of syntactic locality. As ontologies can be very large, the computation of modules facilitates the reuse and maintenance of these ontologies. Analysing all modules of an ontology, however, is in general not feasible as the number of modules grows exponentially in the number of terms and axioms of the ontology. Nevertheless, the modules can succinctly be represented using the Atomic Decomposition of an ontology. Using this representation, an ontology can be partitioned into atoms, which are maximal sets of axioms that co-occur in every module. The Atomic Decomposition is then defined as a directed graph such that each node correspond to an atom and each edge represents a dependency relation between two atoms. In this thesis, we introduce the notion of an axiom dependency hypergraph which is a generalization of the atomic decomposition of an ontology. A module in the ontology corresponds to a connected component in the hypergraph, and the atoms of the ontology to the strongly-connected components. We apply our algorithms for directed hypergraphs to axiom dependency hypergraphs and in this manner, we compute the atoms of an ontology. To demonstrate the viability of this approach, we have implemented the algorithms in the application HyS which computes the modules of ontologies and calculate their atomic decomposition. In the thesis, we provide an experimental evaluation of HyS with a selection of large and prominent biomedical ontologies, most of which are available in the NCBO Bioportal. HyS outperforms state-of-the-art implementations in the tasks of extracting modules and computing the atomic decomposition of these ontologies.
Resumo:
In vitro studies of drug absorption processes are undertaken to assess drug candidate or formulation suitability, mechanism investigation, and ultimately for the development of predictive models. This study included each of these approaches, with the aim of developing novel in vitro methods for inclusion in a drug absorption model. Two model analgesic drugs, ibuprofen and paracetamol, were selected. The study focused on three main areas, the interaction of the model drugs with co-administered antacids, the elucidation of the mechanisms responsible for the increased absorption rate observed in a novel paracetamol formulation and the development of novel ibuprofen tablet formulations containing alkalising excipients as dissolution promoters.Several novel dissolution methods were developed. A method to study the interaction of drug/excipient mixtures in the powder form was successfully used to select suitable dissolution enhancing exicipents. A method to study intrinsic dissolution rate using paddle apparatus was developed and used to study dissolution mechanisms. Methods to simulate stomach and intestine environments in terms of media composition and volume and drug/antacid doses were developed. Antacid addition greatly increased the dissolution of ibuprofen in the stomach model.Novel methods to measure drug permeability through rat stomach and intestine were developed, using sac methodology. The methods allowed direct comparison of the apparent permeability values obtained. Tissue stability, reproducibility and integrity was observed, with selectivity between paracellular and transcellular markers and hydrophilic and lipophilic compounds within an homologous series of beta-blockers.
Resumo:
This paper concerns the problem of agent trust in an electronic market place. We maintain that agent trust involves making decisions under uncertainty and therefore the phenomenon should be modelled probabilistically. We therefore propose a probabilistic framework that models agent interactions as a Hidden Markov Model (HMM). The observations of the HMM are the interaction outcomes and the hidden state is the underlying probability of a good outcome. The task of deciding whether to interact with another agent reduces to probabilistic inference of the current state of that agent given all previous interaction outcomes. The model is extended to include a probabilistic reputation system which involves agents gathering opinions about other agents and fusing them with their own beliefs. Our system is fully probabilistic and hence delivers the following improvements with respect to previous work: (a) the model assumptions are faithfully translated into algorithms; our system is optimal under those assumptions, (b) It can account for agents whose behaviour is not static with time (c) it can estimate the rate with which an agent's behaviour changes. The system is shown to significantly outperform previous state-of-the-art methods in several numerical experiments. Copyright © 2010, International Foundation for Autonomous Agents and Multiagent Systems (www.ifaamas.org). All rights reserved.
Resumo:
Mémoire numérisé par la Direction des bibliothèques de l'Université de Montréal.
Resumo:
Mémoire numérisé par la Direction des bibliothèques de l'Université de Montréal.
Resumo:
The work presents a theoretical framework for the evaluation of e-Teaching that aims at positioning the online activities designed and developed by the teacher as to the Learning, Interaction and Technology Dimensions. The theoretical research that underlies the study was developed reflecting current thinking on the promotion of quality of teaching and of the integration of information and communication tools into the curriculum in Higher Education (HE), i.e., bearing in mind some European guidelines and policies on this subject. This way, an answer was sought to be given to one of the aims put forward in this study, namely to contribute towards the development of a conceptual framework to support research on evaluation of e-teaching in the context of HE. Based on the theoretical research carried out, an evaluation tool (SCAI) was designed, which integrates the two questionnaires developed to collect the teachers' and the students' perceptions regarding the development of e-activities. Consequently, an empirical study was structured and carried out, allowing SCAI tool to be tested and validated in real cases. From the comparison of the theoretical framework established and the analysis of the data obtained, we found that the differences in teaching should be valued and seen as assets by HE institutions rather than annihilated in a globalizing perspective.
Resumo:
We introduced a spectral clustering algorithm based on the bipartite graph model for the Manufacturing Cell Formation problem in [Oliveira S, Ribeiro JFF, Seok SC. A spectral clustering algorithm for manufacturing cell formation. Computers and Industrial Engineering. 2007 [submitted for publication]]. It constructs two similarity matrices; one for parts and one for machines. The algorithm executes a spectral clustering algorithm on each separately to find families of parts and cells of machines. The similarity measure in the approach utilized limited information between parts and between machines. This paper reviews several well-known similarity measures which have been used for Group Technology. Computational clustering results are compared by various performance measures. (C) 2008 The Society of Manufacturing Engineers. Published by Elsevier Ltd. All rights reserved.
Resumo:
This article modifies the usual form of the Dubinin-Radushkevich pore-filling model for application to liquid-phase adsorption data, where large molecules are often involved. In such cases it is necessary to include the repulsive part of the energy in the micropores, which is accomplished here by relating the pore potential to the fluid-solid interaction potential. The model also considers the nonideality of the bulk liquid phase through the UNIFAC activity coefficient model, as well as structural heterogeneity of the carbon. For the latter the generalized adsorption integral is used while incorporating the pore-size distribution obtained by density functional theory analysis of argon adsorption data. The model is applied here to the interpretation of aqueous phase adsorption isotherms of three different esters on three commercial activated carbons. Excellent agreement between the model and experimental data is observed, and the fitted Lennard-Jones size parameter for the adsorbate-adsorbate interactions compares well with that estimated from known critical properties, supporting the modified approach. On the other hand, the model without consideration of bulk nonideality, or when using classical models of the characteristic energy, gives much poorer bts of the data and unrealistic parameter values.
Resumo:
Monocytes/macrophages are important targets for dengue virus (DENV) replication; they induce inflammatory mediators and are sources of viral dissemination in the initial phase of the disease. Apoptosis is an active process of cellular destruction genetically regulated, in which a complex enzymatic pathway is activated and may be trigged by many viral infections. Since the mechanisms of apoptotic induction in DENV-infected target cells are not yet defined, we investigated the virus-cell interaction using a model of primary human monocyte infection with DENV-2 with the aim of identifying apoptotic markers. Cultures analyzed by flow cytometry and confocal microscopy yielded DENV antigen positive cells with rates that peaked at the second day post infection (p.i.), decayed afterwards and produced the apoptosis-related cytokines TNF-α and IL-10. Phosphatidylserine, an early marker for apoptosis, was increased at the cell surface and the Fas death receptor was upregulated at the second day p.i. at significantly higher rates in DENV infected cell cultures than controls. However, no detectable changes were observed in the expression of the anti-apoptotic protein Bcl-2 in infected cultures. Our data support virus modulation of extrinsic apoptotic factors in the in vitro model of human monocyte DENV-2 infection. DENV may be interfering in activation and death mechanisms by inducing apoptosis in target cells.
Resumo:
Statistical computing when input/output is driven by a Graphical User Interface is considered. A proposal is made for automatic control ofcomputational flow to ensure that only strictly required computationsare actually carried on. The computational flow is modeled by a directed graph for implementation in any object-oriented programming language with symbolic manipulation capabilities. A complete implementation example is presented to compute and display frequency based piecewise linear density estimators such as histograms or frequency polygons.
Resumo:
Tractable cases of the binary CSP are mainly divided in two classes: constraint language restrictions and constraint graph restrictions. To better understand and identify the hardest binary CSPs, in this work we propose methods to increase their hardness by increasing the balance of both the constraint language and the constraint graph. The balance of a constraint is increased by maximizing the number of domain elements with the same number of occurrences. The balance of the graph is defined using the classical definition from graph the- ory. In this sense we present two graph models; a first graph model that increases the balance of a graph maximizing the number of vertices with the same degree, and a second one that additionally increases the girth of the graph, because a high girth implies a high treewidth, an important parameter for binary CSPs hardness. Our results show that our more balanced graph models and constraints result in harder instances when compared to typical random binary CSP instances, by several orders of magnitude. Also we detect, at least for sparse constraint graphs, a higher treewidth for our graph models.
Resumo:
This work aimed to study the interaction between the model plant Arabidopsis thaliana and Xanthomonas campestris pv. campestris (Xcc), the pathogen responsible for black rot of crucifers. The response of 32 accessions of A. thaliana to the Brazilian isolate of Xcc CNPH 17 was evaluated. No immunity-like response was observed. "CS1308", "CS1566" and "CS1643" grown in continuous light were among the accessions that showed strongest resistance when inoculated with 5 x 10(6) CFU/mL. In contrast, "CS1194" and "CS1492" were among the most susceptible accessions. Similar results were obtained when plants were grown under short-day conditions. To quantify the differences in disease symptoms, total chlorophyll was extracted from contrasting accessions at different time points after inoculation. Chlorophyll levels from controls and Xcc inoculated plants showed a similar reduction in resistant accessions, whereas Xcc-inoculated susceptible accessions showed a greater reduction compared to controls. To test the specificity of resistance, accessions CS1308, CS1566, CS1643 and CS1438 (which showed partial resistance to CNPH 17), were inoculated with a more aggressive isolate of Xcc (CNPH 77) and Ralstonia solanacearum. Among the accessions tested, "CS1566" was the most resistant to Xcc CNPH 77 and also displayed resistance to R. solanacearum. Accessions CS1308, CS1566 and CS1643 were also inoculated with a high titer of Xcc CNPH 17 (5 x 10(8) CFU/mL). No collapse of tissue was observed up to 48 h after inoculation, indicating that a hypersensitive response is not involved in the resistance displayed by these accessions.
Resumo:
This qualitative study examined the perceived thoughts, feelings and experiences of seven public health nurses employed in a southern ontario health department, regarding the initial phase of the introduction of a self-directed orientation program in their place of employment. A desire to understand what factors facilitate public health nurses in the process of becoming self-directed learners was the purpose of this study. Data were gathered by three methods: 1) a standard open-ended interview was conducted by the researcher with each nurse for approximately one hour; 2) personal notes were kept by the researcher throughout the study; and 3) a review of all pertinent health department documents such as typed minutes of meetings and memos which referred to the introduction of the self-directed learning model was conducted. The meaning of the experience for the nurses provided some insights into what does and does not facilitate public health nurses in the process of becoming self-directed learners. Implications and recommendations for program planners, nurse administrators, facilitators of learning and researchers evolved from the findings of this study.