854 resultados para Probabilistic functions
Resumo:
This work studies the combination of safe and probabilistic reasoning through the hybridization of Monte Carlo integration techniques with continuous constraint programming. In continuous constraint programming there are variables ranging over continuous domains (represented as intervals) together with constraints over them (relations between variables) and the goal is to find values for those variables that satisfy all the constraints (consistent scenarios). Constraint programming “branch-and-prune” algorithms produce safe enclosures of all consistent scenarios. Special proposed algorithms for probabilistic constraint reasoning compute the probability of sets of consistent scenarios which imply the calculation of an integral over these sets (quadrature). In this work we propose to extend the “branch-and-prune” algorithms with Monte Carlo integration techniques to compute such probabilities. This approach can be useful in robotics for localization problems. Traditional approaches are based on probabilistic techniques that search the most likely scenario, which may not satisfy the model constraints. We show how to apply our approach in order to cope with this problem and provide functionality in real time.
Resumo:
Salmonella enterica serovars are Gram-negative facultative intracellular bacterial pathogens that infect a wide variety of animals. Salmonella infections are common in humans, causing usually typhoid fever and gastrointestinal diseases. Salmonella enterica serovar Typhimurium (S. Typhimurium), which is a leading cause of human gastroenteritis, has been extensively used to study the molecular pathogenesis of Salmonella, because of the availability of sophisticated genetic tools, and of suitable animal and tissue culture models mimicking different aspects of Salmonella infections.(...)
Resumo:
RESUMO: As células dendríticas (CDs) são fundamentais na imunomodulação e iniciação de respostas imunes adaptativas, enquanto os ácidos siálicos (Sias) são potenciais imunomoduladores. Estas células expressam níveis elevados da sialiltransferase ST6Gal-1, que transfere Sias para a posição terminal de oligossacáridos. De facto, a maturação de CDs está associada a uma diminuição da sialilação na sua superfície celular. Apesar de ter função biológica desconhecida, a forma solúvel, extracelular de ST6Gal-1 aumenta em cancros e inflamação. Ainda assim, esta foi recentemente identificada como moduladora da hematopoiese. Considerando o importante papel das CDs na iniciação de respostas anticancerígenas, uma ligação entre a sialilação extrínseca induzida por ST6Gal-1 extracelular e o seu papel na modulação de CDs deve ser identificada. Neste trabalho hipotetizou-se que a sialilação α2,6 extrínseca de CDs diminui o seu perfil de maturação mediante ativação por lipopolissacarídeo (LPS). O objetivo principal foi sialilar extrinsecamente em α2,6 CDs da medula óssea de murganhos, avaliando os seus perfis de maturação e de libertação de citocinas, após estimulação com LPS (por Citometria de Fluxo e ELISA, respetivamente). Ao contrário da hipótese, o perfil celular não foi modulado, usando várias abordagens. Por outro lado, a consequência da falta de α2,6 Sias na maturação de CDs foi avaliada analisando: 1) CDs da medula óssea de murganhos tratadas com sialidase, 2) CDs da medula óssea e 3) CDs das vias aéreas, ambas de murganhos deficientes em ST6Gal-1, comparando com a estirpe selvagem. Estes resultados sugerem que a perta total de α2,6 Sias se relaciona com o aumento da expressão do complexo de histocompatibilidade principal de classe II. Apesar de controverso, é provável existirem mecanismos inerentes à ativação por LPS, reduzindo a eficácia de ST6Gal-1 extracelular. Por outro lado, a modificação no perfil de CDs de murganhos deficientes em ST6Gal-1 poderá relacionar-se com uma predisposição para um estado inflamatório severo. Com isto, o trabalho desenvolvido abriu futuras linhas de investigação, nomeadamente explorar outros fatores envolvidos na (de)sialilação α2,6 de CDs, podendo ter impacto em imunoterapia com uso de CDs.--------------------------ABSTRACT: Dendritic cells (DCs) are vital for immunomodulation and the initiation of adaptive immune responses, whereas sialic acids (Sias) are potential immunomodulators. These cells express high levels of sialyltransferase ST6Gal-1, responsible for transferring Sias to the terminal position of oligosaccharide chains. Indeed, DCs’ maturation is associated with decreased cell surface sialylation. Although its biological significance is unknown, the soluble, extracellular form of ST6Gal-1 increases in cancers and inflammation. However, extracellular ST6Gal-1 was recently identified as modulator of hematopoiesis. Considering that DCs play a crucial role in the initiation of a productive anti-cancer immune response, a link between extrinsic sialylation by the extracellular ST6Gal-1 on DC function needs to be investigated. We hypothesize that extrinsic α2,6 sialylation of DCs diminishes their maturation features upon lipopolysaccharide (LPS) stimulation. The main goal was to extrinsically α2,6 sialylate mice bone marrow derived DCs (BMDCs) and to evaluate their maturation and cytokine profiles upon LPS stimulation (by Flow Cytometry and ELISA, respectively). Unlike the hypothesis, we observed that BMDCs’ profile is not modulated, even using several approaches. In contrast, the consequence of lacking cell surface α2,6 Sias in DC maturation was assessed by analysing: 1) sialidase treated BMDCs, 2) BMDCs from mice lacking ST6Gal-1 and 3) DCs from mice airways, comparing wild type with ST6Gal-1 knockout mice. These results suggest that overall lack in α2,6 Sias is related with increased expression of major histocompatibility class II (MHC-II). Although appearing to be controversial findings, other intracellular mechanisms might be occurring upon LPS-induced BMDC activation, probably reducing extracellular ST6Gal-1 effect. In opposite, the modification observed in DC profile of ST6Gal-1 knockout mice might be related to its predisposition to a more severe inflammatory status. With this, the developed work opened future lines of investigation, namely exploring other factors involved in α2,6 (de)sialylation of DC, which might have influence in immunotherapy using DCs.
Resumo:
The assessment of existing timber structures is often limited to information obtained from non or semi destructive testing, as mechanical testing is in many cases not possible due to its destructive nature. Therefore, the available data provides only an indirect measurement of the reference mechanical properties of timber elements, often obtained through empirical based correlations. Moreover, the data must result from the combination of different tests, as to provide a reliable source of information for a structural analysis. Even if general guidelines are available for each typology of testing, there is still a need for a global methodology allowing to combine information from different sources and infer upon that information in a decision process. In this scope, the present work presents the implementation of a probabilistic based framework for safety assessment of existing timber elements. This methodology combines information gathered in different scales and follows a probabilistic framework allowing for the structural assessment of existing timber elements with possibility of inference and updating of its mechanical properties, through Bayesian methods. The probabilistic based framework is based in four main steps: (i) scale of information; (ii) measurement data; (iii) probability assignment; and (iv) structural analysis. In this work, the proposed methodology is implemented in a case study. Data was obtained through a multi-scale experimental campaign made to old chestnut timber beams accounting correlations of non and semi-destructive tests with mechanical properties. Finally, different inference scenarios are discussed aiming at the characterization of the safety level of the elements.
Resumo:
A novel framework for probabilistic-based structural assessment of existing structures, which combines model identification and reliability assessment procedures, considering in an objective way different sources of uncertainty, is presented in this paper. A short description of structural assessment applications, provided in literature, is initially given. Then, the developed model identification procedure, supported in a robust optimization algorithm, is presented. Special attention is given to both experimental and numerical errors, to be considered in this algorithm convergence criterion. An updated numerical model is obtained from this process. The reliability assessment procedure, which considers a probabilistic model for the structure in analysis, is then introduced, incorporating the results of the model identification procedure. The developed model is then updated, as new data is acquired, through a Bayesian inference algorithm, explicitly addressing statistical uncertainty. Finally, the developed framework is validated with a set of reinforced concrete beams, which were loaded up to failure in laboratory.
Resumo:
A modified version of the metallic-phase pseudofermion dynamical theory (PDT) of the 1D Hubbard model is introduced for the spin dynamical correlation functions of the half-filled 1D Hubbard model Mott– Hubbard phase. The Mott–Hubbard insulator phase PDT is applied to the study of the model longitudinal and transverse spin dynamical structure factors at finite magnetic field h, focusing in particular on the sin- gularities at excitation energies in the vicinity of the lower thresholds. The relation of our theoretical results to both condensed-matter and ultra-cold atom systems is discussed.
Resumo:
OBJECTIVE: To evaluate the influence of systolic or diastolic dysfunction, or both on congestive heart failure functional class. METHODS: Thirty-six consecutive patients with a clinical diagnosis of congestive heart failure with sinus rhythm, who were seen between September and November of 1998 answered an adapted questionnaire about tolerance to physical activity for the determination of NYHA functional class. The patients were studied with transthoracic Doppler echocardiography. Two groups were compared: group 1 (19 patients in functional classes I and II) and group 2 (17 patients in functional classes III and IV). RESULTS: The average ejection fraction was significantly higher in group 1 (44.84%±8.04% vs. 32.59%±11.48% with p=0.0007). The mean ratio of the initial/final maximum diastolic filling velocity (E/A) of the left ventricle was significantly smaller in group 1 (1.07±0.72 vs. 1.98±1.49 with p=0.03). The average maximum systolic pulmonary venous velocity (S) was significantly higher in group 1 (53.53cm/s ± 12.02cm/s vs. 43.41cm/s ± 13.55cm/s with p=0.02). The mean ratio of maximum systolic/diastolic pulmonary venous velocity was significantly higher in group 1 (1.52±0.48 vs. 1.08±0.48 with p=0.01). A predominance of pseudo-normal and restrictive diastolic patterns existed in group 2 (58.83% in group 2 vs. 21.06% in group 1 with p=0.03). CONCLUSION: Both the systolic dysfunction index and the patterns of diastolic dysfunction evaluated by Doppler echocardiography worsened with the evolution of congestive heart failure.
Resumo:
In recent decades, an increased interest has been evidenced in the research on multi-scale hierarchical modelling in the field of mechanics, and also in the field of wood products and timber engineering. One of the main motivations for hierar-chical modelling is to understand how properties, composition and structure at lower scale levels may influence and be used to predict the material properties on a macroscopic and structural engineering scale. This chapter presents the applicability of statistic and probabilistic methods, such as the Maximum Likelihood method and Bayesian methods, in the representation of timber’s mechanical properties and its inference accounting to prior information obtained in different importance scales. These methods allow to analyse distinct timber’s reference properties, such as density, bending stiffness and strength, and hierarchically consider information obtained through different non, semi or destructive tests. The basis and fundaments of the methods are described and also recommendations and limitations are discussed. The methods may be used in several contexts, however require an expert’s knowledge to assess the correct statistic fitting and define the correlation arrangement between properties.
Resumo:
La verificación y el análisis de programas con características probabilistas es una tarea necesaria del quehacer científico y tecnológico actual. El éxito y su posterior masificación de las implementaciones de protocolos de comunicación a nivel hardware y soluciones probabilistas a problemas distribuidos hacen más que interesante el uso de agentes estocásticos como elementos de programación. En muchos de estos casos el uso de agentes aleatorios produce soluciones mejores y más eficientes; en otros proveen soluciones donde es imposible encontrarlas por métodos tradicionales. Estos algoritmos se encuentran generalmente embebidos en múltiples mecanismos de hardware, por lo que un error en los mismos puede llegar a producir una multiplicación no deseada de sus efectos nocivos.Actualmente el mayor esfuerzo en el análisis de programas probabilísticos se lleva a cabo en el estudio y desarrollo de herramientas denominadas chequeadores de modelos probabilísticos. Las mismas, dado un modelo finito del sistema estocástico, obtienen de forma automática varias medidas de performance del mismo. Aunque esto puede ser bastante útil a la hora de verificar programas, para sistemas de uso general se hace necesario poder chequear especificaciones más completas que hacen a la corrección del algoritmo. Incluso sería interesante poder obtener automáticamente las propiedades del sistema, en forma de invariantes y contraejemplos.En este proyecto se pretende abordar el problema de análisis estático de programas probabilísticos mediante el uso de herramientas deductivas como probadores de teoremas y SMT solvers. Las mismas han mostrado su madurez y eficacia en atacar problemas de la programación tradicional. Con el fin de no perder automaticidad en los métodos, trabajaremos dentro del marco de "Interpretación Abstracta" el cual nos brinda un delineamiento para nuestro desarrollo teórico. Al mismo tiempo pondremos en práctica estos fundamentos mediante implementaciones concretas que utilicen aquellas herramientas.
Resumo:
Las reducciones jesuíticas en Argentina reconocen generalmente un único aporte en la región guaraní. Pero lo cierto es que una cantidad importante de reducciones, equivalente en número con las anteriormente mencionadas, se desarrollaron en el interior del país, fundamentalmente en las regiones del Chaco, noroeste y sur argentino. Muchas de ellas reconocen hoy su continuidad en centros urbanos y otras tan sólo, y en el mejor de los casos, en vestigios arqueológicos. Se propone el análisis de este conjunto desde las primeras incursiones en "misiones volantes" en el siglo XVII hasta la expulsión de los jesuitas en 1767. También se abordarán las modalidades y procesos generadores de centros reduccionales en estas regiones y sus interrelaciones territoriales. Además se pretende analizar sus funciones y morfologías originales, su evolución, traslados y posibles transformaciones posteriores, en las etapas previas al impacto originado ante la ausencia de la Compañía de Jesús. Los resultados incluirán la interpretación de los procesos formativos, con su diversidad de casos, en escalas regionales y locales, la recopilación de cartografía regional y urbana, y la determinación de series tipológicas de formas de trazados y organizaciones de tejidos, tanto en las demarcaciones de origen como en sus remodelaciones y ensanches cuando así correspondiere.
Resumo:
En nuestro proyecto anterior aproximamos el cálculo de una integral definida con integrandos de grandes variaciones funcionales. Nuestra aproximación paraleliza el algoritmo de cómputo de un método adaptivo de cuadratura, basado en reglas de Newton-Cote. Los primeros resultados obtenidos fueron comunicados en distintos congresos nacionales e internacionales; ellos nos permintieron comenzar con una tipificación de las reglas de cuadratura existentes y una clasificación de algunas funciones utilizadas como funciones de prueba. Estas tareas de clasificación y tipificación no las hemos finalizado, por lo que pretendemos darle continuidad a fin de poder informar sobre la conveniencia o no de utilizar nuestra técnica. Para llevar adelante esta tarea se buscará una base de funciones de prueba y se ampliará el espectro de reglas de cuadraturas a utilizar. Además, nos proponemos re-estructurar el cálculo de algunas rutinas que intervienen en el cómputo de la mínima energía de una molécula. Este programa ya existe en su versión secuencial y está modelizado utilizando la aproximación LCAO. El mismo obtiene resultados exitosos en cuanto a precisión, comparado con otras publicaciones internacionales similares, pero requiere de un tiempo de cálculo significativamente alto. Nuestra propuesta es paralelizar el algoritmo mencionado abordándolo al menos en dos niveles: 1- decidir si conviene distribuir el cálculo de una integral entre varios procesadores o si será mejor distribuir distintas integrales entre diferentes procesadores. Debemos recordar que en los entornos de arquitecturas paralelas basadas en redes (típicamente redes de área local, LAN) el tiempo que ocupa el envío de mensajes entre los procesadores es muy significativo medido en cantidad de operaciones de cálculo que un procesador puede completar. 2- de ser necesario, paralelizar el cálculo de integrales dobles y/o triples. Para el desarrollo de nuestra propuesta se desarrollarán heurísticas para verificar y construir modelos en los casos mencionados tendientes a mejorar las rutinas de cálculo ya conocidas. A la vez que se testearán los algoritmos con casos de prueba. La metodología a utilizar es la habitual en Cálculo Numérico. Con cada propuesta se requiere: a) Implementar un algoritmo de cálculo tratando de lograr versiones superadoras de las ya existentes. b) Realizar los ejercicios de comparación con las rutinas existentes para confirmar o desechar una mejor perfomance numérica. c) Realizar estudios teóricos de error vinculados al método y a la implementación. Se conformó un equipo interdisciplinario integrado por investigadores tanto de Ciencias de la Computación como de Matemática. Metas a alcanzar Se espera obtener una caracterización de las reglas de cuadratura según su efectividad, con funciones de comportamiento oscilatorio y con decaimiento exponencial, y desarrollar implementaciones computacionales adecuadas, optimizadas y basadas en arquitecturas paralelas.
Resumo:
Neuroimaging, functional image analysis, spatial model, cortical surface, spatially variable convolution