999 resultados para Pruning methods
Resumo:
"Series title: Springerbriefs in applied sciences and technology, ISSN 2191-530X"
Resumo:
Strategic funding of UID/BIO/04469/2013 unit and project ref RECI/BBB-EBI/0179/2012 (project number FCOMP-01-0124-FEDER-027462) and Xanel Vecino post-doctoral grant (ref SFRH/BPD/101476/2014) funded by Fundação para a Ciência e a Tecnologia, Portugal
Resumo:
Under the framework of constraint based modeling, genome-scale metabolic models (GSMMs) have been used for several tasks, such as metabolic engineering and phenotype prediction. More recently, their application in health related research has spanned drug discovery, biomarker identification and host-pathogen interactions, targeting diseases such as cancer, Alzheimer, obesity or diabetes. In the last years, the development of novel techniques for genome sequencing and other high-throughput methods, together with advances in Bioinformatics, allowed the reconstruction of GSMMs for human cells. Considering the diversity of cell types and tissues present in the human body, it is imperative to develop tissue-specific metabolic models. Methods to automatically generate these models, based on generic human metabolic models and a plethora of omics data, have been proposed. However, their results have not yet been adequately and critically evaluated and compared. This work presents a survey of the most important tissue or cell type specific metabolic model reconstruction methods, which use literature, transcriptomics, proteomics and metabolomics data, together with a global template model. As a case study, we analyzed the consistency between several omics data sources and reconstructed distinct metabolic models of hepatocytes using different methods and data sources as inputs. The results show that omics data sources have a poor overlapping and, in some cases, are even contradictory. Additionally, the hepatocyte metabolic models generated are in many cases not able to perform metabolic functions known to be present in the liver tissue. We conclude that reliable methods for a priori omics data integration are required to support the reconstruction of complex models of human cells.
Resumo:
This paper presents an automated optimization framework able to provide network administrators with resilient routing configurations for link-state protocols, such as OSPF or IS-IS. In order to deal with the formulated NP-hard optimization problems, the devised framework is underpinned by the use of computational intelligence optimization engines, such as Multi-objective Evolutionary Algorithms (MOEAs). With the objective of demonstrating the framework capabilities, two illustrative Traffic Engineering methods are described, allowing to attain routing configurations robust to changes in the traffic demands and maintaining the network stable even in the presence of link failure events. The presented illustrative results clearly corroborate the usefulness of the proposed automated framework along with the devised optimization methods.
Resumo:
"Series: Solid mechanics and its applications, vol. 226"
Resumo:
OBJECTIVE - The aim of our study was to assess the profile of a wrist monitor, the Omron Model HEM-608, compared with the indirect method for blood pressure measurement. METHODS - Our study population consisted of 100 subjects, 29 being normotensive and 71 being hypertensive. Participants had their blood pressure checked 8 times with alternate techniques, 4 by the indirect method and 4 with the Omron wrist monitor. The validation criteria used to test this device were based on the internationally recognized protocols. RESULTS - Our data showed that the Omron HEM-608 reached a classification B for systolic and A for diastolic blood pressure, according to the one protocol. The mean differences between blood pressure values obtained with each of the methods were -2.3 +7.9mmHg for systolic and 0.97+5.5mmHg for diastolic blood pressure. Therefore, we considered this type of device approved according to the criteria selected. CONCLUSION - Our study leads us to conclude that this wrist monitor is not only easy to use, but also produces results very similar to those obtained by the standard indirect method.
Resumo:
In recent decades, an increased interest has been evidenced in the research on multi-scale hierarchical modelling in the field of mechanics, and also in the field of wood products and timber engineering. One of the main motivations for hierar-chical modelling is to understand how properties, composition and structure at lower scale levels may influence and be used to predict the material properties on a macroscopic and structural engineering scale. This chapter presents the applicability of statistic and probabilistic methods, such as the Maximum Likelihood method and Bayesian methods, in the representation of timber’s mechanical properties and its inference accounting to prior information obtained in different importance scales. These methods allow to analyse distinct timber’s reference properties, such as density, bending stiffness and strength, and hierarchically consider information obtained through different non, semi or destructive tests. The basis and fundaments of the methods are described and also recommendations and limitations are discussed. The methods may be used in several contexts, however require an expert’s knowledge to assess the correct statistic fitting and define the correlation arrangement between properties.
Resumo:
OBJECTIVE: To assess, in myocardium specimens obtained from necropsies, the correlation between the concentration of hydroxyproline, measured with the photocolorimetric method, and the intensity of fibrosis, determined with the morphometric method. METHODS: Left ventricle myocardium samples were obtained from 45 patients who had undergone necropsy, some of them with a variety of cardiopathies and others without any heart disease. The concentrations of hydroxyproline were determined with the photocolorimetric method. In the histologic sections from each heart, the myocardial fibrosis was quantified by using a light microscope with an integrating ocular lens. RESULTS: A median of, respectively, 4.5 and 4.3 mug of hydroxyproline/mg of dry weight was found in fixed and nonfixed left ventricle myocardium fragments. A positive correlation occurred between the hydroxyproline concentrations and the intensity of fibrosis, both in the fixed (Sr=+0.25; p=0.099) and in the nonfixed (Sr=+0.32; p=0.03) specimens. CONCLUSION: The biochemical methodology was proven to be adequate, and manual morphometry was shown to have limitations that may interfere with the statistical significance of correlations for the estimate of fibrosis intensity in the human myocardium.
Resumo:
OBJECTIVE: To evaluate the performance of the turbidimetric method of C-reactive protein (CRP) as a measure of low-grade inflammation in patients admitted with non-ST elevation acute coronary syndromes (ACS). METHODS: Serum samples obtained at hospital arrival from 68 patients (66±11 years, 40 men), admitted with unstable angina or non-ST elevation acute myocardial infarction were used to measure CRP by the methods of nephelometry and turbidimetry. RESULTS: The medians of C-reactive protein by the turbidimetric and nephelometric methods were 0.5 mg/dL and 0.47 mg/dL, respectively. A strong linear association existed between the 2 methods, according to the regression coefficient (b=0.75; 95% C.I.=0.70-0.80) and correlation coefficient (r=0.96; P<0.001). The mean difference between the nephelometric and turbidimetric CRP was 0.02 ± 0.91 mg/dL, and 100% agreement between the methods in the detection of high CRP was observed. CONCLUSION: In patients with non-ST elevation ACS, CRP values obtained by turbidimetry show a strong linear association with the method of nephelometry and perfect agreement in the detection of high CRP.
Resumo:
La verificación y el análisis de programas con características probabilistas es una tarea necesaria del quehacer científico y tecnológico actual. El éxito y su posterior masificación de las implementaciones de protocolos de comunicación a nivel hardware y soluciones probabilistas a problemas distribuidos hacen más que interesante el uso de agentes estocásticos como elementos de programación. En muchos de estos casos el uso de agentes aleatorios produce soluciones mejores y más eficientes; en otros proveen soluciones donde es imposible encontrarlas por métodos tradicionales. Estos algoritmos se encuentran generalmente embebidos en múltiples mecanismos de hardware, por lo que un error en los mismos puede llegar a producir una multiplicación no deseada de sus efectos nocivos.Actualmente el mayor esfuerzo en el análisis de programas probabilísticos se lleva a cabo en el estudio y desarrollo de herramientas denominadas chequeadores de modelos probabilísticos. Las mismas, dado un modelo finito del sistema estocástico, obtienen de forma automática varias medidas de performance del mismo. Aunque esto puede ser bastante útil a la hora de verificar programas, para sistemas de uso general se hace necesario poder chequear especificaciones más completas que hacen a la corrección del algoritmo. Incluso sería interesante poder obtener automáticamente las propiedades del sistema, en forma de invariantes y contraejemplos.En este proyecto se pretende abordar el problema de análisis estático de programas probabilísticos mediante el uso de herramientas deductivas como probadores de teoremas y SMT solvers. Las mismas han mostrado su madurez y eficacia en atacar problemas de la programación tradicional. Con el fin de no perder automaticidad en los métodos, trabajaremos dentro del marco de "Interpretación Abstracta" el cual nos brinda un delineamiento para nuestro desarrollo teórico. Al mismo tiempo pondremos en práctica estos fundamentos mediante implementaciones concretas que utilicen aquellas herramientas.
Resumo:
Identificación y caracterización del problema: El problema que guía este proyecto, pretende dar respuesta a interrogantes tales como: ¿De qué modo el tipo de actividades que se diseñan, se constituyen en dispositivos posibilitadores de la comprensión de los temas propios de cada asignatura, por parte de los alumnos? A partir de esta pregunta, surge la siguiente: Al momento de resolver las actividades, ¿qué estrategias cognitivas ponen en juego los estudiantes? y ¿cuáles de ellas favorecen procesos de construcción del conocimiento? Hipótesis: - Las asignaturas cuyas actividades están elaboradas bajo la metodología de Aprendizaje Basado en Problemas y Estudio de Casos, propician aprendizajes significativos por parte de los estudiantes. - Las actividades elaboradas bajo la metodología del Aprendizaje Basado en Problemas y el Estudio de Casos requieren de procesos cognitivos más complejos que los que se implementan en las de tipo tradicional. Objetivo: - Identificar el impacto que tienen las actividades de aprendizaje de tipo tradicional y las elaboradas bajo la metodología de Aprendizaje Basado en Problemas y Estudio de Casos, en el aprendizaje de los alumnos. Materiales y Métodos: a) Análisis de las actividades de aprendizaje del primero y segundo año de la carrera de Abogacía, bajo lamodalidad a Distancia. b) Entrevistas tanto a docentes contenidistas como así también a los tutores. c) Encuestas y entrevistas a los alumnos. Resultados esperados: Se pretende confirmar que las actividades de aprendizaje, diseñadas bajo la metodología del Aprendizaje Basado en Problemas y el Estudio de Casos, promueven aprendizajes significativos en los alumnos. Importancia del proyecto y pertinencia: La relevancia del presente proyecto se podría identificar a través de dos grandes variables vinculadas entre sí: la relacionada con el dispositivo didáctico (estrategias implementadas por los alumnos) y la referida a lo institucional (carácter innovador de la propuesta de enseñanza y posibilidad de extenderla a otras cátedras). El presente proyecto pretende implementar mejoras en el diseño de las actividades de aprendizaje, a fin de promover en los alumnos la generación de ideas y soluciones responsables y el desarrollo de su capacidad analítica y reflexiva.