971 resultados para Methods: observational
Resumo:
"Series title: Springerbriefs in applied sciences and technology, ISSN 2191-530X"
Resumo:
Under the framework of constraint based modeling, genome-scale metabolic models (GSMMs) have been used for several tasks, such as metabolic engineering and phenotype prediction. More recently, their application in health related research has spanned drug discovery, biomarker identification and host-pathogen interactions, targeting diseases such as cancer, Alzheimer, obesity or diabetes. In the last years, the development of novel techniques for genome sequencing and other high-throughput methods, together with advances in Bioinformatics, allowed the reconstruction of GSMMs for human cells. Considering the diversity of cell types and tissues present in the human body, it is imperative to develop tissue-specific metabolic models. Methods to automatically generate these models, based on generic human metabolic models and a plethora of omics data, have been proposed. However, their results have not yet been adequately and critically evaluated and compared. This work presents a survey of the most important tissue or cell type specific metabolic model reconstruction methods, which use literature, transcriptomics, proteomics and metabolomics data, together with a global template model. As a case study, we analyzed the consistency between several omics data sources and reconstructed distinct metabolic models of hepatocytes using different methods and data sources as inputs. The results show that omics data sources have a poor overlapping and, in some cases, are even contradictory. Additionally, the hepatocyte metabolic models generated are in many cases not able to perform metabolic functions known to be present in the liver tissue. We conclude that reliable methods for a priori omics data integration are required to support the reconstruction of complex models of human cells.
Resumo:
This paper presents an automated optimization framework able to provide network administrators with resilient routing configurations for link-state protocols, such as OSPF or IS-IS. In order to deal with the formulated NP-hard optimization problems, the devised framework is underpinned by the use of computational intelligence optimization engines, such as Multi-objective Evolutionary Algorithms (MOEAs). With the objective of demonstrating the framework capabilities, two illustrative Traffic Engineering methods are described, allowing to attain routing configurations robust to changes in the traffic demands and maintaining the network stable even in the presence of link failure events. The presented illustrative results clearly corroborate the usefulness of the proposed automated framework along with the devised optimization methods.
Resumo:
OBJECTIVE: To describe echocardiographic measurements and left ventricular mass in a population sample of healthy adults inhabitants of the urban region of Porto Alegre. METHODS: An analytical, observational, population-based, cross-sectional study was done. Through a multi-stage probability sample, 114 individuals were selected to be submitted to a M-mode and two-dimensional echocardiogram with color Doppler. The analyses were restricted to healthy participants. Echocardiographic measurements were described by mean, standard deviation, 95 percentile and 95% confidence limits. RESULTS: A total of 100 healthy participants, with several characteristics similar to those from the original population, had a complete and reliable echocardiographic examination. The measurements of aorta, left atrium, interventricular septum, left ventricle in systole and diastole, left posterior wall and left ventricular mass, adjusted or not for body surface area or height, were significantly higher in males. The right ventricle size was similar among the genders. Several echocardiographic measurements were within standard normal limits. Interventricular septum, left posterior wall and left ventricular mass, adjusted or not for anthropometric measurements, and aortic dimensions had lower mean and range than the reference limits. CONCLUSION: The means and estimates of distribution for the measurements of interventricular septum, left posterior wall and left ventricular mass found in this survey were lower than those indicated by the international literature and accepted as normal limits.
Resumo:
"Series: Solid mechanics and its applications, vol. 226"
Resumo:
OBJECTIVE - The aim of our study was to assess the profile of a wrist monitor, the Omron Model HEM-608, compared with the indirect method for blood pressure measurement. METHODS - Our study population consisted of 100 subjects, 29 being normotensive and 71 being hypertensive. Participants had their blood pressure checked 8 times with alternate techniques, 4 by the indirect method and 4 with the Omron wrist monitor. The validation criteria used to test this device were based on the internationally recognized protocols. RESULTS - Our data showed that the Omron HEM-608 reached a classification B for systolic and A for diastolic blood pressure, according to the one protocol. The mean differences between blood pressure values obtained with each of the methods were -2.3 +7.9mmHg for systolic and 0.97+5.5mmHg for diastolic blood pressure. Therefore, we considered this type of device approved according to the criteria selected. CONCLUSION - Our study leads us to conclude that this wrist monitor is not only easy to use, but also produces results very similar to those obtained by the standard indirect method.
Resumo:
In recent decades, an increased interest has been evidenced in the research on multi-scale hierarchical modelling in the field of mechanics, and also in the field of wood products and timber engineering. One of the main motivations for hierar-chical modelling is to understand how properties, composition and structure at lower scale levels may influence and be used to predict the material properties on a macroscopic and structural engineering scale. This chapter presents the applicability of statistic and probabilistic methods, such as the Maximum Likelihood method and Bayesian methods, in the representation of timber’s mechanical properties and its inference accounting to prior information obtained in different importance scales. These methods allow to analyse distinct timber’s reference properties, such as density, bending stiffness and strength, and hierarchically consider information obtained through different non, semi or destructive tests. The basis and fundaments of the methods are described and also recommendations and limitations are discussed. The methods may be used in several contexts, however require an expert’s knowledge to assess the correct statistic fitting and define the correlation arrangement between properties.
Resumo:
OBJECTIVE - To determine the risk factors prevalence for coronary artery disease in the State of Rio Grande do Sul, Brazil and to identify their relation with the age bracket. METHODS - We carried out an observational, cross-sectional study of 1,066 adults older than 20 years in the Brazilian State of Rio Grande do Sul. We investigated the risk factors: familial antecedents, systemic arterial hypertension, high levels of cholesterol and glycemia, overweight/obesity, smoking and sedentary lifestyle. A standardized questionnaire completed at the patients' dwellings by health agents were used; the data were stored in an EPI-INFO software database. The results were expressed with a 95% confidence interval. RESULTS - The sample composition was of 51.8% females. The risk factors prevalences were: 1) sedentary lifestyle 71.3%; 2) familial antecedents: 57.3%; 3) overweight/obesity (body mass index >25): 54.7%; 4) smokers: 33.9%; 5) hypertension: 31.6% (considering >140/90mmHg) and 14.4% (considering >160/95mmHg); 6) high glycemia (>126 mg/dL): 7%; 7) high cholesterol >240 mg/dL): 5.6%. CONCLUSION - The prevalence of the major risk factors for coronary artery disease in the Brazilian state of Rio Grande do Sul could be determined in a study that integrated public and private institutions.
Resumo:
OBJECTIVE: To assess, in myocardium specimens obtained from necropsies, the correlation between the concentration of hydroxyproline, measured with the photocolorimetric method, and the intensity of fibrosis, determined with the morphometric method. METHODS: Left ventricle myocardium samples were obtained from 45 patients who had undergone necropsy, some of them with a variety of cardiopathies and others without any heart disease. The concentrations of hydroxyproline were determined with the photocolorimetric method. In the histologic sections from each heart, the myocardial fibrosis was quantified by using a light microscope with an integrating ocular lens. RESULTS: A median of, respectively, 4.5 and 4.3 mug of hydroxyproline/mg of dry weight was found in fixed and nonfixed left ventricle myocardium fragments. A positive correlation occurred between the hydroxyproline concentrations and the intensity of fibrosis, both in the fixed (Sr=+0.25; p=0.099) and in the nonfixed (Sr=+0.32; p=0.03) specimens. CONCLUSION: The biochemical methodology was proven to be adequate, and manual morphometry was shown to have limitations that may interfere with the statistical significance of correlations for the estimate of fibrosis intensity in the human myocardium.
Resumo:
OBJECTIVE: To analyze the reasons given by patients for interrupting their pharmacological treatment of hypertension. METHODS: We carried out an observational cross-sectional study, in which a questionnaire was applied and blood pressure was measured in 401 patients in different centers of the state of Bahia. The patients selected had been diagnosed with hypertension and were not on antihypertensive treatment for at least 60 days. Clinical and epidemiological characteristics of the groups were analyzed. RESULTS: Of the 401 patients, 58.4% were females, 55.6% of whom white; 60.5% of the males were white. The major reasons alleged for not adhering to treatment were as follows (for males and females respectively): normalization of blood pressure (41.3% and 42.3%); side effects of the medications (31.7% and 24.8%); forgetting to use the medication (25.2% and 20.1%); cost of medication (21.6% and 20.1%); fear of mixing alcohol and medication (23.4% and 3.8%); ignoring the need for continuing the treatment (15% and 21.8%); use of an alternative treatment (11.4% and 17.1%); fear of intoxication (9.6% and 12.4%); fear of hypotension (9.6% and 12%); and fear of mixing the medication with other drugs (8.4% and 6.1%). CONCLUSION: Our data suggest that most factors concerning the abandonment of the treatment of hypertension are related to lack of information, and that, despite the advancement in antihypertensive drugs, side effects still account for most abandonments of treatment.
Resumo:
OBJECTIVE: To evaluate the performance of the turbidimetric method of C-reactive protein (CRP) as a measure of low-grade inflammation in patients admitted with non-ST elevation acute coronary syndromes (ACS). METHODS: Serum samples obtained at hospital arrival from 68 patients (66±11 years, 40 men), admitted with unstable angina or non-ST elevation acute myocardial infarction were used to measure CRP by the methods of nephelometry and turbidimetry. RESULTS: The medians of C-reactive protein by the turbidimetric and nephelometric methods were 0.5 mg/dL and 0.47 mg/dL, respectively. A strong linear association existed between the 2 methods, according to the regression coefficient (b=0.75; 95% C.I.=0.70-0.80) and correlation coefficient (r=0.96; P<0.001). The mean difference between the nephelometric and turbidimetric CRP was 0.02 ± 0.91 mg/dL, and 100% agreement between the methods in the detection of high CRP was observed. CONCLUSION: In patients with non-ST elevation ACS, CRP values obtained by turbidimetry show a strong linear association with the method of nephelometry and perfect agreement in the detection of high CRP.
Resumo:
OBJECTIVE: To determine the prevalence of dyslipidemias in adults in the city of Campos dos Goytacazes, in the Brazilian state of Rio de Janeiro, and to identify its relation to risk factors. METHODS: Cross-sectional, population-based, observational study with sampling through conglomerates and stratified according to socioeconomic levels, sex, and age, with 1,039 individuals. Risk factors, familial history, blood pressure, anthropometric measurements, glucose, triglycerides and cholesterol were determined. RESULTS: The following prevalences were observed: of dyslipidemias 24.2%; of hypercholesterolemia, 4.2%; of elevated LDL-C, 3.5%; of low HDL-C, 18.3%; and of hypertriglyceridemia, 17.1%. The following mean levels were observed: cholesterol, 187.6± 33.7 mg/dL; LDL-C, 108.7±26.8 mg/dL; HDL-C, 48.5±7.7 mg/dL; and triglycerides, 150.1±109.8 mg/dL. The following variables showed a positive correlation with dyslipidemia: increased age (P<0.001), male sex (P<0.001), low familial income (P<0.001), familial history (P<0.01), overweight/obesity (P<0.001), waist measure (P<0.001), high blood pressure (P<0.001), and diabetes mellitus (P<0.001). The following variables had no influence on dyslipidemias: ethnicity, educational level, smoking habits, and sedentary lifestyle. CONCLUSION: The frequency of lipid changes in the population studied was high, suggesting that measures for the early diagnosis should be taken, in association with implementation of programs for primary and secondary prevention of atherosclerosis.
Resumo:
La verificación y el análisis de programas con características probabilistas es una tarea necesaria del quehacer científico y tecnológico actual. El éxito y su posterior masificación de las implementaciones de protocolos de comunicación a nivel hardware y soluciones probabilistas a problemas distribuidos hacen más que interesante el uso de agentes estocásticos como elementos de programación. En muchos de estos casos el uso de agentes aleatorios produce soluciones mejores y más eficientes; en otros proveen soluciones donde es imposible encontrarlas por métodos tradicionales. Estos algoritmos se encuentran generalmente embebidos en múltiples mecanismos de hardware, por lo que un error en los mismos puede llegar a producir una multiplicación no deseada de sus efectos nocivos.Actualmente el mayor esfuerzo en el análisis de programas probabilísticos se lleva a cabo en el estudio y desarrollo de herramientas denominadas chequeadores de modelos probabilísticos. Las mismas, dado un modelo finito del sistema estocástico, obtienen de forma automática varias medidas de performance del mismo. Aunque esto puede ser bastante útil a la hora de verificar programas, para sistemas de uso general se hace necesario poder chequear especificaciones más completas que hacen a la corrección del algoritmo. Incluso sería interesante poder obtener automáticamente las propiedades del sistema, en forma de invariantes y contraejemplos.En este proyecto se pretende abordar el problema de análisis estático de programas probabilísticos mediante el uso de herramientas deductivas como probadores de teoremas y SMT solvers. Las mismas han mostrado su madurez y eficacia en atacar problemas de la programación tradicional. Con el fin de no perder automaticidad en los métodos, trabajaremos dentro del marco de "Interpretación Abstracta" el cual nos brinda un delineamiento para nuestro desarrollo teórico. Al mismo tiempo pondremos en práctica estos fundamentos mediante implementaciones concretas que utilicen aquellas herramientas.