993 resultados para sequential methods
Resumo:
This paper presents an automated optimization framework able to provide network administrators with resilient routing configurations for link-state protocols, such as OSPF or IS-IS. In order to deal with the formulated NP-hard optimization problems, the devised framework is underpinned by the use of computational intelligence optimization engines, such as Multi-objective Evolutionary Algorithms (MOEAs). With the objective of demonstrating the framework capabilities, two illustrative Traffic Engineering methods are described, allowing to attain routing configurations robust to changes in the traffic demands and maintaining the network stable even in the presence of link failure events. The presented illustrative results clearly corroborate the usefulness of the proposed automated framework along with the devised optimization methods.
Resumo:
"Series: Solid mechanics and its applications, vol. 226"
Resumo:
OBJECTIVE: To use published Hypertension Optimal Treatment (HOT) Study data to evaluate changes in cardiovascular mortality in nondiabetic hypertensive patients according to the degree of reduction in their diastolic blood pressure. METHODS: In the HOT Study, 18,700 patients from various centers were allocated at random to groups having different objectives of for diastolic blood pressure: <=90 (n=6264); <=85 (n=6264); <=80mmHg (n=6262). Felodipine was the basic drug used. Other antihypertensive drugs were administered in a sequential manner, aiming at the objectives of diastolic blood pressure reduction. RESULTS: The group of nondiabetic hypertensive subjects with diastolic pressure<=80mmHg had a cardiovascular mortality ratio of 4.1/1000 patients/year, 35.5% higher than the group with diastolic pressure <=90mmHg (cardiovascular mortality ratio, 3.1/1000 patients/year). In contrast, diabetic patients allocated to the diastolic pressure objective group of <=80mmHg had a 66.7% reduction in cardiovascular mortality (3.7/1000 patients/year) when compared with the diastolic pressure group of <=90mmHg (cardiovascular mortality ratio, 11.1/1000 patients/year). CONCLUSION: The results indicate that in hypertensive diabetic patients reduction in diastolic blood pressure to levels <=80mmHg decreases the risk of fatal cardiovascular events. It remains necessary to define the level of diastolic blood pressure <=90mmHg at which maximal reduction in cardiovascular mortality is obtained for nondiabetics.
Resumo:
OBJECTIVE - A prospective, nonrandomized clinical study to assess splanchnic perfusion based on intramucosal pH in the postoperative period of cardiac surgery and to check the evolution of patients during hospitalization. METHODS - We studied 10 children, during the immediate postoperative period after elective cardiac surgery. Sequential intramucosal pH measurements were taken, without dobutamine (T0) and with 5mcg/kg/min (T1) and 10 (T2) mcg/kg/min. In the pediatric intensive care unit, intramucosal pH measurements were made on admission and 4, 8, 12, and 24 hours thereafter. RESULTS - The patients had an increase in intramucosal pH values with dobutamine 10mcg/kg/min [7.19± 0.09 (T0), 7.16±0.13(T1), and 7.32±0.16(T2)], (p=0.103). During the hospitalization period, the intramucosal pH values were the following: 7.20±0.13 (upon admission), 7.27±0.16 (after 4 hours), 7.26±0.07 (after 8 hours), 7.32±0.12 (after 12 hours), and 7.38±0.08 (after 24 hours), (p=0.045). No deaths occurred, and none of the patients developed multiple organ and systems dysfunction. CONCLUSION - An increase in and normalization of intramucosal pH was observed after dobutamine use. Measurement of intramucosal pH is a type of monitoring that is easy to perform and free of complications in children during the postoperative period of cardiac surgery.
Resumo:
OBJECTIVE - The aim of our study was to assess the profile of a wrist monitor, the Omron Model HEM-608, compared with the indirect method for blood pressure measurement. METHODS - Our study population consisted of 100 subjects, 29 being normotensive and 71 being hypertensive. Participants had their blood pressure checked 8 times with alternate techniques, 4 by the indirect method and 4 with the Omron wrist monitor. The validation criteria used to test this device were based on the internationally recognized protocols. RESULTS - Our data showed that the Omron HEM-608 reached a classification B for systolic and A for diastolic blood pressure, according to the one protocol. The mean differences between blood pressure values obtained with each of the methods were -2.3 +7.9mmHg for systolic and 0.97+5.5mmHg for diastolic blood pressure. Therefore, we considered this type of device approved according to the criteria selected. CONCLUSION - Our study leads us to conclude that this wrist monitor is not only easy to use, but also produces results very similar to those obtained by the standard indirect method.
Resumo:
In recent decades, an increased interest has been evidenced in the research on multi-scale hierarchical modelling in the field of mechanics, and also in the field of wood products and timber engineering. One of the main motivations for hierar-chical modelling is to understand how properties, composition and structure at lower scale levels may influence and be used to predict the material properties on a macroscopic and structural engineering scale. This chapter presents the applicability of statistic and probabilistic methods, such as the Maximum Likelihood method and Bayesian methods, in the representation of timber’s mechanical properties and its inference accounting to prior information obtained in different importance scales. These methods allow to analyse distinct timber’s reference properties, such as density, bending stiffness and strength, and hierarchically consider information obtained through different non, semi or destructive tests. The basis and fundaments of the methods are described and also recommendations and limitations are discussed. The methods may be used in several contexts, however require an expert’s knowledge to assess the correct statistic fitting and define the correlation arrangement between properties.
Resumo:
OBJECTIVE: To study the effect of propafenone on the contractile function of latissimus dorsi muscle isolated from rats in an organ chamber. METHODS: We studied 20 latissimus dorsi muscles of Wistar rats and divided them into 2 groups: group I (n=10), or control group - we studied the feasibility of muscle contractility; group II (n=10), in which the contralateral muscles were grouped - we analyzed the effect of propafenone on muscle contractility. After building a muscle ring, 8 periods of sequential 2-minute baths were performed, with intervals of preprogrammed electrical stimulation using a pacemaker of 50 stimuli/min. In group II, propafenone, at the concentration of 9.8 µg/mL, was added to the bath in period 2 and withdrawn in period 4. RESULTS: In group I, no significant depression in muscle contraction occurred up to period 5 (p>0.05). In group II, a significant depression occurred in all periods, except between the last 2 periods (p<0.05). Comparing groups I and II only in period 1, which was a standard period for both groups, we found no significant difference (p>0.05). CONCLUSION: Propafenone had a depressing effect on the contractile function of latissimus dorsi muscle isolated from rats and studied in an organ chamber.
Resumo:
OBJECTIVE: To assess, in myocardium specimens obtained from necropsies, the correlation between the concentration of hydroxyproline, measured with the photocolorimetric method, and the intensity of fibrosis, determined with the morphometric method. METHODS: Left ventricle myocardium samples were obtained from 45 patients who had undergone necropsy, some of them with a variety of cardiopathies and others without any heart disease. The concentrations of hydroxyproline were determined with the photocolorimetric method. In the histologic sections from each heart, the myocardial fibrosis was quantified by using a light microscope with an integrating ocular lens. RESULTS: A median of, respectively, 4.5 and 4.3 mug of hydroxyproline/mg of dry weight was found in fixed and nonfixed left ventricle myocardium fragments. A positive correlation occurred between the hydroxyproline concentrations and the intensity of fibrosis, both in the fixed (Sr=+0.25; p=0.099) and in the nonfixed (Sr=+0.32; p=0.03) specimens. CONCLUSION: The biochemical methodology was proven to be adequate, and manual morphometry was shown to have limitations that may interfere with the statistical significance of correlations for the estimate of fibrosis intensity in the human myocardium.
Resumo:
OBJECTIVE: To analyze the frequency and prevalence of congenital heart defects in a tertiary care center for children with heart diseases. METHODS: We carried out an epidemiological assessment of the first medical visit of 4,538 children in a pediatric hospital from January 1995 to December 1997. All patients with congenital heart defects had their diagnoses confirmed at least on echocardiography. The frequency and prevalence of the anomalies were computed according to the classification of sequential analysis. Age, weight, and sex were compared between the groups of healthy individuals and those with congenital heart defects after distribution according to the age group. RESULTS: Of all the children assessed, 2,017 (44.4%) were diagnosed with congenital heart disease, 201 (4.4%) with acquired heart disease, 52 (1.2%) with arrhythmias, and 2,268 (50%) were healthy children. Congenital heart diseases predominated in neonates and infants, corresponding to 71.5% of the cases. Weight and age were significantly lower in children with congenital heart defects. Ventricular septal defect was the most frequent acyanotic anomaly, and tetralogy of Fallot was the most frequent cyanotic anomaly. CONCLUSION: Children with congenital heart defects are mainly referred during the neonatal period and infancy with impairment in gaining weight. Ventricular septal defect is the most frequent heart defect.
Resumo:
OBJECTIVE: To evaluate the performance of the turbidimetric method of C-reactive protein (CRP) as a measure of low-grade inflammation in patients admitted with non-ST elevation acute coronary syndromes (ACS). METHODS: Serum samples obtained at hospital arrival from 68 patients (66±11 years, 40 men), admitted with unstable angina or non-ST elevation acute myocardial infarction were used to measure CRP by the methods of nephelometry and turbidimetry. RESULTS: The medians of C-reactive protein by the turbidimetric and nephelometric methods were 0.5 mg/dL and 0.47 mg/dL, respectively. A strong linear association existed between the 2 methods, according to the regression coefficient (b=0.75; 95% C.I.=0.70-0.80) and correlation coefficient (r=0.96; P<0.001). The mean difference between the nephelometric and turbidimetric CRP was 0.02 ± 0.91 mg/dL, and 100% agreement between the methods in the detection of high CRP was observed. CONCLUSION: In patients with non-ST elevation ACS, CRP values obtained by turbidimetry show a strong linear association with the method of nephelometry and perfect agreement in the detection of high CRP.
Resumo:
La verificación y el análisis de programas con características probabilistas es una tarea necesaria del quehacer científico y tecnológico actual. El éxito y su posterior masificación de las implementaciones de protocolos de comunicación a nivel hardware y soluciones probabilistas a problemas distribuidos hacen más que interesante el uso de agentes estocásticos como elementos de programación. En muchos de estos casos el uso de agentes aleatorios produce soluciones mejores y más eficientes; en otros proveen soluciones donde es imposible encontrarlas por métodos tradicionales. Estos algoritmos se encuentran generalmente embebidos en múltiples mecanismos de hardware, por lo que un error en los mismos puede llegar a producir una multiplicación no deseada de sus efectos nocivos.Actualmente el mayor esfuerzo en el análisis de programas probabilísticos se lleva a cabo en el estudio y desarrollo de herramientas denominadas chequeadores de modelos probabilísticos. Las mismas, dado un modelo finito del sistema estocástico, obtienen de forma automática varias medidas de performance del mismo. Aunque esto puede ser bastante útil a la hora de verificar programas, para sistemas de uso general se hace necesario poder chequear especificaciones más completas que hacen a la corrección del algoritmo. Incluso sería interesante poder obtener automáticamente las propiedades del sistema, en forma de invariantes y contraejemplos.En este proyecto se pretende abordar el problema de análisis estático de programas probabilísticos mediante el uso de herramientas deductivas como probadores de teoremas y SMT solvers. Las mismas han mostrado su madurez y eficacia en atacar problemas de la programación tradicional. Con el fin de no perder automaticidad en los métodos, trabajaremos dentro del marco de "Interpretación Abstracta" el cual nos brinda un delineamiento para nuestro desarrollo teórico. Al mismo tiempo pondremos en práctica estos fundamentos mediante implementaciones concretas que utilicen aquellas herramientas.
Resumo:
Identificación y caracterización del problema: El problema que guía este proyecto, pretende dar respuesta a interrogantes tales como: ¿De qué modo el tipo de actividades que se diseñan, se constituyen en dispositivos posibilitadores de la comprensión de los temas propios de cada asignatura, por parte de los alumnos? A partir de esta pregunta, surge la siguiente: Al momento de resolver las actividades, ¿qué estrategias cognitivas ponen en juego los estudiantes? y ¿cuáles de ellas favorecen procesos de construcción del conocimiento? Hipótesis: - Las asignaturas cuyas actividades están elaboradas bajo la metodología de Aprendizaje Basado en Problemas y Estudio de Casos, propician aprendizajes significativos por parte de los estudiantes. - Las actividades elaboradas bajo la metodología del Aprendizaje Basado en Problemas y el Estudio de Casos requieren de procesos cognitivos más complejos que los que se implementan en las de tipo tradicional. Objetivo: - Identificar el impacto que tienen las actividades de aprendizaje de tipo tradicional y las elaboradas bajo la metodología de Aprendizaje Basado en Problemas y Estudio de Casos, en el aprendizaje de los alumnos. Materiales y Métodos: a) Análisis de las actividades de aprendizaje del primero y segundo año de la carrera de Abogacía, bajo lamodalidad a Distancia. b) Entrevistas tanto a docentes contenidistas como así también a los tutores. c) Encuestas y entrevistas a los alumnos. Resultados esperados: Se pretende confirmar que las actividades de aprendizaje, diseñadas bajo la metodología del Aprendizaje Basado en Problemas y el Estudio de Casos, promueven aprendizajes significativos en los alumnos. Importancia del proyecto y pertinencia: La relevancia del presente proyecto se podría identificar a través de dos grandes variables vinculadas entre sí: la relacionada con el dispositivo didáctico (estrategias implementadas por los alumnos) y la referida a lo institucional (carácter innovador de la propuesta de enseñanza y posibilidad de extenderla a otras cátedras). El presente proyecto pretende implementar mejoras en el diseño de las actividades de aprendizaje, a fin de promover en los alumnos la generación de ideas y soluciones responsables y el desarrollo de su capacidad analítica y reflexiva.