853 resultados para automated correlation optimized warping
Resumo:
OBJECTIVE: To make individual assessments using automated quantification methodology in order to screen for perfusion abnormalities in cerebral SPECT examinations among a sample of subjects with OCD. METHODS: Statistical parametric mapping (SPM) was used to compare 26 brain SPECT images from patients with OCD individually with an image bank of 32 normal subjects, using the statistical threshold of p < 0.05 (corrected for multiple comparisons at the level of individual voxels or clusters). The maps were analyzed, and regions presenting voxels that remained above this threshold were sought. RESULTS: Six patients from a sample of 26 OCD images showed abnormalities at cluster or voxel level, considering the criteria described above, which represented 23.07%. However, seven images from the normal group of 32 were also indicated as cases of perfusional abnormality, representing 21.8% of the sample. CONCLUSION: The automated quantification method was not considered to be a useful tool for clinical practice, for analyses complementary to visual inspection.
Resumo:
[INTRODUCTION] An accurate preoperative rectal cancer staging is crucial to the correct management of the disease. Despite great controversy around this issue, pelvic magnetic resonance (RM) is said to be the imagiologic standard modality. This work aimed to evaluate magnetic resonance accuracy in preoperative rectal cancer staging comparing with the anatomopathological results. METHODS We calculated sensibility, specificity, positive (VP positive) and negative (VP negative) predictive values for each T and N. We evaluated the concordance between both methods of staging using the Cohen weighted K (Kw), and through ROC curves, we evaluated magnetic resonance accuracy in rectal cancer staging. RESULTS 41 patients met the inclusion criteria. We achieved an efficacy of 43.9% for T and 61% for N staging. The respective sensibility, specificity, positive and negative predictive values are 33.3%, 94.7%, 33.3% and 94.7% for T1; 62.5%, 32%, 37.0% and 57.1% for T2; 31.8%, 79%, 63.6% and 50% for T3 and 27.8%, 87%, 62.5% and 60.6% for N. We obtained a poor concordance for T and N staging and the anatomopathological results. The ROC curves indicated that magnetic resonance is ineffective in rectal cancer staging. CONCLUSION Magnetic resonance has a moderate efficacy in rectal cancer staging and the major difficulty is in differentiating T2 and T3.
Resumo:
Co-cultures of two or more cell types and biodegradable biomaterials of natural origin have been successfully combined to recreate tissue microenvironments. Segregated co-cultures are preferred over conventional mixed ones in order to better control the degree of homotypic and heterotypic interactions. Hydrogel-based systems in particular, have gained much attention to mimic tissue-specific microenvironments and they can be microengineered by innovative bottom-up approaches such as microfluidics. In this study, we developed bi-compartmentalized (Janus) hydrogel microcapsules of methacrylated hyaluronic acid (MeHA)/methacrylated-chitosan (MeCht) blended with marine-origin collagen by droplet-based microfluidics co-flow. Human adipose stem cells (hASCs) and microvascular endothelial cells (hMVECs) were co-encapsulated to create platforms of study relevant for vascularized bone tissue engineering. A specially designed Janus-droplet generator chip was used to fabricate the microcapsules (<250â μm units) and Janus-gradient co-cultures of hASCs: hMVECs were generated in various ratios (90:10; 75:25; 50:50; 25:75; 10:90), through an automated microfluidic flow controller (Elveflow microfluidics system). Such monodisperse 3D co-culture systems were optimized regarding cell number and culture media specific for concomitant maintenance of both phenotypes to establish effective cell-cell (homotypic and heterotypic) and cell-materials interactions. Cellular parameters such as viability, matrix deposition, mineralization and hMVECs re-organization in tube-like structures, were enhanced by blending MeHA/MeCht with marine-origin collagen and increasing hASCs: hMVECs co-culture gradient had significant impact on it. Such Janus hybrid hydrogel microcapsules can be used as a platform to investigate biomaterials interactions with distinct combined cell populations.
Resumo:
Tese de Doutoramento em Biologia Molecular e Ambiental - Especialidade em Biologia Celular e Saúde
Resumo:
Executive functioning (EF), which is considered to govern complex cognition, and verbal memory (VM) are constructs assumed to be related. However, it is not known the magnitude of the association between EF and VM, and how sociodemographic and psychological factors may affect this relationship, including in normal aging. In this study, we assessed different EF and VM parameters, via a battery of neurocognitive/psychological tests, and performed a Canonical Correlation Analysis (CCA) to explore the connection between these constructs, in a sample of middle- aged and older healthy individuals without cognitive impairment (N = 563, 50+ years of age). The analysis revealed a positive and moderate association between EF and VM independently of gender, age, education, global cognitive performance level, and mood. These results confirm that EF presents a significant association with VM performance.
Resumo:
OBJECTIVE - To evaluate the cardiac abnormalities and their evolution during the course of the acquired immunodeficiency syndrome, as well as to correlate clinical and pathological data. METHODS - Twenty-one patients, admitted to the hospital with the diagnosis of acquired immunodeficiency syndrome, were prospectively studied and followed until their death. Age ranged from 19 to 42 years (17 males). ECG and echocardiogram were also obtained every six months. After death, macro- and microscopic examinations were also performed. RESULTS - The most frequent causes of referral to the hospital were: diarrhea or repeated pneumonias, tuberculosis, toxoplasmosis or Kaposi sarcoma. The most frequent findings were acute or chronic pericarditis (42%) and dilated cardiomyopathy (19%). Four patients died of cardiac problems: infective endocarditis, pericarditis with pericardial effusion, bacterial myocarditis and infection by Toxoplasma gondii. CONCLUSION - Severe cardiac abnormalities were the cause of death in some patients. In the majority of the patients, a good correlation existed between clinical and anatomical-pathological data. Cardiac evaluation was important to detect early manifestations and treat them accordingly, even in asymptomatic patients.
Resumo:
This paper presents an automated optimization framework able to provide network administrators with resilient routing configurations for link-state protocols, such as OSPF or IS-IS. In order to deal with the formulated NP-hard optimization problems, the devised framework is underpinned by the use of computational intelligence optimization engines, such as Multi-objective Evolutionary Algorithms (MOEAs). With the objective of demonstrating the framework capabilities, two illustrative Traffic Engineering methods are described, allowing to attain routing configurations robust to changes in the traffic demands and maintaining the network stable even in the presence of link failure events. The presented illustrative results clearly corroborate the usefulness of the proposed automated framework along with the devised optimization methods.
Resumo:
OBJECTIVE: To evaluate the influence of systolic or diastolic dysfunction, or both on congestive heart failure functional class. METHODS: Thirty-six consecutive patients with a clinical diagnosis of congestive heart failure with sinus rhythm, who were seen between September and November of 1998 answered an adapted questionnaire about tolerance to physical activity for the determination of NYHA functional class. The patients were studied with transthoracic Doppler echocardiography. Two groups were compared: group 1 (19 patients in functional classes I and II) and group 2 (17 patients in functional classes III and IV). RESULTS: The average ejection fraction was significantly higher in group 1 (44.84%±8.04% vs. 32.59%±11.48% with p=0.0007). The mean ratio of the initial/final maximum diastolic filling velocity (E/A) of the left ventricle was significantly smaller in group 1 (1.07±0.72 vs. 1.98±1.49 with p=0.03). The average maximum systolic pulmonary venous velocity (S) was significantly higher in group 1 (53.53cm/s ± 12.02cm/s vs. 43.41cm/s ± 13.55cm/s with p=0.02). The mean ratio of maximum systolic/diastolic pulmonary venous velocity was significantly higher in group 1 (1.52±0.48 vs. 1.08±0.48 with p=0.01). A predominance of pseudo-normal and restrictive diastolic patterns existed in group 2 (58.83% in group 2 vs. 21.06% in group 1 with p=0.03). CONCLUSION: Both the systolic dysfunction index and the patterns of diastolic dysfunction evaluated by Doppler echocardiography worsened with the evolution of congestive heart failure.
Resumo:
Fluorescence in situ hybridization (FISH) is based on the use of fluorescent staining dyes, however, the signal intensity of the images obtained by microscopy is seldom quantified with accuracy by the researcher. The development of innovative digital image processing programs and tools has been trying to overcome this problem, however, the determination of fluorescent intensity in microscopy images still has issues due to the lack of precision in the results and the complexity of existing software. This work presents FISHji, a set of new ImageJ methods for automated quantification of fluorescence in images obtained by epifluorescence microscopy. To validate the methods, results obtained by FISHji were compared with results obtained by flow cytometry. The mean correlation between FISHji and flow cytometry was high and significant, showing that the imaging methods are able to accurately assess the signal intensity of fluorescence images. FISHji are available for non-commercial use at http://paginas.fe.up.pt/nazevedo/.
Resumo:
After the incorporation of automated external defibrilators by other airlines and the support of the Brazilian Society of cardiology, Varig Airlines Began the onboard defibrilation program with the initial purpose of equiping wide-body aircrafts frequently used in international flights and that airplanes use in the Rio - São Paulo route. With all fight attendants trained, the automated. External defibrilation devides were incorporated to 34 airplanes of a total pleet of 80 aircrats. The devices were intalled in the bagage compartments secured with velero straps and 2 pairs of electrods, one or which pre-conected to the device to minimize application time. Later, a portable monitor was addres to the ressocitation kit in the long flights. The expansion of the knowledge of the basic life support fundamentors and the correted implantation of the survival chain and of the automated external defibrilators will increase the extense of recovery of cardiorespiratory arrest victins in aircrafts.
Resumo:
OBJECTIVE - The aim of our study was to assess the profile of a wrist monitor, the Omron Model HEM-608, compared with the indirect method for blood pressure measurement. METHODS - Our study population consisted of 100 subjects, 29 being normotensive and 71 being hypertensive. Participants had their blood pressure checked 8 times with alternate techniques, 4 by the indirect method and 4 with the Omron wrist monitor. The validation criteria used to test this device were based on the internationally recognized protocols. RESULTS - Our data showed that the Omron HEM-608 reached a classification B for systolic and A for diastolic blood pressure, according to the one protocol. The mean differences between blood pressure values obtained with each of the methods were -2.3 +7.9mmHg for systolic and 0.97+5.5mmHg for diastolic blood pressure. Therefore, we considered this type of device approved according to the criteria selected. CONCLUSION - Our study leads us to conclude that this wrist monitor is not only easy to use, but also produces results very similar to those obtained by the standard indirect method.
Resumo:
OBJECTIVE: To assess the Dixtal DX2710 automated oscillometric device used for blood pressure measurement according to the protocols of the BHS and the AAMI. METHODS: Three blood pressure measurements were taken in 94 patients (53 females 15 to 80 years). The measurements were taken randomly by 2 observers trained to measure blood pressure with a mercury column device connected with an automated device. The device was classified according to the protocols of the BHS and AAMI. RESULT: The mean of blood pressure levels obtained by the observers was 148±38/93±25 mmHg and that obtained with the device was 148±37/89±26 mmHg. Considering the differences between the measurements obtained by the observer and those obtained with the automated device according to the criteria of the BHS, the following classification was adopted: "A" for systolic pressure (69% of the differences < 5; 90% < 10; and 97% < 15 mmHg); and "B" for diastolic pressure (63% of the differences < 5; 83% < 10; and 93% < 15 mmHg). The mean and standard deviation of the differences were 0±6.27 mmHg for systolic pressure and 3.82±6.21 mmHg for diastolic pressure. CONCLUSION: The Dixtal DX2710 device was approved according to the international recommendations.
Resumo:
OBJECTIVE: To evaluate the performance of the turbidimetric method of C-reactive protein (CRP) as a measure of low-grade inflammation in patients admitted with non-ST elevation acute coronary syndromes (ACS). METHODS: Serum samples obtained at hospital arrival from 68 patients (66±11 years, 40 men), admitted with unstable angina or non-ST elevation acute myocardial infarction were used to measure CRP by the methods of nephelometry and turbidimetry. RESULTS: The medians of C-reactive protein by the turbidimetric and nephelometric methods were 0.5 mg/dL and 0.47 mg/dL, respectively. A strong linear association existed between the 2 methods, according to the regression coefficient (b=0.75; 95% C.I.=0.70-0.80) and correlation coefficient (r=0.96; P<0.001). The mean difference between the nephelometric and turbidimetric CRP was 0.02 ± 0.91 mg/dL, and 100% agreement between the methods in the detection of high CRP was observed. CONCLUSION: In patients with non-ST elevation ACS, CRP values obtained by turbidimetry show a strong linear association with the method of nephelometry and perfect agreement in the detection of high CRP.
Resumo:
OBJECTIVE: To verify the association of serum markers of myocardial injury, such as troponin I, creatinine kinase, and creatinine kinase isoenzyme MB, and inflammatory markers, such as tumor necrosis factor alpha (TNF-alpha), C-reactive protein, and the erythrocyte sedimentation rate in the perioperative period of cardiac surgery, with the occurrence of possible postpericardiotomy syndrome. METHODS: This was a cohort study with 96 patients undergoing cardiac surgery assessed at the following 4 different time periods: the day before surgery (D0); the 3rd postoperative day (D3); between the 7th and 10th postoperative days (D7-10); and the 30th postoperative day (D30). During each period, we evaluated demographic variables (sex and age), surgical variables (type and duration , extracorporeal circulation), and serum dosages of the markers of myocardial injury and inflammatory response. RESULTS: Of all patients, 12 (12.5%) met the clinical criteria for a diagnosis of postpericardiotomy syndrome, and their mean age was 10.3 years lower than the age of the others (P=0.02). The results of the serum markers for tissue injury and inflammatory response were not significantly different between the 2 assessed groups. No significant difference existed regarding either surgery duration or extracorporeal circulation. CONCLUSION: The patients who met the clinical criteria for postpericardiotomy syndrome were significantly younger than the others were. Serum markers for tissue injury and inflammatory response were not different in the clinically affected group, and did not correlate with the different types and duration of surgery or with extracorporeal circulation.
Resumo:
Identificación y caracterización del problema. Uno de los problemas más importantes asociados con la construcción de software es la corrección del mismo. En busca de proveer garantías del correcto funcionamiento del software, han surgido una variedad de técnicas de desarrollo con sólidas bases matemáticas y lógicas conocidas como métodos formales. Debido a su naturaleza, la aplicación de métodos formales requiere gran experiencia y conocimientos, sobre todo en lo concerniente a matemáticas y lógica, por lo cual su aplicación resulta costosa en la práctica. Esto ha provocado que su principal aplicación se limite a sistemas críticos, es decir, sistemas cuyo mal funcionamiento puede causar daños de magnitud, aunque los beneficios que sus técnicas proveen son relevantes a todo tipo de software. Poder trasladar los beneficios de los métodos formales a contextos de desarrollo de software más amplios que los sistemas críticos tendría un alto impacto en la productividad en tales contextos. Hipótesis. Contar con herramientas de análisis automático es un elemento de gran importancia. Ejemplos de esto son varias herramientas potentes de análisis basadas en métodos formales, cuya aplicación apunta directamente a código fuente. En la amplia mayoría de estas herramientas, la brecha entre las nociones a las cuales están acostumbrados los desarrolladores y aquellas necesarias para la aplicación de estas herramientas de análisis formal sigue siendo demasiado amplia. Muchas herramientas utilizan lenguajes de aserciones que escapan a los conocimientos y las costumbres usuales de los desarrolladores. Además, en muchos casos la salida brindada por la herramienta de análisis requiere cierto manejo del método formal subyacente. Este problema puede aliviarse mediante la producción de herramientas adecuadas. Otro problema intrínseco a las técnicas automáticas de análisis es cómo se comportan las mismas a medida que el tamaño y complejidad de los elementos a analizar crece (escalabilidad). Esta limitación es ampliamente conocida y es considerada crítica en la aplicabilidad de métodos formales de análisis en la práctica. Una forma de atacar este problema es el aprovechamiento de información y características de dominios específicos de aplicación. Planteo de objetivos. Este proyecto apunta a la construcción de herramientas de análisis formal para contribuir a la calidad, en cuanto a su corrección funcional, de especificaciones, modelos o código, en el contexto del desarrollo de software. Más precisamente, se busca, por un lado, identificar ambientes específicos en los cuales ciertas técnicas de análisis automático, como el análisis basado en SMT o SAT solving, o el model checking, puedan llevarse a niveles de escalabilidad superiores a los conocidos para estas técnicas en ámbitos generales. Se intentará implementar las adaptaciones a las técnicas elegidas en herramientas que permitan su uso a desarrolladores familiarizados con el contexto de aplicación, pero no necesariamente conocedores de los métodos o técnicas subyacentes. Materiales y métodos a utilizar. Los materiales a emplear serán bibliografía relevante al área y equipamiento informático. Métodos. Se emplearán los métodos propios de la matemática discreta, la lógica y la ingeniería de software. Resultados esperados. Uno de los resultados esperados del proyecto es la individualización de ámbitos específicos de aplicación de métodos formales de análisis. Se espera que como resultado del desarrollo del proyecto surjan herramientas de análisis cuyo nivel de usabilidad sea adecuado para su aplicación por parte de desarrolladores sin formación específica en los métodos formales utilizados. Importancia del proyecto. El principal impacto de este proyecto será la contribución a la aplicación práctica de técnicas formales de análisis en diferentes etapas del desarrollo de software, con la finalidad de incrementar su calidad y confiabilidad. A crucial factor for software quality is correcteness. Traditionally, formal approaches to software development concentrate on functional correctness, and tackle this problem basically by being based on well defined notations founded on solid mathematical grounds. This makes formal methods better suited for analysis, due to their precise semantics, but they are usually more complex, and require familiarity and experience with the manipulation of mathematical definitions. So, their acceptance by software engineers is rather restricted, and formal methods applications have been confined to critical systems. Nevertheless, it is obvious that the advantages that formal methods provide apply to any kind of software system. It is accepted that appropriate software tool support for formal analysis is essential, if one seeks providing support for software development based on formal methods. Indeed, some of the relatively recent sucesses of formal methods are accompanied by good quality tools that automate powerful analysis mechanisms, and are even integrated in widely used development environments. Still, most of these tools either concentrate on code analysis, and in many cases are still far from being simple enough to be employed by software engineers without experience in formal methods. Another important problem for the adoption of tool support for formal methods is scalability. Automated software analysis is intrinsically complex, and thus techniques do not scale well in the general case. In this project, we will attempt to identify particular modelling, design, specification or coding activities in software development processes where to apply automated formal analysis techniques. By focusing in very specific application domains, we expect to find characteristics that might be exploited to increase the scalability of the corresponding analyses, compared to the general case.