29 resultados para software analysis

em Biblioteca Digital da Produção Intelectual da Universidade de São Paulo


Relevância:

40.00% 40.00%

Publicador:

Resumo:

Abstract Background The study and analysis of gene expression measurements is the primary focus of functional genomics. Once expression data is available, biologists are faced with the task of extracting (new) knowledge associated to the underlying biological phenomenon. Most often, in order to perform this task, biologists execute a number of analysis activities on the available gene expression dataset rather than a single analysis activity. The integration of heteregeneous tools and data sources to create an integrated analysis environment represents a challenging and error-prone task. Semantic integration enables the assignment of unambiguous meanings to data shared among different applications in an integrated environment, allowing the exchange of data in a semantically consistent and meaningful way. This work aims at developing an ontology-based methodology for the semantic integration of gene expression analysis tools and data sources. The proposed methodology relies on software connectors to support not only the access to heterogeneous data sources but also the definition of transformation rules on exchanged data. Results We have studied the different challenges involved in the integration of computer systems and the role software connectors play in this task. We have also studied a number of gene expression technologies, analysis tools and related ontologies in order to devise basic integration scenarios and propose a reference ontology for the gene expression domain. Then, we have defined a number of activities and associated guidelines to prescribe how the development of connectors should be carried out. Finally, we have applied the proposed methodology in the construction of three different integration scenarios involving the use of different tools for the analysis of different types of gene expression data. Conclusions The proposed methodology facilitates the development of connectors capable of semantically integrating different gene expression analysis tools and data sources. The methodology can be used in the development of connectors supporting both simple and nontrivial processing requirements, thus assuring accurate data exchange and information interpretation from exchanged data.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Photodynamic therapy (PDT) is a treatment modality that has advanced rapidly in recent years. It causes tissue and vascular damage with the interaction of a photosensitizing agent (PS), light of a proper wavelength, and molecular oxygen. Evaluation of vessel damage usually relies on histopathology evaluation. Results are often qualitative or at best semi-quantitative based on a subjective system. The aim of this study was to evaluate, using CD31 immunohistochem- istry and image analysis software, the vascular damage after PDT in a well-established rodent model of chemically induced mammary tumor. Fourteen Sprague-Dawley rats received a single dose of 7,12-dimethylbenz(a)anthraxcene (80 mg/kg by gavage), treatment efficacy was evaluated by comparing the vascular density of tumors after treatment with Photogem® as a PS, intraperitoneally, followed by interstitial fiber optic lighting, from a diode laser, at 200 mW/cm and light dose of 100 J/cm directed against his tumor (7 animals), with a control group (6 animals, no PDT). The animals were euthanized 30 hours after the lighting and mammary tumors were removed and samples from each lesion were formalin-fixed. Immunostained blood vessels were quantified by Image Pro-Plus version 7.0. The control group had an average of 3368.6 ± 4027.1 pixels per picture and the treated group had an average of 779 ± 1242.6 pixels per area (P < 0.01), indicating that PDT caused a significant decrease in vascular density of mammary tumors. The combined immu- nohistochemistry using CD31, with selection of representative areas by a trained pathology, followed by quantification of staining using Image Pro-Plus version 7.0 system was a practical and robust methodology for vessel damage evalua- tion, which probably could be used to assess other antiangiogenic treatments.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objective: To analyze drug prescriptions for insulin and oral antidiabetic drugs in type 1 and type 2 diabetes mellitus patients seen in the Brazilian Public Healthcare System (Unified Health System - SUS) in Ribeirao Preto, SP, Brazil. Subjects and methods: All the patients with diabetes seen in the SUS in the western district of Ribeirao Preto, SP, Brazil between March/2006 and February/2007 were included in the study. Results: A total of 3,982 patients were identified. Mean age of the patients was 60.6 years, and 61.0% were females. Sixty percent of the patients were treated with monotherapy. Doses of oral antidiabetic drugs were lower in monotherapy than in polytherapy. Ten patients received doses of glibenclamide or metformin above the recommended maximum doses, and in elderly patients there was no reduction in drug doses. Conclusion: Monotherapy with oral antidiabetic drugs was the predominant procedure, and the doses were not individualized according to age. Arq Bras Endocrinol Metab. 2012;56(2):120-7

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose: To evaluate endothelial cell sample size and statistical error in corneal specular microscopy (CSM) examinations. Methods: One hundred twenty examinations were conducted with 4 types of corneal specular microscopes: 30 with each BioOptics, CSO, Konan, and Topcon corneal specular microscopes. All endothelial image data were analyzed by respective instrument software and also by the Cells Analyzer software with a method developed in our lab(US Patent). A reliability degree (RD) of 95% and a relative error (RE) of 0.05 were used as cut-off values to analyze images of the counted endothelial cells called samples. The sample size mean was the number of cells evaluated on the images obtained with each device. Only examinations with RE<0.05 were considered statistically correct and suitable for comparisons with future examinations. The Cells Analyzer software was used to calculate the RE and customized sample size for all examinations. Results: Bio-Optics: sample size, 97 +/- 22 cells; RE, 6.52 +/- 0.86; only 10% of the examinations had sufficient endothelial cell quantity (RE<0.05); customized sample size, 162 +/- 34 cells. CSO: sample size, 110 +/- 20 cells; RE, 5.98 +/- 0.98; only 16.6% of the examinations had sufficient endothelial cell quantity (RE<0.05); customized sample size, 157 +/- 45 cells. Konan: sample size, 80 +/- 27 cells; RE, 10.6 +/- 3.67; none of the examinations had sufficient endothelial cell quantity (RE>0.05); customized sample size, 336 +/- 131 cells. Topcon: sample size, 87 +/- 17 cells; RE, 10.1 +/- 2.52; none of the examinations had sufficient endothelial cell quantity (RE>0.05); customized sample size, 382 +/- 159 cells. Conclusions: A very high number of CSM examinations had sample errors based on Cells Analyzer software. The endothelial sample size (examinations) needs to include more cells to be reliable and reproducible. The Cells Analyzer tutorial routine will be useful for CSM examination reliability and reproducibility.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objective: To observe the behavior of the plotted vectors on the RXc (R - resistance - and Xc - reactance corrected for body height/length) graph through bioelectrical impedance analysis (BIVA) and phase angle (PA) values in stable premature infants, considering the hypothesis that preterm infants present vector behavior on BIVA suggestive of less total body water and soft tissues, compared to reference data for term infants. Methods: Cross-sectional study, including preterm neonates of both genders, in-patients admitted to an intermediate care unit at a tertiary care hospital. Data on delivery, diet and bioelectrical impedance (800 mA, 50 kHz) were collected. The graphs and vector analysis were performed with the BIVA software. Results: A total of 108 preterm infants were studied, separated according to age (< 7 days and >= 7 days). Most of the premature babies were without the normal range (above the 95% tolerance intervals) existing in literature for term newborn infants and there was a tendency to dispersion of the points in the upper right quadrant, RXc plan. The PA was 4.92 degrees (+/- 2.18) for newborns < 7 days and 4.34 degrees (+/- 2.37) for newborns >= 7 days. Conclusion: Premature infants behave similarly in terms of BIVA and most of them have less absolute body water, presenting less fat free mass and fat mass in absolute values, compared to term newborn infants.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This finite element analysis (FEA) compared stress distribution on different bony ridges rehabilitated with different lengths of morse taper implants, varying dimensions of metal-ceramic crowns to maintain the occlusal alignment. Three-dimensional FE models were designed representing a posterior left side segment of the mandible: group control, 3 implants of 11 mm length; group 1, implants of 13 mm, 11 mm and 5 mm length; group 2, 1 implant of 11 mm and 2 implants of 5 mm length; and group 3, 3 implants of 5 mm length. The abutments heights were 3.5 mm for 13- and 11-mm implants (regular), and 0.8 mm for 5-mm implants (short). Evaluation was performed on Ansys software, oblique loads of 365N for molars and 200N for premolars. There was 50% higher stress on cortical bone for the short implants than regular implants. There was 80% higher stress on trabecular bone for the short implants than regular implants. There was higher stress concentration on the bone region of the short implants neck. However, these implants were capable of dissipating the stress to the bones, given the applied loads, but achieving near the threshold between elastic and plastic deformation to the trabecular bone. Distal implants and/or with biggest occlusal table generated greatest stress regions on the surrounding bone. It was concluded that patients requiring short implants associated with increased proportions implant prostheses need careful evaluation and occlusal adjustment, as a possible overload in these short implants, and even in regular ones, can generate stress beyond the physiological threshold of the surrounding bone, compromising the whole system.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We evaluated the diagnostic quality of first-trimester ultrasound images transmitted in realtime using low-cost telecommunications. A prospective sample of fetal ultrasound images from 11 weeks to 13 weeks and six days of pregnancy was obtained from pregnant women over 18 years old. The examinations were transmitted in realtime to three independent examiners who carried out a qualitative assessment based on parameters established by the Fetal Medicine Foundation. All fetal structures could be viewed and the quality of images received by the examiners was considered normal. There were significant differences for crown-rump length and nuchal translucency in the transmitted images but the loss in definition was acceptable. Thus the quality of images transmitted via the Internet through the use of low-cost software appeared suitable for screening for chromosomal abnormalities in the first trimester of pregnancy.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The aim of the present study was to determine clinical parameters for the use of Er,Cr:YSGG laser in the treatment of dentine hypersensitivity. Two antagonist areas were determined as control and experimental areas for irradiation in 90 premolar roots. Each surface was conditioned with 24% EDTA (sub-group 1) and 35% phosphoric acid (sub-group 2) and irradiated with the following settings: 1) Er:YAG, 60 mJ, 2 Hz, defocused; groups 2 to 9: irradiation with Er,Cr:YSGG laser, 20 Hz, Z6 tip, 0% of air and water: 2) Er,Cr:YSGG 0.25 W; 3) 0.5 W; 4) 0.75 W; 5) 1.0 W; 6) 1.25 W, 7) 1.50 W, 8) 2 W; 9) 2 W. After irradiation, samples were immersed in methylene blue solution and included in epoxy resin to obtain longitudinal cuts. The images were digitalized and analyzed by computer software. Although the samples irradiated with Er:YAG laser showed less microleakage, sub-group 1 showed differences between the groups, differing statistically from groups 3, 6, and 9. The results of sub-group 2 showed that the mean values of Er:YAG samples showed a negative trend, however, no differences were detected between the groups. For scanning electron microscopy analysis, dentine squares were obtained and prepared to evaluate the superficial morphology. Partial closure of dentinal tubules was observed after irradiation with Er:YAG and Er,Cr:YSGG laser in the 0.25 and 0.50 W protocols. As the energy densities rose, open dentinal tubules, carbonization and cracks were observed. It can be concluded that none of the parameters were capable of eliminating microleakage, however, clinical studies with Er:YAG and Er,Cr:YSGG lasers should be conducted with the lowest protocols in order to determine the most satisfactory setting for dentine hypersensitivity.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objective: To evaluate the neuroprotection of mild hypothermia, applied in different moments, in temporary focal cerebral ischemia in rats. Methods: Rats was divided into Control (C), Sham (S), Ischemic-control(IC), Pre-ischemic Hypothermia (IH1), Intra-ischemic Hypothermia (IH2), and Post-ischemic Hypothermia (IH3) groups. Morphometry was performed using the KS400 software (Carl Zeiss (R)) in coronal sections stained by Luxol Fast Blue. Ischemic areas and volumes were obtained. Results: Statistically, blue areas showed difference for C vs. IC, IC vs. IH1 and IC vs. IH2 (p=0.0001; p=0.01; p=0.03), and no difference between C vs. S, IC vs. IH3 and IH vs. IH2 (p=0.39; p=0.85; p=0.63). Red areas showed difference between C vs. IC, IC vs. IH1 and IC vs. IH2 (p=0.0001; p=0.009; p=0.03), and no difference between C vs. S, IC vs. IH3 and IH1 vs. IH2 (p=0.48; p=0.27; p=0.68). Average ischemic areas and ischemic volumes showed difference between IC vs. IH1 and IC vs. IH2 (p=0.0001 and p=0.0011), and no difference between IC vs. IH3 and IH1 vs. IH2 (p=0.57; p=0.79). Conclusion: Pre-ischemic and intra-ischemic hypothermia were shown to be similarly neuroprotective, but this was not true for post-ischemic hypothermia.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Pulse repetition rates and the number of laser pulses are among the most important parameters that do affect the analysis of solid materials by laser induced breakdown spectroscopy, and the knowledge of their effects is of fundamental importance for suggesting analytical strategies when dealing with laser ablation processes of polymers. In this contribution, the influence of these parameters in the ablated mass and in the features of craters was evaluated in polypropylene and high density polyethylene plates containing pigment-based PbCrO4. Surface characterization and craters profile were carried out by perfilometry and scanning electron microscopy. Area, volume and profile of craters were obtained using Taylor Map software. A laser induced breakdown spectroscopy system consisted of a Q-Switched Nd:YAG laser (1064 nm, 5 ns) and an Echelle spectrometer equipped with ICCD detector were used. The evaluated operating conditions consisted of 10, 25 and 50 laser pulses at 1, 5 and 10 Hz, 250 mJ/pulse (85 J cm(-2)), 2 mu s delay time and 6 mu s integration time gate. Differences in the topographical features among craters of both polymers were observed. The decrease in the repetition rate resulted in irregular craters and formation of edges, especially in polypropylene sample. The differences in the topographical features and ablated masses were attributed to the influence of the degree of crystallinity, crystalline melting temperature and glass transition temperature in the ablation process of the high density polyethylene and polypropylene. It was also observed that the intensities of chromium and lead emission signals obtained at 10 Hz were two times higher than at 5 Hz by keeping the number of laser pulses constant. (C) 2011 Elsevier B. V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

To evaluate the efficacy of ProTaper Universal rotary retreatment system and the influence of sealer type on the presence of filling debris in the reinstrumented canals viewed in an operative clinical microscope. Forty-five palatal root canals of first molars were filled with gutta-percha and one of the following sealers: G1, EndoFill; G2, AH Plus; G3, Sealapex. The canals were then reinstrumented with ProTaper Universal rotary system. Roots were longitudinally sectioned and examined under an operative clinical microscope (10x), and the amount of filling debris on canal walls was analyzed using the AutoCAD 2004 software. A single operator used a specific software tool to outline the canal area and the filling debris area in each third (cervical, middle, and apical), as well as the total canal area. Data were analyzed by Kruskal-Wallis test and Tukey test at P < 0.05. Sealapex demonstrated significant differences in the average of filling debris area/canal among the 3 thirds. This group revealed that apical third showed more debris than the both cervical and middle third (P < 0.0001). Endofill presented significantly more filling debris than Sealapex in the cervical third (P < 0.05). In the middle (P = 0.12) and apical third (P = 0.10), there were no differences amongst groups. Debris was left in all canal thirds, regardless of the retreatment technique. The greatest differences between techniques and sealers were found in the cervical third. Microsc. Res. Tech. 75:12331236, 2012. (C) 2012 Wiley Periodicals, Inc.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A deep theoretical analysis of the graph cut image segmentation framework presented in this paper simultaneously translates into important contributions in several directions. The most important practical contribution of this work is a full theoretical description, and implementation, of a novel powerful segmentation algorithm, GC(max). The output of GC(max) coincides with a version of a segmentation algorithm known as Iterative Relative Fuzzy Connectedness, IRFC. However, GC(max) is considerably faster than the classic IRFC algorithm, which we prove theoretically and show experimentally. Specifically, we prove that, in the worst case scenario, the GC(max) algorithm runs in linear time with respect to the variable M=|C|+|Z|, where |C| is the image scene size and |Z| is the size of the allowable range, Z, of the associated weight/affinity function. For most implementations, Z is identical to the set of allowable image intensity values, and its size can be treated as small with respect to |C|, meaning that O(M)=O(|C|). In such a situation, GC(max) runs in linear time with respect to the image size |C|. We show that the output of GC(max) constitutes a solution of a graph cut energy minimization problem, in which the energy is defined as the a"" (a) norm ayenF (P) ayen(a) of the map F (P) that associates, with every element e from the boundary of an object P, its weight w(e). This formulation brings IRFC algorithms to the realm of the graph cut energy minimizers, with energy functions ayenF (P) ayen (q) for qa[1,a]. Of these, the best known minimization problem is for the energy ayenF (P) ayen(1), which is solved by the classic min-cut/max-flow algorithm, referred to often as the Graph Cut algorithm. We notice that a minimization problem for ayenF (P) ayen (q) , qa[1,a), is identical to that for ayenF (P) ayen(1), when the original weight function w is replaced by w (q) . Thus, any algorithm GC(sum) solving the ayenF (P) ayen(1) minimization problem, solves also one for ayenF (P) ayen (q) with qa[1,a), so just two algorithms, GC(sum) and GC(max), are enough to solve all ayenF (P) ayen (q) -minimization problems. We also show that, for any fixed weight assignment, the solutions of the ayenF (P) ayen (q) -minimization problems converge to a solution of the ayenF (P) ayen(a)-minimization problem (ayenF (P) ayen(a)=lim (q -> a)ayenF (P) ayen (q) is not enough to deduce that). An experimental comparison of the performance of GC(max) and GC(sum) algorithms is included. This concentrates on comparing the actual (as opposed to provable worst scenario) algorithms' running time, as well as the influence of the choice of the seeds on the output.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Various types of trill exercises have been used for a long time as a tool in the treatment and preparation of the voice. Although they are reported to produce vocal benefits in most subjects, their physiology has not yet been studied in depth. The aim of this study was to compare the mean and standard deviation of the closed quotient in exercises of lip and tongue trills with the sustained vowel /epsilon/ in opera singers. Ten professional classical (operatic) singers, reportedly in perfect laryngeal health, served as subjects for this study and underwent electroglottography. During the examination, the subjects were instructed to deliver the sustained vowel /epsilon/ and lip and tongue trills in a same preestablished frequency and intensity. The mean values and standard deviation of the closed quotient were obtained using the software developed for this purpose. The comparison of the results was intrasubjects; maximum intensities were compared only among them and so were minimum intensities. The means of closed quotient were statistically significant only in the strong intensities, and the lip trill was different from the tongue trill and the sustained vowel /epsilon/. The standard deviation of the closed quotient distinguished the sustained vowel /epsilon/ from the lip and tongue trills in the two intensities. We concluded that there is oscillation of the closed quotient during the exercises of tongue and lip trills, and the closed quotient is higher during the performance of exercises of the lip trill, when compared with the two other utterances, only in the strong intensities.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose: One of the most common problems of the surgical management of Graves upper eyelid retraction is the occurrence of eyelid contour abnormalities. In the present study, the postoperative contour of a large sample of eyelids of patients with Graves orbitopathy was measured. Methods: The postoperative upper eyelid contour of 62 eyes of 43 patients with Graves orbitopathy was subjectively classified by 3 experienced surgeons in 3 categories: poor, fair, and good. The shape of the eyelid contours in each category was then measured with a recently developed custom-made software by measuring multiple midpupil eyelid distances each 15 degrees along the palpebral fissure. The upper eyelid contour of 60 normal subjects was also quantified as a control group. Results: The mean ratio between the sum of the lateral and medial midpupil eyelid distances (lateral/medial ratio) was 1.10 +/- 0.11 standard deviation in controls and 1.15 +/- 0.13 standard deviation in patients. Postoperatively, the mean midpupil eyelid distance at 90 degrees was 4.16 +/- 1.13 mm standard deviation. The distribution lateral/medial ratios of the eyelids judged as having good contours was similar to the distribution of the controls with a modal value centered on the interval between 1.0 and 1.10. The distribution of lateral/medial ratios of the eyelids judged as having poor contour was bimodal, with eyelids with low and high lateral/medial ratios. Low lateral/medial ratios occurred when there was a lateral overcorrection, giving the eyelid a flat or a medial ptosis appearance. High lateral/medial ratios were due to a central or medial overcorrection or a lateral peak maintenance. Conclusions: Postoperative upper eyelid contour abnormalities can be quantified by comparing the sum of multiple midpupil eyelid distances of the lateral and medial sectors of the eyelid. Low and high lateral/medial ratios are anomalous and judged as unpleasant. (Ophthal Plast Reconstr Surg 2012;28:429-433)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Dimensionality reduction is employed for visual data analysis as a way to obtaining reduced spaces for high dimensional data or to mapping data directly into 2D or 3D spaces. Although techniques have evolved to improve data segregation on reduced or visual spaces, they have limited capabilities for adjusting the results according to user's knowledge. In this paper, we propose a novel approach to handling both dimensionality reduction and visualization of high dimensional data, taking into account user's input. It employs Partial Least Squares (PLS), a statistical tool to perform retrieval of latent spaces focusing on the discriminability of the data. The method employs a training set for building a highly precise model that can then be applied to a much larger data set very effectively. The reduced data set can be exhibited using various existing visualization techniques. The training data is important to code user's knowledge into the loop. However, this work also devises a strategy for calculating PLS reduced spaces when no training data is available. The approach produces increasingly precise visual mappings as the user feeds back his or her knowledge and is capable of working with small and unbalanced training sets.