977 resultados para work system method
Resumo:
Background: The post-genomic era has brought new challenges regarding the understanding of the organization and function of the human genome. Many of these challenges are centered on the meaning of differential gene regulation under distinct biological conditions and can be performed by analyzing the Multiple Differential Expression (MDE) of genes associated with normal and abnormal biological processes. Currently MDE analyses are limited to usual methods of differential expression initially designed for paired analysis. Results: We proposed a web platform named ProbFAST for MDE analysis which uses Bayesian inference to identify key genes that are intuitively prioritized by means of probabilities. A simulated study revealed that our method gives a better performance when compared to other approaches and when applied to public expression data, we demonstrated its flexibility to obtain relevant genes biologically associated with normal and abnormal biological processes. Conclusions: ProbFAST is a free accessible web-based application that enables MDE analysis on a global scale. It offers an efficient methodological approach for MDE analysis of a set of genes that are turned on and off related to functional information during the evolution of a tumor or tissue differentiation. ProbFAST server can be accessed at http://gdm.fmrp.usp.br/probfast.
Resumo:
Background: The inherent complexity of statistical methods and clinical phenomena compel researchers with diverse domains of expertise to work in interdisciplinary teams, where none of them have a complete knowledge in their counterpart's field. As a result, knowledge exchange may often be characterized by miscommunication leading to misinterpretation, ultimately resulting in errors in research and even clinical practice. Though communication has a central role in interdisciplinary collaboration and since miscommunication can have a negative impact on research processes, to the best of our knowledge, no study has yet explored how data analysis specialists and clinical researchers communicate over time. Methods/Principal Findings: We conducted qualitative analysis of encounters between clinical researchers and data analysis specialists (epidemiologist, clinical epidemiologist, and data mining specialist). These encounters were recorded and systematically analyzed using a grounded theory methodology for extraction of emerging themes, followed by data triangulation and analysis of negative cases for validation. A policy analysis was then performed using a system dynamics methodology looking for potential interventions to improve this process. Four major emerging themes were found. Definitions using lay language were frequently employed as a way to bridge the language gap between the specialties. Thought experiments presented a series of ""what if'' situations that helped clarify how the method or information from the other field would behave, if exposed to alternative situations, ultimately aiding in explaining their main objective. Metaphors and analogies were used to translate concepts across fields, from the unfamiliar to the familiar. Prolepsis was used to anticipate study outcomes, thus helping specialists understand the current context based on an understanding of their final goal. Conclusion/Significance: The communication between clinical researchers and data analysis specialists presents multiple challenges that can lead to errors.
Resumo:
Background: High-throughput molecular approaches for gene expression profiling, such as Serial Analysis of Gene Expression (SAGE), Massively Parallel Signature Sequencing (MPSS) or Sequencing-by-Synthesis (SBS) represent powerful techniques that provide global transcription profiles of different cell types through sequencing of short fragments of transcripts, denominated sequence tags. These techniques have improved our understanding about the relationships between these expression profiles and cellular phenotypes. Despite this, more reliable datasets are still necessary. In this work, we present a web-based tool named S3T: Score System for Sequence Tags, to index sequenced tags in accordance with their reliability. This is made through a series of evaluations based on a defined rule set. S3T allows the identification/selection of tags, considered more reliable for further gene expression analysis. Results: This methodology was applied to a public SAGE dataset. In order to compare data before and after filtering, a hierarchical clustering analysis was performed in samples from the same type of tissue, in distinct biological conditions, using these two datasets. Our results provide evidences suggesting that it is possible to find more congruous clusters after using S3T scoring system. Conclusion: These results substantiate the proposed application to generate more reliable data. This is a significant contribution for determination of global gene expression profiles. The library analysis with S3T is freely available at http://gdm.fmrp.usp.br/s3t/.S3T source code and datasets can also be downloaded from the aforementioned website.
Resumo:
Objective: The purpose of this in vitro study was to evaluate the dentine root surface roughness and the adherence of Streptococcus sanguinis (ATCC 10556) after treatment with an ultrasonic system, Er:YAG laser, or manual curette. Background Data: Bacterial adhesion and formation of dental biofilm after scaling and root planing may be a challenge to the long-term stability of periodontal therapy. Materials and Methods: Forty flattened bovine roots were randomly assigned to one of the following groups: ultrasonic system (n = 10); Er:YAG laser (n = 10); manual curette (n = 10); or control untreated roots (n = 10). The mean surface roughness (Ra, mu m) of the specimens before and after exposure to each treatment was determined using a surface profilometer. In addition, S. sanguinis was grown on the treated and untreated specimens and the amounts of retained bacteria on the surfaces were measured by culture method. Results: All treatments increased the Ra; however, the roughest surface was produced by the curettes. In addition, the specimens treated with curettes showed the highest S. sanguinis adhesion. There was a significant positive correlation between roughness values and bacterial cells counts. Conclusion: S. sanguinis adhesion was the highest on the curette-treated dentine root surfaces, which also presented the greatest surface roughness.
Resumo:
Aims. In this work, we describe the pipeline for the fast supervised classification of light curves observed by the CoRoT exoplanet CCDs. We present the classification results obtained for the first four measured fields, which represent a one-year in-orbit operation. Methods. The basis of the adopted supervised classification methodology has been described in detail in a previous paper, as is its application to the OGLE database. Here, we present the modifications of the algorithms and of the training set to optimize the performance when applied to the CoRoT data. Results. Classification results are presented for the observed fields IRa01, SRc01, LRc01, and LRa01 of the CoRoT mission. Statistics on the number of variables and the number of objects per class are given and typical light curves of high-probability candidates are shown. We also report on new stellar variability types discovered in the CoRoT data. The full classification results are publicly available.
Resumo:
The VISTA near infrared survey of the Magellanic System (VMC) will provide deep YJK(s) photometry reaching stars in the oldest turn-off point throughout the Magellanic Clouds (MCs). As part of the preparation for the survey, we aim to access the accuracy in the star formation history (SFH) that can be expected from VMC data, in particular for the Large Magellanic Cloud (LMC). To this aim, we first simulate VMC images containing not only the LMC stellar populations but also the foreground Milky Way (MW) stars and background galaxies. The simulations cover the whole range of density of LMC field stars. We then perform aperture photometry over these simulated images, access the expected levels of photometric errors and incompleteness, and apply the classical technique of SFH-recovery based on the reconstruction of colour-magnitude diagrams (CMD) via the minimisation of a chi-squared-like statistics. We verify that the foreground MW stars are accurately recovered by the minimisation algorithms, whereas the background galaxies can be largely eliminated from the CMD analysis due to their particular colours and morphologies. We then evaluate the expected errors in the recovered star formation rate as a function of stellar age, SFR(t), starting from models with a known age-metallicity relation (AMR). It turns out that, for a given sky area, the random errors for ages older than similar to 0.4 Gyr seem to be independent of the crowding. This can be explained by a counterbalancing effect between the loss of stars from a decrease in the completeness and the gain of stars from an increase in the stellar density. For a spatial resolution of similar to 0.1 deg(2), the random errors in SFR(t) will be below 20% for this wide range of ages. On the other hand, due to the lower stellar statistics for stars younger than similar to 0.4 Gyr, the outer LMC regions will require larger areas to achieve the same level of accuracy in the SFR( t). If we consider the AMR as unknown, the SFH-recovery algorithm is able to accurately recover the input AMR, at the price of an increase of random errors in the SFR(t) by a factor of about 2.5. Experiments of SFH-recovery performed for varying distance modulus and reddening indicate that these parameters can be determined with (relative) accuracies of Delta(m-M)(0) similar to 0.02 mag and Delta E(B-V) similar to 0.01 mag, for each individual field over the LMC. The propagation of these errors in the SFR(t) implies systematic errors below 30%. This level of accuracy in the SFR(t) can reveal significant imprints in the dynamical evolution of this unique and nearby stellar system, as well as possible signatures of the past interaction between the MCs and the MW.
Resumo:
This article evaluates social implications of the ""SIGA"" Health Care Information System (HIS) in a public health care organization in the city of Sao Paulo. The evaluation was performed by means of an in-depth case study with patients and staff of a public health care organization, using qualitative and quantitative data. On the one hand, the system had consequences perceived as positive such as improved convenience and democratization of specialized treatment for patients and improvements in work organization. On the other hand, negative outcomes were reported, like difficulties faced by employees due to little familiarity with IT and an increase in the time needed to schedule appointments. Results show the ambiguity of the implications of HIS in developing countries, emphasizing the need for a more nuanced view of the evaluation of failures and successes and the importance of social contextual factors.
Resumo:
In this work the time resolved thermal lens method is combined with interferometric technique, the thermal relaxation calorimetry, photoluminescence and lifetime measurements to determine the thermo physical properties of Nd(2)O(3) doped sodium zincborate glass as a function of temperature up to the glass transition region. Thermal diffusivity, thermal conductivity, fluorescence quantum efficiency, linear thermal expansion coefficient and thermal coefficient of electronic polarizability were determined. In conclusion, the results showed the ability of thermal lens and interferometric methods to perform measurements very close to the phase transition region. These techniques provide absolute values for the measured physical quantities and are advantageous when low scan rates are required. (c) 2008 Optical Society of America
Resumo:
Transparent conducting oxides (TCO) are widely used in technological applications ranging from photovoltaics to thin-film transparent field-effect transistors. In this work we report a first-principles investigation, based on density-functional theory, of the atomic and electronic properties of Ga(2)O(3)(ZnO)(6) (GZO(6)), which is a promising candidate to be used as host oxide for wide band gap TCO applications. We identify a low-energy configuration for the coherent distribution of the Ga and Zn atoms in the cation positions within the experimentally reported orthorhombic GZO(6) structure. Four Ga atoms are located in four-fold sites, while the remaining 12 Ga atoms in the unit cell form four shared Ga agglomerates (a motif of four atoms). The Zn atoms are distributed in the remaining cation sites with effective coordination numbers from 3.90 to 4.50. Furthermore, we identify the natural formation of twin-boundaries in GZO(6), which can explain the zigzag modulations observed experimentally by high-resolution transmission electron microscopy in GZO(n) (n=9). Due to the intrinsic twin-boundary formation, polarity inversion in the ZnO tetrahedrons is present which is facilitated by the formation of the Ga agglomerates. Our analysis shows that the formation of fourfold Ga sites and Ga agglomerates are stabilized by the electronic octet rule, while the distribution of Ga atoms and the formation of the twin-boundary help alleviate excess strain. Finally we identify that the electronic properties of GZO(6) are essentially determined by the electronic properties of ZnO, i.e., there are slight changes in the band gap and optical absorption properties.
Resumo:
We present results of the CO(2)/carbonate system from the BIOSOPE cruise in the Eastern South Pacific Ocean, in an area not sampled previously. In particular, we present estimates of the anthropogenic carbon (C(ant)(TrOCA)) distribution in the upper 1000m of this region using the TrOCA method. The highest concentrations of C(ant)(TrOCA) found around 13 degrees S, 132 degrees W and 32 degrees S, 91 degrees W, are higher than 80 mu mol.kg(-)1 and 70 mu mol.kg(-1), respectively. The lowest concentrations are observed below 800m depth (<= 2 mu mol.kg(-1)) and within the Oxygen Minimum Zone (OMZ), mainly around 140 degrees W (< 11 mu mol.kg(-1)). As a result of the anthropogenic carbon penetration there has been decrease in pH by over 0.1 on an average in the upper 200 m. This work further improves our understanding on the penetration of anthropogenic carbon in the Eastern Pacific Ocean.
Resumo:
In this work we present a complete characterization and magnetic study of vanadium oxide/hexadecylamine nanotubes (VO(x)/Hexa NT's) doped with Co(2)+ and Ni(2+) ions. The morphology of the NT's has been characterized by transmission electron microscopy, while the metallic elements have been quantified by the instrumental neutron activation analysis technique. The static and dynamic magnetic properties were studied by collecting data of magnetization as a function of magnetic field and temperature and by electron paramagnetic resonance. At difference of the majority reports in the literature, we do not observe magnetic dimers in vanadium oxide nanotubes. Also, we observed that the incorporation of metallic ions (Co(2+), S = 3/2 and Ni(2+), S = 1) decreases notably the amount of V(4+) ions in the system, from 14-16% (nondoped case) to 2%-4%, with respect to the total vanadium atoms (fact corroborated by XPS experiments) anyway preserving the tubular nanostructure. The method to decrease the amount of V(4+) in the nanotubes improves considerably their potential technological applications as Li-ion batteries cathodes. (C) 2011 American Institute of Physics. [doi: 10.1063/1.3580252]
Resumo:
Although H(+) and OH(-) are the most common ions in aqueous media, they are not usually observable in capillary electrophoresis (CE) experiments, because of the extensive use of buffer solutions as the background electrolyte. In the present work, we introduce CE equipment designed to allow the determination of such ions in a similar fashion as any other ion. Basically, it consists of a four-compartment piece of equipment for electrolysis-separated experiments (D. P. de Jesus et at, Anal. Chem., 2005, 77, 607). In such a system, the ends of the capillary are placed in two reservoirs, which are connected to two other reservoirs through electrolyte-filled tubes. The electrodes of the high-voltage power source are positioned in these reservoirs. Thus, the electrolysis products are kept away from the inputs of the capillary. The detection was provided by two capacitively coupled contactless conductivity detectors (CD), each one positioned about 11 cm from the end of the capillary. Two applications were demonstrated: titration-like procedures for nanolitre samples and mobility measurements. Strong and weak acids (pK(a) < 5), pure or mixtures, could be titrated. The analytical curve is linear from 50 mu M up to 10 mM of total dissociable hydrogen (r = 0.99899 for n =10) in 10-nL samples. By including D(2)O in the running electrolyte, we could demonstrate how to measure the mixed proton/deuteron mobility. When H(2)O/D(2)O (9 : 1 v/v) was used as the solvent, the mobility was 289.6 +/- 0.5 x 10(-5) cm(2) V(-1) s(-1). Due to the fast conversion of the species, this value is related to the overall behaviour of all isotopologues and isotopomers of the Zundel and Eigen structures, as well as the Stokesian mobility of proton and deuteron. The effect of neutral (o-phenanthroline) and negatively charged (chloroacetate) bases and aprotic solvent (DMSO) over the H(+) mobility was also demonstrated.
Resumo:
The present work describes an investigation concerning the acetylation of celluloses extracted from short-life-cycle plant sources (i.e. sugarcane bagasse and sisal fiber) as well as microcrystalline cellulose. The acetylation was carried out under homogeneous conditions using the solvent system N,N-dimethylacetamide/lithium chloride. The celluloses were characterized, and the characterizations included an evaluation of the amount of hemicellulose present in the materials obtained from lignocellulosics sources (sugarcane and sisal). The amount of LiCl was varied and its influence on the degree of acetate substitution was analyzed. It was found that the solvent system composition and the nature of the cellulose influenced both the state of chain dissolution and the product characteristics. The obtained results demonstrated the importance of developing specific studies on the dissolution process as well as on the derivatization of celluloses from various sources.
Resumo:
A green and highly sensitive analytical procedure was developed for the determination of free chlorine in natural waters, based on the reaction with N,N-diethyl-p-phenylenediamine (DPD). The flow system was designed with solenoid micro-pumps in order to improve mixing conditions by pulsed flows and to minimize reagent consumption as well as waste generation. A 100-cm optical path flow cell based on a liquid core waveguide was employed to increase sensitivity. A linear response was observed within the range 10.0 to 100.0 mu g L(-1), with the detection limit, coefficient of variation and sampling rate estimated as 6.8 mu g (99.7% confidence level), 0.9% (n = 20) and 60 determinations per hour, respectively. The consumption of the most toxic reagent (DPD) was reduced 20,000-fold and 30-fold in comparison to the batch method and flow injection with continuous reagent addition, respectively. The results for natural and tap water samples agreed with those obtained by the reference batch spectrophotometric procedure at the 95% confidence level. (C) 2010 Elsevier By. All rights reserved.
Resumo:
A fully automated methodology was developed for the determination of the thyroid hormones levothyroxine (T4) and liothyronine (T3). The proposed method exploits the formation of highly coloured charge-transfer (CT) complexes between these compounds, acting as electron donors, and pi-acceptors such as chloranilic acid (CIA) and 2,3-dichloro-5,6-dicyano-p-benzoquinone (DDQ). For automation of the analytical procedure a simple, fast and versatile single interface flow system (SIFA)was implemented guaranteeing a simplified performance optimisation, low maintenance and a cost-effective operation. Moreover, the single reaction interface assured a convenient and straightforward approach for implementing job`s method of continuous variations used to establish the stoichiometry of the formed CT complexes. Linear calibration plots for levothyroxine and liothyronine concentrations ranging from 5.0 x 10(-5) to 2.5 x 10(-4) mol L(-1) and 1.0 x 10(-5) to 1.0 x 10(-4) mol L(-1), respectively, were obtained, with good precision (R.S.D. <4.6% and <3.9%) and with a determination frequency of 26 h(-1) for both drugs. The results obtained for pharmaceutical formulations were statistically comparable to the declared hormone amount with relative deviations lower than 2.1%. The accuracy was confirmed by carrying out recovery studies, which furnished recovery values ranging from 96.3% to 103.7% for levothyroxine and 100.1% for liothyronine. (C) 2009 Elsevier B.V. All rights reserved.