10 resultados para simplicity
em Biblioteca Digital da Produção Intelectual da Universidade de São Paulo (BDPI/USP)
Resumo:
Complex networks can be understood as graphs whose connectivity properties deviate from those of regular or near-regular graphs, which are understood as being ""simple"". While a great deal of the attention so far dedicated to complex networks has been duly driven by the ""complex"" nature of these structures, in this work we address the identification of their simplicity. The basic idea is to seek for subgraphs whose nodes exhibit similar measurements. This approach paves the way for complementing the characterization of networks, including results suggesting that the protein-protein interaction networks, and to a lesser extent also the Internet, may be getting simpler over time. Copyright (C) EPLA, 2009
Resumo:
Human respiratory syncytial virus (HRSV) is the main cause of acute lower respiratory tract infections in infants and children. Rapid diagnosis is required to permit appropriate care and treatment and to avoid unnecessary antibiotic use. Reverse transcriptase (RT-PCR) and indirect immunofluorescence assay (IFA) methods have been considered important tools for virus detection due to their high sensitivity and specificity. In order to maximize use-simplicity and minimize the risk of sample cross-contamination inherent in two-step techniques, a RT-PCR method using only a single tube to detect HRSV in clinical samples was developed. Nasopharyngeal aspirates from 226 patients with acute respiratory illness, ranging from infants to 5 years old, were collected at the University Hospital of the University of Sao Paulo (HU-USP), and tested using IFA, one-step RT-PCR, and semi-nested RT-PCR. One hundred and two (45.1%) samples were positive by at least one of the three methods, and 75 (33.2%) were positive by all methods: 92 (40.7%) were positive by one-step RT-PCR, 84 (37.2%) by IFA, and 96 (42.5%) by the semi-nested RT-PCR technique. One-step RT-PCR was shown to be fast, sensitive, and specific for RSV diagnosis, without the added inconvenience and risk of false positive results associated with semi-nested PCR. The combined use of these two methods enhances HRSV detection. (C) 2007 Elsevier B.V. All rights reserved.
Resumo:
Since the advent of the postgenomic era, efforts have focused on the development of rapid strategies for annotating plant genes of unknown function. Given its simplicity and rapidity, virus-induced gene silencing (VIGS) has become one of the preeminent approaches for functional analyses. However, several problems remain intrinsic to the use of such a strategy in the study of both metabolic and developmental processes. The most prominent of these is the commonly observed phenomenon of ""sectoring"" the tissue regions that are not effectively targeted by VIGS. To better discriminate these sectors, an effective marker system displaying minimal secondary effects is a prerequisite. Utilizing a VIGS system based on the tobacco rattle virus vector, we here studied the effect of silencing the endogenous phytoene desaturase gene (pds) and the expression and subsequent silencing of the exogenous green fluorescence protein (gfp) on the metabolism of Arabidopsis (Arabidopsis thaliana) leaves and tomato (Solanum lycopersicum) fruits. In leaves, we observed dramatic effects on primary carbon and pigment metabolism associated with the photobleached phenotype following the silencing of the endogenous pds gene. However, relatively few pleiotropic effects on carbon metabolism were observed in tomato fruits when pds expression was inhibited. VIGS coupled to gfp constitutive expression revealed no significant metabolic alterations after triggering of silencing in Arabidopsis leaves and a mild effect in mature green tomato fruits. By contrast, a wider impact on metabolism was observed in ripe fruits. Silencing experiments with an endogenous target gene of interest clearly demonstrated the feasibility of cosilencing in this system; however, carefully constructed control experiments are a prerequisite to prevent erroneous interpretation.
Resumo:
Predictive performance evaluation is a fundamental issue in design, development, and deployment of classification systems. As predictive performance evaluation is a multidimensional problem, single scalar summaries such as error rate, although quite convenient due to its simplicity, can seldom evaluate all the aspects that a complete and reliable evaluation must consider. Due to this, various graphical performance evaluation methods are increasingly drawing the attention of machine learning, data mining, and pattern recognition communities. The main advantage of these types of methods resides in their ability to depict the trade-offs between evaluation aspects in a multidimensional space rather than reducing these aspects to an arbitrarily chosen (and often biased) single scalar measure. Furthermore, to appropriately select a suitable graphical method for a given task, it is crucial to identify its strengths and weaknesses. This paper surveys various graphical methods often used for predictive performance evaluation. By presenting these methods in the same framework, we hope this paper may shed some light on deciding which methods are more suitable to use in different situations.
Resumo:
Surface roughness is an important geomorphological variable which has been used in the Earth and planetary sciences to infer material properties, current/past processes, and the time elapsed since formation. No single definition exists; however, within the context of geomorphometry, we use surface roughness as an expression of the variability of a topographic surface at a given scale, where the scale of analysis is determined by the size of the landforms or geomorphic features of interest. Six techniques for the calculation of surface roughness were selected for an assessment of the parameter`s behavior at different spatial scales and data-set resolutions. Area ratio operated independently of scale, providing consistent results across spatial resolutions. Vector dispersion produced results with increasing roughness and homogenization of terrain at coarser resolutions and larger window sizes. Standard deviation of residual topography highlighted local features and did not detect regional relief. Standard deviation of elevation correctly identified breaks of slope and was good at detecting regional relief. Standard deviation of slope (SD(slope)) also correctly identified smooth sloping areas and breaks of slope, providing the best results for geomorphological analysis. Standard deviation of profile curvature identified the breaks of slope, although not as strongly as SD(slope), and it is sensitive to noise and spurious data. In general, SD(slope) offered good performance at a variety of scales, while the simplicity of calculation is perhaps its single greatest benefit.
Resumo:
Most studies involving statistical time series analysis rely on assumptions of linearity, which by its simplicity facilitates parameter interpretation and estimation. However, the linearity assumption may be too restrictive for many practical applications. The implementation of nonlinear models in time series analysis involves the estimation of a large set of parameters, frequently leading to overfitting problems. In this article, a predictability coefficient is estimated using a combination of nonlinear autoregressive models and the use of support vector regression in this model is explored. We illustrate the usefulness and interpretability of results by using electroencephalographic records of an epileptic patient.
Resumo:
The Grubbs` measurement model is frequently used to compare several measuring devices. It is common to assume that the random terms have a normal distribution. However, such assumption makes the inference vulnerable to outlying observations, whereas scale mixtures of normal distributions have been an interesting alternative to produce robust estimates, keeping the elegancy and simplicity of the maximum likelihood theory. The aim of this paper is to develop an EM-type algorithm for the parameter estimation, and to use the local influence method to assess the robustness aspects of these parameter estimates under some usual perturbation schemes, In order to identify outliers and to criticize the model building we use the local influence procedure in a Study to compare the precision of several thermocouples. (C) 2008 Elsevier B.V. All rights reserved.
Resumo:
In this work we show that the eigenvalues of the Dirichlet problem for the biharmonic operator are generically simple in the set Of Z(2)-symmetric regions of R-n, n >= 2, with a suitable topology. To accomplish this, we combine Baire`s lemma, a generalised version of the transversality theorem, due to Henry [Perturbation of the boundary in boundary value problems of PDEs, London Mathematical Society Lecture Note Series 318 (Cambridge University Press, 2005)], and the method of rapidly oscillating functions developed in [A. L. Pereira and M. C. Pereira, Mat. Contemp. 27 (2004) 225-241].
Resumo:
An exploratory investigation was conducted on the effects of application of ozone on the removal of organic and inorganic contaminants and the reduction of settleable solids in urban lake sediments. Homogenized sediment samples were treated in a batch reactor with an external recirculation loop and ozone feed from a Venturi injector. The ozone generating system was fed with ambient air with small footprint and operational simplicity. Ozone mass application (g/h) and contact time (min) were varied over wide ranges during testing. The effects of the ozone mass applied per unit time and the contact time on contaminant removal efficiencies were analyzed and a trade-off between the costs of ozonation and of solids treatment and disposal was proposed. The minimum ozone mass application required for total contaminant removal apparently depended on the type of organic contaminant present. An apparent influence of inorganic contaminant speciation on the removal efficiency was found and discussed.
Resumo:
In this report, we describe the microfabrication and integration of planar electrodes for contactless conductivity detection on polyester-toner (PT) electrophoresis microchips using toner masks. Planar electrodes were fabricated by three simple steps: (i) drawing and laser-printing the electrode geometry on polyester films, (ii) sputtering deposition onto substrates, and (iii) removal of toner layer by a lift-off process. The polyester film with anchored electrodes was integrated to PT electrophoresis microchannels by lamination at 120 degrees C in less than 1 min. The electrodes were designed in an antiparallel configuration with 750 mu m width and 750 gm gap between them. The best results were recorded with a frequency of 400 kHz and 10 V-PP using a sinusoidal wave. The analytical performance of the proposed microchip was evaluated by electrophoretic separation of potassium, sodium and lithium in 150 mu m wide x 6 mu m deep microchannels. Under an electric field of 250 V/cm the analytes were successfully separated in less than 90 s with efficiencies ranging from 7000 to 13 000 plates. The detection limits (S/N = 3) found for K+, Na+, and Li+ were 3.1, 4.3, and 7.2 mu mol/L, respectively. Besides the low-cost and instrumental simplicity, the integrated PT chip eliminates the problem of manual alignment and gluing of the electrodes, permitting more robustness and better reproducibility, therefore, more suitable for mass production of electrophoresis microchips.