994 resultados para serial method
Resumo:
Electron wave motion in a quantum wire with periodic structure is treated by direct solution of the Schrödinger equation as a mode-matching problem. Our method is particularly useful for a wire consisting of several distinct units, where the total transfer matrix for wave propagation is just the product of those for its basic units. It is generally applicable to any linearly connected serial device, and it can be implemented on a small computer. The one-dimensional mesoscopic crystal recently considered by Ulloa, Castaño, and Kirczenow [Phys. Rev. B 41, 12 350 (1990)] is discussed with our method, and is shown to be a strictly one-dimensional problem. Electron motion in the multiple-stub T-shaped potential well considered by Sols et al. [J. Appl. Phys. 66, 3892 (1989)] is also treated. A structure combining features of both of these is investigated
Resumo:
Resumen tomado de la publicaci??n
Resumo:
The paper concerns the design and analysis of serial dilution assays to estimate the infectivity of a sample of tissue when it is assumed that the sample contains a finite number of indivisible infectious units such that a subsample will be infectious if it contains one or more of these units. The aim of the study is to estimate the number of infectious units in the original sample. The standard approach to the analysis of data from such a study is based on the assumption of independence of aliquots both at the same dilution level and at different dilution levels, so that the numbers of infectious units in the aliquots follow independent Poisson distributions. An alternative approach is based on calculation of the expected value of the total number of samples tested that are not infectious. We derive the likelihood for the data on the basis of the discrete number of infectious units, enabling calculation of the maximum likelihood estimate and likelihood-based confidence intervals. We use the exact probabilities that are obtained to compare the maximum likelihood estimate with those given by the other methods in terms of bias and standard error and to compare the coverage of the confidence intervals. We show that the methods have very similar properties and conclude that for practical use the method that is based on the Poisson assumption is to be recommended, since it can be implemented by using standard statistical software. Finally we consider the design of serial dilution assays, concluding that it is important that neither the dilution factor nor the number of samples that remain untested should be too large.
Resumo:
Cultura de fezes (Método de Exclusão Competitiva - EC) utilizada para prevenir a colonização cecal de aves por Salmonella enterica serovar Enteritidis (SE) foi submetida a cultivos seriados para evitar a presença de patógenos e, após o tratamento mais adequado, foi armazenada em temperatura de refrigeração antes do seu uso por até 63 dias. Os resultados mostraram que o cultivo repetido por 14 vezes não prejudica a ação protetora da cultura (CE), a qual continua inibindo a colonização cecal por SE. O produto submetido a 12 cultivos e armazenado durante 28 dias em temperatura de refrigeração também continua eficaz.
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
Abstract Background An important challenge for transcript counting methods such as Serial Analysis of Gene Expression (SAGE), "Digital Northern" or Massively Parallel Signature Sequencing (MPSS), is to carry out statistical analyses that account for the within-class variability, i.e., variability due to the intrinsic biological differences among sampled individuals of the same class, and not only variability due to technical sampling error. Results We introduce a Bayesian model that accounts for the within-class variability by means of mixture distribution. We show that the previously available approaches of aggregation in pools ("pseudo-libraries") and the Beta-Binomial model, are particular cases of the mixture model. We illustrate our method with a brain tumor vs. normal comparison using SAGE data from public databases. We show examples of tags regarded as differentially expressed with high significance if the within-class variability is ignored, but clearly not so significant if one accounts for it. Conclusion Using available information about biological replicates, one can transform a list of candidate transcripts showing differential expression to a more reliable one. Our method is freely available, under GPL/GNU copyleft, through a user friendly web-based on-line tool or as R language scripts at supplemental web-site.
Resumo:
BACKGROUND Quantitative light intensity analysis of the strut core by optical coherence tomography (OCT) may enable assessment of changes in the light reflectivity of the bioresorbable polymeric scaffold from polymer to provisional matrix and connective tissues, with full disappearance and integration of the scaffold into the vessel wall. The aim of this report was to describe the methodology and to apply it to serial human OCT images post procedure and at 6, 12, 24 and 36 months in the ABSORB cohort B trial. METHODS AND RESULTS In serial frequency-domain OCT pullbacks, corresponding struts at different time points were identified by 3-dimensional foldout view. The peak and median values of light intensity were measured in the strut core by dedicated software. A total of 303 corresponding struts were serially analyzed at 3 time points. In the sequential analysis, peak light intensity increased gradually in the first 24 months after implantation and reached a plateau (relative difference with respect to baseline [%Dif]: 61.4% at 12 months, 115.0% at 24 months, 110.7% at 36 months), while the median intensity kept increasing at 36 months (%Dif: 14.3% at 12 months, 75.0% at 24 months, 93.1% at 36 months). CONCLUSIONS Quantitative light intensity analysis by OCT was capable of detecting subtle changes in the bioresorbable strut appearance over time, and could be used to monitor the bioresorption and integration process of polylactide struts.
Resumo:
A new microtiter-plate dilution method was applied during the expedition ANTARKTIS-XI/2 with RV Polarstern to determine the distribution of copiotrophic and oligotrophic bacteria in the water columns at polar fronts. Twofold serial dilutions were performed with an eight-channel Electrapette in 96-wells plates by mixing 150 µl of seawater with 150 µl of copiotrophic or olitrophic Trypticase-Broth, three times per well. After incubation of about 6 month at 2 °C, turbidities were measured with an eight-channel photometer at 405 nm and combinations of positive test results for three consecutive dilutions chosen and compared with a Most Probable Number table, calculated for 8 replicates and twofold serial dilutions. Densities of 12 to 661 cells/ml for copiotrophs, and 1 to 39 cells/ml for oligotrophs were found. Colony Forming Units on copiotrophic Trypticase-Agar were between 6 and 847 cells/ml, which is in the same range as determined with the MPN method.
Resumo:
This paper presents a theoretical analysis and an optimization method for envelope amplifier. Highly efficient envelope amplifiers based on a switching converter in parallel or series with a linear regulator have been analyzed and optimized. The results of the optimization process have been shown and these two architectures are compared regarding their complexity and efficiency. The optimization method that is proposed is based on the previous knowledge about the transmitted signal type (OFDM, WCDMA...) and it can be applied to any signal type as long as the envelope probability distribution is known. Finally, it is shown that the analyzed architectures have an inherent efficiency limit.
Resumo:
Large-scale gene expression studies can now be routinely performed on macroamounts of cells, but it is unclear to which extent current methods are valuable for analyzing complex tissues. In the present study, we used the method of serial analysis of gene expression (SAGE) for quantitative mRNA profiling in the mouse kidney. We first performed SAGE at the whole-kidney level by sequencing 12,000 mRNA tags. Most abundant tags corresponded to transcripts widely distributed or enriched in the predominant kidney epithelial cells (proximal tubular cells), whereas transcripts specific for minor cell types were barely evidenced. To better explore such cells, we set up a SAGE adaptation for downsized extracts, enabling a 1,000-fold reduction of the amount of starting material. The potential of this approach was evaluated by studying gene expression in microdissected kidney tubules (50,000 cells). Specific gene expression profiles were obtained, and known markers (e.g., uromodulin in the thick ascending limb of Henle's loop and aquaporin-2 in the collecting duct) were found appropriately enriched. In addition, several enriched tags had no databank match, suggesting that they correspond to unknown or poorly characterized transcripts with specific tissue distribution. It is concluded that SAGE adaptation for downsized extracts makes possible large-scale quantitative gene expression measurements in small biological samples and will help to study the tissue expression and function of genes not evidenced with other high-throughput methods.
Resumo:
We describe a genome-wide characterization of mRNA transcript levels in yeast grown on the fatty acid oleate, determined using Serial Analysis of Gene Expression (SAGE). Comparison of this SAGE library with that reported for glucose grown cells revealed the dramatic adaptive response of yeast to a change in carbon source. A major fraction (>20%) of the 15,000 mRNA molecules in a yeast cell comprised differentially expressed transcripts, which were derived from only 2% of the total number of ∼6300 yeast genes. Most of the mRNAs that were differentially expressed code for enzymes or for other proteins participating in metabolism (e.g., metabolite transporters). In oleate-grown cells, this was exemplified by the huge increase of mRNAs encoding the peroxisomal β-oxidation enzymes required for degradation of fatty acids. The data provide evidence for the existence of redox shuttles across organellar membranes that involve peroxisomal, cytoplasmic, and mitochondrial enzymes. We also analyzed the mRNA profile of a mutant strain with deletions of the PIP2 and OAF1 genes, encoding transcription factors required for induction of genes encoding peroxisomal proteins. Induction of genes under the immediate control of these factors was abolished; other genes were up-regulated, indicating an adaptive response to the changed metabolism imposed by the genetic impairment. We describe a statistical method for analysis of data obtained by SAGE.
Resumo:
We have developed a technique called the generation of longer cDNA fragments from serial analysis of gene expression (SAGE) tags for gene identification (GLGI), to convert SAGE tags of 10 bases into their corresponding 3′ cDNA fragments covering hundred bases. A primer containing the 10-base SAGE tag is used as the sense primer, and a single base anchored oligo(dT) primer is used as an antisense primer in PCR, together with Pfu DNA polymerase. By using this approach, a cDNA fragment extending from the SAGE tag toward the 3′ end of the corresponding sequence can be generated. Application of the GLGI technique can solve two critical issues in applying the SAGE technique: one is that a longer fragment corresponding to a SAGE tag, which has no match in databases, can be generated for further studies; the other is that the specific fragment corresponding to a SAGE tag can be identified from multiple sequences that match the same SAGE tag. The development of the GLGI method provides several potential applications. First, it provides a strategy for even wider application of the SAGE technique for quantitative analysis of global gene expression. Second, a combined application of SAGE/GLGI can be used to complete the catalogue of the expressed genes in human and in other eukaryotic species. Third, it can be used to identify the 3′ cDNA sequence from any exon within a gene. It can also be used to confirm the reality of exons predicted by bioinformatic tools in genomic sequences. Fourth, a combined application of SAGE/GLGI can be applied to define the 3′ boundary of expressed genes in the genomic sequences in human and in other eukaryotic genomes.
Resumo:
The invasive signal amplification reaction has been previously developed for quantitative detection of nucleic acids and discrimination of single-nucleotide polymorphisms. Here we describe a method that couples two invasive reactions into a serial isothermal homogeneous assay using fluorescence resonance energy transfer detection. The serial version of the assay generates more than 107 reporter molecules for each molecule of target DNA in a 4-h reaction; this sensitivity, coupled with the exquisite specificity of the reaction, is sufficient for direct detection of less than 1,000 target molecules with no prior target amplification. Here we present a kinetic analysis of the parameters affecting signal and background generation in the serial invasive signal amplification reaction and describe a simple kinetic model of the assay. We demonstrate the ability of the assay to detect as few as 600 copies of the methylene tetrahydrofolate reductase gene in samples of human genomic DNA. We also demonstrate the ability of the assay to discriminate single base differences in this gene by using 20 ng of human genomic DNA.
Resumo:
"Serial no. 105-7."
Resumo:
A new control algorithm using parallel braking resistor (BR) and serial fault current limiter (FCL) for power system transient stability enhancement is presented in this paper. The proposed control algorithm can prevent transient instability during first swing by immediately taking away the transient energy gained in faulted period. It can also reduce generator oscillation time and efficiently make system back to the post-fault equilibrium. The algorithm is based on a new system energy function based method to choose optimal switching point. The parallel BR and serial FCL resistor can be switched at the calculated optimal point to get the best control result. This method allows optimum dissipation of the transient energy caused by disturbance so to make system back to equilibrium in minimum time. Case studies are given to verify the efficiency and effectiveness of this new control algorithm.