934 resultados para Real Root Isolation Methods


Relevância:

30.00% 30.00%

Publicador:

Resumo:

One approach to microbial genotyping is to make use of sets of single-nucleotide polymorphisms (SNPs) in combination with binary markers. Here we report the modification and automation of a SNP-plus-binary-marker-based approach to the genotyping of Staphylococcus aureus and its application to 391 S. aureus isolates from southeast Queensland, Australia. The SNPs used were arcC210, tpi243, arcC162, gmk318, pta294, tpi36, tpi241, and pta383. These provide a Simpson's index of diversity (D) of 0.95 with respect to the S. aureus multilocus sequence typing database and define 61 genotypes and the major clonal complexes. The binary markers used were pvl, cna, sdrE, pT181, and pUB110. Two novel real-time PCR formats for interrogating these markers were compared. One of these makes use of light upon extension (LUX) primers and biplexed reactions, while the other is a streamlined modification of kinetic PCR using SYBR green. The latter format proved to be more robust. In addition, automated methods for DNA template preparation, reaction setup, and data analysis were developed. A single SNP-based method for ST-93 (Queensland clone) identification was also devised. The genotyping revealed the numerical importance of the South West Pacific and Queensland community-acquired methicillin-resistant S. aureus (MRSA) clones and the clonal complex 239 Aus-1/Aus-2 hospital-associated MRSA. There was a strong association between the community-acquired clones and pvl.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Aim: To rapidly quantify hepatitis B virus (HBV) DNA by real-time PCR using efficient TaqMan probe and extraction methods of virus DNA. Methods: Three standards were prepared by cloning PCR products which targeted S, C and X region of HBV genome into pGEM-T vector respectively. A pair of primers and matched TaqMan probe were selected by comparing the copy number and the Ct values of HBV serum samples derived from the three different standard curves using certain serum DNA. Then the efficiency of six HBV DNA extraction methods including guanidinium isothiocyanate, proteinase K, NaI, NaOH lysis, alkaline lysis and simple boiling was analyzed in sample A, B and C by real-time PCR. Meanwhile, 8 clinical HBV serum samples were quantified. Results: The copy number of the same HBV serum sample originated from the standard curve of S, C and X regions was 5.7 × 104/ mL, 6.3 × 102/mL and 1.6 × 103/mL respectively. The relative Ct value was 26.6, 31.8 and 29.5 respectively. Therefore, primers and matched probe from S region were chosen for further optimization of six extraction methods. The copy number of HBV serum samples A, B and C was 3.49 × 109/mL, 2.08 × 106/mL and 4.40 × 107/mL respectively, the relative Ct value was 19.9, 30 and 26.2 in the method of NaOH lysis, which was the efficientest among six methods. Simple boiling showed a slightly lower efficiency than NaOH lysis. Guanidinium isothiocyanate, proteinase K and NaI displayed that the copy number of HBV serum sample A, B and C was around 105/ mL, meanwhile the Ct value was about 30. Alkaline failed to quantify the copy number of three HBV serum samples, Standard deviation (SD) and coefficient variation (CV) were very low in all 8 clinical HBV serum samples, showing that quantification of HBV DNA in triplicate was reliable and accurate. Conclusion: Real-time PCR based on optimized primers and TaqMan probe from S region in combination with NaOH lysis is a simple, rapid and accurate method for quantification of HBV serum DNA. © 2006 The WJG Press. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Enterovirus 71 (EV71) is one of the main causative agents of hand, foot and mouth disease (HFMD) in young children. Infections caused by EV71 could lead to many complications, ranging from brainstem encephalitis to pulmonary oedema, resulting in high mortality. Thus, rapid detection of the virus is required to enable measures to be implemented in preventing widespread transmission. Based on primers and probes targeting at the VP1 region, a real-time reverse-transcriptase polymerase chain reaction (RT-PCR) hybridization probe assay was developed for specific detection of EV71 from clinical specimens. Quantitative analysis showed that the assay was able to detect as low as 5 EV71 viral copies and EV71 was detected from 46 of the 55 clinical specimens obtained from pediatric patients suffering from HFMD during the period from 2000 to 2003 in Singapore. This study showed that the single tube real-time RT-PCR assay developed in this study can be applied as a rapid and sensitive method for specific detection of EV71 directly from clinical specimens. (c) 2005 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Real-time software systems are rarely developed once and left to run. They are subject to changes of requirements as the applications they support expand, and they commonly outlive the platforms they were designed to run on. A successful real-time system is duplicated and adapted to a variety of applications - it becomes a product line. Current methods for real-time software development are commonly based on low-level programming languages and involve considerable duplication of effort when a similar system is to be developed or the hardware platform changes. To provide more dependable, flexible and maintainable real-time systems at a lower cost what is needed is a platform-independent approach to real-time systems development. The development process is composed of two phases: a platform-independent phase, that defines the desired system behaviour and develops a platform-independent design and implementation, and a platform-dependent phase that maps the implementation onto the target platform. The last phase should be highly automated. For critical systems, assessing dependability is crucial. The partitioning into platform dependent and independent phases has to support verification of system properties through both phases.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Terrain can be approximated by a triangular mesh consisting millions of 3D points. Multiresolution triangular mesh (MTM) structures are designed to support applications that use terrain data at variable levels of detail (LOD). Typically, an MTM adopts a tree structure where a parent node represents a lower-resolution approximation of its descendants. Given a region of interest (ROI) and a LOD, the process of retrieving the required terrain data from the database is to traverse the MTM tree from the root to reach all the nodes satisfying the ROI and LOD conditions. This process, while being commonly used for multiresolution terrain visualization, is inefficient as either a large number of sequential I/O operations or fetching a large amount of extraneous data is incurred. Various spatial indexes have been proposed in the past to address this problem, however level-by-level tree traversal remains a common practice in order to obtain topological information among the retrieved terrain data. A new MTM data structure called direct mesh is proposed. We demonstrate that with direct mesh the amount of data retrieval can be substantially reduced. Comparing with existing MTM indexing methods, a significant performance improvement has been observed for real-life terrain data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The importance of availability of comparable real income aggregates and their components to applied economic research is highlighted by the popularity of the Penn World Tables. Any methodology designed to achieve such a task requires the combination of data from several sources. The first is purchasing power parities (PPP) data available from the International Comparisons Project roughly every five years since the 1970s. The second is national level data on a range of variables that explain the behaviour of the ratio of PPP to market exchange rates. The final source of data is the national accounts publications of different countries which include estimates of gross domestic product and various price deflators. In this paper we present a method to construct a consistent panel of comparable real incomes by specifying the problem in state-space form. We present our completed work as well as briefly indicate our work in progress.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Real-time control programs are often used in contexts where (conceptually) they run forever. Repetitions within such programs (or their specifications) may either (i) be guaranteed to terminate, (ii) be guaranteed to never terminate (loop forever), or (iii) may possibly terminate. In dealing with real-time programs and their specifications, we need to be able to represent these possibilities, and define suitable refinement orderings. A refinement ordering based on Dijkstra's weakest precondition only copes with the first alternative. Weakest liberal preconditions allow one to constrain behaviour provided the program terminates, which copes with the third alternative to some extent. However, neither of these handles the case when a program does not terminate. To handle this case a refinement ordering based on relational semantics can be used. In this paper we explore these issues and the definition of loops for real-time programs as well as corresponding refinement laws.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Este trabalho é um estudo exploratório sobre o Ambiente Comunicacional Internet que investiga tanto a possibilidade da influência de suas ferramentas de interação/comunicação sobre o comportamento sexual e de risco quanto o desenvolvimento de comportamento compulsivo no uso destas ferramentas na busca de parceiros sexuais. A metodologia adotada é, além da pesquisa bibliográfica, a da pesquisa exploratória, um levantamento e análise de dados quantitativos e pode ser considerada como pertencente ao paradigma tradicional empírico, pois a coleta de dados foi baseada em respostas a questionários semi-estruturados, aplicados a um grupo de informação composto por 428 estudantes universitários dos cursos ligados à área de Computação e Informática de uma instituição particular de Ensino Superior do município de São Paulo SP, Brasil. Para isso, obedece à Resolução do Conselho Nacional de Saúde CNS 196/96 e conta com o TCLE. Os resultados indicam que as práticas sexuais, a exposição a DST e vírus HIV e, particularmente, a tendência ao desenvolvimento do Transtorno de Adicção à Internet se distinguem de modo irrefutável. Os participantes que alegaram buscar parceiros sexuais reais na Internet são diretos nos seus objetivos, pois quando encontram esse parceiro concretizam o ato sexual, em ambientes impessoais, como por exemplo, o motel, e muitas vezes de modo arriscado no que toca à prevenção e à segurança no contato com outro. Destaca-se, ainda, que a compulsão não é reconhecida pelo grupo e que a procura de parceiros por intermédio das mídias digitais, para esse grupo, não está relacionada a itens negativos quanto a sua qualidade de vida o que suscita o estudo e a discussão mais aprofundada sobre a interação comunicação, sexo e Internet .(AU)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This accessible, practice-oriented and compact text provides a hands-on introduction to the principles of market research. Using the market research process as a framework, the authors explain how to collect and describe the necessary data and present the most important and frequently used quantitative analysis techniques, such as ANOVA, regression analysis, factor analysis, and cluster analysis. An explanation is provided of the theoretical choices a market researcher has to make with regard to each technique, as well as how these are translated into actions in IBM SPSS Statistics. This includes a discussion of what the outputs mean and how they should be interpreted from a market research perspective. Each chapter concludes with a case study that illustrates the process based on real-world data. A comprehensive web appendix includes additional analysis techniques, datasets, video files and case studies. Several mobile tags in the text allow readers to quickly browse related web content using a mobile device.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background & Aims: Current models of visceral pain processing derived from metabolic brain imaging techniques fail to differentiate between exogenous (stimulus-dependent) and endogenous (non-stimulus-specific) neural activity. The aim of this study was to determine the spatiotemporal correlates of exogenous neural activity evoked by painful esophageal stimulation. Methods: In 16 healthy subjects (8 men; mean age, 30.2 ± 2.2 years), we recorded magnetoencephalographic responses to 2 runs of 50 painful esophageal electrical stimuli originating from 8 brain subregions. Subsequently, 11 subjects (6 men; mean age, 31.2 ± 1.8 years) had esophageal cortical evoked potentials recorded on a separate occasion by using similar experimental parameters. Results: Earliest cortical activity (P1) was recorded in parallel in the primary/secondary somatosensory cortex and posterior insula (∼85 ms). Significantly later activity was seen in the anterior insula (∼103 ms) and cingulate cortex (∼106 ms; P = .0001). There was no difference between the P1 latency for magnetoencephalography and cortical evoked potential (P = .16); however, neural activity recorded with cortical evoked potential was longer than with magnetoencephalography (P = .001). No sex differences were seen for psychophysical or neurophysiological measures. Conclusions: This study shows that exogenous cortical neural activity evoked by experimental esophageal pain is processed simultaneously in somatosensory and posterior insula regions. Activity in the anterior insula and cingulate - brain regions that process the affective aspects of esophageal pain - occurs significantly later than in the somatosensory regions, and no sex differences were observed with this experimental paradigm. Cortical evoked potential reflects the summation of cortical activity from these brain regions and has sufficient temporal resolution to separate exogenous and endogenous neural activity. © 2005 by the American Gastroenterological Association.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper we describe a novel, extensible visualization system currently under development at Aston University. We introduce modern programming methods, such as the use of data driven programming, design patterns, and the careful definition of interfaces to allow easy extension using plug-ins, to 3D landscape visualization software. We combine this with modern developments in computer graphics, such as vertex and fragment shaders, to create an extremely flexible, extensible real-time near photorealistic visualization system. In this paper we show the design of the system and the main sub-components. We stress the role of modern programming practices and illustrate the benefits these bring to 3D visualization. © 2006 Springer-Verlag Berlin Heidelberg.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Liposomes have been imaged using a plethora of techniques. However, few of these methods offer the ability to study these systems in their natural hydrated state without the requirement of drying, staining, and fixation of the vesicles. However, the ability to image a liposome in its hydrated state is the ideal scenario for visualization of these dynamic lipid structures and environmental scanning electron microscopy (ESEM), with its ability to image wet systems without prior sample preparation, offers potential advantages to the above methods. In our studies, we have used ESEM to not only investigate the morphology of liposomes and niosomes but also to dynamically follow the changes in structure of lipid films and liposome suspensions as water condenses on to or evaporates from the sample. In particular, changes in liposome morphology were studied using ESEM in real time to investigate the resistance of liposomes to coalescence during dehydration thereby providing an alternative assay of liposome formulation and stability. Based on this protocol, we have also studied niosome-based systems and cationic liposome/DNA complexes. Copyright © Informa Healthcare.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The aim of this thesis is to present numerical investigations of the polarisation mode dispersion (PMD) effect. Outstanding issues on the side of the numerical implementations of PMD are resolved and the proposed methods are further optimized for computational efficiency and physical accuracy. Methods for the mitigation of the PMD effect are taken into account and simulations of transmission system with added PMD are presented. The basic outline of the work focusing on PMD can be divided as follows. At first the widely-used coarse-step method for simulating the PMD phenomenon as well as a method derived from the Manakov-PMD equation are implemented and investigated separately through the distribution of a state of polarisation on the Poincaré sphere, and the evolution of the dispersion of a signal. Next these two methods are statistically examined and compared to well-known analytical models of the probability distribution function (PDF) and the autocorrelation function (ACF) of the PMD phenomenon. Important optimisations are achieved, for each of the aforementioned implementations in the computational level. In addition the ACF of the coarse-step method is considered separately, based on the result which indicates that the numerically produced ACF, exaggerates the value of the correlation between different frequencies. Moreover the mitigation of the PMD phenomenon is considered, in the form of numerically implementing Low-PMD spun fibres. Finally, all the above are combined in simulations that demonstrate the impact of the PMD on the quality factor (Q=factor) of different transmission systems. For this a numerical solver based on the coupled nonlinear Schrödinger equation is created which is otherwise tested against the most important transmission impairments in the early chapters of this thesis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The rapid global loss of biodiversity has led to a proliferation of systematic conservation planning methods. In spite of their utility and mathematical sophistication, these methods only provide approximate solutions to real-world problems where there is uncertainty and temporal change. The consequences of errors in these solutions are seldom characterized or addressed. We propose a conceptual structure for exploring the consequences of input uncertainty and oversimpli?ed approximations to real-world processes for any conservation planning tool or strategy. We then present a computational framework based on this structure to quantitatively model species representation and persistence outcomes across a range of uncertainties. These include factors such as land costs, landscape structure, species composition and distribution, and temporal changes in habitat. We demonstrate the utility of the framework using several reserve selection methods including simple rules of thumb and more sophisticated tools such as Marxan and Zonation. We present new results showing how outcomes can be strongly affected by variation in problem characteristics that are seldom compared across multiple studies. These characteristics include number of species prioritized, distribution of species richness and rarity, and uncertainties in the amount and quality of habitat patches. We also demonstrate how the framework allows comparisons between conservation planning strategies and their response to error under a range of conditions. Using the approach presented here will improve conservation outcomes and resource allocation by making it easier to predict and quantify the consequences of many different uncertainties and assumptions simultaneously. Our results show that without more rigorously generalizable results, it is very dif?cult to predict the amount of error in any conservation plan. These results imply the need for standard practice to include evaluating the effects of multiple real-world complications on the behavior of any conservation planning method.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Inference and optimization of real-value edge variables in sparse graphs are studied using the Bethe approximation and replica method of statistical physics. Equilibrium states of general energy functions involving a large set of real edge variables that interact at the network nodes are obtained in various cases. When applied to the representative problem of network resource allocation, efficient distributed algorithms are also devised. Scaling properties with respect to the network connectivity and the resource availability are found, and links to probabilistic Bayesian approximation methods are established. Different cost measures are considered and algorithmic solutions in the various cases are devised and examined numerically. Simulation results are in full agreement with the theory. © 2007 The American Physical Society.