939 resultados para GNSS, three carrier ambiguity resolution, real time kinematic, decimetre positioning
Resumo:
For the past several decades, we have experienced the tremendous growth, in both scale and scope, of real-time embedded systems, thanks largely to the advances in IC technology. However, the traditional approach to get performance boost by increasing CPU frequency has been a way of past. Researchers from both industry and academia are turning their focus to multi-core architectures for continuous improvement of computing performance. In our research, we seek to develop efficient scheduling algorithms and analysis methods in the design of real-time embedded systems on multi-core platforms. Real-time systems are the ones with the response time as critical as the logical correctness of computational results. In addition, a variety of stringent constraints such as power/energy consumption, peak temperature and reliability are also imposed to these systems. Therefore, real-time scheduling plays a critical role in design of such computing systems at the system level. We started our research by addressing timing constraints for real-time applications on multi-core platforms, and developed both partitioned and semi-partitioned scheduling algorithms to schedule fixed priority, periodic, and hard real-time tasks on multi-core platforms. Then we extended our research by taking temperature constraints into consideration. We developed a closed-form solution to capture temperature dynamics for a given periodic voltage schedule on multi-core platforms, and also developed three methods to check the feasibility of a periodic real-time schedule under peak temperature constraint. We further extended our research by incorporating the power/energy constraint with thermal awareness into our research problem. We investigated the energy estimation problem on multi-core platforms, and developed a computation efficient method to calculate the energy consumption for a given voltage schedule on a multi-core platform. In this dissertation, we present our research in details and demonstrate the effectiveness and efficiency of our approaches with extensive experimental results.
Resumo:
The use of serious games in education and their pedagogical benefit is being widely recognized. However, effective integration of serious games in education depends on addressing two big challenges: the successful incorporation of motivation and engagement that can lead to learning; and the highly specialised skills associated with customised development to meet the required pedagogical objectives. This paper presents the Westminster Serious Games Platform (wmin-SGP) an authoring tool that allows educators/domain experts without games design and development technical skills to create bespoke roleplay simulations in three dimensional scenes featuring fully embodied virtual humans capable of verbal and non-verbal interaction with users fit for specific educational objectives. The paper presents the wmin-SGP system architecture and it evaluates its effectiveness in fulfilling its purpose via the implementation of two roleplay simulations, one for Politics and one for Law. In addition, it presents the results of two types of evaluation that address how successfully the wmin-SGP combines usability principles and game core drives based on the Octalysis gamification framework that lead to motivating games experiences. The evaluation results shows that the wmin-SGP: provides an intuitive environment and tools that support users without advanced technical skills to create in real-time bespoke roleplay simulations in advanced graphical interfaces; satisfies most of the usability principles; and provides balanced simulations based on the Octalysis framework core drives. The paper concludes with a discussion of future extension of this real time authoring tool and directions for further development of the Octalysis framework to address learning.
Resumo:
An array of Bio-Argo floats equipped with radiometric sensors has been recently deployed in various open ocean areas representative of the diversity of trophic and bio-optical conditions prevailing in the so-called Case 1 waters. Around solar noon and almost everyday, each float acquires 0-250 m vertical profiles of Photosynthetically Available Radiation and downward irradiance at three wavelengths (380, 412 and 490 nm). Up until now, more than 6500 profiles for each radiometric channel have been acquired. As these radiometric data are collected out of operator’s control and regardless of meteorological conditions, specific and automatic data processing protocols have to be developed. Here, we present a data quality-control procedure aimed at verifying profile shapes and providing near real-time data distribution. This procedure is specifically developed to: 1) identify main issues of measurements (i.e. dark signal, atmospheric clouds, spikes and wave-focusing occurrences); 2) validate the final data with a hierarchy of tests to ensure a scientific utilization. The procedure, adapted to each of the four radiometric channels, is designed to flag each profile in a way compliant with the data management procedure used by the Argo program. Main perturbations in the light field are identified by the new protocols with good performances over the whole dataset. This highlights its potential applicability at the global scale. Finally, the comparison with modeled surface irradiances allows assessing the accuracy of quality-controlled measured irradiance values and identifying any possible evolution over the float lifetime due to biofouling and instrumental drift.
Resumo:
An array of Bio-Argo floats equipped with radiometric sensors has been recently deployed in various open ocean areas representative of the diversity of trophic and bio-optical conditions prevailing in the so-called Case 1 waters. Around solar noon and almost everyday, each float acquires 0-250 m vertical profiles of Photosynthetically Available Radiation and downward irradiance at three wavelengths (380, 412 and 490 nm). Up until now, more than 6500 profiles for each radiometric channel have been acquired. As these radiometric data are collected out of operator’s control and regardless of meteorological conditions, specific and automatic data processing protocols have to be developed. Here, we present a data quality-control procedure aimed at verifying profile shapes and providing near real-time data distribution. This procedure is specifically developed to: 1) identify main issues of measurements (i.e. dark signal, atmospheric clouds, spikes and wave-focusing occurrences); 2) validate the final data with a hierarchy of tests to ensure a scientific utilization. The procedure, adapted to each of the four radiometric channels, is designed to flag each profile in a way compliant with the data management procedure used by the Argo program. Main perturbations in the light field are identified by the new protocols with good performances over the whole dataset. This highlights its potential applicability at the global scale. Finally, the comparison with modeled surface irradiances allows assessing the accuracy of quality-controlled measured irradiance values and identifying any possible evolution over the float lifetime due to biofouling and instrumental drift.
Resumo:
FPGAs and GPUs are often used when real-time performance in video processing is required. An accelerated processor is chosen based on task-specific priorities (power consumption, processing time and detection accuracy), and this decision is normally made once at design time. All three characteristics are important, particularly in battery-powered systems. Here we propose a method for moving selection of processing platform from a single design-time choice to a continuous run time one.We implement Histogram of Oriented Gradients (HOG) detectors for cars and people and Mixture of Gaussians (MoG) motion detectors running across FPGA, GPU and CPU in a heterogeneous system. We use this to detect illegally parked vehicles in urban scenes. Power, time and accuracy information for each detector is characterised. An anomaly measure is assigned to each detected object based on its trajectory and location, when compared to learned contextual movement patterns. This drives processor and implementation selection, so that scenes with high behavioural anomalies are processed with faster but more power hungry implementations, but routine or static time periods are processed with power-optimised, less accurate, slower versions. Real-time performance is evaluated on video datasets including i-LIDS. Compared to power-optimised static selection, automatic dynamic implementation mapping is 10% more accurate but draws 12W extra power in our testbed desktop system.
Resumo:
The global socioeconomic importance of helminth parasitic disease is underpinned by the considerable clinical impact on millions of people. While helminth polyparasitism is considered common in the Philippines, little has been done to survey its extent in endemic communities. High morphological similarity of eggs between related species complicates conventional microscopic diagnostic methods which are known to lack sensitivity, particularly in low intensity infections. Multiplex quantitative PCR diagnostic methods can provide rapid, simultaneous identification of multiple helminth species from a single stool sample. We describe a multiplex assay for the differentiation of Ascaris lumbricoides, Necator americanus, Ancylostoma, Taenia saginata and Taenia solium, building on our previously published findings for Schistosoma japonicum. Of 545 human faecal samples examined, 46.6% were positive for at least three different parasite species. High prevalences of S. japonicum (90.64%), A. lumbricoides (58.17%), T. saginata (42.57%) and A. duodenale (48.07%) were recorded. Neither T. solium nor N. americanus were found to be present. The utility of molecular diagnostic methods for monitoring helminth parasite prevalence provides new information on the extent of polyparasitism in the Philippines municipality of Palapag. These methods and findings have potential global implications for the monitoring of neglected tropical diseases and control measures.
Resumo:
The hypervariable regions of immunoglobulin heavy-chain (IgH) rearrangements provide a specific tumor marker in multiple myeloma (MM). Recently, real-time PCR assays have been developed in order to quantify the number of tumor cells after treatment. However, these strategies are hampered by the presence of somatic hypermutation (SH) in VDJH rearrangements from multiple myeloma (MM) patients, which causes mismatches between primers and/or probes and the target, leading to a nonaccurate quantification of tumor cells. Our group has recently described a 60% incidence of incomplete DJH rearrangements in MM patients, with no or very low rates of SH. In this study, we compare the efficiency of a real-time PCR approach for the analysis of both complete and incomplete IgH rearrangements in eight MM patients using only three JH consensus probes. We were able to design an allele-specific oligonucleotide for both the complete and incomplete rearrangement in all patients. DJH rearrangements fulfilled the criteria of effectiveness for real-time PCR in all samples (ie no unspecific amplification, detection of less than 10 tumor cells within 10(5) polyclonal background and correlation coefficients of standard curves higher than 0.98). By contrast, only three out of eight VDJH rearrangements fulfilled these criteria. Further analyses showed that the remaining five VDJH rearrangements carried three or more somatic mutations in the probe and primer sites, leading to a dramatic decrease in the melting temperature. These results support the use of incomplete DJH rearrangements instead of complete somatically mutated VDJH rearrangements for investigation of minimal residual disease in multiple myeloma.
Resumo:
Lors du transport du bois de la forêt vers les usines, de nombreux événements imprévus peuvent se produire, événements qui perturbent les trajets prévus (par exemple, en raison des conditions météo, des feux de forêt, de la présence de nouveaux chargements, etc.). Lorsque de tels événements ne sont connus que durant un trajet, le camion qui accomplit ce trajet doit être détourné vers un chemin alternatif. En l’absence d’informations sur un tel chemin, le chauffeur du camion est susceptible de choisir un chemin alternatif inutilement long ou pire, qui est lui-même "fermé" suite à un événement imprévu. Il est donc essentiel de fournir aux chauffeurs des informations en temps réel, en particulier des suggestions de chemins alternatifs lorsqu’une route prévue s’avère impraticable. Les possibilités de recours en cas d’imprévus dépendent des caractéristiques de la chaîne logistique étudiée comme la présence de camions auto-chargeurs et la politique de gestion du transport. Nous présentons trois articles traitant de contextes d’application différents ainsi que des modèles et des méthodes de résolution adaptés à chacun des contextes. Dans le premier article, les chauffeurs de camion disposent de l’ensemble du plan hebdomadaire de la semaine en cours. Dans ce contexte, tous les efforts doivent être faits pour minimiser les changements apportés au plan initial. Bien que la flotte de camions soit homogène, il y a un ordre de priorité des chauffeurs. Les plus prioritaires obtiennent les volumes de travail les plus importants. Minimiser les changements dans leurs plans est également une priorité. Étant donné que les conséquences des événements imprévus sur le plan de transport sont essentiellement des annulations et/ou des retards de certains voyages, l’approche proposée traite d’abord l’annulation et le retard d’un seul voyage, puis elle est généralisée pour traiter des événements plus complexes. Dans cette ap- proche, nous essayons de re-planifier les voyages impactés durant la même semaine de telle sorte qu’une chargeuse soit libre au moment de l’arrivée du camion à la fois au site forestier et à l’usine. De cette façon, les voyages des autres camions ne seront pas mo- difiés. Cette approche fournit aux répartiteurs des plans alternatifs en quelques secondes. De meilleures solutions pourraient être obtenues si le répartiteur était autorisé à apporter plus de modifications au plan initial. Dans le second article, nous considérons un contexte où un seul voyage à la fois est communiqué aux chauffeurs. Le répartiteur attend jusqu’à ce que le chauffeur termine son voyage avant de lui révéler le prochain voyage. Ce contexte est plus souple et offre plus de possibilités de recours en cas d’imprévus. En plus, le problème hebdomadaire peut être divisé en des problèmes quotidiens, puisque la demande est quotidienne et les usines sont ouvertes pendant des périodes limitées durant la journée. Nous utilisons un modèle de programmation mathématique basé sur un réseau espace-temps pour réagir aux perturbations. Bien que ces dernières puissent avoir des effets différents sur le plan de transport initial, une caractéristique clé du modèle proposé est qu’il reste valable pour traiter tous les imprévus, quelle que soit leur nature. En effet, l’impact de ces événements est capturé dans le réseau espace-temps et dans les paramètres d’entrée plutôt que dans le modèle lui-même. Le modèle est résolu pour la journée en cours chaque fois qu’un événement imprévu est révélé. Dans le dernier article, la flotte de camions est hétérogène, comprenant des camions avec des chargeuses à bord. La configuration des routes de ces camions est différente de celle des camions réguliers, car ils ne doivent pas être synchronisés avec les chargeuses. Nous utilisons un modèle mathématique où les colonnes peuvent être facilement et naturellement interprétées comme des itinéraires de camions. Nous résolvons ce modèle en utilisant la génération de colonnes. Dans un premier temps, nous relaxons l’intégralité des variables de décision et nous considérons seulement un sous-ensemble des itinéraires réalisables. Les itinéraires avec un potentiel d’amélioration de la solution courante sont ajoutés au modèle de manière itérative. Un réseau espace-temps est utilisé à la fois pour représenter les impacts des événements imprévus et pour générer ces itinéraires. La solution obtenue est généralement fractionnaire et un algorithme de branch-and-price est utilisé pour trouver des solutions entières. Plusieurs scénarios de perturbation ont été développés pour tester l’approche proposée sur des études de cas provenant de l’industrie forestière canadienne et les résultats numériques sont présentés pour les trois contextes.
Resumo:
Power system engineers face a double challenge: to operate electric power systems within narrow stability and security margins, and to maintain high reliability. There is an acute need to better understand the dynamic nature of power systems in order to be prepared for critical situations as they arise. Innovative measurement tools, such as phasor measurement units, can capture not only the slow variation of the voltages and currents but also the underlying oscillations in a power system. Such dynamic data accessibility provides us a strong motivation and a useful tool to explore dynamic-data driven applications in power systems. To fulfill this goal, this dissertation focuses on the following three areas: Developing accurate dynamic load models and updating variable parameters based on the measurement data, applying advanced nonlinear filtering concepts and technologies to real-time identification of power system models, and addressing computational issues by implementing the balanced truncation method. By obtaining more realistic system models, together with timely updated parameters and stochastic influence consideration, we can have an accurate portrait of the ongoing phenomena in an electrical power system. Hence we can further improve state estimation, stability analysis and real-time operation.
Resumo:
Puccinia psidii (Myrtle rust) is an emerging pathogen that has a wide host range in the Myrtaceae family; it continues to show an increase in geographic range and is considered to be a significant threat to Myrtaceae plants worldwide. In this study, we describe the development and validation of three novel real-time polymerase reaction (qPCR) assays using ribosomal DNA and β-tubulin gene sequences to detect P. psidii. All qPCR assays were able to detect P. psidii DNA extracted from urediniospores and from infected plants, including asymptomatic leaf tissues. Depending on the gene target, qPCR was able to detect down to 0.011 pg of P. psidii DNA. The most optimum qPCR assay was shown to be highly specific, repeatable, and reproducible following testing using different qPCR reagents and real-time PCR platforms in different laboratories. In addition, a duplex qPCR assay was developed to allow coamplification of the cytochrome oxidase gene from host plants for use as an internal PCR control. The most optimum qPCR assay proved to be faster and more sensitive than the previously published nested PCR assay and will be particularly useful for high-throughput testing and to detect P. psidii at the early stages of infection, before the development of sporulating rust pustules.
Resumo:
Lors du transport du bois de la forêt vers les usines, de nombreux événements imprévus peuvent se produire, événements qui perturbent les trajets prévus (par exemple, en raison des conditions météo, des feux de forêt, de la présence de nouveaux chargements, etc.). Lorsque de tels événements ne sont connus que durant un trajet, le camion qui accomplit ce trajet doit être détourné vers un chemin alternatif. En l’absence d’informations sur un tel chemin, le chauffeur du camion est susceptible de choisir un chemin alternatif inutilement long ou pire, qui est lui-même "fermé" suite à un événement imprévu. Il est donc essentiel de fournir aux chauffeurs des informations en temps réel, en particulier des suggestions de chemins alternatifs lorsqu’une route prévue s’avère impraticable. Les possibilités de recours en cas d’imprévus dépendent des caractéristiques de la chaîne logistique étudiée comme la présence de camions auto-chargeurs et la politique de gestion du transport. Nous présentons trois articles traitant de contextes d’application différents ainsi que des modèles et des méthodes de résolution adaptés à chacun des contextes. Dans le premier article, les chauffeurs de camion disposent de l’ensemble du plan hebdomadaire de la semaine en cours. Dans ce contexte, tous les efforts doivent être faits pour minimiser les changements apportés au plan initial. Bien que la flotte de camions soit homogène, il y a un ordre de priorité des chauffeurs. Les plus prioritaires obtiennent les volumes de travail les plus importants. Minimiser les changements dans leurs plans est également une priorité. Étant donné que les conséquences des événements imprévus sur le plan de transport sont essentiellement des annulations et/ou des retards de certains voyages, l’approche proposée traite d’abord l’annulation et le retard d’un seul voyage, puis elle est généralisée pour traiter des événements plus complexes. Dans cette ap- proche, nous essayons de re-planifier les voyages impactés durant la même semaine de telle sorte qu’une chargeuse soit libre au moment de l’arrivée du camion à la fois au site forestier et à l’usine. De cette façon, les voyages des autres camions ne seront pas mo- difiés. Cette approche fournit aux répartiteurs des plans alternatifs en quelques secondes. De meilleures solutions pourraient être obtenues si le répartiteur était autorisé à apporter plus de modifications au plan initial. Dans le second article, nous considérons un contexte où un seul voyage à la fois est communiqué aux chauffeurs. Le répartiteur attend jusqu’à ce que le chauffeur termine son voyage avant de lui révéler le prochain voyage. Ce contexte est plus souple et offre plus de possibilités de recours en cas d’imprévus. En plus, le problème hebdomadaire peut être divisé en des problèmes quotidiens, puisque la demande est quotidienne et les usines sont ouvertes pendant des périodes limitées durant la journée. Nous utilisons un modèle de programmation mathématique basé sur un réseau espace-temps pour réagir aux perturbations. Bien que ces dernières puissent avoir des effets différents sur le plan de transport initial, une caractéristique clé du modèle proposé est qu’il reste valable pour traiter tous les imprévus, quelle que soit leur nature. En effet, l’impact de ces événements est capturé dans le réseau espace-temps et dans les paramètres d’entrée plutôt que dans le modèle lui-même. Le modèle est résolu pour la journée en cours chaque fois qu’un événement imprévu est révélé. Dans le dernier article, la flotte de camions est hétérogène, comprenant des camions avec des chargeuses à bord. La configuration des routes de ces camions est différente de celle des camions réguliers, car ils ne doivent pas être synchronisés avec les chargeuses. Nous utilisons un modèle mathématique où les colonnes peuvent être facilement et naturellement interprétées comme des itinéraires de camions. Nous résolvons ce modèle en utilisant la génération de colonnes. Dans un premier temps, nous relaxons l’intégralité des variables de décision et nous considérons seulement un sous-ensemble des itinéraires réalisables. Les itinéraires avec un potentiel d’amélioration de la solution courante sont ajoutés au modèle de manière itérative. Un réseau espace-temps est utilisé à la fois pour représenter les impacts des événements imprévus et pour générer ces itinéraires. La solution obtenue est généralement fractionnaire et un algorithme de branch-and-price est utilisé pour trouver des solutions entières. Plusieurs scénarios de perturbation ont été développés pour tester l’approche proposée sur des études de cas provenant de l’industrie forestière canadienne et les résultats numériques sont présentés pour les trois contextes.
Resumo:
Biofilms are the primary cause of clinical bacterial infections and are impervious to typical amounts of antibiotics, necessitating very high doses for treatment. Therefore, it is highly desirable to develop new alternate methods of treatment that can complement or replace existing approaches using significantly lower doses of antibiotics. Current standards for studying biofilms are based on end-point studies that are invasive and destroy the biofilm during characterization. This dissertation presents the development of a novel real-time sensing and treatment technology to aid in the non-invasive characterization, monitoring and treatment of bacterial biofilms. The technology is demonstrated through the use of a high-throughput bifurcation based microfluidic reactor that enables simulation of flow conditions similar to indwelling medical devices. The integrated microsystem developed in this work incorporates the advantages of previous in vitro platforms while attempting to overcome some of their limitations. Biofilm formation is extremely sensitive to various growth parameters that cause large variability in biofilms between repeated experiments. In this work we investigate the use of microfluidic bifurcations for the reduction in biofilm growth variance. The microfluidic flow cell designed here spatially sections a single biofilm into multiple channels using microfluidic flow bifurcation. Biofilms grown in the bifurcated device were evaluated and verified for reduced biofilm growth variance using standard techniques like confocal microscopy. This uniformity in biofilm growth allows for reliable comparison and evaluation of new treatments with integrated controls on a single device. Biofilm partitioning was demonstrated using the bifurcation device by exposing three of the four channels to various treatments. We studied a novel bacterial biofilm treatment independent of traditional antibiotics using only small molecule inhibitors of bacterial quorum sensing (analogs) in combination with low electric fields. Studies using the bifurcation-based microfluidic flow cell integrated with real-time transduction methods and macro-scale end-point testing of the combination treatment showed a significant decrease in biomass compared to the untreated controls and well-known treatments such as antibiotics. To understand the possible mechanism of action of electric field-based treatments, fundamental treatment efficacy studies focusing on the effect of the energy of the applied electrical signal were performed. It was shown that the total energy and not the type of the applied electrical signal affects the effectiveness of the treatment. The linear dependence of the treatment efficacy on the applied electrical energy was also demonstrated. The integrated bifurcation-based microfluidic platform is the first microsystem that enables biofilm growth with reduced variance, as well as continuous real-time threshold-activated feedback monitoring and treatment using low electric fields. The sensors detect biofilm growth by monitoring the change in impedance across the interdigitated electrodes. Using the measured impedance change and user inputs provided through a convenient and simple graphical interface, a custom-built MATLAB control module intelligently switches the system into and out of treatment mode. Using this self-governing microsystem, in situ biofilm treatment based on the principles of the bioelectric effect was demonstrated by exposing two of the channels of the integrated bifurcation device to low doses of antibiotics.
Resumo:
This study aimed to standardise an in-house real-time polymerase chain reaction (rtPCR) to allow quantification of hepatitis B virus (HBV) DNA in serum or plasma samples, and to compare this method with two commercial assays, the Cobas Amplicor HBV monitor and the Cobas AmpliPrep/Cobas TaqMan HBV test. Samples from 397 patients from the state of São Paulo were analysed by all three methods. Fifty-two samples were from patients who were human immunodeficiency virus and hepatitis C virus positive, but HBV negative. Genotypes were characterised, and the viral load was measure in each sample. The in-house rtPCR showed an excellent success rate compared with commercial tests; inter-assay and intra-assay coefficients correlated with commercial tests (r = 0.96 and r = 0.913, p < 0.001) and the in-house test showed no genotype-dependent differences in detection and quantification rates. The in-house assay tested in this study could be used for screening and quantifying HBV DNA in order to monitor patients during therapy.
Resumo:
Catering to society’s demand for high performance computing, billions of transistors are now integrated on IC chips to deliver unprecedented performances. With increasing transistor density, the power consumption/density is growing exponentially. The increasing power consumption directly translates to the high chip temperature, which not only raises the packaging/cooling costs, but also degrades the performance/reliability and life span of the computing systems. Moreover, high chip temperature also greatly increases the leakage power consumption, which is becoming more and more significant with the continuous scaling of the transistor size. As the semiconductor industry continues to evolve, power and thermal challenges have become the most critical challenges in the design of new generations of computing systems. In this dissertation, we addressed the power/thermal issues from the system-level perspective. Specifically, we sought to employ real-time scheduling methods to optimize the power/thermal efficiency of the real-time computing systems, with leakage/ temperature dependency taken into consideration. In our research, we first explored the fundamental principles on how to employ dynamic voltage scaling (DVS) techniques to reduce the peak operating temperature when running a real-time application on a single core platform. We further proposed a novel real-time scheduling method, “M-Oscillations” to reduce the peak temperature when scheduling a hard real-time periodic task set. We also developed three checking methods to guarantee the feasibility of a periodic real-time schedule under peak temperature constraint. We further extended our research from single core platform to multi-core platform. We investigated the energy estimation problem on the multi-core platforms and developed a light weight and accurate method to calculate the energy consumption for a given voltage schedule on a multi-core platform. Finally, we concluded the dissertation with elaborated discussions of future extensions of our research.
Resumo:
Advances in the diagnosis of Mycobacterium bovis infection in wildlife hosts may benefit the development of sustainable approaches to the management of bovine tuberculosis in cattle. In the present study, three laboratories from two different countries participated in a validation trial to evaluate the reliability and reproducibility of a real time PCR assay in the detection and quantification of M. bovis from environmental samples. The sample panels consisted of negative badger faeces spiked with a dilution series of M. bovis BCG Pasteur and of field samples of faeces from badgers of unknown infection status taken from badger latrines in areas with high and low incidence of bovine TB (bTB) in cattle. Samples were tested with a previously optimised methodology. The experimental design involved rigorous testing which highlighted a number of potential pitfalls in the analysis of environmental samples using real time PCR. Despite minor variation between operators and laboratories, the validation study demonstrated good concordance between the three laboratories: on the spiked panels, the test showed high levels of agreement in terms of positive/negative detection, with high specificity (100%) and high sensitivity (97%) at levels of 10(5) cells g(-1) and above. Quantitative analysis of the data revealed low variability in recovery of BCG cells between laboratories and operators. On the field samples, the test showed high reproducibility both in terms of positive/negative detection and in the number of cells detected, despite low numbers of samples identified as positive by any laboratory. Use of a parallel PCR inhibition control assay revealed negligible PCR-interfering chemicals co-extracted with the DNA. This is the first example of a multi-laboratory validation of a real time PCR assay for the detection of mycobacteria in environmental samples. Field studies are now required to determine how best to apply the assay for population-level bTB surveillance in wildlife.