911 resultados para persistent fever
Resumo:
The notion of being sure that you have completely eradicated an invasive species is fanciful because of imperfect detection and persistent seed banks. Eradication is commonly declared either on an ad hoc basis, on notions of seed bank longevity, or on setting arbitrary thresholds of 1% or 5% confidence that the species is not present. Rather than declaring eradication at some arbitrary level of confidence, we take an economic approach in which we stop looking when the expected costs outweigh the expected benefits. We develop theory that determines the number of years of absent surveys required to minimize the net expected cost. Given detection of a species is imperfect, the optimal stopping time is a trade-off between the cost of continued surveying and the cost of escape and damage if eradication is declared too soon. A simple rule of thumb compares well to the exact optimal solution using stochastic dynamic programming. Application of the approach to the eradication programme of Helenium amarum reveals that the actual stopping time was a precautionary one given the ranges for each parameter.
Resumo:
The cattle tick, Rhipicephalus (Boophilus) microplus, and the diseases it transmits pose a persistent threat to tropical beef production. Genetic selection of host resistance has become the method of choice for non-chemical control of cattle tick. Previous studies have suggested that larval stages are most susceptible to host resistance mechanisms. To gain insights into the molecular basis of host resistance that occurs during R. microplus attachment, we assessed the abundance of proteins (by isobaric tag for relative and absolute quantitation (iTRAQ) and Western blot analyses) and mRNAs (by quantitative reverse transcription PCR (qRT-PCR)) in skin adjacent to tick bite sites from high tick-resistant (HR) and low tick-resistant (LR) Belmont Red cattle following challenge with cattle tick. We showed substantially higher expression of the basal epidermal keratins KRT5 and KRT14, the lipid processing protein, lipocalin 9 (LCN9), the epidermal barrier catalysing enzyme transglutaminase 1 (TGM1), and the transcriptional regulator B lymphocyte-induced maturation protein 1 (Blimp1) in HR skin. Our data reveals the essential role of the epidermal permeability barrier in conferring greater resistance of cattle to tick infestation, and suggest that the physical structure of the epidermal layers of the skin may represent the first line of defence against ectoparasite invasion. Crown Copyright. © Australian Society for Parasitology Inc.
Resumo:
In the subtropics of Australia, the ryegrass component of irrigated perennial ryegrass (Lolium perenne) - white clover (Trifolium repens) pastures declines by approximately 40% in the summer following establishment, being replaced by summer-active C4 grasses. Tall fescue (Festuca arundinacea) is more persistent than perennial ryegrass and might resist this invasion, although tall fescue does not compete vigorously as a seedling. This series of experiments investigated the influence of ryegrass and tall fescue genotype, sowing time and sowing mixture as a means of improving tall fescue establishment and the productivity and persistence of tall fescue, ryegrass and white clover-based mixtures in a subtropical environment. Tall fescue frequency at the end of the establishment year decreased as the number of companion species sown in the mixture increased. Neither sowing mixture combinations nor sowing rates influenced overall pasture yield (of around 14 t/ha) in the establishment year but had a significant effect on botanical composition and component yields. Perennial ryegrass was less competitive than short-rotation ryegrass, increasing first-year yields of tall fescue by 40% in one experiment and by 10% in another but total yield was unaffected. The higher establishment-year yield (3.5 t/ha) allowed Dovey tall fescue to compete more successfully with the remaining pasture components than Vulcan (1.4 t/ha). Sowing 2 ryegrass cultivars in the mixture reduced tall fescue yields by 30% compared with a single ryegrass (1.6 t/ha), although tall fescue alone achieved higher yields (7.1 t/ha). Component sowing rate had little influence on composition or yield. Oversowing the ryegrass component into a 6-week-old sward of tall fescue and white clover improved tall fescue, white clover and overall yields in the establishment year by 83, 17 and 11%, respectively, but reduced ryegrass yields by 40%. The inclusion of red (T. pratense) and Persian (T. resupinatum) clovers and chicory (Cichorium intybus) increased first-year yields by 25% but suppressed perennial grass and clover components. Yields were generally maintained at around 12 t/ha/yr in the second and third years, with tall fescue becoming dominant in all 3 experiments. The lower tall fescue seeding rate used in the first experiment resulted in tall fescue dominance in the second year following establishment, whereas in Experiments 2 and 3 dominance occurred by the end of the first year. Invasion by the C4 grasses was relatively minor (<10%) even in the third year. As ryegrass plants died, tall fescue and, to a lesser extent, white clover increased as a proportion of the total sward. Treatment effects continued into the second, but rarely the third, year and mostly affected the yield of one of the components rather than total cumulative yield. Once tall fescue became dominant, it was difficult to re-introduce other pasture components, even following removal of foliage and moderate renovation. Severe renovation (reducing the tall fescue population by at least 30%) seems a possible option for redressing this situation.
Resumo:
The genus Asparagus includes at least six invasive species in Australia. Asparagus aethiopicus and A. africanus are invasive in subtropical Australia, and a third species, A. virgatus is naturalized and demonstrates localized spread in south east Queensland. To better understand how the attributes of these species contribute to their invasiveness, we compared fruit and seed traits, germination, seedling emergence, seed survival, and time-to-maturity. We further investigated dispersal ecology of A. africanus, examining the diet of a local frugivore, the figbird (Sphecotheres viridis) and the effect of gut passage on seedling emergence. Overall, A. aethiopicus was superior in germination and emergence, with the highest mean germination (98.8%) and emergence (94.5%) under optimal conditions and higher emergence (mean of 73.3%) across all treatments. In contrast, A. africanus had the lowest germination under optimal conditions (71.7%) and low mean seedling emergence (49.5%), but had fruits with the highest relative yield (ratio of dry pulp to fruit fresh weight) that were favored by a local frugivore. Figbirds consumed large numbers of A. africanus fruits (~30% of all non-Ficus fruits), and seedling germination was not significantly affected by gut passage compared to unprocessed fruits. Asparagus virgatus germinated poorly under cool, light conditions (1.4%) despite a high optimum mean (95.0%) and had low mean performance across emergence treatments (36.3%). The species also had fruits with a low pulp return for frugivores. For all species, seed survival declined rapidly in the first 12 mo and fell to < 3.2% viability at 36 mo. On the basis of the traits considered, A. virgatus is unlikely to have the invasive potential of its congeners. Uniformly short seed survival times suggest that weed managers do not have to contend with a substantial persistent soil-stored seed bank, but frugivore-mediated dispersal beyond existing infestations will present a considerable management challenge.
Resumo:
Seed persistence of Gymnocoronis spilanthoides (D.Don) DC.; Asteraceae (Senegal tea), a serious weed of freshwater habitats, was examined in relation to burial status and different soil moisture regimes over a 3-year period. Seeds were found to be highly persistent, especially when buried. At the end of the experiment, 42.0%, 27.3% and 61.4% of buried seeds were viable following maintenance at field capacity, water logged and fluctuating (cycles of 1 week at field capacity followed by 3 weeks’ drying down) soil moisture conditions, respectively. Comparable viability values for surface-situated seeds were ~3% over all soil moisture regimes. Predicted times to1% viability are 16.2 years for buried seed and 3.8 years for surface-situated seed. Persistence was attributed primarily to the absence of light, a near-obligate requirement for germination in this species, although secondary dormancy was induced in some seeds. Previous work has demonstrated low fecundity in field populations of G. spilanthoides, which suggests that soil seed banks may not be particularly large. However, high levels of seed persistence, combined with ostensibly effective dispersal mechanisms, indicate that this weed may prove a difficult target for regional or state-wide eradication.
Resumo:
Health promotion aspires to work in empowering, participatory ways, with the goal of supporting people to increase control over their health. However, buried in this goal is an ethical tension: while increasing people’s autonomy, health promotion also imposes a particular, health promotion-sanctioned version of what is good. This tension positions practitioners precariously, where the ethos of empowerment risks increasing health promotion’s paternalistic control over people, rather than people’s control over their own health. Here in we argue that this ethical tension is amplified in Indigenous Australia, where colonial processes of control over Indigenous lands, lives and cultures are indistinguishable from contemporary health promotion ‘interventions’. Moreover, the potential stigmatisation produced in any paternalistic acts ‘done for their own good’ cannot be assumed to have evaporated within the self-proclaimed ‘empowering’ narratives of health promotion. This issue’s guest editor’s call for health promotion to engage ‘with politics and with philosophical ideas about the state and the citizen’ is particularly relevant in an Indigenous Australian context. Indigenous Australians continue to experience health promotion as a moral project of control through intervention, which contradicts health promotion’s central goal of empowerment. Therefore, Indigenous health promotion is an invaluable site for discussion and analysis of health promotion’s broader ethical tensions. Given the persistent and alarming Indigenous health inequalities, this paper calls for systematic ethical reflection in order to redress health promotion’s general failure to reduce health inequalities experienced by Indigenous Australians.
Resumo:
Temperate species and tropical crop silage are the basis for forage production for the dairy industry in the Australian subtropics. Irrigation is the key resource needed for production, with little survival of temperate species under rain-grown conditions except for lucerne. Annual ryegrass (Lolium multiflorum), fertilised with either inorganic nitrogen or grown with clovers, is the main cool season forage for the dairy industry. It is sown into fully prepared seedbeds, oversown into tropical grasses, especially kikuyu (Pennisetum clandestinum) or sown after mulching. There has been a continual improvement in the performance of annual and hybrid ryegrass cultivars over the last 25 years. In small plot, cutting experiments, yields of annual ryegrass typically range from 15 to 21 t DM/ha, with equivalent on-farm yields of 7 to 14 t DM/ha of utilised material. Rust (Puccinia coronata) remains the major concern although resistance is more stable than in oats. There have also been major improvements in the performance of perennial ryegrass (L. perenne) cultivars although their persistence under grazing is insufficient to make them a reliable forage source for the subtropics. On the other hand, tall fescue (Festuca arundinacea) and prairie grass (Bromus willdenowii) cultivars perform well under cutting and grazing, although farmer resistance to the use of tall fescue is strong. White clover (Trifolium repens) is a reliable and persistent performer although disease usually reduces its performance in the third year after sowing. Persian (Shaftal) annual clover (T. resupinatum) gives good winter production but the performance of berseem clover (T. alexandrinum) is less reliable and the sub clovers (T. subterraneum) are generally not suited to clay soils of neutral to alkaline pH. Lucerne (Medicago sativa), either as a pure stand or in mixtures, is a high producing legume under both irrigation and natural rainfall. Understanding the importance of leaf and crown diseases, and the development of resistant cultivars, have been the reasons for its reliability. Insects on temperate species are not as serious a problem in the subtropics as in New Zealand (NZ). Fungal and viral diseases, on the other hand, cause many problems and forage performance would benefit from more research into resistance.
Resumo:
Distinct endogenous network events, generated independently of sensory input, are a general feature of various structures of the immature central nervous system. In the immature hippocampus, these type of events are seen as "giant depolarizing potentials" (GDPs) in intracellular recordings in vitro. GABA, the major inhibitory neurotransmitter of the adult brain, has a depolarizing action in immature neurons, and GDPs have been proposed to be driven by GABAergic transmission. Moreover, GDPs have been thought to reflect an early pattern that disappears during development in parallel with the maturation of hyperpolarizing GABAergic inhibition. However, the adult hippocampus in vivo also generates endogenous network events known as sharp (positive) waves (SPWs), which reflect synchronous discharges of CA3 pyramidal neurons and are thought to be involved in cognitive functions. In this thesis, mechanisms of GDP generation were studied with intra- and extracellular recordings in the neonatal rat hippocampus in vitro and in vivo. Immature CA3 pyramidal neurons were found to generate intrinsic bursts of spikes and to act as cellular pacemakers for GDP activity whereas depolarizing GABAergic signalling was found to have a temporally non-patterned facilitatory role in the generation of the network events. Furthermore, the data indicate that the intrinsic bursts of neonatal CA3 pyramidal neurons and, consequently, GDPs are driven by a persistent Na+ current and terminated by a slow Ca2+-dependent K+ current. Gramicidin-perforated patch recordings showed that the depolarizing driving force for GABAA receptor-mediated actions is provided by Cl- uptake via the Na-K-C1 cotransporter, NKCC1, in the immature CA3 pyramids. A specific blocker of NKCC1, bumetanide, inhibited SPWs and GDPs in the neonatal rat hippocampus in vivo and in vitro, respectively. Finally, pharmacological blockade of the GABA transporter-1 prolonged the decay of the large GDP-associated GABA transients but not of single postsynaptic GABAA receptor-mediated currents. As a whole the data in this thesis indicate that the mechanism of GDP generation, based on the interconnected network of bursting CA3 pyramidal neurons, is similar to that involved in adult SPW activity. Hence, GDPs do not reflect a network pattern that disappears during development but they are the in vitro counterpart of neonatal SPWs.
Resumo:
1. Weed eradication efforts often must be sustained for long periods owing to the existence of persistent seed banks, among other factors. Decision makers need to consider both the amount of investment required and the period over which investment must be maintained when determining whether to commit to (or continue) an eradication programme. However, a basis for estimating eradication programme duration based on simple data has been lacking. Here, we present a stochastic dynamic model that can provide such estimates. 2. The model is based upon the rates of progression of infestations from the active to the monitoring state (i.e. no plants detected for at least 12 months), rates of reversion of infestations from monitoring to the active state and the frequency distribution of time since last detection for all infestations. Isoquants that illustrate the combinations of progression and reversion parameters corresponding to eradication within different time frames are generated. 3. The model is applied to ongoing eradication programmes targeting branched broomrape Orobanche ramosa and chromolaena Chromolaena odorata. The minimum periods in which eradication could potentially be achieved were 22 and 23 years, respectively. On the basis of programme performance until 2008, however, eradication is predicted to take considerably longer for both species (on average, 62 and 248 years, respectively). Performance of the branched broomrape programme could be best improved through reducing rates of reversion to the active state; for chromolaena, boosting rates of progression to the monitoring state is more important. 4. Synthesis and applications. Our model for estimating weed eradication programme duration, which captures critical transitions between a limited number of states, is readily applicable to any weed.Aparticular strength of the method lies in its minimal data requirements. These comprise estimates of maximum seed persistence and infested area, plus consistent annual records of the detection (or otherwise) of the weed in each infestation. This work provides a framework for identifying where improvements in management are needed and a basis for testing the effectiveness of alternative tactics. If adopted, our approach should help improve decision making with regard to eradication as a management strategy.
Resumo:
Kaposi's sarcoma herpesvirus (KSHV) is an oncogenic human virus and the causative agent of three human malignancies: Kaposi's sarcoma (KS), Multicentric Castleman's Disease (MCD), and primary effusion lymphoma (PEL). In tumors, KSHV establishes latent infection during which it produces no infectious particles. Latently infected cells can enter the lytic replication cycle, and upon provision of appropriate cellular signals, produce progeny virus. PEL, commonly described in patients with AIDS, represents a diffuse large-cell non-Hodgkin's lymphoma, with median survival time less than six months after diagnosis. As tumor suppressor gene TP53 mutations occur rarely in PEL, the aim of this thesis was to investigate whether non-genotoxic activation of the p53 pathway can eradicate malignant PEL cells. This thesis demonstrates that Nutlin-3, a small-molecule inhibitor of the p53-MDM2 interaction, efficiently restored p53 function in PEL cells, leading to cell cycle arrest and massive apoptosis. Furthermore, we found that KSHV infection activated DNA damage signaling, rendering the cells more sensitive to p53-dependent cell death. We also showed in vivo the therapeutic potential of p53 restoration that led to regression of subcutaneous and intraperitoneal PEL tumor xenografts without adversely affecting normal cells. Importantly, we demonstrated that in a small subset of intraperitoneal PEL tumors, spontaneous induction of viral reactivation dramatically impaired Nutlin-3-induced p53-mediated apoptosis. Accordingly, we found that elevated KSHV lytic transcripts correlated with PEL tumor burden in animals and that inhibition of viral reactivation in vitro restored cytotoxic activity of a small-molecule inhibitor of the p53-MDM2 interaction. Latency provides a unique opportunity for KSHV to escape host immune surveillance and to establish persistent infections. However, to maintain viral reservoirs and spread to other hosts, KSHV must be reactivated from latency and enter into the lytic growth phase. We showed that phosphorylation of nucleolar phosphoprotein nucleophosmin (NPM) by viral cyclin-CDK6 is critical for establishment and maintenance of the KSHV latency. In short, this study provides evidence that the switch between latent phase and lytic replication is a critical step that determines the outcome of viral infection and the pathogenesis of KSHV-induced malignancies. Our data may thus contribute to development of novel targeted therapies for intervention and treatment of KSHV-associated cancers.
Resumo:
Prostate cancer is the most common noncutaneous malignancy and the second leading cause of cancer mortality in men. In 2004, 5237 new cases were diagnosed and altogether 25 664 men suffered from prostate cancer in Finland (Suomen Syöpärekisteri). Although extensively investigated, we still have a very rudimentary understanding of the molecular mechanisms leading to the frequent transformation of the prostate epithelium. Prostate cancer is characterized by several unique features including the multifocal origin of tumors and extreme resistance to chemotherapy, and new treatment options are therefore urgently needed. The integrity of genomic DNA is constantly challenged by genotoxic insults. Cellular responses to DNA damage involve elegant checkpoint cascades enforcing cell cycle arrest, thus facilitating damage repair, apoptosis or cellular senescence. Cellular DNA damage triggers the activation of tumor suppressor protein p53 and Wee1 kinase which act as executors of the cellular checkpoint responses. These are essential for genomic integrity, and are activated in early stages of tumorigenesis in order to function as barriers against tumor formation. Our work establishes that the primary human prostatic epithelial cells and prostatic epithelium have unexpectedly indulgent checkpoint surveillance. This is evidenced by the absence of inhibitory Tyr15 phosphorylation on Cdk2, lack of p53 response, radioresistant DNA synthesis, lack of G1/S and G2/M phase arrest, and presence of persistent gammaH2AX damage foci. We ascribe the absence of inhibitory Tyr15 phosphorylation to low levels of Wee1A, a tyrosine kinase and negative regulator of cell cycle progression. Ectopic Wee1A kinase restored Cdk2-Tyr15 phosphorylation and efficiently rescued the ionizing radiation-induced checkpoints in the human prostatic epithelial cells. As variability in the DNA damage responses has been shown to underlie susceptibility to cancer, our results imply that a suboptimal checkpoint arrest may greatly increase the accumulation of genetic lesions in the prostate epithelia. We also show that small molecules can restore p53 function in prostatic epithelial cells and may serve as a paradigm for the development of future therapeutic agents for the treatment of prostate cancer We hypothesize that the prostate has evolved to activate the damage surveillance pathways and molecules involved in these pathways only to certain stresses in extreme circumstances. In doing so, this organ inadvertently made itself vulnerable to genotoxic stress, which may have implications in malignant transformation. Recognition of the limited activity of p53 and Wee1 in the prostate could drive mechanism-based discovery of preventative and therapeutic agents.
Resumo:
The StreamIt programming model has been proposed to exploit parallelism in streaming applications oil general purpose multicore architectures. The StreamIt graphs describe task, data and pipeline parallelism which can be exploited on accelerators such as Graphics Processing Units (GPUs) or CellBE which support abundant parallelism in hardware. In this paper, we describe a novel method to orchestrate the execution of if StreamIt program oil a multicore platform equipped with an accelerator. The proposed approach identifies, using profiling, the relative benefits of executing a task oil the superscalar CPU cores and the accelerator. We formulate the problem of partitioning the work between the CPU cores and the GPU, taking into account the latencies for data transfers and the required buffer layout transformations associated with the partitioning, as all integrated Integer Linear Program (ILP) which can then be solved by an ILP solver. We also propose an efficient heuristic algorithm for the work-partitioning between the CPU and the GPU, which provides solutions which are within 9.05% of the optimal solution on an average across the benchmark Suite. The partitioned tasks are then software pipelined to execute oil the multiple CPU cores and the Streaming Multiprocessors (SMs) of the GPU. The software pipelining algorithm orchestrates the execution between CPU cores and the GPU by emitting the code for the CPU and the GPU, and the code for the required data transfers. Our experiments on a platform with 8 CPU cores and a GeForce 8800 GTS 512 GPU show a geometric mean speedup of 6.94X with it maximum of 51.96X over it single threaded CPU execution across the StreamIt benchmarks. This is a 18.9% improvement over it partitioning strategy that maps only the filters that cannot be executed oil the GPU - the filters with state that is persistent across firings - onto the CPU.
Resumo:
Developmental dyslexia is a specific reading disability, which is characterised by unexpected difficulty in reading, spelling and writing despite adequate intelligence, education and social environment. It is the most common childhood learning disorder affecting 5-10 % of the population and thus constitutes the largest portion of all learning disorders. It is a persistent developmental failure although it can be improved by compensation. According to the most common theory, the deficit is in phonological processing, which is needed in reading when the words have to be divided into phonemes, or distinct sound elements. This occurs in the lowest level of the hierarchy of the language system and disturbs processes in higher levels, such as understanding the meaning of words. Dyslexia is a complex genetic disorder and previous studies have found nine locations in the genome that associate with it. Altogether four susceptibility genes have been found and this study describes the discovery of the first two of them, DYX1C1 and ROBO1. The first clues were obtained from two Finnish dyslexic families that have chromosomal translocations which disrupt these genes. Genetic analyses supported their role in dyslexia: DYX1C1 associates with dyslexia in the Finnish population and ROBO1 was linked to dyslexia in a large Finnish pedigree. In addition a genome-wide scan in Finnish dyslexic families was performed. This supported the previously detected dyslexia locus on chromosome 2 and revealed a new locus on chromosome 7. Dyslexia is a neurological disorder and the neurobiological function of the susceptibility genes DYX1C1 and ROBO1 are consistent with this. ROBO1 is an axon guidance receptor gene, which is involved in axon guidance across the midline in Drosophila and axonal pathfinding between the two hemispheres via the corpus callosum, as well as neuronal migration in the brain of mice. The translocation and decreased ROBO1 expression in dyslexic individuals indicate that two functional copies of ROBO1 gene are required in reading. DYX1C1 was a new gene without a previously known function. Inhibition of Dyx1c1 expression showed that it is needed in normal brain development in rats. Without Dyx1c1 protein, the neurons in the developing brain will not migrate to their final position in the cortex. These two dyslexia susceptibility genes DYX1C1 and ROBO1 revealed two distinct neurodevelopmental mechanisms of dyslexia, axonal pathfinding and neuronal migration. This study describes the discovery of the genes and our research to clarify their role in developmental dyslexia.
Resumo:
Infection by Epstein-Barr virus (EBV) occurs in approximately 95% of the world s population. EBV was the first human virus implicated in oncogenesis. Characteristic for EBV primary infection are detectable IgM and IgG antibodies against viral capsid antigen (VCA). During convalescence the VCA IgM disappears while the VCA IgG persists for life. Reactivations of EBV occur both among immunocompromised and immunocompetent individuals. In serological diagnosis, measurement of avidity of VCA IgG separates primary from secondary infections. However, in serodiagnosis of mononucleosis it is quite common to encounter, paradoxically, VCA IgM together with high-avidity VCA IgG, indicating past immunity. We determined the etiology of this phenomenon and found that, among patients with cytomegalovirus (CMV) primary infection a large proportion (23%) showed antibody profiles of EBV reactivation. In contrast, EBV primary infection did not appear to induce immunoreactivation of CMV. EBV-associated post-transplant lymphoproliferative disease (PTLD) is a life threatening complication of allogeneic stem cell or solid organ transplantation. PTLD may present with a diverse spectrum of clinical symptoms and signs. Due to rapidity of PTLD progression especially after stem cell transplantation, the diagnosis must be obtained quickly. Pending timely detection, the evolution of the fatal disease may be halted by reduction of immunosuppression. A promising new PTLD treatment (also in Finland) is based on anti-CD-20 monoclonal antibodies. Diagnosis of PTLD has been demanding because of immunosuppression, blood transfusions and the latent nature of the virus. We set up in 1999 to our knowledge first in Finland for any microbial pathogen a real-time quantitative PCR (qPCR) for detection of EBV DNA in blood serum/plasma. In addition, we set up an in situ hybridisation assay for EBV RNA in tissue sections. In collaboration with a group of haematologists at Helsinki University Central Hospital we retrospectively determined the incidence of PTLD among 257 allogenic stem cell transplantations (SCT) performed during 1994-1999. Post-mortem analysis revealed 18 cases of PTLD. From a subset of PTLD cases (12/18) and a series of corresponding controls (36), consecutive samples of serum were studied by the new EBV-qPCR. All the PTLD patients were positive for EBV-DNA with progressively rising copy numbers. In most PTLD patients EBV DNA became detectable within 70 days of SCT. Of note, the appearance of EBV DNA preceded the PTLD symptoms (fever, lymphadenopathy, atypical lymphocytes). Among the SCT controls, EBV DNA occurred only sporadically, and the EBV-DNA levels remained relatively low. We concluded that EBV qPCR is a highly sensitive (100%) and specific (96%) new diagnostic approach. We also looked for and found risk factors for the development of PTLD. Together with a liver transplantation group at the Transplantation and Liver Surgery Clinic we wanted to clarify how often and how severely do EBV infections occur after liver transplantation. We studied by the EBV qPCR 1284 plasma samples obtained from 105 adult liver transplant recipients. EBV DNA was detected in 14 patients (13%) during the first 12 months. The peak viral loads of 13 asymptomatic patients were relatively low (<6600/ml), and EBV DNA subsided quickly from circulation. Fatal PTLD was diagnosed in one patient. Finally, we wanted to determine the number and clinical significance of EBV infections of various types occurring among a large, retrospective, nonselected cohort of allogenic SCT recipients. We analysed by EBV qPCR 5479 serum samples of 406 SCT recipients obtained during 1988-1999. EBV DNA was seen in 57 (14%) patients, of whom 22 (5%) showed progressively rising and ultimately high levels of EBV DNA (median 54 million /ml). Among the SCT survivors, EBV DNA was transiently detectable in 19 (5%) asymptomatic patients. Thereby, low-level EBV-DNA positivity in serum occurs relatively often after SCT and may subside without specific treatment. However, high molecular copy numbers (>50 000) are diagnostic for life-threatening EBV infection. We furthermore developed a mathematical algorithm for the prediction of development of life-threatening EBV infection.
Resumo:
Typhoid fever is becoming an ever increasing threat in the developing countries. We have improved considerably upon the existing PCR-based diagnosis method by designing primers against a region that is unique to Salmonella enterica subsp. enterica serovar Typhi and Salmonella enterica subsp. enterica serovar Paratyphi A, corresponding to the STY0312 gene in S. Typhi and its homolog SPA2476 in S. Paratyphi A. An additional set of primers amplify another region in S. Typhi CT18 and S. Typhi Ty2 corresponding to the region between genes STY0313 to STY0316 but which is absent in S. Paratyphi A. The possibility of a false-negative result arising due to mutation in hypervariable genes has been reduced by targeting a gene unique to typhoidal Salmonella serovars as a diagnostic marker. The amplified region has been tested for genomic stability by amplifying the region from clinical isolates of patients from various geographical locations in India, thereby showing that this region is potentially stable. These set of primers can also differentiate between S. Typhi CT18, S. Typhi Ty2, and S. Paratyphi A, which have stable deletions in this specific locus. The PCR assay designed in this study has a sensitivity of 95% compared to the Widal test which has a sensitivity of only 63%. As observed, in certain cases, the PCR assay was more sensitive than the blood culture test was, as the PCR-based detection could also detect dead bacteria.