974 resultados para sustained virologic response
Resumo:
Although the treatment of most cancers has improved steadily, only few metastatic solid tumors can be cured. Despite responses, refractory clones often emerge and the disease becomes refractory to available treatment modalities. Furthermore, resistance factors are shared between different treatment regimens and therefore loss of response typically occurs rapidly, and there is a tendency for cross-resistance between agents. Therefore, new agents with novel mechanisms of action and lacking cross-resistance to currently available approaches are needed. Modified oncolytic adenoviruses, featuring cancer-celective cell lysis and spread, constitute an interesting drug platform towards the goals of tumor specificity and the implementation of potent multimodal treatment regimens. In this work, we demonstrate the applicability of capsid-modified, transcriptionally targeted oncolytic adenoviruses in targeting gastric, pancreatic and breast cancer. A variety of capsid modified adenoviruses were tested for transductional specificity first in gastric and pancreatic cancer cells and patient tissues and then in mice. Then, oncolytic viruses featuring the same capsid modifications were tested to confirm that successful transductional targeting translates into enhanced oncolytic potential. Capsid modified oncolytic viruses also prolonged the survival of tumor bearing orthotopic models of gastric and pancreatic cancer. Taken together, oncolytic adenoviral gene therapy could be a potent drug for gastric and pancreatic cancer, and its specificity, potency and safety can be modulated by means of capsid modification. We also characterized a new intraperitoneal virus delivery method in benefit for the persistence of gene delivery to intraperitoneal gastric and pancreatic cancer tumors. With a silica implant a steady and sustained virus release to the vicinity of the tumor improved the survival of the orthotopic tumor bearing mice. Furthermore, silica gel-based virus delivery lowered the toxicity mediating proimflammatory cytokine response and production of total and anti-adenovirus neutralizing antibodies (NAbs). On the other hand, silica shielded the virus against pre-excisting NAbs, resulting in a more favourable biodistribution in the preimmunized mice. The silica implant might therefore be of interest in treating intraperitoneally disseminated disease. Cancer stem cells are thought to be resistant to conventional cancer drugs and might play an important role in cancer relapse and the formation of metastasis. Therefore, we examined if transcriptionally modified oncolytic adenoviruses are able to kill these cells. Complete eradication of CD44+CD24-/low putative breast cancer stem cells was seen in vitro, and significant antitumor activity was detected in CD44+CD24-/low –derived tumor bearing mice. Thus, genetically engineered oncolytic adenoviruses have potential in destroying cancer initiating cells, which may have relevance for the elimination of cancer stem cells in humans.
Resumo:
This section outlines the most important issues addressed in the management of the response in the two infected states, New South Wales and Queensland. There were differences in the management of the response between the states for logistic, geographic and organisation structural reasons. Issues included the use of control centres, information centres, the problems associated with the lack of trained staff to undertake all the roles, legislative issues, controls of horse movements, the availability of resources for adequate surveillance, the challenges of communication between disparate groups and tracing the movements of both humans and horses.
Resumo:
The equine influenza (EI) outbreak presented many challenges that required high-level coordination and decision making, as well as the development of new approaches for satisfactory and consistent resolution. This paper outlines the elements of the national coordination arrangements, preparatory arrangements in place prior to the outbreak that facilitated national coordination, and some of the issues faced and resolved in the response.
Resumo:
A small fraction of the energy absorbed in the light reactions of photosynthesis is re-emitted as chlorophyll-a fluorescence. Chlorophyll-a fluorescence and photochemistry compete for excitation energy in photosystem II (PSII). Therefore, changes in the photochemical capacity can be detected through analysis of chlorophyll fluorescence. Chlorophyll fluorescence techniques have been widely used to follow the diurnal (fast), and the seasonal (slow) acclimation in the energy partitioning between photochemical and non-photochemical processes in PSII. Energy partitioning in PSII estimated through chlorophyll fluorescence can be used as a proxy of the plant physiological status, and measured at different spatial and temporal scales. However, a number of technical and theoretical limitations still limit the use of chlorophyll fluorescence data for the study of the acclimation of PSII. The aim of this Thesis was to study the diurnal and seasonal acclimation of PSII in field conditions through the development and testing of new chlorophyll fluorescence-based tools, overcoming these limitations. A new model capable of following the fast acclimation of PSII to rapid fluctuations in light intensity was developed. The model was used to study the rapid acclimation in the electron transport rate under fluctuating light. Additionally, new chlorophyll fluorescence parameters were developed for estimating the seasonal acclimation in the sustained rate constant of thermal energy dissipation and photochemistry. The parameters were used to quantitatively evaluate the effect of light and temperature on the seasonal acclimation of PSII. The results indicated that light environment not only affected the degree but also the kinetics of response of the acclimation to temperature, which was attributed to differences in the structural organization of PSII during seasonal acclimation. Furthermore, zeaxanthin-facilitated thermal dissipation appeared to be the main mechanisms modulating the fraction of absorbed energy being dissipated thermally during winter in field Scots pine. Finally, the integration between diurnal and seasonal acclimation mechanisms was studied using a recently developed instrument MONI-PAM (Walz GmbH, Germany) capable of continuously monitoring the energy partitioning in PSII.
Resumo:
Fumigation of stored grain with phosphine (PH 3) is used widely to control the lesser grain borer Rhyzopertha dominica. However, development of high level resistance to phosphine in this species threatens control. Effective resistance management relies on knowledge of the expression of resistance in relation to dosage at all life stages. Therefore, we determined the mode of inheritance of phosphine resistance and strength of the resistance phenotype at each developmental stage. We achieved this by comparing mortality and developmental delay between a strongly resistant strain (R-strain), a susceptible strain (S-strain) and their F 1 progenies. Resistance was a maternally inherited, semi-dominant trait in the egg stage but was inherited as an autosomal, incompletely recessive trait in larvae and pupae. The rank order of developmental tolerance in both the sensitive and resistant strains was eggs > pupae > larvae. Comparison of published values for the response of adult R. dominica relative to our results from immature stages reveals that the adult stage of the S-strain is more sensitive to phosphine than are larvae. This situation is reversed in the R-strain as the adult stage is much more resistant to phosphine than even the most tolerant immature stage. Phosphine resistance factors at LC 50 were eggs 400×, larvae 87× and pupae 181× with respect to reference susceptible strain (S-strain) adults indicating that tolerance conferred by a particular immature stage neither strongly nor reliably interacts with the genetic resistance element. Developmental delay relative to unfumigated control insects was observed in 93% of resistant pupae, 86% of resistant larvae and 41% of resistant eggs. Increased delay in development and the toxicity response to phosphine exposure were both incompletely recessive. We show that resistance to phosphine has pleiotropic effects and that the expression of these effects varies with genotype and throughout the life history of the insect. © 2012.
Resumo:
Rhizoctonia solani is a soil inhabiting basidiomycetous fungus able to induce a wide range of symptoms in many plant species. This genetically complex species is divided to 13 anastomosis groups (AG), of which AG-3 is specialized to infect potato. However, also a few other AGs are able to infect or live in close contact with potato. On potato, R. solani infection causes two main types of diseases including stem canker observed as a dark brown lesions on developing stems and stolons, and black scurf that develops on new tubers close to the time of harvest. These disease symptoms are collectively called a ‘Rhizoctonia disease complex’. Between the growing seasons R. solani survives in soil and plant debri as sclerotia or as the sclerotia called black scurf on potato tubers which when used as seed offer the main route for dispersal of the fungus to new areas. The reasons for the dominance of AG-3 on potato seem to be attributable to its highly specialization to potato and its ability to infect and form sclerotia efficiently at low temperatures. In this study, a large nationwide survey of R. solani isolates was made in potato crops in Finland. Almost all characterized isolates belonged to AG-3. Additionally, three other AGs (AG-2-1, AG-4 and AG-5) were found associated with symptoms on potato plants but they were weaker pathogens on potato than AG-3 as less prone to form black scurf. According to phylogenetic analysis of the internal transcribed sequences (ITS) of the ribosomal RNA genes the Finnish AG-3 isolates are closely related to each other even though a wide variation of physiological features was observed between them. Detailed analysis of the ITS regions revealed single nucleotide polymorphism in 14 nucleotide positions of ITS-1 and ITS-2. Additionally, compensatory base changes on ITS-2 were detected which suggests that potato-infecting R. solani AG-3 could be considered as a separate species instead of an AG of R. solani. For the first time, molecular defence responses were studied and detected during the early phases of interaction between R. solani AG-3 and potato. Extensive systemic signalling for defence exploiting several known defence pathways was activated as soon as R. solani came into close contact with the base of a sprout. The defence response was strong enough to protect vulnerable sprout tips from new attacks by the pathogen. These results at least partly explain why potato emergence is eventually successful even under heavy infection pressure by R. solani.
Resumo:
Propagation of subtropical eucalypts is often limited by low production of rooted cuttings in winter. This study tested whether changing the temperature of Corymbia citriodora and Eucalyptus dunnii stock plants from 28/23A degrees C (day/night) to 18/13A degrees C, 23/18A degrees C or 33/28A degrees C affected the production of cuttings by stock plants, the concentrations of Ca and other nutrients in cuttings, and the subsequent percentages of cuttings that formed roots. Optimal temperatures for shoot production were 33/28A degrees C and 28/23A degrees C, with lower temperatures reducing the number of harvested cuttings. Stock plant temperature regulated production of rooted cuttings, firstly by controlling shoot production and, secondly, by affecting the ensuing rooting percentage. Shoot production was the primary factor regulating rooted cutting production by C. citriodora, but both shoot production and root production were key determinants of rooted cutting production in E. dunnii. Effects of lower stock plant temperatures on rooting were not the result of reduced Ca concentration, but consistent relationships were found between adventitious root formation and B concentration. Average rooting percentages were low (1-15% for C. citriodora and 2-22% for E. dunnii) but rooted cutting production per stock plant (e.g. 25 for C. citriodora and 52 for E. dunnii over 14 weeks at 33/28A degrees C) was sufficient to establish clonal field tests for plantation forestry.
Resumo:
Tribolium castaneum (Herbst) and Rhyzopertha dominica (F.) are common cosmopolitan pests of stored grain and grain products. We evaluated the relative attraction of T.castaneum and R.dominica to wheat, sorghum and cotton seeds in the field, near grain storage facilities and well away from storages in southern and central Queensland using multiple trapping techniques. The results show that T.castaneum is more strongly attracted to linted cotton seed relative to wheat, whereas R.dominica did not respond to cotton seed at all and was attracted only to wheat. Significantly more adults of T.castaneum (10-15 times) were attracted to traps placed on the ground, near grain storage, than to equivalent traps that were suspended (1.5m above the ground) nearby. These results suggest that Tribolium beetles detect and respond to resources towards the end of their dispersal flight, after which they localize resources while walking. By contrast R.dominica was captured only in suspended traps, which suggests they fly directly onto resources as they localize them. The ability of both species to colonize and reproduce in isolated resource patches within the relatively short time of 1month is illustrated by the returns from the traps deployed in the field (at least 1km from the nearest stored grain) even though they caught only a few beetles. The results presented here provide novel insights about the resource location behaviours of both T.castaneum and R.dominica. In particular, the relationship of T.castaneum with non-cereal resources that are not conventionally associated with this species suggests an emphasis on these other resources in investigating the resource location behaviour of these beetles. This new perspective on the ecology of T. castaneum highlights the potential role of non-cereal resources (such as the lint on cotton seed) in the spread of grain pest infestations.
Resumo:
More than 1200 wheat and 120 barley experiments conducted in Australia to examine yield responses to applied nitrogen (N) fertiliser are contained in a national database of field crops nutrient research (BFDC National Database). The yield responses are accompanied by various pre-plant soil test data to quantify plant-available N and other indicators of soil fertility status or mineralisable N. A web application (BFDC Interrogator), developed to access the database, enables construction of calibrations between relative crop yield ((Y0/Ymax) × 100) and N soil test value. In this paper we report the critical soil test values for 90% RY (CV90) and the associated critical ranges (CR90, defined as the 70% confidence interval around that CV90) derived from analysis of various subsets of these winter cereal experiments. Experimental programs were conducted throughout Australia’s main grain-production regions in different eras, starting from the 1960s in Queensland through to Victoria during 2000s. Improved management practices adopted during the period were reflected in increasing potential yields with research era, increasing from an average Ymax of 2.2 t/ha in Queensland in the 1960s and 1970s, to 3.4 t/ha in South Australia (SA) in the 1980s, to 4.3 t/ha in New South Wales (NSW) in the 1990s, and 4.2 t/ha in Victoria in the 2000s. Various sampling depths (0.1–1.2 m) and methods of quantifying available N (nitrate-N or mineral-N) from pre-planting soil samples were used and provided useful guides to the need for supplementary N. The most regionally consistent relationships were established using nitrate-N (kg/ha) in the top 0.6 m of the soil profile, with regional and seasonal variation in CV90 largely accounted for through impacts on experimental Ymax. The CV90 for nitrate-N within the top 0.6 m of the soil profile for wheat crops increased from 36 to 110 kg nitrate-N/ha as Ymax increased over the range 1 to >5 t/ha. Apparent variation in CV90 with seasonal moisture availability was entirely consistent with impacts on experimental Ymax. Further analyses of wheat trials with available grain protein (~45% of all experiments) established that grain yield and not grain N content was the major driver of crop N demand and CV90. Subsets of data explored the impact of crop management practices such as crop rotation or fallow length on both pre-planting profile mineral-N and CV90. Analyses showed that while management practices influenced profile mineral-N at planting and the likelihood and size of yield response to applied N fertiliser, they had no significant impact on CV90. A level of risk is involved with the use of pre-plant testing to determine the need for supplementary N application in all Australian dryland systems. In southern and western regions, where crop performance is based almost entirely on in-crop rainfall, this risk is offset by the management opportunity to split N applications during crop growth in response to changing crop yield potential. In northern cropping systems, where stored soil moisture at sowing is indicative of minimum yield potential, erratic winter rainfall increases uncertainty about actual yield potential as well as reducing the opportunity for effective in-season applications.
Resumo:
Soil testing is the most widely used tool to predict the need for fertiliser phosphorus (P) application to crops. This study examined factors affecting critical soil P concentrations and confidence intervals for wheat and barley grown in Australian soils by interrogating validated data from 1777 wheat and 150 barley field treatment series now held in the BFDC National Database. To narrow confidence intervals associated with estimated critical P concentrations, filters for yield, crop stress, or low pH were applied. Once treatment series with low yield (<1 t/ha), severe crop stress, or pHCaCl2 <4.3 were screened out, critical concentrations were relatively insensitive to wheat yield (>1 t/ha). There was a clear increase in critical P concentration from early trials when full tillage was common compared with those conducted in 1995–2011, which corresponds to a period of rapid shift towards adoption of minimum tillage. For wheat, critical Colwell-P concentrations associated with 90 or 95% of maximum yield varied among Australian Soil Classification (ASC) Orders and Sub-orders: Calcarosol, Chromosol, Kandosol, Sodosol, Tenosol and Vertosol. Soil type, based on ASC Orders and Sub-orders, produced critical Colwell-P concentrations at 90% of maximum relative yield from 15 mg/kg (Grey Vertosol) to 47 mg/kg (Supracalcic Calcarosols), with other soils having values in the range 19–27 mg/kg. Distinctive differences in critical P concentrations were evident among Sub-orders of Calcarosols, Chromosols, Sodosols, Tenosols, and Vertosols, possibly due to differences in soil properties related to P sorption. However, insufficient data were available to develop a relationship between P buffering index (PBI) and critical P concentration. In general, there was no evidence that critical concentrations for barley would be different from those for wheat on the same soils. Significant knowledge gaps to fill to improve the relevance and reliability of soil P testing for winter cereals were: lack of data for oats; the paucity of treatment series reflecting current cropping practices, especially minimum tillage; and inadequate metadata on soil texture, pH, growing season rainfall, gravel content, and PBI. The critical concentrations determined illustrate the importance of recent experimental data and of soil type, but also provide examples of interrogation pathways into the BFDC National Database to extract locally relevant critical P concentrations for guiding P fertiliser decision-making in wheat and barley.
Resumo:
1. 1. An increase in the oxidation of succinate by hepatic mitochondria in rats exposed to hypoxia (O2-N2; 1:9, v/v) or hypobaria (0.5 atm) was observed which appears to be due to modification of the activity of the rate-limiting succinate dehydrogenase [succinate: (acceptor) oxidoreductase, EC 1.3.99.1].
Resumo:
A recent report to the Australian Government identified concerns relating to Australia's capacity to respond to a medium to large outbreak of FMD. To assess the resources required, the AusSpread disease simulation model was used to develop a plausible outbreak scenario that included 62 infected premises in five different states at the time of detection, 28 days after the disease entered the first property in Victoria. Movements of infected animals and/or contaminated product/equipment led to smaller outbreaks in NSW, Queensland, South Australia and Tasmania. With unlimited staff resources, the outbreak was eradicated in 63 days with 54 infected premises and a 98% chance of eradication within 3 months. This unconstrained response was estimated to involve 2724 personnel. Unlimited personnel was considered unrealistic, and therefore, the course of the outbreak was modelled using three levels of staffing and the probability of achieving eradication within 3 or 6 months of introduction determined. Under the baseline staffing level, there was only a 16% probability that the outbreak would be eradicated within 3 months, and a 60% probability of eradication in 6 months. Deployment of an additional 60 personnel in the first 3 weeks of the response increased the likelihood of eradication in 3 months to 68%, and 100% in 6 months. Deployment of further personnel incrementally increased the likelihood of timely eradication and decreased the duration and size of the outbreak. Targeted use of vaccination in high-risk areas coupled with the baseline personnel resources increased the probability of eradication in 3 months to 74% and to 100% in 6 months. This required 25 vaccination teams commencing 12 days into the control program increasing to 50 vaccination teams 3 weeks later. Deploying an equal number of additional personnel to surveillance and infected premises operations was equally effective in reducing the outbreak size and duration.