251 resultados para Saturated throughput
Resumo:
The role that heparanase plays during metastasis and angiogenesis in tumors makes it an attractive target for cancer therapeutics. Despite this enzyme’s significance, most of the assays developed to measure its activity are complex. Moreover, they usually rely on labeling variable preparations of the natural substrate heparan sulfate, making comparisons across studies precarious. To overcome these problems, we have developed a convenient assay based on the cleavage of the synthetic heparin oligosaccharide fondaparinux. The assay measures the appearance of the disaccharide product of heparanase-catalyzed fondaparinux cleavage colorimetrically using the tetrazolium salt WST-1. Because this assay has a homogeneous substrate with a single point of cleavage, the kinetics of the enzyme can be reliably characterized, giving a Km of 46 μM and a kcat of 3.5 s−1 with fondaparinux as substrate. The inhibition of heparanase by the published inhibitor, PI-88, was also studied, and a Ki of 7.9 nM was determined. The simplicity and robustness of this method, should, not only greatly assist routine assay of heparanase activity but also could be adapted for high-throughput screening of compound libraries, with the data generated being directly comparable across studies.
Resumo:
Biotribology, the study of lubrication, wear and friction within the body, has become a topic of high importance in recent times as we continue to encounter debilitating diseases and trauma that destroy function of the joints. A highly successful surgical procedure to replace the joint with an artificial equivalent alleviates dysfunction and pain. However, the wear of the bearing surfaces in prosthetic joints is a significant clinical problem and more patients are surviving longer than the life expectancy of the joint replacement. Revision surgery is associated with increased morbidity and mortality and has a far less successful outcome than primary joint replacement. As such, it is essential to ensure that everything possible is done to limit the rate of revision surgery. Past experience indicates that the survival rate of the implant will be influenced by many parameters, of primary importance, the material properties of the implant, the composition of the synovial fluid and the method of lubrication. In prosthetic joints, effective boundary lubrication is known to take place. The interaction of the boundary lubricant and the bearing material is of utmost importance. The identity of the vital active ingredient within synovial fluid (SF) to which we owe the near frictionless performance of our articulating joints has been the quest of researchers for many years. Once identified, tribo tests can determine what materials and more importantly what surfaces this fraction of SF can function most optimally with. Surface-Active Phospholipids (SAPL) have been implicated as the body’s natural load bearing lubricant. Studies in this thesis are the first to fully characterise the adsorbed SAPL detected on the surface of retrieved prostheses and the first to verify the presence of SAPL on knee prostheses. Rinsings from the bearing surfaces of both hip and knee prostheses removed from revision operations were analysed using High Performance Liquid Chromatography (HPLC) to determine the presence and profile of SAPL. Several common prosthetic materials along with a novel biomaterial were investigated to determine their tribological interaction with various SAPLs. A pin-on-flat tribometer was used to make comparative friction measurements between the various tribo-pairs. A novel material, Pyrolytic Carbon (PyC) was screened as a potential candidate as a load bearing prosthetic material. Friction measurements were also performed on explanted prostheses. SAPL was detected on all retrieved implant bearing surfaces. As a result of the study eight different species of phosphatidylcholines were identified. The relative concentrations of each species were also determined indicating that the unsaturated species are dominant. Initial tribo tests employed a saturated phosphatidylcholine (SPC) and the subsequent tests adopted the addition of the newly identified major constituents of SAPL, unsaturated phosphatidylcholine (USPC), as the test lubricant. All tribo tests showed a dramatic reduction in friction when synthetic SAPL was used as the lubricant under boundary lubrication conditions. Some tribopairs showed more of an affinity to SAPL than others. PyC performed superior to the other prosthetic materials. Friction measurements with explanted prostheses verified the presence and performance of SAPL. SAPL, in particular phosphatidylcholine, plays an essential role in the lubrication of prosthetic joints. Of particular interest was the ability of SAPLs to reduce friction and ultimately wear of the bearing materials. The identification and knowledge of the lubricating constituents of SF is invaluable for not only the future development of artificial joints but also in developing effective cures for several disease processes where lubrication may play a role. The tribological interaction of the various tribo-pairs and SAPL is extremely favourable in the context of reducing friction at the bearing interface. PyC is highly recommended as a future candidate material for use in load bearing prosthetic joints considering its impressive tribological performance.
Resumo:
Art continues to bemuse and confuse many people today. Yet, its critical analyses are saturated with daunting analyses of contemporary art's exhaustion, its predictability or its absorption into global commercial culture. In this book, the author seeks to clarify this apprehensive perception of art. He argues it is a consequence not only of confounding art-works, but also of the paradoxical impetus of a culture of modernity. By positively reassessing the perplexing or apprehensive features of cultural modernity as well as of aesthetic inquiry, this book redefines the ambitions of art in the wake of this legacy. In the process, it challenges many familiar approaches to art inquiry in order to offer a new understanding of the aesthetic, social and cultural aspirations of art in our time.
Resumo:
Background For more than a decade emergency medicine organizations have produced guidelines, training and leadership for disaster management. However to date, there have been limited guidelines for emergency physicians needing to provide a rapid response to a surge in demand. The aim of this study is to identify strategies which may guide surge management in the Emergency Department. Method A working group of individuals experienced in disaster medicine from the Australasian College for Emergency Medicine Disaster Medicine Subcommittee (the Australasian Surge Strategy Working Group) was established to undertake this work. The Working Group used a modified Delphi technique to examine response actions in surge situations. The Working Group identified underlying assumptions from epidemiological and empirical understanding and then identified remedial strategies from literature and from personal experience and collated these within domains of space, staff, supplies, and system operation. Findings These recommendations detail 22 potential actions available to an emergency physician working in the context of surge. The Working Group also provides detailed guidance on surge recognition, triage, patient flow through the emergency department and clinical goals and practices. Discussion These strategies provide guidance to emergency physicians confronting the challenges of a surge in demand. The paper also identifies areas that merit future research including the measurement of surge capacity, constraints to strategy implementation, validation of surge strategies and measurement of strategy impacts on throughput, cost, and quality of care.
Resumo:
Background Takeaway consumption has been increasing and may contribute to socioeconomic inequalities in overweight/obesity and chronic disease. This study examined socioeconomic differences in takeaway consumption patterns, and their contributions to dietary intake inequalities. Method Cross-sectional dietary intake data from adults aged between 25 and 64 years from the Australian National Nutrition Survey (n= 7319, 61% response rate). Twenty-four hour dietary recalls ascertained intakes of takeaway food, nutrients and fruit and vegetables. Education was used as socioeconomic indicator. Data were analysed using logistic regression and general linear models. Results Thirty-two percent (n = 2327) consumed takeaway foods in the 24 hour period. Lower-educated participants were less likely than their higher-educated counterparts to have consumed total takeaway foods (OR 0.64; 95% CI 0.52, 0.80). Of those consuming takeaway foods, the lowest-educated group was more likely to have consumed “less healthy” takeaway choices (OR 2.55; 95% CI 1.73, 3.77), and less likely to have consumed “healthy” choices (OR 0.52; 95% CI 0.36, 0.75). Takeaway foods made a greater contribution to energy, total fat, saturated fat, and fibre intakes among lower than higher-educated groups. Lower likelihood of fruit and vegetable intakes were observed among “less healthy” takeaway consumers, whereas a greater likelihood of their consumption was found among “healthy” takeaway consumers. Conclusions Total and the types of takeaway foods consumed may contribute to socioeconomic inequalities in intakes of energy, total and saturated fats. However, takeaway consumption is unlikely to be a factor contributing to the lower fruit and vegetable intakes among socioeconomically-disadvantaged groups.
Resumo:
Texture based techniques for visualisation of unsteady vector fields have been applied for the visualisation of a Finite volume model for variably saturated groundwater flow through porous media. This model has been developed by staff in the School of Mathematical Sciences QUT for the study of salt water intrusion into coastal aquifers. This presentation discusses the implementation and effectiveness of the IBFV algorithm in the context of visualisation of the groundwater simulation outputs.
Resumo:
A major focus of research in nanotechnology is the development of novel, high throughput techniques for fabrication of arbitrarily shaped surface nanostructures of sub 100 nm to atomic scale. A related pursuit is the development of simple and efficient means for parallel manipulation and redistribution of adsorbed atoms, molecules and nanoparticles on surfaces – adparticle manipulation. These techniques will be used for the manufacture of nanoscale surface supported functional devices in nanotechnologies such as quantum computing, molecular electronics and lab-on-achip, as well as for modifying surfaces to obtain novel optical, electronic, chemical, or mechanical properties. A favourable approach to formation of surface nanostructures is self-assembly. In self-assembly, nanostructures are grown by aggregation of individual adparticles that diffuse by thermally activated processes on the surface. The passive nature of this process means it is generally not suited to formation of arbitrarily shaped structures. The self-assembly of nanostructures at arbitrary positions has been demonstrated, though these have typically required a pre-patterning treatment of the surface using sophisticated techniques such as electron beam lithography. On the other hand, a parallel adparticle manipulation technique would be suited for directing the selfassembly process to occur at arbitrary positions, without the need for pre-patterning the surface. There is at present a lack of techniques for parallel manipulation and redistribution of adparticles to arbitrary positions on the surface. This is an issue that needs to be addressed since these techniques can play an important role in nanotechnology. In this thesis, we propose such a technique – thermal tweezers. In thermal tweezers, adparticles are redistributed by localised heating of the surface. This locally enhances surface diffusion of adparticles so that they rapidly diffuse away from the heated regions. Using this technique, the redistribution of adparticles to form a desired pattern is achieved by heating the surface at specific regions. In this project, we have focussed on the holographic implementation of this approach, where the surface is heated by holographic patterns of interfering pulsed laser beams. This implementation is suitable for the formation of arbitrarily shaped structures; the only condition is that the shape can be produced by holographic means. In the simplest case, the laser pulses are linearly polarised and intersect to form an interference pattern that is a modulation of intensity along a single direction. Strong optical absorption at the intensity maxima of the interference pattern results in approximately a sinusoidal variation of the surface temperature along one direction. The main aim of this research project is to investigate the feasibility of the holographic implementation of thermal tweezers as an adparticle manipulation technique. Firstly, we investigate theoretically the surface diffusion of adparticles in the presence of sinusoidal modulation of the surface temperature. Very strong redistribution of adparticles is predicted when there is strong interaction between the adparticle and the surface, and the amplitude of the temperature modulation is ~100 K. We have proposed a thin metallic film deposited on a glass substrate heated by interfering laser beams (optical wavelengths) as a means of generating very large amplitude of surface temperature modulation. Indeed, we predict theoretically by numerical solution of the thermal conduction equation that amplitude of the temperature modulation on the metallic film can be much greater than 100 K when heated by nanosecond pulses with an energy ~1 mJ. The formation of surface nanostructures of less than 100 nm in width is predicted at optical wavelengths in this implementation of thermal tweezers. Furthermore, we propose a simple extension to this technique where spatial phase shift of the temperature modulation effectively doubles or triples the resolution. At the same time, increased resolution is predicted by reducing the wavelength of the laser pulses. In addition, we present two distinctly different, computationally efficient numerical approaches for theoretical investigation of surface diffusion of interacting adparticles – the Monte Carlo Interaction Method (MCIM) and the random potential well method (RPWM). Using each of these approaches we have investigated thermal tweezers for redistribution of both strongly and weakly interacting adparticles. We have predicted that strong interactions between adparticles can increase the effectiveness of thermal tweezers, by demonstrating practically complete adparticle redistribution into the low temperature regions of the surface. This is promising from the point of view of thermal tweezers applied to directed self-assembly of nanostructures. Finally, we present a new and more efficient numerical approach to theoretical investigation of thermal tweezers of non-interacting adparticles. In this approach, the local diffusion coefficient is determined from solution of the Fokker-Planck equation. The diffusion equation is then solved numerically using the finite volume method (FVM) to directly obtain the probability density of adparticle position. We compare predictions of this approach to those of the Ermak algorithm solution of the Langevin equation, and relatively good agreement is shown at intermediate and high friction. In the low friction regime, we predict and investigate the phenomenon of ‘optimal’ friction and describe its occurrence due to very long jumps of adparticles as they diffuse from the hot regions of the surface. Future research directions, both theoretical and experimental are also discussed.
Resumo:
A wide range of screening strategies have been employed to isolate antibodies and other proteins with specific attributes, including binding affinity, specificity, stability and improved expression. However, there remains no high-throughput system to screen for target-binding proteins in a mammalian, intracellular environment. Such a system would allow binding reagents to be isolated against intracellular clinical targets such as cell signalling proteins associated with tumour formation (p53, ras, cyclin E), proteins associated with neurodegenerative disorders (huntingtin, betaamyloid precursor protein), and various proteins crucial to viral replication (e.g. HIV-1 proteins such as Tat, Rev and Vif-1), which are difficult to screen by phage, ribosome or cell-surface display. This study used the β-lactamase protein complementation assay (PCA) as the display and selection component of a system for screening a protein library in the cytoplasm of HEK 293T cells. The colicin E7 (ColE7) and Immunity protein 7 (Imm7) *Escherichia coli* proteins were used as model interaction partners for developing the system. These proteins drove effective β-lactamase complementation, resulting in a signal-to-noise ratio (9:1 – 13:1) comparable to that of other β-lactamase PCAs described in the literature. The model Imm7-ColE7 interaction was then used to validate protocols for library screening. Single positive cells that harboured the Imm7 and ColE7 binding partners were identified and isolated using flow cytometric cell sorting in combination with the fluorescent β-lactamase substrate, CCF2/AM. A single-cell PCR was then used to amplify the Imm7 coding sequence directly from each sorted cell. With the screening system validated, it was then used to screen a protein library based the Imm7 scaffold against a proof-of-principle target. The wild-type Imm7 sequence, as well as mutants with wild-type residues in the ColE7- binding loop were enriched from the library after a single round of selection, which is consistent with other eukaryotic screening systems such as yeast and mammalian cell-surface display. In summary, this thesis describes a new technology for screening protein libraries in a mammalian, intracellular environment. This system has the potential to complement existing screening technologies by allowing access to intracellular proteins and expanding the range of targets available to the pharmaceutical industry.
Resumo:
Defibrillator is a 16’41” musical work for solo performer, laptop computer and electric guitar. The electric guitar is processed in real-time by digital signal processing network in software, with gestural control provided by a foot-operated pedal board. --------- The work is informed by a range of ideas from the genres of electroacoustic music, western art music, popular music and cinematic sound. It seeks to fluidly cross and hybridise musical practices from these diverse sonic traditions and to develop a compositional language that draws upon multiple genres, but at the same time resists the ability to be located within a singular genre. Musical structures and sonic markers which form genre are ruptured at strategic levels of the musical structure in order to allow for a cross flow of concepts between genres. The process of rupture is facilitated by the practical implementation of music and sound reception theories into the compositional process. -------- The piece exhibits the by-products of a composer born into a media saturated environment, drawing on a range of musical and sonic traditions, actively seeking to explore the liminal space in between these traditions. The project stems from the author's research interests in locating points of connection between traditions of experimentation in diverse musical and sonic traditions arising from the broad uptake of media technologies in the early 20th century.
Resumo:
The radiation chemistry and the grafting of a fluoropolymer, poly(tetrafluoroethylene-coperfluoropropyl vinyl ether) (PFA), was investigated with the aim of developing a highly stable grafted support for use in solid phase organic chemistry (SPOC). A radiation-induced grafting method was used whereby the PFA was exposed to ionizing radiation to form free radicals capable of initiating graft copolymerization of styrene. To fully investigate this process, both the radiation chemistry of PFA and the grafting of styrene to PFA were examined. Radiation alone was found to have a detrimental effect on PFA when irradiated at 303 K. This was evident from the loss in the mechanical properties due to chain scission reactions. This meant that when radiation was used for the grafting reactions, the total radiation dose needed to be kept as low as possible. The radicals produced when PFA was exposed to radiation were examined using electron spin resonance spectroscopy. Both main-chain (–CF2–C.F–CF2-) and end-chain (–CF2–C.F2) radicals were identified. The stability of the majority of the main-chain radicals when the polymer was heated above the glass transition temperature suggested that they were present mainly in the crystalline regions of the polymer, while the end-chain radicals were predominately located in the amorphous regions. The radical yield at 77 K was lower than the radical yield at 303 K suggesting that cage recombination at low temperatures inhibited free radicals from stabilizing. High-speed MAS 19F NMR was used to identify the non-volatile products after irradiation of PFA over a wide temperature range. The major products observed over the irradiation temperature 303 to 633 K included new saturated chain ends, short fluoromethyl side chains in both the amorphous and crystalline regions, and long branch points. The proportion of the radiolytic products shifted from mainly chain scission products at low irradiation temperatures to extensive branching at higher irradiation temperatures. Calculations of G values revealed that net crosslinking only occurred when PFA was irradiated in the melt. Minor products after irradiation at elevated temperatures included internal and terminal double bonds and CF3 groups adjacent to double bonds. The volatile products after irradiation at 303 K included tetrafluoromethane (CF4) and oxygen-containing species from loss of the perfluoropropyl ether side chains of PFA as identified by mass spectrometry and FTIR spectroscopy. The chemical changes induced by radiation exposure were accompanied by changes in the thermal properties of the polymer. Changes in the crystallinity and thermal stability of PFA after irradiation were examined using DSC and TGA techniques. The equilibrium melting temperature of untreated PFA was 599 K as determined using a method of extrapolation of the melting temperatures of imperfectly formed crystals. After low temperature irradiation, radiation- induced crystallization was prevalent due to scission of strained tie molecules, loss of perfluoropropyl ether side chains, and lowering of the molecular weight which promoted chain alignment and hence higher crystallinity. After irradiation at high temperatures, the presence of short and long branches hindered crystallization, lowering the overall crystallinity. The thermal stability of the PFA decreased with increasing radiation dose and temperature due to the introduction of defect groups. Styrene was graft copolymerized to PFA using -radiation as the initiation source with the aim of preparing a graft copolymer suitable as a support for SPOC. Various grafting conditions were studied, such as the total dose, dose rate, solvent effects and addition of nitroxides to create “living” graft chains. The effect of dose rate was examined when grafting styrene vapour to PFA using the simultaneous grafting method. The initial rate of grafting was found to be independent of the dose rate which implied that the reaction was diffusion controlled. When the styrene was dissolved in various solvents for the grafting reaction, the graft yield was strongly dependent of the type and concentration of the solvent used. The greatest graft yield was observed when the solvent swelled the grafted layers and the substrate. Microprobe Raman spectroscopy was used to map the penetration of the graft into the substrate. The grafted layer was found to contain both poly(styrene) (PS) and PFA and became thicker with increasing radiation dose and graft yield which showed that grafting began at the surface and progressively penetrated the substrate as the grafted layer was swollen. The molecular weight of the grafted PS was estimated by measuring the molecular weight of the non-covalently bonded homopolymer formed in the grafted layers using SEC. The molecular weight of the occluded homopolymer was an order of magnitude greater than the free homopolymer formed in the surrounding solution suggesting that the high viscosity in the grafted regions led to long PS grafts. When a nitroxide mediated free radical polymerization was used, grafting occurred within the substrate and not on the surface due to diffusion of styrene into the substrate at the high temperatures needed for the reaction to proceed. Loading tests were used to measure the capacity of the PS graft to be functionialized with aminomethyl groups then further derivatized. These loading tests showed that samples grafted in a solution of styrene and methanol had superior loading capacity over samples graft using other solvents due to the shallow penetration and hence better accessibility of the graft when methanol was used as a solvent.
Resumo:
Campylobacter jejuni followed by Campylobacter coli contribute substantially to the economic and public health burden attributed to food-borne infections in Australia. Genotypic characterisation of isolates has provided new insights into the epidemiology and pathogenesis of C. jejuni and C. coli. However, currently available methods are not conducive to large scale epidemiological investigations that are necessary to elucidate the global epidemiology of these common food-borne pathogens. This research aims to develop high resolution C. jejuni and C. coli genotyping schemes that are convenient for high throughput applications. Real-time PCR and High Resolution Melt (HRM) analysis are fundamental to the genotyping schemes developed in this study and enable rapid, cost effective, interrogation of a range of different polymorphic sites within the Campylobacter genome. While the sources and routes of transmission of campylobacters are unclear, handling and consumption of poultry meat is frequently associated with human campylobacteriosis in Australia. Therefore, chicken derived C. jejuni and C. coli isolates were used to develop and verify the methods described in this study. The first aim of this study describes the application of MLST-SNP (Multi Locus Sequence Typing Single Nucleotide Polymorphisms) + binary typing to 87 chicken C. jejuni isolates using real-time PCR analysis. These typing schemes were developed previously by our research group using isolates from campylobacteriosis patients. This present study showed that SNP + binary typing alone or in combination are effective at detecting epidemiological linkage between chicken derived Campylobacter isolates and enable data comparisons with other MLST based investigations. SNP + binary types obtained from chicken isolates in this study were compared with a previously SNP + binary and MLST typed set of human isolates. Common genotypes between the two collections of isolates were identified and ST-524 represented a clone that could be worth monitoring in the chicken meat industry. In contrast, ST-48, mainly associated with bovine hosts, was abundant in the human isolates. This genotype was, however, absent in the chicken isolates, indicating the role of non-poultry sources in causing human Campylobacter infections. This demonstrates the potential application of SNP + binary typing for epidemiological investigations and source tracing. While MLST SNPs and binary genes comprise the more stable backbone of the Campylobacter genome and are indicative of long term epidemiological linkage of the isolates, the development of a High Resolution Melt (HRM) based curve analysis method to interrogate the hypervariable Campylobacter flagellin encoding gene (flaA) is described in Aim 2 of this study. The flaA gene product appears to be an important pathogenicity determinant of campylobacters and is therefore a popular target for genotyping, especially for short term epidemiological studies such as outbreak investigations. HRM curve analysis based flaA interrogation is a single-step closed-tube method that provides portable data that can be easily shared and accessed. Critical to the development of flaA HRM was the use of flaA specific primers that did not amplify the flaB gene. HRM curve analysis flaA interrogation was successful at discriminating the 47 sequence variants identified within the 87 C. jejuni and 15 C. coli isolates and correlated to the epidemiological background of the isolates. In the combinatorial format, the resolving power of flaA was additive to that of SNP + binary typing and CRISPR (Clustered regularly spaced short Palindromic repeats) HRM and fits the PHRANA (Progressive hierarchical resolving assays using nucleic acids) approach for genotyping. The use of statistical methods to analyse the HRM data enhanced sophistication of the method. Therefore, flaA HRM is a rapid and cost effective alternative to gel- or sequence-based flaA typing schemes. Aim 3 of this study describes the development of a novel bioinformatics driven method to interrogate Campylobacter MLST gene fragments using HRM, and is called ‘SNP Nucleated Minim MLST’ or ‘Minim typing’. The method involves HRM interrogation of MLST fragments that encompass highly informative “Nucleating SNPS” to ensure high resolution. Selection of fragments potentially suited to HRM analysis was conducted in silico using i) “Minimum SNPs” and ii) the new ’HRMtype’ software packages. Species specific sets of six “Nucleating SNPs” and six HRM fragments were identified for both C. jejuni and C. coli to ensure high typeability and resolution relevant to the MLST database. ‘Minim typing’ was tested empirically by typing 15 C. jejuni and five C. coli isolates. The association of clonal complexes (CC) to each isolate by ‘Minim typing’ and SNP + binary typing were used to compare the two MLST interrogation schemes. The CCs linked with each C. jejuni isolate were consistent for both methods. Thus, ‘Minim typing’ is an efficient and cost effective method to interrogate MLST genes. However, it is not expected to be independent, or meet the resolution of, sequence based MLST gene interrogation. ‘Minim typing’ in combination with flaA HRM is envisaged to comprise a highly resolving combinatorial typing scheme developed around the HRM platform and is amenable to automation and multiplexing. The genotyping techniques described in this thesis involve the combinatorial interrogation of differentially evolving genetic markers on the unified real-time PCR and HRM platform. They provide high resolution and are simple, cost effective and ideally suited to rapid and high throughput genotyping for these common food-borne pathogens.
Resumo:
Greyback canegrubs cost the Australian sugarcane industry around $13 million per annum in damage and control. A novel and cost effective biocontrol bacterium could play an important role in the integrated pest management program currently in place to reduce damage and control associated costs. During the course of this project, terminal restriction fragment length polymorphism (TRFLP), 16-S rDNA cloning, suppressive subtractive hybridisation (SSH) and entomopathogen-specific PCR screening were used to investigate the little studied canegrub-associated microflora in an attempt to discover novel pathogens from putatively-diseased specimens. Microflora associated with these soil-dwelling insects was found to be both highly diverse and divergent between individual specimens. Dominant members detected in live specimens were predominantly from taxa of known insect symbionts while dominant sequences amplified from dead grubs were homologous to putativelysaprophytic bacteria and bacteria able to grow during refrigeration. A number of entomopathogenic bacteria were identified such as Photorhabdus luminescens and Pseudomonas fluorescens. Dead canegrubs prior to decomposition need to be analysed if these bacteria are to be isolated. Novel strategies to enrich putative pathogen-associated sequences (SSH and PCR screening) were shown to be promising approaches for pathogen discovery and the investigation of canegrubsassociated microflora. However, due to inter- and intra-grub-associated community diversity, dead grub decomposition and PCR-specific methodological limitations (PCR bias, primer specificity, BLAST database restrictions, 16-S gene copy number and heterogeneity), recommendations have been made to improve the efficiency of such techniques. Improved specimen collection procedures and utilisation of emerging high-throughput sequencing technologies may be required to examine these complex communities in more detail. This is the first study to perform a whole-grub analysis and comparison of greyback canegrub-associated microbial communities. This work also describes the development of a novel V3-PCR based SSH technique. This was the first SSH technique to use V3-PCR products as a starting material and specifically compare bacterial species present in a complex community.
Resumo:
Standardization is critical to scientists and regulators to ensure the quality and interoperability of research processes, as well as the safety and efficacy of the attendant research products. This is perhaps most evident in the case of “omics science,” which is enabled by a host of diverse high-throughput technologies such as genomics, proteomics, and metabolomics. But standards are of interest to (and shaped by) others far beyond the immediate realm of individual scientists, laboratories, scientific consortia, or governments that develop, apply, and regulate them. Indeed, scientific standards have consequences for the social, ethical, and legal environment in which innovative technologies are regulated, and thereby command the attention of policy makers and citizens. This article argues that standardization of omics science is both technical and social. A critical synthesis of the social science literature indicates that: (1) standardization requires a degree of flexibility to be practical at the level of scientific practice in disparate sites; (2) the manner in which standards are created, and by whom, will impact their perceived legitimacy and therefore their potential to be used; and (3) the process of standardization itself is important to establishing the legitimacy of an area of scientific research.
Resumo:
A significant amount (ca. 15-25 GL/a) of PRW (Purified Recycled Water) from urban areas is foreseen as augmentation of the depleted groundwater resources of the Lockyer Valley (approx. 80 km west of Brisbane). Theresearch project uses field investigations, lab trials and modelling techniques to address the key challenges: (i) how to determine benefits of individual users from the augmentation of a natural common pool resource; (ii) how to minimise impacts of applying different quality water on the Lockyer soils, to creeks and on aquifier materials; (iii) how to minimuse mobilisation of salts in the unsaturated and saturated zones as a result of increased deep drainage; (iv) determination of potential for direct aquifer recharge using injection wells?
Resumo:
Composite web services comprise several component web services. When a composite web service is executed centrally, a single web service engine is responsible for coordinating the execution of the components, which may create a bottleneck and degrade the overall throughput of the composite service when there are a large number of service requests. Potentially this problem can be handled by decentralizing execution of the composite web service, but this raises the issue of how to partition a composite service into groups of component services such that each group can be orchestrated by its own execution engine while ensuring acceptable overall throughput of the composite service. Here we present a novel penalty-based genetic algorithm to solve the composite web service partitioning problem. Empirical results show that our new algorithm outperforms existing heuristic-based solutions.