116 resultados para Objets fragmentés
Resumo:
In this thesis an investigation into theoretical models for formation and interaction of nanoparticles is presented. The work presented includes a literature review of current models followed by a series of five chapters of original research. This thesis has been submitted in partial fulfilment of the requirements for the degree of doctor of philosophy by publication and therefore each of the five chapters consist of a peer-reviewed journal article. The thesis is then concluded with a discussion of what has been achieved during the PhD candidature, the potential applications for this research and ways in which the research could be extended in the future. In this thesis we explore stochastic models pertaining to the interaction and evolution mechanisms of nanoparticles. In particular, we explore in depth the stochastic evaporation of molecules due to thermal activation and its ultimate effect on nanoparticles sizes and concentrations. Secondly, we analyse the thermal vibrations of nanoparticles suspended in a fluid and subject to standing oscillating drag forces (as would occur in a standing sound wave) and finally on lattice surfaces in the presence of high heat gradients. We have described in this thesis a number of new models for the description of multicompartment networks joined by a multiple, stochastically evaporating, links. The primary motivation for this work is in the description of thermal fragmentation in which multiple molecules holding parts of a carbonaceous nanoparticle may evaporate. Ultimately, these models predict the rate at which the network or aggregate fragments into smaller networks/aggregates and with what aggregate size distribution. The models are highly analytic and describe the fragmentation of a link holding multiple bonds using Markov processes that best describe different physical situations and these processes have been analysed using a number of mathematical methods. The fragmentation of the network/aggregate is then predicted using combinatorial arguments. Whilst there is some scepticism in the scientific community pertaining to the proposed mechanism of thermal fragmentation,we have presented compelling evidence in this thesis supporting the currently proposed mechanism and shown that our models can accurately match experimental results. This was achieved using a realistic simulation of the fragmentation of the fractal carbonaceous aggregate structure using our models. Furthermore, in this thesis a method of manipulation using acoustic standing waves is investigated. In our investigation we analysed the effect of frequency and particle size on the ability for the particle to be manipulated by means of a standing acoustic wave. In our results, we report the existence of a critical frequency for a particular particle size. This frequency is inversely proportional to the Stokes time of the particle in the fluid. We also find that for large frequencies the subtle Brownian motion of even larger particles plays a significant role in the efficacy of the manipulation. This is due to the decreasing size of the boundary layer between acoustic nodes. Our model utilises a multiple time scale approach to calculating the long term effects of the standing acoustic field on the particles that are interacting with the sound. These effects are then combined with the effects of Brownian motion in order to obtain a complete mathematical description of the particle dynamics in such acoustic fields. Finally, in this thesis, we develop a numerical routine for the description of "thermal tweezers". Currently, the technique of thermal tweezers is predominantly theoretical however there has been a handful of successful experiments which demonstrate the effect it practise. Thermal tweezers is the name given to the way in which particles can be easily manipulated on a lattice surface by careful selection of a heat distribution over the surface. Typically, the theoretical simulations of the effect can be rather time consuming with supercomputer facilities processing data over days or even weeks. Our alternative numerical method for the simulation of particle distributions pertaining to the thermal tweezers effect use the Fokker-Planck equation to derive a quick numerical method for the calculation of the effective diffusion constant as a result of the lattice and the temperature. We then use this diffusion constant and solve the diffusion equation numerically using the finite volume method. This saves the algorithm from calculating many individual particle trajectories since it is describes the flow of the probability distribution of particles in a continuous manner. The alternative method that is outlined in this thesis can produce a larger quantity of accurate results on a household PC in a matter of hours which is much better than was previously achieveable.
Resumo:
Campylobacter jejuni followed by Campylobacter coli contribute substantially to the economic and public health burden attributed to food-borne infections in Australia. Genotypic characterisation of isolates has provided new insights into the epidemiology and pathogenesis of C. jejuni and C. coli. However, currently available methods are not conducive to large scale epidemiological investigations that are necessary to elucidate the global epidemiology of these common food-borne pathogens. This research aims to develop high resolution C. jejuni and C. coli genotyping schemes that are convenient for high throughput applications. Real-time PCR and High Resolution Melt (HRM) analysis are fundamental to the genotyping schemes developed in this study and enable rapid, cost effective, interrogation of a range of different polymorphic sites within the Campylobacter genome. While the sources and routes of transmission of campylobacters are unclear, handling and consumption of poultry meat is frequently associated with human campylobacteriosis in Australia. Therefore, chicken derived C. jejuni and C. coli isolates were used to develop and verify the methods described in this study. The first aim of this study describes the application of MLST-SNP (Multi Locus Sequence Typing Single Nucleotide Polymorphisms) + binary typing to 87 chicken C. jejuni isolates using real-time PCR analysis. These typing schemes were developed previously by our research group using isolates from campylobacteriosis patients. This present study showed that SNP + binary typing alone or in combination are effective at detecting epidemiological linkage between chicken derived Campylobacter isolates and enable data comparisons with other MLST based investigations. SNP + binary types obtained from chicken isolates in this study were compared with a previously SNP + binary and MLST typed set of human isolates. Common genotypes between the two collections of isolates were identified and ST-524 represented a clone that could be worth monitoring in the chicken meat industry. In contrast, ST-48, mainly associated with bovine hosts, was abundant in the human isolates. This genotype was, however, absent in the chicken isolates, indicating the role of non-poultry sources in causing human Campylobacter infections. This demonstrates the potential application of SNP + binary typing for epidemiological investigations and source tracing. While MLST SNPs and binary genes comprise the more stable backbone of the Campylobacter genome and are indicative of long term epidemiological linkage of the isolates, the development of a High Resolution Melt (HRM) based curve analysis method to interrogate the hypervariable Campylobacter flagellin encoding gene (flaA) is described in Aim 2 of this study. The flaA gene product appears to be an important pathogenicity determinant of campylobacters and is therefore a popular target for genotyping, especially for short term epidemiological studies such as outbreak investigations. HRM curve analysis based flaA interrogation is a single-step closed-tube method that provides portable data that can be easily shared and accessed. Critical to the development of flaA HRM was the use of flaA specific primers that did not amplify the flaB gene. HRM curve analysis flaA interrogation was successful at discriminating the 47 sequence variants identified within the 87 C. jejuni and 15 C. coli isolates and correlated to the epidemiological background of the isolates. In the combinatorial format, the resolving power of flaA was additive to that of SNP + binary typing and CRISPR (Clustered regularly spaced short Palindromic repeats) HRM and fits the PHRANA (Progressive hierarchical resolving assays using nucleic acids) approach for genotyping. The use of statistical methods to analyse the HRM data enhanced sophistication of the method. Therefore, flaA HRM is a rapid and cost effective alternative to gel- or sequence-based flaA typing schemes. Aim 3 of this study describes the development of a novel bioinformatics driven method to interrogate Campylobacter MLST gene fragments using HRM, and is called ‘SNP Nucleated Minim MLST’ or ‘Minim typing’. The method involves HRM interrogation of MLST fragments that encompass highly informative “Nucleating SNPS” to ensure high resolution. Selection of fragments potentially suited to HRM analysis was conducted in silico using i) “Minimum SNPs” and ii) the new ’HRMtype’ software packages. Species specific sets of six “Nucleating SNPs” and six HRM fragments were identified for both C. jejuni and C. coli to ensure high typeability and resolution relevant to the MLST database. ‘Minim typing’ was tested empirically by typing 15 C. jejuni and five C. coli isolates. The association of clonal complexes (CC) to each isolate by ‘Minim typing’ and SNP + binary typing were used to compare the two MLST interrogation schemes. The CCs linked with each C. jejuni isolate were consistent for both methods. Thus, ‘Minim typing’ is an efficient and cost effective method to interrogate MLST genes. However, it is not expected to be independent, or meet the resolution of, sequence based MLST gene interrogation. ‘Minim typing’ in combination with flaA HRM is envisaged to comprise a highly resolving combinatorial typing scheme developed around the HRM platform and is amenable to automation and multiplexing. The genotyping techniques described in this thesis involve the combinatorial interrogation of differentially evolving genetic markers on the unified real-time PCR and HRM platform. They provide high resolution and are simple, cost effective and ideally suited to rapid and high throughput genotyping for these common food-borne pathogens.
Resumo:
Habitat models are widely used in ecology, however there are relatively few studies of rare species, primarily because of a paucity of survey records and lack of robust means of assessing accuracy of modelled spatial predictions. We investigated the potential of compiled ecological data in developing habitat models for Macadamia integrifolia, a vulnerable mid-stratum tree endemic to lowland subtropical rainforests of southeast Queensland, Australia. We compared performance of two binomial models—Classification and Regression Trees (CART) and Generalised Additive Models (GAM)—with Maximum Entropy (MAXENT) models developed from (i) presence records and available absence data and (ii) developed using presence records and background data. The GAM model was the best performer across the range of evaluation measures employed, however all models were assessed as potentially useful for informing in situ conservation of M. integrifolia, A significant loss in the amount of M. integrifolia habitat has occurred (p < 0.05), with only 37% of former habitat (pre-clearing) remaining in 2003. Remnant patches are significantly smaller, have larger edge-to-area ratios and are more isolated from each other compared to pre-clearing configurations (p < 0.05). Whilst the network of suitable habitat patches is still largely intact, there are numerous smaller patches that are more isolated in the contemporary landscape compared with their connectedness before clearing. These results suggest that in situ conservation of M. integrifolia may be best achieved through a landscape approach that considers the relative contribution of small remnant habitat fragments to the species as a whole, as facilitating connectivity among the entire network of habitat patches.
Resumo:
One approach to reducing the yield losses caused by banana viral diseases is the use of genetic engineering and pathogen-derived resistance strategies to generate resistant cultivars. The development of transgenic virus resistance requires an efficient banana transformation method, particularly for commercially important 'Cavendish' type cultivars such as 'Grand Nain'. Prior to this study, only two examples of the stable transformation of banana had been reported, both of which demonstrated the principle of transformation but did not characterise transgenic plants in terms of the efficiency at which individual transgenic lines were generated, relative activities of promoters in stably transformed plants, and the stability of transgene expression. The aim of this study was to develop more efficient transformation methods for banana, assess the activity of some commonly used and also novel promoters in stably transformed plants, and transform banana with genes that could potentially confer resistance to banana bunchy top nanovirus (BBTV) and banana bract mosaic potyvirus (BBrMV). A regeneration system using immature male flowers as the explant was established. The frequency of somatic embryogenesis in male flower explants was influenced by the season in which the inflorescences were harvested. Further, the media requirements of various banana cultivars in respect to the 2,4-D concentration in the initiation media also differed. Following the optimisation of these and other parameters, embryogenic cell suspensions of several banana (Musa spp.) cultivars including 'Grand Nain' (AAA), 'Williams' (AAA), 'SH-3362' (AA), 'Goldfinger' (AAAB) and 'Bluggoe' (ABB) were successfully generated. Highly efficient transformation methods were developed for both 'Bluggoe' and 'Grand Nain'; this is the first report of microprojectile bombardment transformation of the commercially important 'Grand Nain' cultivar. Following bombardment of embryogenic suspension cells, regeneration was monitored from single transfom1ed cells to whole plants using a reporter gene encoding the green fluorescent protein (gfp). Selection with kanamycin enabled the regeneration of a greater number of plants than with geneticin, while still preventing the regeneration of non-transformed plants. Southern hybridisation confirmed the neomycin phosphotransferase gene (npt II) was stably integrated into the banana genome and that multiple transgenic lines were derived from single bombardments. The activity, stability and tissue specificity of the cauliflower mosaic virus 358 (CaMV 35S) and maize polyubiquitin-1 (Ubi-1) promoters were examined. In stably transformed banana, the Ubi-1 promoter provided approximately six-fold higher p-glucuronidase (GUS) activity than the CaMV 35S promoter, and both promoters remained active in glasshouse grown plants for the six months they were observed. The intergenic regions ofBBTV DNA-I to -6 were isolated and fused to either the uidA (GUS) or gfjJ reporter genes to assess their promoter activities. BBTV promoter activity was detected in banana embryogenic cells using the gfp reporter gene. Promoters derived from BBTV DNA-4 and -5 generated the highest levels of transient activity, which were greater than that generated by the maize Ubi-1 promoter. In transgenic banana plants, the activity of the BBTV DNA-6 promoter (BT6.1) was restricted to the phloem of leaves and roots, stomata and root meristems. The activity of the BT6.1 promoter was enhanced by the inclusion of intron-containing fragments derived from the maize Ubi-1, rice Act-1, and sugarcane rbcS 5' untranslated regions in GUS reporter gene constructs. In transient assays in banana, the rice Act-1 and maize Ubi-1 introns provided the most significant enhancement, increasing expression levels 300-fold and 100-fold, respectively. The sugarcane rbcS intron increased expression about 10-fold. In stably transformed banana plants, the maize Ubi-1 intron enhanced BT6.1 promoter activity to levels similar to that of the CaMV 35S promoter, but did not appear to alter the tissue specificity of the promoter. Both 'Grand Nain' and 'Bluggoe' were transformed with constructs that could potentially confer resistance to BBTV and BBrMV, including constructs containing BBTV DNA-1 major and internal genes, BBTV DNA-5 gene, and the BBrMV coat protein-coding region all under the control of the Ubi-1 promoter, while the BT6 promoter was used to drive the npt II selectable marker gene. At least 30 transgenic lines containing each construct were identified and replicates of each line are currently being generated by micropropagation in preparation for virus challenge.
Resumo:
This paper addresses the problem of constructing consolidated business process models out of collections of process models that share common fragments. The paper considers the construction of unions of multiple models (called merged models) as well as intersections (called digests). Merged models are intended for analysts who wish to create a model that subsumes a collection of process models - typically representing variants of the same underlying process - with the aim of replacing the variants with the merged model. Digests, on the other hand, are intended for analysts who wish to identify the most recurring fragments across a collection of process models, so that they can focus their efforts on optimizing these fragments. The paper presents an algorithm for computing merged models and an algorithm for extracting digests from a merged model. The merging and digest extraction algorithms have been implemented and tested against collections of process models taken from multiple application domains. The tests show that the merging algorithm produces compact models and scales up to process models containing hundreds of nodes. Furthermore, a case study conducted in a large insurance company has demonstrated the usefulness of the merging and digest extraction operators in a practical setting.
Resumo:
To date, biodegradable networks and particularly their kinetic chain lengths have been characterized by analysis of their degradation products in solution. We characterize the network itself by NMR analysis in the solvent-swollen state under magic angle spinning conditions. The networks were prepared by photoinitiated cross-linking of poly(dl-lactide)−dimethacrylate macromers (5 kg/mol) in the presence of an unreactive diluent. Using diffusion filtering and 2D correlation spectroscopy techniques, all network components are identified. By quantification of network-bound photoinitiator fragments, an average kinetic chain length of 9 ± 2 methacrylate units is determined. The PDLLA macromer solution was also used with a dye to prepare computer-designed structures by stereolithography. For these networks structures, the average kinetic chain length is 24 ± 4 methacrylate units. In all cases the calculated molecular weights of the polymethacrylate chains after degradation are maximally 8.8 kg/mol, which is far below the threshold for renal clearance. Upon incubation in phosphate buffered saline at 37 °C, the networks show a similar mass loss profile in time as linear high-molecular-weight PDLLA (HMW PDLLA). The mechanical properties are preserved longer for the PDLLA networks than for HMW PDLLA. The initial tensile strength of 47 ± 2 MPa does not decrease significantly for the first 15 weeks, while HMW PDLLA lost 85 ± 5% of its strength within 5 weeks. The physical properties, kinetic chain length, and degradation profile of these photo-cross-linked PDLLA networks make them most suited materials for orthopedic applications and use in (bone) tissue engineering.
Resumo:
This paper presents evidence of an apparent connection between ball lightning and a green fireball. On the evening of the 16th May 2006 at least three fireballs were seen by many people in the skies of Queensland, Australia. One of the fireballs was seen passing over the Great Divide about 120 km west of Brisbane, and soon after, a luminous green ball about 30 cm in diameter was seen rolling down the slope of the Great Divide. A detailed description given by a witness indicates that the phenomenon was probably a highly luminous form of ball lightning. An hypothesis presented in this paper is that the passage of the Queensland fireball meteor created an electrically conductive path between the ionosphere and ground, providing energy for the ball lightning phenomenon. A strong similarity is noted between the Queensland fireball and the Pasamonte fireball seen in New Mexico in 1933. Both meteors exhibit a twist in the tail that could be explained by hydrodynamic forces. The possibility that multiple sightings of fireballs across South East Queensland were produced owing to fragments from comet 73P Schwassmann-Wachmann 3 is discussed.
Resumo:
Business process model repositories capture precious knowledge about an organization or a business domain. In many cases, these repositories contain hundreds or even thousands of models and they represent several man-years of effort. Over time, process model repositories tend to accumulate duplicate fragments, as new process models are created by copying and merging fragments from other models. This calls for methods to detect duplicate fragments in process models that can be refactored as separate subprocesses in order to increase readability and maintainability. This paper presents an indexing structure to support the fast detection of clones in large process model repositories. Experiments show that the algorithm scales to repositories with hundreds of models. The experimental results also show that a significant number of non-trivial clones can be found in process model repositories taken from industrial practice.
Resumo:
As organizations reach higher levels of Business Process Management maturity, they tend to accumulate large collections of process models. These repositories may contain thousands of activities and be managed by different stakeholders with varying skills and responsibilities. However, while being of great value, these repositories induce high management costs. Thus, it becomes essential to keep track of the various model versions as they may mutually overlap, supersede one another and evolve over time. We propose an innovative versioning model and associated storage structure, specifically designed to maximize sharing across process model versions, and to automatically handle change propagation. The focal point of this technique is to version single process model fragments, rather than entire process models. Indeed empirical evidence shows that real-life process model repositories have numerous duplicate fragments. Experiments on two industrial datasets confirm the usefulness of our technique.
Resumo:
This abstract explores the possibility of a grass roots approach to engaging people in community change initiatives by designing simple interactive exploratory prototypes for use by communities over time that support shared action. The prototype is gradually evolved in response to community use, fragments of data gathered through the prototype, and participant feedback with the goal of building participation in community change initiatives. A case study of a system to support ridesharing is discussed. The approach is compared and contrasted to a traditional IT systems procurement approach.
Resumo:
Process modeling is a central element in any approach to Business Process Management (BPM). However, what hinders both practitioners and academics is the lack of support for assessing the quality of process models – let alone realizing high quality process models. Existing frameworks are highly conceptual or too general. At the same time, various techniques, tools, and research results are available that cover fragments of the issue at hand. This chapter presents the SIQ framework that on the one hand integrates concepts and guidelines from existing ones and on the other links these concepts to current research in the BPM domain. Three different types of quality are distinguished and for each of these levels concrete metrics, available tools, and guidelines will be provided. While the basis of the SIQ framework is thought to be rather robust, its external pointers can be updated with newer insights as they emerge.
Resumo:
A series of porphyrins substituted in one or two meso-positions by diphenylphosphine oxide groups has been prepared by the palladium catalysed reaction of diphenylphosphine or its oxide with the corresponding bromoporphyrins. Compounds {MDPP-[P(O)Ph2]n} (M = H2, Ni, Zn; H2DPP = 5,15-diphenylporphyrin; n = 1, 2) were isolated in yields of 60-95%. The reaction is believed to proceed via the conventional oxidative addition, phosphination and reductive elimination steps, as the stoichiometric reaction of η1-palladio(II) porphyrin [PdBr(H2DPP)(dppe)] (H2DPP = 5,15-diphenylporphyrin; dppe = 1,2-bis(diphenylphosphino)ethane) with diphenylphosphine oxide also results in the desired mono-porphyrinylphosphine oxide [H2DPP-P(O)Ph2]. Attempts to isolate the tertiary phosphines failed due to their extreme air-sensitivity. Variable temperature 1H NMR studies of [H2DPP-P(O)Ph2] revealed an intrinsic lack of symmetry, while fluorescence spectroscopy showed that the phosphine oxide group does not behave as a "heavy atom" quencher. The electron withdrawing effect of the phosphine oxide group was confirmed by voltammetry. The ligands were characterised by multinuclear NMR and UV-visible spectroscopy as well as mass spectrometry. Single crystal X-ray crystallography showed that the bis(phosphine oxide) nickel(II) complex {[NiDPP-[P(O)Ph2]2} is monomeric in the solid state, with a ruffled porphyrin core and the two P=O fragments on the same side of the average plane of the molecule. On the other hand, the corresponding zinc(II) complex formed infinite chains through coordination of one Ph2PO substituent to the neighbouring zinc porphyrin through an almost linear P=O---Zn unit, leaving the other Ph2PO group facing into a parallel channel filled with disordered water molecules. These new phosphine oxides are attractive ligands for supramolecular porphyrin chemistry.
Resumo:
As organizations reach to higher levels of business process management maturity, they often find themselves maintaining repositories of hundreds or even thousands of process models, representing valuable knowledge about their operations. Over time, process model repositories tend to accumulate duplicate fragments (also called clones) as new process models are created or extended by copying and merging fragments from other models. This calls for methods to detect clones in process models, so that these clones can be refactored as separate subprocesses in order to improve maintainability. This paper presents an indexing structure to support the fast detection of clones in large process model repositories. The proposed index is based on a novel combination of a method for process model decomposition (specifically the Refined Process Structure Tree), with established graph canonization and string matching techniques. Experiments show that the algorithm scales to repositories with hundreds of models. The experimental results also show that a significant number of non-trivial clones can be found in process model repositories taken from industrial practice.
Resumo:
As organizations reach higher levels of Business Process Management maturity, they tend to accumulate large collections of process models. These repositories may contain thousands of activities and be managed by different stakeholders with varying skills and responsibilities. However, while being of great value, these repositories induce high management costs. Thus, it becomes essential to keep track of the various model versions as they may mutually overlap, supersede one another and evolve over time. We propose an innovative versioning model, and associated storage structure, specifically designed to maximize sharing across process models and process model versions, reduce conflicts in concurrent edits and automatically handle controlled change propagation. The focal point of this technique is to version single process model fragments, rather than entire process models. Indeed empirical evidence shows that real-life process model repositories have numerous duplicate fragments. Experiments on two industrial datasets confirm the usefulness of our technique.
Resumo:
This research underlines the extensive application of nanostructured metal oxides in environmental systems such as hazardous waste remediation and water purification. This study tries to forge a new understanding of the complexity of adsorption and photocatalysis in the process of water treatment. Sodium niobate doped with a different amount of tantalum, was prepared via a hydrothermal reaction and was observed to be able to adsorb highly hazardous bivalent radioactive isotopes such as Sr2+ and Ra2+ions. This study facilitates the preparation of Nb-based adsorbents for efficiently removing toxic radioactive ions from contaminated water and also identifies the importance of understanding the influence of heterovalent substitution in microporous frameworks. Clay adsorbents were prepared via a two-step method to remove anionic and non-ionic herbicides from water. Firstly, layered beidellite clay was treated with acid in a hydrothermal process; secondly, common silane coupling agents, 3-chloro-propyl trimethoxysilane or triethoxy silane, were grafted onto the acid treated samples to prepare the adsorption materials. In order to isolate the effect of the clay surface, we compared the adsorption property of clay adsorbents with ƒ×-Al2O3 nanofibres grafted with the same functional groups. Thin alumina (£^-Al2O3) nanofibres were modified by the grafting of two organosilane agents 3-chloropropyltriethoxysilane and octyl triethoxysilane onto the surface, for the adsorptive removal of alachlor and imazaquin herbicides from water. The formation of organic groups during the functionalisation process established super hydrophobic sites along the surfaces and those non-polar regions of the surfaces were able to make close contact with the organic pollutants. A new structure of anatase crystals linked to clay fragments was synthesised by the reaction of TiOSO4 with laponite clay for the degradation of pesticides. Based on the Ti/clay ratio, these new catalysts showed a high degradation rate when compared with P25. Moreover, immobilized TiO2 on laponite clay fragments could be readily separated out from a slurry system after the photocatalytic reaction. Using a series of partial phase transition methods, an effective catalyst with fibril morphology was prepared for the degradation of different types of phenols and trace amount of herbicides from water. Both H-titanate and TiO2-(B) fibres coated with anatase nanocrystal were studied. When compared with a laponite clay photocatalyst, it was found that anatase dotted TiO2-(B) fibres prepared by a 45 h hydrothermal treatment followed by calcination were not only superior in performance in photocatalysis but could also be readily separated from a slurry system after photocatalytic reactions. This study has laid the foundation for the development of the ability to fabricate highly efficient nanostructured solids for the removal of radioactive ions and organic pollutants from contaminated water. These results now seem set to contribute to the development of advanced water purification devices in the future. These modified nanostructured materials with unusual properties have broadened their application range beyond their traditional use as adsorbents, to also encompass the storage of nuclear waste after concentrating from contaminated water.