80 resultados para Population set-based methods
Resumo:
Immunomagnetic separation (IMS) represents a simple but effective method of selectively capturing and concentrating Mycobacterium bovis, the causative agent of bovine tuberculosis (bTB), from tissue samples. It is a physical cell separation technique that does not impact cell viability, unlike traditional chemical decontamination prior to culture. IMS is performed with paramagnetic beads coated with M. bovis-specific antibody and peptide binders. Once captured by IMS, M. bovis cells can be detected by either PCR or cultural detection methods. Increased detection rates of M. bovis, particularly from non-visibly lesioned lymph node tissues from bTB reactor animals, have recently been reported when IMS-based methods were employed.
Resumo:
BACKGROUND: Burkholderia pseudomallei is an important cause of acute fulminant pneumonia and septicaemia in tropical regions of northern Australia and south east Asia. Subacute and chronic forms of the disease also occur. There have been three recent reports of adults with cystic fibrosis (CF) who presumably acquired B pseudomallei infection during extended vacations or residence in either Thailand or northern Australia.
METHODS: The clinical course, molecular characteristics, serology and response to treatment are described in four adult CF patients infected with B pseudomallei. Polymerase chain reaction (PCR) based methods were used to confirm B pseudomallei and exclude B cepacia complex. Genotyping was performed using randomly amplified polymorphic DNA (RAPD) PCR and pulsed field gel electrophoresis (PFGE).
RESULTS: Four patients are described with a mean duration of infection of 32 months. All but one patient lived in tropical Queensland. Two patients (with the longest duration of infection) deteriorated clinically and one subsequently died of respiratory failure. Both responded to intravenous treatment specifically targeting B pseudomallei. Another patient suffered two severe episodes of acute bronchopneumonia following acquisition of B pseudomallei. Eradication of the organism was not possible in any of the cases. PFGE of a sample isolate from each patient revealed the strains to be unique and RAPD analysis showed retention of the same strain within an individual over time.
CONCLUSIONS: These findings support a potential pathogenic role for B pseudomallei in CF lung disease, producing both chronic infection and possibly acute bronchopneumonia. Identical isolates are retained over time and are unique, consistent with likely environmental acquisition and not person to person spread. B pseudomallei is emerging as a significant pathogen for patients with CF residing and holidaying in the tropics.
Resumo:
The new Food Information Regulation (1169/2011), dictates that in a refined vegetable oil blend, the type of oil must be clearly identified in the package in contract with current practice where is labelled under the generic and often misleading term “vegetable oil”. With increase consumer awareness in food authenticity, as shown in the recent food scandal with horsemeat in beef products, the identification of the origin of species in food products becomes increasingly relevant. Palm oil is used extensively in food manufacturing and as global demand increases, producing countries suffer from the aftermath of intensive agriculture. Even if only a small portion of global production, sustainable palm oil comes in great demand from consumers and industry. It is therefore of interest to detect the presence of palm oil in food products as consumers have the right to know if it is present in the product or not, mainly from an ethical point of view. Apart from palm oil and its derivatives, rapeseed oil and sunflower oil are also included. With DNA-based methods, the gold standard for the detection of food authenticity and species recognition deemed not suitable in this analytical problem, the focus is inevitably drawn to the chromatographic and spectroscopic methods. Both chromatographic (such as GC-FID and LC-MS) and spectroscopic methods (FT-IR, Raman, NIR) are relevant. Previous attempts have not shown promising results due to oils’ natural variation in composition and complex chemical signals but the suggested two-step analytical procedure is a promising approach with very good initial results.
Resumo:
Grinding solid reagents under solvent-free or low-solvent conditions (mechanochemistry) is emerging as a general synthetic technique which is an alternative to conventional solvent-intensive methods. However, it is essential to find ways to scale-up this type of synthesis if its promise of cleaner manufacturing is to be realised. Here, we demonstrate the use of twin screw and single screw extruders for the continuous synthesis of various metal complexes, including Ni(salen), Ni(NCS)(2)(PPh3)(2) as well as the commercially important metal organic frameworks (MOFs) Cu-3(BTC)(2) (HKUST-1), Zn(2-methylimidazolate)(2) (ZIF-8, MAF-4) and Al(fumarate)(OH). Notably, Al(fumarate)(OH) has not previously been synthesised mechanochemically. Quantitative conversions occur to give products at kg h(-1) rates which, after activation, exhibit surface areas and pore volumes equivalent to those of materials produced by conventional solvent-based methods. Some reactions can be performed either under completely solvent-free conditions whereas others require the addition of small amounts of solvent (typically 3-4 mol equivalents). Continuous neat melt phase synthesis is also successfully demonstrated by both twin screw and single screw extrusion for ZIF-8. The latter technique provided ZIF-8 at 4 kg h(-1). The space time yields (STYs) for these methods of up to 144 x 10(3) kg per m(3) per day are orders of magnitude greater than STYs for other methods of making MOFs. Extrusion methods clearly enable scaling of mechanochemical and melt phase synthesis under solvent-free or low-solvent conditions, and may also be applied in synthesis more generally.
Resumo:
Modern cancer research on prognostic and predictive biomarkers demands the integration of established and emerging high-throughput technologies. However, these data are meaningless unless carefully integrated with patient clinical outcome and epidemiological information. Integrated datasets hold the key to discovering new biomarkers and therapeutic targets in cancer. We have developed a novel approach and set of methods for integrating and interrogating phenomic, genomic and clinical data sets to facilitate cancer biomarker discovery and patient stratification. Applied to a known paradigm, the biological and clinical relevance of TP53, PICan was able to recapitulate the known biomarker status and prognostic significance at a DNA, RNA and protein levels.
Resumo:
Many powders and particulate solids are cohesive in nature and the strength often exhibits dependence on the consolidation stress. As a result, the stress history in the material leading up to a handling scenario needs to be considered when evaluating its handleability. This paper outlines the development of a DEM contact model accounting for plasticity and adhesion force, which is shown to be suitable for modelling the stress history dependent cohesive strength. The model was used to simulate the confined consolidation and the subsequent unconfined loading of iron ore fines with particle sizes up to 1.18mm. The predicted flow function was found to be comparable to the experimental results.
Resumo:
An environment has been created for the optimisation of aerofoil profiles with inclusion of small surface features. For TS wave dominated flows, the paper examines the consequences of the addition of a depression on the aerodynamic optimisation of an NLF aerofoil, and describes the geometry definition fidelity and optimisation algorithm employed in the development process. The variables that define the depression for this optimisation investigation have been fixed, however a preliminary study is presented demonstrating the sensitivity of the flow to the depression characteristics. Solutions to the optimisation problem are then presented using both gradient-based and genetic algorithm techniques, and for accurate representation of the inclusion of small surface perturbations it is concluded that a global optimisation method is required for this type of aerofoil optimisation task due to the nature of the response surface generated. When dealing with surface features, changes in the transition onset are likely to be of a non-linear nature so it is highly critical to have an optimisation algorithm that is robust, suggesting that for this framework, gradient-based methods alone are not suited.
Resumo:
Bulk handling of powders and granular solids is common in many industries and often gives rise to handling difficulties especially when the material exhibits complex cohesive behaviour. For example, high storage stresses in a silo can lead to high cohesive strength of the stored solid, which may in turn cause blockages such as ratholing or arching near the outlet during discharge. This paper presents a Discrete Element Method study of discharge of a granular solid with varying levels of cohesion from a flat-bottomed silo. The DEM simulations were conducted using the commercial EDEM code with a recently developed DEM contact model for cohesive solids implemented through an API. The contact model is based on an elasto-plastic contact with adhesion and uses hysteretic non-linear loading and unloading paths to model the elastic-plastic contact deformation. The adhesion parameter is a function of the maximum contact overlap. The model has been shown to be able to predict the stress history dependent behaviour depicted by a flow function of the material. The effects of cohesion on the discharge rate and flow pattern in the silo are investigated. The predicted discharge rates are compared for the varying levels of cohesion and the effect of adhesion is evaluated. The ability of the contact model to qualitatively predict the phenomena that are present in the discharge of a silo has been shown with the salient feature of mixed flow from a flat bottomed hopper identified in the simulation.
Resumo:
Many researchers have investigated the flow and segregation behaviour in model scale experimental silos at normal gravity conditions. However it is known that the stresses experienced by the bulk solid in industrial silos are high when compared to model silos. Therefore it is important to understand the effect of stress level on flow and segregation behaviour and establish the scaling laws governing this behaviour. The objective of this paper is to understand the effect of gravity on the flow and segregation behaviour of bulk solids in a silo centrifuge model. The materials used were two mixtures composed of Polyamide and glass beads. The discharge of two bi-disperse bulk solids in a silo centrifuge model were recorded under accelerations ranging from 1g to 15g. The velocity distribution during discharge was evaluated using Particle Image Velocimetry (PIV) techniques and the concentration distribution of large and small particles were obtained by imaging processing techniques. The flow and segregation behaviour at high gravities were then quantified and compared with the empirical equations available in the literature.
Resumo:
This paper presents an analytical solution for the solid stresses in a silo with an internal tube. The research was conducted to support the design of a group of full scale silos with large inner concrete tubes. The silos were blasted and formed out of solid rock underground for storing iron ore pellets. Each of these silos is 40m in diameter and has a 10m diameter concrete tube with five levels of openings constructed at the centre of each rock silo. A large scale model was constructed to investigate the stress regime for the stored pellets and to evaluate the solids flow pattern and the loading on the concrete tube. This paper focuses on the development of an analytical solution for stresses in the iron ore pellets in the silo and the effect of the central tube on the stress regimes. The solution is verified using finite element analysis before being applied to analyse stresses in the solid in the full scale silo and the effect of the size of the tube.
Resumo:
In this study, the behaviour of iron ore fines with varying levels of adhesion was investigated using a confined compression test and a uniaxial test. The uniaxial test was conducted using the semi-automated uniaxial EPT tester in which the cohesive strength of a bulk solid is evaluated from an unconfined compression test following a period of consolidation to a pre-defined vertical stress. The iron ore fines were also tested by measuring both the vertical and circumferential strains on the cylindrical container walls under vertical loading in a separate confined compression tester - the K0 tester, to determine the lateral pressure ratio. Discrete Element Method simulations of both experiments were carried out and the predictions were compared with the experimental observations. A recently developed DEM contact model for cohesive solids, an Elasto-Plastic Adhesive model, was used. This particle contact model uses hysteretic non-linear loading and unloading paths and an adhesion parameter which is a function of the maximum contact overlap. The model parameters for the simulations are phenomenologically based to reproduce the key bulk characteristics exhibited by the solid. The simulation results show a good agreement in capturing the stress history dependent behaviour depicted by the flow function of the cohesive iron ore fines while also providing a reasonably good match for the lateral pressure ratio observed during the confined compression K0 tests. This demonstrates the potential for the DEM model to be used in the simulation of bulk handling applications.
Resumo:
With over 50 billion downloads and more than 1.3 million apps in Google’s official market, Android has continued to gain popularity amongst smartphone users worldwide. At the same time there has been a rise in malware targeting the platform, with more recent strains employing highly sophisticated detection avoidance techniques. As traditional signature based methods become less potent in detecting unknown malware, alternatives are needed for timely zero-day discovery. Thus this paper proposes an approach that utilizes ensemble learning for Android malware detection. It combines advantages of static analysis with the efficiency and performance of ensemble machine learning to improve Android malware detection accuracy. The machine learning models are built using a large repository of malware samples and benign apps from a leading antivirus vendor. Experimental results and analysis presented shows that the proposed method which uses a large feature space to leverage the power of ensemble learning is capable of 97.3 % to 99% detection accuracy with very low false positive rates.
Resumo:
Increasing consumer demand for seafood, combined with concern over the health of our oceans, has led to many initiatives aimed at tackling destructive fishing practices and promoting the sustainability of fisheries. An important global threat to sustainable fisheries is Illegal, Unreported and Unregulated (IUU) fishing, and there is now an increased emphasis on the use of trade measures to prevent IUU-sourced fish and fish products from entering the international market. Initiatives encompass new legislation in the European Union requiring the inclusion of species names on catch labels throughout the distribution chain. Such certification measures do not, however, guarantee accuracy of species designation. Using two DNA-based methods to compare species descriptions with molecular ID, we examined 386 samples of white fish, or products labelled as primarily containing white fish, from major UK supermarket chains. Species specific real-time PCR probes were used for cod (Gadus morhua) and haddock (Melanogrammus aeglefinus) to provide a highly sensitive and species-specific test for the major species of white fish sold in the UK. Additionally, fish-specific primers were used to sequence the forensically validated barcoding gene, mitochondrial cytochrome oxidase I (COI). Overall levels of congruence between product label and genetic species identification were high, with 94.34% of samples correctly labelled, though a significant proportion in terms of potential volume, were mislabelled. Substitution was usually for a cheaper alternative and, in one case, extended to a tropical species. To our knowledge, this is the first published study encompassing a large-scale assessment of UK retailers, and if representative, indicates a potentially significant incidence of incorrect product designation.
Resumo:
The problem of detecting spatially-coherent groups of data that exhibit anomalous behavior has started to attract attention due to applications across areas such as epidemic analysis and weather forecasting. Earlier efforts from the data mining community have largely focused on finding outliers, individual data objects that display deviant behavior. Such point-based methods are not easy to extend to find groups of data that exhibit anomalous behavior. Scan Statistics are methods from the statistics community that have considered the problem of identifying regions where data objects exhibit a behavior that is atypical of the general dataset. The spatial scan statistic and methods that build upon it mostly adopt the framework of defining a character for regions (e.g., circular or elliptical) of objects and repeatedly sampling regions of such character followed by applying a statistical test for anomaly detection. In the past decade, there have been efforts from the statistics community to enhance efficiency of scan statstics as well as to enable discovery of arbitrarily shaped anomalous regions. On the other hand, the data mining community has started to look at determining anomalous regions that have behavior divergent from their neighborhood.In this chapter,we survey the space of techniques for detecting anomalous regions on spatial data from across the data mining and statistics communities while outlining connections to well-studied problems in clustering and image segmentation. We analyze the techniques systematically by categorizing them appropriately to provide a structured birds eye view of the work on anomalous region detection;we hope that this would encourage better cross-pollination of ideas across communities to help advance the frontier in anomaly detection.
Resumo:
To value something, you first have to know what it is. Bartkowski et al. (2015) reveal a critical weakness: that biodiversity has rarely, if ever, been defined in economic valuations of putative biodiversity. Here we argue that a precise definition is available and could help focus valuation studies, but that in using this scientific definition (a three-dimensional measure of total difference), valuation by stated-preference methods becomes, at best, very difficult.We reclassify the valuation studies reviewed by Bartkowski et al. (2015) to better reflect the biological definition of biodiversity and its potential indirect use value as the support for provisioning and regulating services. Our analysis shows that almost all of the studies reviewed by Bartkowski et al. (2015) were not about biodiversity, but rather were about the 'vague notion' of naturalness, or sometimes a specific biological component of diversity. Alternative economic methods should be found to value biodiversity as it is defined in natural science. We suggest options based on a production function analogy or cost-based methods. Particularly the first of these provides a strong link between economic theory and ecological research and is empirically practical. Since applied science emphasizes a scientific definition of biodiversity in the design and justification of conservation plans, the need for economic valuation of this quantitative meaning of biodiversity is considerable and as yet unfulfilled.