77 resultados para Topology-based methods


Relevância:

80.00% 80.00%

Publicador:

Resumo:

An environment has been created for the optimisation of aerofoil profiles with inclusion of small surface features. For TS wave dominated flows, the paper examines the consequences of the addition of a depression on the aerodynamic optimisation of an NLF aerofoil, and describes the geometry definition fidelity and optimisation algorithm employed in the development process. The variables that define the depression for this optimisation investigation have been fixed, however a preliminary study is presented demonstrating the sensitivity of the flow to the depression characteristics. Solutions to the optimisation problem are then presented using both gradient-based and genetic algorithm techniques, and for accurate representation of the inclusion of small surface perturbations it is concluded that a global optimisation method is required for this type of aerofoil optimisation task due to the nature of the response surface generated. When dealing with surface features, changes in the transition onset are likely to be of a non-linear nature so it is highly critical to have an optimisation algorithm that is robust, suggesting that for this framework, gradient-based methods alone are not suited.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Bulk handling of powders and granular solids is common in many industries and often gives rise to handling difficulties especially when the material exhibits complex cohesive behaviour. For example, high storage stresses in a silo can lead to high cohesive strength of the stored solid, which may in turn cause blockages such as ratholing or arching near the outlet during discharge. This paper presents a Discrete Element Method study of discharge of a granular solid with varying levels of cohesion from a flat-bottomed silo. The DEM simulations were conducted using the commercial EDEM code with a recently developed DEM contact model for cohesive solids implemented through an API. The contact model is based on an elasto-plastic contact with adhesion and uses hysteretic non-linear loading and unloading paths to model the elastic-plastic contact deformation. The adhesion parameter is a function of the maximum contact overlap. The model has been shown to be able to predict the stress history dependent behaviour depicted by a flow function of the material. The effects of cohesion on the discharge rate and flow pattern in the silo are investigated. The predicted discharge rates are compared for the varying levels of cohesion and the effect of adhesion is evaluated. The ability of the contact model to qualitatively predict the phenomena that are present in the discharge of a silo has been shown with the salient feature of mixed flow from a flat bottomed hopper identified in the simulation.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Many researchers have investigated the flow and segregation behaviour in model scale experimental silos at normal gravity conditions. However it is known that the stresses experienced by the bulk solid in industrial silos are high when compared to model silos. Therefore it is important to understand the effect of stress level on flow and segregation behaviour and establish the scaling laws governing this behaviour. The objective of this paper is to understand the effect of gravity on the flow and segregation behaviour of bulk solids in a silo centrifuge model. The materials used were two mixtures composed of Polyamide and glass beads. The discharge of two bi-disperse bulk solids in a silo centrifuge model were recorded under accelerations ranging from 1g to 15g. The velocity distribution during discharge was evaluated using Particle Image Velocimetry (PIV) techniques and the concentration distribution of large and small particles were obtained by imaging processing techniques. The flow and segregation behaviour at high gravities were then quantified and compared with the empirical equations available in the literature.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper presents an analytical solution for the solid stresses in a silo with an internal tube. The research was conducted to support the design of a group of full scale silos with large inner concrete tubes. The silos were blasted and formed out of solid rock underground for storing iron ore pellets. Each of these silos is 40m in diameter and has a 10m diameter concrete tube with five levels of openings constructed at the centre of each rock silo. A large scale model was constructed to investigate the stress regime for the stored pellets and to evaluate the solids flow pattern and the loading on the concrete tube. This paper focuses on the development of an analytical solution for stresses in the iron ore pellets in the silo and the effect of the central tube on the stress regimes. The solution is verified using finite element analysis before being applied to analyse stresses in the solid in the full scale silo and the effect of the size of the tube.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In this study, the behaviour of iron ore fines with varying levels of adhesion was investigated using a confined compression test and a uniaxial test. The uniaxial test was conducted using the semi-automated uniaxial EPT tester in which the cohesive strength of a bulk solid is evaluated from an unconfined compression test following a period of consolidation to a pre-defined vertical stress. The iron ore fines were also tested by measuring both the vertical and circumferential strains on the cylindrical container walls under vertical loading in a separate confined compression tester - the K0 tester, to determine the lateral pressure ratio. Discrete Element Method simulations of both experiments were carried out and the predictions were compared with the experimental observations. A recently developed DEM contact model for cohesive solids, an Elasto-Plastic Adhesive model, was used. This particle contact model uses hysteretic non-linear loading and unloading paths and an adhesion parameter which is a function of the maximum contact overlap. The model parameters for the simulations are phenomenologically based to reproduce the key bulk characteristics exhibited by the solid. The simulation results show a good agreement in capturing the stress history dependent behaviour depicted by the flow function of the cohesive iron ore fines while also providing a reasonably good match for the lateral pressure ratio observed during the confined compression K0 tests. This demonstrates the potential for the DEM model to be used in the simulation of bulk handling applications.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

With over 50 billion downloads and more than 1.3 million apps in Google’s official market, Android has continued to gain popularity amongst smartphone users worldwide. At the same time there has been a rise in malware targeting the platform, with more recent strains employing highly sophisticated detection avoidance techniques. As traditional signature based methods become less potent in detecting unknown malware, alternatives are needed for timely zero-day discovery. Thus this paper proposes an approach that utilizes ensemble learning for Android malware detection. It combines advantages of static analysis with the efficiency and performance of ensemble machine learning to improve Android malware detection accuracy. The machine learning models are built using a large repository of malware samples and benign apps from a leading antivirus vendor. Experimental results and analysis presented shows that the proposed method which uses a large feature space to leverage the power of ensemble learning is capable of 97.3 % to 99% detection accuracy with very low false positive rates.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Increasing consumer demand for seafood, combined with concern over the health of our oceans, has led to many initiatives aimed at tackling destructive fishing practices and promoting the sustainability of fisheries. An important global threat to sustainable fisheries is Illegal, Unreported and Unregulated (IUU) fishing, and there is now an increased emphasis on the use of trade measures to prevent IUU-sourced fish and fish products from entering the international market. Initiatives encompass new legislation in the European Union requiring the inclusion of species names on catch labels throughout the distribution chain. Such certification measures do not, however, guarantee accuracy of species designation. Using two DNA-based methods to compare species descriptions with molecular ID, we examined 386 samples of white fish, or products labelled as primarily containing white fish, from major UK supermarket chains. Species specific real-time PCR probes were used for cod (Gadus morhua) and haddock (Melanogrammus aeglefinus) to provide a highly sensitive and species-specific test for the major species of white fish sold in the UK. Additionally, fish-specific primers were used to sequence the forensically validated barcoding gene, mitochondrial cytochrome oxidase I (COI). Overall levels of congruence between product label and genetic species identification were high, with 94.34% of samples correctly labelled, though a significant proportion in terms of potential volume, were mislabelled. Substitution was usually for a cheaper alternative and, in one case, extended to a tropical species. To our knowledge, this is the first published study encompassing a large-scale assessment of UK retailers, and if representative, indicates a potentially significant incidence of incorrect product designation.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The problem of detecting spatially-coherent groups of data that exhibit anomalous behavior has started to attract attention due to applications across areas such as epidemic analysis and weather forecasting. Earlier efforts from the data mining community have largely focused on finding outliers, individual data objects that display deviant behavior. Such point-based methods are not easy to extend to find groups of data that exhibit anomalous behavior. Scan Statistics are methods from the statistics community that have considered the problem of identifying regions where data objects exhibit a behavior that is atypical of the general dataset. The spatial scan statistic and methods that build upon it mostly adopt the framework of defining a character for regions (e.g., circular or elliptical) of objects and repeatedly sampling regions of such character followed by applying a statistical test for anomaly detection. In the past decade, there have been efforts from the statistics community to enhance efficiency of scan statstics as well as to enable discovery of arbitrarily shaped anomalous regions. On the other hand, the data mining community has started to look at determining anomalous regions that have behavior divergent from their neighborhood.In this chapter,we survey the space of techniques for detecting anomalous regions on spatial data from across the data mining and statistics communities while outlining connections to well-studied problems in clustering and image segmentation. We analyze the techniques systematically by categorizing them appropriately to provide a structured birds eye view of the work on anomalous region detection;we hope that this would encourage better cross-pollination of ideas across communities to help advance the frontier in anomaly detection.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

To value something, you first have to know what it is. Bartkowski et al. (2015) reveal a critical weakness: that biodiversity has rarely, if ever, been defined in economic valuations of putative biodiversity. Here we argue that a precise definition is available and could help focus valuation studies, but that in using this scientific definition (a three-dimensional measure of total difference), valuation by stated-preference methods becomes, at best, very difficult.We reclassify the valuation studies reviewed by Bartkowski et al. (2015) to better reflect the biological definition of biodiversity and its potential indirect use value as the support for provisioning and regulating services. Our analysis shows that almost all of the studies reviewed by Bartkowski et al. (2015) were not about biodiversity, but rather were about the 'vague notion' of naturalness, or sometimes a specific biological component of diversity. Alternative economic methods should be found to value biodiversity as it is defined in natural science. We suggest options based on a production function analogy or cost-based methods. Particularly the first of these provides a strong link between economic theory and ecological research and is empirically practical. Since applied science emphasizes a scientific definition of biodiversity in the design and justification of conservation plans, the need for economic valuation of this quantitative meaning of biodiversity is considerable and as yet unfulfilled.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Damage detection in bridges using vibration-based methods is an area of growing research interest. Improved assessment
methodologies combined with state-of-the-art sensor technology are rapidly making these approaches applicable for real-world
structures. Applying these techniques to the detection and monitoring of scour around bridge foundations has remained
challenging; however this area has gained attraction in recent years. Several authors have investigated a range of methods but
there is still significant work required to achieve a rounded and widely applicable methodology to detect and monitor scour.This
paper presents a novel Vehicle-Bridge-Soil Dynamic Interaction (VBSDI) model which can be used to simulate the effect of scour
on an integral bridge. The model outputs dynamic signals which can be analysed to determine modal parameters and the variation
of these parameters with respect to scour can be examined.The key novelty of this model is that it is the first numerical model for
simulating scour that combines a realistic vehicle loadingmodel with a robust foundation soil responsemodel.This paper provides a
description of the model development and explains the mathematical theory underlying themodel. Finally a case study application
of the model using typical bridge, soil, and vehicle properties is provided.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This case study deals with the role of time series analysis in sociology, and its relationship with the wider literature and methodology of comparative case study research. Time series analysis is now well-represented in top-ranked sociology journals, often in the form of ‘pooled time series’ research designs. These studies typically pool multiple countries together into a pooled time series cross-section panel, in order to provide a larger sample for more robust and comprehensive analysis. This approach is well suited to exploring trans-national phenomena, and for elaborating useful macro-level theories specific to social structures, national policies, and long-term historical processes. It is less suited however, to understanding how these global social processes work in different countries. As such, the complexities of individual countries - which often display very different or contradictory dynamics than those suggested in pooled studies – are subsumed. Meanwhile, a robust literature on comparative case-based methods exists in the social sciences, where researchers focus on differences between cases, and the complex ways in which they co-evolve or diverge over time. A good example of this is the inequality literature, where although panel studies suggest a general trend of rising inequality driven by the weakening power of labour, marketisation of welfare, and the rising power of capital, some countries have still managed to remain resilient. This case study takes a closer look at what can be learned by applying the insights of case-based comparative research to the method of time series analysis. Taking international income inequality as its point of departure, it argues that we have much to learn about the viability of different combinations of policy options by examining how they work in different countries over time. By taking representative cases from different welfare systems (liberal, social democratic, corporatist, or antipodean), we can better sharpen our theories of how policies can be more specifically engineered to offset rising inequality. This involves a fundamental realignment of the strategy of time series analysis, grounding it instead in a qualitative appreciation of the historical context of cases, as a basis for comparing effects between different countries.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In the past decade, several major food safety crises originated from problems with feed. Consequently, there is an urgent need for early detection of fraudulent adulteration and contamination in the feed chain. Strategies are presented for two specific cases, viz. adulterations of (i) soybean meal with melamine and other types of adulterants/contaminants and (ii) vegetable oils with mineral oil, transformer oil or other oils. These strategies comprise screening at the feed mill or port of entry with non-destructive spectroscopic methods (NIRS and Raman), followed by post-screening and confirmation in the laboratory with MS-based methods. The spectroscopic techniques are suitable for on-site and on-line applications. Currently they are suited to detect fraudulent adulteration at relatively high levels but not to detect low level contamination. The potential use of the strategies for non-targeted analysis is demonstrated.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Background: Esophageal adenocarcinoma (EA) is one of the fastest rising cancers in western countries. Barrett’s Esophagus (BE) is the premalignant precursor of EA. However, only a subset of BE patients develop EA, which complicates the clinical management in the absence of valid predictors. Genetic risk factors for BE and EA are incompletely understood. This study aimed to identify novel genetic risk factors for BE and EA.Methods: Within an international consortium of groups involved in the genetics of BE/EA, we performed the first meta-analysis of all genome-wide association studies (GWAS) available, involving 6,167 BE patients, 4,112 EA patients, and 17,159 representative controls, all of European ancestry, genotyped on Illumina high-density SNP-arrays, collected from four separate studies within North America, Europe, and Australia. Meta-analysis was conducted using the fixed-effects inverse variance-weighting approach. We used the standard genome-wide significant threshold of 5×10-8 for this study. We also conducted an association analysis following reweighting of loci using an approach that investigates annotation enrichment among the genome-wide significant loci. The entire GWAS-data set was also analyzed using bioinformatics approaches including functional annotation databases as well as gene-based and pathway-based methods in order to identify pathophysiologically relevant cellular pathways.Findings: We identified eight new associated risk loci for BE and EA, within or near the CFTR (rs17451754, P=4·8×10-10), MSRA (rs17749155, P=5·2×10-10), BLK (rs10108511, P=2·1×10-9), KHDRBS2 (rs62423175, P=3·0×10-9), TPPP/CEP72 (rs9918259, P=3·2×10-9), TMOD1 (rs7852462, P=1·5×10-8), SATB2 (rs139606545, P=2·0×10-8), and HTR3C/ABCC5 genes (rs9823696, P=1·6×10-8). A further novel risk locus at LPA (rs12207195, posteriori probability=0·925) was identified after re-weighting using significantly enriched annotations. This study thereby doubled the number of known risk loci. The strongest disease pathways identified (P<10-6) belong to muscle cell differentiation and to mesenchyme development/differentiation, which fit with current pathophysiological BE/EA concepts. To our knowledge, this study identified for the first time an EA-specific association (rs9823696, P=1·6×10-8) near HTR3C/ABCC5 which is independent of BE development (P=0·45).Interpretation: The identified disease loci and pathways reveal new insights into the etiology of BE and EA. Furthermore, the EA-specific association at HTR3C/ABCC5 may constitute a novel genetic marker for the prediction of transition from BE to EA. Mutations in CFTR, one of the new risk loci identified in this study, cause cystic fibrosis (CF), the most common recessive disorder in Europeans. Gastroesophageal reflux (GER) belongs to the phenotypic CF-spectrum and represents the main risk factor for BE/EA. Thus, the CFTR locus may trigger a common GER-mediated pathophysiology.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Estimates of HIV prevalence are important for policy in order to establish the health status of a country's population and to evaluate the effectiveness of population-based interventions and campaigns. However, participation rates in testing for surveillance conducted as part of household surveys, on which many of these estimates are based, can be low. HIV positive individuals may be less likely to participate because they fear disclosure, in which case estimates obtained using conventional approaches to deal with missing data, such as imputation-based methods, will be biased. We develop a Heckman-type simultaneous equation approach which accounts for non-ignorable selection, but unlike previous implementations, allows for spatial dependence and does not impose a homogeneous selection process on all respondents. In addition, our framework addresses the issue of separation, where for instance some factors are severely unbalanced and highly predictive of the response, which would ordinarily prevent model convergence. Estimation is carried out within a penalized likelihood framework where smoothing is achieved using a parametrization of the smoothing criterion which makes estimation more stable and efficient. We provide the software for straightforward implementation of the proposed approach, and apply our methodology to estimating national and sub-national HIV prevalence in Swaziland, Zimbabwe and Zambia.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

BACKGROUND AND OBJECTIVE: The main difficulty of PCR-based clonality studies for B-cell lymphoproliferative disorders (B-LPD) is discrimination between monoclonal and polyclonal PCR products, especially when there is a high background of polyclonal B cells in the tumor sample. Actually, PCR-based methods for clonality assessment require additional analysis of the PCR products in order to discern between monoclonal and polyclonal samples. Heteroduplex analysis represents an attractive approach since it is easy to perform and avoids the use of radioactive substrates or expensive equipment. DESIGN AND METHODS: We studied the sensitivity and specificity of heteroduplex PCR analysis for monoclonal detection in samples from 90 B-cell non Hodgkin's lymphoma (B-NHL) patients and in 28 individuals without neoplastic B-cell disorders (negative controls). Furthermore, in 42 B-NHL and in the same 28 negative controls, we compared heteroduplex analysis vs the classical PCR technique. We also compared ethidium bromide (EtBr) vs. silver nitrate (AgNO(3)) staining as well as agarose vs. polyacrylamide gel electrophoresis (PAGE). RESULTS: Using two pair consensus primers sited at VH (FR3 and FR2) and at JH, 91% of B-NHL samples displayed monoclonal products after heteroduplex PCR analysis using PAGE and AgNO(3) staining. Moreover, no polyclonal sample showed a monoclonal PCR product. By contrast, false positive results were obtained when using agarose (5/28) and PAGE without heteroduplex analysis: 2/28 and 8/28 with EtBr and AgNO(3) staining, respectively. In addition, false negative results only appeared with EtBr staining: 13/42 in agarose, 4/42 in PAGE without heteroduplex analysis and 7/42 in PAGE after heteroduplex analysis. INTERPRETATION AND CONCLUSIONS: We conclude that AgNO(3) stained PAGE after heteroduplex analysis is the most suitable strategy for detecting monoclonal rearrangements in B-NHL samples because it does not produce false-positive results and the risk of false-negative results is very low.