9 resultados para Experimental Problems
em CentAUR: Central Archive University of Reading - UK
Resumo:
1. Suction sampling is a popular method for the collection of quantitative data on grassland invertebrate populations, although there have been no detailed studies into the effectiveness of the method. 2. We investigate the effect of effort (duration and number of suction samples) and sward height on the efficiency of suction sampling of grassland beetle, true bug, planthopper and spider Populations. We also compare Suction sampling with an absolute sampling method based on the destructive removal of turfs. 3. Sampling for durations of 16 seconds was sufficient to collect 90% of all individuals and species of grassland beetles, with less time required for the true bugs, spiders and planthoppers. The number of samples required to collect 90% of the species was more variable, although in general 55 sub-samples was sufficient for all groups, except the true bugs. Increasing sward height had a negative effect on the capture efficiency of suction sampling. 4. The assemblage structure of beetles, planthoppers and spiders was independent of the sampling method (suction or absolute) used. 5. Synthesis and applications. In contrast to other sampling methods used in grassland habitats (e.g. sweep netting or pitfall trapping), suction sampling is an effective quantitative tool for the measurement of invertebrate diversity and assemblage structure providing sward height is included as a covariate. The effective sampling of beetles, true bugs, planthoppers and spiders altogether requires a minimum sampling effort of 110 sub-samples of duration of 16 seconds. Such sampling intensities can be adjusted depending on the taxa sampled, and we provide information to minimize sampling problems associated with this versatile technique. Suction sampling should remain an important component in the toolbox of experimental techniques used during both experimental and management sampling regimes within agroecosystems, grasslands or other low-lying vegetation types.
Resumo:
The assumption that negligible work is involved in the formation of new surfaces in the machining of ductile metals, is re-examined in the light of both current Finite Element Method (FEM) simulations of cutting and modern ductile fracture mechanics. The work associated with separation criteria in FEM models is shown to be in the kJ/m2 range rather than the few J/m2 of the surface energy (surface tension) employed by Shaw in his pioneering study of 1954 following which consideration of surface work has been omitted from analyses of metal cutting. The much greater values of surface specific work are not surprising in terms of ductile fracture mechanics where kJ/m2 values of fracture toughness are typical of the ductile metals involved in machining studies. This paper shows that when even the simple Ernst–Merchant analysis is generalised to include significant surface work, many of the experimental observations for which traditional ‘plasticity and friction only’ analyses seem to have no quantitative explanation, are now given meaning. In particular, the primary shear plane angle φ becomes material-dependent. The experimental increase of φ up to a saturated level, as the uncut chip thickness is increased, is predicted. The positive intercepts found in plots of cutting force vs. depth of cut, and in plots of force resolved along the primary shear plane vs. area of shear plane, are shown to be measures of the specific surface work. It is demonstrated that neglect of these intercepts in cutting analyses is the reason why anomalously high values of shear yield stress are derived at those very small uncut chip thicknesses at which the so-called size effect becomes evident. The material toughness/strength ratio, combined with the depth of cut to form a non-dimensional parameter, is shown to control ductile cutting mechanics. The toughness/strength ratio of a given material will change with rate, temperature, and thermomechanical treatment and the influence of such changes, together with changes in depth of cut, on the character of machining is discussed. Strength or hardness alone is insufficient to describe machining. The failure of the Ernst–Merchant theory seems less to do with problems of uniqueness and the validity of minimum work, and more to do with the problem not being properly posed. The new analysis compares favourably and consistently with the wide body of experimental results available in the literature. Why considerable progress in the understanding of metal cutting has been achieved without reference to significant surface work is also discussed.
Resumo:
Chain is a commonly used component in offshore moorings where its ruggedness and corrosion resistance make it an attractive choice. Another attractive property is that a straight chain is inherently torque balanced. Having said this, if a chain is loaded in a twisted condition, or twisted when under load, it exhibits highly non-linear torsional behaviour. The consequences of this behaviour can cause handling difficulties or may compromise the integrity of the mooring system, and care must be taken to avoid problems for both the chain and any components to which it is connected. Even with knowledge of the potential problems, there will always be occasions where, despite the utmost care, twist is unavoidable. Thus it is important for the engineer to be able to determine the effects. A frictionless theory has been developed in Part 1 of the paper that may be used to predict the resultant torques and movement or 'lift' in the links as non-dimensional functions of the angle of twist. The present part of the paper describes a series of experiments undertaken on both studless and stud-link chain to allow comparison of this theoretical model with experimental data. Results are presented for the torsional response and link lift for 'constant twist' and 'constant load' type tests on chains of three different link sizes.
Resumo:
One of the aims of a broad ethnographic study into how the apportionment of risk influences pricing levels of contactors was to ascertain the significant risks affecting contractors in Ghana, and their impact on prices. To do this, in the context of contractors, the difference between expected and realized return on a project is the key dependent variable examined using documentary analyses and semi-structured interviews. Most work in this has focused on identifying and prioritising risks using relative importance indices generated from the analysis of questionnaire survey responses. However, this approach may be argued to constitute perceptions rather than direct measures of the project risk. Here, instead, project risk is investigated by examining two measures of the same quantity; one ‘before’ and one ‘after’ construction of a project has taken place. Risks events are identified by ascertaining the independent variables causing deviations between expected and actual rates of return. Risk impact is then measured by ascertaining additions or reductions to expected costs due to the occurrence of risk events. So far, data from eight substantially complete building projects indicates that consultants’ inefficiency, payment delays, subcontractor-related problems and changes in macroeconomic factors are significant risks affecting contractors in Ghana.
Resumo:
A solution has been found to the long-standing problem of experimental modelling of the interfacial instability in aluminium reduction cells. The idea is to replace the electrolyte overlaying molten aluminium with a mesh of thin rods supplying current down directly into the liquid metal layer. This eliminates electrolysis altogether and all the problems associated with it, such as high temperature, chemical aggressiveness of media, products of electrolysis, the necessity for electrolyte renewal, high power demands, etc. The result is a room temperature, versatile laboratory model which simulates Sele-type, rolling pad interfacial instability. Our new, safe laboratory model enables detailed experimental investigations to test the existing theoretical models for the first time.
Resumo:
The combination of the synthetic minority oversampling technique (SMOTE) and the radial basis function (RBF) classifier is proposed to deal with classification for imbalanced two-class data. In order to enhance the significance of the small and specific region belonging to the positive class in the decision region, the SMOTE is applied to generate synthetic instances for the positive class to balance the training data set. Based on the over-sampled training data, the RBF classifier is constructed by applying the orthogonal forward selection procedure, in which the classifier structure and the parameters of RBF kernels are determined using a particle swarm optimization algorithm based on the criterion of minimizing the leave-one-out misclassification rate. The experimental results on both simulated and real imbalanced data sets are presented to demonstrate the effectiveness of our proposed algorithm.
Resumo:
This contribution proposes a powerful technique for two-class imbalanced classification problems by combining the synthetic minority over-sampling technique (SMOTE) and the particle swarm optimisation (PSO) aided radial basis function (RBF) classifier. In order to enhance the significance of the small and specific region belonging to the positive class in the decision region, the SMOTE is applied to generate synthetic instances for the positive class to balance the training data set. Based on the over-sampled training data, the RBF classifier is constructed by applying the orthogonal forward selection procedure, in which the classifier's structure and the parameters of RBF kernels are determined using a PSO algorithm based on the criterion of minimising the leave-one-out misclassification rate. The experimental results obtained on a simulated imbalanced data set and three real imbalanced data sets are presented to demonstrate the effectiveness of our proposed algorithm.
Resumo:
The sensitivity to the horizontal resolution of the climate, anthropogenic climate change, and seasonal predictive skill of the ECMWF model has been studied as part of Project Athena—an international collaboration formed to test the hypothesis that substantial progress in simulating and predicting climate can be achieved if mesoscale and subsynoptic atmospheric phenomena are more realistically represented in climate models. In this study the experiments carried out with the ECMWF model (atmosphere only) are described in detail. Here, the focus is on the tropics and the Northern Hemisphere extratropics during boreal winter. The resolutions considered in Project Athena for the ECMWF model are T159 (126 km), T511 (39 km), T1279 (16 km), and T2047 (10 km). It was found that increasing horizontal resolution improves the tropical precipitation, the tropical atmospheric circulation, the frequency of occurrence of Euro-Atlantic blocking, and the representation of extratropical cyclones in large parts of the Northern Hemisphere extratropics. All of these improvements come from the increase in resolution from T159 to T511 with relatively small changes for further resolution increases to T1279 and T2047, although it should be noted that results from this very highest resolution are from a previously untested model version. Problems in simulating the Madden–Julian oscillation remain unchanged for all resolutions tested. There is some evidence that increasing horizontal resolution to T1279 leads to moderate increases in seasonal forecast skill during boreal winter in the tropics and Northern Hemisphere extratropics. Sensitivity experiments are discussed, which helps to foster a better understanding of some of the resolution dependence found for the ECMWF model in Project Athena
Resumo:
Global change drivers are known to interact in their effects on biodiversity, but much research to date ignores this complexity. As a consequence, there are problems in the attribution of biodiversity change to different drivers and, therefore, our ability to manage habitats and landscapes appropriately. Few studies explicitly acknowledge and account for interactive (i.e., nonadditive) effects of land use and climate change on biodiversity. One reason is that the mechanisms by which drivers interact are poorly understood. We evaluate such mechanisms, including interactions between demographic parameters, evolutionary trade-offs and synergies and threshold effects of population size and patch occupancy on population persistence. Other reasons for the lack of appropriate research are limited data availability and analytical issues in addressing interaction effects. We highlight the influence that attribution errors can have on biodiversity projections and discuss experimental designs and analytical tools suited to this challenge. Finally, we summarize the risks and opportunities provided by the existence of interaction effects. Risks include ineffective conservation management; but opportunities also arise, whereby the negative impacts of climate change on biodiversity can be reduced through appropriate land management as an adaptation measure. We hope that increasing the understanding of key mechanisms underlying interaction effects and discussing appropriate experimental and analytical designs for attribution will help researchers, policy makers, and conservation practitioners to better minimize risks and exploit opportunities provided by land use-climate change interactions.