853 resultados para fracture thresholds
Resumo:
A review is given of the mechanics of cutting, ranging from the slicing of thin floppy offcuts (where there is negligible elasticity and no permanent deformation of the offcut) to the machining of ductile metals (where there is severe permanent distortion of the offcut/chip). Materials scientists employ the former conditions to determine the fracture toughness of ‘soft’ solids such as biological materials and foodstuffs. In contrast, traditional analyses of metalcutting are based on plasticity and friction only, and do not incorporate toughness. The machining theories are inadequate in a number of ways but a recent paper has shown that when ductile work of fracture is included many, if not all, of the shortcomings are removed. Support for the new analysis is given by examination of FEM simulations of metalcutting which reveal that a ‘separation criterion’ has to be employed at the tool tip. Some consideration shows that the separation criteria are versions of void-initiation-growth-and-coalescence models employed in ductile fracture mechanics. The new analysis shows that cutting forces for ductile materials depend upon the fracture toughness as well as plasticity and friction, and reveals a simple way of determining both toughness and flow stress from cutting experiments. Examples are given for a wide range of materials including metals, polymers and wood, and comparison is made with the same properties independently determined using conventional testpieces. Because cutting can be steady state, a new way is presented for simultaneously measuring toughness and flow stress at controlled speeds and strain rates.
Resumo:
The complete fracture behaviour of ductile double edge notched tension (DENT) specimen is analysed with an approximate model, which is then used to discuss the essential work of fracture (EWF) concept. The model results are compared with the experimental results for an aluminium alloy 6082-O. The restrictions on the ligament size for valid application of the EWF method are discussed with the aid of the model. The model is used to suggest an improved method of obtaining the cohesive stress-displacement relationship for the fracture process zone (FPZ).
Resumo:
The perceived wisdom about thin sheet fracture is that (i) the crack propagates under mixed mode I & III giving rise to a slant through-thickness fracture profile and (ii) the fracture toughness remains constant at low thickness and eventually decreases with increasing thickness. In the present study, fracture tests performed on thin DENT plates of various thicknesses made of stainless steel, mild steel, 6082-O and NS4 aluminium alloys, brass, bronze, lead, and zinc systematically exhibit (i) mode I “bath-tub”, i.e. “cup & cup”, fracture profiles with limited shear lips and significant localized necking (more than 50% thickness reduction), (ii) a fracture toughness that linearly increases with increasing thickness (in the range of 0.5–5 mm). The different contributions to the work expended during fracture of these materials are separated based on dimensional considerations. The paper emphasises the two parts of the work spent in the fracture process zone: the necking work and the “fracture” work. Experiments show that, as expected, the work of necking per unit area linearly increases with thickness. For a typical thickness of 1 mm, both fracture and necking contributions have the same order of magnitude in most of the metals investigated. A model is developed in order to independently evaluate the work of necking, which successfully predicts the experimental values. Furthermore, it enables the fracture energy to be derived from tests performed with only one specimen thickness. In a second modelling step, the work of fracture is computed using an enhanced void growth model valid in the quasi plane stress regime. The fracture energy varies linearly with the yield stress and void spacing and is a strong function of the hardening exponent and initial void volume fraction. The coupling of the two models allows the relative contributions of necking versus fracture to be quantified with respect to (i) the two length scales involved in this problem, i.e. the void spacing and the plate thickness, and (ii) the flow properties of the material. Each term can dominate depending on the properties of the material which explains the different behaviours reported in the literature about thin plate fracture toughness and its dependence with thickness.
Resumo:
Investigation of the fracture mode for hard and soft wheat endosperm was aimed at gaining a better understanding of the fragmentation process. Fracture mechanical characterization was based on the three-point bending test which enables stable crack propagation to take place in small rectangular pieces of wheat endosperm. The crack length can be measured in situ by using an optical microscope with light illumination from the side of the specimen or from the back of the specimen. Two new techniques were developed and used to estimate the fracture toughness of wheat endosperm, a geometric approach and a compliance method. The geometric approach gave average fracture toughness values of 53.10 and 27.0 J m(-2) for hard and soft endosperm, respectively. Fracture toughness estimated using the compliance method gave values of 49.9 and 29.7 J m(-2) for hard and soft endosperm, respectively. Compressive properties of the endosperm in three mutually perpendicular axes revealed that the hard and soft endosperms are isotropic composites. Scanning electron microscopy (SEM) observation of the fracture surfaces and the energy-time curves of loading-unloading cycles revealed that there was a plastic flow during crack propagation for both the hard and soft endosperms, and confirmed that the fracture mode is significantly related to the adhesion level between starch granules and the protein matrix.
Resumo:
A series of three-point bend tests using single edge notched testpieces of pure polycrystalline ice have been performed at three different temperatures (–20°C, –30°C and –40°C). The displacement rate was varied from 1 mm/min to 100 mm/min, producing the crack tip strain rates from about 10–3 to 10–1 s–1. The results show that (a) the fracture toughness of pure polycrystalline ice given by the critical stress intensity factor (K IC) is much lower than that measured from the J—integral under identical conditions; (b) from the determination of K IC, the fracture toughness of pure polycrystalline ice decreases with increasing strain rate and there is good power law relationship between them; (c) from the measurement of the J—integral, a different tendency was appeared: when the crack tip strain rate exceeds a critical value of 6 × 10–3 s–1, the fracture toughness is almost constant but when the crack tip strain rate is less than this value, the fracture toughness increases with decreasing crack tip strain rate. Re-examination of the mechanisms of rate-dependent fracture toughness of pure polycrystalline ice shows that the effect of strain rate is related not only to the blunting of crack tips due to plasticity, creep and stress relaxation but also to the nucleation and growth of microcracks in the specimen.
Resumo:
Records of Atlantic basin tropical cyclones (TCs) since the late nineteenth century indicate a very large upward trend in storm frequency. This increase in documented TCs has been previously interpreted as resulting from anthropogenic climate change. However, improvements in observing and recording practices provide an alternative interpretation for these changes: recent studies suggest that the number of potentially missed TCs is sufficient to explain a large part of the recorded increase in TC counts. This study explores the influence of another factor—TC duration—on observed changes in TC frequency, using a widely used Atlantic hurricane database (HURDAT). It is found that the occurrence of short-lived storms (duration of 2 days or less) in the database has increased dramatically, from less than one per year in the late nineteenth–early twentieth century to about five per year since about 2000, while medium- to long-lived storms have increased little, if at all. Thus, the previously documented increase in total TC frequency since the late nineteenth century in the database is primarily due to an increase in very short-lived TCs. The authors also undertake a sampling study based upon the distribution of ship observations, which provides quantitative estimates of the frequency of missed TCs, focusing just on the moderate to long-lived systems with durations exceeding 2 days in the raw HURDAT. Upon adding the estimated numbers of missed TCs, the time series of moderate to long-lived Atlantic TCs show substantial multidecadal variability, but neither time series exhibits a significant trend since the late nineteenth century, with a nominal decrease in the adjusted time series. Thus, to understand the source of the century-scale increase in Atlantic TC counts in HURDAT, one must explain the relatively monotonic increase in very short-duration storms since the late nineteenth century. While it is possible that the recorded increase in short-duration TCs represents a real climate signal, the authors consider that it is more plausible that the increase arises primarily from improvements in the quantity and quality of observations, along with enhanced interpretation techniques. These have allowed National Hurricane Center forecasters to better monitor and detect initial TC formation, and thus incorporate increasing numbers of very short-lived systems into the TC database.
Resumo:
The redistribution of a finite amount of martian surface dust during global dust storms and in the intervening periods has been modelled in a dust lifting version of the UK Mars General Circulation Model. When using a constant, uniform threshold in the model’s wind stress lifting parameterisation and assuming an unlimited supply of surface dust, multiannual simulations displayed some variability in dust lifting activity from year to year, arising from internal variability manifested in surface wind stress, but dust storms were limited in size and formed within a relatively short seasonal window. Lifting thresholds were then allowed to vary at each model gridpoint, dependent on the rates of emission or deposition of dust. This enhanced interannual variability in dust storm magnitude and timing, such that model storms covered most of the observed ranges in size and initiation date within a single multiannual simulation. Peak storm magnitude in a given year was primarily determined by the availability of surface dust at a number of key sites in the southern hemisphere. The observed global dust storm (GDS) frequency of roughly one in every 3 years was approximately reproduced, but the model failed to generate these GDSs spontaneously in the southern hemisphere, where they have typically been observed to initiate. After several years of simulation, the surface threshold field—a proxy for net change in surface dust density—showed good qualitative agreement with the observed pattern of martian surface dust cover. The model produced a net northward cross-equatorial dust mass flux, which necessitated the addition of an artificial threshold decrease rate in order to allow the continued generation of dust storms over the course of a multiannual simulation. At standard model resolution, for the southward mass flux due to cross-equatorial flushing storms to offset the northward flux due to GDSs on a timescale of ∼3 years would require an increase in the former by a factor of 3–4. Results at higher model resolution and uncertainties in dust vertical profiles mean that quasi-periodic redistribution of dust on such a timescale nevertheless appears to be a plausible explanation for the observed GDS frequency.
Resumo:
An updated empirical approach is proposed for specifying coexistence requirements for genetically modified (GM) maize (Zea mays L.) production to ensure compliance with the 0.9% labeling threshold for food and feed in the European Union. The model improves on a previously published (Gustafson et al., 2006) empirical model by adding recent data sources to supplement the original database and including the following additional cases: (i) more than one GM maize source field adjacent to the conventional or organic field, (ii) the possibility of so-called “stacked” varieties with more than one GM trait, and (iii) lower pollen shed in the non-GM receptor field. These additional factors lead to the possibility for somewhat wider combinations of isolation distance and border rows than required in the original version of the empirical model. For instance, in the very conservative case of a 1-ha square non-GM maize field surrounded on all four sides by homozygous GM maize with 12 m isolation (the effective isolation distance for a single GM field), non-GM border rows of 12 m are required to be 95% confident of gene flow less than 0.9% in the non-GM field (with adventitious presence of 0.3%). Stacked traits of higher GM mass fraction and receptor fields of lower pollen shed would require a greater number of border rows to comply with the 0.9% threshold, and an updated extension to the model is provided to quantify these effects.
Resumo:
Detection of a tactile stimulus on one finger is impaired when a concurrent stimulus (masker) is presented on an additional finger of the same or the opposite hand. This phenomenon is known to be finger-specific at the within-hand level. However, whether this specificity is also maintained at the between-hand level is not known. In four experiments, we addressed this issue by combining a Bayesian adaptive staircase procedure (QUEST) with a two-interval forced choice (2IFC) design in order to establish threshold for detecting 200ms, 100Hz sinusoidal vibrations applied to the index or little fingertip of either hand (targets). We systematically varied the masker finger (index, middle, ring, or little finger of either hand), while controlling the spatial location of the target and masker stimuli. Detection thresholds varied consistently as a function of the masker finger when the latter was on the same hand (Experiments 1 and 2), but not when on different hands (Experiments 3 and 4). Within the hand, detection thresholds increased for masker fingers closest to the target finger (i.e., middle>ring when the target was index). Between the hands, detection thresholds were higher only when the masker was present on any finger as compared to when the target was presented in isolation. The within hand effect of masker finger is consistent with the segregation of different fingers at the early stages of somatosensory processing, from the periphery to the primary somatosensory cortex (SI). We propose that detection is finger-specific and reflects the organisation of somatosensory receptive fields in SI within, but not between the hands.
Resumo:
The application of the Water Framework Directive (WFD) in the European Union (EU) targets certain threshold levels for the concentration of various nutrients, nitrogen and phosphorous being the most important. In the EU, agri-environmental measures constitute a significant component of Pillar 2—Rural Development Policies in both financial and regulatory terms. Environmental measures also are linked to Pillar 1 payments through cross-compliance and the greening proposals. This paper drawing from work carried out in the REFRESH FP7 project aims to show how an INtegrated CAtchment model of plant/soil system dynamics and instream biogeochemical and hydrological dynamics can be used to assess the cost-effectiveness of agri-environmental measures in relation to nutrient concentration targets set by the WFD, especially in the presence of important habitats. We present the procedures (methodological steps, challenges and problems) for assessing the cost-effectiveness of agri-environmental measures at the baseline situation, and climate and land use change scenarios. Furthermore, we present results of an application of this methodology to the Louros watershed in Greece and discuss the likely uses and future extensions of the modelling approach. Finally, we attempt to reveal the importance of this methodology for designing and incorporating alternative environmental practices in Pillar 1 and 2 measures.