979 resultados para Cutting temperature modeling


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The assumption that negligible work is involved in the formation of new surfaces in the machining of ductile metals, is re-examined in the light of both current Finite Element Method (FEM) simulations of cutting and modern ductile fracture mechanics. The work associated with separation criteria in FEM models is shown to be in the kJ/m2 range rather than the few J/m2 of the surface energy (surface tension) employed by Shaw in his pioneering study of 1954 following which consideration of surface work has been omitted from analyses of metal cutting. The much greater values of surface specific work are not surprising in terms of ductile fracture mechanics where kJ/m2 values of fracture toughness are typical of the ductile metals involved in machining studies. This paper shows that when even the simple Ernst–Merchant analysis is generalised to include significant surface work, many of the experimental observations for which traditional ‘plasticity and friction only’ analyses seem to have no quantitative explanation, are now given meaning. In particular, the primary shear plane angle φ becomes material-dependent. The experimental increase of φ up to a saturated level, as the uncut chip thickness is increased, is predicted. The positive intercepts found in plots of cutting force vs. depth of cut, and in plots of force resolved along the primary shear plane vs. area of shear plane, are shown to be measures of the specific surface work. It is demonstrated that neglect of these intercepts in cutting analyses is the reason why anomalously high values of shear yield stress are derived at those very small uncut chip thicknesses at which the so-called size effect becomes evident. The material toughness/strength ratio, combined with the depth of cut to form a non-dimensional parameter, is shown to control ductile cutting mechanics. The toughness/strength ratio of a given material will change with rate, temperature, and thermomechanical treatment and the influence of such changes, together with changes in depth of cut, on the character of machining is discussed. Strength or hardness alone is insufficient to describe machining. The failure of the Ernst–Merchant theory seems less to do with problems of uniqueness and the validity of minimum work, and more to do with the problem not being properly posed. The new analysis compares favourably and consistently with the wide body of experimental results available in the literature. Why considerable progress in the understanding of metal cutting has been achieved without reference to significant surface work is also discussed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Optical density measurements were used to estimate the effect of heat treatments on the single-cell lag times of Listeria innocua fitted to a shifted gamma distribution. The single-cell lag time was subdivided into repair time ( the shift of the distribution assumed to be uniform for all cells) and adjustment time (varying randomly from cell to cell). After heat treatments in which all of the cells recovered (sublethal), the repair time and the mean and the variance of the single-cell adjustment time increased with the severity of the treatment. When the heat treatments resulted in a loss of viability (lethal), the repair time of the survivors increased with the decimal reduction of the cell numbers independently of the temperature, while the mean and variance of the single-cell adjustment times remained the same irrespective of the heat treatment. Based on these observations and modeling of the effect of time and temperature of the heat treatment, we propose that the severity of a heat treatment can be characterized by the repair time of the cells whether the heat treatment is lethal or not, an extension of the F value concept for sublethal heat treatments. In addition, the repair time could be interpreted as the extent or degree of injury with a multiple-hit lethality model. Another implication of these results is that the distribution of the time for cells to reach unacceptable numbers in food is not affected by the time-temperature combination resulting in a given decimal reduction.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A new primary model based on a thermodynamically consistent first-order kinetic approach was constructed to describe non-log-linear inactivation kinetics of pressure-treated bacteria. The model assumes a first-order process in which the specific inactivation rate changes inversely with the square root of time. The model gave reasonable fits to experimental data over six to seven orders of magnitude. It was also tested on 138 published data sets and provided good fits in about 70% of cases in which the shape of the curve followed the typical convex upward form. In the remainder of published examples, curves contained additional shoulder regions or extended tail regions. Curves with shoulders could be accommodated by including an additional time delay parameter and curves with tails shoulders could be accommodated by omitting points in the tail beyond the point at which survival levels remained more or less constant. The model parameters varied regularly with pressure, which may reflect a genuine mechanistic basis for the model. This property also allowed the calculation of (a) parameters analogous to the decimal reduction time D and z, the temperature increase needed to change the D value by a factor of 10, in thermal processing, and hence the processing conditions needed to attain a desired level of inactivation; and (b) the apparent thermodynamic volumes of activation associated with the lethal events. The hypothesis that inactivation rates changed as a function of the square root of time would be consistent with a diffusion-limited process.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Quantitative control of aroma generation during the Maillard reaction presents great scientific and industrial interest. Although there have been many studies conducted in simplified model systems, the results are difficult to apply to complex food systems, where the presence of other components can have a significant impact. In this work, an aqueous extract of defatted beef liver was chosen as a simplified food matrix for studying the kinetics of the Mallard reaction. Aliquots of the extract were heated under different time and temperature conditions and analyzed for sugars, amino acids, and methylbutanals, which are important Maillard-derived aroma compounds formed in cooked meat. Multiresponse kinetic modeling, based on a simplified mechanistic pathway, gave a good fit with the experimental data, but only when additional steps were introduced to take into account the interactions of glucose and glucose-derived intermediates with protein and other amino compounds. This emphasizes the significant role of the food matrix in controlling the Maillard reaction.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Strong vertical gradients at the top of the atmospheric boundary layer affect the propagation of electromagnetic waves and can produce radar ducts. A three-dimensional, time-dependent, nonhydrostatic numerical model was used to simulate the propagation environment in the atmosphere over the Persian Gulf when aircraft observations of ducting had been made. A division of the observations into high- and low-wind cases was used as a framework for the simulations. Three sets of simulations were conducted with initial conditions of varying degrees of idealization and were compared with the observations taken in the Ship Antisubmarine Warfare Readiness/Effectiveness Measuring (SHAREM-115) program. The best results occurred with the initialization based on a sounding taken over the coast modified by the inclusion of data on low-level atmospheric conditions over the Gulf waters. The development of moist, cool, stable marine internal boundary layers (MIBL) in air flowing from land over the waters of the Gulf was simulated. The MIBLs were capped by temperature inversions and associated lapses of humidity and refractivity. The low-wind MIBL was shallower and the gradients at its top were sharper than in the high-wind case, in agreement with the observations. Because it is also forced by land–sea contrasts, a sea-breeze circulation frequently occurs in association with the MIBL. The size, location, and internal structure of the sea-breeze circulation were realistically simulated. The gradients of temperature and humidity that bound the MIBL cause perturbations in the refractivity distribution that, in turn, lead to trapping layers and ducts. The existence, location, and surface character of the ducts were well captured. Horizontal variations in duct characteristics due to the sea-breeze circulation were also evident. The simulations successfully distinguished between high- and low-wind occasions, a notable feature of the SHAREM-115 observations. The modeled magnitudes of duct depth and strength, although leaving scope for improvement, were most encouraging.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We review the scientific literature since the 1960s to examine the evolution of modeling tools and observations that have advanced understanding of global stratospheric temperature changes. Observations show overall cooling of the stratosphere during the period for which they are available (since the late 1950s and late 1970s from radiosondes and satellites, respectively), interrupted by episodes of warming associated with volcanic eruptions, and superimposed on variations associated with the solar cycle. There has been little global mean temperature change since about 1995. The temporal and vertical structure of these variations are reasonably well explained bymodels that include changes in greenhouse gases, ozone, volcanic aerosols, and solar output, although there are significant uncertainties in the temperature observations and regarding the nature and influence of past changes in stratospheric water vapor. As a companion to a recent WIREs review of tropospheric temperature trends, this article identifies areas of commonality and contrast between the tropospheric and stratospheric trend literature. For example, the increased attention over time to radiosonde and satellite data quality has contributed to better characterization of uncertainty in observed trends both in the troposphere and in the lower stratosphere, and has highlighted the relative deficiency of attention to observations in the middle and upper stratosphere. In contrast to the relatively unchanging expectations of surface and tropospheric warming primarily induced by greenhouse gas increases, stratospheric temperature change expectations have arisen from experiments with a wider variety of model types, showingmore complex trend patterns associated with a greater diversity of forcing agents.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Buffalo curd gave higher amount of yield than cows’ curd at similar processing conditions. Curd moisture was decreased with the increase of gelation temperatures in both types of milk. Curd cutting time of 45 minutes was found optimum for Mozzarella cheese making from both milk samples. Centrifugation method is simpler, quicker and more reproducible than Buchner funnel method. Buffalo milk contains higher amounts of αs1- , β- and к-casein as compared to cows’ milk.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

To understand the resilience of aquatic ecosystems to environmental change, it is important to determine how multiple, related environmental factors, such as near-surface air temperature and river flow, will change during the next century. This study develops a novel methodology that combines statistical downscaling and fish species distribution modeling, to enhance the understanding of how global climate changes (modeled by global climate models at coarse-resolution) may affect local riverine fish diversity. The novelty of this work is the downscaling framework developed to provide suitable future projections of fish habitat descriptors, focusing particularly on the hydrology which has been rarely considered in previous studies. The proposed modeling framework was developed and tested in a major European system, the Adour-Garonne river basin (SW France, 116,000 km(2)), which covers distinct hydrological and thermal regions from the Pyrenees to the Atlantic coast. The simulations suggest that, by 2100, the mean annual stream flow is projected to decrease by approximately 15% and temperature to increase by approximately 1.2 °C, on average. As consequence, the majority of cool- and warm-water fish species is projected to expand their geographical range within the basin while the few cold-water species will experience a reduction in their distribution. The limitations and potential benefits of the proposed modeling approach are discussed. Copyright © 2012 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The formulation and implementation of LEAF-2, the Land Ecosystem–Atmosphere Feedback model, which comprises the representation of land–surface processes in the Regional Atmospheric Modeling System (RAMS), is described. LEAF-2 is a prognostic model for the temperature and water content of soil, snow cover, vegetation, and canopy air, and includes turbulent and radiative exchanges between these components and with the atmosphere. Subdivision of a RAMS surface grid cell into multiple areas of distinct land-use types is allowed, with each subgrid area, or patch, containing its own LEAF-2 model, and each patch interacts with the overlying atmospheric column with a weight proportional to its fractional area in the grid cell. A description is also given of TOPMODEL, a land hydrology model that represents surface and subsurface downslope lateral transport of groundwater. Details of the incorporation of a modified form of TOPMODEL into LEAF-2 are presented. Sensitivity tests of the coupled system are presented that demonstrate the potential importance of the patch representation and of lateral water transport in idealized model simulations. Independent studies that have applied LEAF-2 and verified its performance against observational data are cited. Linkage of RAMS and TOPMODEL through LEAF-2 creates a modeling system that can be used to explore the coupled atmosphere–biophysical–hydrologic response to altered climate forcing at local watershed and regional basin scales.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Northern Hemisphere tropical cyclone (TC) activity is investigated in multiyear global climate simulations with theECMWFIntegrated Forecast System (IFS) at 10-km resolution forced by the observed records of sea surface temperature and sea ice. The results are compared to analogous simulationswith the 16-, 39-, and 125-km versions of the model as well as observations. In the North Atlantic, mean TC frequency in the 10-km model is comparable to the observed frequency, whereas it is too low in the other versions. While spatial distributions of the genesis and track densities improve systematically with increasing resolution, the 10-km model displays qualitatively more realistic simulation of the track density in the western subtropical North Atlantic. In the North Pacific, the TC count tends to be too high in thewest and too low in the east for all resolutions. These model errors appear to be associated with the errors in the large-scale environmental conditions that are fairly similar in this region for all model versions. The largest benefits of the 10-km simulation are the dramatically more accurate representation of the TC intensity distribution and the structure of the most intense storms. The model can generate a supertyphoon with a maximum surface wind speed of 68.4 m s21. The life cycle of an intense TC comprises intensity fluctuations that occur in apparent connection with the variations of the eyewall/rainband structure. These findings suggest that a hydrostatic model with cumulus parameterization and of high enough resolution could be efficiently used to simulate the TC intensity response (and the associated structural changes) to future climate change.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Results from aircraft and surface observations provided evidence for the existence of mesoscale circulations over the Boreal Ecosystem-Atmosphere Study (BOREAS) domain. Using an integrated approach that included the use of analytical modeling, numerical modeling, and data analysis, we have found that there are substantial contributions to the total budgets of heat over the BOREAS domain generated by mesoscale circulations. This effect is largest when the synoptic flow is relatively weak, yet it is present under less favorable conditions, as shown by the case study presented here. While further analysis is warranted to document this effect, the existence of mesoscale flow is not surprising, since it is related to the presence of landscape patches, including lakes, which are of a size on the order of the local Rossby radius and which have spatial differences in maximum sensible heat flux of about 300 W m−2. We have also analyzed the vertical temperature profile simulated in our case study as well as high-resolution soundings and we have found vertical profiles of temperature change above the boundary layer height, which we attribute in part to mesoscale contributions. Our conclusion is that in regions with organized landscapes, such as BOREAS, even with relatively strong synoptic winds, dynamical scaling criteria should be used to assess whether mesoscale effects should be parameterized or explicitly resolved in numerical models of the atmosphere.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The connection between the El Ni˜no Southern Oscillation (ENSO) and the Northern polar stratosphere has been established from observations and atmospheric modeling. Here a systematic inter-comparison of the sensitivity of the modeled stratosphere to ENSO in Chemistry Climate Models (CCMs) is reported. This work uses results from a number of the CCMs included in the 2006 ozone assessment. In the lower stratosphere, the mean of all model simulations reports a warming of the polar vortex during strong ENSO events in February–March, consistent with but smaller than the estimate from satellite observations and ERA40 reanalysis. The anomalous warming is associated with an anomalous dynamical increase of column ozone north of 70� N that is accompanied by coherent column ozone decrease in the Tropics, in agreement with that deduced from the NIWA column ozone database, implying an increased residual circulation in the mean of all model simulations during ENSO. The spread in the model responses is partly due to the large internal stratospheric variability and it is shown that it crucially depends on the representation of the tropospheric ENSO teleconnection in the models.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper describes the techniques used to obtain sea surface temperature (SST) retrievals from the Geostationary Operational Environmental Satellite 12 (GOES-12) at the National Oceanic and Atmospheric Administration’s Office of Satellite Data Processing and Distribution. Previous SST retrieval techniques relying on channels at 11 and 12 μm are not applicable because GOES-12 lacks the latter channel. Cloud detection is performed using a Bayesian method exploiting fast-forward modeling of prior clear-sky radiances using numerical weather predictions. The basic retrieval algorithm used at nighttime is based on a linear combination of brightness temperatures at 3.9 and 11 μm. In comparison with traditional split window SSTs (using 11- and 12-μm channels), simulations show that this combination has maximum scatter when observing drier colder scenes, with a comparable overall performance. For daytime retrieval, the same algorithm is applied after estimating and removing the contribution to brightness temperature in the 3.9-μm channel from solar irradiance. The correction is based on radiative transfer simulations and comprises a parameterization for atmospheric scattering and a calculation of ocean surface reflected radiance. Potential use of the 13-μm channel for SST is shown in a simulation study: in conjunction with the 3.9-μm channel, it can reduce the retrieval error by 30%. Some validation results are shown while a companion paper by Maturi et al. shows a detailed analysis of the validation results for the operational algorithms described in this present article.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Seventeen simulations of the Last Glacial Maximum (LGM) climate have been performed using atmospheric general circulation models (AGCM) in the framework of the Paleoclimate Modeling Intercomparison Project (PMIP). These simulations use the boundary conditions for CO2, insolation and ice-sheets; surface temperatures (SSTs) are either (a) prescribed using CLIMAP data set (eight models) or (b) computed by coupling the AGCM with a slab ocean (nine models). The present-day (PD) tropical climate is correctly depicted by all the models, except the coarser resolution models, and the simulated geographical distribution of annual mean temperature is in good agreement with climatology. Tropical cooling at the LGM is less than at middle and high latitudes, but greatly exceeds the PD temperature variability. The LGM simulations with prescribed SSTs underestimate the observed temperature changes except over equatorial Africa where the models produce a temperature decrease consistent with the data. Our results confirm previous analyses showing that CLIMAP (1981) SSTs only produce a weak terrestrial cooling. When SSTs are computed, the models depict a cooling over the Pacific and Indian oceans in contrast with CLIMAP and most models produce cooler temperatures over land. Moreover four of the nine simulations, produce a cooling in good agreement with terrestrial data. Two of these model results over ocean are consistent with new SST reconstructions whereas two models simulate a homogeneous cooling. Finally, the LGM aridity inferred for most of the tropics from the data, is globally reproduced by the models with a strong underestimation for models using computed SSTs.