88 resultados para combined stage sintering model
Resumo:
Acrylamide is formed from reducing sugars and asparagine during the preparation of French fries. The commercial preparation of French fries is a multi-stage process involving the preparation of frozen, par-fried potato strips for distribution to catering outlets where they are finish fried. The initial blanching, treatment in glucose solution and par-frying steps are crucial since they determine the levels of precursors present at the beginning of the finish frying process. In order to minimize the quantities of acrylamide in cooked fries, it is important to understand the impact of each stage on the formation of acrylamide. Acrylamide, amino acids, sugars, moisture, fat and color were monitored at time intervals during the frying of potato strips which had been dipped in varying concentrations of glucose and fructose during a typical pretreatment. A mathematical model of the finish-frying was developed based on the fundamental chemical reaction pathways, incorporating moisture and temperature gradients in the fries. This showed the contribution of both glucose and fructose to the generation of acrylamide, and accurately predicted the acrylamide content of the final fries.
Resumo:
Acrylamide is formed from reducing sugars and asparagine during the preparation of French fries. The commercial preparation of French fries is a multistage process involving the preparation of frozen, par-fried potato strips for distribution to catering outlets, where they are finish-fried. The initial blanching, treatment in glucose solution, and par-frying steps are crucial because they determine the levels of precursors present at the beginning of the finish-frying process. To minimize the quantities of acrylamide in cooked fries, it is important to understand the impact of each stage on the formation of acrylamide. Acrylamide, amino acids, sugars, moisture, fat, and color were monitored at time intervals during the frying of potato strips that had been dipped in various concentrations of glucose and fructose during a typical pretreatment. A mathematical model based on the fundamental chemical reaction pathways of the finish-frying was developed, incorporating moisture and temperature gradients in the fries. This showed the contribution of both glucose and fructose to the generation of acrylamide and accurately predicted the acrylamide content of the final fries.
Resumo:
Let 0 denote the level of quality inherent in a food product that is delivered to some terminal market. In this paper, I characterize allocations over 0 and provide an economic rationale for regulating safety and quality standards in the food system. Zusman and Bockstael investigate the theoretical foundations for imposing standards and stress the importance of providing a tractable conceptual foundation. Despite a wealth of contributions that are mainly empirical (for reviews of these works see, respectively, Caswell and Antle), there have been relatively few attempts to model formally the linkages between farm and food markets when food quality and consumer safety are at issue. Here, I attempt to provide such a framework, building on key contributions in the theoretical literature and linking them in a simple model of quality determination in a vertically related marketing channel. The food-marketing model is due to Gardner. Spence provides a foundation for Pareto-improving intervention in a deterministic model of quality provision, and Leland, building on the classic paper by Akerlof, investigates licensing and minimum standards when the information structure is incomplete. Linking these ideas in a satisfactory model of the food markets is the main objective of the paper.
Resumo:
Iatrogenic errors and patient safety in clinical processes are an increasing concern. The quality of process information in hardcopy or electronic form can heavily influence clinical behaviour and decision making errors. Little work has been undertaken to assess the safety impact of clinical process planning documents guiding the clinical actions and decisions. This paper investigates the clinical process documents used in elective surgery and their impact on latent and active clinical errors. Eight clinicians from a large health trust underwent extensive semi- structured interviews to understand their use of clinical documents, and their perceived impact on errors and patient safety. Samples of the key types of document used were analysed. Theories of latent organisational and active errors from the literature were combined with the EDA semiotics model of behaviour and decision making to propose the EDA Error Model. This model enabled us to identify perceptual, evaluation, knowledge and action error types and approaches to reducing their causes. The EDA error model was then used to analyse sample documents and identify error sources and controls. Types of knowledge artefact structures used in the documents were identified and assessed in terms of safety impact. This approach was combined with analysis of the questionnaire findings using existing error knowledge from the literature. The results identified a number of document and knowledge artefact issues that give rise to latent and active errors and also issues concerning medical culture and teamwork together with recommendations for further work.
Resumo:
Airborne lidar provides accurate height information of objects on the earth and has been recognized as a reliable and accurate surveying tool in many applications. In particular, lidar data offer vital and significant features for urban land-cover classification, which is an important task in urban land-use studies. In this article, we present an effective approach in which lidar data fused with its co-registered images (i.e. aerial colour images containing red, green and blue (RGB) bands and near-infrared (NIR) images) and other derived features are used effectively for accurate urban land-cover classification. The proposed approach begins with an initial classification performed by the Dempster–Shafer theory of evidence with a specifically designed basic probability assignment function. It outputs two results, i.e. the initial classification and pseudo-training samples, which are selected automatically according to the combined probability masses. Second, a support vector machine (SVM)-based probability estimator is adopted to compute the class conditional probability (CCP) for each pixel from the pseudo-training samples. Finally, a Markov random field (MRF) model is established to combine spatial contextual information into the classification. In this stage, the initial classification result and the CCP are exploited. An efficient belief propagation (EBP) algorithm is developed to search for the global minimum-energy solution for the maximum a posteriori (MAP)-MRF framework in which three techniques are developed to speed up the standard belief propagation (BP) algorithm. Lidar and its co-registered data acquired by Toposys Falcon II are used in performance tests. The experimental results prove that fusing the height data and optical images is particularly suited for urban land-cover classification. There is no training sample needed in the proposed approach, and the computational cost is relatively low. An average classification accuracy of 93.63% is achieved.
Resumo:
In this paper we report on a study conducted using the Middle Atmospheric Nitrogen TRend Assessment (MANTRA) balloon measurements of stratospheric constituents and temperature and the Canadian Middle Atmosphere Model (CMAM). Three different kinds of data are used to assess the inter-consistency of the combined dataset: single profiles of long-lived species from MANTRA 1998, sparse climatologies from the ozonesonde measurements during the four MANTRA campaigns and from HALOE satellite measurements, and the CMAM climatology. In doing so, we evaluate the ability of the model to reproduce the measured fields and to thereby test our ability to describe mid-latitude summertime stratospheric processes. The MANTRA campaigns were conducted at Vanscoy, Saskatchewan, Canada (52◦ N, 107◦ W)in late August and early September of 1998, 2000, 2002 and 2004. During late summer at mid-latitudes, the stratosphere is close to photochemical control, providing an ideal scenario for the study reported here. From this analysis we find that: (1) reducing the value for the vertical diffusion coefficient in CMAM to a more physically reasonable value results in the model better reproducing the measured profiles of long-lived species; (2) the existence of compact correlations among the constituents, as expected from independent measurements in the literature and from models, confirms the self-consistency of the MANTRA measurements; and (3) the 1998 measurements show structures in the chemical species profiles that can be associated with transport, adding to the growing evidence that the summertime stratosphere can be much more disturbed than anticipated. The mechanisms responsible for such disturbances need to be understood in order to assess the representativeness of the measurements and to isolate longterm trends.
Resumo:
This study examines the effect of combining equatorial planetary wave drag and gravity wave drag in a one-dimensional zonal mean model of the quasi-biennial oscillation (QBO). Several different combinations of planetary wave and gravity wave drag schemes are considered in the investigations, with the aim being to assess which aspects of the different schemes affect the nature of the modeled QBO. Results show that it is possible to generate a realistic-looking QBO with various combinations of drag from the two types of waves, but there are some constraints on the wave input spectra and amplitudes. For example, if the phase speeds of the gravity waves in the input spectrum are large relative to those of the equatorial planetary waves, critical level absorption of the equatorial planetary waves may occur. The resulting mean-wind oscillation, in that case, is driven almost exclusively by the gravity wave drag, with only a small contribution from the planetary waves at low levels. With an appropriate choice of wave input parameters, it is possible to obtain a QBO with a realistic period and to which both types of waves contribute. This is the regime in which the terrestrial QBO appears to reside. There may also be constraints on the initial strength of the wind shear, and these are similar to the constraints that apply when gravity wave drag is used without any planetary wave drag. In recent years, it has been observed that, in order to simulate the QBO accurately, general circulation models require parameterized gravity wave drag, in addition to the drag from resolved planetary-scale waves, and that even if the planetary wave amplitudes are incorrect, the gravity wave drag can be adjusted to compensate. This study provides a basis for knowing that such a compensation is possible.
Resumo:
This work proposes a unified neurofuzzy modelling scheme. To begin with, the initial fuzzy base construction method is based on fuzzy clustering utilising a Gaussian mixture model (GMM) combined with the analysis of covariance (ANOVA) decomposition in order to obtain more compact univariate and bivariate membership functions over the subspaces of the input features. The mean and covariance of the Gaussian membership functions are found by the expectation maximisation (EM) algorithm with the merit of revealing the underlying density distribution of system inputs. The resultant set of membership functions forms the basis of the generalised fuzzy model (GFM) inference engine. The model structure and parameters of this neurofuzzy model are identified via the supervised subspace orthogonal least square (OLS) learning. Finally, instead of providing deterministic class label as model output by convention, a logistic regression model is applied to present the classifier’s output, in which the sigmoid type of logistic transfer function scales the outputs of the neurofuzzy model to the class probability. Experimental validation results are presented to demonstrate the effectiveness of the proposed neurofuzzy modelling scheme.
Resumo:
Oculopharyngeal muscular dystrophy (OPMD) is an adult-onset disorder characterized by ptosis, dysphagia and proximal limb weakness. Autosomal-dominant OPMD is caused by a short (GCG)8–13 expansions within the first exon of the poly(A)-binding protein nuclear 1 gene (PABPN1), leading to an expanded polyalanine tract in the mutated protein. Expanded PABPN1 forms insoluble aggregates in the nuclei of skeletal muscle fibres. In order to gain insight into the different physiological processes affected in OPMD muscles, we have used a transgenic mouse model of OPMD (A17.1) and performed transcriptomic studies combined with a detailed phenotypic characterization of this model at three time points. The transcriptomic analysis revealed a massive gene deregulation in the A17.1 mice, among which we identified a significant deregulation of pathways associated with muscle atrophy. Using a mathematical model for progression, we have identified that one-third of the progressive genes were also associated with muscle atrophy. Functional and histological analysis of the skeletal muscle of this mouse model confirmed a severe and progressive muscular atrophy associated with a reduction in muscle strength. Moreover, muscle atrophy in the A17.1 mice was restricted to fast glycolytic fibres, containing a large number of intranuclear inclusions (INIs). The soleus muscle and, in particular, oxidative fibres were spared, even though they contained INIs albeit to a lesser degree. These results demonstrate a fibre-type specificity of muscle atrophy in this OPMD model. This study improves our understanding of the biological pathways modified in OPMD to identify potential biomarkers and new therapeutic targets.
Resumo:
The occurrence of mid-latitude windstorms is related to strong socio-economic effects. For detailed and reliable regional impact studies, large datasets of high-resolution wind fields are required. In this study, a statistical downscaling approach in combination with dynamical downscaling is introduced to derive storm related gust speeds on a high-resolution grid over Europe. Multiple linear regression models are trained using reanalysis data and wind gusts from regional climate model simulations for a sample of 100 top ranking windstorm events. The method is computationally inexpensive and reproduces individual windstorm footprints adequately. Compared to observations, the results for Germany are at least as good as pure dynamical downscaling. This new tool can be easily applied to large ensembles of general circulation model simulations and thus contribute to a better understanding of the regional impact of windstorms based on decadal and climate change projections.
Resumo:
This study examines, in a unified fashion, the budgets of ocean gravitational potential energy (GPE) and available gravitational potential energy (AGPE) in the control simulation of the coupled atmosphere–ocean general circulation model HadCM3. Only AGPE can be converted into kinetic energy by adiabatic processes. Diapycnal mixing supplies GPE, but not AGPE, whereas the reverse is true of the combined effect of surface buoyancy forcing and convection. Mixing and buoyancy forcing, thus, play complementary roles in sustaining the large scale circulation. However, the largest globally integrated source of GPE is resolved advection (+0.57 TW) and the largest sink is through parameterized eddy transports (-0.82 TW). The effect of these adiabatic processes on AGPE is identical to their effect on GPE, except for perturbations to both budgets due to numerical leakage exacerbated by non-linearities in the equation of state.
Resumo:
The Hadley Centre Global Environmental Model (HadGEM) includes two aerosol schemes: the Coupled Large-scale Aerosol Simulator for Studies in Climate (CLASSIC), and the new Global Model of Aerosol Processes (GLOMAP-mode). GLOMAP-mode is a modal aerosol microphysics scheme that simulates not only aerosol mass but also aerosol number, represents internally-mixed particles, and includes aerosol microphysical processes such as nucleation. In this study, both schemes provide hindcast simulations of natural and anthropogenic aerosol species for the period 2000–2006. HadGEM simulations of the aerosol optical depth using GLOMAP-mode compare better than CLASSIC against a data-assimilated aerosol re-analysis and aerosol ground-based observations. Because of differences in wet deposition rates, GLOMAP-mode sulphate aerosol residence time is two days longer than CLASSIC sulphate aerosols, whereas black carbon residence time is much shorter. As a result, CLASSIC underestimates aerosol optical depths in continental regions of the Northern Hemisphere and likely overestimates absorption in remote regions. Aerosol direct and first indirect radiative forcings are computed from simulations of aerosols with emissions for the year 1850 and 2000. In 1850, GLOMAP-mode predicts lower aerosol optical depths and higher cloud droplet number concentrations than CLASSIC. Consequently, simulated clouds are much less susceptible to natural and anthropogenic aerosol changes when the microphysical scheme is used. In particular, the response of cloud condensation nuclei to an increase in dimethyl sulphide emissions becomes a factor of four smaller. The combined effect of different 1850 baselines, residence times, and abilities to affect cloud droplet number, leads to substantial differences in the aerosol forcings simulated by the two schemes. GLOMAP-mode finds a presentday direct aerosol forcing of −0.49Wm−2 on a global average, 72% stronger than the corresponding forcing from CLASSIC. This difference is compensated by changes in first indirect aerosol forcing: the forcing of −1.17Wm−2 obtained with GLOMAP-mode is 20% weaker than with CLASSIC. Results suggest that mass-based schemes such as CLASSIC lack the necessary sophistication to provide realistic input to aerosol-cloud interaction schemes. Furthermore, the importance of the 1850 baseline highlights how model skill in predicting present-day aerosol does not guarantee reliable forcing estimates. Those findings suggest that the more complex representation of aerosol processes in microphysical schemes improves the fidelity of simulated aerosol forcings.
Resumo:
Wheat dextrin soluble fibre may have metabolic and health benefits, potentially acting via mechanisms governed by the selective modulation of the human gut microbiota. Our aim was to examine the impact of wheat dextrin on the composition and metabolic activity of the gut microbiota. We used a validated in vitro three-stage continuous culture human colonic model (gut model) system comprised of vessels simulating anatomical regions of the human colon. To mimic human ingestion, 7 g of wheat dextrin (NUTRIOSE® FB06) was administered to three gut models, twice daily at 10.00 and 15.00, for a total of 18 days. Samples were collected and analysed for microbial composition and organic acid concentrations by 16S rRNA-based fluorescence in situ hybridisation and gas chromatography approaches, respectively. Wheat dextrin mediated a significant increase in total bacteria in vessels simulating the transverse and distal colon, and a significant increase in key butyrate-producing bacteria Clostridium cluster XIVa and Roseburia genus in all vessels of the gut model. The production of principal short-chain fatty acids, acetate, propionate and butyrate, which have been purported to have protective, trophic and metabolic host benefits, were increased. Specifically, wheat dextrin fermentation had a significant butyrogenic effect in all vessels of the gut model and significantly increased production of acetate (vessels 2 and 3) and propionate (vessel 3), simulating the transverse and distal regions of the human colon, respectively. In conclusion, wheat dextrin NUTRIOSE® FB06 is selectively fermented in vitro by Clostridium cluster XIVa and Roseburia genus and beneficially alters the metabolic profile of the human gut microbiota.
Resumo:
A novel analytical model for mixed-phase, unblocked and unseeded orographic precipitation with embedded convection is developed and evaluated. The model takes an idealised background flow and terrain geometry, and calculates the area-averaged precipitation rate and other microphysical quantities. The results provide insight into key physical processes, including cloud condensation, vapour deposition, evaporation, sublimation, as well as precipitation formation and sedimentation (fallout). To account for embedded convection in nominally stratiform clouds, diagnostics for purely convective and purely stratiform clouds are calculated independently and combined using weighting functions based on relevant dynamical and microphysical time scales. An in-depth description of the model is presented, as well as a quantitative assessment of its performance against idealised, convection-permitting numerical simulations with a sophisticated microphysics parameterisation. The model is found to accurately reproduce the simulation diagnostics over most of the parameter space considered.
Resumo:
Certain milk factors can promote the growth of a host-friendly gastrointestinal microflora. This may explain why breast-fed infants experience fewer intestinal infections than their formula-fed counterparts. The effect of formula supplementation with two such factors was investigated in this study. Infant faecal specimens were used to ferment formulas supplemented with glycomacropeptide and α-lactalbumin in a two-stage compound continuous culture model. Bacteriology was determined by fluorescence in situ hybridisation. Vessels that contained breast milk as well as α-lactalbumin and glycomacropeptide had stable counts of bifidobacteria while lactobacilli increased significantly only in vessels with breast milk. Bacteroides, clostridia and Escherichia coli decreased significantly in all runs. Acetate was the principal acid found along with high amounts of propionate and lactate. Supplementation of infant formulas with appropriate milk proteins may be useful in simulating the beneficial bacteriological effects of breast milk.