90 resultados para Milk processing
Resumo:
There is now considerable evidence to suggest that non-demented people with Parkinson's disease (PD) experience difficulties using the morphosyntactic aspects of language. It remains unclear, however, at precisely which point in the processing of morphosyntax, these difficulties emerge. The major objective of the present study was to examine the impact of PD on the processes involved in accessing morphosyntactic information in the lexicon. Nineteen people with PD and 19 matched control subjects participated in the study which employed on-line word recognition tasks to examine morphosyntactic priming for local grammatical dependencies that occur both within (e.g. is going) and across (e.g. she gives) phrasal boundaries (Experiments 1 and 2, respectively). The control group evidenced robust morphosyntactic priming effects that were consistent with the involvement of both pre- (Experiment 1) and post-lexical (Experiment 2) processing routines. Whilst the participants with PD also recorded priming for dependencies within phrasal boundaries (Experiment 1), priming effects were observed over an abnormally brief time course. Further, in contrast to the controls, the PD group failed to record morphosyntactic priming for constructions that crossed phrasal boundaries (Experiment 2). The results demonstrate that attentionally mediated mechanisms operating at both the pre- and post-lexical stages of processing are able to contribute to morphosyntactic priming effects. In addition, the findings support the notion that, whilst people with PD are able to access morphosyntactic information in a normal manner, the time frame in which this information remains available for processing is altered. Deficits may also be experienced at the post-lexical integrational stage of processing.
Resumo:
The Coefficient of Variance (mean standard deviation/mean Response time) is a measure of response time variability that corrects for differences in mean Response time (RT) (Segalowitz & Segalowitz, 1993). A positive correlation between decreasing mean RTs and CVs (rCV-RT) has been proposed as an indicator of L2 automaticity and more generally as an index of processing efficiency. The current study evaluates this claim by examining lexical decision performance by individuals from three levels of English proficiency (Intermediate ESL, Advanced ESL and L1 controls) on stimuli from four levels of item familiarity, as defined by frequency of occurrence. A three-phase model of skill development defined by changing rCV-RT.values was tested. Results showed that RTs and CVs systematically decreased as a function of increasing proficiency and frequency levels, with the rCV-RT serving as a stable indicator of individual differences in lexical decision performance. The rCV-RT and automaticity/restructuring account is discussed in light of the findings. The CV is also evaluated as a more general quantitative index of processing efficiency in the L2.
Resumo:
This Toolkit was developed for the Australian dairy processing industry on behalf of Dairy Australia. At the conclusion of the project, industry participants gained exclusive access to a comprehensive Eco-Efficiency Manual, which outlined many of the opportunities available to the industry. Summary fact sheets were also prepared as publicly available resources and these are available for download below
Resumo:
This manual has been developed to help the Australian dairy processing industry increase its competitiveness through increased awareness and uptake of eco-efficiency. The manual seeks to consolidate and build on existing knowledge, accumulated through projects and initiatives that the industry has previously undertaken to improve its use of raw materials and resources and reduce the generation of wastes. Where there is an existing comprehensive report or publication, the manual refers to this for further information. Eco-efficiency is about improving environmental performance to become more efficient and profitable. It is about producing more with less. It involves applying strategies that will not only ensure efficient use of resources and reduction in waste, but will also reduce costs. This chapter outlines the environmental challenges faced by Australian dairy processors. The manual explores opportunities for reducing environmental impacts in relation to water, energy, product yield, solid and liquid waste reduction and chemical use.
Resumo:
The large fat globules that can be present in UHT milk due to inadequate homogenisation cause a cream layer to form that limits the shelf life of UHT milk. Four different particle size measurement techniques were used to measure the size of fat globules in poorly homogenised UHT milk processed in a UHT pilot plant. The thickness of the cream layer that formed during storage was negatively correlated with homogenisation pressure. It was positively correlated with the mass mean diameter and the percentage volume of particles between 1.5 and 2 mu m diameter, as determined by laser light scattering using the Malvern Mastersizer. Also, the thickness of the cream layer was positively correlated with the volume mode diameter and the percentage volume of particles between 1.5 and 2 mu m diameter, as determined by electrical impedance using the Coulter Counter. The cream layer thickness did not correlate significantly with the Coulter Counter measurements of volume mean diameter, or volume percentages of particles between 2 and 5 mu m or 5 and 10 mu m diameter. Spectroturbidimetry (Emulsion Quality Analyser) and light microscopy analyses were found to be unsuitable for assessing the size of the fat particles. This study suggests that the fat globule size distribution as determined by the electrical impedance method (Coulter Counter) is the most useful for determining the efficiency of homogenisation and therefore for predicting the stability of the fat emulsion in UHT milk during storage.
Resumo:
The cost of spatial join processing can be very high because of the large sizes of spatial objects and the computation-intensive spatial operations. While parallel processing seems a natural solution to this problem, it is not clear how spatial data can be partitioned for this purpose. Various spatial data partitioning methods are examined in this paper. A framework combining the data-partitioning techniques used by most parallel join algorithms in relational databases and the filter-and-refine strategy for spatial operation processing is proposed for parallel spatial join processing. Object duplication caused by multi-assignment in spatial data partitioning can result in extra CPU cost as well as extra communication cost. We find that the key to overcome this problem is to preserve spatial locality in task decomposition. We show in this paper that a near-optimal speedup can be achieved for parallel spatial join processing using our new algorithms.
Resumo:
Efficiency of presentation of a peptide epitope by a MHC class I molecule depends on two parameters: its binding to the MHC molecule and its generation by intracellular Ag processing. In contrast to the former parameter, the mechanisms underlying peptide selection in Ag processing are poorly understood. Peptide translocation by the TAP transporter is required for presentation of most epitopes and may modulate peptide supply to MHC class I molecules. To study the role of human TAP for peptide presentation by individual HLA class I molecules, we generated artificial neural networks capable of predicting the affinity of TAP for random sequence 9-mer peptides. Using neural network-based predictions of TAP affinity, we found that peptides eluted from three different HLA class I molecules had higher TAP affinities than control peptides with equal binding affinities for the same HLA class I molecules, suggesting that human TAP may contribute to epitope selection. In simulated TAP binding experiments with 408 HLA class I binding peptides, HLA class I molecules differed significantly with respect to TAP affinities of their ligands, As a result, some class I molecules, especially HLA-B27, may be particularly efficient in presentation of cytosolic peptides with low concentrations, while most class I molecules may predominantly present abundant cytosolic peptides.
Resumo:
Inhibitors of proteolytic enzymes (proteases) are emerging as prospective treatments for diseases such as AIDS and viral infections, cancers, inflammatory disorders, and Alzheimer's disease. Generic approaches to the design of protease inhibitors are limited by the unpredictability of interactions between, and structural changes to, inhibitor and protease during binding. A computer analysis of superimposed crystal structures for 266 small molecule inhibitors bound to 48 proteases (16 aspartic, 17 serine, 8 cysteine, and 7 metallo) provides the first conclusive proof that inhibitors, including substrate analogues, commonly bind in an extended beta-strand conformation at the active sites of all these proteases. Representative superimposed structures are shown for (a) multiple inhibitors bound to a protease of each class, (b) single inhibitors each bound to multiple proteases, and (c) conformationally constrained inhibitors bound to proteases. Thus inhibitor/substrate conformation, rather than sequence/composition alone, influences protease recognition, and this has profound implications for inhibitor design. This conclusion is supported by NMR, CD, and binding studies for HIV-1 protease inhibitors/ substrates which, when preorganized in an extended conformation, have significantly higher protease affinity. Recognition is dependent upon conformational equilibria since helical and turn peptide conformations are not processed by proteases. Conformational selection explains the resistance of folded/structured regions of proteins to proteolytic degradation, the susceptibility of denatured proteins to processing, and the higher affinity of conformationally constrained 'extended' inhibitors/substrates for proteases. Other approaches to extended inhibitor conformations should similarly lead to high-affinity binding to a protease.
Resumo:
Recent research has begun to provide support for the assumptions that memories are stored as a composite and are accessed in parallel (Tehan & Humphreys, 1998). New predictions derived from these assumptions and from the Chappell and Humphreys (1994) implementation of these assumptions were tested. In three experiments, subjects studied relatively short lists of words. Some of the Lists contained two similar targets (thief and theft) or two dissimilar targets (thief and steal) associated with the same cue (ROBBERY). AS predicted, target similarity affected performance in cued recall but not free association. Contrary to predictions, two spaced presentations of a target did not improve performance in free association. Two additional experiments confirmed and extended this finding. Several alternative explanations for the target similarity effect, which incorporate assumptions about separate representations and sequential search, are rejected. The importance of the finding that, in at least one implicit memory paradigm, repetition does not improve performance is also discussed.