166 resultados para fermentation technique
Resumo:
Pullpipelining, a pipeline technique where data is pulled from successor stages from predecessor stages is proposed Control circuits using a synchronous, a semi-synchronous and an asynchronous approach are given. Simulation examples for a DLX generic RISC datapath show that common control pipeline circuit overhead is avoided using the proposal. Applications to linear systolic arrays in cases when computation is finished at early stages in the array are foreseen. This would allow run-time data-driven digital frequency modulation of synchronous pipelined designs. This has applications to implement algorithms exhibiting average-case processing time using a synchronous approach.
Resumo:
Very large scale scheduling and planning tasks cannot be effectively addressed by fully automated schedule optimisation systems, since many key factors which govern 'fitness' in such cases are unformalisable. This raises the question of an interactive (or collaborative) approach, where fitness is assigned by the expert user. Though well-researched in the domains of interactively evolved art and music, this method is as yet rarely used in logistics. This paper concerns a difficulty shared by all interactive evolutionary systems (IESs), but especially those used for logistics or design problems. The difficulty is that objective evaluation of IESs is severely hampered by the need for expert humans in the loop. This makes it effectively impossible to, for example, determine with statistical confidence any ranking among a decent number of configurations for the parameters and strategy choices. We make headway into this difficulty with an Automated Tester (AT) for such systems. The AT replaces the human in experiments, and has parameters controlling its decision-making accuracy (modelling human error) and a built-in notion of a target solution which may typically be at odds with the solution which is optimal in terms of formalisable fitness. Using the AT, plausible evaluations of alternative designs for the IES can be done, allowing for (and examining the effects of) different levels of user error. We describe such an AT for evaluating an IES for very large scale planning.
Resumo:
A simple and practical technique for assessing the risks, that is, the potential for error, and consequent loss, in software system development, acquired during a requirements engineering phase is described. The technique uses a goal-based requirements analysis as a framework to identify and rate a set of key issues in order to arrive at estimates of the feasibility and adequacy of the requirements. The technique is illustrated and how it has been applied to a real systems development project is shown. How problems in this project could have been identified earlier is shown, thereby avoiding costly additional work and unhappy users.
Resumo:
This paper presents a clocking pipeline technique referred to as a single-pulse pipeline (PP-Pipeline) and applies it to the problem of mapping pipelined circuits to a Field Programmable Gate Array (FPGA). A PP-pipeline replicates the operation of asynchronous micropipelined control mechanisms using synchronous-orientated logic resources commonly found in FPGA devices. Consequently, circuits with an asynchronous-like pipeline operation can be efficiently synthesized using a synchronous design methodology. The technique can be extended to include data-completion circuitry to take advantage of variable data-completion processing time in synchronous pipelined designs. It is also shown that the PP-pipeline reduces the clock tree power consumption of pipelined circuits. These potential applications are demonstrated by post-synthesis simulation of FPGA circuits. (C) 2004 Elsevier B.V. All rights reserved.
Resumo:
Using the classical Parzen window (PW) estimate as the desired response, the kernel density estimation is formulated as a regression problem and the orthogonal forward regression technique is adopted to construct sparse kernel density (SKD) estimates. The proposed algorithm incrementally minimises a leave-one-out test score to select a sparse kernel model, and a local regularisation method is incorporated into the density construction process to further enforce sparsity. The kernel weights of the selected sparse model are finally updated using the multiplicative nonnegative quadratic programming algorithm, which ensures the nonnegative and unity constraints for the kernel weights and has the desired ability to reduce the model size further. Except for the kernel width, the proposed method has no other parameters that need tuning, and the user is not required to specify any additional criterion to terminate the density construction procedure. Several examples demonstrate the ability of this simple regression-based approach to effectively construct a SKID estimate with comparable accuracy to that of the full-sample optimised PW density estimate. (c) 2007 Elsevier B.V. All rights reserved.
Resumo:
Tremor is a clinical feature characterized by oscillations of a part of the body. The detection and study of tremor is an important step in investigations seeking to explain underlying control strategies of the central nervous system under natural (or physiological) and pathological conditions. It is well established that tremorous activity is composed of deterministic and stochastic components. For this reason, the use of digital signal processing techniques (DSP) which take into account the nonlinearity and nonstationarity of such signals may bring new information into the signal analysis which is often obscured by traditional linear techniques (e.g. Fourier analysis). In this context, this paper introduces the application of the empirical mode decomposition (EMD) and Hilbert spectrum (HS), which are relatively new DSP techniques for the analysis of nonlinear and nonstationary time-series, for the study of tremor. Our results, obtained from the analysis of experimental signals collected from 31 patients with different neurological conditions, showed that the EMD could automatically decompose acquired signals into basic components, called intrinsic mode functions (IMFs), representing tremorous and voluntary activity. The identification of a physical meaning for IMFs in the context of tremor analysis suggests an alternative and new way of detecting tremorous activity. These results may be relevant for those applications requiring automatic detection of tremor. Furthermore, the energy of IMFs was visualized as a function of time and frequency by means of the HS. This analysis showed that the variation of energy of tremorous and voluntary activity could be distinguished and characterized on the HS. Such results may be relevant for those applications aiming to identify neurological disorders. In general, both the HS and EMD demonstrated to be very useful to perform objective analysis of any kind of tremor and can therefore be potentially used to perform functional assessment.
Resumo:
We describe a high-level design method to synthesize multi-phase regular arrays. The method is based on deriving component designs using classical regular (or systolic) array synthesis techniques and composing these separately evolved component design into a unified global design. Similarity transformations ar e applied to component designs in the composition stage in order to align data ow between the phases of the computations. Three transformations are considered: rotation, re ection and translation. The technique is aimed at the design of hardware components for high-throughput embedded systems applications and we demonstrate this by deriving a multi-phase regular array for the 2-D DCT algorithm which is widely used in many vide ocommunications applications.
Resumo:
Epidemiological studies and healthy eating guidelines suggest a positive correlation between ingestion of whole grain cereal and food rich in fibre with protection from chronic diseases. The prebiotic potential of whole grains may be related, however, little is known about the microbiota modulatory capability of oat grain or the impact processing has on this ability. In this study the fermentation profile of whole grain oat flakes, processed to produce two different sized flakes (small and large), by human faecal microbiota was investigated in vitro. Simulated digestion and subsequent fermentation by gut bacteria was investigated using pH controlled faecal batch cultures inoculated with human faecal slurry. The different sized oat flakes, Oat 23’s (0.53–0.63 mm) and Oat 25’s/26’s (0.85–1.0 mm) were compared to oligofructose, a confirmed prebiotic, and cellulose, a poorly fermented carbohydrate. Bacterial enumeration was carried out using the culture independent technique, fluorescent in situ hybridisation, and short chain fatty acid (SCFA) production monitored by gas chromatography. Significant changes in total bacterial populations were observed after 24 h incubation for all substrates except Oat 23’s and cellulose. Oats 23’s fermentation resulted in a significant increase in the Bacteroides–Prevotella group. Oligofructose and Oats 25’s/26’s produced significant increases in Bifidobacterium in the latter stages of fermentation while numbers declined for Oats 23’s between 5 h and 24 h. This is possibly due to the smaller surface area of the larger flakes inhibiting the simulated digestion, which may have resulted in increased levels of resistant starch (Bifidobacterium are known to ferment this dietary fibre). Fermentation of Oat 25’s/26’s resulted in a propionate rich SCFA profile and a significant increase in butyrate, which have both been linked to benefiting host health. The smaller sized oats did not produce a significant increase in butyrate concentration. This study shows for the first time the impact of oat grain on the microbial ecology of the human gut and its potential to beneficially modulate the gut microbiota through increasing Bifidobacterium population.
Resumo:
A sparse kernel density estimator is derived based on the zero-norm constraint, in which the zero-norm of the kernel weights is incorporated to enhance model sparsity. The classical Parzen window estimate is adopted as the desired response for density estimation, and an approximate function of the zero-norm is used for achieving mathemtical tractability and algorithmic efficiency. Under the mild condition of the positive definite design matrix, the kernel weights of the proposed density estimator based on the zero-norm approximation can be obtained using the multiplicative nonnegative quadratic programming algorithm. Using the -optimality based selection algorithm as the preprocessing to select a small significant subset design matrix, the proposed zero-norm based approach offers an effective means for constructing very sparse kernel density estimates with excellent generalisation performance.
Resumo:
The fermentability of rice bran (RB), alone or in combination with one of two probiotics, by canine faecal microbiota was evaluated in stirred, pH-controlled, anaerobic batch cultures. RB enhanced the levels of bacteria detected by probes Bif164 (bifidobacteria) and Lab158 (lactic acid bacteria); however, addition of the probiotics did not have a significant effect on the predominant microbial counts compared with RB alone. RB sustained levels of Bifidobacterium longum 05 throughout the fermentation; in contrast, Lactobacillus acidophilus 14 150B levels decreased significantly after 5-h fermentation. RB fermentation induced changes in the short-chain fatty acid (SCFA) profile. However, RB combined with probiotics did not alter the SCFA levels compared with RB alone. Denaturing gradient gel electrophoresis analysis of samples obtained at 24 h showed a treatment effect with RB, which was not observed in the RB plus probiotic systems. Overall, the negative controls displayed lower species richness than the treatment systems and their banding profiles were distinct. This study illustrates the ability of a common ingredient found in pet food to modulate the canine faecal microbiota and highlights that RB may be an economical alternative to prebiotics for use in dog food.