399 resultados para Processing methods


Relevância:

70.00% 70.00%

Publicador:

Resumo:

This thesis is a documented energy audit and long term study of energy and water reduction in a ghee factory. Global production of ghee exceeds 4 million tonnes annually. The factory in this study refines dairy products by non-traditional centrifugal separation and produces 99.9% pure, canned, crystallised Anhydrous Milk Fat (Ghee). Ghee is traditionally made by batch processing methods. The traditional method is less efficient, than centrifugal separation. An in depth systematic investigation was conducted of each item of major equipment including; ammonia refrigeration, a steam boiler, canning equipment, pumps, heat exchangers and compressed air were all fine-tuned. Continuous monitoring of electrical usage showed that not every initiative worked, others had pay back periods of less than a year. In 1994-95 energy consumption was 6,582GJ and in 2003-04 it was 5,552GJ down 16% for a similar output. A significant reduction in water usage was achieved by reducing the airflow in the refrigeration evaporative condensers to match the refrigeration load. Water usage has fallen 68% from18ML in 1994-95 to 5.78ML in 2003-04. The methods reported in this thesis could be applied to other industries, which have similar equipment, and other ghee manufacturers.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

In South and Southeast Asia, postharvest loss causes material waste of up to 66% in fruits and vegetables, 30% in oilseeds and pulses, and 49% in roots and tubers. The efficiency of postharvest equipment directly affects industrial-scale food production. To enhance current processing methods and devices, it is essential to analyze the responses of food materials under loading operations. Food materials undergo different types of mechanical loading during postharvest and processing stages. Therefore, it is important to determine the properties of these materials under different types of loads, such as tensile, compression, and indentation. This study presents a comprehensive analysis of the available literature on the tensile properties of different food samples. The aim of this review was to categorize the available methods of tensile testing for agricultural crops and food materials to investigate an appropriate sample size and tensile test method. The results were then applied to perform tensile tests on pumpkin flesh and peel samples, in particular on arc-sided samples at a constant loading rate of 20 mm min-1. The results showed the maximum tensile stress of pumpkin flesh and peel samples to be 0.535 and 1.45 MPa, respectively. The elastic modulus of the flesh and peel samples was 6.82 and 25.2 MPa, respectively, while the failure modulus values were 14.51 and 30.88 MPa, respectively. The results of the tensile tests were also used to develop a finite element model of mechanical peeling of tough-skinned vegetables. However, to study the effects of deformation rate, moisture content, and texture of the tissue on the tensile responses of food materials, more investigation needs to be done in the future.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

As a part of vital infrastructure and transportation networks, bridge structures must function safely at all times. However, due to heavier and faster moving vehicular loads and function adjustment, such as Busway accommodation, many bridges are now operating at an overload beyond their design capacity. Additionally, the huge renovation and replacement costs always make the infrastructure owners difficult to undertake. Structural health monitoring (SHM) is set to assess condition and foresee probable failures of designated bridge(s), so as to monitor the structural health of the bridges. The SHM systems proposed recently are incorporated with Vibration-Based Damage Detection (VBDD) techniques, Statistical Methods and Signal processing techniques and have been regarded as efficient and economical ways to solve the problem. The recent development in damage detection and condition assessment techniques based on VBDD and statistical methods are reviewed. The VBDD methods based on changes in natural frequencies, curvature/strain modes, modal strain energy (MSE) dynamic flexibility, artificial neural networks (ANN) before and after damage and other signal processing methods like Wavelet techniques and empirical mode decomposition (EMD) / Hilbert spectrum methods are discussed here.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The high morbidity and mortality associated with atherosclerotic coronary vascular disease (CVD) and its complications are being lessened by the increased knowledge of risk factors, effective preventative measures and proven therapeutic interventions. However, significant CVD morbidity remains and sudden cardiac death continues to be a presenting feature for some subsequently diagnosed with CVD. Coronary vascular disease is also the leading cause of anaesthesia related complications. Stress electrocardiography/exercise testing is predictive of 10 year risk of CVD events and the cardiovascular variables used to score this test are monitored peri-operatively. Similar physiological time-series datasets are being subjected to data mining methods for the prediction of medical diagnoses and outcomes. This study aims to find predictors of CVD using anaesthesia time-series data and patient risk factor data. Several pre-processing and predictive data mining methods are applied to this data. Physiological time-series data related to anaesthetic procedures are subjected to pre-processing methods for removal of outliers, calculation of moving averages as well as data summarisation and data abstraction methods. Feature selection methods of both wrapper and filter types are applied to derived physiological time-series variable sets alone and to the same variables combined with risk factor variables. The ability of these methods to identify subsets of highly correlated but non-redundant variables is assessed. The major dataset is derived from the entire anaesthesia population and subsets of this population are considered to be at increased anaesthesia risk based on their need for more intensive monitoring (invasive haemodynamic monitoring and additional ECG leads). Because of the unbalanced class distribution in the data, majority class under-sampling and Kappa statistic together with misclassification rate and area under the ROC curve (AUC) are used for evaluation of models generated using different prediction algorithms. The performance based on models derived from feature reduced datasets reveal the filter method, Cfs subset evaluation, to be most consistently effective although Consistency derived subsets tended to slightly increased accuracy but markedly increased complexity. The use of misclassification rate (MR) for model performance evaluation is influenced by class distribution. This could be eliminated by consideration of the AUC or Kappa statistic as well by evaluation of subsets with under-sampled majority class. The noise and outlier removal pre-processing methods produced models with MR ranging from 10.69 to 12.62 with the lowest value being for data from which both outliers and noise were removed (MR 10.69). For the raw time-series dataset, MR is 12.34. Feature selection results in reduction in MR to 9.8 to 10.16 with time segmented summary data (dataset F) MR being 9.8 and raw time-series summary data (dataset A) being 9.92. However, for all time-series only based datasets, the complexity is high. For most pre-processing methods, Cfs could identify a subset of correlated and non-redundant variables from the time-series alone datasets but models derived from these subsets are of one leaf only. MR values are consistent with class distribution in the subset folds evaluated in the n-cross validation method. For models based on Cfs selected time-series derived and risk factor (RF) variables, the MR ranges from 8.83 to 10.36 with dataset RF_A (raw time-series data and RF) being 8.85 and dataset RF_F (time segmented time-series variables and RF) being 9.09. The models based on counts of outliers and counts of data points outside normal range (Dataset RF_E) and derived variables based on time series transformed using Symbolic Aggregate Approximation (SAX) with associated time-series pattern cluster membership (Dataset RF_ G) perform the least well with MR of 10.25 and 10.36 respectively. For coronary vascular disease prediction, nearest neighbour (NNge) and the support vector machine based method, SMO, have the highest MR of 10.1 and 10.28 while logistic regression (LR) and the decision tree (DT) method, J48, have MR of 8.85 and 9.0 respectively. DT rules are most comprehensible and clinically relevant. The predictive accuracy increase achieved by addition of risk factor variables to time-series variable based models is significant. The addition of time-series derived variables to models based on risk factor variables alone is associated with a trend to improved performance. Data mining of feature reduced, anaesthesia time-series variables together with risk factor variables can produce compact and moderately accurate models able to predict coronary vascular disease. Decision tree analysis of time-series data combined with risk factor variables yields rules which are more accurate than models based on time-series data alone. The limited additional value provided by electrocardiographic variables when compared to use of risk factors alone is similar to recent suggestions that exercise electrocardiography (exECG) under standardised conditions has limited additional diagnostic value over risk factor analysis and symptom pattern. The effect of the pre-processing used in this study had limited effect when time-series variables and risk factor variables are used as model input. In the absence of risk factor input, the use of time-series variables after outlier removal and time series variables based on physiological variable values’ being outside the accepted normal range is associated with some improvement in model performance.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Structural health is a vital aspect of infrastructure sustainability. As a part of a vital infrastructure and transportation network, bridge structures must function safely at all times. However, due to heavier and faster moving vehicular loads and function adjustment, such as Busway accommodation, many bridges are now operating at an overload beyond their design capacity. Additionally, the huge renovation and replacement costs are a difficult burden for infrastructure owners. The structural health monitoring (SHM) systems proposed recently are incorporated with vibration-based damage detection techniques, statistical methods and signal processing techniques and have been regarded as efficient and economical ways to assess bridge condition and foresee probable costly failures. In this chapter, the recent developments in damage detection and condition assessment techniques based on vibration-based damage detection and statistical methods are reviewed. The vibration-based damage detection methods based on changes in natural frequencies, curvature or strain modes, modal strain energy, dynamic flexibility, artificial neural networks, before and after damage, and other signal processing methods such as Wavelet techniques, empirical mode decomposition and Hilbert spectrum methods are discussed in this chapter.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

A well-engineered scaffold for regenerative medicine, which is suitable to be translated from the bench to the bedside, combines inspired design, technical innovation and precise craftsmanship. Electrospinning and additive manufacturing are separate approaches to manufacturing scaffolds for a variety of tissue engineering applications. A need to accurately control the spatial distribution of pores within scaffolds has recently resulted in combining the two processing methods, to overcome shortfalls in each technology. This review describes where electrospinning and additive manufacturing are used together to generate new porous structures for biological applications.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Monitoring fetal wellbeing is a compelling problem in modern obstetrics. Clinicians have become increasingly aware of the link between fetal activity (movement), well-being, and later developmental outcome. We have recently developed an ambulatory accelerometer-based fetal activity monitor (AFAM) to record 24-hour fetal movement. Using this system, we aim at developing signal processing methods to automatically detect and quantitatively characterize fetal movements. The first step in this direction is to test the performance of the accelerometer in detecting fetal movement against real-time ultrasound imaging (taken as the gold standard). This paper reports first results of this performance analysis.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Cells respond to various biochemical and physical cues during wound–healing and tumour progression. In vitro assays used to study these processes are typically conducted in one particular geometry and it is unclear how the assay geometry affects the capacity of cell populations to spread, or whether the relevant mechanisms, such as cell motility and cell proliferation, are somehow sensitive to the geometry of the assay. In this work we use a circular barrier assay to characterise the spreading of cell populations in two different geometries. Assay 1 describes a tumour–like geometry where a cell population spreads outwards into an open space. Assay 2 describes a wound–like geometry where a cell population spreads inwards to close a void. We use a combination of discrete and continuum mathematical models and automated image processing methods to obtain independent estimates of the effective cell diffusivity, D, and the effective cell proliferation rate, λ. Using our parameterised mathematical model we confirm that our estimates of D and λ accurately predict the time–evolution of the location of the leading edge and the cell density profiles for both assay 1 and assay 2. Our work suggests that the effective cell diffusivity is up to 50% lower for assay 2 compared to assay 1, whereas the effective cell proliferation rate is up to 30% lower for assay 2 compared to assay 1.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The one-step preparation of highly anisotropic polymer semiconductor thin films directly from solution is demonstrated. The conjugated polymer poly(3-hexylthiophene) (P3HT) as well as P3HT:fullerene bulk-heterojunction blends can be spin-coated from a mixture of the crystallizable solvent 1,3,5-trichlorobenzene (TCB) and a second carrier solvent such as chlorobenzene. Solidification is initiated by growth of macroscopic TCB spherulites followed by epitaxial crystallization of P3HT on TCB crystals. Subsequent sublimation of TCB leaves behind a replica of the original TCB spherulites. Thus, highly ordered thin films are obtained, which feature square-centimeter-sized domains that are composed of one spherulite-like structure each. A combination of optical microscopy and polarized photoluminescence spectroscopy reveals radial alignment of the polymer backbone in case of P3HT, whereas P3HT:fullerene blends display a tangential orientation with respect to the center of spherulite-like structures. Moreover, grazing-incidence wide-angle X-ray scattering reveals an increased relative degree of crystallinity and predominantly flat-on conformation of P3HT crystallites in the blend. The use of other processing methods such as dip-coating is also feasible and offers uniaxial orientation of the macromolecule. Finally, the applicability of this method to a variety of other semi-crystalline conjugated polymer systems is established. Those include other poly(3-alkylthiophene)s, two polyfluorenes, the low band-gap polymer PCPDTBT, a diketopyrrolopyrrole (DPP) small molecule as well as a number of polymer:fullerene and polymer:polymer blends. Macroscopic spherulite-like structures of the conjugated polymer poly(3-hexylthiophene) (P3HT) grow directly during spin-coating. This is achieved by processing P3HT or P3HT:fullerene bulk heterojunction blends from a mixture of the crystallizable solvent 1,3,5-trichlorobenzene and a second carrier solvent such as chlorobenzene. Epitaxial growth of the polymer on solidified solvent crystals gives rise to circular-symmetric, spherulite-like structures that feature a high degree of anisotropy.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In this paper we propose and study low complexity algorithms for on-line estimation of hidden Markov model (HMM) parameters. The estimates approach the true model parameters as the measurement noise approaches zero, but otherwise give improved estimates, albeit with bias. On a nite data set in the high noise case, the bias may not be signi cantly more severe than for a higher complexity asymptotically optimal scheme. Our algorithms require O(N3) calculations per time instant, where N is the number of states. Previous algorithms based on earlier hidden Markov model signal processing methods, including the expectation-maximumisation (EM) algorithm require O(N4) calculations per time instant.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Over the last few years, investigations of human epigenetic profiles have identified key elements of change to be Histone Modifications, stable and heritable DNA methylation and Chromatin remodeling. These factors determine gene expression levels and characterise conditions leading to disease. In order to extract information embedded in long DNA sequences, data mining and pattern recognition tools are widely used, but efforts have been limited to date with respect to analyzing epigenetic changes, and their role as catalysts in disease onset. Useful insight, however, can be gained by investigation of associated dinucleotide distributions. The focus of this paper is to explore specific dinucleotides frequencies across defined regions within the human genome, and to identify new patterns between epigenetic mechanisms and DNA content. Signal processing methods, including Fourier and Wavelet Transformations, are employed and principal results are reported.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Several protocols for isolation of mycobacteria from water exist, but there is no established standard method. This study compared methods of processing potable water samples for the isolation of Mycobacterium avium and Mycobacterium intracellulare using spiked sterilized water and tap water decontaminated using 0.005% cetylpyridinium chloride (CPC). Samples were concentrated by centrifugation or filtration and inoculated onto Middlebrook 7H10 and 7H11 plates and Lowenstein-Jensen slants and into mycobacterial growth indicator tubes with or without polymyxin, azlocillin, nalidixic acid, trimethoprim, and amphotericin B. The solid media were incubated at 32°C, at 35°C, and at 35°C with CO2 and read weekly. The results suggest that filtration of water for the isolation of mycobacteria is a more sensitive method for concentration than centrifugation. The addition of sodium thiosulfate may not be necessary and may reduce the yield. Middlebrook M7H10 and 7H11 were equally sensitive culture media. CPC decontamination, while effective for reducing growth of contaminants, also significantly reduces mycobacterial numbers. There was no difference at 3 weeks between the different incubation temperatures.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This article explores two matrix methods to induce the ``shades of meaning" (SoM) of a word. A matrix representation of a word is computed from a corpus of traces based on the given word. Non-negative Matrix Factorisation (NMF) and Singular Value Decomposition (SVD) compute a set of vectors corresponding to a potential shade of meaning. The two methods were evaluated based on loss of conditional entropy with respect to two sets of manually tagged data. One set reflects concepts generally appearing in text, and the second set comprises words used for investigations into word sense disambiguation. Results show that for NMF consistently outperforms SVD for inducing both SoM of general concepts as well as word senses. The problem of inducing the shades of meaning of a word is more subtle than that of word sense induction and hence relevant to thematic analysis of opinion where nuances of opinion can arise.