51 resultados para Image analysis method
Resumo:
Robotic and manual methods have been used to obtain identification of significantly changing proteins regulated when Schizosaccharomyces pombe is exposed to oxidative stress. Differently treated S. pombe cells were lysed, labelled with CyDye (TM) and analysed by two-dimensional difference gel. electrophoresis. Gel images analysed off-line, using the DeCyder (TM) image analysis software [GE Healthcare, Amersham, UK] allowed selection of significantly regulated proteins. Proteins displaying differential expression were excised robotically for manual digestion and identified by matrix-assisted laser desorption/ionisation - mass spectrometry (MALDI-MS). Additionally the same set of proteins displaying differential expression were automatically cut and digested using a prototype robotic platform. Automated MALDI-MS, peak label assignment and database searching were utilised to identify as many proteins as possible. The results achieved by the robotic system were compared to manual methods. The identification of all significantly altered proteins provides an annotated peroxide stress-related proteome that can be used as a base resource against which other stress-induced proteomic changes can be compared.
Resumo:
Purpose – The purpose of this research is to determine whether new intelligent classrooms will affect the behaviour of children in their new learning environments. Design/methodology/approach – A multi-method study approach was used to carry out the research. Behavioural mapping was used to observe and monitor the classroom environment and analyse usage. Two new classrooms designed by INTEGER (Intelligent and Green) in two different UK schools provided the case studies to determine whether intelligent buildings (learning environments) can enhance learning experiences. Findings – Several factors were observed in the learning environments: mobility, flexibility, use of technology, interactions. Relationships among them were found indicating that the new environments have positive impact on pupils' behaviour. Practical implications – A very useful feedback for the Classrooms of the Future initiative will be provided, which can be used as basis for the School of the Future initiative. Originality/value – The behavioural analysis method described in this study will enable an evaluation of the “Schools of the Future” concept, under children's perspective. Using a real life laboratory gives contribution to the education field by rethinking the classroom environment and the way of teaching.
Resumo:
Crumpets are made by heating fermented batter on a hot plate at around 230°C. The characteristic structure dominated by vertical pores develops rapidly: structure has developed throughout around 75% of the product height within 30s, which is far faster than might be expected from transient heat conduction through the batter. Cooking is complete within around 3 min. Image analysis based on results from X-ray tomography shows that the voidage fraction is approximately constant and that there is continual coalescence between the larger pores throughout the product although there is also a steady level of small bubbles trapped within the solidified batter. We report here experimental studies which shed light on some of the mechanisms responsible for this structure, together with some models of key phenomena.Three aspects are discussed here: the role of gas (carbon dioxide and nitrogen) nuclei in initiating structure development; convective heat transfer inside the developing pores; and the kinetics of setting the batter into an elastic solid structure. It is shown conclusively that the small bubbles of carbon dioxide resulting from the fermentation stage play a crucial role as nuclei for pore development: without these nuclei, the result is not a porous structure, but rather a solid, elastic, inedible, gelatinized product. These nuclei are also responsible for the tiny bubbles which are set in the final product. The nuclei form the source of the dominant pore structure which is largely driven by the, initially explosive, release of water vapour from the batter together with the desorption of dissolved carbon dioxide. It is argued that the rapid evaporation, transport and condensation of steam within the growing pores provides an important mechanism, as in a heat pipe, for rapid heat transfer, and models for this process are developed and tested. The setting of the continuous batter phase is essential for final product quality: studies using differential scanning calorimetry and on the kinetics of change in the visco-elastic properties of the batter suggest that this process is driven by the kinetics of gelatinization. Unlike many thermally driven food processes the rates of heating are such that gelatinization kinetics cannot be neglected. The implications of these results for modelling and for the development of novel structures are discussed.
Resumo:
A combined mathematical model for predicting heat penetration and microbial inactivation in a solid body heated by conduction was tested experimentally by inoculating agar cylinders with Salmonella typhimurium or Enterococcus faecium and heating in a water bath. Regions of growth where bacteria had survived after heating were measured by image analysis and compared with model predictions. Visualisation of the regions of growth was improved by incorporating chromogenic metabolic indicators into the agar. Preliminary tests established that the model performed satisfactorily with both test organisms and with cylinders of different diameter. The model was then used in simulation studies in which the parameters D, z, inoculum size, cylinder diameter and heating temperature were systematically varied. These simulations showed that the biological variables D, z and inoculum size had a relatively small effect on the time needed to eliminate bacteria at the cylinder axis in comparison with the physical variables heating temperature and cylinder diameter, which had a much greater relative effect. (c) 2005 Elsevier B.V All rights reserved.
Resumo:
Bubbles impart a very unique texture, chew, and mouth feel to foods. However, little is known about the relationship between structure of such products and consumer response in terms of mouth-feel and eating experience. The objective of this article is to investigate the sensory properties of 4 types of bubble-containing chocolates, produced by using different gases: carbon dioxide, nitrogen, nitrous oxide, and argon. The structure of these chocolates were characterized in terms of (1) gas hold-up values determined by density measurements and (2) bubble size distribution which was measured by undertaking an image analysis of X-ray microtomograph sections. Bubble size distributions were obtained by measuring bubble volumes after reconstructing 3D images from the tomographic sections. A sensory study was undertaken by a nonexpert panel of 20 panelists and their responses were analyzed using qualitative descriptive analysis (QDA). The results show that chocolates made from the 4 gases could be divided into 2 groups on the basis of bubble volume and gas hold-up: the samples produced using carbon dioxide and nitrous oxide had a distinctly higher gas hold-up containing larger bubbles in comparison with those produced using argon and nitrogen. The sensory study also demonstrated that chocolates made with the latter were perceived to be harder, less aerated, slow to melt in the mouth, and having overall flavor intensity. These products were further found to be creamier than the chocolates made by using carbon dioxide and nitrous oxide; the latter sample also showed a higher intensity of cocoa flavor.
Resumo:
This paper describes the crowd image analysis challenge that forms part of the PETS 2009 workshop. The aim of this challenge is to use new or existing systems for i) crowd count and density estimation, ii) tracking of individual(s) within a crowd, and iii) detection of separate flows and specific crowd events, in a real-world environment. The dataset scenarios were filmed from multiple cameras and involve multiple actors.
Resumo:
This paper describes the crowd image analysis challenge that forms part of the PETS 2009 workshop. The aim of this challenge is to use new or existing systems for i) crowd count and density estimation, ii) tracking of individual(s) within a crowd, and iii) detection of separate flows and specific crowd events, in a real-world environment. The dataset scenarios were filmed from multiple cameras and involve multiple actors.
Resumo:
This paper presents the results of the crowd image analysis challenge, as part of the PETS 2009 workshop. The evaluation is carried out using a selection of the metrics available in the Video Analysis and Content Extraction (VACE) program and the CLassification of Events, Activities, and Relationships (CLEAR) consortium. The evaluation highlights the strengths of the authors’ systems in areas such as precision, accuracy and robustness.
Resumo:
This paper presents the results of the crowd image analysis challenge of the Winter PETS 2009 workshop. The evaluation is carried out using a selection of the metrics developed in the Video Analysis and Content Extraction (VACE) program and the CLassification of Events, Activities, and Relationships (CLEAR) consortium [13]. The evaluation highlights the detection and tracking performance of the authors’systems in areas such as precision, accuracy and robustness. The performance is also compared to the PETS 2009 submitted results.
Resumo:
This paper presents the results of the crowd image analysis challenge of the PETS2010 workshop. The evaluation was carried out using a selection of the metrics developed in the Video Analysis and Content Extraction (VACE) program and the CLassification of Events, Activities, and Relationships (CLEAR) consortium. The PETS 2010 evaluation was performed using new ground truthing create from each independant two dimensional view. In addition, the performance of the submissions to the PETS 2009 and Winter-PETS 2009 were evaluated and included in the results. The evaluation highlights the detection and tracking performance of the authors’ systems in areas such as precision, accuracy and robustness.
Resumo:
Many weeds occur in patches but farmers frequently spray whole fields to control the weeds in these patches. Given a geo-referenced weed map, technology exists to confine spraying to these patches. Adoption of patch spraying by arable farmers has, however, been negligible partly due to the difficulty of constructing weed maps. Building on previous DEFRA and HGCA projects, this proposal aims to develop and evaluate a machine vision system to automate the weed mapping process. The project thereby addresses the principal technical stumbling block to widespread adoption of site specific weed management (SSWM). The accuracy of weed identification by machine vision based on a single field survey may be inadequate to create herbicide application maps. We therefore propose to test the hypothesis that sufficiently accurate weed maps can be constructed by integrating information from geo-referenced images captured automatically at different times of the year during normal field activities. Accuracy of identification will also be increased by utilising a priori knowledge of weeds present in fields. To prove this concept, images will be captured from arable fields on two farms and processed offline to identify and map the weeds, focussing especially on black-grass, wild oats, barren brome, couch grass and cleavers. As advocated by Lutman et al. (2002), the approach uncouples the weed mapping and treatment processes and builds on the observation that patches of these weeds are quite stable in arable fields. There are three main aspects to the project. 1) Machine vision hardware. Hardware component parts of the system are one or more cameras connected to a single board computer (Concurrent Solutions LLC) and interfaced with an accurate Global Positioning System (GPS) supplied by Patchwork Technology. The camera(s) will take separate measurements for each of the three primary colours of visible light (red, green and blue) in each pixel. The basic proof of concept can be achieved in principle using a single camera system, but in practice systems with more than one camera may need to be installed so that larger fractions of each field can be photographed. Hardware will be reviewed regularly during the project in response to feedback from other work packages and updated as required. 2) Image capture and weed identification software. The machine vision system will be attached to toolbars of farm machinery so that images can be collected during different field operations. Images will be captured at different ground speeds, in different directions and at different crop growth stages as well as in different crop backgrounds. Having captured geo-referenced images in the field, image analysis software will be developed to identify weed species by Murray State and Reading Universities with advice from The Arable Group. A wide range of pattern recognition and in particular Bayesian Networks will be used to advance the state of the art in machine vision-based weed identification and mapping. Weed identification algorithms used by others are inadequate for this project as we intend to collect and correlate images collected at different growth stages. Plants grown for this purpose by Herbiseed will be used in the first instance. In addition, our image capture and analysis system will include plant characteristics such as leaf shape, size, vein structure, colour and textural pattern, some of which are not detectable by other machine vision systems or are omitted by their algorithms. Using such a list of features observable using our machine vision system, we will determine those that can be used to distinguish weed species of interest. 3) Weed mapping. Geo-referenced maps of weeds in arable fields (Reading University and Syngenta) will be produced with advice from The Arable Group and Patchwork Technology. Natural infestations will be mapped in the fields but we will also introduce specimen plants in pots to facilitate more rigorous system evaluation and testing. Manual weed maps of the same fields will be generated by Reading University, Syngenta and Peter Lutman so that the accuracy of automated mapping can be assessed. The principal hypothesis and concept to be tested is that by combining maps from several surveys, a weed map with acceptable accuracy for endusers can be produced. If the concept is proved and can be commercialised, systems could be retrofitted at low cost onto existing farm machinery. The outputs of the weed mapping software would then link with the precision farming options already built into many commercial sprayers, allowing their use for targeted, site-specific herbicide applications. Immediate economic benefits would, therefore, arise directly from reducing herbicide costs. SSWM will also reduce the overall pesticide load on the crop and so may reduce pesticide residues in food and drinking water, and reduce adverse impacts of pesticides on non-target species and beneficials. Farmers may even choose to leave unsprayed some non-injurious, environmentally-beneficial, low density weed infestations. These benefits fit very well with the anticipated legislation emerging in the new EU Thematic Strategy for Pesticides which will encourage more targeted use of pesticides and greater uptake of Integrated Crop (Pest) Management approaches, and also with the requirements of the Water Framework Directive to reduce levels of pesticides in water bodies. The greater precision of weed management offered by SSWM is therefore a key element in preparing arable farming systems for the future, where policy makers and consumers want to minimise pesticide use and the carbon footprint of farming while maintaining food production and security. The mapping technology could also be used on organic farms to identify areas of fields needing mechanical weed control thereby reducing both carbon footprints and also damage to crops by, for example, spring tines. Objective i. To develop a prototype machine vision system for automated image capture during agricultural field operations; ii. To prove the concept that images captured by the machine vision system over a series of field operations can be processed to identify and geo-reference specific weeds in the field; iii. To generate weed maps from the geo-referenced, weed plants/patches identified in objective (ii).
Resumo:
A new state estimator algorithm is based on a neurofuzzy network and the Kalman filter algorithm. The major contribution of the paper is recognition of a bias problem in the parameter estimation of the state-space model and the introduction of a simple, effective prefiltering method to achieve unbiased parameter estimates in the state-space model, which will then be applied for state estimation using the Kalman filtering algorithm. Fundamental to this method is a simple prefiltering procedure using a nonlinear principal component analysis method based on the neurofuzzy basis set. This prefiltering can be performed without prior system structure knowledge. Numerical examples demonstrate the effectiveness of the new approach.
Resumo:
Reliable techniques for screening large numbers of plants for root traits are still being developed, but include aeroponic, hydroponic and agar plate systems. Coupled with digital cameras and image analysis software, these systems permit the rapid measurement of root numbers, length and diameter in moderate ( typically <1000) numbers of plants. Usually such systems are employed with relatively small seedlings, and information is recorded in 2D. Recent developments in X-ray microtomography have facilitated 3D non-invasive measurement of small root systems grown in solid media, allowing angular distributions to be obtained in addition to numbers and length. However, because of the time taken to scan samples, only a small number can be screened (typically<10 per day, not including analysis time of the large spatial datasets generated) and, depending on sample size, limited resolution may mean that fine roots remain unresolved. Although agar plates allow differences between lines and genotypes to be discerned in young seedlings, the rank order may not be the same when the same materials are grown in solid media. For example, root length of dwarfing wheat ( Triticum aestivum L.) lines grown on agar plates was increased by similar to 40% relative to wild-type and semi-dwarfing lines, but in a sandy loam soil under well watered conditions it was decreased by 24-33%. Such differences in ranking suggest that significant soil environment-genotype interactions are occurring. Developments in instruments and software mean that a combination of high-throughput simple screens and more in-depth examination of root-soil interactions is becoming viable.
Resumo:
Working memory (WM) is not a unitary construct. There are distinct processes involved in encoding information, maintaining it on-line, and using it to guide responses. The anatomical configurations of these processes are more accurately analyzed as functionally connected networks than collections of individual regions. In the current study we analyzed event-related functional magnetic resonance imaging (fMRI) data from a Sternberg Item Recognition Paradigm WM task using a multivariate analysis method that allowed the linking of functional networks to temporally-separated WM epochs. The length of the delay epochs was varied to optimize isolation of the hemodynamic response (HDR) for each task epoch. All extracted functional networks displayed statistically significant sensitivity to delay length. Novel information extracted from these networks that was not apparent in the univariate analysis of these data included involvement of the hippocampus in encoding/probe, and decreases in BOLD signal in the superior temporal gyrus (STG), along with default-mode regions, during encoding/delay. The bilateral hippocampal activity during encoding/delay fits with theoretical models of WM in which memoranda held across the short term are activated long-term memory representations. The BOLD signal decreases in the STG were unexpected, and may reflect repetition suppression effects invoked by internal repetition of letter stimuli. Thus, analysis methods focusing on how network dynamics relate to experimental conditions allowed extraction of novel information not apparent in univariate analyses, and are particularly recommended for WM experiments for which task epochs cannot be randomized.
Resumo:
We present a new sparse shape modeling framework on the Laplace-Beltrami (LB) eigenfunctions. Traditionally, the LB-eigenfunctions are used as a basis for intrinsically representing surface shapes by forming a Fourier series expansion. To reduce high frequency noise, only the first few terms are used in the expansion and higher frequency terms are simply thrown away. However, some lower frequency terms may not necessarily contribute significantly in reconstructing the surfaces. Motivated by this idea, we propose to filter out only the significant eigenfunctions by imposing l1-penalty. The new sparse framework can further avoid additional surface-based smoothing often used in the field. The proposed approach is applied in investigating the influence of age (38-79 years) and gender on amygdala and hippocampus shapes in the normal population. In addition, we show how the emotional response is related to the anatomy of the subcortical structures.