34 resultados para Object-oriented image analysis

em Aston University Research Archive


Relevância:

100.00% 100.00%

Publicador:

Resumo:

In analysing manufacturing systems, for either design or operational reasons, failure to account for the potentially significant dynamics could produce invalid results. There are many analysis techniques that can be used, however, simulation is unique in its ability to assess detailed, dynamic behaviour. The use of simulation to analyse manufacturing systems would therefore seem appropriate if not essential. Many simulation software products are available but their ease of use and scope of application vary greatly. This is illustrated at one extreme by simulators which offer rapid but limited application whilst at the other simulation languages which are extremely flexible but tedious to code. Given that a typical manufacturing engineer does not posses in depth programming and simulation skills then the use of simulators over simulation languages would seem a more appropriate choice. Whilst simulators offer ease of use their limited functionality may preclude their use in many applications. The construction of current simulators makes it difficult to amend or extend the functionality of the system to meet new challenges. Some simulators could even become obsolete as users, demand modelling functionality that reflects the latest manufacturing system design and operation concepts. This thesis examines the deficiencies in current simulation tools and considers whether they can be overcome by the application of object-oriented principles. Object-oriented techniques have gained in popularity in recent years and are seen as having the potential to overcome any of the problems traditionally associated with software construction. There are a number of key concepts that are exploited in the work described in this thesis: the use of object-oriented techniques to act as a framework for abstracting engineering concepts into a simulation tool and the ability to reuse and extend object-oriented software. It is argued that current object-oriented simulation tools are deficient and that in designing such tools, object -oriented techniques should be used not just for the creation of individual simulation objects but for the creation of the complete software. This results in the ability to construct an easy to use simulator that is not limited by its initial functionality. The thesis presents the design of an object-oriented data driven simulator which can be freely extended. Discussion and work is focused on discrete parts manufacture. The system developed retains the ease of use typical of data driven simulators. Whilst removing any limitation on its potential range of applications. Reference is given to additions made to the simulator by other developers not involved in the original software development. Particular emphasis is put on the requirements of the manufacturing engineer and the need for Ihe engineer to carrv out dynamic evaluations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The density of axons in the optic nerve, olfactory tract and corpus callosum was quantified in non-demented elderly subjects and in Alzheimer’s disease (AD) using an image analysis system. In each fibre tract, there was significant reduction in the density of axons in AD compared with non-demented subjects, the greatest reductions being observed in the olfactory tract and corpus callosum. Axonal loss in the optic nerve and olfactory tract was mainly of axons with smaller myelinated cross-sectional areas. In the corpus callosum, a reduction in the number of ‘thin’ and ‘thick’ fibres was observed in AD, but there was a proportionally greater loss of the ‘thick’ fibres. The data suggest significant degeneration of white matter fibre tracts in AD involving the smaller axons in the two sensory nerves and both large and small axons in the corpus callosum. Loss of axons in AD could reflect an associated white matter disorder and/or be secondary to neuronal degeneration.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper we describe a novel, extensible visualization system currently under development at Aston University. We introduce modern programming methods, such as the use of data driven programming, design patterns, and the careful definition of interfaces to allow easy extension using plug-ins, to 3D landscape visualization software. We combine this with modern developments in computer graphics, such as vertex and fragment shaders, to create an extremely flexible, extensible real-time near photorealistic visualization system. In this paper we show the design of the system and the main sub-components. We stress the role of modern programming practices and illustrate the benefits these bring to 3D visualization. © 2006 Springer-Verlag Berlin Heidelberg.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Object-oriented programming is seen as a difficult skill to master. There is considerable debate about the most appropriate way to introduce novice programmers to object-oriented concepts. Is it possible to uncover what the critical aspects or features are that enhance the learning of object-oriented programming? Practitioners have differing understandings of the nature of an object-oriented program. Uncovering these different ways of understanding leads to agreater understanding of the critical aspects and their relationship tothe structure of the program produced. A phenomenographic studywas conducted to uncover practitioner understandings of the nature of an object-oriented program. The study identified five levels of understanding and three dimensions of variation within these levels. These levels and dimensions of variation provide a framework for fostering conceptual change with respect to the nature of an object-oriented program.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis explores translating well-written sequential programs in a subset of the Eiffel programming language - without syntactic or semantic extensions - into parallelised programs for execution on a distributed architecture. The main focus is on constructing two object-oriented models: a theoretical self-contained model of concurrency which enables a simplified second model for implementing the compiling process. There is a further presentation of principles that, if followed, maximise the potential levels of parallelism. Model of Concurrency. The concurrency model is designed to be a straightforward target for mapping sequential programs onto, thus making them parallel. It aids the compilation process by providing a high level of abstraction, including a useful model of parallel behaviour which enables easy incorporation of message interchange, locking, and synchronization of objects. Further, the model is sufficient such that a compiler can and has been practically built. Model of Compilation. The compilation-model's structure is based upon an object-oriented view of grammar descriptions and capitalises on both a recursive-descent style of processing and abstract syntax trees to perform the parsing. A composite-object view with an attribute grammar style of processing is used to extract sufficient semantic information for the parallelisation (i.e. code-generation) phase. Programming Principles. The set of principles presented are based upon information hiding, sharing and containment of objects and the dividing up of methods on the basis of a command/query division. When followed, the level of potential parallelism within the presented concurrency model is maximised. Further, these principles naturally arise from good programming practice. Summary. In summary this thesis shows that it is possible to compile well-written programs, written in a subset of Eiffel, into parallel programs without any syntactic additions or semantic alterations to Eiffel: i.e. no parallel primitives are added, and the parallel program is modelled to execute with equivalent semantics to the sequential version. If the programming principles are followed, a parallelised program achieves the maximum level of potential parallelisation within the concurrency model.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Jackson System Development (JSD) is an operational software development method which addresses most of the software lifecycle either directly or by providing a framework into which more specialised techniques can fit. The method has two major phases: first an abstract specification is derived that is in principle executable; second the specification is implemented using a variety of transformations. The object oriented paradigm is based on data abstraction and encapsulation coupled to an inheritance architecture that is able to support software reuse. Its claims of improved programmer productivity and easier program maintenance make it an important technology to be considered for building complex software systems. The mapping of JSD specifications into procedural languages typified by Cobol, Ada, etc., involves techniques such as inversion and state vector separation to produce executable systems of acceptable performance. However, at present, no strategy exists to map JSD specifications into object oriented languages. The aim of this research is to investigate the relationship between JSD and the object oriented paradigm, and to identify and implement transformations capable of mapping JSD specifications into an object oriented language typified by Smalltalk-80. The direction which the transformational strategy follows is one whereby the concurrency of a specification is removed. Two approaches implementing inversion - an architectural transformation resulting in a simulated coroutine mechanism being generated - are described in detail. The first approach directly realises inversions by manipulating Smalltalk-80 system contexts. This is possible in Smalltalk-80 because contexts are first class objects and are accessible to the user like any other system object. However, problems associated with this approach are expounded. The second approach realises coroutine-like behaviour in a structure called a `followmap'. A followmap is the results of a transformation on a JSD process in which a collection of followsets is generated. Each followset represents all possible state transitions a process can undergo from the current state of the process. Followsets, together with exploitation of the class/instance mechanism for implementing state vector separation, form the basis for mapping JSD specifications into Smalltalk-80. A tool, which is also built in Smalltalk-80, supports these derived transformations and enables a user to generate Smalltalk-80 prototypes of JSD specifications.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The traditional waterfall software life cycle model has several weaknesses. One problem is that a working version of a system is unavailable until a late stage in the development; any omissions and mistakes in the specification undetected until that stage can be costly to maintain. The operational approach which emphasises the construction of executable specifications can help to remedy this problem. An operational specification may be exercised to generate the behaviours of the specified system, thereby serving as a prototype to facilitate early validation of the system's functional requirements. Recent ideas have centred on using an existing operational method such as JSD in the specification phase of object-oriented development. An explicit transformation phase following specification is necessary in this approach because differences in abstractions between the two domains need to be bridged. This research explores an alternative approach of developing an operational specification method specifically for object-oriented development. By incorporating object-oriented concepts in operational specifications, the specifications have the advantage of directly facilitating implementation in an object-oriented language without requiring further significant transformations. In addition, object-oriented concepts can help the developer manage the complexity of the problem domain specification, whilst providing the user with a specification that closely reflects the real world and so the specification and its execution can be readily understood and validated. A graphical notation has been developed for the specification method which can capture the dynamic properties of an object-oriented system. A tool has also been implemented comprising an editor to facilitate the input of specifications, and an interpreter which can execute the specifications and graphically animate the behaviours of the specified systems.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The aim of this project was to carry out a fundamental study to assess the potential of colour image analysis for use in investigations of fire damaged concrete. This involved:(a) Quantification (rather than purely visual assessment) of colour change as an indicator of the thermal history of concrete.(b) Quantification of the nature and intensity of crack development as an indication of the thermal history of concrete, supporting and in addition to, colour change observations.(c) Further understanding of changes in the physical and chemical properties of aggregate and mortar matrix after heating.(d) An indication of the relationship between cracking and non-destructive methods of testing e.g. UPV or Schmidt hammer. Results showed that colour image analysis could be used to quantify the colour changes found when concrete is heated. Development of red colour coincided with significant reduction in compressive strength. Such measurements may be used to determine the thermal history of concrete by providing information regarding the temperature distribution that existed at the height of a fire. The actual colours observed depended on the types of cement and aggregate that were used to make the concrete. With some aggregates it may be more appropriate to only analyse the mortar matrix. Petrographic techniques may also be used to determine the nature and density of cracks developing at elevated temperatures and values of crack density correlate well with measurements of residual compressive strength. Small differences in crack density were observed with different cements and aggregates, although good correlations were always found with the residual compressive strength. Taken together these two techniques can provide further useful information for the evaluation of fire damaged concrete. This is especially so since petrographic analysis can also provide information on the quality of the original concrete such as cement content and water / cement ratio. Concretes made with blended cements tended to produce small differences in physical and chemical properties compared to those made with unblended cements. There is some evidence to suggest that a coarsening of pore structure in blended cements may lead to onset of cracking at lower temperatures. The use of DTA/TGA was of little use in assessing the thermal history of concrete made with blended cements. Corner spalling and sloughing off, as observed in columns, was effectively reproduced in tests on small scale specimens and the crack distributions measured. Relationships between compressive strength/cracking and non-destructive methods of testing are discussed and an outline procedure for site investigations of fire damaged concrete is described.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This study considers the application of image analysis in petrography and investigates the possibilities for advancing existing techniques by introducing feature extraction and analysis capabilities of a higher level than those currently employed. The aim is to construct relevant, useful descriptions of crystal form and inter-crystal relations in polycrystalline igneous rock sections. Such descriptions cannot be derived until the `ownership' of boundaries between adjacent crystals has been established: this is the fundamental problem of crystal boundary assignment. An analysis of this problem establishes key image features which reveal boundary ownership; a set of explicit analysis rules is presented. A petrographic image analysis scheme based on these principles is outlined and the implementation of key components of the scheme considered. An algorithm for the extraction and symbolic representation of image structural information is developed. A new multiscale analysis algorithm which produces a hierarchical description of the linear and near-linear structure on a contour is presented in detail. Novel techniques for symmetry analysis are developed. The analyses considered contribute both to the solution of the boundary assignment problem and to the construction of geologically useful descriptions of crystal form. The analysis scheme which is developed employs grouping principles such as collinearity, parallelism, symmetry and continuity, so providing a link between this study and more general work in perceptual grouping and intermediate level computer vision. Consequently, the techniques developed in this study may be expected to find wider application beyond the petrographic domain.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Manufacturing planning and control systems are fundamental to the successful operations of a manufacturing organisation. 10 order to improve their business performance, significant investment is made by companies into planning and control systems; however, not all companies realise the benefits sought Many companies continue to suffer from high levels of inventory, shortages, obsolete parts, poor resource utilisation and poor delivery performance. This thesis argues that the fit between the planning and control system and the manufacturing organisation is a crucial element of success. The design of appropriate control systems is, therefore, important. The different approaches to the design of manufacturing planning and control systems are investigated. It is concluded that there is no provision within these design methodologies to properly assess the impact of a proposed design on the manufacturing facility. Consequently, an understanding of how a new (or modified) planning and control system will perform in the context of the complete manufacturing system is unlikely to be gained until after the system has been implemented and is running. There are many modelling techniques available, however discrete-event simulation is unique in its ability to model the complex dynamics inherent in manufacturing systems, of which the planning and control system is an integral component. The existing application of simulation to manufacturing control system issues is limited: although operational issues are addressed, application to the more fundamental design of control systems is rarely, if at all, considered. The lack of a suitable simulation-based modelling tool does not help matters. The requirements of a simulation tool capable of modelling a host of different planning and control systems is presented. It is argued that only through the application of object-oriented principles can these extensive requirements be achieved. This thesis reports on the development of an extensible class library called WBS/Control, which is based on object-oriented principles and discrete-event simulation. The functionality, both current and future, offered by WBS/Control means that different planning and control systems can be modelled: not only the more standard implementations but also hybrid systems and new designs. The flexibility implicit in the development of WBS/Control supports its application to design and operational issues. WBS/Control wholly integrates with an existing manufacturing simulator to provide a more complete modelling environment.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Purpose-To develop a non-invasive method for quantification of blood and pigment distributions across the posterior pole of the fundus from multispectral images using a computer-generated reflectance model of the fundus. Methods - A computer model was developed to simulate light interaction with the fundus at different wavelengths. The distribution of macular pigment (MP) and retinal haemoglobins in the fundus was obtained by comparing the model predictions with multispectral image data at each pixel. Fundus images were acquired from 16 healthy subjects from various ethnic backgrounds and parametric maps showing the distribution of MP and of retinal haemoglobins throughout the posterior pole were computed. Results - The relative distributions of MP and retinal haemoglobins in the subjects were successfully derived from multispectral images acquired at wavelengths 507, 525, 552, 585, 596, and 611?nm, providing certain conditions were met and eye movement between exposures was minimal. Recovery of other fundus pigments was not feasible and further development of the imaging technique and refinement of the software are necessary to understand the full potential of multispectral retinal image analysis. Conclusion - The distributions of MP and retinal haemoglobins obtained in this preliminary investigation are in good agreement with published data on normal subjects. The ongoing development of the imaging system should allow for absolute parameter values to be computed. A further study will investigate subjects with known pathologies to determine the effectiveness of the method as a screening and diagnostic tool.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Objective. Using an image analysis system to determine whether there is loss of axons in the olfactory tract (OT) in Alzheimer’s disease (AD). Design. A retrospective neuropathological study. Patients Nine control patients and eight clinically and pathologically verified AD cases. Measurements and Results. There was a reduction in axon density in AD compared with control subjects in the central and peripheral regions of the tract. Axonal loss was mainly of axons with smaller (<2.99 µm2) myelinated cross-sectional areas. Conclusions. The data suggest significant degeneration of axons within the OT involving the smaller sized axons. Loss of axons in the OT is likely to be secondary to pathological changes originating within the parahippocampal gyrus rather than to a pathogen spreading into the brain via the olfactory pathways.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Areolae of the crustose lichen Rhizocarpon geographicum (L.) DC., are present on the peripheral prothallus (marginal areolae) and also aggregate to form confluent masses in the centre of the thallus (central areolae). To determine the relationships between these areolae and whether growth of the peripheral prothallus is dependent on the marginal areolae, the density, morphology, and size frequency distributions of marginal areolae were measured in 23 thalli of R. geographicum in north Wales, UK using image analysis (Image J). Size and morphology of central areolae were also studied across the thallus. Marginal areolae were small, punctate, and occurred in clusters scattered over the peripheral prothallus while central areolae were larger and had a lobed structure. The size-class frequency distributions of the marginal and central areolae were fitted by power-law and log-normal models respectively. In 16 out of 23 thalli, central areolae close to the outer edge were larger and had a more complex lobed morphology than those towards the thallus centre. Neither mean width nor radial growth rate (RaGR) of the peripheral prothallus were correlated with density, diameter, or area fraction of marginal areolae. The data suggest central areolae may develop from marginal areolae as follows: (1) marginal areolae develop in clusters at the periphery and fuse to form central areolae, (2) central areolae grow exponentially, and (3) crowding of central areolae results in constriction and fragmentation. In addition, growth of the peripheral prothallus may be unrelated to the marginal areolae. © 2013 Springer Science+Business Media Dordrecht.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

DUE TO COPYRIGHT RESTRICTIONS ONLY AVAILABLE FOR CONSULTATION AT ASTON UNIVERSITY LIBRARY AND INFORMATION SERVICES WITH PRIOR ARRANGEMENT

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Aims: To establish the sensitivity and reliability of objective image analysis in direct comparison with subjective grading of bulbar hyperaemia. Methods: Images of the same eyes were captured with a range of bulbar hyperaemia caused by vasodilation. The progression was recorded and 45 images extracted. The images were objectively analysed on 14 occasions using previously validated edge-detection and colour-extraction techniques. They were also graded by 14 eye-care practitioners (ECPs) and 14 non-clinicians (NCb) using the Efron scale. Six ECPs repeated the grading on three separate occasions Results: Subjective grading was only able to differentiate images with differences in grade of 0.70-1.03 Efron units (sensitivity of 0.30-0.53), compared to 0,02-0.09 Efron units with objective techniques (sensitivity of 0.94-0.99). Significant differences were found between ECPs and individual repeats were also inconsistent (p<0.001). Objective analysis was 16x more reliable than subjective analysis. The NCLs used wider ranges of the scale but were more variable than ECPs, implying that training may have an effect on grading. Conclusions: Objective analysis may offer a new gold standard in anterior ocular examination, and should be developed further as a clinical research tool to allow more highly powered analysis, and to enhance the clinical monitoring of anterior eye disease.