6 resultados para Number representation format

em Aston University Research Archive


Relevância:

90.00% 90.00%

Publicador:

Resumo:

Existing parsers for textual model representation formats such as XMI and HUTN are unforgiving and fail upon even the smallest inconsistency between the structure and naming of metamodel elements and the contents of serialised models. In this paper, we demonstrate how a fuzzy parsing approach can transparently and automatically resolve a number of these inconsistencies, and how it can eventually turn XML into a human-readable and editable textual model representation format for particular classes of models.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Using methods of statistical physics, we study the average number and kernel size of general sparse random matrices over GF(q), with a given connectivity profile, in the thermodynamical limit of large matrices. We introduce a mapping of GF(q) matrices onto spin systems using the representation of the cyclic group of order q as the q-th complex roots of unity. This representation facilitates the derivation of the average kernel size of random matrices using the replica approach, under the replica symmetric ansatz, resulting in saddle point equations for general connectivity distributions. Numerical solutions are then obtained for particular cases by population dynamics. Similar techniques also allow us to obtain an expression for the exact and average number of random matrices for any general connectivity profile. We present numerical results for particular distributions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A multi-chromosome GA (Multi-GA) was developed, based upon concepts from the natural world, allowing improved flexibility in a number of areas including representation, genetic operators, their parameter rates and real world multi-dimensional applications. A series of experiments were conducted, comparing the performance of the Multi-GA to a traditional GA on a number of recognised and increasingly complex test optimisation surfaces, with promising results. Further experiments demonstrated the Multi-GA's flexibility through the use of non-binary chromosome representations and its applicability to dynamic parameterisation. A number of alternative and new methods of dynamic parameterisation were investigated, in addition to a new non-binary 'Quotient crossover' mechanism. Finally, the Multi-GA was applied to two real world problems, demonstrating its ability to handle mixed type chromosomes within an individual, the limited use of a chromosome level fitness function, the introduction of new genetic operators for structural self-adaptation and its viability as a serious real world analysis tool. The first problem involved optimum placement of computers within a building, allowing the Multi-GA to use multiple chromosomes with different type representations and different operators in a single individual. The second problem, commonly associated with Geographical Information Systems (GIS), required a spatial analysis location of the optimum number and distribution of retail sites over two different population grids. In applying the Multi-GA, two new genetic operators (addition and deletion) were developed and explored, resulting in the definition of a mechanism for self-modification of genetic material within the Multi-GA structure and a study of this behaviour.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis describes a novel connectionist machine utilizing induction by a Hilbert hypercube representation. This representation offers a number of distinct advantages which are described. We construct a theoretical and practical learning machine which lies in an area of overlap between three disciplines - neural nets, machine learning and knowledge acquisition - hence it is refered to as a "coalesced" machine. To this unifying aspect is added the various advantages of its orthogonal lattice structure as against less structured nets. We discuss the case for such a fundamental and low level empirical learning tool and the assumptions behind the machine are clearly outlined. Our theory of an orthogonal lattice structure the Hilbert hypercube of an n-dimensional space using a complemented distributed lattice as a basis for supervised learning is derived from first principles on clearly laid out scientific principles. The resulting "subhypercube theory" was implemented in a development machine which was then used to test the theoretical predictions again under strict scientific guidelines. The scope, advantages and limitations of this machine were tested in a series of experiments. Novel and seminal properties of the machine include: the "metrical", deterministic and global nature of its search; complete convergence invariably producing minimum polynomial solutions for both disjuncts and conjuncts even with moderate levels of noise present; a learning engine which is mathematically analysable in depth based upon the "complexity range" of the function concerned; a strong bias towards the simplest possible globally (rather than locally) derived "balanced" explanation of the data; the ability to cope with variables in the network; and new ways of reducing the exponential explosion. Performance issues were addressed and comparative studies with other learning machines indicates that our novel approach has definite value and should be further researched.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Web document cluster analysis plays an important role in information retrieval by organizing large amounts of documents into a small number of meaningful clusters. Traditional web document clustering is based on the Vector Space Model (VSM), which takes into account only two-level (document and term) knowledge granularity but ignores the bridging paragraph granularity. However, this two-level granularity may lead to unsatisfactory clustering results with “false correlation”. In order to deal with the problem, a Hierarchical Representation Model with Multi-granularity (HRMM), which consists of five-layer representation of data and a twophase clustering process is proposed based on granular computing and article structure theory. To deal with the zero-valued similarity problemresulted from the sparse term-paragraphmatrix, an ontology based strategy and a tolerance-rough-set based strategy are introduced into HRMM. By using granular computing, structural knowledge hidden in documents can be more efficiently and effectively captured in HRMM and thus web document clusters with higher quality can be generated. Extensive experiments show that HRMM, HRMM with tolerancerough-set strategy, and HRMM with ontology all outperform VSM and a representative non VSM-based algorithm, WFP, significantly in terms of the F-Score.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Editorial: The 2015 BCLA annual conference was another fantastic affair. It was the first time the conference was held in the beautiful city of Liverpool. The venue was great and the programme was excellent. The venue overlooked the River Mersey and many of the hotels were local boutique hotels. I stayed in one which was formerly the offices of White Star Liners—where the RMS Titanic was originally registered. The hotel decor was consistent with its historic significance. The BCLA gala dinner was held in the hugely impressive Anglican Cathedral with entertainment from a Beatles tribute band. That will certainly be a hard act to follow at the next conference in 2017. Brian Tompkins took the reigns as the new BCLA president. Professor Fiona Stapleton was the recipient of the BCLA Gold Medal Award. The winner of the poster competition was Dorota Szczesna-Iskander with a poster entitled ‘Dry Contact lens poor wettability and visual performance’. Second place was Renee Reeder with her poster entitled ‘Abnormal Rosacea as a differential diagnosis in corneal scarring’. And third place was Maria Jesus Gonzalez-Garcia with her poster entitled ‘Dry Effect of the Environmental Conditions on Tear Inflammatory Mediators Concentration in Contact Lens Wearers’. The photographic competition winner was Professor Wolfgang Sickenberger from Jena in Germany. The Editorial Panel of CLAE met at the BCLA conference for their first biannual meeting. The journal metrics were discussed. In terms of number of submissions of new papers CLAE seems to have plateaued after seeing a rapid growth in the number of submissions over the last few years. The increase over the last few years could be attributed to the fact that CLAE was awarded an impact factor for the first time in 2012. This year it seems that impact factors across nearly all ophthalmic related journals has dropped. This could in part be due to the fact that last year was a ‘Research Exercise Framework (REF) year for UK universities, where they are judged on quality of their research output. The next REF is in 2020 so we may see changes nearing that time. Looking at article downloads, there seems to be a continued rise in figures. Currently CLAE attracts around 85,000 downloads per year (this is an increase of around 10,000 per year for the last few years) and the 2015 prediction is 120,000! With this in mind and with other contributing factors too, the BCLA has decided to move to online delivery of CLAE to its members starting from issue 5 of 2015. Some members do like to flick through the pages of a hard copy of the journal so members will still have the option of receiving a hard copy through the post but the default journal delivery method will now be online. The BCLA office will send various alerts and content details to members email addresses. To access CLAE online you will need to log in via the BCLA web page, currently you then click on ‘Resources’ and then under ‘Free and Discounted Publications’ you will see CLAE. This actually takes you to CLAE’s own webpage (www.contactlensjournal.com) but you need to log in via the BCLA web page. The BCLA plans to change these weblinks so that from the BCLA web page you can link to the journal website much more easily and you have the choice of going directly into the general website for CLAE or straight to the current issue. In 2016 you will see an even easier way of accessing CLAE online as the BCLA will launch a CLAE application for mobile devices where the journal can be downloaded as a ‘flick-book’. This is a great way of bringing CLAE into the modern era where people access their information in newer ways. For many the BCLA conference was part of a very busy conference week as it was preceded by the International Association of Contact Lens Educators’ (IACLE) Third World Congress, held in Manchester on the 4 days before the BCLA conference. The first and second IACE World Congresses were held in Waterloo, Canada in 1994 and 2000 respectively and hosted by Professor Des Fonn. Professor Fonn was the recipient of the first ever IACLE lifetime achievement award. The Third IACLE World Congress saw more than 100 contact lens educators and industry representatives from around 30 countries gather in the UK for the four-day event, hosted by The University of Manchester. Delegates gained hands-on experience of innovations in teaching, such as learning delivery systems, the use of iPads in the classroom and for creating ePub content, and augmented and virtual reality technologies. IACLE members around the world also took part via a live online broadcast. The Third IACLE World Congress was made possible by the generous support of Sponsors Alcon, CooperVision and Johnson & Johnson Vision Care., for more information look at the IACLE web page (www.iacle.org).