25 resultados para Hilbert schemes of points Poincaré polynomial Betti numbers Goettsche formula
em Aston University Research Archive
Resumo:
This thesis describes a novel connectionist machine utilizing induction by a Hilbert hypercube representation. This representation offers a number of distinct advantages which are described. We construct a theoretical and practical learning machine which lies in an area of overlap between three disciplines - neural nets, machine learning and knowledge acquisition - hence it is refered to as a "coalesced" machine. To this unifying aspect is added the various advantages of its orthogonal lattice structure as against less structured nets. We discuss the case for such a fundamental and low level empirical learning tool and the assumptions behind the machine are clearly outlined. Our theory of an orthogonal lattice structure the Hilbert hypercube of an n-dimensional space using a complemented distributed lattice as a basis for supervised learning is derived from first principles on clearly laid out scientific principles. The resulting "subhypercube theory" was implemented in a development machine which was then used to test the theoretical predictions again under strict scientific guidelines. The scope, advantages and limitations of this machine were tested in a series of experiments. Novel and seminal properties of the machine include: the "metrical", deterministic and global nature of its search; complete convergence invariably producing minimum polynomial solutions for both disjuncts and conjuncts even with moderate levels of noise present; a learning engine which is mathematically analysable in depth based upon the "complexity range" of the function concerned; a strong bias towards the simplest possible globally (rather than locally) derived "balanced" explanation of the data; the ability to cope with variables in the network; and new ways of reducing the exponential explosion. Performance issues were addressed and comparative studies with other learning machines indicates that our novel approach has definite value and should be further researched.
Resumo:
N-tuple recognition systems (RAMnets) are normally modeled using a small number of input lines to each RAM, because the address space grows exponentially with the number of inputs. It is impossible to implement an arbitrarily-large address space as physical memory. But given modest amounts of training data, correspondingly modest numbers of bits will be set in that memory. Hash arrays can therefore be used instead of a direct implementation of the required address space. This paper describes some exploratory experiments using the hash array technique to investigate the performance of RAMnets with very large numbers of input lines. An argument is presented which concludes that performance should peak at a relatively small n-tuple size, but the experiments carried out so far contradict this. Further experiments are needed to confirm this unexpected result.
Resumo:
We contend that powerful group studies can be conducted using magnetoencephalography (MEG), which can provide useful insights into the approximate distribution of the neural activity detected with MEG without requiring magnetic resonance imaging (MRI) for each participant. Instead, a participant's MRI is approximated with one chosen as a best match on the basis of the scalp surface from a database of available MRIs. Because large inter-individual variability in sulcal and gyral patterns is an inherent source of blurring in studies using grouped functional activity, the additional error introduced by this approximation procedure has little effect on the group results, and offers a sufficiently close approximation to that of the participants to yield a good indication of the true distribution of the grouped neural activity. T1-weighted MRIs of 28 adults were acquired in a variety of MR systems. An artificial functional image was prepared for each person in which eight 5 × 5 × 5 mm regions of brain activation were simulated. Spatial normalisation was applied to each image using transformations calculated using SPM99 with (1) the participant's actual MRI, and (2) the best matched MRI substituted from those of the other 27 participants. The distribution of distances between the locations of points using real and substituted MRIs had a modal value of 6 mm with 90% of cases falling below 12.5 mm. The effects of this -approach on real grouped SAM source imaging of MEG data in a verbal fluency task are also shown. The distribution of MEG activity in the estimated average response is very similar to that produced when using the real MRIs. © 2003 Wiley-Liss, Inc.
Resumo:
The treatment of effluents produced during the manufacture of metallurgical coke is normally carried out using the activated sludge process. The efficiency of activated sludges in purifying coke oven effluent depends largely on the maintenance of species of micro-organisms which destroy thiocyanate. The composition, production, toxicity and treatment of coke oven effluent at Corby steelworks are described. A review is presented which follows the progress made towards identifying and monitoring the species of bacteria which destroy thiocyanate in biological treatment plants purifying coke oven effluents. In the present study a search for bacteria capable of destroying thiocyanate led to the isolation of a species of bacteria, identified as Pseudomonas putida, which destroyed thiocyanate in the presence of succinate; this species had not previously been reported to use thiocyanate. Washed cell suspensions of P. putida destroyed phenol and thiocyanate simultaneously and thiocyanate destruction was not suppressed by pyridine, aniline or catechol at the highest concentrations normally encountered in coke oven effluent. The isolate has been included, as N.C.I.B. 11198, in the National Collection of Industrial Bacteria, Torrey Research Station, Aberdeen. Three other isolates, identified as Achromobacter sp., Thiobacillus thioparus and T. denitrificans, were also confirmed to destroy thi.ocyanate. A technique has been developed for monitoring populations of different species of bacteria in activated sludges. Application of this technique to laboratory scale and full scale treatment plants at Corby showed that thiobacilli were usually not detected; thiobacilli were el~inated during the commissioning period of the full scale plant. However experiments using a laboratory scale plant indicated that during a period of three weeks an increase in the numbers of thiobacilli might have contributed to an improvement in plant performance. Factors which might have facilitated the development of thiobacilli are discussed. Large numbers of fluorescent pseudomonads capable of using thiocyanate were sometimes detected in the laboratory scale plant. The possibility is considered that catechol or other organic compounds in the feed-liquor might have stimulated fluorescent pseudmonads. Experiments using the laboratory scale plant confirmed that deteriorations in the efficiency of thiocyanate destruction were sometimes caused by bulking sludges, due to the excessive growth of fungal floes. Increased dilution of the coke oven effluent was a successful remedy to this difficulty. The optimum operating conditions recommended by the manufacturer of the full scale activated sludge plant at Corby are assessed and the role of bacterial monitoring in a programme of regular monitoring tests is discussed in relation to the operation of activated sludge plants treating coke oven effluents.
Using interior point algorithms for the solution of linear programs with special structural features
Resumo:
Linear Programming (LP) is a powerful decision making tool extensively used in various economic and engineering activities. In the early stages the success of LP was mainly due to the efficiency of the simplex method. After the appearance of Karmarkar's paper, the focus of most research was shifted to the field of interior point methods. The present work is concerned with investigating and efficiently implementing the latest techniques in this field taking sparsity into account. The performance of these implementations on different classes of LP problems is reported here. The preconditional conjugate gradient method is one of the most powerful tools for the solution of the least square problem, present in every iteration of all interior point methods. The effect of using different preconditioners on a range of problems with various condition numbers is presented. Decomposition algorithms has been one of the main fields of research in linear programming over the last few years. After reviewing the latest decomposition techniques, three promising methods were chosen the implemented. Sparsity is again a consideration and suggestions have been included to allow improvements when solving problems with these methods. Finally, experimental results on randomly generated data are reported and compared with an interior point method. The efficient implementation of the decomposition methods considered in this study requires the solution of quadratic subproblems. A review of recent work on algorithms for convex quadratic was performed. The most promising algorithms are discussed and implemented taking sparsity into account. The related performance of these algorithms on randomly generated separable and non-separable problems is also reported.
Resumo:
Oxysterols (OS), the polyoxygenated sterols, represent a class of potent regulatory molecules for important biological actions. Cytotoxicity of OS is one of the most important aspects in studies of OS bioactivities. However, studies, the structure-activity relationship (SAR) study in particular, have been hampered by the limited availability of structurally diverse OS in numbers and amounts. The aim of this project was to develop robust synthetic methods for the preparation of polyhydroxyl sterols, thereof, evaluate their cytotoxicity and establish structure-activity relationship. First, we found hydrophobicity of the side chain is essential for 7-HC's cytotoxicity, and a limited number of hydroxyl groups and a desired configuration on the A, B ring are required for a potent cytotoxicity of an OS, after syntheses and tests of a number of 7-HC's analogues against cancer cell lines. Then polyoxygenation of cholesterol A, B rings was explored. A preparative method for the synthesis of four diastereomerically pure cholest-4-en-3,6-diols was developed. Epoxidation on these cholest-4-en-3,6-diols showed that an allyl group exerts an auxiliary role in producing products with desired configuration in syntheses of the eight diastereomerically pure 45-epoxycholestane-3,6-diols. Reduction of the eight 45-epoxycholestane-3,6-diols produced all eight isomers of the cytotoxic 5α-acholestane 3β,5,6β-triol (CT) for the first time. Epoxide ring opening with protic or Lewis acids on the eight 45-epoxycholestane-3,6-diols are carefully studied. The results demonstrated a combination of an acid and a solvent affected the outcomes of a reaction dramatically. Acyl group participation and migration play an important role with numbers of substrates under certain conditions. All the eight 4,5-trans cholestane- 3,4,5,6-tetrols were synthesised through manipulation of acyl participation. Furthermore these reaction conditions were tested when a number of cholestane-3,4, 5,6,7-pentols and other C3-C7 oxygenated sterols were synthesised for the first time. Introduction of an oxygenated functional group through cholest-2-ene derivatives was studied. The elimination of 3-(4-toluenesulfonate) esters showed the interaction between the existing hydroxyls or acyls with the reaction centre often resulted in different products. The allyl oxidation, epoxidation and Epoxide ring opening reactions are investigated with these cholest-2-enes.
Resumo:
The specific objective of the research was to evaluate proprietary audit systems. Proprietary audit systems comprise question sets containing approximately 500 questions dealing with selected aspects of health and safety management. Each question is allotted a number of points and an organisation seeks to judge its health and safety performance by the overall score achieved in the audit. Initially it was considered that the evaluation method might involve comparing the proprietary audit scores with other methods of measuring safety performance. However, what appeared to be missing in the first instance was information that organisations could use to compare the contrast question set content against their own needs. A technique was developed using the computer database FileMaker Pro. This enables questions in an audit to be sorted into categories using a process of searching for key words. Questions that are not categorised by word searching can be identified and sorted manually. The process can be completed in 2-3 hours which is considerably faster than manual categorisation of questions which typically takes about 10 days. The technique was used to compare and contrast three proprietary audits: ISRS, CHASE and QSA. Differences and similarities between these audits were successfully identified. It was concluded that in general proprietary audits need to focus to a greater extent on identifying strengths and weaknesses in occupational health and safety management systems. To do this requires the inclusion of more probing questions which consider whether risk control measures are likely to be successful.
Resumo:
Previous research has shown that adults with dyslexia (AwD) are disproportionately impacted by close spacing of stimuli and increased numbers of distractors in a visual search task compared to controls [1]. Using an orientation discrimination task, the present study extended these findings to show that even in conditions where target search was not required: (i) AwD had detrimental effects of both crowding and increased numbers of distractors; (ii) AwD had more pronounced difficulty with distractor exclusion in the left visual field and (iii) measures of crowding and distractor exclusion correlated significantly with literacy measures. Furthermore, such difficulties were not accounted for by the presence of covarying symptoms of ADHD in the participant groups. These findings provide further evidence to suggest that the ability to exclude distracting stimuli likely contributes to the reported visual attention difficulties in AwD and to the aetiology of literacy difficulties. The pattern of results is consistent with weaker and asymmetric attention in AwD.
Resumo:
This paper estimates the implicit model, especially the roles of size asymmetries and firm numbers, used by the European Commission to identify mergers with coordinated effects. This subset of cases offers an opportunity to shed empirical light on the conditions where a Competition Authority believes tacit collusion is most likely to arise. We find that, for the Commission, tacit collusion is a rare phenomenon, largely confined to markets of two, more or less symmetric, players. This is consistent with recent experimental literature, but contrasts with the facts on ‘hard-core’ collusion in which firm numbers and asymmetries are often much larger.
Resumo:
During the second half of the nineteenth century, a German business community of about one hundred merchants and commercial clerks developed in Glasgow. Their trade networks extended not only to Germany but also to other world markets. The main arguments and findings of the microhistorical analysis include: numbers were significantly higher than previously assumed; endogenous recruitment based on ethnic and family ties was prevalent; migrants benefited from their migration-induced social capital (training, languages, intercultural competence) to fill a skills-gap in Britain; labour market competition at the junior career level was less pronounced than contemporaneous assessments suggested; naturalisation was taken out for purely pragmatic reasons; there was a sense of community at intra-ethnic level, but also with the local business elite. The case study is embedded into the larger context of Anglo-German economic relations and globalisation. A purely local perspective does not suffice to do justice to the wider significance of expatriate business communities in an age of economic globalisation.
Resumo:
Information and Communications Technology (ICT) is widely regarded as a key integration enabler in contemporary supply chain configurations. Furthermore, recent years have seen the vertical disintegration of supply chains as increasing numbers of manufacturers and retailers outsource significant parts of their supply chain functionality. In this environment, Third Party Logistics (3PL) providers - the majority of which are small companies - play a pivotal role. This raises important questions about the usage of ICT in this sector. However, there is a paucity of research in the field of small 3PLs with little empirical investigation into the usage of ICT by such firms. This paper presents the results of a survey on ICT systems usage in a sample of small Italian 3PLs. The results provide a technological profile of the surveyed companies, as well as an analysis of the role of ICT in customising services and of the factors influencing technology adoption.
Resumo:
Overlaying maps using a desktop GIS is often the first step of a multivariate spatial analysis. The potential of this operation has increased considerably as data sources and Web services to manipulate them are becoming widely available via the Internet. Standards from the OGC enable such geospatial mashups to be seamless and user driven, involving discovery of thematic data. The user is naturally inclined to look for spatial clusters and correlation of outcomes. Using classical cluster detection scan methods to identify multivariate associations can be problematic in this context, because of a lack of control on or knowledge about background populations. For public health and epidemiological mapping, this limiting factor can be critical but often the focus is on spatial identification of risk factors associated with health or clinical status. Spatial entropy index HSu for the ScankOO analysis of the hypothetical dataset using a vicinity which is fixed by the number of points without distinction between their labels. (The size of the labels is proportional to the inverse of the index) In this article we point out that this association itself can ensure some control on underlying populations, and develop an exploratory scan statistic framework for multivariate associations. Inference using statistical map methodologies can be used to test the clustered associations. The approach is illustrated with a hypothetical data example and an epidemiological study on community MRSA. Scenarios of potential use for online mashups are introduced but full implementation is left for further research.
Resumo:
The opening up of the Chinese economy and the associated transfer of technology from abroad have been taking place at an accelerating pace. Technology is crucial to China's industrial development. It is a productive resource and has a vital role in the process of economic and social development. This article provides an overview of technology transfer into China, focusing on recent developments, and examines the macroenvironmental and microenvironmental influences which foreign enterprises must consider when making investments or technology transfer decisions. Cases of companies engaged in international technology transfer are used to illustrate the discussion on the microenvironment. To be successful, foreign investors and suppliers of technology must respond to China's industrial priorities and pursue projects that are compatible with the country's broad policy goals as well as the corporate objectives of Chinese partners. The article concludes by listing a number of points to which attention should be paid before a decision is made to transfer technology to China.
Resumo:
1. The techniques associated with regression, whether linear or non-linear, are some of the most useful statistical procedures that can be applied in clinical studies in optometry. 2. In some cases, there may be no scientific model of the relationship between X and Y that can be specified in advance and the objective may be to provide a ‘curve of best fit’ for predictive purposes. In such cases, the fitting of a general polynomial type curve may be the best approach. 3. An investigator may have a specific model in mind that relates Y to X and the data may provide a test of this hypothesis. Some of these curves can be reduced to a linear regression by transformation, e.g., the exponential and negative exponential decay curves. 4. In some circumstances, e.g., the asymptotic curve or logistic growth law, a more complex process of curve fitting involving non-linear estimation will be required.
Resumo:
Purpose – This paper aims to respond to John Rossiter's call for a “Marketing measurement revolution” in the current issue of EJM, as well as providing broader comment on Rossiter's C-OAR-SE framework, and measurement practice in marketing in general. Design/methodology/approach – The paper is purely theoretical, based on interpretation of measurement theory. Findings – The authors find that much of Rossiter's diagnosis of the problems facing measurement practice in marketing and social science is highly relevant. However, the authors find themselves opposed to the revolution advocated by Rossiter. Research limitations/implications – The paper presents a comment based on interpretation of measurement theory and observation of practices in marketing and social science. As such, the interpretation is itself open to disagreement. Practical implications – There are implications for those outside academia who wish to use measures derived from academic work as well as to derive their own measures of key marketing and other social variables. Originality/value – This paper is one of the few to explicitly respond to the C-OAR-SE framework proposed by Rossiter, and presents a number of points critical to good measurement theory and practice, which appear to remain underdeveloped in marketing and social science.