935 resultados para Algebra of differential operators


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The overall aim of this study was to examine experimentally the effects of noise upon short-term memory tasks in the hope of shedding further light upon the apparently inconsistent results of previous research in the area. Seven experiments are presented. The first chapter of the thesis comprised a comprehensive review of the literature on noise and human performance while in the second chapter some theoretical questions concerning the effects of noise were considered in more detail follovred by a more detailed examination of the effects of noise upon memory. Chapter 3 described an experiment which examined the effects of noise on attention allocation in short-term memory as a function of list length. The results provided only weak evidence of increased selectivity in noise. In further chapters no~effects Here investigated in conjunction vrith various parameters of short-term memory tasks e.g. the retention interval, presentation rate. The results suggested that noise effects were significantly affected by the length of the retention interval but not by the rate of presentation. Later chapters examined the possibility of differential noise effects on the mode of recall (recall v. recognition) and the type of presentation (sequential v. simultaneous) as well as an investigation of the effect of varying the point of introduction of the noise and the importance of individual differences in noise research. The results of this study were consistent with the hypothesis that noise at presentation facilitates phonemic coding. However, noise during recall appeared to affect the retrieval strategy adopted by the subject.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A study of information available on the settlement characteristics of backfill in restored opencast coal mining sites and other similar earthworks projects has been undertaken. In addition, the methods of opencast mining, compaction controls, monitoring and test methods have been reviewed. To consider and develop the methods of predicting the settlement of fill, three sites in the West Midlands have been examined; at each, the backfill had been placed in a controlled manner. In addition, use has been made of a finite element computer program to compare a simple two-dimensional linear elastic analysis with field observations of surface settlements in the vicinity of buried highwalls. On controlled backfill sites, settlement predictions have been accurately made, based on a linear relationship between settlement (expressed as a percentage of fill height) against logarithm of time. This `creep' settlement was found to be effectively complete within 18 months of restoration. A decrease of this percentage settlement was observed with increasing fill thickness; this is believed to be related to the speed with which the backfill is placed. A rising water table within the backfill is indicated to cause additional gradual settlement. A prediction method, based on settlement monitoring, has been developed and used to determine the pattern of settlement across highwalls and buried highwalls. The zone of appreciable differential settlement was found to be mainly limited to the highwall area, the magnitude was dictated by the highwall inclination. With a backfill cover of about 15 metres over a buried highwall the magnitude of differential settlement was negligible. Use has been made of the proposed settlement prediction method and monitoring to control the re-development of restored opencase sites. The specifications, tests and monitoring techniques developed in recent years have been used to aid this. Such techniques have been valuable in restoring land previously derelict due to past underground mining.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis includes analysis of disordered spin ensembles corresponding to Exact Cover, a multi-access channel problem, and composite models combining sparse and dense interactions. The satisfiability problem in Exact Cover is addressed using a statistical analysis of a simple branch and bound algorithm. The algorithm can be formulated in the large system limit as a branching process, for which critical properties can be analysed. Far from the critical point a set of differential equations may be used to model the process, and these are solved by numerical integration and exact bounding methods. The multi-access channel problem is formulated as an equilibrium statistical physics problem for the case of bit transmission on a channel with power control and synchronisation. A sparse code division multiple access method is considered and the optimal detection properties are examined in typical case by use of the replica method, and compared to detection performance achieved by interactive decoding methods. These codes are found to have phenomena closely resembling the well-understood dense codes. The composite model is introduced as an abstraction of canonical sparse and dense disordered spin models. The model includes couplings due to both dense and sparse topologies simultaneously. The new type of codes are shown to outperform sparse and dense codes in some regimes both in optimal performance, and in performance achieved by iterative detection methods in finite systems.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

When composing stock portfolios, managers frequently choose among hundreds of stocks. The stocks' risk properties are analyzed with statistical tools, and managers try to combine these to meet the investors' risk profiles. A recently developed tool for performing such optimization is called full-scale optimization (FSO). This methodology is very flexible for investor preferences, but because of computational limitations it has until now been infeasible to use when many stocks are considered. We apply the artificial intelligence technique of differential evolution to solve FSO-type stock selection problems of 97 assets. Differential evolution finds the optimal solutions by self-learning from randomly drawn candidate solutions. We show that this search technique makes large scale problem computationally feasible and that the solutions retrieved are stable. The study also gives further merit to the FSO technique, as it shows that the solutions suit investor risk profiles better than portfolios retrieved from traditional methods.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We studied the effects of the composition of impregnating solution and heat treatment conditions on the activity of catalytic systems for the low-temperature oxidation of CO obtained by the impregnation of Busofit carbon-fiber cloth with aqueous solutions of palladium, copper, and iron salts. The formation of an active phase in the synthesized catalysts at different stages of their preparation was examined with the use of differential thermal and thermogravimetric analyses, X-ray diffraction analysis, X-ray photoelectron spectroscopy, and elemental spectral analysis. The catalytic system prepared by the impregnation of electrochemically treated Busofit with the solutions of PdCl, FeCl, CuBr, and Cu(NO ) and activated under optimum conditions ensured 100% CO conversion under a respiratory regime at both low (0.03%) and high (0.5%) carbon monoxide contents of air. It was found that the activation of a catalytic system at elevated temperatures (170-180°C) leads to the conversion of Pd(II) into Pd(I), which was predominantly localized in a near-surface layer. The promoting action of copper nitrate consists in the formation of a crystalline phase of the rhombic atacamite CuCl(OH). The catalyst surface is finally formed under the conditions of a catalytic reaction, when a joint Pd(I)-Cu(I) active site is formed. © 2014 Pleiades Publishing, Ltd.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background—The molecular mechanisms underlying similarities and differences between physiological and pathological left ventricular hypertrophy (LVH) are of intense interest. Most previous work involved targeted analysis of individual signaling pathways or screening of transcriptomic profiles. We developed a network biology approach using genomic and proteomic data to study the molecular patterns that distinguish pathological and physiological LVH. Methods and Results—A network-based analysis using graph theory methods was undertaken on 127 genome-wide expression arrays of in vivo murine LVH. This revealed phenotype-specific pathological and physiological gene coexpression networks. Despite >1650 common genes in the 2 networks, network structure is significantly different. This is largely because of rewiring of genes that are differentially coexpressed in the 2 networks; this novel concept of differential wiring was further validated experimentally. Functional analysis of the rewired network revealed several distinct cellular pathways and gene sets. Deeper exploration was undertaken by targeted proteomic analysis of mitochondrial, myofilament, and extracellular subproteomes in pathological LVH. A notable finding was that mRNA–protein correlation was greater at the cellular pathway level than for individual loci. Conclusions—This first combined gene network and proteomic analysis of LVH reveals novel insights into the integrated pathomechanisms that distinguish pathological versus physiological phenotypes. In particular, we identify differential gene wiring as a major distinguishing feature of these phenotypes. This approach provides a platform for the investigation of potentially novel pathways in LVH and offers a freely accessible protocol (http://sites.google.com/site/cardionetworks) for similar analyses in other cardiovascular diseases.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The objective of this study was to investigate the effects of circularity, comorbidity, prevalence and presentation variation on the accuracy of differential diagnoses made in optometric primary care using a modified form of naïve Bayesian sequential analysis. No such investigation has ever been reported before. Data were collected for 1422 cases seen over one year. Positive test outcomes were recorded for case history (ethnicity, age, symptoms and ocular and medical history) and clinical signs in relation to each diagnosis. For this reason only positive likelihood ratios were used for this modified form of Bayesian analysis that was carried out with Laplacian correction and Chi-square filtration. Accuracy was expressed as the percentage of cases for which the diagnoses made by the clinician appeared at the top of a list generated by Bayesian analysis. Preliminary analyses were carried out on 10 diagnoses and 15 test outcomes. Accuracy of 100% was achieved in the absence of presentation variation but dropped by 6% when variation existed. Circularity artificially elevated accuracy by 0.5%. Surprisingly, removal of Chi-square filtering increased accuracy by 0.4%. Decision tree analysis showed that accuracy was influenced primarily by prevalence followed by presentation variation and comorbidity. Analysis of 35 diagnoses and 105 test outcomes followed. This explored the use of positive likelihood ratios, derived from the case history, to recommend signs to look for. Accuracy of 72% was achieved when all clinical signs were entered. The drop in accuracy, compared to the preliminary analysis, was attributed to the fact that some diagnoses lacked strong diagnostic signs; the accuracy increased by 1% when only recommended signs were entered. Chi-square filtering improved recommended test selection. Decision tree analysis showed that accuracy again influenced primarily by prevalence, followed by comorbidity and presentation variation. Future work will explore the use of likelihood ratios based on positive and negative test findings prior to considering naïve Bayesian analysis as a form of artificial intelligence in optometric practice.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

It is proved that there exists a bijection between the primitive ideals of the algebra of regular functions on quantum m × n-matrices and the symplectic leaves of associated Poisson structure.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Partially supported by grant RFFI 98-01-01020.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

∗The first author was partially supported by MURST of Italy; the second author was par- tially supported by RFFI grant 99-01-00233.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The basic conceptions of the model „entity-relationship” as entities, relationships, structural constraints of the relationships (index cardinality, participation degree, and structural constraints of kind (min, max)) are considered and formalized in terms of relations theory. For the binary relations two operators (min and max) are introduced; structural constraints are determined in terms of the operators; the main theorem about compatibility of these operators’ values on the source relation and inversion to it is given here.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper, a novel approach for character recognition has been presented with the help of genetic operators which have evolved from biological genetics and help us to achieve highly accurate results. A genetic algorithm approach has been described in which the biological haploid chromosomes have been implemented using a single row bit pattern of 315 values which have been operated upon by various genetic operators. A set of characters are taken as an initial population from which various new generations of characters are generated with the help of selection, crossover and mutation. Variations of population of characters are evolved from which the fittest solution is found by subjecting the various populations to a new fitness function developed. The methodology works and reduces the dissimilarity coefficient found by the fitness function between the character to be recognized and members of the populations and on reaching threshold limit of the error found from dissimilarity, it recognizes the character. As the new population is being generated from the older population, traits are passed on from one generation to another. We present a methodology with the help of which we are able to achieve highly efficient character recognition.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Mathematics Subject Classification: 26A33, 47B06, 47G30, 60G50, 60G52, 60G60.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Mathematics Subject Classification: 26A33, 93C83, 93C85, 68T40

Relevância:

100.00% 100.00%

Publicador:

Resumo:

AMS Subj. Classification: 03C05, 08B20