736 resultados para Computer Science, theory and methods
Resumo:
We report on the study of nonequilibrium ordering in the reaction-diffusion lattice gas. It is a kinetic model that relaxes towards steady states under the simultaneous competition of a thermally activated creation-annihilation $(reaction$) process at temperature T, and a diffusion process driven by a heat bath at temperature T?T. The phase diagram as one varies T and T, the system dimension d, the relative priori probabilities for the two processes, and their dynamical rates is investigated. We compare mean-field theory, new Monte Carlo data, and known exact results for some limiting cases. In particular, no evidence of Landau critical behavior is found numerically when d=2 for Metropolis rates but Onsager critical points and a variety of first-order phase transitions.
Resumo:
The recent rapid development of biotechnological approaches has enabled the production of large whole genome level biological data sets. In order to handle thesedata sets, reliable and efficient automated tools and methods for data processingand result interpretation are required. Bioinformatics, as the field of studying andprocessing biological data, tries to answer this need by combining methods and approaches across computer science, statistics, mathematics and engineering to studyand process biological data. The need is also increasing for tools that can be used by the biological researchers themselves who may not have a strong statistical or computational background, which requires creating tools and pipelines with intuitive user interfaces, robust analysis workflows and strong emphasis on result reportingand visualization. Within this thesis, several data analysis tools and methods have been developed for analyzing high-throughput biological data sets. These approaches, coveringseveral aspects of high-throughput data analysis, are specifically aimed for gene expression and genotyping data although in principle they are suitable for analyzing other data types as well. Coherent handling of the data across the various data analysis steps is highly important in order to ensure robust and reliable results. Thus,robust data analysis workflows are also described, putting the developed tools andmethods into a wider context. The choice of the correct analysis method may also depend on the properties of the specific data setandthereforeguidelinesforchoosing an optimal method are given. The data analysis tools, methods and workflows developed within this thesis have been applied to several research studies, of which two representative examplesare included in the thesis. The first study focuses on spermatogenesis in murinetestis and the second one examines cell lineage specification in mouse embryonicstem cells.
Resumo:
En synthèse d’images, reproduire les effets complexes de la lumière sur des matériaux transluminescents, tels que la cire, le marbre ou la peau, contribue grandement au réalisme d’une image. Malheureusement, ce réalisme supplémentaire est couteux en temps de calcul. Les modèles basés sur la théorie de la diffusion visent à réduire ce coût en simulant le comportement physique du transport de la lumière sous surfacique tout en imposant des contraintes de variation sur la lumière incidente et sortante. Une composante importante de ces modèles est leur application à évaluer hiérarchiquement l’intégrale numérique de l’illumination sur la surface d’un objet. Cette thèse révise en premier lieu la littérature actuelle sur la simulation réaliste de la transluminescence, avant d’investiguer plus en profondeur leur application et les extensions des modèles de diffusion en synthèse d’images. Ainsi, nous proposons et évaluons une nouvelle technique d’intégration numérique hiérarchique utilisant une nouvelle analyse fréquentielle de la lumière sortante et incidente pour adapter efficacement le taux d’échantillonnage pendant l’intégration. Nous appliquons cette théorie à plusieurs modèles qui correspondent à l’état de l’art en diffusion, octroyant une amélioration possible à leur efficacité et précision.
Resumo:
It is believed that every fuzzy generalization should be formulated in such a way that it contain the ordinary set theoretic notion as a special case. Therefore the definition of fuzzy topology in the line of C.L.CHANG E9] with an arbitrary complete and distributive lattice as the membership set is taken. Almost all the results proved and presented in this thesis can, in a sense, be called generalizations of corresponding results in ordinary set theory and set topology. However the tools and the methods have to be in many of the cases, new. Here an attempt is made to solve the problem of complementation in the lattice of fuzzy topologies on a set. It is proved that in general, the lattice of fuzzy topologies is not complemented. Complements of some fuzzy topologies are found out. It is observed that (L,X) is not uniquely complemented. However, a complete analysis of the problem of complementation in the lattice of fuzzy topologies is yet to be found out
Resumo:
This is the website for the Nano Research group based at the University of Southampton ECS department, and details current research topics and the people connected with these. It shows some of the current research topics undertaken at the center, and gives an outline of what can be done for post graduate courses.
Resumo:
Set readings 1. Sismondo S. (2009). The Kuhnian revolution. In An introduction to science and technology studies. p12-22 2. Ben-David J, Sullivan T. (1975) Sociology of science. Annual Review of Sociology p203-21 3. Clarke A, Star SL. (2008) The social worlds framework: a theory/methods package. In Hackett EJ et al. The handbook of science and technology studies. Cambridge MA: MIT Press p113-137 Bonus paper (read if you have time) 4. Mitroff I. (1974). Norms and Counternorms in a Select Group of Apollo Moon Scientists. American Sociological Review 39:79-95 • Aim to ensure that you understand the core arguments of each paper • Look up/note any new terminology (and questions you want to ask) • Think about your critical appraisal of the paper (what are the merits/demerits of the argument, evidence etc) In the seminar we will spend about 5 minutes talking about each paper, and then - building on the two lectures - discuss how these ideas might be used to think about the Web and Web Science. At the end there will be some time for questions and a chance to note your key learning points.
Resumo:
Este texto es una guía para la enseñanza de la ciencia primaria según lo establecido en las normas profesionales para la acreditación docente (QTS) en Inglaterra y el Reino Unido. Cada capítulo incluye estudios de casos de situaciones para ayudar a los alumnos a establecer el vínculo entre la teoría y la enseñanza práctica en el aula. También se incluyen en cada capítulo resúmenes de investigaciones clave para guiar a los estudiantes en una comprensión más profunda de los fundamentos teóricos de la enseñanza, ideas para actividades prácticas en el aula y un glosario de los principales términos científicos.
Resumo:
Monográfico con el título: 'Educación, valores y democracia'. Resumen basado en el de la publicación
Resumo:
The calculation of accurate and reliable vibrational potential functions and normal co-ordinates is discussed, for such simple polyatomic molecules as it may be possible. Such calculations should be corrected for the effects of anharmonicity and of resonance interactions between the vibrational states, and should be fitted to all the available information on all isotopic species: particularly the vibrational frequencies, Coriolis zeta constants and centrifugal distortion constants. The difficulties of making these corrections, and of making use of the observed data are reviewed. A programme for the Ferranti Mercury Computer is described by means of which harmonic vibration frequencies and normal co-ordinate vectors, zeta factors and centrifugal distortion constants can be calculated, from a given force field and from given G-matrix elements, etc. The programme has been used on up to 5 × 5 secular equations for which a single calculation and output of results takes approximately l min; it can readily be extended to larger determinants. The best methods of using such a programme and the possibility of reversing the direction of calculation are discussed. The methods are applied to calculating the best possible vibrational potential function for the methane molecule, making use of all the observed data.