929 resultados para Lattice theory - Computer programs
Resumo:
We present an algorithm for the computation of reducible invariant tori of discrete dynamical systems that is suitable for tori of dimensions larger than 1. It is based on a quadratically convergent scheme that approximates, at the same time, the Fourier series of the torus, its Floquet transformation, and its Floquet matrix. The Floquet matrix describes the linearization of the dynamics around the torus and, hence, its linear stability. The algorithm presents a high degree of parallelism, and the computational effort grows linearly with the number of Fourier modes needed to represent the solution. For these reasons it is a very good option to compute quasi-periodic solutions with several basic frequencies. The paper includes some examples (flows) to show the efficiency of the method in a parallel computer. In these flows we compute invariant tori of dimensions up to 5, by taking suitable sections.
Resumo:
The purpose of this research was to do a repeated cross-sectional research on class teachers who study in the 4th year and also graduated at the Faculty of Education, University of Turku between the years of 2000 through 2004. Specifically, seven research questions were addressed to target the main purpose of the study: How do class teacher education masters’ degree senior students and graduates rate “importance; effectiveness; and quality” of training they have received at the Faculty of Education? Are there significant differences between overall ratings of importance; effectiveness and quality of training by year of graduation, sex, and age (for graduates) and sex and age (for senior students)? Is there significant relationship between respondents’ overall ratings of importance; effectiveness and their overall ratings of the quality of training and preparation they have received? Are there significant differences between graduates and senior students about importance, effectiveness, and quality of teacher education programs? And what do teachers’ [Graduates] believe about how increasing work experience has changed their opinions of their preservice training? Moreover the following concepts related to the instructional activities were studied: critical thinking skills, communication skills, attention to ethics, curriculum and instruction (planning), role of teacher and teaching knowledge, assessment skills, attention to continuous professional development, subject matters knowledge, knowledge of learning environment, and using educational technology. Researcher also tried to find influence of some moderator variables e.g. year of graduation, sex, and age on the dependent and independent variables. This study consisted of two questionnaires (a structured likert-scale and an open ended questionnaire). The population in study 1 was all senior students and 2000-2004 class teacher education masters’ degree from the departments of Teacher Education Faculty of Education at University of Turku. Of the 1020 students and graduates the researcher was able to find current addresses of 675 of the subjects and of the 675 graduates contacted, 439 or 66.2 percent responded to the survey. The population in study 2 was all class teachers who graduated from Turku University and now work in the few basic schools (59 Schools) in South- West Finland. 257 teachers answered to the open ended web-based questions. SPSS was used to produce standard deviations; Analysis of Variance; Pearson Product Moment Correlation (r); T-test; ANOVA, Bonferroni post-hoc test; and Polynomial Contrast tests meant to analyze linear trend. An alpha level of .05 was used to determine statistical significance. The results of the study showed that: A majority of the respondents (graduates and students) rated the overall importance, effectiveness and quality of the teacher education programs as important, effective and good. Generally speaking there were only a few significant differences between the cohorts and groups related to the background variables (gender, age). The different cohorts were rating the quality of the programs very similarly but some differences between the cohorts were found in the importance and effectiveness ratings. Graduates of 2001 and 2002 rated the importance of the program significantly higher than 2000 graduates. The effectiveness of the programs was rated significantly higher by 2001 and 2003 graduates than other groups. In spite of these individual differences between cohorts there were no linear trends among the year cohorts in any measure. In respondents’ ratings of the effectiveness of teacher education programs there was significant difference between males and females; females rated it higher than males. There were no significant differences between males’ and females’ ratings of the importance and quality of programs. In the ratings there was only one difference between age groups. Older graduates (35 years or older) rated the importance of the teacher training significantly higher that 25-35 years old graduates. In graduates’ ratings there were positive but relatively low correlations between all variables related to importance, effectiveness and quality of Teacher Education Programs. Generally speaking students’ ratings about importance, effectiveness and quality of teacher education program were very positive. There was only one significant difference related to the background variables. Females rated higher the effectiveness of the program. The comparison of students’ and graduates’ perception about importance, effectiveness, and quality of teacher education programs showed that there were no significant differences between graduates and students in the overall ratings. However there were differences in some individual variables. Students rated higher in importance of “Continuous Professional Development”, effectiveness of “Critical Thinking Skills” and “Using Educational Technology” and quality of “Advice received from the advisor”. Graduates rated higher in importance of “Knowledge of Learning Environment” and effectiveness of “Continuous Professional Development”. According to the qualitative data of study 2 some graduates expressed that their perceptions have not changed about the importance, effectiveness, and quality of training that they received during their study time. They pointed out that teacher education programs have provided them the basic theoretical/formal knowledge and some training of practical routines. However, a majority of the teachers seems to have somewhat critical opinions about the teacher education. These teachers were not satisfied with teacher education programs because they argued that the programs failed to meet their practical demands in different everyday situations of the classroom e.g. in coping with students’ learning difficulties, multiprofessional communication with parents and other professional groups (psychologists and social workers), and classroom management problems. Participants also emphasized more practice oriented knowledge of subject matter, evaluation methods and teachers’ rights and responsibilities. Therefore, they (54.1% of participants) suggested that teacher education departments should provide more practice-based courses and programs as well as closer collaboration between regular schools and teacher education departments in order to fill gap between theory and practice.
Resumo:
The extensional theory of arrays is one of the most important ones for applications of SAT Modulo Theories (SMT) to hardware and software verification. Here we present a new T-solver for arrays in the context of the DPLL(T) approach to SMT. The main characteristics of our solver are: (i) no translation of writes into reads is needed, (ii) there is no axiom instantiation, and (iii) the T-solver interacts with the Boolean engine by asking to split on equality literals between indices. As far as we know, this is the first accurate description of an array solver integrated in a state-of-the-art SMT solver and, unlike most state-of-the-art solvers, it is not based on a lazy instantiation of the array axioms. Moreover, it is very competitive in practice, specially on problems that require heavy reasoning on array literals
Resumo:
El problema de la regresión simbólica consiste en el aprendizaje, a partir de un conjunto muestra de datos obtenidos experimentalmente, de una función desconocida. Los métodos evolutivos han demostrado su eficiencia en la resolución de instancias de dicho problema. En este proyecto se propone una nueva estrategia evolutiva, a través de algoritmos genéticos, basada en una nueva estructura de datos denominada Straight Line Program (SLP) y que representa en este caso expresiones simbólicas. A partir de un SLP universal, que depende de una serie de parámetros cuya especialización proporciona SLP's concretos del espacio de búsqueda, la estrategia trata de encontrar los parámetros óptimos para que el SLP universal represente la función que mejor se aproxime al conjunto de puntos muestra. De manera conceptual, este proyecto consiste en un entrenamiento genético del SLP universal, utilizando los puntos muestra como conjunto de entrenamiento, para resolver el problema de la regresión simbólica.
Resumo:
New economic and enterprise needs have increased the interest and utility of the methods of the grouping process based on the theory of uncertainty. A fuzzy grouping (clustering) process is a key phase of knowledge acquisition and reduction complexity regarding different groups of objects. Here, we considered some elements of the theory of affinities and uncertain pretopology that form a significant support tool for a fuzzy clustering process. A Galois lattice is introduced in order to provide a clearer vision of the results. We made an homogeneous grouping process of the economic regions of Russian Federation and Ukraine. The obtained results gave us a large panorama of a regional economic situation of two countries as well as the key guidelines for the decision-making. The mathematical method is very sensible to any changes the regional economy can have. We gave an alternative method of the grouping process under uncertainty.
Resumo:
In this paper, we present view-dependent information theory quality measures for pixel sampling and scene discretization in flatland. The measures are based on a definition for the mutual information of a line, and have a purely geometrical basis. Several algorithms exploiting them are presented and compare well with an existing one based on depth differences
Resumo:
The development of correct programs is a core problem in computer science. Although formal verification methods for establishing correctness with mathematical rigor are available, programmers often find these difficult to put into practice. One hurdle is deriving the loop invariants and proving that the code maintains them. So called correct-by-construction methods aim to alleviate this issue by integrating verification into the programming workflow. Invariant-based programming is a practical correct-by-construction method in which the programmer first establishes the invariant structure, and then incrementally extends the program in steps of adding code and proving after each addition that the code is consistent with the invariants. In this way, the program is kept internally consistent throughout its development, and the construction of the correctness arguments (proofs) becomes an integral part of the programming workflow. A characteristic of the approach is that programs are described as invariant diagrams, a graphical notation similar to the state charts familiar to programmers. Invariant-based programming is a new method that has not been evaluated in large scale studies yet. The most important prerequisite for feasibility on a larger scale is a high degree of automation. The goal of the Socos project has been to build tools to assist the construction and verification of programs using the method. This thesis describes the implementation and evaluation of a prototype tool in the context of the Socos project. The tool supports the drawing of the diagrams, automatic derivation and discharging of verification conditions, and interactive proofs. It is used to develop programs that are correct by construction. The tool consists of a diagrammatic environment connected to a verification condition generator and an existing state-of-the-art theorem prover. Its core is a semantics for translating diagrams into verification conditions, which are sent to the underlying theorem prover. We describe a concrete method for 1) deriving sufficient conditions for total correctness of an invariant diagram; 2) sending the conditions to the theorem prover for simplification; and 3) reporting the results of the simplification to the programmer in a way that is consistent with the invariantbased programming workflow and that allows errors in the program specification to be efficiently detected. The tool uses an efficient automatic proof strategy to prove as many conditions as possible automatically and lets the remaining conditions be proved interactively. The tool is based on the verification system PVS and i uses the SMT (Satisfiability Modulo Theories) solver Yices as a catch-all decision procedure. Conditions that were not discharged automatically may be proved interactively using the PVS proof assistant. The programming workflow is very similar to the process by which a mathematical theory is developed inside a computer supported theorem prover environment such as PVS. The programmer reduces a large verification problem with the aid of the tool into a set of smaller problems (lemmas), and he can substantially improve the degree of proof automation by developing specialized background theories and proof strategies to support the specification and verification of a specific class of programs. We demonstrate this workflow by describing in detail the construction of a verified sorting algorithm. Tool-supported verification often has little to no presence in computer science (CS) curricula. Furthermore, program verification is frequently introduced as an advanced and purely theoretical topic that is not connected to the workflow taught in the early and practically oriented programming courses. Our hypothesis is that verification could be introduced early in the CS education, and that verification tools could be used in the classroom to support the teaching of formal methods. A prototype of Socos has been used in a course at Åbo Akademi University targeted at first and second year undergraduate students. We evaluate the use of Socos in the course as part of a case study carried out in 2007.
Resumo:
The feasibility of using augmented block designs and spatial analysis methods for early stage selection in eucalyptus breeding programs was tested. A total of 113 half-sib progenies of Eucalyptus urophylla and eight clones were evaluated in an 11 x 11 triple lattice experiment at two locations: Posto da Mata (Bahia, Brazil) and São Mateus (Minas Gerais, Brazil). Four checks were randomly allocated within each block. Plots consisted of 15 m long rows containing 6 plants spaced 3 m apart. The girth at breast height (cm/plant) was evaluated at 19 and 26 months of age. Variance analyses were performed according to the following methods: lattice design, randomized complete block design, augmented block design, Papadakis method, moving means method, and check plots. Comparisons among different methods were based on the magnitude of experimental errors and precision of the estimates of genetic and phenotypic parameters. General results indicated that augmented block design is useful to evaluate progenies and clones in early selection in eucalyptus breeding programs using moderate and low selection intensities. However, this design is not suitable for estimating genetic and phenotypic parameters due to its low precision. Check plots, nearest neighbour, Papadakis (1937), and moving means methods were efficient in removing the heterogeneity within blocks. These efficiencies were compared to that in lattice analysis for estimation of genetic and phenotypic parameters.
Resumo:
The simulation programs are important tools to analyze the different energetic alternatives, including the use of renewable energy. The objective of this study was to analyze comparatively the different computer tools available for modeling of solar water heaters. Among the main simulation software of solar thermal systems, there are: RETScreen International, EnergyPlus, TRNSYS, SolDesigner, SolarPro, e T*SOL. Among the tools mentioned, only EnergyPlus and RETScreen International are free, but they allow obtaining interesting results when applied together. The first one has a detailed module of energy analysis of solar water heaters, while the second one provides an detailed economic feasibility study and an assessment of emissions of greenhouse gases. RETScreen International and EnergyPlus programs are aimed at a diverse audience, including designers, researchers and energy planners.
Resumo:
The objective of this research was to identify the skills and competences required by Chief Information Officers in their professional life and whether these skills can be developed by means of postgraduate education pro-grams. Although the changing role of the CIO has been studied for years by the academia, the ways of necessary skills development have not been paid significant attention. In order to obtain understanding of the topic and its main issues qualitative method was implemented and questionnaires and interviews were conducted with CIOs and other C-level executives to-gether with analysis of the curricula of postgraduate educational programs in the field of business designed for executives. Business skills and knowledge along with developed communication and leadership skills are among the most discussed and required from CIOs. According to the collected data and its further analysis, although the most important competences of an IT executive are technological, the im-portance of business related skills is emphasized by the majority of re-spondents and supported by the existing theory. Postgraduate educational programs have curricula that can develop the required competences, alt-hough not equally.
Resumo:
JÄKÄLA-algoritmi (Jatkuvan Äänitehojakautuman algoritmi Käytävien Äänikenttien LAskentaan) ja sen NUMO- ja APPRO-laskentayhtälöt perustuvat käytävällä olevan todellisen äänilähteen kuvalähteiden symmetriaan. NUMO on algoritmin numeerisen ratkaisun ja APPRO likiarvoratkaisun laskentayhtälö. Algoritmia johdettaessa oletettiin, että absorptiomateriaali oli jakautunut tasaisesti käytävän ääntä heijastaville pinnoille. Suorakaiteen muotoisen käytävän kuvalähdetason muunto jatkuvaksi äänitehojakautumaksi sisältää kolme muokkausvaihetta. Aluksi suorakaiteen kuvalähdetaso muunnetaan neliön muotoiseksi. Seuraavaksi neliön muotoisen kuvalähdetason samanarvoiset kuvalähteet siirretään koordinaattiakselille diskreetiksi kuvalähdejonoksi. Lopuksi kuvalähdejono muunnetaan jatkuvaksi äänitehojakautumaksi, jolloin käytävän vastaanottopisteen äänenpainetaso voidaan laskea integroimalla jatkuvan äänitehojakautuman yli. JÄKÄLA-algoritmin validiteetin toteamiseksi käytettiin testattua kaupallista AKURI-ohjelmaa. AKURI-ohjelma antoi myös hyvän käsityksen siitä, miten NUMO- ja APPRO-yhtälöillä lasketut arvot mahdollisesti eroavat todellisilla käytävillä mitatuista arvoista. JÄKÄLA-algoritmin NUMO- ja APPRO-yhtälöitä testattiin myös vertaamalla niiden antamia tuloksia kolmen erityyppisen käytävän äänenpainetasomittauksiin. Tässä tutkimuksessa on osoitettu, että akustisen kuvateorian pohjalta on mahdollista johtaa laskenta-algoritmi, jota voidaan soveltaa pitkien käytävien äänikenttien pika-arvioinnissa paikan päällä. Sekä teoreettinen laskenta että käytännön äänenpainetasomittaukset todellisilla käytävillä osoittivat, että JÄKÄLA-algoritmin yhtälöiden ennustustarkkuus oli erinomainen ideaalikäytävillä ja hyvä niillä todellisilla käytävillä, joilla ei ollut ääntä heijastavia rakenteita. NUMO- ja APPRO-yhtälöt näyttäisivät toimivan hyvin käytävillä, joiden poikkileikkaus oli lähes neliön muotoinen ja joissa pintojen suurin absorptiokerroin oli korkeintaan kymmenen kertaa pienintä absorptiokerrointa suurempi. NUMO- ja APPRO-yhtälöiden suurin puute on, etteivät ne ota huomioon pintojen erilaisia absorptiokertoimia eivätkä esineistä heijastuvia ääniä. NUMO- ja APPRO- laskentayhtälöt poikkesivat mitatuista arvoista eniten käytävillä, joilla kahden vastakkaisen pinnan absorptiokerroin oli hyvin suuri ja toisen pintaparin hyvin pieni, ja käytävillä, joissa oli massiivisia, ääntä heijastavia pilareita ja palkkeja. JÄKÄLA-algoritmin NUMO- ja APPRO-yhtälöt antoivat tutkituilla käytävillä kuitenkin selvästi tarkempia arvoja kuin Kuttruffin likiarvoyhtälö ja tilastollisen huoneakustiikan perusyhtälö. JÄKÄLA-algoritmin laskentatarkkuutta on testattu vain neljällä todellisella käytävällä. Algoritmin kehittämiseksi tulisi jatkossa käytävän vastakkaisia pintoja ja niiden absorptiokertoimia käsitellä laskennassa pareittain. Algoritmin validiteetin varmistamiseksi on mittauksia tehtävä lisää käytävillä, joiden absorptiomateriaalien jakautumat poikkeavat toisistaan.
Resumo:
The computer game industry has grown steadily for years, and in revenues it can be compared to the music and film industries. The game industry has been moving to digital distribution. Computer gaming and the concept of business model are discussed among industrial practitioners and the scientific community. The significance of the business model concept has increased in the scientific literature recently, although there is still a lot of discussion going on on the concept. In the thesis, the role of the business model in the computer game industry is studied. Computer game developers, designers, project managers and organization leaders in 11 computer game companies were interviewed. The data was analyzed to identify the important elements of computer game business model, how the business model concept is perceived and how the growth of the organization affects the business model. It was identified that the importance of human capital is crucial to the business. As games are partly a product of creative thinking also innovation and the creative process are highly valued. The same applies to technical skills when performing various activities. Marketing and customer relationships are also considered as key elements in the computer game business model. Financing and partners are important especially for startups, when the organization is dependent on external funding and third party assets. The results of this study provide organizations with improved understanding on how the organization is built and what business model elements are weighted.
Resumo:
A cognitively based instructional program for narrative writing was developed. The effects of using cognitively based schematic planning organizers at the pre-writing stage were evaluated using subjects from the Primary, Junior and Intermediate divisions. Results indicate that the use of organizers based on problem solving significantly improved the organization and the overall quality of narrative writing for students in grades 3, 6 and 7. The magnitude of the improvement of the treatment group over the control group performance in Organization ranged from 10.7% to 22.9%. Statistical and observational data indicate many implications for further research into the cognitive basis for writing and reading; for the improvement and evaluation of school writing programs; for the design of school curricula; and for the inservice education for teachers of writing.
Resumo:
The Lennard-Jones Devonshire 1 (LJD) single particle theory for liquids is extended and applied to the anharmonic solid in a high temperature limit. The exact free energy for the crystal is expressed as a convergent series of terms involving larger and larger sets of contiguous particles called cell-clusters. The motions of all the particles within cell-clusters are correlated to each other and lead to non-trivial integrals of orders 3, 6, 9, ... 3N. For the first time the six dimensional integral has been calculated to high accuracy using a Lennard-Jones (6-12) pair interaction between nearest neighbours only for the f.c.c. lattice. The thermodynamic properties predicted by this model agree well with experimental results for solid Xenon.
Resumo:
We have calculated the thermodynamic properties of monatomic fcc crystals from the high temperature limit of the Helmholtz free energy. This equation of state included the static and vibrational energy components. The latter contribution was calculated to order A4 of perturbation theory, for a range of crystal volumes, in which a nearest neighbour central force model was used. We have calculated the lattice constant, the coefficient of volume expansion, the specific heat at constant volume and at constant pressure, the adiabatic and the isothermal bulk modulus, and the Gruneisen parameter, for two of the rare gas solids, Xe and Kr, and for the fcc metals Cu, Ag, Au, Al, and Pb. The LennardJones and the Morse potential were each used to represent the atomic interactions for the rare gas solids, and only the Morse potential was used for the fcc metals. The thermodynamic properties obtained from the A4 equation of state with the Lennard-Jones potential, seem to be in reasonable agreement with experiment for temperatures up to about threequarters of the melting temperature. However, for the higher temperatures, the results are less than satisfactory. For Xe and Kr, the thermodynamic properties calculated from the A2 equation of state with the Morse potential, are qualitatively similar to the A 2 results obtained with the Lennard-Jones potential, however, the properties obtained from the A4 equation of state are in good agreement with experiment, since the contribution from the A4 terms seem to be small. The lattice contribution to the thermal properties of the fcc metals was calculated from the A4 equation of state, and these results produced a slight improvement over the properties calculated from the A2 equation of state. In order to compare the calculated specific heats and bulk moduli results with experiment~ the electronic contribution to thermal properties was taken into account~ by using the free electron model. We found that the results varied significantly with the value chosen for the number of free electrons per atom.