929 resultados para Lattice theory - Computer programs


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Computer game technology provides us with the tools to create web-based educational materials for autonomous and collaborative learning. At Worcester, we have researched the use of this technology in various educational contexts. This paper reports one such study; the use of the commercial game engine “Unreal Tournament 2004” (UT2004) to produce materials suitable for education of Architects. We map the concepts and principles of Architectural Design onto the affordances (development tools) provided by UT2004, leading to a systematic procedure for the realization of buildings and urban environments using this game engine. A theory for the production of web-based learning materials which supports both autonomous and collaborative learning is developed. A heuristic evaluation of our materials, used with second-year students is presented. Associated web-pages provide on-line materials for delegates.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

At a recent conference on games in education, we made a radical decision to transform our standard presentation of PowerPoint slides and computer game demonstrations into a unified whole, inserting the PowerPoint presentation to the computer game. This opened up various questions relating to learning and teaching theories, which were debated by the conference delegates. In this paper, we reflect on these discussions, we present our initial experiment, and relate this to various theories of learning and teaching. In particular, we consider the applicability of “concept maps” to inform the construction of educational materials, especially their topological, geometrical and pedagogical significance. We supplement this “spatial” dimension with a theory of the dynamic, temporal dimension, grounded in a context of learning processes, such as Kolb’s learning cycle. Finally, we address the multi-player aspects of computer games, and relate this to the theories of social and collaborative learning. This paper attempts to explore various theoretical bases, and so support the development of a new learning and teaching virtual reality approach.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Abstract : Since at least the 1980's, a growing number of companies have set up an ethics or a compliance program within their organization. However, in the field of study of business management, there is a paucity of research studies concerning these management systems. This observation warranted the present investigation of one company's compliance program. Compliance programs are set up so that individuals working within an organization observe the laws and regulations which pertain to their work. This study used a constructivist grounded theory methodology to examine the process by which a specific compliance program, that of Siemens Canada Limited, was implemented throughout its organization. In conformity with this methodology, instead of proceeding with the investigation in accordance to a particular theoretical framework, the study established a number of theoretical constructs used strictly as reference points. The study's research question was stated as: what are the characteristics of the process by which Siemens' compliance program integrated itself into the existing organizational structure and gained employee acceptance? Data consisted of documents produced by the company and of interviews done with twenty-four managers working for Siemens Canada Limited. The researcher used QSR-Nvivo computer assisted software to code transcripts and to help with analyzing interviews and documents. Triangulation was done by using a number of analysis techniques and by constantly comparing findings with extant theory. A descriptive model of the implementation process grounded in the experience of participants and in the contents of the documents emerged from the data. The process was called "Remolding"; remolding being the core category having emerged. This main process consisted of two sub-processes identified as "embedding" and "appraising." The investigation was able to provide a detailed account of the appraising process. It identified that employees appraised the compliance program according to three facets: the impact of the program on the employee's daily activities, the relationship employees have with the local compliance organization, and the relationship employees have with the corporate ethics identity. The study suggests that a company who is entertaining the idea of implementing a compliance program should consider all three facets. In particular, it suggests that any company interested in designing and implementing a compliance program should pay particular attention to its corporate ethics identity. This is because employee's acceptance of the program is influenced by their comparison of the company's ethics identity to their local ethics identity. Implications of the study suggest that personnel responsible for the development and organizational support of a compliance program should understand the appraisal process by which employees build their relationship with the program. The originality of this study is that it points emphatically that companies must pay special attention in developing a corporate ethics identify which is coherent, well documented and well explained.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We explore the relationships between the construction of a work of art and the crafting of a computer program in Java and suggest that the structure of paintings and drawings may be used to teach the fundamental concepts of computer programming. This movement "from Art to Science", using art to drive computing, complements the common use of computing to inform art. We report on initial experiences using this approach with undergraduate and postgraduate students. An embryonic theory of the correspondence between art and computing is presented and a methodology proposed to develop this project further.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Coupled map lattices (CML) can describe many relaxation and optimization algorithms currently used in image processing. We recently introduced the ‘‘plastic‐CML’’ as a paradigm to extract (segment) objects in an image. Here, the image is applied by a set of forces to a metal sheet which is allowed to undergo plastic deformation parallel to the applied forces. In this paper we present an analysis of our ‘‘plastic‐CML’’ in one and two dimensions, deriving the nature and stability of its stationary solutions. We also detail how to use the CML in image processing, how to set the system parameters and present examples of it at work. We conclude that the plastic‐CML is able to segment images with large amounts of noise and large dynamic range of pixel values, and is suitable for a very large scale integration(VLSI) implementation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Many-core systems are emerging from the need of more computational power and power efficiency. However there are many issues which still revolve around the many-core systems. These systems need specialized software before they can be fully utilized and the hardware itself may differ from the conventional computational systems. To gain efficiency from many-core system, programs need to be parallelized. In many-core systems the cores are small and less powerful than cores used in traditional computing, so running a conventional program is not an efficient option. Also in Network-on-Chip based processors the network might get congested and the cores might work at different speeds. In this thesis is, a dynamic load balancing method is proposed and tested on Intel 48-core Single-Chip Cloud Computer by parallelizing a fault simulator. The maximum speedup is difficult to obtain due to severe bottlenecks in the system. In order to exploit all the available parallelism of the Single-Chip Cloud Computer, a runtime approach capable of dynamically balancing the load during the fault simulation process is used. The proposed dynamic fault simulation approach on the Single-Chip Cloud Computer shows up to 45X speedup compared to a serial fault simulation approach. Many-core systems can draw enormous amounts of power, and if this power is not controlled properly, the system might get damaged. One way to manage power is to set power budget for the system. But if this power is drawn by just few cores of the many, these few cores get extremely hot and might get damaged. Due to increase in power density multiple thermal sensors are deployed on the chip area to provide realtime temperature feedback for thermal management techniques. Thermal sensor accuracy is extremely prone to intra-die process variation and aging phenomena. These factors lead to a situation where thermal sensor values drift from the nominal values. This necessitates efficient calibration techniques to be applied before the sensor values are used. In addition, in modern many-core systems cores have support for dynamic voltage and frequency scaling. Thermal sensors located on cores are sensitive to the core's current voltage level, meaning that dedicated calibration is needed for each voltage level. In this thesis a general-purpose software-based auto-calibration approach is also proposed for thermal sensors to calibrate thermal sensors on different range of voltages.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Coronal jets represent important manifestations of ubiquitous solar transients, which may be the source of significant mass and energy input to the upper solar atmosphere and the solar wind. While the energy involved in a jet-like event is smaller than that of “nominal” solar flares and coronal mass ejections (CMEs), jets share many common properties with these phenomena, in particular, the explosive magnetically driven dynamics. Studies of jets could, therefore, provide critical insight for understanding the larger, more complex drivers of the solar activity. On the other side of the size-spectrum, the study of jets could also supply important clues on the physics of transients close or at the limit of the current spatial resolution such as spicules. Furthermore, jet phenomena may hint to basic process for heating the corona and accelerating the solar wind; consequently their study gives us the opportunity to attack a broad range of solar-heliospheric problems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This analysis paper presents previously unknown properties of some special cases of the Wright function whose consideration is necessitated by our work on probability theory and the theory of stochastic processes. Specifically, we establish new asymptotic properties of the particular Wright function 1Ψ1(ρ, k; ρ, 0; x) = X∞ n=0 Γ(k + ρn) Γ(ρn) x n n! (|x| < ∞) when the parameter ρ ∈ (−1, 0)∪(0, ∞) and the argument x is real. In the probability theory applications, which are focused on studies of the Poisson-Tweedie mixtures, the parameter k is a non-negative integer. Several representations involving well-known special functions are given for certain particular values of ρ. The asymptotics of 1Ψ1(ρ, k; ρ, 0; x) are obtained under numerous assumptions on the behavior of the arguments k and x when the parameter ρ is both positive and negative. We also provide some integral representations and structural properties involving the ‘reduced’ Wright function 0Ψ1(−−; ρ, 0; x) with ρ ∈ (−1, 0) ∪ (0, ∞), which might be useful for the derivation of new properties of members of the power-variance family of distributions. Some of these imply a reflection principle that connects the functions 0Ψ1(−−;±ρ, 0; ·) and certain Bessel functions. Several asymptotic relationships for both particular cases of this function are also given. A few of these follow under additional constraints from probability theory results which, although previously available, were unknown to analysts.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The protein folding problem has been one of the most challenging subjects in biological physics due to its complexity. Energy landscape theory based on statistical mechanics provides a thermodynamic interpretation of the protein folding process. We have been working to answer fundamental questions about protein-protein and protein-water interactions, which are very important for describing the energy landscape surface of proteins correctly. At first, we present a new method for computing protein-protein interaction potentials of solvated proteins directly from SAXS data. An ensemble of proteins was modeled by Metropolis Monte Carlo and Molecular Dynamics simulations, and the global X-ray scattering of the whole model ensemble was computed at each snapshot of the simulation. The interaction potential model was optimized and iterated by a Levenberg-Marquardt algorithm. Secondly, we report that terahertz spectroscopy directly probes hydration dynamics around proteins and determines the size of the dynamical hydration shell. We also present the sequence and pH-dependence of the hydration shell and the effect of the hydrophobicity. On the other hand, kinetic terahertz absorption (KITA) spectroscopy is introduced to study the refolding kinetics of ubiquitin and its mutants. KITA results are compared to small angle X-ray scattering, tryptophan fluorescence, and circular dichroism results. We propose that KITA monitors the rearrangement of hydrogen bonding during secondary structure formation. Finally, we present development of the automated single molecule operating system (ASMOS) for a high throughput single molecule detector, which levitates a single protein molecule in a 10 µm diameter droplet by the laser guidance. I also have performed supporting calculations and simulations with my own program codes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

COSTA, Umberto Souza; MOREIRA, Anamaria Martins; MUSICANTE, Matin A.; SOUZA NETO, Plácido A. JCML: A specification language for the runtime verification of Java Card programs. Science of Computer Programming. [S.l]: [s.n], 2010.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

COSTA, Umberto Souza da; MOREIRA, Anamaria Martins; MUSICANTE, Martin A. Specification and Runtime Verification of Java Card Programs. Electronic Notes in Theoretical Computer Science. [S.l:s.n], 2009.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Abstract We present ideas about creating a next generation Intrusion Detection System (IDS) based on the latest immunological theories. The central challenge with computer security is determining the difference between normal and potentially harmful activity. For half a century, developers have protected their systems by coding rules that identify and block specific events. However, the nature of current and future threats in conjunction with ever larger IT systems urgently requires the development of automated and adaptive defensive tools. A promising solution is emerging in the form of Artificial Immune Systems (AIS): The Human Immune System (HIS) can detect and defend against harmful and previously unseen invaders, so can we not build a similar Intrusion Detection System (IDS) for our computers? Presumably, those systems would then have the same beneficial properties as HIS like error tolerance, adaptation and self-monitoring. Current AIS have been successful on test systems, but the algorithms rely on self-nonself discrimination, as stipulated in classical immunology. However, immunologist are increasingly finding fault with traditional self-nonself thinking and a new 'Danger Theory' (DT) is emerging. This new theory suggests that the immune system reacts to threats based on the correlation of various (danger) signals and it provides a method of 'grounding' the immune response, i.e. linking it directly to the attacker. Little is currently understood of the precise nature and correlation of these signals and the theory is a topic of hot debate. It is the aim of this research to investigate this correlation and to translate the DT into the realms of computer security, thereby creating AIS that are no longer limited by self-nonself discrimination. It should be noted that we do not intend to defend this controversial theory per se, although as a deliverable this project will add to the body of knowledge in this area. Rather we are interested in its merits for scaling up AIS applications by overcoming self-nonself discrimination problems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We determine numerically the single-particle and the two-particle spectrum of the three-state quantum Potts model on a lattice by using the density matrix renormalization group method, and extract information on the asymptotic (small momentum) S-matrix of the quasiparticles. The low energy part of the finite size spectrum can be understood in terms of a simple effective model introduced in a previous work, and is consistent with an asymptotic S-matrix of an exchange form below a momentum scale p*. This scale appears to vanish faster than the Compton scale, mc, as one approaches the critical point, suggesting that a dangerously irrelevant operator may be responsible for the behaviour observed on the lattice.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this dissertation I draw a connection between quantum adiabatic optimization, spectral graph theory, heat-diffusion, and sub-stochastic processes through the operators that govern these processes and their associated spectra. In particular, we study Hamiltonians which have recently become known as ``stoquastic'' or, equivalently, the generators of sub-stochastic processes. The operators corresponding to these Hamiltonians are of interest in all of the settings mentioned above. I predominantly explore the connection between the spectral gap of an operator, or the difference between the two lowest energies of that operator, and certain equilibrium behavior. In the context of adiabatic optimization, this corresponds to the likelihood of solving the optimization problem of interest. I will provide an instance of an optimization problem that is easy to solve classically, but leaves open the possibility to being difficult adiabatically. Aside from this concrete example, the work in this dissertation is predominantly mathematical and we focus on bounding the spectral gap. Our primary tool for doing this is spectral graph theory, which provides the most natural approach to this task by simply considering Dirichlet eigenvalues of subgraphs of host graphs. I will derive tight bounds for the gap of one-dimensional, hypercube, and general convex subgraphs. The techniques used will also adapt methods recently used by Andrews and Clutterbuck to prove the long-standing ``Fundamental Gap Conjecture''.