848 resultados para computer science and engineering


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Proofs by induction are central to many computer science areas such as data structures, theory of computation, programming languages, program efficiency-time complexity, and program correctness. Proofs by induction can also improve students’ understanding of and performance with computer science concepts such as programming languages, algorithm design, and recursion, as well as serve as a medium for teaching them. Even though students are exposed to proofs by induction in many courses of their curricula, they still have difficulties understanding and performing them. This impacts the whole course of their studies, since proofs by induction are omnipresent in computer science. Specifically, students do not gain conceptual understanding of induction early in the curriculum and as a result, they have difficulties applying it to more advanced areas later on in their studies. The goal of my dissertation is twofold: 1. identifying sources of computer science students’ difficulties with proofs by induction, and 2. developing a new approach to teaching proofs by induction by way of an interactive and multimodal electronic book (e-book). For the first goal, I undertook a study to identify possible sources of computer science students’ difficulties with proofs by induction. Its results suggest that there is a close correlation between students’ understanding of inductive definitions and their understanding and performance of proofs by induction. For designing and developing my e-book, I took into consideration the results of my study, as well as the drawbacks of the current methodologies of teaching proofs by induction for computer science. I designed my e-book to be used as a standalone and complete educational environment. I also conducted a study on the effectiveness of my e-book in the classroom. The results of my study suggest that, unlike the current methodologies of teaching proofs by induction for computer science, my e-book helped students overcome many of their difficulties and gain conceptual understanding of proofs induction.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Petri Nets are a formal, graphical and executable modeling technique for the specification and analysis of concurrent and distributed systems and have been widely applied in computer science and many other engineering disciplines. Low level Petri nets are simple and useful for modeling control flows but not powerful enough to define data and system functionality. High level Petri nets (HLPNs) have been developed to support data and functionality definitions, such as using complex structured data as tokens and algebraic expressions as transition formulas. Compared to low level Petri nets, HLPNs result in compact system models that are easier to be understood. Therefore, HLPNs are more useful in modeling complex systems. There are two issues in using HLPNs - modeling and analysis. Modeling concerns the abstracting and representing the systems under consideration using HLPNs, and analysis deals with effective ways study the behaviors and properties of the resulting HLPN models. In this dissertation, several modeling and analysis techniques for HLPNs are studied, which are integrated into a framework that is supported by a tool. For modeling, this framework integrates two formal languages: a type of HLPNs called Predicate Transition Net (PrT Net) is used to model a system's behavior and a first-order linear time temporal logic (FOLTL) to specify the system's properties. The main contribution of this dissertation with regard to modeling is to develop a software tool to support the formal modeling capabilities in this framework. For analysis, this framework combines three complementary techniques, simulation, explicit state model checking and bounded model checking (BMC). Simulation is a straightforward and speedy method, but only covers some execution paths in a HLPN model. Explicit state model checking covers all the execution paths but suffers from the state explosion problem. BMC is a tradeoff as it provides a certain level of coverage while more efficient than explicit state model checking. The main contribution of this dissertation with regard to analysis is adapting BMC to analyze HLPN models and integrating the three complementary analysis techniques in a software tool to support the formal analysis capabilities in this framework. The SAMTools developed for this framework in this dissertation integrates three tools: PIPE+ for HLPNs behavioral modeling and simulation, SAMAT for hierarchical structural modeling and property specification, and PIPE+Verifier for behavioral verification.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

An interdisciplinary field trip to a remote marine lab joined graduate students from fine arts and natural resource science departments to think creatively about the topic of climate change and science communication. We followed a learning cycle framework to allow the students to explore marine ecosystems and participate in scientific lectures, group discussions, and an artist-led project making abstract collages representing climate change processes. Students subsequently worked in small groups to develop environmental communication material for public visitors. We assessed the learning activity and the communication product using pre- and post-field trip participant surveys, focus group discussions, and critiques by art and communication experts of the products. Significant changes in knowledge about climate change occurred in program participants. Incorporating artists and the arts into this activity helped engage multiple senses and emphasized social interaction, as well as providing support to participants to think creatively. The production of art helped to encourage peer learning and normalize the different views among participants in communicating about climate change impacts. Students created effective communication products based on external reviews. Disciplinary differences in cultures, language, and standards challenged participating faculty, yet unanticipated outcomes such as potentially transformative learning and improved teacher evaluations resulted.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

C3S2E '16 Proceedings of the Ninth International C* Conference on Computer Science & Software Engineering

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The primary goals of this study are to: embed sustainable concepts of energy consumption into certain part of existing Computer Science curriculum for English schools; investigate how to motivate 7-to-11 years old kids to learn these concepts; promote responsive ICT (Information and Communications Technology) use by these kids in their daily life; raise their awareness of today’s ecological challenges. Sustainability-related ICT lessons developed aim to provoke computational thinking and creativity to foster understanding of environmental impact of ICT and positive environmental impact of small changes in user energy consumption behaviour. The importance of including sustainability into the Computer Science curriculum is due to the fact that ICT is both a solution and one of the causes of current world ecological problems. This research follows Agile software development methodology. In order to achieve the aforementioned goals, sustainability requirements, curriculum requirements and technical requirements are firstly analysed. Secondly, the web-based user interface is designed. In parallel, a set of three online lessons (video, slideshow and game) is created for the website GreenICTKids.com taking into account several green design patterns. Finally, the evaluation phase involves the collection of adults’ and kids’ feedback on the following: user interface; contents; user interaction; impacts on the kids’ sustainability awareness and on the kids’ behaviour with technologies. In conclusion, a list of research outcomes is as follows: 92% of the adults learnt more about energy consumption; 80% of the kids are motivated to learn about energy consumption and found the website easy to use; 100% of the kids understood the contents and liked website’s visual aspect; 100% of the kids will try to apply in their daily life what they learnt through the online lessons.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The main objectives of this thesis are to validate an improved principal components analysis (IPCA) algorithm on images; designing and simulating a digital model for image compression, face recognition and image detection by using a principal components analysis (PCA) algorithm and the IPCA algorithm; designing and simulating an optical model for face recognition and object detection by using the joint transform correlator (JTC); establishing detection and recognition thresholds for each model; comparing between the performance of the PCA algorithm and the performance of the IPCA algorithm in compression, recognition and, detection; and comparing between the performance of the digital model and the performance of the optical model in recognition and detection. The MATLAB © software was used for simulating the models. PCA is a technique used for identifying patterns in data and representing the data in order to highlight any similarities or differences. The identification of patterns in data of high dimensions (more than three dimensions) is too difficult because the graphical representation of data is impossible. Therefore, PCA is a powerful method for analyzing data. IPCA is another statistical tool for identifying patterns in data. It uses information theory for improving PCA. The joint transform correlator (JTC) is an optical correlator used for synthesizing a frequency plane filter for coherent optical systems. The IPCA algorithm, in general, behaves better than the PCA algorithm in the most of the applications. It is better than the PCA algorithm in image compression because it obtains higher compression, more accurate reconstruction, and faster processing speed with acceptable errors; in addition, it is better than the PCA algorithm in real-time image detection due to the fact that it achieves the smallest error rate as well as remarkable speed. On the other hand, the PCA algorithm performs better than the IPCA algorithm in face recognition because it offers an acceptable error rate, easy calculation, and a reasonable speed. Finally, in detection and recognition, the performance of the digital model is better than the performance of the optical model.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Colloid self-assembly under external control is a new route to fabrication of advanced materials with novel microstructures and appealing functionalities. The kinetic processes of colloidal self-assembly have attracted great interests also because they are similar to many atomic level kinetic processes of materials. In the past decades, rapid technological progresses have been achieved on producing shape-anisotropic, patchy, core-shell structured particles and particles with electric/magnetic charges/dipoles, which greatly enriched the self-assembled structures. Multi-phase carrier liquids offer new route to controlling colloidal self-assembly. Therefore, heterogeneity is the essential characteristics of colloid system, while so far there still lacks a model that is able to efficiently incorporate these possible heterogeneities. This thesis is mainly devoted to development of a model and computational study on the complex colloid system through a diffuse-interface field approach (DIFA), recently developed by Wang et al. This meso-scale model is able to describe arbitrary particle shape and arbitrary charge/dipole distribution on the surface or body of particles. Within the framework of DIFA, a Gibbs-Duhem-type formula is introduced to treat Laplace pressure in multi-liquid-phase colloidal system and it obeys Young-Laplace equation. The model is thus capable to quantitatively study important capillarity related phenomena. Extensive computer simulations are performed to study the fundamental behavior of heterogeneous colloidal system. The role of Laplace pressure is revealed in determining the mechanical equilibrium of shape-anisotropic particles at fluid interfaces. In particular, it is found that the Laplace pressure plays a critical role in maintaining the stability of capillary bridges between close particles, which sheds light on a novel route to in situ firming compact but fragile colloidal microstructures via capillary bridges. Simulation results also show that competition between like-charge repulsion, dipole-dipole interaction and Brownian motion dictates the degree of aggregation of heterogeneously charged particles. Assembly and alignment of particles with magnetic dipoles under external field is studied. Finally, extended studies on the role of dipole-dipole interaction are performed for ferromagnetic and ferroelectric domain phenomena. The results reveal that the internal field generated by dipoles competes with external field to determine the dipole-domain evolution in ferroic materials.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We report the observation at the Relativistic Heavy Ion Collider of suppression of back-to-back correlations in the direct photon+jet channel in Au+Au relative to p+p collisions. Two-particle correlations of direct photon triggers with associated hadrons are obtained by statistical subtraction of the decay photon-hadron (gamma-h) background. The initial momentum of the away-side parton is tightly constrained, because the parton-photon pair exactly balance in momentum at leading order in perturbative quantum chromodynamics, making such correlations a powerful probe of the in-medium parton energy loss. The away-side nuclear suppression factor, I(AA), in central Au+Au collisions, is 0.32 +/- 0.12(stat)+/- 0.09(syst) for hadrons of 3 < p(T)(h)< 5 in coincidence with photons of 5 < p(T)(gamma)< 15 GeV/c. The suppression is comparable to that observed for high-p(T) single hadrons and dihadrons. The direct photon associated yields in p+p collisions scale approximately with the momentum balance, z(T)equivalent to p(T)(h)/p(T)(gamma), as expected for a measurement of the away-side parton fragmentation function. We compare to Au+Au collisions for which the momentum balance dependence of the nuclear modification should be sensitive to the path-length dependence of parton energy loss.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The PHENIX experiment presents results from the RHIC 2006 run with polarized p + p collisions at root s = 62.4 GeV, for inclusive pi(0) production at midrapidity. Unpolarized cross section results are measured for transverse momenta p(T) = 0.5 to 7 GeV/c. Next-to-leading order perturbative quantum chromodynamics calculations are compared with the data, and while the calculations are consistent with the measurements, next-to-leading logarithmic corrections improve the agreement. Double helicity asymmetries A(LL) are presented for p(T) = 1 to 4 GeV/c and probe the higher range of Bjorken x of the gluon (x(g)) with better statistical precision than our previous measurements at root s = 200 GeV. These measurements are sensitive to the gluon polarization in the proton for 0.06 < x(g) < 0.4.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We simplify the known formula for the asymptotic estimate of the number of deterministic and accessible automata with n states over a k-letter alphabet. The proof relies on the theory of Lagrange inversion applied in the context of generalized binomial series.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper describes three-dimensional microfluidic paper-based analytical devices (3-D mu PADs) that can be programmed (postfabrication) by the user to generate multiple patterns of flow through them. These devices are programmed by pressing single-use 'on' buttons, using a stylus or a ballpoint pen. Pressing a button closes a small space (gap) between two vertically aligned microfluidic channels, and allows fluids to wick from one channel to the other. These devices are simple to fabricate, and are made entirely out of paper and double-sided adhesive tape. Programmable devices expand the capabilities of mu PADs and provide a simple method for controlling the movement of fluids in paper-based channels. They are the conceptual equivalent of field-programmable gate arrays (FPGAs) widely used in electronics.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Composition and orientation effects on the final recrystallization texture of three coarse-grained Nb-containing AISI 430 ferritic stainless steels (FSSs) were investigated. Hot-bands of steels containing distinct amounts of niobium, carbon and nitrogen were annealed at 1250 degrees C for 2h to promote grain growth. In particular, the amounts of Nb in solid solution vary from one grade to another. For purposes of comparison, the texture evolution of a hot-band sheet annealed at 1030 degrees C for 1 min (finer grain structure) was also investigated. Subsequently, the four sheets were cold rolled up to 80% reduction and then annealed at 800 degrees C for 15 min. Texture was determined using X-ray diffraction and electron backscatter diffraction (EBSD). Noticeable differences regarding the final recrystallization texture and microstructure were observed in the four investigated grades. Results suggest that distinct nucleation mechanisms take place within these large grains leading to the development of different final recrystallization textures. (c) 2011 Elsevier B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Oxide dispersion strengthened reduced-activation ferritic-martensitic steels are promising candidates for applications in future fusion power plants. Samples of a reduced activation ferritic-martensitic 9 wt.%Cr-oxide dispersion strengthened Eurofer steel were cold rolled to 80% reduction in thickness and annealed in vacuum for 1 h from 200 to 1350 degrees C to evaluate its thermal stability. Vickers microhardness testing and electron backscatter diffraction (EBSD) were used to characterize the microstructure. The microstructural changes were also followed by magnetic measurements, in particular the corresponding variation of the coercive field (H(c)), as a function of the annealing treatment. Results show that magnetic measurements were sensitive to detect the changes, in particular the martensitic transformation, in samples annealed above 850 degrees C (austenitic regime). (C) 2010 Elsevier B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A combination of an extension of the topological instability ""lambda criterion"" and a thermodynamic criterion were applied to the Al-La system, indicating the best range of compositions for glass formation. Alloy compositions in this range were prepared by melt-spinning and casting in an arc-melting furnace with a wedge-section copper mold. The GFA of these samples was evaluated by X-ray diffraction, differential scanning calorimetry and scanning electron microscopy. The results indicated that the gamma* parameter of compositions with high GFA is higher, corresponding to a range in which the lambda parameter is greater than 0.1, which are compositions far from Al solid solution. A new alloy was identified with the best GFA reported so far for this system, showing a maximum thickness of 286 mu m in a wedge-section copper mold. Crown Copyright (C) 2009 Published by Elsevier B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Inverse analysis is currently an important subject of study in several fields of science and engineering. The identification of physical and geometric parameters using experimental measurements is required in many applications. In this work a boundary element formulation to identify boundary and interface values as well as material properties is proposed. In particular the proposed formulation is dedicated to identifying material parameters when a cohesive crack model is assumed for 2D problems. A computer code is developed and implemented using the BEM multi-region technique and regularisation methods to perform the inverse analysis. Several examples are shown to demonstrate the efficiency of the proposed model. (C) 2010 Elsevier Ltd. All rights reserved,