893 resultados para Computer Science, Software Engineering
Resumo:
Soft skills and teamwork practices were identi ed as the main de ciencies of recent graduates in computer courses. This issue led to a realization of a qualitative research aimed at investigating the challenges faced by professors of those courses in conducting, monitoring and assessing collaborative software development projects. Di erent challenges were reported by teachers, including di culties in the assessment of students both in the collective and individual levels. In this context, a quantitative research was conducted with the aim to map soft skill of students to a set of indicators that can be extracted from software repositories using data mining techniques. These indicators are aimed at measuring soft skills, such as teamwork, leadership, problem solving and the pace of communication. Then, a peer assessment approach was applied in a collaborative software development course of the software engineering major at the Federal University of Rio Grande do Norte (UFRN). This research presents a correlation study between the students' soft skills scores and indicators based on mining software repositories. This study contributes: (i) in the presentation of professors' perception of the di culties and opportunities for improving management and monitoring practices in collaborative software development projects; (ii) in investigating relationships between soft skills and activities performed by students using software repositories; (iii) in encouraging the development of soft skills and the use of software repositories among software engineering students; (iv) in contributing to the state of the art of three important areas of software engineering, namely software engineering education, educational data mining and human aspects of software engineering.
Resumo:
Soft skills and teamwork practices were identi ed as the main de ciencies of recent graduates in computer courses. This issue led to a realization of a qualitative research aimed at investigating the challenges faced by professors of those courses in conducting, monitoring and assessing collaborative software development projects. Di erent challenges were reported by teachers, including di culties in the assessment of students both in the collective and individual levels. In this context, a quantitative research was conducted with the aim to map soft skill of students to a set of indicators that can be extracted from software repositories using data mining techniques. These indicators are aimed at measuring soft skills, such as teamwork, leadership, problem solving and the pace of communication. Then, a peer assessment approach was applied in a collaborative software development course of the software engineering major at the Federal University of Rio Grande do Norte (UFRN). This research presents a correlation study between the students' soft skills scores and indicators based on mining software repositories. This study contributes: (i) in the presentation of professors' perception of the di culties and opportunities for improving management and monitoring practices in collaborative software development projects; (ii) in investigating relationships between soft skills and activities performed by students using software repositories; (iii) in encouraging the development of soft skills and the use of software repositories among software engineering students; (iv) in contributing to the state of the art of three important areas of software engineering, namely software engineering education, educational data mining and human aspects of software engineering.
Resumo:
In this Bachelor Thesis I want to provide readers with tools and scripts for the control of a 7DOF manipulator, backed up by some theory of Robotics and Computer Science, in order to better contextualize the work done. In practice, we will see most common software, and developing environments, used to cope with our task: these include ROS, along with visual simulation by VREP and RVIZ, and an almost "stand-alone" ROS extension called MoveIt!, a very complete programming interface for trajectory planning and obstacle avoidance. As we will better appreciate and understand in the introduction chapter, the capability of detecting collision objects through a camera sensor, and re-plan to the desired end-effector pose, are not enough. In fact, this work is implemented in a more complex system, where recognition of particular objects is needed. Through a package of ROS and customized scripts, a detailed procedure will be provided on how to distinguish a particular object, retrieve its reference frame with respect to a known one, and then allow navigation to that target. Together with technical details, the aim is also to report working scripts and a specific appendix (A) you can refer to, if desiring to put things together.
Resumo:
Acknowledgements The authors thank the children, their parents and school staff, who participated in this research, and who so willingly gave us their time, help and support. They also thank Steven Knox and Alan Clelland for their work on programming the mobile phone application. Additional thanks to DynaVox Inc. for supplying the Vmax communication devices to run our system on and Sensory Software Ltd for supplying us with their AAC software. This research was supported by the Research Council UKs Digittal Economy Programme and EPSRC (Grant numbers EP/F067151/1, EP/F066880/1, EP/E011764/1, EP/H022376/1, and EP/H022570 /1).
Resumo:
Acknowledgements This work has been partially supported by the European project Marrying Ontologies and Software Technologies (EU ICT2008-216691), the European project Knowledge Driven Data Exploitation (EU FP7/IAPP2011-286348), the UK EPSRC project WhatIf (EP/J014354/1). The authors thank Prof. Ian Horrocks and Dr. Giorgos Stoilos for their helpful discussion on role subsumptions. The authors thank Rafael S. Gonçalves et al. for providing their hotspots ontologies. The authors also thank BoC-group for providing their ADOxx Metamodelling ontologies.
Resumo:
Acknowledgments The authors acknowledge the support from Engineering and Physical Sciences Research Council, grant number EP/M002322/1. The authors would also like to thank Numerical Analysis Group at the Rutherford Appleton Laboratory for their FORTRAN HSL packages (HSL, a collection of Fortran codes for large-scale scientific computation. See http://www.hsl.rl.ac.uk/).
Resumo:
Cumulon is a system aimed at simplifying the development and deployment of statistical analysis of big data in public clouds. Cumulon allows users to program in their familiar language of matrices and linear algebra, without worrying about how to map data and computation to specific hardware and cloud software platforms. Given user-specified requirements in terms of time, monetary cost, and risk tolerance, Cumulon automatically makes intelligent decisions on implementation alternatives, execution parameters, as well as hardware provisioning and configuration settings -- such as what type of machines and how many of them to acquire. Cumulon also supports clouds with auction-based markets: it effectively utilizes computing resources whose availability varies according to market conditions, and suggests best bidding strategies for them. Cumulon explores two alternative approaches toward supporting such markets, with different trade-offs between system and optimization complexity. Experimental study is conducted to show the efficiency of Cumulon's execution engine, as well as the optimizer's effectiveness in finding the optimal plan in the vast plan space.
Resumo:
Petri Nets are a formal, graphical and executable modeling technique for the specification and analysis of concurrent and distributed systems and have been widely applied in computer science and many other engineering disciplines. Low level Petri nets are simple and useful for modeling control flows but not powerful enough to define data and system functionality. High level Petri nets (HLPNs) have been developed to support data and functionality definitions, such as using complex structured data as tokens and algebraic expressions as transition formulas. Compared to low level Petri nets, HLPNs result in compact system models that are easier to be understood. Therefore, HLPNs are more useful in modeling complex systems. There are two issues in using HLPNs - modeling and analysis. Modeling concerns the abstracting and representing the systems under consideration using HLPNs, and analysis deals with effective ways study the behaviors and properties of the resulting HLPN models. In this dissertation, several modeling and analysis techniques for HLPNs are studied, which are integrated into a framework that is supported by a tool. For modeling, this framework integrates two formal languages: a type of HLPNs called Predicate Transition Net (PrT Net) is used to model a system's behavior and a first-order linear time temporal logic (FOLTL) to specify the system's properties. The main contribution of this dissertation with regard to modeling is to develop a software tool to support the formal modeling capabilities in this framework. For analysis, this framework combines three complementary techniques, simulation, explicit state model checking and bounded model checking (BMC). Simulation is a straightforward and speedy method, but only covers some execution paths in a HLPN model. Explicit state model checking covers all the execution paths but suffers from the state explosion problem. BMC is a tradeoff as it provides a certain level of coverage while more efficient than explicit state model checking. The main contribution of this dissertation with regard to analysis is adapting BMC to analyze HLPN models and integrating the three complementary analysis techniques in a software tool to support the formal analysis capabilities in this framework. The SAMTools developed for this framework in this dissertation integrates three tools: PIPE+ for HLPNs behavioral modeling and simulation, SAMAT for hierarchical structural modeling and property specification, and PIPE+Verifier for behavioral verification.
Resumo:
Computer games have now been around for over three decades and the term serious games has been attributed to the use of computer games that are thought to have educational value. Game-based learning (GBL) has been applied in a number of different fields such as medicine, languages and software engineering. Furthermore, serious games can be a very effective as an instructional tool and can assist learning by providing an alternative way of presenting instructions and content on a supplementary level, and can promote student motivation and interest in subject matter resulting in enhanced learning effectiveness. REVLAW (Real and Virtual Reality Law) is a research project that the departments of Law and Computer Science of Westminster University have proposed as a new framework in which law students can explore a real case scenario using Virtual Reality (VR) technology to discover important pieces of evidence from a real-given scenario and make up their mind over the crime case if this is a murder or not. REVLAW integrates the immersion into VR as the perception of being physically present in a non-physical world. The paper presents the prototype framework and the mechanics used to make students focus on the crime case and make the best use of this immersive learning approach.
Resumo:
This paper presents and validates a methodology for integrating reusable software components in diverse game engines. While conforming to the RAGE com-ponent-based architecture described elsewhere, the paper explains how the interac-tions and data exchange processes between a reusable software component and a game engine should be implemented for procuring seamless integration. To this end, a RAGE-compliant C# software component providing a difficulty adaptation routine was integrated with an exemplary strategic tile-based game “TileZero”. Implementa-tions in MonoGame, Unity and Xamarin, respectively, have demonstrated successful portability of the adaptation component. Also, portability across various delivery platforms (Windows desktop, iOS, Android, Windows Phone) was established. Thereby this study has established the validity of the RAGE architecture and its un-derlying interaction processes for the cross-platform and cross-game engine reuse of software components. The RAGE architecture thereby accommodates the large scale development and application of reusable software components for serious gaming.
Resumo:
Software assets are key output of the RAGE project and they can be used by applied game developers to enhance the pedagogical and educational value of their games. These software assets cover a broad spectrum of functionalities – from player analytics including emotion detection to intelligent adaptation and social gamification. In order to facilitate integration and interoperability, all of these assets adhere to a common model, which describes their properties through a set of metadata. In this paper the RAGE asset model and asset metadata model is presented, capturing the detail of assets and their potential usage within three distinct dimensions – technological, gaming and pedagogical. The paper highlights key issues and challenges in constructing the RAGE asset and asset metadata model and details the process and design of a flexible metadata editor that facilitates both adaptation and improvement of the asset metadata model.
Resumo:
Stealthy attackers move patiently through computer networks - taking days, weeks or months to accomplish their objectives in order to avoid detection. As networks scale up in size and speed, monitoring for such attack attempts is increasingly a challenge. This paper presents an efficient monitoring technique for stealthy attacks. It investigates the feasibility of proposed method under number of different test cases and examines how design of the network affects the detection. A methodological way for tracing anonymous stealthy activities to their approximate sources is also presented. The Bayesian fusion along with traffic sampling is employed as a data reduction method. The proposed method has the ability to monitor stealthy activities using 10-20% size sampling rates without degrading the quality of detection.
Resumo:
Abstract This seminar consists of two very different research reports by PhD students in WAIS. Hypertext Engineering, Fettling or Tinkering (Mark Anderson): Contributors to a public hypertext such as Wikipedia do not necessarily record their maintenance activities, but some specific hypertext features - such transclusion - could indicate deliberate editing with a mind to the hypertext’s long-term use. The MediaWiki software used to create Wikipedia supports transclusion, a deliberately hypertextual form of content creation which aids long terms consistency. This discusses the evidence of the use of hypertext transclusion in Wikipedia, and its implications for the coherence and stability of Wikipedia. Designing a Public Intervention - Towards a Sociotechnical Approach to Web Governance (Faranak Hardcastle): In this talk I introduce a critical and speculative design for a socio-technical intervention -called TATE (Transparency and Accountability Tracking Extension)- that aims to enhance transparency and accountability in Online Behavioural Tracking and Advertising mechanisms and practices.
Resumo:
Abstract not available