155 resultados para Software testing. Problem-oriented programming. Teachingmethodology
Resumo:
In the current market, extensive software development is taking place and the software industry is thriving. Major software giants have stated source code theft as a major threat to revenues. By inserting an identity-establishing watermark in the source code, a company can prove it's ownership over the source code. In this paper, we propose a watermarking scheme for C/C++ source codes by exploiting the language restrictions. If a function calls another function, the latter needs to be defined in the code before the former, unless one uses function pre-declarations. We embed the watermark in the code by imposing an ordering on the mutually independent functions by introducing bogus dependency. Removal of dependency by the attacker to erase the watermark requires extensive manual intervention thereby making the attack infeasible. The scheme is also secure against subtractive and additive attacks. Using our watermarking scheme, an n-bit watermark can be embedded in a program having n independent functions. The scheme is implemented on several sample codes and performance changes are analyzed.
Resumo:
This article addresses the problem of estimating the Quality of Service (QoS) of a composite service given the QoS of the services participating in the composition. Previous solutions to this problem impose restrictions on the topology of the orchestration models, limiting their applicability to well-structured orchestration models for example. This article lifts these restrictions by proposing a method for aggregate QoS computation that deals with more general types of unstructured orchestration models. The applicability and scalability of the proposed method are validated using a collection of models from industrial practice.
Resumo:
Mobile devices are rapidly developing into the primary technology for users to work, socialize, and play in a variety of settings and contexts. Their pervasiveness has provided researchers with the means to investigate innovative solutions to ever more complex user demands. Tools for Mobile Multimedia Programming and Development investigates the use of mobile platforms for research projects, focusing on the development, testing, and evaluation of prototypes rather than final products, which enables researchers to better understand the needs of users through image processing, object recognition, sensor integration, and user interactions. This book benefits researchers and professionals in multiple disciplines who utilize such techniques in the creation of prototypes for mobile devices and applications. This book is part of the Advances in Wireless Technologies and Telecommunication series collection.
Resumo:
Integer ambiguity resolution is an indispensable procedure for all high precision GNSS applications. The correctness of the estimated integer ambiguities is the key to achieving highly reliable positioning, but the solution cannot be validated with classical hypothesis testing methods. The integer aperture estimation theory unifies all existing ambiguity validation tests and provides a new prospective to review existing methods, which enables us to have a better understanding on the ambiguity validation problem. This contribution analyses two simple but efficient ambiguity validation test methods, ratio test and difference test, from three aspects: acceptance region, probability basis and numerical results. The major contribution of this paper can be summarized as: (1) The ratio test acceptance region is overlap of ellipsoids while the difference test acceptance region is overlap of half-spaces. (2) The probability basis of these two popular tests is firstly analyzed. The difference test is an approximation to optimal integer aperture, while the ratio test follows an exponential relationship in probability. (3) The limitations of the two tests are firstly identified. The two tests may under-evaluate the failure risk if the model is not strong enough or the float ambiguities fall in particular region. (4) Extensive numerical results are used to compare the performance of these two tests. The simulation results show the ratio test outperforms the difference test in some models while difference test performs better in other models. Particularly in the medium baseline kinematic model, the difference tests outperforms the ratio test, the superiority is independent on frequency number, observation noise, satellite geometry, while it depends on success rate and failure rate tolerance. Smaller failure rate leads to larger performance discrepancy.
Resumo:
The purpose of the book is to use Delphi as a vehicle to introduce some fundamental algorithms and to illustrate several mathematical and problem-solving techniques. This book is therefore intended to be more of a reference for problem-solving, with the solution expressed in Delphi. It introduces a somewhat eclectic collection of material, much of which will not be found in a typical book on Pascal or Delphi. Many of the topics have been used by the author over a period of about ten years at Bond University, Australia in various subjects from 1993 to 2003. Much of the work was connected with a data structures subject (second programming course) conducted variously in MODULA-2, Oberon and Delphi, at Bond University, however there is considerable other, more recent material, e.g., a chapter on Sudoku.
Resumo:
The common reasons for those in organizations adopting large config- urable packaged software products are compelling. Problems with the existing software situation, the supposed predictability and perceived business benefits of packaged software, and various social influences, can lead to packages being preferred to custom approaches. Yet, for every reason, there is a potential associated problem that must be understood before an informed adoption decision can be made...
Resumo:
We do not commonly associate software engineering with philosophical debate. Indeed, software engineers ought to be concerned with building software systems and not settling philosophical questions. I attempt to show that software engineers do, in fact, take philosophical sides when designing software applications. In particular, I look at how the problem of vagueness arises in software engineering and argue that when software engineers solve it, they commit to philosophical views that they are seldom aware of. In the second part of the paper, I suggest a way of dealing with vague predicates without having to confront the problem of vagueness itself. The purpose of my paper is to highlight the currently prevalent disconnect between philosophy and software engineering. I claim that a better knowledge of the philosophical debate is important as it can have ramifications for crucial software design decisions. Better awareness of philosophical issues not only produces better software engineers, it also produces better engineered products.
Resumo:
Critical to the research of urban morphologists is the availability of historical records that document the urban transformation of the study area. However, thus far little work has been done towards an empirical approach to the validation of archival data in this field. Outlined in this paper, therefore, is a new methodology for validating the accuracy of archival records and mapping data, accrued through the process of urban morphological research, so as to establish a reliable platform from which analysis can proceed. The paper particularly addresses the problems of inaccuracies in existing curated historical information, as well as errors in archival research by student assistants, which together give rise to unacceptable levels of uncertainty in the documentation. The paper discusses the problems relating to the reliability of historical information, demonstrates the importance of data verification in urban morphological research, and proposes a rigorous method for objective testing of collected archival data through the use of qualitative data analysis software.
Resumo:
Computational neuroscience aims to elucidate the mechanisms of neural information processing and population dynamics, through a methodology of incorporating biological data into complex mathematical models. Existing simulation environments model at a particular level of detail; none allow a multi-level approach to neural modelling. Moreover, most are not engineered to produce compute-efficient solutions, an important issue because sufficient processing power is a major impediment in the field. This project aims to apply modern software engineering techniques to create a flexible high performance neural modelling environment, which will allow rigorous exploration of model parameter effects, and modelling at multiple levels of abstraction.
Resumo:
An increasing range of technology services are now offered on a self-service basis. However, problems with self-service technologies (SSTs) occur at times due to the technical error, staff error, or consumers’ own mistakes. Considering the role of consumers as co-producers in the SST context, we aim to study consumer’s behaviours, strategies, and decision making in solving their problem with SST and identify the factors contributing to their persistence in solving the problem. This study contributes to the information systems research, as it is the first study that aims to identify such a process and the factors affecting consumers’ persistence in solving their problem with SST. A focus group with user support staff has been conducted, yielding some initial results that helped to conduct the next phases of the study. Next, using Critical Incident Technique, data will be gathered through focus groups with users, diary method, and think-aloud method.
Resumo:
The worldwide installed base of enterprise resource planning (ERP) systems has increased rapidly over the past 10 years now comprising tens of thousands of installations in large- and medium-sized organizations and millions of licensed users. Similar to traditional information systems (IS), ERP systems must be maintained and upgraded. It is therefore not surprising that ERP maintenance activities have become the largest budget provision in the IS departments of many ERP-using organizations. Yet, there has been limited study of ERP maintenance activities. Are they simply instances of traditional software maintenance activities to which traditional software maintenance research findings can be generalized? Or are they fundamentally different, such that new research, specific to ERP maintenance, is required to help alleviate the ERP maintenance burden? This paper reports a case study of a large organization that implemented ERP (an SAP system) more than three years ago. From the case study and data collected, we observe the following distinctions of ERP maintenance: (1) the ERP-using organization, in addition to addressing internally originated change-requests, also implements maintenance introduced by the vendor; (2) requests for user-support concerning the ERP system behavior, function and training constitute a main part of ERP maintenance activity; and (3) similar to the in-house software environment, enhancement is the major maintenance activity in the ERP environment, encompassing almost 64% of the total change-request effort. In light of these and other findings, we ultimately: (1) propose a clear and precise definition of ERP maintenance; (2) conclude that ERP maintenance cannot be sufficiently described by existing software maintenance taxonomies; and (3) propose a benefits-oriented taxonomy, that better represents ERP maintenance activities. Three salient dimensions (for characterizing requests) incorporated in the proposed ERP maintenance taxonomy are: (1) who is the maintenance source? (2) why is it important to service the request? and (3) what––whether there is any impact of implementing the request on the installed module(s)?
Resumo:
This project develops the required guidelines to assure stable and accurate operation of Power-Hardware-in-the-Loop implementations. The proposals of this research have been theoretically analyzed and practically examined using a Real-Time Digital Simulator. In this research, the interaction between software simulated power network and the physical power system has been studied. The conditions for different operating regimes have been derived and the corresponding analyses have been presented.