982 resultados para SOFTWARE REUSE
Resumo:
Free and Open Source Software (FOSS) has gained increased interest in the computer software industry, but assessing its quality remains a challenge. FOSS development is frequently carried out by globally distributed development teams, and all stages of development are publicly visible. Several product and process-level quality factors can be measured using the public data. This thesis presents a theoretical background for software quality and metrics and their application in a FOSS environment. Information available from FOSS projects in three information spaces are presented, and a quality model suitable for use in a FOSS context is constructed. The model includes both process and product quality metrics, and takes into account the tools and working methods commonly used in FOSS projects. A subset of the constructed quality model is applied to three FOSS projects, highlighting both theoretical and practical concerns in implementing automatic metric collection and analysis. The experiment shows that useful quality information can be extracted from the vast amount of data available. In particular, projects vary in their growth rate, complexity, modularity and team structure.
Resumo:
Free and open source software development is an alternative to traditional software engineering as an approach to the development of complex software systems. It is a way of developing software based on geographically distributed teams of volunteers without apparent central plan or traditional mechanisms of coordination. The purpose of this thesis is to summarize the current knowledge about free and open source software development and explore the ways on which further understanding on it could be gained. The results of research on the field as well as the research methods are introduced and discussed. Also adapting software process metrics to the context of free and open source software development is illustrated and the possibilities to utilize them as tools to validate other research are discussed.
Resumo:
Certain software products employing digital techniques for encryption of data are subject to export controls in the EU Member States pursuant to Community law and relevant laws in the Member States. These controls are agreed globally in the framework of the so-called Wassenaar Arrangement. Wassenaar is an informal non-proliferation regime aimed at promoting international stability and responsibility in transfers of strategic (dual-use) products and technology. This thesis covers provisions of Wassenaar, Community export control laws and export control laws of Finland, Sweden, Germany, France and United Kingdom. This thesis consists of five chapters. The first chapter discusses the ratio of export control laws and the impact they have on global trade. The ratio is originally defence-related - in general to prevent potential adversaries of participating States from having the same tools, and in particular in the case of cryptographic software to enable signals intelligence efforts. Increasingly as the use of cryptography in a civilian context has mushroomed, export restrictions can have negative effects on civilian trade. Information security solutions may also be took weak because of export restrictions on cryptography. The second chapter covers the OECD's Cryptography Policy, which had a significant effect on its member nations' national cryptography policies and legislation. The OECD is a significant organization,because it acts as a meeting forum for most important industrialized nations. The third chapter covers the Wassenaar Arrangement. The Arrangement is covered from the viewpoint of international law and politics. The Wassenaar control list provisions affecting cryptographic software transfers are also covered in detail. Control lists in the EU and in Member States are usually directly copied from Wassenaar control lists. Controls agreed in its framework set only a minimum level for participating States. However, Wassenaar countries can adopt stricter controls. The fourth chapter covers Community export control law. Export controls are viewed in Community law as falling within the domain of Common Commercial Policy pursuant to Article 133 of the EC Treaty. Therefore the Community has exclusive competence in export matters, save where a national measure is authorized by the Community or falls under foreign or security policy derogations established in Community law. The Member States still have a considerable amount of power in the domain of Common Foreign and Security Policy. They are able to maintain national export controls because export control laws are not fully harmonized. This can also have possible detrimental effects on the functioning of internal market and common export policies. In 1995 the EU adopted Dual-Use Regulation 3381/94/EC, which sets common rules for exports in Member States. Provisions of this regulation receive detailed coverage in this chapter. The fifth chapter covers national legislation and export authorization practices in five different Member States - in Finland, Sweden, Germany, France and in United Kingdom. Export control laws of those Member States are covered when the national laws differ from the uniform approach of the Community's acquis communautaire. Keywords: export control, encryption, software, dual-use, license, foreign trade, e-commerce, Internet
Resumo:
We treat urban stormwater as a problem as it causes flooding, transports pollutants and degrades the ecosystem health of waterways (Goonetilleke et al., 2014). Municipal authorities devote a significant component of their budget to capture and remove stormwater from urban areas as rapidly as possible. Unfortunately, it is a largely unappreciated fact that urban stormwater is the last available uncommitted water resource for our cities as the demand for potable water escalates due to growing urbanisation, industrialisation and higher living standards.
Resumo:
This demonstration highlights the applications of our research work i.e. second generation (Scalable Fault Tolerant Agent Grooming Environment - SAGE) Multi Agent System, Integration of Software Agents and Grid Computing and Autonomous Agent Architecture in the Agent Platform. It is a conference planner application that uses collaborative effort of services deployed geographically wide in different technologies i.e. Software Agents, Grid computing and Web services to perform useful tasks as required. Copyright 2005 ACM.
Resumo:
The NUVIEW software package allows skeletal models of any double helical nucleic acid molecule to be displayed out a graphics monitor and to apply various rotations, translations and scaling transformations interactively, through the keyboard. The skeletal model is generated by connecting any pair of representative points, one from each of the bases in the basepair. In addition to the above mentioned manipulations, the base residues can be identified by using a locator and the distance between any pair of residues can be obtained. A sequence based color coded display allows easy identification of sequence repeats, such as runs of Adenines. The real time interactive manipulation of such skeletal models for large DNA/RNA double helices, can be used to trace the path of the nucleic acid chain in three dimensions and hence get a better idea of its topology, location of linear or curved regions, distances between far off regions in the sequence etc. A physical picture of these features will assist in understanding the relationship between base sequence, structure and biological function in nucleic acids.
Resumo:
Software packages NUPARM and NUCGEN, are described, which can be used to understand sequence directed structural variations in nucleic acids, by analysis and generation of non-uniform structures. A set of local inter basepair parameters (viz. tilt, roll, twist, shift, slide and rise) have been defined, which use geometry and coordinates of two successive basepairs only and can be used to generate polymeric structures with varying geometries for each of the 16 possible dinucleotide steps. Intra basepair parameters, propeller, buckle, opening and the C6...C8 distance can also be varied, if required, while the sugar phosphate backbone atoms are fixed in some standard conformation ill each of the nucleotides. NUPARM can be used to analyse both DNA and RNA structures, with single as well as double stranded helices. The NUCGEN software generates double helical models with the backbone fixed in B-form DNA, but with appropriate modifications in the input data, it can also generate A-form DNA ar rd RNA duplex structures.
Resumo:
Microbiological quality of the treated wastewater is an important parameter for its reuse. The data oil the Fecal Coliform (FC) and Fecal Streptococcus (FS) at different stages of treatment in the Sewage Treatment Plants (STPs) in Delhi watershed is not available, therefore in the present study microbial profiling of STPs was carried out to assess the effluent quality for present and future reuse options. This Study further evaluates the water quality profiles at different stages of treatment for 16 STPs in Delhi city. These STPs are based on conventional Activated Sludge Process (ASP), extended aeration, physical, chemical and biological treatment (BIOFORE), Trickling Filter and Oxidation Pond. The primary effluent quality produced from most of the STPs was suitable for Soil Aquifer Treatment (SAT). Extended Hydraulic Retention Time (HRT) as a result Of low inflow to the STPS Was responsible for high turbidity, COD and BODs removal. Conventional ASP based STPs achieved 1.66 log FC and 1.06 log FS removal. STPs with extended aeration treatment process produced better quality effluent with maximum 4 log order reduction in FC and FS levels. ``Kondli'' and ``Nilothi'' STPs employing ASP, produced better quality secondary effluent as compared to other STPs based oil similar treatment process. Oxidation Pond based STPs showed better FC and FS removals, whereas good physiochemical quality was achieved during the first half of the treatment. Based upon physical, chemical and microbiological removal efficiencies, actual integrated efficiency (IEa) of each STP was determined to evaluate its Suitability for reuse for irrigation purposes. Except Mehrauli'' and ``Oxidation Pond'', effluents from all other STPs require tertiary treatment for further reuse. Possible reuse options, depending Upon the geographical location, proximity of facilities of potential users based oil the beneficial uses, and sub-soil types, etc. for the Delhi city have been investigated, which include artificial groundwater recharge, aquaculture, horticulture and industrial uses Such as floor washing, boiler feed, and cooling towers, etc. (C) 2009 Elsevier B.V. All rights reserved.
Resumo:
Flood extent mapping is a basic tool for flood damage assessment, which can be done by digital classification techniques using satellite imageries, including the data recorded by radar and optical sensors. However, converting the data into the information we need is not a straightforward task. One of the great challenges involved in the data interpretation is to separate the permanent water bodies and flooding regions, including both the fully inundated areas and the wet areas where trees and houses are partly covered with water. This paper adopts the decision fusion technique to combine the mapping results from radar data and the NDVI data derived from optical data. An improved capacity in terms of identifying the permanent or semi-permanent water bodies from flood inundated areas has been achieved. Computer software tools Multispec and Matlab were used.
Resumo:
Free software is viewed as a revolutionary and subversive practice, and in particular has dealt a strong blow to the traditional conception of intellectual property law (although in its current form could be considered a 'hack' of IP rights). However, other (capitalist) areas of law have been swift to embrace free software, or at least incorporate it into its own tenets. One area in particular is that of competition (antitrust) law, which itself has long been in theoretical conflict with intellectual property, due to the restriction on competition inherent in the grant of ‘monopoly’ rights by copyrights, patents and trademarks. This contribution will examine how competition law has approached free software by examining instances in which courts have had to deal with such initiatives, for instance in the Oracle Sun Systems merger, and the implications that these decisions have on free software initiatives. The presence or absence of corporate involvement in initiatives will be an important factor in this investigation, with it being posited that true instances of ‘commons-based peer production’ can still subvert the capitalist system, including perplexing its laws beyond intellectual property.
Resumo:
The StreamIt programming model has been proposed to exploit parallelism in streaming applications on general purpose multi-core architectures. This model allows programmers to specify the structure of a program as a set of filters that act upon data, and a set of communication channels between them. The StreamIt graphs describe task, data and pipeline parallelism which can be exploited on modern Graphics Processing Units (GPUs), as they support abundant parallelism in hardware. In this paper, we describe the challenges in mapping StreamIt to GPUs and propose an efficient technique to software pipeline the execution of stream programs on GPUs. We formulate this problem - both scheduling and assignment of filters to processors - as an efficient Integer Linear Program (ILP), which is then solved using ILP solvers. We also describe a novel buffer layout technique for GPUs which facilitates exploiting the high memory bandwidth available in GPUs. The proposed scheduling utilizes both the scalar units in GPU, to exploit data parallelism, and multiprocessors, to exploit task and pipelin parallelism. Further it takes into consideration the synchronization and bandwidth limitations of GPUs, and yields speedups between 1.87X and 36.83X over a single threaded CPU.
Resumo:
Automatic identification of software faults has enormous practical significance. This requires characterizing program execution behavior and the use of appropriate data mining techniques on the chosen representation. In this paper, we use the sequence of system calls to characterize program execution. The data mining tasks addressed are learning to map system call streams to fault labels and automatic identification of fault causes. Spectrum kernels and SVM are used for the former while latent semantic analysis is used for the latter The techniques are demonstrated for the intrusion dataset containing system call traces. The results show that kernel techniques are as accurate as the best available results but are faster by orders of magnitude. We also show that latent semantic indexing is capable of revealing fault-specific features.
Resumo:
Several researchers are of the opinion that there are many benefits in using the object-oriented paradigm in information systems development. If the object-oriented paradigm is used, the development of information systems may, for example, be faster and more efficient. On the other hand, there are also several problems with the paradigm. For example, it is often considered complex, it is often difficult to make use of the reuse concept and it is still immature in some areas. Although there are several interesting features in the object-oriented paradigm, there is still little comprehensive knowledge of the benefits and problems associated with it. The objective of the following study was to investigate and to gain more understanding of the benefits and problems of the object-oriented paradigm. A review of previous studies was made and twelve benefits and twelve problems were established. These benefits and problems were then analysed, studied and discussed. Further a survey and some case studies were made in order to get some knowledge on what benefits and problems with the object-oriented paradigm Finnish software companies had experienced. One hundred and four companies answered the survey that was sent to all Finnish software companies with five or more employees. The case studies were made with six large Finnish software companies. The major finding was that Finnish software companies were exceptionally positive towards the object-oriented information systems development and had experienced very few of the proposed problems. Finally two models for further research were developed. The first model presents connections between benefits and the second between problems.
Resumo:
Four algorithms, all variants of Simultaneous Perturbation Stochastic Approximation (SPSA), are proposed. The original one-measurement SPSA uses an estimate of the gradient of objective function L containing an additional bias term not seen in two-measurement SPSA. As a result, the asymptotic covariance matrix of the iterate convergence process has a bias term. We propose a one-measurement algorithm that eliminates this bias, and has asymptotic convergence properties making for easier comparison with the two-measurement SPSA. The algorithm, under certain conditions, outperforms both forms of SPSA with the only overhead being the storage of a single measurement. We also propose a similar algorithm that uses perturbations obtained from normalized Hadamard matrices. The convergence w.p. 1 of both algorithms is established. We extend measurement reuse to design two second-order SPSA algorithms and sketch the convergence analysis. Finally, we present simulation results on an illustrative minimization problem.