59 resultados para Software industry
em Indian Institute of Science - Bangalore - Índia
Resumo:
This paper presents on overview of the issues in precisely defining, specifying and evaluating the dependability of software, particularly in the context of computer controlled process systems. Dependability is intended to be a generic term embodying various quality factors and is useful for both software and hardware. While the developments in quality assurance and reliability theories have proceeded mostly in independent directions for hardware and software systems, we present here the case for developing a unified framework of dependability—a facet of operational effectiveness of modern technological systems, and develop a hierarchical systems model helpful in clarifying this view. In the second half of the paper, we survey the models and methods available for measuring and improving software reliability. The nature of software “bugs”, the failure history of the software system in the various phases of its lifecycle, the reliability growth in the development phase, estimation of the number of errors remaining in the operational phase, and the complexity of the debugging process have all been considered to varying degrees of detail. We also discuss the notion of software fault-tolerance, methods of achieving the same, and the status of other measures of software dependability such as maintainability, availability and safety.
Resumo:
The literature contains many examples of digital procedures for the analytical treatment of electroencephalograms, but there is as yet no standard by which those techniques may be judged or compared. This paper proposes one method of generating an EEG, based on a computer program for Zetterberg's simulation. It is assumed that the statistical properties of an EEG may be represented by stationary processes having rational transfer functions and achieved by a system of software fillers and random number generators.The model represents neither the neurological mechanism response for generating the EEG, nor any particular type of EEG record; transient phenomena such as spikes, sharp waves and alpha bursts also are excluded. The basis of the program is a valid ‘partial’ statistical description of the EEG; that description is then used to produce a digital representation of a signal which if plotted sequentially, might or might not by chance resemble an EEG, that is unimportant. What is important is that the statistical properties of the series remain those of a real EEG; it is in this sense that the output is a simulation of the EEG. There is considerable flexibility in the form of the output, i.e. its alpha, beta and delta content, which may be selected by the user, the same selected parameters always producing the same statistical output. The filtered outputs from the random number sequences may be scaled to provide realistic power distributions in the accepted EEG frequency bands and then summed to create a digital output signal, the ‘stationary EEG’. It is suggested that the simulator might act as a test input to digital analytical techniques for the EEG, a simulator which would enable at least a substantial part of those techniques to be compared and assessed in an objective manner. The equations necessary to implement the model are given. The program has been run on a DEC1090 computer but is suitable for any microcomputer having more than 32 kBytes of memory; the execution time required to generate a 25 s simulated EEG is in the region of 15 s.
Resumo:
There are several areas in the plywood industry where Operations Research techniques have greatly assisted in better decision-making. These have resulted in improved profits, reduction of wood losses and better utilization of resources. Realizing these, some of the plywood manufacturing firms in the developed countries have established separate Operations Research departments or divisions. In the face of limited raw-material resources, raising costs and a competitive environment, the benefits attributable to the use of these techniques are becoming more and more significant.
Resumo:
A new shock wave generator has been designed, fabricated and tested for preservative impregnation studies into wood slats used for manufacturing pencils in the Shock Waves Laboratory, IISc, Bangalore. Series of experiments have been carried out in the laboratory to achieve satisfactory preservative impregnation into VATTA wood slats. The experiments have shown that it is indeed possible to impregnate preservatives into VATTA wood slats using shock waves and the depth of penetration and the retention of preservatives by wood slats is as good as the conventional methods. This method is expected to result in substantial reduction in the treatment process time compared to conventional methods that are currently being used by the pencil manufacturing industry.
Resumo:
The NUVIEW software package allows skeletal models of any double helical nucleic acid molecule to be displayed out a graphics monitor and to apply various rotations, translations and scaling transformations interactively, through the keyboard. The skeletal model is generated by connecting any pair of representative points, one from each of the bases in the basepair. In addition to the above mentioned manipulations, the base residues can be identified by using a locator and the distance between any pair of residues can be obtained. A sequence based color coded display allows easy identification of sequence repeats, such as runs of Adenines. The real time interactive manipulation of such skeletal models for large DNA/RNA double helices, can be used to trace the path of the nucleic acid chain in three dimensions and hence get a better idea of its topology, location of linear or curved regions, distances between far off regions in the sequence etc. A physical picture of these features will assist in understanding the relationship between base sequence, structure and biological function in nucleic acids.
Resumo:
Software packages NUPARM and NUCGEN, are described, which can be used to understand sequence directed structural variations in nucleic acids, by analysis and generation of non-uniform structures. A set of local inter basepair parameters (viz. tilt, roll, twist, shift, slide and rise) have been defined, which use geometry and coordinates of two successive basepairs only and can be used to generate polymeric structures with varying geometries for each of the 16 possible dinucleotide steps. Intra basepair parameters, propeller, buckle, opening and the C6...C8 distance can also be varied, if required, while the sugar phosphate backbone atoms are fixed in some standard conformation ill each of the nucleotides. NUPARM can be used to analyse both DNA and RNA structures, with single as well as double stranded helices. The NUCGEN software generates double helical models with the backbone fixed in B-form DNA, but with appropriate modifications in the input data, it can also generate A-form DNA ar rd RNA duplex structures.
Resumo:
Product success is substantially influenced by satisfaction of knowledge needs of designers, and many tools and methods have been proposed to support these needs. However, adoption of these methods in industry is minimal. This may be due to an inadequate understanding of the knowledge needs of designers in industry. This research attempts to develop a better understanding of these needs by undertaking descriptive studies in an industry. We propose a taxonomy of knowledge, and evaluate this by analyzing the questions asked by the designers involved in the study during their interactions. Using the taxonomy, we converted the questions asked into a generic form. The generic questions provide an understanding about what knowledge must be captured during design, and what its structure should be.
Resumo:
The StreamIt programming model has been proposed to exploit parallelism in streaming applications on general purpose multi-core architectures. This model allows programmers to specify the structure of a program as a set of filters that act upon data, and a set of communication channels between them. The StreamIt graphs describe task, data and pipeline parallelism which can be exploited on modern Graphics Processing Units (GPUs), as they support abundant parallelism in hardware. In this paper, we describe the challenges in mapping StreamIt to GPUs and propose an efficient technique to software pipeline the execution of stream programs on GPUs. We formulate this problem - both scheduling and assignment of filters to processors - as an efficient Integer Linear Program (ILP), which is then solved using ILP solvers. We also describe a novel buffer layout technique for GPUs which facilitates exploiting the high memory bandwidth available in GPUs. The proposed scheduling utilizes both the scalar units in GPU, to exploit data parallelism, and multiprocessors, to exploit task and pipelin parallelism. Further it takes into consideration the synchronization and bandwidth limitations of GPUs, and yields speedups between 1.87X and 36.83X over a single threaded CPU.
Resumo:
Automatic identification of software faults has enormous practical significance. This requires characterizing program execution behavior and the use of appropriate data mining techniques on the chosen representation. In this paper, we use the sequence of system calls to characterize program execution. The data mining tasks addressed are learning to map system call streams to fault labels and automatic identification of fault causes. Spectrum kernels and SVM are used for the former while latent semantic analysis is used for the latter The techniques are demonstrated for the intrusion dataset containing system call traces. The results show that kernel techniques are as accurate as the best available results but are faster by orders of magnitude. We also show that latent semantic indexing is capable of revealing fault-specific features.
Resumo:
Sensing and photocatalysis of textile industry effluents such as dyes using mesoporous anatase titania nanowires are discussed here.Spectroscopic investigations show that the titania nanowires preferentially sense cationic (e.g. Methylene Blue, Rhodamine B) over anionic (e.g. Orange G, Remazol Brilliant Blue R) dyes. The adsorbed dye concentration on titania nanowires increased with increase in nanowire dimensions and dye solution pH. Electrochemical sensing directly corroborated spectroscopic findings. Electrochemical detection sensitivity for Methylene Blue increased by more than two times in magnitude with tripling of nanowire average length. Photodegradation of Methylene Blue using titania nanowires is also more efficient than the commercial P25-TiO2 nanopowders. Keeping illumination protocol and observation times constant, the Methylene Blue concentration in solution decreased by only 50% in case of P25-TiO2 nanoparticles compared to a 100% decrease for titania nanowires. Photodegradation was also found to be function of exposure times and dye solution pH.Excellent sensing ability and photocatalytic activity of the titania nanowires is attributed to increased effective reaction area of the controlled nanostructured morphology. (C) 2010 Elsevier B.V. All rights reserved.
Resumo:
The hazards associated with major accident hazard (MAN) industries are fire, explosion and toxic gas releases. Of these, toxic gas release is the worst as it has the potential to cause extensive fatalities. Qualitative and quantitative hazard analyses are essential for the identification and quantification of these hazards related to chemical industries. Fault tree analysis (FTA) is an established technique in hazard identification. This technique has the advantage of being both qualitative and quantitative, if the probabilities and frequencies of the basic events are known. This paper outlines the estimation of the probability of release of chlorine from storage and filling facility of chlor-alkali industry using FTA. An attempt has also been made to arrive at the probability of chlorine release using expert elicitation and proven fuzzy logic technique for Indian conditions. Sensitivity analysis has been done to evaluate the percentage contribution of each basic event that could lead to chlorine release. Two-dimensional fuzzy fault tree analysis (TDFFTA) has been proposed for balancing the hesitation factor involved in expert elicitation. (C) 2010 Elsevier B.V. All rights reserved.
Resumo:
Trans-national corporations (TNCs) expanding their production bases to developing countries having better conditions of manufacturing and domestic markets provide increasing opportunities for local small and medium enterprises (SMEs) to have subcontracting relationships with these TNCs Even though some theoretical and a few empirical studies throw light on the nature of assistance provided by TNCs to local SMEs through subcontracting relationships none of the studies so far quantitatively analysed the role of this assistance on the innovative performance of SMEs leading to better economic performance This paper probes the extent and diversity of assistance received by SMEs from a TNC through subcontracting and its influence on technological innovations and economic performance of SMEs in the Indian automobile industry Indian SMEs were able to receive mainly product related and purchase process assistance thereby implying that subcontracting is largely confined to purchase-supply relationships However assistance received through subcontracting is beneficial as It promoted technological innovations of SMEs the higher the degree of assistance the higher the level of innovations carried out by these SMEs which in turn facilitated their economic performance Thus this paper substantiates in the Indian context that subcontracting relationship with a TNC can be an important source of technological innovations and enhanced economic performance for SMEs (C) 2010 Elsevier Ltd All rights reserved