47 resultados para Privacy By Design, Data Protection Officer, Privacy Officer, trattamento, dati personali, PETs
em Indian Institute of Science - Bangalore - Índia
Resumo:
As the gap between processor and memory continues to grow Memory performance becomes a key performance bottleneck for many applications. Compilers therefore increasingly seek to modify an application’s data layout to improve cache locality and cache reuse. Whole program Structure Layout [WPSL] transformations can significantly increase the spatial locality of data and reduce the runtime of programs that use link-based data structures, by increasing the cache line utilization. However, in production compilers WPSL transformations do not realize the entire performance potential possible due to a number of factors. Structure layout decisions made on the basis of whole program aggregated affinity/hotness of structure fields, can be sub optimal for local code regions. WPSL is also restricted in applicability in production compilers for type unsafe languages like C/C++ due to the extensive legality checks and field sensitive pointer analysis required over the entire application. In order to overcome the issues associated with WPSL, we propose Region Based Structure Layout (RBSL) optimization framework, using selective data copying. We describe our RBSL framework, implemented in the production compiler for C/C++ on HP-UX IA-64. We show that acting in complement to the existing and mature WPSL transformation framework in our compiler, RBSL improves application performance in pointer intensive SPEC benchmarks ranging from 3% to 28% over WPSL
Resumo:
Many novel computer architectures like array and multiprocessors which achieve high performance through the use of concurrency exploit variations of the von Neumann model of computation. The effective utilization of the machines makes special demands on programmers and their programming languages, such as the structuring of data into vectors or the partitioning of programs into concurrent processes. In comparison, the data flow model of computation demands only that the principle of structured programming be followed. A data flow program, often represented as a data flow graph, is a program that expresses a computation by indicating the data dependencies among operators. A data flow computer is a machine designed to take advantage of concurrency in data flow graphs by executing data independent operations in parallel. In this paper, we discuss the design of a high level language (DFL: Data Flow Language) suitable for data flow computers. Some sample procedures in DFL are presented. The implementation aspects have not been discussed in detail since there are no new problems encountered. The language DFL embodies the concepts of functional programming, but in appearance closely resembles Pascal. The language is a better vehicle than the data flow graph for expressing a parallel algorithm. The compiler has been implemented on a DEC 1090 system in Pascal.
Resumo:
The telecommunication, broadcasting and other instrumented towers carry power and/or signal cables from their ground end to their upper regions. During a direct hit to the tower, significant induction can occur to these mounted cables. In order to provide adequate protection to the equipments connected to them, protection schemes have been evolved in the literature. Development of more effective protection schemes requires a quantitative knowledge on various parameters. However, such quantitative knowledge is difficult to find at present. Amongst several of these aspects, the present work aims to investigate on the two important aspects: (i) what would be the nature of the induced currents and (ii) what will be the current sharing if as per the practice, the sheath of the cable is connected to the down conductor/tower. These aspects will be useful in design of protection schemes and also in analyzing the field structure around instrumented towers.
Resumo:
The rapid data acquisition, natural fluorescence rejection and experimental ease are the advantages of the ultra-fast Raman loss scattering (URLS) which makes it a unique and valuable molecular structure-determining technique. URLS is an analogue of stimulated Raman scattering (SRS) but far more sensitive than SRS. It involves the interaction of two laser sources, viz. a picosecond (ps) pulse and white light, with the sample leading to the generation of loss signal on the higher energy (blue) side with respect to the wavelength of the ps pulse, unlike the gain signal observed on the red side in SRS. These loss signals are at least 1.5 times more intense than the SRS signals. Also, the very prerequisite of the experimental protocol for signal detection to be on the higher energy side by design eliminates the interference from fluorescence, which always appears on the red side. Unlike coherent anti-Stokes Raman scattering, URLS signals are not precluded by non-resonant background under resonance condition and also being a self-phase matched process, it is experimentally easier.
Resumo:
Programming for parallel architectures that do not have a shared address space is extremely difficult due to the need for explicit communication between memories of different compute devices. A heterogeneous system with CPUs and multiple GPUs, or a distributed-memory cluster are examples of such systems. Past works that try to automate data movement for distributed-memory architectures can lead to excessive redundant communication. In this paper, we propose an automatic data movement scheme that minimizes the volume of communication between compute devices in heterogeneous and distributed-memory systems. We show that by partitioning data dependences in a particular non-trivial way, one can generate data movement code that results in the minimum volume for a vast majority of cases. The techniques are applicable to any sequence of affine loop nests and works on top of any choice of loop transformations, parallelization, and computation placement. The data movement code generated minimizes the volume of communication for a particular configuration of these. We use a combination of powerful static analyses relying on the polyhedral compiler framework and lightweight runtime routines they generate, to build a source-to-source transformation tool that automatically generates communication code. We demonstrate that the tool is scalable and leads to substantial gains in efficiency. On a heterogeneous system, the communication volume is reduced by a factor of 11X to 83X over state-of-the-art, translating into a mean execution time speedup of 1.53X. On a distributed-memory cluster, our scheme reduces the communication volume by a factor of 1.4X to 63.5X over state-of-the-art, resulting in a mean speedup of 1.55X. In addition, our scheme yields a mean speedup of 2.19X over hand-optimized UPC codes.
Resumo:
Sugganahalli, a rural vernacular community in a warm-humid region in South India, is under transition towards adopting modern construction practices. Vernacular local building elements like rubble walls and mud roofs are given way to burnt brick walls and reinforced cement concrete (RCC)/tin roofs. Over 60% of Indian population is rural, and implications of such transitions on thermal comfort and energy in buildings are crucial to understand. Vernacular architecture evolves adopting local resources in response to the local climate adopting passive solar designs. This paper investigates the effectiveness of passive solar elements on the indoor thermal comfort by adopting modern climate-responsive design strategies. Dynamic simulation models validated by measured data have also been adopted to determine the impact of the transition from vernacular to modern material-configurations. Age-old traditional design considerations were found to concur with modern understanding into bio-climatic response and climate-responsiveness. Modern transitions were found to increase the average indoor temperatures in excess of 7 degrees C. Such transformations tend to shift the indoor conditions to a psychrometric zone that is likely to require active air-conditioning. Also, the surveyed thermal sensation votes were found to lie outside the extended thermal comfort boundary for hot developing countries provided by Givoni in the bio-climatic chart.
Resumo:
A link level reliable multicast requires a channel access protocol to resolve the collision of feedback messages sent by multicast data receivers. Several deterministic media access control protocols have been proposed to attain high reliability, but with large delay. Besides, there are also protocols which can only give probabilistic guarantee about reliability, but have the least delay. In this paper, we propose a virtual token-based channel access and feedback protocol (VTCAF) for link level reliable multicasting. The VTCAF protocol introduces a virtual (implicit) token passing mechanism based on carrier sensing to avoid the collision between feedback messages. The delay performance is improved in VTCAF protocol by reducing the number of feedback messages. Besides, the VTCAF protocol is parametric in nature and can easily trade off reliability with the delay as per the requirement of the underlying application. Such a cross layer design approach would be useful for a variety of multicast applications which require reliable communication with different levels of reliability and delay performance. We have analyzed our protocol to evaluate various performance parameters at different packet loss rate and compared its performance with those of others. Our protocol has also been simulated using Castalia network simulator to evaluate the same performance parameters. Simulation and analytical results together show that the VTCAF protocol is able to considerably reduce average access delay while ensuring very high reliability at the same time.
Resumo:
Vernacular dwellings are well-suited climate-responsive designs that adopt local materials and skills to support comfortable indoor environments in response to local climatic conditions. These naturally-ventilated passive dwellings have enabled civilizations to sustain even in extreme climatic conditions. The design and physiological resilience of the inhabitants have coevolved to be attuned to local climatic and environmental conditions. Such adaptations have perplexed modern theories in human thermal-comfort that have evolved in the era of electricity and air-conditioned buildings. Vernacular local building elements like rubble walls and mud roofs are given way to burnt brick walls and reinforced cement concrete tin roofs. Over 60% of Indian population is rural, and implications of such transitions on thermal comfort and energy in buildings are crucial to understand. Types of energy use associated with a buildings life cycle include its embodied energy, operational and maintenance energy, demolition and disposal energy. Embodied Energy (EE) represents total energy consumption for construction of building, i.e., embodied energy of building materials, material transportation energy and building construction energy. Embodied energy of building materials forms major contribution to embodied energy in buildings. Operational energy (OE) in buildings mainly contributed by space conditioning and lighting requirements, depends on the climatic conditions of the region and comfort requirements of the building occupants. Less energy intensive natural materials are used for traditional buildings and the EE of traditional buildings is low. Transition in use of materials causes significant impact on embodied energy of vernacular dwellings. Use of manufactured, energy intensive materials like brick, cement, steel, glass etc. contributes to high embodied energy in these dwellings. This paper studies the increase in EE of the dwelling attributed to change in wall materials. Climatic location significantly influences operational energy in dwellings. Buildings located in regions experiencing extreme climatic conditions would require more operational energy to satisfy the heating and cooling energy demands throughout the year. Traditional buildings adopt passive techniques or non-mechanical methods for space conditioning to overcome the vagaries of extreme climatic variations and hence less operational energy. This study assesses operational energy in traditional dwelling with regard to change in wall material and climatic location. OE in the dwellings has been assessed for hot-dry, warm humid and moderate climatic zones. Choice of thermal comfort models is yet another factor which greatly influences operational energy assessment in buildings. The paper adopts two popular thermal-comfort models, viz., ASHRAE comfort standards and TSI by Sharma and Ali to investigate thermal comfort aspects and impact of these comfort models on OE assessment in traditional dwellings. A naturally ventilated vernacular dwelling in Sugganahalli, a village close to Bangalore (India), set in warm - humid climate is considered for present investigations on impact of transition in building materials, change in climatic location and choice of thermal comfort models on energy in buildings. The study includes a rigorous real time monitoring of the thermal performance of the dwelling. Dynamic simulation models validated by measured data have also been adopted to determine the impact of the transition from vernacular to modern material-configurations. Results of the study and appraisal for appropriate thermal comfort standards for computing operational energy has been presented and discussed in this paper. (c) 2014 K.I. Praseeda. Published by Elsevier Ltd.
Resumo:
We propose to develop a 3-D optical flow features based human action recognition system. Optical flow based features are employed here since they can capture the apparent movement in object, by design. Moreover, they can represent information hierarchically from local pixel level to global object level. In this work, 3-D optical flow based features a re extracted by combining the 2-1) optical flow based features with the depth flow features obtained from depth camera. In order to develop an action recognition system, we employ a Meta-Cognitive Neuro-Fuzzy Inference System (McFIS). The m of McFIS is to find the decision boundary separating different classes based on their respective optical flow based features. McFIS consists of a neuro-fuzzy inference system (cognitive component) and a self-regulatory learning mechanism (meta-cognitive component). During the supervised learning, self-regulatory learning mechanism monitors the knowledge of the current sample with respect to the existing knowledge in the network and controls the learning by deciding on sample deletion, sample learning or sample reserve strategies. The performance of the proposed action recognition system was evaluated on a proprietary data set consisting of eight subjects. The performance evaluation with standard support vector machine classifier and extreme learning machine indicates improved performance of McFIS is recognizing actions based of 3-D optical flow based features.
Resumo:
Clustering techniques which can handle incomplete data have become increasingly important due to varied applications in marketing research, medical diagnosis and survey data analysis. Existing techniques cope up with missing values either by using data modification/imputation or by partial distance computation, often unreliable depending on the number of features available. In this paper, we propose a novel approach for clustering data with missing values, which performs the task by Symmetric Non-Negative Matrix Factorization (SNMF) of a complete pair-wise similarity matrix, computed from the given incomplete data. To accomplish this, we define a novel similarity measure based on Average Overlap similarity metric which can effectively handle missing values without modification of data. Further, the similarity measure is more reliable than partial distances and inherently possesses the properties required to perform SNMF. The experimental evaluation on real world datasets demonstrates that the proposed approach is efficient, scalable and shows significantly better performance compared to the existing techniques.
Resumo:
Beyond product design, if the notion of product `lifecycle design' enforces the consideration of requirements from all the lifecycle phases of products, design for sustainability enforces the consideration of lifecycle design in the context of the lifecycles of other products, processes, institutions and their design. Consequently, sustainability requirements that need to be met by design are very diverse. In this article, we portray the nature of design process to address sustainability requirements. This is done taking an example of designing a urban household organic waste management system that requires less water and reclaims the nutrients.
Resumo:
An artificial neural network (ANN) is presented to predict a 28-day compressive strength of a normal and high strength self compacting concrete (SCC) and high performance concrete (HPC) with high volume fly ash. The ANN is trained by the data available in literature on normal volume fly ash because data on SCC with high volume fly ash is not available in sufficient quantity. Further, while predicting the strength of HPC the same data meant for SCC has been used to train in order to economise on computational effort. The compressive strengths of SCC and HPC as well as slump flow of SCC estimated by the proposed neural network are validated by experimental results.
Resumo:
Lateral or transaxial truncation of cone-beam data can occur either due to the field of view limitation of the scanning apparatus or iregion-of-interest tomography. In this paper, we Suggest two new methods to handle lateral truncation in helical scan CT. It is seen that reconstruction with laterally truncated projection data, assuming it to be complete, gives severe artifacts which even penetrates into the field of view. A row-by-row data completion approach using linear prediction is introduced for helical scan truncated data. An extension of this technique known as windowed linear prediction approach is introduced. Efficacy of the two techniques are shown using simulation with standard phantoms. A quantitative image quality measure of the resulting reconstructed images are used to evaluate the performance of the proposed methods against an extension of a standard existing technique.
Resumo:
Knoevenagel condensation of 2-acylcyclohexanones or 2-ethoxycarbonylcyclohexanone with either cyanoacetamide or malononitrile followed by silver salt alkylation gave the 5,6,7,8-tetrahydroisoquinolines (3a–i). Chromic acid oxidation of the 5,6,7,8-tetrahydroisoquinolines (3a–i) to the corresponding tetralones (4a–i) followed by sodium borohydride reduction and p-toluenesulphonic acid-catalysed dehydration of the resulting alcohols (5a–i) gave the 5,6-dihydroisoquinolines (6a–i). Reaction of 5,6-dihydroisoquinolines (6a–g) with potassium amide in liquid ammonia gave a mixture of the 1,3-dihydroisoquinolines (7a–g) and the isoquinolines (8a–g). The C-1 unsubstituted 1,2-dihydroisoquinoline (7c) was found to be very unstable. In the case of the 5,6-dihydroisoquinolines (6h and 6i), reaction of potassium amide in liquid ammonia resulted in a mixture of 1-aminoisoquinoline (9) and the isoquinolines (8h and 8i). All the above compounds have been characterised by spectral data. A probable pathway for the formation of the 1,2-dihydroisoquinolines (7a–g) and the isoquinolines (8a–i) is suggested.
Resumo:
A transformation is suggested which can transform a non-Gaussian monthly hydrological time series into a Gaussian one. The suggested approach is verified with data of ten Indian rainfall time series. Incidentally, it is observed that once the deterministic trends are removed, the transformation leads to an uncorrelated process for monthly rainfall. The procedure for normalization is general enough in that it should be also applicable to river discharges. This is verified to a limited extent by considering data of two Indian river discharges.