10 resultados para Elements, High Trhoughput Data, elettrofisiologia, elaborazione dati, analisi Real Time

em Cochin University of Science


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The present research problem is to study the existing encryption methods and to develop a new technique which is performance wise superior to other existing techniques and at the same time can be very well incorporated in the communication channels of Fault Tolerant Hard Real time systems along with existing Error Checking / Error Correcting codes, so that the intention of eaves dropping can be defeated. There are many encryption methods available now. Each method has got it's own merits and demerits. Similarly, many crypt analysis techniques which adversaries use are also available.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Embedded systems are usually designed for a single or a specified set of tasks. This specificity means the system design as well as its hardware/software development can be highly optimized. Embedded software must meet the requirements such as high reliability operation on resource-constrained platforms, real time constraints and rapid development. This necessitates the adoption of static machine codes analysis tools running on a host machine for the validation and optimization of embedded system codes, which can help meet all of these goals. This could significantly augment the software quality and is still a challenging field.Embedded systems are usually designed for a single or a specified set of tasks. This specificity means the system design as well as its hardware/software development can be highly optimized. Embedded software must meet the requirements such as high reliability operation on resource-constrained platforms, real time constraints and rapid development. This necessitates the adoption of static machine codes analysis tools running on a host machine for the validation and optimization of embedded system codes, which can help meet all of these goals. This could significantly augment the software quality and is still a challenging field.Embedded systems are usually designed for a single or a specified set of tasks. This specificity means the system design as well as its hardware/software development can be highly optimized. Embedded software must meet the requirements such as high reliability operation on resource-constrained platforms, real time constraints and rapid development. This necessitates the adoption of static machine codes analysis tools running on a host machine for the validation and optimization of embedded system codes, which can help meet all of these goals. This could significantly augment the software quality and is still a challenging field.Embedded systems are usually designed for a single or a specified set of tasks. This specificity means the system design as well as its hardware/software development can be highly optimized. Embedded software must meet the requirements such as high reliability operation on resource-constrained platforms, real time constraints and rapid development. This necessitates the adoption of static machine codes analysis tools running on a host machine for the validation and optimization of embedded system codes, which can help meet all of these goals. This could significantly augment the software quality and is still a challenging field.This dissertation contributes to an architecture oriented code validation, error localization and optimization technique assisting the embedded system designer in software debugging, to make it more effective at early detection of software bugs that are otherwise hard to detect, using the static analysis of machine codes. The focus of this work is to develop methods that automatically localize faults as well as optimize the code and thus improve the debugging process as well as quality of the code.Validation is done with the help of rules of inferences formulated for the target processor. The rules govern the occurrence of illegitimate/out of place instructions and code sequences for executing the computational and integrated peripheral functions. The stipulated rules are encoded in propositional logic formulae and their compliance is tested individually in all possible execution paths of the application programs. An incorrect sequence of machine code pattern is identified using slicing techniques on the control flow graph generated from the machine code.An algorithm to assist the compiler to eliminate the redundant bank switching codes and decide on optimum data allocation to banked memory resulting in minimum number of bank switching codes in embedded system software is proposed. A relation matrix and a state transition diagram formed for the active memory bank state transition corresponding to each bank selection instruction is used for the detection of redundant codes. Instances of code redundancy based on the stipulated rules for the target processor are identified.This validation and optimization tool can be integrated to the system development environment. It is a novel approach independent of compiler/assembler, applicable to a wide range of processors once appropriate rules are formulated. Program states are identified mainly with machine code pattern, which drastically reduces the state space creation contributing to an improved state-of-the-art model checking. Though the technique described is general, the implementation is architecture oriented, and hence the feasibility study is conducted on PIC16F87X microcontrollers. The proposed tool will be very useful in steering novices towards correct use of difficult microcontroller features in developing embedded systems.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This proposed thesis is entitled “Plasma Polymerised Organic Thin Films: A study on the Structural, Electrical, and Nonlinear Optical Properties for Possible Applications. Polymers and polymer based materials find enormous applications in the realm of electronics and optoelectronics. They are employed as both active and passive components in making various devices. Enormous research activities are going on in this area for the last three decades or so, and many useful contributions are made quite accidentally. Conducting polymers is such a discovery, and eversince the discovery of conducting polyacetylene, a new branch of science itself has emerged in the form of synthetic metals. Conducting polymers are useful materials for many applications like polymer displays, high density data storage, polymer FETs, polymer LEDs, photo voltaic devices and electrochemical cells. With the emergence of molecular electronics and its potential in finding useful applications, organic thin films are receiving an unusual attention by scientists and engineers alike. This is evident from the vast literature pertaining to this field appearing in various journals. Recently, computer aided design of organic molecules have added further impetus to the ongoing research activities in this area. Polymers, especially, conducting polymers can be prepared both in the bulk and in the thinfilm form. However, many applications necessitate that they are grown in the thin film form either as free standing or on appropriate substrates. As far as their bulk counterparts are concerned, they can be prepared by various polymerisation techniques such as chemical routes and electrochemical means. A survey of the literature reveals that polymers like polyaniline, polypyrrole, polythiophene, have been investigated with a view to studying their structural electrical and optical properties. Among the various alternate techniques employed for the preparation of polymer thin films, the method of plasma polymerisation needs special attention in this context. The technique of plasma polymerisation is an inexpensive method and often requires very less infra structure. This method includes the employment of ac, rf, dc, microwave and pulsed sources. They produce pinhole free homogeneous films on appropriate substrates under controlled conditions. In conventional plasma polymerisation set up, the monomer is fed into an evacuated chamber and an ac/rf/dc/ w/pulsed discharge is created which enables the monomer species to dissociate, leading to the formation of polymer thin films. However, it has been found that the structure and hence the properties exhibited by plasma polymerized thin films are quite different from that of their counterparts produced by other thin film preparation techniques such as electrochemical deposition or spin coating. The properties of these thin films can be tuned only if the interrelationship between the structure and other properties are understood from a fundamental point of view. So very often, a through evaluation of the various properties is a pre-requisite for tailoring the properties of the thin films for applications. It has been found that conjugation is a necessary condition for enhancing the conductivity of polymer thin films. RF technique of plasma polymerisation is an excellent tool to induce conjugation and this modifies the electrical properties too. Both oxidative and reductive doping can be employed to modify the electrical properties of the polymer thin films for various applications. This is where organic thin films based on polymers scored over inorganic thin films, where in large area devices can be fabricated with organic semiconductors which is difficult to achieve by inorganic materials. For such applications, a variety of polymers have been synthesized such as polyaniline, polythiophene, polypyrrole etc. There are newer polymers added to this family every now and then. There are many virgin areas where plasma polymers are yet to make a foray namely low-k dielectrics or as potential nonlinear optical materials such as optical limiters. There are also many materials which are not been prepared by the method of plasma polymerisation. Some of the materials which are not been dealt with are phenyl hydrazine and tea tree oil. The advantage of employing organic extracts like tea tree oil monomers as precursors for making plasma polymers is that there can be value addition to the already existing uses and possibility exists in converting them to electronic grade materials, especially semiconductors and optically active materials for photonic applications. One of the major motivations of this study is to synthesize plasma polymer thin films based on aniline, phenyl hydrazine, pyrrole, tea tree oil and eucalyptus oil by employing both rf and ac plasma polymerisation techniques. This will be carried out with the objective of growing thin films on various substrates such as glass, quartz and indium tin oxide (ITO) coated glass. There are various properties namely structural, electrical, dielectric permittivity, nonlinear optical properties which are to be evaluated to establish the relationship with the structure and the other properties. Special emphasis will be laid in evaluating the optical parameters like refractive index (n), extinction coefficient (k), the real and imaginary components of dielectric constant and the optical transition energies of the polymer thin films from the spectroscopic ellipsometric studies. Apart from evaluating these physical constants, it is also possible to predict whether a material exhibit nonlinear optical properties by ellipsometric investigations. So further studies using open aperture z-scan technique in order to evaluate the nonlinear optical properties of a few selected samples which are potential nonlinear optical materials is another objective of the present study. It will be another endeavour to offer an appropriate explanation for the nonlinear optical properties displayed by these films. Doping of plasma polymers is found to modify both the electrical conductivity and optical properties. Iodine is found to modify the properties of the polymer thin films. However insitu iodine doping is tricky and the film often looses its stability because of the escape of iodine. An appropriate insitu technique of doping will be developed to dope iodine in to the plasma polymerized thin films. Doping of polymer thin films with iodine results in improved and modified optical and electrical properties. However it requires tools like FTIR and UV-Vis-NIR spectroscopy to elucidate the structural and optical modifications imparted to the polymer films. This will be attempted here to establish the role of iodine in the modification of the properties exhibited by the films

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Holographic technology is at the dawn of quick evolution in various new areas including holographic data storage, holographic optical elements, artificial intelligence, optical interconnects, optical correlators, commerce, medical practice, holographic weapon sight, night vision goggles and games etc. One of the major obstacles for the success of holographic technology to a large extent is the lack of suitable recording medium. Compared with other holographic materials such as dichromated gelatin and silver halide emulsions, photopolymers have the great advantage of recording and reading holograms in real time and the spectral sensitivity could be easily shifted to the type of recording laser used by simply changing the sensitizing dye. Also these materials possess characteristics such as good light sensitivity, real time image development, large dynamic range, good optical properties, format flexibility, and low cost. This thesis describes the attempts made to fabricate highly economic photopolymer films for various holographic applications. In the present work, Poly (vinyl alcohol) (PVA) and poly (vinyl chloride) (PVC) are selected as the host polymer matrices and methylene blue (MB) is used as the photosensitizing dye. The films were fabricated using gravity settling method. No chemical treatment or pre/post exposures were applied to the films. As the outcome of the work, photopolymer films with more than 70% efficiency, a permanent recording material which required no fixing process, a reusable recording material etc. were fabricated.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Increase in sea surface temperature with global warming has an impact on coastal upwelling. Past two decades (1988 to 2007) of satellite observed sea surface temperatures and space borne scatterometer measured winds have provided an insight into the dynamics of coastal upwelling in the southeastern Arabian Sea, in the global warming scenario. These high resolution data products have shown inconsistent variability with a rapid rise in sea surface temperature between 1992 and 1998 and again from 2004 to 2007. The upwelling indices derived from both sea surface temperature and wind have shown that there is an increase in the intensity of upwelling during the period 1998 to 2004 than the previous decade. These indices have been modulated by the extreme climatic events like El–Nino and Indian Ocean Dipole that happened during 1991–92 and 1997–98. A considerable drop in the intensity of upwelling was observed concurrent with these events. Apart from the impact of global warming on the upwelling, the present study also provides an insight into spatial variability of upwelling along the coast. Noticeable fact is that the intensity of offshore Ekman transport off 8oN during the winter monsoon is as high as that during the usual upwelling season in summer monsoon. A drop in the meridional wind speed during the years 2005, 2006 and 2007 has resulted in extreme decrease in upwelling though the zonal wind and the total wind magnitude are a notch higher than the previous years. This decrease in upwelling strength has resulted in reduced productivity too.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Preparation of an appropriate optical-fiber preform is vital for the fabrication of graded-index polymer optical fibers (GIPOF), which are considered to be a good choice for providing inexpensive high bandwidth data links, for local area networks and telecommunication applications. Recent development of the interfacial gel polymerization technique has caused a dramatic reduction in the total attenuation in GIPOF, and this is one of the potential methods to prepare fiber preforms for the fabrication of dye-doped polymer-fiber amplifiers. In this paper, the preparation of a dye-doped graded-index poly(methyl methacrylate) (PMMA) rod by the interfacial gel polymerization method using a PMMA tube is reported. An organic compound of high-refractive index, viz., diphenyl phthalate (DPP), was used to obtain a graded-index distribution, and Rhodamine B (Rh B), was used to dope the PMMA rod. The refractive index profile of the rod was measured using an interferometric technique and the index exponent was estimated. The single pass gain of the rod was measured at a pump wavelength of 532 nm. The extent of doping of the Rh B in the preform was studied by axially exciting a thin slice of the rod with white light and measuring the spatial variation of the fluorescence intensity across the sample.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis Entitled Photonic applications of biomaterials with special reference to biopolymers and microbes. A detailed investigation will be presented in the present thesis related to direct applications of biopolymers into some selected area of photonics and how the growth kinetics of an aerial bacterial colony on solid agar media was studied using laser induced fluorescence technique. This chapter is an overview of the spectrum of biomaterials and their application to Photonics. The chapter discusses a wide range of biomaterials based photonics applications like efficient harvesting of solar energy, lowthreshold lasing, high-density data storage, optical switching, filtering and template for nano s tructures. The most extensively investigated photonics application in biology is Laser induced fluorescence technique. The importance of fluorescence studies in different biological and related fields are also mentioned in this chapter.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

n the recent years protection of information in digital form is becoming more important. Image and video encryption has applications in various fields including Internet communications, multimedia systems, medical imaging, Tele-medicine and military communications. During storage as well as in transmission, the multimedia information is being exposed to unauthorized entities unless otherwise adequate security measures are built around the information system. There are many kinds of security threats during the transmission of vital classified information through insecure communication channels. Various encryption schemes are available today to deal with information security issues. Data encryption is widely used to protect sensitive data against the security threat in the form of “attack on confidentiality”. Secure transmission of information through insecure communication channels also requires encryption at the sending side and decryption at the receiving side. Encryption of large text message and image takes time before they can be transmitted, causing considerable delay in successive transmission of information in real-time. In order to minimize the latency, efficient encryption algorithms are needed. An encryption procedure with adequate security and high throughput is sought in multimedia encryption applications. Traditional symmetric key block ciphers like Data Encryption Standard (DES), Advanced Encryption Standard (AES) and Escrowed Encryption Standard (EES) are not efficient when the data size is large. With the availability of fast computing tools and communication networks at relatively lower costs today, these encryption standards appear to be not as fast as one would like. High throughput encryption and decryption are becoming increasingly important in the area of high-speed networking. Fast encryption algorithms are needed in these days for high-speed secure communication of multimedia data. It has been shown that public key algorithms are not a substitute for symmetric-key algorithms. Public key algorithms are slow, whereas symmetric key algorithms generally run much faster. Also, public key systems are vulnerable to chosen plaintext attack. In this research work, a fast symmetric key encryption scheme, entitled “Matrix Array Symmetric Key (MASK) encryption” based on matrix and array manipulations has been conceived and developed. Fast conversion has been achieved with the use of matrix table look-up substitution, array based transposition and circular shift operations that are performed in the algorithm. MASK encryption is a new concept in symmetric key cryptography. It employs matrix and array manipulation technique using secret information and data values. It is a block cipher operated on plain text message (or image) blocks of 128 bits using a secret key of size 128 bits producing cipher text message (or cipher image) blocks of the same size. This cipher has two advantages over traditional ciphers. First, the encryption and decryption procedures are much simpler, and consequently, much faster. Second, the key avalanche effect produced in the ciphertext output is better than that of AES.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Towed Array electronics is a multi-channel simultaneous real time high speed data acquisition system. Since its assembly is highly manpower intensive, the costs of arrays are prohibitive and therefore any attempt to reduce the manufacturing, assembly, testing and maintenance costs is a welcome proposition. The Network Based Towed Array is an innovative concept and its implementation has remarkably simplified the fabrication, assembly and testing and revolutionised the Towed Array scenario. The focus of this paper is to give a good insight into the Reliability aspects of Network Based Towed Array. A case study of the comparison between the conventional array and the network based towed array is also dealt with

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Study on variable stars is an important topic of modern astrophysics. After the invention of powerful telescopes and high resolving powered CCD’s, the variable star data is accumulating in the order of peta-bytes. The huge amount of data need lot of automated methods as well as human experts. This thesis is devoted to the data analysis on variable star’s astronomical time series data and hence belong to the inter-disciplinary topic, Astrostatistics. For an observer on earth, stars that have a change in apparent brightness over time are called variable stars. The variation in brightness may be regular (periodic), quasi periodic (semi-periodic) or irregular manner (aperiodic) and are caused by various reasons. In some cases, the variation is due to some internal thermo-nuclear processes, which are generally known as intrinsic vari- ables and in some other cases, it is due to some external processes, like eclipse or rotation, which are known as extrinsic variables. Intrinsic variables can be further grouped into pulsating variables, eruptive variables and flare stars. Extrinsic variables are grouped into eclipsing binary stars and chromospheri- cal stars. Pulsating variables can again classified into Cepheid, RR Lyrae, RV Tauri, Delta Scuti, Mira etc. The eruptive or cataclysmic variables are novae, supernovae, etc., which rarely occurs and are not periodic phenomena. Most of the other variations are periodic in nature. Variable stars can be observed through many ways such as photometry, spectrophotometry and spectroscopy. The sequence of photometric observa- xiv tions on variable stars produces time series data, which contains time, magni- tude and error. The plot between variable star’s apparent magnitude and time are known as light curve. If the time series data is folded on a period, the plot between apparent magnitude and phase is known as phased light curve. The unique shape of phased light curve is a characteristic of each type of variable star. One way to identify the type of variable star and to classify them is by visually looking at the phased light curve by an expert. For last several years, automated algorithms are used to classify a group of variable stars, with the help of computers. Research on variable stars can be divided into different stages like observa- tion, data reduction, data analysis, modeling and classification. The modeling on variable stars helps to determine the short-term and long-term behaviour and to construct theoretical models (for eg:- Wilson-Devinney model for eclips- ing binaries) and to derive stellar properties like mass, radius, luminosity, tem- perature, internal and external structure, chemical composition and evolution. The classification requires the determination of the basic parameters like pe- riod, amplitude and phase and also some other derived parameters. Out of these, period is the most important parameter since the wrong periods can lead to sparse light curves and misleading information. Time series analysis is a method of applying mathematical and statistical tests to data, to quantify the variation, understand the nature of time-varying phenomena, to gain physical understanding of the system and to predict future behavior of the system. Astronomical time series usually suffer from unevenly spaced time instants, varying error conditions and possibility of big gaps. This is due to daily varying daylight and the weather conditions for ground based observations and observations from space may suffer from the impact of cosmic ray particles. Many large scale astronomical surveys such as MACHO, OGLE, EROS, xv ROTSE, PLANET, Hipparcos, MISAO, NSVS, ASAS, Pan-STARRS, Ke- pler,ESA, Gaia, LSST, CRTS provide variable star’s time series data, even though their primary intention is not variable star observation. Center for Astrostatistics, Pennsylvania State University is established to help the astro- nomical community with the aid of statistical tools for harvesting and analysing archival data. Most of these surveys releases the data to the public for further analysis. There exist many period search algorithms through astronomical time se- ries analysis, which can be classified into parametric (assume some underlying distribution for data) and non-parametric (do not assume any statistical model like Gaussian etc.,) methods. Many of the parametric methods are based on variations of discrete Fourier transforms like Generalised Lomb-Scargle peri- odogram (GLSP) by Zechmeister(2009), Significant Spectrum (SigSpec) by Reegen(2007) etc. Non-parametric methods include Phase Dispersion Minimi- sation (PDM) by Stellingwerf(1978) and Cubic spline method by Akerlof(1994) etc. Even though most of the methods can be brought under automation, any of the method stated above could not fully recover the true periods. The wrong detection of period can be due to several reasons such as power leakage to other frequencies which is due to finite total interval, finite sampling interval and finite amount of data. Another problem is aliasing, which is due to the influence of regular sampling. Also spurious periods appear due to long gaps and power flow to harmonic frequencies is an inherent problem of Fourier methods. Hence obtaining the exact period of variable star from it’s time series data is still a difficult problem, in case of huge databases, when subjected to automation. As Matthew Templeton, AAVSO, states “Variable star data analysis is not always straightforward; large-scale, automated analysis design is non-trivial”. Derekas et al. 2007, Deb et.al. 2010 states “The processing of xvi huge amount of data in these databases is quite challenging, even when looking at seemingly small issues such as period determination and classification”. It will be beneficial for the variable star astronomical community, if basic parameters, such as period, amplitude and phase are obtained more accurately, when huge time series databases are subjected to automation. In the present thesis work, the theories of four popular period search methods are studied, the strength and weakness of these methods are evaluated by applying it on two survey databases and finally a modified form of cubic spline method is intro- duced to confirm the exact period of variable star. For the classification of new variable stars discovered and entering them in the “General Catalogue of Vari- able Stars” or other databases like “Variable Star Index“, the characteristics of the variability has to be quantified in term of variable star parameters.