7 resultados para Low Autocorrelation Binary Sequence Problem
em Cochin University of Science
Resumo:
The study on the fabrication and characterization of spray pyrolysed cadmium sulphide homojunction solar cells. As an alternative to the conventional energy source, the PV technology has to be improved. Study about the factors affecting the performance of the existing solar cells and this will result in the enhancement of efficiency of the cells. At the same time it is equally important to have R&D works on developing new photovoltaic devices and processes which are less expensive for large scale production. CdS is an important binary compound semiconductor, which is very useful in the field of photovoltaics. It is very easy to prepare large area CdS thin films. In order to fabricate thin film homojunction cadmium sulphide cells, prepared and characterized SnO2 thin film as the lower electrode, p-CdS as the active layer and n-CdS as window layer. Cadmium material used for the fabrication of homojunction solar cells is highly toxic. The major damage due to continued exposure to low levels of cadmium are on the kidneys, lungs and bones. The real advantage of spray pyrolysis process is that there is no emission of any toxic gases during the deposition. Very low concentration of the chemicals is needed in this process. The risk involved from this material is very low, though they are toxic. On large scale usage it may become necessary that the cells after their life, should be bought back by the companies to retrieve chemicals like cadmium. This will reduce environmental problem and also the material wastage
Resumo:
Modern computer systems are plagued with stability and security problems: applications lose data, web servers are hacked, and systems crash under heavy load. Many of these problems or anomalies arise from rare program behavior caused by attacks or errors. A substantial percentage of the web-based attacks are due to buffer overflows. Many methods have been devised to detect and prevent anomalous situations that arise from buffer overflows. The current state-of-art of anomaly detection systems is relatively primitive and mainly depend on static code checking to take care of buffer overflow attacks. For protection, Stack Guards and I-leap Guards are also used in wide varieties.This dissertation proposes an anomaly detection system, based on frequencies of system calls in the system call trace. System call traces represented as frequency sequences are profiled using sequence sets. A sequence set is identified by the starting sequence and frequencies of specific system calls. The deviations of the current input sequence from the corresponding normal profile in the frequency pattern of system calls is computed and expressed as an anomaly score. A simple Bayesian model is used for an accurate detection.Experimental results are reported which show that frequency of system calls represented using sequence sets, captures the normal behavior of programs under normal conditions of usage. This captured behavior allows the system to detect anomalies with a low rate of false positives. Data are presented which show that Bayesian Network on frequency variations responds effectively to induced buffer overflows. It can also help administrators to detect deviations in program flow introduced due to errors.
Resumo:
Latex protein allergy is a serious problem faced by users of natural rubber latex products. This is severe in health care workers, who are constantly using latex products like examination gloves, surgical gloves etc. Out of the total proteins only a small fraction is extractable and only these proteins cause allergic reactions in sensitized people. Enzymic deproteinisation of latex and leaching and chlorination of latex products are the common methods used to reduce the severity of the problem.Enzyme deproteinisation is a cubersome process involving high cost and process loss.Physical properties of such films are poor. Leaching is a lengthy process and in leached latex products presence of extractable proteins is observed on further storing. Chlorination causes yellowing of latex products and reduction in tensile properties.In this context a more simple process of removal of extractable proteins from latex itself was investigated. This thesis reports the application of poly propylene glycol (PPG) to displace extractable proteins from natural latex. PPG is added to 60 % centrifuged natural latex to the extent of 0.2 % m/rn, subssequently diluted to 30 % dry rubber content and again concentrated to obtain a low protein latex.Dilution of concentrated latex and subsequent concentration lead to a total reduction in non - rubber solids in the concentrate, especially proteins and reduction in the ionic concentration in the aqueous phase of the latex. It has been reported that proteins in natural rubber / latex affect its behaviour in the vulcanisation process. Ionic concentration in the aqueous phase of latex influence the stability, viscosity and flow behaviour of natural latex. Hence, a detailed technological evaluation was carried out on this low protein latex. In this study, low protein latex was compared with single centrifuged latex ( the raw material to almost every latex product), double centrifuged latex ( because dilution and second concentration of latex is accompanied by protein removal to some extent and reduction in the ionic concentration of the aqueous phase of latex.). Studies were conducted on Sulphur cure in conventional and EV systems under conditions of post ~ cure and prevulcanisation of latex. Studies were conducted on radiation cure in latex stage. Extractable protein content in vulcanised low protein latex films are observed to be very low. lt is observed that this low protein latex is some what slower curing than single centrifuged latex, but faster than double centrifuged latex. Modulus of low protein latex films were slightly low. In general physical properties of vulcanised low protein latex films are only siightly lower than single centrifuged latex. Ageing properties of the low protein latex films were satisfactory. Viscosity and flow behaviour of low protein latex is much better than double centrifuged latex and almost comparable to single centrifuged latex. On observing that the physical properties and flow behaviour of low protein latex was satisfactory, it was used for the preparation of examination gloves and the gloves were evaluated. It is observed that the properties are conforming to the Indian Standard Specifications. It is thus observed that PPG treatment of natural latex is a simple process of preparing low protein latex. Extractable protein content in these films are very low.The physical properties of the films are comparable to ordinary centrifuged latex and better than conventionally deprotenized latex films. This latex can be used for the production of examination gloves.
Resumo:
The primary objective of this work is to develop an efficient accelerator system for low temperature vulcanization of rubbers. Although xanthates are known to act as accelerators for low temperature vulcanization, a systematic study on the mechanism of vulcanization, the mechanical properties of the vulcanizates at varying temperatures of vulcanization, cure characteristics etc are not reported. Further. xanthate based curing systems are not commonly used because of their chance for premature vulcanization during processing. The proposed study is to develop a novel accelerator system for the low temperature vulcanization of rubbers having enough processing safely. lt is also proposed to develop a method for the prevulcanisation of natural rubber latex at room temperature. As already mentioned the manufacture of rubber products at low temperature will improve its quality and appearance. Also, energy consumption can be reduced by low temperature vulcanization. in addition, low temperature vulcanization will be extremely useful in the area of repair of defective products, since subjecting finished products to high temperatures during the process of repair will adversely affect the quality of the product. Further. room temperature curing accelerator systems will find extensive applications in surface coating industries.
Resumo:
Considerable research effort has been devoted in predicting the exon regions of genes. The binary indicator (BI), Electron ion interaction pseudo potential (EIIP), Filter method are some of the methods. All these methods make use of the period three behavior of the exon region. Even though the method suggested in this paper is similar to above mentioned methods , it introduces a set of sequences for mapping the nucleotides selected by applying genetic algorithm and found to be more promising
Resumo:
Increasing amounts of plastic waste in the environment have become a problem of gigantic proportions. The case of linear low-density polyethylene (LLDPE) is especially significant as it is widely used for packaging and other applications. This synthetic polymer is normally not biodegradable until it is degraded into low molecular mass fragments that can be assimilated by microorganisms. Blends of nonbiodegradable polymers and biodegradable commercial polymers such as poly (vinyl alcohol) (PVA) can facilitate a reduction in the volume of plastic waste when they undergo partial degradation. Further, the remaining fragments stand a greater chance of undergoing biodegradation in a much shorter span of time. In this investigation, LLDPE was blended with different proportions of PVA (5–30%) in a torque rheometer. Mechanical, thermal, and biodegradation studies were carried out on the blends. The biodegradability of LLDPE/PVA blends has been studied in two environments: (1) in a culture medium containing Vibrio sp. and (2) soil environment, both over a period of 15 weeks. Blends exposed to culture medium degraded more than that exposed to soil environment. Changes in various properties of LLDPE/PVA blends before and after degradation were monitored using Fourier transform infrared spectroscopy, a differential scanning calorimeter (DSC) for crystallinity, and scanning electron microscope (SEM) for surface morphology among other things. Percentage crystallinity decreased as the PVA content increased and biodegradation resulted in an increase of crystallinity in LLDPE/PVA blends. The results prove that partial biodegradation of the blends has occurred holding promise for an eventual biodegradable product
Resumo:
Study on variable stars is an important topic of modern astrophysics. After the invention of powerful telescopes and high resolving powered CCD’s, the variable star data is accumulating in the order of peta-bytes. The huge amount of data need lot of automated methods as well as human experts. This thesis is devoted to the data analysis on variable star’s astronomical time series data and hence belong to the inter-disciplinary topic, Astrostatistics. For an observer on earth, stars that have a change in apparent brightness over time are called variable stars. The variation in brightness may be regular (periodic), quasi periodic (semi-periodic) or irregular manner (aperiodic) and are caused by various reasons. In some cases, the variation is due to some internal thermo-nuclear processes, which are generally known as intrinsic vari- ables and in some other cases, it is due to some external processes, like eclipse or rotation, which are known as extrinsic variables. Intrinsic variables can be further grouped into pulsating variables, eruptive variables and flare stars. Extrinsic variables are grouped into eclipsing binary stars and chromospheri- cal stars. Pulsating variables can again classified into Cepheid, RR Lyrae, RV Tauri, Delta Scuti, Mira etc. The eruptive or cataclysmic variables are novae, supernovae, etc., which rarely occurs and are not periodic phenomena. Most of the other variations are periodic in nature. Variable stars can be observed through many ways such as photometry, spectrophotometry and spectroscopy. The sequence of photometric observa- xiv tions on variable stars produces time series data, which contains time, magni- tude and error. The plot between variable star’s apparent magnitude and time are known as light curve. If the time series data is folded on a period, the plot between apparent magnitude and phase is known as phased light curve. The unique shape of phased light curve is a characteristic of each type of variable star. One way to identify the type of variable star and to classify them is by visually looking at the phased light curve by an expert. For last several years, automated algorithms are used to classify a group of variable stars, with the help of computers. Research on variable stars can be divided into different stages like observa- tion, data reduction, data analysis, modeling and classification. The modeling on variable stars helps to determine the short-term and long-term behaviour and to construct theoretical models (for eg:- Wilson-Devinney model for eclips- ing binaries) and to derive stellar properties like mass, radius, luminosity, tem- perature, internal and external structure, chemical composition and evolution. The classification requires the determination of the basic parameters like pe- riod, amplitude and phase and also some other derived parameters. Out of these, period is the most important parameter since the wrong periods can lead to sparse light curves and misleading information. Time series analysis is a method of applying mathematical and statistical tests to data, to quantify the variation, understand the nature of time-varying phenomena, to gain physical understanding of the system and to predict future behavior of the system. Astronomical time series usually suffer from unevenly spaced time instants, varying error conditions and possibility of big gaps. This is due to daily varying daylight and the weather conditions for ground based observations and observations from space may suffer from the impact of cosmic ray particles. Many large scale astronomical surveys such as MACHO, OGLE, EROS, xv ROTSE, PLANET, Hipparcos, MISAO, NSVS, ASAS, Pan-STARRS, Ke- pler,ESA, Gaia, LSST, CRTS provide variable star’s time series data, even though their primary intention is not variable star observation. Center for Astrostatistics, Pennsylvania State University is established to help the astro- nomical community with the aid of statistical tools for harvesting and analysing archival data. Most of these surveys releases the data to the public for further analysis. There exist many period search algorithms through astronomical time se- ries analysis, which can be classified into parametric (assume some underlying distribution for data) and non-parametric (do not assume any statistical model like Gaussian etc.,) methods. Many of the parametric methods are based on variations of discrete Fourier transforms like Generalised Lomb-Scargle peri- odogram (GLSP) by Zechmeister(2009), Significant Spectrum (SigSpec) by Reegen(2007) etc. Non-parametric methods include Phase Dispersion Minimi- sation (PDM) by Stellingwerf(1978) and Cubic spline method by Akerlof(1994) etc. Even though most of the methods can be brought under automation, any of the method stated above could not fully recover the true periods. The wrong detection of period can be due to several reasons such as power leakage to other frequencies which is due to finite total interval, finite sampling interval and finite amount of data. Another problem is aliasing, which is due to the influence of regular sampling. Also spurious periods appear due to long gaps and power flow to harmonic frequencies is an inherent problem of Fourier methods. Hence obtaining the exact period of variable star from it’s time series data is still a difficult problem, in case of huge databases, when subjected to automation. As Matthew Templeton, AAVSO, states “Variable star data analysis is not always straightforward; large-scale, automated analysis design is non-trivial”. Derekas et al. 2007, Deb et.al. 2010 states “The processing of xvi huge amount of data in these databases is quite challenging, even when looking at seemingly small issues such as period determination and classification”. It will be beneficial for the variable star astronomical community, if basic parameters, such as period, amplitude and phase are obtained more accurately, when huge time series databases are subjected to automation. In the present thesis work, the theories of four popular period search methods are studied, the strength and weakness of these methods are evaluated by applying it on two survey databases and finally a modified form of cubic spline method is intro- duced to confirm the exact period of variable star. For the classification of new variable stars discovered and entering them in the “General Catalogue of Vari- able Stars” or other databases like “Variable Star Index“, the characteristics of the variability has to be quantified in term of variable star parameters.