883 resultados para crittografia, mixnet, EasyCrypt, game-based proofs,sequence of games, computation-aided proofs


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Modern computer systems are plagued with stability and security problems: applications lose data, web servers are hacked, and systems crash under heavy load. Many of these problems or anomalies arise from rare program behavior caused by attacks or errors. A substantial percentage of the web-based attacks are due to buffer overflows. Many methods have been devised to detect and prevent anomalous situations that arise from buffer overflows. The current state-of-art of anomaly detection systems is relatively primitive and mainly depend on static code checking to take care of buffer overflow attacks. For protection, Stack Guards and I-leap Guards are also used in wide varieties.This dissertation proposes an anomaly detection system, based on frequencies of system calls in the system call trace. System call traces represented as frequency sequences are profiled using sequence sets. A sequence set is identified by the starting sequence and frequencies of specific system calls. The deviations of the current input sequence from the corresponding normal profile in the frequency pattern of system calls is computed and expressed as an anomaly score. A simple Bayesian model is used for an accurate detection.Experimental results are reported which show that frequency of system calls represented using sequence sets, captures the normal behavior of programs under normal conditions of usage. This captured behavior allows the system to detect anomalies with a low rate of false positives. Data are presented which show that Bayesian Network on frequency variations responds effectively to induced buffer overflows. It can also help administrators to detect deviations in program flow introduced due to errors.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The importance of industrialisation in achiering rapid economic growth has been recognised in India's development strategy ever since the inception of economic planning in the country. Being the secondary sector in the generation of national income. industry contributes significantly to the process of economic development. Extensive debates have taken place on the nature of the industrialisation strategy to be pursued in the economy since Independence. This is reflected in the industrial policy which evolved through the various five year plans and policy resolutions. Stupendous efforts have been made by the government since the commencement of planning and particularly since the 1960s to industrialise the Indian economy and develop the infrastructural base for sustained industrial development. It is difficult to assess the performance of the industrial sector over the past three decades with respect to the broad objectives of industrialisation. However. there are certain areas in which the achievements have been clearly significant.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Antimicrobial peptides (AMPs) are humoral innate immune components of fishes that provide protection against pathogenic infections. Histone derived antimicrobial peptides are reported to actively participate in the immune defenses of fishes. Present study deals with identification of putative antimicrobial sequences from the histone H2A of sicklefin chimaera, Neoharriotta pinnata. A 52 amino acid residue termed Harriottin-1, a 40 amino acid Harriottin-2, and a 21 mer Harriottin-3 were identified to possess antimicrobial sequence motif. Physicochemical properties andmolecular structure ofHarriottins are in agreement with the characteristic features of antimicrobial peptides, indicating its potential role in innate immunity of sicklefin chimaera. The histone H2A sequence of sicklefin chimera was found to differ from previously reported histone H2A sequences. Phylogenetic analysis based on histone H2A and cytochrome oxidase subunit-1 (CO1) gene revealed N. pinnata to occupy an intermediate position with respect to invertebrates and vertebrates

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Study on variable stars is an important topic of modern astrophysics. After the invention of powerful telescopes and high resolving powered CCD’s, the variable star data is accumulating in the order of peta-bytes. The huge amount of data need lot of automated methods as well as human experts. This thesis is devoted to the data analysis on variable star’s astronomical time series data and hence belong to the inter-disciplinary topic, Astrostatistics. For an observer on earth, stars that have a change in apparent brightness over time are called variable stars. The variation in brightness may be regular (periodic), quasi periodic (semi-periodic) or irregular manner (aperiodic) and are caused by various reasons. In some cases, the variation is due to some internal thermo-nuclear processes, which are generally known as intrinsic vari- ables and in some other cases, it is due to some external processes, like eclipse or rotation, which are known as extrinsic variables. Intrinsic variables can be further grouped into pulsating variables, eruptive variables and flare stars. Extrinsic variables are grouped into eclipsing binary stars and chromospheri- cal stars. Pulsating variables can again classified into Cepheid, RR Lyrae, RV Tauri, Delta Scuti, Mira etc. The eruptive or cataclysmic variables are novae, supernovae, etc., which rarely occurs and are not periodic phenomena. Most of the other variations are periodic in nature. Variable stars can be observed through many ways such as photometry, spectrophotometry and spectroscopy. The sequence of photometric observa- xiv tions on variable stars produces time series data, which contains time, magni- tude and error. The plot between variable star’s apparent magnitude and time are known as light curve. If the time series data is folded on a period, the plot between apparent magnitude and phase is known as phased light curve. The unique shape of phased light curve is a characteristic of each type of variable star. One way to identify the type of variable star and to classify them is by visually looking at the phased light curve by an expert. For last several years, automated algorithms are used to classify a group of variable stars, with the help of computers. Research on variable stars can be divided into different stages like observa- tion, data reduction, data analysis, modeling and classification. The modeling on variable stars helps to determine the short-term and long-term behaviour and to construct theoretical models (for eg:- Wilson-Devinney model for eclips- ing binaries) and to derive stellar properties like mass, radius, luminosity, tem- perature, internal and external structure, chemical composition and evolution. The classification requires the determination of the basic parameters like pe- riod, amplitude and phase and also some other derived parameters. Out of these, period is the most important parameter since the wrong periods can lead to sparse light curves and misleading information. Time series analysis is a method of applying mathematical and statistical tests to data, to quantify the variation, understand the nature of time-varying phenomena, to gain physical understanding of the system and to predict future behavior of the system. Astronomical time series usually suffer from unevenly spaced time instants, varying error conditions and possibility of big gaps. This is due to daily varying daylight and the weather conditions for ground based observations and observations from space may suffer from the impact of cosmic ray particles. Many large scale astronomical surveys such as MACHO, OGLE, EROS, xv ROTSE, PLANET, Hipparcos, MISAO, NSVS, ASAS, Pan-STARRS, Ke- pler,ESA, Gaia, LSST, CRTS provide variable star’s time series data, even though their primary intention is not variable star observation. Center for Astrostatistics, Pennsylvania State University is established to help the astro- nomical community with the aid of statistical tools for harvesting and analysing archival data. Most of these surveys releases the data to the public for further analysis. There exist many period search algorithms through astronomical time se- ries analysis, which can be classified into parametric (assume some underlying distribution for data) and non-parametric (do not assume any statistical model like Gaussian etc.,) methods. Many of the parametric methods are based on variations of discrete Fourier transforms like Generalised Lomb-Scargle peri- odogram (GLSP) by Zechmeister(2009), Significant Spectrum (SigSpec) by Reegen(2007) etc. Non-parametric methods include Phase Dispersion Minimi- sation (PDM) by Stellingwerf(1978) and Cubic spline method by Akerlof(1994) etc. Even though most of the methods can be brought under automation, any of the method stated above could not fully recover the true periods. The wrong detection of period can be due to several reasons such as power leakage to other frequencies which is due to finite total interval, finite sampling interval and finite amount of data. Another problem is aliasing, which is due to the influence of regular sampling. Also spurious periods appear due to long gaps and power flow to harmonic frequencies is an inherent problem of Fourier methods. Hence obtaining the exact period of variable star from it’s time series data is still a difficult problem, in case of huge databases, when subjected to automation. As Matthew Templeton, AAVSO, states “Variable star data analysis is not always straightforward; large-scale, automated analysis design is non-trivial”. Derekas et al. 2007, Deb et.al. 2010 states “The processing of xvi huge amount of data in these databases is quite challenging, even when looking at seemingly small issues such as period determination and classification”. It will be beneficial for the variable star astronomical community, if basic parameters, such as period, amplitude and phase are obtained more accurately, when huge time series databases are subjected to automation. In the present thesis work, the theories of four popular period search methods are studied, the strength and weakness of these methods are evaluated by applying it on two survey databases and finally a modified form of cubic spline method is intro- duced to confirm the exact period of variable star. For the classification of new variable stars discovered and entering them in the “General Catalogue of Vari- able Stars” or other databases like “Variable Star Index“, the characteristics of the variability has to be quantified in term of variable star parameters.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis investigates a method for human-robot interaction (HRI) in order to uphold productivity of industrial robots like minimization of the shortest operation time, while ensuring human safety like collision avoidance. For solving such problems an online motion planning approach for robotic manipulators with HRI has been proposed. The approach is based on model predictive control (MPC) with embedded mixed integer programming. The planning strategies of the robotic manipulators mainly considered in the thesis are directly performed in the workspace for easy obstacle representation. The non-convex optimization problem is approximated by a mixed-integer program (MIP). It is further effectively reformulated such that the number of binary variables and the number of feasible integer solutions are drastically decreased. Safety-relevant regions, which are potentially occupied by the human operators, can be generated online by a proposed method based on hidden Markov models. In contrast to previous approaches, which derive predictions based on probability density functions in the form of single points, such as most likely or expected human positions, the proposed method computes safety-relevant subsets of the workspace as a region which is possibly occupied by the human at future instances of time. The method is further enhanced by combining reachability analysis to increase the prediction accuracy. These safety-relevant regions can subsequently serve as safety constraints when the motion is planned by optimization. This way one arrives at motion plans that are safe, i.e. plans that avoid collision with a probability not less than a predefined threshold. The developed methods have been successfully applied to a developed demonstrator, where an industrial robot works in the same space as a human operator. The task of the industrial robot is to drive its end-effector according to a nominal sequence of grippingmotion-releasing operations while no collision with a human arm occurs.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Abstract 1: Social Networks such as Twitter are often used for disseminating and collecting information during natural disasters. The potential for its use in Disaster Management has been acknowledged. However, more nuanced understanding of the communications that take place on social networks are required to more effectively integrate this information into the processes within disaster management. The type and value of information shared should be assessed, determining the benefits and issues, with credibility and reliability as known concerns. Mapping the tweets in relation to the modelled stages of a disaster can be a useful evaluation for determining the benefits/drawbacks of using data from social networks, such as Twitter, in disaster management.A thematic analysis of tweets’ content, language and tone during the UK Storms and Floods 2013/14 was conducted. Manual scripting was used to determine the official sequence of events, and classify the stages of the disaster into the phases of the Disaster Management Lifecycle, to produce a timeline. Twenty- five topics discussed on Twitter emerged, and three key types of tweets, based on the language and tone, were identified. The timeline represents the events of the disaster, according to the Met Office reports, classed into B. Faulkner’s Disaster Management Lifecycle framework. Context is provided when observing the analysed tweets against the timeline. This illustrates a potential basis and benefit for mapping tweets into the Disaster Management Lifecycle phases. Comparing the number of tweets submitted in each month with the timeline, suggests users tweet more as an event heightens and persists. Furthermore, users generally express greater emotion and urgency in their tweets.This paper concludes that the thematic analysis of content on social networks, such as Twitter, can be useful in gaining additional perspectives for disaster management. It demonstrates that mapping tweets into the phases of a Disaster Management Lifecycle model can have benefits in the recovery phase, not just in the response phase, to potentially improve future policies and activities. Abstract2: The current execution of privacy policies, as a mode of communicating information to users, is unsatisfactory. Social networking sites (SNS) exemplify this issue, attracting growing concerns regarding their use of personal data and its effect on user privacy. This demonstrates the need for more informative policies. However, SNS lack the incentives required to improve policies, which is exacerbated by the difficulties of creating a policy that is both concise and compliant. Standardization addresses many of these issues, providing benefits for users and SNS, although it is only possible if policies share attributes which can be standardized. This investigation used thematic analysis and cross- document structure theory, to assess the similarity of attributes between the privacy policies (as available in August 2014), of the six most frequently visited SNS globally. Using the Jaccard similarity coefficient, two types of attribute were measured; the clauses used by SNS and the coverage of forty recommendations made by the UK Information Commissioner’s Office. Analysis showed that whilst similarity in the clauses used was low, similarity in the recommendations covered was high, indicating that SNS use different clauses, but to convey similar information. The analysis also showed that low similarity in the clauses was largely due to differences in semantics, elaboration and functionality between SNS. Therefore, this paper proposes that the policies of SNS already share attributes, indicating the feasibility of standardization and five recommendations are made to begin facilitating this, based on the findings of the investigation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this final week we will look at the tensions between ludology and narratology in games design, in effect how the agency of games has been reconciled with the dramatic requirements (and lack of agency) in narrative. I will argue that there are two broad approaches, the mainstream method of concentrating in the Fabula, and a method pioneered by many indie games of fusing narrative and play. We will look in more detail at what this might mean in terms of thematic cohesion, diegetic choices, and mechanics and metaphor. Finally we look at Spec Ops: The Line, as a rare example of a AAA title that takes this fusion approach. Looking at how the game uses many of the techniques we have explored.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We offer a new explanation of partial risk sharing based on coalition formation and segmentation of society in a risky environment, without assuming limited commitment and imperfect information. Heterogenous individuals in a society freely choose with whom they will share risk. A partition belonging to the core of the membership game obtains. Perfect risk sharing does not necessarily arise. Focusing on mutual insurance rule and assuming that individuals only differ with respect to risk, we show that the core partition is homophily-based. The distribution of risk affects the number and size of these coalitions. Individuals may pay a lower risk premium in riskier societies. A higher heterogeneity in risk leads to a lower degree of risk sharing. We discuss how the endogenous partition of society into risk-sharing coalitions may shed light on empirical evidence on partial risk sharing. The case of heterogenous risk aversion leads to similar results.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

En aquesta tesi proposem dos esquemes de xarxa amb control d'admissió per al trànsit elàstic TCP amb mecanismes senzills. Ambdós esquemes són capaços de proporcionar throughputs diferents i aïllament entre fluxos, on un "flux" es defineix com una seqüència de paquets relacionats dins d'una connexió TCP. Quant a l'arquitectura, ambdós fan servir classes de paquets amb diferents prioritats de descart, i un control d'admissió implícit, edge-to-edge i basat en mesures. En el primer esquema, les mesures són per flux, mentre que en el segon, les mesures són per agregat. El primer esquema aconsegueix un bon rendiment fent servir una modificació especial de les fonts TCP, mentre que el segon aconsegueix un bon rendiment amb fonts TCP estàndard. Ambdós esquemes han estat avaluats satisfactòriament a través de simulació en diferents topologies de xarxa i càrregues de trànsit.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis proposes a solution to the problem of estimating the motion of an Unmanned Underwater Vehicle (UUV). Our approach is based on the integration of the incremental measurements which are provided by a vision system. When the vehicle is close to the underwater terrain, it constructs a visual map (so called "mosaic") of the area where the mission takes place while, at the same time, it localizes itself on this map, following the Concurrent Mapping and Localization strategy. The proposed methodology to achieve this goal is based on a feature-based mosaicking algorithm. A down-looking camera is attached to the underwater vehicle. As the vehicle moves, a sequence of images of the sea-floor is acquired by the camera. For every image of the sequence, a set of characteristic features is detected by means of a corner detector. Then, their correspondences are found in the next image of the sequence. Solving the correspondence problem in an accurate and reliable way is a difficult task in computer vision. We consider different alternatives to solve this problem by introducing a detailed analysis of the textural characteristics of the image. This is done in two phases: first comparing different texture operators individually, and next selecting those that best characterize the point/matching pair and using them together to obtain a more robust characterization. Various alternatives are also studied to merge the information provided by the individual texture operators. Finally, the best approach in terms of robustness and efficiency is proposed. After the correspondences have been solved, for every pair of consecutive images we obtain a list of image features in the first image and their matchings in the next frame. Our aim is now to recover the apparent motion of the camera from these features. Although an accurate texture analysis is devoted to the matching pro-cedure, some false matches (known as outliers) could still appear among the right correspon-dences. For this reason, a robust estimation technique is used to estimate the planar transformation (homography) which explains the dominant motion of the image. Next, this homography is used to warp the processed image to the common mosaic frame, constructing a composite image formed by every frame of the sequence. With the aim of estimating the position of the vehicle as the mosaic is being constructed, the 3D motion of the vehicle can be computed from the measurements obtained by a sonar altimeter and the incremental motion computed from the homography. Unfortunately, as the mosaic increases in size, image local alignment errors increase the inaccuracies associated to the position of the vehicle. Occasionally, the trajectory described by the vehicle may cross over itself. In this situation new information is available, and the system can readjust the position estimates. Our proposal consists not only in localizing the vehicle, but also in readjusting the trajectory described by the vehicle when crossover information is obtained. This is achieved by implementing an Augmented State Kalman Filter (ASKF). Kalman filtering appears as an adequate framework to deal with position estimates and their associated covariances. Finally, some experimental results are shown. A laboratory setup has been used to analyze and evaluate the accuracy of the mosaicking system. This setup enables a quantitative measurement of the accumulated errors of the mosaics created in the lab. Then, the results obtained from real sea trials using the URIS underwater vehicle are shown.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

La tesis pretende explorar acercamientos computacionalmente confiables y eficientes de contractivo MPC para sistemas de tiempo discreto. Dos tipos de contractivo MPC han sido estudiados: MPC con coacción contractiva obligatoria y MPC con una secuencia contractiva de conjuntos controlables. Las técnicas basadas en optimización convexa y análisis de intervalos son aplicadas para tratar MPC contractivo lineal y no lineal, respectivamente. El análisis de intervalos clásicos es ampliado a zonotopes en la geometría para diseñar un conjunto invariante de control terminal para el modo dual de MPC. También es ampliado a intervalos modales para tener en cuenta la modalidad al calcula de conjuntos controlables robustos con una interpretación semántica clara. Los instrumentos de optimización convexa y análisis de intervalos han sido combinados para mejorar la eficacia de contractive MPC para varias clases de sistemas de tiempo discreto inciertos no lineales limitados. Finalmente, los dos tipos dirigidos de contractivo MPC han sido aplicados para controlar un Torneo de Fútbol de Copa Mundial de Micro Robot (MiroSot) y un Tanque-Reactor de Mezcla Continua (CSTR), respectivamente.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The proposal presented in this thesis is to provide designers of knowledge based supervisory systems of dynamic systems with a framework to facilitate their tasks avoiding interface problems among tools, data flow and management. The approach is thought to be useful to both control and process engineers in assisting their tasks. The use of AI technologies to diagnose and perform control loops and, of course, assist process supervisory tasks such as fault detection and diagnose, are in the scope of this work. Special effort has been put in integration of tools for assisting expert supervisory systems design. With this aim the experience of Computer Aided Control Systems Design (CACSD) frameworks have been analysed and used to design a Computer Aided Supervisory Systems (CASSD) framework. In this sense, some basic facilities are required to be available in this proposed framework: ·

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A driver controls a car by turning the steering wheel or by pressing on the accelerator or the brake. These actions are modelled by Gaussian processes, leading to a stochastic model for the motion of the car. The stochastic model is the basis of a new filter for tracking and predicting the motion of the car, using measurements obtained by fitting a rigid 3D model to a monocular sequence of video images. Experiments show that the filter easily outperforms traditional filters.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Turing Test, originally configured for a human to distinguish between an unseen man and unseen woman through a text-based conversational measure of gender, is the ultimate test for thinking. So conceived Alan Turing when he replaced the woman with a machine. His assertion, that once a machine deceived a human judge into believing that they were the human, then that machine should be attributed with intelligence. But is the Turing Test nothing more than a mindless game? We present results from recent Loebner Prizes, a platform for the Turing Test, and find that machines in the contest appear conversationally worse rather than better, from 2004 to 2006, showing a downward trend in highest scores awarded to them by human judges. Thus the machines are not thinking in the same way as a human intelligent entity would.