950 resultados para Symbolic Execution
Resumo:
Key Performance Indicators (KPIs) and their predictions are widely used by the enterprises for informed decision making. Nevertheless , a very important factor, which is generally overlooked, is that the top level strategic KPIs are actually driven by the operational level business processes. These two domains are, however, mostly segregated and analysed in silos with different Business Intelligence solutions. In this paper, we are proposing an approach for advanced Business Simulations, which converges the two domains by utilising process execution & business data, and concepts from Business Dynamics (BD) and Business Ontologies, to promote better system understanding and detailed KPI predictions. Our approach incorporates the automated creation of Causal Loop Diagrams, thus empowering the analyst to critically examine the complex dependencies hidden in the massive amounts of available enterprise data. We have further evaluated our proposed approach in the context of a retail use-case that involved verification of the automatically generated causal models by a domain expert.
Resumo:
This chapter presents an exploratory study involving a group of athletic shoe enthusiasts and their feelings towards customized footwear. These "sneakerheads" demonstrate their infatuation with sneakers via activities ranging from creating catalogs of custom shoes to buying and selling rare athletic footwear online. The key characteristic these individuals share is that, for them, athletic shoes are a fundamental fashion accessory stepped in symbolism and meaning. A series of in-depth interviews utilizing the Zaltman Metaphor Elicitation Technique (ZMET) provide a better understanding of how issues such as art, self-expression, exclusivity, peer recognition, and counterfeit goods interact with the mass customization of symbolic products by category experts.
Resumo:
This paper reports on the use of non-symbolic fragmentation of data for securing communications. Non-symbolic fragmentation, or NSF, relies on breaking up data into non-symbolic fragments, which are (usually irregularly-sized) chunks whose boundaries do not necessarily coincide with the boundaries of the symbols making up the data. For example, ASCII data is broken up into fragments which may include 8-bit fragments but also include many other sized fragments. Fragments are then separated with a form of path diversity. The secrecy of the transmission relies on the secrecy of one or more of a number of things: the ordering of the fragments, the sizes of the fragments, and the use of path diversity. Once NSF is in place, it can help secure many forms of communication, and is useful for exchanging sensitive information, and for commercial transactions. A sample implementation is described with an evaluation of the technology.
Resumo:
The analysis of system calls is one method employed by anomaly detection systems to recognise malicious code execution. Similarities can be drawn between this process and the behaviour of certain cells belonging to the human immune system, and can be applied to construct an artificial immune system. A recently developed hypothesis in immunology, the Danger Theory, states that our immune system responds to the presence of intruders through sensing molecules belonging to those invaders, plus signals generated by the host indicating danger and damage. We propose the incorporation of this concept into a responsive intrusion detection system, where behavioural information of the system and running processes is combined with information regarding individual system calls.
Resumo:
When we take a step back from the imposing figure of physical violence, it becomes possible to examine other structurally violent forces that constantly shape our cultural and political landscapes. One of the driving interests in the “turn to Paul” in recent continental philosophy stems from wrestling with questions about the real nature of contemporary violence. Paul is positioned as a thinker whose messianic experience began to cut through the violent masquerade of the existing order. The crucifixion and resurrection of the Messiah (a slave and a God co-existing in one body) exposed the empty grounding upon which power resided. The Christ-event signifies a moment of violent interruption in the existing order which Paul enjoins the Gentiles to participate in through a dedication of love for the neighbour. This divine violence aims to reveal and subvert the “powers,” epitomised in the Roman Empire, in order to fulfil the labour of the Messianic now-time which had arrived. The impetus behind this research comes from a typically enigmatic and provocative section of text by the Slovene philosopher, cultural critic, and Christian atheist Slavoj Žižek. He claims that 'the notion of love should be given here all its Paulinian weight: the domain of pure violence… is the domain of love' (2008a, 173). In this move he links Paul’s idea of love to that of Walter Benjamin’s divine violence; the sublime and the cataclysmic come together in this seemingly perverse notion. At stake here is the way in which uncovering violent forces in the “zero-level” of our narrative worldviews aids the diagnosis of contemporary political and ethical issues. It is not enough to imagine Paul’s encounter with the Christ-event as non-violent. This Jewish apocalyptic movement was engaged in a violent struggle within an existing order that God’s wrath will soon dismantle. Paul’s weak violence, inspired by his fidelity to the Christ-event, places all responsibility over creation in the role of the individual within the collective body. The centre piece of this re-imagined construction of the Pauline narrative comes in Romans 13: the violent dedication to love understood in the radical nature of the now-time. 3 This research examines the role that narratives play in the creation and diagnosis of these violent forces. In order to construct a new genealogy of violence in Christianity it is crucial to understand the role of the slave of Christ (the revolutionary messianic subject). This turn in the Symbolic is examined through creating a literary structure in which we can approach a radical Nietzschean shift in Pauline thought. The claim here, a claim which is also central to Paul’s letters, is that when the symbolic violence which manipulates our worldviews is undone by a divine violence, if even for a moment, new possibilities are created in the opening for a transvaluation of values. Through this we uncover the nature of original sin: the consequences of the interconnected reality of our actions. The role of literature is vital in the construction of this narrative; starting with Cormac McCarthy’s No Country for Old Men, and continuing through works such as Melville’s Bartleby the Scrivener, this thesis draws upon the power of literature in the shaping of our narrative worlds. Typical of the continental philosophy at the heart of this work, a diverse range of illustrations and inspirations from fiction is pulled into its narrative to reflect the symbolic universe that this work was forged through. What this work attempts to do is give this theory a greater grounding in Paul’s letters by demonstrating this radical kenotic power at the heart of the Christ-event. Romans 13 reveals, in a way that has not yet been picked up by Critchley, Žižek, and others, that Paul opposed the biopolitical power of the Roman Empire through the weak violence of love that is the labour of the slaves of Christ on the “now-time” that had arrived.
Resumo:
The analysis of system calls is one method employed by anomaly detection systems to recognise malicious code execution. Similarities can be drawn between this process and the behaviour of certain cells belonging to the human immune system, and can be applied to construct an artificial immune system. A recently developed hypothesis in immunology, the Danger Theory, states that our immune system responds to the presence of intruders through sensing molecules belonging to those invaders, plus signals generated by the host indicating danger and damage. We propose the incorporation of this concept into a responsive intrusion detection system, where behavioural information of the system and running processes is combined with information regarding individual system calls.
Resumo:
This thesis examines the manufacture, use, exchange (including gift exchange), collecting and commodification of German medals and badges from the early 18th century until the present-day, with particular attention being given to the symbols that were deployed by the National Socialist German Workers’ Party (NSDAP) between 1919 and 1945. It does so by focusing in particular on the construction of value through insignia, and how such badges and their symbolic and monetary value changed over time. In order to achieve this, the thesis adopts a chronological structure, which encompasses the creation of Prussia in 1701, the Napoleonic wars and the increased democratisation of military awards such as the Iron Cross during the Great War. The collapse of the Kaiserreich in 1918 was the major factor that led to the creation of the NSDAP under the eventual strangle-hold of Hitler, a fundamentally racist and anti-Semitic movement that continued the German tradition of awarding and wearing badges. The traditional symbols of Imperial Germany, such as the eagle, were then infused with the swastika, an emblem that was meant to signify anti-Semitism, thus creating a hybrid identity. This combination was then replicated en-masse, and eventually eclipsed all the symbols that had possessed symbolic significance in Germany’s past. After Hitler was appointed Chancellor in 1933, millions of medals and badges were produced in an effort to create a racially based “People’s Community”, but the steel and iron that were required for munitions eventually led to substitute materials being utilised and developed in order to manufacture millions of politically oriented badges. The Second World War unleashed Nazi terror across Europe, and the conscripts and volunteers who took part in this fight for living-space were rewarded with medals that were modelled on those that had been instituted during Imperial times. The colonial conquest and occupation of the East by the Wehrmacht, the Order Police and the Waffen-SS surpassed the brutality of former wars that finally culminated in the Holocaust, and some of these horrific crimes and the perpetrators of them were perversely rewarded with medals and badges. Despite Nazism being thoroughly discredited, many of the Allied soldiers who occupied Germany took part in the age-old practice of obtaining trophies of war, which reconfigured the meaning of Nazi badges as souvenirs, and began the process of their increased commodification on an emerging secondary collectors’ market. In order to analyse the dynamics of this market, a “basket” of badges is examined that enables a discussion of the role that aesthetics, scarcity and authenticity have in determining the price of the artefacts. In summary, this thesis demonstrates how the symbolic, socio-economic and exchange value of German military and political medals and badges has changed substantially over time, provides a stimulus for scholars to conduct research in this under-developed area, and encourages collectors to investigate the artefacts that they collect in a more historically contextualised manner.
Resumo:
Within the last few years, disabled people have become the target of government austerity measures through drastic cuts to welfare justified through the portrayal of benefit claimants as inactive, problem citizens who are wilfully unemployed. For all that is wrong with these cuts, they are one of many aspects of exclusion that disabled people face. Attitudes towards disability are deteriorating (Scope, 2011) and disabled people are devalued and negatively positioned in a myriad of ways, meaning that an understanding of the perceptions and positioning of disability and the power of disabling practices is critical. This thesis will examine how Bourdieu’s theoretical repertoire may be applied to the area of Disability Studies in order to discern how society produces oppressive and exclusionary systems of classification which structures the social position and perceptions of disability. The composite nature of disability and multiple forms of exclusion and inequality associated with it benefits from a multipronged approach which acknowledges personal, embodied and psychological aspects of disability alongside socio-political and cultural conceptualisations. Bourdieu’s approach is one in which the micro and macro aspects of social life are brought together through their meso interplay and provides a thorough analysis of the many aspects of disability.
Resumo:
The big data era has dramatically transformed our lives; however, security incidents such as data breaches can put sensitive data (e.g. photos, identities, genomes) at risk. To protect users' data privacy, there is a growing interest in building secure cloud computing systems, which keep sensitive data inputs hidden, even from computation providers. Conceptually, secure cloud computing systems leverage cryptographic techniques (e.g., secure multiparty computation) and trusted hardware (e.g. secure processors) to instantiate a “secure” abstract machine consisting of a CPU and encrypted memory, so that an adversary cannot learn information through either the computation within the CPU or the data in the memory. Unfortunately, evidence has shown that side channels (e.g. memory accesses, timing, and termination) in such a “secure” abstract machine may potentially leak highly sensitive information, including cryptographic keys that form the root of trust for the secure systems. This thesis broadly expands the investigation of a research direction called trace oblivious computation, where programming language techniques are employed to prevent side channel information leakage. We demonstrate the feasibility of trace oblivious computation, by formalizing and building several systems, including GhostRider, which is a hardware-software co-design to provide a hardware-based trace oblivious computing solution, SCVM, which is an automatic RAM-model secure computation system, and ObliVM, which is a programming framework to facilitate programmers to develop applications. All of these systems enjoy formal security guarantees while demonstrating a better performance than prior systems, by one to several orders of magnitude.
Resumo:
Reconfigurable HW can be used to build a hardware multitasking system where tasks can be assigned to the reconfigurable HW at run-time according to the requirements of the running applications. Normally the execution in this kind of systems is controlled by an embedded processor. In these systems tasks are frequently represented as subtask graphs, where a subtask is the basic scheduling unit that can be assigned to a reconfigurable HW. In order to control the execution of these tasks, the processor must manage at run-time complex data structures, like graphs or linked list, which may generate significant execution-time penalties. In addition, HW/SW communications are frequently a system bottleneck. Hence, it is very interesting to find a way to reduce the run-time SW computations and the HW/SW communications. To this end we have developed a HW execution manager that controls the execution of subtask graphs over a set of reconfigurable units. This manager receives as input a subtask graph coupled to a subtask schedule, and guarantees its proper execution. In addition it includes support to reduce the execution-time overhead due to reconfigurations. With this HW support the execution of task graphs can be managed efficiently generating only very small run-time penalties.
Resumo:
Reconfigurable hardware can be used to build multi tasking systems that dynamically adapt themselves to the requirements of the running applications. This is especially useful in embedded systems, since the available resources are very limited and the reconfigurable hardware can be reused for different applications. In these systems computations are frequently represented as task graphs that are executed taking into account their internal dependencies and the task schedule. The management of the task graph execution is critical for the system performance. In this regard, we have developed two dif erent versions, a software module and a hardware architecture, of a generic task-graph execution manager for reconfigurable multi-tasking systems. The second version reduces the run-time management overheads by almost two orders of magnitude. Hence it is especially suitable for systems with exigent timing constraints. Both versions include specific support to optimize the reconfiguration process.
Resumo:
Mathematical skills that we acquire during formal education mostly entail exact numerical processing. Besides this specifically human faculty, an additional system exists to represent and manipulate quantities in an approximate manner. We share this innate approximate number system (ANS) with other nonhuman animals and are able to use it to process large numerosities long before we can master the formal algorithms taught in school. Dehaene´s (1992) Triple Code Model (TCM) states that also after the onset of formal education, approximate processing is carried out in this analogue magnitude code no matter if the original problem was presented nonsymbolically or symbolically. Despite the wide acceptance of the model, most research only uses nonsymbolic tasks to assess ANS acuity. Due to this silent assumption that genuine approximation can only be tested with nonsymbolic presentations, up to now important implications in research domains of high practical relevance remain unclear, and existing potential is not fully exploited. For instance, it has been found that nonsymbolic approximation can predict math achievement one year later (Gilmore, McCarthy, & Spelke, 2010), that it is robust against the detrimental influence of learners´ socioeconomic status (SES), and that it is suited to foster performance in exact arithmetic in the short-term (Hyde, Khanum, & Spelke, 2014). We provided evidence that symbolic approximation might be equally and in some cases even better suited to generate predictions and foster more formal math skills independently of SES. In two longitudinal studies, we realized exact and approximate arithmetic tasks in both a nonsymbolic and a symbolic format. With first graders, we demonstrated that performance in symbolic approximation at the beginning of term was the only measure consistently not varying according to children´s SES, and among both approximate tasks it was the better predictor for math achievement at the end of first grade. In part, the strong connection seems to come about from mediation through ordinal skills. In two further experiments, we tested the suitability of both approximation formats to induce an arithmetic principle in elementary school children. We found that symbolic approximation was equally effective in making children exploit the additive law of commutativity in a subsequent formal task as a direct instruction. Nonsymbolic approximation on the other hand had no beneficial effect. The positive influence of the symbolic approximate induction was strongest in children just starting school and decreased with age. However, even third graders still profited from the induction. The results show that also symbolic problems can be processed as genuine approximation, but that beyond that they have their own specific value with regard to didactic-educational concerns. Our findings furthermore demonstrate that the two often con-founded factors ꞌformatꞌ and ꞌdemanded accuracyꞌ cannot be disentangled easily in first graders numerical understanding, but that children´s SES also influences existing interrelations between the different abilities tested here.
Resumo:
We consider a system described by the linear heat equation with adiabatic boundary conditions which is perturbed periodicaly. This perturbation is nonlinear and is characterized by a one-parameter family of quadratic maps. The system, depending on the parameters, presents very complex behaviour. We introduce a symbolic framework to analyze the system and resume its most important features.
Resumo:
We consider piecewise defined differential dynamical systems which can be analysed through symbolic dynamics and transition matrices. We have a continuous regime, where the time flow is characterized by an ordinary differential equation (ODE) which has explicit solutions, and the singular regime, where the time flow is characterized by an appropriate transformation. The symbolic codification is given through the association of a symbol for each distinct regular system and singular system. The transition matrices are then determined as linear approximations to the symbolic dynamics. We analyse the dependence on initial conditions, parameter variation and the occurrence of global strange attractors.