880 resultados para LISP (Computer program language)
Resumo:
Nowadays, Workflow Management Systems (WfMSs) and, more generally, Process Management Systems (PMPs) are process-aware Information Systems (PAISs), are widely used to support many human organizational activities, ranging from well-understood, relatively stable and structures processes (supply chain management, postal delivery tracking, etc.) to processes that are more complicated, less structured and may exhibit a high degree of variation (health-care, emergency management, etc.). Every aspect of a business process involves a certain amount of knowledge which may be complex depending on the domain of interest. The adequate representation of this knowledge is determined by the modeling language used. Some processes behave in a way that is well understood, predictable and repeatable: the tasks are clearly delineated and the control flow is straightforward. Recent discussions, however, illustrate the increasing demand for solutions for knowledge-intensive processes, where these characteristics are less applicable. The actors involved in the conduct of a knowledge-intensive process have to deal with a high degree of uncertainty. Tasks may be hard to perform and the order in which they need to be performed may be highly variable. Modeling knowledge-intensive processes can be complex as it may be hard to capture at design-time what knowledge is available at run-time. In realistic environments, for example, actors lack important knowledge at execution time or this knowledge can become obsolete as the process progresses. Even if each actor (at some point) has perfect knowledge of the world, it may not be certain of its beliefs at later points in time, since tasks by other actors may change the world without those changes being perceived. Typically, a knowledge-intensive process cannot be adequately modeled by classical, state of the art process/workflow modeling approaches. In some respect there is a lack of maturity when it comes to capturing the semantic aspects involved, both in terms of reasoning about them. The main focus of the 1st International Workshop on Knowledge-intensive Business processes (KiBP 2012) was investigating how techniques from different fields, such as Artificial Intelligence (AI), Knowledge Representation (KR), Business Process Management (BPM), Service Oriented Computing (SOC), etc., can be combined with the aim of improving the modeling and the enactment phases of a knowledge-intensive process. The 1st International Workshop on Knowledge-intensive Business process (KiBP 2012) was held as part of the program of the 2012 Knowledge Representation & Reasoning International Conference (KR 2012) in Rome, Italy, in June 2012. The workshop was hosted by the Dipartimento di Ingegneria Informatica, Automatica e Gestionale Antonio Ruberti of Sapienza Universita di Roma, with financial support of the University, through grant 2010-C26A107CN9 TESTMED, and the EU Commission through the projects FP7-25888 Greener Buildings and FP7-257899 Smart Vortex. This volume contains the 5 papers accepted and presented at the workshop. Each paper was reviewed by three members of the internationally renowned Program Committee. In addition, a further paper was invted for inclusion in the workshop proceedings and for presentation at the workshop. There were two keynote talks, one by Marlon Dumas (Institute of Computer Science, University of Tartu, Estonia) on "Integrated Data and Process Management: Finally?" and the other by Yves Lesperance (Department of Computer Science and Engineering, York University, Canada) on "A Logic-Based Approach to Business Processes Customization" completed the scientific program. We would like to thank all the Program Committee members for the valuable work in selecting the papers, Andrea Marrella for his valuable work as publication and publicity chair of the workshop, and Carola Aiello and the consulting agency Consulta Umbria for the organization of this successful event.
Resumo:
A breaker restrike is an abnormal arcing phenomenon, leading to a possible breaker failure. Eventually, this failure leads to interruption of the transmission and distribution of the electricity supply system until the breaker is replaced. Before 2008, there was little evidence in the literature of monitoring techniques based on restrike measurement and interpretation produced during switching of capacitor banks and shunt reactor banks in power systems. In 2008 a non-intrusive radiometric restrike measurement method and a restrike hardware detection algorithm were developed by M.S. Ramli and B. Kasztenny. However, the limitations of the radiometric measurement method are a band limited frequency response as well as limitations in amplitude determination. Current restrike detection methods and algorithms require the use of wide bandwidth current transformers and high voltage dividers. A restrike switch model using Alternative Transient Program (ATP) and Wavelet Transforms which support diagnostics are proposed. Restrike phenomena become a new diagnostic process using measurements, ATP and Wavelet Transforms for online interrupter monitoring. This research project investigates the restrike switch model Parameter „A. dielectric voltage gradient related to a normal and slowed case of the contact opening velocity and the escalation voltages, which can be used as a diagnostic tool for a vacuum circuit-breaker (CB) at service voltages between 11 kV and 63 kV. During current interruption of an inductive load at current quenching or chopping, a transient voltage is developed across the contact gap. The dielectric strength of the gap should rise to a point to withstand this transient voltage. If it does not, the gap will flash over, resulting in a restrike. A straight line is fitted through the voltage points at flashover of the contact gap. This is the point at which the gap voltage has reached a value that exceeds the dielectric strength of the gap. This research shows that a change in opening contact velocity of the vacuum CB produces a corresponding change in the slope of the gap escalation voltage envelope. To investigate the diagnostic process, an ATP restrike switch model was modified with contact opening velocity computation for restrike waveform signature analyses along with experimental investigations. This also enhanced a mathematical CB model with the empirical dielectric model for SF6 (sulphur hexa-fluoride) CBs at service voltages above 63 kV and a generalised dielectric curve model for 12 kV CBs. A CB restrike can be predicted if there is a similar type of restrike waveform signatures for measured and simulated waveforms. The restrike switch model applications are used for: computer simulations as virtual experiments, including predicting breaker restrikes; estimating the interrupter remaining life of SF6 puffer CBs; checking system stresses; assessing point-on-wave (POW) operations; and for a restrike detection algorithm development using Wavelet Transforms. A simulated high frequency nozzle current magnitude was applied to an Equation (derived from the literature) which can calculate the life extension of the interrupter of a SF6 high voltage CB. The restrike waveform signatures for a medium and high voltage CB identify its possible failure mechanism such as delayed opening, degraded dielectric strength and improper contact travel. The simulated and measured restrike waveform signatures are analysed using Matlab software for automatic detection. Experimental investigation of a 12 kV vacuum CB diagnostic was carried out for the parameter determination and a passive antenna calibration was also successfully developed with applications for field implementation. The degradation features were also evaluated with a predictive interpretation technique from the experiments, and the subsequent simulation indicates that the drop in voltage related to the slow opening velocity mechanism measurement to give a degree of contact degradation. A predictive interpretation technique is a computer modeling for assessing switching device performance, which allows one to vary a single parameter at a time; this is often difficult to do experimentally because of the variable contact opening velocity. The significance of this thesis outcome is that it is a non-intrusive method developed using measurements, ATP and Wavelet Transforms to predict and interpret a breaker restrike risk. The measurements on high voltage circuit-breakers can identify degradation that can interrupt the distribution and transmission of an electricity supply system. It is hoped that the techniques for the monitoring of restrike phenomena developed by this research will form part of a diagnostic process that will be valuable for detecting breaker stresses relating to the interrupter lifetime. Suggestions for future research, including a field implementation proposal to validate the restrike switch model for ATP system studies and the hot dielectric strength curve model for SF6 CBs, are given in Appendix A.
Resumo:
The University of Newcastle (UoN) offers various access and support programs for a range of students through the English Language and Foundation Studies Centre and a University orientation for students. At UoN, students are required to engage in a learning experience, meet program outcomes and demonstrate the core attributes of the University at each graduation point. For a University with a strong focus on access is there a missing facet to the access programs where students are required to study within a teaching delivery style which may be vastly different to their previous educational experience? This paper will describe a pedagogical orientation program currently delivered at UoN School of Architecture and Built Environment in 2005 to assist in the transition of students from different cultural and pedagogical backgrounds into “Problem Based Learning” as delivered by this School. Furthermore the paper will analyse how this program has enabled students from diverse backgrounds to understand and successfully embrace the new learning opportunities.
Resumo:
Software development and Web site development techniques have evolved significantly over the past 20 years. The relatively young Web Application development area has borrowed heavily from traditional software development methodologies primarily due to the similarities in areas of data persistence and User Interface (UI) design. Recent developments in this area propose a new Web Modeling Language (WebML) to facilitate the nuances specific to Web development. WebML is one of a number of implementations designed to enable modeling of web site interaction flows while being extendable to accommodate new features in Web site development into the future. Our research aims to extend WebML with a focus on stigmergy which is a biological term originally used to describe coordination between insects. We see design features in existing Web sites that mimic stigmergic mechanisms as part of the UI. We believe that we can synthesize and embed stigmergy in Web 2.0 sites. This paper focuses on the sub-topic of site UI design and stigmergic mechanism designs required to achieve this.
Resumo:
This paper outlines the Fulbright Scholar-in-Residence (SIR) program that I undertook at University of Colorado in Denver, USA, in August-December 2010. It explains how the SIR program proved a most enriching experience for me, professionally and personally. One reason for this paper is to encourage other LIS professionals and educators to apply for Fulbrights and other types of scholarships and exchange programs. And also to reinforce the message that research and further study really can open doors and enrich our professional careers.
Resumo:
This tutorial is designed to help new users become familiar with using the PicoBlaze microcontroller with the Spartan-3E board. The tutorial gives a brief introduction to the PicoBlaze microcontroller, and then steps through the following: - Writing a small PicoBlaze assembly language (.psm) file, and stepping through the process of assembling the .psm file using KCPSM3; - Writing a top level VHDL module to connect the PicoBlaze microcontroller (KCPSM3 component) and the program ROM, and to connect the required input and output ports; - Connecting the top level module inputs and outputs to the switches, buttons and LEDs on the Spartan-3E board; - Downloading the program to the Spartan-3E board using the Project Navigator software.
Resumo:
It is acknowledged around the world that many university students struggle with learning to program (McCracken et al., 2001; McGettrick et al., 2005). In this paper, we describe how we have developed a research programme to systematically study and incrementally improve our teaching. We have adopted a research programme with three elements: (1) a theory that provides an organising framework for defining the type of phenomena and data of interest, (2) data on how the class as a whole performs on formative assessment tasks that are framed from within the organising framework, and (3) data from one-on-one think aloud sessions, to establish why students struggle with some of those in-class formative assessment tasks. We teach introductory computer programming, but this three-element structure of our research is applicable to many areas of engineering education research.
Resumo:
The official need for content teachers to teach the language features of their fields has never been greater in Australia than now. In 2012, the recently formed national curriculum board announced that all teachers are responsible for the English language development of students whose first language or dialect is not Standard Australian English (SAE). This formal endorsement is an important juncture regarding the way expertise might be developed, perceived and exchanged between content and language teachers through collaboration, in order for the goals of English language learners in content areas to be realised. To that end, we conducted an action research project to explore and extend the reading strategies pedagogy of one English language teacher who teaches English language learners in a parallel junior high school Geography program. Such pedagogy will be valuable for all teachers as they seek to contribute to English language development goals as outlined in national curricula.
Resumo:
Organizations from every industry sector seek to enhance their business performance and competitiveness through the deployment of contemporary information systems (IS), such as Enterprise Systems (ERP). Investments in ERP are complex and costly, attracting scrutiny and pressure to justify their cost. Thus, IS researchers highlight the need for systematic evaluation of information system success, or impact, which has resulted in the introduction of varied models for evaluating information systems. One of these systematic measurement approaches is the IS-Impact Model introduced by a team of researchers at Queensland University of technology (QUT) (Gable, Sedera, & Chan, 2008). The IS-Impact Model is conceptualized as a formative, multidimensional index that consists of four dimensions. Gable et al. (2008) define IS-Impact as "a measure at a point in time, of the stream of net benefits from the IS, to date and anticipated, as perceived by all key-user-groups" (p.381). The IT Evaluation Research Program (ITE-Program) at QUT has grown the IS-Impact Research Track with the central goal of conducting further studies to enhance and extend the IS-Impact Model. The overall goal of the IS-Impact research track at QUT is "to develop the most widely employed model for benchmarking information systems in organizations for the joint benefit of both research and practice" (Gable, 2009). In order to achieve that, the IS-Impact research track advocates programmatic research having the principles of tenacity, holism, and generalizability through extension research strategies. This study was conducted within the IS-Impact Research Track, to further generalize the IS-Impact Model by extending it to the Saudi Arabian context. According to Hofsted (2012), the national culture of Saudi Arabia is significantly different from the Australian national culture making the Saudi Arabian culture an interesting context for testing the external validity of the IS-Impact Model. The study re-visits the IS-Impact Model from the ground up. Rather than assume the existing instrument is valid in the new context, or simply assess its validity through quantitative data collection, the study takes a qualitative, inductive approach to re-assessing the necessity and completeness of existing dimensions and measures. This is done in two phases: Exploratory Phase and Confirmatory Phase. The exploratory phase addresses the first research question of the study "Is the IS-Impact Model complete and able to capture the impact of information systems in Saudi Arabian Organization?". The content analysis, used to analyze the Identification Survey data, indicated that 2 of the 37 measures of the IS-Impact Model are not applicable for the Saudi Arabian Context. Moreover, no new measures or dimensions were identified, evidencing the completeness and content validity of the IS-Impact Model. In addition, the Identification Survey data suggested several concepts related to IS-Impact, the most prominent of which was "Computer Network Quality" (CNQ). The literature supported the existence of a theoretical link between IS-Impact and CNQ (CNQ is viewed as an antecedent of IS-Impact). With the primary goal of validating the IS-Impact model within its extended nomological network, CNQ was introduced to the research model. The Confirmatory Phase addresses the second research question of the study "Is the Extended IS-Impact Model Valid as a Hierarchical Multidimensional Formative Measurement Model?". The objective of the Confirmatory Phase was to test the validity of IS-Impact Model and CNQ Model. To achieve that, IS-Impact, CNQ, and IS-Satisfaction were operationalized in a survey instrument, and then the research model was assessed by employing the Partial Least Squares (PLS) approach. The CNQ model was validated as a formative model. Similarly, the IS-Impact Model was validated as a hierarchical multidimensional formative construct. However, the analysis indicated that one of the IS-Impact Model indicators was insignificant and can be removed from the model. Thus, the resulting Extended IS-Impact Model consists of 4 dimensions and 34 measures. Finally, the structural model was also assessed against two aspects: explanatory and predictive power. The analysis revealed that the path coefficient between CNQ and IS-Impact is significant with t-value= (4.826) and relatively strong with â = (0.426) with CNQ explaining 18% of the variance in IS-Impact. These results supported the hypothesis that CNQ is antecedent of IS-Impact. The study demonstrates that the quality of Computer Network affects the quality of the Enterprise System (ERP) and consequently the impacts of the system. Therefore, practitioners should pay attention to the Computer Network quality. Similarly, the path coefficient between IS-Impact and IS-Satisfaction was significant t-value = (17.79) and strong â = (0.744), with IS-Impact alone explaining 55% of the variance in Satisfaction, consistent with results of the original IS-Impact study (Gable et al., 2008). The research contributions include: (a) supporting the completeness and validity of IS-Impact Model as a Hierarchical Multi-dimensional Formative Measurement Model in the Saudi Arabian context, (b) operationalizing Computer Network Quality as conceptualized in the ITU-T Recommendation E.800 (ITU-T, 1993), (c) validating CNQ as a formative measurement model and as an antecedent of IS Impact, and (d) conceptualizing and validating IS-Satisfaction as a reflective measurement model and as an immediate consequence of IS Impact. The CNQ model provides a framework to perceptually measure Computer Network Quality from multiple perspectives. The CNQ model features an easy-to-understand, easy-to-use, and economical survey instrument.
Resumo:
Many software applications extend their functionality by dynamically loading executable components into their allocated address space. Such components, exemplified by browser plugins and other software add-ons, not only enable reusability, but also promote programming simplicity, as they reside in the same address space as their host application, supporting easy sharing of complex data structures and pointers. However, such components are also often of unknown provenance and quality and may be riddled with accidental bugs or, in some cases, deliberately malicious code. Statistics show that such component failures account for a high percentage of software crashes and vulnerabilities. Enabling isolation of such fine-grained components is therefore necessary to increase the stability, security and resilience of computer programs. This thesis addresses this issue by showing how host applications can create isolation domains for individual components, while preserving the benefits of a single address space, via a new architecture for software isolation called LibVM. Towards this end, we define a specification which outlines the functional requirements for LibVM, identify the conditions under which these functional requirements can be met, define an abstract Application Programming Interface (API) that encompasses the general problem of isolating shared libraries, thus separating policy from mechanism, and prove its practicality with two concrete implementations based on hardware virtualization and system call interpositioning, respectively. The results demonstrate that hardware isolation minimises the difficulties encountered with software based approaches, while also reducing the size of the trusted computing base, thus increasing confidence in the solution’s correctness. This thesis concludes that, not only is it feasible to create such isolation domains for individual components, but that it should also be a fundamental operating system supported abstraction, which would lead to more stable and secure applications.
Resumo:
Parents are encouraged to read with their children from an early age because shared book reading helps children to develop their language and early literacy skills. A pragmatic Randomised Controlled Trial (RCT) research design was adopted to investigate the influence of two forms of a shared reading intervention (Dialogic Reading and Dialogic Reading with the addition of Print Referencing) on children’s language and literacy skills. Dialogic reading is a validated shared reading intervention that has been shown to improve children’s oral language skills prior to formal schooling (Whitehurst & Lonigan, 1998). Print referencing is another form of shared reading intervention that has the potential to have effects on children’s print knowledge as they begin school (Justice & Ezell, 2002). However, training parents to use print referencing strategies at home has not been researched extensively although research findings indicate its effectiveness when used by teachers in the early years of school. Eighty parents of Preparatory year children from three Catholic schools in low income areas in the outer suburbs of a metropolitan city were trained to deliver specific shared reading strategies in an eight-week home intervention. Parents read eight books to their children across the period of the intervention. Each book was requested to be read at least three times a week. There were 42 boys and 38 girls ranging in age from 4.92 years to 6.25 years (M=5.53, SD=0.33) in the sample. The families were randomly assigned to three groups: Dialogic Reading (DR); Dialogic Reading with the addition of Print Referencing (DR + PR); and a Control group. Six measures were used to assess children’s language skills at pre and post, and follow-up (three months after the intervention). These measures assessed oral language (receptive and expressive vocabulary), phonological awareness skills (rhyme, word completion), alphabet knowledge, and concepts about print. Results of the intervention showed that there were significant differences from pre to post between the two intervention groups and the control group on three measures: expressive vocabulary, rhyme, and concepts about print. The shared reading strategies delivered by parents of the dialogic reading, and dialogic reading with the addition of print referencing, showed promising results to develop children’s oral language skills in terms of expressive vocabulary and rhyme, as well as understanding of the concepts about print. At follow-up, when the children entered Year 1, the two intervention groups (DR and DR + PR) group had significantly maintained their knowledge of concepts about prints when compared with the control group. Overall, the findings from this intervention study did not show that dialogic reading with the addition of print referencing had stronger effects on children’s early literacy skills than dialogic reading alone. The research also explored if pre-existing family factors impacted on the outcomes of the intervention from pre to post. The relationships between maternal education and home reading practices prior to intervention and child outcomes at post were considered. However, there were no significant effects of maternal education and home literacy activities on child outcomes at post. Additionally, there were no significant effects for the level of compliance of parents with the intervention program in terms of regular weekly reading to children during the intervention period on child outcomes at post. These non-significant findings are attributed to the lack of variability in the recruited sample. Parents participating in the intervention had high levels of education, although they were recruited from schools in low socio-economic areas; parents were already highly engaged in home literacy activities at recruitment; and the parents were highly compliant in reading regularly to their child during the intervention. Findings of the current study did show that training in shared reading strategies enhanced children’s early language and literacy skills. Both dialogic reading and dialogic reading with the addition of print referencing improved children’s expressive vocabulary, rhyme, and concepts about print at post intervention. Further research is needed to identify how, and if, print referencing strategies used by parents at home can be effective over and above the use of dialogic reading strategies. In this research, limitations of sample size and the nature of the intervention to use print referencing strategies at home may have restricted the opportunities for this research study to find more effects on children’s emergent literacy skills or for the effectiveness of combining dialogic reading with print referencing strategies. However, these results did indicate that there was value in teaching parents to implement shared reading strategies at home in order to improve early literacy skills as children begin formal schooling.
Resumo:
The Japanese language is recognised as being more difficult than European languages, needing three times more tuition time to reach comparable levels of proficiency. Encouraging Japanese as a Foreign Language (JFL) students to become aware of, and effectively use, learner strategies is one way to assist them become more controlled, effective learners leading to enhanced language learning. This thesis investigates the development and implementation of a JFL curriculum implemented in a university course for students learning JFL. The curriculum was developed specifically to assist beginner university students with the development of learner strategies appropriate for a JFL reading context. The theoretical underpinning of the study was informed by Educational Criticism (Eisner, 1998), which aims to describe, interpret and evaluate the processes of interaction between the teacher, the learner and the curriculum and the students' learning processes in a tertiary JFL classroom. The study investigated the effect on student learning processes of a JFL reading program that incorporated explicit learner strategy instruction and identified factors that enhanced or impeded the development of learner strategy knowledge. The participants in the study were 29 students enrolled in the course, 10 of whom volunteered to undertake additional tasks, and the two teachers who implemented the curriculum. Data collection involved a number of different strategies to observe the students' participation in the classroom and learning experiences. Learning processes were investigated through TOL protocols, classroom observations, course evaluations, interviews, and learner strategy use measurement instruments (SILL, SILK and SORS) to document student uptake of learner strategies. The design of the study and its applied focus recognised my expertise as a JFL teacher, curriculum writer and researcher, an approach that aligns with the purpose of a Professional Doctorate. Four general thematics, or principles, were identified in this study: „h Explicit learner strategy instruction provides the context for students to develop awareness of learner strategies and take control of their learning; „h Collaborative learning and interaction with teachers offers students the opportunity for shared knowledge construction; „h Reflection offers teachers and students the opportunity to reflect on their own learning style and strategy knowledge, and raises awareness of other available strategies; and „h Diverse cultural and linguistic backgrounds have an impact on curriculum implementation and student uptake of learner strategies. The study¡¦s methodological contribution is that it is one of the first in Australia to use Educational Criticism (Eisner, 1998) as a research methodology. The findings contribute to theoretical knowledge in the fields of Applied Linguistics, Second Language Teaching and Learning, Second Language Acquisition and JFL Teaching and Learning by offering new knowledge on the importance of learner strategies in the beginner JFL classroom, the potential of explicit strategy instruction, the value of reflection for both teachers and students, and the important role of the teacher in the process of curriculum implementation. The general principles identified and the findings of this in-depth study of a JFL classroom can be drawn upon to inform other teaching practice situations, and invite practitioners from not just Japanese, but from other language areas and other disciplines, to examine and improve their own practices, and suggest further research questions to pursue this line of enquiry.
Resumo:
The Remote Sensing Core Curriculum (RSCC) was initiated in 1993 to meet the demands for a college-level set of resources to enhance the quality of education across national and international campuses. The American Society of Photogrammetry and Remote Sensing adopted the RSCC in 1996 to sustain support of this educational initiative for its membership and collegiate community. A series of volumes, containing lectures, exercises, and data, is being created by expert contributors to address the different technical fields of remote sensing. The RSCC program is designed to operate on the Internet taking full advantage of the World Wide Web (WWW) technology for distance learning. The issues of curriculum development related to the educational setting, with demands on faculty, students, and facilities, is considered to understand the new paradigms for WWW-influenced computer-aided learning. The WWW is shown to be especially appropriate for facilitating remote sensing education with requirements for addressing image data sets and multimedia learning tools. The RSCC is located at http://www.umbc.edu/rscc. The Remote Sensing Core Curriculum (RSCC) was initiated in 1993 to meet the demands for a college-level set of resources to enhance the quality of education across national and international campuses. The American Society of Photogrammetry and Remote Sensing adopted the RSCC in 1996 to sustain support of this educational initiative for its membership and collegiate community. A series of volumes, containing lectures, exercises, and data, is being created by expert contributors to address the different technical fields of remote sensing. The RSCC program is designed to operate on the Internet taking full advantage of the World Wide Web (WWW) technology for distance learning. The issues of curriculum development related to the educational setting, with demands on faculty, students, and facilities, is considered to understand the new paradigms for WWW-influenced computer-aided learning. The WWW is shown to be especially appropriate for facilitating remote sensing education with requirements for addressing image data sets and multimedia learning tools. The RSCC is located at http://www.umbc.edu/rscc.
Resumo:
In the current market, extensive software development is taking place and the software industry is thriving. Major software giants have stated source code theft as a major threat to revenues. By inserting an identity-establishing watermark in the source code, a company can prove it's ownership over the source code. In this paper, we propose a watermarking scheme for C/C++ source codes by exploiting the language restrictions. If a function calls another function, the latter needs to be defined in the code before the former, unless one uses function pre-declarations. We embed the watermark in the code by imposing an ordering on the mutually independent functions by introducing bogus dependency. Removal of dependency by the attacker to erase the watermark requires extensive manual intervention thereby making the attack infeasible. The scheme is also secure against subtractive and additive attacks. Using our watermarking scheme, an n-bit watermark can be embedded in a program having n independent functions. The scheme is implemented on several sample codes and performance changes are analyzed.