876 resultados para Computer software - Development
Resumo:
Recent research on novice programmers has suggested that they pass through neo-Piagetian stages: sensorimotor, preoperational, and concrete operational stages, before eventually reaching programming competence at the formal operational stage. This paper presents empirical results in support of this neo-Piagetian perspective. The major novel contributions of this paper are empirical results for some exam questions aimed at testing novices for the concrete operational abilities to reason with quantities that are conserved, processes that are reversible, and properties that hold under transitive inference. While the questions we used had been proposed earlier by Lister, he did not present any data for how students performed on these questions. Our empirical results demonstrate that many students struggle to answer these problems, despite the apparent simplicity of these problems. We then compare student performance on these questions with their performance on six explain in plain English questions.
Resumo:
Purpose – To investigate and identify the patterns of interaction between searchers and search engine during web searching. Design/methodology/approach – The authors examined 2,465,145 interactions from 534,507 users of Dogpile.com submitted on May 6, 2005, and compared query reformulation patterns. They investigated the type of query modifications and query modification transitions within sessions. Findings – The paper identifies three strong query reformulation transition patterns: between specialization and generalization; between video and audio, and between content change and system assistance. In addition, the findings show that web and images content were the most popular media collections. Originality/value – This research sheds light on the more complex aspects of web searching involving query modifications.
Resumo:
This thesis addresses one of the fundamental issues that remains unresolved in patent law today. It is a question that strikes at the heart of what a patent is and what it is supposed to protect. That question is whether an invention must produce a physical effect or cause a physical transformation of matter to be patentable, or whether it is sufficient that an invention involves a specific practical application of an idea or principle to achieve a useful result. In short, the question is whether patent law contains a physicality requirement. Resolving this issue will determine whether only traditional mechanical, industrial and manufacturing processes are patent eligible, or whether patent eligibility extends to include purely intangible, or non-physical, products and processes. To this end, this thesis seeks to identify where the dividing line lies between patentable subject matter and the recognised categories of excluded matter, namely, fundamental principles of nature, physical phenomena, and abstract ideas. It involves determining which technological advances are worth the inconvenience monopoly protection causes the public at large, and which should remain free for all to use without restriction. This is an issue that has important ramifications for innovation in the ‘knowledge economy’ of the Information Age. Determining whether patent law contains a physicality requirement is integral to deciding whether much of the valuable innovation we are likely to witness, in what are likely to be the emerging areas of technology in the near future, will receive the same encouragement as industrial and manufacturing advances of previous times.
Resumo:
Purpose: The purpose of this paper is to clarify how end-users’ tacit knowledge can be captured and integrated in an overall business process management (BPM) approach. Current approaches to support stakeholders’ collaboration in the modelling of business processes envision an egalitarian environment where stakeholders interact in the same context, using the same languages and sharing the same perspectives on the business process. Therefore, such stakeholders have to collaborate in the context of process modelling using a language that some of them do not master, and have to integrate their various perspectives. Design/methodology/approach: The paper applies the SECI knowledge management process to analyse the problems of traditional top-down BPM approaches and BPM collaborative modelling tools. Besides, the SECI model is also applied to Wikipedia, a successful Web 2.0-based knowledge management environment, to identify how tacit knowledge is captured in a bottom-up approach. Findings – The paper identifies a set of requirements for a hybrid BPM approach, both top-down and bottom-up, and describes a new BPM method based on a stepwise discovery of knowledge. Originality/value: This new approach, Processpedia, enhances collaborative modelling among stakeholders without enforcing egalitarianism. In Processpedia tacit knowledge is captured and standardised into the organisation’s business processes by fostering an ecological participation of all the stakeholders and capitalising on stakeholders’ distinctive characteristics.
Resumo:
Complexity is a major concern which is aimed to be overcome by people through modeling. One way of reducing complexity is separation of concerns, e.g. separation of business process from applications. One sort of concerns are cross-cutting concerns i.e. concerns which are scattered and tangled through one of several models. In business process management, examples of such concerns are security and privacy policies. To deal with these cross-cutting concerns, the aspect orientated approach was introduced in the software development area and recently also in the business process management area. The work presented in this paper elaborates on aspect oriented process modelling. It extends earlier work by defining a mechanism for capturing multiple concerns and specifying a precedence order according to which they should be handled in a process. A formal syntax of the notation is presented precisely capturing the extended concepts and mechanisms. Finally, the relevant of the approach is demonstrated through a case study.
Resumo:
STUDY DESIGN: Controlled laboratory study. OBJECTIVES: To investigate the reliability and concurrent validity of photographic measurements of hallux valgus angle compared to radiographs as the criterion standard. BACKGROUND: Clinical assessment of hallux valgus involves measuring alignment between the first toe and metatarsal on weight-bearing radiographs or visually grading the severity of deformity with categorical scales. Digital photographs offer a noninvasive method of measuring deformity on an exact scale; however, the validity of this technique has not previously been established. METHODS: Thirty-eight subjects (30 female, 8 male) were examined (76 feet, 54 with hallux valgus). Computer software was used to measure hallux valgus angle from digital records of bilateral weight-bearing dorsoplantar foot radiographs and photographs. One examiner measured 76 feet on 2 occasions 2 weeks apart, and a second examiner measured 40 feet on a single occasion. Reliability was investigated by intraclass correlation coefficients and validity by 95% limits of agreement. The Pearson correlation coefficient was also calculated. RESULTS: Intrarater and interrater reliability were very high (intraclass correlation coefficients greater than 0.96) and 95% limits of agreement between photographic and radiographic measurements were acceptable. Measurements from photographs and radiographs were also highly correlated (Pearson r = 0.96). CONCLUSIONS: Digital photographic measurements of hallux valgus angle are reliable and have acceptable validity compared to weight-bearing radiographs. This method provides a convenient and precise tool in assessment of hallux valgus, while avoiding the cost and radiation exposure associated with radiographs.
Resumo:
Fundamental tooling is required in order to apply USDL in practical settings. This chapter discusses three fundamental types of tools for USDL. First, USDL editors have been developed for expert and casual users, respectively. Second, several USDL repositories have been built to allow editors accessing and storing USDL descriptions. Third, our generic USDL marketplace allows providers to describe their services once and potentially trade them anywhere. In addition, the iosyncrasies of service trading as opposed to the simpler case of product trading. The chapter also presents several deployment scenarios of such tools to foster individual value chains and support new business models across organizational boundaries. We close the chapter with an application of USDL in the context of service engineering.
Resumo:
As the service-oriented architecture paradigm has become ever more popular, different standardization efforts have been proposed by various consortia to enable interaction among heterongeneous environments through this paradigm. This chapter will overview the most prevalent of these SOA approaches. It will first show how technical services can be described, how they can interact with each other and be discovered by users. Next, the chapter will present different standards to facilitate service composition and to design service-oriented environments in light of a universal understanding of service orientation. The chapter will conclude with a summary and a discussion on the limitations of the reviewed standards along their ability to describe service properties. This paves the way to the next chapters where the USDL standard will be presented, which aims to lift such limitations.
Resumo:
Enabling web-based service networks and ecosystems requires a way of describing services by a "commercial envelope" as discussed in Chapter 1. A uniform conception of services across all walks of life (including technical services) is required capturing business, operational and technical aspects. Therefore, our proposed Unified Service Description Language (USDL) particularly draws from and generalizes the best-of-breed approaches presented in Part I. The following chapter presents the design rationale of USDL where the different aspects are put in a framework of descriptions requirements. This is followed by the subsequent chapters of this part that provide details on specific aspects such as pricing or legal issues.
Resumo:
This article sets out the results of an empirical research study into the uses to which the Australian patent system is being put in the early 21st century. The focus of the study is business method patents, which are of interest because they are a controversial class of patent that are thought to differ significantly from the mechanical, chemical and industrial inventions that have traditionally been the mainstay of the patent system. The purpose of the study is to understand what sort of business method patent applications have been lodged in Australia in the first decade of this century and how the patent office is responding to those applications.
Resumo:
Premature convergence to local optimal solutions is one of the main difficulties when using evolutionary algorithms in real-world optimization problems. To prevent premature convergence and degeneration phenomenon, this paper proposes a new optimization computation approach, human-simulated immune evolutionary algorithm (HSIEA). Considering that the premature convergence problem is due to the lack of diversity in the population, the HSIEA employs the clonal selection principle of artificial immune system theory to preserve the diversity of solutions for the search process. Mathematical descriptions and procedures of the HSIEA are given, and four new evolutionary operators are formulated which are clone, variation, recombination, and selection. Two benchmark optimization functions are investigated to demonstrate the effectiveness of the proposed HSIEA.
Resumo:
Modern applications comprise multiple components, such as browser plug-ins, often of unknown provenance and quality. Statistics show that failure of such components accounts for a high percentage of software faults. Enabling isolation of such fine-grained components is therefore necessary to increase the robustness and resilience of security-critical and safety-critical computer systems. In this paper, we evaluate whether such fine-grained components can be sandboxed through the use of the hardware virtualization support available in modern Intel and AMD processors. We compare the performance and functionality of such an approach to two previous software based approaches. The results demonstrate that hardware isolation minimizes the difficulties encountered with software based approaches, while also reducing the size of the trusted computing base, thus increasing confidence in the solution's correctness. We also show that our relatively simple implementation has equivalent run-time performance, with overheads of less than 34%, does not require custom tool chains and provides enhanced functionality over software-only approaches, confirming that hardware virtualization technology is a viable mechanism for fine-grained component isolation.