122 resultados para implementations


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Software development and Web site development techniques have evolved significantly over the past 20 years. The relatively young Web Application development area has borrowed heavily from traditional software development methodologies primarily due to the similarities in areas of data persistence and User Interface (UI) design. Recent developments in this area propose a new Web Modeling Language (WebML) to facilitate the nuances specific to Web development. WebML is one of a number of implementations designed to enable modeling of web site interaction flows while being extendable to accommodate new features in Web site development into the future. Our research aims to extend WebML with a focus on stigmergy which is a biological term originally used to describe coordination between insects. We see design features in existing Web sites that mimic stigmergic mechanisms as part of the UI. We believe that we can synthesize and embed stigmergy in Web 2.0 sites. This paper focuses on the sub-topic of site UI design and stigmergic mechanism designs required to achieve this.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Fast calculation of quantities such as in-cylinder volume and indicated power is important in internal combustion engine research. Multiple channels of data including crank angle and pressure were collected for this purpose using a fully instrumented diesel engine research facility. Currently, existing methods use software to post-process the data, first calculating volume from crank angle, then calculating the indicated work and indicated power from the area enclosed by the pressure-volume indicator diagram. Instead, this work investigates the feasibility of achieving real-time calculation of volume and power via hardware implementation on Field Programmable Gate Arrays (FPGAs). Alternative hardware implementations were investigated using lookup tables, Taylor series methods or the CORDIC (CoOrdinate Rotation DIgital Computer) algorithm to compute the trigonometric operations in the crank angle to volume calculation, and the CORDIC algorithm was found to use the least amount of resources. Simulation of the hardware based implementation showed that the error in the volume and indicated power is less than 0.1%.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper is based on an Australian Learning & Teaching Council (ALTC) funded evaluation in 13 universities across Australia and New Zealand of the use of Engineers Without Borders (EWB) projects in first-year engineering courses. All of the partner institutions have implemented this innovation differently and comparison of these implementations affords us the opportunity to assemble "a body of carefully gathered data that provides evidence of which approaches work for which students in which learning environments". This study used a mixed-methods data collection approach and a realist analysis. Data was collected by program logic analysis with course co-ordinators, observation of classes, focus groups with students, exit survey of students and interviews with staff as well as scrutiny of relevant course and curriculum documents. Course designers and co-ordinators gave us a range of reasons for using the projects, most of which alluded to their presumed capacity to deliver experience in and learning of higher order thinking skills in areas such as sustainability, ethics, teamwork and communication. For some students, however, the nature of the projects decreased their interest in issues such as ethical development, sustainability and how to work in teams. We also found that the projects provoked different responses from students depending on the nature of the courses in which they were embedded (general introduction, design, communication, or problem-solving courses) and their mode of delivery (lecture, workshop or online).

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A Delay Tolerant Network (DTN) is one where nodes can be highly mobile, with long message delay times forming dynamic and fragmented networks. Traditional centralised network security is difficult to implement in such a network, therefore distributed security solutions are more desirable in DTN implementations. Establishing effective trust in distributed systems with no centralised Public Key Infrastructure (PKI) such as the Pretty Good Privacy (PGP) scheme usually requires human intervention. Our aim is to build and compare different de- centralised trust systems for implementation in autonomous DTN systems. In this paper, we utilise a key distribution model based on the Web of Trust principle, and employ a simple leverage of common friends trust system to establish initial trust in autonomous DTN’s. We compare this system with two other methods of autonomously establishing initial trust by introducing a malicious node and measuring the distribution of malicious and fake keys. Our results show that the new trust system not only mitigates the distribution of fake malicious keys by 40% at the end of the simulation, but it also improved key distribution between nodes. This paper contributes a comparison of three de-centralised trust systems that can be employed in autonomous DTN systems.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Recent decades have witnessed a global acceleration of legislative and private sector initiatives to deal with Cross-Border insolvency. Legislative institutions include the various national implementations of the Model Law on Cross-Border Insolvency (Model Law) published by the United Nations Commission on International Trade (UNCITRAL).3 Private mechanisms include Cross-Border protocols developed and utilised by insolvency professionals and their advisers (often with the imprimatur of the judiciary), on both general and ad hoc bases. The Asia Pacific region has not escaped the effect of those developments, and the economic turmoil of the past few years has provided an early test for some of the emerging initiatives in that region. This two-part article explores the operation of those institutions through the medium of three recent cases.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Recent decades have witnessed a global acceleration of legislative and private sector initiatives to deal with Cross-Border insolvency. Legislative institutions include the various national implementations of the Model Law on Cross-Border Insolvency (Model Law) published by the United Nations Commission on International Trade (UNCITRAL).3 Private mechanisms include Cross-Border protocols developed and utilised by insolvency professionals and their advisers (often with the imprimatur of the judiciary), on both general and ad hoc bases. The Asia Pacific region has not escaped the effect of those developments, and the economic turmoil of the past few years has provided an early test for some of the emerging initiatives in that region. This two-part article explores the operation of those institutions through the medium of three recent cases.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The R statistical environment and language has demonstrated particular strengths for interactive development of statistical algorithms, as well as data modelling and visualisation. Its current implementation has an interpreter at its core which may result in a performance penalty in comparison to directly executing user algorithms in the native machine code of the host CPU. In contrast, the C++ language has no built-in visualisation capabilities, handling of linear algebra or even basic statistical algorithms; however, user programs are converted to high-performance machine code, ahead of execution. A new method avoids possible speed penalties in R by using the Rcpp extension package in conjunction with the Armadillo C++ matrix library. In addition to the inherent performance advantages of compiled code, Armadillo provides an easy-to-use template-based meta-programming framework, allowing the automatic pooling of several linear algebra operations into one, which in turn can lead to further speedups. With the aid of Rcpp and Armadillo, conversion of linear algebra centered algorithms from R to C++ becomes straightforward. The algorithms retains the overall structure as well as readability, all while maintaining a bidirectional link with the host R environment. Empirical timing comparisons of R and C++ implementations of a Kalman filtering algorithm indicate a speedup of several orders of magnitude.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The numerical solution of stochastic differential equations (SDEs) has been focused recently on the development of numerical methods with good stability and order properties. These numerical implementations have been made with fixed stepsize, but there are many situations when a fixed stepsize is not appropriate. In the numerical solution of ordinary differential equations, much work has been carried out on developing robust implementation techniques using variable stepsize. It has been necessary, in the deterministic case, to consider the "best" choice for an initial stepsize, as well as developing effective strategies for stepsize control-the same, of course, must be carried out in the stochastic case. In this paper, proportional integral (PI) control is applied to a variable stepsize implementation of an embedded pair of stochastic Runge-Kutta methods used to obtain numerical solutions of nonstiff SDEs. For stiff SDEs, the embedded pair of the balanced Milstein and balanced implicit method is implemented in variable stepsize mode using a predictive controller for the stepsize change. The extension of these stepsize controllers from a digital filter theory point of view via PI with derivative (PID) control will also be implemented. The implementations show the improvement in efficiency that can be attained when using these control theory approaches compared with the regular stepsize change strategy.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper gives a review of recent progress in the design of numerical methods for computing the trajectories (sample paths) of solutions to stochastic differential equations. We give a brief survey of the area focusing on a number of application areas where approximations to strong solutions are important, with a particular focus on computational biology applications, and give the necessary analytical tools for understanding some of the important concepts associated with stochastic processes. We present the stochastic Taylor series expansion as the fundamental mechanism for constructing effective numerical methods, give general results that relate local and global order of convergence and mention the Magnus expansion as a mechanism for designing methods that preserve the underlying structure of the problem. We also present various classes of explicit and implicit methods for strong solutions, based on the underlying structure of the problem. Finally, we discuss implementation issues relating to maintaining the Brownian path, efficient simulation of stochastic integrals and variable-step-size implementations based on various types of control.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The dynamic capabilities view (DCV) focuses on renewal of firms’ strategic knowledge resources so as to sustain competitive advantage within turbulent markets. Within the context of the DCV, the focus of knowledge management (KM) is to develop the KMC through deploying knowledge governance mechanisms that are conducive to facilitating knowledge processes so as to produce superior business performance over time. The essence of KM performance evaluation is to assess how well the KMC is configured with knowledge governance mechanisms and processes that enable a firm to achieve superior performance through matching its knowledge base with market needs. However, little research has been undertaken to evaluate KM performance from the DCV perspective. This study employed a survey study design and adopted hypothesis-testing approaches to develop a capability-based KM evaluation framework (CKMEF) that upholds the basic assertions of the DCV. Under the governance of the framework, a KM index (KMI) and a KM maturity model (KMMM) were derived not only to indicate the extent to which a firm’s KM implementations fulfill its strategic objectives, and to identify the evolutionary phase of its KMC, but also to bench-mark the KMC in the research population. The research design ensured that the evaluation framework and instruments have statistical significance and good generalizabilty to be applied in the research population, namely construction firms operating in the dynamic Hong Kong construction market. The study demonstrated the feasibility of quantitatively evaluating the development of the KMC and revealing the performance heterogeneity associated with the development.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Stochastic differential equations (SDEs) arise from physical systems where the parameters describing the system can only be estimated or are subject to noise. Much work has been done recently on developing higher order Runge-Kutta methods for solving SDEs numerically. Fixed stepsize implementations of numerical methods have limitations when, for example, the SDE being solved is stiff as this forces the stepsize to be very small. This paper presents a completely general variable stepsize implementation of an embedded Runge Kutta pair for solving SDEs numerically; in this implementation, there is no restriction on the value used for the stepsize, and it is demonstrated that the integration remains on the correct Brownian path.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Purpose: Tacit knowledge is perceived as the most strategically important resource of the construction organisation, and the only renewable and sustainable base for its activities and competitiveness. Knowledge management (KM) activities that deal with tacit knowledge are essential in helping an organisation to achieve its long-term organisational objectives. The purpose of this paper is to provide empirical evidence for the stronger strategic role of tacit KM in comparison to explicit KM. Design/methodology/approach: A questionnaire survey was administered in 2005 to a sample of construction contractors operating in Hong Kong to elicit opinions on the internal business environment, intensity of KM activities as executed by targeted organisations, and contribution of these activities to business performance (BP). A total of 149 usable responses were received from 99 organisations representing about 38 per cent of the sampling frame. The statistical analyses helped to map the reported KM activities into two groups that, respectively, deal with tacit and explicit knowledge. The sensitivity to variations of organisational policies and strength of association with BP in relation to the two groups of KM activities were also compared empirically. A total of 15 interviews with the managerial and professional staff of leading contractors was undertaken to provide insightful narratives of KM implementations. Findings: The effective implementation of organisational policies, such as encouraging innovations and strengthening strategic guidance for KM, would facilitate human interactions of tacit KM. Higher intensity of activities in managing tacit knowledge would ultimately help the organisations to achieve economic gain in the long run. Originality/value: The stronger strategic role of tacit KM is empirically investigated and established within the context of construction organisations.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The configuration of comprehensive Enterprise Systems to meet the specific requirements of an organisation up to today is consuming significant resources. The results of failing implementation projects are severe and may even threaten the organisation’s existence. This paper proposes a method which aims at increasing the efficiency of Enterprise Systems implementations. First, we argue that existing process modelling languages that feature different degrees of abstraction for different user groups exist and are used for different purposes which makes it necessary to integrate them. We describe how to do this using the meta models of the involved languages. Second, we motivate that an integrated process model based on the integrated meta model needs to be configurable and elaborate on the mechanisms by which this model configuration can be achieved. We introduce a business example using SAP modelling techniques to illustrate the proposed method.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Many software applications extend their functionality by dynamically loading executable components into their allocated address space. Such components, exemplified by browser plugins and other software add-ons, not only enable reusability, but also promote programming simplicity, as they reside in the same address space as their host application, supporting easy sharing of complex data structures and pointers. However, such components are also often of unknown provenance and quality and may be riddled with accidental bugs or, in some cases, deliberately malicious code. Statistics show that such component failures account for a high percentage of software crashes and vulnerabilities. Enabling isolation of such fine-grained components is therefore necessary to increase the stability, security and resilience of computer programs. This thesis addresses this issue by showing how host applications can create isolation domains for individual components, while preserving the benefits of a single address space, via a new architecture for software isolation called LibVM. Towards this end, we define a specification which outlines the functional requirements for LibVM, identify the conditions under which these functional requirements can be met, define an abstract Application Programming Interface (API) that encompasses the general problem of isolating shared libraries, thus separating policy from mechanism, and prove its practicality with two concrete implementations based on hardware virtualization and system call interpositioning, respectively. The results demonstrate that hardware isolation minimises the difficulties encountered with software based approaches, while also reducing the size of the trusted computing base, thus increasing confidence in the solution’s correctness. This thesis concludes that, not only is it feasible to create such isolation domains for individual components, but that it should also be a fundamental operating system supported abstraction, which would lead to more stable and secure applications.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Universities are more and more challenged by the emerging global higher education market, facilitated by advances in Information and Communication Technologies (ICT). This requires them to reconsider their mission and direction in order to function effectively and efficiently, and to be responsive to changes in their environment. In the face of increasing demands and competitive pressures, Universities like other companies, seek to continuously innovate and improve their performance. Universities are considering co-operating or sharing, both internally and externally, in a wide range of areas to achieve cost effectiveness and improvements in performance. Shared services are an effective model for re-organizing to reduce costs, increase quality and create new capabilities. Shared services are not limited to the Higher Education (HE) sector. Organizations across different sectors are adopting shared services, in particular for support functions such as Finance, Accounting, Human Resources and Information Technology. While shared services has been around for more than three decades, commencing in the 1970’s in the banking sector and then been adopted by other sectors, it is an under researched domain, with little consensus on the most fundamental issues even as basic as defining what shared services is. Moreover, the interest in shared services within Higher Education is a global phenomenon. This study on shared services is situated within the Higher Education Sector of Malaysia, and originated as an outcome resulting from a national project (2005 – 2007) conducted by the Ministry of Higher Education (MOHE) entitled "Knowledge, Information Communication Technology Strategic Plan (KICTSP) for Malaysian Public Higher Education"- where progress towards more collaborations via shared services was a key recommendation. The study’s primary objective was to understand the nature and potential for ICT shared services, in particular in the Malaysian HE sector; by laying a foundation in terms of definition, typologies and research agenda and deriving theoretically based conceptualisations of the potential benefits of shared services, success factors and issues of pursuing shared services. The study embarked on this objective with a literature review and pilot case study as a means to further define the context of the study, given the current under-researched status of ICT shared services and of shared services in Higher Education. This context definition phase illustrated a range of unaddressed issues; including a lack of common understanding of what shared services are, how they are formed, what objectives they full fill, who is involved etc. The study thus embarked on a further investigation of a more foundational nature with an exploratory phase that aimed to address these gaps, where a detailed archival analysis of shared services literature within the IS context was conducted to better understand shared services from an IS perspective. The IS literature on shared services was analysed in depth to report on the current status of shared services research in the IS domain; in particular definitions, objectives, stakeholders, the notion of sharing, theories used, and research methods applied were analysed, which provided a firmer base to this study’s design. The study also conducted a detailed content analysis of 36 cases (globally) of shared services implementations in the HE sector to better understand how shared services are structured within the HE sector and what is been shared. The results of the context definition phase and exploratory phase formed a firm basis in the multiple case studies phase which was designed to address the primary goals of this study (as presented above). Three case sites within the Malaysian HE sector was included in this analysis, resulting in empirically supported theoretical conceptualizations of shared services success factors, issues and benefits. A range of contributions are made through this study. First, the detailed archival analysis of shared services in Information Systems (IS) demonstrated the dearth of research on shared services within Information Systems. While the existing literature was synthesised to contribute towards an improved understanding of shared services in the IS domain, the areas that are yet under-developed and requires further exploration is identified and presented as a proposed research agenda for the field. This study also provides theoretical considerations and methodological guidelines to support the research agenda; to conduct better empirical research in this domain. A number of literatures based a priori frameworks (i.e. on the forms of sharing and shared services stakeholders etc) are derived in this phase, contributing to practice and research with early conceptualisations of critical aspects of shared services. Furthermore, the comprehensive archival analysis design presented and executed here is an exemplary approach of a systematic, pre-defined and tool-supported method to extract, analyse and report literature, and is documented as guidelines that can be applied for other similar literature analysis, with particular attention to supporting novice researchers. Second, the content analysis of 36 shared services initiatives in the Higher Education sector presented eight different types of structural arrangements for shared services, as observed in practice, and the salient dimensions along which those types can be usefully differentiated. Each of the eight structural arrangement types are defined and demonstrated through case examples, with further descriptive details and insights to what is shared and how the sharing occurs. This typology, grounded on secondary empirical evidence, can serve as a useful analytical tool for researchers investigating the shared services phenomenon further, and for practitioners considering the introduction or further development of shared services. Finally, the multiple case studies conducted in the Malaysian Higher Education sector, provided further empirical basis to instantiate the conceptual frameworks and typology derived from the prior phases and develops an empirically supported: (i) framework of issues and challenges, (ii) a preliminary theory of shared services success, and (iii) a benefits framework, for shared services in the Higher Education sector.