910 resultados para implementations


Relevância:

10.00% 10.00%

Publicador:

Resumo:

In the medical and healthcare arena, patients‟ data is not just their own personal history but also a valuable large dataset for finding solutions for diseases. While electronic medical records are becoming popular and are used in healthcare work places like hospitals, as well as insurance companies, and by major stakeholders such as physicians and their patients, the accessibility of such information should be dealt with in a way that preserves privacy and security. Thus, finding the best way to keep the data secure has become an important issue in the area of database security. Sensitive medical data should be encrypted in databases. There are many encryption/ decryption techniques and algorithms with regard to preserving privacy and security. Currently their performance is an important factor while the medical data is being managed in databases. Another important factor is that the stakeholders should decide more cost-effective ways to reduce the total cost of ownership. As an alternative, DAS (Data as Service) is a popular outsourcing model to satisfy the cost-effectiveness but it takes a consideration that the encryption/ decryption modules needs to be handled by trustworthy stakeholders. This research project is focusing on the query response times in a DAS model (AES-DAS) and analyses the comparison between the outsourcing model and the in-house model which incorporates Microsoft built-in encryption scheme in a SQL Server. This research project includes building a prototype of medical database schemas. There are 2 types of simulations to carry out the project. The first stage includes 6 databases in order to carry out simulations to measure the performance between plain-text, Microsoft built-in encryption and AES-DAS (Data as Service). Particularly, the AES-DAS incorporates implementations of symmetric key encryption such as AES (Advanced Encryption Standard) and a Bucket indexing processor using Bloom filter. The results are categorised such as character type, numeric type, range queries, range queries using Bucket Index and aggregate queries. The second stage takes the scalability test from 5K to 2560K records. The main result of these simulations is that particularly as an outsourcing model, AES-DAS using the Bucket index shows around 3.32 times faster than a normal AES-DAS under the 70 partitions and 10K record-sized databases. Retrieving Numeric typed data takes shorter time than Character typed data in AES-DAS. The aggregation query response time in AES-DAS is not as consistent as that in MS built-in encryption scheme. The scalability test shows that the DBMS reaches in a certain threshold; the query response time becomes rapidly slower. However, there is more to investigate in order to bring about other outcomes and to construct a secured EMR (Electronic Medical Record) more efficiently from these simulations.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Enterprise Systems (ES) have emerged as possibly the most important and challenging development in the corporate use of information technology in the last decade. Organizations have invested heavily in these large, integrated application software suites expecting improvments in; business processes, management of expenditure, customer service, and more generally, competitiveness, improved access to better information/knowledge (i.e., business intelligence and analytics). Forrester survey data consistently shows that investment in ES and enterprise applications in general remains the top IT spending priority, with the ES market estimated at $38 billion and predicted to grow at a steady rate of 6.9%, reaching $50 billion by 2012 (Wang & Hamerman, 2008). Yet, organizations have failed to realize all the anticipated benefits. One of the key reasons is the inability of employees to properly utilize the capabilities of the enterprise systems to complete the work and extract information critical to decision making. In response, universities (tertiary institutes) have developed academic programs aimed at addressing the skill gaps. In parallel with the proliferation of ES, there has been growing recognition of the importance of Teaching Enterprise Systems at tertiary education institutes. Many academic papers have discused the important role of Enterprise System curricula at tertiary education institutes (Ask, 2008; Hawking, 2004; Stewart, 2001), where the teaching philosophises, teaching approaches and challenges in Enterprise Systems education were discussed. Following the global trends, tertiary institutes in the Pacific-Asian region commenced introducing Enterprise System curricula in late 1990s with a range of subjects (a subject represents a single unit, rather than a collection of units; which we refer to as a course) in faculties / schools / departments of Information Technology, Business and in some cases in Engineering. Many tertiary educations commenced their initial subject offers around four salient concepts of Enterprise Systems: (1) Enterprise Systems implementations, (2) Introductions to core modules of Enterprise Systems, (3) Application customization using a programming language (e.g. ABAP) and (4) Systems Administration. While universities have come a long way in developing curricula in the enterprise system area, many obstacles remain: high cost of technology, qualified faculty to teach, lack of teaching materials, etc.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Optimising the container transfer schedule at the multimodal terminals is known to be NP-hard, which implies that the best solution becomes computationally infeasible as problem sizes increase. Genetic Algorithm (GA) techniques are used to reduce container handling/transfer times and ships' time at the port by speeding up handling operations. The GA is chosen due to the relatively good results that have been reported even with the simplest GA implementations to obtain near-optimal solutions in reasonable time. Also discussed, is the application of the model to assess the consequences of increased scheduled throughput time as well as different strategies such as the alternative plant layouts, storage policies and number of yard machines. A real data set used for the solution and subsequent sensitivity analysis is applied to the alternative plant layouts, storage policies and number of yard machines.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Thermal-infrared imagery is relatively robust to many of the failure conditions of visual and laser-based SLAM systems, such as fog, dust and smoke. The ability to use thermal-infrared video for localization is therefore highly appealing for many applications. However, operating in thermal-infrared is beyond the capacity of existing SLAM implementations. This paper presents the first known monocular SLAM system designed and tested for hand-held use in the thermal-infrared modality. The implementation includes a flexible feature detection layer able to achieve robust feature tracking in high-noise, low-texture thermal images. A novel approach for structure initialization is also presented. The system is robust to irregular motion and capable of handling the unique mechanical shutter interruptions common to thermal-infrared cameras. The evaluation demonstrates promising performance of the algorithm in several environments.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Workflow patterns have been recognized as the theoretical basis to modeling recurring problems in workflow systems. A form of workflow patterns, known as the resource patterns, characterise the behaviour of resources in workflow systems. Despite the fact that many resource patterns have been discovered, people still preclude them from many workflow system implementations. One of reasons could be obscurityin the behaviour of and interaction between resources and a workflow management system. Thus, we provide a modelling and visualization approach for the resource patterns, enabling a resource behaviour modeller to intuitively see the specific resource patterns involved in the lifecycle of a workitem. We believe this research can be extended to benefit not only workflow modelling, but also other applications, such as model validation, human resource behaviour modelling, and workflow model visualization.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper presents a combined structure for using real, complex, and binary valued vectors for semantic representation. The theory, implementation, and application of this structure are all significant. For the theory underlying quantum interaction, it is important to develop a core set of mathematical operators that describe systems of information, just as core mathematical operators in quantum mechanics are used to describe the behavior of physical systems. The system described in this paper enables us to compare more traditional quantum mechanical models (which use complex state vectors), alongside more generalized quantum models that use real and binary vectors. The implementation of such a system presents fundamental computational challenges. For large and sometimes sparse datasets, the demands on time and space are different for real, complex, and binary vectors. To accommodate these demands, the Semantic Vectors package has been carefully adapted and can now switch between different number types comparatively seamlessly. This paper describes the key abstract operations in our semantic vector models, and describes the implementations for real, complex, and binary vectors. We also discuss some of the key questions that arise in the field of quantum interaction and informatics, explaining how the wide availability of modelling options for different number fields will help to investigate some of these questions.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Building prefabrication is known as Industrialised Building Systems (IBS) in Malaysia. This construction method possesses unique characteristics that are central to sustainable construction. For example, offsite construction enables efficient management of construction wastage by identifying major causes of waste arising during both the design and construction stages. These causes may then be eliminated by the improvement process in IBS component's manufacturing. However, current decisions on using IBS are typically financial driven and hinder the wider ranged adoption. In addition, current IBS misconceptions and the failure of rating schemes in evaluating the sustainability of IBS affect its implementation. A new approach is required to provide better understanding on the sustainability potential of IBS among stakeholders. Such approach should also help project the outcomes of each levels of decision-making to respond to social, economy and environmental challenges. This paper presents interim findings of research aimed at developing a framework for sustainable IBS development and suggests a more holistic approach to achieve sustainability. A framework of embedding sustainability factors is considered in three main phases of IBS construction; 1) Pre-construction, 2) Construction and 3) Post-construction phase. SWOT analysis was used to evaluate the strengths, weaknesses, opportunities and threats involved in the IBS implementations. The action plans are formulated from the analysis of sustainable objectives. This approach will show where and how sustainability should be integrated to improve IBS construction. A mix of quantitative and qualitative methodology was used in this research to explore the potential of IBS in integrating sustainability. The tools used in the study are questionnaires and semi-structured interviews. Outcomes from these tools lead to the identification of viable approaches involving 18 critical factors to improve sustainability in IBS constructions. Finally, guidelines for decision-making are being developed to provide a useful source of information and support to mutual benefit of the stakeholders in integrating sustainability issues and concepts into IBS applications.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Software development and Web site development techniques have evolved significantly over the past 20 years. The relatively young Web Application development area has borrowed heavily from traditional software development methodologies primarily due to the similarities in areas of data persistence and User Interface (UI) design. Recent developments in this area propose a new Web Modeling Language (WebML) to facilitate the nuances specific to Web development. WebML is one of a number of implementations designed to enable modeling of web site interaction flows while being extendable to accommodate new features in Web site development into the future. Our research aims to extend WebML with a focus on stigmergy which is a biological term originally used to describe coordination between insects. We see design features in existing Web sites that mimic stigmergic mechanisms as part of the UI. We believe that we can synthesize and embed stigmergy in Web 2.0 sites. This paper focuses on the sub-topic of site UI design and stigmergic mechanism designs required to achieve this.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Fast calculation of quantities such as in-cylinder volume and indicated power is important in internal combustion engine research. Multiple channels of data including crank angle and pressure were collected for this purpose using a fully instrumented diesel engine research facility. Currently, existing methods use software to post-process the data, first calculating volume from crank angle, then calculating the indicated work and indicated power from the area enclosed by the pressure-volume indicator diagram. Instead, this work investigates the feasibility of achieving real-time calculation of volume and power via hardware implementation on Field Programmable Gate Arrays (FPGAs). Alternative hardware implementations were investigated using lookup tables, Taylor series methods or the CORDIC (CoOrdinate Rotation DIgital Computer) algorithm to compute the trigonometric operations in the crank angle to volume calculation, and the CORDIC algorithm was found to use the least amount of resources. Simulation of the hardware based implementation showed that the error in the volume and indicated power is less than 0.1%.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper is based on an Australian Learning & Teaching Council (ALTC) funded evaluation in 13 universities across Australia and New Zealand of the use of Engineers Without Borders (EWB) projects in first-year engineering courses. All of the partner institutions have implemented this innovation differently and comparison of these implementations affords us the opportunity to assemble "a body of carefully gathered data that provides evidence of which approaches work for which students in which learning environments". This study used a mixed-methods data collection approach and a realist analysis. Data was collected by program logic analysis with course co-ordinators, observation of classes, focus groups with students, exit survey of students and interviews with staff as well as scrutiny of relevant course and curriculum documents. Course designers and co-ordinators gave us a range of reasons for using the projects, most of which alluded to their presumed capacity to deliver experience in and learning of higher order thinking skills in areas such as sustainability, ethics, teamwork and communication. For some students, however, the nature of the projects decreased their interest in issues such as ethical development, sustainability and how to work in teams. We also found that the projects provoked different responses from students depending on the nature of the courses in which they were embedded (general introduction, design, communication, or problem-solving courses) and their mode of delivery (lecture, workshop or online).

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A Delay Tolerant Network (DTN) is one where nodes can be highly mobile, with long message delay times forming dynamic and fragmented networks. Traditional centralised network security is difficult to implement in such a network, therefore distributed security solutions are more desirable in DTN implementations. Establishing effective trust in distributed systems with no centralised Public Key Infrastructure (PKI) such as the Pretty Good Privacy (PGP) scheme usually requires human intervention. Our aim is to build and compare different de- centralised trust systems for implementation in autonomous DTN systems. In this paper, we utilise a key distribution model based on the Web of Trust principle, and employ a simple leverage of common friends trust system to establish initial trust in autonomous DTN’s. We compare this system with two other methods of autonomously establishing initial trust by introducing a malicious node and measuring the distribution of malicious and fake keys. Our results show that the new trust system not only mitigates the distribution of fake malicious keys by 40% at the end of the simulation, but it also improved key distribution between nodes. This paper contributes a comparison of three de-centralised trust systems that can be employed in autonomous DTN systems.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Recent decades have witnessed a global acceleration of legislative and private sector initiatives to deal with Cross-Border insolvency. Legislative institutions include the various national implementations of the Model Law on Cross-Border Insolvency (Model Law) published by the United Nations Commission on International Trade (UNCITRAL).3 Private mechanisms include Cross-Border protocols developed and utilised by insolvency professionals and their advisers (often with the imprimatur of the judiciary), on both general and ad hoc bases. The Asia Pacific region has not escaped the effect of those developments, and the economic turmoil of the past few years has provided an early test for some of the emerging initiatives in that region. This two-part article explores the operation of those institutions through the medium of three recent cases.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Recent decades have witnessed a global acceleration of legislative and private sector initiatives to deal with Cross-Border insolvency. Legislative institutions include the various national implementations of the Model Law on Cross-Border Insolvency (Model Law) published by the United Nations Commission on International Trade (UNCITRAL).3 Private mechanisms include Cross-Border protocols developed and utilised by insolvency professionals and their advisers (often with the imprimatur of the judiciary), on both general and ad hoc bases. The Asia Pacific region has not escaped the effect of those developments, and the economic turmoil of the past few years has provided an early test for some of the emerging initiatives in that region. This two-part article explores the operation of those institutions through the medium of three recent cases.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The R statistical environment and language has demonstrated particular strengths for interactive development of statistical algorithms, as well as data modelling and visualisation. Its current implementation has an interpreter at its core which may result in a performance penalty in comparison to directly executing user algorithms in the native machine code of the host CPU. In contrast, the C++ language has no built-in visualisation capabilities, handling of linear algebra or even basic statistical algorithms; however, user programs are converted to high-performance machine code, ahead of execution. A new method avoids possible speed penalties in R by using the Rcpp extension package in conjunction with the Armadillo C++ matrix library. In addition to the inherent performance advantages of compiled code, Armadillo provides an easy-to-use template-based meta-programming framework, allowing the automatic pooling of several linear algebra operations into one, which in turn can lead to further speedups. With the aid of Rcpp and Armadillo, conversion of linear algebra centered algorithms from R to C++ becomes straightforward. The algorithms retains the overall structure as well as readability, all while maintaining a bidirectional link with the host R environment. Empirical timing comparisons of R and C++ implementations of a Kalman filtering algorithm indicate a speedup of several orders of magnitude.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The numerical solution of stochastic differential equations (SDEs) has been focused recently on the development of numerical methods with good stability and order properties. These numerical implementations have been made with fixed stepsize, but there are many situations when a fixed stepsize is not appropriate. In the numerical solution of ordinary differential equations, much work has been carried out on developing robust implementation techniques using variable stepsize. It has been necessary, in the deterministic case, to consider the "best" choice for an initial stepsize, as well as developing effective strategies for stepsize control-the same, of course, must be carried out in the stochastic case. In this paper, proportional integral (PI) control is applied to a variable stepsize implementation of an embedded pair of stochastic Runge-Kutta methods used to obtain numerical solutions of nonstiff SDEs. For stiff SDEs, the embedded pair of the balanced Milstein and balanced implicit method is implemented in variable stepsize mode using a predictive controller for the stepsize change. The extension of these stepsize controllers from a digital filter theory point of view via PI with derivative (PID) control will also be implemented. The implementations show the improvement in efficiency that can be attained when using these control theory approaches compared with the regular stepsize change strategy.