859 resultados para Time-sharing computer systems.


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Granulation is one of the fundamental operations in particulate processing and has a very ancient history and widespread use. Much fundamental particle science has occurred in the last two decades to help understand the underlying phenomena. Yet, until recently the development of granulation systems was mostly based on popular practice. The use of process systems approaches to the integrated understanding of these operations is providing improved insight into the complex nature of the processes. Improved mathematical representations, new solution techniques and the application of the models to industrial processes are yielding better designs, improved optimisation and tighter control of these systems. The parallel development of advanced instrumentation and the use of inferential approaches provide real-time access to system parameters necessary for improvements in operation. The use of advanced models to help develop real-time plant diagnostic systems provides further evidence of the utility of process system approaches to granulation processes. This paper highlights some of those aspects of granulation. (c) 2005 Elsevier Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This research is concerned with the development of distributed real-time systems, in which software is used for the control of concurrent physical processes. These distributed control systems are required to periodically coordinate the operation of several autonomous physical processes, with the property of an atomic action. The implementation of this coordination must be fault-tolerant if the integrity of the system is to be maintained in the presence of processor or communication failures. Commit protocols have been widely used to provide this type of atomicity and ensure consistency in distributed computer systems. The objective of this research is the development of a class of robust commit protocols, applicable to the coordination of distributed real-time control systems. Extended forms of the standard two phase commit protocol, that provides fault-tolerant and real-time behaviour, were developed. Petri nets are used for the design of the distributed controllers, and to embed the commit protocol models within these controller designs. This composition of controller and protocol model allows the analysis of the complete system in a unified manner. A common problem for Petri net based techniques is that of state space explosion, a modular approach to both the design and analysis would help cope with this problem. Although extensions to Petri nets that allow module construction exist, generally the modularisation is restricted to the specification, and analysis must be performed on the (flat) detailed net. The Petri net designs for the type of distributed systems considered in this research are both large and complex. The top down, bottom up and hybrid synthesis techniques that are used to model large systems in Petri nets are considered. A hybrid approach to Petri net design for a restricted class of communicating processes is developed. Designs produced using this hybrid approach are modular and allow re-use of verified modules. In order to use this form of modular analysis, it is necessary to project an equivalent but reduced behaviour on the modules used. These projections conceal events local to modules that are not essential for the purpose of analysis. To generate the external behaviour, each firing sequence of the subnet is replaced by an atomic transition internal to the module, and the firing of these transitions transforms the input and output markings of the module. Thus local events are concealed through the projection of the external behaviour of modules. This hybrid design approach preserves properties of interest, such as boundedness and liveness, while the systematic concealment of local events allows the management of state space. The approach presented in this research is particularly suited to distributed systems, as the underlying communication model is used as the basis for the interconnection of modules in the design procedure. This hybrid approach is applied to Petri net based design and analysis of distributed controllers for two industrial applications that incorporate the robust, real-time commit protocols developed. Temporal Petri nets, which combine Petri nets and temporal logic, are used to capture and verify causal and temporal aspects of the designs in a unified manner.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis is a study of performance management of Complex Event Processing (CEP) systems. Since CEP systems have distinct characteristics from other well-studied computer systems such as batch and online transaction processing systems and database-centric applications, these characteristics introduce new challenges and opportunities to the performance management for CEP systems. Methodologies used in benchmarking CEP systems in many performance studies focus on scaling the load injection, but not considering the impact of the functional capabilities of CEP systems. This thesis proposes the approach of evaluating the performance of CEP engines’ functional behaviours on events and develops a benchmark platform for CEP systems: CEPBen. The CEPBen benchmark platform is developed to explore the fundamental functional performance of event processing systems: filtering, transformation and event pattern detection. It is also designed to provide a flexible environment for exploring new metrics and influential factors for CEP systems and evaluating the performance of CEP systems. Studies on factors and new metrics are carried out using the CEPBen benchmark platform on Esper. Different measurement points of response time in performance management of CEP systems are discussed and response time of targeted event is proposed to be used as a metric for quality of service evaluation combining with the traditional response time in CEP systems. Maximum query load as a capacity indicator regarding to the complexity of queries and number of live objects in memory as a performance indicator regarding to the memory management are proposed in performance management of CEP systems. Query depth is studied as a performance factor that influences CEP system performance.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Physiological signals, which are controlled by the autonomic nervous system (ANS), could be used to detect the affective state of computer users and therefore find applications in medicine and engineering. The Pupil Diameter (PD) seems to provide a strong indication of the affective state, as found by previous research, but it has not been investigated fully yet. ^ In this study, new approaches based on monitoring and processing the PD signal for off-line and on-line affective assessment ("relaxation" vs. "stress") are proposed. Wavelet denoising and Kalman filtering methods are first used to remove abrupt changes in the raw Pupil Diameter (PD) signal. Then three features (PDmean, PDmax and PDWalsh) are extracted from the preprocessed PD signal for the affective state classification. In order to select more relevant and reliable physiological data for further analysis, two types of data selection methods are applied, which are based on the paired t-test and subject self-evaluation, respectively. In addition, five different kinds of the classifiers are implemented on the selected data, which achieve average accuracies up to 86.43% and 87.20%, respectively. Finally, the receiver operating characteristic (ROC) curve is utilized to investigate the discriminating potential of each individual feature by evaluation of the area under the ROC curve, which reaches values above 0.90. ^ For the on-line affective assessment, a hard threshold is implemented first in order to remove the eye blinks from the PD signal and then a moving average window is utilized to obtain the representative value PDr for every one-second time interval of PD. There are three main steps for the on-line affective assessment algorithm, which are preparation, feature-based decision voting and affective determination. The final results show that the accuracies are 72.30% and 73.55% for the data subsets, which were respectively chosen using two types of data selection methods (paired t-test and subject self-evaluation). ^ In order to further analyze the efficiency of affective recognition through the PD signal, the Galvanic Skin Response (GSR) was also monitored and processed. The highest affective assessment classification rate obtained from GSR processing is only 63.57% (based on the off-line processing algorithm). The overall results confirm that the PD signal should be considered as one of the most powerful physiological signals to involve in future automated real-time affective recognition systems, especially for detecting the "relaxation" vs. "stress" states.^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The research described here is supported by the award made by the RCUK Digital Economy programme to the dot.rural Digital Economy Research Hub; award reference: EP/G066051/1/.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Interacting with a computer system in the operating room (OR) can be a frustrating experience for a surgeon, who currently has to verbally delegate to an assistant every computer interaction task. This indirect mode of interaction is time consuming, error prone and can lead to poor usability of OR computer systems. This thesis describes the design and evaluation of a joystick-like device that allows direct surgeon control of the computer in the OR. The device was tested extensively in comparison to a mouse and delegated dictation with seven surgeons, eleven residents, and five graduate students. The device contains no electronic parts, is easy to use, is unobtrusive, has no physical connection to the computer and makes use of an existing tool in the OR. We performed a user study to determine its effectiveness in allowing a user to perform all the tasks they would be expected to perform on an OR computer system during a computer-assisted surgery. Dictation was found to be superior to the joystick in qualitative measures, but the joystick was preferred over dictation in user satisfaction responses. The mouse outperformed both joystick and dictation, but it is not a readily accepted modality in the OR.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Monitoring multiple myeloma patients for relapse requires sensitive methods to measure minimal residual disease and to establish a more precise prognosis. The present study aimed to standardize a real-time quantitative polymerase chain reaction (PCR) test for the IgH gene with a JH consensus self-quenched fluorescence reverse primer and a VDJH or DJH allele-specific sense primer (self-quenched PCR). This method was compared with allele-specific real-time quantitative PCR test for the IgH gene using a TaqMan probe and a JH consensus primer (TaqMan PCR). We studied nine multiple myeloma patients from the Spanish group treated with the MM2000 therapeutic protocol. Self-quenched PCR demonstrated sensitivity of >or=10(-4) or 16 genomes in most cases, efficiency was 1.71 to 2.14, and intra-assay and interassay reproducibilities were 1.18 and 0.75%, respectively. Sensitivity, efficiency, and residual disease detection were similar with both PCR methods. TaqMan PCR failed in one case because of a mutation in the JH primer binding site, and self-quenched PCR worked well in this case. In conclusion, self-quenched PCR is a sensitive and reproducible method for quantifying residual disease in multiple myeloma patients; it yields similar results to TaqMan PCR and may be more effective than the latter when somatic mutations are present in the JH intronic primer binding site.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Authentication plays an important role in how we interact with computers, mobile devices, the web, etc. The idea of authentication is to uniquely identify a user before granting access to system privileges. For example, in recent years more corporate information and applications have been accessible via the Internet and Intranet. Many employees are working from remote locations and need access to secure corporate files. During this time, it is possible for malicious or unauthorized users to gain access to the system. For this reason, it is logical to have some mechanism in place to detect whether the logged-in user is the same user in control of the user's session. Therefore, highly secure authentication methods must be used. We posit that each of us is unique in our use of computer systems. It is this uniqueness that is leveraged to "continuously authenticate users" while they use web software. To monitor user behavior, n-gram models are used to capture user interactions with web-based software. This statistical language model essentially captures sequences and sub-sequences of user actions, their orderings, and temporal relationships that make them unique by providing a model of how each user typically behaves. Users are then continuously monitored during software operations. Large deviations from "normal behavior" can possibly indicate malicious or unintended behavior. This approach is implemented in a system called Intruder Detector (ID) that models user actions as embodied in web logs generated in response to a user's actions. User identification through web logs is cost-effective and non-intrusive. We perform experiments on a large fielded system with web logs of approximately 4000 users. For these experiments, we use two classification techniques; binary and multi-class classification. We evaluate model-specific differences of user behavior based on coarse-grain (i.e., role) and fine-grain (i.e., individual) analysis. A specific set of metrics are used to provide valuable insight into how each model performs. Intruder Detector achieves accurate results when identifying legitimate users and user types. This tool is also able to detect outliers in role-based user behavior with optimal performance. In addition to web applications, this continuous monitoring technique can be used with other user-based systems such as mobile devices and the analysis of network traffic.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The growing demand for large-scale virtualization environments, such as the ones used in cloud computing, has led to a need for efficient management of computing resources. RAM memory is the one of the most required resources in these environments, and is usually the main factor limiting the number of virtual machines that can run on the physical host. Recently, hypervisors have brought mechanisms for transparent memory sharing between virtual machines in order to reduce the total demand for system memory. These mechanisms “merge” similar pages detected in multiple virtual machines into the same physical memory, using a copy-on-write mechanism in a manner that is transparent to the guest systems. The objective of this study is to present an overview of these mechanisms and also evaluate their performance and effectiveness. The results of two popular hypervisors (VMware and KVM) using different guest operating systems (Linux and Windows) and different workloads (synthetic and real) are presented herein. The results show significant performance differences between hypervisors according to the guest system workloads and execution time.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Physiological signals, which are controlled by the autonomic nervous system (ANS), could be used to detect the affective state of computer users and therefore find applications in medicine and engineering. The Pupil Diameter (PD) seems to provide a strong indication of the affective state, as found by previous research, but it has not been investigated fully yet. In this study, new approaches based on monitoring and processing the PD signal for off-line and on-line affective assessment (“relaxation” vs. “stress”) are proposed. Wavelet denoising and Kalman filtering methods are first used to remove abrupt changes in the raw Pupil Diameter (PD) signal. Then three features (PDmean, PDmax and PDWalsh) are extracted from the preprocessed PD signal for the affective state classification. In order to select more relevant and reliable physiological data for further analysis, two types of data selection methods are applied, which are based on the paired t-test and subject self-evaluation, respectively. In addition, five different kinds of the classifiers are implemented on the selected data, which achieve average accuracies up to 86.43% and 87.20%, respectively. Finally, the receiver operating characteristic (ROC) curve is utilized to investigate the discriminating potential of each individual feature by evaluation of the area under the ROC curve, which reaches values above 0.90. For the on-line affective assessment, a hard threshold is implemented first in order to remove the eye blinks from the PD signal and then a moving average window is utilized to obtain the representative value PDr for every one-second time interval of PD. There are three main steps for the on-line affective assessment algorithm, which are preparation, feature-based decision voting and affective determination. The final results show that the accuracies are 72.30% and 73.55% for the data subsets, which were respectively chosen using two types of data selection methods (paired t-test and subject self-evaluation). In order to further analyze the efficiency of affective recognition through the PD signal, the Galvanic Skin Response (GSR) was also monitored and processed. The highest affective assessment classification rate obtained from GSR processing is only 63.57% (based on the off-line processing algorithm). The overall results confirm that the PD signal should be considered as one of the most powerful physiological signals to involve in future automated real-time affective recognition systems, especially for detecting the “relaxation” vs. “stress” states.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The construction industry is categorised as being an information-intensive industry and described as one of the most important industries in any developed country, facing a period of rapid and unparalleled change (Industry Science Resources 1999) (Love P.E.D., Tucker S.N. et al. 1996). Project communications are becoming increasingly complex, with a growing need and fundamental drive to collaborate electronically at project level and beyond (Olesen K. and Myers M.D. 1999; Thorpe T. and Mead S. 2001; CITE 2003). Yet, the industry is also identified as having a considerable lack of knowledge and awareness about innovative information and communication technology (ICT) and web-based communication processes, systems and solutions which may prove beneficial in the procurement, delivery and life cycle of projects (NSW Government 1998; Kajewski S. and Weippert A. 2000). The Internet has debatably revolutionised the way in which information is stored, exchanged and viewed, opening new avenues for business, which only a decade ago were deemed almost inconceivable (DCITA 1998; IIB 2002). In an attempt to put these ‘new avenues of business’ into perspective, this report provides an overall ‘snapshot’ of current public and private construction industry sector opportunities and practices in the implementation and application of web-based ICT tools, systems and processes (e-Uptake). Research found that even with a reserved uptake, the construction industry and its participating organisations are making concerted efforts (fortunately with positive results) in taking up innovative forms of doing business via the internet, including e-Tendering (making it possible to manage the entire tender letting process electronically and online) (Anumba C.J. and Ruikar K. 2002; ITCBP 2003). Furthermore, Government (often a key client within the construction industry),and with its increased tendency to transact its business electronically, undoubtedly has an effect on how various private industry consultants, contractors, suppliers, etc. do business (Murray M. 2003) – by offering a wide range of (current and anticipated) e-facilities / services, including e-Tendering (Ecommerce 2002). Overall, doing business electronically is found to have a profound impact on the way today’s construction businesses operate - streamlining existing processes, with the growth in innovative tools, such as e-Tender, offering the construction industry new responsibilities and opportunities for all parties involved (ITCBP 2003). It is therefore important that these opportunities should be accessible to as many construction industry businesses as possible (The Construction Confederation 2001). Historically, there is a considerable exchange of information between various parties during a tendering process, where accuracy and efficiency of documentation is critical. Traditionally this process is either paper-based (involving large volumes of supporting tender documentation), or via a number of stand-alone, non-compatible computer systems, usually costly to both the client and contractor. As such, having a standard electronic exchange format that allows all parties involved in an electronic tender process to access one system only via the Internet, saves both time and money, eliminates transcription errors and increases speed of bid analysis (The Construction Confederation 2001). Supporting this research project’s aims and objectives, researchers set to determine today’s construction industry ‘current state-of-play’ in relation to e-Tendering opportunities. The report also provides brief introductions to several Australian and International e-Tender systems identified during this investigation. e-Tendering, in its simplest form, is described as the electronic publishing, communicating, accessing, receiving and submitting of all tender related information and documentation via the internet, thereby replacing the traditional paper-based tender processes, and achieving a more efficient and effective business process for all parties involved (NT Governement 2000; NT Government 2000; NSW Department of Commerce 2003; NSW Government 2003). Although most of the e-Tender websites investigated at the time, maintain their tendering processes and capabilities are ‘electronic’, research shows these ‘eTendering’ systems vary from being reasonably advanced to more ‘basic’ electronic tender notification and archiving services for various industry sectors. Research also indicates an e-Tender system should have a number of basic features and capabilities, including: • All tender documentation to be distributed via a secure web-based tender system – thereby avoiding the need for collating paperwork and couriers. • The client/purchaser should be able to upload a notice and/or invitation to tender onto the system. • Notification is sent out electronically (usually via email) for suppliers to download the information and return their responses electronically (online). • During the tender period, updates and queries are exchanged through the same e-Tender system. • The client/purchaser should only be able to access the tenders after the deadline has passed. • All tender related information is held in a central database, which should be easily searchable and fully audited, with all activities recorded. • It is essential that tender documents are not read or submitted by unauthorised parties. • Users of the e-Tender system are to be properly identified and registered via controlled access. In simple terms, security has to be as good as if not better than a manual tender process. Data is to be encrypted and users authenticated by means such as digital signatures, electronic certificates or smartcards. • All parties must be assured that no 'undetected' alterations can be made to any tender. • The tenderer should be able to amend the bid right up to the deadline – whilst the client/purchaser cannot obtain access until the submission deadline has passed. • The e-Tender system may also include features such as a database of service providers with spreadsheet-based pricing schedules, which can make it easier for a potential tenderer to electronically prepare and analyse a tender. Research indicates the efficiency of an e-Tender process is well supported internationally, with a significant number, yet similar, e-Tender benefits identified during this investigation. Both construction industry and Government participants generally agree that the implementation of an automated e-Tendering process or system enhances the overall quality, timeliness and cost-effectiveness of a tender process, and provides a more streamlined method of receiving, managing, and submitting tender documents than the traditional paper-based process. On the other hand, whilst there are undoubtedly many more barriers challenging the successful implementation and adoption of an e-Tendering system or process, researchers have also identified a range of challenges and perceptions that seem to hinder the uptake of this innovative approach to tendering electronically. A central concern seems to be that of security - when industry organisations have to use the Internet for electronic information transfer. As a result, when it comes to e-Tendering, industry participants insist these innovative tendering systems are developed to ensure the utmost security and integrity. Finally, if Australian organisations continue to explore the competitive ‘dynamics’ of the construction industry, without realising the current and future, trends and benefits of adopting innovative processes, such as e-Tendering, it will limit their globalising opportunities to expand into overseas markets and allow the continuation of international firms successfully entering local markets. As such, researchers believe increased knowledge, awareness and successful implementation of innovative systems and processes raises great expectations regarding their contribution towards ‘stimulating’ the globalisation of electronic procurement activities, and improving overall business and project performances throughout the construction industry sectors and overall marketplace (NSW Government 2002; Harty C. 2003; Murray M. 2003; Pietroforte R. 2003). Achieving the successful integration of an innovative e-Tender solution with an existing / traditional process can be a complex, and if not done correctly, could lead to failure (Bourn J. 2002).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

There are many studies that reveal the nature of design thinking and the nature of conceptual design as distinct from detailed or embodiment design. The results can assist in our understanding of how the process of design can be supported and how new technologies can be introduced into the workplace. Existing studies provide limited information about the nature of collaborative design as it takes place on the ground and in the actual working context. How to provide appropriate and effective of support for collaborative design information sharing across companies, countries and heterogeneous computer systems is a key issue. As data are passed between designers and the computer systems they employ, many exchanges are made. These exchanges may be used to establish measures of the benefits that new support systems can bring. Collaboration support tools represent a fast growing section of the commercial software market place and a reasonable range of products are available. Many of them offer significant application to design for the support of distributed meetings by the provision of video and audio communications and the sharing of information, including collaborative sketching. The tools that specifically support 3D models and other very design specific features are less common and many of those are in prototype stages of development. A key question is to find viable ways of combining design information visualisation support with the collaboration support technologies that can be seen today. When collaborating, different views will need to be accessible at different times to all the collaborators. The architects may want to explain some ideas on their model, the structural engineers on their model and so on. However, there are issues of ownership when the structural engineer wants to manipulate the architect’s model and vice versa. The modes of working, synchronous or asynchronous may have a bearing as in a synchronous session there is control of what is happening.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Public transportation is an environment with great potential for applying location-based services through mobile devices. The BusTracker study is looking at how real-time passenger information systems can provide a core platform to improve commuters’ experiences. These systems rely on mobile computing and GPS technology to provide accurate information on transport vehicle locations. BusTracker builds on this mobile computing platform and geospatial information. The pilot study is running on the open source BugLabs computing platform, using a GPS module for accurate location information.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The paper provides an assessment of the performance of commercial Real Time Kinematic (RTK) systems over longer than recommended inter-station distances. The experiments were set up to test and analyse solutions from the i-MAX, MAX and VRS systems being operated with three triangle shaped network cells, each having an average inter-station distance of 69km, 118km and 166km. The performance characteristics appraised included initialization success rate, initialization time, RTK position accuracy and availability, ambiguity resolution risk and RTK integrity risk in order to provide a wider perspective of the performance of the testing systems. ----- ----- The results showed that the performances of all network RTK solutions assessed were affected by the increase in the inter-station distances to similar degrees. The MAX solution achieved the highest initialization success rate of 96.6% on average, albeit with a longer initialisation time. Two VRS approaches achieved lower initialization success rate of 80% over the large triangle. In terms of RTK positioning accuracy after successful initialisation, the results indicated a good agreement between the actual error growth in both horizontal and vertical components and the accuracy specified in the RMS and part per million (ppm) values by the manufacturers. ----- ----- Additionally, the VRS approaches performed better than the MAX and i-MAX when being tested under the standard triangle network with a mean inter-station distance of 69km. However as the inter-station distance increases, the network RTK software may fail to generate VRS correction and then may turn to operate in the nearest single-base RTK (or RAW) mode. The position uncertainty reached beyond 2 meters occasionally, showing that the RTK rover software was using an incorrect ambiguity fixed solution to estimate the rover position rather than automatically dropping back to using an ambiguity float solution. Results identified that the risk of incorrectly resolving ambiguities reached 18%, 20%, 13% and 25% for i-MAX, MAX, Leica VRS and Trimble VRS respectively when operating over the large triangle network. Additionally, the Coordinate Quality indicator values given by the Leica GX1230 GG rover receiver tended to be over-optimistic and not functioning well with the identification of incorrectly fixed integer ambiguity solutions. In summary, this independent assessment has identified some problems and failures that can occur in all of the systems tested, especially when being pushed beyond the recommended limits. While such failures are expected, they can offer useful insights into where users should be wary and how manufacturers might improve their products. The results also demonstrate that integrity monitoring of RTK solutions is indeed necessary for precision applications, thus deserving serious attention from researchers and system providers.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The uniformization method (also known as randomization) is a numerically stable algorithm for computing transient distributions of a continuous time Markov chain. When the solution is needed after a long run or when the convergence is slow, the uniformization method involves a large number of matrix-vector products. Despite this, the method remains very popular due to its ease of implementation and its reliability in many practical circumstances. Because calculating the matrix-vector product is the most time-consuming part of the method, overall efficiency in solving large-scale problems can be significantly enhanced if the matrix-vector product is made more economical. In this paper, we incorporate a new relaxation strategy into the uniformization method to compute the matrix-vector products only approximately. We analyze the error introduced by these inexact matrix-vector products and discuss strategies for refining the accuracy of the relaxation while reducing the execution cost. Numerical experiments drawn from computer systems and biological systems are given to show that significant computational savings are achieved in practical applications.