672 resultados para task model

em Queensland University of Technology - ePrints Archive


Relevância:

70.00% 70.00%

Publicador:

Resumo:

Multi-agent systems implicate a high degree of concurrency at both the Inter- and Intra-Agent levels. Scalable, fault tolerant, Agent Grooming Environment (SAGE), the second generation, FIPA compliant MAS requires a built in mechanism to achieve both the Inter- and Intra-Agent concurrency. This paper dilates upon an attempt to provide a reliable, efficient and light-weight solution to provide intra-agent concurrency with-in the internal agent architecture of SAGE. It addresses the issues related to using the JAVA threading model to provide this level of concurrency to the agent and provides an alternative approach that is based on an eventdriven, concurrent and user-scalable multi-tasking model for the agent's internal model. The findings of this paper show that our proposed approach is suitable for providing an efficient and lightweight concurrent task model for SA GE and considerably outweighs the performance of multithreaded tasking model based on JAVA in terms of throughput and efficiency. This has been illustrated using the practical implementation and evaluation of both models. © 2004 IEEE.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Due to the popularity of modern Collaborative Virtual Environments, there has been a related increase in their size and complexity. Developers therefore need visualisations that expose usage patterns from logged data, to understand the structures and dynamics of these complex environments. This chapter presents a new framework for the process of visualising virtual environment usage data. Major components, such as an event model, designer task model and data acquisition infrastructure are described. Interface and implementation factors are also developed, along with example visualisation techniques that make use of the new task and event model. A case study is performed to illustrate a typical scenario for the framework, and its benefits to the environment development team.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Due to the popularity of modern Collaborative Virtual Environments, there has been a related increase in their size and complexity. Developers therefore need visualisations that expose usage patterns from logged data, to understand the structures and dynamics of these complex environments. This chapter presents a new framework for the process of visualising virtual environment usage data. Major components, such as an event model, designer task model and data acquisition infrastructure are described. Interface and implementation factors are also developed, along with example visualisation techniques that make use of the new task and event model. A case study is performed to illustrate a typical scenario for the framework, and its benefits to the environment development team.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The elastic task model, a significant development in scheduling of real-time control tasks, provides a mechanism for flexible workload management in uncertain environments. It tells how to adjust the control periods to fulfill the workload constraints. However, it is not directly linked to the quality-of-control (QoC) management, the ultimate goal of a control system. As a result, it does not tell how to make the best use of the system resources to maximize the QoC improvement. To fill in this gap, a new feedback scheduling framework, which we refer to as QoC elastic scheduling, is developed in this paper for real-time process control systems. It addresses the QoC directly through embedding both the QoC management and workload adaptation into a constrained optimization problem. The resulting solution for period adjustment is in a closed-form expressed in QoC measurements, enabling closed-loop feedback of the QoC to the task scheduler. Whenever the QoC elastic scheduler is activated, it improves the QoC the most while still meeting the system constraints. Examples are given to demonstrate the effectiveness of the QoC elastic scheduling.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The driving task requires sustained attention during prolonged periods, and can be performed in highly predictable or repetitive environments. Such conditions could create drowsiness or hypovigilance and impair the ability to react to critical events. Identifying vigilance decrement in monotonous conditions has been a major subject of research, but no research to date has attempted to predict this vigilance decrement. This pilot study aims to show that vigilance decrements due to monotonous tasks can be predicted through mathematical modelling. A short vigilance task sensitive to short periods of lapses of vigilance called Sustained Attention to Response Task is used to assess participants’ performance. This task models the driver’s ability to cope with unpredicted events by performing the expected action. A Hidden Markov Model (HMM) is proposed to predict participants’ hypovigilance. Driver’s vigilance evolution is modelled as a hidden state and is correlated to an observable variable: the participant’s reactions time. This experiment shows that the monotony of the task can lead to an important vigilance decline in less than five minutes. This impairment can be predicted four minutes in advance with an 86% accuracy using HMMs. This experiment showed that mathematical models such as HMM can efficiently predict hypovigilance through surrogate measures. The presented model could result in the development of an in-vehicle device that detects driver hypovigilance in advance and warn the driver accordingly, thus offering the potential to enhance road safety and prevent road crashes.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Over about the last decade, people involved in game development have noted the need for more formal models and tools to support the design phase of games. It is argued that the present lack of such formal tools is currently hindering knowledge transfer among designers. Formal visual languages, on the other hand, can help to more effectively express, abstract and communicate game design concepts. Moreover, formal tools can assist in the prototyping phase, allowing designers to reason about and simulate game mechanics on an abstract level. In this paper we present an initial investigation into whether workflow patterns – which have already proven to be effective for modeling business processes – are a suitable way to model task succession in games. Our preliminary results suggest that workflow patterns show promise in this regard but some limitations, especially in regard to time constraints, currently restrict their potential.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Human factors such as distraction, fatigue, alcohol and drug use are generally ignored in car-following (CF) models. Such ignorance overestimates driver capability and leads to most CF models’ inability in realistically explaining human driving behaviors. This paper proposes a novel car-following modeling framework by introducing the difficulty of driving task measured as the dynamic interaction between driving task demand and driver capability. Task difficulty is formulated based on the famous Task Capability Interface (TCI) model, which explains the motivations behind driver’s decision making. The proposed method is applied to enhance two popular CF models: Gipps’ model and IDM, and named as TDGipps and TDIDM respectively. The behavioral soundness of TDGipps and TDIDM are discussed and their stabilities are analyzed. Moreover, the enhanced models are calibrated with the vehicle trajectory data, and validated to explain both regular and human factor influenced CF behavior (which is distraction caused by hand-held mobile phone conversation in this paper). Both the models show better performance than their predecessors, especially in presence of human factors.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Health challenges present arguably the most significant barrier to sustainable global development. The introduction of ICT in healthcare, especially the application of mobile communications, has created the potential to transform healthcare delivery by making it more accessible, affordable and effective across the developing world. However, current research into the assessment of mHealth from the perspective of developing countries particularly with community Health workers (CHWs) as primary users continues to be limited. The aim of this study is to analyze the contribution of mHealth in enhancing the performance of the health workers and its alignment with existing workflows to guide its utilization. The proposed research takes into account this consideration and aims to examine the task-technology alignment of mHealth for CHWs drawing upon the task technology fit as the theoretical foundation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

With the advent of Service Oriented Architecture, Web Services have gained tremendous popularity. Due to the availability of a large number of Web services, finding an appropriate Web service according to the requirement of the user is a challenge. This warrants the need to establish an effective and reliable process of Web service discovery. A considerable body of research has emerged to develop methods to improve the accuracy of Web service discovery to match the best service. The process of Web service discovery results in suggesting many individual services that partially fulfil the user’s interest. By considering the semantic relationships of words used in describing the services as well as the use of input and output parameters can lead to accurate Web service discovery. Appropriate linking of individual matched services should fully satisfy the requirements which the user is looking for. This research proposes to integrate a semantic model and a data mining technique to enhance the accuracy of Web service discovery. A novel three-phase Web service discovery methodology has been proposed. The first phase performs match-making to find semantically similar Web services for a user query. In order to perform semantic analysis on the content present in the Web service description language document, the support-based latent semantic kernel is constructed using an innovative concept of binning and merging on the large quantity of text documents covering diverse areas of domain of knowledge. The use of a generic latent semantic kernel constructed with a large number of terms helps to find the hidden meaning of the query terms which otherwise could not be found. Sometimes a single Web service is unable to fully satisfy the requirement of the user. In such cases, a composition of multiple inter-related Web services is presented to the user. The task of checking the possibility of linking multiple Web services is done in the second phase. Once the feasibility of linking Web services is checked, the objective is to provide the user with the best composition of Web services. In the link analysis phase, the Web services are modelled as nodes of a graph and an allpair shortest-path algorithm is applied to find the optimum path at the minimum cost for traversal. The third phase which is the system integration, integrates the results from the preceding two phases by using an original fusion algorithm in the fusion engine. Finally, the recommendation engine which is an integral part of the system integration phase makes the final recommendations including individual and composite Web services to the user. In order to evaluate the performance of the proposed method, extensive experimentation has been performed. Results of the proposed support-based semantic kernel method of Web service discovery are compared with the results of the standard keyword-based information-retrieval method and a clustering-based machine-learning method of Web service discovery. The proposed method outperforms both information-retrieval and machine-learning based methods. Experimental results and statistical analysis also show that the best Web services compositions are obtained by considering 10 to 15 Web services that are found in phase-I for linking. Empirical results also ascertain that the fusion engine boosts the accuracy of Web service discovery by combining the inputs from both the semantic analysis (phase-I) and the link analysis (phase-II) in a systematic fashion. Overall, the accuracy of Web service discovery with the proposed method shows a significant improvement over traditional discovery methods.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The selection criteria for contractor pre-qualification are characterized by the co-existence of both quantitative and qualitative data. The qualitative data is non-linear, uncertain and imprecise. An ideal decision support system for contractor pre-qualification should have the ability of handling both quantitative and qualitative data, and of mapping the complicated nonlinear relationship of the selection criteria, such that rational and consistent decisions can be made. In this research paper, an artificial neural network model was developed to assist public clients identifying suitable contractors for tendering. The pre-qualification criteria (variables) were identified for the model. One hundred and twelve real pre-qualification cases were collected from civil engineering projects in Hong Kong, and eighty-eight hypothetical pre-qualification cases were also generated according to the “If-then” rules used by professionals in the pre-qualification process. The results of the analysis totally comply with current practice (public developers in Hong Kong). Each pre-qualification case consisted of input ratings for candidate contractors’ attributes and their corresponding pre-qualification decisions. The training of the neural network model was accomplished by using the developed program, in which a conjugate gradient descent algorithm was incorporated for improving the learning performance of the network. Cross-validation was applied to estimate the generalization errors based on the “re-sampling” of training pairs. The case studies show that the artificial neural network model is suitable for mapping the complicated nonlinear relationship between contractors’ attributes and their corresponding pre-qualification (disqualification) decisions. The artificial neural network model can be concluded as an ideal alternative for performing the contractor pre-qualification task.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Vigilance declines when exposed to highly predictable and uneventful tasks. Monotonous tasks provide little cognitive and motor stimulation and contribute to human errors. This paper aims to model and detect vigilance decline in real time through participant’s reaction times during a monotonous task. A lab-based experiment adapting the Sustained Attention to Response Task (SART) is conducted to quantify the effect of monotony on overall performance. Then relevant parameters are used to build a model detecting hypovigilance throughout the experiment. The accuracy of different mathematical models are compared to detect in real-time – minute by minute - the lapses in vigilance during the task. We show that monotonous tasks can lead to an average decline in performance of 45%. Furthermore, vigilance modelling enables to detect vigilance decline through reaction times with an accuracy of 72% and a 29% false alarm rate. Bayesian models are identified as a better model to detect lapses in vigilance as compared to Neural Networks and Generalised Linear Mixed Models. This modelling could be used as a framework to detect vigilance decline of any human performing monotonous tasks.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The driving task requires sustained attention during prolonged periods, and can be performed in highly predictable or repetitive environments. Such conditions could create hypovigilance and impair performance towards critical events. Identifying such impairment in monotonous conditions has been a major subject of research, but no research to date has attempted to predict it in real-time. This pilot study aims to show that performance decrements due to monotonous tasks can be predicted through mathematical modelling taking into account sensation seeking levels. A short vigilance task sensitive to short periods of lapses of vigilance called Sustained Attention to Response Task is used to assess participants‟ performance. The framework for prediction developed on this task could be extended to a monotonous driving task. A Hidden Markov Model (HMM) is proposed to predict participants‟ lapses in alertness. Driver‟s vigilance evolution is modelled as a hidden state and is correlated to a surrogate measure: the participant‟s reactions time. This experiment shows that the monotony of the task can lead to an important decline in performance in less than five minutes. This impairment can be predicted four minutes in advance with an 86% accuracy using HMMs. This experiment showed that mathematical models such as HMM can efficiently predict hypovigilance through surrogate measures. The presented model could result in the development of an in-vehicle device that detects driver hypovigilance in advance and warn the driver accordingly, thus offering the potential to enhance road safety and prevent road crashes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Effective use of information and communication technologies (ICT) is necessary for delivering efficiency and improved project delivery in the construction industry. Convincing clients or contracting organisations to embrace ICT is a difficult task, there are few templates of an ICT business model for the industry to use. ICT application in the construction industry is relatively low compared to automotive and aerospace industries. The National Museum of Australia project provides a unique opportunity for investigating and reporting on this deficiency in publicly available knowledge. Concentrates on the business model content and objectives, briefly indicates the evaluation framework that was used to evaluate ICT effectiveness.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We present a hierarchical model for assessing an object-oriented program's security. Security is quantified using structural properties of the program code to identify the ways in which `classified' data values may be transferred between objects. The model begins with a set of low-level security metrics based on traditional design characteristics of object-oriented classes, such as data encapsulation, cohesion and coupling. These metrics are then used to characterise higher-level properties concerning the overall readability and writability of classified data throughout the program. In turn, these metrics are then mapped to well-known security design principles such as `assigning the least privilege' and `reducing the size of the attack surface'. Finally, the entire program's security is summarised as a single security index value. These metrics allow different versions of the same program, or different programs intended to perform the same task, to be compared for their relative security at a number of different abstraction levels. The model is validated via an experiment involving five open source Java programs, using a static analysis tool we have developed to automatically extract the security metrics from compiled Java bytecode.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper we extend the concept of speaker annotation within a single-recording, or speaker diarization, to a collection wide approach we call speaker attribution. Accordingly, speaker attribution is the task of clustering expectantly homogenous intersession clusters obtained using diarization according to common cross-recording identities. The result of attribution is a collection of spoken audio across multiple recordings attributed to speaker identities. In this paper, an attribution system is proposed using mean-only MAP adaptation of a combined-gender UBM to model clusters from a perfect diarization system, as well as a JFA-based system with session variability compensation. The normalized cross-likelihood ratio is calculated for each pair of clusters to construct an attribution matrix and the complete linkage algorithm is employed to conduct clustering of the inter-session clusters. A matched cluster purity and coverage of 87.1% was obtained on the NIST 2008 SRE corpus.