950 resultados para task model


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Effective use of information and communication technologies (ICT) is necessary for delivering efficiency and improved project delivery in the construction industry. Convincing clients or contracting organisations to embrace ICT is a difficult task, there are few templates of an ICT business model for the industry to use. ICT application in the construction industry is relatively low compared to automotive and aerospace industries. The National Museum of Australia project provides a unique opportunity for investigating and reporting on this deficiency in publicly available knowledge. Concentrates on the business model content and objectives, briefly indicates the evaluation framework that was used to evaluate ICT effectiveness.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We present a hierarchical model for assessing an object-oriented program's security. Security is quantified using structural properties of the program code to identify the ways in which `classified' data values may be transferred between objects. The model begins with a set of low-level security metrics based on traditional design characteristics of object-oriented classes, such as data encapsulation, cohesion and coupling. These metrics are then used to characterise higher-level properties concerning the overall readability and writability of classified data throughout the program. In turn, these metrics are then mapped to well-known security design principles such as `assigning the least privilege' and `reducing the size of the attack surface'. Finally, the entire program's security is summarised as a single security index value. These metrics allow different versions of the same program, or different programs intended to perform the same task, to be compared for their relative security at a number of different abstraction levels. The model is validated via an experiment involving five open source Java programs, using a static analysis tool we have developed to automatically extract the security metrics from compiled Java bytecode.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper we extend the concept of speaker annotation within a single-recording, or speaker diarization, to a collection wide approach we call speaker attribution. Accordingly, speaker attribution is the task of clustering expectantly homogenous intersession clusters obtained using diarization according to common cross-recording identities. The result of attribution is a collection of spoken audio across multiple recordings attributed to speaker identities. In this paper, an attribution system is proposed using mean-only MAP adaptation of a combined-gender UBM to model clusters from a perfect diarization system, as well as a JFA-based system with session variability compensation. The normalized cross-likelihood ratio is calculated for each pair of clusters to construct an attribution matrix and the complete linkage algorithm is employed to conduct clustering of the inter-session clusters. A matched cluster purity and coverage of 87.1% was obtained on the NIST 2008 SRE corpus.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Project management in the construction industry involves coordination of many tasks and individuals, affected by complexity and uncertainty, which increases the need for efficient cooperation. Procurement is crucial since it sets the basis for cooperation between clients and contractors. This is true whether the project is local, regional or global in scope. Traditionally, procurement procedures are competitive, resulting in conflicts, adversarial relationships and less desirable project results. The purpose of this paper is to propose and empirically test an alternative procurement model based on cooperative procurement procedures that facilitates cooperation between clients and contractors in construction projects. The model is based on four multi-item constructs – incentive-based compensation, limited bidding options, partner selection and cooperation. Based on a sample of 87 client organisations, the model was empirically tested and exhibited strong support, including content, nomological, convergent and discriminant validity, as well as reliability. Our findings indicate that partner selection based on task related attributes mediates the relationship between two important pre-selection processes (incentive-based compensation and limited bid invitation) and preferred outcome of cooperation. The contribution of the paper is identifying valid and reliable measurement constructs and confirming a unique sequential order for achieving cooperation. Moreover, the findings are applicable for many types of construction projects because of the similarities in the construction industry worldwide.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objective The current study evaluated part of the Multifactorial Model of Driving Safety to elucidate the relative importance of cognitive function and a limited range of standard measures of visual function in relation to the Capacity to Drive Safely. Capacity to Drive Safely was operationalized using three validated screening measures for older drivers. These included an adaptation of the well validated Useful Field of View (UFOV) and two newer measures, namely a Hazard Perception Test (HPT), and a Hazard Change Detection Task (HCDT). Method Community dwelling drivers (n = 297) aged 65–96 were assessed using a battery of measures of cognitive and visual function. Results Factor analysis of these predictor variables yielded factors including Executive/Speed, Vision (measured by visual acuity and contrast sensitivity), Spatial, Visual Closure, and Working Memory. Cognitive and Vision factors explained 83–95% of age-related variance in the Capacity to Drive Safely. Spatial and Working Memory were associated with UFOV, HPT and HCDT, Executive/Speed was associated with UFOV and HCDT and Vision was associated with HPT. Conclusion The Capacity to Drive Safely declines with chronological age, and this decline is associated with age-related declines in several higher order cognitive abilities involving manipulation and storage of visuospatial information under speeded conditions. There are also age-independent effects of cognitive function and vision that determine driving safety.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose: Important performance objectives manufacturers sought can be achieved through adopting the appropriate manufacturing practices. This paper presents a conceptual model proposing relationship between advanced quality practices, perceived manufacturing difficulties and manufacturing performances. Design/methodology/approach: A survey-based approach was adopted to test the hypotheses proposed in this study. The selection of research instruments for inclusion in this survey was based on literature review, the pilot case studies and relevant industrial experience of the author. A sample of 1000 manufacturers across Australia was randomly selected. Quality managers were requested to complete the questionnaire, as the task of dealing with the quality and reliability issues is a quality manager’s major responsibility. Findings: Evidence indicates that product quality and reliability is the main competitive factor for manufacturers. Design and manufacturing capability and on time delivery came second. Price is considered as the least important factor for the Australian manufacturers. Results show that collectively the advanced quality practices proposed in this study neutralize the difficulties manufacturers face and contribute to the most performance objectives of the manufacturers. The companies who have put more emphasize on the advanced quality practices have less problem in manufacturing and better performance in most manufacturing performance indices. The results validate the proposed conceptual model and lend credence to hypothesis that proposed relationship between quality practices, manufacturing difficulties and manufacturing performances. Practical implications: The model shown in this paper provides a simple yet highly effective approach to achieving significant improvements in product quality and manufacturing performance. This study introduces a relationship based ‘proactive’ quality management approach and provides great potential for managers and engineers to adopt the model in a wide range of manufacturing organisations. Originality/value: Traditional ways of checking product quality are different types of testing, inspection and screening out bad products after manufacturing them. In today’s manufacturing where product life cycle is very short, it is necessary to focus on not to manufacturing them first rather than screening out the bad ones. This study introduces, for the first time, the idea of relationship based advanced quality practices (AQP) and suggests AQPs will enable manufacturers to develop reliable products and minimize the manufacturing anomalies. This paper explores some of the attributes of AQP capable of reducing manufacturing difficulties and improving manufacturing performances. The proposed conceptual model contributes to the existing knowledge base of quality practices and subsequently provides impetus and guidance towards increasing manufacturing performance.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Here we present a sequential Monte Carlo (SMC) algorithm that can be used for any one-at-a-time Bayesian sequential design problem in the presence of model uncertainty where discrete data are encountered. Our focus is on adaptive design for model discrimination but the methodology is applicable if one has a different design objective such as parameter estimation or prediction. An SMC algorithm is run in parallel for each model and the algorithm relies on a convenient estimator of the evidence of each model which is essentially a function of importance sampling weights. Other methods for this task such as quadrature, often used in design, suffer from the curse of dimensionality. Approximating posterior model probabilities in this way allows us to use model discrimination utility functions derived from information theory that were previously difficult to compute except for conjugate models. A major benefit of the algorithm is that it requires very little problem specific tuning. We demonstrate the methodology on three applications, including discriminating between models for decline in motor neuron numbers in patients suffering from neurological diseases such as Motor Neuron disease.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Incorporating knowledge based urban development (KBUD) strategies in the urban planning and development process is a challenging and complex task due to the fragmented and incoherent nature of the existing KBUD models. This paper scrutinizes and compares these KBUD models with an aim of identifying key and common features that help in developing a new comprehensive and integrated KBUD model. The features and characteristics of the existing KBUD models are determined through a thorough literature review and the analysis reveals that while these models are invaluable and useful in some cases, lack of a comprehensive perspective and absence of full integration of all necessary development domains render them incomplete as a generic model. The proposed KBUD model considers all central elements of urban development and sets an effective platform for planners and developers to achieve more holistic development outcomes. The proposed model, when developed further, has a high potential to support researchers, practitioners and particularly city and state administrations that are aiming to a knowledge-based development.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Approximate clone detection is the process of identifying similar process fragments in business process model collections. The tool presented in this paper can efficiently cluster approximate clones in large process model repositories. Once a repository is clustered, users can filter and browse the clusters using different filtering parameters. Our tool can also visualize clusters in the 2D space, allowing a better understanding of clusters and their member fragments. This demonstration will be useful for researchers and practitioners working on large process model repositories, where process standardization is a critical task for increasing the consistency and reducing the complexity of the repository.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this article, we report on the findings of an exploratory study into the experience of undergraduate students as they learn new mathematical models. Qualitative and quanti- tative data based around the students’ approaches to learning new mathematical models were collected. The data revealed that students actively adopt three approaches to under- standing a new mathematical model: gathering information for the task of understanding the model, practising with and using the model, and finding interrelationships between elements of the model. We found that the students appreciate mathematical models that have a real world application and that this can be used to engage students in higher level learning approaches.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The rapid increase in the deployment of CCTV systems has led to a greater demand for algorithms that are able to process incoming video feeds. These algorithms are designed to extract information of interest for human operators. During the past several years, there has been a large effort to detect abnormal activities through computer vision techniques. Typically, the problem is formulated as a novelty detection task where the system is trained on normal data and is required to detect events which do not fit the learned `normal' model. Many researchers have tried various sets of features to train different learning models to detect abnormal behaviour in video footage. In this work we propose using a Semi-2D Hidden Markov Model (HMM) to model the normal activities of people. The outliers of the model with insufficient likelihood are identified as abnormal activities. Our Semi-2D HMM is designed to model both the temporal and spatial causalities of the crowd behaviour by assuming the current state of the Hidden Markov Model depends not only on the previous state in the temporal direction, but also on the previous states of the adjacent spatial locations. Two different HMMs are trained to model both the vertical and horizontal spatial causal information. Location features, flow features and optical flow textures are used as the features for the model. The proposed approach is evaluated using the publicly available UCSD datasets and we demonstrate improved performance compared to other state of the art methods.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

For the timber industry, the ability to simulate the drying of wood is invaluable for manufacturing high quality wood products. Mathematically, however, modelling the drying of a wet porous material, such as wood, is a diffcult task due to its heterogeneous and anisotropic nature, and the complex geometry of the underlying pore structure. The well{ developed macroscopic modelling approach involves writing down classical conservation equations at a length scale where physical quantities (e.g., porosity) can be interpreted as averaged values over a small volume (typically containing hundreds or thousands of pores). This averaging procedure produces balance equations that resemble those of a continuum with the exception that effective coeffcients appear in their deffnitions. Exponential integrators are numerical schemes for initial value problems involving a system of ordinary differential equations. These methods differ from popular Newton{Krylov implicit methods (i.e., those based on the backward differentiation formulae (BDF)) in that they do not require the solution of a system of nonlinear equations at each time step but rather they require computation of matrix{vector products involving the exponential of the Jacobian matrix. Although originally appearing in the 1960s, exponential integrators have recently experienced a resurgence in interest due to a greater undertaking of research in Krylov subspace methods for matrix function approximation. One of the simplest examples of an exponential integrator is the exponential Euler method (EEM), which requires, at each time step, approximation of φ(A)b, where φ(z) = (ez - 1)/z, A E Rnxn and b E Rn. For drying in porous media, the most comprehensive macroscopic formulation is TransPore [Perre and Turner, Chem. Eng. J., 86: 117-131, 2002], which features three coupled, nonlinear partial differential equations. The focus of the first part of this thesis is the use of the exponential Euler method (EEM) for performing the time integration of the macroscopic set of equations featured in TransPore. In particular, a new variable{ stepsize algorithm for EEM is presented within a Krylov subspace framework, which allows control of the error during the integration process. The performance of the new algorithm highlights the great potential of exponential integrators not only for drying applications but across all disciplines of transport phenomena. For example, when applied to well{ known benchmark problems involving single{phase liquid ow in heterogeneous soils, the proposed algorithm requires half the number of function evaluations than that required for an equivalent (sophisticated) Newton{Krylov BDF implementation. Furthermore for all drying configurations tested, the new algorithm always produces, in less computational time, a solution of higher accuracy than the existing backward Euler module featured in TransPore. Some new results relating to Krylov subspace approximation of '(A)b are also developed in this thesis. Most notably, an alternative derivation of the approximation error estimate of Hochbruck, Lubich and Selhofer [SIAM J. Sci. Comput., 19(5): 1552{1574, 1998] is provided, which reveals why it performs well in the error control procedure. Two of the main drawbacks of the macroscopic approach outlined above include the effective coefficients must be supplied to the model, and it fails for some drying configurations, where typical dual{scale mechanisms occur. In the second part of this thesis, a new dual{scale approach for simulating wood drying is proposed that couples the porous medium (macroscale) with the underlying pore structure (microscale). The proposed model is applied to the convective drying of softwood at low temperatures and is valid in the so{called hygroscopic range, where hygroscopically held liquid water is present in the solid phase and water exits only as vapour in the pores. Coupling between scales is achieved by imposing the macroscopic gradient on the microscopic field using suitably defined periodic boundary conditions, which allows the macroscopic ux to be defined as an average of the microscopic ux over the unit cell. This formulation provides a first step for moving from the macroscopic formulation featured in TransPore to a comprehensive dual{scale formulation capable of addressing any drying configuration. Simulation results reported for a sample of spruce highlight the potential and flexibility of the new dual{scale approach. In particular, for a given unit cell configuration it is not necessary to supply the effective coefficients prior to each simulation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Enterprise Systems (ES) can be understood as the de facto standard for holistic operational and managerial support within an organization. Most commonly ES are offered as commercial off-the-shelf packages, requiring customization in the user organization. This process is a complex and resource-intensive task, which often prevents small and midsize enterprises (SME) from undertaking configuration projects. Especially in the SME market independent software vendors provide pre-configured ES for a small customer base. The problem of ES configuration is shifted from the customer to the vendor, but remains critical. We argue that the yet unexplored link between process configuration and business document configuration must be closer examined as both types of configuration are closely tied to one another.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Process mining encompasses the research area which is concerned with knowledge discovery from event logs. One common process mining task focuses on conformance checking, comparing discovered or designed process models with actual real-life behavior as captured in event logs in order to assess the “goodness” of the process model. This paper introduces a novel conformance checking method to measure how well a process model performs in terms of precision and generalization with respect to the actual executions of a process as recorded in an event log. Our approach differs from related work in the sense that we apply the concept of so-called weighted artificial negative events towards conformance checking, leading to more robust results, especially when dealing with less complete event logs that only contain a subset of all possible process execution behavior. In addition, our technique offers a novel way to estimate a process model’s ability to generalize. Existing literature has focused mainly on the fitness (recall) and precision (appropriateness) of process models, whereas generalization has been much more difficult to estimate. The described algorithms are implemented in a number of ProM plugins, and a Petri net conformance checking tool was developed to inspect process model conformance in a visual manner.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Social Media (SM) is increasingly being integrated with business information in decision making. Unique characteristics of social media (e.g. wide accessibility, permanence, global audience, recentness, and ease of use) raise new issues with information quality (IQ); quite different from traditional considerations of IQ in information systems (IS) evaluation. This paper presents a preliminary conceptual model of information quality in social media (IQnSM) derived through directed content analysis and employing characteristics of analytic theory in the study protocol. Based in the notion of ‘fitness for use’, IQnSM is highly use and user centric and is defined as “the degree to which information is suitable for doing a specified task by a specific user, in a certain context”. IQnSM is operationalised as hierarchical, formed by the three dimensions (18 measures): intrinsic quality, contextual quality and representational quality. A research plan for empirically validating the model is proposed.