909 resultados para Observation-driven Models


Relevância:

20.00% 20.00%

Publicador:

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Velocity jump processes are discrete random walk models that have many applications including the study of biological and ecological collective motion. In particular, velocity jump models are often used to represent a type of persistent motion, known as a “run and tumble”, which is exhibited by some isolated bacteria cells. All previous velocity jump processes are non-interacting, which means that crowding effects and agent-to-agent interactions are neglected. By neglecting these agent-to-agent interactions, traditional velocity jump models are only applicable to very dilute systems. Our work is motivated by the fact that many applications in cell biology, such as wound healing, cancer invasion and development, often involve tissues that are densely packed with cells where cell-to-cell contact and crowding effects can be important. To describe these kinds of high cell density problems using a velocity jump process we introduce three different classes of crowding interactions into a one-dimensional model. Simulation data and averaging arguments lead to a suite of continuum descriptions of the interacting velocity jump processes. We show that the resulting systems of hyperbolic partial differential equations predict the mean behavior of the stochastic simulations very well.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Introduction: An observer, looking sideways from a moving vehicle, while wearing a neutral density filter over one eye, can have a distorted perception of speed, known as the Enright phenomenon. The purpose of this study was to determine how the Enright phenomenon influences driving behaviour. Methods: A geometric model of the Enright phenomenon was developed. Ten young, visually normal, participants (mean age = 25.4 years) were tested on a straight section of a closed driving circuit and instructed to look out of the right side of the vehicle and drive at either 40 Km/h or 60 Km/h under the following binocular viewing conditions: with a 0.9 ND filter over the left eye (leading eye); 0.9 ND filter over the right eye (trailing eye); 0.9 ND filters over both eyes, and with no filters over either eye. The order of filter conditions was randomised and the speed driven recorded for each condition. Results: Speed judgments did not differ significantly between the two baseline conditions (no filters and both eyes filtered) for either speed tested. For the baseline conditions, when subjects were asked to drive at 60 Km/h they matched this speed well (61 ± 10.2 Km/h) but drove significantly faster than requested (51.6 ± 9.4 Km/h) when asked to drive at 40 Km/h. Subjects significantly exceeded baseline speeds by 8.7± 5.0 Km/h, when the trailing eye was filtered and travelled slower than baseline speeds by 3.7± 4.6 Km/h when the leading eye was filtered. Conclusions: This is the first quantitative study demonstrating how the Enright effect can influence perceptions of driving speed, and demonstrates that monocular filtering of an eye can significantly impact driving speeds, albeit to a lesser extent than predicted by geometric models of the phenomenon.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We propose to use the Tensor Space Modeling (TSM) to represent and analyze the user’s web log data that consists of multiple interests and spans across multiple dimensions. Further we propose to use the decomposition factors of the Tensors for clustering the users based on similarity of search behaviour. Preliminary results show that the proposed method outperforms the traditional Vector Space Model (VSM) based clustering.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Previous research has put forward a number of properties of business process models that have an impact on their understandability. Two such properties are compactness and(block-)structuredness. What has not been sufficiently appreciated at this point is that these desirable properties may be at odds with one another. This paper presents the results of a two-pronged study aimed at exploring the trade-off between compactness and structuredness of process models. The first prong of the study is a comparative analysis of the complexity of a set of unstructured process models from industrial practice and of their corresponding structured versions. The second prong is an experiment wherein a cohort of students was exposed to semantically equivalent unstructured and structured process models. The key finding is that structuredness is not an absolute desideratum vis-a-vis for process model understandability. Instead, subtle trade-offs between structuredness and other model properties are at play.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Embedded real-time programs rely on external interrupts to respond to events in their physical environment in a timely fashion. Formal program verification theories, such as the refinement calculus, are intended for development of sequential, block-structured code and do not allow for asynchronous control constructs such as interrupt service routines. In this article we extend the refinement calculus to support formal development of interrupt-dependent programs. To do this we: use a timed semantics, to support reasoning about the occurrence of interrupts within bounded time intervals; introduce a restricted form of concurrency, to model composition of interrupt service routines with the main program they may preempt; introduce a semantics for shared variables, to model contention for variables accessed by both interrupt service routines and the main program; and use real-time scheduling theory to discharge timing requirements on interruptible program code.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Purpose – The purpose of this paper is to jointly assess the impact of regulatory reform for corporate fundraising in Australia (CLERP Act 1999) and the relaxation of ASX admission rules in 1999, on the accuracy of management earnings forecasts in initial public offer (IPO) prospectuses. The relaxation of ASX listing rules permitted a new category of new economy firms (commitments test entities (CTEs))to list without a prior history of profitability, while the CLERP Act (introduced in 2000) was accompanied by tighter disclosure obligations and stronger enforcement action by the corporate regulator (ASIC). Design/methodology/approach – All IPO earnings forecasts in prospectuses lodged between 1998 and 2003 are examined to assess the pre- and post-CLERP Act impact. Based on active ASIC enforcement action in the post-reform period, IPO firms are hypothesised to provide more accurate forecasts, particularly CTE firms, which are less likely to have a reasonable basis for forecasting. Research models are developed to empirically test the impact of the reforms on CTE and non-CTE IPO firms. Findings – The new regulatory environment has had a positive impact on management forecasting behaviour. In the post-CLERP Act period, the accuracy of prospectus forecasts and their revisions significantly improved and, as expected, the results are primarily driven by CTE firms. However, the majority of prospectus forecasts continue to be materially inaccurate. Originality/value – The results highlight the need to control for both the changing nature of listed firms and the level of enforcement action when examining responses to regulatory changes to corporate fundraising activities.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Evaluating the safety of different traffic facilities is a complex and crucial task. Microscopic simulation models have been widely used for traffic management but have been largely neglected in traffic safety studies. Micro simulation to study safety is more ethical and accessible than the traditional safety studies, which only assess historical crash data. However, current microscopic models are unable to mimic unsafe driver behavior, as they are based on presumptions of safe driver behavior. This highlights the need for a critical examination of the current microscopic models to determine which components and parameters have an effect on safety indicator reproduction. The question then arises whether these safety indicators are valid indicators of traffic safety. The safety indicators were therefore selected and tested for straight motorway segments in Brisbane, Australia. This test examined the capability of a micro-simulation model and presents a better understanding of micro-simulation models and how such models, in particular car following models can be enriched to present more accurate safety indicators.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Non-invasive vibration analysis has been used extensively to monitor the progression of dental implant healing and stabilization. It is now being considered as a method to monitor femoral implants in transfemoral amputees. This paper evaluates two modal analysis excitation methods and investigates their capabilities in detecting changes at the interface between the implant and the bone that occur during osseointegration. Excitation of bone-implant physical models with the electromagnetic shaker provided higher coherence values and a greater number of modes over the same frequency range when compared to the impact hammer. Differences were detected in the natural frequencies and fundamental mode shape of the model when the fit of the implant was altered in the bone. The ability to detect changes in the model dynamic properties demonstrates the potential of modal analysis in this application and warrants further investigation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

With the increasing number of XML documents in varied domains, it has become essential to identify ways of finding interesting information from these documents. Data mining techniques were used to derive this interesting information. Mining on XML documents is impacted by its model due to the semi-structured nature of these documents. Hence, in this chapter we present an overview of the various models of XML documents, how these models were used for mining and some of the issues and challenges in these models. In addition, this chapter also provides some insights into the future models of XML documents for effectively capturing the two important features namely structure and content of XML documents for mining.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Existing recommendation systems often recommend products to users by capturing the item-to-item and user-to-user similarity measures. These types of recommendation systems become inefficient in people-to-people networks for people to people recommendation that require two way relationship. Also, existing recommendation methods use traditional two dimensional models to find inter relationships between alike users and items. It is not efficient enough to model the people-to-people network with two-dimensional models as the latent correlations between the people and their attributes are not utilized. In this paper, we propose a novel tensor decomposition-based recommendation method for recommending people-to-people based on users profiles and their interactions. The people-to-people network data is multi-dimensional data which when modeled using vector based methods tend to result in information loss as they capture either the interactions or the attributes of the users but not both the information. This paper utilizes tensor models that have the ability to correlate and find latent relationships between similar users based on both information, user interactions and user attributes, in order to generate recommendations. Empirical analysis is conducted on a real-life online dating dataset. As demonstrated in results, the use of tensor modeling and decomposition has enabled the identification of latent correlations between people based on their attributes and interactions in the network and quality recommendations have been derived using the 'alike' users concept.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Continuum, partial differential equation models are often used to describe the collective motion of cell populations, with various types of motility represented by the choice of diffusion coefficient, and cell proliferation captured by the source terms. Previously, the choice of diffusion coefficient has been largely arbitrary, with the decision to choose a particular linear or nonlinear form generally based on calibration arguments rather than making any physical connection with the underlying individual-level properties of the cell motility mechanism. In this work we provide a new link between individual-level models, which account for important cell properties such as varying cell shape and volume exclusion, and population-level partial differential equation models. We work in an exclusion process framework, considering aligned, elongated cells that may occupy more than one lattice site, in order to represent populations of agents with different sizes. Three different idealizations of the individual-level mechanism are proposed, and these are connected to three different partial differential equations, each with a different diffusion coefficient; one linear, one nonlinear and degenerate and one nonlinear and nondegenerate. We test the ability of these three models to predict the population level response of a cell spreading problem for both proliferative and nonproliferative cases. We also explore the potential of our models to predict long time travelling wave invasion rates and extend our results to two dimensional spreading and invasion. Our results show that each model can accurately predict density data for nonproliferative systems, but that only one does so for proliferative systems. Hence great care must be taken to predict density data for with varying cell shape.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The quality of conceptual business process models is highly relevant for the design of corresponding information systems. In particular, a precise measurement of model characteristics can be beneficial from a business perspective, helping to save costs thanks to early error detection. This is just as true from a software engineering point of view. In this latter case, models facilitate stakeholder communication and software system design. Research has investigated several proposals as regards measures for business process models, from a rather correlational perspective. This is helpful for understanding, for example size and complexity as general driving forces of error probability. Yet, design decisions usually have to build on thresholds, which can reliably indicate that a certain counter-action has to be taken. This cannot be achieved only by providing measures; it requires a systematic identification of effective and meaningful thresholds. In this paper, we derive thresholds for a set of structural measures for predicting errors in conceptual process models. To this end, we use a collection of 2,000 business process models from practice as a means of determining thresholds, applying an adaptation of the ROC curves method. Furthermore, an extensive validation of the derived thresholds was conducted by using 429 EPC models from an Australian financial institution. Finally, significant thresholds were adapted to refine existing modeling guidelines in a quantitative way.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This thesis presents a design investigation into how traditional technology-orientated markets can use design led innovation (DLI) strategies in order to achieve better market penetration of disruptive products. In a review of the Australian livestock industry, considering historical information and present-day trends, a lack of socio-cultural consideration was identified in the design and implementation of products and systems, previously been taken to market. Hence the adoption of these novel products has been documented as extremely slow. Classical diffusion models have typically been used in order to implement these products. However, this thesis poses that it is through the strategic intent of design led innovation, where heavily technology-orientated markets (such as the Australian livestock industry), can achieve better final adoption rates. By considering a range of external factors (business models, technology and user needs), rather than focusing design efforts solely on the technology, it is argued that using DLI approach will lead to disruptive innovations being made easier to adopt in the Australian livestock industry. This thesis therefore explored two research questions: 1. What are the social inhibitors to the adoption of a new technology in the Australian livestock industry? 2. Can design be used to gain a significant feedback response to the proposed innovation? In order to answer these questions, this thesis used a design led innovation approach to investigate the livestock industry, centring on how design can be used early on in the development of disruptive products being taken to market. This thesis used a three stage data collection programme, combining methods of design thinking, co-design and participatory design. The first study found four key themes to the social barriers of technology adoption; Social attitudes to innovation, Market monitoring, Attitude to 3D imaging and Online processes. These themes were built upon through a design thinking/co-design approach to create three ‘future scenarios’ to be tested in participant workshops. The analysis of the data collection found four key socio-cultural barriers that inhibited the adoption of a disruptive innovation in the Australian livestock industry. These were found to be a lack of Education, a Culture of Innovation, a Lack of Engagement and Communication barriers. This thesis recommends five key areas to be focused upon in the subsequent design of a new product in the Australian livestock industry. These recommendations are made to business and design managers looking to introduce disruptive innovations in this industry. Moreover, the thesis presents three design implications relating to stakeholder attitudes, practical constraints and technological restrictions of innovations within the industry.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Many modern business environments employ software to automate the delivery of workflows; whereas, workflow design and generation remains a laborious technical task for domain specialists. Several differ- ent approaches have been proposed for deriving workflow models. Some approaches rely on process data mining approaches, whereas others have proposed derivations of workflow models from operational struc- tures, domain specific knowledge or workflow model compositions from knowledge-bases. Many approaches draw on principles from automatic planning, but conceptual in context and lack mathematical justification. In this paper we present a mathematical framework for deducing tasks in workflow models from plans in mechanistic or strongly controlled work environments, with a focus around automatic plan generations. In addition, we prove an associative composition operator that permits crisp hierarchical task compositions for workflow models through a set of mathematical deduction rules. The result is a logical framework that can be used to prove tasks in workflow hierarchies from operational information about work processes and machine configurations in controlled or mechanistic work environments.