848 resultados para Model Based Testing


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Purpose: The aim was to construct and advise on the use of a cost-per-wear model based on contact lens replacement frequency, to form an equitable basis for cost comparison. ---------- Methods: The annual cost of professional fees, contact lenses and solutions when wearing daily, two-weekly and monthly replacement contact lenses is determined in the context of the Australian market for spherical, toric and multifocal prescription types. This annual cost is divided by the number of times lenses are worn per year, resulting in a ‘cost-per-wear’. The model is presented graphically as the cost-per-wear versus the number of times lenses are worn each week for daily replacement and reusable (two-weekly and monthly replacement) lenses.---------- Results: The cost-per-wear for two-weekly and monthly replacement spherical lenses is almost identical but decreases with increasing frequency of wear. The cost-per-wear of daily replacement spherical lenses is lower than for reusable spherical lenses, when worn from one to four days per week but higher when worn six or seven days per week. The point at which the cost-per-wear is virtually the same for all three spherical lens replacement frequencies (approximately AUD$3.00) is five days of lens wear per week. A similar but upwardly displaced (higher cost) pattern is observed for toric lenses, with the cross-over point occurring between three and four days of wear per week (AUD$4.80). Multifocal lenses have the highest price, with cross-over points for daily versus two-weekly replacement lenses at between four and five days of wear per week (AUD$5.00) and for daily versus monthly replacement lenses at three days per week (AUD$5.50).---------- Conclusions: This cost-per-wear model can be used to assist practitioners and patients in making an informed decision in relation to the cost of contact lens wear as one of many considerations that must be taken into account when deciding on the most suitable lens replacement modality.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

As more and more information is available on the Web finding quality and reliable information is becoming harder. To help solve this problem, Web search models need to incorporate users’ cognitive styles. This paper reports the preliminary results from a user study exploring the relationships between Web users’ searching behavior and their cognitive style. The data was collected using a questionnaire, Web search logs and think-aloud strategy. The preliminary findings reveal a number of cognitive factors, such as information searching processes, results evaluations and cognitive style, having an influence on users’ Web searching behavior. Among these factors, the cognitive style of the user was observed to have a greater impact. Based on the key findings, a conceptual model of Web searching and cognitive styles is presented.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In information retrieval (IR) research, more and more focus has been placed on optimizing a query language model by detecting and estimating the dependencies between the query and the observed terms occurring in the selected relevance feedback documents. In this paper, we propose a novel Aspect Language Modeling framework featuring term association acquisition, document segmentation, query decomposition, and an Aspect Model (AM) for parameter optimization. Through the proposed framework, we advance the theory and practice of applying high-order and context-sensitive term relationships to IR. We first decompose a query into subsets of query terms. Then we segment the relevance feedback documents into chunks using multiple sliding windows. Finally we discover the higher order term associations, that is, the terms in these chunks with high degree of association to the subsets of the query. In this process, we adopt an approach by combining the AM with the Association Rule (AR) mining. In our approach, the AM not only considers the subsets of a query as “hidden” states and estimates their prior distributions, but also evaluates the dependencies between the subsets of a query and the observed terms extracted from the chunks of feedback documents. The AR provides a reasonable initial estimation of the high-order term associations by discovering the associated rules from the document chunks. Experimental results on various TREC collections verify the effectiveness of our approach, which significantly outperforms a baseline language model and two state-of-the-art query language models namely the Relevance Model and the Information Flow model

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We present a novel approach for developing summary statistics for use in approximate Bayesian computation (ABC) algorithms using indirect infer- ence. We embed this approach within a sequential Monte Carlo algorithm that is completely adaptive. This methodological development was motivated by an application involving data on macroparasite population evolution modelled with a trivariate Markov process. The main objective of the analysis is to compare inferences on the Markov process when considering two di®erent indirect mod- els. The two indirect models are based on a Beta-Binomial model and a three component mixture of Binomials, with the former providing a better ¯t to the observed data.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Fluid–Structure Interaction (FSI) problem is significant in science and engineering, which leads to challenges for computational mechanics. The coupled model of Finite Element and Smoothed Particle Hydrodynamics (FE-SPH) is a robust technique for simulation of FSI problems. However, two important steps of neighbor searching and contact searching in the coupled FE-SPH model are extremely time-consuming. Point-In-Box (PIB) searching algorithm has been developed by Swegle to improve the efficiency of searching. However, it has a shortcoming that efficiency of searching can be significantly affected by the distribution of points (nodes in FEM and particles in SPH). In this paper, in order to improve the efficiency of searching, a novel Striped-PIB (S-PIB) searching algorithm is proposed to overcome the shortcoming of PIB algorithm that caused by points distribution, and the two time-consuming steps of neighbor searching and contact searching are integrated into one searching step. The accuracy and efficiency of the newly developed searching algorithm is studied on by efficiency test and FSI problems. It has been found that the newly developed model can significantly improve the computational efficiency and it is believed to be a powerful tool for the FSI analysis.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Fault identification in industrial machine is a topic of major importance under engineering point of view. In fact, the possibility to identify not only the type, but also the severity and the position of a fault occurred along a shaft-line allows quick maintenance and shorten the downtime. This is really important in the power generation industry where the units are often of several tenths of meters long and where the rotors are enclosed by heavy and pressure-sealed casings. In this paper, an industrial experimental case is presented related to the identification of the unbalance on a large size steam turbine of about 1.3 GW, belonging to a nuclear power plant. The case history is analyzed by considering the vibrations measured by the condition monitoring system of the unit. A model-based method in the frequency domain, developed by the authors, is introduced in detail and it is then used to identify the position of the fault and its severity along the shaft-line. The complete model of the unit (rotor – modeled by means of finite elements, bearings – modeled by linearized damping and stiffness coefficients and foundation – modeled by means of pedestals) is analyzed and discussed before being used for the fault identification. The assessment of the actual fault was done by inspection during a scheduled maintenance and excellent correspondence was found with the identified one by means of authors’ proposed method. Finally a complete discussion is presented about the effectiveness of the method, even in presence of a not fine tuned machine model and considering only few measuring planes for the machine vibration.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The aim of this study is to investigate the stress relaxation behavior of single chondrocytes using the Porohyperelastic (PHE) model and inverse Finite Element Analysis (FEA). Firstly, based on Atomic Force Microscopy (AFM) technique, we have found that the chondrocytes exhibited stress relaxation behavior. We explored the mechanism of this stress relaxation behavior and concluded that the intracellular fluid exuding out from the cells during deformation plays the most important role in the stress relaxation. Next, we have applied the inverse FEA technique to determine necessary material parameters for PHE model to simulate this stress relaxation behavior as this model is proven capable of capturing the non-linear behavior and the fluid-solid interaction during the stress relaxation of the single chondrocytes. It is observed that this PHE model can precisely capture the stress relaxation behavior of single chondrocytes and would be a suitable model for cell biomechanics.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper proposes a physically motivated reappraisal of manoeuvring models for ships and presents a new model developed from first principles by application of low aspect-ratio aerodynamic theory and Lagrangian mechanics. The coefficients of the model are shown to be related to physical processes, and validation is presented using the results from a planar motion mechanism dataset.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

One of the problems to be solved in attaining the full potentials of hematopoietic stem cell (HSC) applications is the limited availability of the cells. Growing HSCs in a bioreactor offers an alternative solution to this problem. Besides, it also offers the advantages of eliminating labour intensive process as well as the possible contamination involved in the periodic nutrient replenishments in the traditional T-flask stem cell cultivation. In spite of this, the optimization of HSC cultivation in a bioreactor has been barely explored. This manuscript discusses the development of a mathematical model to describe the dynamics in nutrient distribution and cell concentration of an ex vivo HSC cultivation in a microchannel perfusion bioreactor. The model was further used to optimize the cultivation by proposing three alternative feeding strategies in order to prevent the occurrence of nutrient limitation in the bioreactor. The evaluation of these strategies, the periodic step change increase in the inlet oxygen concentration, the periodic step change increase in the media inflow, and the feedback control of media inflow, shows that these strategies can successfully improve the cell yield of the bioreactor. In general, the developed model is useful for the design and optimization of bioreactor operation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Solid–interstitial fluid interaction, which depends on tissue permeability, is significant to the strain-rate-dependent mechanical behavior of humeral head (shoulder) cartilage. Due to anatomical and biomechanical similarities to that of the human shoulder, kangaroos present a suitable animal model. Therefore, indentation experiments were conducted on kangaroo shoulder cartilage tissues from low (10−4/s) to moderately high (10−2/s) strain-rates. A porohyperelastic model was developed based on the experimental characterization; and a permeability function that takes into account the effect of strain-rate on permeability (strain-rate-dependent permeability) was introduced into the model to investigate the effect of rate-dependent fluid flow on tissue response. The prediction of the model with the strain-rate-dependent permeability was compared with those of the models using constant permeability and strain-dependent permeability. Compared to the model with constant permeability, the models with strain-dependent and strain-rate-dependent permeability were able to better capture the experimental variation at all strain-rates (p<0.05). Significant differences were not identified between models with strain-dependent and strain-rate-dependent permeability at strain-rate of 5×10−3/s (p=0.179). However, at strain-rate of 10−2/s, the model with strain-rate-dependent permeability was significantly better at capturing the experimental results (p<0.005). The findings thus revealed the significance of rate-dependent fluid flow on tissue behavior at large strain-rates, which provides insights into the mechanical deformation mechanisms of cartilage tissues.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The aim of this dissertation is to provide conceptual tools for the social scientist for clarifying, evaluating and comparing explanations of social phenomena based on formal mathematical models. The focus is on relatively simple theoretical models and simulations, not statistical models. These studies apply a theory of explanation according to which explanation is about tracing objective relations of dependence, knowledge of which enables answers to contrastive why and how-questions. This theory is developed further by delineating criteria for evaluating competing explanations and by applying the theory to social scientific modelling practices and to the key concepts of equilibrium and mechanism. The dissertation is comprised of an introductory essay and six published original research articles. The main theses about model-based explanations in the social sciences argued for in the articles are the following. 1) The concept of explanatory power, often used to argue for the superiority of one explanation over another, compasses five dimensions which are partially independent and involve some systematic trade-offs. 2) All equilibrium explanations do not causally explain the obtaining of the end equilibrium state with the multiple possible initial states. Instead, they often constitutively explain the macro property of the system with the micro properties of the parts (together with their organization). 3) There is an important ambivalence in the concept mechanism used in many model-based explanations and this difference corresponds to a difference between two alternative research heuristics. 4) Whether unrealistic assumptions in a model (such as a rational choice model) are detrimental to an explanation provided by the model depends on whether the representation of the explanatory dependency in the model is itself dependent on the particular unrealistic assumptions. Thus evaluating whether a literally false assumption in a model is problematic requires specifying exactly what is supposed to be explained and by what. 5) The question of whether an explanatory relationship depends on particular false assumptions can be explored with the process of derivational robustness analysis and the importance of robustness analysis accounts for some of the puzzling features of the tradition of model-building in economics. 6) The fact that economists have been relatively reluctant to use true agent-based simulations to formulate explanations can partially be explained by the specific ideal of scientific understanding implicit in the practise of orthodox economics.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper describes the development of a model, based on Bayesian networks, to estimate the likelihood that sheep flocks are infested with lice at shearing and to assist farm managers or advisers to assess whether or not to apply a lousicide treatment. The risk of lice comes from three main sources: (i) lice may have been present at the previous shearing and not eradicated; (ii) lice may have been introduced with purchased sheep; and (iii) lice may have entered with strays. A Bayesian network is used to assess the probability of each of these events independently and combine them for an overall assessment. Rubbing is a common indicator of lice but there are other causes too. If rubbing has been observed, an additional Bayesian network is used to assess the probability that lice are the cause. The presence or absence of rubbing and its possible cause are combined with these networks to improve the overall risk assessment.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

There exists various suggestions for building a functional and a fault-tolerant large-scale quantum computer. Topological quantum computation is a more exotic suggestion, which makes use of the properties of quasiparticles manifest only in certain two-dimensional systems. These so called anyons exhibit topological degrees of freedom, which, in principle, can be used to execute quantum computation with intrinsic fault-tolerance. This feature is the main incentive to study topological quantum computation. The objective of this thesis is to provide an accessible introduction to the theory. In this thesis one has considered the theory of anyons arising in two-dimensional quantum mechanical systems, which are described by gauge theories based on so called quantum double symmetries. The quasiparticles are shown to exhibit interactions and carry quantum numbers, which are both of topological nature. Particularly, it is found that the addition of the quantum numbers is not unique, but that the fusion of the quasiparticles is described by a non-trivial fusion algebra. It is discussed how this property can be used to encode quantum information in a manner which is intrinsically protected from decoherence and how one could, in principle, perform quantum computation by braiding the quasiparticles. As an example of the presented general discussion, the particle spectrum and the fusion algebra of an anyon model based on the gauge group S_3 are explicitly derived. The fusion algebra is found to branch into multiple proper subalgebras and the simplest one of them is chosen as a model for an illustrative demonstration. The different steps of a topological quantum computation are outlined and the computational power of the model is assessed. It turns out that the chosen model is not universal for quantum computation. However, because the objective was a demonstration of the theory with explicit calculations, none of the other more complicated fusion subalgebras were considered. Studying their applicability for quantum computation could be a topic of further research.