993 resultados para model refinement


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Pasminco Century Mine has developed a geophysical logging system to provide new data for ore mining/grade control and the generation of Short Term Models for mine planning. Previous work indicated the applicability of petrophysical logging for lithology prediction, however, the automation of the method was not considered reliable enough for the development of a mining model. A test survey was undertaken using two diamond drilled control holes and eight percussion holes. All holes were logged with natural gamma, magnetic susceptibility and density. Calibration of the LogTrans auto-interpretation software using only natural gamma and magnetic susceptibility indicated that both lithology and stratigraphy could be predicted. Development of a capability to enforce stratigraphic order within LogTrans increased the reliability and accuracy of interpretations. After the completion of a feasibility program, Century Mine has invested in a dedicated logging vehicle to log blast holes as well as for use in in-fill drilling programs. Future refinement of the system may lead to the development of GPS controlled excavators for mining ore.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Program compilation can be formally defined as a sequence of equivalence-preserving transformations, or refinements, from high-level language programs to assembler code, Recent models also incorporate timing properties, but the resulting formalisms are intimidatingly complex. Here we take advantage of a new, simple model of real-time refinement, based on predicate transformer semantics, to present a straightforward compilation formalism that incorporates real-time constraints. (C) 2002 Elsevier Science B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The refinement calculus is a well-established theory for deriving program code from specifications. Recent research has extended the theory to handle timing requirements, as well as functional ones, and we have developed an interactive programming tool based on these extensions. Through a number of case studies completed using the tool, this paper explains how the tool helps the programmer by supporting the many forms of variables needed in the theory. These include simple state variables as in the untimed calculus, trace variables that model the evolution of properties over time, auxiliary variables that exist only to support formal reasoning, subroutine parameters, and variables shared between parallel processes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper is concerned with methods for refinement of specifications written using a combination of Object-Z and CSP. Such a combination has proved to be a suitable vehicle for specifying complex systems which involve state and behaviour, and several proposals exist for integrating these two languages. The basis of the integration in this paper is a semantics of Object-Z classes identical to CSP processes. This allows classes specified in Object-Z to be combined using CSP operators. It has been shown that this semantic model allows state-based refinement relations to be used on the Object-Z components in an integrated Object-Z/CSP specification. However, the current refinement methodology does not allow the structure of a specification to be changed in a refinement, whereas a full methodology would, for example, allow concurrency to be introduced during the development life-cycle. In this paper, we tackle these concerns and discuss refinements of specifications written using Object-Z and CSP where we change the structure of the specification when performing the refinement. In particular, we develop a set of structural simulation rules which allow single components to be refined to more complex specifications involving CSP operators. The soundness of these rules is verified against the common semantic model and they are illustrated via a number of examples.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVE: To test discriminant analysis as a method of turning the information of a routine customer satisfaction survey (CSS) into a more accurate decision-making tool. METHODS: A 7-question, 10-multiple choice, self-applied questionnaire was used to study a sample of patients seen in two outpatient care units in Valparaíso, Chile, one of primary care (n=100) and the other of secondary care (n=249). Two cutting points were considered in the dependent variable (final satisfaction score): satisfied versus unsatisfied, and very satisfied versus all others. Results were compared with empirical measures (proportion of satisfied individuals, proportion of unsatisfied individuals and size of the median). RESULTS: The response rate was very high, over 97.0% in both units. A new variable, medical attention, was revealed, as explaining satisfaction at the primary care unit. The proportion of the total variability explained by the model was very high (over 99.4%) in both units, when comparing satisfied with unsatisfied customers. In the analysis of very satisfied versus all other customers, significant relationship was identified only in the case of the primary care unit, which explained a small proportion of the variability (41.9%). CONCLUSIONS: Discriminant analysis identified relationships not revealed by the previous analysis. It provided information about the proportion of the variability explained by the model. It identified non-significant relationships suggested by empirical analysis (e.g. the case of the relation very satisfied versus others in the secondary care unit). It measured the contribution of each independent variable to the explanation of the variation of the dependent one.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Variational steepest descent approximation schemes for the modified Patlak-Keller-Segel equation with a logarithmic interaction kernel in any dimension are considered. We prove the convergence of the suitably interpolated in time implicit Euler scheme, defined in terms of the Euclidean Wasserstein distance, associated to this equation for sub-critical masses. As a consequence, we recover the recent result about the global in time existence of weak-solutions to the modified Patlak-Keller-Segel equation for the logarithmic interaction kernel in any dimension in the sub-critical case. Moreover, we show how this method performs numerically in one dimension. In this particular case, this numerical scheme corresponds to a standard implicit Euler method for the pseudo-inverse of the cumulative distribution function. We demonstrate its capabilities to reproduce easily without the need of mesh-refinement the blow-up of solutions for super-critical masses.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In April 2011, the OECD released an important discussion draft that is intended to clarify the meaning of the term "beneficial ownership" under articles 10, 11 and 12 of the OECD Model (2010). This article discusses these proposals and demonstrates that some refinement is necessary.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

QUESTIONS UNDER STUDY: The starting point of the interdisciplinary project "Assessing the impact of diagnosis related groups (DRGs) on patient care and professional practice" (IDoC) was the lack of a systematic ethical assessment for the introduction of cost containment measures in healthcare. Our aim was to contribute to the methodological and empirical basis of such an assessment. METHODS: Five sub-groups conducted separate but related research within the fields of biomedical ethics, law, nursing sciences and health services, applying a number of complementary methodological approaches. The individual research projects were framed within an overall ethical matrix. Workshops and bilateral meetings were held to identify and elaborate joint research themes. RESULTS: Four common, ethically relevant themes emerged in the results of the studies across sub-groups: (1.) the quality and safety of patient care, (2.) the state of professional practice of physicians and nurses, (3.) changes in incentives structure, (4.) vulnerable groups and access to healthcare services. Furthermore, much-needed data for future comparative research has been collected and some early insights into the potential impact of DRGs are outlined. CONCLUSIONS: Based on the joint results we developed preliminary recommendations related to conceptual analysis, methodological refinement, monitoring and implementation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Homology modeling is the most commonly used technique to build a three-dimensional model for a protein sequence. It heavily relies on the quality of the sequence alignment between the protein to model and related proteins with a known three dimensional structure. Alignment quality can be assessed according to the physico-chemical properties of the three dimensional models it produces.In this work, we introduce fifteen predictors designed to evaluate the properties of the models obtained for various alignments. They consist of an energy value obtained from different force fields (CHARMM, ProsaII or ANOLEA) computed on residue selected around misaligned regions. These predictors were evaluated on ten challenging test cases. For each target, all possible ungapped alignments are generated and their corresponding models are computed and evaluated.The best predictor, retrieving the structural alignment for 9 out of 10 test cases, is based on the ANOLEA atomistic mean force potential and takes into account residues around misaligned secondary structure elements. The performance of the other predictors is significantly lower. This work shows that substantial improvement in local alignments can be obtained by careful assessment of the local structure of the resulting models.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objective: The processes of change implied in weight management remain unclear. The present study aimed to identify these processes by validating a questionnaire designed to assess processes of change (the P-Weight) in line with the transtheoretical model. The relationship of processes of change with stages of change and other external variables is also examined. Methods: Participants were 723 people from community and clinical settings in Barcelona. Their mean age was 32.07 (SD = 14.55) years; most of them were women (75.0%), and their mean BMI was 26.47 (SD = 8.52) kg/m2. They all completed the P-Weight and the stages of change questionnaire (SWeight), both applied to weight management, as well as two subscales from the Eating Disorders Inventory-2 and Eating Attitudes Test-40 questionnaires about the concern with dieting. Results: A 34-item version of the PWeight was obtained by means of a refinement process. The principal components analysis applied to half of the sample identified four processes of change. A confirmatory factor analysis was then carried out with the other half of the sample, revealing that the model of four freely correlated first-order factors showed the best fit (GFI = 0.988, AGFI = 0.986, NFI = 0.986, and SRMR = 0.0559). Corrected item-total correlations (0.322-0.865) and Cronbach"s alpha coefficients (0.781-0.960) were adequate. The relationship between the P-Weight and the S-Weight and the concern with dieting measures from other questionnaires supported the validity of the scale. Conclusion: The study identified processes of change involved in weight management and reports the adequate psychometric properties of the P-Weight. It also reveals the relationship between processes and stages of change and other external variables.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

There are many ways to generate geometrical models for numerical simulation, and most of them start with a segmentation step to extract the boundaries of the regions of interest. This paper presents an algorithm to generate a patient-specific three-dimensional geometric model, based on a tetrahedral mesh, without an initial extraction of contours from the volumetric data. Using the information directly available in the data, such as gray levels, we built a metric to drive a mesh adaptation process. The metric is used to specify the size and orientation of the tetrahedral elements everywhere in the mesh. Our method, which produces anisotropic meshes, gives good results with synthetic and real MRI data. The resulting model quality has been evaluated qualitatively and quantitatively by comparing it with an analytical solution and with a segmentation made by an expert. Results show that our method gives, in 90% of the cases, as good or better meshes as a similar isotropic method, based on the accuracy of the volume reconstruction for a given mesh size. Moreover, a comparison of the Hausdorff distances between adapted meshes of both methods and ground-truth volumes shows that our method decreases reconstruction errors faster. Copyright © 2015 John Wiley & Sons, Ltd.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A new algorithm is described for refining the pose of a model of a rigid object, to conform more accurately to the image structure. Elemental 3D forces are considered to act on the model. These are derived from directional derivatives of the image local to the projected model features. The convergence properties of the algorithm is investigated and compared to a previous technique. Its use in a video sequence of a cluttered outdoor traffic scene is also illustrated and assessed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Different optimization methods can be employed to optimize a numerical estimate for the match between an instantiated object model and an image. In order to take advantage of gradient-based optimization methods, perspective inversion must be used in this context. We show that convergence can be very fast by extrapolating to maximum goodness-of-fit with Newton's method. This approach is related to methods which either maximize a similar goodness-of-fit measure without use of gradient information, or else minimize distances between projected model lines and image features. Newton's method combines the accuracy of the former approach with the speed of convergence of the latter.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Simulations of the global atmosphere for weather and climate forecasting require fast and accurate solutions and so operational models use high-order finite differences on regular structured grids. This precludes the use of local refinement; techniques allowing local refinement are either expensive (eg. high-order finite element techniques) or have reduced accuracy at changes in resolution (eg. unstructured finite-volume with linear differencing). We present solutions of the shallow-water equations for westerly flow over a mid-latitude mountain from a finite-volume model written using OpenFOAM. A second/third-order accurate differencing scheme is applied on arbitrarily unstructured meshes made up of various shapes and refinement patterns. The results are as accurate as equivalent resolution spectral methods. Using lower order differencing reduces accuracy at a refinement pattern which allows errors from refinement of the mountain to accumulate and reduces the global accuracy over a 15 day simulation. We have therefore introduced a scheme which fits a 2D cubic polynomial approximately on a stencil around each cell. Using this scheme means that refinement of the mountain improves the accuracy after a 15 day simulation. This is a more severe test of local mesh refinement for global simulations than has been presented but a realistic test if these techniques are to be used operationally. These efficient, high-order schemes may make it possible for local mesh refinement to be used by weather and climate forecast models.