365 resultados para atk-ohjelmat - LSP - Library software package
Resumo:
Concerns have been raised about the reproducibility of brachial artery reactivity (BAR), because subjective decisions regarding the location of interfaces may influence the measurement of very small changes in lumen diameter. We studied 120 consecutive patients with BAR to address if an automated technique could be applied, and if experience influenced reproducibility between two observers, one experienced and one inexperienced. Digital cineloops were measured automatically, using software that measures the leading edge of the endothelium and tracks this in sequential frames and also manually, where a set of three point-to-point measurements were averaged. There was a high correlation between automated and manual techniques for both observers, although less variability was present with expert readers. The limits of agreement overall for interobserver concordance were 0.13 +/-0.65 mm for the manual and 0.03 +/-0.74 mm for the automated measurement. For intraobserver concordance, the limits of agreement were -0.07 +/-0.38 mm for observer 1 and -0.16 +/-0.55 mm for observer 2. We concluded that BAR measurements were highly concordant between observers, although more concordant using the automated method, and that experience does affect concordance. Care must be taken to ensure that the same segments are measured between observers and serially.
Resumo:
Incremental parsing has long been recognized as a technique of great utility in the construction of language-based editors, and correspondingly, the area currently enjoys a mature theory. Unfortunately, many practical considerations have been largely overlooked in previously published algorithms. Many user requirements for an editing system necessarily impact on the design of its incremental parser, but most approaches focus only on one: response time. This paper details an incremental parser based on LR parsing techniques and designed for use in a modeless syntax recognition editor. The nature of this editor places significant demands on the structure and quality of the document representation it uses, and hence, on the parser. The strategy presented here is novel in that both the parser and the representation it constructs are tolerant of the inevitable and frequent syntax errors that arise during editing. This is achieved by a method that differs from conventional error repair techniques, and that is more appropriate for use in an interactive context. Furthermore, the parser aims to minimize disturbance to this representation, not only to ensure other system components can operate incrementally, but also to avoid unfortunate consequences for certain user-oriented services. The algorithm is augmented with a limited form of predictive tree-building, and a technique is presented for the determination of valid symbols for menu-based insertion. Copyright (C) 2001 John Wiley & Sons, Ltd.
Resumo:
This paper presents a method of formally specifying, refining and verifying concurrent systems which uses the object-oriented state-based specification language Object-Z together with the process algebra CSP. Object-Z provides a convenient way of modelling complex data structures needed to define the component processes of such systems, and CSP enables the concise specification of process interactions. The basis of the integration is a semantics of Object-Z classes identical to that of CSP processes. This allows classes specified in Object-Z to he used directly within the CSP part of the specification. In addition to specification, we also discuss refinement and verification in this model. The common semantic basis enables a unified method of refinement to be used, based upon CSP refinement. To enable state-based techniques to be used fur the Object-Z components of a specification we develop state-based refinement relations which are sound and complete with respect to CSP refinement. In addition, a verification method for static and dynamic properties is presented. The method allows us to verify properties of the CSP system specification in terms of its component Object-Z classes by using the laws of the the CSP operators together with the logic for Object-Z.
Resumo:
In population pharmacokinetic studies, the precision of parameter estimates is dependent on the population design. Methods based on the Fisher information matrix have been developed and extended to population studies to evaluate and optimize designs. In this paper we propose simple programming tools to evaluate population pharmacokinetic designs. This involved the development of an expression for the Fisher information matrix for nonlinear mixed-effects models, including estimation of the variance of the residual error. We implemented this expression as a generic function for two software applications: S-PLUS and MATLAB. The evaluation of population designs based on two pharmacokinetic examples from the literature is shown to illustrate the efficiency and the simplicity of this theoretic approach. Although no optimization method of the design is provided, these functions can be used to select and compare population designs among a large set of possible designs, avoiding a lot of simulations.
Resumo:
This study evaluated the effectiveness of a teacher-implemented intervention package designed to replace prelinguistic behaviors with functional communication. Four young children with autism participated in a multiple-probe design across three communicative functions. Initially, three existing communication functions were selected for each child. Next, the existing prelinguistic behaviors that the children used to achieve these functions were identified. Replacement forms that were considered more recognizable and symbolic were defined to achieve these same functions. After a baseline phase, teachers received inservice training, consultation, and feedback on how to encourage, acknowledge, and respond to the replacement forms. During intervention, the replacement forms increased and prelinguistic behaviors decreased in most cases. The results suggested that the teacher-implemented intervention was effective in replacing prelinguistic behaviors with alternative forms of functional communication.
Resumo:
The present study estimated the population pharmacokinetics of lamotrigine in patients receiving oral lamotrigine therapy with drug concentration monitoring, and determined intersubject and intrasubject variability. A total of 129 patients were analyzed from two clinical sites. Of these, 124 patients provided spare data (198 concentration-time points); nine patients (four from a previous group plus five from the current group) provided rich data (431 points). The population analysis was conducted using P-PHARM (TM) (SIMED Scientific Software, Cedex, France), a nonlinear mixed-effect modeling program. A single exponential elimination model (first-order absorption) with heteroscedastic weighting was used. Apparent clearance (CL/F) and volume of distribution (V/F) were the pharmacokinetic parameters estimated. Covariate analysis was performed to determine which factors explained any of the variability associated with lamotrigine clearance. Population estimates of CL/F and V/F for lamotrigine generated in the final model were 2.14 +/- 0.81 L/h and 78.1 +/- 5.1 L/kg. Intersubject and intrasubject variability for clearance was 38% and 38%, respectively. The covariates of concomitant valproate and phenytoin therapy accounted for 42% of the intersubject variability of clearance. Age, gender, clinic site, and other concomitant antiepileptic drugs did not influence clearance. This study of the population pharmacokinetics of lamotrigine in patients using the drug clinically provides useful data and should lead to better dosage individualization for lamotrigine.
Resumo:
The QU-GENE Computing Cluster (QCC) is a hardware and software solution to the automation and speedup of large QU-GENE (QUantitative GENEtics) simulation experiments that are designed to examine the properties of genetic models, particularly those that involve factorial combinations of treatment levels. QCC automates the management of the distribution of components of the simulation experiments among the networked single-processor computers to achieve the speedup.
Resumo:
Injection of particulate hepatitis B virus surface antigen (HBsAg) in mice leads to the induction of a HBsAg-specific class-I-restricted cytotoxic T lymphocyte (CTL) response. It is proposed that any protein internal to HBsAg will also be able to elicit a specific CTL response. In this study, several carboxy-terminal truncations of hepatitis C virus (HCV) core protein were fused to varying lengths of amino-terminal truncated large hepatitis delta antigen (L-HDAg). These constructs were analysed for their ability to be expressed and the particles secreted in the presence of HBsAg after transfection into HuH-7 cells. The secretion efficiency of the various HCV core-HDAg chimeric proteins was generally poor. Constructs containing full length HDAg appeared to be more stable than truncated versions and the length of the inserted protein was restricted to around 40 amino acids. Thus, the use of L-HDAg as a chimera to package foreign proteins is limited. Consequently, a polyepitope (polytope) containing a B-cell epitope from human papillomavirus (HPV 16) and multiple T-cell epitopes from the HCV polyprotein was used to create the construct, L-HDAg-polyB. This chimeric protein was shown to be reliant on the co-expression of HBsAg for secretion into the cell culture fluid and was secreted more efficiently than the previous HCV core-HDAg constructs. These L-HDAg-polyB virus-like particles (VLPs) had a buoyant density of similar to 1.2 g/cm(3) in caesium chloride and similar to 1.15 g/cm(3) in sucrose. The VLPs were also immunoprecipitated using an anti-HBs but not an anti-HD antibody. Thus, these recombinant VLPs have similar biophysical properties to L-HDAg VLPs.
Resumo:
A model has been developed which enables the viscosities of coal ash slags to be predicted as a function of composition and temperature under reducing conditions. The model describes both completely liquid and heterogeneous, i.e. partly crystallised, slags in the Al2O3-CaO-'FeO'-SiO2 system in equilibrium with metallic iron. The Urbain formalism has been modified to describe the viscosities of the liquid slag phase over the complete range of compositions and a wide range of temperatures. The computer package F * A * C * T was used to predict the proportions of solids and the compositions of the remaining liquid phases. The Roscoe equation has been used to describe the effect of presence of solid suspension (slurry effect) on the viscosity of partly crystallised slag systems. The model provides a good description of the experimental data of fully liquid, and liquid + solids mixtures, over the complete range of compositions and a wide range of temperatures. This model can now be used for viscosity predictions in industrial slag systems. Examples of the application of the new model to coal ash fluxing and blending are given in the paper. (C) 2001 Elsevier Science Ltd. All rights reserved.
Resumo:
Pasminco Century Mine has developed a geophysical logging system to provide new data for ore mining/grade control and the generation of Short Term Models for mine planning. Previous work indicated the applicability of petrophysical logging for lithology prediction, however, the automation of the method was not considered reliable enough for the development of a mining model. A test survey was undertaken using two diamond drilled control holes and eight percussion holes. All holes were logged with natural gamma, magnetic susceptibility and density. Calibration of the LogTrans auto-interpretation software using only natural gamma and magnetic susceptibility indicated that both lithology and stratigraphy could be predicted. Development of a capability to enforce stratigraphic order within LogTrans increased the reliability and accuracy of interpretations. After the completion of a feasibility program, Century Mine has invested in a dedicated logging vehicle to log blast holes as well as for use in in-fill drilling programs. Future refinement of the system may lead to the development of GPS controlled excavators for mining ore.
Resumo:
We constructed a BAC library of the model legume Lotus japonicus with a 6-to 7-fold genome coverage. We used vector PCLD04541, which allows direct plant transformation by BACs. The average insert size is 94 kb. Clones were stable in Escherichia coli and Agrobacterium tumefaciens.
Resumo:
A data warehouse is a data repository which collects and maintains a large amount of data from multiple distributed, autonomous and possibly heterogeneous data sources. Often the data is stored in the form of materialized views in order to provide fast access to the integrated data. One of the most important decisions in designing a data warehouse is the selection of views for materialization. The objective is to select an appropriate set of views that minimizes the total query response time with the constraint that the total maintenance time for these materialized views is within a given bound. This view selection problem is totally different from the view selection problem under the disk space constraint. In this paper the view selection problem under the maintenance time constraint is investigated. Two efficient, heuristic algorithms for the problem are proposed. The key to devising the proposed algorithms is to define good heuristic functions and to reduce the problem to some well-solved optimization problems. As a result, an approximate solution of the known optimization problem will give a feasible solution of the original problem. (C) 2001 Elsevier Science B.V. All rights reserved.