802 resultados para Computer display
Resumo:
Both structural and dynamical properties of 7Li at 470 and 843 K are studied by molecular dynamics simulation and the results are comapred with the available experimental data. Two effective interatomic potentials are used, i.e., a potential derived from the Ashcroft pseudopotential [Phys. Lett. 23, 48 (1966)] and a recently proposed potential deduced from the neutral pseudoatom method [J. Phys.: Condens. Matter 5, 4283 (1993)]. Although the shape of the two potential functions is very different, the majority of the properties calculated from them are very similar. The differences among the results using the two interaction models are carefully discussed.
Resumo:
We propose a general scenario to analyze technological changes in socio-economic environments. We illustrate the ideas with a model that incorporating the main trends is simple enough to extract analytical results and, at the same time, sufficiently complex to display a rich dynamic behavior. Our study shows that there exists a macroscopic observable that is maximized in a regime where the system is critical, in the sense that the distribution of events follow power laws. Computer simulations show that, in addition, the system always self-organizes to achieve the optimal performance in the stationary state.
Resumo:
Self- and cross-velocity correlation functions and related transport coefficients of molten salts are studied by molecular-dynamics simulation. Six representative systems are considered, i.e., NaCl and KCl alkali halides, CuCl and CuBr noble-metal halides, and SrCl2 and ZnCl2 divalent metal-ion halides. Computer simulation results are compared with experimental self-diffusion coefficients and electrical conductivities. Special attention is paid to dynamic cross correlations and their dependence on the Coulomb interactions as well as on the size and mass differences between anions and cations.
Resumo:
In this work, we present the cultural evolution that has allowed to overcome many problems derived from the limitations of the human body. These limitations have been solved by a"cyborization" process that began since early anthropogenesis. Originally, it was envisioned to deal with some diseases, accidents or body malfunctions. Nowadays, augmentations improve common human capabilities; one of the most notable is the increase of brain efficiency by using connections with a computer. A basic social question also addressed is which people will and should have access to these augmentations. Advanced humanoid robots (with human external aspect, artificial intelligence and even emotions) already exist and consequently a number of questions arise. For instance, will robots be considered living organisms? Could they be considered as persons? Will we confer the human status to robots? These questions are discussed. Our conclusions are that the advanced humanoid robots display some actions that may be considered as life-like, yet different to the life associated with living organisms, also, to some extend they could be considered as persons-like, but not humans.
Resumo:
Underbody plows can be very useful tools in winter maintenance, especially when compacted snow or hard ice must be removed from the roadway. By the application of significant down-force, and the use of an appropriate cutting edge angle, compacted snow and ice can be removed very effectively by such plows, with much greater efficiency than any other tool under those circumstances. However, the successful operation of an underbody plow requires considerable skill. If too little down pressure is applied to the plow, then it will not cut the ice or compacted snow. However, if too much force is applied, then either the cutting edge may gouge the road surface, causing significant damage often to both the road surface and the plow, or the plow may ride up on the cutting edge so that it is no longer controllable by the operator. Spinning of the truck in such situations is easily accomplished. Further, excessive down force will result in rapid wear of the cutting edge. Given this need for a high level of operator skill, the operation of an underbody plow is a candidate for automation. In order to successfully automate the operation of an underbody plow, a control system must be developed that follows a set of rules that represent appropriate operation of such a plow. These rules have been developed, based upon earlier work in which operational underbody plows were instrumented to determine the loading upon them (both vertical and horizontal) and the angle at which the blade was operating.These rules have been successfully coded into two different computer programs, both using the MatLab® software. In the first program, various load and angle inputs are analyzed to determine when, whether, and how they violate the rules of operation. This program is essentially deterministic in nature. In the second program, the Simulink® package in the MatLab® software system was used to implement these rules using fuzzy logic. Fuzzy logic essentially replaces a fixed and constant rule with one that varies in such a way as to improve operational control. The development of the fuzzy logic in this simulation was achieved simply by using appropriate routines in the computer software, rather than being developed directly. The results of the computer testing and simulation indicate that a fully automated, computer controlled underbody plow is indeed possible. The issue of whether the next steps toward full automation should be taken (and by whom) has also been considered, and the possibility of some sort of joint venture between a Department of Transportation and a vendor has been suggested.
Resumo:
Introduction: Therapeutic drug monitoring (TDM) aims at optimizing treatment by individualizing dosage regimen based on measurement of blood concentrations. Maintaining concentrations within a target range requires pharmacokinetic and clinical capabilities. Bayesian calculation represents a gold standard in TDM approach but requires computing assistance. In the last decades computer programs have been developed to assist clinicians in this assignment. The aim of this benchmarking was to assess and compare computer tools designed to support TDM clinical activities.¦Method: Literature and Internet search was performed to identify software. All programs were tested on common personal computer. Each program was scored against a standardized grid covering pharmacokinetic relevance, user-friendliness, computing aspects, interfacing, and storage. A weighting factor was applied to each criterion of the grid to consider its relative importance. To assess the robustness of the software, six representative clinical vignettes were also processed through all of them.¦Results: 12 software tools were identified, tested and ranked. It represents a comprehensive review of the available software's characteristics. Numbers of drugs handled vary widely and 8 programs offer the ability to the user to add its own drug model. 10 computer programs are able to compute Bayesian dosage adaptation based on a blood concentration (a posteriori adjustment) while 9 are also able to suggest a priori dosage regimen (prior to any blood concentration measurement), based on individual patient covariates, such as age, gender, weight. Among those applying Bayesian analysis, one uses the non-parametric approach. The top 2 software emerging from this benchmark are MwPharm and TCIWorks. Other programs evaluated have also a good potential but are less sophisticated (e.g. in terms of storage or report generation) or less user-friendly.¦Conclusion: Whereas 2 integrated programs are at the top of the ranked listed, such complex tools would possibly not fit all institutions, and each software tool must be regarded with respect to individual needs of hospitals or clinicians. Interest in computing tool to support therapeutic monitoring is still growing. Although developers put efforts into it the last years, there is still room for improvement, especially in terms of institutional information system interfacing, user-friendliness, capacity of data storage and report generation.
Resumo:
Objectives: Therapeutic drug monitoring (TDM) aims at optimizing treatment by individualizing dosage regimen based on blood concentrations measurement. Maintaining concentrations within a target range requires pharmacokinetic (PK) and clinical capabilities. Bayesian calculation represents a gold standard in TDM approach but requires computing assistance. The aim of this benchmarking was to assess and compare computer tools designed to support TDM clinical activities.¦Methods: Literature and Internet were searched to identify software. Each program was scored against a standardized grid covering pharmacokinetic relevance, user-friendliness, computing aspects, interfacing, and storage. A weighting factor was applied to each criterion of the grid to consider its relative importance. To assess the robustness of the software, six representative clinical vignettes were also processed through all of them.¦Results: 12 software tools were identified, tested and ranked. It represents a comprehensive review of the available software characteristics. Numbers of drugs handled vary from 2 to more than 180, and integration of different population types is available for some programs. Nevertheless, 8 programs offer the ability to add new drug models based on population PK data. 10 computer tools incorporate Bayesian computation to predict dosage regimen (individual parameters are calculated based on population PK models). All of them are able to compute Bayesian a posteriori dosage adaptation based on a blood concentration while 9 are also able to suggest a priori dosage regimen, only based on individual patient covariates. Among those applying Bayesian analysis, MM-USC*PACK uses a non-parametric approach. The top 2 programs emerging from this benchmark are MwPharm and TCIWorks. Others programs evaluated have also a good potential but are less sophisticated or less user-friendly.¦Conclusions: Whereas 2 software packages are ranked at the top of the list, such complex tools would possibly not fit all institutions, and each program must be regarded with respect to individual needs of hospitals or clinicians. Programs should be easy and fast for routine activities, including for non-experienced users. Although interest in TDM tools is growing and efforts were put into it in the last years, there is still room for improvement, especially in terms of institutional information system interfacing, user-friendliness, capability of data storage and automated report generation.
Resumo:
A new type of high avidity binding molecule, termed "peptabody" was created by harnessing the effect of multivalent interaction. A short peptide ligand was fused via a semi-rigid hinge region with the coiled-coil assembly domain of the cartilage oligomeric matrix protein, resulting in a pentameric multivalent binding molecule. In the first peptabody (Pab-S) described here, a peptide (S) specific for the mouse B-cell lymphoma BCL1 surface Ig idiotype, was selected from a phage display library. A fusion gene was constructed encoding peptide S, followed by the 24 aa hinge region from camel IgG and a modified 55 aa cartilage oligomeric matrix protein pentamerization domain. The Pab-S fusion protein was expressed in Escherichia coli in a soluble form at high levels and purified in a single step by metal-affinity chromatography. Pab-S specifically bound the BCL1 surface idiotype with an avidity of about 1 nM, which corresponds to a 2 x 10(5)-fold increase compared with the affinity of the synthetic peptide S itself. Biochemical characterization showed that Pab-S is a stable homopentamer of about 85 kDa, with interchain disulfide bonds. Pab-S can be dissociated under denaturing and reducing conditions and reassociated as a pentamer with full-binding activity. This intrinsic feature provides an easy way to combine Pab molecules with two different peptide specificities, thus producing heteropentamers with bispecific and/or chelating properties.
Resumo:
Accurate prediction of transcription factor binding sites is needed to unravel the function and regulation of genes discovered in genome sequencing projects. To evaluate current computer prediction tools, we have begun a systematic study of the sequence-specific DNA-binding of a transcription factor belonging to the CTF/NFI family. Using a systematic collection of rationally designed oligonucleotides combined with an in vitro DNA binding assay, we found that the sequence specificity of this protein cannot be represented by a simple consensus sequence or weight matrix. For instance, CTF/NFI uses a flexible DNA binding mode that allows for variations of the binding site length. From the experimental data, we derived a novel prediction method using a generalised profile as a binding site predictor. Experimental evaluation of the generalised profile indicated that it accurately predicts the binding affinity of the transcription factor to natural or synthetic DNA sequences. Furthermore, the in vitro measured binding affinities of a subset of oligonucleotides were found to correlate with their transcriptional activities in transfected cells. The combined computational-experimental approach exemplified in this work thus resulted in an accurate prediction method for CTF/NFI binding sites potentially functioning as regulatory regions in vivo.
Resumo:
This report documents an extensive field program carried out to identify the relationships between soil engineering properties, as measured by various in situ devices, and the results of machine compaction monitoring using prototype compaction monitoring technology developed by Caterpillar Inc. Primary research tasks for this study include the following: (1) experimental testing and statistical analyses to evaluate machine power in terms of the engineering properties of the compacted soil (e.g., density, strength, stiffness) and (2) recommendations for using the compaction monitoring technology in practice. The compaction monitoring technology includes sensors that monitor the power consumption used to move the compaction machine, an on-board computer and display screen, and a GPS system to map the spatial location of the machine. In situ soil density, strength, and stiffness data characterized the soil at various stages of compaction. For each test strip or test area, in situ soil properties were compared directly to machine power values to establish statistical relationships. Statistical models were developed to predict soil density, strength, and stiffness from the machine power values. Field data for multiple test strips were evaluated. The R2 correlation coefficient was generally used to assess the quality of the regressions. Strong correlations were observed between averaged machine power and field measurement data. The relationships are based on the compaction model derived from laboratory data. Correlation coefficients (R2) were consistently higher for thicker lifts than for thin lifts, indicating that the depth influencing machine power response exceeds the representative lift thickness encountered under field conditions. Caterpillar Inc. compaction monitoring technology also identified localized areas of an earthwork project with weak or poorly compacted soil. The soil properties at these locations were verified using in situ test devices. This report also documents the steps required to implement the compaction monitoring technology evaluated.
Resumo:
The overall system is designed to permit automatic collection of delamination field data for bridge decks. In addition to measuring and recording the data in the field, the system provides for transferring the recorded data to a personal computer for processing and plotting. This permits rapid turnaround from data collection to a finished plot of the results in a fraction of the time previously required for manual analysis of the analog data captured on a strip chart recorder. In normal operation the Delamtect provides an analog voltage for each of two channels which is proportional to the extent of any delamination. These voltages are recorded on a strip chart for later visual analysis. An event marker voltage, produced by a momentary push button on the handle, is also provided by the Delamtect and recorded on a third channel of the analog recorder.