957 resultados para Computer Prediction Program


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background and aim of the study: Genomic gains and losses play a crucial role in the development and progression of DLBCL and are closely related to gene expression profiles (GEP), including the germinal center B-cell like (GCB) and activated B-cell like (ABC) cell of origin (COO) molecular signatures. To identify new oncogenes or tumor suppressor genes (TSG) involved in DLBCL pathogenesis and to determine their prognostic values, an integrated analysis of high-resolution gene expression and copy number profiling was performed. Patients and methods: Two hundred and eight adult patients with de novo CD20+ DLBCL enrolled in the prospective multicentric randomized LNH-03 GELA trials (LNH03-1B, -2B, -3B, 39B, -5B, -6B, -7B) with available frozen tumour samples, centralized reviewing and adequate DNA/RNA quality were selected. 116 patients were treated by Rituximab(R)-CHOP/R-miniCHOP and 92 patients were treated by the high dose (R)-ACVBP regimen dedicated to patients younger than 60 years (y) in frontline. Tumour samples were simultaneously analysed by high resolution comparative genomic hybridization (CGH, Agilent, 144K) and gene expression arrays (Affymetrix, U133+2). Minimal common regions (MCR), as defined by segments that affect the same chromosomal region in different cases, were delineated. Gene expression and MCR data sets were merged using Gene expression and dosage integrator algorithm (GEDI, Lenz et al. PNAS 2008) to identify new potential driver genes. Results: A total of 1363 recurrent (defined by a penetrance > 5%) MCRs within the DLBCL data set, ranging in size from 386 bp, affecting a single gene, to more than 24 Mb were identified by CGH. Of these MCRs, 756 (55%) showed a significant association with gene expression: 396 (59%) gains, 354 (52%) single-copy deletions, and 6 (67%) homozygous deletions. By this integrated approach, in addition to previously reported genes (CDKN2A/2B, PTEN, DLEU2, TNFAIP3, B2M, CD58, TNFRSF14, FOXP1, REL...), several genes targeted by gene copy abnormalities with a dosage effect and potential physiopathological impact were identified, including genes with TSG activity involved in cell cycle (HACE1, CDKN2C) immune response (CD68, CD177, CD70, TNFSF9, IRAK2), DNA integrity (XRCC2, BRCA1, NCOR1, NF1, FHIT) or oncogenic functions (CD79b, PTPRT, MALT1, AUTS2, MCL1, PTTG1...) with distinct distribution according to COO signature. The CDKN2A/2B tumor suppressor locus (9p21) was deleted homozygously in 27% of cases and hemizygously in 9% of cases. Biallelic loss was observed in 49% of ABC DLBCL and in 10% of GCB DLBCL. This deletion was strongly correlated to age and associated to a limited number of additional genetic abnormalities including trisomy 3, 18 and short gains/losses of Chr. 1, 2, 19 regions (FDR < 0.01), allowing to identify genes that may have synergistic effects with CDKN2A/2B inactivation. With a median follow-up of 42.9 months, only CDKN2A/2B biallelic deletion strongly correlates (FDR p.value < 0.01) to a poor outcome in the entire cohort (4y PFS = 44% [32-61] respectively vs. 74% [66-82] for patients in germline configuration; 4y OS = 53% [39-72] vs 83% [76-90]). In a Cox proportional hazard prediction of the PFS, CDKN2A/2B deletion remains predictive (HR = 1.9 [1.1-3.2], p = 0.02) when combined with IPI (HR = 2.4 [1.4-4.1], p = 0.001) and GCB status (HR = 1.3 [0.8-2.3], p = 0.31). This difference remains predictive in the subgroup of patients treated by R-CHOP (4y PFS = 43% [29-63] vs. 66% [55-78], p=0.02), in patients treated by R-ACVBP (4y PFS = 49% [28-84] vs. 83% [74-92], p=0.003), and in GCB (4y PFS = 50% [27-93] vs. 81% [73-90], p=0.02), or ABC/unclassified (5y PFS = 42% [28-61] vs. 67% [55-82] p = 0.009) molecular subtypes (Figure 1). Conclusion: We report for the first time an integrated genetic analysis of a large cohort of DLBCL patients included in a prospective multicentric clinical trial program allowing identifying new potential driver genes with pathogenic impact. However CDKN2A/2B deletion constitutes the strongest and unique prognostic factor of chemoresistance to R-CHOP, regardless the COO signature, which is not overcome by a more intensified immunochemotherapy. Patients displaying this frequent genomic abnormality warrant new and dedicated therapeutic approaches.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

PURPOSE: To explore whether triaxial accelerometric measurements can be utilized to accurately assess speed and incline of running in free-living conditions. METHODS: Body accelerations during running were recorded at the lower back and at the heel by a portable data logger in 20 human subjects, 10 men, and 10 women. After parameterizing body accelerations, two neural networks were designed to recognize each running pattern and calculate speed and incline. Each subject ran 18 times on outdoor roads at various speeds and inclines; 12 runs were used to calibrate the neural networks whereas the 6 other runs were used to validate the model. RESULTS: A small difference between the estimated and the actual values was observed: the square root of the mean square error (RMSE) was 0.12 m x s(-1) for speed and 0.014 radiant (rad) (or 1.4% in absolute value) for incline. Multiple regression analysis allowed accurate prediction of speed (RMSE = 0.14 m x s(-1)) but not of incline (RMSE = 0.026 rad or 2.6% slope). CONCLUSION: Triaxial accelerometric measurements allows an accurate estimation of speed of running and incline of terrain (the latter with more uncertainty). This will permit the validation of the energetic results generated on the treadmill as applied to more physiological unconstrained running conditions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Protein-ligand docking has made important progress during the last decade and has become a powerful tool for drug development, opening the way to virtual high throughput screening and in silico structure-based ligand design. Despite the flattering picture that has been drawn, recent publications have shown that the docking problem is far from being solved, and that more developments are still needed to achieve high successful prediction rates and accuracy. Introducing an accurate description of the solvation effect upon binding is thought to be essential to achieve this goal. In particular, EADock uses the Generalized Born Molecular Volume 2 (GBMV2) solvent model, which has been shown to reproduce accurately the desolvation energies calculated by solving the Poisson equation. Here, the implementation of the Fast Analytical Continuum Treatment of Solvation (FACTS) as an implicit solvation model in small molecules docking calculations has been assessed using the EADock docking program. Our results strongly support the use of FACTS for docking. The success rates of EADock/FACTS and EADock/GBMV2 are similar, i.e. around 75% for local docking and 65% for blind docking. However, these results come at a much lower computational cost: FACTS is 10 times faster than GBMV2 in calculating the total electrostatic energy, and allows a speed up of EADock by a factor of 4. This study also supports the EADock development strategy relying on the CHARMM package for energy calculations, which enables straightforward implementation and testing of the latest developments in the field of Molecular Modeling.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We have devised a program that allows computation of the power of F-test, and hence determination of appropriate sample and subsample sizes, in the context of the one-way hierarchical analysis of variance with fixed effects. The power at a fixed alternative is an increasing function of the sample size and of the subsample size. The program makes it easy to obtain the power of F-test for a range of values of sample and subsample sizes, and therefore the appropriate sizes based on a desired power. The program can be used for the 'ordinary' case of the one-way analysis of variance, as well as for hierarchical analysis of variance with two stages of sampling. Examples are given of the practical use of the program.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Substantial collective flow is observed in collisions between lead nuclei at Large Hadron Collider (LHC) as evidenced by the azimuthal correlations in the transverse momentum distributions of the produced particles. Our calculations indicate that the global v1-flow, which at RHIC peaked at negative rapidities (named third flow component or antiflow), now at LHC is going to turn toward forward rapidities (to the same side and direction as the projectile residue). Potentially this can provide a sensitive barometer to estimate the pressure and transport properties of the quark-gluon plasma. Our calculations also take into account the initial state center-of-mass rapidity fluctuations, and demonstrate that these are crucial for v1 simulations. In order to better study the transverse momentum flow dependence we suggest a new "symmetrized" v1S(pt) function, and we also propose a new method to disentangle global v1 flow from the contribution generated by the random fluctuations in the initial state. This will enhance the possibilities of studying the collective Global v1 flow both at the STAR Beam Energy Scan program and at LHC.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Underbody plows can be very useful tools in winter maintenance, especially when compacted snow or hard ice must be removed from the roadway. By the application of significant down-force, and the use of an appropriate cutting edge angle, compacted snow and ice can be removed very effectively by such plows, with much greater efficiency than any other tool under those circumstances. However, the successful operation of an underbody plow requires considerable skill. If too little down pressure is applied to the plow, then it will not cut the ice or compacted snow. However, if too much force is applied, then either the cutting edge may gouge the road surface, causing significant damage often to both the road surface and the plow, or the plow may ride up on the cutting edge so that it is no longer controllable by the operator. Spinning of the truck in such situations is easily accomplished. Further, excessive down force will result in rapid wear of the cutting edge. Given this need for a high level of operator skill, the operation of an underbody plow is a candidate for automation. In order to successfully automate the operation of an underbody plow, a control system must be developed that follows a set of rules that represent appropriate operation of such a plow. These rules have been developed, based upon earlier work in which operational underbody plows were instrumented to determine the loading upon them (both vertical and horizontal) and the angle at which the blade was operating.These rules have been successfully coded into two different computer programs, both using the MatLab® software. In the first program, various load and angle inputs are analyzed to determine when, whether, and how they violate the rules of operation. This program is essentially deterministic in nature. In the second program, the Simulink® package in the MatLab® software system was used to implement these rules using fuzzy logic. Fuzzy logic essentially replaces a fixed and constant rule with one that varies in such a way as to improve operational control. The development of the fuzzy logic in this simulation was achieved simply by using appropriate routines in the computer software, rather than being developed directly. The results of the computer testing and simulation indicate that a fully automated, computer controlled underbody plow is indeed possible. The issue of whether the next steps toward full automation should be taken (and by whom) has also been considered, and the possibility of some sort of joint venture between a Department of Transportation and a vendor has been suggested.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Evaluating other individuals with respect to personality characteristics plays a crucial role in human relations and it is the focus of attention for research in diverse fields such as psychology and interactive computer systems. In psychology, face perception has been recognized as a key component of this evaluation system. Multiple studies suggest that observers use face information to infer personality characteristics. Interactive computer systems are trying to take advantage of these findings and apply them to increase the natural aspect of interaction and to improve the performance of interactive computer systems. Here, we experimentally test whether the automatic prediction of facial trait judgments (e.g. dominance) can be made by using the full appearance information of the face and whether a reduced representation of its structure is sufficient. We evaluate two separate approaches: a holistic representation model using the facial appearance information and a structural model constructed from the relations among facial salient points. State of the art machine learning methods are applied to a) derive a facial trait judgment model from training data and b) predict a facial trait value for any face. Furthermore, we address the issue of whether there are specific structural relations among facial points that predict perception of facial traits. Experimental results over a set of labeled data (9 different trait evaluations) and classification rules (4 rules) suggest that a) prediction of perception of facial traits is learnable by both holistic and structural approaches; b) the most reliable prediction of facial trait judgments is obtained by certain type of holistic descriptions of the face appearance; and c) for some traits such as attractiveness and extroversion, there are relationships between specific structural features and social perceptions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objectives: Therapeutic drug monitoring (TDM) aims at optimizing treatment by individualizing dosage regimen based on blood concentrations measurement. Maintaining concentrations within a target range requires pharmacokinetic (PK) and clinical capabilities. Bayesian calculation represents a gold standard in TDM approach but requires computing assistance. The aim of this benchmarking was to assess and compare computer tools designed to support TDM clinical activities.¦Methods: Literature and Internet were searched to identify software. Each program was scored against a standardized grid covering pharmacokinetic relevance, user-friendliness, computing aspects, interfacing, and storage. A weighting factor was applied to each criterion of the grid to consider its relative importance. To assess the robustness of the software, six representative clinical vignettes were also processed through all of them.¦Results: 12 software tools were identified, tested and ranked. It represents a comprehensive review of the available software characteristics. Numbers of drugs handled vary from 2 to more than 180, and integration of different population types is available for some programs. Nevertheless, 8 programs offer the ability to add new drug models based on population PK data. 10 computer tools incorporate Bayesian computation to predict dosage regimen (individual parameters are calculated based on population PK models). All of them are able to compute Bayesian a posteriori dosage adaptation based on a blood concentration while 9 are also able to suggest a priori dosage regimen, only based on individual patient covariates. Among those applying Bayesian analysis, MM-USC*PACK uses a non-parametric approach. The top 2 programs emerging from this benchmark are MwPharm and TCIWorks. Others programs evaluated have also a good potential but are less sophisticated or less user-friendly.¦Conclusions: Whereas 2 software packages are ranked at the top of the list, such complex tools would possibly not fit all institutions, and each program must be regarded with respect to individual needs of hospitals or clinicians. Programs should be easy and fast for routine activities, including for non-experienced users. Although interest in TDM tools is growing and efforts were put into it in the last years, there is still room for improvement, especially in terms of institutional information system interfacing, user-friendliness, capability of data storage and automated report generation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Office of Special Investigations at Iowa Department of Transportation (DOT) collects FWD data on regular basis to evaluate pavement structural conditions. The primary objective of this study was to develop a fully-automated software system for rapid processing of the FWD data along with a user manual. The software system automatically reads the FWD raw data collected by the JILS-20 type FWD machine that Iowa DOT owns, processes and analyzes the collected data with the rapid prediction algorithms developed during the phase I study. This system smoothly integrates the FWD data analysis algorithms and the computer program being used to collect the pavement deflection data. This system can be used to assess pavement condition, estimate remaining pavement life, and eventually help assess pavement rehabilitation strategies by the Iowa DOT pavement management team. This report describes the developed software in detail and can also be used as a user-manual for conducting simulation studies and detailed analyses. *********************** Large File ***********************

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Provides instructions for using the computer program which was developed under the research project, "The Economics of Reducing the County Road System: Three Case Studies In Iowa". This program operates on an IBP personal computer with 300K storage. A fixed disk is required with at least 3 megabytes of storage. The computer must be equipped with DOS version 3.0; the programs are written in Fortran. The user's manual describes all data requirements including network preparation, trip information, cost for maintenance, reconstruction, etc. Program operation instructions are presented, as well as sample solution output and a listing of the computer programs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND AND PURPOSE: To determine whether infarct core or penumbra is the more significant predictor of outcome in acute ischemic stroke, and whether the results are affected by the statistical method used. METHODS: Clinical and imaging data were collected in 165 patients with acute ischemic stroke. We reviewed the noncontrast head computed tomography (CT) to determine the Alberta Score Program Early CT score and assess for hyperdense middle cerebral artery. We reviewed CT-angiogram for site of occlusion and collateral flow score. From perfusion-CT, we calculated the volumes of infarct core and ischemic penumbra. Recanalization status was assessed on early follow-up imaging. Clinical data included age, several time points, National Institutes of Health Stroke Scale at admission, treatment type, and modified Rankin score at 90 days. Two multivariate regression analyses were conducted to determine which variables predicted outcome best. In the first analysis, we did not include recanalization status among the potential predicting variables. In the second, we included recanalization status and its interaction between perfusion-CT variables. RESULTS: Among the 165 study patients, 76 had a good outcome (modified Rankin score ≤2) and 89 had a poor outcome (modified Rankin score >2). In our first analysis, the most important predictors were age (P<0.001) and National Institutes of Health Stroke Scale at admission (P=0.001). The imaging variables were not important predictors of outcome (P>0.05). In the second analysis, when the recanalization status and its interaction with perfusion-CT variables were included, recanalization status and perfusion-CT penumbra volume became the significant predictors (P<0.001). CONCLUSIONS: Imaging prediction of tissue fate, more specifically imaging of the ischemic penumbra, matters only if recanalization can also be predicted.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Iowa DOT has been correlating its roadmeters to the CHLOE Profilometer since 1968. The same test method for the Present Serviceability Index (PSI) deduction from the pavement condition (crack and patch) survey has also been used since 1968. Resulting PSI measurements on the Interstate and Primary Highway Systems have had good continuity through the years due to these test procedures. A computer program called PSITREND has been developed to plot PSI versus year tested for every rural pavement section in the State of Iowa. PSITREND provides pavement performance trends which are very useful for prediction of rehabilitation needs and for evaluation of new designs or rehabilitation techniques. The PSITREND data base should be maintained through future years to expand on nineteen years of historical PSI test information already collected.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This report describes the work accomplished to date on research project HR-173, A Computer Based Information System for County Equipment Cost Records, and presents the initial design for this system. The specific topics discussed here are findings from the analysis of information needs, the system specifications developed from these findings, and the proposed system design based upon the system specifications. The initial system design will include tentative input designs for capturing input data, output designs to show the output formats and the items to be output for use in decision making, file design showing the organization of information to be kept on each piece of equipment in the computer data file, and general system design explaining how the entire system will operate. The Steering Committee appointed by Iowa Highway Research Board is asked to study this report, make appropriate suggestions, and give approval to the proposed design subject to any suggestions made. This approval will permit the designer to proceed promptly with the development of the computer program implementation phase of the design.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This appendix is divided into three sections. The first section contains abstracts of each of the eight computer programs in the system, instructions for keypunching the three input documents, and computer operating instructions pertaining to each program. The second section contains system flowcharts for the entire system as well as program flowcharts for each program. The last section contains PL/l program listings of each program.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: Computer assisted cognitive remediation (CACR) was demonstrated to be efficient in improving cognitive deficits in adults with psychosis. However, scarce studies explored the outcome of CACR in adolescents with psychosis or at high risk. Aims: To investigate the effectiveness of a computer-assisted cognitive remediation (CACR) program in adolescents with psychosis or at high risk. Method: Intention to treat analyses included 32 adolescents who participated in a blinded 8-week randomized controlled trial of CACR treatment compared to computer games (CG). Cognitive abilities, symptoms and psychosocial functioning were assessed at baseline and posttreatment. Results: Improvement in visuospatial abilities was significantly greater in the CACR group than in CG. Other cognitive functions, psychotic symptoms and psychosocial functioning improved significantly, but at similar rates, in the two groups. Conclusion: CACR can be successfully administered in this population; it proved to be effective over and above CG for the most intensively trained cognitive ability.