976 resultados para project delay estimation
Resumo:
Comparison of donor-acceptor electronic couplings calculated within two-state and three-state models suggests that the two-state treatment can provide unreliable estimates of Vda because of neglecting the multistate effects. We show that in most cases accurate values of the electronic coupling in a π stack, where donor and acceptor are separated by a bridging unit, can be obtained as Ṽ da = (E2 - E1) μ12 Rda + (2 E3 - E1 - E2) 2 μ13 μ23 Rda2, where E1, E2, and E3 are adiabatic energies of the ground, charge-transfer, and bridge states, respectively, μij is the transition dipole moments between the states i and j, and Rda is the distance between the planes of donor and acceptor. In this expression based on the generalized Mulliken-Hush approach, the first term corresponds to the coupling derived within a two-state model, whereas the second term is the superexchange correction accounting for the bridge effect. The formula is extended to bridges consisting of several subunits. The influence of the donor-acceptor energy mismatch on the excess charge distribution, adiabatic dipole and transition moments, and electronic couplings is examined. A diagnostic is developed to determine whether the two-state approach can be applied. Based on numerical results, we showed that the superexchange correction considerably improves estimates of the donor-acceptor coupling derived within a two-state approach. In most cases when the two-state scheme fails, the formula gives reliable results which are in good agreement (within 5%) with the data of the three-state generalized Mulliken-Hush model
Resumo:
Report produced by Iowa Departmment of Agriculture and Land Stewardship
Resumo:
With the widespread availability of high-throughput sequencing technologies, sequencing projects have become pervasive in the molecular life sciences. The huge bulk of data generated daily must be analyzed further by biologists with skills in bioinformatics and by "embedded bioinformaticians," i.e., bioinformaticians integrated in wet lab research groups. Thus, students interested in molecular life sciences must be trained in the main steps of genomics: sequencing, assembly, annotation and analysis. To reach that goal, a practical course has been set up for master students at the University of Lausanne: the "Sequence a genome" class. At the beginning of the academic year, a few bacterial species whose genome is unknown are provided to the students, who sequence and assemble the genome(s) and perform manual annotation. Here, we report the progress of the first class from September 2010 to June 2011 and the results obtained by seven master students who specifically assembled and annotated the genome of Estrella lausannensis, an obligate intracellular bacterium related to Chlamydia. The draft genome of Estrella is composed of 29 scaffolds encompassing 2,819,825 bp that encode for 2233 putative proteins. Estrella also possesses a 9136 bp plasmid that encodes for 14 genes, among which we found an integrase and a toxin/antitoxin module. Like all other members of the Chlamydiales order, Estrella possesses a highly conserved type III secretion system, considered as a key virulence factor. The annotation of the Estrella genome also allowed the characterization of the metabolic abilities of this strictly intracellular bacterium. Altogether, the students provided the scientific community with the Estrella genome sequence and a preliminary understanding of the biology of this recently-discovered bacterial genus, while learning to use cutting-edge technologies for sequencing and to perform bioinformatics analyses.
Resumo:
The clinical demand for a device to monitor Blood Pressure (BP) in ambulatory scenarios with minimal use of inflation cuffs is increasing. Based on the so-called Pulse Wave Velocity (PWV) principle, this paper introduces and evaluates a novel concept of BP monitor that can be fully integrated within a chest sensor. After a preliminary calibration, the sensor provides non-occlusive beat-by-beat estimations of Mean Arterial Pressure (MAP) by measuring the Pulse Transit Time (PTT) of arterial pressure pulses travelling from the ascending aorta towards the subcutaneous vasculature of the chest. In a cohort of 15 healthy male subjects, a total of 462 simultaneous readings consisting of reference MAP and chest PTT were acquired. Each subject was recorded at three different days: D, D+3 and D+14. Overall, the implemented protocol induced MAP values to range from 80 ± 6 mmHg in baseline, to 107 ± 9 mmHg during isometric handgrip maneuvers. Agreement between reference and chest-sensor MAP values was tested by using intraclass correlation coefficient (ICC = 0.78) and Bland-Altman analysis (mean error = 0.7 mmHg, standard deviation = 5.1 mmHg). The cumulative percentage of MAP values provided by the chest sensor falling within a range of ±5 mmHg compared to reference MAP readings was of 70%, within ±10 mmHg was of 91%, and within ±15mmHg was of 98%. These results point at the fact that the chest sensor complies with the British Hypertension Society (BHS) requirements of Grade A BP monitors, when applied to MAP readings. Grade A performance was maintained even two weeks after having performed the initial subject-dependent calibration. In conclusion, this paper introduces a sensor and a calibration strategy to perform MAP measurements at the chest. The encouraging performance of the presented technique paves the way towards an ambulatory-compliant, continuous and non-occlusive BP monitoring system.
Resumo:
The aim of ORAMED work package 4 was the optimization of the medical practices in nuclear medicine during the preparation of radiopharmaceuticals and their administration to the patient. During the project a wide campaign of measurements was performed in the nuclear medicine departments of the collaborating hospitals. Such data were intrinsically characterized by a large variability that depended on the procedure, the employed techniques and the operator's habits. That variability could easily hide some important parameter, for example, the effectiveness of the adopted shielding (for syringe and vial) or the effect of the distances from the source. This information is necessary for a valuable optimization purpose of radiation protection. To this end a sensitivity analysis was carried out through Monte Carlo simulations employing voxel models, representing operator's hand during the considered practices. Such analysis allowed understanding at what extent the range of personal dose equivalent evaluated during measurements can be considered intrinsically related to the procedures. Furthermore, with the Monte Carlo simulations it was possible to study the appropriateness of the shielding usually utilized in these practices.
Resumo:
Part I of this series of articles focused on the construction of graphical probabilistic inference procedures, at various levels of detail, for assessing the evidential value of gunshot residue (GSR) particle evidence. The proposed models - in the form of Bayesian networks - address the issues of background presence of GSR particles, analytical performance (i.e., the efficiency of evidence searching and analysis procedures) and contamination. The use and practical implementation of Bayesian networks for case pre-assessment is also discussed. This paper, Part II, concentrates on Bayesian parameter estimation. This topic complements Part I in that it offers means for producing estimates useable for the numerical specification of the proposed probabilistic graphical models. Bayesian estimation procedures are given a primary focus of attention because they allow the scientist to combine (his/her) prior knowledge about the problem of interest with newly acquired experimental data. The present paper also considers further topics such as the sensitivity of the likelihood ratio due to uncertainty in parameters and the study of likelihood ratio values obtained for members of particular populations (e.g., individuals with or without exposure to GSR).
Resumo:
OBJECTIVES: This study was designed to assess the prevalence of major cardiovascular risk factors in familial premature coronary artery disease (P-CAD), affecting two or more siblings within one sibship. BACKGROUND: Premature CAD has a genetic component. It remains to be established whether familial P-CAD is due to genes acting independently from major cardiovascular risk factors. METHODS: We recruited 213 P-CAD survivors from 103 sibships diagnosed before age <or=50 (men) or <or=55 (women) years old. Hypertension, hypercholesterolemia, obesity, and smoking were documented at the time of the event in 163 patients (145 men and 18 women). Each patient was compared with two individuals of the same age and gender, diagnosed with sporadic (nonfamilial) P-CAD, and three individuals randomly sampled from the general population. RESULTS: Compared with the general population, patients with sporadic P-CAD had a higher prevalence of hypertension (29% vs. 14%, p < 0.001), hypercholesterolemia (54% vs. 33%, p < 0.001), obesity (20% vs. 13%, p < 0.01), and smoking (76% vs. 39%, p < 0.001). These risk factors were equally or even more prevalent in patients with familial P-CAD (43% [p < 0.05 vs. sporadic P-CAD], 58% [p = 0.07], 21% and 72%, respectively). Overall, only 7 (4%) of 163 of patients with familial P-CAD and 22 (7%) of 326 of patients with sporadic P-CAD had none of these conditions, as compared with 167 (34%) of 489 patients in the general population. CONCLUSIONS: Classic, remediable risk factors are highly prevalent in patients with familial P-CAD. Accordingly, a major contribution of genes acting in the absence of these risk factors is unlikely.
Resumo:
Biochemical systems are commonly modelled by systems of ordinary differential equations (ODEs). A particular class of such models called S-systems have recently gained popularity in biochemical system modelling. The parameters of an S-system are usually estimated from time-course profiles. However, finding these estimates is a difficult computational problem. Moreover, although several methods have been recently proposed to solve this problem for ideal profiles, relatively little progress has been reported for noisy profiles. We describe a special feature of a Newton-flow optimisation problem associated with S-system parameter estimation. This enables us to significantly reduce the search space, and also lends itself to parameter estimation for noisy data. We illustrate the applicability of our method by applying it to noisy time-course data synthetically produced from previously published 4- and 30-dimensional S-systems. In addition, we propose an extension of our method that allows the detection of network topologies for small S-systems. We introduce a new method for estimating S-system parameters from time-course profiles. We show that the performance of this method compares favorably with competing methods for ideal profiles, and that it also allows the determination of parameters for noisy profiles.
Resumo:
Final report produced by DOT on development of manual crack quantification and automatic crack measurment system.
Resumo:
Optimal Usage of De-Icing Chemicals when Scraping Ice, Final Report of Project HR 391
Resumo:
To date, state-of-the-art seismic material parameter estimates from multi-component sea-bed seismic data are based on the assumption that the sea-bed consists of a fully elastic half-space. In reality, however, the shallow sea-bed generally consists of soft, unconsolidated sediments that are characterized by strong to very strong seismic attenuation. To explore the potential implications, we apply a state-of-the-art elastic decomposition algorithm to synthetic data for a range of canonical sea-bed models consisting of a viscoelastic half-space of varying attenuation. We find that in the presence of strong seismic attenuation, as quantified by Q-values of 10 or less, significant errors arise in the conventional elastic estimation of seismic properties. Tests on synthetic data indicate that these errors can be largely avoided by accounting for the inherent attenuation of the seafloor when estimating the seismic parameters. This can be achieved by replacing the real-valued expressions for the elastic moduli in the governing equations in the parameter estimation by their complex-valued viscoelastic equivalents. The practical application of our parameter procedure yields realistic estimates of the elastic seismic material properties of the shallow sea-bed, while the corresponding Q-estimates seem to be biased towards too low values, particularly for S-waves. Given that the estimation of inelastic material parameters is notoriously difficult, particularly in the immediate vicinity of the sea-bed, this is expected to be of interest and importance for civil and ocean engineering purposes.
Resumo:
Background: We present the results of EGASP, a community experiment to assess the state-ofthe-art in genome annotation within the ENCODE regions, which span 1% of the human genomesequence. The experiment had two major goals: the assessment of the accuracy of computationalmethods to predict protein coding genes; and the overall assessment of the completeness of thecurrent human genome annotations as represented in the ENCODE regions. For thecomputational prediction assessment, eighteen groups contributed gene predictions. Weevaluated these submissions against each other based on a ‘reference set’ of annotationsgenerated as part of the GENCODE project. These annotations were not available to theprediction groups prior to the submission deadline, so that their predictions were blind and anexternal advisory committee could perform a fair assessment.Results: The best methods had at least one gene transcript correctly predicted for close to 70%of the annotated genes. Nevertheless, the multiple transcript accuracy, taking into accountalternative splicing, reached only approximately 40% to 50% accuracy. At the coding nucleotidelevel, the best programs reached an accuracy of 90% in both sensitivity and specificity. Programsrelying on mRNA and protein sequences were the most accurate in reproducing the manuallycurated annotations. Experimental validation shows that only a very small percentage (3.2%) of the selected 221 computationally predicted exons outside of the existing annotation could beverified.Conclusions: This is the first such experiment in human DNA, and we have followed thestandards established in a similar experiment, GASP1, in Drosophila melanogaster. We believe theresults presented here contribute to the value of ongoing large-scale annotation projects and shouldguide further experimental methods when being scaled up to the entire human genome sequence.
Resumo:
A number of experimental methods have been reported for estimating the number of genes in a genome, or the closely related coding density of a genome, defined as the fraction of base pairs in codons. Recently, DNA sequence data representative of the genome as a whole have become available for several organisms, making the problem of estimating coding density amenable to sequence analytic methods. Estimates of coding density for a single genome vary widely, so that methods with characterized error bounds have become increasingly desirable. We present a method to estimate the protein coding density in a corpus of DNA sequence data, in which a ‘coding statistic’ is calculated for a large number of windows of the sequence under study, and the distribution of the statistic is decomposed into two normal distributions, assumed to be the distributions of the coding statistic in the coding and noncoding fractions of the sequence windows. The accuracy of the method is evaluated using known data and application is made to the yeast chromosome III sequence and to C.elegans cosmid sequences. It can also be applied to fragmentary data, for example a collection of short sequences determined in the course of STS mapping.
Resumo:
The research presented in this report provides the basis for the development of a new procedure to be used by the Iowa DOT and cities and counties in the state to deal with detours. Even though the project initially focused on investigating new tools to determine condition and compensation, the focus was shifted to traffic and the gas tax method to set the basis for the new procedure. It was concluded that the condition-based approach, even though accurate and consistent condition evaluations can be achieved, is not feasible or cost effective because of the current practices of data collection (two-year cycle) and also the logistics of the procedure (before and after determination). The gas tax method provides for a simple, easy to implement, and consistent approach to dealing with compensation for use of detours. It removes the subjectivity out of the current procedures and provides for a more realistic (traffic based) approach to the compensation determination.