998 resultados para solution accuracy
Resumo:
The Brianconnais area is explained as a large scale exotic terrain separating from Europe during the opening of the Valais ocean. It's displacement history during the Alpine evolution allows to replace older concepts of multiple oceans separating narrow strips of continental crust.
Resumo:
The discrepancies between the designed and measured camber of precast pretensioned concrete beams (PPCBs) observed by the Iowa DOT have created challenges in the field during bridge construction, causing construction delays and additional costs. This study was undertaken to systematically identify the potential sources of discrepancies between the designed and measured camber from release to time of erection and improve the accuracy of camber estimations in order to minimize the associated problems in the field. To successfully accomplish the project objectives, engineering properties, including creep and shrinkage, of three normal concrete and four high-performance concrete mix designs were characterized. In parallel, another task focused on identifying the instantaneous camber and the variables affecting the instantaneous camber and evaluated the corresponding impact of this factor using more than 100 PPCBs. Using a combination of finite element analyses and the time-step method, the long-term camber was estimated for 66 PPCBs, with due consideration given to creep and shrinkage of concrete, changes in support location and prestress force, and the thermal effects. Utilizing the outcomes of the project, suitable long-term camber multipliers were developed that account for the time-dependent behavior, including the thermal effects. It is shown that by using the recommended practice for the camber measurements together with the proposed multipliers, the accuracy of camber prediction will be greatly improved. Consequently, it is expected that future bridge projects in Iowa can minimize construction challenges resulting from large discrepancies between the designed and actual camber of PPCBs during construction.
Resumo:
Abstract
Resumo:
The use of quantum dots (QDs) in the area of fingermark detection is currently receiving a lot of attention in the forensic literature. Most of the research efforts have been devoted to cadmium telluride (CdTe) quantum dots often applied as powders to the surfaces of interests. Both the use of cadmium and the nano size of these particles raise important issues in terms of health and safety. This paper proposes to replace CdTe QDs by zinc sulphide QDs doped with copper (ZnS:Cu) to address these issues. Zinc sulphide-copper doped QDs were successfully synthesized, characterized in terms of size and optical properties and optimized to be applied for the detection of impressions left in blood, where CdTe QDs proved to be efficient. Effectiveness of detection was assessed in comparison with CdTe QDs and Acid Yellow 7 (AY7, an effective blood reagent), using two series of depletive blood fingermarks from four donors prepared on four non-porous substrates, i.e. glass, transparent polypropylene, black polyethylene and aluminium foil. The marks were cut in half and processed separately with both reagents, leading to two comparison series (ZnS:Cu vs. CdTe, and ZnS:Cu vs. AY7). ZnS:Cu proved to be better than AY7 and at least as efficient as CdTe on most substrates. Consequently, copper-doped ZnS QDs constitute a valid substitute for cadmium-based QDs to detect blood marks on non-porous substrates and offer a safer alternative for routine use.
Resumo:
An image analysis method is presented which allows for the reconstruction of the three-dimensional path of filamentous objects from two of their projections. Starting with stereo pairs, this method is used to trace the trajectory of DNA molecules embedded in vitreous ice and leads to a faithful representation of their three-dimensional shape in solution. This computer-aided reconstruction is superior to the subjective three-dimensional impression generated by observation of stereo pairs of micrographs because it enables one to look at the reconstructed molecules from any chosen direction and distance and allows quantitative analysis such as determination of distances, curvature, persistence length, and writhe of DNA molecules in solution.
Resumo:
3 Summary 3. 1 English The pharmaceutical industry has been facing several challenges during the last years, and the optimization of their drug discovery pipeline is believed to be the only viable solution. High-throughput techniques do participate actively to this optimization, especially when complemented by computational approaches aiming at rationalizing the enormous amount of information that they can produce. In siiico techniques, such as virtual screening or rational drug design, are now routinely used to guide drug discovery. Both heavily rely on the prediction of the molecular interaction (docking) occurring between drug-like molecules and a therapeutically relevant target. Several softwares are available to this end, but despite the very promising picture drawn in most benchmarks, they still hold several hidden weaknesses. As pointed out in several recent reviews, the docking problem is far from being solved, and there is now a need for methods able to identify binding modes with a high accuracy, which is essential to reliably compute the binding free energy of the ligand. This quantity is directly linked to its affinity and can be related to its biological activity. Accurate docking algorithms are thus critical for both the discovery and the rational optimization of new drugs. In this thesis, a new docking software aiming at this goal is presented, EADock. It uses a hybrid evolutionary algorithm with two fitness functions, in combination with a sophisticated management of the diversity. EADock is interfaced with .the CHARMM package for energy calculations and coordinate handling. A validation was carried out on 37 crystallized protein-ligand complexes featuring 11 different proteins. The search space was defined as a sphere of 15 R around the center of mass of the ligand position in the crystal structure, and conversely to other benchmarks, our algorithms was fed with optimized ligand positions up to 10 A root mean square deviation 2MSD) from the crystal structure. This validation illustrates the efficiency of our sampling heuristic, as correct binding modes, defined by a RMSD to the crystal structure lower than 2 A, were identified and ranked first for 68% of the complexes. The success rate increases to 78% when considering the five best-ranked clusters, and 92% when all clusters present in the last generation are taken into account. Most failures in this benchmark could be explained by the presence of crystal contacts in the experimental structure. EADock has been used to understand molecular interactions involved in the regulation of the Na,K ATPase, and in the activation of the nuclear hormone peroxisome proliferatoractivated receptors a (PPARa). It also helped to understand the action of common pollutants (phthalates) on PPARy, and the impact of biotransformations of the anticancer drug Imatinib (Gleevec®) on its binding mode to the Bcr-Abl tyrosine kinase. Finally, a fragment-based rational drug design approach using EADock was developed, and led to the successful design of new peptidic ligands for the a5ß1 integrin, and for the human PPARa. In both cases, the designed peptides presented activities comparable to that of well-established ligands such as the anticancer drug Cilengitide and Wy14,643, respectively. 3.2 French Les récentes difficultés de l'industrie pharmaceutique ne semblent pouvoir se résoudre que par l'optimisation de leur processus de développement de médicaments. Cette dernière implique de plus en plus. de techniques dites "haut-débit", particulièrement efficaces lorsqu'elles sont couplées aux outils informatiques permettant de gérer la masse de données produite. Désormais, les approches in silico telles que le criblage virtuel ou la conception rationnelle de nouvelles molécules sont utilisées couramment. Toutes deux reposent sur la capacité à prédire les détails de l'interaction moléculaire entre une molécule ressemblant à un principe actif (PA) et une protéine cible ayant un intérêt thérapeutique. Les comparatifs de logiciels s'attaquant à cette prédiction sont flatteurs, mais plusieurs problèmes subsistent. La littérature récente tend à remettre en cause leur fiabilité, affirmant l'émergence .d'un besoin pour des approches plus précises du mode d'interaction. Cette précision est essentielle au calcul de l'énergie libre de liaison, qui est directement liée à l'affinité du PA potentiel pour la protéine cible, et indirectement liée à son activité biologique. Une prédiction précise est d'une importance toute particulière pour la découverte et l'optimisation de nouvelles molécules actives. Cette thèse présente un nouveau logiciel, EADock, mettant en avant une telle précision. Cet algorithme évolutionnaire hybride utilise deux pressions de sélections, combinées à une gestion de la diversité sophistiquée. EADock repose sur CHARMM pour les calculs d'énergie et la gestion des coordonnées atomiques. Sa validation a été effectuée sur 37 complexes protéine-ligand cristallisés, incluant 11 protéines différentes. L'espace de recherche a été étendu à une sphère de 151 de rayon autour du centre de masse du ligand cristallisé, et contrairement aux comparatifs habituels, l'algorithme est parti de solutions optimisées présentant un RMSD jusqu'à 10 R par rapport à la structure cristalline. Cette validation a permis de mettre en évidence l'efficacité de notre heuristique de recherche car des modes d'interactions présentant un RMSD inférieur à 2 R par rapport à la structure cristalline ont été classés premier pour 68% des complexes. Lorsque les cinq meilleures solutions sont prises en compte, le taux de succès grimpe à 78%, et 92% lorsque la totalité de la dernière génération est prise en compte. La plupart des erreurs de prédiction sont imputables à la présence de contacts cristallins. Depuis, EADock a été utilisé pour comprendre les mécanismes moléculaires impliqués dans la régulation de la Na,K ATPase et dans l'activation du peroxisome proliferatoractivated receptor a (PPARa). Il a également permis de décrire l'interaction de polluants couramment rencontrés sur PPARy, ainsi que l'influence de la métabolisation de l'Imatinib (PA anticancéreux) sur la fixation à la kinase Bcr-Abl. Une approche basée sur la prédiction des interactions de fragments moléculaires avec protéine cible est également proposée. Elle a permis la découverte de nouveaux ligands peptidiques de PPARa et de l'intégrine a5ß1. Dans les deux cas, l'activité de ces nouveaux peptides est comparable à celles de ligands bien établis, comme le Wy14,643 pour le premier, et le Cilengitide (PA anticancéreux) pour la seconde.
Resumo:
OBJECTIVE: To compare the predictive accuracy of the original and recalibrated Framingham risk function on current morbidity from coronary heart disease (CHD) and mortality data from the Swiss population. METHODS: Data from the CoLaus study, a cross-sectional, population-based study conducted between 2003 and 2006 on 5,773 participants aged 35-74 without CHD were used to recalibrate the Framingham risk function. The predicted number of events from each risk function were compared with those issued from local MONICA incidence rates and official mortality data from Switzerland. RESULTS: With the original risk function, 57.3%, 21.2%, 16.4% and 5.1% of men and 94.9%, 3.8%, 1.2% and 0.1% of women were at very low (<6%), low (6-10%), intermediate (10-20%) and high (>20%) risk, respectively. With the recalibrated risk function, the corresponding values were 84.7%, 10.3%, 4.3% and 0.6% in men and 99.5%, 0.4%, 0.0% and 0.1% in women, respectively. The number of CHD events over 10 years predicted by the original Framingham risk function was 2-3 fold higher than predicted by mortality+case fatality or by MONICA incidence rates (men: 191 vs. 92 and 51 events, respectively). The recalibrated risk function provided more reasonable estimates, albeit slightly overestimated (92 events, 5-95th percentile: 26-223 events); sensitivity analyses showed that the magnitude of the overestimation was between 0.4 and 2.2 in men, and 0.7 and 3.3 in women. CONCLUSION: The recalibrated Framingham risk function provides a reasonable alternative to assess CHD risk in men, but not in women.
Resumo:
Validation is the main bottleneck preventing theadoption of many medical image processing algorithms inthe clinical practice. In the classical approach,a-posteriori analysis is performed based on someobjective metrics. In this work, a different approachbased on Petri Nets (PN) is proposed. The basic ideaconsists in predicting the accuracy that will result froma given processing based on the characterization of thesources of inaccuracy of the system. Here we propose aproof of concept in the scenario of a diffusion imaginganalysis pipeline. A PN is built after the detection ofthe possible sources of inaccuracy. By integrating thefirst qualitative insights based on the PN withquantitative measures, it is possible to optimize the PNitself, to predict the inaccuracy of the system in adifferent setting. Results show that the proposed modelprovides a good prediction performance and suggests theoptimal processing approach.
Resumo:
Introduction: Le captopril, inhibiteur de l'enzyme de conversion de l'angiotensine, est largement utilisé dans le traitement de l'hypertension artérielle et de l'insuffisance cardiaque chez l'adulte et l'enfant . De par son instabilité en milieu aqueux (oxydation en captopril disulfure), il n'est actuellement commercialisé que sous forme de comprimés. La prescription fréquente de doses faibles et variables en pédiatrie justifiait le développement d'une forme orale liquide, tant sur le plan pratique et économique, que sur celui de la sécurité d'administration. Cependant toutes les formulations orales liquides publiées présentent une stabilité courte, ne dépassant guère 1 mois. Objectif: Développer une forme orale liquide de captopril d'une stabilité d'au moins 6 mois. Méthode: Sur la base des données de la littérature, élaboration de 8 formulations liquides différentes de captopril 1 mg/ml (concentration permettant le prélèvement d'un volume adéquat pour une administration en pédiatrie). Mise au point d'une « stability indicating method » par HPLC par des tests de dégradation accélérée à la chaleur (100°C), en milieu acide, basique, en présence d'un agent oxydant et de la lumière. Sélection de la formulation la plus stable durant le premier mois. Etude de stabilité sur 2 ans de 3 lots (3 échantillons/lot) dans leur conditionnement final à température ambiante (TA), au frigo et à 40° ± 2°C. Contrôle microbiologique au début et à la fin de l'étude selon la méthode de la Ph. Eur. (Ed. 3). Résultats: La formule retenue est une solution aqueuse de captopril 1 mg/ml additionnée d'EDTA 1 mg/ml comme stabilisateur et conditionnée dans des flacons VERAL en verre brun de 60 ml. Après dégradation dans les conditions définies ci-dessus, le pic du captopril est nettement séparé sur les chromatogrammes de ceux des produits de dégradation. Après 2 ans, la concentration mesurée est de 104.6% (±0.32%) au frigo, 103.6% (±0.86%) à TA et 96.5 % (±0.02%) à 40°C. Aucune croissance n'a été observée sur la durée de l'étude. Discussion et conclusion: La solution de captopril 1 mg/ml mise au point est simple à préparer. En partant du principe actif pur et en présence d'EDTA comme complexant, les traces de métaux éventuellement présents n'induisent pas l'oxydation du captopril et par conséquent le recours à d'autres stabilisateurs n'est pas nécessaire. La méthode HPLC développée est une « stability indicating method ». Les résultats de l'étude ont montré que cette solution a une durée de validité de 2 ans au frigo et à TA. Compte tenu du fait que la préparation ne contient pas d'agent antimicrobien, une conservation au frigo (2 - 8°C) est toutefois recommandée. La formule proposée présente un réel avantage en pédiatrie tant sur le plan de la sécurité d'administration que sur celui de l'économie.
Resumo:
Both neural and behavioral responses to stimuli are influenced by the state of the brain immediately preceding their presentation, notably by pre-stimulus oscillatory activity. Using frequency analysis of high-density electroencephalogram coupled with source estimations, the present study investigated the role of pre-stimulus oscillatory activity in auditory spatial temporal order judgments (TOJ). Oscillations within the beta range (i.e. 18-23Hz) were significantly stronger before accurate than inaccurate TOJ trials. Distributed source estimations identified bilateral posterior sylvian regions as the principal contributors to pre-stimulus beta oscillations. Activity within the left posterior sylvian region was significantly stronger before accurate than inaccurate TOJ trials. We discuss our results in terms of a modulation of sensory gating mechanisms mediated by beta activity.