994 resultados para volume algorithm
Resumo:
Comprend : Vie de Voltaire
Resumo:
Comprend : Notice
Resumo:
Comprend : Notice
Resumo:
OBJECTIVE: To assess the accuracy of a semiautomated 3D volume reconstruction method for organ volume measurement by postmortem MRI. METHODS: This prospective study was approved by the institutional review board and the infants' parents gave their consent. Postmortem MRI was performed in 16 infants (1 month to 1 year of age) at 1.5 T within 48 h of their sudden death. Virtual organ volumes were estimated using the Myrian software. Real volumes were recorded at autopsy by water displacement. The agreement between virtual and real volumes was quantified following the Bland and Altman's method. RESULTS: There was a good agreement between virtual and real volumes for brain (mean difference: -0.03% (-13.6 to +7.1)), liver (+8.3% (-9.6 to +26.2)) and lungs (+5.5% (-26.6 to +37.6)). For kidneys, spleen and thymus, the MRI/autopsy volume ratio was close to 1 (kidney: 0.87±0.1; spleen: 0.99±0.17; thymus: 0.94±0.25), but with a less good agreement. For heart, the MRI/real volume ratio was 1.29±0.76, possibly due to the presence of residual blood within the heart. The virtual volumes of adrenal glands were significantly underestimated (p=0.04), possibly due to their very small size during the first year of life. The percentage of interobserver and intraobserver variation was lower or equal to 10%, but for thymus (15.9% and 12.6%, respectively) and adrenal glands (69% and 25.9%). CONCLUSIONS: Virtual volumetry may provide significant information concerning the macroscopic features of the main organs and help pathologists in sampling organs that are more likely to yield histological findings.
Resumo:
In this paper, a hybrid simulation-based algorithm is proposed for the StochasticFlow Shop Problem. The main idea of the methodology is to transform the stochastic problem into a deterministic problem and then apply simulation to the latter. In order to achieve this goal, we rely on Monte Carlo Simulation and an adapted version of a deterministic heuristic. This approach aims to provide flexibility and simplicity due to the fact that it is not constrained by any previous assumption and relies in well-tested heuristics.
Resumo:
3 Summary 3. 1 English The pharmaceutical industry has been facing several challenges during the last years, and the optimization of their drug discovery pipeline is believed to be the only viable solution. High-throughput techniques do participate actively to this optimization, especially when complemented by computational approaches aiming at rationalizing the enormous amount of information that they can produce. In siiico techniques, such as virtual screening or rational drug design, are now routinely used to guide drug discovery. Both heavily rely on the prediction of the molecular interaction (docking) occurring between drug-like molecules and a therapeutically relevant target. Several softwares are available to this end, but despite the very promising picture drawn in most benchmarks, they still hold several hidden weaknesses. As pointed out in several recent reviews, the docking problem is far from being solved, and there is now a need for methods able to identify binding modes with a high accuracy, which is essential to reliably compute the binding free energy of the ligand. This quantity is directly linked to its affinity and can be related to its biological activity. Accurate docking algorithms are thus critical for both the discovery and the rational optimization of new drugs. In this thesis, a new docking software aiming at this goal is presented, EADock. It uses a hybrid evolutionary algorithm with two fitness functions, in combination with a sophisticated management of the diversity. EADock is interfaced with .the CHARMM package for energy calculations and coordinate handling. A validation was carried out on 37 crystallized protein-ligand complexes featuring 11 different proteins. The search space was defined as a sphere of 15 R around the center of mass of the ligand position in the crystal structure, and conversely to other benchmarks, our algorithms was fed with optimized ligand positions up to 10 A root mean square deviation 2MSD) from the crystal structure. This validation illustrates the efficiency of our sampling heuristic, as correct binding modes, defined by a RMSD to the crystal structure lower than 2 A, were identified and ranked first for 68% of the complexes. The success rate increases to 78% when considering the five best-ranked clusters, and 92% when all clusters present in the last generation are taken into account. Most failures in this benchmark could be explained by the presence of crystal contacts in the experimental structure. EADock has been used to understand molecular interactions involved in the regulation of the Na,K ATPase, and in the activation of the nuclear hormone peroxisome proliferatoractivated receptors a (PPARa). It also helped to understand the action of common pollutants (phthalates) on PPARy, and the impact of biotransformations of the anticancer drug Imatinib (Gleevec®) on its binding mode to the Bcr-Abl tyrosine kinase. Finally, a fragment-based rational drug design approach using EADock was developed, and led to the successful design of new peptidic ligands for the a5ß1 integrin, and for the human PPARa. In both cases, the designed peptides presented activities comparable to that of well-established ligands such as the anticancer drug Cilengitide and Wy14,643, respectively. 3.2 French Les récentes difficultés de l'industrie pharmaceutique ne semblent pouvoir se résoudre que par l'optimisation de leur processus de développement de médicaments. Cette dernière implique de plus en plus. de techniques dites "haut-débit", particulièrement efficaces lorsqu'elles sont couplées aux outils informatiques permettant de gérer la masse de données produite. Désormais, les approches in silico telles que le criblage virtuel ou la conception rationnelle de nouvelles molécules sont utilisées couramment. Toutes deux reposent sur la capacité à prédire les détails de l'interaction moléculaire entre une molécule ressemblant à un principe actif (PA) et une protéine cible ayant un intérêt thérapeutique. Les comparatifs de logiciels s'attaquant à cette prédiction sont flatteurs, mais plusieurs problèmes subsistent. La littérature récente tend à remettre en cause leur fiabilité, affirmant l'émergence .d'un besoin pour des approches plus précises du mode d'interaction. Cette précision est essentielle au calcul de l'énergie libre de liaison, qui est directement liée à l'affinité du PA potentiel pour la protéine cible, et indirectement liée à son activité biologique. Une prédiction précise est d'une importance toute particulière pour la découverte et l'optimisation de nouvelles molécules actives. Cette thèse présente un nouveau logiciel, EADock, mettant en avant une telle précision. Cet algorithme évolutionnaire hybride utilise deux pressions de sélections, combinées à une gestion de la diversité sophistiquée. EADock repose sur CHARMM pour les calculs d'énergie et la gestion des coordonnées atomiques. Sa validation a été effectuée sur 37 complexes protéine-ligand cristallisés, incluant 11 protéines différentes. L'espace de recherche a été étendu à une sphère de 151 de rayon autour du centre de masse du ligand cristallisé, et contrairement aux comparatifs habituels, l'algorithme est parti de solutions optimisées présentant un RMSD jusqu'à 10 R par rapport à la structure cristalline. Cette validation a permis de mettre en évidence l'efficacité de notre heuristique de recherche car des modes d'interactions présentant un RMSD inférieur à 2 R par rapport à la structure cristalline ont été classés premier pour 68% des complexes. Lorsque les cinq meilleures solutions sont prises en compte, le taux de succès grimpe à 78%, et 92% lorsque la totalité de la dernière génération est prise en compte. La plupart des erreurs de prédiction sont imputables à la présence de contacts cristallins. Depuis, EADock a été utilisé pour comprendre les mécanismes moléculaires impliqués dans la régulation de la Na,K ATPase et dans l'activation du peroxisome proliferatoractivated receptor a (PPARa). Il a également permis de décrire l'interaction de polluants couramment rencontrés sur PPARy, ainsi que l'influence de la métabolisation de l'Imatinib (PA anticancéreux) sur la fixation à la kinase Bcr-Abl. Une approche basée sur la prédiction des interactions de fragments moléculaires avec protéine cible est également proposée. Elle a permis la découverte de nouveaux ligands peptidiques de PPARa et de l'intégrine a5ß1. Dans les deux cas, l'activité de ces nouveaux peptides est comparable à celles de ligands bien établis, comme le Wy14,643 pour le premier, et le Cilengitide (PA anticancéreux) pour la seconde.
Resumo:
In this paper, a hybrid simulation-based algorithm is proposed for the StochasticFlow Shop Problem. The main idea of the methodology is to transform the stochastic problem into a deterministic problem and then apply simulation to the latter. In order to achieve this goal, we rely on Monte Carlo Simulation and an adapted version of a deterministic heuristic. This approach aims to provide flexibility and simplicity due to the fact that it is not constrained by any previous assumption and relies in well-tested heuristics.
Resumo:
Iowa features an extensive surface transportation system, with more than 110,000 miles of roadway, most of which is under the jurisdiction of local agencies. Given that Iowa is a lower-population state, most of this mileage is located in rural areas that exhibit low traffic volumes of less than 400 vehicles per day. However, these low-volume rural roads also account for about half of all recorded traffic crashes in Iowa, including a high percentage of fatal and major injury crashes. This study was undertaken to examine these crashes, identify major contributing causes, and develop low-cost strategies for reducing the incidence of these crashes. Iowa’s extensive crash and roadway system databases were utilized to obtain needed data. Using descriptive statistics, a test of proportions, and crash modeling, various classes of rural secondary roads were compared to similar state of Iowa controlled roads in crash frequency, severity, density, and rate for numerous selected factors that could contribute to crashes. The results of this study allowed the drawing of conclusions as to common contributing factors for crashes on low-volume rural roads, both paved and unpaved. Due to identified higher crash statistics, particular interest was drawn to unpaved rural roads with traffic volumes greater than 100 vehicles per day. Recommendations for addressing these crashes with low-cost mitigation are also included. Because of the isolated nature of traffic crashes on low-volume roads, a systemic or mass action approach to safety mitigation was recommended for an identified subset of the entire system. In addition, future development of a reliable crash prediction model is described.
Resumo:
Part 6 of the Manual on Uniform Traffic Control Devices (MUTCD) describes several types of channelizing devices that can be used to warn road users and guide them through work zones; these devices include cones, tubular markers, vertical panels, drums, barricades, and temporary raised islands. On higher speed/volume roadways, drums and/or vertical panels have been popular choices in many states, due to their formidable appearance and the enhanced visibility they provide when compared to standard cones. However, due to their larger size, drums also require more effort and storage space to transport, deploy and retrieve. Recent editions of the MUTCD have introduced new devices for channelizing; specifically of interest for this study is a taller (>36 inches) but thinner cone. While this new device does not offer a comparable target value to that of drums, the new devices are significantly larger than standard cones and they offer improved stability as well. In addition, these devices are more easily deployed and stored than drums and they cost less. Further, for applications previously using both drums and tall cones, the use of tall cones only provides the ability for delivery and setup by a single vehicle. An investigation of the effectiveness of the new channelizing devices provides a reference for states to use in selecting appropriate traffic control for high speed, high volume applications, especially for short term or limited duration exposures. This study includes a synthesis of common practices by state DOTs, as well as daytime and nighttime field observations of driver reactions using video detection equipment. The results of this study are promising for the day and night performance of the new tall cones, comparing favorably to the performance of drums when used for channelizing in tapers. The evaluation showed no statistical difference in merge distance and location, shy distance, or operating speed in either daytime or nighttime conditions. The study should provide a valuable resource for state DOTs to utilize in selecting the most effective channelizing device for use on high speed/high volume roadways where timely merging by drivers is critical to safety and mobility.
Resumo:
Résumé Des développements antérieurs, au sein de l'Institut de Géophysique de Lausanne, ont permis de développer des techniques d'acquisition sismique et de réaliser l'interprétation des données sismique 2D et 3D pour étudier la géologie de la région et notamment les différentes séquences sédimentaires du Lac Léman. Pour permettre un interprétation quantitative de la sismique en déterminant des paramètres physiques des sédiments la méthode AVO (Amplitude Versus Offset) a été appliquée. Deux campagnes sismiques lacustres, 2D et 3D, ont été acquises afin de tester la méthode AVO dans le Grand Lac sur les deltas des rivières. La géométrie d'acquisition a été repensée afin de pouvoir enregistrer les données à grands déports. Les flûtes sismiques, mises bout à bout, ont permis d'atteindre des angles d'incidence d'environ 40˚ . Des récepteurs GPS spécialement développés à cet effet, et disposés le long de la flûte, ont permis, après post-traitement des données, de déterminer la position de la flûte avec précision (± 0.5 m). L'étalonnage de nos hydrophones, réalisé dans une chambre anéchoïque, a permis de connaître leur réponse en amplitude en fonction de la fréquence. Une variation maximale de 10 dB a été mis en évidence entre les capteurs des flûtes et le signal de référence. Un traitement sismique dont l'amplitude a été conservée a été appliqué sur les données du lac. L'utilisation de l'algorithme en surface en consistante a permis de corriger les variations d'amplitude des tirs du canon à air. Les sections interceptes et gradients obtenues sur les deltas de l'Aubonne et de la Dranse ont permis de produire des cross-plots. Cette représentation permet de classer les anomalies d'amplitude en fonction du type de sédiments et de leur contenu potentiel en gaz. L'un des attributs qui peut être extrait des données 3D, est l'amplitude de la réflectivité d'une interface sismique. Ceci ajoute une composante quantitative à l'interprétation géologique d'une interface. Le fond d'eau sur le delta de l'Aubonne présente des anomalies en amplitude qui caractérisent les chenaux. L'inversion de l'équation de Zoeppritz par l'algorithme de Levenberg-Marquardt a été programmée afin d'extraire les paramètres physiques des sédiments sur ce delta. Une étude statistique des résultats de l'inversion permet de simuler la variation de l'amplitude en fonction du déport. On a obtenu un modèle dont la première couche est l'eau et dont la seconde est une couche pour laquelle V P = 1461 m∕s, ρ = 1182 kg∕m3 et V S = 383 m∕s. Abstract A system to record very high resolution (VHR) seismic data on lakes in 2D and 3D was developed at the Institute of Geophysics, University of Lausanne. Several seismic surveys carried out on Lake Geneva helped us to better understand the geology of the area and to identify sedimentary sequences. However, more sophisticated analysis of the data such as the AVO (Amplitude Versus Offset) method provides means of deciphering the detailed structure of the complex Quaternary sedimentary fill of the Lake Geneva trough. To study the physical parameters we applied the AVO method at some selected places of sediments. These areas are the Aubonne and Dranse River deltas where the configurations of the strata are relatively smooth and the discontinuities between them easy to pick. A specific layout was developed to acquire large incidence angle. 2D and 3D seismic data were acquired with streamers, deployed end to end, providing incidence angle up to 40˚ . One or more GPS antennas attached to the streamer enabled us to calculate individual hydrophone positions with an accuracy of 50 cm after post-processing of the navigation data. To ensure that our system provides correct amplitude information, our streamer sensors were calibrated in an anechoic chamber using a loudspeaker as a source. Amplitude variations between the each hydrophone were of the order of 10 dB. An amplitude correction for each hydrophone was computed and applied before processing. Amplitude preserving processing was then carried out. Intercept vs. gradient cross-plots enable us to determine that both geological discontinuities (lacustrine sediments/moraine and moraine/molasse) have well defined trends. A 3D volume collected on the Aubonne river delta was processed in order ro obtain AVO attributes. Quantitative interpretation using amplitude maps were produced and amplitude maps revealed high reflectivity in channels. Inversion of the water bottom of the Zoeppritz equation using the Levenberg-Marquadt algorithm was carried out to estimate V P , V S and ρ of sediments immediately under the lake bottom. Real-data inversion gave, under the water layer, a mud layer with V P = 1461 m∕s, ρ = 1182 kg∕m3 et V S = 383 m∕s.
Resumo:
The role of endothelin (ET) receptors was tested in volume-stimulated atrial natriuretic factor (ANF) secretion in conscious rats. Mean ANF responses to slow infusions (3 x 3.3 ml/8 min) were dose dependently reduced (P < 0.05) by bosentan (nonselective ET-receptor antagonist) from 64.1 +/- 18.1 (SE) pg/ml (control) to 52.6 +/- 16.1 (0.033 mg bosentan/rat), 16.1 +/- 7.6 (0. 33 mg/rat), and 11.6 +/- 6.5 pg/ml (3.3 mg/rat). The ET-A-receptor antagonist BQ-123 (1 mg/rat) had no effect relative to DMSO controls, whereas the putative ET-B antagonist IRL-1038 (0.1 mg/rat) abolished the response. In a second protocol, BQ-123 (>/=0.5 mg/rat) nonsignificantly reduced the peak ANF response (106.1 +/- 23.0 pg/ml) to 74.0 +/- 20.5 pg/ml for slow infusions (3.5 ml/8.5 min) but reduced the peak response (425.3 +/- 58.1 pg/ml) for fast infusions (6.6 ml/1 min) by 49.9% (P < 0.001) and for 340 pmoles ET-1 (328.8 +/- 69.5 pg/ml) by 83.5% (P < 0.0001). BQ-123 abolished the ET-1-induced increase in arterial pressure (21.8 +/- 5.2 mmHg at 1 min). Changes in central venous pressure were similar for DMSO and BQ-123 (slow: 0.91 and 1.14 mmHg; fast: 4.50 and 4.13 mmHg). The results suggest 1) ET-B receptors mainly mediate the ANF secretion to slow volume expansions of <1.6%/min; and 2) ET-A receptors mainly mediate the ANF response to acute volume overloads.
Resumo:
BACKGROUND: Liver remnant volumes after major hepatic resection and graft volumes for liver transplantation correlate with surgical outcome. The relative contributions of the hepatic segments to total liver volume (TLV) are not well established. METHODS: TLV and hepatic segment volumes were measured with computed tomography (CT) in 102 patients without liver disease who underwent CT for conditions unrelated to the liver or biliary tree. RESULTS: TLV ranged from 911 to 2729 cm(3). On average, the right liver (segments V, VI, VII, and VIII) contributed approximately two thirds of TLV (997+/-279 cm(3)), and the left liver (segments II, III and IV) contributed approximately one third of TLV (493+/-127 cm(3)). Bisegment II+III (left lateral section) contributed about half the volume of the left liver (242+/-79 cm(3)), or 16% of TLV. Liver volumes varied significantly between patients--the right liver varied from 49% to 82% of TLV, the left liver, 17% to 49% of TLV, and bisegment II+III (left lateral section) 5% to 27% of TLV. Bisegment II+III contributed less than 20% of TLV in more than 75% of patients and the left liver contributed 25% or less of TLV in more than 10% of patients. DISCUSSION: There is clinically significant interpatient variation in hepatic volumes. Therefore, in the absence of appreciable hypertrophy, we recommend routine measurement of the future liver remnant before extended right hepatectomy (right trisectionectomy) and in selected patients before right hepatectomy if a small left liver is anticipated.