898 resultados para motion cueing algorithm (MCA)
Resumo:
Background: Research in epistasis or gene-gene interaction detection for human complex traits has grown over the last few years. It has been marked by promising methodological developments, improved translation efforts of statistical epistasis to biological epistasis and attempts to integrate different omics information sources into the epistasis screening to enhance power. The quest for gene-gene interactions poses severe multiple-testing problems. In this context, the maxT algorithm is one technique to control the false-positive rate. However, the memory needed by this algorithm rises linearly with the amount of hypothesis tests. Gene-gene interaction studies will require a memory proportional to the squared number of SNPs. A genome-wide epistasis search would therefore require terabytes of memory. Hence, cache problems are likely to occur, increasing the computation time. In this work we present a new version of maxT, requiring an amount of memory independent from the number of genetic effects to be investigated. This algorithm was implemented in C++ in our epistasis screening software MBMDR-3.0.3. We evaluate the new implementation in terms of memory efficiency and speed using simulated data. The software is illustrated on real-life data for Crohn’s disease. Results: In the case of a binary (affected/unaffected) trait, the parallel workflow of MBMDR-3.0.3 analyzes all gene-gene interactions with a dataset of 100,000 SNPs typed on 1000 individuals within 4 days and 9 hours, using 999 permutations of the trait to assess statistical significance, on a cluster composed of 10 blades, containing each four Quad-Core AMD Opteron(tm) Processor 2352 2.1 GHz. In the case of a continuous trait, a similar run takes 9 days. Our program found 14 SNP-SNP interactions with a multiple-testing corrected p-value of less than 0.05 on real-life Crohn’s disease (CD) data. Conclusions: Our software is the first implementation of the MB-MDR methodology able to solve large-scale SNP-SNP interactions problems within a few days, without using much memory, while adequately controlling the type I error rates. A new implementation to reach genome-wide epistasis screening is under construction. In the context of Crohn’s disease, MBMDR-3.0.3 could identify epistasis involving regions that are well known in the field and could be explained from a biological point of view. This demonstrates the power of our software to find relevant phenotype-genotype higher-order associations.
Resumo:
It is well known the relationship between source separation and blind deconvolution: If a filtered version of an unknown i.i.d. signal is observed, temporal independence between samples can be used to retrieve the original signal, in the same manner as spatial independence is used for source separation. In this paper we propose the use of a Genetic Algorithm (GA) to blindly invert linear channels. The use of GA is justified in the case of small number of samples, where other gradient-like methods fails because of poor estimation of statistics.
Resumo:
Comprend : Les opinions des honorables membres qui ont appuyé la motion
Resumo:
This report documents work undertaken in the demonstration of a low-cost Automatic Weight and Classification System (AWACS). An AWACS procurement specification and details of the results of the project are also included. The intent of the project is to support and encourage transferring research knowledge to state and local agencies and manufacturers through field demonstrations. Presently available, Weigh-in-Motion and Classification Systems are typically too expensive to permit the wide deployment necessary to obtain representative vehicle data. Piezo electric technology has been used in the United Kingdom and Europe and is believed to be the basic element in a low-cost AWACS. Low-cost systems have been installed at two sites, one in Portland Cement Concrete (PCC) pavement in Iowa and the other in Asphaltic Cement Concrete (ACC) pavement in Minnesota to provide experience with both types of pavement. The systems provide axle weights, gross vehicle weight, axle spacing, vehicle classification, vehicle speed, vehicle count, and time of arrival. In addition, system self-calibration and a method to predict contact tire pressure is included in the system design. The study has shown that in the PCC pavement, the AWACS is capable of meeting the needs of state and federal highway agencies, producing accuracies comparable to many current commercial WIM devices. This is being achieved at a procurement cost of substantially less than currently available equipment. In the ACC pavement the accuracies were less than those observed in the PCC pavement which is concluded to result from a low pavement rigidity at this site. Further work is needed to assess the AWACS performance at a range of sites in ACC pavements.
Resumo:
The velocity of a liquid slug falling in a capillary tube is lower than predicted for Poiseuille flow due to presence of menisci, whose shapes are determined by the complex interplay of capillary, viscous, and gravitational forces. Due to the presence of menisci, a capillary pressure proportional to surface curvature acts on the slug and streamlines are bent close to the interface, resulting in enhanced viscous dissipation at the wedges. To determine the origin of drag-force increase relative to Poiseuille flow, we compute the force resultant acting on the slug by integrating Navier-Stokes equations over the liquid volume. Invoking relationships from differential geometry we demonstrate that the additional drag is due to viscous forces only and that no capillary drag of hydrodynamic origin exists (i.e., due to hydrodynamic deformation of the interface). Requiring that the force resultant is zero, we derive scaling laws for the steady velocity in the limit of small capillary numbers by estimating the leading order viscous dissipation in the different regions of the slug (i.e., the unperturbed Poiseuille-like bulk, the static menisci close to the tube axis and the dynamic regions close to the contact lines). Considering both partial and complete wetting, we find that the relationship between dimensionless velocity and weight is, in general, nonlinear. Whereas the relationship obtained for complete-wetting conditions is found in agreement with the experimental data of Bico and Quere [J. Bico and D. Quere, J. Colloid Interface Sci. 243, 262 (2001)], the scaling law under partial-wetting conditions is validated by numerical simulations performed with the Volume of Fluid method. The simulated steady velocities agree with the behavior predicted by the theoretical scaling laws in presence and in absence of static contact angle hysteresis. The numerical simulations suggest that wedge-flow dissipation alone cannot account for the entire additional drag and that the non-Poiseuille dissipation in the static menisci (not considered in previous studies) has to be considered for large contact angles.
Resumo:
Image registration has been proposed as an automatic method for recovering cardiac displacement fields from Tagged Magnetic Resonance Imaging (tMRI) sequences. Initially performed as a set of pairwise registrations, these techniques have evolved to the use of 3D+t deformation models, requiring metrics of joint image alignment (JA). However, only linear combinations of cost functions defined with respect to the first frame have been used. In this paper, we have applied k-Nearest Neighbors Graphs (kNNG) estimators of the -entropy (H ) to measure the joint similarity between frames, and to combine the information provided by different cardiac views in an unified metric. Experiments performed on six subjects showed a significantly higher accuracy (p < 0.05) with respect to a standard pairwise alignment (PA) approach in terms of mean positional error and variance with respect to manually placed landmarks. The developed method was used to study strains in patients with myocardial infarction, showing a consistency between strain, infarction location, and coronary occlusion. This paper also presentsan interesting clinical application of graph-based metric estimators, showing their value for solving practical problems found in medical imaging.
Resumo:
Two portable Radio Frequency IDentification (RFID) systems (made by Texas Instruments and HiTAG) were developed and tested for bridge scour monitoring by the Department of Civil and Environmental Engineering at the University of Iowa (UI). Both systems consist of three similar components: 1) a passive cylindrical transponder of 2.2 cm in length (derived from transmitter/responder); 2) a low frequency reader (~134.2 kHz frequency); and 3) an antenna (of rectangular or hexagonal loop). The Texas Instruments system can only read one smart particle per time, while the HiTAG system was successfully modified here at UI by adding the anti-collision feature. The HiTAG system was equipped with four antennas and could simultaneously detect 1,000s of smart particles located in a close proximity. A computer code was written in C++ at the UI for the HiTAG system to allow simultaneous, multiple readouts of smart particles under different flow conditions. The code is written for the Windows XP operational system which has a user-friendly windows interface that provides detailed information regarding the smart particle that includes: identification number, location (orientation in x,y,z), and the instance the particle was detected.. These systems were examined within the context of this innovative research in order to identify the best suited RFID system for performing autonomous bridge scour monitoring. A comprehensive laboratory study that included 142 experimental runs and limited field testing was performed to test the code and determine the performance of each system in terms of transponder orientation, transponder housing material, maximum antenna-transponder detection distance, minimum inter-particle distance and antenna sweep angle. The two RFID systems capabilities to predict scour depth were also examined using pier models. The findings can be summarized as follows: 1) The first system (Texas Instruments) read one smart particle per time, and its effective read range was about 3ft (~1m). The second system (HiTAG) had similar detection ranges but permitted the addition of an anti-collision system to facilitate the simultaneous identification of multiple smart particles (transponders placed into marbles). Therefore, it was sought that the HiTAG system, with the anti-collision feature (or a system with similar features), would be preferable when compared to a single-read-out system for bridge scour monitoring, as the former could provide repetitive readings at multiple locations, which could help in predicting the scour-hole bathymetry along with maximum scour depth. 2) The HiTAG system provided reliable measures of the scour depth (z-direction) and the locations of the smart particles on the x-y plane within a distance of about 3ft (~1m) from the 4 antennas. A Multiplexer HTM4-I allowed the simultaneous use of four antennas for the HiTAG system. The four Hexagonal Loop antennas permitted the complete identification of the smart particles in an x, y, z orthogonal system as function of time. The HiTAG system can be also used to measure the rate of sediment movement (in kg/s or tones/hr). 3) The maximum detection distance of the antenna did not change significantly for the buried particles compared to the particles tested in the air. Thus, the low frequency RFID systems (~134.2 kHz) are appropriate for monitoring bridge scour because their waves can penetrate water and sand bodies without significant loss of their signal strength. 4) The pier model experiments in a flume with first RFID system showed that the system was able to successfully predict the maximum scour depth when the system was used with a single particle in the vicinity of pier model where scour-hole was expected. The pier model experiments with the second RFID system, performed in a sandbox, showed that system was able to successfully predict the maximum scour depth when two scour balls were used in the vicinity of the pier model where scour-hole was developed. 5) The preliminary field experiments with the second RFID system, at the Raccoon River, IA near the Railroad Bridge (located upstream of 360th street Bridge, near Booneville), showed that the RFID technology is transferable to the field. A practical method would be developed for facilitating the placement of the smart particles within the river bed. This method needs to be straightforward for the Department of Transportation (DOT) and county road working crews so it can be easily implemented at different locations. 6) Since the inception of this project, further research showed that there is significant progress in RFID technology. This includes the availability of waterproof RFID systems with passive or active transponders of detection ranges up to 60 ft (~20 m) within the water–sediment column. These systems do have anti-collision and can facilitate up to 8 powerful antennas which can significantly increase the detection range. Such systems need to be further considered and modified for performing automatic bridge scour monitoring. The knowledge gained from the two systems, including the software, needs to be adapted to the new systems.
Resumo:
Context: Ovarian tumors (OT) typing is a competency expected from pathologists, with significant clinical implications. OT however come in numerous different types, some rather rare, with the consequence of few opportunities for practice in some departments. Aim: Our aim was to design a tool for pathologists to train in less common OT typing. Method and Results: Representative slides of 20 less common OT were scanned (Nano Zoomer Digital Hamamatsu®) and the diagnostic algorithm proposed by Young and Scully applied to each case (Young RH and Scully RE, Seminars in Diagnostic Pathology 2001, 18: 161-235) to include: recognition of morphological pattern(s); shortlisting of differential diagnosis; proposition of relevant immunohistochemical markers. The next steps of this project will be: evaluation of the tool in several post-graduate training centers in Europe and Québec; improvement of its design based on evaluation results; diffusion to a larger public. Discussion: In clinical medicine, solving many cases is recognized as of utmost importance for a novice to become an expert. This project relies on the virtual slides technology to provide pathologists with a learning tool aimed at increasing their skills in OT typing. After due evaluation, this model might be extended to other uncommon tumors.
Resumo:
In a previous work we have shown that sinusoidal whole-body rotations producing continuous vestibular stimulation, affected the timing of motor responses as assessed with a paced finger tapping (PFT) task (Binetti et al. (2010). Neuropsychologia, 48(6), 1842-1852). Here, in two new psychophysical experiments, one purely perceptual and one with both sensory and motor components, we explored the relationship between body motion/vestibular stimulation and perceived timing of acoustic events. In experiment 1, participants were required to discriminate sequences of acoustic tones endowed with different degrees of acceleration or deceleration. In this experiment we found that a tone sequence presented during acceleratory whole-body rotations required a progressive increase in rate in order to be considered temporally regular, consistent with the idea of an increase in "clock" frequency and of an overestimation of time. In experiment 2 participants produced self-paced taps, which entailed an acoustic feedback. We found that tapping frequency in this task was affected by periodic motion by means of anticipatory and congruent (in-phase) fluctuations irrespective of the self-generated sensory feedback. On the other hand, synchronizing taps to an external rhythm determined a completely opposite modulation (delayed/counter-phase). Overall this study shows that body displacements "remap" our metric of time, affecting not only motor output but also sensory input.
Resumo:
PURPOSE: A new magnetic resonance imaging approach for detection of myocardial late enhancement during free-breathing was developed. METHODS AND RESULTS: For suppression of respiratory motion artifacts, a prospective navigator technology including real-time motion correction and a local navigator restore was implemented. Subject specific inversion times were defined from images with incrementally increased inversion times acquired during a single dynamic scout navigator-gated and real-time motion corrected free-breathing scan. Subsequently, MR-imaging of myocardial late enhancement was performed with navigator-gated and real-time motion corrected adjacent short axis and long axis (two, three and four chamber) views. This alternative approach was investigated in 7 patients with history of myocardial infarction 12 min after i. v. administration of 0.2 mmol/kg body weight gadolinium-DTPA. CONCLUSION: With the presented navigator-gated and real-time motion corrected sequence for MR-imaging of myocardial late enhancement data can be completely acquired during free-breathing. Time constraints of a breath-hold technique are abolished and optimized patient specific inversion time is ensured.
Resumo:
PURPOSE: To examine the impact of spatial resolution and respiratory motion on the ability to accurately measure atherosclerotic plaque burden and to visually identify atherosclerotic plaque composition. MATERIALS AND METHODS: Numerical simulations of the Bloch equations and vessel wall phantom studies were performed for different spatial resolutions by incrementally increasing the field of view. In addition, respiratory motion was simulated based on a measured physiologic breathing pattern. RESULTS: While a spatial resolution of > or = 6 pixels across the wall does not result in significant errors, a resolution of < or = 4 pixels across the wall leads to an overestimation of > 20%. Using a double-inversion T2-weighted turbo spin echo sequence, a resolution of 1 pixel across equally thick tissue layers (fibrous cap, lipid, smooth muscle) and a respiratory motion correction precision (gating window) of three times the thickness of the tissue layer allow for characterization of the different coronary wall components. CONCLUSIONS: We found that measurements in low-resolution black blood images tend to overestimate vessel wall area and underestimate lumen area.
Resumo:
In this paper, a hybrid simulation-based algorithm is proposed for the StochasticFlow Shop Problem. The main idea of the methodology is to transform the stochastic problem into a deterministic problem and then apply simulation to the latter. In order to achieve this goal, we rely on Monte Carlo Simulation and an adapted version of a deterministic heuristic. This approach aims to provide flexibility and simplicity due to the fact that it is not constrained by any previous assumption and relies in well-tested heuristics.
Resumo:
3 Summary 3. 1 English The pharmaceutical industry has been facing several challenges during the last years, and the optimization of their drug discovery pipeline is believed to be the only viable solution. High-throughput techniques do participate actively to this optimization, especially when complemented by computational approaches aiming at rationalizing the enormous amount of information that they can produce. In siiico techniques, such as virtual screening or rational drug design, are now routinely used to guide drug discovery. Both heavily rely on the prediction of the molecular interaction (docking) occurring between drug-like molecules and a therapeutically relevant target. Several softwares are available to this end, but despite the very promising picture drawn in most benchmarks, they still hold several hidden weaknesses. As pointed out in several recent reviews, the docking problem is far from being solved, and there is now a need for methods able to identify binding modes with a high accuracy, which is essential to reliably compute the binding free energy of the ligand. This quantity is directly linked to its affinity and can be related to its biological activity. Accurate docking algorithms are thus critical for both the discovery and the rational optimization of new drugs. In this thesis, a new docking software aiming at this goal is presented, EADock. It uses a hybrid evolutionary algorithm with two fitness functions, in combination with a sophisticated management of the diversity. EADock is interfaced with .the CHARMM package for energy calculations and coordinate handling. A validation was carried out on 37 crystallized protein-ligand complexes featuring 11 different proteins. The search space was defined as a sphere of 15 R around the center of mass of the ligand position in the crystal structure, and conversely to other benchmarks, our algorithms was fed with optimized ligand positions up to 10 A root mean square deviation 2MSD) from the crystal structure. This validation illustrates the efficiency of our sampling heuristic, as correct binding modes, defined by a RMSD to the crystal structure lower than 2 A, were identified and ranked first for 68% of the complexes. The success rate increases to 78% when considering the five best-ranked clusters, and 92% when all clusters present in the last generation are taken into account. Most failures in this benchmark could be explained by the presence of crystal contacts in the experimental structure. EADock has been used to understand molecular interactions involved in the regulation of the Na,K ATPase, and in the activation of the nuclear hormone peroxisome proliferatoractivated receptors a (PPARa). It also helped to understand the action of common pollutants (phthalates) on PPARy, and the impact of biotransformations of the anticancer drug Imatinib (Gleevec®) on its binding mode to the Bcr-Abl tyrosine kinase. Finally, a fragment-based rational drug design approach using EADock was developed, and led to the successful design of new peptidic ligands for the a5ß1 integrin, and for the human PPARa. In both cases, the designed peptides presented activities comparable to that of well-established ligands such as the anticancer drug Cilengitide and Wy14,643, respectively. 3.2 French Les récentes difficultés de l'industrie pharmaceutique ne semblent pouvoir se résoudre que par l'optimisation de leur processus de développement de médicaments. Cette dernière implique de plus en plus. de techniques dites "haut-débit", particulièrement efficaces lorsqu'elles sont couplées aux outils informatiques permettant de gérer la masse de données produite. Désormais, les approches in silico telles que le criblage virtuel ou la conception rationnelle de nouvelles molécules sont utilisées couramment. Toutes deux reposent sur la capacité à prédire les détails de l'interaction moléculaire entre une molécule ressemblant à un principe actif (PA) et une protéine cible ayant un intérêt thérapeutique. Les comparatifs de logiciels s'attaquant à cette prédiction sont flatteurs, mais plusieurs problèmes subsistent. La littérature récente tend à remettre en cause leur fiabilité, affirmant l'émergence .d'un besoin pour des approches plus précises du mode d'interaction. Cette précision est essentielle au calcul de l'énergie libre de liaison, qui est directement liée à l'affinité du PA potentiel pour la protéine cible, et indirectement liée à son activité biologique. Une prédiction précise est d'une importance toute particulière pour la découverte et l'optimisation de nouvelles molécules actives. Cette thèse présente un nouveau logiciel, EADock, mettant en avant une telle précision. Cet algorithme évolutionnaire hybride utilise deux pressions de sélections, combinées à une gestion de la diversité sophistiquée. EADock repose sur CHARMM pour les calculs d'énergie et la gestion des coordonnées atomiques. Sa validation a été effectuée sur 37 complexes protéine-ligand cristallisés, incluant 11 protéines différentes. L'espace de recherche a été étendu à une sphère de 151 de rayon autour du centre de masse du ligand cristallisé, et contrairement aux comparatifs habituels, l'algorithme est parti de solutions optimisées présentant un RMSD jusqu'à 10 R par rapport à la structure cristalline. Cette validation a permis de mettre en évidence l'efficacité de notre heuristique de recherche car des modes d'interactions présentant un RMSD inférieur à 2 R par rapport à la structure cristalline ont été classés premier pour 68% des complexes. Lorsque les cinq meilleures solutions sont prises en compte, le taux de succès grimpe à 78%, et 92% lorsque la totalité de la dernière génération est prise en compte. La plupart des erreurs de prédiction sont imputables à la présence de contacts cristallins. Depuis, EADock a été utilisé pour comprendre les mécanismes moléculaires impliqués dans la régulation de la Na,K ATPase et dans l'activation du peroxisome proliferatoractivated receptor a (PPARa). Il a également permis de décrire l'interaction de polluants couramment rencontrés sur PPARy, ainsi que l'influence de la métabolisation de l'Imatinib (PA anticancéreux) sur la fixation à la kinase Bcr-Abl. Une approche basée sur la prédiction des interactions de fragments moléculaires avec protéine cible est également proposée. Elle a permis la découverte de nouveaux ligands peptidiques de PPARa et de l'intégrine a5ß1. Dans les deux cas, l'activité de ces nouveaux peptides est comparable à celles de ligands bien établis, comme le Wy14,643 pour le premier, et le Cilengitide (PA anticancéreux) pour la seconde.
Resumo:
In this paper, a hybrid simulation-based algorithm is proposed for the StochasticFlow Shop Problem. The main idea of the methodology is to transform the stochastic problem into a deterministic problem and then apply simulation to the latter. In order to achieve this goal, we rely on Monte Carlo Simulation and an adapted version of a deterministic heuristic. This approach aims to provide flexibility and simplicity due to the fact that it is not constrained by any previous assumption and relies in well-tested heuristics.
Resumo:
What is Iowa in Motion? The Iowa Department of Transportation is continuing the journey to develop Iowa’s future transportation system. This ongoing planning process, known as Iowa in Motion, was developed in response to the Intermodal Surface Transportation Efficiency Act (ISTEA) and Iowa’s changing transportation needs. The completion of Parts I, II and III of Iowa in Motion has led to development of this State Transportation Plan. Part IV includes activities, both current and future, to support the plan. This State Transportation Plan represents the thoughts and concerns of thousands of Iowans. Individuals, metropolitan planning organizations (MPOs), regional planning affiliations (RPAs), associations and organizations have become involved and have made recommendations concerning which direction should be followed regarding transportation investments. This plan represents their extensive input into the Iowa in Motion process and consensus building as we moved towards adoption of this State Transportation Plan. The adopted plan serves as a guide for development of transportation policies, goals, objectives, initiatives and investment decisions through the year 2020.