48 resultados para Application method
Resumo:
A method is proposed for the estimation of absolute binding free energy of interaction between proteins and ligands. Conformational sampling of the protein-ligand complex is performed by molecular dynamics (MD) in vacuo and the solvent effect is calculated a posteriori by solving the Poisson or the Poisson-Boltzmann equation for selected frames of the trajectory. The binding free energy is written as a linear combination of the buried surface upon complexation, SASbur, the electrostatic interaction energy between the ligand and the protein, Eelec, and the difference of the solvation free energies of the complex and the isolated ligand and protein, deltaGsolv. The method uses the buried surface upon complexation to account for the non-polar contribution to the binding free energy because it is less sensitive to the details of the structure than the van der Waals interaction energy. The parameters of the method are developed for a training set of 16 HIV-1 protease-inhibitor complexes of known 3D structure. A correlation coefficient of 0.91 was obtained with an unsigned mean error of 0.8 kcal/mol. When applied to a set of 25 HIV-1 protease-inhibitor complexes of unknown 3D structures, the method provides a satisfactory correlation between the calculated binding free energy and the experimental pIC5o without reparametrization.
Resumo:
A score system integrating the evolution of efficacy and tolerability over time was applied to a subpopulation of the STRATHE trial, a trial performed according to a parallel group design, with a double-blind, random allocation to either a fixed-dose combination strategy (perindopril/indapamide 2 mg/0.625 mg, with the possibility to increase the dose to 3 mg/0.935 mg, and 4 mg/1.250 mg if needed, n = 118), a sequential monotherapy approach (atenolol 50 mg, followed by losartan 50 mg and amlodipine 5 mg if needed, n = 108), or a stepped-care strategy (valsartan 40 mg, followed by valsartan 80 mg and valsartan 80 mg+ hydrochlorothiazide 12.5 mg if needed, n = 103). The aim was to lower blood pressure below 140/90 mmHg within a 9-month period. The treatment could be adjusted after 3 and 6 months. Only patients in whom the study protocol was strictly applied were included in this analysis. At completion of the trial the total score averaged 13.1 +/- 70.5 (mean +/- SD) using the fixed-dose combination strategy, compared with -7.2 +/- 81.0 using the sequential monotherapy approach and -17.5 +/- 76.4 using the stepped-care strategy. In conclusion, the use of a score system allows the comparison of antihypertensive therapeutic strategies, taking into account at the same time efficacy and tolerability. In the STRATHE trial the best results were observed with the fixed-dose combination containing low doses of an angiotensin enzyme converting inhibitor (perindopril) and a diuretic (indapamide).
Resumo:
The study investigates the possibility to incorporate fracture intensity and block geometry as spatially continuous parameters in GIS-based systems. For this purpose, a deterministic method has been implemented to estimate block size (Bloc3D) and joint frequency (COLTOP). In addition to measuring the block size, the Bloc3D Method provides a 3D representation of the shape of individual blocks. These two methods were applied using field measurements (joint set orientation and spacing) performed over a large field area, in the Swiss Alps. This area is characterized by a complex geology, a number of different rock masses and varying degrees of metamorphism. The spatial variability of the parameters was evaluated with regard to lithology and major faults. A model incorporating these measurements and observations into a GIS system to assess the risk associated with rock falls is proposed. The analysis concludes with a discussion on the feasibility of such an application in regularly and irregularly jointed rock masses, with persistent and impersistent discontinuities.
Resumo:
Single amino acid substitution is the type of protein alteration most related to human diseases. Current studies seek primarily to distinguish neutral mutations from harmful ones. Very few methods offer an explanation of the final prediction result in terms of the probable structural or functional effect on the protein. In this study, we describe the use of three novel parameters to identify experimentally-verified critical residues of the TP53 protein (p53). The first two parameters make use of a surface clustering method to calculate the protein surface area of highly conserved regions or regions with high nonlocal atomic interaction energy (ANOLEA) score. These parameters help identify important functional regions on the surface of a protein. The last parameter involves the use of a new method for pseudobinding free-energy estimation to specifically probe the importance of residue side-chains to the stability of protein fold. A decision tree was designed to optimally combine these three parameters. The result was compared to the functional data stored in the International Agency for Research on Cancer (IARC) TP53 mutation database. The final prediction achieved a prediction accuracy of 70% and a Matthews correlation coefficient of 0.45. It also showed a high specificity of 91.8%. Mutations in the 85 correctly identified important residues represented 81.7% of the total mutations recorded in the database. In addition, the method was able to correctly assign a probable functional or structural role to the residues. Such information could be critical for the interpretation and prediction of the effect of missense mutations, as it not only provided the fundamental explanation of the observed effect, but also helped design the most appropriate laboratory experiment to verify the prediction results.
Resumo:
Résumé La réalisation d'une seconde ligne de métro (M2) dès 2004, passant dans le centre ville de Lausanne, a été l'opportunité de développer une méthodologie concernant des campagnes microgravimétriques dans un environnement urbain perturbé. Les corrections topographiques prennent une dimension particulière dans un tel milieu, car de nombreux objets non géologiques d'origine anthropogénique comme toutes sortes de sous-sols vides viennent perturber les mesures gravimétriques. Les études de génie civil d'avant projet de ce métro nous ont fournis une quantité importante d'informations cadastrales, notamment sur les contours des bâtiments, sur la position prévue du tube du M2, sur des profondeurs de sous-sol au voisinage du tube, mais aussi sur la géologie rencontré le long du corridor du M2 (issue des données lithologiques de forages géotechniques). La planimétrie des sous-sols a été traitée à l'aide des contours des bâtiments dans un SIG (Système d'Information Géographique), alors qu'une enquête de voisinage fut nécessaire pour mesurer la hauteur des sous-sols. Il a été alors possible, à partir d'un MNT (Modèle Numérique de Terrain) existant sur une grille au mètre, de mettre à jour celui ci avec les vides que représentent ces sous-sols. Les cycles de mesures gravimétriques ont été traités dans des bases de données Ac¬cess, pour permettre un plus grand contrôle des données, une plus grande rapidité de traitement, et une correction de relief rétroactive plus facile, notamment lorsque des mises à jour de la topographie ont lieu durant les travaux. Le quartier Caroline (entre le pont Bessières et la place de l'Ours) a été choisi comme zone d'étude. Le choix s'est porté sur ce quartier du fait que, durant ce travail de thèse, nous avions chronologiquement les phases pré et post creusement du tunnel du M2. Cela nous a permis d'effectuer deux campagnes gravimétriques (avant le creu¬sement durant l'été 2005 et après le creusement durant l'été 2007). Ces réitérations nous ont permis de tester notre modélisation du tunnel. En effet, en comparant les mesures des deux campagnes et la réponse gravifique du modèle du tube discrétisé en prismes rectangulaires, nous avons pu valider notre méthode de modélisation. La modélisation que nous avons développée nous permet de construire avec détail la forme de l'objet considéré avec la possibilité de recouper plusieurs fois des interfaces de terrains géologiques et la surface topographique. Ce type de modélisation peut s'appliquer à toutes constructions anthropogéniques de formes linéaires. Abstract The realization of a second underground (M2) in 2004, in downtown Lausanne, was the opportunity to develop a methodology of microgravity in urban environment. Terrain corrections take on special meaning in such environment. Many non-geologic anthropogenic objects like basements act as perturbation of gravity measurements. Civil engineering provided a large amount of cadastral informations, including out¬lines of buildings, M2 tube position, depths of some basements in the vicinity of the M2 corridor, and also on the geology encountered along the M2 corridor (from the lithological data from boreholes). Geometry of basements was deduced from building outlines in a GIS (Geographic Information System). Field investigation was carried out to measure or estimate heights of basements. A DEM (Digital Elevation Model) of the city of Lausanne is updated from voids of basements. Gravity cycles have been processed in Access database, to enable greater control of data, enhance speed processing, and retroactive terrain correction easier, when update of topographic surface are available. Caroline area (between the bridge Saint-Martin and Place de l'Ours) was chosen as the study area. This area was in particular interest because it was before and after digging in this thesis. This allowed us to conduct two gravity surveys (before excavation during summer 2005 and after excavation during summer 2007). These re-occupations enable us to test our modélisation of the tube. Actually, by comparing the difference of measurements between the both surveys and the gravity response of our model (by rectangular prisms), we were able to validate our modeling. The modeling method we developed allows us to construct detailed shape of an object with possibility to cross land geological interfaces and surface topography. This type of modélisation can be applied to all anthropogenic structures.
Resumo:
BACKGROUND: The annotation of protein post-translational modifications (PTMs) is an important task of UniProtKB curators and, with continuing improvements in experimental methodology, an ever greater number of articles are being published on this topic. To help curators cope with this growing body of information we have developed a system which extracts information from the scientific literature for the most frequently annotated PTMs in UniProtKB. RESULTS: The procedure uses a pattern-matching and rule-based approach to extract sentences with information on the type and site of modification. A ranked list of protein candidates for the modification is also provided. For PTM extraction, precision varies from 57% to 94%, and recall from 75% to 95%, according to the type of modification. The procedure was used to track new publications on PTMs and to recover potential supporting evidence for phosphorylation sites annotated based on the results of large scale proteomics experiments. CONCLUSIONS: The information retrieval and extraction method we have developed in this study forms the basis of a simple tool for the manual curation of protein post-translational modifications in UniProtKB/Swiss-Prot. Our work demonstrates that even simple text-mining tools can be effectively adapted for database curation tasks, providing that a thorough understanding of the working process and requirements are first obtained. This system can be accessed at http://eagl.unige.ch/PTM/.
Resumo:
ABSTRACT: BACKGROUND: Decision curve analysis has been introduced as a method to evaluate prediction models in terms of their clinical consequences if used for a binary classification of subjects into a group who should and into a group who should not be treated. The key concept for this type of evaluation is the "net benefit", a concept borrowed from utility theory. METHODS: We recall the foundations of decision curve analysis and discuss some new aspects. First, we stress the formal distinction between the net benefit for the treated and for the untreated and define the concept of the "overall net benefit". Next, we revisit the important distinction between the concept of accuracy, as typically assessed using the Youden index and a receiver operating characteristic (ROC) analysis, and the concept of utility of a prediction model, as assessed using decision curve analysis. Finally, we provide an explicit implementation of decision curve analysis to be applied in the context of case-control studies. RESULTS: We show that the overall net benefit, which combines the net benefit for the treated and the untreated, is a natural alternative to the benefit achieved by a model, being invariant with respect to the coding of the outcome, and conveying a more comprehensive picture of the situation. Further, within the framework of decision curve analysis, we illustrate the important difference between the accuracy and the utility of a model, demonstrating how poor an accurate model may be in terms of its net benefit. Eventually, we expose that the application of decision curve analysis to case-control studies, where an accurate estimate of the true prevalence of a disease cannot be obtained from the data, is achieved with a few modifications to the original calculation procedure. CONCLUSIONS: We present several interrelated extensions to decision curve analysis that will both facilitate its interpretation and broaden its potential area of application.
Resumo:
Photoreceptors and retinal pigment epithelial cells (RPE) targeting remains challenging in ocular gene therapy. Viral gene transfer, the only method having reached clinical evaluation, still raises safety concerns when administered via subretinal injections. We have developed a novel transfection method in the adult rat, called suprachoroidal electrotransfer (ET), combining the administration of nonviral plasmid DNA into the suprachoroidal space with the application of an electrical field. Optimization of injection, electrical parameters and external electrodes geometry using a reporter plasmid, resulted in a large area of transfected tissues. Not only choroidal cells but also RPE, and potentially photoreceptors, were efficiently transduced for at least a month when using a cytomegalovirus (CMV) promoter. No ocular complications were recorded by angiographic, electroretinographic, and histological analyses, demonstrating that under selected conditions the procedure is devoid of side effects on the retina or the vasculature integrity. Moreover, a significant inhibition of laser induced-choroidal neovascularization (CNV) was achieved 15 days after transfection of a soluble vascular endothelial growth factor receptor-1 (sFlt-1)-encoding plasmid. This is the first nonviral gene transfer technique that is efficient for RPE targeting without inducing retinal detachment. This novel minimally invasive nonviral gene therapy method may open new prospects for human retinal therapies.
Resumo:
Résumé Des développements antérieurs, au sein de l'Institut de Géophysique de Lausanne, ont permis de développer des techniques d'acquisition sismique et de réaliser l'interprétation des données sismique 2D et 3D pour étudier la géologie de la région et notamment les différentes séquences sédimentaires du Lac Léman. Pour permettre un interprétation quantitative de la sismique en déterminant des paramètres physiques des sédiments la méthode AVO (Amplitude Versus Offset) a été appliquée. Deux campagnes sismiques lacustres, 2D et 3D, ont été acquises afin de tester la méthode AVO dans le Grand Lac sur les deltas des rivières. La géométrie d'acquisition a été repensée afin de pouvoir enregistrer les données à grands déports. Les flûtes sismiques, mises bout à bout, ont permis d'atteindre des angles d'incidence d'environ 40˚ . Des récepteurs GPS spécialement développés à cet effet, et disposés le long de la flûte, ont permis, après post-traitement des données, de déterminer la position de la flûte avec précision (± 0.5 m). L'étalonnage de nos hydrophones, réalisé dans une chambre anéchoïque, a permis de connaître leur réponse en amplitude en fonction de la fréquence. Une variation maximale de 10 dB a été mis en évidence entre les capteurs des flûtes et le signal de référence. Un traitement sismique dont l'amplitude a été conservée a été appliqué sur les données du lac. L'utilisation de l'algorithme en surface en consistante a permis de corriger les variations d'amplitude des tirs du canon à air. Les sections interceptes et gradients obtenues sur les deltas de l'Aubonne et de la Dranse ont permis de produire des cross-plots. Cette représentation permet de classer les anomalies d'amplitude en fonction du type de sédiments et de leur contenu potentiel en gaz. L'un des attributs qui peut être extrait des données 3D, est l'amplitude de la réflectivité d'une interface sismique. Ceci ajoute une composante quantitative à l'interprétation géologique d'une interface. Le fond d'eau sur le delta de l'Aubonne présente des anomalies en amplitude qui caractérisent les chenaux. L'inversion de l'équation de Zoeppritz par l'algorithme de Levenberg-Marquardt a été programmée afin d'extraire les paramètres physiques des sédiments sur ce delta. Une étude statistique des résultats de l'inversion permet de simuler la variation de l'amplitude en fonction du déport. On a obtenu un modèle dont la première couche est l'eau et dont la seconde est une couche pour laquelle V P = 1461 m∕s, ρ = 1182 kg∕m3 et V S = 383 m∕s. Abstract A system to record very high resolution (VHR) seismic data on lakes in 2D and 3D was developed at the Institute of Geophysics, University of Lausanne. Several seismic surveys carried out on Lake Geneva helped us to better understand the geology of the area and to identify sedimentary sequences. However, more sophisticated analysis of the data such as the AVO (Amplitude Versus Offset) method provides means of deciphering the detailed structure of the complex Quaternary sedimentary fill of the Lake Geneva trough. To study the physical parameters we applied the AVO method at some selected places of sediments. These areas are the Aubonne and Dranse River deltas where the configurations of the strata are relatively smooth and the discontinuities between them easy to pick. A specific layout was developed to acquire large incidence angle. 2D and 3D seismic data were acquired with streamers, deployed end to end, providing incidence angle up to 40˚ . One or more GPS antennas attached to the streamer enabled us to calculate individual hydrophone positions with an accuracy of 50 cm after post-processing of the navigation data. To ensure that our system provides correct amplitude information, our streamer sensors were calibrated in an anechoic chamber using a loudspeaker as a source. Amplitude variations between the each hydrophone were of the order of 10 dB. An amplitude correction for each hydrophone was computed and applied before processing. Amplitude preserving processing was then carried out. Intercept vs. gradient cross-plots enable us to determine that both geological discontinuities (lacustrine sediments/moraine and moraine/molasse) have well defined trends. A 3D volume collected on the Aubonne river delta was processed in order ro obtain AVO attributes. Quantitative interpretation using amplitude maps were produced and amplitude maps revealed high reflectivity in channels. Inversion of the water bottom of the Zoeppritz equation using the Levenberg-Marquadt algorithm was carried out to estimate V P , V S and ρ of sediments immediately under the lake bottom. Real-data inversion gave, under the water layer, a mud layer with V P = 1461 m∕s, ρ = 1182 kg∕m3 et V S = 383 m∕s.
Resumo:
Résumé : Cette thèse de doctorat est le fruit d'un projet de recherche européen financé par le quatrième programme cadre de la Commission Européenne (DG XII, Standards, Measurement and Testing). Ce projet, dénommé SMT-CT98-2277, a été financé pour la partie suisse par l'Office Fédéral de l'Education et de la Science (OFES, Berne, Suisse). Le but de ce projet était de développer une méthode harmonisée et collaborativement testée pour le profilage des impuretés de l'amphétamine illicite par chromatographie capillaire en phase gazeuse. Le travail a été divisé en sept phases majeures qui concernaient la synthèse de l'amphétamine, l'identification d'impuretés, l'optimisation de la préparation de l'échantillon et du système chromatographique, la variabilité des résultats, l'investigation de méthodes mathématiques pour la classification et la comparaison de profils et finalement l'application de la méthode à des réels échantillons illicites. La méthode résultant de ce travail n'a pas seulement montré que les données étaient interchangeables entre laboratoires mais aussi qu'elle était supérieure en de nombreux points aux méthodes préalablement publiées dans la littérature scientifique. Abstract : This Ph.D. thesis was carried out in parallel to an European project funded by the fourth framework program of the European Commission (DG XII, Standards, Measurement and Testing). This project, named SMT-CT98-2277 was funded, for the Swiss part, by the Federal Office of Education and Science (OFES, Bern, Switzerland). The aim of the project was to develop a harmonised, collaboratively tested method for the impurity profiling of illicit amphetamine by capillary gas chromatography. The work was divided into seven main tasks which deal with the synthesis of amphetamine, identification of impurities, optimization of sample preparation and of the chromatographic system, variability of the results, investigation of numerical methods for the classification and comparison of profiles and finally application of the methodology to real illicit samples. The resulting method has not only shown to produce interchangeable data between different laboratories but was also found to be superior in many aspects to previously published methods.
Resumo:
The application of statistics to science is not a neutral act. Statistical tools have shaped and were also shaped by its objects. In the social sciences, statistical methods fundamentally changed research practice, making statistical inference its centerpiece. At the same time, textbook writers in the social sciences have transformed rivaling statistical systems into an apparently monolithic method that could be used mechanically. The idol of a universal method for scientific inference has been worshipped since the "inference revolution" of the 1950s. Because no such method has ever been found, surrogates have been created, most notably the quest for significant p values. This form of surrogate science fosters delusions and borderline cheating and has done much harm, creating, for one, a flood of irreproducible results. Proponents of the "Bayesian revolution" should be wary of chasing yet another chimera: an apparently universal inference procedure. A better path would be to promote both an understanding of the various devices in the "statistical toolbox" and informed judgment to select among these.
Resumo:
The objective of this work was to combine the advantages of the dried blood spot (DBS) sampling process with the highly sensitive and selective negative-ion chemical ionization tandem mass spectrometry (NICI-MS-MS) to analyze for recent antidepressants including fluoxetine, norfluoxetine, reboxetine, and paroxetine from micro whole blood samples (i.e., 10 microL). Before analysis, DBS samples were punched out, and antidepressants were simultaneously extracted and derivatized in a single step by use of pentafluoropropionic acid anhydride and 0.02% triethylamine in butyl chloride for 30 min at 60 degrees C under ultrasonication. Derivatives were then separated on a gas chromatograph coupled with a triple-quadrupole mass spectrometer operating in negative selected reaction monitoring mode for a total run time of 5 min. To establish the validity of the method, trueness, precision, and selectivity were determined on the basis of the guidelines of the "Société Française des Sciences et des Techniques Pharmaceutiques" (SFSTP). The assay was found to be linear in the concentration ranges 1 to 500 ng mL(-1) for fluoxetine and norfluoxetine and 20 to 500 ng mL(-1) for reboxetine and paroxetine. Despite the small sampling volume, the limit of detection was estimated at 20 pg mL(-1) for all the analytes. The stability of DBS was also evaluated at -20 degrees C, 4 degrees C, 25 degrees C, and 40 degrees C for up to 30 days. Furthermore, the method was successfully applied to a pharmacokinetic investigation performed on a healthy volunteer after oral administration of a single 40-mg dose of fluoxetine. Thus, this validated DBS method combines an extractive-derivative single step with a fast and sensitive GC-NICI-MS-MS technique. Using microliter blood samples, this procedure offers a patient-friendly tool in many biomedical fields such as checking treatment adherence, therapeutic drug monitoring, toxicological analyses, or pharmacokinetic studies.
Resumo:
To-date, there has been no effective chiral capillary electrophoresis-mass spectrometry (CE-MS) method reported for the simultaneous enantioseparation of the antidepressant drug, venlafaxine (VX) and its structurally-similar major metabolite, O-desmethylvenlafaxine (O-DVX). This is mainly due to the difficulty of identifying MS compatible chiral selector, which could provide both high enantioselectivity and sensitive MS detection. In this work, poly-sodium N-undecenoyl-L,L-leucylalaninate (poly-L,L-SULA) was employed as a chiral selector after screening several dipeptide polymeric chiral surfactants. Baseline separation of both O-DVX and VX enantiomers was achieved in 15min after optimizing the buffer pH, poly-L,L-SULA concentration, nebulizer pressure and separation voltage. Calibration curves in spiked plasma (recoveries higher than 80%) were linear over the concentration range 150-5000ng/mL for both VX and O-DVX. The limit of detection (LOD) was found to be as low as 30ng/mL and 21ng/mL for O-DVX and VX, respectively. This method was successfully applied to measure the plasma concentrations of human volunteers receiving VX or O-DVX orally when co-administered without and with indinivar therapy. The results suggest that micellar electrokinetic chromatography electrospray ionization-tandem mass spectrometry (MEKC-ESI-MS/MS) is an effective low cost alternative technique for the pharmacokinetics and pharmacodynamics studies of both O-DVX and VX enantiomers. The technique has potential to identify drug-drug interaction involving VX and O-DVX enantiomers while administering indinivar therapy.
Resumo:
Genome-wide association studies (GWASs) have identified many genetic variants underlying complex traits. Many detected genetic loci harbor variants that associate with multiple-even distinct-traits. Most current analysis approaches focus on single traits, even though the final results from multiple traits are evaluated together. Such approaches miss the opportunity to systemically integrate the phenome-wide data available for genetic association analysis. In this study, we propose a general approach that can integrate association evidence from summary statistics of multiple traits, either correlated, independent, continuous, or binary traits, which might come from the same or different studies. We allow for trait heterogeneity effects. Population structure and cryptic relatedness can also be controlled. Our simulations suggest that the proposed method has improved statistical power over single-trait analysis in most of the cases we studied. We applied our method to the Continental Origins and Genetic Epidemiology Network (COGENT) African ancestry samples for three blood pressure traits and identified four loci (CHIC2, HOXA-EVX1, IGFBP1/IGFBP3, and CDH17; p < 5.0 × 10(-8)) associated with hypertension-related traits that were missed by a single-trait analysis in the original report. Six additional loci with suggestive association evidence (p < 5.0 × 10(-7)) were also observed, including CACNA1D and WNT3. Our study strongly suggests that analyzing multiple phenotypes can improve statistical power and that such analysis can be executed with the summary statistics from GWASs. Our method also provides a way to study a cross phenotype (CP) association by using summary statistics from GWASs of multiple phenotypes.
Resumo:
Intravoxel incoherent motion (IVIM) MRI is a method to extract microvascular blood flow information out of diffusion-weighted images acquired at multiple b-values. We hypothesized that IVIM can identify the muscles selectively involved in a specific task, by measuring changes in activity-induced local muscular perfusion after exercise. We tested this hypothesis using a widely used clinical maneuver, the lift-off test, which is known to assess specifically the subscapularis muscle functional integrity. Twelve shoulders from six healthy male volunteers were imaged at 3 T, at rest, as well as after a lift-off test hold against resistance for 30 s, 1 and 2 min respectively, in three independent sessions. IVIM parameters, consisting of perfusion fraction (f), diffusion coefficient (D), pseudo-diffusion coefficient D* and blood flow-related fD*, were estimated within outlined muscles of the rotator cuff and the deltoid bundles. The mean values at rest and after the lift-off tests were compared in each muscle using a one-way ANOVA. A statistically significant increase in fD* was measured in the subscapularis, after a lift-off test of any duration, as well as in D. A fD* increase was the most marked (30 s, +103%; 1 min, +130%; 2 min, +156%) and was gradual with the duration of the test (in 10(-3) mm(2) /s: rest, 1.41 ± 0.50; 30 s, 2.86 ± 1.17; 1 min, 3.23 ± 1.22; 2 min, 3.60 ± 1.21). A significant increase in fD* and D was also visible in the posterior bundle of the deltoid. No significant change was consistently visible in the other investigated muscles of the rotator cuff and the other bundles of the deltoid. In conclusion, IVIM fD* allows the demonstration of a task-related microvascular perfusion increase after a specific task and suggests a direct relationship between microvascular perfusion and the duration of the effort. It is a promising method to investigate non-invasively skeletal muscle physiology and clinical perfusion-related muscular disorders.