71 resultados para Current Density Mapping Method


Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: Carotid artery stenosis is associated with the occurrence of acute and chronic ischemic lesions that increase with age in the elderly population. Diffusion Imaging and ADC mapping may be an appropriate method to investigate patients with chronic hypoperfusion consecutive to carotid stenosis. This non-invasive technique allows to investigate brain integrity and structure, in particular hypoperfusion induced by carotid stenosis diseases. The aim of this study was to evaluate the impact of a carotid stenosis on the parenchyma using ADC mapping. METHODS: Fifty-nine patients with symptomatic (33) and asymptomatic (26) carotid stenosis were recruited from our multidisciplinary consultation. Both groups demonstrated a similar degree of stenosis. All patients underwent MRI of the brain including diffusion-weighted MR imaging with ADC mapping. Regions of interest were defined in the anterior and posterior paraventricular regions both ipsilateral and contralateral to the stenosis (anterior circulation). The same analysis was performed for the thalamic and occipital regions (posterior circulation). RESULTS: ADC values of the affected vascular territory were significantly higher on the side of the stenosis in the periventricular anterior (P<0.001) and posterior (P<0.01) area. There was no difference between ipsilateral and contralateral ADC values in the thalamic and occipital regions. CONCLUSIONS: We have shown that carotid stenosis is associated with significantly higher ADC values in the anterior circulation, probably reflecting an impact of chronic hypoperfusion on the brain parenchyma in symptomatic and asymptomatic patients. This is consistent with previous data in the literature.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

RATIONALE, AIMS AND OBJECTIVES: There is little evidence regarding the benefit of stress ulcer prophylaxis (SUP) outside a critical care setting. Overprescription of SUP is not devoid of risks. This prospective study aimed to evaluate the use of proton pump inhibitors (PPIs) for SUP in a general surgery department. METHOD: Data collection was performed prospectively during an 8-week period on patients hospitalized in a general surgery department (58 beds) by pharmacists. Patients with a PPI prescription for the treatment of ulcers, gastro-oesophageal reflux disease, oesophagitis or epigastric pain were excluded. Patients admitted twice during the study period were not reincluded. The American Society of Health-System Pharmacists guidelines on SUP were used to assess the appropriateness of de novo PPI prescriptions. RESULTS: Among 255 patients in the study, 138 (54%) received a prophylaxis with PPI, of which 86 (62%) were de novo PPI prescriptions. A total of 129 patients (94%) received esomeprazole (according to the hospital drug policy). The most frequent dosage was at 40 mg once daily. Use of PPI for SUP was evaluated in 67 patients. A total of 53 patients (79%) had no risk factors for SUP. Twelve and two patients had one or two risk factors, respectively. At discharge, PPI prophylaxis was continued in 33% of patients with a de novo PPI prescription. CONCLUSIONS: This study highlights the overuse of PPIs in non-intensive care unit patients and the inappropriate continuation of PPI prescriptions at discharge. Treatment recommendations for SUP are needed to restrict PPI use for justified indications.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Context: In the milder form of primary hyperparathyroidism (PHPT), cancellous bone, represented by areal bone mineral density at the lumbar spine by dual-energy x-ray absorptiometry (DXA), is preserved. This finding is in contrast to high-resolution peripheral quantitative computed tomography (HRpQCT) results of abnormal trabecular microstructure and epidemiological evidence for increased overall fracture risk in PHPT. Because DXA does not directly measure trabecular bone and HRpQCT is not widely available, we used trabecular bone score (TBS), a novel gray-level textural analysis applied to spine DXA images, to estimate indirectly trabecular microarchitecture. Objective: The purpose of this study was to assess TBS from spine DXA images in relation to HRpQCT indices and bone stiffness in radius and tibia in PHPT. Design and Setting: This was a cross-sectional study conducted in a referral center. Patients: Participants were 22 postmenopausal women with PHPT. Main Outcome Measures: Outcomes measured were areal bone mineral density by DXA, TBS indices derived from DXA images, HRpQCT standard measures, and bone stiffness assessed by finite element analysis at distal radius and tibia. Results: TBS in PHPT was low at 1.24, representing abnormal trabecular microstructure (normal ≥1.35). TBS was correlated with whole bone stiffness and all HRpQCT indices, except for trabecular thickness and trabecular stiffness at the radius. At the tibia, correlations were observed between TBS and volumetric densities, cortical thickness, trabecular bone volume, and whole bone stiffness. TBS correlated with all indices of trabecular microarchitecture, except trabecular thickness, after adjustment for body weight. Conclusion: TBS, a measurement technology readily available by DXA, shows promise in the clinical assessment of trabecular microstructure in PHPT.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

β-blockers and β-agonists are primarily used to treat cardiovascular diseases. Inter-individual variability in response to both drug classes is well recognized, yet the identity and relative contribution of the genetic players involved are poorly understood. This work is the first genome-wide association study (GWAS) addressing the values and susceptibility of cardiovascular-related traits to a selective β(1)-blocker, Atenolol (ate), and a β-agonist, Isoproterenol (iso). The phenotypic dataset consisted of 27 highly heritable traits, each measured across 22 inbred mouse strains and four pharmacological conditions. The genotypic panel comprised 79922 informative SNPs of the mouse HapMap resource. Associations were mapped by Efficient Mixed Model Association (EMMA), a method that corrects for the population structure and genetic relatedness of the various strains. A total of 205 separate genome-wide scans were analyzed. The most significant hits include three candidate loci related to cardiac and body weight, three loci for electrocardiographic (ECG) values, two loci for the susceptibility of atrial weight index to iso, four loci for the susceptibility of systolic blood pressure (SBP) to perturbations of the β-adrenergic system, and one locus for the responsiveness of QTc (p<10(-8)). An additional 60 loci were suggestive for one or the other of the 27 traits, while 46 others were suggestive for one or the other drug effects (p<10(-6)). Most hits tagged unexpected regions, yet at least two loci for the susceptibility of SBP to β-adrenergic drugs pointed at members of the hypothalamic-pituitary-thyroid axis. Loci for cardiac-related traits were preferentially enriched in genes expressed in the heart, while 23% of the testable loci were replicated with datasets of the Mouse Phenome Database (MPD). Altogether these data and validation tests indicate that the mapped loci are relevant to the traits and responses studied.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Electrical Impedance Tomography (EIT) is an imaging method which enables a volume conductivity map of a subject to be produced from multiple impedance measurements. It has the potential to become a portable non-invasive imaging technique of particular use in imaging brain function. Accurate numerical forward models may be used to improve image reconstruction but, until now, have employed an assumption of isotropic tissue conductivity. This may be expected to introduce inaccuracy, as body tissues, especially those such as white matter and the skull in head imaging, are highly anisotropic. The purpose of this study was, for the first time, to develop a method for incorporating anisotropy in a forward numerical model for EIT of the head and assess the resulting improvement in image quality in the case of linear reconstruction of one example of the human head. A realistic Finite Element Model (FEM) of an adult human head with segments for the scalp, skull, CSF, and brain was produced from a structural MRI. Anisotropy of the brain was estimated from a diffusion tensor-MRI of the same subject and anisotropy of the skull was approximated from the structural information. A method for incorporation of anisotropy in the forward model and its use in image reconstruction was produced. The improvement in reconstructed image quality was assessed in computer simulation by producing forward data, and then linear reconstruction using a sensitivity matrix approach. The mean boundary data difference between anisotropic and isotropic forward models for a reference conductivity was 50%. Use of the correct anisotropic FEM in image reconstruction, as opposed to an isotropic one, corrected an error of 24 mm in imaging a 10% conductivity decrease located in the hippocampus, improved localisation for conductivity changes deep in the brain and due to epilepsy by 4-17 mm, and, overall, led to a substantial improvement on image quality. This suggests that incorporation of anisotropy in numerical models used for image reconstruction is likely to improve EIT image quality.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: Used in conjunction with biological surveillance, behavioural surveillance provides data allowing for a more precise definition of HIV/STI prevention strategies. In 2008, mapping of behavioural surveillance in EU/EFTA countries was performed on behalf of the European Centre for Disease prevention and Control. METHOD: Nine questionnaires were sent to all 31 member States and EEE/EFTA countries requesting data on the overall behavioural and second generation surveillance system and on surveillance in the general population, youth, men having sex with men (MSM), injecting drug users (IDU), sex workers (SW), migrants, people living with HIV/AIDS (PLWHA), and sexually transmitted infection (STI) clinics patients. Requested data included information on system organisation (e.g. sustainability, funding, institutionalisation), topics covered in surveys and main indicators. RESULTS: Twenty-eight of the 31 countries contacted supplied data. Sixteen countries reported an established behavioural surveillance system, and 13 a second generation surveillance system (combination of biological surveillance of HIV/AIDS and STI with behavioural surveillance). There were wide differences as regards the year of survey initiation, number of populations surveyed, data collection methods used, organisation of surveillance and coordination with biological surveillance. The populations most regularly surveyed are the general population, youth, MSM and IDU. SW, patients of STI clinics and PLWHA are surveyed less regularly and in only a small number of countries, and few countries have undertaken behavioural surveys among migrant or ethnic minorities populations. In many cases, the identification of populations with risk behaviour and the selection of populations to be included in a BS system have not been formally conducted, or are incomplete. Topics most frequently covered are similar across countries, although many different indicators are used. In most countries, sustainability of surveillance systems is not assured. CONCLUSION: Although many European countries have established behavioural surveillance systems, there is little harmonisation as regards the methods and indicators adopted. The main challenge now faced is to build and maintain organised and functional behavioural and second generation surveillance systems across Europe, to increase collaboration, to promote robust, sustainable and cost-effective data collection methods, and to harmonise indicators.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

La biologie de la conservation est communément associée à la protection de petites populations menacées d?extinction. Pourtant, il peut également être nécessaire de soumettre à gestion des populations surabondantes ou susceptibles d?une trop grande expansion, dans le but de prévenir les effets néfastes de la surpopulation. Du fait des différences tant quantitatives que qualitatives entre protection des petites populations et contrôle des grandes, il est nécessaire de disposer de modèles et de méthodes distinctes. L?objectif de ce travail a été de développer des modèles prédictifs de la dynamique des grandes populations, ainsi que des logiciels permettant de calculer les paramètres de ces modèles et de tester des scénarios de gestion. Le cas du Bouquetin des Alpes (Capra ibex ibex) - en forte expansion en Suisse depuis sa réintroduction au début du XXème siècle - servit d?exemple. Cette tâche fut accomplie en trois étapes : En premier lieu, un modèle de dynamique locale, spécifique au Bouquetin, fut développé : le modèle sous-jacent - structuré en classes d?âge et de sexe - est basé sur une matrice de Leslie à laquelle ont été ajoutées la densité-dépendance, la stochasticité environnementale et la chasse de régulation. Ce modèle fut implémenté dans un logiciel d?aide à la gestion - nommé SIM-Ibex - permettant la maintenance de données de recensements, l?estimation automatisée des paramètres, ainsi que l?ajustement et la simulation de stratégies de régulation. Mais la dynamique d?une population est influencée non seulement par des facteurs démographiques, mais aussi par la dispersion et la colonisation de nouveaux espaces. Il est donc nécessaire de pouvoir modéliser tant la qualité de l?habitat que les obstacles à la dispersion. Une collection de logiciels - nommée Biomapper - fut donc développée. Son module central est basé sur l?Analyse Factorielle de la Niche Ecologique (ENFA) dont le principe est de calculer des facteurs de marginalité et de spécialisation de la niche écologique à partir de prédicteurs environnementaux et de données d?observation de l?espèce. Tous les modules de Biomapper sont liés aux Systèmes d?Information Géographiques (SIG) ; ils couvrent toutes les opérations d?importation des données, préparation des prédicteurs, ENFA et calcul de la carte de qualité d?habitat, validation et traitement des résultats ; un module permet également de cartographier les barrières et les corridors de dispersion. Le domaine d?application de l?ENFA fut exploré par le biais d?une distribution d?espèce virtuelle. La comparaison à une méthode couramment utilisée pour construire des cartes de qualité d?habitat, le Modèle Linéaire Généralisé (GLM), montra qu?elle était particulièrement adaptée pour les espèces cryptiques ou en cours d?expansion. Les informations sur la démographie et le paysage furent finalement fusionnées en un modèle global. Une approche basée sur un automate cellulaire fut choisie, tant pour satisfaire aux contraintes du réalisme de la modélisation du paysage qu?à celles imposées par les grandes populations : la zone d?étude est modélisée par un pavage de cellules hexagonales, chacune caractérisée par des propriétés - une capacité de soutien et six taux d?imperméabilité quantifiant les échanges entre cellules adjacentes - et une variable, la densité de la population. Cette dernière varie en fonction de la reproduction et de la survie locale, ainsi que de la dispersion, sous l?influence de la densité-dépendance et de la stochasticité. Un logiciel - nommé HexaSpace - fut développé pour accomplir deux fonctions : 1° Calibrer l?automate sur la base de modèles de dynamique (par ex. calculés par SIM-Ibex) et d?une carte de qualité d?habitat (par ex. calculée par Biomapper). 2° Faire tourner des simulations. Il permet d?étudier l?expansion d?une espèce envahisseuse dans un paysage complexe composé de zones de qualité diverses et comportant des obstacles à la dispersion. Ce modèle fut appliqué à l?histoire de la réintroduction du Bouquetin dans les Alpes bernoises (Suisse). SIM-Ibex est actuellement utilisé par les gestionnaires de la faune et par les inspecteurs du gouvernement pour préparer et contrôler les plans de tir. Biomapper a été appliqué à plusieurs espèces (tant végétales qu?animales) à travers le Monde. De même, même si HexaSpace fut initialement conçu pour des espèces animales terrestres, il pourrait aisément être étndu à la propagation de plantes ou à la dispersion d?animaux volants. Ces logiciels étant conçus pour, à partir de données brutes, construire un modèle réaliste complexe, et du fait qu?ils sont dotés d?une interface d?utilisation intuitive, ils sont susceptibles de nombreuses applications en biologie de la conservation. En outre, ces approches peuvent également s?appliquer à des questions théoriques dans les domaines de l?écologie des populations et du paysage.<br/><br/>Conservation biology is commonly associated to small and endangered population protection. Nevertheless, large or potentially large populations may also need human management to prevent negative effects of overpopulation. As there are both qualitative and quantitative differences between small population protection and large population controlling, distinct methods and models are needed. The aim of this work was to develop theoretical models to predict large population dynamics, as well as computer tools to assess the parameters of these models and to test management scenarios. The alpine Ibex (Capra ibex ibex) - which experienced a spectacular increase since its reintroduction in Switzerland at the beginning of the 20th century - was used as paradigm species. This task was achieved in three steps: A local population dynamics model was first developed specifically for Ibex: the underlying age- and sex-structured model is based on a Leslie matrix approach with addition of density-dependence, environmental stochasticity and culling. This model was implemented into a management-support software - named SIM-Ibex - allowing census data maintenance, parameter automated assessment and culling strategies tuning and simulating. However population dynamics is driven not only by demographic factors, but also by dispersal and colonisation of new areas. Habitat suitability and obstacles modelling had therefore to be addressed. Thus, a software package - named Biomapper - was developed. Its central module is based on the Ecological Niche Factor Analysis (ENFA) whose principle is to compute niche marginality and specialisation factors from a set of environmental predictors and species presence data. All Biomapper modules are linked to Geographic Information Systems (GIS); they cover all operations of data importation, predictor preparation, ENFA and habitat suitability map computation, results validation and further processing; a module also allows mapping of dispersal barriers and corridors. ENFA application domain was then explored by means of a simulated species distribution. It was compared to a common habitat suitability assessing method, the Generalised Linear Model (GLM), and was proven better suited for spreading or cryptic species. Demography and landscape informations were finally merged into a global model. To cope with landscape realism and technical constraints of large population modelling, a cellular automaton approach was chosen: the study area is modelled by a lattice of hexagonal cells, each one characterised by a few fixed properties - a carrying capacity and six impermeability rates quantifying exchanges between adjacent cells - and one variable, population density. The later varies according to local reproduction/survival and dispersal dynamics, modified by density-dependence and stochasticity. A software - named HexaSpace - was developed, which achieves two functions: 1° Calibrating the automaton on the base of local population dynamics models (e.g., computed by SIM-Ibex) and a habitat suitability map (e.g. computed by Biomapper). 2° Running simulations. It allows studying the spreading of an invading species across a complex landscape made of variously suitable areas and dispersal barriers. This model was applied to the history of Ibex reintroduction in Bernese Alps (Switzerland). SIM-Ibex is now used by governmental wildlife managers to prepare and verify culling plans. Biomapper has been applied to several species (both plants and animals) all around the World. In the same way, whilst HexaSpace was originally designed for terrestrial animal species, it could be easily extended to model plant propagation or flying animals dispersal. As these softwares were designed to proceed from low-level data to build a complex realistic model and as they benefit from an intuitive user-interface, they may have many conservation applications. Moreover, theoretical questions in the fields of population and landscape ecology might also be addressed by these approaches.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

PURPOSE: All methods presented to date to map both conductivity and permittivity rely on multiple acquisitions to compute quantitatively the magnitude of radiofrequency transmit fields, B1+. In this work, we propose a method to compute both conductivity and permittivity based solely on relative receive coil sensitivities ( B1-) that can be obtained in one single measurement without the need to neither explicitly perform transmit/receive phase separation nor make assumptions regarding those phases. THEORY AND METHODS: To demonstrate the validity and the noise sensitivity of our method we used electromagnetic finite differences simulations of a 16-channel transceiver array. To experimentally validate our methodology at 7 Tesla, multi compartment phantom data was acquired using a standard 32-channel receive coil system and two-dimensional (2D) and 3D gradient echo acquisition. The reconstructed electric properties were correlated to those measured using dielectric probes. RESULTS: The method was demonstrated both in simulations and in phantom data with correlations to both the modeled and bench measurements being close to identity. The noise properties were modeled and understood. CONCLUSION: The proposed methodology allows to quantitatively determine the electrical properties of a sample using any MR contrast, with the only constraint being the need to have 4 or more receive coils and high SNR. Magn Reson Med, 2014. © 2014 Wiley Periodicals, Inc.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

PURPOSE: According to estimations around 230 people die as a result of radon exposure in Switzerland. This public health concern makes reliable indoor radon prediction and mapping methods necessary in order to improve risk communication to the public. The aim of this study was to develop an automated method to classify lithological units according to their radon characteristics and to develop mapping and predictive tools in order to improve local radon prediction. METHOD: About 240 000 indoor radon concentration (IRC) measurements in about 150 000 buildings were available for our analysis. The automated classification of lithological units was based on k-medoids clustering via pair-wise Kolmogorov distances between IRC distributions of lithological units. For IRC mapping and prediction we used random forests and Bayesian additive regression trees (BART). RESULTS: The automated classification groups lithological units well in terms of their IRC characteristics. Especially the IRC differences in metamorphic rocks like gneiss are well revealed by this method. The maps produced by random forests soundly represent the regional difference of IRCs in Switzerland and improve the spatial detail compared to existing approaches. We could explain 33% of the variations in IRC data with random forests. Additionally, the influence of a variable evaluated by random forests shows that building characteristics are less important predictors for IRCs than spatial/geological influences. BART could explain 29% of IRC variability and produced maps that indicate the prediction uncertainty. CONCLUSION: Ensemble regression trees are a powerful tool to model and understand the multidimensional influences on IRCs. Automatic clustering of lithological units complements this method by facilitating the interpretation of radon properties of rock types. This study provides an important element for radon risk communication. Future approaches should consider taking into account further variables like soil gas radon measurements as well as more detailed geological information.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Extracellular vesicles represent a rich source of novel biomarkers in the diagnosis and prognosis of disease. However, there is currently limited information elucidating the most efficient methods for obtaining high yields of pure exosomes, a subset of extracellular vesicles, from cell culture supernatant and complex biological fluids such as plasma. To this end, we comprehensively characterize a variety of exosome isolation protocols for their efficiency, yield and purity of isolated exosomes. Repeated ultracentrifugation steps can reduce the quality of exosome preparations leading to lower exosome yield. We show that concentration of cell culture conditioned media using ultrafiltration devices results in increased vesicle isolation when compared to traditional ultracentrifugation protocols. However, our data on using conditioned media isolated from the Non-Small-Cell Lung Cancer (NSCLC) SK-MES-1 cell line demonstrates that the choice of concentrating device can greatly impact the yield of isolated exosomes. We find that centrifuge-based concentrating methods are more appropriate than pressure-driven concentrating devices and allow the rapid isolation of exosomes from both NSCLC cell culture conditioned media and complex biological fluids. In fact to date, no protocol detailing exosome isolation utilizing current commercial methods from both cells and patient samples has been described. Utilizing tunable resistive pulse sensing and protein analysis, we provide a comparative analysis of 4 exosome isolation techniques, indicating their efficacy and preparation purity. Our results demonstrate that current precipitation protocols for the isolation of exosomes from cell culture conditioned media and plasma provide the least pure preparations of exosomes, whereas size exclusion isolation is comparable to density gradient purification of exosomes. We have identified current shortcomings in common extracellular vesicle isolation methods and provide a potential standardized method that is effective, reproducible and can be utilized for various starting materials. We believe this method will have extensive application in the growing field of extracellular vesicle research.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study analyzed high-density event-related potentials (ERPs) within an electrical neuroimaging framework to provide insights regarding the interaction between multisensory processes and stimulus probabilities. Specifically, we identified the spatiotemporal brain mechanisms by which the proportion of temporally congruent and task-irrelevant auditory information influences stimulus processing during a visual duration discrimination task. The spatial position (top/bottom) of the visual stimulus was indicative of how frequently the visual and auditory stimuli would be congruent in their duration (i.e., context of congruence). Stronger influences of irrelevant sound were observed when contexts associated with a high proportion of auditory-visual congruence repeated and also when contexts associated with a low proportion of congruence switched. Context of congruence and context transition resulted in weaker brain responses at 228 to 257 ms poststimulus to conditions giving rise to larger behavioral cross-modal interactions. Importantly, a control oddball task revealed that both congruent and incongruent audiovisual stimuli triggered equivalent non-linear multisensory interactions when congruence was not a relevant dimension. Collectively, these results are well explained by statistical learning, which links a particular context (here: a spatial location) with a certain level of top-down attentional control that further modulates cross-modal interactions based on whether a particular context repeated or changed. The current findings shed new light on the importance of context-based control over multisensory processing, whose influences multiplex across finer and broader time scales.