68 resultados para Minkowski metrics


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Functional connectivity in human brain can be represented as a network using electroencephalography (EEG) signals. These networks--whose nodes can vary from tens to hundreds--are characterized by neurobiologically meaningful graph theory metrics. This study investigates the degree to which various graph metrics depend upon the network size. To this end, EEGs from 32 normal subjects were recorded and functional networks of three different sizes were extracted. A state-space based method was used to calculate cross-correlation matrices between different brain regions. These correlation matrices were used to construct binary adjacency connectomes, which were assessed with regards to a number of graph metrics such as clustering coefficient, modularity, efficiency, economic efficiency, and assortativity. We showed that the estimates of these metrics significantly differ depending on the network size. Larger networks had higher efficiency, higher assortativity and lower modularity compared to those with smaller size and the same density. These findings indicate that the network size should be considered in any comparison of networks across studies.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Multisensory interactions are a fundamental feature of brain organization. Principles governing multisensory processing have been established by varying stimulus location, timing and efficacy independently. Determining whether and how such principles operate when stimuli vary dynamically in their perceived distance (as when looming/receding) provides an assay for synergy among the above principles and also means for linking multisensory interactions between rudimentary stimuli with higher-order signals used for communication and motor planning. Human participants indicated movement of looming or receding versus static stimuli that were visual, auditory, or multisensory combinations while 160-channel EEG was recorded. Multivariate EEG analyses and distributed source estimations were performed. Nonlinear interactions between looming signals were observed at early poststimulus latencies (∼75 ms) in analyses of voltage waveforms, global field power, and source estimations. These looming-specific interactions positively correlated with reaction time facilitation, providing direct links between neural and performance metrics of multisensory integration. Statistical analyses of source estimations identified looming-specific interactions within the right claustrum/insula extending inferiorly into the amygdala and also within the bilateral cuneus extending into the inferior and lateral occipital cortices. Multisensory effects common to all conditions, regardless of perceived distance and congruity, followed (∼115 ms) and manifested as faster transition between temporally stable brain networks (vs summed responses to unisensory conditions). We demonstrate the early-latency, synergistic interplay between existing principles of multisensory interactions. Such findings change the manner in which to model multisensory interactions at neural and behavioral/perceptual levels. We also provide neurophysiologic backing for the notion that looming signals receive preferential treatment during perception.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

BACKGROUND: Modern theories define chronic pain as a multidimensional experience - the result of complex interplay between physiological and psychological factors with significant impact on patients' physical, emotional and social functioning. The development of reliable assessment tools capable of capturing the multidimensional impact of chronic pain has challenged the medical community for decades. A number of validated tools are currently used in clinical practice however they all rely on self-reporting and are therefore inherently subjective. In this study we show that a comprehensive analysis of physical activity (PA) under real life conditions may capture behavioral aspects that may reflect physical and emotional functioning.¦METHODOLOGY: PA was monitored during five consecutive days in 60 chronic pain patients and 15 pain-free healthy subjects. To analyze the various aspects of pain-related activity behaviors we defined the concept of PA 'barcoding'. The main idea was to combine different features of PA (type, intensity, duration) to define various PA states. The temporal sequence of different states was visualized as a 'barcode' which indicated that significant information about daily activity can be contained in the amount and variety of PA states, and in the temporal structure of sequence. This information was quantified using complementary measures such as structural complexity metrics (information and sample entropy, Lempel-Ziv complexity), time spent in PA states, and two composite scores, which integrate all measures. The reliability of these measures to characterize chronic pain conditions was assessed by comparing groups of subjects with clinically different pain intensity.¦CONCLUSION: The defined measures of PA showed good discriminative features. The results suggest that significant information about pain-related functional limitations is captured by the structural complexity of PA barcodes, which decreases when the intensity of pain increases. We conclude that a comprehensive analysis of daily-life PA can provide an objective appraisal of the intensity of pain.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Static incubation tests, where microcapsules and beads are contacted with polymer and protein solutions, have been developed for the characterization of permselective materials applied for bioartificial organs and drug delivery. A combination of polymer ingress, detected by size-exclusion chromatography, and protein ingress/ egress, assessed by gel electrophoresis, provides information regarding the diffusion kinetics, molar mass cutoff(MMCO) and permeability. This represents an improvement over existing permeability measurements that are based on the diffusion of a single type of solute. Specifically, the permeability of capsules based on alginate, cellulose sulfate, polymethylene-co-guanidine were characterized as a function of membrane thickness. Solid alginate beads were also evaluated. The MMCO of these capsules was estimated to be between 80 and 90 kDa using polymers, and between 116-150 kDa with proteins. Apparently, the globular shape of the proteins (radius of gyration (Rg) of 4.2-4.6 nm) facilitates their passage through the membrane, comparatively to the polysaccharide coil conformation (Rg of 6.5-8.3 nm). An increase of the capsule membrane thickness reduced these values. The MMCO of the beads, which do not have a membrane limiting their permselective properties, was higher, between 110 and 200 kDa with dextrans, and between 150 and 220 kDa with proteins. Therefore, although the permeability estimated with biologically relevant molecules is generally higher due to their lower radius of gyration, both the MMCO of synthetic and natural watersoluble polymers correlate well, and can be used as in vitro metrics for the immune protection ability of microcapsules and microbeads. This article shows, to the authors' knowledge, the first reported concordance between permeability measures based on model natural and biological macromolecules.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

From toddler to late teenager, the macroscopic pattern of axonal projections in the human brain remains largely unchanged while undergoing dramatic functional modifications that lead to network refinement. These functional modifications are mediated by increasing myelination and changes in axonal diameter and synaptic density, as well as changes in neurochemical mediators. Here we explore the contribution of white matter maturation to the development of connectivity between ages 2 and 18 y using high b-value diffusion MRI tractography and connectivity analysis. We measured changes in connection efficacy as the inverse of the average diffusivity along a fiber tract. We observed significant refinement in specific metrics of network topology, including a significant increase in node strength and efficiency along with a decrease in clustering. Major structural modules and hubs were in place by 2 y of age, and they continued to strengthen their profile during subsequent development. Recording resting-state functional MRI from a subset of subjects, we confirmed a positive correlation between structural and functional connectivity, and in addition observed that this relationship strengthened with age. Continuously increasing integration and decreasing segregation of structural connectivity with age suggests that network refinement mediated by white matter maturation promotes increased global efficiency. In addition, the strengthening of the correlation between structural and functional connectivity with age suggests that white matter connectivity in combination with other factors, such as differential modulation of axonal diameter and myelin thickness, that are partially captured by inverse average diffusivity, play an increasingly important role in creating brain-wide coherence and synchrony.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Recent technological advances in remote sensing have enabled investigation of the morphodynamics and hydrodynamics of large rivers. However, measuring topography and flow in these very large rivers is time consuming and thus often constrains the spatial resolution and reach-length scales that can be monitored. Similar constraints exist for computational fluid dynamics (CFD) studies of large rivers, requiring maximization of mesh-or grid-cell dimensions and implying a reduction in the representation of bedform-roughness elements that are of the order of a model grid cell or less, even if they are represented in available topographic data. These ``subgrid'' elements must be parameterized, and this paper applies and considers the impact of roughness-length treatments that include the effect of bed roughness due to ``unmeasured'' topography. CFD predictions were found to be sensitive to the roughness-length specification. Model optimization was based on acoustic Doppler current profiler measurements and estimates of the water surface slope for a variety of roughness lengths. This proved difficult as the metrics used to assess optimal model performance diverged due to the effects of large bedforms that are not well parameterized in roughness-length treatments. However, the general spatial flow patterns are effectively predicted by the model. Changes in roughness length were shown to have a major impact upon flow routing at the channel scale. The results also indicate an absence of secondary flow circulation cells in the reached studied, and suggest simpler two-dimensional models may have great utility in the investigation of flow within large rivers. Citation: Sandbach, S. D. et al. (2012), Application of a roughness-length representation to parameterize energy loss in 3-D numerical simulations of large rivers, Water Resour. Res., 48, W12501, doi: 10.1029/2011WR011284.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Network analysis naturally relies on graph theory and, more particularly, on the use of node and edge metrics to identify the salient properties in graphs. When building visual maps of networks, these metrics are turned into useful visual cues or are used interactively to filter out parts of a graph while querying it, for instance. Over the years, analysts from different application domains have designed metrics to serve specific needs. Network science is an inherently cross-disciplinary field, which leads to the publication of metrics with similar goals; different names and descriptions of their analytics often mask the similarity between two metrics that originated in different fields. Here, we study a set of graph metrics and compare their relative values and behaviors in an effort to survey their potential contributions to the spatial analysis of networks.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

AimWe take a comparative phylogeographical approach to assess whether three species involved in a specialized oil-rewarding pollination system (i.e. Lysimachia vulgaris and two oil-collecting bees within the genus Macropis) show congruent phylogeographical trajectories during post-glacial colonization processes. Our working hypothesis is that within specialized mutualistic interactions, where each species relies on the co-occurrence of the other for survival and/or reproduction, partners are expected to show congruent evolutionary trajectories, because they are likely to have followed parallel migration routes and to have shared glacial refugia. LocationWestern Palaearctic. MethodsOur analysis relies on the extensive sampling of 104 Western Palaearctic populations (totalling 434, 159 and 74 specimens of Lysimachiavulgaris, Macropiseuropaea and Macropisfulvipes, respectively), genotyped with amplified fragment length polymorphism. Based on this, we evaluated the regional genetic diversity (Shannon diversity and allele rarity index) and genetic structure (assessed using structure, population networks, isolation-by-distance and spatial autocorrelation metrics) of each species. Finally, we compared the general phylogeographical patterns obtained. ResultsContrary to our expectations, the analyses revealed phylogeographical signals suggesting that the investigated organisms demonstrate independent post-glacial trajectories as well as distinct contemporaneous demographic parameters, despite their mutualistic interaction. Main conclusionsThe mutualistic partners investigated here are likely to be experiencing distinct and independent evolutionary dynamics because of their contrasting life-history traits (e.g. dispersal abilities), as well as distinct hubs and migration routes. Such conditions would prevent and/or erase any signature of co-structuring of lineages in space and time. As a result, the lack of phylogeographical congruence driven by differences in life-history traits might have arisen irrespective of the three species having shared similar Pleistocene glacial refugia.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

BACKGROUND: Data for trends in glycaemia and diabetes prevalence are needed to understand the effects of diet and lifestyle within populations, assess the performance of interventions, and plan health services. No consistent and comparable global analysis of trends has been done. We estimated trends and their uncertainties in mean fasting plasma glucose (FPG) and diabetes prevalence for adults aged 25 years and older in 199 countries and territories. METHODS: We obtained data from health examination surveys and epidemiological studies (370 country-years and 2·7 million participants). We converted systematically between different glycaemic metrics. For each sex, we used a Bayesian hierarchical model to estimate mean FPG and its uncertainty by age, country, and year, accounting for whether a study was nationally, subnationally, or community representative. FINDINGS: In 2008, global age-standardised mean FPG was 5·50 mmol/L (95% uncertainty interval 5·37-5·63) for men and 5·42 mmol/L (5·29-5·54) for women, having risen by 0·07 mmol/L and 0·09 mmol/L per decade, respectively. Age-standardised adult diabetes prevalence was 9·8% (8·6-11·2) in men and 9·2% (8·0-10·5) in women in 2008, up from 8·3% (6·5-10·4) and 7·5% (5·8-9·6) in 1980. The number of people with diabetes increased from 153 (127-182) million in 1980, to 347 (314-382) million in 2008. We recorded almost no change in mean FPG in east and southeast Asia and central and eastern Europe. Oceania had the largest rise, and the highest mean FPG (6·09 mmol/L, 5·73-6·49 for men; 6·08 mmol/L, 5·72-6·46 for women) and diabetes prevalence (15·5%, 11·6-20·1 for men; and 15·9%, 12·1-20·5 for women) in 2008. Mean FPG and diabetes prevalence in 2008 were also high in south Asia, Latin America and the Caribbean, and central Asia, north Africa, and the Middle East. Mean FPG in 2008 was lowest in sub-Saharan Africa, east and southeast Asia, and high-income Asia-Pacific. In high-income subregions, western Europe had the smallest rise, 0·07 mmol/L per decade for men and 0·03 mmol/L per decade for women; North America had the largest rise, 0·18 mmol/L per decade for men and 0·14 mmol/L per decade for women. INTERPRETATION: Glycaemia and diabetes are rising globally, driven both by population growth and ageing and by increasing age-specific prevalences. Effective preventive interventions are needed, and health systems should prepare to detect and manage diabetes and its sequelae. FUNDING: Bill & Melinda Gates Foundation and WHO.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Many studies have provided evidence that prey adjust their behaviour to adaptively balance the fitness effects of reproduction and predation risk. Nocturnal terrestrial animals should deal with a range of environmental conditions during the reproductive season at the breeding sites, including a variable amount of natural ambient light. High degrees of illumination are expected to minimize those behaviours that might increase the animal detection by predators. Therefore, under habitat variable brightness conditions and in different ecosystems, the above mentioned behaviours are expected to depend on the variation in predation risk. Although moon effects on amphibian biology have been recognized, the direction of this influence is rather controversial with evidences of both increased and depressed activity under full moon. We tested in four nocturnal amphibian species (Hyla intermedia, Rana dalmatina, Rana italica, Salamandrina perspicillata) the effects of different (i) light conditions and (ii) habitats (open land vs. dense forest) on the reproductive phenology. Our results showed that the effects of the lunar cycle on the study species are associated with the change in luminosity, and there is no evidence of an endogenous rhythm controlled by biological clocks. The habitat type conditioned the amphibian reproductive strategy in relation to moon phases. Open habitat breeders (e. g., ponds with no canopy cover) strongly avoided conditions with high brightness, whereas forest habitat breeders were apparently unaffected by the different moon phases. Indeed, for all the studied species no effects of the moon phase itself on the considered metrics were found. Rather, the considered amphibian species seem to be conditioned mainly by moonlight irrespective of the moon phase. The two anurans spawning in open habitat apparently adjust their oviposition timing by balancing the fitness effects of the risk to be detected by predators and the reproduction.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Phylogenomic databases provide orthology predictions for species with fully sequenced genomes. Although the goal seems well-defined, the content of these databases differs greatly. Seven ortholog databases (Ensembl Compara, eggNOG, HOGENOM, InParanoid, OMA, OrthoDB, Panther) were compared on the basis of reference trees. For three well-conserved protein families, we observed a generally high specificity of orthology assignments for these databases. We show that differences in the completeness of predicted gene relationships and in the phylogenetic information are, for the great majority, not due to the methods used, but to differences in the underlying database concepts. According to our metrics, none of the databases provides a fully correct and comprehensive protein classification. Our results provide a framework for meaningful and systematic comparisons of phylogenomic databases. In the future, a sustainable set of 'Gold standard' phylogenetic trees could provide a robust method for phylogenomic databases to assess their current quality status, measure changes following new database releases and diagnose improvements subsequent to an upgrade of the analysis procedure.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

La tomodensitométrie (CT) est une technique d'imagerie dont l'intérêt n'a cessé de croître depuis son apparition dans le début des années 70. Dans le domaine médical, son utilisation est incontournable à tel point que ce système d'imagerie pourrait être amené à devenir victime de son succès si son impact au niveau de l'exposition de la population ne fait pas l'objet d'une attention particulière. Bien évidemment, l'augmentation du nombre d'examens CT a permis d'améliorer la prise en charge des patients ou a rendu certaines procédures moins invasives. Toutefois, pour assurer que le compromis risque - bénéfice soit toujours en faveur du patient, il est nécessaire d'éviter de délivrer des doses non utiles au diagnostic.¦Si cette action est importante chez l'adulte elle doit être une priorité lorsque les examens se font chez l'enfant, en particulier lorsque l'on suit des pathologies qui nécessitent plusieurs examens CT au cours de la vie du patient. En effet, les enfants et jeunes adultes sont plus radiosensibles. De plus, leur espérance de vie étant supérieure à celle de l'adulte, ils présentent un risque accru de développer un cancer radio-induit dont la phase de latence peut être supérieure à vingt ans. Partant du principe que chaque examen radiologique est justifié, il devient dès lors nécessaire d'optimiser les protocoles d'acquisitions pour s'assurer que le patient ne soit pas irradié inutilement. L'avancée technologique au niveau du CT est très rapide et depuis 2009, de nouvelles techniques de reconstructions d'images, dites itératives, ont été introduites afin de réduire la dose et améliorer la qualité d'image.¦Le présent travail a pour objectif de déterminer le potentiel des reconstructions itératives statistiques pour réduire au minimum les doses délivrées lors d'examens CT chez l'enfant et le jeune adulte tout en conservant une qualité d'image permettant le diagnostic, ceci afin de proposer des protocoles optimisés.¦L'optimisation d'un protocole d'examen CT nécessite de pouvoir évaluer la dose délivrée et la qualité d'image utile au diagnostic. Alors que la dose est estimée au moyen d'indices CT (CTDIV0| et DLP), ce travail a la particularité d'utiliser deux approches radicalement différentes pour évaluer la qualité d'image. La première approche dite « physique », se base sur le calcul de métriques physiques (SD, MTF, NPS, etc.) mesurées dans des conditions bien définies, le plus souvent sur fantômes. Bien que cette démarche soit limitée car elle n'intègre pas la perception des radiologues, elle permet de caractériser de manière rapide et simple certaines propriétés d'une image. La seconde approche, dite « clinique », est basée sur l'évaluation de structures anatomiques (critères diagnostiques) présentes sur les images de patients. Des radiologues, impliqués dans l'étape d'évaluation, doivent qualifier la qualité des structures d'un point de vue diagnostique en utilisant une échelle de notation simple. Cette approche, lourde à mettre en place, a l'avantage d'être proche du travail du radiologue et peut être considérée comme méthode de référence.¦Parmi les principaux résultats de ce travail, il a été montré que les algorithmes itératifs statistiques étudiés en clinique (ASIR?, VEO?) ont un important potentiel pour réduire la dose au CT (jusqu'à-90%). Cependant, par leur fonctionnement, ils modifient l'apparence de l'image en entraînant un changement de texture qui pourrait affecter la qualité du diagnostic. En comparant les résultats fournis par les approches « clinique » et « physique », il a été montré que ce changement de texture se traduit par une modification du spectre fréquentiel du bruit dont l'analyse permet d'anticiper ou d'éviter une perte diagnostique. Ce travail montre également que l'intégration de ces nouvelles techniques de reconstruction en clinique ne peut se faire de manière simple sur la base de protocoles utilisant des reconstructions classiques. Les conclusions de ce travail ainsi que les outils développés pourront également guider de futures études dans le domaine de la qualité d'image, comme par exemple, l'analyse de textures ou la modélisation d'observateurs pour le CT.¦-¦Computed tomography (CT) is an imaging technique in which interest has been growing since it first began to be used in the early 1970s. In the clinical environment, this imaging system has emerged as the gold standard modality because of its high sensitivity in producing accurate diagnostic images. However, even if a direct benefit to patient healthcare is attributed to CT, the dramatic increase of the number of CT examinations performed has raised concerns about the potential negative effects of ionizing radiation on the population. To insure a benefit - risk that works in favor of a patient, it is important to balance image quality and dose in order to avoid unnecessary patient exposure.¦If this balance is important for adults, it should be an absolute priority for children undergoing CT examinations, especially for patients suffering from diseases requiring several follow-up examinations over the patient's lifetime. Indeed, children and young adults are more sensitive to ionizing radiation and have an extended life span in comparison to adults. For this population, the risk of developing cancer, whose latency period exceeds 20 years, is significantly higher than for adults. Assuming that each patient examination is justified, it then becomes a priority to optimize CT acquisition protocols in order to minimize the delivered dose to the patient. Over the past few years, CT advances have been developing at a rapid pace. Since 2009, new iterative image reconstruction techniques, called statistical iterative reconstructions, have been introduced in order to decrease patient exposure and improve image quality.¦The goal of the present work was to determine the potential of statistical iterative reconstructions to reduce dose as much as possible without compromising image quality and maintain diagnosis of children and young adult examinations.¦The optimization step requires the evaluation of the delivered dose and image quality useful to perform diagnosis. While the dose is estimated using CT indices (CTDIV0| and DLP), the particularity of this research was to use two radically different approaches to evaluate image quality. The first approach, called the "physical approach", computed physical metrics (SD, MTF, NPS, etc.) measured on phantoms in well-known conditions. Although this technique has some limitations because it does not take radiologist perspective into account, it enables the physical characterization of image properties in a simple and timely way. The second approach, called the "clinical approach", was based on the evaluation of anatomical structures (diagnostic criteria) present on patient images. Radiologists, involved in the assessment step, were asked to score image quality of structures for diagnostic purposes using a simple rating scale. This approach is relatively complicated to implement and also time-consuming. Nevertheless, it has the advantage of being very close to the practice of radiologists and is considered as a reference method.¦Primarily, this work revealed that the statistical iterative reconstructions studied in clinic (ASIR? and VECO have a strong potential to reduce CT dose (up to -90%). However, by their mechanisms, they lead to a modification of the image appearance with a change in image texture which may then effect the quality of the diagnosis. By comparing the results of the "clinical" and "physical" approach, it was showed that a change in texture is related to a modification of the noise spectrum bandwidth. The NPS analysis makes possible to anticipate or avoid a decrease in image quality. This project demonstrated that integrating these new statistical iterative reconstruction techniques can be complex and cannot be made on the basis of protocols using conventional reconstructions. The conclusions of this work and the image quality tools developed will be able to guide future studies in the field of image quality as texture analysis or model observers dedicated to CT.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We consider electroencephalograms (EEGs) of healthy individuals and compare the properties of the brain functional networks found through two methods: unpartialized and partialized cross-correlations. The networks obtained by partial correlations are fundamentally different from those constructed through unpartial correlations in terms of graph metrics. In particular, they have completely different connection efficiency, clustering coefficient, assortativity, degree variability, and synchronization properties. Unpartial correlations are simple to compute and they can be easily applied to large-scale systems, yet they cannot prevent the prediction of non-direct edges. In contrast, partial correlations, which are often expensive to compute, reduce predicting such edges. We suggest combining these alternative methods in order to have complementary information on brain functional networks.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

BACKGROUND: The goals of our study are to determine the most appropriate model for alcohol consumption as an exposure for burden of disease, to analyze the effect of the chosen alcohol consumption distribution on the estimation of the alcohol Population- Attributable Fractions (PAFs), and to characterize the chosen alcohol consumption distribution by exploring if there is a global relationship within the distribution. METHODS: To identify the best model, the Log-Normal, Gamma, and Weibull prevalence distributions were examined using data from 41 surveys from Gender, Alcohol and Culture: An International Study (GENACIS) and from the European Comparative Alcohol Study. To assess the effect of these distributions on the estimated alcohol PAFs, we calculated the alcohol PAF for diabetes, breast cancer, and pancreatitis using the three above-named distributions and using the more traditional approach based on categories. The relationship between the mean and the standard deviation from the Gamma distribution was estimated using data from 851 datasets for 66 countries from GENACIS and from the STEPwise approach to Surveillance from the World Health Organization. RESULTS: The Log-Normal distribution provided a poor fit for the survey data, with Gamma and Weibull distributions providing better fits. Additionally, our analyses showed that there were no marked differences for the alcohol PAF estimates based on the Gamma or Weibull distributions compared to PAFs based on categorical alcohol consumption estimates. The standard deviation of the alcohol distribution was highly dependent on the mean, with a unit increase in alcohol consumption associated with a unit increase in the mean of 1.258 (95% CI: 1.223 to 1.293) (R2 = 0.9207) for women and 1.171 (95% CI: 1.144 to 1.197) (R2 = 0. 9474) for men. CONCLUSIONS: Although the Gamma distribution and the Weibull distribution provided similar results, the Gamma distribution is recommended to model alcohol consumption from population surveys due to its fit, flexibility, and the ease with which it can be modified. The results showed that a large degree of variance of the standard deviation of the alcohol consumption Gamma distribution was explained by the mean alcohol consumption, allowing for alcohol consumption to be modeled through a Gamma distribution using only average consumption.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

L'importance des déficits neurocognitifs des patients schizophrènes a conduit la section «E. Minkowski» du Département universitaire de psychiatrie adulte de Lausanne à développer un programme de remédiation cognitive pour jeunes patients schizophrènes. Les résultats préliminaires de ce programme indiquent une très nette amélioration des performances cognitives chez la plupart des patients. Les mécanismes cérébraux responsables de ces progrès ne sont cependant pas élucidés. De nouveaux éléments concernant les mécanismes de plasticité synaptique sont évoqués pour expliquer les modifications cérébrales associées aux fonctions cognitives exercées.