921 resultados para Temporal ways of knowing


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Settling particles were collected using sediment traps deployed along three transects in the Lacaze-Duthiers and Cap de Creus canyons and the adjacent southern open slope from October 2005 to October 2006. The settling material was analyzed to obtain total mass fluxes and main constituent contents (organic matter, opal, calcium carbonate, and siliciclastics). Cascades of dense shelf water from the continental shelf edge to the lower continental slope occurred from January to March 2006. They were traced through strong negative near-bottom temperature anomalies and increased current speeds, and generated two intense pulses of mass fluxes in January and March 2006. This oceanographic phenomenon appeared as the major physical forcing of settling particles at almost all stations, and caused both high seasonal variability in mass fluxes and important qualitative changes in settling material. Fluxes during the dense shelf water cascading (DSWC) event ranged from 90.1 g m(-2) d(-1) at the middle Cap de Creus canyon (1000 m) to 3.2 g m(-2) d(-1) at the canyon mouth (1900 m). Fractions of organic matter, opal and calcium carbonate components increased seaward, thus diminishing the siliciclastic fraction. Temporal variability of the major components was larger in the canyon mouth and open slope sites, due to the mixed impact of dense shelf water cascading processes and the pelagic biological production. Results indicate that the cascading event remobilized and homogenized large amounts of material down canyon and southwardly along the continental slope contributing to a better understanding of the off-shelf particle transport and the internal dynamics of DSWC events.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Using a scaling assumption, we propose a phenomenological model aimed to describe the joint probability distribution of two magnitudes A and T characterizing the spatial and temporal scales of a set of avalanches. The model also describes the correlation function of a sequence of such avalanches. As an example we study the joint distribution of amplitudes and durations of the acoustic emission signals observed in martensitic transformations [Vives et al., preceding paper, Phys. Rev. B 52, 12 644 (1995)].

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Abstract Background: Medical errors have recently been recognized as a relevant concern in public health, and increasing research efforts have been made to find ways of improving patient safety. In palliative care, however, studies on errors are scant. Objective: Our aim was to gather pilot data concerning experiences and attitudes of palliative care professionals on this topic. Methods: We developed a questionnaire, which consists of questions on relevance, estimated frequency, kinds and severity of errors, their causes and consequences, and the way palliative care professionals handle them. The questionnaire was sent to all specialist palliative care institutions in the region of Bavaria, Germany (n=168; inhabitants 12.5 million) reaching a response rate of 42% (n=70). Results: Errors in palliative care were regarded as a highly relevant problem (median 8 on a 10-point numeric rating scale). Most respondents experienced a moderate frequency of errors (1-10 per 100 patients). Errors in communication were estimated to be more common than those in symptom control. The causes most often mentioned were deficits in communication or organization. Moral and psychological problems for the person committing the error were seen as more frequent than consequences for the patient. Ninety percent of respondents declared that they disclose errors to the harmed patient. For 78% of the professionals, the issue was not a part of their professional training. Conclusion: Professionals acknowledge errors-in particular errors in communication-to be a common and relevant problem in palliative care, one that has, however, been neglected in training and research.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Assuming selective vulnerability of short association U-fibers in early Alzheimer's disease (AD), we quantified demyelination of the surface white matter (dSWM) with magnetization transfer ratio (MTR) in 15 patients (Clinical Dementia Rating Scale [CDR] 0.5-1; Functional Assessment Staging [FAST]: 3-4) compared with 15 controls. MTRs were computed for 39 areas in each hemisphere. We found a bilateral MTR decrease in the temporal, cingulate, parietal, and prefrontal areas. With linear discriminant analysis, we successfully classified all the participants with 3 variates including the cuneus, parahippocampal, and superior temporal regions of the left hemisphere. The pattern of dSWM changed with the age of AD onset. In early onset patients, we found bilateral posterior demyelination spreading to the temporal areas in the left hemisphere. The late onset patients showed a distributed bilateral pattern with the temporal and cingulate areas strongly affected. A correlation with Mini Mental State Examination (MMSE), Lexis, and memory tests revealed the dSWM impact on cognition. A specific landscape of dSWM in early AD shows the potential of MTR imaging as an in vivo biomarker superior to currently used techniques.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Measurement of the blood pressure by the physician remains an essential step in the evaluation of cardiovascular risk. Ambulatory measurement and self-measurement of blood pressure are ways of counteracting the "white coat" effect which is the rise in blood pressure many patients experience in the presence of doctors. Thus, it is possible to define the cardiovascular risk of hypertension and identify the patients with the greatest chance of benefiting from antihypertensive therapy. However, it must be realised that normotensive subjects during their everyday activities and becoming hypertensive in the doctor's surgery, may become hypertensive with time, irrespective of the means used to measure blood pressure. These patients should be followed up regularly even if the decision to treat has been postponed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

PURPOSE OF REVIEW: We present an overview of recent concepts in mechanisms underlying cognitive decline associated with brain aging and neurodegeneration from the perspective of MRI. RECENT FINDINGS: Recent findings challenge the established link between neuroimaging biomarkers of neurodegeneration and age-related or disease-related cognitive decline. Amyloid burden, white matter hyperintensities and local patterns of brain atrophy seem to have differential impact on cognition, particularly on episodic and working memory - the most vulnerable domains in 'normal aging' and Alzheimer's disease. Studies suggesting that imaging biomarkers of neurodegeneration are independent of amyloid-β give rise to new hypothesis regarding the pathological cascade in Alzheimer's disease. Findings in patients with autosomal-dominant Alzheimer's disease confirm the notion of differential temporal trajectory of amyloid deposition and brain atrophy to add another layer of complexity on the basic mechanisms of cognitive aging and neurodegeneration. Finally, the concept of cognitive reserve in 'supernormal aging' is questioned by evidence for the preservation of neurochemical, structural and functional brain integrity in old age rather than recruitment of 'reserves' for maintaining cognitive abilities. SUMMARY: Recent advances in clinical neuroscience, brain imaging and genetics challenge pathophysiological hypothesis of neurodegeneration and cognitive aging dominating the field in the last decade and call for reconsidering the choice of therapeutic window for early intervention.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Signaling through the Notch1 receptor is essential for the control of numerous developmental processes during embryonic life as well as in adult tissue homeostasis and disease. Since the outcome of Notch1 signaling is highly context-dependent, and its precise physiological and pathological role in many organs is unclear, it is of great interest to localize and identify the cells that receive active Notch1 signals in vivo. Here, we report the generation and characterization of a BAC-transgenic mouse line, N1-Gal4VP16, that when crossed to a Gal4-responsive reporter mouse line allowed the identification of cells undergoing active Notch1 signaling in vivo. Analysis of embryonic and adult N1-Gal4VP16 mice demonstrated that the activation pattern of the transgene coincides with previously observed activation patterns of the endogenous Notch1 receptor. Thus, this novel reporter mouse line provides a unique tool to specifically investigate the spatial and temporal aspects of Notch1 signaling in vivo. genesis 50:700-710, 2012. © 2012 Wiley Periodicals, Inc.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

One of the largest strawberry-producing municipalities of Rio Grande do Sul (RS) is Turuçu, in the South of the State. The strawberry production system adopted by farmers is similar to that used in other regions in Brazil and in the world. The main difference is related to the soil management, which can change the soil chemical properties during the strawberry cycle. This study had the objective of assessing the spatial and temporal distribution of soil fertility parameters using principal component analysis (PCA). Soil sampling was based on topography, dividing the field in three thirds: upper, middle and lower. From each of these thirds, five soil samples were randomly collected in the 0-0.20 m layer, to form a composite sample for each third. Four samples were taken during the strawberry cycle and the following properties were determined: soil organic matter (OM), soil total nitrogen (N), available phosphorus (P) and potassium (K), exchangeable calcium (Ca) and magnesium (Mg), soil pH (pH), cation exchange capacity (CEC) at pH 7.0, soil base (V%) and soil aluminum saturation(m%). No spatial variation was observed for any of the studied soil fertility parameters in the strawberry fields and temporal variation was only detected for available K. Phosphorus and K contents were always high or very high from the beginning of the strawberry cycle, while pH values ranged from very low to very high. Principal component analysis allowed the clustering of all strawberry fields based on variables related to soil acidity and organic matter content.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

To date, only a couple of functional MR spectroscopy (fMRS) studies were conducted in rats. Due to the low temporal resolution of (1)H MRS techniques, prolonged stimulation paradigms are necessary for investigating the metabolic outcome in the rat brain during functional challenge. However, sustained activation of cortical areas is usually difficult to obtain due to neural adaptation. Anesthesia, habituation, high variability of the basal state metabolite concentrations as well as low concentrations of the metabolites of interest such as lactate (Lac), glucose (Glc) or γ-aminobutyric acid (GABA) and small expected changes of metabolite concentrations need to be addressed. In the present study, the rat barrel cortex was reliably and reproducibly activated through sustained trigeminal nerve (TGN) stimulation. In addition, TGN stimulation induced significant positive changes in lactate (+1.01μmol/g, p<0.008) and glutamate (+0.92μmol/g, p<0.02) and significant negative aspartate changes (-0.63μmol/g, p<0.004) using functional (1)H MRS at 9.4T in agreement with previous changes observed in human fMRS studies. Finally, for the first time, the dynamics of lactate, glucose, aspartate and glutamate concentrations during sustained somatosensory activation in rats using fMRS were assessed. These results allow demonstrating the feasibility of fMRS measurements during prolonged barrel cortex activation in rats.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Generally, in tropical and subtropical agroecosystems, the efficiency of nitrogen (N) fertilization is low, inducing a temporal variability of crop yield, economic losses, and environmental impacts. Variable-rate N fertilization (VRF), based on optical spectrometry crop sensors, could increase the N use efficiency (NUE). The objective of this study was to evaluate the corn grain yield and N fertilization efficiency under VRF determined by an optical sensor in comparison to the traditional single-application N fertilization (TSF). With this purpose, three experiments with no-tillage corn were carried out in the 2008/09 and 2010/11 growing seasons on a Hapludox in South Brazil, in a completely randomized design, at three different sites that were analyzed separately. The following crop properties were evaluated: aboveground dry matter production and quantity of N uptake at corn flowering, grain yield, and vegetation index determined by an N-Sensor® ALS optical sensor. Across the sites, the corn N fertilizer had a positive effect on corn N uptake, resulting in increased corn dry matter and grain yield. However, N fertilization induced lower increases of corn grain yield at site 2, where there was a severe drought during the growing period. The VRF defined by the optical crop sensor increased the apparent N recovery (NRE) and agronomic efficiency of N (NAE) compared to the traditional fertilizer strategy. In the average of sites 1 and 3, which were not affected by drought, VRF promoted an increase of 28.0 and 41.3 % in NAE and NRE, respectively. Despite these results, no increases in corn grain yield were observed by the use of VRF compared to TSF.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In order to select soil management practices that increase the nitrogen-use efficiency (NUE) in agro-ecosystems, the different indices of agronomic fertilizer efficiency must be evaluated under varied weather conditions. This study assessed the NUE indices in no-till corn in southern Paraguay. Nitrogen fertilizer rates from 0 to 180 kg ha-1 were applied in a single application at corn sowing and the crop response investigated in two growing seasons (2010 and 2011). The experimental design was a randomized block with three replications. Based on the data of grain yield, dry matter, and N uptake, the following fertilizer indices were assessed: agronomic N-use efficiency (ANE), apparent N recovery efficiency (NRE), N physiological efficiency (NPE), partial factor productivity (PFP), and partial nutrient balance (PNB). The weather conditions varied largely during the experimental period; the rainfall distribution was favorable for crop growth in the first season and unfavorable in the second. The PFP and ANE indices, as expected, decreased with increasing N fertilizer rates. A general analysis of the N fertilizer indices in the first season showed that the maximum rate (180 kg ha-1) obtained the highest corn yield and also optimized the efficiency of NPE, NRE and ANE. In the second season, under water stress, the most efficient N fertilizer rate (60 kg ha-1) was three times lower than in the first season, indicating a strong influence of weather conditions on NUE. Considering that weather instability is typical for southern Paraguay, anticipated full N fertilization at corn sowing is not recommended due the temporal variability of the optimum N fertilizer rate needed to achieve high ANE.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Soil erosion is one of the chief causes of agricultural land degradation. Practices of conservation agriculture, such as no-tillage and cover crops, are the key strategies of soil erosion control. In a long-term experiment on a Typic Paleudalf, we evaluated the temporal changes of soil loss and water runoff rates promoted by the transition from conventional to no-tillage systems in the treatments: bare soil (BS); grassland (GL); winter fallow (WF); intercrop maize and velvet bean (M+VB); intercrop maize and jack bean (M+JB); forage radish as winter cover crop (FR); and winter cover crop consortium ryegrass - common vetch (RG+CV). Intensive soil tillage induced higher soil losses and water runoff rates; these effects persisted for up to three years after the adoption of no-tillage. The planting of cover crops resulted in a faster decrease of soil and water loss rates in the first years after conversion from conventional to no-tillage than to winter fallow. The association of no-tillage with cover crops promoted progressive soil stabilization; after three years, soil losses were similar and water runoff was lower than from grassland soil. In the treatments of cropping systems with cover crops, soil losses were reduced by 99.7 and 66.7 %, compared to bare soil and winter fallow, while the water losses were reduced by 96.8 and 71.8 % in relation to the same treatments, respectively.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In the subtropical regions of southern Brazil, rainfall distribution is uneven, which results in temporal variability of soil water storage. For grapes, water is generally available in excess and water deficiency occurs only occasionally. Furthermore, on the Southern Plateau of Santa Catarina, there are differences in soil properties, which results in high spatial variability. These two factors affect the composition of wine grapes. Spatio-temporal analyses are therefore useful in the selection of cultural practices as well as of adequate soils for vineyards. In this way, well-suited areas can produce grapes with a more appropriate composition for the production of quality wines. The aim of this study was to evaluate the spatio-temporal variability of water storage in a Cambisol during the growth cycle of a Cabernet Sauvignon vineyard and its relation to selected soil properties. The experimental area consisted of a commercial 8-year-old vineyard in São Joaquim, Santa Catarina, Brazil. A sampling grid with five rows and seven points per row, spaced 12 m apart, was outlined on an area of 3,456 m². Soil samples were collected with an auger at these points, 0.30 m away from the grapevines, in the 0.00-0.30 m layer, to determine gravimetric soil moisture. Measurements were taken once a week from December 2008 to April 2009, and every two weeks from December 2009 to March 2010. In December 2008, undisturbed soil samples were collected to determine bulk density, macro- and microporosity, and disturbed samples were used to quantify particle size distribution and organic carbon content. Results were subjected to descriptive analysis and semivariogram analysis, calculating the mean relative difference and the Pearson correlation. The average water storage in a Cambisol under grapevine on ridges had variable spatial dependence, i.e., the lower the average water storage, the higher the range of spatial dependence. Water storage had a stable spatial pattern during the trial period, indicating that the points with lower water storage or points with higher water storage during a certain period maintain these conditions throughout the experimental period. The relative difference is a simple method to identify positions that represent the average soil water storage more adequately at any time for a given area.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Knowledge of the soil water retention curve (SWRC) is essential for understanding and modeling hydraulic processes in the soil. However, direct determination of the SWRC is time consuming and costly. In addition, it requires a large number of samples, due to the high spatial and temporal variability of soil hydraulic properties. An alternative is the use of models, called pedotransfer functions (PTFs), which estimate the SWRC from easy-to-measure properties. The aim of this paper was to test the accuracy of 16 point or parametric PTFs reported in the literature on different soils from the south and southeast of the State of Pará, Brazil. The PTFs tested were proposed by Pidgeon (1972), Lal (1979), Aina & Periaswamy (1985), Arruda et al. (1987), Dijkerman (1988), Vereecken et al. (1989), Batjes (1996), van den Berg et al. (1997), Tomasella et al. (2000), Hodnett & Tomasella (2002), Oliveira et al. (2002), and Barros (2010). We used a database that includes soil texture (sand, silt, and clay), bulk density, soil organic carbon, soil pH, cation exchange capacity, and the SWRC. Most of the PTFs tested did not show good performance in estimating the SWRC. The parametric PTFs, however, performed better than the point PTFs in assessing the SWRC in the tested region. Among the parametric PTFs, those proposed by Tomasella et al. (2000) achieved the best accuracy in estimating the empirical parameters of the van Genuchten (1980) model, especially when tested in the top soil layer.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Ces dernières années, de nombreuses recherches ont mis en évidence les effets toxiques des micropolluants organiques pour les espèces de nos lacs et rivières. Cependant, la plupart de ces études se sont focalisées sur la toxicité des substances individuelles, alors que les organismes sont exposés tous les jours à des milliers de substances en mélange. Or les effets de ces cocktails ne sont pas négligeables. Cette thèse de doctorat s'est ainsi intéressée aux modèles permettant de prédire le risque environnemental de ces cocktails pour le milieu aquatique. Le principal objectif a été d'évaluer le risque écologique des mélanges de substances chimiques mesurées dans le Léman, mais aussi d'apporter un regard critique sur les méthodologies utilisées afin de proposer certaines adaptations pour une meilleure estimation du risque. Dans la première partie de ce travail, le risque des mélanges de pesticides et médicaments pour le Rhône et pour le Léman a été établi en utilisant des approches envisagées notamment dans la législation européenne. Il s'agit d'approches de « screening », c'est-à-dire permettant une évaluation générale du risque des mélanges. Une telle approche permet de mettre en évidence les substances les plus problématiques, c'est-à-dire contribuant le plus à la toxicité du mélange. Dans notre cas, il s'agit essentiellement de 4 pesticides. L'étude met également en évidence que toutes les substances, même en trace infime, contribuent à l'effet du mélange. Cette constatation a des implications en terme de gestion de l'environnement. En effet, ceci implique qu'il faut réduire toutes les sources de polluants, et pas seulement les plus problématiques. Mais l'approche proposée présente également un biais important au niveau conceptuel, ce qui rend son utilisation discutable, en dehors d'un screening, et nécessiterait une adaptation au niveau des facteurs de sécurité employés. Dans une deuxième partie, l'étude s'est portée sur l'utilisation des modèles de mélanges dans le calcul de risque environnemental. En effet, les modèles de mélanges ont été développés et validés espèce par espèce, et non pour une évaluation sur l'écosystème en entier. Leur utilisation devrait donc passer par un calcul par espèce, ce qui est rarement fait dû au manque de données écotoxicologiques à disposition. Le but a été donc de comparer, avec des valeurs générées aléatoirement, le calcul de risque effectué selon une méthode rigoureuse, espèce par espèce, avec celui effectué classiquement où les modèles sont appliqués sur l'ensemble de la communauté sans tenir compte des variations inter-espèces. Les résultats sont dans la majorité des cas similaires, ce qui valide l'approche utilisée traditionnellement. En revanche, ce travail a permis de déterminer certains cas où l'application classique peut conduire à une sous- ou sur-estimation du risque. Enfin, une dernière partie de cette thèse s'est intéressée à l'influence que les cocktails de micropolluants ont pu avoir sur les communautés in situ. Pour ce faire, une approche en deux temps a été adoptée. Tout d'abord la toxicité de quatorze herbicides détectés dans le Léman a été déterminée. Sur la période étudiée, de 2004 à 2009, cette toxicité due aux herbicides a diminué, passant de 4% d'espèces affectées à moins de 1%. Ensuite, la question était de savoir si cette diminution de toxicité avait un impact sur le développement de certaines espèces au sein de la communauté des algues. Pour ce faire, l'utilisation statistique a permis d'isoler d'autres facteurs pouvant avoir une influence sur la flore, comme la température de l'eau ou la présence de phosphates, et ainsi de constater quelles espèces se sont révélées avoir été influencées, positivement ou négativement, par la diminution de la toxicité dans le lac au fil du temps. Fait intéressant, une partie d'entre-elles avait déjà montré des comportements similaires dans des études en mésocosmes. En conclusion, ce travail montre qu'il existe des modèles robustes pour prédire le risque des mélanges de micropolluants sur les espèces aquatiques, et qu'ils peuvent être utilisés pour expliquer le rôle des substances dans le fonctionnement des écosystèmes. Toutefois, ces modèles ont bien sûr des limites et des hypothèses sous-jacentes qu'il est important de considérer lors de leur application. - Depuis plusieurs années, les risques que posent les micropolluants organiques pour le milieu aquatique préoccupent grandement les scientifiques ainsi que notre société. En effet, de nombreuses recherches ont mis en évidence les effets toxiques que peuvent avoir ces substances chimiques sur les espèces de nos lacs et rivières, quand elles se retrouvent exposées à des concentrations aiguës ou chroniques. Cependant, la plupart de ces études se sont focalisées sur la toxicité des substances individuelles, c'est à dire considérées séparément. Actuellement, il en est de même dans les procédures de régulation européennes, concernant la partie évaluation du risque pour l'environnement d'une substance. Or, les organismes sont exposés tous les jours à des milliers de substances en mélange, et les effets de ces "cocktails" ne sont pas négligeables. L'évaluation du risque écologique que pose ces mélanges de substances doit donc être abordé par de la manière la plus appropriée et la plus fiable possible. Dans la première partie de cette thèse, nous nous sommes intéressés aux méthodes actuellement envisagées à être intégrées dans les législations européennes pour l'évaluation du risque des mélanges pour le milieu aquatique. Ces méthodes sont basées sur le modèle d'addition des concentrations, avec l'utilisation des valeurs de concentrations des substances estimées sans effet dans le milieu (PNEC), ou à partir des valeurs des concentrations d'effet (CE50) sur certaines espèces d'un niveau trophique avec la prise en compte de facteurs de sécurité. Nous avons appliqué ces méthodes à deux cas spécifiques, le lac Léman et le Rhône situés en Suisse, et discuté les résultats de ces applications. Ces premières étapes d'évaluation ont montré que le risque des mélanges pour ces cas d'étude atteint rapidement une valeur au dessus d'un seuil critique. Cette valeur atteinte est généralement due à deux ou trois substances principales. Les procédures proposées permettent donc d'identifier les substances les plus problématiques pour lesquelles des mesures de gestion, telles que la réduction de leur entrée dans le milieu aquatique, devraient être envisagées. Cependant, nous avons également constaté que le niveau de risque associé à ces mélanges de substances n'est pas négligeable, même sans tenir compte de ces substances principales. En effet, l'accumulation des substances, même en traces infimes, atteint un seuil critique, ce qui devient plus difficile en terme de gestion du risque. En outre, nous avons souligné un manque de fiabilité dans ces procédures, qui peuvent conduire à des résultats contradictoires en terme de risque. Ceci est lié à l'incompatibilité des facteurs de sécurité utilisés dans les différentes méthodes. Dans la deuxième partie de la thèse, nous avons étudié la fiabilité de méthodes plus avancées dans la prédiction de l'effet des mélanges pour les communautés évoluant dans le système aquatique. Ces méthodes reposent sur le modèle d'addition des concentrations (CA) ou d'addition des réponses (RA) appliqués sur les courbes de distribution de la sensibilité des espèces (SSD) aux substances. En effet, les modèles de mélanges ont été développés et validés pour être appliqués espèce par espèce, et non pas sur plusieurs espèces agrégées simultanément dans les courbes SSD. Nous avons ainsi proposé une procédure plus rigoureuse, pour l'évaluation du risque d'un mélange, qui serait d'appliquer d'abord les modèles CA ou RA à chaque espèce séparément, et, dans une deuxième étape, combiner les résultats afin d'établir une courbe SSD du mélange. Malheureusement, cette méthode n'est pas applicable dans la plupart des cas, car elle nécessite trop de données généralement indisponibles. Par conséquent, nous avons comparé, avec des valeurs générées aléatoirement, le calcul de risque effectué selon cette méthode plus rigoureuse, avec celle effectuée traditionnellement, afin de caractériser la robustesse de cette approche qui consiste à appliquer les modèles de mélange sur les courbes SSD. Nos résultats ont montré que l'utilisation de CA directement sur les SSDs peut conduire à une sous-estimation de la concentration du mélange affectant 5 % ou 50% des espèces, en particulier lorsque les substances présentent un grand écart- type dans leur distribution de la sensibilité des espèces. L'application du modèle RA peut quant à lui conduire à une sur- ou sous-estimations, principalement en fonction de la pente des courbes dose- réponse de chaque espèce composant les SSDs. La sous-estimation avec RA devient potentiellement importante lorsque le rapport entre la EC50 et la EC10 de la courbe dose-réponse des espèces est plus petit que 100. Toutefois, la plupart des substances, selon des cas réels, présentent des données d' écotoxicité qui font que le risque du mélange calculé par la méthode des modèles appliqués directement sur les SSDs reste cohérent et surestimerait plutôt légèrement le risque. Ces résultats valident ainsi l'approche utilisée traditionnellement. Néanmoins, il faut garder à l'esprit cette source d'erreur lorsqu'on procède à une évaluation du risque d'un mélange avec cette méthode traditionnelle, en particulier quand les SSD présentent une distribution des données en dehors des limites déterminées dans cette étude. Enfin, dans la dernière partie de cette thèse, nous avons confronté des prédictions de l'effet de mélange avec des changements biologiques observés dans l'environnement. Dans cette étude, nous avons utilisé des données venant d'un suivi à long terme d'un grand lac européen, le lac Léman, ce qui offrait la possibilité d'évaluer dans quelle mesure la prédiction de la toxicité des mélanges d'herbicide expliquait les changements dans la composition de la communauté phytoplanctonique. Ceci à côté d'autres paramètres classiques de limnologie tels que les nutriments. Pour atteindre cet objectif, nous avons déterminé la toxicité des mélanges sur plusieurs années de 14 herbicides régulièrement détectés dans le lac, en utilisant les modèles CA et RA avec les courbes de distribution de la sensibilité des espèces. Un gradient temporel de toxicité décroissant a pu être constaté de 2004 à 2009. Une analyse de redondance et de redondance partielle, a montré que ce gradient explique une partie significative de la variation de la composition de la communauté phytoplanctonique, même après avoir enlevé l'effet de toutes les autres co-variables. De plus, certaines espèces révélées pour avoir été influencées, positivement ou négativement, par la diminution de la toxicité dans le lac au fil du temps, ont montré des comportements similaires dans des études en mésocosmes. On peut en conclure que la toxicité du mélange herbicide est l'un des paramètres clés pour expliquer les changements de phytoplancton dans le lac Léman. En conclusion, il existe diverses méthodes pour prédire le risque des mélanges de micropolluants sur les espèces aquatiques et celui-ci peut jouer un rôle dans le fonctionnement des écosystèmes. Toutefois, ces modèles ont bien sûr des limites et des hypothèses sous-jacentes qu'il est important de considérer lors de leur application, avant d'utiliser leurs résultats pour la gestion des risques environnementaux. - For several years now, the scientists as well as the society is concerned by the aquatic risk organic micropollutants may pose. Indeed, several researches have shown the toxic effects these substances may induce on organisms living in our lakes or rivers, especially when they are exposed to acute or chronic concentrations. However, most of the studies focused on the toxicity of single compounds, i.e. considered individually. The same also goes in the current European regulations concerning the risk assessment procedures for the environment of these substances. But aquatic organisms are typically exposed every day simultaneously to thousands of organic compounds. The toxic effects resulting of these "cocktails" cannot be neglected. The ecological risk assessment of mixtures of such compounds has therefore to be addressed by scientists in the most reliable and appropriate way. In the first part of this thesis, the procedures currently envisioned for the aquatic mixture risk assessment in European legislations are described. These methodologies are based on the mixture model of concentration addition and the use of the predicted no effect concentrations (PNEC) or effect concentrations (EC50) with assessment factors. These principal approaches were applied to two specific case studies, Lake Geneva and the River Rhône in Switzerland, including a discussion of the outcomes of such applications. These first level assessments showed that the mixture risks for these studied cases exceeded rapidly the critical value. This exceeding is generally due to two or three main substances. The proposed procedures allow therefore the identification of the most problematic substances for which management measures, such as a reduction of the entrance to the aquatic environment, should be envisioned. However, it was also showed that the risk levels associated with mixtures of compounds are not negligible, even without considering these main substances. Indeed, it is the sum of the substances that is problematic, which is more challenging in term of risk management. Moreover, a lack of reliability in the procedures was highlighted, which can lead to contradictory results in terms of risk. This result is linked to the inconsistency in the assessment factors applied in the different methods. In the second part of the thesis, the reliability of the more advanced procedures to predict the mixture effect to communities in the aquatic system were investigated. These established methodologies combine the model of concentration addition (CA) or response addition (RA) with species sensitivity distribution curves (SSD). Indeed, the mixture effect predictions were shown to be consistent only when the mixture models are applied on a single species, and not on several species simultaneously aggregated to SSDs. Hence, A more stringent procedure for mixture risk assessment is proposed, that would be to apply first the CA or RA models to each species separately and, in a second step, to combine the results to build an SSD for a mixture. Unfortunately, this methodology is not applicable in most cases, because it requires large data sets usually not available. Therefore, the differences between the two methodologies were studied with datasets created artificially to characterize the robustness of the traditional approach applying models on species sensitivity distribution. The results showed that the use of CA on SSD directly might lead to underestimations of the mixture concentration affecting 5% or 50% of species, especially when substances present a large standard deviation of the distribution from the sensitivity of the species. The application of RA can lead to over- or underestimates, depending mainly on the slope of the dose-response curves of the individual species. The potential underestimation with RA becomes important when the ratio between the EC50 and the EC10 for the dose-response curve of the species composing the SSD are smaller than 100. However, considering common real cases of ecotoxicity data for substances, the mixture risk calculated by the methodology applying mixture models directly on SSDs remains consistent and would rather slightly overestimate the risk. These results can be used as a theoretical validation of the currently applied methodology. Nevertheless, when assessing the risk of mixtures, one has to keep in mind this source of error with this classical methodology, especially when SSDs present a distribution of the data outside the range determined in this study Finally, in the last part of this thesis, we confronted the mixture effect predictions with biological changes observed in the environment. In this study, long-term monitoring of a European great lake, Lake Geneva, provides the opportunity to assess to what extent the predicted toxicity of herbicide mixtures explains the changes in the composition of the phytoplankton community next to other classical limnology parameters such as nutrients. To reach this goal, the gradient of the mixture toxicity of 14 herbicides regularly detected in the lake was calculated, using concentration addition and response addition models. A decreasing temporal gradient of toxicity was observed from 2004 to 2009. Redundancy analysis and partial redundancy analysis showed that this gradient explains a significant portion of the variation in phytoplankton community composition, even when having removed the effect of all other co-variables. Moreover, some species that were revealed to be influenced positively or negatively, by the decrease of toxicity in the lake over time, showed similar behaviors in mesocosms studies. It could be concluded that the herbicide mixture toxicity is one of the key parameters to explain phytoplankton changes in Lake Geneva. To conclude, different methods exist to predict the risk of mixture in the ecosystems. But their reliability varies depending on the underlying hypotheses. One should therefore carefully consider these hypotheses, as well as the limits of the approaches, before using the results for environmental risk management