18 resultados para Many-to-many-assignment problem

em Universit


Relevância:

100.00% 100.00%

Publicador:

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The hydrological and biogeochemical processes that operate in catchments influence the ecological quality of freshwater systems through delivery of fine sediment, nutrients and organic matter. Most models that seek to characterise the delivery of diffuse pollutants from land to water are reductionist. The multitude of processes that are parameterised in such models to ensure generic applicability make them complex and difficult to test on available data. Here, we outline an alternative - data-driven - inverse approach. We apply SCIMAP, a parsimonious risk based model that has an explicit treatment of hydrological connectivity. we take a Bayesian approach to the inverse problem of determining the risk that must be assigned to different land uses in a catchment in order to explain the spatial patterns of measured in-stream nutrient concentrations. We apply the model to identify the key sources of nitrogen (N) and phosphorus (P) diffuse pollution risk in eleven UK catchments covering a range of landscapes. The model results show that: 1) some land use generates a consistently high or low risk of diffuse nutrient pollution; but 2) the risks associated with different land uses vary both between catchments and between nutrients; and 3) that the dominant sources of P and N risk in the catchment are often a function of the spatial configuration of land uses. Taken on a case-by-case basis, this type of inverse approach may be used to help prioritise the focus of interventions to reduce diffuse pollution risk for freshwater ecosystems. (C) 2012 Elsevier B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A major challenge in studying social behaviour stems from the need to disentangle the behaviour of each individual from the resulting collective. One way to overcome this problem is to construct a model of the behaviour of an individual, and observe whether combining many such individuals leads to the predicted outcome. This can be achieved by using robots. In this review we discuss the strengths and weaknesses of such an approach for studies of social behaviour. We find that robots-whether studied in groups of simulated or physical robots, or used to infiltrate and manipulate groups of living organisms-have important advantages over conventional individual-based models and have contributed greatly to the study of social behaviour. In particular, robots have increased our understanding of self-organization and the evolution of cooperative behaviour and communication. However, the resulting findings have not had the desired impact on the biological community. We suggest reasons for why this may be the case, and how the benefits of using robots can be maximized in future research on social behaviour.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In Switzerland, the land management regime is characterized by a liberal attitude towards the institution of property rights, which is guaranteed by the Constitution. Under the present Swiss constitutional arrangement, authorities (municipalities) are required to take into account landowners' interests when implementing their spatial planning policy. In other words, the institution of property rights cannot be restricted easily in order to implement zoning plans and planning projects. This situation causes many problems. One of them is the gap between the way land is really used by the landowners and the way land should be used based on zoning plans. In fact, zoning plans only describe how landowners should use their property. There is no sufficient provision for handling cases where the use is not in accordance with zoning plans. In particular, landowners may not be expropriated for a non-conforming use of the land. This situation often leads to the opening of new building areas in greenfields and urban sprawl, which is in contradiction with the goals set into the Federal Law on Spatial Planning. In order to identify legal strategies of intervention to solve the problem, our paper is structured into three main parts. Firstly, we make a short description of the Swiss land management regime. Then, we focus on an innovative land management approach designed to implement zoning plans in accordance with property rights. Finally, we present a case study that shows the usefulness of the presented land management approach in practice. We develop three main results. Firstly, the land management approach brings a mechanism to involve landowners in planning projects. Coordination principle between spatial planning goals and landowners' interests is the cornerstone of all the process. Secondly, the land use is improved both in terms of space and time. Finally, the institution of property rights is not challenged, since there is no expropriation and the market stays free.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Combinatorial optimization involves finding an optimal solution in a finite set of options; many everyday life problems are of this kind. However, the number of options grows exponentially with the size of the problem, such that an exhaustive search for the best solution is practically infeasible beyond a certain problem size. When efficient algorithms are not available, a practical approach to obtain an approximate solution to the problem at hand, is to start with an educated guess and gradually refine it until we have a good-enough solution. Roughly speaking, this is how local search heuristics work. These stochastic algorithms navigate the problem search space by iteratively turning the current solution into new candidate solutions, guiding the search towards better solutions. The search performance, therefore, depends on structural aspects of the search space, which in turn depend on the move operator being used to modify solutions. A common way to characterize the search space of a problem is through the study of its fitness landscape, a mathematical object comprising the space of all possible solutions, their value with respect to the optimization objective, and a relationship of neighborhood defined by the move operator. The landscape metaphor is used to explain the search dynamics as a sort of potential function. The concept is indeed similar to that of potential energy surfaces in physical chemistry. Borrowing ideas from that field, we propose to extend to combinatorial landscapes the notion of the inherent network formed by energy minima in energy landscapes. In our case, energy minima are the local optima of the combinatorial problem, and we explore several definitions for the network edges. At first, we perform an exhaustive sampling of local optima basins of attraction, and define weighted transitions between basins by accounting for all the possible ways of crossing the basins frontier via one random move. Then, we reduce the computational burden by only counting the chances of escaping a given basin via random kick moves that start at the local optimum. Finally, we approximate network edges from the search trajectory of simple search heuristics, mining the frequency and inter-arrival time with which the heuristic visits local optima. Through these methodologies, we build a weighted directed graph that provides a synthetic view of the whole landscape, and that we can characterize using the tools of complex networks science. We argue that the network characterization can advance our understanding of the structural and dynamical properties of hard combinatorial landscapes. We apply our approach to prototypical problems such as the Quadratic Assignment Problem, the NK model of rugged landscapes, and the Permutation Flow-shop Scheduling Problem. We show that some network metrics can differentiate problem classes, correlate with problem non-linearity, and predict problem hardness as measured from the performances of trajectory-based local search heuristics.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The dynamical analysis of large biological regulatory networks requires the development of scalable methods for mathematical modeling. Following the approach initially introduced by Thomas, we formalize the interactions between the components of a network in terms of discrete variables, functions, and parameters. Model simulations result in directed graphs, called state transition graphs. We are particularly interested in reachability properties and asymptotic behaviors, which correspond to terminal strongly connected components (or "attractors") in the state transition graph. A well-known problem is the exponential increase of the size of state transition graphs with the number of network components, in particular when using the biologically realistic asynchronous updating assumption. To address this problem, we have developed several complementary methods enabling the analysis of the behavior of large and complex logical models: (i) the definition of transition priority classes to simplify the dynamics; (ii) a model reduction method preserving essential dynamical properties, (iii) a novel algorithm to compact state transition graphs and directly generate compressed representations, emphasizing relevant transient and asymptotic dynamical properties. The power of an approach combining these different methods is demonstrated by applying them to a recent multilevel logical model for the network controlling CD4+ T helper cell response to antigen presentation and to a dozen cytokines. This model accounts for the differentiation of canonical Th1 and Th2 lymphocytes, as well as of inflammatory Th17 and regulatory T cells, along with many hybrid subtypes. All these methods have been implemented into the software GINsim, which enables the definition, the analysis, and the simulation of logical regulatory graphs.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

As a result of recent welfare state transformations, and most notably the reorientation of welfare states towards activation, the internal fragmentation of social security systems has emerged as a key policy problem in many western European countries. The types of response that have been adopted, however, vary substantially across countries, ranging from the encouragement of inter-agency collaboration to the outright merger of agencies. The purpose of this exploratory article is twofold. First, by proposing the concept of coordination initiatives, it tries to develop a better conceptualization of the cross-national diversity in responses to the fragmentation problem. Second, starting from existing theories of welfare state development and policy change, it presents first hypotheses accounting for the variation observed in coordination initiatives.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Purpose: Although young males encounter sexually-related concerns, they are mostly absent from specialized services. Our objective is to assess whether the internet is used by boys to find answers to these types of problems and questions. Methods: In the context of a qualitative study assessing young males' barriers to access sexual and reproductive health facilities, we conducted two focus groups gathering 12 boys aged 17-20. Discussions were triggered through the presentation of four vignettes corresponding to questions posted by 17-20 year old boys and girls on an information website for adolescents (www.ciao.ch), concerning various sexual dysfunction situations. In order to avoid having to talk about their own experience, participants were asked what they would do in those cases. Results: In general, the internet was mentioned quite thoroughly as a means of searching for information through research engines and a place to address professionals for advice.Within the hierarchy of consultation possibilities, the internet was given the first place as a way to deal with these types of problems presenting many advantages: (1) the internet enables to maintain intimacy; (2) it is anonymous (use of a pseudo); (3) it avoids having to confront someone face-to-face with personal problems which can be embarrassing and challenging for one's pride; (4) it is free; and (5) it is accessible at all times. In other words, participants value the internet as a positive tool to avoid many barriers which prevent offline consultations to take place. Most participants consider the internet at least as a first step in trying to solve a problem; for instance, by better defining the seriousness of a problem and judging if it is worth consulting a doctor. However, despite the positive qualities of the internet, they do put forward the importance of having specialists answering questions, trustworthiness, and being followed-up by the same person. Participants suggested that a strategy to break down barriers for boys to consult in face-to-face settings is to have a consultation on the internet as a first step which could then guide the person to an in-person consultation if necessary. Conclusions: The internet as a means of obtaining information or consulting received high marks overall. Although the internet cannot replace an in-person consultation, the screen and the keyboard have the advantage of not involving a face-to-face encounter and raise the possibility of discussing sexual problems anonymously and in private. The internet tools together with other new technologies should continue to develop in a secure manner as a space providing prevention messages and to become an easy access door to sexual and reproductive health services for young men, which can then guide youths to appropriate resource persons. Sources of support: This study was supported by the Maurice Chalumeau Foundation, Switzerland.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Carbon isotope ratio (CIR) analysis has been routinely and successfully used in sports drug testing for many years to uncover the misuse of endogenous steroids. One limitation of the method is the availability of steroid preparations exhibiting CIRs equal to endogenous steroids. To overcome this problem, hydrogen isotope ratios (HIR) of endogenous urinary steroids were investigated as a potential complement; results obtained from a reference population of 67 individuals are presented herein. An established sample preparation method was modified and improved to enable separate measurements of each analyte of interest where possible. From the fraction of glucuronidated steroids; pregnanediol, 16-androstenol, 11-ketoetiocholanolone, androsterone (A), etiocholanolone (E), dehydroepiandrosterone (D), 5α- and 5β-androstanediol, testosterone and epitestosterone were included. In addition, sulfate conjugates of A, E, D, epiandrosterone and 17α- and 17β-androstenediol were considered and analyzed after acidic solvolysis. The obtained results enabled the calculation of the first reference-population-based thresholds for HIR of urinary steroids that can readily be applied to routine doping control samples. Proof-of-concept was accomplished by investigating urine specimens collected after a single oral application of testosterone-undecanoate. The HIR of most testosterone metabolites were found to be significantly influenced by the exogenous steroid beyond the established threshold values. Additionally, one regular doping control sample with an extraordinary testosterone/epitestosterone ratio of 100 without suspicious CIR was subjected to the complementary methodology of HIR analysis. The HIR data eventually provided evidence for the exogenous origin of urinary testosterone metabolites. Despite further investigations on HIR being advisable to corroborate the presented reference-population-based thresholds, the developed method proved to be a new tool supporting modern sports drug testing procedures.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Notre consommation en eau souterraine, en particulier comme eau potable ou pour l'irrigation, a considérablement augmenté au cours des années. De nombreux problèmes font alors leur apparition, allant de la prospection de nouvelles ressources à la remédiation des aquifères pollués. Indépendamment du problème hydrogéologique considéré, le principal défi reste la caractérisation des propriétés du sous-sol. Une approche stochastique est alors nécessaire afin de représenter cette incertitude en considérant de multiples scénarios géologiques et en générant un grand nombre de réalisations géostatistiques. Nous rencontrons alors la principale limitation de ces approches qui est le coût de calcul dû à la simulation des processus d'écoulements complexes pour chacune de ces réalisations. Dans la première partie de la thèse, ce problème est investigué dans le contexte de propagation de l'incertitude, oú un ensemble de réalisations est identifié comme représentant les propriétés du sous-sol. Afin de propager cette incertitude à la quantité d'intérêt tout en limitant le coût de calcul, les méthodes actuelles font appel à des modèles d'écoulement approximés. Cela permet l'identification d'un sous-ensemble de réalisations représentant la variabilité de l'ensemble initial. Le modèle complexe d'écoulement est alors évalué uniquement pour ce sousensemble, et, sur la base de ces réponses complexes, l'inférence est faite. Notre objectif est d'améliorer la performance de cette approche en utilisant toute l'information à disposition. Pour cela, le sous-ensemble de réponses approximées et exactes est utilisé afin de construire un modèle d'erreur, qui sert ensuite à corriger le reste des réponses approximées et prédire la réponse du modèle complexe. Cette méthode permet de maximiser l'utilisation de l'information à disposition sans augmentation perceptible du temps de calcul. La propagation de l'incertitude est alors plus précise et plus robuste. La stratégie explorée dans le premier chapitre consiste à apprendre d'un sous-ensemble de réalisations la relation entre les modèles d'écoulement approximé et complexe. Dans la seconde partie de la thèse, cette méthodologie est formalisée mathématiquement en introduisant un modèle de régression entre les réponses fonctionnelles. Comme ce problème est mal posé, il est nécessaire d'en réduire la dimensionnalité. Dans cette optique, l'innovation du travail présenté provient de l'utilisation de l'analyse en composantes principales fonctionnelles (ACPF), qui non seulement effectue la réduction de dimensionnalités tout en maximisant l'information retenue, mais permet aussi de diagnostiquer la qualité du modèle d'erreur dans cet espace fonctionnel. La méthodologie proposée est appliquée à un problème de pollution par une phase liquide nonaqueuse et les résultats obtenus montrent que le modèle d'erreur permet une forte réduction du temps de calcul tout en estimant correctement l'incertitude. De plus, pour chaque réponse approximée, une prédiction de la réponse complexe est fournie par le modèle d'erreur. Le concept de modèle d'erreur fonctionnel est donc pertinent pour la propagation de l'incertitude, mais aussi pour les problèmes d'inférence bayésienne. Les méthodes de Monte Carlo par chaîne de Markov (MCMC) sont les algorithmes les plus communément utilisés afin de générer des réalisations géostatistiques en accord avec les observations. Cependant, ces méthodes souffrent d'un taux d'acceptation très bas pour les problèmes de grande dimensionnalité, résultant en un grand nombre de simulations d'écoulement gaspillées. Une approche en deux temps, le "MCMC en deux étapes", a été introduite afin d'éviter les simulations du modèle complexe inutiles par une évaluation préliminaire de la réalisation. Dans la troisième partie de la thèse, le modèle d'écoulement approximé couplé à un modèle d'erreur sert d'évaluation préliminaire pour le "MCMC en deux étapes". Nous démontrons une augmentation du taux d'acceptation par un facteur de 1.5 à 3 en comparaison avec une implémentation classique de MCMC. Une question reste sans réponse : comment choisir la taille de l'ensemble d'entrainement et comment identifier les réalisations permettant d'optimiser la construction du modèle d'erreur. Cela requiert une stratégie itérative afin que, à chaque nouvelle simulation d'écoulement, le modèle d'erreur soit amélioré en incorporant les nouvelles informations. Ceci est développé dans la quatrième partie de la thèse, oú cette méthodologie est appliquée à un problème d'intrusion saline dans un aquifère côtier. -- Our consumption of groundwater, in particular as drinking water and for irrigation, has considerably increased over the years and groundwater is becoming an increasingly scarce and endangered resource. Nofadays, we are facing many problems ranging from water prospection to sustainable management and remediation of polluted aquifers. Independently of the hydrogeological problem, the main challenge remains dealing with the incomplete knofledge of the underground properties. Stochastic approaches have been developed to represent this uncertainty by considering multiple geological scenarios and generating a large number of realizations. The main limitation of this approach is the computational cost associated with performing complex of simulations in each realization. In the first part of the thesis, we explore this issue in the context of uncertainty propagation, where an ensemble of geostatistical realizations is identified as representative of the subsurface uncertainty. To propagate this lack of knofledge to the quantity of interest (e.g., the concentration of pollutant in extracted water), it is necessary to evaluate the of response of each realization. Due to computational constraints, state-of-the-art methods make use of approximate of simulation, to identify a subset of realizations that represents the variability of the ensemble. The complex and computationally heavy of model is then run for this subset based on which inference is made. Our objective is to increase the performance of this approach by using all of the available information and not solely the subset of exact responses. Two error models are proposed to correct the approximate responses follofing a machine learning approach. For the subset identified by a classical approach (here the distance kernel method) both the approximate and the exact responses are knofn. This information is used to construct an error model and correct the ensemble of approximate responses to predict the "expected" responses of the exact model. The proposed methodology makes use of all the available information without perceptible additional computational costs and leads to an increase in accuracy and robustness of the uncertainty propagation. The strategy explored in the first chapter consists in learning from a subset of realizations the relationship between proxy and exact curves. In the second part of this thesis, the strategy is formalized in a rigorous mathematical framework by defining a regression model between functions. As this problem is ill-posed, it is necessary to reduce its dimensionality. The novelty of the work comes from the use of functional principal component analysis (FPCA), which not only performs the dimensionality reduction while maximizing the retained information, but also allofs a diagnostic of the quality of the error model in the functional space. The proposed methodology is applied to a pollution problem by a non-aqueous phase-liquid. The error model allofs a strong reduction of the computational cost while providing a good estimate of the uncertainty. The individual correction of the proxy response by the error model leads to an excellent prediction of the exact response, opening the door to many applications. The concept of functional error model is useful not only in the context of uncertainty propagation, but also, and maybe even more so, to perform Bayesian inference. Monte Carlo Markov Chain (MCMC) algorithms are the most common choice to ensure that the generated realizations are sampled in accordance with the observations. Hofever, this approach suffers from lof acceptance rate in high dimensional problems, resulting in a large number of wasted of simulations. This led to the introduction of two-stage MCMC, where the computational cost is decreased by avoiding unnecessary simulation of the exact of thanks to a preliminary evaluation of the proposal. In the third part of the thesis, a proxy is coupled to an error model to provide an approximate response for the two-stage MCMC set-up. We demonstrate an increase in acceptance rate by a factor three with respect to one-stage MCMC results. An open question remains: hof do we choose the size of the learning set and identify the realizations to optimize the construction of the error model. This requires devising an iterative strategy to construct the error model, such that, as new of simulations are performed, the error model is iteratively improved by incorporating the new information. This is discussed in the fourth part of the thesis, in which we apply this methodology to a problem of saline intrusion in a coastal aquifer.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Deliberate fires appear to be borderless and timeless events creating a serious security problem. There have been many attempts to develop approaches to tackle this problem, but unfortunately acting effectively against deliberate fires has proven a complex challenge. This article reviews the current situation relating to deliberate fires: what do we know, how serious is the situation, how is it being dealt with, and what challenges are faced when developing a systematic and global methodology to tackle the issues? The repetitive nature of some types of deliberate fires will also be discussed. Finally, drawing on the reality of repetition within deliberate fires and encouraged by successes obtained in previous repetitive crimes (such as property crimes or drug trafficking), we will argue that the use of the intelligence process cycle as a framework to allow a follow-up and systematic analysis of fire events is a relevant approach. This is the first article of a series of three articles. This first part is introducing the context and discussing the background issues in order to provide a better underpinning knowledge to managers and policy makers planning on tackling this issue. The second part will present a methodology developed to detect and identify repetitive fire events from a set of data, and the third part will discuss the analyses of these data to produce intelligence.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Sexually transmitted infections are a major problem for medicine and for public health services worldwide. More than 30 sexually transmittable pathogenic micro-organisms are known, including bacteria, viruses, fungi, protozoa and ectoparasites. According to estimates from the World Health Organisation more than 333 million of bacterial sexually transmitted infections occur worldwide per year. Sexually transmitted infections, by their nature, affect individuals, within partnerships and larger sexual networks, and in turn populations. This report focuses on three bacterial sexually transmitted infections in Switzerland that are Chlamydia trachomatis, Neisseria gonorrhea and Treponema pallidum (syphilis) in Switzerland. The prevalence of these infections has been increasing alarmingly for a decade. All three infections can be asymptomatic and their diagnosis and treatment can therefore occur too late or worse not at all, even though treatments are available. This is an important problem as untreated sexually transmitted infections may cause complications such as ascending infections, infertility, ectopic pregnancies and serious long-term neurological sequels. The consequences of these infections should not be underestimated. They constitute a significant public health burden as well as serious financial burden. The increases in chlamydia, syphilis and gonorrhea infections have also been observed in many European countries. Countries, where rising numbers of sexually transmitted infections have been observed, have reacted in different ways. Some have developed clinical guidelines or implemented screening programs, while others are still in their observational phase. The aim of this mémoire is to assess whether Switzerland is doing enough regarding the prevention of chlamydial, syphilis and gonorrheal infections. After first describing the infections, surveillance systems of sexually transmitted infections are assessed, then the epidemiological trends of these three infections are described, and finally the prevention measures implemented in Switzerland to respond to the increasing number of infections are described. The reaction of the United Kingdom to the same problem is reported for comparison. [Author, p. 7]

Relevância:

100.00% 100.00%

Publicador:

Resumo:

OBJECTIVES: Tissue engineering methods can be applied to regenerate diseased, or congenitally missing, urinary tract tissues. Urinary tract tissue cell cultures must be established in vitro and adequate matrices, acting as cell carriers, must be developed. Although degradable and nondegradable polymer matrices offer adequate mechanical stability, they are not optimal for cell adherence and growth. To overcome this problem, extracellular matrix proteins, permitting cell adhesion and regulation of cell proliferation and differentiation, can be adsorbed to the surface-modified polymer. METHODS: In this study, nondegradable polymer films, poly(ethylene terephthalate), were used as an experimental model. Films were modified by graft polymerization of acrylic acid to subsequently allow collagen type I and III immobilization. The following adhesion, proliferation of human urothelial cells, and induction of their stratification were analyzed. RESULTS: Collagen adsorption on 0.2 microg/cm2 poly(acrylic acid)-grafted polymer films rendered the matrix apt for human urothelial cell adhesion and proliferation. Furthermore, stratification of urothelial cells was demonstrated on these surface-modified matrices. CONCLUSIONS: These results have shown that surface-modified polymer matrices can be used to act as cell carriers for cultured human urothelial cells. Such a cell-matrix construct could be applied in reparative surgery of the urinary tract.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

QUESTIONS UNDER STUDY: Our aim was to identify the barriers young men face to consult a health professional when they encounter sexual dysfunctions and where they turn to, if so, for answers. METHODS: We conducted an exploratory qualitative research including 12 young men aged 16-20 years old seen in two focus groups. Discussions were triggered through vignettes about sexual dysfunction. RESULTS: Young men preferred not to talk about sexual dysfunction problems with anyone and to solve them alone as it is considered an intimate and embarrassing subject which can negatively impact their masculinity. Confidentiality appeared to be the most important criterion in disclosing an intimate subject to a health professional. Participants raised the problem of males' accessibility to services and lack of reason to consult. Two criteria to address the problem were if it was long-lasting or considered as physical. The Internet was unanimously considered as an initial solution to solve a problem, which could guide them to a face-to-face consultation if necessary. CONCLUSIONS: Results suggest that Internet-based tools should be developed to become an easy access door to sexual health services for young men. Wherever they consult and for whatever problem, sexual health must be on the agenda.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Chloride channels represent a group of targets for major clinical indications. However, molecular screening for chloride channel modulators has proven to be difficult and time-consuming as approaches essentially rely on the use of fluorescent dyes or invasive patch-clamp techniques which do not lend themselves to the screening of large sets of compounds. To address this problem, we have developed a non-invasive optical method, based on digital holographic microcopy (DHM), allowing monitoring of ion channel activity without using any electrode or fluorescent dye. To illustrate this approach, GABA(A) mediated chloride currents have been monitored with DHM. Practically, we show that DHM can non-invasively provide the quantitative determination of transmembrane chloride fluxes mediated by the activation of chloride channels associated with GABA(A) receptors. Indeed through an original algorithm, chloride currents elicited by application of appropriate agonists of the GABA(A) receptor can be derived from the quantitative phase signal recorded with DHM. Finally, chloride currents can be determined and pharmacologically characterized non-invasively simultaneously on a large cellular sampling by DHM.