946 resultados para Towards Seamless Integration of Geoscience Models and Data


Relevância:

100.00% 100.00%

Publicador:

Resumo:

This study analyzed the health and overall landcover of citrus crops in Florida. The analysis was completed using Landsat satellite imagery available free of charge from the University of Maryland Global Landcover Change Facility. The project hypothesized that combining citrus production (economic) data with citrus area per county derived from spectral signatures would yield correlations between observable spectral reflectance throughout the year, and the fiscal impact of citrus on local economies. A positive correlation between these two data types would allow us to predict the economic impact of citrus using spectral data analysis to determine final crop harvests.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Due to the rapid advances in computing and sensing technologies, enormous amounts of data are being generated everyday in various applications. The integration of data mining and data visualization has been widely used to analyze these massive and complex data sets to discover hidden patterns. For both data mining and visualization to be effective, it is important to include the visualization techniques in the mining process and to generate the discovered patterns for a more comprehensive visual view. In this dissertation, four related problems: dimensionality reduction for visualizing high dimensional datasets, visualization-based clustering evaluation, interactive document mining, and multiple clusterings exploration are studied to explore the integration of data mining and data visualization. In particular, we 1) propose an efficient feature selection method (reliefF + mRMR) for preprocessing high dimensional datasets; 2) present DClusterE to integrate cluster validation with user interaction and provide rich visualization tools for users to examine document clustering results from multiple perspectives; 3) design two interactive document summarization systems to involve users efforts and generate customized summaries from 2D sentence layouts; and 4) propose a new framework which organizes the different input clusterings into a hierarchical tree structure and allows for interactive exploration of multiple clustering solutions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Hydrophobicity as measured by Log P is an important molecular property related to toxicity and carcinogenicity. With increasing public health concerns for the effects of Disinfection By-Products (DBPs), there are considerable benefits in developing Quantitative Structure and Activity Relationship (QSAR) models capable of accurately predicting Log P. In this research, Log P values of 173 DBP compounds in 6 functional classes were used to develop QSAR models, by applying 3 molecular descriptors, namely, Energy of the Lowest Unoccupied Molecular Orbital (ELUMO), Number of Chlorine (NCl) and Number of Carbon (NC) by Multiple Linear Regression (MLR) analysis. The QSAR models developed were validated based on the Organization for Economic Co-operation and Development (OECD) principles. The model Applicability Domain (AD) and mechanistic interpretation were explored. Considering the very complex nature of DBPs, the established QSAR models performed very well with respect to goodness-of-fit, robustness and predictability. The predicted values of Log P of DBPs by the QSAR models were found to be significant with a correlation coefficient R2 from 81% to 98%. The Leverage Approach by Williams Plot was applied to detect and remove outliers, consequently increasing R 2 by approximately 2% to 13% for different DBP classes. The developed QSAR models were statistically validated for their predictive power by the Leave-One-Out (LOO) and Leave-Many-Out (LMO) cross validation methods. Finally, Monte Carlo simulation was used to assess the variations and inherent uncertainties in the QSAR models of Log P and determine the most influential parameters in connection with Log P prediction. The developed QSAR models in this dissertation will have a broad applicability domain because the research data set covered six out of eight common DBP classes, including halogenated alkane, halogenated alkene, halogenated aromatic, halogenated aldehyde, halogenated ketone, and halogenated carboxylic acid, which have been brought to the attention of regulatory agencies in recent years. Furthermore, the QSAR models are suitable to be used for prediction of similar DBP compounds within the same applicability domain. The selection and integration of various methodologies developed in this research may also benefit future research in similar fields.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Article Accepted Date: 29 May 2014 Acknowledgements The authors gratefully acknowledge the support of the Cognitive Science Society for the organisation of the Workshop on Production of Referring Expressions: Bridging the Gap between Cognitive and Computational Approaches to Reference, from which this special issue originated. Funding Emiel Krahmer and Albert Gatt thank The Netherlands Organisation for Scientific Research (NWO) for VICI grant Bridging the Gap between Computational Linguistics and Psycholinguistics: The Case of Referring Expressions (grant number 277-70-007).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Human use of the oceans is increasingly in conflict with conservation of endangered species. Methods for managing the spatial and temporal placement of industries such as military, fishing, transportation and offshore energy, have historically been post hoc; i.e. the time and place of human activity is often already determined before assessment of environmental impacts. In this dissertation, I build robust species distribution models in two case study areas, US Atlantic (Best et al. 2012) and British Columbia (Best et al. 2015), predicting presence and abundance respectively, from scientific surveys. These models are then applied to novel decision frameworks for preemptively suggesting optimal placement of human activities in space and time to minimize ecological impacts: siting for offshore wind energy development, and routing ships to minimize risk of striking whales. Both decision frameworks relate the tradeoff between conservation risk and industry profit with synchronized variable and map views as online spatial decision support systems.

For siting offshore wind energy development (OWED) in the U.S. Atlantic (chapter 4), bird density maps are combined across species with weights of OWED sensitivity to collision and displacement and 10 km2 sites are compared against OWED profitability based on average annual wind speed at 90m hub heights and distance to transmission grid. A spatial decision support system enables toggling between the map and tradeoff plot views by site. A selected site can be inspected for sensitivity to a cetaceans throughout the year, so as to capture months of the year which minimize episodic impacts of pre-operational activities such as seismic airgun surveying and pile driving.

Routing ships to avoid whale strikes (chapter 5) can be similarly viewed as a tradeoff, but is a different problem spatially. A cumulative cost surface is generated from density surface maps and conservation status of cetaceans, before applying as a resistance surface to calculate least-cost routes between start and end locations, i.e. ports and entrance locations to study areas. Varying a multiplier to the cost surface enables calculation of multiple routes with different costs to conservation of cetaceans versus cost to transportation industry, measured as distance. Similar to the siting chapter, a spatial decisions support system enables toggling between the map and tradeoff plot view of proposed routes. The user can also input arbitrary start and end locations to calculate the tradeoff on the fly.

Essential to the input of these decision frameworks are distributions of the species. The two preceding chapters comprise species distribution models from two case study areas, U.S. Atlantic (chapter 2) and British Columbia (chapter 3), predicting presence and density, respectively. Although density is preferred to estimate potential biological removal, per Marine Mammal Protection Act requirements in the U.S., all the necessary parameters, especially distance and angle of observation, are less readily available across publicly mined datasets.

In the case of predicting cetacean presence in the U.S. Atlantic (chapter 2), I extracted datasets from the online OBIS-SEAMAP geo-database, and integrated scientific surveys conducted by ship (n=36) and aircraft (n=16), weighting a Generalized Additive Model by minutes surveyed within space-time grid cells to harmonize effort between the two survey platforms. For each of 16 cetacean species guilds, I predicted the probability of occurrence from static environmental variables (water depth, distance to shore, distance to continental shelf break) and time-varying conditions (monthly sea-surface temperature). To generate maps of presence vs. absence, Receiver Operator Characteristic (ROC) curves were used to define the optimal threshold that minimizes false positive and false negative error rates. I integrated model outputs, including tables (species in guilds, input surveys) and plots (fit of environmental variables, ROC curve), into an online spatial decision support system, allowing for easy navigation of models by taxon, region, season, and data provider.

For predicting cetacean density within the inner waters of British Columbia (chapter 3), I calculated density from systematic, line-transect marine mammal surveys over multiple years and seasons (summer 2004, 2005, 2008, and spring/autumn 2007) conducted by Raincoast Conservation Foundation. Abundance estimates were calculated using two different methods: Conventional Distance Sampling (CDS) and Density Surface Modelling (DSM). CDS generates a single density estimate for each stratum, whereas DSM explicitly models spatial variation and offers potential for greater precision by incorporating environmental predictors. Although DSM yields a more relevant product for the purposes of marine spatial planning, CDS has proven to be useful in cases where there are fewer observations available for seasonal and inter-annual comparison, particularly for the scarcely observed elephant seal. Abundance estimates are provided on a stratum-specific basis. Steller sea lions and harbour seals are further differentiated by ‘hauled out’ and ‘in water’. This analysis updates previous estimates (Williams & Thomas 2007) by including additional years of effort, providing greater spatial precision with the DSM method over CDS, novel reporting for spring and autumn seasons (rather than summer alone), and providing new abundance estimates for Steller sea lion and northern elephant seal. In addition to providing a baseline of marine mammal abundance and distribution, against which future changes can be compared, this information offers the opportunity to assess the risks posed to marine mammals by existing and emerging threats, such as fisheries bycatch, ship strikes, and increased oil spill and ocean noise issues associated with increases of container ship and oil tanker traffic in British Columbia’s continental shelf waters.

Starting with marine animal observations at specific coordinates and times, I combine these data with environmental data, often satellite derived, to produce seascape predictions generalizable in space and time. These habitat-based models enable prediction of encounter rates and, in the case of density surface models, abundance that can then be applied to management scenarios. Specific human activities, OWED and shipping, are then compared within a tradeoff decision support framework, enabling interchangeable map and tradeoff plot views. These products make complex processes transparent for gaming conservation, industry and stakeholders towards optimal marine spatial management, fundamental to the tenets of marine spatial planning, ecosystem-based management and dynamic ocean management.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In June 2015, legal frameworks of the Asian Infrastructural Investment Bank were signed by its 57 founding members. Proposed and initiated by China, this multilateral development bank is considered to be an Asian counterpart to break the monopoly of the World Bank and the International Monetary Fund. In October 2015, China’s Central Bank announced a benchmark interest rate cut to combat the economic slowdown. The easing policy coincides with the European Central Bank’s announcement of doubts over US Fed’s commitment to raise interest rates. Global stock markets responded positively to China’s move, with the exception of the indexes from Wall Street (Bland, 2015; Elliott, 2015). In the meantime, China’s ‘One Belt, One Road’ (or New Silk Road Economic Belt) became atopic of discourse in relation to its growing global economy, as China pledged $40 billion to trade and infrastructure projects (Bermingham, 2015). The foreign policy aims to reinforce the economic belt from western China through Central Asia towards Europe, as well as to construct maritime trading routes from coastal China through the South China Sea (Summers, 2015). In 2012, The Economist launched a new China section, to reveal the complexity of the‘meteoric rise’ of China. John Micklethwait, who was then the chief editor of the magazine, said that China’s emergence as a global power justified giving it a section of its own(Roush, 2012). In July 2015, Hu Shuli, the former chief editor of Caijing, announced the launch of a think tank and financial data service division called Caixin Insight Group, which encompasses the new Caixin China Purchasing Managers Index (PMI). Incooperation with with Markit Group, a principal global provider of PMI, the index soon became a widely cited economic indicator. One anecdote from November’s Caixin shows how much has changed: in a high-profile dialogue between Hu Shuli and Kevin Rudd, Hu insisted on asking questions in English; interestingly, the former Prime Minister of Australia insisted on replying in Chinese. These recent developments point to one thing: the economic ascent of China and its increasing influence on the power play between economics and politics in world markets. China has begun to take a more active role in rule making and enforcement under neoliberal frameworks. However, due to the country’s size and the scale of its economy in comparison to other countries, China’s version of globalisation has unique characteristics. The ‘Capitalist-socialist’ paradox is vital to China’s market-oriented transformation. In order to comprehend how such unique features are articulated and understood, there are several questions worth investigating in the realms of media and communication studies,such as how China’s neoliberal restructuring is portrayed and perceived by different types of interested parties, and how these portrayals are de-contextualised and re-contextualised in global or Anglo-American narratives. Therefore, based on a combination of the themes of globalisation, financial media and China’s economic integration, this thesis attempts to explore how financial media construct the narratives of China’s economic globalisation through the deployment of comparative and multi-disciplinary approaches. Two outstanding elite financial magazines, Britain’s The Economist, which has a global readership and influence, and Caijing, China’s leading financial magazine, are chosen as case studies to exemplify differing media discourses, representing, respectively, Anglo-American and Chinese socio-economic and political backgrounds, as well as their own journalistic cultures. This thesis tries to answer the questions of how and why China’s neoliberal restructuring is constructed from a globally-oriented perspective. The construction primarily involves people who are influential in business and policymaking. Hence, the analysis falls into the paradigm of elite-elite communication, which is an important but relatively less developed perspective in studying China and its globalisation. The comparing of characteristics of narrative construction are the result of the textual analysis of articles published over a ten-year period (mid-1998 to mid-2008). The corpus of samples come from the two media outlets’ coverage of three selected events:China becoming a member of the World Trade Organization, its outward direct investment, and the listing of stocks of Chinese companies in overseas exchanges, which are mutually exclusive in sample collection and collectively exhaustive in the inclusion of articles regarding China’s economic globalisation. The findings help to understand that, despite language, socio-economic and political differences, elite financial media with globally-oriented readerships share similar methods of and approaches to agenda setting, the evaluation of news prominence, the selection of frame, and the advocacy of deeply rooted neoliberal ideas. The comparison of their distinctive features reflects the different phases of building up the sense of identity in their readers as global elites, as well as the different economic interests that are aligned with the corresponding readerships. However, textual analysis is only relevant in terms of exploring how the narratives are constructed and the elements they include; textual analysis alone prevents us from seeing the obstacles and the constrains of the journalistic practices of construction. Therefore, this thesis provides a brief discussion of interviews with practitioners from the two media, in order to understand how similar or different narratives are manifested and perceived, how the concept of neoliberalism deviates from and is justified in the Chinese context, and how and for what purpose deviations arise from Western to Chinese contexts. The thesis also contributes to defining financial media in the domain of elite communication. The relevant and closely interlocking concepts of globalisation, elitism and neoliberalism are discussed, and are used as a theoretical bedrock in the analysis of texts and contexts. It is important to address the agenda-setting and ideological role of elite financial media, because of its narrative formula of infusing business facts with opinions,which is important in constructing the global elite identity as well as influencing neoliberal policy-making. On the other hand, ‘journalistic professionalism’ has been redefined, in that the elite identity is shared by the content producer, reader and the actors in the news stories emerging from the much-compressed news cycle. The professionalism of elite financial media requires a dual definition, that of being professional in the understanding of business facts and statistics, and that of being professional in the making sense of stories by deploying economic logic.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

An investigation into karst hazard in southern Ontario has been undertaken with the intention of leading to the development of predictive karst models for this region. The reason these are not currently feasible is a lack of sufficient karst data, though this is not entirely due to the lack of karst features. Geophysical data was collected at Lake on the Mountain, Ontario as part of this karst investigation. This data was collected in order to validate the long-standing hypothesis that Lake on the Mountain was formed from a sinkhole collapse. Sub-bottom acoustic profiling data was collected in order to image the lake bottom sediments and bedrock. Vertical bedrock features interpreted as solutionally enlarged fractures were taken as evidence for karst processes on the lake bottom. Additionally, the bedrock topography shows a narrower and more elongated basin than was previously identified, and this also lies parallel to a mapped fault system in the area. This suggests that Lake on the Mountain was formed over a fault zone which also supports the sinkhole hypothesis as it would provide groundwater pathways for karst dissolution to occur. Previous sediment cores suggest that Lake on the Mountain would have formed at some point during the Wisconsinan glaciation with glacial meltwater and glacial loading as potential contributing factors to sinkhole development. A probabilistic karst model for the state of Kentucky, USA, has been generated using the Weights of Evidence method. This model is presented as an example of the predictive capabilities of these kind of data-driven modelling techniques and to show how such models could be applied to karst in Ontario. The model was able to classify 70% of the validation dataset correctly while minimizing false positive identifications. This is moderately successful and could stand to be improved. Finally, suggestions to improving the current karst model of southern Ontario are suggested with the goal of increasing investigation into karst in Ontario and streamlining the reporting system for sinkholes, caves, and other karst features so as to improve the current Ontario karst database.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The automated transfer of flight logbook information from aircrafts into aircraft maintenance systems leads to reduced ground and maintenance time and is thus desirable from an economical point of view. Until recently, flight logbooks have not been managed electronically in aircrafts or at least the data transfer from aircraft to ground maintenance system has been executed manually. Latest aircraft types such as the Airbus A380 or the Boeing 787 do support an electronic logbook and thus make an automated transfer possible. A generic flight logbook transfer system must deal with different data formats on the input side – due to different aircraft makes and models – as well as different, distributed aircraft maintenance systems for different airlines as aircraft operators. This article contributes the concept and top level distributed system architecture of such a generic system for automated flight log data transfer. It has been developed within a joint industry and applied research project. The architecture has already been successfully evaluated in a prototypical implementation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Mechanistic models used for prediction should be parsimonious, as models which are over-parameterised may have poor predictive performance. Determining whether a model is parsimonious requires comparisons with alternative model formulations with differing levels of complexity. However, creating alternative formulations for large mechanistic models is often problematic, and usually time-consuming. Consequently, few are ever investigated. In this paper, we present an approach which rapidly generates reduced model formulations by replacing a model’s variables with constants. These reduced alternatives can be compared to the original model, using data based model selection criteria, to assist in the identification of potentially unnecessary model complexity, and thereby inform reformulation of the model. To illustrate the approach, we present its application to a published radiocaesium plant-uptake model, which predicts uptake on the basis of soil characteristics (e.g. pH, organic matter content, clay content). A total of 1024 reduced model formulations were generated, and ranked according to five model selection criteria: Residual Sum of Squares (RSS), AICc, BIC, MDL and ICOMP. The lowest scores for RSS and AICc occurred for the same reduced model in which pH dependent model components were replaced. The lowest scores for BIC, MDL and ICOMP occurred for a further reduced model in which model components related to the distinction between adsorption on clay and organic surfaces were replaced. Both these reduced models had a lower RSS for the parameterisation dataset than the original model. As a test of their predictive performance, the original model and the two reduced models outlined above were used to predict an independent dataset. The reduced models have lower prediction sums of squares than the original model, suggesting that the latter may be overfitted. The approach presented has the potential to inform model development by rapidly creating a class of alternative model formulations, which can be compared.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this thesis we discuss in what ways computational logic (CL) and data science (DS) can jointly contribute to the management of knowledge within the scope of modern and future artificial intelligence (AI), and how technically-sound software technologies can be realised along the path. An agent-oriented mindset permeates the whole discussion, by stressing pivotal role of autonomous agents in exploiting both means to reach higher degrees of intelligence. Accordingly, the goals of this thesis are manifold. First, we elicit the analogies and differences among CL and DS, hence looking for possible synergies and complementarities along 4 major knowledge-related dimensions, namely representation, acquisition (a.k.a. learning), inference (a.k.a. reasoning), and explanation. In this regard, we propose a conceptual framework through which bridges these disciplines can be described and designed. We then survey the current state of the art of AI technologies, w.r.t. their capability to support bridging CL and DS in practice. After detecting lacks and opportunities, we propose the notion of logic ecosystem as the new conceptual, architectural, and technological solution supporting the incremental integration of symbolic and sub-symbolic AI. Finally, we discuss how our notion of logic ecosys- tem can be reified into actual software technology and extended towards many DS-related directions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The purpose of this research study is to discuss privacy and data protection-related regulatory and compliance challenges posed by digital transformation in healthcare in the wake of the COVID-19 pandemic. The public health crisis accelerated the development of patient-centred remote/hybrid healthcare delivery models that make increased use of telehealth services and related digital solutions. The large-scale uptake of IoT-enabled medical devices and wellness applications, and the offering of healthcare services via healthcare platforms (online doctor marketplaces) have catalysed these developments. However, the use of new enabling technologies (IoT, AI) and the platformisation of healthcare pose complex challenges to the protection of patient’s privacy and personal data. This happens at a time when the EU is drawing up a new regulatory landscape for the use of data and digital technologies. Against this background, the study presents an interdisciplinary (normative and technology-oriented) critical assessment on how the new regulatory framework may affect privacy and data protection requirements regarding the deployment and use of Internet of Health Things (hardware) devices and interconnected software (AI systems). The study also assesses key privacy and data protection challenges that affect healthcare platforms (online doctor marketplaces) in their offering of video API-enabled teleconsultation services and their (anticipated) integration into the European Health Data Space. The overall conclusion of the study is that regulatory deficiencies may create integrity risks for the protection of privacy and personal data in telehealth due to uncertainties about the proper interplay, legal effects and effectiveness of (existing and proposed) EU legislation. The proliferation of normative measures may increase compliance costs, hinder innovation and ultimately, deprive European patients from state-of-the-art digital health technologies, which is paradoxically, the opposite of what the EU plans to achieve.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Mother and infant mortality has been the scope of analysis throughout the history of public health in Brazil and various strategies to tackle the issue have been proposed to date. The Ministry of Health has been working on this and the Rede Cegonha strategy is the most recent policy in this context. Given the principle of comprehensive health care and the structure of the Unified Health System in care networks, it is necessary to ensure the integration of health care practices, among which are the sanitary surveillance actions (SSA). Considering that the integration of health care practices and SSA can contribute to reduce mother and infant mortality rates, this article is a result of qualitative research that analyzed the integration of these actions in four cities in the State of São Paulo/Brazil: Campinas, Indaiatuba, Jaguariúna and Santa Bárbara D'Oeste. The research was conducted through interviews with SSA and maternal health managers, and the data were evaluated using thematic analysis. The results converge with other studies, identifying the isolation of health care practices and SSA. The insertion of SSA in collectively-managed areas appears to be a potential strategy for health planning and implementation of actions in the context under scrutiny.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The exact composition of a specific class of compact stars, historically referred to as ""neutron stars,'' is still quite unknown. Possibilities ranging from hadronic to quark degrees of freedom, including self-bound versions of the latter, have been proposed. We specifically address the suitability of strange star models (including pairing interactions) in this work, in the light of new measurements available for four compact stars. The analysis shows that these data might be explained by such an exotic equation of state, actually selecting a small window in parameter space, but still new precise measurements and also further theoretical developments are needed to settle the subject.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We investigate the internal dynamics of two cellular automaton models with heterogeneous strength fields and differing nearest neighbour laws. One model is a crack-like automaton, transferring ail stress from a rupture zone to the surroundings. The other automaton is a partial stress drop automaton, transferring only a fraction of the stress within a rupture zone to the surroundings. To study evolution of stress, the mean spectral density. f(k(r)) of a stress deficit held is: examined prior to, and immediately following ruptures in both models. Both models display a power-law relationship between f(k(r)) and spatial wavenumber (k(r)) of the form f(k(r)) similar tok(r)(-beta). In the crack model, the evolution of stress deficit is consistent with cyclic approach to, and retreat from a critical state in which large events occur. The approach to criticality is driven by tectonic loading. Short-range stress transfer in the model does not affect the approach to criticality of broad regions in the model. The evolution of stress deficit in the partial stress drop model is consistent with small fluctuations about a mean state of high stress, behaviour indicative of a self-organised critical system. Despite statistics similar to natural earthquakes these simplified models lack a physical basis. physically motivated models of earthquakes also display dynamical complexity similar to that of a critical point system. Studies of dynamical complexity in physical models of earthquakes may lead to advancement towards a physical theory for earthquakes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The movement of chemicals through the soil to the groundwater or discharged to surface waters represents a degradation of these resources. In many cases, serious human and stock health implications are associated with this form of pollution. The chemicals of interest include nutrients, pesticides, salts, and industrial wastes. Recent studies have shown that current models and methods do not adequately describe the leaching of nutrients through soil, often underestimating the risk of groundwater contamination by surface-applied chemicals, and overestimating the concentration of resident solutes. This inaccuracy results primarily from ignoring soil structure and nonequilibrium between soil constituents, water, and solutes. A multiple sample percolation system (MSPS), consisting of 25 individual collection wells, was constructed to study the effects of localized soil heterogeneities on the transport of nutrients (NO3-, Cl-, PO43-) in the vadose zone of an agricultural soil predominantly dominated by clay. Very significant variations in drainage patterns across a small spatial scale were observed tone-way ANOVA, p < 0.001) indicating considerable heterogeneity in water flow patterns and nutrient leaching. Using data collected from the multiple sample percolation experiments, this paper compares the performance of two mathematical models for predicting solute transport, the advective-dispersion model with a reaction term (ADR), and a two-region preferential flow model (TRM) suitable for modelling nonequilibrium transport. These results have implications for modelling solute transport and predicting nutrient loading on a larger scale. (C) 2001 Elsevier Science Ltd. All rights reserved.