813 resultados para feature based modelling
Resumo:
We present a comparative analysis of projected impacts of climate change on river runoff from two types of distributed hydrological model, a global hydrological model (GHM) and catchment-scale hydrological models (CHM). Analyses are conducted for six catchments that are global in coverage and feature strong contrasts in spatial scale as well as climatic and development conditions. These include the Liard (Canada), Mekong (SE Asia), Okavango (SW Africa), Rio Grande (Brazil), Xiangu (China) and Harper's Brook (UK). A single GHM (Mac-PDM.09) is applied to all catchments whilst different CHMs are applied for each catchment. The CHMs typically simulate water resources impacts based on a more explicit representation of catchment water resources than that available from the GHM, and the CHMs include river routing. Simulations of average annual runoff, mean monthly runoff and high (Q5) and low (Q95) monthly runoff under baseline (1961-1990) and climate change scenarios are presented. We compare the simulated runoff response of each hydrological model to (1) prescribed increases in global mean temperature from the HadCM3 climate model and (2)a prescribed increase in global-mean temperature of 2oC for seven GCMs to explore response to climate model and structural uncertainty. We find that differences in projected changes of mean annual runoff between the two types of hydrological model can be substantial for a given GCM, and they are generally larger for indicators of high and low flow. However, they are relatively small in comparison to the range of projections across the seven GCMs. Hence, for the six catchments and seven GCMs we considered, climate model structural uncertainty is greater than the uncertainty associated with the type of hydrological model applied. Moreover, shifts in the seasonal cycle of runoff with climate change are presented similarly by both hydrological models, although for some catchments the monthly timing of high and low flows differs.This implies that for studies that seek to quantify and assess the role of climate model uncertainty on catchment-scale runoff, it may be equally as feasible to apply a GHM as it is to apply a CHM, especially when climate modelling uncertainty across the range of available GCMs is as large as it currently is. Whilst the GHM is able to represent the broad climate change signal that is represented by the CHMs, we find, however, that for some catchments there are differences between GHMs and CHMs in mean annual runoff due to differences in potential evaporation estimation methods, in the representation of the seasonality of runoff, and in the magnitude of changes in extreme monthly runoff, all of which have implications for future water management issues.
Resumo:
In this article a simple and effective controller design is introduced for the Hammerstein systems that are identified based on observational input/output data. The nonlinear static function in the Hammerstein system is modelled using a B-spline neural network. The controller is composed by computing the inverse of the B-spline approximated nonlinear static function, and a linear pole assignment controller. The contribution of this article is the inverse of De Boor algorithm that computes the inverse efficiently. Mathematical analysis is provided to prove the convergence of the proposed algorithm. Numerical examples are utilised to demonstrate the efficacy of the proposed approach.
Resumo:
This paper presents a new approach to modelling flash floods in dryland catchments by integrating remote sensing and digital elevation model (DEM) data in a geographical information system (GIS). The spectral reflectance of channels affected by recent flash floods exhibit a marked increase, due to the deposition of fine sediments in these channels as the flood recedes. This allows the parts of a catchment that have been affected by a recent flood event to be discriminated from unaffected parts, using a time series of Landsat images. Using images of the Wadi Hudain catchment in southern Egypt, the hillslope areas contributing flow were inferred for different flood events. The SRTM3 DEM was used to derive flow direction, flow length, active channel cross-sectional areas and slope. The Manning Equation was used to estimate the channel flow velocities, and hence the time-area zones of the catchment. A channel reach that was active during a 1985 runoff event, that does not receive any tributary flow, was used to estimate a transmission loss rate of 7·5 mm h−1, given the maximum peak discharge estimate. Runoff patterns resulting from different flood events are quite variable; however the southern part of the catchment appears to have experienced more floods during the period of study (1984–2000), perhaps because the bedrock hillslopes in this area are more effective at runoff production than other parts of the catchment which are underlain by unconsolidated Quaternary sands and gravels. Due to high transmission loss, runoff generated within the upper reaches is rarely delivered to the alluvial fan and Shalateen city situated at the catchment outlet. The synthetic GIS-based time area zones, on their own, cannot be relied on to model the hydrographs reliably; physical parameters, such as rainfall intensity, distribution, and transmission loss, must also be considered.
Resumo:
We present some additions to a fuzzy variable radius niche technique called Dynamic Niche Clustering (DNC) (Gan and Warwick, 1999; 2000; 2001) that enable the identification and creation of niches of arbitrary shape through a mechanism called Niche Linkage. We show that by using this mechanism it is possible to attain better feature extraction from the underlying population.
Resumo:
Acquiring a mechanistic understanding of the role of the biotic feedbacks on the links between atmospheric CO2 concentrations and temperature is essential for trustworthy climate predictions. Currently, computer based simulations are the only available tool to estimate the global impact of the biotic feedbacks on future atmospheric CO2 and temperatures. Here we propose an alternative and complementary approaches by using materially closed and energetically open analogue/physical models of the carbon cycle. We argue that there is potential in using a materially closed approach to improve our understanding of the magnitude and sign of many biotic feedbacks, and that recent technological advance make this feasible. We also suggest how such systems could be designed and discuss the advantages and limitations of establishing physical models of the global carbon cycle.
Resumo:
The high complexity of cloud parameterizations now held in models puts more pressure on observational studies to provide useful means to evaluate them. One approach to the problem put forth in the modelling community is to evaluate under what atmospheric conditions the parameterizations fail to simulate the cloud properties and under what conditions they do a good job. It is the ambition of this paper to characterize the variability of the statistical properties of tropical ice clouds in different tropical "regimes" recently identified in the literature to aid the development of better process-oriented parameterizations in models. For this purpose, the statistical properties of non-precipitating tropical ice clouds over Darwin, Australia are characterized using ground-based radar-lidar observations from the Atmospheric Radiation Measurement (ARM) Program. The ice cloud properties analysed are the frequency of ice cloud occurrence, the morphological properties (cloud top height and thickness), and the microphysical and radiative properties (ice water content, visible extinction, effective radius, and total concentration). The variability of these tropical ice cloud properties is then studied as a function of the large-scale cloud regimes derived from the International Satellite Cloud Climatology Project (ISCCP), the amplitude and phase of the Madden-Julian Oscillation (MJO), and the large-scale atmospheric regime as derived from a long-term record of radiosonde observations over Darwin. The vertical variability of ice cloud occurrence and microphysical properties is largest in all regimes (1.5 order of magnitude for ice water content and extinction, a factor 3 in effective radius, and three orders of magnitude in concentration, typically). 98 % of ice clouds in our dataset are characterized by either a small cloud fraction (smaller than 0.3) or a very large cloud fraction (larger than 0.9). In the ice part of the troposphere three distinct layers characterized by different statistically-dominant microphysical processes are identified. The variability of the ice cloud properties as a function of the large-scale atmospheric regime, cloud regime, and MJO phase is large, producing mean differences of up to a factor 8 in the frequency of ice cloud occurrence between large-scale atmospheric regimes and mean differences of a factor 2 typically in all microphysical properties. Finally, the diurnal cycle of the frequency of occurrence of ice clouds is also very different between regimes and MJO phases, with diurnal amplitudes of the vertically-integrated frequency of ice cloud occurrence ranging from as low as 0.2 (weak diurnal amplitude) to values in excess of 2.0 (very large diurnal amplitude). Modellers should now use these results to check if their model cloud parameterizations are capable of translating a given atmospheric forcing into the correct statistical ice cloud properties.
Resumo:
CloudSat is a satellite experiment designed to measure the vertical structure of clouds from space. The expected launch of CloudSat is planned for 2004, and once launched, CloudSat will orbit in formation as part of a constellation of satellites (the A-Train) that includes NASA's Aqua and Aura satellites, a NASA-CNES lidar satellite (CALIPSO), and a CNES satellite carrying a polarimeter (PARASOL). A unique feature that CloudSat brings to this constellation is the ability to fly a precise orbit enabling the fields of view of the CloudSat radar to be overlapped with the CALIPSO lidar footprint and the other measurements of the constellation. The precision and near simultaneity of this overlap creates a unique multisatellite observing system for studying the atmospheric processes essential to the hydrological cycle.The vertical profiles of cloud properties provided by CloudSat on the global scale fill a critical gap in the investigation of feedback mechanisms linking clouds to climate. Measuring these profiles requires a combination of active and passive instruments, and this will be achieved by combining the radar data of CloudSat with data from other active and passive sensors of the constellation. This paper describes the underpinning science and general overview of the mission, provides some idea of the expected products and anticipated application of these products, and the potential capability of the A-Train for cloud observations. Notably, the CloudSat mission is expected to stimulate new areas of research on clouds. The mission also provides an important opportunity to demonstrate active sensor technology for future scientific and tactical applications. The CloudSat mission is a partnership between NASA's JPL, the Canadian Space Agency, Colorado State University, the U.S. Air Force, and the U.S. Department of Energy.
Resumo:
Background. Meta-analyses show that cognitive behaviour therapy for psychosis (CBT-P) improves distressing positive symptoms. However, it is a complex intervention involving a range of techniques. No previous study has assessed the delivery of the different elements of treatment and their effect on outcome. Our aim was to assess the differential effect of type of treatment delivered on the effectiveness of CBT-P, using novel statistical methodology. Method. The Psychological Prevention of Relapse in Psychosis (PRP) trial was a multi-centre randomized controlled trial (RCT) that compared CBT-P with treatment as usual (TAU). Therapy was manualized, and detailed evaluations of therapy delivery and client engagement were made. Follow-up assessments were made at 12 and 24 months. In a planned analysis, we applied principal stratification (involving structural equation modelling with finite mixtures) to estimate intention-to-treat (ITT) effects for subgroups of participants, defined by qualitative and quantitative differences in receipt of therapy, while maintaining the constraints of randomization. Results. Consistent delivery of full therapy, including specific cognitive and behavioural techniques, was associated with clinically and statistically significant increases in months in remission, and decreases in psychotic and affective symptoms. Delivery of partial therapy involving engagement and assessment was not effective. Conclusions. Our analyses suggest that CBT-P is of significant benefit on multiple outcomes to patients able to engage in the full range of therapy procedures. The novel statistical methods illustrated in this report have general application to the evaluation of heterogeneity in the effects of treatment.
Resumo:
Models play a vital role in supporting a range of activities in numerous domains. We rely on models to support the design, visualisation, analysis and representation of parts of the world around us, and as such significant research effort has been invested into numerous areas of modelling; including support for model semantics, dynamic states and behaviour, temporal data storage and visualisation. Whilst these efforts have increased our capabilities and allowed us to create increasingly powerful software-based models, the process of developing models, supporting tools and /or data structures remains difficult, expensive and error-prone. In this paper we define from literature the key factors in assessing a model’s quality and usefulness: semantic richness, support for dynamic states and object behaviour, temporal data storage and visualisation. We also identify a number of shortcomings in both existing modelling standards and model development processes and propose a unified generic process to guide users through the development of semantically rich, dynamic and temporal models.
Resumo:
This article extends the traditions of style-based criticism through an encounter with the insights that can be gained from engaging with filmmakers at work. By bringing into relationship two things normally thought of as separate: production history and disinterested critical analysis, the discussion aims to extend the subjects which criticism can appreciate as well as providing some insights on the creative process. Drawing on close analysis, on observations made during fieldwork and on access to earlier cuts of the film, this article looks at a range of interrelated decision-making anchored by the reading of a particular sequence. The article examines changes the film underwent in the different stages of production, and some of the inventions deployed to ensure key themes and ideas remained in play, as other elements changed. It draws conclusions which reveal perspectives on the filmmaking process, on collaboration, and on the creative response to material realities. The article reveals elements of the complexity of the process of the construction of image and soundtrack, and extends the range of filmmakers’ choices which are part of a critical dialogue. Has a relationship to ‘Sleeping with half open eyes: dreams and realities in The Cry of the Owl’, Movie: A Journal of Film Criticism, 1, (2010) which provides a broader interpretative context for the enquiry.
Resumo:
A novel approach is presented for combining spatial and temporal detail from newly available TRMM-based data sets to derive hourly rainfall intensities at 1-km spatial resolution for hydrological modelling applications. Time series of rainfall intensities derived from 3-hourly 0.25° TRMM 3B42 data are merged with a 1-km gridded rainfall climatology based on TRMM 2B31 data to account for the sub-grid spatial distribution of rainfall intensities within coarse-scale 0.25° grid cells. The method is implemented for two dryland catchments in Tunisia and Senegal, and validated against gauge data. The outcomes of the validation show that the spatially disaggregated and intensity corrected TRMM time series more closely approximate ground-based measurements than non-corrected data. The method introduced here enables the generation of rainfall intensity time series with realistic temporal and spatial detail for dynamic modelling of runoff and infiltration processes that are especially important to water resource management in arid regions.
Resumo:
Software representations of scenes, i.e. the modelling of objects in space, are used in many application domains. Current modelling and scene description standards focus on visualisation dimensions, and are intrinsically limited by their dependence upon their semantic interpretation and contextual application by humans. In this paper we propose the need for an open, extensible and semantically rich modelling language, which facilitates a machine-readable semantic structure. We critically review existing standards and techniques, and highlight a need for a semantically focussed scene description language. Based on this defined need we propose a preliminary solution, based on hypergraph theory, and reflect on application domains.
Resumo:
The DNA G-qadruplexes are one of the targets being actively explored for anti-cancer therapy by inhibiting them through small molecules. This computational study was conducted to predict the binding strengths and orientations of a set of novel dimethyl-amino-ethyl-acridine (DACA) analogues that are designed and synthesized in our laboratory, but did not diffract in Synchrotron light.Thecrystal structure of DNA G-Quadruplex(TGGGGT)4(PDB: 1O0K) was used as target for their binding properties in our studies.We used both the force field (FF) and QM/MM derived atomic charge schemes simultaneously for comparing the predictions of drug binding modes and their energetics. This study evaluates the comparative performance of fixed point charge based Glide XP docking and the quantum polarized ligand docking schemes. These results will provide insights on the effects of including or ignoring the drug-receptor interfacial polarization events in molecular docking simulations, which in turn, will aid the rational selection of computational methods at different levels of theory in future drug design programs. Plenty of molecular modelling tools and methods currently exist for modelling drug-receptor or protein-protein, or DNA-protein interactionssat different levels of complexities.Yet, the capasity of such tools to describevarious physico-chemical propertiesmore accuratelyis the next step ahead in currentresearch.Especially, the usage of most accurate methods in quantum mechanics(QM) is severely restricted by theirtedious nature. Though the usage of massively parallel super computing environments resulted in a tremendous improvement in molecular mechanics (MM) calculations like molecular dynamics,they are still capable of dealing with only a couple of tens to hundreds of atoms for QM methods. One such efficient strategy that utilizes thepowers of both MM and QM are the QM/MM hybrid methods. Lately, attempts have been directed towards the goal of deploying several different QM methods for betterment of force field based simulations, but with practical restrictions in place. One of such methods utilizes the inclusion of charge polarization events at the drug-receptor interface, that is not explicitly present in the MM FF.
Resumo:
High rates of nutrient loading from agricultural and urban development have resulted in surface water eutrophication and groundwater contamination in regions of Ontario. In Lake Simcoe (Ontario, Canada), anthropogenic nutrient contributions have contributed to increased algal growth, low hypolimnetic oxygen concentrations, and impaired fish reproduction. An ambitious programme has been initiated to reduce phosphorus loads to the lake, aiming to achieve at least a 40% reduction in phosphorus loads by 2045. Achievement of this target necessitates effective remediation strategies, which will rely upon an improved understanding of controls on nutrient export from tributaries of Lake Simcoe as well as improved understanding of the importance of phosphorus cycling within the lake. In this paper, we describe a new model structure for the integrated dynamic and process-based model INCA-P, which allows fully-distributed applications, suited to branched river networks. We demonstrate application of this model to the Black River, a tributary of Lake Simcoe, and use INCA-P to simulate the fluxes of P entering the lake system, apportion phosphorus among different sources in the catchment, and explore future scenarios of land-use change and nutrient management to identify high priority sites for implementation of watershed best management practises.
Resumo:
We make a qualitative and quantitative comparison of numericalsimulations of the ashcloud generated by the eruption of Eyjafjallajökull in April2010 with ground-basedlidar measurements at Exeter and Cardington in southern England. The numericalsimulations are performed using the Met Office’s dispersion model, NAME (Numerical Atmospheric-dispersion Modelling Environment). The results show that NAME captures many of the features of the observed ashcloud. The comparison enables us to estimate the fraction of material which survives the near-source fallout processes and enters into the distal plume. A number of simulations are performed which show that both the structure of the ashcloudover southern England and the concentration of ash within it are particularly sensitive to the height of the eruption column (and the consequent estimated mass emission rate), to the shape of the vertical source profile and the level of prescribed ‘turbulent diffusion’ (representing the mixing by the unresolved eddies) in the free troposphere with less sensitivity to the timing of the start of the eruption and the sedimentation of particulates in the distal plume.