921 resultados para Non-autonomous Schr odinger-Poisson systems


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Coloured effluents from textile industries are a problem in many rivers and waterways. Prediction of adsorption capacities of dyes by adsorbents is important in design considerations. The sorption of three basic dyes, namely Basic Blue 3, Basic Yellow 21 and Basic Red 22, onto peat is reported. Equilibrium sorption isotherms have been measured for the three single component systems. Equilibrium was achieved after twenty-one days. The experimental isotherm data were analysed using Langmuir, Freundlich, Redlich-Peterson, Temkin and Toth isotherm equations. A detailed error analysis has been undertaken to investigate the effect of using different error criteria for the determination of the single component isotherm parameters and hence obtain the best isotherm and isotherm parameters which describe the adsorption process. The linear transform model provided the highest R2 regression coefficient with the Redlich-Peterson model. The Redlich-Peterson model also yielded the best fit to experimental data for all three dyes using the non-linear error functions. An extended Langmuir model has been used to predict the isotherm data for the binary systems using the single component data. The correlation between theoretical and experimental data had only limited success due to competitive and interactive effects between the dyes and the dye-surface interactions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper, chosen as a best paper from the 2005 SAMOS Workshop on Computer Systems: describes the for the first time the major Abhainn project for automated system level design of embedded signal processing systems. In particular, this describes four key novelties: novel algorithm modelling techniques for DSP systems, automated implementation realisation, algorithm transformation for system optimisation and automated inter-processor communication. This is applied to two complex systems: a radar and sonar system. In both cases technology which allows non-experts to automatically create low-overhead, high performance embedded signal processing systems is exhibited.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

To help design an environment in which professionals without legal training can make effective use of public sector legal information on planning and the environment - for Add-Wijzer, a European e-government project - we evaluated their perceptions of usefulness and usability. In concurrent think-aloud usability tests, lawyers and non-lawyers carried out information retrieval tasks on a range of online legal databases. We found that non-lawyers reported twice as many difficulties as those with legal training (p = 0.001), that the number of difficulties and the choice of database affected successful completion, and that the non-lawyers had surprisingly few problems understanding legal terminology. Instead, they had more problems understanding the syntactical structure of legal documents and collections. The results support the constraint attunement hypothesis (CAH) of the effects of expertise on information retrieval, with implications for the design of systems to support the effective understanding and use of information.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Non-invasive real time in vivo molecular imaging in small animal models has become the essential bridge between in vitro data and their translation into clinical applications. The tremendous development and technological progress, such as tumour modelling, monitoring of tumour growth and detection of metastasis, has facilitated translational drug development. This has added to our knowledge on carcinogenesis. The modalities that are commonly used include Magnetic Resonance Imaging (MRI), Computed Tomography (CT), Positron Emission Tomography (PET), bioluminescence imaging, fluorescence imaging and multi-modality imaging systems. The ability to obtain multiple images longitudinally provides reliable information whilst reducing animal numbers. As yet there is no one modality that is ideal for all experimental studies. This review outlines the instrumentation available together with corresponding applications reported in the literature with particular emphasis on cancer research. Advantages and limitations to current imaging technology are discussed and the issues concerning small animal care during imaging are highlighted.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The paper is primarily concerned with the modelling of aircraft manufacturing cost. The aim is to establish an integrated life cycle balanced design process through a systems engineering approach to interdisciplinary analysis and control. The cost modelling is achieved using the genetic causal approach that enforces product family categorisation and the subsequent generation of causal relationships between deterministic cost components and their design source. This utilises causal parametric cost drivers and the definition of the physical architecture from the Work Breakdown Structure (WBS) to identify product families. The paper presents applications to the overall aircraft design with a particular focus on the fuselage as a subsystem of the aircraft, including fuselage panels and localised detail, as well as engine nacelles. The higher level application to aircraft requirements and functional analysis is investigated and verified relative to life cycle design issues for the relationship between acquisition cost and Direct Operational Cost (DOC), for a range of both metal and composite subsystems. Maintenance is considered in some detail as an important contributor to DOC and life cycle cost. The lower level application to aircraft physical architecture is investigated and verified for the WBS of an engine nacelle, including a sequential build stage investigation of the materials, fabrication and assembly costs. The studies are then extended by investigating the acquisition cost of aircraft fuselages, including the recurring unit cost and the non-recurring design cost of the airframe sub-system. The systems costing methodology is facilitated by the genetic causal cost modeling technique as the latter is highly generic, interdisciplinary, flexible, multilevel and recursive in nature, and can be applied at the various analysis levels required of systems engineering. Therefore, the main contribution of paper is a methodology for applying systems engineering costing, supported by the genetic causal cost modeling approach, whether at a requirements, functional or physical level.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The identification of non-linear systems using only observed finite datasets has become a mature research area over the last two decades. A class of linear-in-the-parameter models with universal approximation capabilities have been intensively studied and widely used due to the availability of many linear-learning algorithms and their inherent convergence conditions. This article presents a systematic overview of basic research on model selection approaches for linear-in-the-parameter models. One of the fundamental problems in non-linear system identification is to find the minimal model with the best model generalisation performance from observational data only. The important concepts in achieving good model generalisation used in various non-linear system-identification algorithms are first reviewed, including Bayesian parameter regularisation and models selective criteria based on the cross validation and experimental design. A significant advance in machine learning has been the development of the support vector machine as a means for identifying kernel models based on the structural risk minimisation principle. The developments on the convex optimisation-based model construction algorithms including the support vector regression algorithms are outlined. Input selection algorithms and on-line system identification algorithms are also included in this review. Finally, some industrial applications of non-linear models are discussed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Research on the selective reduction of NOx with hydrocarbons under lean-burn conditions using non-zeolitic oxides and platinum group metal (PGM) catalysts has been critically reviewed. Alumina and silver-promoted alumina catalysts have been described in detail with particular emphasis on an analysis of the various reaction mechanisms that have been put forward in the literature. The influence of the nature of the reducing agent, and the preparation and structure of the catalysts have also been discussed and rationalised for several other oxide systems. It is concluded for non-zeolitic oxides that species that are strongly adsorbed on the surface, such as nitrates/nitrites and acetates, could be key intermediates in the formation of various reduced and oxidised species of nitrogen, the further reaction of which leads eventually to the formation of molecular nitrogen. For the platinum group metal catalysts, the different mechanisms that have been proposed in the literature have been critically assessed. It is concluded that although there is indirect, mainly spectroscopic, evidence for various reaction intermediates on the catalyst surface, it is difficult to confirm that any of these are involved in a critical mechanistic step because of a lack of a direct quantitative correlation between infrared and kinetic measurements. A simple mechanism which involves the dissociation of NO on a reduced metal surface to give N(ads) and O(ads), with subsequent desorption of N-2 and N2O and removal of O(ads) by the reductant can explain many of the results with the platinum group metal catalysts, although an additional contribution from organo-nitro-type species may contribute to the overall NOx reduction activity with these catalysts.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A problem with use of the geostatistical Kriging error for optimal sampling design is that the design does not adapt locally to the character of spatial variation. This is because a stationary variogram or covariance function is a parameter of the geostatistical model. The objective of this paper was to investigate the utility of non-stationary geostatistics for optimal sampling design. First, a contour data set of Wiltshire was split into 25 equal sub-regions and a local variogram was predicted for each. These variograms were fitted with models and the coefficients used in Kriging to select optimal sample spacings for each sub-region. Large differences existed between the designs for the whole region (based on the global variogram) and for the sub-regions (based on the local variograms). Second, a segmentation approach was used to divide a digital terrain model into separate segments. Segment-based variograms were predicted and fitted with models. Optimal sample spacings were then determined for the whole region and for the sub-regions. It was demonstrated that the global design was inadequate, grossly over-sampling some segments while under-sampling others.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Modem society depends on complex agro-ecological and trading systems to provide food for urban residents, yet there are few tools available to assess whether these systems are vulnerable to future disturbances. We propose a preliminary framework to assess the vulnerability of food systems to future shocks based on landscape ecology's 'Panarchy Framework'. According to Panarchy, ecosystem vulnerability is determined by three generic characteristics: (1) the wealth available in the system, (2) how connected the system is, and (3) how much diversity exists in the system. In this framework, wealthy, non-diverse, tightly connected systems are highly vulnerable. The wealth of food systems can be measured using the approach pioneered by development economists to assess how poverty affects food security. Diversity can be measured using the tools investors use to measure the diversity of investment portfolios to assess financial risk. The connectivity of a system can be evaluated with the tools chemists use to assess the pathways chemicals use to flow through the environment. This approach can lead to better tools for creating policy designed to reduce vulnerability, and can help urban or regional planners identify where food systems are vulnerable to shocks and disturbances that may occur in the future. (c) 2005 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We discuss how common problems arising with multi/many core distributed architectures can he effectively handled through co-design of parallel/distributed programming abstractions and of autonomic management of non-functional concerns. In particular, we demonstrate how restricted patterns (or skeletons) may be efficiently managed by rule-based autonomic managers. We discuss the basic principles underlying pattern+manager co-design, current implementations inspired by this approach and some result achieved with proof-or-concept, prototype.