228 resultados para Optimisation problems
Resumo:
To bridge the gaps between traditional mesoscale modelling and microscale modelling, the National Center for Atmospheric Research, in collaboration with other agencies and research groups, has developed an integrated urban modelling system coupled to the weather research and forecasting (WRF) model as a community tool to address urban environmental issues. The core of this WRF/urban modelling system consists of the following: (1) three methods with different degrees of freedom to parameterize urban surface processes, ranging from a simple bulk parameterization to a sophisticated multi-layer urban canopy model with an indoor–outdoor exchange sub-model that directly interacts with the atmospheric boundary layer, (2) coupling to fine-scale computational fluid dynamic Reynolds-averaged Navier–Stokes and Large-Eddy simulation models for transport and dispersion (T&D) applications, (3) procedures to incorporate high-resolution urban land use, building morphology, and anthropogenic heating data using the National Urban Database and Access Portal Tool (NUDAPT), and (4) an urbanized high-resolution land data assimilation system. This paper provides an overview of this modelling system; addresses the daunting challenges of initializing the coupled WRF/urban model and of specifying the potentially vast number of parameters required to execute the WRF/urban model; explores the model sensitivity to these urban parameters; and evaluates the ability of WRF/urban to capture urban heat islands, complex boundary-layer structures aloft, and urban plume T&D for several major metropolitan regions. Recent applications of this modelling system illustrate its promising utility, as a regional climate-modelling tool, to investigate impacts of future urbanization on regional meteorological conditions and on air quality under future climate change scenarios. Copyright © 2010 Royal Meteorological Society
Resumo:
As the calibration and evaluation of flood inundation models are a prerequisite for their successful application, there is a clear need to ensure that the performance measures that quantify how well models match the available observations are fit for purpose. This paper evaluates the binary pattern performance measures that are frequently used to compare flood inundation models with observations of flood extent. This evaluation considers whether these measures are able to calibrate and evaluate model predictions in a credible and consistent way, i.e. identifying the underlying model behaviour for a number of different purposes such as comparing models of floods of different magnitudes or on different catchments. Through theoretical examples, it is shown that the binary pattern measures are not consistent for floods of different sizes, such that for the same vertical error in water level, a model of a flood of large magnitude appears to perform better than a model of a smaller magnitude flood. Further, the commonly used Critical Success Index (usually referred to as F<2 >) is biased in favour of overprediction of the flood extent, and is also biased towards correctly predicting areas of the domain with smaller topographic gradients. Consequently, it is recommended that future studies consider carefully the implications of reporting conclusions using these performance measures. Additionally, future research should consider whether a more robust and consistent analysis could be achieved by using elevation comparison methods instead.
Resumo:
Energy storage is a potential alternative to conventional network reinforcementof the low voltage (LV) distribution network to ensure the grid’s infrastructure remainswithin its operating constraints. This paper presents a study on the control of such storagedevices, owned by distribution network operators. A deterministic model predictive control (MPC) controller and a stochastic receding horizon controller (SRHC) are presented, wherethe objective is to achieve the greatest peak reduction in demand, for a given storagedevice specification, taking into account the high level of uncertainty in the prediction of LV demand. The algorithms presented in this paper are compared to a standard set-pointcontroller and bench marked against a control algorithm with a perfect forecast. A specificcase study, using storage on the LV network, is presented, and the results of each algorithmare compared. A comprehensive analysis is then carried out simulating a large number of LV networks of varying numbers of households. The results show that the performance of each algorithm is dependent on the number of aggregated households. However, on a typical aggregation, the novel SRHC algorithm presented in this paper is shown to outperform each of the comparable storage control techniques.
Resumo:
Background: Although it is well-established that children with language impairment (LI) and children with autism spectrum disorders (ASD) both show elevated levels of emotional and behavioural problems, the level and types of difficulties across the two groups have not previously been directly compared. Aims: To compare levels of emotional and behavioural problems in children with LI and children with ASD recruited from the same mainstream schools. Methods & Procedures: We measured teacher-reported emotional and behavioural problems using the Strengths and Difficulties Questionnaire (SDQ) in a sample of 5-to-13-year old children with LI (N=62) and children with ASD (N=42) attending mainstream school but with identified special educational needs. Outcomes & Results: Both groups showed similarly elevated levels of emotional, conduct and hyperactivity problems. The only differences between the LI and ASD groups were on subscales assessing peer problems (which were higher in the ASD group) and prosocial behaviours (which were higher in the LI group). Overall, there were few associations between emotional and behavioural problems and child characteristics, reflecting the pervasive nature of these difficulties in children with LI and children with ASD, although levels of problems were higher in children with ASD with lower language ability. However, in the ASD group only, a measure of family social economic status was associated with language ability and attenuated the association between language ability and emotional and behavioural problems. Conclusions & Implications: Children with LI and children with ASD in mainstream school show similarly elevated levels of emotional and behavioural problems, which require monitoring and may benefit from intervention. Further work is required to identify the child, family and situational factors that place children with LI and children with ASD at risk of emotional and behavioural problems, and whether these differ between the two groups. This work can then guide the application of evidence-based interventions to these children.
Resumo:
Information technology has become heavily embedded in business operations. As business needs change over time, IT applications are expected to continue providing required support. Whether the existing IT applications are still fit for the business purpose they were intended or new IT applications should be introduced, is a strategic decision for business, IT and business-aligned IT. In this paper, we present a method which aims to analyse business functions and IT roles, and to evaluate business-aligned IT from both social and technical perspectives. The method introduces a set of techniques that systematically supports the evaluation of the existing IT applications in relation to their technical capabilities for maximising business value. Furthermore, we discuss the evaluation process and results which are illustrated and validated through a real-life case study of a UK borough council, and followed by discussion on implications for researchers and practitioners.
Resumo:
The psychiatric and psychosocial evaluation of the heart transplant candidate can identify particular predictors for postoperative problems. These factors, as identified during the comprehensive evaluation phase, provide an assessment of the candidate in context of the proposed transplantation protocol. Previous issues with compliance, substance abuse, and psychosis are clear indictors of postoperative problems. The prolonged waiting list time provides an additional period to evaluate and provide support to patients having a terminal disease who need a heart transplant, and are undergoing prolonged hospitalization. Following transplantation, the patient is faced with additional challenges of a new self-image, multiple concerns, anxiety, and depression. Ultimately, the success of the heart transplantation remains dependent upon the recipient's ability to cope psychologically and comply with the medication regimen. The limited resource of donor hearts and the high emotional and financial cost of heart transplantation lead to an exhaustive effort to select those patients who will benefit from the improved physical health the heart transplant confers.
Resumo:
Variational data assimilation is commonly used in environmental forecasting to estimate the current state of the system from a model forecast and observational data. The assimilation problem can be written simply in the form of a nonlinear least squares optimization problem. However the practical solution of the problem in large systems requires many careful choices to be made in the implementation. In this article we present the theory of variational data assimilation and then discuss in detail how it is implemented in practice. Current solutions and open questions are discussed.
Resumo:
Biological models of an apoptotic process are studied using models describing a system of differential equations derived from reaction kinetics information. The mathematical model is re-formulated in a state-space robust control theory framework where parametric and dynamic uncertainty can be modelled to account for variations naturally occurring in biological processes. We propose to handle the nonlinearities using neural networks.
Resumo:
In this review I summarise some of the most significant advances of the last decade in the analysis and solution of boundary value problems for integrable partial differential equations in two independent variables. These equations arise widely in mathematical physics, and in order to model realistic applications, it is essential to consider bounded domain and inhomogeneous boundary conditions. I focus specifically on a general and widely applicable approach, usually referred to as the Unified Transform or Fokas Transform, that provides a substantial generalisation of the classical Inverse Scattering Transform. This approach preserves the conceptual efficiency and aesthetic appeal of the more classical transform approaches, but presents a distinctive and important difference. While the Inverse Scattering Transform follows the "separation of variables" philosophy, albeit in a nonlinear setting, the Unified Transform is a based on the idea of synthesis, rather than separation, of variables. I will outline the main ideas in the case of linear evolution equations, and then illustrate their generalisation to certain nonlinear cases of particular significance.
Resumo:
We extend extreme learning machine (ELM) classifiers to complex Reproducing Kernel Hilbert Spaces (RKHS) where the input/output variables as well as the optimization variables are complex-valued. A new family of classifiers, called complex-valued ELM (CELM) suitable for complex-valued multiple-input–multiple-output processing is introduced. In the proposed method, the associated Lagrangian is computed using induced RKHS kernels, adopting a Wirtinger calculus approach formulated as a constrained optimization problem similarly to the conventional ELM classifier formulation. When training the CELM, the Karush–Khun–Tuker (KKT) theorem is used to solve the dual optimization problem that consists of satisfying simultaneously smallest training error as well as smallest norm of output weights criteria. The proposed formulation also addresses aspects of quaternary classification within a Clifford algebra context. For 2D complex-valued inputs, user-defined complex-coupled hyper-planes divide the classifier input space into four partitions. For 3D complex-valued inputs, the formulation generates three pairs of complex-coupled hyper-planes through orthogonal projections. The six hyper-planes then divide the 3D space into eight partitions. It is shown that the CELM problem formulation is equivalent to solving six real-valued ELM tasks, which are induced by projecting the chosen complex kernel across the different user-defined coordinate planes. A classification example of powdered samples on the basis of their terahertz spectral signatures is used to demonstrate the advantages of the CELM classifiers compared to their SVM counterparts. The proposed classifiers retain the advantages of their ELM counterparts, in that they can perform multiclass classification with lower computational complexity than SVM classifiers. Furthermore, because of their ability to perform classification tasks fast, the proposed formulations are of interest to real-time applications.