27 resultados para Design Studio Model
em CentAUR: Central Archive University of Reading - UK
Resumo:
New construction algorithms for radial basis function (RBF) network modelling are introduced based on the A-optimality and D-optimality experimental design criteria respectively. We utilize new cost functions, based on experimental design criteria, for model selection that simultaneously optimizes model approximation, parameter variance (A-optimality) or model robustness (D-optimality). The proposed approaches are based on the forward orthogonal least-squares (OLS) algorithm, such that the new A-optimality- and D-optimality-based cost functions are constructed on the basis of an orthogonalization process that gains computational advantages and hence maintains the inherent computational efficiency associated with the conventional forward OLS approach. The proposed approach enhances the very popular forward OLS-algorithm-based RBF model construction method since the resultant RBF models are constructed in a manner that the system dynamics approximation capability, model adequacy and robustness are optimized simultaneously. The numerical examples provided show significant improvement based on the D-optimality design criterion, demonstrating that there is significant room for improvement in modelling via the popular RBF neural network.
Resumo:
A common problem in many data based modelling algorithms such as associative memory networks is the problem of the curse of dimensionality. In this paper, a new two-stage neurofuzzy system design and construction algorithm (NeuDeC) for nonlinear dynamical processes is introduced to effectively tackle this problem. A new simple preprocessing method is initially derived and applied to reduce the rule base, followed by a fine model detection process based on the reduced rule set by using forward orthogonal least squares model structure detection. In both stages, new A-optimality experimental design-based criteria we used. In the preprocessing stage, a lower bound of the A-optimality design criterion is derived and applied as a subset selection metric, but in the later stage, the A-optimality design criterion is incorporated into a new composite cost function that minimises model prediction error as well as penalises the model parameter variance. The utilisation of NeuDeC leads to unbiased model parameters with low parameter variance and the additional benefit of a parsimonious model structure. Numerical examples are included to demonstrate the effectiveness of this new modelling approach for high dimensional inputs.
Resumo:
This paper describes a study of the use of immersive Virtual reality technologies in the design of a new hospital. It uses Schön’s concept of reflective practice and video-based methods to analyse the ways design teams approach and employ a full scale 3D immersive environment – a CAVE – in collaborative design work. The analysis describes four themes relating to reflective practice occurring in the setting: orienting to the CAVE technology itself, orienting to the representation of the specific design within the CAVE, activities accounting for, or exploring alternatives within the design for the use and users of the space, and more strategic interactions around how to best represent the design and model to the client within the CAVE setting. The analysis also reveals some unique aspects of design work in this environment. Perhaps most significantly, rather than enhancing or adding to an existing understanding of design through paper based or non-immersive digital representations, it is often acting to challenge or surprise the participants as they experience the immersive, full scale version of their own design.
Resumo:
This paper presents a controller design scheme for a priori unknown non-linear dynamical processes that are identified via an operating point neurofuzzy system from process data. Based on a neurofuzzy design and model construction algorithm (NeuDec) for a non-linear dynamical process, a neurofuzzy state-space model of controllable form is initially constructed. The control scheme based on closed-loop pole assignment is then utilized to ensure the time invariance and linearization of the state equations so that the system stability can be guaranteed under some mild assumptions, even in the presence of modelling error. The proposed approach requires a known state vector for the application of pole assignment state feedback. For this purpose, a generalized Kalman filtering algorithm with coloured noise is developed on the basis of the neurofuzzy state-space model to obtain an optimal state vector estimation. The derived controller is applied in typical output tracking problems by minimizing the tracking error. Simulation examples are included to demonstrate the operation and effectiveness of the new approach.
Resumo:
Glucokinase Regulatory Protein (GCKR) plays a central role regulating both hepatic triglyceride and glucose metabolism. Fatty acids are key metabolic regulators, which interact with genetic factors and influence glucose metabolism and other metabolic traits. Omega-3 polyunsaturated fatty acids (n-3 PUFA) have been of considerable interest, due to their potential to reduce metabolic syndrome (MetS) risk. Objective To examine whether genetic variability at the GCKR gene locus was associated with the degree of insulin resistance, plasma concentrations of C-reactive protein (CRP) and n-3 PUFA in MetS subjects. Design Homeostasis model assessment of insulin resistance (HOMA-IR), HOMA-B, plasma concentrations of C-peptide, CRP, fatty acid composition and the GCKR rs1260326-P446L polymorphism, were determined in a cross-sectional analysis of 379 subjects with MetS participating in the LIPGENE dietary cohort. Results Among subjects with n-3 PUFA levels below the population median, carriers of the common C/C genotype had higher plasma concentrations of fasting insulin (P = 0.019), C-peptide (P = 0.004), HOMA-IR (P = 0.008) and CRP (P = 0.032) as compared with subjects carrying the minor T-allele (Leu446). In contrast, homozygous C/C carriers with n-3 PUFA levels above the median showed lower plasma concentrations of fasting insulin, peptide C, HOMA-IR and CRP, as compared with individuals with the T-allele. Conclusions We have demonstrated a significant interaction between the GCKR rs1260326-P446L polymorphism and plasma n-3 PUFA levels modulating insulin resistance and inflammatory markers in MetS subjects. Further studies are needed to confirm this gene-diet interaction in the general population and whether targeted dietary recommendations can prevent MetS in genetically susceptible individuals.
Resumo:
The persuasive design of e-commerce websites has been shown to support people with online purchases. Therefore, it is important to understand how persuasive applications are used and assimilated into e-commerce website designs. This paper demonstrates how the PSD model’s persuasive features could be used to build a bridge supporting the extraction and evaluation of persuasive features in such e-commerce websites; thus practically explaining how feature implementation can enhance website persuasiveness. To support a deeper understanding of persuasive e-commerce website design, this research, using the Persuasive Systems Design (PSD) model, identifies the distinct persuasive features currently assimilated in ten successful e-commerce websites. The results revealed extensive use of persuasive features; particularly features related to dialogue support, credibility support, and primary task support; thus highlighting weaknesses in the implementation of social support features. In conclusion we suggest possible ways for enhancing persuasive feature implementation via appropriate contextual examples and explanation.
Resumo:
The formulation of a new process-based crop model, the general large-area model (GLAM) for annual crops is presented. The model has been designed to operate on spatial scales commensurate with those of global and regional climate models. It aims to simulate the impact of climate on crop yield. Procedures for model parameter determination and optimisation are described, and demonstrated for the prediction of groundnut (i.e. peanut; Arachis hypogaea L.) yields across India for the period 1966-1989. Optimal parameters (e.g. extinction coefficient, transpiration efficiency, rate of change of harvest index) were stable over space and time, provided the estimate of the yield technology trend was based on the full 24-year period. The model has two location-specific parameters, the planting date, and the yield gap parameter. The latter varies spatially and is determined by calibration. The optimal value varies slightly when different input data are used. The model was tested using a historical data set on a 2.5degrees x 2.5degrees grid to simulate yields. Three sites are examined in detail-grid cells from Gujarat in the west, Andhra Pradesh towards the south, and Uttar Pradesh in the north. Agreement between observed and modelled yield was variable, with correlation coefficients of 0.74, 0.42 and 0, respectively. Skill was highest where the climate signal was greatest, and correlations were comparable to or greater than correlations with seasonal mean rainfall. Yields from all 35 cells were aggregated to simulate all-India yield. The correlation coefficient between observed and simulated yields was 0.76, and the root mean square error was 8.4% of the mean yield. The model can be easily extended to any annual crop for the investigation of the impacts of climate variability (or change) on crop yield over large areas. (C) 2004 Elsevier B.V. All rights reserved.
Resumo:
Presented herein is an experimental design that allows the effects of several radiative forcing factors on climate to be estimated as precisely as possible from a limited suite of atmosphere-only general circulation model (GCM) integrations. The forcings include the combined effect of observed changes in sea surface temperatures, sea ice extent, stratospheric (volcanic) aerosols, and solar output, plus the individual effects of several anthropogenic forcings. A single linear statistical model is used to estimate the forcing effects, each of which is represented by its global mean radiative forcing. The strong colinearity in time between the various anthropogenic forcings provides a technical problem that is overcome through the design of the experiment. This design uses every combination of anthropogenic forcing rather than having a few highly replicated ensembles, which is more commonly used in climate studies. Not only is this design highly efficient for a given number of integrations, but it also allows the estimation of (nonadditive) interactions between pairs of anthropogenic forcings. The simulated land surface air temperature changes since 1871 have been analyzed. The changes in natural and oceanic forcing, which itself contains some forcing from anthropogenic and natural influences, have the most influence. For the global mean, increasing greenhouse gases and the indirect aerosol effect had the largest anthropogenic effects. It was also found that an interaction between these two anthropogenic effects in the atmosphere-only GCM exists. This interaction is similar in magnitude to the individual effects of changing tropospheric and stratospheric ozone concentrations or to the direct (sulfate) aerosol effect. Various diagnostics are used to evaluate the fit of the statistical model. For the global mean, this shows that the land temperature response is proportional to the global mean radiative forcing, reinforcing the use of radiative forcing as a measure of climate change. The diagnostic tests also show that the linear model was suitable for analyses of land surface air temperature at each GCM grid point. Therefore, the linear model provides precise estimates of the space time signals for all forcing factors under consideration. For simulated 50-hPa temperatures, results show that tropospheric ozone increases have contributed to stratospheric cooling over the twentieth century almost as much as changes in well-mixed greenhouse gases.
Resumo:
The emergent requirements for effective e-learning calls for a paradigm shift for instructional design. Constructivist theory and semiotics offer a sound underpinning to enable such revolutionary change by employing the concepts of Learning Objects. E-learning guidelines adopted by the industry have led successfully to the development of training materials. Inadequacy and deficiency of those methods for Higher Education have been identified in this paper. Based on the best practice in industry and our empirical research, we present an instructional design model with practical templates for constructivist learning.
Resumo:
Procurement is one of major business operations in public service sector. The advance of information and communication technology (ICT) pushes this business operation to increase its efficiency and foster collaborations between the organization and its suppliers. This leads to a shift from the traditional procurement transactions to an e-procurement paradigm. Such change impacts on business process, information management and decision making. E-procurement involves various stakeholders who engage in activities based on different social and cultural practices. Therefore, a design of e-procurement system may involve complex situations analysis. This paper describes an approach of using the problem articulation method to support such analysis. This approach is applied to a case study from UAE.
Resumo:
There is a growing concern in reducing greenhouse gas emissions all over the world. The U.K. has set 34% target reduction of emission before 2020 and 80% before 2050 compared to 1990 recently in Post Copenhagen Report on Climate Change. In practise, Life Cycle Cost (LCC) and Life Cycle Assessment (LCA) tools have been introduced to construction industry in order to achieve this such as. However, there is clear a disconnection between costs and environmental impacts over the life cycle of a built asset when using these two tools. Besides, the changes in Information and Communication Technologies (ICTs) lead to a change in the way information is represented, in particular, information is being fed more easily and distributed more quickly to different stakeholders by the use of tool such as the Building Information Modelling (BIM), with little consideration on incorporating LCC and LCA and their maximised usage within the BIM environment. The aim of this paper is to propose the development of a model-based LCC and LCA tool in order to provide sustainable building design decisions for clients, architects and quantity surveyors, by then an optimal investment decision can be made by studying the trade-off between costs and environmental impacts. An application framework is also proposed finally as the future work that shows how the proposed model can be incorporated into the BIM environment in practise.