16 resultados para Set design
Resumo:
The formulation of a new process-based crop model, the general large-area model (GLAM) for annual crops is presented. The model has been designed to operate on spatial scales commensurate with those of global and regional climate models. It aims to simulate the impact of climate on crop yield. Procedures for model parameter determination and optimisation are described, and demonstrated for the prediction of groundnut (i.e. peanut; Arachis hypogaea L.) yields across India for the period 1966-1989. Optimal parameters (e.g. extinction coefficient, transpiration efficiency, rate of change of harvest index) were stable over space and time, provided the estimate of the yield technology trend was based on the full 24-year period. The model has two location-specific parameters, the planting date, and the yield gap parameter. The latter varies spatially and is determined by calibration. The optimal value varies slightly when different input data are used. The model was tested using a historical data set on a 2.5degrees x 2.5degrees grid to simulate yields. Three sites are examined in detail-grid cells from Gujarat in the west, Andhra Pradesh towards the south, and Uttar Pradesh in the north. Agreement between observed and modelled yield was variable, with correlation coefficients of 0.74, 0.42 and 0, respectively. Skill was highest where the climate signal was greatest, and correlations were comparable to or greater than correlations with seasonal mean rainfall. Yields from all 35 cells were aggregated to simulate all-India yield. The correlation coefficient between observed and simulated yields was 0.76, and the root mean square error was 8.4% of the mean yield. The model can be easily extended to any annual crop for the investigation of the impacts of climate variability (or change) on crop yield over large areas. (C) 2004 Elsevier B.V. All rights reserved.
Resumo:
Plant communities of set-aside agricultural land in a European project were managed in order to enhance plant succession towards weed-resistant, mid-successional grassland. Here, we ask if the management of a plant community affects the earthworm community. Field experiments were established in four countries, the Netherlands, Sweden, the UK, and the Czech Republic. High (15 plant species) and low diversity (four plant species) seed mixtures were sown as management practice, with natural colonization as control treatment in a randomized block design. The response of the earthworrns to the management was studied after three summers since establishment of the sites. Samples were also taken from plots with continued agricultural practices included in the experimental design and from a site with a late successional plant community representing the target plant community. The numbers and biomass of individuals were higher in the set-aside plots than in the agricultural treatment in two countries out of four. The numbers of individuals at one site (The Netherlands) was higher in the naturally colonized plots than in the sowing treatments, otherwise there were no differences between the treatments. Species diversity was lower in the agricultural plots in one country. The species composition had changed from the initial community of the agricultural field, but was still different from a late successional target community. The worm biomass was positively related to legume biomass in Sweden and to grass biomass in the UK. (C) 2005 Elsevier SAS. All rights reserved.
Resumo:
Purpose: Acquiring details of kinetic parameters of enzymes is crucial to biochemical understanding, drug development, and clinical diagnosis in ocular diseases. The correct design of an experiment is critical to collecting data suitable for analysis, modelling and deriving the correct information. As classical design methods are not targeted to the more complex kinetics being frequently studied, attention is needed to estimate parameters of such models with low variance. Methods: We have developed Bayesian utility functions to minimise kinetic parameter variance involving differentiation of model expressions and matrix inversion. These have been applied to the simple kinetics of the enzymes in the glyoxalase pathway (of importance in posttranslational modification of proteins in cataract), and the complex kinetics of lens aldehyde dehydrogenase (also of relevance to cataract). Results: Our successful application of Bayesian statistics has allowed us to identify a set of rules for designing optimum kinetic experiments iteratively. Most importantly, the distribution of points in the range is critical; it is not simply a matter of even or multiple increases. At least 60 % must be below the KM (or plural if more than one dissociation constant) and 40% above. This choice halves the variance found using a simple even spread across the range.With both the glyoxalase system and lens aldehyde dehydrogenase we have significantly improved the variance of kinetic parameter estimation while reducing the number and costs of experiments. Conclusions: We have developed an optimal and iterative method for selecting features of design such as substrate range, number of measurements and choice of intermediate points. Our novel approach minimises parameter error and costs, and maximises experimental efficiency. It is applicable to many areas of ocular drug design, including receptor-ligand binding and immunoglobulin binding, and should be an important tool in ocular drug discovery.
Resumo:
Taipei City has put a significant effort toward the implementation of green design and green building schemes towards a sustainable eco-city. Although some of the environmental indicators have not indicated significant progress in environmental improvement, implementing the two schemes has obtained considerable results; therefore, the two schemes are on the right path towards promoting a sustainable eco-city. However, it has to be admitted that the two schemes are a rather “technocratic” set of solutions and eco-centric approach. It is suggested that not only the public sector but also the private sector need to put more effort toward implement the schemes, and the government needs to encourage the private sector to adopt the schemes in practice.
Resumo:
Semiotics is the study of signs. Application of semiotics in information systems design is based on the notion that information systems are organizations within which agents deploy signs in the form of actions according to a set of norms. An analysis of the relationships among the agents, their actions and the norms would give a better specification of the system. Distributed multimedia systems (DMMS) could be viewed as a system consisted of many dynamic, self-controlled normative agents engaging in complex interaction and processing of multimedia information. This paper reports the work of applying the semiotic approach to the design and modeling of DMMS, with emphasis on using semantic analysis under the semiotic framework. A semantic model of DMMS describing various components and their ontological dependencies is presented, which then serves as a design model and implemented in a semantic database. Benefits of using the semantic database are discussed with reference to various design scenarios.
Resumo:
There is a growing concern in reducing greenhouse gas emissions all over the world. The U.K. has set 34% target reduction of emission before 2020 and 80% before 2050 compared to 1990 recently in Post Copenhagen Report on Climate Change. In practise, Life Cycle Cost (LCC) and Life Cycle Assessment (LCA) tools have been introduced to construction industry in order to achieve this such as. However, there is clear a disconnection between costs and environmental impacts over the life cycle of a built asset when using these two tools. Besides, the changes in Information and Communication Technologies (ICTs) lead to a change in the way information is represented, in particular, information is being fed more easily and distributed more quickly to different stakeholders by the use of tool such as the Building Information Modelling (BIM), with little consideration on incorporating LCC and LCA and their maximised usage within the BIM environment. The aim of this paper is to propose the development of a model-based LCC and LCA tool in order to provide sustainable building design decisions for clients, architects and quantity surveyors, by then an optimal investment decision can be made by studying the trade-off between costs and environmental impacts. An application framework is also proposed finally as the future work that shows how the proposed model can be incorporated into the BIM environment in practise.
Resumo:
A common problem in many data based modelling algorithms such as associative memory networks is the problem of the curse of dimensionality. In this paper, a new two-stage neurofuzzy system design and construction algorithm (NeuDeC) for nonlinear dynamical processes is introduced to effectively tackle this problem. A new simple preprocessing method is initially derived and applied to reduce the rule base, followed by a fine model detection process based on the reduced rule set by using forward orthogonal least squares model structure detection. In both stages, new A-optimality experimental design-based criteria we used. In the preprocessing stage, a lower bound of the A-optimality design criterion is derived and applied as a subset selection metric, but in the later stage, the A-optimality design criterion is incorporated into a new composite cost function that minimises model prediction error as well as penalises the model parameter variance. The utilisation of NeuDeC leads to unbiased model parameters with low parameter variance and the additional benefit of a parsimonious model structure. Numerical examples are included to demonstrate the effectiveness of this new modelling approach for high dimensional inputs.
Resumo:
Limnologists had an early preoccupation with lake classification. It gave a necessary structure to the many chemical and biological observations that were beginning to form the basis of one of the earliest truly environmental sciences. August Thienemann was the doyen of such classifiers and his concept with Einar Naumann of oligotrophic and eutrophic lakes remains central to the world-view that limnologists still have. Classification fell into disrepute, however, as it became clear that there would always be lakes that deviated from the prescriptions that the classifiers made for them. Continua became the de rigeur concept and lakes were seen as varying along many chemical, biological and geographic axes. Modern limnologists are comfortable with this concept. That all lakes are different guarantees an indefinite future for limnological research. For those who manage lakes and the landscapes in which they are set, however, it is not very useful. There may be as many as 300000 standing water bodies in England and Wales alone and maybe as many again in Scotland. More than 80 000 are sizable (> 1 ha). Some classification scheme to cope with these numbers is needed and, as human impacts on them increase, a system of assessing and monitoring change must be built into such a scheme. Although ways of classifying and monitoring running waters are well developed in the UK, the same is not true of standing waters. Sufficient understanding of what determines the nature and functioning of lakes exists to create a system which has intellectual credibility as well as practical usefulness. This paper outlines the thinking behind a system which will be workable on a north European basis and presents some early results.
Resumo:
Norms are a set of rules that govern the behaviour of human agent, and how human agent behaves in response to the given certain conditions. This paper investigates the overlapping of information fields (set of shared norms) in the Context State Transition Model, and how these overlapping fields may affect the choices and actions of human agent. This paper also includes discussion on the implementation of new conflict resolution strategies based on the situation specification. The reasoning about conflicting norms in multiple information fields is discussed in detail.)
Cross-layer design for MIMO systems over spatially correlated and keyhole Nakagami-m fading channels
Resumo:
Cross-layer design is a generic designation for a set of efficient adaptive transmission schemes, across multiple layers of the protocol stack, that are aimed at enhancing the spectral efficiency and increasing the transmission reliability of wireless communication systems. In this paper, one such cross-layer design scheme that combines physical layer adaptive modulation and coding (AMC) with link layer truncated automatic repeat request (T-ARQ) is proposed for multiple-input multiple-output (MIMO) systems employing orthogonal space--time block coding (OSTBC). The performance of the proposed cross-layer design is evaluated in terms of achievable average spectral efficiency (ASE), average packet loss rate (PLR) and outage probability, for which analytical expressions are derived, considering transmission over two types of MIMO fading channels, namely, spatially correlated Nakagami-m fading channels and keyhole Nakagami-m fading channels. Furthermore, the effects of the maximum number of ARQ retransmissions, numbers of transmit and receive antennas, Nakagami fading parameter and spatial correlation parameters, are studied and discussed based on numerical results and comparisons. Copyright © 2009 John Wiley & Sons, Ltd.
Resumo:
The modern built environment has become more complex in terms of building types, environmental systems and use profiles. This complexity causes difficulties in terms of optimising buildings energy design. In this circumstance, introducing a set of prototype reference buildings, or so called benchmark buildings, that are able to represent all or majority parts of the UK building stock may be useful for the examination of the impact of national energy policies on building energy consumption. This study proposes a set of reference office buildings for England and Wales based on the information collected from the Non-Domestic Building Stock (NDBS) project and an intensive review of the existing building benchmarks. The proposed building benchmark comprises 10 prototypical reference buildings, which in relation to built form and size, represent 95% of office buildings in England and Wales. This building benchmark provides a platform for those involved in building energy simulations to evaluate energy-efficiency measures and for policy-makers to assess the influence of different building energy policies.
Resumo:
A universal systems design process is specified, tested in a case study and evaluated. It links English narratives to numbers using a categorical language framework with mathematical mappings taking the place of conjunctions and numbers. The framework is a ring of English narrative words between 1 (option) and 360 (capital); beyond 360 the ring cycles again to 1. English narratives are shown to correspond to the field of fractional numbers. The process can enable the development, presentation and communication of complex narrative policy information among communities of any scale, on a software implementation known as the "ecoputer". The information is more accessible and comprehensive than that in conventional decision support, because: (1) it is expressed in narrative language; and (2) the narratives are expressed as compounds of words within the framework. Hence option generation is made more effective than in conventional decision support processes including Multiple Criteria Decision Analysis, Life Cycle Assessment and Cost-Benefit Analysis.The case study is of a participatory workshop in UK bioenergy project objectives and criteria, at which attributes were elicited in environmental, economic and social systems. From the attributes, the framework was used to derive consequences at a range of levels of precision; these are compared with the project objectives and criteria as set out in the Case for Support. The design process is to be supported by a social information manipulation, storage and retrieval system for numeric and verbal narratives attached to the "ecoputer". The "ecoputer" will have an integrated verbal and numeric operating system. Novel design source code language will assist the development of narrative policy. The utility of the program, including in the transition to sustainable development and in applications at both community micro-scale and policy macro-scale, is discussed from public, stakeholder, corporate, Governmental and regulatory perspectives.
Resumo:
Building Information Modeling (BIM) is the process of structuring, capturing, creating, and managing a digital representation of physical and/or functional characteristics of a built space [1]. Current BIM has limited ability to represent dynamic semantics, social information, often failing to consider building activity, behavior and context; thus limiting integration with intelligent, built-environment management systems. Research, such as the development of Semantic Exchange Modules, and/or the linking of IFC with semantic web structures, demonstrates the need for building models to better support complex semantic functionality. To implement model semantics effectively, however, it is critical that model designers consider semantic information constructs. This paper discusses semantic models with relation to determining the most suitable information structure. We demonstrate how semantic rigidity can lead to significant long-term problems that can contribute to model failure. A sufficiently detailed feasibility study is advised to maximize the value from the semantic model. In addition we propose a set of questions, to be used during a model’s feasibility study, and guidelines to help assess the most suitable method for managing semantics in a built environment.
Resumo:
Purpose – The purpose of this paper is to introduce the debate forum on internationalization motives of this special issue of Multinational Business Review. Design/methodology/approach – The authors reflect on the background and evolution of the internationalization motives over the past few decades, and then provide suggestions for how to use the motives for future analyses. The authors also reflect on the contributions to the debate of the accompanying articles of the forum. Findings – There continue to be new developments in the way in which firms organize themselves as multinational enterprises (MNEs), and this implies that the “classic” motives originally introduced by Dunning in 1993 need to be revisited. Dunning’s motives and arguments were deductive and atheoretical, and these were intended to be used as a toolkit, used in conjunction with other theories and frameworks. They are not an alternative to a classification of possible MNE strategies. Originality/value – This paper and the ones that accompany it, provide a deeper and nuanced understanding on internationalization motives for future research to build on.