880 resultados para Building information modeling
Resumo:
The spatial and temporal dynamics in the stream water NO3-N concentrations in a major European river-system, the Garonne (62,700 km(2)), are described and related to variations in climate, land management, and effluent point-sources using multivariate statistics. Building on this, the Hydrologiska Byrans Vattenbalansavdelning (HBV) rainfall-runoff model and the Integrated Catchment Model of Nitrogen (INCA-N) are applied to simulate the observed flow and N dynamics. This is done to help us to understand which factors and processes control the flow and N dynamics in different climate zones and to assess the relative inputs from diffuse and point sources across the catchment. This is the first application of the linked HBV and INCA-N models to a major European river system commensurate with the largest basins to be managed tinder the Water Framework Directive. The simulations suggest that in the lowlands, seasonal patterns in the stream water NO3-N concentrations emerge and are dominated by diffuse agricultural inputs, with an estimated 75% of the river load in the lowlands derived from arable farming. The results confirm earlier European catchment studies. Namely, current semi-distrubuted catchment-scale dynamic models, which integrate variations in land cover, climate, and a simple representation of the terrestrial and in-stream N cycle, are able to simulate seasonal NO3-N patterns at large spatial (> 300 km(2)) and temporal (>= monthly) scales using available national datasets.
Resumo:
The spatial and temporal dynamics in the stream water NO3-N concentrations in a major European river-system, the Garonne (62,700 km(2)), are described and related to variations in climate, land management, and effluent point-sources using multivariate statistics. Building on this, the Hydrologiska Byrans Vattenbalansavdelning (HBV) rainfall-runoff model and the Integrated Catchment Model of Nitrogen (INCA-N) are applied to simulate the observed flow and N dynamics. This is done to help us to understand which factors and processes control the flow and N dynamics in different climate zones and to assess the relative inputs from diffuse and point sources across the catchment. This is the first application of the linked HBV and INCA-N models to a major European river system commensurate with the largest basins to be managed tinder the Water Framework Directive. The simulations suggest that in the lowlands, seasonal patterns in the stream water NO3-N concentrations emerge and are dominated by diffuse agricultural inputs, with an estimated 75% of the river load in the lowlands derived from arable farming. The results confirm earlier European catchment studies. Namely, current semi-distrubuted catchment-scale dynamic models, which integrate variations in land cover, climate, and a simple representation of the terrestrial and in-stream N cycle, are able to simulate seasonal NO3-N patterns at large spatial (> 300 km(2)) and temporal (>= monthly) scales using available national datasets.
Resumo:
This paper describes the user modeling component of EPIAIM, a consultation system for data analysis in epidemiology. The component is aimed at representing knowledge of concepts in the domain, so that their explanations can be adapted to user needs. The first part of the paper describes two studies aimed at analysing user requirements. The first one is a questionnaire study which examines the respondents' familiarity with concepts. The second one is an analysis of concept descriptions in textbooks and from expert epidemiologists, which examines how discourse strategies are tailored to the level of experience of the expected audience. The second part of the paper describes how the results of these studies have been used to design the user modeling component of EPIAIM. This module works in a two-step approach. In the first step, a few trigger questions allow the activation of a stereotype that includes a "body" and an "inference component". The body is the representation of the body of knowledge that a class of users is expected to know, along with the probability that the knowledge is known. In the inference component, the learning process of concepts is represented as a belief network. Hence, in the second step the belief network is used to refine the initial default information in the stereotype's body. This is done by asking a few questions on those concepts where it is uncertain whether or not they are known to the user, and propagating this new evidence to revise the whole situation. The system has been implemented on a workstation under UNIX. An example of functioning is presented, and advantages and limitations of the approach are discussed.
Resumo:
It is known that germin, which is a marker of the onset of growth in germinating wheat, is an oxalate oxidase, and also that germins possess sequence similarity with legumin and vicilin seed storage proteins. These two pieces of information have been combined in order to generate a 3D model of germin based on the structure of vicilin and to examine the model with regard to a potential oxalate oxidase active site. A cluster of three histidine residues has been located within the conserved beta-barrel structure. While there is a relatively low level of overall sequence similarity between the model and the vicilin structures, the conservation of amino acids important in maintaining the scaffold of the beta-barrel lends confidence to the juxtaposition of the histidine residues. The cluster is similar structurally to those found in copper amine oxidase and other proteins, leading to the suggestion that it defines a metal-binding location within the oxalate oxidase active site. It is also proposed that the structural elements involved in intermolecular interactions in vicilins may play a role in oligomer formation in germin/oxalate oxidase.
Resumo:
This paper investigates dendritic peptides capable of assembling into nanostructured gels, and explores the effect on self-assembly of mixing different molecular building blocks. Thermal measurements, small angle Xray scattering (SAXS) and circular dichroism (CD) spectroscopy are used to probe these materials on macroscopic, nanoscopic and molecular length scales. The results from these investigations demonstrate that in this case, systems with different "size" and "chirality" factors can self-organise, whilst systems with different "shape" factors cannot. The "size" and "chirality" factors are directly connected with the molecular information programmed into the dendritic peptides, whilst the shape factor depends on the group linking these peptides together-this is consistent with molecular recognition hydrogen bond pathways between the peptidic building blocks controlling the ability of these systems to self-recognise. These results demonstrate that mixtures of relatively complex peptides, with only subtle differences on the molecular scale, can self-organise into nanoscale structures, an important step in the spontaneous assembly of ordered systems from complex mixtures.
Resumo:
This paper reviews four approaches used to create rational tools to aid the planning and the management of the building design process and then proposes a fifth approach. The new approach that has been developed is based on the mechanical aspects of technology rather than subjective design issues. The knowledge base contains, for each construction technology, a generic model of the detailed design process. Each activity in the process is specified by its input and output information needs. By connecting the input demands of one technology with the output supply from another technology a map or network of design activity is formed. Thus, it is possible to structure a specific model from the generic knowledge base within a KBE system.
Resumo:
Information technology in construction (ITC) has been gaining wide acceptance and is being implemented in the construction research domains as a tool to assist decision makers. Most of the research into visualization technologies (VT) has been on the wide range of 3D and simulation applications suitable for construction processes. Despite its development with interoperability and standardization of products, VT usage has remained very low when it comes to communicating and addressing the needs of building end-users (BEU). This paper argues that building end users are a source of experience and expertise that can be brought into the briefing stage for the evaluation of design proposals. It also suggests that the end user is a source of new ideas promoting innovation. In this research a positivistic methodology that includes the comparison of 3D models and the traditional 2D methods is proposed. It will help to identify "how much", if anything, a non-spatial specialist can gain in terms Of "understanding" of a particular design proposal presented, using both methods.
Resumo:
Design management research usually deals with the processes within the professional design team and yet, in the UK, the volume of the total project information produced by the specialist trade contractors equals or exceeds that produced by the design team. There is a need to understand the scale of this production task and to plan and manage it accordingly. The model of the process on which the plan is to be based, while generic, must be sufficiently robust to cover the majority of instances. An approach using design elements, in sufficient depth to possibly develop tools for a predictive model of the process, is described. The starting point is that each construction element and its components have a generic sequence of design activities. Specific requirements tailor the element's application to the building. Then there are the constraints produced due to the interaction with other elements. Therefore, the selection of a component within the element may impose a set of constraints that will affect the choice of other design elements. Thus, a design decision can be seen as an interrelated element-constraint-element (ECE) sub-net. To illustrate this approach, an example of the process within precast concrete cladding has been used.
Resumo:
Design management research usually deals with the processes within the professional design team and yet, in the UK, the volume of the total project information produced by the specialist trade contractors equals or exceeds that produced by the design team. There is a need to understand the scale of this production task and to plan and manage it accordingly. The model of the process on which the plan is to be based, while generic, must be sufficiently robust to cover the majority of instances. An approach using design elements, in sufficient depth to possibly develop tools for a predictive model of the process, is described. The starting point is that each construction element and its components have a generic sequence of design activities. Specific requirements tailor the element's application to the building. Then there are the constraints produced due to the interaction with other elements. Therefore, the selection of a component within the element may impose a set of constraints that will affect the choice of other design elements. Thus, a design decision can be seen as an interrelated element-constraint-element (ECE) sub-net. To illustrate this approach, an example of the process within precast concrete cladding has been used.
Resumo:
Modern buildings are designed to enhance the match between environment, spaces and the people carrying out work, so that the well-being and the performance of the occupants are all in harmony. Building services are systems that facilitate a healthy working environment within which workers productivity can be optimised in the buildings. However, the maintenance of these services is fraught with problems that may contribute to up to 50% of the total life cycle cost of the building. Maintenance support is one area which is not usually designed into the system as this is not common practice in the services industry. The other areas of shortfall for future designs are; client requirements, commissioning, facilities management data and post occupancy evaluation feedback which needs to be adequately planned to capture and document this information for use in future designs. At the University of Reading an integrated approach has been developed to assemble the multitude of aspects inherent in this field. The means records required and measured achievements for the benefit of both building owners and practitioners. This integrated approach can be represented in a Through Life Business Model (TLBM) format using the concept of Integrated Logistic Support (ILS). The prototype TLBM developed utilises the tailored tools and techniques of ILS for building services. This TLBM approach will facilitate the successful development of a databank that would be invaluable in capturing essential data (e.g. reliability of components) for enhancing future building services designs, life cycle costing and decision making by practitioners, in particular facilities managers.
Resumo:
According to the Chinese State Council's "Building Energy Efficiency Management Ordinance", a large-scale investigation of energy efficiency (EE) in buildings in contemporary China has been carried out in 22 provincial capitals and major cities in China. The aim of this project is to provide reliable information for drawing up the "Decision on reinforcing building energy efficiency" by the Ministry of Construction of China. The surveyed organizations include government departments, research institutions, property developers, design institutions, construction companies, construction consultancy services companies, facility management departments, financial institutions and those which relate to the business of building energy efficiency. In addition, representatives of the media and residents were also involved. A detailed analysis of the results of the investigation concerning aspects of the cur-rent situation and trends in building energy consumption, energy efficiency strategy and the implementation of energy efficiency measures has been conducted. The investigation supplies essential information to formulate the market entrance policy for new buildings and the refurbishment policy for existing buildings to encourage the development of energy efficient technology.
Resumo:
Multi-agent systems have been adopted to build intelligent environment in recent years. It was claimed that energy efficiency and occupants' comfort were the most important factors for evaluating the performance of modem work environment, and multi-agent systems presented a viable solution to handling the complexity of dynamic building environment. While previous research has made significant advance in some aspects, the proposed systems or models were often not applicable in a "shared environment". This paper introduces an ongoing project on multi-agent for building control, which aims to achieve both energy efficiency and occupants' comfort in a shared environment.
Resumo:
Space applications demand the need for building reliable systems. Autonomic computing defines such reliable systems as self-managing systems. The work reported in this paper combines agent based and swarm robotic approaches leading to swarm-array computing, a novel technique to achieve autonomy for distributed parallel computing systems. Two swarm-array computing approaches based on swarms of computational resources and swarms of tasks are explored. FPGA is considered as the computing system. The feasibility of the two proposed approaches that binds the computing system and the task together is simulated on the SeSAm multi-agent simulator.
Resumo:
Semiotics is the study of signs. Application of semiotics in information systems design is based on the notion that information systems are organizations within which agents deploy signs in the form of actions according to a set of norms. An analysis of the relationships among the agents, their actions and the norms would give a better specification of the system. Distributed multimedia systems (DMMS) could be viewed as a system consisted of many dynamic, self-controlled normative agents engaging in complex interaction and processing of multimedia information. This paper reports the work of applying the semiotic approach to the design and modeling of DMMS, with emphasis on using semantic analysis under the semiotic framework. A semantic model of DMMS describing various components and their ontological dependencies is presented, which then serves as a design model and implemented in a semantic database. Benefits of using the semantic database are discussed with reference to various design scenarios.