134 resultados para Building information modelling (BIM)


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The principles of organization theory are applied to the organization of construction projects. This is done by proposing a framework for modelling the whole process of building procurement. This consists of a framework for describing the environments within which construction projects take place. This is followed by the development of a series of hypotheses about the organizational structure of construction projects. Four case studies are undertaken, and the extent to which their organizational structure matches the model is compared to the level of success achieved by each project. To this end there is a systematic method for evaluating the success of building project organizations, because any conclusions about the adequacy of a particular organization must be related to the degree of success achieved by that organization. In order to test these hypotheses, a mapping technique is developed. The technique offered is a development of a technique known as Linear Responsibility Analysis, and is called "3R analysis" as it deals with roles, responsibilities and relationships. The analysis of the case studies shows that they tended to suffer due to inappropriate organizational structure. One of the prevailing problems of public sector organization is that organizational structures are inadequately defined, and too cumbersome to respond to environmental demands on the project. The projects tended to be organized as rigid hierarchies, particularly at decision points, when what was required was a more flexible, dynamic and responsive organization. The study concludes with a series of recommendations; including suggestions for increasing the responsiveness of construction project organizations, and reducing the lead-in times for the inception periods.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Recent severe flooding in the UK has highlighted the need for better information on flood risk, increasing the pressure on engineers to enhance the capabilities of computer models for flood prediction. This paper evaluates the benefits to be gained from the use of remotely sensed data to support flood modelling. The remotely sensed data available can be used either to produce high-resolution digital terrain models (DTMs) (light detection and ranging (Lidar) data), or to generate accurate inundation mapping of past flood events (airborne synthetic aperture radar (SAR) data and aerial photography). The paper reports on the modelling of real flood events that occurred at two UK sites on the rivers Severn and Ouse. At these sites a combination of remotely sensed data and recorded hydrographs was available. It is concluded first that light detection and ranging Lidar generated DTMs support the generation of considerably better models and enhance the visualisation of model results and second that flood outlines obtained from airborne SAR or aerial images help develop an appreciation of the hydraulic behaviour of important model components, and facilitate model validation. The need for further research is highlighted by a number of limitations, namely: the difficulties in obtaining an adequate representation of hydraulically important features such as embankment crests and walls; uncertainties in the validation data; and difficulties in extracting flood outlines from airborne SAR images in urban areas.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

For the very large nonlinear dynamical systems that arise in a wide range of physical, biological and environmental problems, the data needed to initialize a numerical forecasting model are seldom available. To generate accurate estimates of the expected states of the system, both current and future, the technique of ‘data assimilation’ is used to combine the numerical model predictions with observations of the system measured over time. Assimilation of data is an inverse problem that for very large-scale systems is generally ill-posed. In four-dimensional variational assimilation schemes, the dynamical model equations provide constraints that act to spread information into data sparse regions, enabling the state of the system to be reconstructed accurately. The mechanism for this is not well understood. Singular value decomposition techniques are applied here to the observability matrix of the system in order to analyse the critical features in this process. Simplified models are used to demonstrate how information is propagated from observed regions into unobserved areas. The impact of the size of the observational noise and the temporal position of the observations is examined. The best signal-to-noise ratio needed to extract the most information from the observations is estimated using Tikhonov regularization theory. Copyright © 2005 John Wiley & Sons, Ltd.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Europe's widely distributed climate modelling expertise, now organized in the European Network for Earth System Modelling (ENES), is both a strength and a challenge. Recognizing this, the European Union's Program for Integrated Earth System Modelling (PRISM) infrastructure project aims at designing a flexible and friendly user environment to assemble, run and post-process Earth System models. PRISM was started in December 2001 with a duration of three years. This paper presents the major stages of PRISM, including: (1) the definition and promotion of scientific and technical standards to increase component modularity; (2) the development of an end-to-end software environment (graphical user interface, coupling and I/O system, diagnostics, visualization) to launch, monitor and analyse complex Earth system models built around state-of-art community component models (atmosphere, ocean, atmospheric chemistry, ocean bio-chemistry, sea-ice, land-surface); and (3) testing and quality standards to ensure high-performance computing performance on a variety of platforms. PRISM is emerging as a core strategic software infrastructure for building the European research area in Earth system sciences. Copyright (c) 2005 John Wiley & Sons, Ltd.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Context: Learning can be regarded as knowledge construction in which prior knowledge and experience serve as basis for the learners to expand their knowledge base. Such a process of knowledge construction has to take place continuously in order to enhance the learners’ competence in a competitive working environment. As the information consumers, the individual users demand personalised information provision which meets their own specific purposes, goals, and expectations. Objectives: The current methods in requirements engineering are capable of modelling the common user’s behaviour in the domain of knowledge construction. The users’ requirements can be represented as a case in the defined structure which can be reasoned to enable the requirements analysis. Such analysis needs to be enhanced so that personalised information provision can be tackled and modelled. However, there is a lack of suitable modelling methods to achieve this end. This paper presents a new ontological method for capturing individual user’s requirements and transforming the requirements onto personalised information provision specifications. Hence the right information can be provided to the right user for the right purpose. Method: An experiment was conducted based on the qualitative method. A medium size of group of users participated to validate the method and its techniques, i.e. articulates, maps, configures, and learning content. The results were used as the feedback for the improvement. Result: The research work has produced an ontology model with a set of techniques which support the functions for profiling user’s requirements, reasoning requirements patterns, generating workflow from norms, and formulating information provision specifications. Conclusion: The current requirements engineering approaches provide the methodical capability for developing solutions. Our research outcome, i.e. the ontology model with the techniques, can further enhance the RE approaches for modelling the individual user’s needs and discovering the user’s requirements.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Two ongoing projects at ESSC that involve the development of new techniques for extracting information from airborne LiDAR data and combining this information with environmental models will be discussed. The first project in conjunction with Bristol University is aiming to improve 2-D river flood flow models by using remote sensing to provide distributed data for model calibration and validation. Airborne LiDAR can provide such models with a dense and accurate floodplain topography together with vegetation heights for parameterisation of model friction. The vegetation height data can be used to specify a friction factor at each node of a model’s finite element mesh. A LiDAR range image segmenter has been developed which converts a LiDAR image into separate raster maps of surface topography and vegetation height for use in the model. Satellite and airborne SAR data have been used to measure flood extent remotely in order to validate the modelled flood extent. Methods have also been developed for improving the models by decomposing the model’s finite element mesh to reflect floodplain features such as hedges and trees having different frictional properties to their surroundings. Originally developed for rural floodplains, the segmenter is currently being extended to provide DEMs and friction parameter maps for urban floods, by fusing the LiDAR data with digital map data. The second project is concerned with the extraction of tidal channel networks from LiDAR. These networks are important features of the inter-tidal zone, and play a key role in tidal propagation and in the evolution of salt-marshes and tidal flats. The study of their morphology is currently an active area of research, and a number of theories related to networks have been developed which require validation using dense and extensive observations of network forms and cross-sections. The conventional method of measuring networks is cumbersome and subjective, involving manual digitisation of aerial photographs in conjunction with field measurement of channel depths and widths for selected parts of the network. A semi-automatic technique has been developed to extract networks from LiDAR data of the inter-tidal zone. A multi-level knowledge-based approach has been implemented, whereby low level algorithms first extract channel fragments based mainly on image properties then a high level processing stage improves the network using domain knowledge. The approach adopted at low level uses multi-scale edge detection to detect channel edges, then associates adjacent anti-parallel edges together to form channels. The higher level processing includes a channel repair mechanism.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Goal modelling is a well known rigorous method for analysing problem rationale and developing requirements. Under the pressures typical of time-constrained projects its benefits are not accessible. This is because of the effort and time needed to create the graph and because reading the results can be difficult owing to the effects of crosscutting concerns. Here we introduce an adaptation of KAOS to meet the needs of rapid turn around and clarity. The main aim is to help the stakeholders gain an insight into the larger issues that might be overlooked if they make a premature start into implementation. The method emphasises the use of obstacles, accepts under-refined goals and has new methods for managing crosscutting concerns and strategic decision making. It is expected to be of value to agile as well as traditional processes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The rate and scale of human-driven changes can exert profound impacts on ecosystems, the species that make them up and the services they provide that sustain humanity. Given the speed at which these changes are occurring, one of society's major challenges is to coexist within ecosystems and to manage ecosystem services in a sustainable way. The effect of possible scenarios of global change on ecosystem services can be explored using ecosystem models. Such models should adequately represent ecosystem processes above and below the soil surface (aboveground and belowground) and the interactions between them. We explore possibilities to include such interactions into ecosystem models at scales that range from global to local. At the regional to global scale we suggest to expand the plant functional type concept (aggregating plants into groups according to their physiological attributes) to include functional types of aboveground-belowground interactions. At the scale of discrete plant communities, process-based and organism-oriented models could be combined into "hybrid approaches" that include organism-oriented mechanistic representation of a limited number of trophic interactions in an otherwise process - oriented approach. Under global change the density and activity of organisms determining the processes may change non-linearly and therefore explicit knowledge of the organisms and their responses should ideally be included. At the individual plant scale a common organism-based conceptual model of aboveground-belowground interactions has emerged. This conceptual model facilitates the formulation of research questions to guide experiments aiming to identify patterns that are common within, but differ between, ecosystem types and biomes. Such experiments inform modelling approaches at larger scales. Future ecosystem models should better include this evolving knowledge of common patterns of aboveground-belowground interactions. Improved ecosystem models are necessary toots to reduce the uncertainty in the information that assists us in the sustainable management of our environment in a changing world. (C) 2004 Elsevier GmbH. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper investigates Willingness to Accept (WTA) Genetic Modification (GM) foods based on experimental auctions carried out in the USA, UK and France. It explores perceptions of risk and benefits, moral concerns and their antecedents, attitudes to the environment and technology and trust in various sources using Structural Equation Modelling (SEM). Trust in information provided by industry proved to be the most important determinant of risk/benefit perceptions and WTA followed by general attitudes to the environment and technology. Education and age are also enhance perceived benefits and lower perceived risks of GM. Perception of risk/benefit and moral concerns all have significant effects on consumers' WTA but the perceived benefits are most important. The research suggests that trust-building by industry would be the most effective in enhancing GM acceptance.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Health care providers, purchasers and policy makers need to make informed decisions regarding the provision of cost-effective care. When a new health care intervention is to be compared with the current standard, an economic evaluation alongside an evaluation of health benefits provides useful information for the decision making process. We consider the information on cost-effectiveness which arises from an individual clinical trial comparing the two interventions. Recent methods for conducting a cost-effectiveness analysis for a clinical trial have focused on the net benefit parameter. The net benefit parameter, a function of costs and health benefits, is positive if the new intervention is cost-effective compared with the standard. In this paper we describe frequentist and Bayesian approaches to cost-effectiveness analysis which have been suggested in the literature and apply them to data from a clinical trial comparing laparoscopic surgery with open mesh surgery for the repair of inguinal hernias. We extend the Bayesian model to allow the total cost to be divided into a number of different components. The advantages and disadvantages of the different approaches are discussed. In January 2001, NICE issued guidance on the type of surgery to be used for inguinal hernia repair. We discuss our example in the light of this information. Copyright © 2003 John Wiley & Sons, Ltd.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This research is associated with the goal of the horticultural sector of the Colombian southwest, which is to obtain climatic information, specifically, to predict the monthly average temperature in sites where it has not been measured. The data correspond to monthly average temperature, and were recorded in meteorological stations at Valle del Cauca, Colombia, South America. Two components are identified in the data of this research: (1) a component due to the temporal aspects, determined by characteristics of the time series, distribution of the monthly average temperature through the months and the temporal phenomena, which increased (El Nino) and decreased (La Nina) the temperature values, and (2) a component due to the sites, which is determined for the clear differentiation of two populations, the valley and the mountains, which are associated with the pattern of monthly average temperature and with the altitude. Finally, due to the closeness between meteorological stations it is possible to find spatial correlation between data from nearby sites. In the first instance a random coefficient model without spatial covariance structure in the errors is obtained by month and geographical location (mountains and valley, respectively). Models for wet periods in mountains show a normal distribution in the errors; models for the valley and dry periods in mountains do not exhibit a normal pattern in the errors. In models of mountains and wet periods, omni-directional weighted variograms for residuals show spatial continuity. The random coefficient model without spatial covariance structure in the errors and the random coefficient model with spatial covariance structure in the errors are capturing the influence of the El Nino and La Nina phenomena, which indicates that the inclusion of the random part in the model is appropriate. The altitude variable contributes significantly in the models for mountains. In general, the cross-validation process indicates that the random coefficient model with spatial spherical and the random coefficient model with spatial Gaussian are the best models for the wet periods in mountains, and the worst model is the model used by the Colombian Institute for Meteorology, Hydrology and Environmental Studies (IDEAM) to predict temperature.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The rate at which a given site in a gene sequence alignment evolves over time may vary. This phenomenon-known as heterotachy-can bias or distort phylogenetic trees inferred from models of sequence evolution that assume rates of evolution are constant. Here, we describe a phylogenetic mixture model designed to accommodate heterotachy. The method sums the likelihood of the data at each site over more than one set of branch lengths on the same tree topology. A branch-length set that is best for one site may differ from the branch-length set that is best for some other site, thereby allowing different sites to have different rates of change throughout the tree. Because rate variation may not be present in all branches, we use a reversible-jump Markov chain Monte Carlo algorithm to identify those branches in which reliable amounts of heterotachy occur. We implement the method in combination with our 'pattern-heterogeneity' mixture model, applying it to simulated data and five published datasets. We find that complex evolutionary signals of heterotachy are routinely present over and above variation in the rate or pattern of evolution across sites, that the reversible-jump method requires far fewer parameters than conventional mixture models to describe it, and serves to identify the regions of the tree in which heterotachy is most pronounced. The reversible-jump procedure also removes the need for a posteriori tests of 'significance' such as the Akaike or Bayesian information criterion tests, or Bayes factors. Heterotachy has important consequences for the correct reconstruction of phylogenies as well as for tests of hypotheses that rely on accurate branch-length information. These include molecular clocks, analyses of tempo and mode of evolution, comparative studies and ancestral state reconstruction. The model is available from the authors' website, and can be used for the analysis of both nucleotide and morphological data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

1. Jerdon's courser Rhinoptilus bitorquatus is a nocturnally active cursorial bird that is only known to occur in a small area of scrub jungle in Andhra Pradesh, India, and is listed as critically endangered by the IUCN. Information on its habitat requirements is needed urgently to underpin conservation measures. We quantified the habitat features that correlated with the use of different areas of scrub jungle by Jerdon's coursers, and developed a model to map potentially suitable habitat over large areas from satellite imagery and facilitate the design of surveys of Jerdon's courser distribution. 2. We used 11 arrays of 5-m long tracking strips consisting of smoothed fine soil to detect the footprints of Jerdon's coursers, and measured tracking rates (tracking events per strip night). We counted the number of bushes and trees, and described other attributes of vegetation and substrate in a 10-m square plot centred on each strip. We obtained reflectance data from Landsat 7 satellite imagery for the pixel within which each strip lay. 3. We used logistic regression models to describe the relationship between tracking rate by Jerdon's coursers and characteristics of the habitat around the strips, using ground-based survey data and satellite imagery. 4. Jerdon's coursers were most likely to occur where the density of large (>2 m tall) bushes was in the range 300-700 ha(-1) and where the density of smaller bushes was less than 1000 ha(-1). This habitat was detectable using satellite imagery. 5. Synthesis and applications. The occurrence of Jerdon's courser is strongly correlated with the density of bushes and trees, and is in turn affected by grazing with domestic livestock, woodcutting and mechanical clearance of bushes to create pasture, orchards and farmland. It is likely that there is an optimal level of grazing and woodcutting that would maintain or create suitable conditions for the species. Knowledge of the species' distribution is incomplete and there is considerable pressure from human use of apparently suitable habitats. Hence, distribution mapping is a high conservation priority. A two-step procedure is proposed, involving the use of ground surveys of bush density to calibrate satellite image-based mapping of potential habitat. These maps could then be used to select priority areas for Jerdon's courser surveys. The use of tracking strips to study habitat selection and distribution has potential in studies of other scarce and secretive species.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper investigates dendritic peptides capable of assembling into nanostructured gels, and explores the effect on self-assembly of mixing different molecular building blocks. Thermal measurements, small angle Xray scattering (SAXS) and circular dichroism (CD) spectroscopy are used to probe these materials on macroscopic, nanoscopic and molecular length scales. The results from these investigations demonstrate that in this case, systems with different "size" and "chirality" factors can self-organise, whilst systems with different "shape" factors cannot. The "size" and "chirality" factors are directly connected with the molecular information programmed into the dendritic peptides, whilst the shape factor depends on the group linking these peptides together-this is consistent with molecular recognition hydrogen bond pathways between the peptidic building blocks controlling the ability of these systems to self-recognise. These results demonstrate that mixtures of relatively complex peptides, with only subtle differences on the molecular scale, can self-organise into nanoscale structures, an important step in the spontaneous assembly of ordered systems from complex mixtures.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Information technology in construction (ITC) has been gaining wide acceptance and is being implemented in the construction research domains as a tool to assist decision makers. Most of the research into visualization technologies (VT) has been on the wide range of 3D and simulation applications suitable for construction processes. Despite its development with interoperability and standardization of products, VT usage has remained very low when it comes to communicating and addressing the needs of building end-users (BEU). This paper argues that building end users are a source of experience and expertise that can be brought into the briefing stage for the evaluation of design proposals. It also suggests that the end user is a source of new ideas promoting innovation. In this research a positivistic methodology that includes the comparison of 3D models and the traditional 2D methods is proposed. It will help to identify "how much", if anything, a non-spatial specialist can gain in terms Of "understanding" of a particular design proposal presented, using both methods.