842 resultados para Decision support , Construction Management


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Construction firms that employ collaborative procurement approaches develop operating routines through joint learning so as to improve infrastructure project performance. This paper reports a study based on a survey sample of 320 construction practitioners which were involved in collaborative infrastructure delivery in Australia. The study developed valid and reliable scales for measuring collaborative learning capability (CLC), and used the scales to evaluate the CLC of contractor and consultant firms within the sample. The evaluation suggests that whilst these firms explore knowledge from both internal and external sources, transform both explicit and tacit knowledge, and apply and internalise new knowledge, they can improve the extent to which these routines are applied to optimise project performance.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Modern database systems incorporate a query optimizer to identify the most efficient "query execution plan" for executing the declarative SQL queries submitted by users. A dynamic-programming-based approach is used to exhaustively enumerate the combinatorially large search space of plan alternatives and, using a cost model, to identify the optimal choice. While dynamic programming (DP) works very well for moderately complex queries with up to around a dozen base relations, it usually fails to scale beyond this stage due to its inherent exponential space and time complexity. Therefore, DP becomes practically infeasible for complex queries with a large number of base relations, such as those found in current decision-support and enterprise management applications. To address the above problem, a variety of approaches have been proposed in the literature. Some completely jettison the DP approach and resort to alternative techniques such as randomized algorithms, whereas others have retained DP by using heuristics to prune the search space to computationally manageable levels. In the latter class, a well-known strategy is "iterative dynamic programming" (IDP) wherein DP is employed bottom-up until it hits its feasibility limit, and then iteratively restarted with a significantly reduced subset of the execution plans currently under consideration. The experimental evaluation of IDP indicated that by appropriate choice of algorithmic parameters, it was possible to almost always obtain "good" (within a factor of twice of the optimal) plans, and in the few remaining cases, mostly "acceptable" (within an order of magnitude of the optimal) plans, and rarely, a "bad" plan. While IDP is certainly an innovative and powerful approach, we have found that there are a variety of common query frameworks wherein it can fail to consistently produce good plans, let alone the optimal choice. This is especially so when star or clique components are present, increasing the complexity of th- e join graphs. Worse, this shortcoming is exacerbated when the number of relations participating in the query is scaled upwards.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Different seismic hazard components pertaining to Bangalore city,namely soil overburden thickness, effective shear-wave velocity, factor of safety against liquefaction potential, peak ground acceleration at the seismic bedrock, site response in terms of amplification factor, and the predominant frequency, has been individually evaluated. The overburden thickness distribution, predominantly in the range of 5-10 m in the city, has been estimated through a sub-surface model from geotechnical bore-log data. The effective shear-wave velocity distribution, established through Multi-channel Analysis of Surface Wave (MASW) survey and subsequent data interpretation through dispersion analysis, exhibits site class D (180-360 m/s), site class C (360-760 m/s), and site class B (760-1500 m/s) in compliance to the National Earthquake Hazard Reduction Program (NEHRP) nomenclature. The peak ground acceleration has been estimated through deterministic approach, based on the maximum credible earthquake of M-W = 5.1 assumed to be nucleating from the closest active seismic source (Mandya-Channapatna-Bangalore Lineament). The 1-D site response factor, computed at each borehole through geotechnical analysis across the study region, is seen to be ranging from around amplification of one to as high as four times. Correspondingly, the predominant frequency estimated from the Fourier spectrum is found to be predominantly in range of 3.5-5.0 Hz. The soil liquefaction hazard assessment has been estimated in terms of factor of safety against liquefaction potential using standard penetration test data and the underlying soil properties that indicates 90% of the study region to be non-liquefiable. The spatial distributions of the different hazard entities are placed on a GIS platform and subsequently, integrated through analytical hierarchal process. The accomplished deterministic hazard map shows high hazard coverage in the western areas. The microzonation, thus, achieved is envisaged as a first-cut assessment of the site specific hazard in laying out a framework for higher order seismic microzonation as well as a useful decision support tool in overall land-use planning, and hazard management. (C) 2010 Elsevier Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This article discusses the scope of research on the application of information technology in construction (ITC). A model of the information and material activities which together constitute the construction process is presented, using the IDEF0 activity modelling methodology. Information technology is defined to include all kinds of technology used for the storage, transfer and manipulation of information, thus also including devices such as copying machines, faxes and mobile phones. Using the model the domain of ITC research is defined as the use of information technology to facilitate and re-engineer the information process component of construction. Developments during the last decades in IT use in construction is discussed against a background of a simplified model of generic information processing tasks. The scope of ITC is compared with the scopes of research in related areas such as design methodology, construction management and facilities management. Health care is proposed as an interesting alternative (to the often used car manufacturing industry), as an IT application domain to compare with. Some of the key areas of ITC research in recent years; expert systems, company IT strategies, and product modelling are shortly discussed. The article finishes with a short discussion of the problems of applying standard scientific methodology in ITC research, in particular in product model research.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The current mainstream scientific-publication process has so far been only marginally affected by the possibilities offered by the Internet, despite some pioneering attempts with free electronic-only journals and electronic preprint archives. Additional electronic versions of traditional paper journals for which one needs a subscription are not a solution. A clear trend, for young researchers in particular, is to go around subscription barriers (both for paper and electronic material) and rely almost exclusively on what they can find free on the Internet, which often includes working versions posted on the home pages of the authors. A survey of how scientists retrieve publications was conducted in February 2000, aimed at measuring to what extent the opportunities offered by the Internet are already changing the scientific information exchange and how researchers feel about this. This paper presents the results based on 236 replies to an extensive Web-based questionnaire, which was announced to around 3,000 researchers in the domains of construction information technology and construction management. The questions dealt with how researchers find, access, and read different sources; how many and what publications they read; how often and to which conferences they travel; how much they publish, and criteria for where they eventually decide to publish. Some of the questions confronted traditional and electronic publishing, with one final section dedicated to opinions about electronic publishing. According to the survey, researchers already download half of the material that they read digitally from the Web. The most popular method for retrieving an interesting publication is downloading it for free from the author's or publisher's Web site. Researchers are not particularly willing to pay for electronic scientific publications. There is much support for a scenario of electronic journals available freely in their entirety on the Web, where the costs could be covered by, for instance, professional societies or the publishing university.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The factors affecting the non-industrial, private forest landowners' (hereafter referred to using the acronym NIPF) strategic decisions in management planning are studied. A genetic algorithm is used to induce a set of rules predicting potential cut of the landowners' choices of preferred timber management strategies. The rules are based on variables describing the characteristics of the landowners and their forest holdings. The predictive ability of a genetic algorithm is compared to linear regression analysis using identical data sets. The data are cross-validated seven times applying both genetic algorithm and regression analyses in order to examine the data-sensitivity and robustness of the generated models. The optimal rule set derived from genetic algorithm analyses included the following variables: mean initial volume, landowner's positive price expectations for the next eight years, landowner being classified as farmer, and preference for the recreational use of forest property. When tested with previously unseen test data, the optimal rule set resulted in a relative root mean square error of 0.40. In the regression analyses, the optimal regression equation consisted of the following variables: mean initial volume, proportion of forestry income, intention to cut extensively in future, and positive price expectations for the next two years. The R2 of the optimal regression equation was 0.34 and the relative root mean square error obtained from the test data was 0.38. In both models, mean initial volume and positive stumpage price expectations were entered as significant predictors of potential cut of preferred timber management strategy. When tested with the complete data set of 201 observations, both the optimal rule set and the optimal regression model achieved the same level of accuracy.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The paper examines the needs, premises and criteria for effective public participation in tactical forest planning. A method for participatory forest planning utilizing the techniques of preference analysis, professional expertise and heuristic optimization is introduced. The techniques do not cover the whole process of participatory planning, but are applied as a tool constituting the numerical core for decision support. The complexity of multi-resource management is addressed by hierarchical decision analysis which assesses the public values, preferences and decision criteria toward the planning situation. An optimal management plan is sought using heuristic optimization. The plan can further be improved through mutual negotiations, if necessary. The use of the approach is demonstrated with an illustrative example, it's merits and challenges for participatory forest planning and decision making are discussed and a model for applying it in general forest planning context is depicted. By using the approach, valuable information can be obtained about public preferences and the effects of taking them into consideration on the choice of the combination of standwise treatment proposals for a forest area. Participatory forest planning calculations, carried out by the approach presented in the paper, can be utilized in conflict management and in developing compromises between competing interests.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Growing concern over the status of global and regional bioenergy resources has necessitated the analysis and monitoring of land cover and land use parameters on spatial and temporal scales. The knowledge of land cover and land use is very important in understanding natural resources utilization, conversion and management. Land cover, land use intensity and land use diversity are land quality indicators for sustainable land management. Optimal management of resources aids in maintaining the ecosystem balance and thereby ensures the sustainable development of a region. Thus sustainable development of a region requires a synoptic ecosystem approach in the management of natural resources that relates to the dynamics of natural variability and the effects of human intervention on key indicators of biodiversity and productivity. Spatial and temporal tools such as remote sensing (RS), geographic information system (GIS) and global positioning system (GPS) provide spatial and attribute data at regular intervals with functionalities of a decision support system aid in visualisation, querying, analysis, etc., which would aid in sustainable management of natural resources. Remote sensing data and GIS technologies play an important role in spatially evaluating bioresource availability and demand. This paper explores various land cover and land use techniques that could be used for bioresources monitoring considering the spatial data of Kolar district, Karnataka state, India. Slope and distance based vegetation indices are computed for qualitative and quantitative assessment of land cover using remote spectral measurements. Differentscale mapping of land use pattern in Kolar district is done using supervised classification approaches. Slope based vegetation indices show area under vegetation range from 47.65 % to 49.05% while distance based vegetation indices shoes its range from 40.40% to 47.41%. Land use analyses using maximum likelihood classifier indicate that 46.69% is agricultural land, 42.33% is wasteland (barren land), 4.62% is built up, 3.07% of plantation, 2.77% natural forest and 0.53% water bodies. The comparative analysis of various classifiers, indicate that the Gaussian maximum likelihood classifier has least errors. The computation of talukwise bioresource status shows that Chikballapur Taluk has better availability of resources compared to other taluks in the district.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

It is a well-known fact that most of the developing countries have intermittent water supply and the quantity of water supplied from the source is also not distributed equitably among the consumers. Aged pipelines, pump failures, and improper management of water resources are some of the main reasons for it. This study presents the application of a nonlinear control technique to overcome this problem in different zones in the city of Bangalore. The water is pumped to the city from a large distance of approximately 100km over a very high elevation of approximately 400m. The city has large undulating terrain among different zones, which leads to unequal distribution of water. The Bangalore, inflow water-distribution system (WDS) has been modeled. A dynamic inversion (DI) nonlinear controller with proportional integral derivative (PID) features (DI-PID) is used for valve throttling to achieve the target flows to different zones of the city. This novel approach of equitable water distribution using DI-PID controllers that can be used as a decision support system is discussed in this paper.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Harmful Algal Research and Response: A Human Dimensions Strategy (HARR-HD) justifies and guides a coordinated national commitment to human dimensions research critical to prevent and respond to impacts of harmful algal blooms (HABs). Beyond HABs, it serves as a framework for developing hu-man dimensions research as a cross-cutting priority of ecosystem science supporting coastal and ocean management, including hazard research and mitigation planning. Measuring and promoting commu-nity resilience to hazards require human dimensions research outcomes such as effective risk commu-nication strategies; assessment of community vulnerability; identification of susceptible populations; comprehensive assessment of environmental, sociocultural, and economic impacts; development of effective decision support tools; and improved coordination among agencies and stakeholders. HARR-HD charts a course for human dimensions research to achieve these and other priorities through co-ordinated implementation by the Joint Subcommittee on Ocean Science and Technology (JSOST) In-teragency Working Group on HABs, Hypoxia and Human Health (IWG-4H); national HAB funding programs; national research and response programs; and state research and monitoring programs. (PDF contains 72 pages)

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The implementation of various types of marine protected areas is one of several management tools available for conserving representative examples of the biological diversity within marine ecosystems in general and National Marine Sanctuaries in particular. However, deciding where and how many sites to establish within a given area is frequently hampered by incomplete knowledge of the distribution of organisms and an understanding of the potential tradeoffs that would allow planners to address frequently competing interests in an objective manner. Fortunately, this is beginning to change. Recent studies on the continental shelf of the northeastern United States suggest that substrate and water mass characteristics are highly correlated with the composition of benthic communities and may therefore, serve as proxies for the distribution of biological biodiversity. A detailed geo-referenced interpretative map of major sediment types within Stellwagen Bank National Marine Sanctuary (SBNMS) has recently been developed, and computer-aided decision support tools have reached new levels of sophistication. We demonstrate the use of simulated annealing, a type of mathematical optimization, to identify suites of potential conservation sites within SBNMS that equally represent 1) all major sediment types and 2) derived habitat types based on both sediment and depth in the smallest amount of space. The Sanctuary was divided into 3610 0.5 min2 sampling units. Simulations incorporated constraints on the physical dispersion of sampling units to varying degrees such that solutions included between one and four site clusters. Target representation goals were set at 5, 10, 15, 20, and 25 percent of each sediment type, and 10 and 20 percent of each habitat type. Simulations consisted of 100 runs, from which we identified the best solution (i.e., smallest total area) and four nearoptimal alternates. We also plotted total instances in which each sampling unit occurred in solution sets of the 100 runs as a means of gauging the variety of spatial configurations available under each scenario. Results suggested that the total combined area needed to represent each of the sediment types in equal proportions was equal to the percent representation level sought. Slightly larger areas were required to represent all habitat types at the same representation levels. Total boundary length increased in direct proportion to the number of sites at all levels of representation for simulations involving sediment and habitat classes, but increased more rapidly with number of sites at higher representation levels. There were a large number of alternate spatial configurations at all representation levels, although generally fewer among one and two versus three- and four-site solutions. These differences were less pronounced among simulations targeting habitat representation, suggesting that a similar degree of flexibility is inherent in the spatial arrangement of potential protected area systems containing one versus several sites for similar levels of habitat representation. We attribute these results to the distribution of sediment and depth zones within the Sanctuary, and to the fact that even levels of representation were sought in each scenario. (PDF contains 33 pages.)

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Since the early years of the 21st century, and in particular since 2007, the U.S. has been awakening rapidly to the fact that climate change is underway and that even if stringent efforts are undertaken to mitigate greenhouse gas emissions, adaptation to the unavoidable impacts from the existing commitment to climate change is still needed and needs to be begun now. This report provides an historical overview of the public, political, and scientific concern with adaptation in the United States. It begins by briefly distinguishing ongoing, historical adaptation to environmental circumstances from deliberate adaptation to human‐induced climate change. It then describes the shift from the early concerns with climate change and adaptation to the more recent awakening to the need for a comprehensive approach to managing the risks from climate change. Ranging from the treatment of the topic in the news media to the drafting of bills in Congress, to state and local government activities with considerable engagement of NGOs, scientists and consultants, it is apparent that adaptation has finally, and explosively, emerged on the political agenda as a legitimate and needed subject for debate. At the same time, the current policy rush is not underlain by widespread public engagement and mobilization nor does it rest on a solid research foundation. Funding for vulnerability and adaptation research, establishing adequate decision support institutions, as well as the building of the necessary capacity in science, the consulting world, and in government agencies, lags far behind the need. (PDF contains 42 pages)

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Co-management is a system or a process in which responsibility and authority for the management of common resources is shared between the state, local users of the resources as well as other stakeholders, and where they have the legal authority to administer the resource jointly. Co-management has received increasing attention in recent years as a potential strategy for managing fisheries. This paper presents and discusses results of a survey undertaken in the Kenyan part of Lake Victoria to assess the conditions - behaviour, attitude and characteristics of resource users, as well as community institutions - that can support co-management. It analyses the results of this survey with respect to a series of parameters, identified by Pinkerton (1989), as necessary preconditions for the successful inclusion of communities involvement in resource management. The survey was implemented through a two-stage stratified random sampling technique based on district and beach size strata. A total of 405 fishers, drawn from 25 fish landing beaches, were interviewed using a structured questionnaire. The paper concludes that while Kenya's lake Victoria fishery would appear to qualify for a number of these preconditions, it would appear that it fails to qualify in others. Preconditions in this latter category include the definition of boundaries in fishing grounds, community members' rights to the resource, delegation and legislation of local responsibility and authority. Additional work is required to further elaborate and understand these shortcomings

Relevância:

100.00% 100.00%

Publicador:

Resumo:

How to regulate phytoplankton growth in water supply reservoirs has continued to occupy managers and strategists for some fifty years or so, now, and mathematical models have always featured in their design and operational constraints. In recent years, rather more sophisticated simulation models have begun to be available and these, ideally, purport to provide the manager with improved forecasting of plankton blooms, the likely species and the sort of decision support that might permit management choices to be selected with increased confidence. This account describes the adaptation and application of one such model, PROTECH (Phytoplankton RespOnses To Environmental CHange) to the problems of plankton growth in reservoirs. This article supposes no background knowledge of the main algal types; neither does it attempt to catalogue the problems that their abundance may cause in lakes and reservoirs.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Migrating to cloud computing is one of the current enterprise challenges. This technology provides a new paradigm based on "on-demand payment" for information and communication technologies. In this sense, the small and medium enterprise is supposed to be the most interested, since initial investments are avoided and the technology allows gradual implementation. However, even if the characteristics and capacities have been widely discussed, entry into the cloud is still lacking in terms of practical, real frameworks. This paper aims at filling this gap, presenting a real tool already implemented and tested, which can be used as a cloud computing adoption decision tool. This tool uses diagnosis based on specific questions to gather the required information and subsequently provide the user with valuable information to deploy the business within the cloud, specifically in the form of Software as a Service (SaaS) solutions. This information allows the decision makers to generate their particular Cloud Road. A pilot study has been carried out with enterprises at a local level with a two-fold objective: To ascertain the degree of knowledge on cloud computing and to identify the most interesting business areas and their related tools for this technology. As expected, the results show high interest and low knowledge on this subject and the tool presented aims to readdress this mismatch, insofar as possible. Copyright: © 2015 Bildosola et al. This is an open access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.