36 resultados para Lot sizing and scheduling


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The recently erected concept of building deconstruction has significantly promoted building components and materials reuse and recycling where building is carefully dismantled into reusable parts. Current research and practices of building deconstruction mainly focus on issues of process before and during the deconstruction such as hazardous material detection, deconstruction design and deconstruction technology. The issues after the deconstruction project are rarely considered. Waste reuse and recycling are enabled through deconstruction yet not practically achieved, and especially the demands of waste building components and materials are hard to appear and match the actual waste production in a building deconstruction project. To deal with this awkward situation, the waste production needs to be conducted in a demand-oriented way. It needs to be thoughtfully planned and scheduled prior to the physical deconstruction as an essential portion of deconstruction project planning and scheduling. Furthermore, the relationship between waste production and structural characteristics of the building creates a serious consideration affecting a deconstruction plan. As a result, a waste production simulation will facilitate waste reuse and recycling in a deconstruction project. It serves as a crucial section of deconstruction planning and design. This research aims to describe the concept of waste production simulation and investigate various management and technical aspects of waste production simulation for building deconstruction projects.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The building profession is increasingly becoming more demanding with respect to building environmental performance. Intentions are to provide best practices into our buildings. In part, this is a response due to the Australian government and other independent organisations that have developed policy on rating tools and performance ranking measures, all with the intention of accomplishing environmentally sustainable buildings.

With rating systems endorsing innovative environmental design solutions, it could be asked: Are our buildings really operating as rated? Do we know whether our designs are in compliance with what was calculated or simulated? Is there a feedback loop informing the design process on successes or failures in our designs or mechanical services?

While ratings continue to focus on ‘by design’ or ‘as built’ rewards, few tools acknowledge perhaps the more crucial bottom line: ‘as performing’. With the exception of an AGBR (Australian Green Building Rating) scheme on actual annual energy consumption, there appears to be no ‘as performing’ assessment. Furthermore, practically every building is a prototype (a one-off) and requires commissioning, programming and scheduling of its services. It would certainly appear that as stakeholders (the procurers, owners, facilities managers and users) of the newly built environment, that what we really want to know is actual on-site confirmation of performance. It is the objective of the Mobile Architecture and Built Environment Laboratory (MABEL), to provide such a service.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Short-term load forecasting (STLF) is of great importance for control and scheduling of electrical power systems. The uncertainty of power systems increases due to the random nature of climate and the penetration of the renewable energies such as wind and solar power. Traditional methods for generating point forecasts of load demands cannot properly handle uncertainties in datasets. To quantify these potential uncertainties associated with forecasts, this paper implements a neural network (NN)-based method for construction of prediction intervals (PIs). A newly proposed method, called lower upper bound estimation (LUBE), is applied to develop PIs using NN models. The primary multi-objective problem is firstly transformed into a constrained single-objective problem. This new problem formulation is closer to the original problem and has fewer parameters than the cost function. Particle swarm optimization (PSO) integrated with the mutation operator is used to solve the problem. Two case studies from Singapore and New South Wales (Australia) historical load datasets are used to validate the PSO-based LUBE method. Demonstrated results show that the proposed method can construct high quality PIs for load forecasting applications.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The parameters of the orthographic plan drawing largely compel the observer to view the plan image from a single vantage point and in a single instant. the plan drawing places the viewer at a fixed distance—looking from above—at an abstract flattening of a curved surface area of the earth. Landscape architects, who often deal with expansive scales, appreciate how the view from above enables survey and understanding of space of large (or small) tracts of the landscape. The view enabled by the plan, in this case, of a landscape design, allows inscribing and printing on to 2D flat surfaces (computer screens and paper). This process facilitates the sizing and scaling of elements. Plans allow for measurement without distortion of scale afforded by cartographic and drawing techniques that skew space and perspective. Thus the strength of the plan drawing is enabling construction. Its merit as an imaginative, exploratory medium, however, is less convincing.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Studies have shown that most of the computers in a non-dedicated cluster are often idle or lightly loaded. The underutilized computers in a non-dedicated cluster can be employed to execute parallel applications. The aim of this study is to learn how concurrent execution of a computation-bound and sequential applications influence their execution performance and cluster utilization. The result of the study has demonstrated that a computation-bound parallel application benefits from load balancing, and at the same time sequential applications suffer only an insignificant slowdown of execution. Overall, the utilization of a non-dedicated cluster is improved.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

In this paper, we have demonstrated how the existing programming environments, tools and middleware could be used for the study of execution performance of parallel and sequential applications on a non-dedicated cluster. A set of parallel and sequential benchmark applications selected for and used in the experiments were characterized, and experiment requirements shown. 

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Restoration of native vegetation is required in many regions of the world, but determining priority locations for revegetation is a complex problem. We consider the problem of determining spatial and temporal priorities for revegetation to maximize habitat for 62 bird species within a heavily cleared agricultural region, 11 000 km2 in area. We show how a reserve-selection framework can be applied to a complex, large-scale restoration-planning problem to account for multi-species objectives and connectivity requirements at a spatial extent and resolution relevant to management. Our approach explicitly accounts for time lags in planting and development of habitat resources, which is intended to avoid future population bottlenecks caused by delayed provision of critical resources, such as tree hollows. We coupled species-specific models of expected habitat quality and fragmentation effects with the dynamics of habitat suitability following replanting to produce species-specific maps for future times. Spatial priorities for restoration were determined by ranking locations (150-m grid cells) by their expected contribution to species habitat through time using the conservation planning tool, ‘‘Zonation.’’ We evaluated solutions by calculating expected trajectories of habitat availability for each species. We produced a spatially explicit revegetation schedule for the region that resulted in a balanced increase in habitat for all species. Priority areas for revegetation generally were clustered around existing vegetation, although not always. Areas on richer soils and with high rainfall were more highly ranked, reflecting their potential to support high-quality habitats that have been disproportionately cleared for agriculture. Accounting for delayed development of habitat resources altered the rank-order of locations in the derived revegetation plan and led to improved expected outcomes for fragmentation-sensitive species. This work demonstrates the potential for systematic restoration planning at large scales that accounts for multiple objectives, which is urgently needed by land and natural resource managers.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Studies have shown that most of the computers in a non-dedicated cluster are often idle or lightly loaded. The underutilized computers in a non-dedicated cluster can be employed to execute parallel applications. The aim of this study is to learn how concurrent execution of a computation-bound and sequential applications influence their execution performance and cluster utilization. The result of the study has demonstrated that a computation-bound parallel application benefits from load balancing, and at the same time sequential applications suffer only an insignificant slowdown of execution. Overall, the utilization of a non-dedicated cluster is improved.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Bounded uncertainty is a major challenge to real life scheduling as it increases the risk and cost depending on the objective function. Bounded uncertainty provides limited information about its nature. It provides only the upper and the lower bounds without information in between, in contrast to probability distributions and fuzzymembership functions. Bratley algorithm is usually used for scheduling with the constraints of earliest start and due-date. It is formulated as . The proposed research uses interval computation to minimize the impact of bounded uncertainty of processing times on Bratley’s algorithm. It minimizes the uncertainty of the estimate of the objective function. The proposed concept is to do the calculations on the interval values and approximate the end result instead of approximating each interval then doing numerical calculations. This methodology gives a more certain estimate of the objective function.