14 resultados para nonparametric demand model

em Digital Commons at Florida International University


Relevância:

80.00% 80.00%

Publicador:

Resumo:

Miami-Dade County implemented a series of water conservation programs, which included rebate/exchange incentives to encourage the use of high efficiency aerators (AR), showerheads (SH), toilets (HET) and clothes washers (HEW), to respond to the environmental sustainability issue in urban areas. This study first used panel data analysis of water consumption to evaluate the performance and actual water savings of individual programs. Integrated water demand model has also been developed for incorporating property’s physical characteristics into the water consumption profiles. Life cycle assessment (with emphasis on end-use stage in water system) of water intense appliances was conducted to determine the environmental impacts brought by each practice. Approximately 6 to 10 % of water has been saved in the first and second year of implementation of high efficiency appliances, and with continuing savings in the third and fourth years. Water savings (gallons per household per day) for water efficiency appliances were observed at 28 (11.1%) for SH, 34.7 (13.3%) for HET, and 39.7 (14.5%) for HEW. Furthermore, the estimated contributions of high efficiency appliances for reducing water demand in the integrated water demand model were between 5 and 19% (highest in the AR program). Results indicated that adoption of more than one type of water efficiency appliance could significantly reduce residential water demand. For the sustainable water management strategies, the appropriate water conservation rate was projected to be 1 to 2 million gallons per day (MGD) through 2030. With 2 MGD of water savings, the estimated per capita water use (GPCD) could be reduced from approximately 140 to 122 GPCD. Additional efforts are needed to reduce the water demand to US EPA’s “Water Sense” conservation levels of 70 GPCD by 2030. Life cycle assessment results showed that environmental impacts (water and energy demands and greenhouse gas emissions) from end-use and demand phases are most significant within the water system, particularly due to water heating (73% for clothes washer and 93% for showerhead). Estimations of optimal lifespan for appliances (8 to 21 years) implied that earlier replacement with efficiency models is encouraged in order to minimize the environmental impacts brought by current practice.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The purpose of this study is to produce a model to be used by state regulating agencies to assess demand for subacute care. In accomplishing this goal, the study refines the definition of subacute care, demonstrates a method for bed need assessment, and measures the effectiveness of this new level of care. This was the largest study of subacute care to date. Research focused on 19 subacute units in 16 states, each of which provides high-intensity rehabilitative and/or restorative care carried out in a high-tech unit. Each of the facilities was based in a nursing home, but utilized separate staff, equipment, and services. Because these facilities are under local control, it was possible to study regional differences in subacute care demand.^ Using this data, a model for predicting demand for subacute care services was created, building on earlier models submitted by John Whitman for the American Hospital Association and Robin E. MacStravic. The Broderick model uses the "bootstrapping" method and takes advantage of high technology: computers and software, databases in business and government, publicly available databases from providers or commercial vendors, professional organizations, and other information sources. Using newly available sources of information, this new model addresses the problems and needs of health care planners as they approach the challenges of the 21st century. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Access to healthcare is a major problem in which patients are deprived of receiving timely admission to healthcare. Poor access has resulted in significant but avoidable healthcare cost, poor quality of healthcare, and deterioration in the general public health. Advanced Access is a simple and direct approach to appointment scheduling in which the majority of a clinic's appointments slots are kept open in order to provide access for immediate or same day healthcare needs and therefore, alleviate the problem of poor access the healthcare. This research formulates a non-linear discrete stochastic mathematical model of the Advanced Access appointment scheduling policy. The model objective is to maximize the expected profit of the clinic subject to constraints on minimum access to healthcare provided. Patient behavior is characterized with probabilities for no-show, balking, and related patient choices. Structural properties of the model are analyzed to determine whether Advanced Access patient scheduling is feasible. To solve the complex combinatorial optimization problem, a heuristic that combines greedy construction algorithm and neighborhood improvement search was developed. The model and the heuristic were used to evaluate the Advanced Access patient appointment policy compared to existing policies. Trade-off between profit and access to healthcare are established, and parameter analysis of input parameters was performed. The trade-off curve is a characteristic curve and was observed to be concave. This implies that there exists an access level at which at which the clinic can be operated at optimal profit that can be realized. The results also show that, in many scenarios by switching from existing scheduling policy to Advanced Access policy clinics can improve access without any decrease in profit. Further, the success of Advanced Access policy in providing improved access and/or profit depends on the expected value of demand, variation in demand, and the ratio of demand for same day and advanced appointments. The contributions of the dissertation are a model of Advanced Access patient scheduling, a heuristic to solve the model, and the use of the model to understand the scheduling policy trade-offs which healthcare clinic managers must make. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

An emergency is a deviation from a planned course of events that endangers people, properties, or the environment. It can be described as an unexpected event that causes economic damage, destruction, and human suffering. When a disaster happens, Emergency Managers are expected to have a response plan to most likely disaster scenarios. Unlike earthquakes and terrorist attacks, a hurricane response plan can be activated ahead of time, since a hurricane is predicted at least five days before it makes landfall. This research looked into the logistics aspects of the problem, in an attempt to develop a hurricane relief distribution network model. We addressed the problem of how to efficiently and effectively deliver basic relief goods to victims of a hurricane disaster. Specifically, where to preposition State Staging Areas (SSA), which Points of Distributions (PODs) to activate, and the allocation of commodities to each POD. Previous research has addressed several of these issues, but not with the incorporation of the random behavior of the hurricane's intensity and path. This research presents a stochastic meta-model that deals with the location of SSAs and the allocation of commodities. The novelty of the model is that it treats the strength and path of the hurricane as stochastic processes, and models them as Discrete Markov Chains. The demand is also treated as stochastic parameter because it depends on the stochastic behavior of the hurricane. However, for the meta-model, the demand is an input that is determined using Hazards United States (HAZUS), a software developed by the Federal Emergency Management Agency (FEMA) that estimates losses due to hurricanes and floods. A solution heuristic has been developed based on simulated annealing. Since the meta-model is a multi-objective problem, the heuristic is a multi-objective simulated annealing (MOSA), in which the initial solution and the cooling rate were determined via a Design of Experiments. The experiment showed that the initial temperature (T0) is irrelevant, but temperature reduction (δ) must be very gradual. Assessment of the meta-model indicates that the Markov Chains performed as well or better than forecasts made by the National Hurricane Center (NHC). Tests of the MOSA showed that it provides solutions in an efficient manner. Thus, an illustrative example shows that the meta-model is practical.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This dissertation delivers a framework to diagnose the Bull-Whip Effect (BWE) in supply chains and then identify methods to minimize it. Such a framework is needed because in spite of the significant amount of literature discussing the bull-whip effect, many companies continue to experience the wide variations in demand that are indicative of the bull-whip effect. While the theory and knowledge of the bull-whip effect is well established, there still is the lack of an engineering framework and method to systematically identify the problem, diagnose its causes, and identify remedies. ^ The present work seeks to fill this gap by providing a holistic, systems perspective to bull-whip identification and diagnosis. The framework employs the SCOR reference model to examine the supply chain processes with a baseline measure of demand amplification. Then, research of the supply chain structural and behavioral features is conducted by means of the system dynamics modeling method. ^ The contribution of the diagnostic framework, is called Demand Amplification Protocol (DAMP), relies not only on the improvement of existent methods but also contributes with original developments introduced to accomplish successful diagnosis. DAMP contributes a comprehensive methodology that captures the dynamic complexities of supply chain processes. The method also contributes a BWE measurement method that is suitable for actual supply chains because of its low data requirements, and introduces a BWE scorecard for relating established causes to a central BWE metric. In addition, the dissertation makes a methodological contribution to the analysis of system dynamic models with a technique for statistical screening called SS-Opt, which determines the inputs with the greatest impact on the bull-whip effect by means of perturbation analysis and subsequent multivariate optimization. The dissertation describes the implementation of the DAMP framework in an actual case study that exposes the approach, analysis, results and conclusions. The case study suggests a balanced solution between costs and demand amplification can better serve both firms and supply chain interests. Insights pinpoint to supplier network redesign, postponement in manufacturing operations and collaborative forecasting agreements with main distributors.^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Annual Average Daily Traffic (AADT) is a critical input to many transportation analyses. By definition, AADT is the average 24-hour volume at a highway location over a full year. Traditionally, AADT is estimated using a mix of permanent and temporary traffic counts. Because field collection of traffic counts is expensive, it is usually done for only the major roads, thus leaving most of the local roads without any AADT information. However, AADTs are needed for local roads for many applications. For example, AADTs are used by state Departments of Transportation (DOTs) to calculate the crash rates of all local roads in order to identify the top five percent of hazardous locations for annual reporting to the U.S. DOT. ^ This dissertation develops a new method for estimating AADTs for local roads using travel demand modeling. A major component of the new method involves a parcel-level trip generation model that estimates the trips generated by each parcel. The model uses the tax parcel data together with the trip generation rates and equations provided by the ITE Trip Generation Report. The generated trips are then distributed to existing traffic count sites using a parcel-level trip distribution gravity model. The all-or-nothing assignment method is then used to assign the trips onto the roadway network to estimate the final AADTs. The entire process was implemented in the Cube demand modeling system with extensive spatial data processing using ArcGIS. ^ To evaluate the performance of the new method, data from several study areas in Broward County in Florida were used. The estimated AADTs were compared with those from two existing methods using actual traffic counts as the ground truths. The results show that the new method performs better than both existing methods. One limitation with the new method is that it relies on Cube which limits the number of zones to 32,000. Accordingly, a study area exceeding this limit must be partitioned into smaller areas. Because AADT estimates for roads near the boundary areas were found to be less accurate, further research could examine the best way to partition a study area to minimize the impact.^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The need for efficient, sustainable, and planned utilization of resources is ever more critical. In the U.S. alone, buildings consume 34.8 Quadrillion (1015) BTU of energy annually at a cost of $1.4 Trillion. Of this energy 58% is utilized for heating and air conditioning. ^ Several building energy analysis tools have been developed to assess energy demands and lifecycle energy costs in buildings. Such analyses are also essential for an efficient HVAC design that overcomes the pitfalls of an under/over-designed system. DOE-2 is among the most widely known full building energy analysis models. It also constitutes the simulation engine of other prominent software such as eQUEST, EnergyPro, PowerDOE. Therefore, it is essential that DOE-2 energy simulations be characterized by high accuracy. ^ Infiltration is an uncontrolled process through which outside air leaks into a building. Studies have estimated infiltration to account for up to 50% of a building's energy demand. This, considered alongside the annual cost of buildings energy consumption, reveals the costs of air infiltration. It also stresses the need that prominent building energy simulation engines accurately account for its impact. ^ In this research the relative accuracy of current air infiltration calculation methods is evaluated against an intricate Multiphysics Hygrothermal CFD building envelope analysis. The full-scale CFD analysis is based on a meticulous representation of cracking in building envelopes and on real-life conditions. The research found that even the most advanced current infiltration methods, including in DOE-2, are at up to 96.13% relative error versus CFD analysis. ^ An Enhanced Model for Combined Heat and Air Infiltration Simulation was developed. The model resulted in 91.6% improvement in relative accuracy over current models. It reduces error versus CFD analysis to less than 4.5% while requiring less than 1% of the time required for such a complex hygrothermal analysis. The algorithm used in our model was demonstrated to be easy to integrate into DOE-2 and other engines as a standalone method for evaluating infiltration heat loads. This will vastly increase the accuracy of such simulation engines while maintaining their speed and ease of use characteristics that make them very widely used in building design.^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Software engineering researchers are challenged to provide increasingly more powerful levels of abstractions to address the rising complexity inherent in software solutions. One new development paradigm that places models as abstraction at the forefront of the development process is Model-Driven Software Development (MDSD). MDSD considers models as first class artifacts, extending the capability for engineers to use concepts from the problem domain of discourse to specify apropos solutions. A key component in MDSD is domain-specific modeling languages (DSMLs) which are languages with focused expressiveness, targeting a specific taxonomy of problems. The de facto approach used is to first transform DSML models to an intermediate artifact in a HLL e.g., Java or C++, then execute that resulting code.^ Our research group has developed a class of DSMLs, referred to as interpreted DSMLs (i-DSMLs), where models are directly interpreted by a specialized execution engine with semantics based on model changes at runtime. This execution engine uses a layered architecture and is referred to as a domain-specific virtual machine (DSVM). As the domain-specific model being executed descends the layers of the DSVM the semantic gap between the user-defined model and the services being provided by the underlying infrastructure is closed. The focus of this research is the synthesis engine, the layer in the DSVM which transforms i-DSML models into executable scripts for the next lower layer to process.^ The appeal of an i-DSML is constrained as it possesses unique semantics contained within the DSVM. Existing DSVMs for i-DSMLs exhibit tight coupling between the implicit model of execution and the semantics of the domain, making it difficult to develop DSVMs for new i-DSMLs without a significant investment in resources.^ At the onset of this research only one i-DSML had been created for the user- centric communication domain using the aforementioned approach. This i-DSML is the Communication Modeling Language (CML) and its DSVM is the Communication Virtual machine (CVM). A major problem with the CVM's synthesis engine is that the domain-specific knowledge (DSK) and the model of execution (MoE) are tightly interwoven consequently subsequent DSVMs would need to be developed from inception with no reuse of expertise.^ This dissertation investigates how to decouple the DSK from the MoE and subsequently producing a generic model of execution (GMoE) from the remaining application logic. This GMoE can be reused to instantiate synthesis engines for DSVMs in other domains. The generalized approach to developing the model synthesis component of i-DSML interpreters utilizes a reusable framework loosely coupled to DSK as swappable framework extensions.^ This approach involves first creating an i-DSML and its DSVM for a second do- main, demand-side smartgrid, or microgrid energy management, and designing the synthesis engine so that the DSK and MoE are easily decoupled. To validate the utility of the approach, the SEs are instantiated using the GMoE and DSKs of the two aforementioned domains and an empirical study to support our claim of reduced developmental effort is performed.^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

English has been taught as a core and compulsory subject in China for decades. Recently, the demand for English in China has increased dramatically. China now has the world's largest English-learning population. The traditional English-teaching method cannot continue to be the only approach because it merely focuses on reading, grammar and translation, which cannot meet English learners and users' needs (i.e., communicative competence and skills in speaking and writing). ^ This study was conducted to investigate if the Picture-Word Inductive Model (PWIM), a new pedagogical method using pictures and inductive thinking, would benefit English learners in China in terms of potential higher output in speaking and writing. With the gauge of Cognitive Load Theory (CLT), specifically, its redundancy effect, I investigated whether processing words and a picture concurrently would present a cognitive overload for English learners in China. ^ I conducted a mixed methods research study. A quasi-experiment (pretest, intervention for seven weeks, and posttest) was conducted using 234 students in four groups in Lianyungang, China (58 fourth graders and 57 seventh graders as an experimental group with PWIM and 59 fourth graders and 60 seventh graders as a control group with the traditional method). No significant difference in the effects of PWIM was found on vocabulary acquisition based on grade levels. Observations, questionnaires with open-ended questions, and interviews were deployed to answer the three remaining research questions. A few students felt cognitively overloaded when they encountered too many writing samples, too many new words at one time, repeated words, mismatches between words and pictures, and so on. Many students listed and exemplified numerous strengths of PWIM, but a few mentioned weaknesses of PWIM. The students expressed the idea that PWIM had a positive effect on their English teaching. ^ As integrated inferences, qualitative findings were used to explain the quantitative results that there were no significant differences of the effects of the PWIM between the experimental and control groups in both grade levels, from four contextual aspects: time constraints on PWIM implementation, teachers' resistance, how to use PWIM and PWIM implemented in a classroom over 55 students.^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The purpose of this study is to produce a model to be used by state regulating agencies to assess demand for subacute care. In accomplishing this goal, the study refines the definition of subacute care, demonstrates a method for bed need assessment, and measures the effectiveness of this new level of care. This was the largest study of subacute care to date. Research focused on 19 subacute units in 16 states, each of which provides high-intensity rehabilitative and/or restorative care carried out in a high-tech unit. Each of the facilities was based in a nursing home, but utilized separate staff, equipment, and services. Because these facilities are under local control, it was possible to study regional differences in subacute care demand. Using this data, a model for predicting demand for subacute care services was created, building on earlier models submitted by John Whitman for the American Hospital Association and Robin E. MacStravic. The Broderick model uses the "bootstrapping" method and takes advantage of high technology: computers and software, databases in business and government, publicly available databases from providers or commercial vendors, professional organizations, and other information sources. Using newly available sources of information, this new model addresses the problems and needs of health care planners as they approach the challenges of the 21st century.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Software engineering researchers are challenged to provide increasingly more pow- erful levels of abstractions to address the rising complexity inherent in software solu- tions. One new development paradigm that places models as abstraction at the fore- front of the development process is Model-Driven Software Development (MDSD). MDSD considers models as first class artifacts, extending the capability for engineers to use concepts from the problem domain of discourse to specify apropos solutions. A key component in MDSD is domain-specific modeling languages (DSMLs) which are languages with focused expressiveness, targeting a specific taxonomy of problems. The de facto approach used is to first transform DSML models to an intermediate artifact in a HLL e.g., Java or C++, then execute that resulting code. Our research group has developed a class of DSMLs, referred to as interpreted DSMLs (i-DSMLs), where models are directly interpreted by a specialized execution engine with semantics based on model changes at runtime. This execution engine uses a layered architecture and is referred to as a domain-specific virtual machine (DSVM). As the domain-specific model being executed descends the layers of the DSVM the semantic gap between the user-defined model and the services being provided by the underlying infrastructure is closed. The focus of this research is the synthesis engine, the layer in the DSVM which transforms i-DSML models into executable scripts for the next lower layer to process. The appeal of an i-DSML is constrained as it possesses unique semantics contained within the DSVM. Existing DSVMs for i-DSMLs exhibit tight coupling between the implicit model of execution and the semantics of the domain, making it difficult to develop DSVMs for new i-DSMLs without a significant investment in resources. At the onset of this research only one i-DSML had been created for the user- centric communication domain using the aforementioned approach. This i-DSML is the Communication Modeling Language (CML) and its DSVM is the Communication Virtual machine (CVM). A major problem with the CVM’s synthesis engine is that the domain-specific knowledge (DSK) and the model of execution (MoE) are tightly interwoven consequently subsequent DSVMs would need to be developed from inception with no reuse of expertise. This dissertation investigates how to decouple the DSK from the MoE and sub- sequently producing a generic model of execution (GMoE) from the remaining appli- cation logic. This GMoE can be reused to instantiate synthesis engines for DSVMs in other domains. The generalized approach to developing the model synthesis com- ponent of i-DSML interpreters utilizes a reusable framework loosely coupled to DSK as swappable framework extensions. This approach involves first creating an i-DSML and its DSVM for a second do- main, demand-side smartgrid, or microgrid energy management, and designing the synthesis engine so that the DSK and MoE are easily decoupled. To validate the utility of the approach, the SEs are instantiated using the GMoE and DSKs of the two aforementioned domains and an empirical study to support our claim of reduced developmental effort is performed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

English has been taught as a core and compulsory subject in China for decades. Recently, the demand for English in China has increased dramatically. China now has the world’s largest English-learning population. The traditional English-teaching method cannot continue to be the only approach because it merely focuses on reading, grammar and translation, which cannot meet English learners and users’ needs (i.e., communicative competence and skills in speaking and writing). This study was conducted to investigate if the Picture-Word Inductive Model (PWIM), a new pedagogical method using pictures and inductive thinking, would benefit English learners in China in terms of potential higher output in speaking and writing. With the gauge of Cognitive Load Theory (CLT), specifically, its redundancy effect, I investigated whether processing words and a picture concurrently would present a cognitive overload for English learners in China. I conducted a mixed methods research study. A quasi-experiment (pretest, intervention for seven weeks, and posttest) was conducted using 234 students in four groups in Lianyungang, China (58 fourth graders and 57 seventh graders as an experimental group with PWIM and 59 fourth graders and 60 seventh graders as a control group with the traditional method). No significant difference in the effects of PWIM was found on vocabulary acquisition based on grade levels. Observations, questionnaires with open-ended questions, and interviews were deployed to answer the three remaining research questions. A few students felt cognitively overloaded when they encountered too many writing samples, too many new words at one time, repeated words, mismatches between words and pictures, and so on. Many students listed and exemplified numerous strengths of PWIM, but a few mentioned weaknesses of PWIM. The students expressed the idea that PWIM had a positive effect on their English teaching. As integrated inferences, qualitative findings were used to explain the quantitative results that there were no significant differences of the effects of the PWIM between the experimental and control groups in both grade levels, from four contextual aspects: time constraints on PWIM implementation, teachers’ resistance, how to use PWIM and PWIM implemented in a classroom over 55 students.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The need for efficient, sustainable, and planned utilization of resources is ever more critical. In the U.S. alone, buildings consume 34.8 Quadrillion (1015) BTU of energy annually at a cost of $1.4 Trillion. Of this energy 58% is utilized for heating and air conditioning. Several building energy analysis tools have been developed to assess energy demands and lifecycle energy costs in buildings. Such analyses are also essential for an efficient HVAC design that overcomes the pitfalls of an under/over-designed system. DOE-2 is among the most widely known full building energy analysis models. It also constitutes the simulation engine of other prominent software such as eQUEST, EnergyPro, PowerDOE. Therefore, it is essential that DOE-2 energy simulations be characterized by high accuracy. Infiltration is an uncontrolled process through which outside air leaks into a building. Studies have estimated infiltration to account for up to 50% of a building’s energy demand. This, considered alongside the annual cost of buildings energy consumption, reveals the costs of air infiltration. It also stresses the need that prominent building energy simulation engines accurately account for its impact. In this research the relative accuracy of current air infiltration calculation methods is evaluated against an intricate Multiphysics Hygrothermal CFD building envelope analysis. The full-scale CFD analysis is based on a meticulous representation of cracking in building envelopes and on real-life conditions. The research found that even the most advanced current infiltration methods, including in DOE-2, are at up to 96.13% relative error versus CFD analysis. An Enhanced Model for Combined Heat and Air Infiltration Simulation was developed. The model resulted in 91.6% improvement in relative accuracy over current models. It reduces error versus CFD analysis to less than 4.5% while requiring less than 1% of the time required for such a complex hygrothermal analysis. The algorithm used in our model was demonstrated to be easy to integrate into DOE-2 and other engines as a standalone method for evaluating infiltration heat loads. This will vastly increase the accuracy of such simulation engines while maintaining their speed and ease of use characteristics that make them very widely used in building design.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Mara River Basin (MRB) is endowed with pristine biodiversity, socio-cultural heritage and natural resources. The purpose of my study is to develop and apply an integrated water resource allocation framework for the MRB based on the hydrological processes, water demand and economic factors. The basin was partitioned into twelve sub-basins and the rainfall runoff processes was modeled using the Soil and Water Assessment Tool (SWAT) after satisfactory Nash-Sutcliff efficiency of 0.68 for calibration and 0.43 for validation at Mara Mines station. The impact and uncertainty of climate change on the hydrology of the MRB was assessed using SWAT and three scenarios of statistically downscaled outputs from twenty Global Circulation Models. Results predicted the wet season getting more wet and the dry season getting drier, with a general increasing trend of annual rainfall through 2050. Three blocks of water demand (environmental, normal and flood) were estimated from consumptive water use by human, wildlife, livestock, tourism, irrigation and industry. Water demand projections suggest human consumption is expected to surpass irrigation as the highest water demand sector by 2030. Monthly volume of water was estimated in three blocks of current minimum reliability, reserve (>95%), normal (80–95%) and flood (40%) for more than 5 months in a year. The assessment of water price and marginal productivity showed that current water use hardly responds to a change in price or productivity of water. Finally, a water allocation model was developed and applied to investigate the optimum monthly allocation among sectors and sub-basins by maximizing the use value and hydrological reliability of water. Model results demonstrated that the status on reserve and normal volumes can be improved to ‘low’ or ‘moderate’ by updating the existing reliability to meet prevailing demand. Flow volumes and rates for four scenarios of reliability were presented. Results showed that the water allocation framework can be used as comprehensive tool in the management of MRB, and possibly be extended similar watersheds.