29 resultados para Consumption Predicting Model
Resumo:
To promote regional or mutual improvement, numerous interjurisdictional efforts to share tax bases have been attempted. Most of these efforts fail to be consummated. Motivations to share revenues include: narrowing fiscal disparities, enhancing regional cooperation and economic development, rationalizing land-use, and minimizing revenue losses caused by competition to attract and keep businesses. Various researchers have developed theories to aid understanding of why interjurisdictional cooperation efforts succeed or fail. Walter Rosenbaum and Gladys Kammerer studied two contemporaneous Florida local-government consolidation attempts. Boyd Messinger subsequently tested their Theory of Successful Consolidation on nine consolidation attempts. Paul Peterson's dual theories on Modern Federalism posit that all governmental levels attempt to further economic development and that politicians act in ways that either further their futures or cement job security. Actions related to the latter theory often interfere with the former. Samuel Nunn and Mark Rosentraub sought to learn how interjurisdictional cooperation evolves. Through multiple case studies they developed a model framing interjurisdictional cooperation in four dimensions. ^ This dissertation investigates the ability of the above theories to help predict success or failure of regional tax-base revenue sharing attempts. A research plan was formed that used five sequenced steps to gather data, analyze it, and conclude if hypotheses concerning the application of these theories were valid. The primary analytical tools were: multiple case studies, cross-case analysis, and pattern matching. Data was gathered from historical records, questionnaires, and interviews. ^ The results of this research indicate that Rosenbaum-Kammerer theory can be a predictor of success or failure in implementing tax-base revenue sharing if it is amended as suggested by Messinger and further modified by a recommendation in this dissertation. Peterson's Functional and Legislative theories considered together were able to predict revenue sharing proposal outcomes. Many of the indicators of interjurisdictional cooperation forwarded in the Nunn-Rosentraub model appeared in the cases studied, but the model was not a reliable forecasting instrument. ^
Resumo:
The need for efficient, sustainable, and planned utilization of resources is ever more critical. In the U.S. alone, buildings consume 34.8 Quadrillion (1015) BTU of energy annually at a cost of $1.4 Trillion. Of this energy 58% is utilized for heating and air conditioning. ^ Several building energy analysis tools have been developed to assess energy demands and lifecycle energy costs in buildings. Such analyses are also essential for an efficient HVAC design that overcomes the pitfalls of an under/over-designed system. DOE-2 is among the most widely known full building energy analysis models. It also constitutes the simulation engine of other prominent software such as eQUEST, EnergyPro, PowerDOE. Therefore, it is essential that DOE-2 energy simulations be characterized by high accuracy. ^ Infiltration is an uncontrolled process through which outside air leaks into a building. Studies have estimated infiltration to account for up to 50% of a building's energy demand. This, considered alongside the annual cost of buildings energy consumption, reveals the costs of air infiltration. It also stresses the need that prominent building energy simulation engines accurately account for its impact. ^ In this research the relative accuracy of current air infiltration calculation methods is evaluated against an intricate Multiphysics Hygrothermal CFD building envelope analysis. The full-scale CFD analysis is based on a meticulous representation of cracking in building envelopes and on real-life conditions. The research found that even the most advanced current infiltration methods, including in DOE-2, are at up to 96.13% relative error versus CFD analysis. ^ An Enhanced Model for Combined Heat and Air Infiltration Simulation was developed. The model resulted in 91.6% improvement in relative accuracy over current models. It reduces error versus CFD analysis to less than 4.5% while requiring less than 1% of the time required for such a complex hygrothermal analysis. The algorithm used in our model was demonstrated to be easy to integrate into DOE-2 and other engines as a standalone method for evaluating infiltration heat loads. This will vastly increase the accuracy of such simulation engines while maintaining their speed and ease of use characteristics that make them very widely used in building design.^
Resumo:
Colleges base their admission decisions on a number of factors to determine which applicants have the potential to succeed. This study utilized data for students that graduated from Florida International University between 2006 and 2012. Two models were developed (one using SAT as the principal explanatory variable and the other using ACT as the principal explanatory variable) to predict college success, measured using the student’s college grade point average at graduation. Some of the other factors that were used to make these predictions were high school performance, socioeconomic status, major, gender, and ethnicity. The model using ACT had a higher R^2 but the model using SAT had a lower mean square error. African Americans had a significantly lower college grade point average than graduates of other ethnicities. Females had a significantly higher college grade point average than males.
Resumo:
An awareness of mercury (Hg) contamination in various aquatic environments around the world has increased over the past decade, mostly due to its ability to concentrate in the biota. Because the presence and distribution of Hg in aquatic systems depend on many factors (e.g., pe, pH, salinity, temperature, organic and inorganic ligands, sorbents, etc.), it is crucial to understand its fate and transport in the presence of complexing constituents and natural sorbents, under those different factors. An improved understanding of the subject will support the selection of monitoring, remediation, and restoration technologies. The coupling of equilibrium chemical reactions with transport processes in the model PHREEQC offers an advantage in simulating and predicting the fate and transport of aqueous chemical species of interest. Thus, a great variety of reactive transport problems could be addressed in aquatic systems with boundary conditions of specific interest. Nevertheless, PHREEQC lacks a comprehensive thermodynamic database for Hg. Therefore, in order to use PHREEQC to address the fate and transport of Hg in aquatic environments, it is necessary to expand its thermodynamic database, confirm it and then evaluate it in applications where potential exists for its calibration and continued validation. The objectives of this study were twofold: 1) to develop, expand, and confirm the Hg database of the hydrogeochemical PHREEQC to enhance its capability to simulate the fate of Hg species in the presence of complexing constituents and natural sorbents under different conditions of pH, redox, salinity and temperature; and 2) to apply and evaluate the new database in flow and transport scenarios, at two field test beds: Oak Ridge Reservation, Oak Ridge, TN and Everglades National Park, FL, where Hg is present and is of much concern. Overall, this research enhanced the capability of the PHREEQC model to simulate the coupling of the Hg reactions in transport conditions. It also demonstrated its usefulness when applied to field situations.
Resumo:
The performance of building envelopes and roofing systems significantly depends on accurate knowledge of wind loads and the response of envelope components under realistic wind conditions. Wind tunnel testing is a well-established practice to determine wind loads on structures. For small structures much larger model scales are needed than for large structures, to maintain modeling accuracy and minimize Reynolds number effects. In these circumstances the ability to obtain a large enough turbulence integral scale is usually compromised by the limited dimensions of the wind tunnel meaning that it is not possible to simulate the low frequency end of the turbulence spectrum. Such flows are called flows with Partial Turbulence Simulation. In this dissertation, the test procedure and scaling requirements for tests in partial turbulence simulation are discussed. A theoretical method is proposed for including the effects of low-frequency turbulences in the post-test analysis. In this theory the turbulence spectrum is divided into two distinct statistical processes, one at high frequencies which can be simulated in the wind tunnel, and one at low frequencies which can be treated in a quasi-steady manner. The joint probability of load resulting from the two processes is derived from which full-scale equivalent peak pressure coefficients can be obtained. The efficacy of the method is proved by comparing predicted data derived from tests on large-scale models of the Silsoe Cube and Texas-Tech University buildings in Wall of Wind facility at Florida International University with the available full-scale data. For multi-layer building envelopes such as rain-screen walls, roof pavers, and vented energy efficient walls not only peak wind loads but also their spatial gradients are important. Wind permeable roof claddings like roof pavers are not well dealt with in many existing building codes and standards. Large-scale experiments were carried out to investigate the wind loading on concrete pavers including wind blow-off tests and pressure measurements. Simplified guidelines were developed for design of loose-laid roof pavers against wind uplift. The guidelines are formatted so that use can be made of the existing information in codes and standards such as ASCE 7-10 on pressure coefficients on components and cladding.
Resumo:
This study examined the association of theoretically guided and empirically identified psychosocial variables on the co-occurrence of risky sexual behavior with alcohol consumption among university students. The study utilized event analysis to determine whether risky sex occurred during the same event in which alcohol was consumed. Relevant conceptualizations included alcohol disinhibition, self-efficacy, and social network theories. Predictor variables included negative condom attitudes, general risk taking, drinking motives, mistrust, social group membership, and gender. Factor analysis was employed to identify dimensions of drinking motives. Measured risky sex behaviors were (a) sex without a condom, (b) sex with people not known very well, (c) sex with injecting drug users (IDUs), (d) sex with people without knowing whether they had a STD, and (e) sex with using drugs. A purposive sample was used and included 222 male and female students recruited from a major urban university. Chi-square analysis was used to determine whether participants were more likely to engage in risky sex behavior in different alcohol use contexts. These contexts were only when drinking, only when not drinking, and when drinking or not. The chi-square findings did not support the hypothesis that university students who use alcohol with sex will engage in riskier sex. These results added to the literature by extending other similar findings to a university student sample. For each of the observed risky sex behaviors, discriminant analysis methodology was used to determine whether the predictor variables would differentiate the drinking contexts, or whether the behavior occurred. Results from discriminant analyses indicated that sex with people not known very well was the only behavior for which there were significant discriminant functions. Gender and enhancement drinking motives were important constructs in the classification model. Limitations of the study and implications for future research, social work practice and policy are discussed.
Resumo:
Traffic incidents are a major source of traffic congestion on freeways. Freeway traffic diversion using pre-planned alternate routes has been used as a strategy to reduce traffic delays due to major traffic incidents. However, it is not always beneficial to divert traffic when an incident occurs. Route diversion may adversely impact traffic on the alternate routes and may not result in an overall benefit. This dissertation research attempts to apply Artificial Neural Network (ANN) and Support Vector Regression (SVR) techniques to predict the percent of delay reduction from route diversion to help determine whether traffic should be diverted under given conditions. The DYNASMART-P mesoscopic traffic simulation model was applied to generate simulated data that were used to develop the ANN and SVR models. A sample network that comes with the DYNASMART-P package was used as the base simulation network. A combination of different levels of incident duration, capacity lost, percent of drivers diverted, VMS (variable message sign) messaging duration, and network congestion was simulated to represent different incident scenarios. The resulting percent of delay reduction, average speed, and queue length from each scenario were extracted from the simulation output. The ANN and SVR models were then calibrated for percent of delay reduction as a function of all of the simulated input and output variables. The results show that both the calibrated ANN and SVR models, when applied to the same location used to generate the calibration data, were able to predict delay reduction with a relatively high accuracy in terms of mean square error (MSE) and regression correlation. It was also found that the performance of the ANN model was superior to that of the SVR model. Likewise, when the models were applied to a new location, only the ANN model could produce comparatively good delay reduction predictions under high network congestion level.
Resumo:
The purpose of this study is to produce a model to be used by state regulating agencies to assess demand for subacute care. In accomplishing this goal, the study refines the definition of subacute care, demonstrates a method for bed need assessment, and measures the effectiveness of this new level of care. This was the largest study of subacute care to date. Research focused on 19 subacute units in 16 states, each of which provides high-intensity rehabilitative and/or restorative care carried out in a high-tech unit. Each of the facilities was based in a nursing home, but utilized separate staff, equipment, and services. Because these facilities are under local control, it was possible to study regional differences in subacute care demand. Using this data, a model for predicting demand for subacute care services was created, building on earlier models submitted by John Whitman for the American Hospital Association and Robin E. MacStravic. The Broderick model uses the "bootstrapping" method and takes advantage of high technology: computers and software, databases in business and government, publicly available databases from providers or commercial vendors, professional organizations, and other information sources. Using newly available sources of information, this new model addresses the problems and needs of health care planners as they approach the challenges of the 21st century.
Resumo:
Research into the dynamicity of job performance criteria has found evidence suggesting the presence of rank-order changes to job performance scores across time as well as intraindividual trajectories in job performance scores across time. These findings have influenced a large body of research into (a) the dynamicity of validities of individual differences predictors of job performance and (b) the relationship between individual differences predictors of job performance and intraindividual trajectories of job performance. In the present dissertation, I addressed these issues within the context of the Five Factor Model of personality. The Five Factor Model is arranged hierarchically, with five broad higher-order factors subsuming a number of more narrowly tailored personality facets. Research has debated the relative merits of broad versus narrow traits for predicting job performance, but the entire body of research has addressed the issue from a static perspective -- by examining the relative magnitude of validities of global factors versus their facets. While research along these lines has been enlightening, theoretical perspectives suggest that the validities of global factors versus their facets may differ in their stability across time. Thus, research is needed to not only compare the relative magnitude of validities of global factors versus their facets at a single point in time, but also to compare the relative stability of validities of global factors versus their facets across time. Also necessary to advance cumulative knowledge concerning intraindividual performance trajectories is research into broad vs. narrow traits for predicting such trajectories. In the present dissertation, I addressed these issues using a four-year longitudinal design. The results indicated that the validities of global conscientiousness were stable across time, while the validities of conscientiousness facets were more likely to fluctuate. However, the validities of emotional stability and extraversion facets were no more likely to fluctuate across time than those of the factors. Finally, while some personality factors and facets predicted performance intercepts (i.e., performance at the first measurement occasion), my results failed to indicate a significant effect of any personality variable on performance growth. Implications for research and practice are discussed.
Resumo:
The purpose of this dissertation is to examine three distributional issues in macroeconomics. First I explore the effects fiscal federalism on economic growth across regions in China. Using the comprehensive official data set of China for 31 regions from 1952 until 1999, I investigate a number of indicators used by the literature to measure federalism and find robust support for only one such measure: the ratio of local total revenue to local tax revenue. Using a difference-in-difference approach and exploiting the two-year gap in the implementation of a tax reform across different regions of China, I also identify a positive relationship between fiscal federalism and regional economic growth. The second paper hypothesizes that an inequitable distribution of income negatively affects the rule of law in resource-rich economies and provides robust evidence in support of this hypothesis. By investigating a data set that contains 193 countries and using econometric methodologies such as the fixed effects estimator and the generalized method of moments estimator, I find that resource-abundance improves the quality of institutions, as long as income and wealth disparity remains below a certain threshold. When inequality moves beyond this threshold, the positive effects of the resource-abundance level on institutions diminish quickly and turn negative eventually. This paper, thus, provides robust evidence about the endogeneity of institutions and the role income and wealth inequality plays in the determination of long-run growth rates. The third paper sets up a dynamic general equilibrium model with heterogeneous agents to investigate the causal channels which run from a concern for international status to long-run economic growth. The simulation results show that the initial distribution of income and wealth play an important role in whether agents gain or lose from globalization.
Resumo:
The need for efficient, sustainable, and planned utilization of resources is ever more critical. In the U.S. alone, buildings consume 34.8 Quadrillion (1015) BTU of energy annually at a cost of $1.4 Trillion. Of this energy 58% is utilized for heating and air conditioning. Several building energy analysis tools have been developed to assess energy demands and lifecycle energy costs in buildings. Such analyses are also essential for an efficient HVAC design that overcomes the pitfalls of an under/over-designed system. DOE-2 is among the most widely known full building energy analysis models. It also constitutes the simulation engine of other prominent software such as eQUEST, EnergyPro, PowerDOE. Therefore, it is essential that DOE-2 energy simulations be characterized by high accuracy. Infiltration is an uncontrolled process through which outside air leaks into a building. Studies have estimated infiltration to account for up to 50% of a building’s energy demand. This, considered alongside the annual cost of buildings energy consumption, reveals the costs of air infiltration. It also stresses the need that prominent building energy simulation engines accurately account for its impact. In this research the relative accuracy of current air infiltration calculation methods is evaluated against an intricate Multiphysics Hygrothermal CFD building envelope analysis. The full-scale CFD analysis is based on a meticulous representation of cracking in building envelopes and on real-life conditions. The research found that even the most advanced current infiltration methods, including in DOE-2, are at up to 96.13% relative error versus CFD analysis. An Enhanced Model for Combined Heat and Air Infiltration Simulation was developed. The model resulted in 91.6% improvement in relative accuracy over current models. It reduces error versus CFD analysis to less than 4.5% while requiring less than 1% of the time required for such a complex hygrothermal analysis. The algorithm used in our model was demonstrated to be easy to integrate into DOE-2 and other engines as a standalone method for evaluating infiltration heat loads. This will vastly increase the accuracy of such simulation engines while maintaining their speed and ease of use characteristics that make them very widely used in building design.
Resumo:
The performance of building envelopes and roofing systems significantly depends on accurate knowledge of wind loads and the response of envelope components under realistic wind conditions. Wind tunnel testing is a well-established practice to determine wind loads on structures. For small structures much larger model scales are needed than for large structures, to maintain modeling accuracy and minimize Reynolds number effects. In these circumstances the ability to obtain a large enough turbulence integral scale is usually compromised by the limited dimensions of the wind tunnel meaning that it is not possible to simulate the low frequency end of the turbulence spectrum. Such flows are called flows with Partial Turbulence Simulation.^ In this dissertation, the test procedure and scaling requirements for tests in partial turbulence simulation are discussed. A theoretical method is proposed for including the effects of low-frequency turbulences in the post-test analysis. In this theory the turbulence spectrum is divided into two distinct statistical processes, one at high frequencies which can be simulated in the wind tunnel, and one at low frequencies which can be treated in a quasi-steady manner. The joint probability of load resulting from the two processes is derived from which full-scale equivalent peak pressure coefficients can be obtained. The efficacy of the method is proved by comparing predicted data derived from tests on large-scale models of the Silsoe Cube and Texas-Tech University buildings in Wall of Wind facility at Florida International University with the available full-scale data.^ For multi-layer building envelopes such as rain-screen walls, roof pavers, and vented energy efficient walls not only peak wind loads but also their spatial gradients are important. Wind permeable roof claddings like roof pavers are not well dealt with in many existing building codes and standards. Large-scale experiments were carried out to investigate the wind loading on concrete pavers including wind blow-off tests and pressure measurements. Simplified guidelines were developed for design of loose-laid roof pavers against wind uplift. The guidelines are formatted so that use can be made of the existing information in codes and standards such as ASCE 7-10 on pressure coefficients on components and cladding.^
Resumo:
Vehicle fuel consumption and emission are two important effectiveness measurements of sustainable transportation development. Pavement plays an essential role in goals of fuel economy improvement and greenhouse gas (GHG) emission reduction. The main objective of this dissertation study is to experimentally investigate the effect of pavement-vehicle interaction (PVI) on vehicle fuel consumption under highway driving conditions. The goal is to provide a better understanding on the role of pavement in the green transportation initiates. Four study phases are carried out. The first phase involves a preliminary field investigation to detect the fuel consumption differences between paired flexible-rigid pavement sections with repeat measurements. The second phase continues the field investigation by a more detailed and comprehensive experimental design and independently investigates the effect of pavement type on vehicle fuel consumption. The third study phase calibrates the HDM-IV fuel consumption model with data collected in the second field phase. The purpose is to understand how pavement deflection affects vehicle fuel consumption from a mechanistic approach. The last phase applies the calibrated HDM-IV model to Florida’s interstate network and estimates the total annual fuel consumption and CO2 emissions on different scenarios. The potential annual fuel savings and emission reductions are derived based on the estimation results. Statistical results from the two field studies both show fuel savings on rigid pavement compared to flexible pavement with the test conditions specified. The savings derived from the first phase are 2.50% for the passenger car at 112km/h, and 4.04% for 18-wheel tractor-trailer at 93km/h. The savings resulted from the second phase are 2.25% and 2.22% for passenger car at 93km/h and 112km/h, and 3.57% and 3.15% for the 6-wheel medium-duty truck at 89km/h and 105km/h. All savings are statistically significant at 95% Confidence Level (C.L.). From the calibrated HDM-IV model, one unit of pavement deflection (1mm) on flexible pavement can cause an excess fuel consumption by 0.234-0.311 L/100km for the passenger car and by 1.123-1.277 L/100km for the truck. The effect is more evident at lower highway speed than at higher highway speed. From the network level estimation, approximately 40 million gallons of fuel (combined gasoline and diesel) and 0.39 million tons of CO2 emission can be saved/reduced annually if all Florida’s interstate flexible pavement are converted to rigid pavement with the same roughness levels. Moreover, each 1-mile of flexible-rigid conversion can result in a reduction of 29 thousand gallons of fuel and 258 tons of CO2 emission yearly.
Resumo:
Adaptability and invisibility are hallmarks of modern terrorism, and keeping pace with its dynamic nature presents a serious challenge for societies throughout the world. Innovations in computer science have incorporated applied mathematics to develop a wide array of predictive models to support the variety of approaches to counterterrorism. Predictive models are usually designed to forecast the location of attacks. Although this may protect individual structures or locations, it does not reduce the threat—it merely changes the target. While predictive models dedicated to events or social relationships receive much attention where the mathematical and social science communities intersect, models dedicated to terrorist locations such as safe-houses (rather than their targets or training sites) are rare and possibly nonexistent. At the time of this research, there were no publically available models designed to predict locations where violent extremists are likely to reside. This research uses France as a case study to present a complex systems model that incorporates multiple quantitative, qualitative and geospatial variables that differ in terms of scale, weight, and type. Though many of these variables are recognized by specialists in security studies, there remains controversy with respect to their relative importance, degree of interaction, and interdependence. Additionally, some of the variables proposed in this research are not generally recognized as drivers, yet they warrant examination based on their potential role within a complex system. This research tested multiple regression models and determined that geographically-weighted regression analysis produced the most accurate result to accommodate non-stationary coefficient behavior, demonstrating that geographic variables are critical to understanding and predicting the phenomenon of terrorism. This dissertation presents a flexible prototypical model that can be refined and applied to other regions to inform stakeholders such as policy-makers and law enforcement in their efforts to improve national security and enhance quality-of-life.