17 resultados para Two variable oregonator model
Resumo:
Groundwater systems of different densities are often mathematically modeled to understand and predict environmental behavior such as seawater intrusion or submarine groundwater discharge. Additional data collection may be justified if it will cost-effectively aid in reducing the uncertainty of a model's prediction. The collection of salinity, as well as, temperature data could aid in reducing predictive uncertainty in a variable-density model. However, before numerical models can be created, rigorous testing of the modeling code needs to be completed. This research documents the benchmark testing of a new modeling code, SEAWAT Version 4. The benchmark problems include various combinations of density-dependent flow resulting from variations in concentration and temperature. The verified code, SEAWAT, was then applied to two different hydrological analyses to explore the capacity of a variable-density model to guide data collection. ^ The first analysis tested a linear method to guide data collection by quantifying the contribution of different data types and locations toward reducing predictive uncertainty in a nonlinear variable-density flow and transport model. The relative contributions of temperature and concentration measurements, at different locations within a simulated carbonate platform, for predicting movement of the saltwater interface were assessed. Results from the method showed that concentration data had greater worth than temperature data in reducing predictive uncertainty in this case. Results also indicated that a linear method could be used to quantify data worth in a nonlinear model. ^ The second hydrological analysis utilized a model to identify the transient response of the salinity, temperature, age, and amount of submarine groundwater discharge to changes in tidal ocean stage, seasonal temperature variations, and different types of geology. The model was compared to multiple kinds of data to (1) calibrate and verify the model, and (2) explore the potential for the model to be used to guide the collection of data using techniques such as electromagnetic resistivity, thermal imagery, and seepage meters. Results indicated that the model can be used to give insight to submarine groundwater discharge and be used to guide data collection. ^
Resumo:
The standard highway assignment model in the Florida Standard Urban Transportation Modeling Structure (FSUTMS) is based on the equilibrium traffic assignment method. This method involves running several iterations of all-or-nothing capacity-restraint assignment with an adjustment of travel time to reflect delays encountered in the associated iteration. The iterative link time adjustment process is accomplished through the Bureau of Public Roads (BPR) volume-delay equation. Since FSUTMS' traffic assignment procedure outputs daily volumes, and the input capacities are given in hourly volumes, it is necessary to convert the hourly capacities to their daily equivalents when computing the volume-to-capacity ratios used in the BPR function. The conversion is accomplished by dividing the hourly capacity by a factor called the peak-to-daily ratio, or referred to as CONFAC in FSUTMS. The ratio is computed as the highest hourly volume of a day divided by the corresponding total daily volume. ^ While several studies have indicated that CONFAC is a decreasing function of the level of congestion, a constant value is used for each facility type in the current version of FSUTMS. This ignores the different congestion level associated with each roadway and is believed to be one of the culprits of traffic assignment errors. Traffic counts data from across the state of Florida were used to calibrate CONFACs as a function of a congestion measure using the weighted least squares method. The calibrated functions were then implemented in FSUTMS through a procedure that takes advantage of the iterative nature of FSUTMS' equilibrium assignment method. ^ The assignment results based on constant and variable CONFACs were then compared against the ground counts for three selected networks. It was found that the accuracy from the two assignments was not significantly different, that the hypothesized improvement in assignment results from the variable CONFAC model was not empirically evident. It was recognized that many other factors beyond the scope and control of this study could contribute to this finding. It was recommended that further studies focus on the use of the variable CONFAC model with recalibrated parameters for the BPR function and/or with other forms of volume-delay functions. ^
Resumo:
This study examined the predictive merits of selected cognitive and noncognitive variables on the national Registry exam pass rate using 2008 graduates (n = 175) from community college radiography programs in Florida. The independent variables included two GPAs, final grades in five radiography courses, self-efficacy, and social support. The dependent variable was the first-attempt results on the national Registry exam. The design was a retrospective predictive study that relied on academic data collected from participants using the self-report method and on perceptions of students' success on the national Registry exam collected through a questionnaire developed and piloted in the study. All independent variables except self-efficacy and social support correlated with success on the national Registry exam ( p < .01) using the Pearson Product-Moment Correlation analysis. The strongest predictor of the national Registry exam success was the end-of-program GPA, r = .550, p < .001. The GPAs and scores for self-efficacy and social support were entered into a logistic regression analysis to produce a prediction model. The end-of-program GPA (p = .015) emerged as a significant variable. This model predicted 44% of the students who failed the national Registry exam and 97.3% of those who passed, explaining 45.8% of the variance. A second model included the final grades for the radiography courses, self efficacy, and social support. Three courses significantly predicted national Registry exam success; Radiographic Exposures, p < .001; Radiologic Physics, p = .014; and Radiation Safety & Protection, p = .044, explaining 56.8% of the variance. This model predicted 64% of the students who failed the national Registry exam and 96% of those who passed. The findings support the use of in-program data as accurate predictors of success on the national Registry exam.
Resumo:
Personality has long been linked to performance. Evolutions in this relationship have brought forward new questions regarding the true nature of how personality impacts performance. Both direct and indirect relationships have been proven significant. This study further investigated potential indirect relationships by including a mediating variable, mental model formation, in the personality-performance relationship. Undergraduate students were assessed in a 6-week period, Time 1 - Time 2 experiment. Conceptualizations of personality included measures of the Big 5 model and Self-efficacy, with performance measured by content quiz and overall course scores. Findings showed that the Big 5 personality traits, extraversion and agreeableness, positively and significantly impacted commonality with the instructor's mental model. However, commonality with the instructor's mental model did not impact performance. In comparison, commonality with an expert mental model positively and significantly impacted performance for both the content quiz and overall course score. Furthermore, similarity with an expert mental model positively and significantly impacted overall course performance. Hypothesized full mediation of mental model formation for the personality-performance relationship was not supported due to a lack of direct effect relationships required for mediation. However, a revised conceptualization of results emerged. Findings from the current study point to the novel and unique role mental models play in the personality-performance relationship. While personality traits do impact mental model formation, accuracy in the mental models formed is critical to performance.
Resumo:
We estimated trophic position and carbon source for three consumers (Florida gar, Lepisosteus platyrhincus; eastern mosquitofish, Gambusia holbrooki; and riverine grass shrimp, Palaemonetes paludosus) from 20 sites representing gradients of productivity and hydrological disturbance in the southern Florida Everglades, U.S.A. We characterized gross primary productivity at each site using light/dark bottle incubation and stem density of emergent vascular plants. We also documented nutrient availability as total phosphorus (TP) in floc and periphyton, and the density of small fishes. Hydrological disturbance was characterized as the time since a site was last dried and the average number of days per year the sites were inundated for the previous 10 years. Food-web attributes were estimated in both the wet and dry seasons by analysis of δ15N (trophic position) and δ13C (food-web carbon source) from 702 samples of aquatic consumers. An index of carbon source was derived from a two-member mixing model with Seminole ramshorn snails (Planorbella duryi) as a basal grazing consumer and scuds (amphipods Hyallela azteca) as a basal detritivore. Snails yielded carbon isotopic values similar to green algae and diatoms, while carbon values of scuds were similar to bulk periphyton and floc; carbon isotopic values of cyanobacteria were enriched in C13compared to all consumers examined. A carbon source similar to scuds dominated at all but one study site, and though the relative contribution of scud-like and snail-like carbon sources was variable, there was no evidence that these contributions were a function of abiotic factors or season. Gar consistently displayed the highest estimated trophic position of the consumers studied, with mosquitofish feeding at a slightly lower level, and grass shrimp feeding at the lowest level. Trophic position was not correlated with any nutrient or productivity parameter, but did increase for grass shrimp and mosquitofish as the time following droughts increased. Trophic position of Florida gar was positively correlated with emergent plant stem density.
Resumo:
Personality has long been linked to performance. Evolutions in this relationship have brought forward new questions regarding the true nature of how personality impacts performance. Both direct and indirect relationships have been proven significant. This study further investigated potential indirect relationships by including a mediating variable, mental model formation, in the personality-performance relationship. Undergraduate students were assessed in a 6-week period, Time 1 - Time 2 experiment. Conceptualizations of personality included measures of the Big 5 model and Self-efficacy, with performance measured by content quiz and overall course scores. Findings showed that the Big 5 personality traits, extraversion and agreeableness, positively and significantly impacted commonality with the instructor’s mental model. However, commonality with the instructor’s mental model did not impact performance. In comparison, commonality with an expert mental model positively and significantly impacted performance for both the content quiz and overall course score. Furthermore, similarity with an expert mental model positively and significantly impacted overall course performance. Hypothesized full mediation of mental model formation for the personality-performance relationship was not supported due to a lack of direct effect relationships required for mediation. However, a revised conceptualization of results emerged. Findings from the current study point to the novel and unique role mental models play in the personality-performance relationship. While personality traits do impact mental model formation, accuracy in the mental models formed is critical to performance.
Resumo:
The rate of fatal crashes in Florida has remained significantly higher than the national average for the last several years. The 2003 statistics from the National Highway Traffic Safety Administration (NHTSA), the latest available, show a fatality rate in Florida of 1.71 per 100 million vehicle-miles traveled compared to the national average of 1.48 per 100 million vehicle-miles traveled. The objective of this research is to better understand the driver, environmental, and roadway factors that affect the probability of injury severity in Florida. ^ In this research, the ordered logit model was used to develop six injury severity models; single-vehicle and two-vehicle crashes on urban freeways and urban principal arterials and two-vehicle crashes at urban signalized and unsignalized intersections. The data used in this research included all crashes that occurred on the state highway system for the period from 2001 to 2003 in the Southeast Florida region, which includes the Miami-Dade, Broward and Palm Beach Counties.^ The results of the analysis indicate that the age group and gender of the driver at fault were significant factors of injury severity risk across all models. The greatest risk of severe injury was observed for the age groups 55 to 65 and 66 and older. A positive association between injury severity and the race of the driver at fault was also found. Driver at fault of Hispanic origin was associated with a higher risk of severe injury for both freeway models and for the two-vehicle crash model on arterial roads. A higher risk of more severe injury crash involvement was also found when an African-American was the at fault driver on two-vehicle crashes on freeways. In addition, the arterial class was also found to be positively associated with a higher risk of severe crashes. Six-lane divided arterials exhibited the highest injury severity risk of all arterial classes. The lowest severe injury risk was found for one way roads. Alcohol involvement by the driver at fault was also found to be a significant risk of severe injury for the single-vehicle crash model on freeways. ^
Resumo:
Geochemical and geophysical approaches have been used to investigate the freshwater and saltwater dynamics in the coastal Biscayne Aquifer and Biscayne Bay. Stable isotopes of oxygen and hydrogen, and concentrations of Sr2+ and Ca2+ were combined in two geochemical mixing models to provide estimates of the various freshwater inputs (precipitation, canal water, and groundwater) to Biscayne Bay and the coastal canal system in South Florida. Shallow geophysical electromagnetic and direct current resistivity surveys were used to image the geometry and stratification of the saltwater mixing zone in the near coastal (less than 1km inland) Biscayne Aquifer. The combined stable isotope and trace metal models suggest a ratio of canal input-precipitation-groundwater of 38%–52%–10% in the wet season and 37%–58%–5% in the dry season with an error of 25%, where most (20%) of the error was attributed to the isotope regression model, while the remaining 5% error was attributed to the Sr2+/Ca2+ mixing model. These models suggest rainfall is the dominate source of freshwater to Biscayne Bay. For a bay-wide water budget that includes saltwater and freshwater mixing, fresh groundwater accounts for less than 2% of the total input. A similar Sr 2+/Ca2+ tracer model indicates precipitation is the dominate source in 9 out of 10 canals that discharge into Biscayne Bay. The two-component mixing model converged for 100% of the freshwater canal samples in this study with 63% of the water contributed to the canals coming from precipitation and 37% from groundwater inputs ±4%. There was a seasonal shift from 63% precipitation input in the dry season to 55% precipitation input in the wet season. The three end-member mixing model converged for only 60% of the saline canal samples possibly due to non-conservative behavior of Sr2+ and Ca2+ in saline groundwater discharging into the canal system. Electromagnetic and Direct Current resistivity surveys were successful at locating and estimating the geometry and depth of the freshwater/saltwater interface in the Biscayne Aquifer at two near coastal sites. A saltwater interface that deepened as the survey moved inland was detected with a maximum interpreted depth to the interface of 15 meters, approximately 0.33 km inland from the shoreline. ^
Resumo:
This dissertation consists of three essays on different aspects of water management. The first essay focuses on the sustainability of freshwater use by introducing the notion that altruistic parents do bequeath economic assets for their offspring. Constructing a two-period, over-lapping generational model, an optimal ratio of consumption and pollution for old and young generations in each period is determined. Optimal levels of water consumption and pollution change according to different parameters, such as, altruistic degree, natural recharge rate, and population growth. The second essay concerns water sharing between countries in the case of trans-boundary river basins. The paper recognizes that side payments fail to forge water-sharing agreement among the international community and that downstream countries have weak bargaining power. An interconnected game approach is developed by linking the water allocation issue with other non-water issues such as trade or border security problems, creating symmetry between countries in bargaining power. An interconnected game forces two countries to at least partially cooperate under some circumstances. The third essay introduces the concept of virtual water (VW) into a traditional international trade model in order to estimate water savings for a water scarce country. A two country, two products and two factors trade model is developed, which includes not only consumers and producer's surplus, but also environmental externality of water use. The model shows that VW trade saves water and increases global and local welfare. This study should help policy makers to design appropriate subsidy or tax policy to promote water savings especially in water scarce countries.^
Resumo:
The purpose of this project was to evaluate the use of remote sensing 1) to detect and map Everglades wetland plant communities at different scales; and 2) to compare map products delineated and resampled at various scales with the intent to quantify and describe the quantitative and qualitative differences between such products. We evaluated data provided by Digital Globe’s WorldView 2 (WV2) sensor with a spatial resolution of 2m and data from Landsat’s Thematic and Enhanced Thematic Mapper (TM and ETM+) sensors with a spatial resolution of 30m. We were also interested in the comparability and scalability of products derived from these data sources. The adequacy of each data set to map wetland plant communities was evaluated utilizing two metrics: 1) model-based accuracy estimates of the classification procedures; and 2) design-based post-classification accuracy estimates of derived maps.
Resumo:
This dissertation consists of three essays on different aspects of water management. The first essay focuses on the sustainability of freshwater use by introducing the notion that altruistic parents do bequeath economic assets for their offspring. Constructing a two-period, over-lapping generational model, an optimal ratio of consumption and pollution for old and young generations in each period is determined. Optimal levels of water consumption and pollution change according to different parameters, such as, altruistic degree, natural recharge rate, and population growth. The second essay concerns water sharing between countries in the case of trans-boundary river basins. The paper recognizes that side payments fail to forge water-sharing agreement among the international community and that downstream countries have weak bargaining power. An interconnected game approach is developed by linking the water allocation issue with other non-water issues such as trade or border security problems, creating symmetry between countries in bargaining power. An interconnected game forces two countries to at least partially cooperate under some circumstances. The third essay introduces the concept of virtual water (VW) into a traditional international trade model in order to estimate water savings for a water scarce country. A two country, two products and two factors trade model is developed, which includes not only consumers and producer’s surplus, but also environmental externality of water use. The model shows that VW trade saves water and increases global and local welfare. This study should help policy makers to design appropriate subsidy or tax policy to promote water savings especially in water scarce countries.
Resumo:
The purpose of this descriptive study was to evaluate the banking and insurance technology curriculum at ten junior colleges in Taiwan. The study focused on curriculum, curriculum materials, instruction, support services, student achievement and job performance. Data was collected from a diverse sample of faculty, students, alumni, and employers. ^ Questionnaires on the evaluation of curriculum at technical junior colleges were developed for use in this specific case. Data were collected from the sample described above and analyzed utilizing ANOVA, T-Tests and crosstabulations. Findings are presented which indicate that there is room for improvement in terms of meeting individual students' needs. ^ Using Stufflebeam's CIPP model for curriculum evaluation it was determined that the curriculum was adequate in terms of the knowledge and skills imparted to students. However, students were dissatisfied with the rigidity of the curriculum and the lack of opportunity to satisfy the individual needs of students. Employers were satisfied with both the academic preparation of students and their on the job performance. ^ In sum, the curriculum of the two-year banking and insurance technology programs of junior college in Taiwan was shown to have served adequately preparing a work force to enter businesses. It is now time to look toward the future and adapt the curriculum and instruction for the future needs of the ever evolving high-tech society. ^
Resumo:
The main objective of this work is to develop a quasi three-dimensional numerical model to simulate stony debris flows, considering a continuum fluid phase, composed by water and fine sediments, and a non-continuum phase including large particles, such as pebbles and boulders. Large particles are treated in a Lagrangian frame of reference using the Discrete Element Method, the fluid phase is based on the Eulerian approach, using the Finite Element Method to solve the depth-averaged Navier-Stokes equations in two horizontal dimensions. The particle’s equations of motion are in three dimensions. The model simulates particle-particle collisions and wall-particle collisions, taking into account that particles are immersed in a fluid. Bingham and Cross rheological models are used for the continuum phase. Both formulations provide very stable results, even in the range of very low shear rates. Bingham formulation is better able to simulate the stopping stage of the fluid when applied shear stresses are low. Results of numerical simulations have been compared with data from laboratory experiments on a flume-fan prototype. Results show that the model is capable of simulating the motion of big particles moving in the fluid flow, handling dense particulate flows and avoiding overlap among particles. An application to simulate debris flow events that occurred in Northern Venezuela in 1999 shows that the model could replicate the main boulder accumulation areas that were surveyed by the USGS. Uniqueness of this research is the integration of mud flow and stony debris movement in a single modeling tool that can be used for planning and management of debris flow prone areas.
Resumo:
This research sought to understand the role that differentially assessed lands (lands in the United States given tax breaks in return for their guarantee to remain in agriculture) play in influencing urban growth. Our method was to calibrate the SLEUTH urban growth model under two different conditions. The first used an excluded layer that ignored such lands, effectively rendering them available for development. The second treated those lands as totally excluded from development. Our hypothesis was that excluding those lands would yield better metrics of fit with past data. Our results validate our hypothesis since two different metrics that evaluate goodness of fit both yielded higher values when differentially assessed lands are treated as excluded. This suggests that, at least in our study area, differential assessment, which protects farm and ranch lands for tenuous periods of time, has indeed allowed farmland to resist urban development. Including differentially assessed lands also yielded very different calibrated coefficients of growth as the model tried to account for the same growth patterns over two very different excluded areas. Excluded layer design can greatly affect model behavior. Since differentially assessed lands are quite common through the United States and are often ignored in urban growth modeling, the findings of this research can assist other urban growth modelers in designing excluded layers that result in more accurate model calibration and thus forecasting.
Resumo:
Ensemble Stream Modeling and Data-cleaning are sensor information processing systems have different training and testing methods by which their goals are cross-validated. This research examines a mechanism, which seeks to extract novel patterns by generating ensembles from data. The main goal of label-less stream processing is to process the sensed events to eliminate the noises that are uncorrelated, and choose the most likely model without over fitting thus obtaining higher model confidence. Higher quality streams can be realized by combining many short streams into an ensemble which has the desired quality. The framework for the investigation is an existing data mining tool. First, to accommodate feature extraction such as a bush or natural forest-fire event we make an assumption of the burnt area (BA*), sensed ground truth as our target variable obtained from logs. Even though this is an obvious model choice the results are disappointing. The reasons for this are two: One, the histogram of fire activity is highly skewed. Two, the measured sensor parameters are highly correlated. Since using non descriptive features does not yield good results, we resort to temporal features. By doing so we carefully eliminate the averaging effects; the resulting histogram is more satisfactory and conceptual knowledge is learned from sensor streams. Second is the process of feature induction by cross-validating attributes with single or multi-target variables to minimize training error. We use F-measure score, which combines precision and accuracy to determine the false alarm rate of fire events. The multi-target data-cleaning trees use information purity of the target leaf-nodes to learn higher order features. A sensitive variance measure such as ƒ-test is performed during each node's split to select the best attribute. Ensemble stream model approach proved to improve when using complicated features with a simpler tree classifier. The ensemble framework for data-cleaning and the enhancements to quantify quality of fitness (30% spatial, 10% temporal, and 90% mobility reduction) of sensor led to the formation of streams for sensor-enabled applications. Which further motivates the novelty of stream quality labeling and its importance in solving vast amounts of real-time mobile streams generated today.