9 resultados para Attribute reduction process
em Digital Commons at Florida International University
Resumo:
Until recently, it was believed that biological assimilation and gaseous nitrogen (N) loss through denitrification were the two major fates of nitrate entering or produced within most coastal ecosystems. Denitrification is often viewed as an important ecosystem service that removes reactive N from the ecosystem. However, there is a competing nitrate reduction process, dissimilatory nitrate reduction to ammonium (DNRA), that conserves N within the ecosystem. The recent application of nitrogen stable isotopes as tracers has generated growing evidence that DNRA is a major nitrogen pathway that cannot be ignored. Measurements comparing the importance of denitrification vs. DNRA in 55 coastal sites found that DNRA accounted for more than 30% of the nitrate reduction at 26 sites. DNRA was the dominant pathway at more than one-third of the sites. Understanding what controls the relative importance of denitrification and DNRA, and how the balance changes with increased nitrogen loading, is of critical importance for predicting eutrophication trajectories. Recent improvements in methods for assessing rates of DNRA have helped refine our understanding of the rates and controls of this process, but accurate measurements in vegetated sediment still remain a challenge.
Resumo:
The search-experience-credence framework from economics of information, the human-environment relations models from environmental psychology, and the consumer evaluation process from services marketing provide a conceptual basis for testing the model of "Pre-purchase Information Utilization in Service Physical Environments." The model addresses the effects of informational signs, as a dimension of the service physical environment, on consumers' perceptions (perceived veracity and perceived performance risk), emotions (pleasure) and behavior (willingness to buy). The informational signs provide attribute quality information (search and experience) through non-personal sources of information (simulated word-of-mouth and non-personal advocate sources).^ This dissertation examines: (1) the hypothesized relationships addressed in the model of "Pre-purchase Information Utilization in Service Physical Environments" among informational signs, perceived veracity, perceived performance risk, pleasure, and willingness to buy, and (2) the effects of attribute quality information and sources of information on consumers' perceived veracity and perceived performance risk.^ This research is the first in-depth study about the role and effects of information in service physical environments. Using a 2 x 2 between subjects experimental research procedure, undergraduate students were exposed to the informational signs in a simulated service physical environment. The service physical environments were simulated through color photographic slides.^ The results of the study suggest that: (1) the relationship between informational signs and willingness to buy is mediated by perceived veracity, perceived performance risk and pleasure, (2) experience attribute information shows higher perceived veracity and lower perceived performance risk when compared to search attribute information, and (3) information provided through simulated word-of-mouth shows higher perceived veracity and lower perceived performance risk when compared to information provided through non-personal advocate sources. ^
Resumo:
This study explored the relationship between social fund projects and poverty reduction in selected communities in Jamaica. The Caribbean nation's social fund projects aim to reduce “public” poverty by rehabilitating and expanding social and economic infrastructure, improving social services, and strengthening organizations at the community level. Research questions addressed the characteristics of poverty-focused social fund projects; the nexus between poverty reduction and three key concepts suggested by the literature— community (citizen) participation, social capital, and empowerment; and the impact of the projects on poverty. ^ In this qualitative study, data were collected and triangulated by means of in-depth, semi-structured interviews, supplemented by key informant data; non-participant observation; and document reviews. Thirty-four respondents were interviewed individually at eight rural and urban sites over a period of four consecutive months, and 10 key informants provided supplementary data. Open, axial, and selective coding was used for data reduction and analysis as part of the grounded theory method, which included constant comparative analysis. The codes generated a set of themes and a substantive-formal theory. Findings were crosschecked with interview respondents and key informants and validated by means of an audit trail. ^ The results have revealed that the approach to poverty reduction in social fund-supported communities is a process of development-focused collaboration among various stakeholders. The process encompasses four stages: (1) identifying problems and priorities, (2) motivating and mobilizing, (3) working together, and (4) creating an enabling environment. The underlying stakeholder involvement theory posits that collaboration increases the productivity of resources and creates the conditions for community-driven development. In addition, the study has found that social fund projects are largely community-based, collaborative, and highly participatory in their implementation, as well as prescription-driven, results-oriented, and leadership-dependent. Further, social capital formation across communities was found to be limited, and in general, the projects have been enabling rather than empowering. The projects have not reduced poverty per se; however, they have been instrumental in improving conditions that were concomitants of poverty. ^
Resumo:
The primary purpose of this research was to examine the effect of the Truancy Intervention Program (TIP) on attendance patterns of elementary school students. Longitudinal archival data were used from Miami-Dade County Public School system's data system, ISIS. Data included the students' school information from fifth through ninth grade for attendance, academic grades, referral information, and referral consequences. The sample for this study was drawn from students at TIP-participating M-DCPS elementary schools in Miami-Dade County. Data collected spanned five years for each participant from the fifth grade to the ninth grade. To examine the effect of TIP on attendance, participation in middle school TIP was compared with non-TIP participation. In addition to immediate effects on attendance, the durability of the effects of TIP was studied through an analysis of attendance at the ninth grade level. A secondary purpose was to examine the relation of TIP participation to Grade Point Average (GPA). ^ The data were analyzed using 2 (group) x 3 (grade) Repeated Measures Analysis of Variance (ANOVA) on yearly attendance (number of absences), and grade point average for each year. The interaction between group and grade was significant. Post hoc tests indicated that absences were not significantly different in the two programs in seventh, eighth or ninth grade. Students enrolled in a middle school with TIP showed a significantly higher number of absences in ninth grade than for seventh or eighth grade. There were no differences by grade level for students enrolled in non-TIP middle schools. GPA analysis indicated that students enrolled in a non-TIP middle school had a significantly higher GPA across seventh, eighth, and ninth grades when compared to students enrolled at a TIP middle school. ^ An examination of attendance disciplinary referrals and consequences further revealed that the referral rates for students enrolled at a TIP middle school were higher at the seventh, eighth, and ninth grade level, then for students enrolled at a non-TIP middle school. This pattern was not readily apparent at non-TIP middle schools. Limitations of the research were noted and further research regarding program implementation (process evaluation) was suggested. ^
Resumo:
Infrastructure management agencies are facing multiple challenges, including aging infrastructure, reduction in capacity of existing infrastructure, and availability of limited funds. Therefore, decision makers are required to think innovatively and develop inventive ways of using available funds. Maintenance investment decisions are generally made based on physical condition only. It is important to understand that spending money on public infrastructure is synonymous with spending money on people themselves. This also requires consideration of decision parameters, in addition to physical condition, such as strategic importance, socioeconomic contribution and infrastructure utilization. Consideration of multiple decision parameters for infrastructure maintenance investments can be beneficial in case of limited funding. Given this motivation, this dissertation presents a prototype decision support framework to evaluate trade-off, among competing infrastructures, that are candidates for infrastructure maintenance, repair and rehabilitation investments. Decision parameters' performances measured through various factors are combined to determine the integrated state of an infrastructure using Multi-Attribute Utility Theory (MAUT). The integrated state, cost and benefit estimates of probable maintenance actions are utilized alongside expert opinion to develop transition probability and reward matrices for each probable maintenance action for a particular candidate infrastructure. These matrices are then used as an input to the Markov Decision Process (MDP) for the finite-stage dynamic programming model to perform project (candidate)-level analysis to determine optimized maintenance strategies based on reward maximization. The outcomes of project (candidate)-level analysis are then utilized to perform network-level analysis taking the portfolio management approach to determine a suitable portfolio under budgetary constraints. The major decision support outcomes of the prototype framework include performance trend curves, decision logic maps, and a network-level maintenance investment plan for the upcoming years. The framework has been implemented with a set of bridges considered as a network with the assistance of the Pima County DOT, AZ. It is expected that the concept of this prototype framework can help infrastructure management agencies better manage their available funds for maintenance.
Resumo:
Ensemble Stream Modeling and Data-cleaning are sensor information processing systems have different training and testing methods by which their goals are cross-validated. This research examines a mechanism, which seeks to extract novel patterns by generating ensembles from data. The main goal of label-less stream processing is to process the sensed events to eliminate the noises that are uncorrelated, and choose the most likely model without over fitting thus obtaining higher model confidence. Higher quality streams can be realized by combining many short streams into an ensemble which has the desired quality. The framework for the investigation is an existing data mining tool. First, to accommodate feature extraction such as a bush or natural forest-fire event we make an assumption of the burnt area (BA*), sensed ground truth as our target variable obtained from logs. Even though this is an obvious model choice the results are disappointing. The reasons for this are two: One, the histogram of fire activity is highly skewed. Two, the measured sensor parameters are highly correlated. Since using non descriptive features does not yield good results, we resort to temporal features. By doing so we carefully eliminate the averaging effects; the resulting histogram is more satisfactory and conceptual knowledge is learned from sensor streams. Second is the process of feature induction by cross-validating attributes with single or multi-target variables to minimize training error. We use F-measure score, which combines precision and accuracy to determine the false alarm rate of fire events. The multi-target data-cleaning trees use information purity of the target leaf-nodes to learn higher order features. A sensitive variance measure such as ƒ-test is performed during each node's split to select the best attribute. Ensemble stream model approach proved to improve when using complicated features with a simpler tree classifier. The ensemble framework for data-cleaning and the enhancements to quantify quality of fitness (30% spatial, 10% temporal, and 90% mobility reduction) of sensor led to the formation of streams for sensor-enabled applications. Which further motivates the novelty of stream quality labeling and its importance in solving vast amounts of real-time mobile streams generated today.
Resumo:
Property taxes serve as a vital revenue source for local governments. The revenues derived from the property tax function as the primary funding source for a variety of critical local public service systems. Property tax appeal systems serve as quasi-administrative-judicial mechanisms intended to assure the public that property tax assessments are correct, fair, and equitable. Despite these important functions, there is a paucity of empirical research related to property tax appeal systems. This study contributes to property tax literature by identifying who participates in the property tax appeal process and examining their motivations for participation. In addition, the study sought to determine whether patterns of use and success in appeal systems affected the distribution of the tax burden. Data were collected by means of a survey distributed to single-family property owners from two Florida counties. In addition, state and county documents were analyzed to determine appeal patterns and examine the impact on assessment uniformity, over a three-year period. The survey data provided contextual evidence that single-family property owners are not as troubled by property taxes as they are by the conduct of local government officials. The analyses of the decision to appeal indicated that more expensive properties and properties excluded from initial uniformity analyses were more likely to be appealed, while properties with homestead exemptions were less likely to be appealed. The value change analyses indicated that appeals are clustered in certain geographical areas; however, these areas do not always experience a greater percentage of the value changes. Interestingly, professional representation did not increase the probability of obtaining a reduction in value. Other relationships between the variables were discovered, but often with weak predictive ability. Findings from the assessment uniformity analyses were also interesting. The results indicated that the appeals mechanisms in both counties improved assessment uniformity. On average, appealed properties exhibited greater horizontal and vertical inequities, as compared to non-appealed properties, prior to the appeals process. After, the appeal process was completed; the indicators of horizontal and vertical equity were largely improved. However, there were some indications of regressivity in the final year of the study.
Resumo:
Ensemble Stream Modeling and Data-cleaning are sensor information processing systems have different training and testing methods by which their goals are cross-validated. This research examines a mechanism, which seeks to extract novel patterns by generating ensembles from data. The main goal of label-less stream processing is to process the sensed events to eliminate the noises that are uncorrelated, and choose the most likely model without over fitting thus obtaining higher model confidence. Higher quality streams can be realized by combining many short streams into an ensemble which has the desired quality. The framework for the investigation is an existing data mining tool. First, to accommodate feature extraction such as a bush or natural forest-fire event we make an assumption of the burnt area (BA*), sensed ground truth as our target variable obtained from logs. Even though this is an obvious model choice the results are disappointing. The reasons for this are two: One, the histogram of fire activity is highly skewed. Two, the measured sensor parameters are highly correlated. Since using non descriptive features does not yield good results, we resort to temporal features. By doing so we carefully eliminate the averaging effects; the resulting histogram is more satisfactory and conceptual knowledge is learned from sensor streams. Second is the process of feature induction by cross-validating attributes with single or multi-target variables to minimize training error. We use F-measure score, which combines precision and accuracy to determine the false alarm rate of fire events. The multi-target data-cleaning trees use information purity of the target leaf-nodes to learn higher order features. A sensitive variance measure such as f-test is performed during each node’s split to select the best attribute. Ensemble stream model approach proved to improve when using complicated features with a simpler tree classifier. The ensemble framework for data-cleaning and the enhancements to quantify quality of fitness (30% spatial, 10% temporal, and 90% mobility reduction) of sensor led to the formation of streams for sensor-enabled applications. Which further motivates the novelty of stream quality labeling and its importance in solving vast amounts of real-time mobile streams generated today.