8 resultados para System behaviors
em Digital Commons at Florida International University
Resumo:
Modern IT infrastructures are constructed by large scale computing systems and administered by IT service providers. Manually maintaining such large computing systems is costly and inefficient. Service providers often seek automatic or semi-automatic methodologies of detecting and resolving system issues to improve their service quality and efficiency. This dissertation investigates several data-driven approaches for assisting service providers in achieving this goal. The detailed problems studied by these approaches can be categorized into the three aspects in the service workflow: 1) preprocessing raw textual system logs to structural events; 2) refining monitoring configurations for eliminating false positives and false negatives; 3) improving the efficiency of system diagnosis on detected alerts. Solving these problems usually requires a huge amount of domain knowledge about the particular computing systems. The approaches investigated by this dissertation are developed based on event mining algorithms, which are able to automatically derive part of that knowledge from the historical system logs, events and tickets. ^ In particular, two textual clustering algorithms are developed for converting raw textual logs into system events. For refining the monitoring configuration, a rule based alert prediction algorithm is proposed for eliminating false alerts (false positives) without losing any real alert and a textual classification method is applied to identify the missing alerts (false negatives) from manual incident tickets. For system diagnosis, this dissertation presents an efficient algorithm for discovering the temporal dependencies between system events with corresponding time lags, which can help the administrators to determine the redundancies of deployed monitoring situations and dependencies of system components. To improve the efficiency of incident ticket resolving, several KNN-based algorithms that recommend relevant historical tickets with resolutions for incoming tickets are investigated. Finally, this dissertation offers a novel algorithm for searching similar textual event segments over large system logs that assists administrators to locate similar system behaviors in the logs. Extensive empirical evaluation on system logs, events and tickets from real IT infrastructures demonstrates the effectiveness and efficiency of the proposed approaches.^
Resumo:
Concurrent software executes multiple threads or processes to achieve high performance. However, concurrency results in a huge number of different system behaviors that are difficult to test and verify. The aim of this dissertation is to develop new methods and tools for modeling and analyzing concurrent software systems at design and code levels. This dissertation consists of several related results. First, a formal model of Mondex, an electronic purse system, is built using Petri nets from user requirements, which is formally verified using model checking. Second, Petri nets models are automatically mined from the event traces generated from scientific workflows. Third, partial order models are automatically extracted from some instrumented concurrent program execution, and potential atomicity violation bugs are automatically verified based on the partial order models using model checking. Our formal specification and verification of Mondex have contributed to the world wide effort in developing a verified software repository. Our method to mine Petri net models automatically from provenance offers a new approach to build scientific workflows. Our dynamic prediction tool, named McPatom, can predict several known bugs in real world systems including one that evades several other existing tools. McPatom is efficient and scalable as it takes advantage of the nature of atomicity violations and considers only a pair of threads and accesses to a single shared variable at one time. However, predictive tools need to consider the tradeoffs between precision and coverage. Based on McPatom, this dissertation presents two methods for improving the coverage and precision of atomicity violation predictions: 1) a post-prediction analysis method to increase coverage while ensuring precision; 2) a follow-up replaying method to further increase coverage. Both methods are implemented in a completely automatic tool.
Resumo:
Many systems and applications are continuously producing events. These events are used to record the status of the system and trace the behaviors of the systems. By examining these events, system administrators can check the potential problems of these systems. If the temporal dynamics of the systems are further investigated, the underlying patterns can be discovered. The uncovered knowledge can be leveraged to predict the future system behaviors or to mitigate the potential risks of the systems. Moreover, the system administrators can utilize the temporal patterns to set up event management rules to make the system more intelligent. With the popularity of data mining techniques in recent years, these events grad- ually become more and more useful. Despite the recent advances of the data mining techniques, the application to system event mining is still in a rudimentary stage. Most of works are still focusing on episodes mining or frequent pattern discovering. These methods are unable to provide a brief yet comprehensible summary to reveal the valuable information from the high level perspective. Moreover, these methods provide little actionable knowledge to help the system administrators to better man- age the systems. To better make use of the recorded events, more practical techniques are required. From the perspective of data mining, three correlated directions are considered to be helpful for system management: (1) Provide concise yet comprehensive summaries about the running status of the systems; (2) Make the systems more intelligence and autonomous; (3) Effectively detect the abnormal behaviors of the systems. Due to the richness of the event logs, all these directions can be solved in the data-driven manner. And in this way, the robustness of the systems can be enhanced and the goal of autonomous management can be approached. This dissertation mainly focuses on the foregoing directions that leverage tem- poral mining techniques to facilitate system management. More specifically, three concrete topics will be discussed, including event, resource demand prediction, and streaming anomaly detection. Besides the theoretic contributions, the experimental evaluation will also be presented to demonstrate the effectiveness and efficacy of the corresponding solutions.
Resumo:
Concurrent software executes multiple threads or processes to achieve high performance. However, concurrency results in a huge number of different system behaviors that are difficult to test and verify. The aim of this dissertation is to develop new methods and tools for modeling and analyzing concurrent software systems at design and code levels. This dissertation consists of several related results. First, a formal model of Mondex, an electronic purse system, is built using Petri nets from user requirements, which is formally verified using model checking. Second, Petri nets models are automatically mined from the event traces generated from scientific workflows. Third, partial order models are automatically extracted from some instrumented concurrent program execution, and potential atomicity violation bugs are automatically verified based on the partial order models using model checking. Our formal specification and verification of Mondex have contributed to the world wide effort in developing a verified software repository. Our method to mine Petri net models automatically from provenance offers a new approach to build scientific workflows. Our dynamic prediction tool, named McPatom, can predict several known bugs in real world systems including one that evades several other existing tools. McPatom is efficient and scalable as it takes advantage of the nature of atomicity violations and considers only a pair of threads and accesses to a single shared variable at one time. However, predictive tools need to consider the tradeoffs between precision and coverage. Based on McPatom, this dissertation presents two methods for improving the coverage and precision of atomicity violation predictions: 1) a post-prediction analysis method to increase coverage while ensuring precision; 2) a follow-up replaying method to further increase coverage. Both methods are implemented in a completely automatic tool.
Resumo:
The long term goal of the work described is to contribute to the emerging literature of prevention science in general, and to school-based psychoeducational interventions in particular. The psychoeducational intervention reported in this study used a main effects prevention intervention model. The current study focused on promoting optimal cognitive and affective functioning. The goal of this intervention was to increase potential protective factors such as critical cognitive and communicative competencies (e.g., critical problem solving and decision making) and affective competencies (e.g., personal control and responsibility) in middle adolescents who have been identified by the school system as being at-risk for problem behaviors. The current psychoeducational intervention draws on an ongoing program of theory and research (Berman, Berman, Cass Lorente, Ferrer Wreder, Arrufat, & Kurtines 1996; Ferrer Wreder, 1996; Kurtines, Berman, Ittel, & Williamson, 1995) and extends it to include Freire's (1970) concept of transformative pedagogy in developing school-based psychoeducational programs that target troubled adolescents. The results of the quantitative and qualitative analyses indicated trends that were generally encouraging with respect to the effects of the intervention on increasing critical cognitive and affective competencies. ^
Resumo:
The primary purpose of this research is to study the linkage between perceived job design characteristics and information system environment characteristics before and after the replacement of a legacy information system with a new type of information system (referred to as an Enterprise Resource Planning or ERP system). A public state University implementing an academic version of an ERP system was selected for the study. Three survey instruments were used to examine the perception of the information system, the job characteristics, and the organizational culture before and after the system implementation. The research participants included two large departments resulting in a sample of 130 workers. Research questions were analyzed using multivariate procedures including factor analysis, path analysis, step-wise regression, and matched pair analysis. ^ Results indicated that the ERP system has introduced new elements into the working environment that has changed the perception of how the job design characteristics and organization culture dimensions are viewed by the workers. The understanding of how the perceived system characteristics align with an individual's perceived job design characteristics is supported by each of the system characteristics significantly correlated in the proposed direction. The stronger support of this relationship becomes visible in the causal flow of the effects seen in the path diagram and in the step-wise regression. The perceived job design characteristics aligning with dimensions of organizational culture are not as strong as the literature suggests. Although there are significant correlations between the job and culture variables, only one relationship can be seen in the causal flow. ^ This research has demonstrated that system characteristics of ERP do contribute to the perception of change in an organization and do support organizational culture behaviors and job characteristics. ^
Resumo:
This study compares the effects of cooperative delivery (CD) and individual delivery (ID) of integrated learning system (ILS) instruction in mathematics on achievement, attitudes and behaviors in adult (16-21 yrs.) high school students (grades 9-13). The study was conducted in an urban adult high school in Miami-Dade County Public Schools using a pre-test/post-test design. Achievement was measured using the Test of Adult Basic Education (TABE) by CTB MC-Graw-Hill and Compass Learning. An attitudinal survey measured attitudes towards mathematics, the computer-related lessons, and attitudes toward group activities. Behavior was assessed using computer lab observations. ^ Two-way analyses of variance (ANOVA) were conducted on achievement (TABE and Compass) by group and time (pre and post). A one-way ANOVA was conducted on the overall attitude by group on the five components (i.e., content mathematics, delivery/computers, cooperative, partners, and self efficacy) and a one-way ANOVA was conducted on the on-task behavior by group. ^ The results of the study revealed that CD and ID students working on mathematics activities delivered by the ILS performed similarly on achievement tests of the TABE. The CD-ILS students had significantly better overall mathematics attitudes than the ID-ILS students and the ID-ILS group was on-task significantly more than the CD-ILS group. This study concludes that regularity and period of time over which the ILS is used may prove to be important variables although there were insufficient data to fully investigate the impact of models of use. Additionally, a minimum amount of time-on-system is necessary before gains can become apparent in innumeracy and increasing exposure to the system may have beneficial effects on learning. ^
Resumo:
The purpose of this study was to better understand the study behaviors and habits of university undergraduate students. It was designed to determine whether undergraduate students could be grouped based on their self-reported study behaviors and if any grouping system could be determined, whether group membership was related to students’ academic achievement. A total of 152 undergraduate students voluntarily participated in the current study by completing the Study Behavior Inventory instrument. All participants were enrolled in fall semester of 2010 at Florida International University. The Q factor analysis technique using principal components extraction and a varimax rotation was used in order to examine the participants in relation to each other and to detect a pattern of intercorrelations among participants based on their self-reported study behaviors. The Q factor analysis yielded a two factor structure representing two distinct student types among participants regarding their study behaviors. The first student type (i.e., Factor 1) describes proactive learners who organize both their study materials and study time well. Type 1 students are labeled “Proactive Learners with Well-Organized Study Behaviors”. The second type (i.e., Factor 2) represents students who are poorly organized as well as being very likely to procrastinate. Type 2 students are labeled Disorganized Procrastinators. Hierarchical linear regression was employed to examine the relationship between student type and academic achievement as measured by current grade point averages (GPAs). The results showed significant differences in GPAs between Type 1 and Type 2 students at the .05 significance level. Furthermore, student type was found to be a significant predictor of academic achievement beyond and above students’ attribute variables including sex, age, major, and enrollment status. The study has several implications for educational researchers, practitioners, and policy makers in terms of improving college students' learning behaviors and outcomes.