7 resultados para Concurrent exception handling

em Digital Commons at Florida International University


Relevância:

30.00% 30.00%

Publicador:

Resumo:

In recent years, the internet has grown exponentially, and become more complex. This increased complexity potentially introduces more network-level instability. But for any end-to-end internet connection, maintaining the connection's throughput and reliability at a certain level is very important. This is because it can directly affect the connection's normal operation. Therefore, a challenging research task is to improve a network's connection performance by optimizing its throughput and reliability. This dissertation proposed an efficient and reliable transport layer protocol (called concurrent TCP (cTCP)), an extension of the current TCP protocol, to optimize end-to-end connection throughput and enhance end-to-end connection fault tolerance. The proposed cTCP protocol could aggregate multiple paths' bandwidth by supporting concurrent data transfer (CDT) on a single connection. Here concurrent data transfer was defined as the concurrent transfer of data from local hosts to foreign hosts via two or more end-to-end paths. An RTT-Based CDT mechanism, which was based on a path's RTT (Round Trip Time) to optimize CDT performance, was developed for the proposed cTCP protocol. This mechanism primarily included an RTT-Based load distribution and path management scheme, which was used to optimize connections' throughput and reliability. A congestion control and retransmission policy based on RTT was also provided. According to experiment results, under different network conditions, our RTT-Based CDT mechanism could acquire good CDT performance. Finally a CWND-Based CDT mechanism, which was based on a path's CWND (Congestion Window), to optimize CDT performance was introduced. This mechanism primarily included: a CWND-Based load allocation scheme, which assigned corresponding data to paths based on their CWND to achieve aggregate bandwidth; a CWND-Based path management, which was used to optimize connections' fault tolerance; and a congestion control and retransmission management policy, which was similar to regular TCP in its separate path handling. According to corresponding experiment results, this mechanism could acquire near-optimal CDT performance under different network conditions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Considerable funds have been allocated in the area of juvenile justice in attempts to reduce and prevent the problem of juvenile delinquency. Much of these funds have been funneled to various community-level intervention programs. This dissertation reports the results of a study that examined the effects of one such program, the Juvenile Intervention Facility (JIF) in Broward County, Florida, on reducing the number of cases handled judicially by the Juvenile Court in that county. ^ Juvenile justice policy, which precipitated the creation of the JIF program, assumed that more structured and integrative efforts at the point of entry into the juvenile justice system would lead to greater diversion from the courts to much needed intervention services. By virtue of this process, the number of juveniles handled judicially by the courts was expected to decrease and future delinquent behavior would be prevented. Archival data from four fiscal years were examined, two years pre-JIF, two years post-JIF, a third-year follow-up, and a concurrent outcome measure corresponding to the first year of JIF operations. Data included all juvenile cases referred during the fiscal years defined for Broward and St. Lucie Counties, the state of Florida, and the United States. The study tested four hypotheses: (a) the JIF would reduce the number of cases handled judicially in Broward County Juvenile Court, (b) the decrease in judicially handled cases would be greater for females than for males, (c) there would be greater decreases in judicially handled cases for whites than non-whites, (d) there would be greater decreases in judicial handling for younger than older offenders. Bivariate analyses were conducted, consisting of chi square tests, to test the hypotheses. ^ Results indicate that the impact of the JIF was in the opposite direction of what was expected in that more juvenile offenders were handled judicially through juvenile court. This fact points to the possibility that the JIF has failed to provide the intended consequences of the policy. In the discussion, these “unintended” consequences are addressed in the context of juvenile justice policy creation and the competing constituencies involved in such policy development. ^

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This series of 5 single-subject studies used the operant conditioning paradigm to investigate, within the two-way influence process, how (a) contingent infant attention can reinforce maternal verbal behaviors during a period of mother-infant interaction and under subsequent experimental manipulation. Differential reinforcement was used to determine if it is possible that an infant attending to the mother (denoted by head-turns towards the image of the mother plus eye contact) increases (reinforces) the mother's verbal response (to a cue from the infant) upon which the infant behavior is contingent. There was also (b) an evaluation during the contrived parent-infant interaction for concurrent operant learning of infant vocal behavior via contingent verbal responding (reinforcement) implemented by the mother. Further, it was noted (c) whether or not the mother reported being aware that her responses were influenced by the infant's behavior. Findings showed: the operant conditioning of the maternal verbal behaviors were reinforced by contingent infant attention; and the operant conditioning of infant vocalizations was reinforced by contingent maternal verbal behaviors. No parent reported (1) being aware of the increase in their verbal response reinforced during operant conditioning of parental behavior nor a decrease in those responses during the DRA reversal phase, or (2) noticing a contingency between infant's and mother's response. By binomial 1-tail tests, the verbal-behavior patterns of the 5 mothers were conditioned by infant reinforcement (p < 0.02) and, concurrently, the vocal-response patterns of the 5 infants were conditioned by maternal reinforcement (p < 0.02). A program of systematic empirical research on the determinants of concurrent conditioning within mother-child interaction may provide a way to evaluate the differential effectiveness of interventions aimed at improving parent-child interactions. The work conducted in the present study is one step in this direction. ^

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This research focuses on developing a capacity planning methodology for the emerging concurrent engineer-to-order (ETO) operations. The primary focus is placed on the capacity planning at sales stage. This study examines the characteristics of capacity planning in a concurrent ETO operation environment, models the problem analytically, and proposes a practical capacity planning methodology for concurrent ETO operations in the industry. A computer program that mimics a concurrent ETO operation environment was written to validate the proposed methodology and test a set of rules that affect the performance of a concurrent ETO operation. ^ This study takes a systems engineering approach to the problem and employs systems engineering concepts and tools for the modeling and analysis of the problem, as well as for developing a practical solution to this problem. This study depicts a concurrent ETO environment in which capacity is planned. The capacity planning problem is modeled into a mixed integer program and then solved for smaller-sized applications to evaluate its validity and solution complexity. The objective is to select the best set of available jobs to maximize the profit, while having sufficient capacity to meet each due date expectation. ^ The nature of capacity planning for concurrent ETO operations is different from other operation modes. The search for an effective solution to this problem has been an emerging research field. This study characterizes the problem of capacity planning and proposes a solution approach to the problem. This mathematical model relates work requirements to capacity over the planning horizon. The methodology is proposed for solving industry-scale problems. Along with the capacity planning methodology, a set of heuristic rules was evaluated for improving concurrent ETO planning. ^

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Concurrent software executes multiple threads or processes to achieve high performance. However, concurrency results in a huge number of different system behaviors that are difficult to test and verify. The aim of this dissertation is to develop new methods and tools for modeling and analyzing concurrent software systems at design and code levels. This dissertation consists of several related results. First, a formal model of Mondex, an electronic purse system, is built using Petri nets from user requirements, which is formally verified using model checking. Second, Petri nets models are automatically mined from the event traces generated from scientific workflows. Third, partial order models are automatically extracted from some instrumented concurrent program execution, and potential atomicity violation bugs are automatically verified based on the partial order models using model checking. Our formal specification and verification of Mondex have contributed to the world wide effort in developing a verified software repository. Our method to mine Petri net models automatically from provenance offers a new approach to build scientific workflows. Our dynamic prediction tool, named McPatom, can predict several known bugs in real world systems including one that evades several other existing tools. McPatom is efficient and scalable as it takes advantage of the nature of atomicity violations and considers only a pair of threads and accesses to a single shared variable at one time. However, predictive tools need to consider the tradeoffs between precision and coverage. Based on McPatom, this dissertation presents two methods for improving the coverage and precision of atomicity violation predictions: 1) a post-prediction analysis method to increase coverage while ensuring precision; 2) a follow-up replaying method to further increase coverage. Both methods are implemented in a completely automatic tool.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Fisheries independent data on relatively unstudied nekton communities were used to explore the efficacy of new tools to be applied in the investigation of shallow coastal coral reef habitats. These data obtained through concurrent diver visual and acoustic surveys provided descriptions of spatial community distribution patterns across seasonal temporal scales in a previously undocumented region. Fish density estimates by both diver and acoustic methodologies showed a general agreement in ability to detect distributional patterns across reef tracts, though magnitude of density estimates were different. Fish communities in southeastern Florida showed significant trends in spatial distribution and seasonal abundance, with higher estimates of biomass obtained in the dry season. Further, community composition shifted across reef tracts and seasons as a function of the movements of several key reef species.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Concurrent software executes multiple threads or processes to achieve high performance. However, concurrency results in a huge number of different system behaviors that are difficult to test and verify. The aim of this dissertation is to develop new methods and tools for modeling and analyzing concurrent software systems at design and code levels. This dissertation consists of several related results. First, a formal model of Mondex, an electronic purse system, is built using Petri nets from user requirements, which is formally verified using model checking. Second, Petri nets models are automatically mined from the event traces generated from scientific workflows. Third, partial order models are automatically extracted from some instrumented concurrent program execution, and potential atomicity violation bugs are automatically verified based on the partial order models using model checking. Our formal specification and verification of Mondex have contributed to the world wide effort in developing a verified software repository. Our method to mine Petri net models automatically from provenance offers a new approach to build scientific workflows. Our dynamic prediction tool, named McPatom, can predict several known bugs in real world systems including one that evades several other existing tools. McPatom is efficient and scalable as it takes advantage of the nature of atomicity violations and considers only a pair of threads and accesses to a single shared variable at one time. However, predictive tools need to consider the tradeoffs between precision and coverage. Based on McPatom, this dissertation presents two methods for improving the coverage and precision of atomicity violation predictions: 1) a post-prediction analysis method to increase coverage while ensuring precision; 2) a follow-up replaying method to further increase coverage. Both methods are implemented in a completely automatic tool.