6 resultados para Static average-case analysis

em DRUM (Digital Repository at the University of Maryland)


Relevância:

100.00% 100.00%

Publicador:

Resumo:

In the standard Vehicle Routing Problem (VRP), we route a fleet of vehicles to deliver the demands of all customers such that the total distance traveled by the fleet is minimized. In this dissertation, we study variants of the VRP that minimize the completion time, i.e., we minimize the distance of the longest route. We call it the min-max objective function. In applications such as disaster relief efforts and military operations, the objective is often to finish the delivery or the task as soon as possible, not to plan routes with the minimum total distance. Even in commercial package delivery nowadays, companies are investing in new technologies to speed up delivery instead of focusing merely on the min-sum objective. In this dissertation, we compare the min-max and the standard (min-sum) objective functions in a worst-case analysis to show that the optimal solution with respect to one objective function can be very poor with respect to the other. The results motivate the design of algorithms specifically for the min-max objective. We study variants of min-max VRPs including one problem from the literature (the min-max Multi-Depot VRP) and two new problems (the min-max Split Delivery Multi-Depot VRP with Minimum Service Requirement and the min-max Close-Enough VRP). We develop heuristics to solve these three problems. We compare the results produced by our heuristics to the best-known solutions in the literature and find that our algorithms are effective. In the case where benchmark instances are not available, we generate instances whose near-optimal solutions can be estimated based on geometry. We formulate the Vehicle Routing Problem with Drones and carry out a theoretical analysis to show the maximum benefit from using drones in addition to trucks to reduce delivery time. The speed-up ratio depends on the number of drones loaded onto one truck and the speed of the drone relative to the speed of the truck.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Abstract The purpose of this study was to examine how four high schools used an Early Warning Indicator Report (EWIR) to improve ninth grade promotion rates. Ninth grade on-time promotion is an early predictor of a student’s likelihood to graduate (Bornsheuer, Polonyi, Andrews, Fore, & Onwuegbuzie, 2011; Leckrone & Griffith, 2006; Roderick, Kelley-Kemple, Johnson, & Beechum, 2014; Zvoch, 2006). The analysis revealed both similarities and differences in the ways that the four schools used the EWIR. The research took place in a large urban school district in the Mid-Atlantic. Sixteen participants from four high schools and the district’s central office voluntarily participated in face-to-face interviews. The researcher utilized a qualitative case study method to examine the implementation of the EWIR system in Wyatt School District. The interview data was transcribed and analyzed, along with district documents, to identify categories in this cross case analysis. Three primary themes emerged from the data: (1) targeted school structures for EWIR implementation, (2) the EWIR identified necessary supports for students, and (3) the central office support for school staff. The findings revealed the various ways that the target schools implemented the EWIR in their buildings and the level of support that they received from the central office that aided them in using the EWIR to improve ninth grade promotion rates. Based on the findings of this study, the researcher provided a number of key recommendations: (1) Districts should provide professional development to schools to ensure that schools have the support they need to implement the EWIR successfully; (2) There should be increased accountability from the central office for schools using the EWIR to identify impactful interventions for ninth graders; and (3) The district needs to assign dedicated central office staff to support the implementation of the EWIR in high schools across the district. As schools continue to face the challenge of improving ninth grade promotion rates, effective use of an Early Warning Indicator Report is recommended to provide school and district staff with data needed to impact overall student performance.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Problem This dissertation presents a literature-based framework for communication in science (with the elements partners, purposes, message, and channel), which it then applies in and amends through an empirical study of how geoscientists use two social computing technologies (SCTs), blogging and Twitter (both general use and tweeting from conferences). How are these technologies used and what value do scientists derive from them? Method The empirical part used a two-pronged qualitative study, using (1) purposive samples of ~400 blog posts and ~1000 tweets and (2) a purposive sample of 8 geoscientist interviews. Blog posts, tweets, and interviews were coded using the framework, adding new codes as needed. The results were aggregated into 8 geoscientist case studies, and general patterns were derived through cross-case analysis. Results A detailed picture of how geoscientists use blogs and twitter emerged, including a number of new functions not served by traditional channels. Some highlights: Geoscientists use SCTs for communication among themselves as well as with the public. Blogs serve persuasion and personal knowledge management; Twitter often amplifies the signal of traditional communications such as journal articles. Blogs include tutorials for peers, reviews of basic science concepts, and book reviews. Twitter includes links to readings, requests for assistance, and discussions of politics and religion. Twitter at conferences provides live coverage of sessions. Conclusions Both blogs and Twitter are routine parts of scientists' communication toolbox, blogs for in-depth, well-prepared essays, Twitter for faster and broader interactions. Both have important roles in supporting community building, mentoring, and learning and teaching. The Framework of Communication in Science was a useful tool in studying these two SCTs in this domain. The results should encourage science administrators to facilitate SCT use of scientists in their organization and information providers to search SCT documents as an important source of information.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The high rate of teacher attrition in urban schools is well documented. While this does not seem like a problem in Carter County, this equates to hundreds of teachers that need to be replaced annually. Since school year (SY) 2007-08, Carter County has lost over 7,100 teachers, approximately half of (50.1%) of whom resigned, often going to neighboring, higher-paying jurisdictions as suggested by exit survey data (SY2016-2020 Strategic Plan). Included in this study is a range of practices principals use to retain teachers. While the role of the principal is recognized as a critical element in teacher retention, few studies explore the specific practices principals implement to retain teachers and how they use their time to accomplish this task. Through interviews, observations, document analysis and reflective notes, the study identifies the practices four elementary school principals of high and relatively low attrition schools use to support teacher retention. In doing so, the study uses a qualitative cross-case analysis approach. The researcher examined the following leadership practices of the principal and their impact on teacher retention: (a) providing leadership, (b) supporting new teachers, (c) training and mentoring teaching staff, (d) creating opportunities for collaboration, (d) creating a positive school climate, and (e) promoting teacher autonomy. The following research questions served as a foundational guide for the development and implementation of this study: 1. How do principals prioritize addressing teacher attrition or retention relative to all of their other responsibilities? How do they allocate their time to this challenge? 2. What do principals in schools with low attrition rates do to promote retention that principals in high attrition schools do not? What specific practices or interventions are principals in these two types of schools utilizing to retain teachers? Is there evidence to support their use of the practices? The findings that emerge from the data revealed the various practices principals use to influence and support teachers do not differ between the four schools.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The universities rely on the Information Technology (IT) projects to support and enhance their core strategic objectives of teaching, research, and administration. The researcher’s literature review found that the level of IT funding and resources in the universities is not adequate to meet the IT demands. The universities received more IT project requests than they could execute. As such, universities must selectively fund the IT projects. The objectives of the IT projects in the universities vary. An IT project which benefits the teaching functions may not benefit the administrative functions. As such, the selection of an IT project is challenging in the universities. To aid with the IT decision making, many universities in the United States of America (USA) have formed the IT Governance (ITG) processes. ITG is an IT decision making and accountability framework whose purpose is to align the IT efforts in an organization with its strategic objectives, realize the value of the IT investments, meet the expected performance criteria, and manage the risks and the resources (Weil & Ross, 2004). ITG in the universities is relatively new, and it is not well known how the ITG processes are aiding the nonprofit universities in selecting the right IT projects, and managing the performance of these IT projects. This research adds to the body of knowledge regarding the IT project selection under the governance structure, the maturity of the IT projects, and the IT project performance in the nonprofit universities. The case study research methodology was chosen for this exploratory research. The convenience sampling was done to choose the cases from two large, research universities with decentralized colleges, and two small, centralized universities. The data were collected on nine IT projects from these four universities using the interviews and the university documents. The multi-case analysis was complemented by the Qualitative Comparative Analysis (QCA) to systematically analyze how the IT conditions lead to an outcome. This research found that the IT projects were selected in the centralized universities in a more informed manner. ITG was more authoritative in the small centralized universities; the ITG committees were formed by including the key decision makers, the decision-making roles, and responsibilities were better defined, and the frequency of ITG communication was higher. In the centralized universities, the business units and colleges brought the IT requests to ITG committees; which in turn prioritized the IT requests and allocated the funds and the resources to the IT projects. ITG committee members in the centralized universities had a higher awareness of the university-wide IT needs, and the IT projects tended to align with the strategic objectives. On the other hand, the decentralized colleges and business units in the large universities were influential and often bypassed the ITG processes. The decentralized units often chose the “pet” IT projects, and executed them within a silo, without bringing them to the attention of the ITG committees. While these IT projects met the departmental objectives, they did not always align with the university’s strategic objectives. This research found that the IT project maturity in the university could be increased by following the project management methodologies. The IT project management maturity was found higher in the IT projects executed by the centralized university, where a full-time project manager was assigned to manage the project, and the project manager had a higher expertise in the project management. The IT project executed under the guidance of the Project Management Office (PMO) has exhibited a higher project management maturity, as the PMO set the standards and controls for the project. The IT projects managed by the decentralized colleges by a part-time project manager with lower project management expertise have exhibited a lower project management maturity. The IT projects in the decentralized colleges were often managed by the business, or technical leads, who often lacked the project management expertise. This research found that higher the IT project management maturity, the better is the project performance. The IT projects with a higher maturity had a lower project delay, lower number of missed requirements, and lower number of IT system errors. This research found that the quality of IT decision in the university could be improved by centralizing the IT decision-making processes. The IT project management maturity could be improved by following the project management methodologies. The stakeholder management and communication were found critical for the success of the IT projects in the university. It is hoped that the findings from this research would help the university leaders make the strategic IT decisions, and the university’s IT project managers make the IT project decisions.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

With the increasing complexity of today's software, the software development process is becoming highly time and resource consuming. The increasing number of software configurations, input parameters, usage scenarios, supporting platforms, external dependencies, and versions plays an important role in expanding the costs of maintaining and repairing unforeseeable software faults. To repair software faults, developers spend considerable time in identifying the scenarios leading to those faults and root-causing the problems. While software debugging remains largely manual, it is not the case with software testing and verification. The goal of this research is to improve the software development process in general, and software debugging process in particular, by devising techniques and methods for automated software debugging, which leverage the advances in automatic test case generation and replay. In this research, novel algorithms are devised to discover faulty execution paths in programs by utilizing already existing software test cases, which can be either automatically or manually generated. The execution traces, or alternatively, the sequence covers of the failing test cases are extracted. Afterwards, commonalities between these test case sequence covers are extracted, processed, analyzed, and then presented to the developers in the form of subsequences that may be causing the fault. The hypothesis is that code sequences that are shared between a number of faulty test cases for the same reason resemble the faulty execution path, and hence, the search space for the faulty execution path can be narrowed down by using a large number of test cases. To achieve this goal, an efficient algorithm is implemented for finding common subsequences among a set of code sequence covers. Optimization techniques are devised to generate shorter and more logical sequence covers, and to select subsequences with high likelihood of containing the root cause among the set of all possible common subsequences. A hybrid static/dynamic analysis approach is designed to trace back the common subsequences from the end to the root cause. A debugging tool is created to enable developers to use the approach, and integrate it with an existing Integrated Development Environment. The tool is also integrated with the environment's program editors so that developers can benefit from both the tool suggestions, and their source code counterparts. Finally, a comparison between the developed approach and the state-of-the-art techniques shows that developers need only to inspect a small number of lines in order to find the root cause of the fault. Furthermore, experimental evaluation shows that the algorithm optimizations lead to better results in terms of both the algorithm running time and the output subsequence length.