881 resultados para large course design


Relevância:

30.00% 30.00%

Publicador:

Resumo:

As faculty needs evolve and become increasingly digital, libraries are feeling the pressure to provide relevant new services. At the same time, faculty members are struggling to create and maintain their professional reputations online. We at bepress are happy to announce the new SelectedWorks, the fully hosted, library-curated faculty profile platform that positions the library to better support faculty as well as the institution at large. Beverly Lysobey, Digital Commons and Resource Management Librarian, at Sacred Heart University, says: “Both faculty and administration have been impressed with the services we provide through SelectedWorks; we’re able to show how much our faculty really publishes, and it’s great for professors to get that recognition. We’ve had several faculty members approach us for help making sure their record was complete when they were up for tenure, and we’ve even found articles that authors themselves no longer had access to.” With consistent, organized, institution-branded profiles, SelectedWorks increases campus-wide exposure and supports the research mission of the university. As the only profile platform integrated with the fully hosted Digital Commons suite of publishing and repository services, it also ensures that the institution retains management of its content. Powerful integration with the Digital Commons platform lets the home institution more fully capture the range of scholarship produced on campus, and hosted services facilitate resource consolidation and reduces strain on IT. The new SelectedWorks features a modern, streamlined design that provides compelling display options for the full range of faculty work. It beautifully showcases streaming media, images, data, teaching materials, books – any type of content that researchers now produce as part of their scholarship. Detailed analytics tools let authors and librarians measure global readership and track impact for a variety of campus stakeholders: authors can see the universities, agencies, and businesses that are reading their work, and can easily export reports to use in tenure and promotion dossiers. Janelle Wertzbeger, Assistant Dean and Director of Scholarly Communications at Gettysburg College’s Musselman Library, says, “The new author dashboard maps and enhanced readership are SO GOOD. Every professor up for promotion & tenure should use them!” And of course, SelectedWorks is fully backed by the continual efforts of the bepress development team to provide maximum discoverability to search engines, increasing impact for faculty and institutions alike: Reverend Edward R. Udovic, Vice President for Teaching and Learning Resources at DePaul University, says, “In the last several months downloads of my scholarship from my [SelectedWorks] site have far surpassed the total distribution of all my work in the previous twenty five years.”

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In today's fast-paced and interconnected digital world, the data generated by an increasing number of applications is being modeled as dynamic graphs. The graph structure encodes relationships among data items, while the structural changes to the graphs as well as the continuous stream of information produced by the entities in these graphs make them dynamic in nature. Examples include social networks where users post status updates, images, videos, etc.; phone call networks where nodes may send text messages or place phone calls; road traffic networks where the traffic behavior of the road segments changes constantly, and so on. There is a tremendous value in storing, managing, and analyzing such dynamic graphs and deriving meaningful insights in real-time. However, a majority of the work in graph analytics assumes a static setting, and there is a lack of systematic study of the various dynamic scenarios, the complexity they impose on the analysis tasks, and the challenges in building efficient systems that can support such tasks at a large scale. In this dissertation, I design a unified streaming graph data management framework, and develop prototype systems to support increasingly complex tasks on dynamic graphs. In the first part, I focus on the management and querying of distributed graph data. I develop a hybrid replication policy that monitors the read-write frequencies of the nodes to decide dynamically what data to replicate, and whether to do eager or lazy replication in order to minimize network communication and support low-latency querying. In the second part, I study parallel execution of continuous neighborhood-driven aggregates, where each node aggregates the information generated in its neighborhoods. I build my system around the notion of an aggregation overlay graph, a pre-compiled data structure that enables sharing of partial aggregates across different queries, and also allows partial pre-computation of the aggregates to minimize the query latencies and increase throughput. Finally, I extend the framework to support continuous detection and analysis of activity-based subgraphs, where subgraphs could be specified using both graph structure as well as activity conditions on the nodes. The query specification tasks in my system are expressed using a set of active structural primitives, which allows the query evaluator to use a set of novel optimization techniques, thereby achieving high throughput. Overall, in this dissertation, I define and investigate a set of novel tasks on dynamic graphs, design scalable optimization techniques, build prototype systems, and show the effectiveness of the proposed techniques through extensive evaluation using large-scale real and synthetic datasets.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the current Cambodian higher education sector, there is little regulation of standards in curriculum design of undergraduate degrees in English language teacher education. The researcher, in the course of his professional work in the Curriculum and Policy Office at the Department of Higher Education, has seen evidence that most universities tend to copy their curriculum from one source, the curriculum of the Institute of Foreign Languages, the Royal University of Phnom Penh. Their programs fail to impose any entry standards, accepting students who pass the high school exam without any entrance examination. It is possible for a student to enter university with satisfactory scores in all subjects but English. Therefore, not many graduates are able to fulfil the professional requirements of the roles they are supposed to take. Neau (2010) claims that many Cambodian EFL teachers do not reach a high performance standard due to their low English language proficiency and poor background in teacher education. The main purpose of this study is to establish key guidelines for developing curricula for English language teacher education for all the universities across the country. It examines the content of the Bachelor‘s degree of Education in Teaching English as a Foreign Language (B Ed in TEFL) and Bachelor‘s degree of Arts in Teaching English to Speakers of Other Languages (BA in TESOL) curricula adopted in Cambodian universities on the basis of criteria proposed in current curriculum research. It also investigates the perspectives of Cambodian EFL teachers on the areas of knowledge and skill they need in order to perform their English teaching duties in Cambodia today. The areas of knowledge and skill offered in the current curricula at Cambodian higher education institutions (HEIs), the framework of the knowledge base for EFL teacher education and general higher education, and the areas of knowledge and skill Cambodian EFL teachers perceive to be important, are compared so as to identify any gaps in the current English language teacher education curricula in the Cambodian HEIs. The existence of gaps show what domains of knowledge and skill need to be included in the English language teacher education curricula at Cambodian HEIs. These domains are those identified by previous curriculum researchers in both general and English language teacher education at tertiary level. Therefore, the present study provides useful insights into the importance of including appropriate content in English language teacher education curricula. Mixed methods are employed in this study. The course syllabi and the descriptions within the curricula in five Cambodian HEIs are analysed qualitatively based on the framework of knowledge and skills for EFL teachers, which is formed by looking at the knowledge base for second language teachers suggested by the methodologists and curriculum specialists whose work is elaborated on the review of literature. A quantitative method is applied to analyse the perspectives of 120 Cambodian EFL teachers on areas of knowledge and skills they should possess. The fieldwork was conducted between June and August, 2014. The analysis reveals that the following areas are included in the curricula at the five universities: communication skills, general knowledge, knowledge of teaching theories, teaching skills, pedagogical reasoning and decision making skills, subject matter knowledge, contextual knowledge, cognitive abilities, and knowledge of social issues. Additionally, research skills are included in three curricula while society and community involvement is in only one. Further, information and communication technology, which is outlined in the Education Strategies Plan (2006-2010), forms part of four curricula while leadership skills form part of two. This study demonstrates ultimately that most domains that are directly and indirectly related to language teaching competence are not sufficiently represented in the current curricula. On the basis of its findings, the study concludes with a set of guidelines that should inform the design and development of TESOL and TEFL curricula in Cambodia.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis presents approximation algorithms for some NP-Hard combinatorial optimization problems on graphs and networks; in particular, we study problems related to Network Design. Under the widely-believed complexity-theoretic assumption that P is not equal to NP, there are no efficient (i.e., polynomial-time) algorithms that solve these problems exactly. Hence, if one desires efficient algorithms for such problems, it is necessary to consider approximate solutions: An approximation algorithm for an NP-Hard problem is a polynomial time algorithm which, for any instance of the problem, finds a solution whose value is guaranteed to be within a multiplicative factor of the value of an optimal solution to that instance. We attempt to design algorithms for which this factor, referred to as the approximation ratio of the algorithm, is as small as possible. The field of Network Design comprises a large class of problems that deal with constructing networks of low cost and/or high capacity, routing data through existing networks, and many related issues. In this thesis, we focus chiefly on designing fault-tolerant networks. Two vertices u,v in a network are said to be k-edge-connected if deleting any set of k − 1 edges leaves u and v connected; similarly, they are k-vertex connected if deleting any set of k − 1 other vertices or edges leaves u and v connected. We focus on building networks that are highly connected, meaning that even if a small number of edges and nodes fail, the remaining nodes will still be able to communicate. A brief description of some of our results is given below. We study the problem of building 2-vertex-connected networks that are large and have low cost. Given an n-node graph with costs on its edges and any integer k, we give an O(log n log k) approximation for the problem of finding a minimum-cost 2-vertex-connected subgraph containing at least k nodes. We also give an algorithm of similar approximation ratio for maximizing the number of nodes in a 2-vertex-connected subgraph subject to a budget constraint on the total cost of its edges. Our algorithms are based on a pruning process that, given a 2-vertex-connected graph, finds a 2-vertex-connected subgraph of any desired size and of density comparable to the input graph, where the density of a graph is the ratio of its cost to the number of vertices it contains. This pruning algorithm is simple and efficient, and is likely to find additional applications. Recent breakthroughs on vertex-connectivity have made use of algorithms for element-connectivity problems. We develop an algorithm that, given a graph with some vertices marked as terminals, significantly simplifies the graph while preserving the pairwise element-connectivity of all terminals; in fact, the resulting graph is bipartite. We believe that our simplification/reduction algorithm will be a useful tool in many settings. We illustrate its applicability by giving algorithms to find many trees that each span a given terminal set, while being disjoint on edges and non-terminal vertices; such problems have applications in VLSI design and other areas. We also use this reduction algorithm to analyze simple algorithms for single-sink network design problems with high vertex-connectivity requirements; we give an O(k log n)-approximation for the problem of k-connecting a given set of terminals to a common sink. We study similar problems in which different types of links, of varying capacities and costs, can be used to connect nodes; assuming there are economies of scale, we give algorithms to construct low-cost networks with sufficient capacity or bandwidth to simultaneously support flow from each terminal to the common sink along many vertex-disjoint paths. We further investigate capacitated network design, where edges may have arbitrary costs and capacities. Given a connectivity requirement R_uv for each pair of vertices u,v, the goal is to find a low-cost network which, for each uv, can support a flow of R_uv units of traffic between u and v. We study several special cases of this problem, giving both algorithmic and hardness results. In addition to Network Design, we consider certain Traveling Salesperson-like problems, where the goal is to find short walks that visit many distinct vertices. We give a (2 + epsilon)-approximation for Orienteering in undirected graphs, achieving the best known approximation ratio, and the first approximation algorithm for Orienteering in directed graphs. We also give improved algorithms for Orienteering with time windows, in which vertices must be visited between specified release times and deadlines, and other related problems. These problems are motivated by applications in the fields of vehicle routing, delivery and transportation of goods, and robot path planning.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The structured representation of cases by attribute graphs in a Case-Based Reasoning (CBR) system for course timetabling has been the subject of previous research by the authors. In that system, the case base is organised as a decision tree and the retrieval process chooses those cases which are sub attribute graph isomorphic to the new case. The drawback of that approach is that it is not suitable for solving large problems. This paper presents a multiple-retrieval approach that partitions a large problem into small solvable sub-problems by recursively inputting the unsolved part of the graph into the decision tree for retrieval. The adaptation combines the retrieved partial solutions of all the partitioned sub-problems and employs a graph heuristic method to construct the whole solution for the new case. We present a methodology which is not dependant upon problem specific information and which, as such, represents an approach which underpins the goal of building more general timetabling systems. We also explore the question of whether this multiple-retrieval CBR could be an effective initialisation method for local search methods such as Hill Climbing, Tabu Search and Simulated Annealing. Significant results are obtained from a wide range of experiments. An evaluation of the CBR system is presented and the impact of the approach on timetabling research is discussed. We see that the approach does indeed represent an effective initialisation method for these approaches.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

An earlier Case-based Reasoning (CBR) approach developed by the authors for educational course timetabling problems employed structured cases to represent the complex relationships between courses. Previous solved cases represented by attribute graphs were organized hierarchically into a decision tree. The retrieval searches for graph isomorphism among these attribute graphs. In this paper, the approach is further developed to solve a wider range of problems. We also attempt to retrieve those graphs that have common similar structures but also have some differences. Costs that are assigned to these differences have an input upon the similarity measure. A large number of experiments are performed consisting of different randomly produced timetabling problems and the results presented here strongly indicate that a CBR approach could provide a significant step forward in the development of automated system to solve difficult timetabling problems. They show that using relatively little effort, we can retrieve these structurally similar cases to provide high quality timetables for new timetabling problems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Peer-to-peer information sharing has fundamentally changed customer decision-making process. Recent developments in information technologies have enabled digital sharing platforms to influence various granular aspects of the information sharing process. Despite the growing importance of digital information sharing, little research has examined the optimal design choices for a platform seeking to maximize returns from information sharing. My dissertation seeks to fill this gap. Specifically, I study novel interventions that can be implemented by the platform at different stages of the information sharing. In collaboration with a leading for-profit platform and a non-profit platform, I conduct three large-scale field experiments to causally identify the impact of these interventions on customers’ sharing behaviors as well as the sharing outcomes. The first essay examines whether and how a firm can enhance social contagion by simply varying the message shared by customers with their friends. Using a large randomized field experiment, I find that i) adding only information about the sender’s purchase status increases the likelihood of recipients’ purchase; ii) adding only information about referral reward increases recipients’ follow-up referrals; and iii) adding information about both the sender’s purchase as well as the referral rewards increases neither the likelihood of purchase nor follow-up referrals. I then discuss the underlying mechanisms. The second essay studies whether and how a firm can design unconditional incentive to engage customers who already reveal willingness to share. I conduct a field experiment to examine the impact of incentive design on sender’s purchase as well as further referral behavior. I find evidence that incentive structure has a significant, but interestingly opposing, impact on both outcomes. The results also provide insights about senders’ motives in sharing. The third essay examines whether and how a non-profit platform can use mobile messaging to leverage recipients’ social ties to encourage blood donation. I design a large field experiment to causally identify the impact of different types of information and incentives on donor’s self-donation and group donation behavior. My results show that non-profits can stimulate group effect and increase blood donation, but only with group reward. Such group reward works by motivating a different donor population. In summary, the findings from the three studies will offer valuable insights for platforms and social enterprises on how to engineer digital platforms to create social contagion. The rich data from randomized experiments and complementary sources (archive and survey) also allows me to test the underlying mechanism at work. In this way, my dissertation provides both managerial implication and theoretical contribution to the phenomenon of peer-to-peer information sharing.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The structured representation of cases by attribute graphs in a Case-Based Reasoning (CBR) system for course timetabling has been the subject of previous research by the authors. In that system, the case base is organised as a decision tree and the retrieval process chooses those cases which are sub attribute graph isomorphic to the new case. The drawback of that approach is that it is not suitable for solving large problems. This paper presents a multiple-retrieval approach that partitions a large problem into small solvable sub-problems by recursively inputting the unsolved part of the graph into the decision tree for retrieval. The adaptation combines the retrieved partial solutions of all the partitioned sub-problems and employs a graph heuristic method to construct the whole solution for the new case. We present a methodology which is not dependant upon problem specific information and which, as such, represents an approach which underpins the goal of building more general timetabling systems. We also explore the question of whether this multiple-retrieval CBR could be an effective initialisation method for local search methods such as Hill Climbing, Tabu Search and Simulated Annealing. Significant results are obtained from a wide range of experiments. An evaluation of the CBR system is presented and the impact of the approach on timetabling research is discussed. We see that the approach does indeed represent an effective initialisation method for these approaches.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The second generation of large scale interferometric gravitational wave (GW) detectors will be limited by quantum noise over a wide frequency range in their detection band. Further sensitivity improvements for future upgrades or new detectors beyond the second generation motivate the development of measurement schemes to mitigate the impact of quantum noise in these instruments. Two strands of development are being pursued to reach this goal, focusing both on modifications of the well-established Michelson detector configuration and development of different detector topologies. In this paper, we present the design of the world's first Sagnac speed meter (SSM) interferometer, which is currently being constructed at the University of Glasgow. With this proof-of-principle experiment we aim to demonstrate the theoretically predicted lower quantum noise in a Sagnac interferometer compared to an equivalent Michelson interferometer, to qualify SSM for further research towards an implementation in a future generation large scale GW detector, such as the planned Einstein telescope observatory.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Relatório de Estágio para a obtenção do grau de Mestre na área de Educação e Comunicação Multimédia

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The prediction of convective heat transfer in enclosures under high ventilative flow rates is primarily of interest for building design and simulation purposes. Current models are based on experiments performed forty years ago with flat plates under natural convection conditions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Chapter 1: Under the average common value function, we select almost uniquely the mechanism that gives the seller the largest portion of the true value in the worst situation among all the direct mechanisms that are feasible, ex-post implementable and individually rational. Chapter 2: Strategy-proof, budget balanced, anonymous, envy-free linear mechanisms assign p identical objects to n agents. The efficiency loss is the largest ratio of surplus loss to efficient surplus, over all profiles of non-negative valuations. The smallest efficiency loss is uniquely achieved by the following simple allocation rule: assigns one object to each of the p−1 agents with the highest valuation, a large probability to the agent with the pth highest valuation, and the remaining probability to the agent with the (p+1)th highest valuation. When “envy freeness” is replaced by the weaker condition “voluntary participation”, the optimal mechanism differs only when p is much less than n. Chapter 3: One group is to be selected among a set of agents. Agents have preferences over the size of the group if they are selected; and preferences over size as well as the “stand-outside” option are single-peaked. We take a mechanism design approach and search for group selection mechanisms that are efficient, strategy-proof and individually rational. Two classes of such mechanisms are presented. The proposing mechanism allows agents to either maintain or shrink the group size following a fixed priority, and is characterized by group strategy-proofness. The voting mechanism enlarges the group size in each voting round, and achieves at least half of the maximum group size compatible with individual rationality.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The aim of this thesis is to review and augment the theory and methods of optimal experimental design. In Chapter I the scene is set by considering the possible aims of an experimenter prior to an experiment, the statistical methods one might use to achieve those aims and how experimental design might aid this procedure. It is indicated that, given a criterion for design, a priori optimal design will only be possible in certain instances and, otherwise, some form of sequential procedure would seem to be indicated. In Chapter 2 an exact experimental design problem is formulated mathematically and is compared with its continuous analogue. Motivation is provided for the solution of this continuous problem, and the remainder of the chapter concerns this problem. A necessary and sufficient condition for optimality of a design measure is given. Problems which might arise in testing this condition are discussed, in particular with respect to possible non-differentiability of the criterion function at the design being tested. Several examples are given of optimal designs which may be found analytically and which illustrate the points discussed earlier in the chapter. In Chapter 3 numerical methods of solution of the continuous optimal design problem are reviewed. A new algorithm is presented with illustrations of how it should be used in practice. It is shown that, for reasonably large sample size, continuously optimal designs may be approximated to well by an exact design. In situations where this is not satisfactory algorithms for improvement of this design are reviewed. Chapter 4 consists of a discussion of sequentially designed experiments, with regard to both the philosophies underlying, and the application of the methods of, statistical inference. In Chapter 5 we criticise constructively previous suggestions for fully sequential design procedures. Alternative suggestions are made along with conjectures as to how these might improve performance. Chapter 6 presents a simulation study, the aim of which is to investigate the conjectures of Chapter 5. The results of this study provide empirical support for these conjectures. In Chapter 7 examples are analysed. These suggest aids to sequential experimentation by means of reduction of the dimension of the design space and the possibility of experimenting semi-sequentially. Further examples are considered which stress the importance of the use of prior information in situations of this type. Finally we consider the design of experiments when semi-sequential experimentation is mandatory because of the necessity of taking batches of observations at the same time. In Chapter 8 we look at some of the assumptions which have been made and indicate what may go wrong where these assumptions no longer hold.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The focus of this research is to explore the applications of the finite difference formulation based on the latency insertion method (LIM) to the analysis of circuit interconnects. Special attention is devoted to addressing the issues that arise in very large networks such as on-chip signal and power distribution networks. We demonstrate that the LIM has the power and flexibility to handle various types of analysis required at different stages of circuit design. The LIM is particularly suitable for simulations of very large scale linear networks and can significantly outperform conventional circuit solvers (such as SPICE).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In 1620, over the course of 66 days, 102 passengers called the Mayflower their home before arriving and settling in Plymouth, New England. In the years following the Louisiana Purchase of 1803 nearly 7 million people traversed extreme wilderness in covered wagons to found and settle the American West. This year, 2015, the first spaceport has opened in anticipation of sub orbital space flights in 2017 and manned settlement flights to mars by 2026. This thesis explores the questions: In this next phase of human exploration and settlement, what does it mean to dwell beyond earth? What are the current architectural limitations regarding structure and material sustainability? And, How can architecture elevate the traditionally sterile environments of survival shelters to that of permanent dwellings?