816 resultados para Academic path


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Tedd, L. A. (2005). E-books in academic libraries: an international overview. New Review of Academic Librarianship, 11(1), 57-79.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Bonthron, Karen; Urquhart, Christine; Thomas, Rhian; Armstrong, Chris; Ellis, David; Everitt, Jean; Fenton, Roger; Lonsdale, Ray; McDermott, Elizabeth; Morris, Helen; Phillips, Rebecca; Spink, Sian, and Yeoman, Alison. (2003, June). Trends in use of electronic journals in higher education in the UK - views of academic staff and students. D-Lib Magazine, 9(6). Retrieved September 8, 2006 from http://www.dlib.org/dlib/june03/urquhart/06urquhart.html This item is freely available online at http://www.dlib.org/dlib/june03/urquhart/06urquhart.html Sponsorship: JISC

Relevância:

20.00% 20.00%

Publicador:

Resumo:

K. Rasmani and Q. Shen. Data-driven fuzzy rule generation and its application for student academic performance evaluation. Applied Intelligence, 25(3):305-319, 2006.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Jackson, Richard, (2007) 'Constructing Enemies: 'Islamic Terrorism' in Political and Academic Discourse', Government and Opposition, 42(3) pp.394-426 RAE2008

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Accurate measurement of network bandwidth is crucial for flexible Internet applications and protocols which actively manage and dynamically adapt to changing utilization of network resources. These applications must do so to perform tasks such as distributing and delivering high-bandwidth media, scheduling service requests and performing admission control. Extensive work has focused on two approaches to measuring bandwidth: measuring it hop-by-hop, and measuring it end-to-end along a path. Unfortunately, best-practice techniques for the former are inefficient and techniques for the latter are only able to observe bottlenecks visible at end-to-end scope. In this paper, we develop and simulate end-to-end probing methods which can measure bottleneck bandwidth along arbitrary, targeted subpaths of a path in the network, including subpaths shared by a set of flows. As another important contribution, we describe a number of practical applications which we foresee as standing to benefit from solutions to this problem, especially in emerging, flexible network architectures such as overlay networks, ad-hoc networks, peer-to-peer architectures and massively accessed content servers.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We study the problem of preprocessing a large graph so that point-to-point shortest-path queries can be answered very fast. Computing shortest paths is a well studied problem, but exact algorithms do not scale to huge graphs encountered on the web, social networks, and other applications. In this paper we focus on approximate methods for distance estimation, in particular using landmark-based distance indexing. This approach involves selecting a subset of nodes as landmarks and computing (offline) the distances from each node in the graph to those landmarks. At runtime, when the distance between a pair of nodes is needed, we can estimate it quickly by combining the precomputed distances of the two nodes to the landmarks. We prove that selecting the optimal set of landmarks is an NP-hard problem, and thus heuristic solutions need to be employed. Given a budget of memory for the index, which translates directly into a budget of landmarks, different landmark selection strategies can yield dramatically different results in terms of accuracy. A number of simple methods that scale well to large graphs are therefore developed and experimentally compared. The simplest methods choose central nodes of the graph, while the more elaborate ones select central nodes that are also far away from one another. The efficiency of the suggested techniques is tested experimentally using five different real world graphs with millions of edges; for a given accuracy, they require as much as 250 times less space than the current approach in the literature which considers selecting landmarks at random. Finally, we study applications of our method in two problems arising naturally in large-scale networks, namely, social search and community detection.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The history of higher learning in Cork can be traced from its late eighteenth-century origins to its present standing within the extended confines of the Neo-Gothic architecture of University College, Cork. This institution, founded in 1845 was the successor and ultimate achievement of its forerunner, the Royal Cork Institution. The opening in 1849 of the college, then known as Queen's College, Cork, brought about a change in the role of the Royal Cork Institution as a centre of education. Its ambition of being the 'Munster College' was subsumed by the Queen's College even though it continued to function as a centre of learning up to the 1805. At this time its co-habitant, the School of Design, received a new wing under the benevolent patronage of William Crawford, and the Royal Cork Institution ceased to exist as the centre for cultural, technical and scientific learning it had set out to be. The building it occupied is today known as the Crawford Municipal Art Gallery.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Four librarians from Irish university libraries completed the U.K. Future Leaders Programme (FLP) in 2010. In this article they recount their experience and assess the effect of the programme on their professional practice and the value for their institutions. The programme is explored in the context of the Irish higher education environment, which is facing significant challenges due to the demise of the Celtic Tiger economy. A brief review of the literature relating to structured programmes to prepare librarians for senior positions, is presented. The structure and content of the FLP and the learning methodologies, theories, tools and techniques used throughout are discussed. The article suggests that the programme has real value for both individuals and institutions and that it can play a significant role in succession planning and the leadership development of librarians

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Ireland experienced two critical junctures when its economic survival was threatened: 1958/9 and 1986/7. Common to both crises was the supplanting of long established practices, that had become an integral part of the political culture of the state, by new ideas that ensured eventual economic recovery. In their adoption and implementation these ideas also fundamentally changed the institutions of state – how politics was done, how it was organised and regulated. The end result was the transformation of the Irish state. The main hypothesis of this thesis is that at those critical junctures the political and administrative elites who enabled economic recovery were not just making pragmatic decisions, their actions were influenced by ideas. Systematic content analysis of the published works of the main ideational actors, together with primary interviews with those actors still alive, reveals how their ideas were formed, what influenced them, and how they set about implementing their ideas. As the hypothesis assumes institutional change over time historical institutionalism serves as the theoretical framework. Central to this theory is the idea that choices made when a policy is being initiated or an institution formed will have a continuing influence long into the future. Institutions of state become ‘path dependent’ and impervious to change – the forces of inertia take over. That path dependency is broken at critical junctures. At those moments ideas play a major role as they offer a set of ready-made solutions. Historical institutionalism serves as a robust framework for proving that in the transformation of Ireland the role of ideas in punctuating institutional path dependency at critical junctures was central.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

With the proliferation of mobile wireless communication and embedded systems, the energy efficiency becomes a major design constraint. The dissipated energy is often referred as the product of power dissipation and the input-output delay. Most of electronic design automation techniques focus on optimising only one of these parameters either power or delay. Industry standard design flows integrate systematic methods of optimising either area or timing while for power consumption optimisation one often employs heuristics which are characteristic to a specific design. In this work we answer three questions in our quest to provide a systematic approach to joint power and delay Optimisation. The first question of our research is: How to build a design flow which incorporates academic and industry standard design flows for power optimisation? To address this question, we use a reference design flow provided by Synopsys and integrate in this flow academic tools and methodologies. The proposed design flow is used as a platform for analysing some novel algorithms and methodologies for optimisation in the context of digital circuits. The second question we answer is: Is possible to apply a systematic approach for power optimisation in the context of combinational digital circuits? The starting point is a selection of a suitable data structure which can easily incorporate information about delay, power, area and which then allows optimisation algorithms to be applied. In particular we address the implications of a systematic power optimisation methodologies and the potential degradation of other (often conflicting) parameters such as area or the delay of implementation. Finally, the third question which this thesis attempts to answer is: Is there a systematic approach for multi-objective optimisation of delay and power? A delay-driven power and power-driven delay optimisation is proposed in order to have balanced delay and power values. This implies that each power optimisation step is not only constrained by the decrease in power but also the increase in delay. Similarly, each delay optimisation step is not only governed with the decrease in delay but also the increase in power. The goal is to obtain multi-objective optimisation of digital circuits where the two conflicting objectives are power and delay. The logic synthesis and optimisation methodology is based on AND-Inverter Graphs (AIGs) which represent the functionality of the circuit. The switching activities and arrival times of circuit nodes are annotated onto an AND-Inverter Graph under the zero and a non-zero-delay model. We introduce then several reordering rules which are applied on the AIG nodes to minimise switching power or longest path delay of the circuit at the pre-technology mapping level. The academic Electronic Design Automation (EDA) tool ABC is used for the manipulation of AND-Inverter Graphs. We have implemented various combinatorial optimisation algorithms often used in Electronic Design Automation such as Simulated Annealing and Uniform Cost Search Algorithm. Simulated Annealing (SMA) is a probabilistic meta heuristic for the global optimization problem of locating a good approximation to the global optimum of a given function in a large search space. We used SMA to probabilistically decide between moving from one optimised solution to another such that the dynamic power is optimised under given delay constraints and the delay is optimised under given power constraints. A good approximation to the global optimum solution of energy constraint is obtained. Uniform Cost Search (UCS) is a tree search algorithm used for traversing or searching a weighted tree, tree structure, or graph. We have used Uniform Cost Search Algorithm to search within the AIG network, a specific AIG node order for the reordering rules application. After the reordering rules application, the AIG network is mapped to an AIG netlist using specific library cells. Our approach combines network re-structuring, AIG nodes reordering, dynamic power and longest path delay estimation and optimisation and finally technology mapping to an AIG netlist. A set of MCNC Benchmark circuits and large combinational circuits up to 100,000 gates have been used to validate our methodology. Comparisons for power and delay optimisation are made with the best synthesis scripts used in ABC. Reduction of 23% in power and 15% in delay with minimal overhead is achieved, compared to the best known ABC results. Also, our approach is also implemented on a number of processors with combinational and sequential components and significant savings are achieved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Wind energy is predominantly a nonsynchronous generation source. Large-scale integration of wind generation with existing electricity systems, therefore, presents challenges in maintaining system frequency stability and local voltage stability. Transmission system operators have implemented system operational constraints (SOCs) in order to maintain stability with high wind generation, but imposition of these constraints results in higher operating costs. A mixed integer programming tool was used to simulate generator dispatch in order to assess the impact of various SOCs on generation costs. Interleaved day-ahead scheduling and real-time dispatch models were developed to allow accurate representation of forced outages and wind forecast errors, and were applied to the proposed Irish power system of 2020 with a wind penetration of 32%. Savings of at least 7.8% in generation costs and reductions in wind curtailment of 50% were identified when the most influential SOCs were relaxed. The results also illustrate the need to relax local SOCs together with the system-wide nonsynchronous penetration limit SOC, as savings from increasing the nonsynchronous limit beyond 70% were restricted without relaxation of local SOCs. The methodology and results allow for quantification of the costs of SOCs, allowing the optimal upgrade path for generation and transmission infrastructure to be determined.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A report from the inaugural CONUL (Consortium of National & University Libraries) conference held in the Radisson Blu Hotel, Athlone, June 3rd & 4th 2015.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Gemstone Team HOPE (Hospital Optimal Productivity Enterprise)

Relevância:

20.00% 20.00%

Publicador:

Resumo:

© 2013 American Psychological Association.This meta-analysis synthesizes research on the effectiveness of intelligent tutoring systems (ITS) for college students. Thirty-five reports were found containing 39 studies assessing the effectiveness of 22 types of ITS in higher education settings. Most frequently studied were AutoTutor, Assessment and Learning in Knowledge Spaces, eXtended Tutor-Expert System, and Web Interface for Statistics Education. Major findings include (a) Overall, ITS had a moderate positive effect on college students' academic learning (g = .32 to g = .37); (b) ITS were less effective than human tutoring, but they outperformed all other instruction methods and learning activities, including traditional classroom instruction, reading printed text or computerized materials, computer-assisted instruction, laboratory or homework assignments, and no-treatment control; (c) ITS's effectiveness did not significantly differ by different ITS, subject domain, or the manner or degree of their involvement in instruction and learning; and (d) effectiveness in earlier studies appeared to be significantly greater than that in more recent studies. In addition, there is some evidence suggesting the importance of teachers and pedagogy in ITS-assisted learning.