881 resultados para Minimization of open stack problem
Resumo:
This article concerns how higher education institutions across the United Kingdom are implementing systems and workflows in order to meet open access requirements for the next Research Excellence Framework. The way that institutions are preparing is not uniform, although there are key areas which require attention: cost management, advocacy, systems and metadata, structural workflows, and internal policy. Examples of preparative work in these areas are taken from institutions who have participated in the Open Access Good Practice initiative supported by Jisc.
Resumo:
As an emerging innovation paradigm gaining momentum in recent years, the open innovation paradigm is calling for greater theoretical depth and more empirical research. This dissertation proposes that open innovation in the context of open source software sponsorship may be viewed as knowledge strategies of the firm. Hence, this dissertation examines the performance determinants of open innovation through the lens of knowledge-based perspectives. Using event study and regression methodologies, this dissertation found that these open source software sponsorship events can indeed boost the stock market performance of US public firms. In addition, both the knowledge capabilities of the firms and the knowledge profiles of the open source projects they sponsor matter for performance. In terms of firm knowledge capabilities, internet service firms perform better than other firms owing to their advantageous complementary capabilities. Also, strong knowledge exploitation capabilities of the firm are positively associated with performance. In terms of the knowledge profile of sponsored projects, platform projects perform better than component projects. Also, community-originated projects outperform firm-originated projects. Finally, based on these findings, this dissertation discussed the important theoretical implications for the strategic tradeoff between knowledge protection and sharing.
Resumo:
This chapter discusses the consequences of open-access (OA) publishing and dissemination for libraries in higher education institutions (HEIs). Key questions (which are addressed in this chapter) include: 1. How might OA help information provision? 2. What changes to library services will arise from OA developments (particularly if OA becomes widespread)? 3. How do these changes fit in with wider changes affecting the future role of libraries? 4. How can libraries and librarians help to address key practical issues associated with the implementation of OA (particularly transition issues)? This chapter will look at OA from the perspective of HE libraries and will make four key points: 1. Open access has the potential to bring benefits to the research community in particular and society in general by improving information provision. 2. If there is widespread open access to research content, there will be less need for library-based activity at the institution level, and more need for information management activity at the supra-institutional or national level. 3. Institutional libraries will, however, continue to have an important role to play in areas such as managing purchased or licensed content, curating institutional digital assets, and providing support in the use of content for teaching and research. 4. Libraries are well-placed to work with stakeholders within their institutions and beyond to help resolve current challenges associated with the implementation of OA policies and practices.
Resumo:
FEA simulation of thermal metal cutting is central to interactive design and manufacturing. It is therefore relevant to assess the applicability of FEA open software to simulate 2D heat transfer in metal sheet laser cuts. Application of open source code (e.g. FreeFem++, FEniCS, MOOSE) makes possible additional scenarios (e.g. parallel, CUDA, etc.), with lower costs. However, a precise assessment is required on the scenarios in which open software can be a sound alternative to a commercial one. This article contributes in this regard, by presenting a comparison of the aforementioned freeware FEM software for the simulation of heat transfer in thin (i.e. 2D) sheets, subject to a gliding laser point source. We use the commercial ABAQUS software as the reference to compare such open software. A convective linear thin sheet heat transfer model, with and without material removal is used. This article does not intend a full design of computer experiments. Our partial assessment shows that the thin sheet approximation turns to be adequate in terms of the relative error for linear alumina sheets. Under mesh resolutions better than 10e−5 m , the open and reference software temperature differ in at most 1 % of the temperature prediction. Ongoing work includes adaptive re-meshing, nonlinearities, sheet stress analysis and Mach (also called ‘relativistic’) effects.
Resumo:
This is contribution no. 16-114-J from the Kansas Agricultural Experiment Station. The Kansas State University Open/Alternative Textbook Initiative provides grants to faculty members to replace textbooks with open/alternative educational resources (OAERs) that are available at no cost to students. Open educational resources are available for anyone to access, while alternative educational resources are not open. The objective of this study was to determine the perceptions towards OAERs and the initiative, of students enrolled in, and faculty members teaching, courses using OAERs. A survey was sent out to 2,074 students in 13 courses using the OAERs. A total of 524 (25.3%) students completed the survey and a faculty member from each of the 13 courses using OAERs was interviewed. Students rated the OAERs as good quality, preferred using them instead of buying textbooks for their courses, and agreed that they would like OAERs used in other courses. Faculty felt that student learning was somewhat better and it was somewhat easier to teach using OAERs than when they used the traditional textbooks. Nearly all faculty members preferred teaching with OAERs and planned to continue to do so after the funding period. These results, combined with the tremendous savings to students, support the continued funding of the initiative and similar approaches at other institutions.
Resumo:
In the minimization of tool switches problem we seek a sequence to process a set of jobs so that the number of tool switches required is minimized. In this work different variations of a heuristic based on partial ordered job sequences are implemented and evaluated. All variations adopt a depth first strategy of the enumeration tree. The computational test results indicate that good results can be obtained by a variation which keeps the best three branches at each node of the enumeration tree, and randomly choose, among all active nodes, the next node to branch when backtracking.
Resumo:
No problema de minimização de troca de ferramentas procura-se por uma sequência para processar um conjunto de tarefas de modo que o número requerido de trocas de ferramentas seja o menor possível. Neste trabalho propõe-se um algoritmo para resolver este problema baseado em um ordenamento parcial das tarefas. Uma sequência ótima é obtida expandindo-se as sequências parciais enumeradas. Testes computacionais são apresentados.
Resumo:
O problema de minimização de troca de ferramentas (MTSP) busca uma sequência de processamento de um conjunto de tarefas, de modo a minimizar o número de trocas de ferramentas requeridas. Este trabalho apresenta uma nova heurística para o MTSP, capaz de produzir bons limitantes superiores para um algoritmo enumerativo. Esta heurística possui duas fases: uma fase construtiva que é baseada em um grafo em que os vértices correspondem a ferramentas e existe um arco k = (i, j) que liga os vértices i e j se e somente se as ferramentas i e j são necessárias para a execução de alguma tarefa k; e uma fase de refinamento baseada na meta-heurística Busca Local Iterativa. Resultados computacionais mostram que a heurística proposta tem um bom desempenho para os problemas testados, contribuindo para uma redução significativa no número de nós gerados de um algoritmo enumerativo.
Resumo:
We consider a class of two-dimensional problems in classical linear elasticity for which material overlapping occurs in the absence of singularities. Of course, material overlapping is not physically realistic, and one possible way to prevent it uses a constrained minimization theory. In this theory, a minimization problem consists of minimizing the total potential energy of a linear elastic body subject to the constraint that the deformation field must be locally invertible. Here, we use an interior and an exterior penalty formulation of the minimization problem together with both a standard finite element method and classical nonlinear programming techniques to compute the minimizers. We compare both formulations by solving a plane problem numerically in the context of the constrained minimization theory. The problem has a closed-form solution, which is used to validate the numerical results. This solution is regular everywhere, including the boundary. In particular, we show numerical results which indicate that, for a fixed finite element mesh, the sequences of numerical solutions obtained with both the interior and the exterior penalty formulations converge to the same limit function as the penalization is enforced. This limit function yields an approximate deformation field to the plane problem that is locally invertible at all points in the domain. As the mesh is refined, this field converges to the exact solution of the plane problem.
Resumo:
The process of resources systems selection takes an important part in Distributed/Agile/Virtual Enterprises (D/A/V Es) integration. However, the resources systems selection is still a difficult matter to solve in a D/A/VE, as it is pointed out in this paper. Globally, we can say that the selection problem has been equated from different aspects, originating different kinds of models/algorithms to solve it. In order to assist the development of a web prototype tool (broker tool), intelligent and flexible, that integrates all the selection model activities and tools, and with the capacity to adequate to each D/A/V E project or instance (this is the major goal of our final project), we intend in this paper to show: a formulation of a kind of resources selection problem and the limitations of the algorithms proposed to solve it. We formulate a particular case of the problem as an integer programming, which is solved using simplex and branch and bound algorithms, and identify their performance limitations (in terms of processing time) based on simulation results. These limitations depend on the number of processing tasks and on the number of pre-selected resources per processing tasks, defining the domain of applicability of the algorithms for the problem studied. The limitations detected open the necessity of the application of other kind of algorithms (approximate solution algorithms) outside the domain of applicability founded for the algorithms simulated. However, for a broker tool it is very important the knowledge of algorithms limitations, in order to, based on problem features, develop and select the most suitable algorithm that guarantees a good performance.
Resumo:
The emerging technologies have recently challenged the libraries to reconsider their role as a mere mediator between the collections, researchers, and wider audiences (Sula, 2013), and libraries, especially the nationwide institutions like national libraries, haven’t always managed to face the challenge (Nygren et al., 2014). In the Digitization Project of Kindred Languages, the National Library of Finland has become a node that connects the partners to interplay and work for shared goals and objectives. In this paper, I will be drawing a picture of the crowdsourcing methods that have been established during the project to support both linguistic research and lingual diversity. The National Library of Finland has been executing the Digitization Project of Kindred Languages since 2012. The project seeks to digitize and publish approximately 1,200 monograph titles and more than 100 newspapers titles in various, and in some cases endangered Uralic languages. Once the digitization has been completed in 2015, the Fenno-Ugrica online collection will consist of 110,000 monograph pages and around 90,000 newspaper pages to which all users will have open access regardless of their place of residence. The majority of the digitized literature was originally published in the 1920s and 1930s in the Soviet Union, and it was the genesis and consolidation period of literary languages. This was the era when many Uralic languages were converted into media of popular education, enlightenment, and dissemination of information pertinent to the developing political agenda of the Soviet state. The ‘deluge’ of popular literature in the 1920s to 1930s suddenly challenged the lexical orthographic norms of the limited ecclesiastical publications from the 1880s onward. Newspapers were now written in orthographies and in word forms that the locals would understand. Textbooks were written to address the separate needs of both adults and children. New concepts were introduced in the language. This was the beginning of a renaissance and period of enlightenment (Rueter, 2013). The linguistically oriented population can also find writings to their delight, especially lexical items specific to a given publication, and orthographically documented specifics of phonetics. The project is financially supported by the Kone Foundation in Helsinki and is part of the Foundation’s Language Programme. One of the key objectives of the Kone Foundation Language Programme is to support a culture of openness and interaction in linguistic research, but also to promote citizen science as a tool for the participation of the language community in research. In addition to sharing this aspiration, our objective within the Language Programme is to make sure that old and new corpora in Uralic languages are made available for the open and interactive use of the academic community as well as the language societies. Wordlists are available in 17 languages, but without tokenization, lemmatization, and so on. This approach was verified with the scholars, and we consider the wordlists as raw data for linguists. Our data is used for creating the morphological analyzers and online dictionaries at the Helsinki and Tromsø Universities, for instance. In order to reach the targets, we will produce not only the digitized materials but also their development tools for supporting linguistic research and citizen science. The Digitization Project of Kindred Languages is thus linked with the research of language technology. The mission is to improve the usage and usability of digitized content. During the project, we have advanced methods that will refine the raw data for further use, especially in the linguistic research. How does the library meet the objectives, which appears to be beyond its traditional playground? The written materials from this period are a gold mine, so how could we retrieve these hidden treasures of languages out of the stack that contains more than 200,000 pages of literature in various Uralic languages? The problem is that the machined-encoded text (OCR) contains often too many mistakes to be used as such in research. The mistakes in OCRed texts must be corrected. For enhancing the OCRed texts, the National Library of Finland developed an open-source code OCR editor that enabled the editing of machine-encoded text for the benefit of linguistic research. This tool was necessary to implement, since these rare and peripheral prints did often include already perished characters, which are sadly neglected by the modern OCR software developers, but belong to the historical context of kindred languages and thus are an essential part of the linguistic heritage (van Hemel, 2014). Our crowdsourcing tool application is essentially an editor of Alto XML format. It consists of a back-end for managing users, permissions, and files, communicating through a REST API with a front-end interface—that is, the actual editor for correcting the OCRed text. The enhanced XML files can be retrieved from the Fenno-Ugrica collection for further purposes. Could the crowd do this work to support the academic research? The challenge in crowdsourcing lies in its nature. The targets in the traditional crowdsourcing have often been split into several microtasks that do not require any special skills from the anonymous people, a faceless crowd. This way of crowdsourcing may produce quantitative results, but from the research’s point of view, there is a danger that the needs of linguists are not necessarily met. Also, the remarkable downside is the lack of shared goal or the social affinity. There is no reward in the traditional methods of crowdsourcing (de Boer et al., 2012). Also, there has been criticism that digital humanities makes the humanities too data-driven and oriented towards quantitative methods, losing the values of critical qualitative methods (Fish, 2012). And on top of that, the downsides of the traditional crowdsourcing become more imminent when you leave the Anglophone world. Our potential crowd is geographically scattered in Russia. This crowd is linguistically heterogeneous, speaking 17 different languages. In many cases languages are close to extinction or longing for language revitalization, and the native speakers do not always have Internet access, so an open call for crowdsourcing would not have produced appeasing results for linguists. Thus, one has to identify carefully the potential niches to complete the needed tasks. When using the help of a crowd in a project that is aiming to support both linguistic research and survival of endangered languages, the approach has to be a different one. In nichesourcing, the tasks are distributed amongst a small crowd of citizen scientists (communities). Although communities provide smaller pools to draw resources, their specific richness in skill is suited for complex tasks with high-quality product expectations found in nichesourcing. Communities have a purpose and identity, and their regular interaction engenders social trust and reputation. These communities can correspond to research more precisely (de Boer et al., 2012). Instead of repetitive and rather trivial tasks, we are trying to utilize the knowledge and skills of citizen scientists to provide qualitative results. In nichesourcing, we hand in such assignments that would precisely fill the gaps in linguistic research. A typical task would be editing and collecting the words in such fields of vocabularies where the researchers do require more information. For instance, there is lack of Hill Mari words and terminology in anatomy. We have digitized the books in medicine, and we could try to track the words related to human organs by assigning the citizen scientists to edit and collect words with the OCR editor. From the nichesourcing’s perspective, it is essential that altruism play a central role when the language communities are involved. In nichesourcing, our goal is to reach a certain level of interplay, where the language communities would benefit from the results. For instance, the corrected words in Ingrian will be added to an online dictionary, which is made freely available for the public, so the society can benefit, too. This objective of interplay can be understood as an aspiration to support the endangered languages and the maintenance of lingual diversity, but also as a servant of ‘two masters’: research and society.
Resumo:
Minimization of a differentiable function subject to box constraints is proposed as a strategy to solve the generalized nonlinear complementarity problem (GNCP) defined on a polyhedral cone. It is not necessary to calculate projections that complicate and sometimes even disable the implementation of algorithms for solving these kinds of problems. Theoretical results that relate stationary points of the function that is minimized to the solutions of the GNCP are presented. Perturbations of the GNCP are also considered, and results are obtained related to the resolution of GNCPs with very general assumptions on the data. These theoretical results show that local methods for box-constrained optimization applied to the associated problem are efficient tools for solving the GNCP. Numerical experiments are presented that encourage the use of this approach.
Resumo:
The stability analysis of open cavity flows is a problem of great interest in the aeronautical industry. This type of flow can appear, for example, in landing gears or auxiliary power unit configurations. Open cavity flows is very sensitive to any change in the configuration, either physical (incoming boundary layer, Reynolds or Mach numbers) or geometrical (length to depth and length to width ratio). In this work, we have focused on the effect of geometry and of the Reynolds number on the stability properties of a threedimensional spanwise periodic cavity flow in the incompressible limit. To that end, BiGlobal analysis is used to investigate the instabilities in this configuration. The basic flow is obtained by the numerical integration of the Navier-Stokes equations with laminar boundary layers imposed upstream. The 3D perturbation, assumed to be periodic in the spanwise direction, is obtained as the solution of the global eigenvalue problem. A parametric study has been performed, analyzing the stability of the flow under variation of the Reynolds number, the L/D ratio of the cavity, and the spanwise wavenumber β. For consistency, multidomain high order numerical schemes have been used in all the computations, either basic flow or eigenvalue problems. The results allow to define the neutral curves in the range of L/D = 1 to L/D = 3. A scaling relating the frequency of the eigenmodes and the length to depth ratio is provided, based on the analysis results.
Resumo:
This contribution presents results of an incompressible two-dimensional flow over an open cavity of fixed aspect ratio (length/depth) L/D = 2 and the coupling between the three dimensional low frequency oscillation mode confined in the cavity and the wave-like disturbances evolving on the downstream wall of the cavity in the form of Tollmien-Schlichting waves. BiGlobal instability analysis is conducted to search the global disturbances superimposed upon a two-dimensional steady basic flow. The base solution is computed by the integration of the laminar Navier-Stokes equations in primitive variable formulation, while the eigenvalue problem (EVP) derived from the discretization of the linearized equations of motion in the BiGlobal framework is solved using an iterative procedure. The formulation of the BiGlobal EVP for the unbounded flow in the open cavity problem introduces additional difficulties regarding the flow-through boundaries. Local analysis has been utilized for the determination of the proper boundary conditions in the upper limit of the downstream region