891 resultados para free and open source software
Resumo:
Univ SE Calif, Ctr Syst & Software Engn, ABB, Microsoft Res, IEEE, ACMSIGSOFT, N Carolina State Univ Comp Sci
Resumo:
DSpace is an open source software platform that enables organizations to: - Capture and describe digital material using a submission workflow module, or a variety of programmatic ingest options - Distribute an organization's digital assets over the web through a search and retrieval system - Preserve digital assets over the long term This system documentation includes a functional overview of the system, which is a good introduction to the capabilities of the system, and should be readable by nontechnical personnel. Everyone should read this section first because it introduces some terminology used throughout the rest of the documentation. For people actually running a DSpace service, there is an installation guide, and sections on configuration and the directory structure. Note that as of DSpace 1.2, the administration user interface guide is now on-line help available from within the DSpace system. Finally, for those interested in the details of how DSpace works, and those potentially interested in modifying the code for their own purposes, there is a detailed architecture and design section.
Resumo:
BACKGROUND: The ability to write clearly and effectively is of central importance to the scientific enterprise. Encouraged by the success of simulation environments in other biomedical sciences, we developed WriteSim TCExam, an open-source, Web-based, textual simulation environment for teaching effective writing techniques to novice researchers. We shortlisted and modified an existing open source application - TCExam to serve as a textual simulation environment. After testing usability internally in our team, we conducted formal field usability studies with novice researchers. These were followed by formal surveys with researchers fitting the role of administrators and users (novice researchers) RESULTS: The development process was guided by feedback from usability tests within our research team. Online surveys and formal studies, involving members of the Research on Research group and selected novice researchers, show that the application is user-friendly. Additionally it has been used to train 25 novice researchers in scientific writing to date and has generated encouraging results. CONCLUSION: WriteSim TCExam is the first Web-based, open-source textual simulation environment designed to complement traditional scientific writing instruction. While initial reviews by students and educators have been positive, a formal study is needed to measure its benefits in comparison to standard instructional methods.
Resumo:
The fundamental phenotypes of growth rate, size and morphology are the result of complex interactions between genotype and environment. We developed a high-throughput software application, WormSizer, which computes size and shape of nematodes from brightfield images. Existing methods for estimating volume either coarsely model the nematode as a cylinder or assume the worm shape or opacity is invariant. Our estimate is more robust to changes in morphology or optical density as it only assumes radial symmetry. This open source software is written as a plugin for the well-known image-processing framework Fiji/ImageJ. It may therefore be extended easily. We evaluated the technical performance of this framework, and we used it to analyze growth and shape of several canonical Caenorhabditis elegans mutants in a developmental time series. We confirm quantitatively that a Dumpy (Dpy) mutant is short and fat and that a Long (Lon) mutant is long and thin. We show that daf-2 insulin-like receptor mutants are larger than wild-type upon hatching but grow slow, and WormSizer can distinguish dauer larvae from normal larvae. We also show that a Small (Sma) mutant is actually smaller than wild-type at all stages of larval development. WormSizer works with Uncoordinated (Unc) and Roller (Rol) mutants as well, indicating that it can be used with mutants despite behavioral phenotypes. We used our complete data set to perform a power analysis, giving users a sense of how many images are needed to detect different effect sizes. Our analysis confirms and extends on existing phenotypic characterization of well-characterized mutants, demonstrating the utility and robustness of WormSizer.
Resumo:
In this paper, we study a problem of scheduling and batching on two machines in a flow-shop and open-shop environment. Each machine processes operations in batches, and the processing time of a batch is the sum of the processing times of the operations in that batch. A setup time, which depends only on the machine, is required before a batch is processed on a machine, and all jobs in a batch remain at the machine until the entire batch is processed. The aim is to make batching and sequencing decisions, which specify a partition of the jobs into batches on each machine, and a processing order of the batches on each machine, respectively, so that the makespan is minimized. The flow-shop problem is shown to be strongly NP-hard. We demonstrate that there is an optimal solution with the same batches on the two machines; we refer to these as consistent batches. A heuristic is developed that selects the best schedule among several with one, two, or three consistent batches, and is shown to have a worst-case performance ratio of 4/3. For the open-shop, we show that the problem is NP-hard in the ordinary sense. By proving the existence of an optimal solution with one, two or three consistent batches, a close relationship is established with the problem of scheduling two or three identical parallel machines to minimize the makespan. This allows a pseudo-polynomial algorithm to be derived, and various heuristic methods to be suggested.
Resumo:
Software product development is recognised as difficult due to the intangible nature of the product, requirements elicitation, effective progress measurement, and so forth. In this paper, we describe some of the challenges of software product development and how the challenges are being met by lean management principles and techniques. Specifically, we examine lean principles and techniques that were devised by Toyota and other manufacturers over the last 50 years. Applying lean principles to software development projects has been advocated for over ten years and it will be shown that the extensive lean literature is a valuable source of ideas for software development. A case study with a software development organisation, Timberline Inc., will demonstrate that lean principles and techniques can be successfully applied to software product development.
Towards an understanding of the causes and effects of software requirements change: two case studies
Resumo:
Changes to software requirements not only pose a risk to the successful delivery of software applications but also provide opportunity for improved usability and value. Increased understanding of the causes and consequences of change can support requirements management and also make progress towards the goal of change anticipation. This paper presents the results of two case studies that address objectives arising from that ultimate goal. The first case study evaluated the potential of a change source taxonomy containing the elements ‘market’, ‘organisation’, ‘vision’, ‘specification’, and ‘solution’ to provide a meaningful basis for change classification and measurement. The second case study investigated whether the requirements attributes of novelty, complexity, and dependency correlated with requirements volatility. While insufficiency of data in the first case study precluded an investigation of changes arising due to the change source of ‘market’, for the remainder of the change sources, results indicate a significant difference in cost, value to the customer and management considerations. Findings show that higher cost and value changes arose more often from ‘organisation’ and ‘vision’ sources; these changes also generally involved the co-operation of more stakeholder groups and were considered to be less controllable than changes arising from the ‘specification’ or ‘solution’ sources. Results from the second case study indicate that only ‘requirements dependency’ is consistently correlated with volatility and that changes coming from each change source affect different groups of requirements. We conclude that the taxonomy can provide a meaningful means of change classification, but that a single requirement attribute is insufficient for change prediction. A theoretical causal account of requirements change is drawn from the implications of the combined results of the two case studies.
Resumo:
The British and Irish Legal Information Institute (BAILII) entered the online legal information landscape in 2001 with charitable status as a provider of UK and European judgments, and has over the past decade or so moved from a system quickly put together with any materials which could be found, to a system which provides a core resource to professionals in law. In this article we provide an overview for the law teacher of the system’s first years and we then look at whether usage in law schools has matched that of the professional, how the JISC funded Open Law project enabled development for law students, and where we might go in the future as part of the Legal Information Institute collective which operates under the ‘Free Access to Law’ banner.
As members of the Open Law team who sought funding, carried out the research and implemented the project, it seems to us that the project was generally successful. Our indications were that prior to Open Law the use of BAILII by students was low – it was not readily found or discussed by lecturers, was difficult to use, and generally less user friendly than it could have been. The changes implemented by Open Law appear to have changed that position considerably. However, our findings also indicate that there is much work to do to re-energise digital legal information as a legal education research field.
Resumo:
A highly sensitive broad specificity monoclonal antibody was produced and characterised for microcystin detection through the development of a rapid surface plasmon resonance (SPR) optical biosensor based immunoassay. The antibody displayed the following cross-reactivity: MC-LR 100%; MC-RR 108%; MC-YR 68%; MC-LA 69%; MC-LW 71%; MC-LF 68%; and Nodularin 94%. Microcystin-LR was covalently attached to a CM5 chip and with the monoclonal antibody was employed in a competitive 4min injection assay to detect total microcystins in water samples below the WHO recommended limit (1µg/L). A 'total microcystin' level was determined by measuring free and intracellular concentrations in cyanobacterial culture samples as this toxin is an endotoxin. Glass bead beating was used to lyse the cells as a rapid extraction procedure. This method was validated according to European Commission Decision 96/23/EC criteria. The method was proven to measure intracellular microcystin levels, the main source of the toxin, which often goes undetected by other analytical procedures and is advantageous in that it can be used for the monitoring of blooms to provide an early warning of toxicity. It was shown to be repeatable and reproducible, with recoveries from spiked samples ranging from 74 to 123%, and had % CVs below 10% for intra-assay analysis and 15% for inter-assay analysis. The detection capability of the assay was calculated as 0.5ng/mL for extracellular toxins and 0.05ng/mL for intracellular microcystins. A comparison of the SPR method with LC-MS/MS was achieved by testing six Microcystis aeruginosa cultures and this study yielded a correlation R(2) value of 0.9989.
Resumo:
This paper presents a framework for a telecommunications interface which allows data from sensors embedded in Smart Grid applications to reliably archive data in an appropriate time-series database. The challenge in doing so is two-fold, firstly the various formats in which sensor data is represented, secondly the problems of telecoms reliability. A prototype of the authors' framework is detailed which showcases the main features of the framework in a case study featuring Phasor Measurement Units (PMU) as the application. Useful analysis of PMU data is achieved whenever data from multiple locations can be compared on a common time axis. The prototype developed highlights its reliability, extensibility and adoptability; features which are largely deferred from industry standards for data representation to proprietary database solutions. The open source framework presented provides link reliability for any type of Smart Grid sensor and is interoperable with existing proprietary database systems, and open database systems. The features of the authors' framework allow for researchers and developers to focus on the core of their real-time or historical analysis applications, rather than having to spend time interfacing with complex protocols.
Resumo:
O presente trabalho tem por objectivo estudar a caracterização e modelação de arquitecturas de rádio frequência para aplicações em rádios definidos por software e rádios cognitivos. O constante aparecimento no mercado de novos padrões e tecnologias para comunicações sem fios têm levantado algumas limitações à implementação de transceptores rádio de banda larga. Para além disso, o uso de sistemas reconfiguráveis e adaptáveis baseados no conceito de rádio definido por software e rádio cognitivo assegurará a evolução para a próxima geração de comunicações sem fios. A ideia base desta tese passa por resolver alguns problemas em aberto e propor avanços relevantes, tirando para isso partido das capacidades providenciadas pelos processadores digitais de sinal de forma a melhorar o desempenho global dos sistemas propostos. Inicialmente, serão abordadas várias estratégias para a implementação e projecto de transceptores rádio, concentrando-se sempre na aplicabilidade específica a sistemas de rádio definido por software e rádio cognitivo. Serão também discutidas soluções actuais de instrumentação capaz de caracterizar um dispositivo que opere simultaneamente nos domínios analógico e digital, bem como, os próximos passos nesta área de caracterização e modelação. Além disso, iremos apresentar novos formatos de modelos comportamentais construídos especificamente para a descrição e caracterização não-linear de receptores de amostragem passa-banda, bem como, para sistemas nãolineares que utilizem sinais multi-portadora. Será apresentada uma nova arquitectura suportada na avaliação estatística dos sinais rádio que permite aumentar a gama dinâmica do receptor em situações de multi-portadora. Da mesma forma, será apresentada uma técnica de maximização da largura de banda de recepção baseada na utilização do receptor de amostragem passa-banda no formato complexo. Finalmente, importa referir que todas as arquitecturas propostas serão acompanhadas por uma introdução teórica e simulações, sempre que possível, sendo após isto validadas experimentalmente por protótipos laboratoriais.
Resumo:
Resumo:
The IEEE 802.15.4/ZigBee protocols are gaining increasing interests in both research and industrial communities as candidate technologies for Wireless Sensor Network (WSN) applications. In this paper, we present an open-source implementation of the IEEE 802.15.4/Zigbee protocol stack under the TinyOS operating system for the MICAz motes. This work has been driven by the need for an open-source implementation of the IEEE 802.15.4/ZigBee protocols, filling a gap between some newly released complex C implementations and black-box implementations from different manufacturers. In addition, we share our experience on the challenging problem that we have faced during the implementation of the protocol stack on the MICAz motes. We strongly believe that this open-source implementation will potentiate research works on the IEEE 802.15.4/Zigbee protocols allowing their demonstration and validation through experimentation.
Resumo:
Ao longo dos últimos anos, os scanners 3D têm tido uma utilização crescente nas mais variadas áreas. Desde a Medicina à Arqueologia, passando pelos vários tipos de indústria, ´e possível identificar aplicações destes sistemas. Essa crescente utilização deve-se, entre vários factores, ao aumento dos recursos computacionais, à simplicidade e `a diversidade das técnicas existentes, e `as vantagens dos scanners 3D comparativamente com outros sistemas. Estas vantagens são evidentes em áreas como a Medicina Forense, onde a fotografia, tradicionalmente utilizada para documentar objectos e provas, reduz a informação adquirida a duas dimensões. Apesar das vantagens associadas aos scanners 3D, um factor negativo é o preço elevado. No âmbito deste trabalho pretendeu-se desenvolver um scanner 3D de luz estruturada económico e eficaz, e um conjunto de algoritmos para o controlo do scanner, para a reconstrução de superfícies de estruturas analisadas, e para a validação dos resultados obtidos. O scanner 3D implementado ´e constituído por uma câmara e por um projector de vídeo ”off-the-shelf”, e por uma plataforma rotativa desenvolvida neste trabalho. A função da plataforma rotativa consiste em automatizar o scanner de modo a diminuir a interação dos utilizadores. Os algoritmos foram desenvolvidos recorrendo a pacotes de software open-source e a ferramentas gratuitas. O scanner 3D foi utilizado para adquirir informação 3D de um crânio, e o algoritmo para reconstrução de superfícies permitiu obter superfícies virtuais do crânio. Através do algoritmo de validação, as superfícies obtidas foram comparadas com uma superfície do mesmo crânio, obtida por tomografia computorizada (TC). O algoritmo de validação forneceu um mapa de distâncias entre regiões correspondentes nas duas superfícies, que permitiu quantificar a qualidade das superfícies obtidas. Com base no trabalho desenvolvido e nos resultados obtidos, é possível afirmar que foi criada uma base funcional para o varrimento de superfícies 3D de estruturas, apta para desenvolvimento futuro, mostrando que é possível obter alternativas aos métodos comerciais usando poucos recursos financeiros.
Resumo:
Os Sistemas de Informação Geográfica detêm um papel de crescente importância para a tomada de decisão ao nível do ordenamento do território, conforme o que a própria legislação evidencia. Existe uma carência de uniformização da informação para que possa existir um cruzamento eficiente e uma disponibilização mais facilitada à população e às diversas entidades. As instituições públicas, como é o caso do Instituto de Conservação e Florestas (ICNF I.P.), reconhecem esta necessidade, pelo que se propôs uma série de tarefas, para a concretização deste estágio, no sentido de a colmatar. Este instituto detém um papel de especial relevo na conservação do património natural assim como no ordenamento das áreas protegidas para que se possa usufruir das mesmas, sob um ponto de vista cada vez mais ecológico e promotor de desenvolvimento sustentável. A elaboração de um instrumento de gestão territorial, como se pode classificar a Carta de Desporto de Natureza (CDN), acarreta grande responsabilidade e exige um forte empenho, não apenas no envolvimento da população e das entidades que dele fruirão, mas no cumprimento da legislação existente quer para as atividades quer para o próprio território, no sentido da sua proteção, preservação e promoção de usos adequados. Neste trabalho procurou-se cumprir todas estas disposições sempre com o objetivo de, ao se construir normas de uniformização e ao se elaborar um exemplo de CDN para um território com tanto valor como é a Reserva Natural do Estuário do Tejo, tornar acessível a informação sobre esta área protegida. A intensão é também promover a motivação para a prática de um desporto natureza, saudável e em equilíbrio com o ambiente. O desafio não se ficou por aqui, havendo também a procura de utilização de Free Open Source Software, para que o processo de utilização e disponibilização informação SIG seja cada vez mais acessível e direta.