974 resultados para Open source.


Relevância:

70.00% 70.00%

Publicador:

Resumo:

As more and more open-source software components become available on the internet we need automatic ways to label and compare them. For example, a developer who searches for reusable software must be able to quickly gain an understanding of retrieved components. This understanding cannot be gained at the level of source code due to the semantic gap between source code and the domain model. In this paper we present a lexical approach that uses the log-likelihood ratios of word frequencies to automatically provide labels for software components. We present a prototype implementation of our labeling/comparison algorithm and provide examples of its application. In particular, we apply the approach to detect trends in the evolution of a software system.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

The worldwide "hyper-connection" of any object around us is the challenge that promises to cover the paradigm of the Internet of Things. If the Internet has colonized the daily life of more than 2000 million1 people around the globe, the Internet of Things faces of connecting more than 100000 million2 "things" by 2020. The underlying Internet of Things’ technologies are the cornerstone that promises to solve interrelated global problems such as exponential population growth, energy management in cities, and environmental sustainability in the average and long term. On the one hand, this Project has the goal of knowledge acquisition about prototyping technologies available in the market for the Internet of Things. On the other hand, the Project focuses on the development of a system for devices management within a Wireless Sensor and Actuator Network to offer some services accessible from the Internet. To accomplish the objectives, the Project will begin with a detailed analysis of various “open source hardware platforms to encourage creative development of applications, and automatically extract information from the environment around them for transmission to external systems. In addition, web platforms that enable mass storage with the philosophy of the Internet of Things will be studied. The project will culminate in the proposal and specification of a service-oriented software architecture for embedded systems that allows communication between devices on the network, and the data transmission to external systems. Furthermore, it abstracts the complexities of hardware to application developers. RESUMEN. La “hiper-conexión” a nivel mundial de cualquier objeto que nos rodea es el desafío al que promete dar cobertura el paradigma de la Internet de las Cosas. Si la Internet ha colonizado el día a día de más de 2000 millones1 de personas en todo el planeta, la Internet de las Cosas plantea el reto de conectar a más de 100000 millones2 de “cosas” para el año 2020. Las tecnologías subyacentes de la Internet de las Cosas son la piedra angular que prometen dar solución a problemas globales interrelacionados como el crecimiento exponencial de la población, la gestión de la energía en las ciudades o la sostenibilidad del medioambiente a largo plazo. Este Proyecto Fin de Carrera tiene como principales objetivos por un lado, la adquisición de conocimientos acerca de las tecnologías para prototipos disponibles en el mercado para la Internet de las Cosas, y por otro lado el desarrollo de un sistema para la gestión de dispositivos de una red inalámbrica de sensores que ofrezcan unos servicios accesibles desde la Internet. Con el fin de abordar los objetivos marcados, el proyecto comenzará con un análisis detallado de varias plataformas hardware de tipo “open source que estimulen el desarrollo creativo de aplicaciones y que permitan extraer de forma automática información del medio que les rodea para transmitirlo a sistemas externos para su posterior procesamiento. Por otro lado, se estudiarán plataformas web identificadas con la filosofía de la Internet de las Cosas que permitan el almacenamiento masivo de datos que diferentes plataformas hardware transfieren a través de la Internet. El Proyecto culminará con la propuesta y la especificación una arquitectura software orientada a servicios para sistemas empotrados que permita la comunicación entre los dispositivos de la red y la transmisión de datos a sistemas externos, así como facilitar el desarrollo de aplicaciones a los programadores mediante la abstracción de la complejidad del hardware.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Dannie Jost gave an introductory presentation on the emergence on open hardware phenomena, including synthetic biology and other technological environments at the HEPTech Workshop on Open Hardware on June 13 held at the GSI, Darmstadt (Germany). The workshop was organized by CERN and GSI. This event addressed the OSHW phenomenon and its implications for academia and industry with special attention to knowledge and technology transfer issues. Consideration was given to the various aspects of open source hardware development, and how these are dealt with in academia and industry. Presentations from legal experts, academics, practitioners and business provided input for the discussions and exchange of ideas.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Dannie Jost gave a presentation outlining some of the challenges to the patent system presented by open source hardware at the "Open Knowledge Festival", under the topic stream treating open design, hardware, manufacturing and making; September 19, 2012; Helsinki, Finland. This topic stream generated considerable discussion, and it serves to educate an audience that is usually very adverse to patents and copyright, and helps the researcher understand the issuing conflicts surrounding emerging technologies, in particular digital technologies, and the maker movement (digitally enabled).

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Starting from the Schumpeterian producer-driven understanding of innovation, followed by user-generated solutions and understanding of collaborative forms of co-creation, scholars investigated the drivers and the nature of interactions underpinning success in various ways. Innovation literature has gone a long way, where open innovation has attracted researchers to investigate problems like compatibilities of external resources, networks of innovation, or open source collaboration. Openness itself has gained various shades in the different strands of literature. In this paper the author provides with an overview and a draft evaluation of the different models of open innovation, illustrated with some empirical findings from various fields drawn from the literature. She points to the relevance of transaction costs affecting viable forms of (open) innovation strategies of firms, and the importance to define the locus of innovation for further analyses of different firm and interaction level formations.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Abstract: Decision support systems have been widely used for years in companies to gain insights from internal data, thus making successful decisions. Lately, thanks to the increasing availability of open data, these systems are also integrating open data to enrich decision making process with external data. On the other hand, within an open-data scenario, decision support systems can be also useful to decide which data should be opened, not only by considering technical or legal constraints, but other requirements, such as "reusing potential" of data. In this talk, we focus on both issues: (i) open data for decision making, and (ii) decision making for opening data. We will first briefly comment some research problems regarding using open data for decision making. Then, we will give an outline of a novel decision-making approach (based on how open data is being actually used in open-source projects hosted in Github) for supporting open data publication. Bio of the speaker: Jose-Norberto Mazón holds a PhD from the University of Alicante (Spain). He is head of the "Cátedra Telefónica" on Big Data and coordinator of the Computing degree at the University of Alicante. He is also member of the WaKe research group at the University of Alicante. His research work focuses on open data management, data integration and business intelligence within "big data" scenarios, and their application to the tourism domain (smart tourism destinations). He has published his research in international journals, such as Decision Support Systems, Information Sciences, Data & Knowledge Engineering or ACM Transaction on the Web. Finally, he is involved in the open data project in the University of Alicante, including its open data portal at http://datos.ua.es

Relevância:

70.00% 70.00%

Publicador:

Resumo:

FEA simulation of thermal metal cutting is central to interactive design and manufacturing. It is therefore relevant to assess the applicability of FEA open software to simulate 2D heat transfer in metal sheet laser cuts. Application of open source code (e.g. FreeFem++, FEniCS, MOOSE) makes possible additional scenarios (e.g. parallel, CUDA, etc.), with lower costs. However, a precise assessment is required on the scenarios in which open software can be a sound alternative to a commercial one. This article contributes in this regard, by presenting a comparison of the aforementioned freeware FEM software for the simulation of heat transfer in thin (i.e. 2D) sheets, subject to a gliding laser point source. We use the commercial ABAQUS software as the reference to compare such open software. A convective linear thin sheet heat transfer model, with and without material removal is used. This article does not intend a full design of computer experiments. Our partial assessment shows that the thin sheet approximation turns to be adequate in terms of the relative error for linear alumina sheets. Under mesh resolutions better than 10e−5 m , the open and reference software temperature differ in at most 1 % of the temperature prediction. Ongoing work includes adaptive re-meshing, nonlinearities, sheet stress analysis and Mach (also called ‘relativistic’) effects.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Current IEEE 802.11 wireless networks are vulnerable to session hijacking attacks as the existing standards fail to address the lack of authentication of management frames and network card addresses, and rely on loosely coupled state machines. Even the new WLAN security standard - IEEE 802.11i does not address these issues. In our previous work, we proposed two new techniques for improving detection of session hijacking attacks that are passive, computationally inexpensive, reliable, and have minimal impact on network performance. These techniques utilise unspoofable characteristics from the MAC protocol and the physical layer to enhance confidence in the intrusion detection process. This paper extends our earlier work and explores usability, robustness and accuracy of these intrusion detection techniques by applying them to eight distinct test scenarios. A correlation engine has also been introduced to maintain the false positives and false negatives at a manageable level. We also explore the process of selecting optimum thresholds for both detection techniques. For the purposes of our experiments, Snort-Wireless open source wireless intrusion detection system was extended to implement these new techniques and the correlation engine. Absence of any false negatives and low number of false positives in all eight test scenarios successfully demonstrated the effectiveness of the correlation engine and the accuracy of the detection techniques.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Access All was performance produced following a three-month mentorship in web-based performance that I was commissioned to conduct for the performance company Igneous. This live, triple-site performance event for three performers in three remote venues was specifically designed for presentation at Access Grid Nodes - conference rooms located around the globe equipped with a high end, open source computer teleconferencing technology that allowed multiple nodes to cross-connect with each other. Whilst each room was setup somewhat differently they all deployed the same basic infrastructre of multiple projectors, cameras, and sound as well as a reconfigurable floorspace. At that time these relatively formal setups imposed a clear series of limitations in terms of software capabilities and basic infrastructure and so there was much interest in understanding how far its capabilities might be pushed.----- Numerous performance experiments were undertaken between three Access Grid nodes in QUT Brisbane, VISLAB Sydney and Manchester Supercomputing Centre, England, culminating in the public performance staged simultaneously between the sites with local audiences at each venue and others online. Access All was devised in collaboration with interdisciplinary performance company Bonemap, Kelli Dipple (Interarts curator, Tate Modern London) and Mike Stubbs British curator and Director of FACT (Liverpool).----- This period of research and development was instigated and shaped by a public lecture I had earlier delivered in Sydney for the ‘Global Access Grid Network, Super Computing Global Conference’ entitled 'Performance Practice across Electronic Networks'. The findings of this work went on to inform numerous future networked and performative works produced from 2002 onwards.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The Australian Research Collaboration Service (ARCS) has been supporting a wide range of Collaboration Services and Tools which have been allowing researchers, groups and research communities to share ideas and collaborate across organisational boundaries.----- This talk will give an introduction to a number of exciting technologies which are now available. Focus will be on two main areas of Video Collaboration Tools, allowing researchers to talk face-to-face and share data in real-time, and Web Collaboration Tools, allowing researchers to share information and ideas with other like-minded researchers irrespective of distance or organisational structure. A number of examples will also be shown of how these technologies have been used with in various research communities.----- A brief introduction will be given to a number of services which ARCS is now operating and/or supporting such as:--- * EVO – A video conferencing application, which is particularly suited to desktop or low bandwidth applications.--- * AccessGrid – An open source video conferencing and collaboration tool kit, which is great for room to room meetings.--- * Sakai – An online collaboration and learning environment, support teaching and learning, ad hoc group collaboration, support for portfolios and research collaboration.--- * Plone – A ready-to-run content management system, that provides you with a system for managing web content that is ideal for project groups, communities, web sites, extranets and intranets.--- * Wikis – A way to easily create, edit, and link pages together, to create collaborative websites.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Registration fees for this workshop are being met by ARCS. There is no cost to attend; however space is limited.----- The Australian Research Collaboration Service (ARCS) has been supporting a wide range of Collaboration Services and Tools which have been allowing researchers, groups and research communities to share ideas and collaborate across organisational boundaries.----- This workshop will give an introduction into a number of web based and real-time collaboration tools and services which researchers may find useful for day-to-day collaboration with members of a research team located within an institution or across institutions. Attendees will be shown how a number of these tools work with strong emphasis placed on how these tools can help facilitate communication and collaboration. Attendees will have the opportunity to try out a number of examples themselves, and interact with the workshop staff to discuss how their own use cases could benefit from the tools and services which can be provided.----- Outline: A hands on introduction will be given to a number of services which ARCS is now operating and/or supporting such as:--- * EVO – A video conferencing environment, which is particularly suited to desktop or low bandwidth applications.--- * AccessGrid – An open source video conferencing and collaboration tool kit, which is great for room to room meetings.--- * Sakai – An online collaboration and learning environment, support teaching and learning, ad hoc group collaboration, support for portfolios and research collaboration.--- * Plone and Drupal – A ready-to-run content management system, that provides you with a system for managing web content that is ideal for project groups, communities, web sites, extranets and intranets.--- * Wikis – A way to easily create, edit, and link pages together, to create collaborative websites.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Alvin Toffler’s image of the prosumer (1970, 1980, 1990) continues to influence in a significant way our understanding of the user-led, collaborative processes of content creation which are today labelled “social media” or “Web 2.0”. A closer look at Toffler’s own description of his prosumer model reveals, however, that it remains firmly grounded in the mass media age: the prosumer is clearly not the self-motivated creative originator and developer of new content which can today be observed in projects ranging from open source software through Wikipedia to Second Life, but simply a particularly well-informed, and therefore both particularly critical and particularly active, consumer. The highly specialised, high end consumers which exist in areas such as hi-fi or car culture are far more representative of the ideal prosumer than the participants in non-commercial (or as yet non-commercial) collaborative projects. And to expect Toffler’s 1970s model of the prosumer to describe these 21st-century phenomena was always an unrealistic expectation, of course. To describe the creative and collaborative participation which today characterises user-led projects such as Wikipedia, terms such as ‘production’ and ‘consumption’ are no longer particularly useful – even in laboured constructions such as ‘commons-based peer-production’ (Benkler 2006) or ‘p2p production’ (Bauwens 2005). In the user communities participating in such forms of content creation, roles as consumers and users have long begun to be inextricably interwoven with those as producer and creator: users are always already also able to be producers of the shared information collection, regardless of whether they are aware of that fact – they have taken on a new, hybrid role which may be best described as that of a produser (Bruns 2008). Projects which build on such produsage can be found in areas from open source software development through citizen journalism to Wikipedia, and beyond this also in multi-user online computer games, filesharing, and even in communities collaborating on the design of material goods. While addressing a range of different challenges, they nonetheless build on a small number of universal key principles. This paper documents these principles and indicates the possible implications of this transition from production and prosumption to produsage.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Alvin Tofflers Bild des Prosumers beeinflußt weiterhin maßgeblich unser Verständnis vieler heutzutage als „Social Media“ oder „Web 2.0“ beschriebener nutzergesteuerter, kollaborativer Prozesse der Inhaltserstellung. Ein genauerer Blick auf Tofflers eigene Beschreibung seines Prosumermodells offenbart jedoch, daß es fest im Zeitalter der Massenmedienvorherrschaft verankert bleibt: der Prosumer ist eben nicht jener aus eigenem Antrieb aktive, kreative Ersteller und Weiterbearbeiter neuer Inhalte, wie er heutzutage in Projekten von der Open-Source-Software über die Wikipedia bis hin zu Second Life zu finden ist, sondern nur ein ganz besonders gut informierter, und daher in seinem Konsumverhalten sowohl besonders kritischer als auch besonders aktiver Konsument. Hochspezialisierte, High-End-Konsumenten etwa im Hi-Fi- oder Automobilbereich stellen viel eher das Idealbild des Prosumers dar als das für Mitarbeiter in oft eben gerade nicht (oder zumindest noch nicht) kommerziell erfaßten nutzergesteuerten Kollaborationsprojekten der Fall ist. Solches von Tofflers in den 70ern erarbeiteten Modells zu erwarten, ist sicherlich ohnehin zuviel verlangt. Das Problem liegt also nicht bei Toffler selbst, sondern vielmehr in den im Industriezeitalter vorherrschenden Vorstellungen eines recht deutlich in Produktion, Distribution, und Konsum eingeteilten Prozesses. Diese Dreiteilung war für die Erschaffung materieller wie immaterieller Güter durchaus notwendig – sie ist selbst für die konventionellen Massenmedien zutreffend, bei denen Inhaltsproduktion ebenso aus kommerziellen Gründen auf einige wenige Institutionen konzentriert war wie das für die Produktion von Konsumgütern der Fall ist. Im beginnenden Informationszeitalter, beherrscht durch dezentralisierte Mediennetzwerke und weithin erhaltbare und erschwingliche Produktionsmittel, liegt der Fall jedoch anders. Was passiert, wenn Distribution automatisch erfolgt, und wenn beinahe jeder Konsument auch Produzent sein kann, anstelle einer kleinen Schar von kommerziell unterstützten Produzenten, denen bestenfallls vielleicht eine Handvoll von nahezu professionellen Prosumern zur Seite steht? Was geschieht, wenn sich die Zahl der von Eric von Hippel als ‚lead user’ beschriebenen als Produzenten aktiven Konsumenten massiv ausdehnt – wenn, wie Wikipedias Slogan es beschreibt, ‚anyone can edit’, wenn also potentiell jeder Nutzer aktiv an der Inhaltserstellung teilnehmen kann? Um die kreative und kollaborative Beteiligung zu beschreiben, die heutzutage nutzergesteuerte Projekte wie etwa die Wikipedia auszeichnet, sind Begriffe wie ‚Produktion’ und ‚Konsum’ nur noch bedingt nützlich – selbst in Konstruktionen wie 'nutzergesteuerte Produktion' oder 'P2P-Produktion'. In den Nutzergemeinschaften, die an solchen Formen der Inhaltserschaffung teilnehmen, haben sich Rollen als Konsumenten und Benutzer längst unwiederbringlich mit solchen als Produzent vermischt: Nutzer sind immer auch unausweichlich Produzenten der gemeinsamen Informationssammlung, ganz egal, ob sie sich dessens auch bewußt sind: sie haben eine neue, hybride Rolle angenommen, die sich vielleicht am besten als 'Produtzer' umschreiben lassen kann. Projekte, die auf solche Produtzung (Englisch: produsage) aufbauen, finden sich in Bereichen von Open-Source-Software über Bürgerjournalismus bis hin zur Wikipedia, und darüberhinaus auch zunehmend in Computerspielen, Filesharing, und selbst im Design materieller Güter. Obwohl unterschiedlich in ihrer Ausrichtung, bauen sie doch auf eine kleine Zahl universeller Grundprinzipien auf. Dieser Vortrag beschreibt diese Grundprinzipien, und zeigt die möglichen Implikationen dieses Übergangs von Produktion (und Prosumption) zu Produtzung auf.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Discrete event-driven simulations of digital communication networks have been used widely. However, it is difficult to use a network simulator to simulate a hybrid system in which some objects are not discrete event-driven but are continuous time-driven. A networked control system (NCS) is such an application, in which physical process dynamics are continuous by nature. We have designed and implemented a hybrid simulation environment which effectively integrates models of continuous-time plant processes and discrete-event communication networks by extending the open source network simulator NS-2. To do this a synchronisation mechanism was developed to connect a continuous plant simulation with a discrete network simulation. Furthermore, for evaluating co-design approaches in an NCS environment, a piggybacking method was adopted to allow the control period to be adjusted during simulations. The effectiveness of the technique is demonstrated through case studies which simulate a networked control scenario in which the communication and control system properties are defined explicitly.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Real-Time Kinematic (RTK) positioning is a technique used to provide precise positioning services at centimetre accuracy level in the context of Global Navigation Satellite Systems (GNSS). While a Network-based RTK (N-RTK) system involves multiple continuously operating reference stations (CORS), the simplest form of a NRTK system is a single-base RTK. In Australia there are several NRTK services operating in different states and over 1000 single-base RTK systems to support precise positioning applications for surveying, mining, agriculture, and civil construction in regional areas. Additionally, future generation GNSS constellations, including modernised GPS, Galileo, GLONASS, and Compass, with multiple frequencies have been either developed or will become fully operational in the next decade. A trend of future development of RTK systems is to make use of various isolated operating network and single-base RTK systems and multiple GNSS constellations for extended service coverage and improved performance. Several computational challenges have been identified for future NRTK services including: • Multiple GNSS constellations and multiple frequencies • Large scale, wide area NRTK services with a network of networks • Complex computation algorithms and processes • Greater part of positioning processes shifting from user end to network centre with the ability to cope with hundreds of simultaneous users’ requests (reverse RTK) There are two major requirements for NRTK data processing based on the four challenges faced by future NRTK systems, expandable computing power and scalable data sharing/transferring capability. This research explores new approaches to address these future NRTK challenges and requirements using the Grid Computing facility, in particular for large data processing burdens and complex computation algorithms. A Grid Computing based NRTK framework is proposed in this research, which is a layered framework consisting of: 1) Client layer with the form of Grid portal; 2) Service layer; 3) Execution layer. The user’s request is passed through these layers, and scheduled to different Grid nodes in the network infrastructure. A proof-of-concept demonstration for the proposed framework is performed in a five-node Grid environment at QUT and also Grid Australia. The Networked Transport of RTCM via Internet Protocol (Ntrip) open source software is adopted to download real-time RTCM data from multiple reference stations through the Internet, followed by job scheduling and simplified RTK computing. The system performance has been analysed and the results have preliminarily demonstrated the concepts and functionality of the new NRTK framework based on Grid Computing, whilst some aspects of the performance of the system are yet to be improved in future work.