150 resultados para Open-source code


Relevância:

90.00% 90.00%

Publicador:

Resumo:

On 19 June 2015, representatives from over 40 Australian research institutions gathered in Canberra to launch their Open Data Collections. The one day event, hosted by the Australian National Data Service (ANDS), showcased to government and a range of national stakeholders the rich variety of data collections that have been generated through the Major Open Data Collections (MODC) project. Colin Eustace attended the showcase for QUT Library and presented a poster that reflected the work that he and Jodie Vaughan generated through the project. QUT’s Blueprint 4, the University’s five-year institutional strategic plan, outlines the key priorities of developing a commitment to working in partnership with industry, as well as combining disciplinary strengths with interdisciplinary application. The Division of Technology, Information and Learning Support (TILS) has undertaken a number of Australian National Data Service (ANDS) funded projects since 2009 with the aim of developing improved research data management services within the University to support these strategic aims. By leveraging existing tools and systems developed during these projects, the Major Open Data Collection (MODC) project delivered support to multi-disciplinary collaborative research activities through partnership building between QUT researchers and Queensland government agencies, in order to add to and promote the discovery and reuse of a collection of spatially referenced datasets. The MODC project built upon existing Research Data Finder infrastructure (which uses VIVO open source software, developed by Cornell University) to develop a separate collection, Spatial Data Finder (https://researchdatafinder.qut.edu.au/spatial) as the interface to display the spatial data collection. During the course of the project, 62 dataset descriptions were added to Spatial Data Finder, 7 added to Research Data Finder and two added to Software Finder, another separate collection. The project team met with 116 individual researchers and attended 13 school and faculty meetings to promote the MODC project and raise awareness of the Library’s services and resources for research data management.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Current IEEE 802.11 wireless networks are vulnerable to session hijacking attacks as the existing standards fail to address the lack of authentication of management frames and network card addresses, and rely on loosely coupled state machines. Even the new WLAN security standard - IEEE 802.11i does not address these issues. In our previous work, we proposed two new techniques for improving detection of session hijacking attacks that are passive, computationally inexpensive, reliable, and have minimal impact on network performance. These techniques utilise unspoofable characteristics from the MAC protocol and the physical layer to enhance confidence in the intrusion detection process. This paper extends our earlier work and explores usability, robustness and accuracy of these intrusion detection techniques by applying them to eight distinct test scenarios. A correlation engine has also been introduced to maintain the false positives and false negatives at a manageable level. We also explore the process of selecting optimum thresholds for both detection techniques. For the purposes of our experiments, Snort-Wireless open source wireless intrusion detection system was extended to implement these new techniques and the correlation engine. Absence of any false negatives and low number of false positives in all eight test scenarios successfully demonstrated the effectiveness of the correlation engine and the accuracy of the detection techniques.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Access All was performance produced following a three-month mentorship in web-based performance that I was commissioned to conduct for the performance company Igneous. This live, triple-site performance event for three performers in three remote venues was specifically designed for presentation at Access Grid Nodes - conference rooms located around the globe equipped with a high end, open source computer teleconferencing technology that allowed multiple nodes to cross-connect with each other. Whilst each room was setup somewhat differently they all deployed the same basic infrastructre of multiple projectors, cameras, and sound as well as a reconfigurable floorspace. At that time these relatively formal setups imposed a clear series of limitations in terms of software capabilities and basic infrastructure and so there was much interest in understanding how far its capabilities might be pushed.----- Numerous performance experiments were undertaken between three Access Grid nodes in QUT Brisbane, VISLAB Sydney and Manchester Supercomputing Centre, England, culminating in the public performance staged simultaneously between the sites with local audiences at each venue and others online. Access All was devised in collaboration with interdisciplinary performance company Bonemap, Kelli Dipple (Interarts curator, Tate Modern London) and Mike Stubbs British curator and Director of FACT (Liverpool).----- This period of research and development was instigated and shaped by a public lecture I had earlier delivered in Sydney for the ‘Global Access Grid Network, Super Computing Global Conference’ entitled 'Performance Practice across Electronic Networks'. The findings of this work went on to inform numerous future networked and performative works produced from 2002 onwards.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The Australian Research Collaboration Service (ARCS) has been supporting a wide range of Collaboration Services and Tools which have been allowing researchers, groups and research communities to share ideas and collaborate across organisational boundaries.----- This talk will give an introduction to a number of exciting technologies which are now available. Focus will be on two main areas of Video Collaboration Tools, allowing researchers to talk face-to-face and share data in real-time, and Web Collaboration Tools, allowing researchers to share information and ideas with other like-minded researchers irrespective of distance or organisational structure. A number of examples will also be shown of how these technologies have been used with in various research communities.----- A brief introduction will be given to a number of services which ARCS is now operating and/or supporting such as:--- * EVO – A video conferencing application, which is particularly suited to desktop or low bandwidth applications.--- * AccessGrid – An open source video conferencing and collaboration tool kit, which is great for room to room meetings.--- * Sakai – An online collaboration and learning environment, support teaching and learning, ad hoc group collaboration, support for portfolios and research collaboration.--- * Plone – A ready-to-run content management system, that provides you with a system for managing web content that is ideal for project groups, communities, web sites, extranets and intranets.--- * Wikis – A way to easily create, edit, and link pages together, to create collaborative websites.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Registration fees for this workshop are being met by ARCS. There is no cost to attend; however space is limited.----- The Australian Research Collaboration Service (ARCS) has been supporting a wide range of Collaboration Services and Tools which have been allowing researchers, groups and research communities to share ideas and collaborate across organisational boundaries.----- This workshop will give an introduction into a number of web based and real-time collaboration tools and services which researchers may find useful for day-to-day collaboration with members of a research team located within an institution or across institutions. Attendees will be shown how a number of these tools work with strong emphasis placed on how these tools can help facilitate communication and collaboration. Attendees will have the opportunity to try out a number of examples themselves, and interact with the workshop staff to discuss how their own use cases could benefit from the tools and services which can be provided.----- Outline: A hands on introduction will be given to a number of services which ARCS is now operating and/or supporting such as:--- * EVO – A video conferencing environment, which is particularly suited to desktop or low bandwidth applications.--- * AccessGrid – An open source video conferencing and collaboration tool kit, which is great for room to room meetings.--- * Sakai – An online collaboration and learning environment, support teaching and learning, ad hoc group collaboration, support for portfolios and research collaboration.--- * Plone and Drupal – A ready-to-run content management system, that provides you with a system for managing web content that is ideal for project groups, communities, web sites, extranets and intranets.--- * Wikis – A way to easily create, edit, and link pages together, to create collaborative websites.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Alvin Toffler’s image of the prosumer (1970, 1980, 1990) continues to influence in a significant way our understanding of the user-led, collaborative processes of content creation which are today labelled “social media” or “Web 2.0”. A closer look at Toffler’s own description of his prosumer model reveals, however, that it remains firmly grounded in the mass media age: the prosumer is clearly not the self-motivated creative originator and developer of new content which can today be observed in projects ranging from open source software through Wikipedia to Second Life, but simply a particularly well-informed, and therefore both particularly critical and particularly active, consumer. The highly specialised, high end consumers which exist in areas such as hi-fi or car culture are far more representative of the ideal prosumer than the participants in non-commercial (or as yet non-commercial) collaborative projects. And to expect Toffler’s 1970s model of the prosumer to describe these 21st-century phenomena was always an unrealistic expectation, of course. To describe the creative and collaborative participation which today characterises user-led projects such as Wikipedia, terms such as ‘production’ and ‘consumption’ are no longer particularly useful – even in laboured constructions such as ‘commons-based peer-production’ (Benkler 2006) or ‘p2p production’ (Bauwens 2005). In the user communities participating in such forms of content creation, roles as consumers and users have long begun to be inextricably interwoven with those as producer and creator: users are always already also able to be producers of the shared information collection, regardless of whether they are aware of that fact – they have taken on a new, hybrid role which may be best described as that of a produser (Bruns 2008). Projects which build on such produsage can be found in areas from open source software development through citizen journalism to Wikipedia, and beyond this also in multi-user online computer games, filesharing, and even in communities collaborating on the design of material goods. While addressing a range of different challenges, they nonetheless build on a small number of universal key principles. This paper documents these principles and indicates the possible implications of this transition from production and prosumption to produsage.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Alvin Tofflers Bild des Prosumers beeinflußt weiterhin maßgeblich unser Verständnis vieler heutzutage als „Social Media“ oder „Web 2.0“ beschriebener nutzergesteuerter, kollaborativer Prozesse der Inhaltserstellung. Ein genauerer Blick auf Tofflers eigene Beschreibung seines Prosumermodells offenbart jedoch, daß es fest im Zeitalter der Massenmedienvorherrschaft verankert bleibt: der Prosumer ist eben nicht jener aus eigenem Antrieb aktive, kreative Ersteller und Weiterbearbeiter neuer Inhalte, wie er heutzutage in Projekten von der Open-Source-Software über die Wikipedia bis hin zu Second Life zu finden ist, sondern nur ein ganz besonders gut informierter, und daher in seinem Konsumverhalten sowohl besonders kritischer als auch besonders aktiver Konsument. Hochspezialisierte, High-End-Konsumenten etwa im Hi-Fi- oder Automobilbereich stellen viel eher das Idealbild des Prosumers dar als das für Mitarbeiter in oft eben gerade nicht (oder zumindest noch nicht) kommerziell erfaßten nutzergesteuerten Kollaborationsprojekten der Fall ist. Solches von Tofflers in den 70ern erarbeiteten Modells zu erwarten, ist sicherlich ohnehin zuviel verlangt. Das Problem liegt also nicht bei Toffler selbst, sondern vielmehr in den im Industriezeitalter vorherrschenden Vorstellungen eines recht deutlich in Produktion, Distribution, und Konsum eingeteilten Prozesses. Diese Dreiteilung war für die Erschaffung materieller wie immaterieller Güter durchaus notwendig – sie ist selbst für die konventionellen Massenmedien zutreffend, bei denen Inhaltsproduktion ebenso aus kommerziellen Gründen auf einige wenige Institutionen konzentriert war wie das für die Produktion von Konsumgütern der Fall ist. Im beginnenden Informationszeitalter, beherrscht durch dezentralisierte Mediennetzwerke und weithin erhaltbare und erschwingliche Produktionsmittel, liegt der Fall jedoch anders. Was passiert, wenn Distribution automatisch erfolgt, und wenn beinahe jeder Konsument auch Produzent sein kann, anstelle einer kleinen Schar von kommerziell unterstützten Produzenten, denen bestenfallls vielleicht eine Handvoll von nahezu professionellen Prosumern zur Seite steht? Was geschieht, wenn sich die Zahl der von Eric von Hippel als ‚lead user’ beschriebenen als Produzenten aktiven Konsumenten massiv ausdehnt – wenn, wie Wikipedias Slogan es beschreibt, ‚anyone can edit’, wenn also potentiell jeder Nutzer aktiv an der Inhaltserstellung teilnehmen kann? Um die kreative und kollaborative Beteiligung zu beschreiben, die heutzutage nutzergesteuerte Projekte wie etwa die Wikipedia auszeichnet, sind Begriffe wie ‚Produktion’ und ‚Konsum’ nur noch bedingt nützlich – selbst in Konstruktionen wie 'nutzergesteuerte Produktion' oder 'P2P-Produktion'. In den Nutzergemeinschaften, die an solchen Formen der Inhaltserschaffung teilnehmen, haben sich Rollen als Konsumenten und Benutzer längst unwiederbringlich mit solchen als Produzent vermischt: Nutzer sind immer auch unausweichlich Produzenten der gemeinsamen Informationssammlung, ganz egal, ob sie sich dessens auch bewußt sind: sie haben eine neue, hybride Rolle angenommen, die sich vielleicht am besten als 'Produtzer' umschreiben lassen kann. Projekte, die auf solche Produtzung (Englisch: produsage) aufbauen, finden sich in Bereichen von Open-Source-Software über Bürgerjournalismus bis hin zur Wikipedia, und darüberhinaus auch zunehmend in Computerspielen, Filesharing, und selbst im Design materieller Güter. Obwohl unterschiedlich in ihrer Ausrichtung, bauen sie doch auf eine kleine Zahl universeller Grundprinzipien auf. Dieser Vortrag beschreibt diese Grundprinzipien, und zeigt die möglichen Implikationen dieses Übergangs von Produktion (und Prosumption) zu Produtzung auf.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Discrete event-driven simulations of digital communication networks have been used widely. However, it is difficult to use a network simulator to simulate a hybrid system in which some objects are not discrete event-driven but are continuous time-driven. A networked control system (NCS) is such an application, in which physical process dynamics are continuous by nature. We have designed and implemented a hybrid simulation environment which effectively integrates models of continuous-time plant processes and discrete-event communication networks by extending the open source network simulator NS-2. To do this a synchronisation mechanism was developed to connect a continuous plant simulation with a discrete network simulation. Furthermore, for evaluating co-design approaches in an NCS environment, a piggybacking method was adopted to allow the control period to be adjusted during simulations. The effectiveness of the technique is demonstrated through case studies which simulate a networked control scenario in which the communication and control system properties are defined explicitly.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Real-Time Kinematic (RTK) positioning is a technique used to provide precise positioning services at centimetre accuracy level in the context of Global Navigation Satellite Systems (GNSS). While a Network-based RTK (N-RTK) system involves multiple continuously operating reference stations (CORS), the simplest form of a NRTK system is a single-base RTK. In Australia there are several NRTK services operating in different states and over 1000 single-base RTK systems to support precise positioning applications for surveying, mining, agriculture, and civil construction in regional areas. Additionally, future generation GNSS constellations, including modernised GPS, Galileo, GLONASS, and Compass, with multiple frequencies have been either developed or will become fully operational in the next decade. A trend of future development of RTK systems is to make use of various isolated operating network and single-base RTK systems and multiple GNSS constellations for extended service coverage and improved performance. Several computational challenges have been identified for future NRTK services including: • Multiple GNSS constellations and multiple frequencies • Large scale, wide area NRTK services with a network of networks • Complex computation algorithms and processes • Greater part of positioning processes shifting from user end to network centre with the ability to cope with hundreds of simultaneous users’ requests (reverse RTK) There are two major requirements for NRTK data processing based on the four challenges faced by future NRTK systems, expandable computing power and scalable data sharing/transferring capability. This research explores new approaches to address these future NRTK challenges and requirements using the Grid Computing facility, in particular for large data processing burdens and complex computation algorithms. A Grid Computing based NRTK framework is proposed in this research, which is a layered framework consisting of: 1) Client layer with the form of Grid portal; 2) Service layer; 3) Execution layer. The user’s request is passed through these layers, and scheduled to different Grid nodes in the network infrastructure. A proof-of-concept demonstration for the proposed framework is performed in a five-node Grid environment at QUT and also Grid Australia. The Networked Transport of RTCM via Internet Protocol (Ntrip) open source software is adopted to download real-time RTCM data from multiple reference stations through the Internet, followed by job scheduling and simplified RTK computing. The system performance has been analysed and the results have preliminarily demonstrated the concepts and functionality of the new NRTK framework based on Grid Computing, whilst some aspects of the performance of the system are yet to be improved in future work.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Wahrscheinlich war es ja nur ein Versuch einer alteingesessenen Institution der Massenmedien, sich bei „den Leuten, die man früher ‚das Publikum‘ nannte“ („the people formerly known as the audience“; Rosen, 2006, n.pag.) wieder einzuschmeicheln. Dennoch, die Anerkennung des kollektiven „You“ in YouTube und anderen kollaborativen Onlineplattformen als Person des Jahres durch das amerikanische Magazin Time (Grossman, 2007) liefert ein weiteres Indiz für die wachsende Bedeutung solcher Projekte zur gemeinschaftlichen Produktion und Distribution von Inhalten. Kreative Websites wie Flickr und YouTube, kollaborative Wissenssammlungen von Wikipedia über Digg bis zu Google Earth, nutzergesteuerte Diskussionen in Slashdot, OhmyNews, und der allgemeinen Blogosphäre, aber auch die Softwareentwicklungsgemeinschaften im Open‐Source‐Bereich – sie alle dienen als Beispiele für diese nun etablierten Trend zur Entwicklung neuer Produktions‐, Geschäfts‐, Gemeinschafts‐ und Selbstlenkungsmodelle, die wesentlich durch zunehmend komplexere Web‐2.0‐Tools unterstützt werden. Hinter diesen Beispielen wird eine allgemeiner Tendenz sichtbar, die aus ökonomischer Sicht bereits von Yochai Benkler als „commons‐based peer production“ (2006), und von Eric von Hippel als „democratizing innovation" (2005) beschrieben worden ist. Henry Jenkins spricht zudem von einer „convergence culture“ (2006), in der solche nutzergesteuerten Projekte operieren, und es können auch Verbindungen zwischen diesen stärker aktive gewordenen Nutzern und Alvin Tofflers professionellen Konsumenten, den „Prosumers“ ziehen (1971).

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Workflow Management Systems (WfMSs) enable the development and maintenance of workflow specifications at design time and their execution and monitoring at runtime. The open source WfMS YAWL supports the YAWL language – a formally defined language based on Petri nets which offers comprehensive support for control-flow and resource patterns. In addition, the YAWL system provides extensive support for process flexibility, in particular for process configuration, exception handling, dynamic workflow and declarative workflow. Due to its formal foundation, sophisticated verification support can also be achieved. This paper presents the YAWL system and its main applications.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

To date, most theories of business models have theorized value capture assuming that appropriability regimes were exogenous and that the firm would face a unique, ideal-typical appropriability regime. This has led theory contributions to focus on governance structures to minimize transaction costs, to downplay the interdepencies between value capture and value creation, and to ignore revenue generation strategies. We propose a reconceptualization of business models value capture mechanisms that rely on assumptions of endogeneity and multiplicity of appropriability regimes. This new approach to business model construction highlights the interdependencies and trade-offs between value creation and value capture offered by different types and combinations of appropriability regimes. The theory is illustrated by the analysis of three cases of open source software business models

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Public transportation is an environment with great potential for applying location-based services through mobile devices. The BusTracker study is looking at how real-time passenger information systems can provide a core platform to improve commuters’ experiences. These systems rely on mobile computing and GPS technology to provide accurate information on transport vehicle locations. BusTracker builds on this mobile computing platform and geospatial information. The pilot study is running on the open source BugLabs computing platform, using a GPS module for accurate location information.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The SoundCipher software library provides an easy way to create music in the Processing development environment. With the SoundCipher library added to Processing you can write software programs that make music to go along with your graphics and you can add sounds to enhance your Processing animations or games. SoundCipher provides an easy interface for playing 'notes' on the JavaSound synthesizer, for playback of audio files, and comunicating via MIDI. It provides accurate scheduling and allows events to be organised in musical time; using beats and tempo. It uses a 'score' metaphor that allows the construction of simple or complex musical arrangements. SoundCipher is designed to facilitate the basics of algorithmic music and interactive sound design as well as providing a platform for sophisticated computational music, it allows integration with the Minim library when more sophisticated audio and synthesis functionality is required and integration with the oscP5 library for communicating via open sound control.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

An issue on generative music in Contemporary Music Review allows space to explore many of these controversies, and to explore the rich algorithmic scene in contemporary practice, as well as the diverse origins and manifestations of such a culture. A roster of interesting exponents from both academic and arts practice backgrounds are involved, matching the broad spectrum of current work. Contributed articles range from generative algorithms in live systems, from live coding to interactive music systems to computer games, through algorithmic modelling of longer-term form, evolutionary algorithms, to interfaces between modalities and mediums, in algorithmic choreography. A retrospective on the intensive experimentation into algorithmic music and sound synthesis at the Institute of Sonology in the 1960s and 70s creates a complementary strand, as well as an open paper on the issues raised by open source, as opposed to proprietary, software and operating systems, with consequences in the creation and archiving of algorithmic work.