974 resultados para Open source.


Relevância:

60.00% 60.00%

Publicador:

Resumo:

This article evaluates two policy initiatives by the United States Government to address access to essential medicines -- Priority Review vouchers and “Patents for Humanity." Such proposals are aimed at speeding up the regulatory review of inventions with humanitarian uses and applications by the United States Food and Drug Administration, and the United States Patent and Trademark Office. It is argued that such measures fall short of international standards and norms established by the World Intellectual Property Organization Development Agenda 2007; the World Trade Organization’s Doha Declaration on the TRIPS Agreement and Public Health 2001 and the WTO General Council Decision of August 30, 2003; and the World Health Organization’s declarations on intellectual property and public health. This article concludes that there is a need for broader patent law reform in the United States to address matters of patent law and public health. Moreover, there is a need to experiment with other, more promising alternative models of research and development -- such as medical innovation prizes, a Health Impact Fund, the Medicines Patent Pool, and Open Source Drug Discovery.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This article considers the challenges posed to intellectual property law by the emerging field of bioinformatics. It examines the intellectual property strategies of established biotechnology companies, such as Celera Genomics, and information technology firms entering into the marketplace, such as IBM. First this paper argues that copyright law is not irrelevant to biotechnology, as some commentators would suggest. It claims that the use of copyright law and contract law is fundamental to the protection of biomedical and genomic databases. Second this article questions whether biotechnology companies are exclusively interested in patenting genes and genetics sequences. Recent evidence suggests that biotechnology companies and IT firms are patenting bioinformatics software and Internet business methods, as well as underlying instrumentation such as microarrays and genechips. Finally, this paper evaluates what impact the privatisation of bioinformatics will have on public research and scientific communication. It raises important questions about integration, interoperability, and the risks of monopoly. It finally considers whether open source software such as the Ensembl Project and peer to peer technology like DSAS will be able to counter this trend of privatisation.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The 3D Water Chemistry Atlas is an intuitive, open source, Web-based system that enables the three-dimensional (3D) sub-surface visualization of ground water monitoring data, overlaid on the local geological model (formation and aquifer strata). This paper firstly describes the results of evaluating existing virtual globe technologies, which led to the decision to use the Cesium open source WebGL Virtual Globe and Map Engine as the underlying platform. Next it describes the backend database and search, filtering, browse and analysis tools that were developed to enable users to interactively explore the groundwater monitoring data and interpret it spatially and temporally relative to the local geological formations and aquifers via the Cesium interface. The result is an integrated 3D visualization system that enables environmental managers and regulators to assess groundwater conditions, identify inconsistencies in the data, manage impacts and risks and make more informed decisions about coal seam gas extraction, waste water extraction, and water reuse.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This paper demonstrates the integration and usage of Process Query Language (PQL), a special-purpose programming language for querying large collections of process models based on process model behavior, in the Apromore open-source process model repository. The resulting environment provides a unique user experience when carrying out process model querying tasks. The tool is useful for researchers and practitioners working with large process model collections, and specifically for those with an interest in model retrieval tasks as part of process compliance, process redesign and process standardization initiatives.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This portrait of the global debate over patent law and access to essential medicines focuses on public health concerns about HIV/AIDS, malaria, tuberculosis, the SARS virus, influenza, and diseases of poverty. The essays explore the diplomatic negotiations and disputes in key international fora, such as the World Trade Organization, the World Health Organization and the World Intellectual Property Organization. Drawing upon international trade law, innovation policy, intellectual property law, health law, human rights and philosophy, the authors seek to canvass policy solutions which encourage and reward worthwhile pharmaceutical innovation while ensuring affordable access to advanced medicines. A number of creative policy options are critically assessed, including the development of a Health Impact Fund, prizes for medical innovation, the use of patent pools, open-source drug development and forms of 'creative capitalism'.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In his 1987 book, The Media Lab: Inventing the Future at MIT, Stewart Brand provides an insight into the visions of the future of the media in the 1970s and 1980s. 1 He notes that Nicolas Negroponte made a compelling case for the foundation of a media laboratory at MIT with diagrams detailing the convergence of three sectors of the media—the broadcast and motion picture industry; the print and publishing industry; and the computer industry. Stewart Brand commented: ‘If Negroponte was right and communications technologies really are converging, you would look for signs that technological homogenisation was dissolving old boundaries out of existence, and you would expect an explosion of new media where those boundaries used to be’. Two decades later, technology developers, media analysts and lawyers have become excited about the latest phase of media convergence. In 2006, the faddish Time Magazine heralded the arrival of various Web 2.0 social networking services: You can learn more about how Americans live just by looking at the backgrounds of YouTube videos—those rumpled bedrooms and toy‐strewn basement rec rooms—than you could from 1,000 hours of network television. And we didn’t just watch, we also worked. Like crazy. We made Facebook profiles and Second Life avatars and reviewed books at Amazon and recorded podcasts. We blogged about our candidates losing and wrote songs about getting dumped. We camcordered bombing runs and built opensource software. America loves its solitary geniuses—its Einsteins, its Edisons, its Jobses—but those lonely dreamers may have to learn to play with others. Car companies are running open design contests. Reuters is carrying blog postings alongside its regular news feed. Microsoft is working overtime to fend off user‐created Linux. We’re looking at an explosion of productivity and innovation, and it’s just getting started, as millions of minds that would otherwise have drowned in obscurity get backhauled into the global intellectual economy. The magazine announced that Time’s Person of the Year was ‘You’, the everyman and everywoman consumer ‘for seizing the reins of the global media, for founding and framing the new digital democracy, for working for nothing and beating the pros at their own game’. This review essay considers three recent books, which have explored the legal dimensions of new media. In contrast to the unbridled exuberance of Time Magazine, this series of legal works displays an anxious trepidation about the legal ramifications associated with the rise of social networking services. In his tour de force, The Future of Reputation: Gossip, Rumor, and Privacy on the Internet, Daniel Solove considers the implications of social networking services, such as Facebook and YouTube, for the legal protection of reputation under privacy law and defamation law. Andrew Kenyon’s edited collection, TV Futures: Digital Television Policy in Australia, explores the intersection between media law and copyright law in the regulation of digital television and Internet videos. In The Future of the Internet and How to Stop It, Jonathan Zittrain explores the impact of ‘generative’ technologies and ‘tethered applications’—considering everything from the Apple Mac and the iPhone to the One Laptop per Child programme.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This chapter considers the legal ramifications of Wikipedia, and other online media, such as the Encyclopedia of Life. Nathaniel Tkacz (2007) has observed: 'Wikipedia is an ideal entry-point from which to approach the shifting character of knowledge in contemporary society.' He observes: 'Scholarship on Wikipedia from computer science, history, philosophy, pedagogy and media studies has moved beyond speculation regarding its considerable potential, to the task of interpreting - and potentially intervening in - the significance of Wikipedia's impact' (Tkacz 2007). After an introduction, Part II considers the evolution and development of Wikipedia, and the legal troubles that have attended it. It also considers the establishment of rival online encyclopedia - such as Citizendium set up by Larry Sanger, the co-founder of Wikipedia; and Knol, the mysterious new project of Google. Part III explores the use of mass, collaborative authorship in the field of science. In particular, it looks at the development of the Encyclopedia of Life, which seeks to document the world's biodiversity. This chapter expresses concern that Wiki-based software had to develop in a largely hostile and inimical legal environment. It contends that copyright law and related fields of intellectual property need to be reformed in order better to accommodate users of copyright material (Rimmer 2007). This chapter makes a number of recommendations. First, there is a need to acknowledge and recognize forms of mass, collaborative production and consumption - not just individual authorship. Second, the view of a copyright 'work' and other subject matter as a complete and closed piece of cultural production also should be reconceptualised. Third, the defense of fair use should be expanded to accommodate a wide range of amateur, peer-to-peer production activities - not only in the United States, but in other jurisdictions as well. Fourth, the safe harbor protections accorded to Internet intermediaries, such as Wikipedia, should be strengthened. Fifth, there should be a defense in respect of the use of 'orphan works' - especially in cases of large-scale digitization. Sixth, the innovations of open source licensing should be expressly incorporated and entrenched within the formal framework of copyright laws. Finally, courts should craft judicial remedies to take into account concerns about political censorship and freedom of speech.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Digital technology offers enormous benefits (economic, quality of design and efficiency in use) if adopted to implement integrated ways of representing the physical world in a digital form. When applied across the full extent of the built and natural world, it is referred to as the Digital Built Environment (DBE) and encompasses a wide range of approaches and technology initiatives, all aimed at the same end goal: the development of a virtual world that sufficiently mirrors the real world to form the basis for the smart cities of the present and future, enable efficient infrastructure design and programmed maintenance, and create a new foundation for economic growth and social well-being through evidence-based analysis. The creation of a National Data Policy for the DBE will facilitate the creation of additional high technology industries in Australia; provide Governments, industries and citizens with greater knowledge of the environments they occupy and plan; and offer citizen-driven innovations for the future. Australia has slipped behind other nations in the adoption and execution of Building Information Modelling (BIM) and the principal concern is that the gap is widening. Data driven innovation added $67 billion to the Australian economy in 20131. Strong open data policy equates to $16 billion in new value2. Australian Government initiatives such as the Digital Earth inspired “National Map” offer a platform and pathway to embrace the concept of a “BIM Globe”, while also leveraging unprecedented growth in open source / open data collaboration. Australia must address the challenges by learning from international experiences—most notably the UK and NZ—and mandate the use of BIM across Government, extending the Framework for Spatial Data Foundation to include the Built Environment as a theme and engaging collaboration through a “BIM globe” metaphor. This proposed DBE strategy will modernise the Australian urban planning and the construction industry. It will change the way we develop our cities by fundamentally altering the dynamics and behaviours of the supply chains and unlocking new and more efficient ways of collaborating at all stages of the project life-cycle. There are currently two major modelling approaches that contribute to the challenge of delivering the DBE. Though these collectively encompass many (often competing) approaches or proprietary software systems, all can be categorised as either: a spatial modelling approach, where the focus is generally on representing the elements that make up the world within their geographic context; and a construction modelling approach, where the focus is on models that support the life cycle management of the built environment. These two approaches have tended to evolve independently, addressing two broad industry sectors: the one concerned with understanding and managing global and regional aspects of the world that we inhabit, including disciplines concerned with climate, earth sciences, land ownership, urban and regional planning and infrastructure management; the other is concerned with planning, design, construction and operation of built facilities and includes architectural and engineering design, product manufacturing, construction, facility management and related disciplines (a process/technology commonly known as Building Information Modelling, BIM). The spatial industries have a strong voice in the development of public policy in Australia, while the construction sector, which in 2014 accounted for around 8.5% of Australia’s GDP3, has no single voice and because of its diversity, is struggling to adapt to and take advantage of the opportunity presented by these digital technologies. The experience in the UK over the past few years has demonstrated that government leadership is very effective in stimulating industry adoption of digital technologies by, on the one hand, mandating the use of BIM on public procurement projects while at the same time, providing comparatively modest funding to address the common issues that confront the industry in adopting that way of working across the supply chain. The reported result has been savings of £840m in construction costs in 2013/14 according to UK Cabinet Office figures4. There is worldwide recognition of the value of bringing these two modelling technologies together. Australia has the expertise to exercise leadership in this work, but it requires a commitment by government to recognise the importance of BIM as a companion methodology to the spatial technologies so that these two disciplinary domains can cooperate in the development of data policies and information exchange standards to smooth out common workflows. buildingSMART Australasia, SIBA and their academic partners have initiated this dialogue in Australia and wish to work collaboratively, with government support and leadership, to explore the opportunities open to us as we develop an Australasian Digital Built Environment. As part of that programme, we must develop and implement a strategy to accelerate the adoption of BIM processes across the Australian construction sector while at the same time, developing an integrated approach in concert with the spatial sector that will position Australia at the forefront of international best practice in this area. Australia and New Zealand cannot afford to be on the back foot as we face the challenges of rapid urbanisation and change in the global environment. Although we can identify some exemplary initiatives in this area, particularly in New Zealand in response to the need for more resilient urban development in the face of earthquake threats, there is still much that needs to be done. We are well situated in the Asian region to take a lead in this challenge, but we are at imminent risk of losing the initiative if we do not take action now. Strategic collaboration between Governments, Industry and Academia will create new jobs and wealth, with the potential, for example, to save around 20% on the delivery costs of new built assets, based on recent UK estimates.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

People get into healthcare because they want to help society. And when a new hospital is briefed, everyone tries to do their best, but the process is mired by the impossibility of the task. Stakeholders rarely understand the architectural process, nobody can predict the future, and the only thing for certain is that everything will change as the project unfolds, revealing errors in initial assumptions and calculations, shifts in needs, new technologies etc. Yet there’s always pressure to keep to the programme and to press on regardless. This chaos leads eventually to suboptimal results: hospitals the world over are riddled with inefficiencies, idiosyncrasies, incredible wastage and features that lead to poor clinical outcomes. This talk will sketch out the basics of Scrum, the most popular open-source Lean/Agile methodology. It will discuss what healthcare designers can learn from the geeks in Silicon Valley reduce risk, meet deadlines and deliver the highest possible value for the budget despite the uncertainty.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Pacific Journalism Review has consistently, at a good standard, honoured its 1994 founding goal: to be a credible peer-reviewed journal in the Asia-Pacific region, probing developments in journalism and media, and supporting journalism education. Global, it considers new media and social movements; ‘regional’, it promotes vernacular media, human freedoms and sustainable development. Asking how it developed, the method for this article was to research the archive, noting authors, subject matter, themes. The article concludes that one answer is the journal’s collegiate approach; hundreds of academics, journalists and others, have been invited to contribute. Second has been the dedication of its one principal editor, Professor David Robie, always somehow providing resources—at Port Moresby, Suva, and now Auckland—with a consistent editorial stance. Eclectic, not partisan, it has nevertheless been vigilant over rights, such as monitoring the Fiji coups d’etat. Watching through a media lens, it follows a ‘Pacific way’, handling hard information through understanding and consensus. It has 237 subscriptions indexed to seven databases. Open source, it receives more than 1000 site visits weekly. With ‘clientele’ mostly in Australia, New Zealand and ‘Oceania’, it extends much further afield. From 1994 to 2014, 701 articles and reviews were published, now more than 24 scholarly articles each year.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This thesis studies document signatures, which are small representations of documents and other objects that can be stored compactly and compared for similarity. This research finds that document signatures can be effectively and efficiently used to both search and understand relationships between documents in large collections, scalable enough to search a billion documents in a fraction of a second. Deliverables arising from the research include an investigation of the representational capacity of document signatures, the publication of an open-source signature search platform and an approach for scaling signature retrieval to operate efficiently on collections containing hundreds of millions of documents.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In the education of physical sciences, the role of the laboratory cannot be overemphasised. It is the laboratory exercises which enable the student to assimilate the theoretical basis, verify the same through bench-top experiments, and internalize the subject discipline to acquire mastery of the same. However the resources essential to put together such an environment is substantial. As a result, the students go through a curriculum which is wanting in this respect. This paper presents a low cost alternative to impart such an experience to the student aimed at the subject of switched mode power conversion. The resources are based on an open source circuit simulator (Sequel) developed at IIT Mumbai, and inexpensive construction kits developed at IISc Bangalore. The Sequel programme developed by IIT Mumbai, is a circuit simulation program under linux operating system distributed free of charge. The construction kits developed at IISc Bangalore, is fully documented for anyone to assemble these circuit which minimal equipment such as soldering iron, multimeter, power supply etc. This paper puts together a simple forward dc to dc converter as a vehicle to introduce the programming under sequel to evaluate the transient performance and small signal dynamic model of the same. Bench tests on the assembled construction kit may be done by the student for study of operation, transient performance and closed loop stability margins etc.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

We investigate methods for data-based selection of working covariance models in the analysis of correlated data with generalized estimating equations. We study two selection criteria: Gaussian pseudolikelihood and a geodesic distance based on discrepancy between model-sensitive and model-robust regression parameter covariance estimators. The Gaussian pseudolikelihood is found in simulation to be reasonably sensitive for several response distributions and noncanonical mean-variance relations for longitudinal data. Application is also made to a clinical dataset. Assessment of adequacy of both correlation and variance models for longitudinal data should be routine in applications, and we describe open-source software supporting this practice.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This talk gives an overview of the project "Uncanny Nature", which incoporates a style of animation called Hybrid Stop Motion, that combines physical object armatures with virtual copies. The development of the production pipeline (using a mix of Blender, Dragonframe, Photoscan and Arduino) is discussed, as well as the way that Blender was used throughout the production to visualise, model, animate and composite the elements together.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Place identification refers to the process of analyzing sensor data in order to detect places, i.e., spatial areas that are linked with activities and associated with meanings. Place information can be used, e.g., to provide awareness cues in applications that support social interactions, to provide personalized and location-sensitive information to the user, and to support mobile user studies by providing cues about the situations the study participant has encountered. Regularities in human movement patterns make it possible to detect personally meaningful places by analyzing location traces of a user. This thesis focuses on providing system level support for place identification, as well as on algorithmic issues related to the place identification process. The move from location to place requires interactions between location sensing technologies (e.g., GPS or GSM positioning), algorithms that identify places from location data and applications and services that utilize place information. These interactions can be facilitated using a mobile platform, i.e., an application or framework that runs on a mobile phone. For the purposes of this thesis, mobile platforms automate data capture and processing and provide means for disseminating data to applications and other system components. The first contribution of the thesis is BeTelGeuse, a freely available, open source mobile platform that supports multiple runtime environments. The actual place identification process can be understood as a data analysis task where the goal is to analyze (location) measurements and to identify areas that are meaningful to the user. The second contribution of the thesis is the Dirichlet Process Clustering (DPCluster) algorithm, a novel place identification algorithm. The performance of the DPCluster algorithm is evaluated using twelve different datasets that have been collected by different users, at different locations and over different periods of time. As part of the evaluation we compare the DPCluster algorithm against other state-of-the-art place identification algorithms. The results indicate that the DPCluster algorithm provides improved generalization performance against spatial and temporal variations in location measurements.