941 resultados para Open Robot Project
Resumo:
Knowledge Management (KM) is vital factor to successfully undertake projects. The temporary nature of projects necessitates employing useful KM practices for tackling issues such as knowledge leakiness and rework. The Project Management Office (PMO) is a unit within organizations to facilitate and oversee organizational projects. Project Management Maturity Models (PMMM) shows the development of PMOs from immature to mature levels. The existing PMMMs have focused on discussing Project Management (PM) practices, however, the management of project knowledge is yet to be addressed, at various levels of maturity. This research project was undertaken to investigate the mentioned gap for addressing KM practices at the existing PMMMs. Due to the exploratory and inductive nature of this research, qualitative methods were chosen as the research methodology. In total, three cases selected from different industries: research; mining and government organizations, to provide broad categories for research and research questions were examined using the developed framework. This paper presents the partial findings of undertaken investigation of the research organisation with the lowest level of maturity. The result shows that knowledge creation and capturing are the most important processes, while knowledge transferring and reusing are not as important as the other two processes. In addition, it was revealed that provision of “knowledge about client” and “project management knowledge” are the most important types of knowledge that are required at this level of maturity. In conclusion, the outcomes of this paper shall provide powerful guidance to PMOs at lowest level of maturity from KM point of view.
Resumo:
This chapter analyses the copyright law framework needed to ensure open access to outputs of the Australian academic and research sector such as journal articles and theses. It overviews the new knowledge landscape, the principles of copyright law, the concept of open access to knowledge, the recently developed open content models of copyright licensing and the challenges faced in providing greater access to knowledge and research outputs.
Resumo:
The mining industry is highly suitable for the application of robotics and automation technology since the work is arduous, dangerous and often repetitive. This paper discusses a robust sensing system developed to find and trade the position of the hoist ropes of a dragline. Draglines are large `walking cranes' used in open-pit coal mining to remove the material covering the coal seam. The rope sensing system developed uses two time-of-flight laser scanners. The finding algorithm uses a novel data association and tracking strategy based on pairing rope data.
Resumo:
Draglines are extremely large machines that are widely used in open-cut coal mines for overburden stripping. Since 1994 we have been working toward the development of a computer control system capable of automatically driving a dragline for a large portion of its operating cycle. This has necessitated the development and experimental evaluation of sensor systems, machines models, closed-loop control controllers, and an operator interface. This paper describes our steps toward the goal through scale-model and full-scale field experimentation.
Resumo:
The inspection of marine vessels is currently performed manually. Inspectors use tools (e.g. cameras and devices for non-destructive testing) to detect damaged areas, cracks, and corrosion in large cargo holds, tanks, and other parts of a ship. Due to the size and complex geometry of most ships, ship inspection is time-consuming and expensive. The EU-funded project INCASS develops concepts for a marine inspection robotic assistant system to improve and automate ship inspections. In this paper, we introduce our magnetic wall–climbing robot: Marine Inspection Robotic Assistant (MIRA). This semiautonomous lightweight system is able to climb a vessels steel frame to deliver on-line visual inspection data. In addition, we describe the design of the robot and its building subsystems as well as its hardware and software components.
Resumo:
On 19 June 2015, representatives from over 40 Australian research institutions gathered in Canberra to launch their Open Data Collections. The one day event, hosted by the Australian National Data Service (ANDS), showcased to government and a range of national stakeholders the rich variety of data collections that have been generated through the Major Open Data Collections (MODC) project. Colin Eustace attended the showcase for QUT Library and presented a poster that reflected the work that he and Jodie Vaughan generated through the project. QUT’s Blueprint 4, the University’s five-year institutional strategic plan, outlines the key priorities of developing a commitment to working in partnership with industry, as well as combining disciplinary strengths with interdisciplinary application. The Division of Technology, Information and Learning Support (TILS) has undertaken a number of Australian National Data Service (ANDS) funded projects since 2009 with the aim of developing improved research data management services within the University to support these strategic aims. By leveraging existing tools and systems developed during these projects, the Major Open Data Collection (MODC) project delivered support to multi-disciplinary collaborative research activities through partnership building between QUT researchers and Queensland government agencies, in order to add to and promote the discovery and reuse of a collection of spatially referenced datasets. The MODC project built upon existing Research Data Finder infrastructure (which uses VIVO open source software, developed by Cornell University) to develop a separate collection, Spatial Data Finder (https://researchdatafinder.qut.edu.au/spatial) as the interface to display the spatial data collection. During the course of the project, 62 dataset descriptions were added to Spatial Data Finder, 7 added to Research Data Finder and two added to Software Finder, another separate collection. The project team met with 116 individual researchers and attended 13 school and faculty meetings to promote the MODC project and raise awareness of the Library’s services and resources for research data management.
Resumo:
Particle swarm optimization (PSO), a new population based algorithm, has recently been used on multi-robot systems. Although this algorithm is applied to solve many optimization problems as well as multi-robot systems, it has some drawbacks when it is applied on multi-robot search systems to find a target in a search space containing big static obstacles. One of these defects is premature convergence. This means that one of the properties of basic PSO is that when particles are spread in a search space, as time increases they tend to converge in a small area. This shortcoming is also evident on a multi-robot search system, particularly when there are big static obstacles in the search space that prevent the robots from finding the target easily; therefore, as time increases, based on this property they converge to a small area that may not contain the target and become entrapped in that area.Another shortcoming is that basic PSO cannot guarantee the global convergence of the algorithm. In other words, initially particles explore different areas, but in some cases they are not good at exploiting promising areas, which will increase the search time.This study proposes a method based on the particle swarm optimization (PSO) technique on a multi-robot system to find a target in a search space containing big static obstacles. This method is not only able to overcome the premature convergence problem but also establishes an efficient balance between exploration and exploitation and guarantees global convergence, reducing the search time by combining with a local search method, such as A-star.To validate the effectiveness and usefulness of algorithms,a simulation environment has been developed for conducting simulation-based experiments in different scenarios and for reporting experimental results. These experimental results have demonstrated that the proposed method is able to overcome the premature convergence problem and guarantee global convergence.
Resumo:
This essay provides a critical assessment of the Fair Use Project based at the Stanford Center for Internet and Society. In evaluating the efficacy of the Fair Use Project, it is worthwhile considering the litigation that the group has been involved in, and evaluating its performance. Part 1 outlines the history of the Stanford Center for Internet and Society, and the aims and objectives of the Fair Use Project. Part 2 considers the litigation in Shloss v. Sweeney over a biography concerning Lucia Joyce, the daughter of the avant-garde literary great, James Joyce. Part 3 examines the dispute over the Harry Potter Lexicon. Part 4 looks at the controversy over the Shepard Fairey poster of President Barack Obama, and the resulting debate with Associated Press. Part 5 of the essay considers the intervention of the Fair Use Project as an amicus curiae in the ‘Column case’. Part 6 explores the participation of the Fair Use Project as an amicus curiae in the litigation over 60 Years Later, an unauthorised literary sequel to J.D. Salinger’s The Catcher in the Rye. Part 7 of the essay investigates the role of the Fair Use project in disputes over copyright law and musical works. Part 8 investigates the role of the Fair Use Project as an advocate in disputes over copyright law, fair use, documentary films, and internet videos. The conclusion has main three arguments. First, it contends that Australia should establish a Fair Use Project to support creative artists in litigation over copyright exceptions. Second, it maintains that Australia should adopt a flexible, open-ended defence of fair use, and draw upon the rich jurisprudence in the United States on the fair use doctrine. Finally, this paper argues that support should be given at an international level to the proposal for a Treaty on Access to Knowledge.
Resumo:
This thesis addresses audience engagement challenges during professional mainstream ballet and contemporary dance company performances by examining spectator-dancer relationships. Focusing on the open rehearsal as an audience engagement tool, this project presents a new line of enquiry in dance reception studies. The findings signify that open rehearsal attendance can contribute to more meaningful and enjoyable performance experiences for audience members by opening up the possibility of different relationships with dancers.
Resumo:
Networked digital technologies and Open Access (OA) are transforming the processes and institutions of research, knowledge creation and dissemination globally: enabling new forms of collaboration, allowing researchers to be seen and heard in new ways and reshaping relationships between stakeholders across the global academic publishing system. This article draws on Joseph Nye’s concept of ‘Soft Power’ to explore the role that OA is playing in helping to reshape academic publishing in China. It focusses on two important areas of OA development: OA journals and national-level repositories. OA is being supported at the highest levels, and there is potential for it to play an important role in increasing the status and impact of Chinese scholarship. Investments in OA also have the potential to help China to re-position itself within international copyright discourses: moving beyond criticism for failure to enforce the rights of foreign copyright owners and progressing an agenda that places greater emphasis on equality of access to the resources needed to foster innovation. However, the potential for OA to help China to build and project its soft power is being limited by the legacies of the print era, as well as the challenges of efficiently governing the national research and innovation systems.
Resumo:
Open biomass burning from wildfires and the prescribed burning of forests and farmland is a frequent occurrence in South-East Queensland (SEQ), Australia. This work reports on data collected from 10-30 September 2011, which covers the days before (10-14 September), during (15-20 September) and after (21-30 September) a period of biomass burning in SEQ. The aim of this project was to comprehensively quantify the impact of the biomass burning on air quality in Brisbane, the capital city of Queensland. A multi-parameter field measurement campaign was conducted and ambient air quality data from 13 monitoring stations across SEQ were analysed. During the burning period, the average concentrations of all measured pollutants increased (from 20% to 430%) compared to the non-burning period (both before and after burning), except for total xylenes. The average concentration of O3, NO2, SO2, benzene, formaldehyde, PM10, PM2.5 and visibility-reducing particles reached their highest levels for the year, which were up to 10 times higher than annual average levels, while PM10, PM2.5 and SO2 concentrations exceeded the WHO 24-hour guidelines and O3 concentration exceeded the WHO maximum 8-hour average threshold during the burning period. Overall spatial variations showed that all measured pollutants, with the exception of O3, were closer to spatial homogeneity during the burning compared to the non-burning period. In addition to the above, elevated concentrations of three biomass burning organic tracers (levoglucosan, mannosan and galactosan), together with the amount of non-refractory organic particles (PM1) and the average value of f60 (attributed to levoglucosan), reinforce that elevated pollutant concentration levels were due to emissions from open biomass burning events, 70% of which were prescribed burning events. This study, which is the first and most comprehensive of its kind in Australia, provides quantitative evidence of the significant impact of open biomass burning events, especially prescribed burning, on urban air quality. The current results provide a solid platform for more detailed health and modelling investigations in the future.
Resumo:
The nucleotide sequence of cosmid B1790, carrying the Rif-Str regions of the Mycobacterium leprae chromosome, has been determined. Twelve open reading frames were identified in the 36716bp sequence, representing 40% of the coding capacity. Five ribosomal proteins, two elongation factors and the β and β'subunits of RNA polymerase have been characterized and two novel genes were found. One of these encodes a member of the so-called ABC family of ATP-binding proteins while the other appears to encode an enzyme involved in repairing genomic lesions caused by free radicals. This finding may well be significant as M. leprae, an intracellular pathogen, lives within macrophages.
Resumo:
It is being realized that the traditional closed-door and market driven approaches for drug discovery may not be the best suited model for the diseases of the developing world such as tuberculosis and malaria, because most patients suffering from these diseases have poor paying capacity. To ensure that new drugs are created for patients suffering from these diseases, it is necessary to formulate an alternate paradigm of drug discovery process. The current model constrained by limitations for collaboration and for sharing of resources with confidentiality hampers the opportunities for bringing expertise from diverse fields. These limitations hinder the possibilities of lowering the cost of drug discovery. The Open Source Drug Discovery project initiated by Council of Scientific and Industrial Research, India has adopted an open source model to power wide participation across geographical borders. Open Source Drug Discovery emphasizes integrative science through collaboration, open-sharing, taking up multi-faceted approaches and accruing benefits from advances on different fronts of new drug discovery. Because the open source model is based on community participation, it has the potential to self-sustain continuous development by generating a storehouse of alternatives towards continued pursuit for new drug discovery. Since the inventions are community generated, the new chemical entities developed by Open Source Drug Discovery will be taken up for clinical trial in a non-exclusive manner by participation of multiple companies with majority funding from Open Source Drug Discovery. This will ensure availability of drugs through a lower cost community driven drug discovery process for diseases afflicting people with poor paying capacity. Hopefully what LINUX the World Wide Web have done for the information technology, Open Source Drug Discovery will do for drug discovery. (C) 2011 Elsevier Ltd. All rights reserved.
Resumo:
During the 18th Annual 2008 SAIL meeting at the Smithsonian Tropical Research Institute in Panama, Vielka Chang-Yau, librarian, mentioned the need to digitize and make available through the Aquatic Commons some of the early documents related to the U.S. biological survey of Panama from 1910 to 1912. With the assistance of SAIL, a regional marine librarian’s group, a digital project developed and this select bibliography represents the sources used for the project. It will assist researchers and librarians in finding online open access documents written during the construction of the Panama Canal, specifically between 1910-1912. As the project progressed, other items covering the region and its biological diversity were discovered and included. The project team expects that the coverage will continue to expand over time. (PDF contains 9 pages)
Resumo:
Executive Summary: A number of studies have shown that mobile, bottom-contact fishing gear (such as otter trawls) can alter seafloor habitats and associated biota. Considerably less is known about the recovery of these resources following such disturbances, though this information is critical for successful management. In part, this paucity of information can be attributed to the lack of access to adequate control sites – areas of the seafloor that are closed to fishing activity. Recent closures along the coast of central California provide an excellent opportunity to track the recovery of historically trawled areas and to compare recovery rates to adjacent areas that continue to be trawled. In June 2006 we initiated a multi-year study of the recovery of seafloor microhabitats and associated benthic fauna inside and outside two new Essential Fish Habitat (EFH) closures within the Cordell Bank and Gulf of the Farallones National Marine Sanctuaries. Study sites inside the EFH closure at Cordell Bank were located in historically active areas of fishing effort, which had not been trawled since 2003. Sites outside the EFH closure in the Gulf of Farallones were located in an area that continues to be actively trawled. All sites were located in unconsolidated sands at equivalent water depths. Video and still photographic data collected via a remotely operated vehicle (ROV) were used to quantify the abundance, richness, and diversity of microhabitats and epifaunal macro-invertebrates at recovering and actively trawled sites, while bottom grabs and conductivity/temperature/depth (CTD) casts were used to quantify infaunal diversity and to characterize local environmental conditions. Analysis of still photos found differences in common seafloor microhabitats between the recovering and actively trawled areas, while analysis of videographic data indicated that biogenic mound and biogenic depression microhabitats were significantly less abundant at trawled sites. Each of these features provides structure with which demersal fishes, across a wide range of size classes, have been observed to associate. Epifaunal macro-invertebrates were sparsely distributed and occurred in low numbers in both treatments. However, their total abundance was significantly different between treatments, which was attributable to lower densities at trawled sites. In addition, the dominant taxa were different between the two sites. Patchily-distributed buried brittle stars dominated the recovering site, and sea whips (Halipteris cf. willemoesi) were most numerous at the trawled site though they occurred in only five of ten transects. Numerical classification (cluster analysis) of the infaunal samples also revealed a clear difference between benthic assemblages in the recovering vs. trawled areas due to differences in the relative abundances of component species. There were no major differences in infaunal species richness, H′ diversity, or J′ evenness between recovering vs. trawled site groups. However, total infaunal abundance showed a significant difference attributable to much lower densities at trawled sites. This pattern was driven largely by the small oweniid polychaete Myriochele gracilis, which was the most abundant species in the overall study region though significantly less abundant at trawled sites. Other taxa that were significantly less abundant at trawled sites included the polychaete M. olgae and the polychaete family Terebellidae. In contrast, the thyasirid bivalve Axinopsida serricata and the polychaetes Spiophanes spp. (mostly S. duplex), Prionospio spp., and Scoloplos armiger all had significantly to near significantly higher abundances at trawled sites. As a result of such contrasting species patterns, there also was a significant difference in the overall dominance structure of infaunal assemblages between the two treatments. It is suggested that the observed biological patterns were the result of trawling impacts and varying levels of recovery due to the difference in trawling status between the two areas. The EFH closure was established in June 2006, within a month of when sampling was conducted for the present study, however, the stations within this closure area are at sites that actually have experienced little trawling since 2003, based on National Marine Fishery Service trawl records. Thus, the three-year period would be sufficient time for some post-trawling changes to have occurred. Other results from this study (e.g., similarly moderate numbers of infaunal species in both areas that are lower than values recorded elsewhere in comparable habitats along the California continental shelf) also indicate that recovery within the closure area is not yet complete. Additional sampling is needed to evaluate subsequent recovery trends and persistence of effects. Furthermore, to date, the study has been limited to unconsolidated substrates. Ultimately, the goal of this project is to characterize the recovery trajectories of a wide spectrum of seafloor habitats and communities and to link that recovery to the dynamics of exploited marine fishes. (PDF has 48 pages.)