841 resultados para Robotic benchmarks
Resumo:
The Advanced Very High Resolution Radiometer (AVHRR) carried on board the National Oceanic and Atmospheric Administration (NOAA) and the Meteorological Operational Satellite (MetOp) polar orbiting satellites is the only instrument offering more than 25 years of satellite data to analyse aerosols on a daily basis. The present study assessed a modified AVHRR aerosol optical depth τa retrieval over land for Europe. The algorithm might also be applied to other parts of the world with similar surface characteristics like Europe, only the aerosol properties would have to be adapted to a new region. The initial approach used a relationship between Sun photometer measurements from the Aerosol Robotic Network (AERONET) and the satellite data to post-process the retrieved τa. Herein a quasi-stand-alone procedure, which is more suitable for the pre-AERONET era, is presented. In addition, the estimation of surface reflectance, the aerosol model, and other processing steps have been adapted. The method's cross-platform applicability was tested by validating τa from NOAA-17 and NOAA-18 AVHRR at 15 AERONET sites in Central Europe (40.5° N–50° N, 0° E–17° E) from August 2005 to December 2007. Furthermore, the accuracy of the AVHRR retrieval was related to products from two newer instruments, the Medium Resolution Imaging Spectrometer (MERIS) on board the Environmental Satellite (ENVISAT) and the Moderate Resolution Imaging Spectroradiometer (MODIS) on board Aqua/Terra. Considering the linear correlation coefficient R, the AVHRR results were similar to those of MERIS with even lower root mean square error RMSE. Not surprisingly, MODIS, with its high spectral coverage, gave the highest R and lowest RMSE. Regarding monthly averaged τa, the results were ambiguous. Focusing on small-scale structures, R was reduced for all sensors, whereas the RMSE solely for MERIS substantially increased. Regarding larger areas like Central Europe, the error statistics were similar to the individual match-ups. This was mainly explained with sampling issues. With the successful validation of AVHRR we are now able to concentrate on our large data archive dating back to 1985. This is a unique opportunity for both climate and air pollution studies over land surfaces.
Resumo:
In 2011, researchers at Bucknell University and Illinois Wesleyan University compared the search efficacy of Serial Solutions Summon, EBSCO Discovery Service, Google Scholar and conventional library databases. Using a mixed-methods approach, qualitative and quantitative data was gathered on students’ usage of these tools. Regardless of the search system, students exhibited a marked inability to effectively evaluate sources and a heavy reliance on default search settings. On the quantitative benchmarks measured by this study, the EBSCO Discovery Service tool outperformed the other search systems in almost every category. This article describes these results and makes recommendations for libraries considering these tools.
Resumo:
We have measured high-precision infrared parallaxes with the Canada-France-Hawaii Telescope for a large sample of candidate young (approximate to 10-100 Myr) and intermediate-age (approximate to 100-600 Myr) ultracool dwarfs, with spectral types ranging from M8 to T2.5. These objects are compelling benchmarks for substellar evolution and ultracool atmospheres at lower surface gravities (i.e., masses) than most of the field population. We find that the absolute magnitudes of our young sample can be systematically offset from ordinary (older) field dwarfs, with the young late-M objects being brighter and the young/dusty mid-L (L3-L6.5) objects being fainter, especially at J band. Thus, we conclude the "underluminosity" of the young planetary-mass companions HR 8799b and 2MASS J1207-39b compared to field dwarfs is also manifested in young free-floating brown dwarfs, though the effect is not as extreme. At the same time, some young objects over the full spectral type range of our sample are similar to field objects, and thus a simple correspondence between youth and magnitude offset relative to the field population appears to be lacking. Comparing the kinematics of our sample to nearby stellar associations and moving groups, we identify several new moving group members, including the first free-floating L dwarf in the AB Dor moving group, 2MASS J0355+11. Altogether, the effects of surface gravity (age) and dust content on the magnitudes and colors of substellar objects appear to be degenerate. (C) 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim
Resumo:
For the past 10 years, medical imaging techniques have been increasingly applied to forensic investigations. To obtain histological and toxicological information, tissue and liquid samples are required. In this article, we describe the development of a low-cost, secure, and reliable approach for a telematic add-on for remotely planning biopsies on the Virtobot robotic system. Data sets are encrypted and submitted over the Internet. A plugin for the OsiriX medical image viewer allows for remote planning of needle trajectories that are used for needle placement. The application of teleradiological methods to image-guided biopsy in the forensic setting has the potential to reduce costs and, in conjunction with a mobile computer tomographic scanner, allows for tissue sampling in a mass casualty situation involving nuclear, biological, or chemical agents, in a manner that minimizes the risk to involved staff.
Resumo:
Localization is information of fundamental importance to carry out various tasks in the mobile robotic area. The exact degree of precision required in the localization depends on the nature of the task. The GPS provides global position estimation but is restricted to outdoor environments and has an inherent imprecision of a few meters. In indoor spaces, other sensors like lasers and cameras are commonly used for position estimation, but these require landmarks (or maps) in the environment and a fair amount of computation to process complex algorithms. These sensors also have a limited field of vision. Currently, Wireless Networks (WN) are widely available in indoor environments and can allow efficient global localization that requires relatively low computing resources. However, the inherent instability in the wireless signal prevents it from being used for very accurate position estimation. The growth in the number of Access Points (AP) increases the overlap signals areas and this could be a useful means of improving the precision of the localization. In this paper we evaluate the impact of the number of Access Points in mobile nodes localization using Artificial Neural Networks (ANN). We use three to eight APs as a source signal and show how the ANNs learn and generalize the data. Added to this, we evaluate the robustness of the ANNs and evaluate a heuristic to try to decrease the error in the localization. In order to validate our approach several ANNs topologies have been evaluated in experimental tests that were conducted with a mobile node in an indoor space.
Resumo:
Conclusion: A robot built specifically for stereotactic cochlear implantation provides equal or better accuracy levels together with a better integration into a clinical environment, when compared to existing approaches based on industrial robots. Objectives: To evaluate the technical accuracy of a robotic system developed specifically for lateral skull base surgery in an experimental setup reflecting the intended clinical application. The invasiveness of cochlear electrode implantation procedures may be reduced by replacing the traditional mastoidectomy with a small tunnel slightly larger in diameter than the electrode itself. Methods: The end-to-end accuracy of the robot system and associated image-guided procedure was evaluated on 15 temporal bones of whole head cadaver specimens. The main components of the procedure were as follows: reference screw placement, cone beam CT scan, computer-aided planning, pair-point matching of the surgical plan, robotic drilling of the direct access tunnel, and post-operative cone beam CT scan and accuracy assessment. Results: The mean accuracy at the target point (round window) was 0.56 ± 41 mm with an angular misalignment of 0.88 ± 0.41°. The procedural time of the registration process through the completion of the drilling procedure was 25 ± 11 min. The robot was fully operational in a clinical environment.
Resumo:
The Bucknell Humanoid Robot Arm project was developed in order toprovide a lightweight robotic arm for the IHMC / Bucknell University bipedal robot that will provide a means of manipulation and facilitate operations in urban environments. The resulting fabricated arm described in this thesis weighs only 13 pounds, and is capable of holding 11 pounds fully outstretched, lifting objects such as tools, and it can open doors. It is also capable of being easily integrated with the IHMC / Bucknell University biped. This thesis provides an introduction to robots themselves, discusses the goals of the Bucknell Humanoid Robot Arm project, provides a background on some of the existing robots, and shows how the Bucknell Humanoid Robot Arm fits in with the studies that have been completed. After reading these studies, important items such as design trees and operational scenarios were completed. The completion of these items led to measurable specifications and later the design requirements and specifications. A significant contribution of this thesis to the robotics discipline involves the design of the actuator itself. The arm uses of individual, lightweight, compactly designed actuators to achieve desired capabilities and performance requirements. Many iterations were completed to get to the final design of each actuator. After completing the actuators, the design of the intermediate links and brackets was finalized. Completion of the design led to the development of a complex controls system which used a combination of Clanguage and Java.
Resumo:
The abundance of alpha-fetoprotein (AFP), a natural protein produced by the fetal yolk sac during pregnancy, correlates with lower incidence of estrogen receptor positive (ER+) breast cancer. The pharmacophore region of AFP has been narrowed down to a four amino acid (AA) region in the third domain of the 591 AA peptide. Our computational study focuses on a 4-mer segment consisting of the amino acids threonine-proline-valine-asparagine (TPVN). We have run replica exchange molecular dynamics (REMD) simulations and used 120 configurational snapshots from the total trajectory as starting configurations for quantum chemical calculations. We optimized structures using semiempirical (PM3, PM6, PM6-D2, PM6-H2, PM6-DH+, PM6-DH2) and density functional methods (TPSS, PBE0, M06-2X). By comparing the accuracy of these methods against RI-MP2 benchmarks, we devised a protocol for calculating the lowest energy conformers of these peptides accurately and efficiently. This protocol screens out high-energy conformers using lower levels of theory and outlines a general method for predicting small peptide structures.
Resumo:
Open radical prostatectomy represents one possible therapeutic option for treating patients with clinically localized prostate cancer Patient selection and the surgical management have undergone important changes during the last years, resulting in lower morbidity and probably in a better tumor control due to a better standardisation of the surgical technique. Long-term functional outcome regarding continence and potency are of increasing importance and influence mainly the quality of life in these patients. Open radical retropubic prostatectomy remains the gold standard in patients with localized prostate cancer, due to its low morbidity and excellent oncological and functional results. The value of laparoscopic and robotic radical prostatectomy is still discussed controversially. Due to the relative high morbidity during the so-called learning curve and the lack of long-term oncological and functional results, these techniques seem to show less favourable results.
Resumo:
As the performance gap between microprocessors and memory continues to increase, main memory accesses result in long latencies which become a factor limiting system performance. Previous studies show that main memory access streams contain significant localities and SDRAM devices provide parallelism through multiple banks and channels. These locality and parallelism have not been exploited thoroughly by conventional memory controllers. In this thesis, SDRAM address mapping techniques and memory access reordering mechanisms are studied and applied to memory controller design with the goal of reducing observed main memory access latency. The proposed bit-reversal address mapping attempts to distribute main memory accesses evenly in the SDRAM address space to enable bank parallelism. As memory accesses to unique banks are interleaved, the access latencies are partially hidden and therefore reduced. With the consideration of cache conflict misses, bit-reversal address mapping is able to direct potential row conflicts to different banks, further improving the performance. The proposed burst scheduling is a novel access reordering mechanism, which creates bursts by clustering accesses directed to the same rows of the same banks. Subjected to a threshold, reads are allowed to preempt writes and qualified writes are piggybacked at the end of the bursts. A sophisticated access scheduler selects accesses based on priorities and interleaves accesses to maximize the SDRAM data bus utilization. Consequentially burst scheduling reduces row conflict rate, increasing and exploiting the available row locality. Using a revised SimpleScalar and M5 simulator, both techniques are evaluated and compared with existing academic and industrial solutions. With SPEC CPU2000 benchmarks, bit-reversal reduces the execution time by 14% on average over traditional page interleaving address mapping. Burst scheduling also achieves a 15% reduction in execution time over conventional bank in order scheduling. Working constructively together, bit-reversal and burst scheduling successfully achieve a 19% speedup across simulated benchmarks.
Resumo:
In the Dominican Republic economic growth in the past twenty years has not yielded sufficient improvement in access to drinking water services, especially in rural areas where 1.5 million people do not have access to an improved water source (WHO, 2006). Worldwide, strategic development planning in the rural water sector has focused on participatory processes and the use of demand filters to ensure that service levels match community commitment to post-project operation and maintenance. However studies have concluded that an alarmingly high percentage of drinking water systems (20-50%) do not provide service at the design levels and/or fail altogether (up to 90%): BNWP (2009), Annis (2006), and Reents (2003). World Bank, USAID, NGOs, and private consultants have invested significant resources in an effort to determine what components make up an “enabling environment” for sustainable community management of rural water systems (RWS). Research has identified an array of critical factors, internal and external to the community, which affect long term sustainability of water services. Different frameworks have been proposed in order to better understand the linkages between individual factors and sustainability of service. This research proposes a Sustainability Analysis Tool to evaluate the sustainability of RWS, adapted from previous relevant work in the field to reflect the realities in the Dominican Republic. It can be used as a diagnostic tool for government entities and development organizations to characterize the needs of specific communities and identify weaknesses in existing training regimes or support mechanisms. The framework utilizes eight indicators in three categories (Organization/Management, Financial Administration, and Technical Service). Nineteen independent variables are measured resulting in a score of sustainability likely (SL), possible (SP), or unlikely (SU) for each of the eight indicators. Thresholds are based upon benchmarks from the DR and around the world, primary data collected during the research, and the author’s 32 months of field experience. A final sustainability score is calculated using weighting factors for each indicator, derived from Lockwood (2003). The framework was tested using a statistically representative geographically stratified random sample of 61 water systems built in the DR by initiatives of the National Institute of Potable Water (INAPA) and Peace Corps. The results concluded that 23% of sample systems are likely to be sustainable in the long term, 59% are possibly sustainable, and for 18% it is unlikely that the community will be able to overcome any significant challenge. Communities that were scored as unlikely sustainable perform poorly in participation, financial durability, and governance while the highest scores were for system function and repair service. The Sustainability Analysis Tool results are verified by INAPA and PC reports, evaluations, and database information, as well as, field observations and primary data collected during the surveys. Future research will analyze the nature and magnitude of relationships between key factors and the sustainability score defined by the tool. Factors include: gender participation, legal status of water committees, plumber/operator remuneration, demand responsiveness, post construction support methodologies, and project design criteria.
Resumo:
This report shares my efforts in developing a solid unit of instruction that has a clear focus on student outcomes. I have been a teacher for 20 years and have been writing and revising curricula for much of that time. However, most has been developed without the benefit of current research on how students learn and did not focus on what and how students are learning. My journey as a teacher has involved a lot of trial and error. My traditional method of teaching is to look at the benchmarks (now content expectations) to see what needs to be covered. My unit consists of having students read the appropriate sections in the textbook, complete work sheets, watch a video, and take some notes. I try to include at least one hands-on activity, one or more quizzes, and the traditional end-of-unit test consisting mostly of multiple choice questions I find in the textbook. I try to be engaging, make the lessons fun, and hope that at the end of the unit my students get whatever concepts I‘ve presented so that we can move on to the next topic. I want to increase students‘ understanding of science concepts and their ability to connect understanding to the real-world. However, sometimes I feel that my lessons are missing something. For a long time I have wanted to develop a unit of instruction that I know is an effective tool for the teaching and learning of science. In this report, I describe my efforts to reform my curricula using the “Understanding by Design” process. I want to see if this style of curriculum design will help me be a more effective teacher and if it will lead to an increase in student learning. My hypothesis is that this new (for me) approach to teaching will lead to increased understanding of science concepts among students because it is based on purposefully thinking about learning targets based on “big ideas” in science. For my reformed curricula I incorporate lessons from several outstanding programs I‘ve been involved with including EpiCenter (Purdue University), Incorporated Research Institutions for Seismology (IRIS), the Master of Science Program in Applied Science Education at Michigan Technological University, and the Michigan Association for Computer Users in Learning (MACUL). In this report, I present the methodology on how I developed a new unit of instruction based on the Understanding by Design process. I present several lessons and learning plans I‘ve developed for the unit that follow the 5E Learning Cycle as appendices at the end of this report. I also include the results of pilot testing of one of lessons. Although the lesson I pilot-tested was not as successful in increasing student learning outcomes as I had anticipated, the development process I followed was helpful in that it required me to focus on important concepts. Conducting the pilot test was also helpful to me because it led me to identify ways in which I could improve upon the lesson in the future.
Resumo:
Computer-aided microscopic surgery of the lateral skull base is a rare intervention in daily practice. It is often a delicate and difficult minimally invasive intervention, since orientation between the petrous bone and the petrous bone apex is often challenging. In the case of aural atresia or tumors the normal anatomical landmarks are often absent, making orientation more difficult. Navigation support, together with imaging techniques such as CT, MR and angiography, enable the surgeon in such cases to perform the operation more accurately and, in some cases, also in a shorter time. However, there are no internationally standardised indications for navigated surgery on the lateral skull base. Miniaturised robotic systems are still in the initial validation phase.
Resumo:
Behavioral reflection is crucial to support for example functional upgrades, on-the-fly debugging, or monitoring critical applications. However the use of reflective features can lead to severe problems due to infinite metacall recursion even in simple cases. This is especially a problem when reflecting on core language features since there is a high chance that such features are used to implement the reflective behavior itself. In this paper we analyze the problem of infinite meta-object call recursion and solve it by providing a first class representation of meta-level execution: at any point in the execution of a system it can be determined if we are operating on a meta-level or base level so that we can prevent infinite recursion. We present how meta-level execution can be represented by a meta-context and how reflection becomes context-aware. Our solution makes it possible to freely apply behavioral reflection even on system classes: the meta-context brings stability to behavioral reflection. We validate the concept with a robust implementation and we present benchmarks.
Resumo:
Dynamic, unanticipated adaptation of running systems is of interest in a variety of situations, ranging from functional upgrades to on-the-fly debugging or monitoring of critical applications. In this paper we study a particular form of computational reflection, called unanticipated partial behavioral reflection, which is particularly well-suited for unanticipated adaptation of real-world systems. Our proposal combines the dynamicity of unanticipated reflection, i.e. reflection that does not require preparation of the code of any sort, and the selectivity and efficiency of partial behavioral reflection. First, we propose unanticipated partial behavioral reflection which enables the developer to precisely select the required reifications, to flexibly engineer the metalevel and to introduce the meta behavior dynamically. Second, we present a system supporting unanticipated partial behavioral reflection in Squeak Smalltalk, called Geppetto, and illustrate its use with a concrete example of a web application. Benchmarks validate the applicability of our proposal as an extension to the standard reflective abilities of Smalltalk.