963 resultados para precision metrology
Resumo:
A method of accurately controlling the position of a mobile robot using an external Large Volume Metrology (LVM) instrument is presented in this paper. Utilizing a LVM instrument such as the laser tracker in mobile robot navigation, many of the most difficult problems in mobile robot navigation can be simplified or avoided. Using the real- Time position information from the laser tracker, a very simple navigation algorithm, and a low cost robot, 5mm repeatability was achieved over a volume of 30m radius. A surface digitization scan of a wind turbine blade section was also demonstrated, illustrating possible applications of the method for manufacturing processes. © Springer-Verlag Berlin Heidelberg 2010.
Resumo:
Aerospace manufacturers typically use monolithic steel fixtures to control the form of assemblies. This tooling is very expensive, has long lead times and has little ability to accommodate product variation and design changes. Since the tool setting and recertification process is manual and time consuming, monolithic structures are required in order to maintain the tooling tolerances for multiple years without recertification. This paper introduces the Metrology Enhanced Tooling for Aerospace (META) Framework which interfaces multiple metrology technologies with the tooling, components, workers and automation. This will allow rapid or even real-time fixture re-certification with improved product verification leading to a reduced risk of product non-conformance and increased fixture utilization while facilitating flexible fixtures.
Resumo:
Measurement and verification of products and processes during the early design is attracting increasing interest from high value manufacturing industries. Measurement planning is deemed as an effective means to facilitate the integration of the metrology activity into a wider range of production processes. However, the literature reveals that there are very few research efforts in this field, especially regarding large volume metrology. This paper presents a novel approach to accomplish instruments selection, the first stage of measurement planning process, by mapping measurability characteristics between specific measurement assignments and instruments.
Resumo:
Aircraft manufacturing industries are looking for solutions in order to increase their productivity. One of the solutions is to apply the metrology systems during the production and assembly processes. Metrology Process Model (MPM) (Maropoulos et al, 2007) has been introduced which emphasises metrology applications with assembly planning, manufacturing processes and product designing. Measurability analysis is part of the MPM and the aim of this analysis is to check the feasibility for measuring the designed large scale components. Measurability Analysis has been integrated in order to provide an efficient matching system. Metrology database is structured by developing the Metrology Classification Model. Furthermore, the feature-based selection model is also explained. By combining two classification models, a novel approach and selection processes for integrated measurability analysis system (MAS) are introduced and such integrated MAS could provide much more meaningful matching results for the operators. © Springer-Verlag Berlin Heidelberg 2010.
Resumo:
Dimensional and form inspections are key to the manufacturing and assembly of products. Product verification can involve a number of different measuring instruments operated using their dedicated software. Typically, each of these instruments with their associated software is more suitable for the verification of a pre-specified quality characteristic of the product than others. The number of different systems and software applications to perform a complete measurement of products and assemblies within a manufacturing organisation is therefore expected to be large. This number becomes even larger as advances in measurement technologies are made. The idea of a universal software application for any instrument still appears to be only a theoretical possibility. A need for information integration is apparent. In this paper, a design of an information system to consistently manage (store, search, retrieve, search, secure) measurement results from various instruments and software applications is introduced. Two of the main ideas underlying the proposed system include abstracting structures and formats of measurement files from the data so that complexity and compatibility between different approaches to measurement data modelling is avoided. Secondly, the information within a file is enriched with meta-information to facilitate its consistent storage and retrieval. To demonstrate the designed information system, a web application is implemented. © Springer-Verlag Berlin Heidelberg 2010.
Resumo:
Metrology processes used in the manufacture of large products include tool setting, product verification and flexible metrology enabled automation. The range of applications and instruments available makes the selection of the appropriate instrument for a given task highly complex. Since metrology is a key manufacturing process it should be considered in the early stages of design. This paper provides an overview of the important selection criteria for typical measurement processes and presents some novel selection strategies. Metrics which can be used to assess measurability are also discussed. A prototype instrument selection and measurability analysis application is presented with discussion of how this can be used as the basis for development of a more sophisticated measurement planning tool. © Springer-Verlag Berlin Heidelberg 2010.
Resumo:
Precision medicine is an emerging approach to disease treatment and prevention that considers variability in patient genes, environment, and lifestyle. However, little has been written about how such research impacts emergency care. Recent advances in analytical techniques have made it possible to characterize patients in a more comprehensive and sophisticated fashion at the molecular level, promising highly individualized diagnosis and treatment. Among these techniques are various systematic molecular phenotyping analyses (e.g., genomics, transcriptomics, proteomics, and metabolomics). Although a number of emergency physicians use such techniques in their research, widespread discussion of these approaches has been lacking in the emergency care literature and many emergency physicians may be unfamiliar with them. In this article, we briefly review the underpinnings of such studies, note how they already impact acute care, discuss areas in which they might soon be applied, and identify challenges in translation to the emergency department (ED). While such techniques hold much promise, it is unclear whether the obstacles to translating their findings to the ED will be overcome in the near future. Such obstacles include validation, cost, turnaround time, user interface, decision support, standardization, and adoption by end-users.
Resumo:
Historically, memory has been evaluated by examining how much is remembered, however a more recent conception of memory focuses on the accuracy of memories. When using this accuracy-oriented conception of memory, unlike with the quantity-oriented approach, memory does not always deteriorate over time. A possible explanation for this seemingly surprising finding lies in the metacognitive processes of monitoring and control. Use of these processes allows people to withhold responses of which they are unsure, or to adjust the precision of responses to a level that is broad enough to be correct. The ability to accurately report memories has implications for investigators who interview witnesses to crimes, and those who evaluate witness testimony. This research examined the amount of information provided, accuracy, and precision of responses provided during immediate and delayed interviews about a videotaped mock crime. The interview format was manipulated such that a single free narrative response was elicited, or a series of either yes/no or cued questions were asked. Instructions provided by the interviewer indicated to the participants that they should either stress being informative, or being accurate. The interviews were then transcribed and scored. Results indicate that accuracy rates remained stable and high after a one week delay. Compared to those interviewed immediately, after a delay participants provided less information and responses that were less precise. Participants in the free narrative condition were the most accurate. Participants in the cued questions condition provided the most precise responses. Participants in the yes/no questions condition were most likely to say “I don’t know”. The results indicate that people are able to monitor their memories and modify their reports to maintain high accuracy. When control over precision was not possible, such as in the yes/no condition, people said “I don’t know” to maintain accuracy. However when withholding responses and adjusting precision were both possible, people utilized both methods. It seems that concerns that memories reported after a long retention interval might be inaccurate are unfounded.
Resumo:
In his last two State of the Union addresses, President Barack Obama has focused on the need to deliver innovative solutions to improve human health, through the Precision Medicine Initiative in 2015 and the recently announced Cancer Moonshot in 2016. Precision cancer care has delivered clear patient benefit, but even for high-impact medicines such as imatinib mesylate (Glivec) in chronic myeloid leukaemia, the excitement at the success of this practice-changing clinical intervention has been somewhat tempered by the escalating price of this 'poster child' for precision cancer medicine (PCM). Recent studies on the costs of cancer drugs have revealed significant price differentials, which are a major causative factor behind disparities in the access to new generations of immunological and molecularly targeted agents. In this perspective, we will discuss the benefits of PCM to modern cancer control, but also emphasise how increasing costs are rendering the current approaches to integrating the paradigm of PCM unsustainable. Despite the ever increasing pressure on cancer and health care budgets, innovation will and must continue. Value-based frameworks offer one of the most rational approaches for policymakers committed to improving cancer outcomes through a public health approach.
Resumo:
Temporal replicate counts are often aggregated to improve model fit by reducing zero-inflation and count variability, and in the case of migration counts collected hourly throughout a migration, allows one to ignore nonindependence. However, aggregation can represent a loss of potentially useful information on the hourly or seasonal distribution of counts, which might impact our ability to estimate reliable trends. We simulated 20-year hourly raptor migration count datasets with known rate of change to test the effect of aggregating hourly counts to daily or annual totals on our ability to recover known trend. We simulated data for three types of species, to test whether results varied with species abundance or migration strategy: a commonly detected species, e.g., Northern Harrier, Circus cyaneus; a rarely detected species, e.g., Peregrine Falcon, Falco peregrinus; and a species typically counted in large aggregations with overdispersed counts, e.g., Broad-winged Hawk, Buteo platypterus. We compared accuracy and precision of estimated trends across species and count types (hourly/daily/annual) using hierarchical models that assumed a Poisson, negative binomial (NB) or zero-inflated negative binomial (ZINB) count distribution. We found little benefit of modeling zero-inflation or of modeling the hourly distribution of migration counts. For the rare species, trends analyzed using daily totals and an NB or ZINB data distribution resulted in a higher probability of detecting an accurate and precise trend. In contrast, trends of the common and overdispersed species benefited from aggregation to annual totals, and for the overdispersed species in particular, trends estimating using annual totals were more precise, and resulted in lower probabilities of estimating a trend (1) in the wrong direction, or (2) with credible intervals that excluded the true trend, as compared with hourly and daily counts.
Resumo:
The phase difference principle is widely applied nowadays to sonar systems used for sea floor bathymetry, The apparent angle of a target point is obtained from the phase difference measured between two close receiving arrays. Here we study the influence of the phase difference estimation errors caused by the physical structure of the backscattered signals. It is shown that, under certain current conditions, beyond the commonly considered effects of additive external noise and baseline decorrelation, the processing may be affected by the shifting footprint effect: this is due to the fact that the two interferometer receivers get simultaneous echo contributions coming from slightly shifted seabed parts, which results in a degradation of the signal coherence and, hence, of the phase difference measurement. This geometrical effect is described analytically and checked with numerical simulations, both for square- and sine-shaped signal envelopes. Its relative influence depends on the geometrical configuration and receiver spacing; it may be prevalent in practical cases associated with bathymetric sonars. The cases of square and smooth signal envelopes are both considered. The measurements close to nadir, which are known to be especially difficult with interferometry systems, are addressed in particular.
Resumo:
Manipulation of single cells and particles is important to biology and nanotechnology. Our electrokinetic (EK) tweezers manipulate objects in simple microfluidic devices using gentle fluid and electric forces under vision-based feedback control. In this dissertation, I detail a user-friendly implementation of EK tweezers that allows users to select, position, and assemble cells and nanoparticles. This EK system was used to measure attachment forces between living breast cancer cells, trap single quantum dots with 45 nm accuracy, build nanophotonic circuits, and scan optical properties of nanowires. With a novel multi-layer microfluidic device, EK was also used to guide single microspheres along complex 3D trajectories. The schemes, software, and methods developed here can be used in many settings to precisely manipulate most visible objects, assemble objects into useful structures, and improve the function of lab-on-a-chip microfluidic systems.
Resumo:
We describe a new geometry for electrostatic actuators to be used in sensitive laser interferometers, suited for prototype and table top experiments related to gravitational wave detection with mirrors of 100 g or less. The arrangement consists of two plates at the sides of the mirror (test mass), and therefore does not reduce its clear aperture as a conventional electrostatic drive (ESD) would do. Using the sample case of the AEI-10 m prototype interferometer, we investigate the actuation range and the influence of the relative misalignment of the ESD plates with respect to the test mass. We find that in the case of the AEI-10 m prototype interferometer, this new kind of ESD could provide a range of 0.28 mu m when operated at a voltage of 1 kV. In addition, the geometry presented is shown to provide a reduction factor of about 100 in the magnitude of the actuator motion coupling to the test mass displacement. We show that therefore in the specific case of the AEI-10 m interferometer, it is possible to mount the ESD actuators directly on the optical table without spoiling the seismic isolation performance of the triple stage suspension of the main test masses.
Resumo:
We present a method to verify the metrological usefulness of noisy Dicke states of a particle ensemble with only a few collective measurements, without the need for a direct measurement of the sensitivity. Our method determines the usefulness of the state for the usual protocol for estimating the angle of rotation with Dicke states, which is based on the measurement of the second moment of a total spin component. It can also be used to detect entangled states that are useful for quantum metrology. We apply our method to recent experimental results.