765 resultados para MEASURING DEVICES


Relevância:

60.00% 60.00%

Publicador:

Resumo:

A novel Neuropredictive Teleoperation (NPT) Scheme is presented. The design results from two key ideas: the exploitation of the measured or estimated neural input to the human arm or its electromyograph (EMG) as the system input and the employment of a predictor of the arm movement, based on this neural signal and an arm model, to compensate for time delays in the system. Although a multitude of such models, as well as measuring devices for the neural signals and the EMG, have been proposed, current telemanipulator research has only been considering highly simplified arm models. In the present design, the bilateral constraint that the master and slave are simultaneously compliant to each other's state (equal positions and forces) is abandoned, thus obtaining a simple to analyzesuccession of only locally controlled modules, and a robustness to time delays of up to 500 ms. The proposed designs were inspired by well established physiological evidence that the brain, rather than controlling the movement on-line, programs the arm with an action plan of a complete movement, which is then executed largely in open loop, regulated only by local reflex loops. As a model of the human arm the well-established Stark model is employed, whose mathematical representation is modified to make it suitable for an engineering application. The proposed scheme is however valid for any arm model. BIBO-stability and passivity results for a variety of local control laws are reported. Simulation results and comparisons with traditional designs also highlight the advantages of the proposed design.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In this paper, we discuss inferential aspects for the Grubbs model when the unknown quantity x (latent response) follows a skew-normal distribution, extending early results given in Arellano-Valle et al. (J Multivar Anal 96:265-281, 2005b). Maximum likelihood parameter estimates are computed via the EM-algorithm. Wald and likelihood ratio type statistics are used for hypothesis testing and we explain the apparent failure of the Wald statistics in detecting skewness via the profile likelihood function. The results and methods developed in this paper are illustrated with a numerical example.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The Grubbs` measurement model is frequently used to compare several measuring devices. It is common to assume that the random terms have a normal distribution. However, such assumption makes the inference vulnerable to outlying observations, whereas scale mixtures of normal distributions have been an interesting alternative to produce robust estimates, keeping the elegancy and simplicity of the maximum likelihood theory. The aim of this paper is to develop an EM-type algorithm for the parameter estimation, and to use the local influence method to assess the robustness aspects of these parameter estimates under some usual perturbation schemes, In order to identify outliers and to criticize the model building we use the local influence procedure in a Study to compare the precision of several thermocouples. (C) 2008 Elsevier B.V. All rights reserved.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This thesis focuses on using photovoltaic produced electricity to power air conditioners in a tropical climate. The study takes place in Surabaya, Indonesia at two different locations the classroom, located at the UBAYA campus and the home office, 10 km away. Indonesia has an average solar irradiation of about 4.8 kWh/m²/day (PWC Indonesia, 2013) which is for ideal conditions for these tests. At the home office, tests were conducted on different photovoltaic systems. A series of measuring devices recorded the performance of the 800 W PV system and the consumption of the 1.35 kW air conditioner (cooling capacity). To have an off grid system many of the components need to be oversized. The inverter has to be oversized to meet the startup load of the air conditioner, which can be 3 to 8 times the operating power (Rozenblat, 2013). High energy consumption of the air conditioner would require a large battery storage to provide one day of autonomy. The PV systems output must at least match the consumption of the air conditioner. A grid connect system provides a much better solution with the 800 W PV system providing 80 % of the 3.5 kWh load of the air conditioner, the other 20 % coming from the grid during periods of low irradiation. In this system the startup load is provided by the grid so the inverter does not need to be oversized. With the grid-connected system, the PV panel’s production does not need to match the consumption of the air conditioner, although a smaller PV array will mean a smaller percentage of the load will be covered by PV. Using the results from the home office tests and results from measurements made in the classroom. Two different PV systems (8 kW and 12 kW) were simulated to power both the current air conditioners (COP 2.78) and new air conditioners (COP 4.0). The payback period of the systems can vary greatly depending on if a feed in tariff is awarded or not. If the feed in tariff is awarded the best system is the 12 kW system, with a payback period of 4.3 years and a levelized cost of energy at -3,334 IDR/kWh. If the feed in tariff is not granted then the 8 kW system is the best choice with a lower payback period and lower levelized cost of energy than the 12 kW system under the same conditions.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Physical activity is one of the main components of a healthy lifestyle, responsible for many health benefits. Despite being considered important for both disease prevention and health promotion there is high prevalence of sedentary behavior in the elderly population. Questionnaires are practical and feasible instruments for assessing levels of physical activity. However, they may have limitations in older age ranges. Accelerometers, movement sensors that make physical activity data more objective, emerge as reliable measuring devices. Aim: Determine the validity of the International Physical Activity Questionnaire (IPAQ) adapted for elderly with accelerometry in elderly women. Methods: 57 elderly women, with mean age of 66.05 ± 5.98 years who took part in hypertension control and physical activity incentive programs were assessed in relation to objective and subjective measures of physical activity. The accelerometer was used for 07 consecutive days, 24 hours per day before the IPAQ was applied. Data were analyzed using measures of central tendency and dispersion to characterize the sample according to variables collected. To check the validity of the data we used the Spearman correlation test, considering a significance level of p <0.05. Results: With respect to the categories of physical activity obtained by IPAQ, 46.4% developed moderate physical activity, followed by a high (30.3%) and low level (23.2%). There was a negative correlation only between self-reported time spent sitting and time spent on light activities as measured by accelerometry (r = - 0.408; p = 0.003) and mean activity level (counts/min) with physical activity levels evaluated by IPAQ (r = 0.297; p = 0.036). Conclusion: The IPAQ used in elderly women shows moderate to low validity levels according to accelerometry measures. Assessment of sedentary activities exhibited acceptable levels compared to accelerometry; however, moderate (r = 0.096; p > 0.05) to vigorous (r = 0.098; p > 0.05) activities were not correlated, demonstrating the inability of IPAQ to evaluate this type of activity in elderly women

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Disposição construtiva aplicada em cadeira antropométrica. Patente de modelo de utilidade para uma cadeira antropométrica constituída de uma estrutura (1) que sustenta uma pluralidade de instrumentos de medição, sendo que a parte posterior da cadeira é provida de dois instrumentos de medição compostos por duas hastes, sendo uma delas para medida da altura tronco-encefálica (2) e a outra para a medida do assento até a região renal (4), de modo que cada uma dessas hastes possui uma escala numérica, sendo ora interna (3), ora externa (5). O assento (6) da cadeira (1) é composto por um anteparo com uma canaleta interna e dois cursores laterais deslizantes para a direita e para a esquerda, que possuem escala numérica.; O assento é provido de referências métricas sendo uma no sentido da largura do assento (8) e outra no sentido da profundidade do assento (9), sendo a escala (8) dividida em duas escalas, onde o ponto zero é exatamente o meio do assento. O assento (6) também possui acoplada uma haste frontal, com deslizamento no sentido antero-posterior (10), a qual contém uma escala interna embutida na peça, de modo que a soma entre a medida da profundidade do assento com a medida obtida pela haste deslizante horizontal (10) totalizam a medida sacro-poplítea. Para a tomada da medida da altura poplítea, há uma outra haste (11), integrada à superfície anterior da cadeira, cujo deslizamento é vertical, sendo que essa haste (11) possui uma escala numérica interna uma externa (12).; A base da cadeira (13) possui um dispositivo de acionamento lateral com o pé (16), que é conectado ao assento (6) por meio de um macaco hidráulico, permitindo a elevação do assento, sendo que a outra alavanca (17), ao ser girada, realiza a descida do assento (6).

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This paper presents the new active absorption wave basin, named Hydrodynamic Calibrator (HC), constructed at the University of São Paulo (USP), in the Laboratory facilities of the Numerical Offshore Tank (TPN). The square (14 m 14 m) tank is able to generate and absorb waves from 0.5 Hz to 2.0 Hz, by means of 148 active hinged flap wave makers. An independent mechanical system drives each flap by means of a 1HP servo-motor and a ball-screw based transmission system. A customized ultrasonic wave probe is installed in each flap, and is responsible for measuring wave elevation in the flap. A complex automation architecture was implemented, with three Programmable Logic Computers (PLCs), and a low-level software is responsible for all the interlocks and maintenance functions of the tank. Furthermore, all the control algorithms for the generation and absorption are implemented using higher level software (MATLAB /Simulink block diagrams). These algorithms calculate the motions of the wave makers both to generate and absorb the required wave field by taking into account the layout of the flaps and the limits of wave generation. The experimental transfer function that relates the flap amplitude to the wave elevation amplitude is used for the calculation of the motion of each flap. This paper describes the main features of the tank, followed by a detailed presentation of the whole automation system. It includes the measuring devices, signal conditioning, PLC and network architecture, real-time and synchronizing software and motor control loop. Finally, a validation of the whole automation system is presented, by means of the experimental analysis of the transfer function of the waves generated and the calculation of all the delays introduced by the automation system.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This article reports on recent electrical and optical techniques for investigating cellular signaling reactions in artificial and native membranes immobilized on solid supports. The first part describes the formation of planar artificial lipid bilayers on gold electrodes, which reveal giga-ohm electrical resistance and the insertion and characterization of ionotropic receptors therein. These membranes are suited to record a few or even single ion channels by impedance spectroscopy. Such tethered membranes on planar arrays of microelectrodes offer mechanically robust, long-lasting measuring devices to probe the influence of different chemistries on biologically important ionotropic receptors and therefore will have a future impact to probe the function of channel proteins in basic science and in biosensor applications. In a second part, we present complementary approaches to form inside-out native membrane sheets that are immobilized on micrometer-sized beads or across submicrometer-sized holes machined in a planar support. Because the native membrane sheets are plasma membranes detached from live cells, these approaches offer a unique possibility to investigate cellular signaling processes, such as those mediated by ionotropic or G protein-coupled receptors, with original composition of lipids and proteins.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Instruments for on-farm determination of colostrum quality such as refractometers and densimeters are increasingly used in dairy farms. The colour of colostrum is also supposed to reflect its quality. A paler or mature milk-like colour is associated with a lower colostrum value in terms of its general composition compared with a more yellowish and darker colour. The objective of this study was to investigate the relationships between colour measurement of colostrum using the CIELAB colour space (CIE L*=from white to black, a*=from red to green, b*=from yellow to blue, chroma value G=visual perceived colourfulness) and its composition. Dairy cow colostrum samples (n=117) obtained at 4·7±1·5 h after parturition were analysed for immunoglobulin G (IgG) by ELISA and for fat, protein and lactose by infrared spectroscopy. For colour measurements, a calibrated spectrophotometer was used. At a cut-off value of 50 mg IgG/ml, colour measurement had a sensitivity of 50·0%, a specificity of 49·5%, and a negative predictive value of 87·9%. Colostral IgG concentration was not correlated with the chroma value G, but with relative lightness L*. While milk fat content showed a relationship to the parameters L*, a*, b* and G from the colour measurement, milk protein content was not correlated with a*, but with L*, b*, and G. Lactose concentration in colostrum showed only a relationship with b* and G. In conclusion, parameters of the colour measurement showed clear relationships to colostral IgG, fat, protein and lactose concentration in dairy cows. Implementation of colour measuring devices in automatic milking systems and milking parlours might be a potential instrument to access colostrum quality as well as detecting abnormal milk.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

La metodología del racionalismo crítico permite ajustar una teoría cuando aparecen problemas. En metrología surgieron nuevas exigencias para las mediciones que provienen de los mercados globalizados. Esa situación está provocando el desplazamiento de la concepción tradicional por la teoría de incertidumbre. Este trabajo examinó ese cambio, evaluando ambas teorías y determinando sus aproximaciones a la verdad y los motivos para cambiarlas. Se realizó un análisis lógico de la bibliografía especializada para determinar la preferencia teórica. Se concluyó que la teoría de la incertidumbre es más confiable, porque define mejor mensurando y condiciones de medición. Las certificaciones de trazabilidad documentan las calibraciones periódicas de los instrumentos de medición, garantizando la verdad como criterio regulador, para que mantengan su validez y confiabilidad. Además, el racionalismo crítico permitió evaluar este cambio en metrología como una evolución en el mundo del conocimiento objetivo.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The paper describes a procedure for accurately and speedily calibrating tanks used for the chemical processing of nuclear materials. The procedure features the use of (1) precalibrated vessels certified to deliver known volumes of liquid, (2) calibrated linear measuring devices, and (3) a digital computer for manipulating data and producing printed calibration information. Calibration records of the standards are traceable to primary standards. Logic is incorporated in the computer program to accomplish curve fitting and perform the tests to accept or to reject the calibration, based on statistical, empirical, and report requirements. This logic is believed to be unique.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The development of TDR for measurement of soil water content and electrical conductivity has resulted in a large shift in measurement methods for a breadth of soil and hydrological characterization efforts. TDR has also opened new possibilities for soil and plant research. Five examples show how TDR has enhanced our ability to conduct our soil- and plant-water research. (i) Oxygen is necessary for healthy root growth and plant development but quantitative evaluation of the factors controlling oxygen supply in soil depends on knowledge of the soil water content by TDR. With water content information we have modeled successfully some impact of tillage methods on oxygen supply to roots and their growth response. (ii) For field assessment of soil mechanical properties influencing crop growth, water content capability was added to two portable soil strength measuring devices; (a) A TDT (Time Domain Transmittivity)-equipped soil cone penetrometer was used to evaluate seasonal soil strengthwater content relationships. In conventional tillage systems the relationships are dynamic and achieve the more stable no-tillage relationships only relatively late in each growing season; (b) A small TDR transmission line was added to a modified sheargraph that allowed shear strength and water content to be measured simultaneously on the same sample. In addition, the conventional graphing procedure for data acquisition was converted to datalogging using strain gauges. Data acquisition rate was improved by more than a factor of three with improved data quality. (iii) How do drought tolerant plants maintain leaf water content? Non-destructive measurement of TDR water content using a flat serpentine triple wire transmission line replaces more lengthy procedures of measuring relative water content. Two challenges remain: drought-stressed leaves alter salt content, changing electrical conductivity, and drought induced changes in leaf morphology affect TDR measurements. (iv) Remote radar signals are reflected from within the first 2 cm of soil. Appropriate calibration of radar imaging for soil water content can be achieved by a parallel pair of blades separated by 8 cm, reaching 1.7 cm into soil and forming a 20 cm TDR transmission line. The correlation between apparent relative permittivity from TDR and synthetic aperture radar (SAR) backscatter coefficient was 0.57 from an airborne flyover. These five examples highlight the diversity in the application of TDR in soil and plant research.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This thesis presents the results from an investigation into the merits of analysing Magnetoencephalographic (MEG) data in the context of dynamical systems theory. MEG is the study of both the methods for the measurement of minute magnetic flux variations at the scalp, resulting from neuro-electric activity in the neocortex, as well as the techniques required to process and extract useful information from these measurements. As a result of its unique mode of action - by directly measuring neuronal activity via the resulting magnetic field fluctuations - MEG possesses a number of useful qualities which could potentially make it a powerful addition to any brain researcher's arsenal. Unfortunately, MEG research has so far failed to fulfil its early promise, being hindered in its progress by a variety of factors. Conventionally, the analysis of MEG has been dominated by the search for activity in certain spectral bands - the so-called alpha, delta, beta, etc that are commonly referred to in both academic and lay publications. Other efforts have centred upon generating optimal fits of "equivalent current dipoles" that best explain the observed field distribution. Many of these approaches carry the implicit assumption that the dynamics which result in the observed time series are linear. This is despite a variety of reasons which suggest that nonlinearity might be present in MEG recordings. By using methods that allow for nonlinear dynamics, the research described in this thesis avoids these restrictive linearity assumptions. A crucial concept underpinning this project is the belief that MEG recordings are mere observations of the evolution of the true underlying state, which is unobservable and is assumed to reflect some abstract brain cognitive state. Further, we maintain that it is unreasonable to expect these processes to be adequately described in the traditional way: as a linear sum of a large number of frequency generators. One of the main objectives of this thesis will be to prove that much more effective and powerful analysis of MEG can be achieved if one were to assume the presence of both linear and nonlinear characteristics from the outset. Our position is that the combined action of a relatively small number of these generators, coupled with external and dynamic noise sources, is more than sufficient to account for the complexity observed in the MEG recordings. Another problem that has plagued MEG researchers is the extremely low signal to noise ratios that are obtained. As the magnetic flux variations resulting from actual cortical processes can be extremely minute, the measuring devices used in MEG are, necessarily, extremely sensitive. The unfortunate side-effect of this is that even commonplace phenomena such as the earth's geomagnetic field can easily swamp signals of interest. This problem is commonly addressed by averaging over a large number of recordings. However, this has a number of notable drawbacks. In particular, it is difficult to synchronise high frequency activity which might be of interest, and often these signals will be cancelled out by the averaging process. Other problems that have been encountered are high costs and low portability of state-of-the- art multichannel machines. The result of this is that the use of MEG has, hitherto, been restricted to large institutions which are able to afford the high costs associated with the procurement and maintenance of these machines. In this project, we seek to address these issues by working almost exclusively with single channel, unaveraged MEG data. We demonstrate the applicability of a variety of methods originating from the fields of signal processing, dynamical systems, information theory and neural networks, to the analysis of MEG data. It is noteworthy that while modern signal processing tools such as independent component analysis, topographic maps and latent variable modelling have enjoyed extensive success in a variety of research areas from financial time series modelling to the analysis of sun spot activity, their use in MEG analysis has thus far been extremely limited. It is hoped that this work will help to remedy this oversight.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Particulate solids are complex redundant systems which consist of discrete particles. The interactions between the particles are complex and have been the subject of many theoretical and experimental investigations. Invetigations of particulate material have been restricted by the lack of quantitative information on the mechanisms occurring within an assembly. Laboratory experimentation is limited as information on the internal behaviour can only be inferred from measurements on the assembly boundary, or the use of intrusive measuring devices. In addition comparisons between test data are uncertain due to the difficulty in reproducing exact replicas of physical systems. Nevertheless, theoretical and technological advances require more detailed material information. However, numerical simulation affords access to information on every particle and hence the micro-mechanical behaviour within an assembly, and can replicate desired systems. To use a computer program to numerically simulate material behaviour accurately it is necessary to incorporte realistic interaction laws. This research programme used the finite difference simulation program `BALL', developed by Cundall (1971), which employed linear spring force-displacement laws. It was thus necessary to incorporate more realistic interaction laws. Therefore, this research programme was primarily concerned with the implementation of the normal force-displacement law of Hertz (1882) and the tangential force-displacement laws of Mindlin and Deresiewicz (1953). Within this thesis the contact mechanics theories employed in the program are developed and the adaptations which were necessary to incorporate these laws are detailed. Verification of the new contact force-displacement laws was achieved by simulating a quasi-static oblique contact and single particle oblique impact. Applications of the program to the simulation of large assemblies of particles is given, and the problems in undertaking quasi-static shear tests along with the results from two successful shear tests are described.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Highlights of Data Expedition: • Students explored daily observations of local climate data spanning the past 35 years. • Topological Data Analysis, or TDA for short, provides cutting-edge tools for studying the geometry of data in arbitrarily high dimensions. • Using TDA tools, students discovered intrinsic dynamical features of the data and learned how to quantify periodic phenomenon in a time-series. • Since nature invariably produces noisy data which rarely has exact periodicity, students also considered the theoretical basis of almost-periodicity and even invented and tested new mathematical definitions of almost-periodic functions. Summary The dataset we used for this data expedition comes from the Global Historical Climatology Network. “GHCN (Global Historical Climatology Network)-Daily is an integrated database of daily climate summaries from land surface stations across the globe.” Source: https://www.ncdc.noaa.gov/oa/climate/ghcn-daily/ We focused on the daily maximum and minimum temperatures from January 1, 1980 to April 1, 2015 collected from RDU International Airport. Through a guided series of exercises designed to be performed in Matlab, students explore these time-series, initially by direct visualization and basic statistical techniques. Then students are guided through a special sliding-window construction which transforms a time-series into a high-dimensional geometric curve. These high-dimensional curves can be visualized by projecting down to lower dimensions as in the figure below (Figure 1), however, our focus here was to use persistent homology to directly study the high-dimensional embedding. The shape of these curves has meaningful information but how one describes the “shape” of data depends on which scale the data is being considered. However, choosing the appropriate scale is rarely an obvious choice. Persistent homology overcomes this obstacle by allowing us to quantitatively study geometric features of the data across multiple-scales. Through this data expedition, students are introduced to numerically computing persistent homology using the rips collapse algorithm and interpreting the results. In the specific context of sliding-window constructions, 1-dimensional persistent homology can reveal the nature of periodic structure in the original data. I created a special technique to study how these high-dimensional sliding-window curves form loops in order to quantify the periodicity. Students are guided through this construction and learn how to visualize and interpret this information. Climate data is extremely complex (as anyone who has suffered from a bad weather prediction can attest) and numerous variables play a role in determining our daily weather and temperatures. This complexity coupled with imperfections of measuring devices results in very noisy data. This causes the annual seasonal periodicity to be far from exact. To this end, I have students explore existing theoretical notions of almost-periodicity and test it on the data. They find that some existing definitions are also inadequate in this context. Hence I challenged them to invent new mathematics by proposing and testing their own definition. These students rose to the challenge and suggested a number of creative definitions. While autocorrelation and spectral methods based on Fourier analysis are often used to explore periodicity, the construction here provides an alternative paradigm to quantify periodic structure in almost-periodic signals using tools from topological data analysis.