68 resultados para strain gauge measurements
em Consorci de Serveis Universitaris de Catalunya (CSUC), Spain
Resumo:
Measuring the height of the vertical jump is an indicator of the strength and power of the lower body. The technological tools available to measure the vertical jump are black boxes and are not open to third-party verification or adaptation. We propose the creation of a measurement system called Chronojump-Boscosystem, consisting of open hardware and free software. Methods: A microcontroller was created and validated using a square wave generator and an oscilloscope. Two types of contact platforms were developed using different materials. These platforms were validated by the minimum pressure required for activation at different points by a strain gauge, together with the on/off time of our platforms in respect of the Ergojump-Boscosystem platform by a sample of 8 subjects performing submaximal jumps with one foot on each platform. Agile methodologies were used to develop and validate the software. Results: All the tools fall under the free software / open hardware guidelines and are, in that sense, free. The microcontroller margin of error is 0.1%. The validity of the fiberglass platform is 0.95 (ICC). The management software contains nearly 113.000 lines of code and is available in 7 languages.
Resumo:
We analyze the constraints on the mass and mixing of a superstring-inspired E6 Z' neutral gauge boson that follow from the recent precise Z mass measurements and show that they depend very sensitively on the assumed value of the W mass and also, to a lesser extent, on the top-quark mass.
Resumo:
TCP flows from applications such as the web or ftp are well supported by a Guaranteed Minimum Throughput Service (GMTS), which provides a minimum network throughput to the flow and, if possible, an extra throughput. We propose a scheme for a GMTS using Admission Control (AC) that is able to provide different minimum throughput to different users and that is suitable for "standard" TCP flows. Moreover, we consider a multidomain scenario where the scheme is used in one of the domains, and we propose some mechanisms for the interconnection with neighbor domains. The whole scheme uses a small set of packet classes in a core-stateless network where each class has a different discarding priority in queues assigned to it. The AC method involves only edge nodes and uses a special probing packet flow (marked as the highest discarding priority class) that is sent continuously from ingress to egress through a path. The available throughput in the path is obtained at the egress using measurements of flow aggregates, and then it is sent back to the ingress. At the ingress each flow is detected using an implicit way and then it is admission controlled. If it is accepted, it receives the GMTS and its packets are marked as the lowest discarding priority classes; otherwise, it receives a best-effort service. The scheme is evaluated through simulation in a simple "bottleneck" topology using different traffic loads consisting of "standard" TCP flows that carry files of varying sizes
Resumo:
This paper presents the implementation details of a coded structured light system for rapid shape acquisition of unknown surfaces. Such techniques are based on the projection of patterns onto a measuring surface and grabbing images of every projection with a camera. Analyzing the pattern deformations that appear in the images, 3D information of the surface can be calculated. The implemented technique projects a unique pattern so that it can be used to measure moving surfaces. The structure of the pattern is a grid where the color of the slits are selected using a De Bruijn sequence. Moreover, since both axis of the pattern are coded, the cross points of the grid have two codewords (which permits to reconstruct them very precisely), while pixels belonging to horizontal and vertical slits have also a codeword. Different sets of colors are used for horizontal and vertical slits, so the resulting pattern is invariant to rotation. Therefore, the alignment constraint between camera and projector considered by a lot of authors is not necessary
Resumo:
Viruses rapidly evolve, and HIV in particular is known to be one of the fastest evolving human viruses. It is now commonly accepted that viral evolution is the cause of the intriguing dynamics exhibited during HIV infections and the ultimate success of the virus in its struggle with the immune system. To study viral evolution, we use a simple mathematical model of the within-host dynamics of HIV which incorporates random mutations. In this model, we assume a continuous distribution of viral strains in a one-dimensional phenotype space where random mutations are modelled by di ffusion. Numerical simulations show that random mutations combined with competition result in evolution towards higher Darwinian fitness: a stable traveling wave of evolution, moving towards higher levels of fi tness, is formed in the phenoty space.
Resumo:
We study the damage enhanced creep rupture of disordered materials by means of a fiber bundle model. Broken fibers undergo a slow stress relaxation modeled by a Maxwell element whose stress exponent m can vary in a broad range. Under global load sharing we show that due to the strength disorder of fibers, the lifetime ʧ of the bundle has sample-to-sample fluctuations characterized by a log-normal distribution independent of the type of disorder. We determine the Monkman-Grant relation of the model and establish a relation between the rupture life tʄ and the characteristic time tm of the intermediate creep regime of the bundle where the minimum strain rate is reached, making possible reliable estimates of ʧ from short term measurements. Approaching macroscopic failure, the deformation rate has a finite time power law singularity whose exponent is a decreasing function of m. On the microlevel the distribution of waiting times is found to have a power law behavior with m-dependent exponents different below and above the critical load of the bundle. Approaching the critical load from above, the cutoff value of the distributions has a power law divergence whose exponent coincides with the stress exponent of Maxwell elements
Resumo:
This paper presents a new registration algorithm, called Temporal Di eomorphic Free Form Deformation (TDFFD), and its application to motion and strain quanti cation from a sequence of 3D ultrasound (US) images. The originality of our approach resides in enforcing time consistency by representing the 4D velocity eld as the sum of continuous spatiotemporal B-Spline kernels. The spatiotemporal displacement eld is then recovered through forward Eulerian integration of the non-stationary velocity eld. The strain tensor iscomputed locally using the spatial derivatives of the reconstructed displacement eld. The energy functional considered in this paper weighs two terms: the image similarity and a regularization term. The image similarity metric is the sum of squared di erences between the intensities of each frame and a reference one. Any frame in the sequence can be chosen as reference. The regularization term is based on theincompressibility of myocardial tissue. TDFFD was compared to pairwise 3D FFD and 3D+t FFD, bothon displacement and velocity elds, on a set of synthetic 3D US images with di erent noise levels. TDFFDshowed increased robustness to noise compared to these two state-of-the-art algorithms. TDFFD also proved to be more resistant to a reduced temporal resolution when decimating this synthetic sequence. Finally, this synthetic dataset was used to determine optimal settings of the TDFFD algorithm. Subsequently, TDFFDwas applied to a database of cardiac 3D US images of the left ventricle acquired from 9 healthy volunteers and 13 patients treated by Cardiac Resynchronization Therapy (CRT). On healthy cases, uniform strain patterns were observed over all myocardial segments, as physiologically expected. On all CRT patients, theimprovement in synchrony of regional longitudinal strain correlated with CRT clinical outcome as quanti ed by the reduction of end-systolic left ventricular volume at follow-up (6 and 12 months), showing the potential of the proposed algorithm for the assessment of CRT.
Resumo:
Introducción y objetivos. Se ha señalado que, en la miocardiopatía hipertrófica (MCH), la desorganización de las fibras regionales da lugar a segmentos en los que la deformación es nula o está gravemente reducida, y que estos segmentos tienen una distribución no uniforme en el ventrículo izquierdo (VI). Esto contrasta con lo observado en otros tipos de hipertrofia como en el corazón de atleta o la hipertrofia ventricular izquierda hipertensiva (HVI-HT), en los que puede haber una deformación cardiaca anormal, pero nunca tan reducida como para que se observe ausencia de deformación. Así pues, proponemos el empleo de la distribución de los valores de strain para estudiar la deformación en la MCH. Métodos. Con el empleo de resonancia magnética marcada (tagged), reconstruimos la deformación sistólica del VI de 12 sujetos de control, 10 atletas, 12 pacientes con MCH y 10 pacientes con HVI-HT. La deformación se cuantificó con un algoritmo de registro no rígido y determinando los valores de strain sistólico máximo radial y circunferencial en 16 segmentos del VI. Resultados. Los pacientes con MCH presentaron unos valores medios de strain significativamente inferiores a los de los demás grupos. Sin embargo, aunque la deformación observada en los individuos sanos y en los pacientes con HVI-HT se concentraba alrededor del valor medio, en la MCH coexistían segmentos con contracción normal y segmentos con una deformación nula o significativamente reducida, con lo que se producía una mayor heterogeneidad de los valores de strain. Se observaron también algunos segmentos sin deformación incluso en ausencia de fibrosis o hipertrofia. Conclusiones. La distribución de strain caracteriza los patrones específicos de deformación miocárdica en pacientes con diferentes etiologías de la HVI. Los pacientes con MCH presentaron un valor medio de strain significativamente inferior, así como una mayor heterogeneidad de strain (en comparación con los controles, los atletas y los pacientes con HVI-HT), y tenían regiones sin deformación.
Resumo:
With the two aims of monitoring social change and improving social measurement, the European Social Survey is now closing its third round. This paper shows how the accumulated experience of the two first rounds has been used to validate the questionnaire, better adapt the sampling design to the country characteristics and efficiently commit fieldwork in Spain. For example, the dynamic character of the population nowadays makes necessary to estimated design effects at each round from the data of the previous round. The paper also demonstrates how, starting with a response rate of 52% at first round, a 66% response rate is achieved at the third round thanks to an extensive quality control conducted by the polling agency and the ESS national team and based on a detailed analysis of the non-response cases and the incidences reported by the interviewed in the contact form.
Resumo:
With the two aims of monitoring social change and improving social measurement, the European Social Survey is now closing its third round. This paper shows how the accumulated experience of the two first rounds has been used to validate the questionnaire, better adapt the sampling design to the country characteristics and efficiently commit fieldwork in Spain. For example, the dynamic character of the population nowadays makes necessary to estimated design effects at each round from the data of the previous round. The paper also demonstrates how, starting with a response rate of 52% at first round, a 66% response rate is achieved at the third round thanks to an extensive quality control conducted by the polling agency and the ESS national team and based on a detailed analysis of the non-response cases and the incidences reported by the interviewed in the contact form.
Resumo:
We consider two fundamental properties in the analysis of two-way tables of positive data: the principle of distributional equivalence, one of the cornerstones of correspondence analysis of contingency tables, and the principle of subcompositional coherence, which forms the basis of compositional data analysis. For an analysis to be subcompositionally coherent, it suffices to analyse the ratios of the data values. The usual approach to dimension reduction in compositional data analysis is to perform principal component analysis on the logarithms of ratios, but this method does not obey the principle of distributional equivalence. We show that by introducing weights for the rows and columns, the method achieves this desirable property. This weighted log-ratio analysis is theoretically equivalent to spectral mapping , a multivariate method developed almost 30 years ago for displaying ratio-scale data from biological activity spectra. The close relationship between spectral mapping and correspondence analysis is also explained, as well as their connection with association modelling. The weighted log-ratio methodology is applied here to frequency data in linguistics and to chemical compositional data in archaeology.
Resumo:
This paper tests the internal consistency of time trade-off utilities.We find significant violations of consistency in the direction predictedby loss aversion. The violations disappear for higher gauge durations.We show that loss aversion can also explain that for short gaugedurations time trade-off utilities exceed standard gamble utilities. Ourresults suggest that time trade-off measurements that use relativelyshort gauge durations, like the widely used EuroQol algorithm(Dolan 1997), are affected by loss aversion and lead to utilities thatare too high.
Resumo:
Redshifts for 100 galaxies in 10 clusters of galaxies are presented based on data obtained between March 1984 and March 1985 from Calar Alto, La Palma, and ESO, and on data from Mauna Kea. Data for individual galaxies are given, and the accuracy of the velocities of the four instruments is discussed. Comparison with published data shows the present velocities to be shifted by + 4.0 km/s on average, with a standard deviation in the difference of 89.7 km/s, consistent with the rms of redshift measurements which range from 50-100 km/s.
Resumo:
In this work a new admittance spectroscopy technique is proposed to determine the conduction band offset in single quantum well structures (SQW). The proposed technique is based on the study of the capacitance derivative versus the frequency logarithm. This method is found to be less sensitive to parasitic effects, such as leakage current and series resistance, than the classical conductance analysis. Using this technique, we have determined the conduction band offset in In0.52Al0.48As/InxGa1¿xAs/In0.52Al0.48As SQW structures. Two different well compositions, x=0.53, which corresponds to the lattice¿matched case and x=0.60, which corresponds to a strained case, and two well widths (5 and 25 nm) have been considered. The average results are ¿Ec=0.49±0.04 eV for x=0.53 and ¿Ec =0.51±0.04 eV for x=0.6, which are in good agreement with previous reported data.