884 resultados para test data generation


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Platelet-derived microparticles that are produced during platelet activation bind to traumatized endothelium. Such endothelial injury occurs during percutaneous transluminal coronary angioplasty. Approximately 20% of these patients subsequently develop restenosis, although this is improved by treatment with the anti-platelet glycoprotein IIb/IIIa receptor drug abciximab. As platelet activation occurs during angioplasty, it is likely that platelet-derived microparticles may be produced and hence contribute to restenosis. This study population consisted of 113 angioplasty patients, of whom 38 received abciximab. Paired peripheral arterial blood samples were obtained following heparinization and subsequent to all vessel manipulation. Platelet-derived microparticles were identified using an anti-CD61 (glycoprotein IIIa) fluorescence-conjugated antibody and flow cytometry. Baseline clinical characteristics between patient groups were similar. The level of platelet-derived microparticles increased significantly following angioplasty in the group without abciximab (paired t test, P 0.019). However, there was no significant change in the level of platelet-derived microparticles following angioplasty in patients who received abciximab, despite requiring more complex angioplasty procedures. In this study, we have demonstrated that the level of platelet-derived microparticles increased during percutaneous transluminal coronary angioplasty, with no such increase with abciximab treatment. The increased platelet-derived microparticles may adhere to traumatized endothelium, contributing to re-occlusion of the arteries, but this remains to be determined.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The ambiguity acceptance test is an important quality control procedure in high precision GNSS data processing. Although the ambiguity acceptance test methods have been extensively investigated, its threshold determine method is still not well understood. Currently, the threshold is determined with the empirical approach or the fixed failure rate (FF-) approach. The empirical approach is simple but lacking in theoretical basis, while the FF-approach is theoretical rigorous but computationally demanding. Hence, the key of the threshold determination problem is how to efficiently determine the threshold in a reasonable way. In this study, a new threshold determination method named threshold function method is proposed to reduce the complexity of the FF-approach. The threshold function method simplifies the FF-approach by a modeling procedure and an approximation procedure. The modeling procedure uses a rational function model to describe the relationship between the FF-difference test threshold and the integer least-squares (ILS) success rate. The approximation procedure replaces the ILS success rate with the easy-to-calculate integer bootstrapping (IB) success rate. Corresponding modeling error and approximation error are analysed with simulation data to avoid nuisance biases and unrealistic stochastic model impact. The results indicate the proposed method can greatly simplify the FF-approach without introducing significant modeling error. The threshold function method makes the fixed failure rate threshold determination method feasible for real-time applications.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Ambiguity validation as an important procedure of integer ambiguity resolution is to test the correctness of the fixed integer ambiguity of phase measurements before being used for positioning computation. Most existing investigations on ambiguity validation focus on test statistic. How to determine the threshold more reasonably is less understood, although it is one of the most important topics in ambiguity validation. Currently, there are two threshold determination methods in the ambiguity validation procedure: the empirical approach and the fixed failure rate (FF-) approach. The empirical approach is simple but lacks of theoretical basis. The fixed failure rate approach has a rigorous probability theory basis, but it employs a more complicated procedure. This paper focuses on how to determine the threshold easily and reasonably. Both FF-ratio test and FF-difference test are investigated in this research and the extensive simulation results show that the FF-difference test can achieve comparable or even better performance than the well-known FF-ratio test. Another benefit of adopting the FF-difference test is that its threshold can be expressed as a function of integer least-squares (ILS) success rate with specified failure rate tolerance. Thus, a new threshold determination method named threshold function for the FF-difference test is proposed. The threshold function method preserves the fixed failure rate characteristic and is also easy-to-apply. The performance of the threshold function is validated with simulated data. The validation results show that with the threshold function method, the impact of the modelling error on the failure rate is less than 0.08%. Overall, the threshold function for the FF-difference test is a very promising threshold validation method and it makes the FF-approach applicable for the real-time GNSS positioning applications.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Nowadays, integration of small-scale electricity generators, known as Distributed Generation (DG), into distribution networks has become increasingly popular. This tendency together with the falling price of DG units has a great potential in giving the DG a better chance to participate in voltage regulation process, in parallel with other regulating devices already available in the distribution systems. The voltage control issue turns out to be a very challenging problem for distribution engineers, since existing control coordination schemes need to be reconsidered to take into account the DG operation. In this paper, a control coordination approach is proposed, which is able to utilize the ability of the DG as a voltage regulator, and at the same time minimize the interaction of DG with another DG or other active devices, such as On-load Tap Changing Transformer (OLTC). The proposed technique has been developed based on the concepts of protection principles (magnitude grading and time grading) for response coordination of DG and other regulating devices and uses Advanced Line Drop Compensators (ALDCs) for implementation. A distribution feeder with tap changing transformer and DG units has been extracted from a practical system to test the proposed control technique. The results show that the proposed method provides an effective solution for coordination of DG with another DG or voltage regulating devices and the integration of protection principles has considerably reduced the control interaction to achieve the desired voltage correction.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper relates to the importance of impact of the chosen bottle-point method when conducting ion exchange equilibria experiments. As an illustration, potassium ion exchange with strong acid cation resin was investigated due to its relevance to the treatment of various industrial effluents and groundwater. The “constant mass” bottle-point method was shown to be problematic in that depending upon the resin mass used the equilibrium isotherm profiles were different. Indeed, application of common equilibrium isotherm models revealed that the optimal fit could be with either the Freundlich or Temkin equations, depending upon the conditions employed. It could be inferred that the resin surface was heterogeneous in character, but precise conclusions regarding the variation in the heat of sorption were not possible. Estimation of the maximum potassium loading was also inconsistent when employing the “constant mass” method. The “constant concentration” bottle-point method illustrated that the Freundlich model was a good representation of the exchange process. The isotherms recorded were relatively consistent when compared to the “constant mass” approach. Unification of all the equilibrium isotherm data acquired was achieved by use of the Langmuir Vageler expression. The maximum loading of potassium ions was predicted to be at least 116.5 g/kg resin.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background Historically, the paper hand-held record (PHR) has been used for sharing information between hospital clinicians, general practitioners and pregnant women in a maternity shared-care environment. Recently in alignment with a National e-health agenda, an electronic health record (EHR) was introduced at an Australian tertiary maternity service to replace the PHR for collection and transfer of data. The aim of this study was to examine and compare the completeness of clinical data collected in a PHR and an EHR. Methods We undertook a comparative cohort design study to determine differences in completeness between data collected from maternity records in two phases. Phase 1 data were collected from the PHR and Phase 2 data from the EHR. Records were compared for completeness of best practice variables collected The primary outcome was the presence of best practice variables and the secondary outcomes were the differences in individual variables between the records. Results Ninety-four percent of paper medical charts were available in Phase 1 and 100% of records from an obstetric database in Phase 2. No PHR or EHR had a complete dataset of best practice variables. The variables with significant improvement in completeness of data documented in the EHR, compared with the PHR, were urine culture, glucose tolerance test, nuchal screening, morphology scans, folic acid advice, tobacco smoking, illicit drug assessment and domestic violence assessment (p = 0.001). Additionally the documentation of immunisations (pertussis, hepatitis B, varicella, fluvax) were markedly improved in the EHR (p = 0.001). The variables of blood pressure, proteinuria, blood group, antibody, rubella and syphilis status, showed no significant differences in completeness of recording. Conclusion This is the first paper to report on the comparison of clinical data collected on a PHR and EHR in a maternity shared-care setting. The use of an EHR demonstrated significant improvements to the collection of best practice variables. Additionally, the data in an EHR were more available to relevant clinical staff with the appropriate log-in and more easily retrieved than from the PHR. This study contributes to an under-researched area of determining data quality collected in patient records.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Next-generation sequencing techniques have revolutionized over the last decade providing researchers with low cost, high-throughput alternatives compared to the traditional Sanger sequencing methods. These sequencing techniques have rapidly evolved from first-generation to fourth-generation with very broad applications such as unravelling the complexity of the genome, in terms of genetic variations, and having a high impact on the biological field. In this review, we discuss the transition of sequencing from the second-generation to the third- and fourth-generations, and describe some of their novel biological applications. With the advancement in technology, the earlier challenges of minimal size of the instrument, flexibility of throughput, ease of data analysis and short run times are being addressed. However, the need for prospective analysis and effectiveness to test whether the knowledge of any given new variants identified has an effect on clinical outcome may need improvement.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This work is a MATLAB/Simulink model of a controller for a three-phase, four-wire, grid-interactive inverter. The model provides capacity for simulating the performance of power electroinic hardware, as well as code generation for an embedded controller. The implemented hardware topology is a three-leg bridge with a neutral connection to the centre-tap of the DC bus. An LQR-based current controller and MAF-based phase detector are implemented. The model is configured for code generation for a Texas Instruments TMS320F28335 Digital Signal Processor (DSP).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background The capacity to diagnosys, quantify and evaluate movement beyond the general confines of a clinical environment under effectiveness conditions may alleviate rampant strain on limited, expensive and highly specialized medical resources. An iPhone 4® mounted a three dimensional accelerometer subsystem with highly robust software applications. The present study aimed to evaluate the reliability and concurrent criterion-related validity of the accelerations with an iPhone 4® in an Extended Timed Get Up and Go test. Extended Timed Get Up and Go is a clinical test with that the patient get up from the chair and walking ten meters, turn and coming back to the chair. Methods A repeated measure, cross-sectional, analytical study. Test-retest reliability of the kinematic measurements of the iPhone 4® compared with a standard validated laboratory device. We calculated the Coefficient of Multiple Correlation between the two sensors acceleration signal of each subject, in each sub-stage, in each of the three Extended Timed Get Up and Go test trials. To investigate statistical agreement between the two sensors we used the Bland-Altman method. Results With respect to the analysis of the correlation data in the present work, the Coefficient of Multiple Correlation of the five subjects in their triplicated trials were as follows: in sub-phase Sit to Stand the ranged between r = 0.991 to 0.842; in Gait Go, r = 0.967 to 0.852; in Turn, 0.979 to 0.798; in Gait Come, 0.964 to 0.887; and in Turn to Stand to Sit, 0.992 to 0.877. All the correlations between the sensors were significant (p < 0.001). The Bland-Altman plots obtained showed a solid tendency to stay at close to zero, especially on the y and x-axes, during the five phases of the Extended Timed Get Up and Go test. Conclusions The inertial sensor mounted in the iPhone 4® is sufficiently reliable and accurate to evaluate and identify the kinematic patterns in an Extended Timed Get and Go test. While analysis and interpretation of 3D kinematics data continue to be dauntingly complex, the iPhone 4® makes the task of acquiring the data relatively inexpensive and easy to use.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background and purpose There are no published studies on the parameterisation and reliability of the single-leg stance (SLS) test with inertial sensors in stroke patients. Purpose: to analyse the reliability (intra-observer/inter-observer) and sensitivity of inertial sensors used for the SLS test in stroke patients. Secondary objective: to compare the records of the two inertial sensors (trunk and lumbar) to detect any significant differences in the kinematic data obtained in the SLS test. Methods Design: cross-sectional study. While performing the SLS test, two inertial sensors were placed at lumbar (L5-S1) and trunk regions (T7–T8). Setting: Laboratory of Biomechanics (Health Science Faculty - University of Málaga). Participants: Four chronic stroke survivors (over 65 yrs old). Measurement: displacement and velocity, Rotation (X-axis), Flexion/Extension (Y-axis), Inclination (Z-axis); Resultant displacement and velocity (V): RV=(Vx2+Vy2+Vz2)−−−−−−−−−−−−−−−−−√ Along with SLS kinematic variables, descriptive analyses, differences between sensors locations and intra-observer and inter-observer reliability were also calculated. Results Differences between the sensors were significant only for left inclination velocity (p = 0.036) and extension displacement in the non-affected leg with eyes open (p = 0.038). Intra-observer reliability of the trunk sensor ranged from 0.889-0.921 for the displacement and 0.849-0.892 for velocity. Intra-observer reliability of the lumbar sensor was between 0.896-0.949 for the displacement and 0.873-0.894 for velocity. Inter-observer reliability of the trunk sensor was between 0.878-0.917 for the displacement and 0.847-0.884 for velocity. Inter-observer reliability of the lumbar sensor ranged from 0.870-0.940 for the displacement and 0.863-0.884 for velocity. Conclusion There were no significant differences between the kinematic records made by an inertial sensor during the development of the SLS testing between two inertial sensors placed in the lumbar and thoracic regions. In addition, inertial sensors. Have the potential to be reliable, valid and sensitive instruments for kinematic measurements during SLS testing but further research is needed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background The diagnosis of frailty is based on physical impairments and clinicians have indicated that early detection is one of the most effective methods for reducing the severity of physical frailty. Maybe, an alternative to the classical diagnosis could be the instrumentalization of classical functional testing, as Romberg test or Timed Get Up and Go Test. The aim of this study was (I) to measure and describe the magnitude of accelerometry values in the Romberg test in two groups of frail and non-frail elderly people through instrumentation with the iPhone 4®, (II) to analyse the performances and differences between the study groups, and (III) to analyse the performances and differences within study groups to characterise accelerometer responses to increasingly difficult challenges to balance. Methods This is a cross-sectional study of 18 subjects over 70 years old, 9 frail subjects and 9 non-frail subjects. The non-parametric Mann–Whitney U test was used for between-group comparisons in means values derived from different tasks. The Wilcoxon Signed-Rank test was used to analyse differences between different variants of the test in both independent study groups. Results The highest difference between groups was found in the accelerometer values with eyes closed and feet parallel: maximum peak acceleration in the lateral axis (p < 0.01), minimum peak acceleration in the lateral axis (p < 0.01) and minimum peak acceleration from the resultant vector (p < 0.01). Subjects with eyes open and feet parallel, greatest differences found between the groups were in the maximum peak acceleration in the lateral axis (p < 0.01), minimum peak acceleration in the lateral axis (p < 0.01) and minimum peak acceleration from the resultant vector (p < 0.001). With eyes closed and feet in tandem, the greatest differences found between the groups were in the minimum peak acceleration in the lateral axis (p < 0.01). Conclusions The accelerometer fitted in the iPhone 4® is able to study and analyse the kinematics of the Romberg test between frail and non-frail elderly people. In addition, the results indicate that the accelerometry values also were significantly different between the frail and non-frail groups, and that values from the accelerometer accelerometer increased as the test was made more complicated.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We review studies of Nelson's (1976) Modified Card Sorting Test (MCST) that have examined the performance of subjects with frontal lobe dysfunction. Six studies investigated the performance of normal controls and patients with frontal lobe dysfunction, whereas four studies compared the performance of frontal and nonfrontal patients. One further study compared the performance of amnesic patients both on the MCST and on the original Wisconsin Card Sorting Test (WCST). Evidence regarding the MCST's differential sensitivity to frontal lobe dysfunction is weak, as is the evidence regarding the equivalence of the MCST and WCST. It is likely that the MCST is an altogether different test from the standard version. In the absence of proper normative data for the MCST, we provide a table of scores derived from the control groups of various studies. Given the paucity of evidence, further research is required before the MCST can be recommended for use as a marker of frontal lobe dysfunction.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Heritability of brain anatomical connectivity has been studied with diffusion-weighted imaging (DWI) mainly by modeling each voxel's diffusion pattern as a tensor (e.g., to compute fractional anisotropy), but this method cannot accurately represent the many crossing connections present in the brain. We hypothesized that different brain networks (i.e., their component fibers) might have different heritability and we investigated brain connectivity using High Angular Resolution Diffusion Imaging (HARDI) in a cohort of twins comprising 328 subjects that included 70 pairs of monozygotic and 91 pairs of dizygotic twins. Water diffusion was modeled in each voxel with a Fiber Orientation Distribution (FOD) function to study heritability for multiple fiber orientations in each voxel. Precision was estimated in a test-retest experiment on a sub-cohort of 39 subjects. This was taken into account when computing heritability of FOD peaks using an ACE model on the monozygotic and dizygotic twins. Our results confirmed the overall heritability of the major white matter tracts but also identified differences in heritability between connectivity networks. Inter-hemispheric connections tended to be more heritable than intra-hemispheric and cortico-spinal connections. The highly heritable tracts were found to connect particular cortical regions, such as medial frontal cortices, postcentral, paracentral gyri, and the right hippocampus.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper describes part of an engineering study that was undertaken to demonstrate that a multi-megawatt Photovoltaic (PV) generation system could be connected to a rural 11 kV feeder without creating power quality issues for other consumers. The paper concentrates solely on the voltage regulation aspect of the study as this was the most innovative part of the study. The study was carried out using the time-domain software package, PSCAD/EMTDC. The software model included real time data input of actual measured load and scaled PV generation data, along with real-time substation voltage regulator and PV inverter reactive power control. The outputs from the model plot real-time voltage, current and power variations throughout the daily load and PV generation variations. Other aspects of the study not described in the paper include the analysis of harmonics, voltage flicker, power factor, voltage unbalance and system losses.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Montserrat now provides one of the most complete datasets for understanding the character and tempo of hazardous events at volcanic islands. Much of the erupted material ends up offshore, and this offshore record may be easier to date due to intervening hemiplegic sediments between event beds. The offshore dataset includes the first scientific drilling of volcanic island landslides during IODP Expedition 340, together with an unusually comprehensive set of shallow sediment cores and 2-D and 3-D seismic surveys. Most recently in 2013, Remotely Operated Vehicle (ROV) dives mapped and sampled the surface of the main landslide deposits. This contribution aims to provide an overview of key insights from ongoing work on IODP Expedition 340 Sites offshore Montserrat.Key objectives are to understand the composition (and hence source), emplacement mechanism (and hence tsunami generation) of major landslides, together with their frequency and timing relative to volcanic eruption cycles. The most recent major collapse event is Deposit 1, which involved ~1.8 km cubed of material and produced a blocky deposit at ~12-14ka. Deposit 1 appears to have involved not only the volcanic edifice, but also a substantial component of a fringing bioclastic shelf, and material locally incorporated from the underlying seafloor. This information allows us to test how first-order landslide morphology (e.g. blocky or elongate lobes) is related to first-order landslide composition. Preliminary analysis suggests that Deposit 1 occurred shortly before a second major landslide on the SW of the island (Deposit 5). It may have initiated English's Crater, but was not associated with a major change in magma composition. An associated turbidite-stack suggests it was emplaced in multiple stages, separated by at least a few hours and thus reducing the tsunami magnitude. The ROV dives show that mega-blocks in detail comprise smaller-scale breccias, which can travel significant distances without complete disintegration. Landslide Deposit 2 was emplaced at ~130ka, and is more voluminous (~8.4km cubed). It had a much more profound influence on the magmatic system, as it was linked to a major explosive mafic eruption and formation of a new volcanic centre (South Soufriere Hills) on the island. Site U1395 confirms a hypothesis based on the site survey seismic data that Deposit 2 includes a substantial component of pre-existing seafloor sediment. However, surprisingly, this pre-existing seafloor sediment in the lower part of Deposit 2 at Site U1395 is completely undeformed and flat lying, suggesting that Site U1395 penetrated a flat lying block. Work to date material from the upper part of U1396, U1395 and U1394 will also be summarised. This work is establishing a chronostratigraphy of major events over the last 1 Ma, with particularly detailed constraints during the last ~250ka. This is helping us to understand whether major landslides are related to cycles of volcanic eruptions.