988 resultados para Design studies


Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVE: Laypersons are poor at emergency pulse checks (sensitivity 84%, specificity 36%). Guidelines indicate that pulse checks should not be performed. The impedance cardiogram (dZ/dt) is used to assess stroke volume. Can a novel defibrillator-based impedance cardiogram system be used to distinguish between circulatory arrest and other collapse states?

DESIGN: Animal study.

SETTING: University research laboratory.

SUBJECTS: Twenty anesthetized, mechanically ventilated pigs, weight 50-55 kg.

INTERVENTIONS: Stroke volume was altered by right ventricular pacing (160, 210, 260, and 305 beats/min). Cardiac arrest states were then induced: ventricular fibrillation (by rapid ventricular pacing) and, after successful defibrillation, pulseless electrical activity and asystole (by high-dose intravenous pentobarbitone).

MEASUREMENTS AND MAIN RESULTS: The impedance cardiogram was recorded through electrocardiogram/defibrillator pads in standard cardiac arrest positions. Simultaneously recorded electro- and impedance cardiogram (dZ/dt) along with arterial blood pressure tracings were digitized during each pacing and cardiac arrest protocol. Five-second epochs were analyzed for sinus rhythm (20 before ventricular fibrillation, 20 after successful defibrillation), ventricular fibrillation (40), pulseless electrical activity (20), and asystole (20), in two sets of ten pigs (ten training, ten validation). Standard impedance cardiogram variables were noncontributory in cardiac arrest, so the fast Fourier transform of dZ/dt was assessed. During ventricular pacing, the peak amplitude of fast Fourier transform of dZ/dt (between 1.5 and 4.5 Hz) correlated with stroke volume (r2 = .3, p < .001). In cardiac arrest, a peak amplitude of fast Fourier transform of dZ/dt of < or = 4 dB x ohm x rms indicated no output with high sensitivity (94% training set, 86% validation set) and specificity (98% training set, 90% validation set).

CONCLUSIONS: As a powerful clinical marker of circulatory collapse, the fast Fourier transformation of dZ/dt (impedance cardiogram) has the potential to improve emergency care by laypersons using automated defibrillators.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A methodology which allows a non-specialist to rapidly design silicon wavelet transform cores has been developed. This methodology is based on a generic architecture utilizing time-interleaved coefficients for the wavelet transform filters. The architecture is scaleable and it has been parameterized in terms of wavelet family, wavelet type, data word length and coefficient word length. The control circuit is designed in such a way that the cores can also be cascaded without any interface glue logic for any desired level of decomposition. This parameterization allows the use of any orthonormal wavelet family thereby extending the design space for improved transformation from algorithm to silicon. Case studies for stand alone and cascaded silicon cores for single and multi-stage analysis respectively are reported. The typical design time to produce silicon layout of a wavelet based system has been reduced by an order of magnitude. The cores are comparable in area and performance to hand-crafted designs. The designs have been captured in VHDL so they are portable across a range of foundries and are also applicable to FPGA and PLD implementations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In 2004, the integrated European project GEHA (Genetics of Healthy Ageing) was initiated with the aim of identifying genes involved in healthy ageing and longevity. The first step in the project was the recruitment of more than 2500 pairs of siblings aged 90 years or more together with one younger control person from 15 areas in 11 European countries through a coordinated and standardised effort. A biological sample, preferably a blood sample, was collected from each participant, and basic physical and cognitive measures were obtained together with information about health, life style, and family composition. From 2004 to 2008 a total of 2535 families comprising 5319 nonagenarian siblings were identified and included in the project. In addition, 2548 younger control persons aged 50-75 years were recruited. A total of 2249 complete trios with blood samples from at least two old siblings and the younger control were formed and are available for genetic analyses (e.g. linkage studies and genome-wide association studies). Mortality follow-up improves the possibility of identifying families with the most extreme longevity phenotypes. With a mean follow-up time of 3.7 years the number of families with all participating siblings aged 95 years or more has increased by a factor of 5 to 750 families compared to when interviews were conducted. Thus, the GEHA project represents a unique source in the search for genes related to healthy ageing and longevity.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Architects use cycle-by-cycle simulation to evaluate design choices and understand tradeoffs and interactions among design parameters. Efficiently exploring exponential-size design spaces with many interacting parameters remains an open problem: the sheer number of experiments renders detailed simulation intractable. We attack this problem via an automated approach that builds accurate, confident predictive design-space models. We simulate sampled points, using the results to teach our models the function describing relationships among design parameters. The models produce highly accurate performance estimates for other points in the space, can be queried to predict performance impacts of architectural changes, and are very fast compared to simulation, enabling efficient discovery of tradeoffs among parameters in different regions. We validate our approach via sensitivity studies on memory hierarchy and CPU design spaces: our models generally predict IPC with only 1-2% error and reduce required simulation by two orders of magnitude. We also show the efficacy of our technique for exploring chip multiprocessor (CMP) design spaces: when trained on a 1% sample drawn from a CMP design space with 250K points and up to 55x performance swings among different system configurations, our models predict performance with only 4-5% error on average. Our approach combines with techniques to reduce time per simulation, achieving net time savings of three-four orders of magnitude. Copyright © 2006 ACM.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: Overuse of unnecessary medications in frail older adults with limited life expectancy remains an understudied challenge. OBJECTIVE: To identify intervention studies that reduced use of unnecessary medications in frail older adults. A secondary goal was to identify and review studies focusing on patients approaching end of life. We examined criteria for identifying unnecessary medications, intervention processes for medication reduction, and intervention effectiveness. METHODS: A systematic review of English articles using MEDLINE, EMBASE, and International Pharmaceutical Abstracts from January 1966 to September 2012. Additional studies were identified by searching bibliographies. Search terms included prescription drugs, drug utilization, hospice or palliative care, and appropriate or inappropriate. A manual review of 971 identified abstracts for the inclusion criteria (study included an intervention to reduce chronic medication use; at least 5 participants; population included patients aged at least 65 years, hospice enrollment, or indication of frailty or risk of functional decline-including assisted living or nursing home residence, inpatient hospitalization) yielded 60 articles for full review by 3 investigators. After exclusion of review articles, interventions targeting acute medications, or studies exclusively in the intensive care unit, 36 articles were retained (including 13 identified by bibliography review). Articles were extracted for study design, study setting, intervention description, criteria for identifying unnecessary medication use, and intervention outcomes. RESULTS: The studies included 15 randomized controlled trials, 4 non-randomized trials, 6 pre-post studies, and 11 case series. Control groups were used in over half of the studies (n = 20). Study populations varied and included residents of nursing homes and assisted living facilities (n = 16), hospitalized patients (n = 14), hospice/palliative care patients (n = 3), home care patients (n = 2), and frail or disabled community-dwelling patients (n = 1). The majority of studies (n = 21) used implicit criteria to identify unnecessary medications (including drugs without indication, unnecessary duplication, and lack of effectiveness); only one study incorporated patient preference into prescribing criteria. Most (25) interventions were led by or involved pharmacists, 4 used academic detailing, 2 used audit and feedback reports targeting prescribers, and 5 involved physician-led medication reviews. Overall intervention effect sizes could not be determined due to heterogeneity of study designs, samples, and measures. CONCLUSIONS: Very little rigorous research has been conducted on reducing unnecessary medications in frail older adults or patients approaching end of life.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Efficiently exploring exponential-size architectural design spaces with many interacting parameters remains an open problem: the sheer number of experiments required renders detailed simulation intractable.We attack this via an automated approach that builds accurate predictive models. We simulate sampled points, using results to teach our models the function describing relationships among design parameters. The models can be queried and are very fast, enabling efficient design tradeoff discovery. We validate our approach via two uniprocessor sensitivity studies, predicting IPC with only 1–2% error. In an experimental study using the approach, training on 1% of a 250-K-point CMP design space allows our models to predict performance with only 4–5% error. Our predictive modeling combines well with techniques that reduce the time taken by each simulation experiment, achieving net time savings of three-four orders of magnitude.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

PURPOSE. Scanning laser tomography with the Heidelberg retina tomograph (HRT; Heidelberg Engineering, Heidelberg, Germany) has been proposed as a useful diagnostic test for glaucoma. This study was conducted to evaluate the quality of reporting of published studies using the HRT for diagnosing glaucoma. METHODS. A validated Medline and hand search of English-language articles reporting on measures of diagnostic accuracy of the HRT for glaucoma was performed. Two reviewers selected and appraised the papers independently. The Standards for Reporting of Diagnostic Accuracy (STARD) checklist was used to evaluate the quality of each publication. RESULTS. A total of 29 articles were included. Interobserver rating agreement was observed in 83% of items (? = 0.76). The number of STARD items properly reported ranged from 5 to 18. Less than a third of studies (7/29) explicitly reported more than half of the STARD items. Descriptions of key aspects of the methodology were frequently missing. For example, the design of the study (prospective or retrospective) was reported in 6 of 29 studies, and details of participant sampling (e.g., consecutive or random selection) were described in 5 of 29 publications. The commonest description of diagnostic accuracy was sensitivity and specificity (25/29) followed by area under the ROC curve (13/29), with 9 of 29 publications reporting both. CONCLUSIONS. The quality of reporting of diagnostic accuracy tests for glaucoma with HRT is suboptimal. The STARD initiative may be a useful tool for appraising the strengths and weaknesses of diagnostic accuracy studies. Copyright © Association for Research in Vision and Ophthalmology.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objective: To evaluate the quality of reporting of diagnostic accuracy studies using optical coherence tomography (OCT) in glaucoma. Design: Descriptive series of published studies. Participants: Published studies reporting a measure of the diagnostic accuracy of OCT for glaucoma. Methods: Review of English language papers reporting measures of diagnostic accuracy of OCT for glaucoma. Papers were identified from a Medline literature search performed in June 2006. Articles were appraised using the 25 items provided by the Standards for Reporting of Diagnostic Accuracy (STARD) initiative. Each item was recorded as full, partially, or not reported. Main Outcome Measures: Degree of compliance with the STARD guidelines. Results: Thirty papers were appraised. Eight papers (26.7%) fully reported more than half of the STARD items. The lowest number of fully reported items in a study was 5 and the highest was 17. Descriptions of key aspects of methodology frequently were missing. For example, details of participant sampling (e.g., consecutive or random selection) were described in only 8 (26.7%) of 30 publications. Measures of statistical uncertainty were reported in 18 (60%) of 30 publications. No single STARD item was fully reported by all the papers. Conclusions: The standard of reporting of diagnostic accuracy studies in glaucoma using OCT was suboptimal. It is hoped that adoption of the STARD guidelines will lead to an improvement in reporting of diagnostic accuracy studies, enabling clearer evidence to be produced for the usefulness of OCT for the diagnosis of glaucoma. © 2007 American Academy of Ophthalmology.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Aim: To assess the sample sizes used in studies on diagnostic accuracy in ophthalmology. Design and sources: A survey literature published in 2005. Methods: The frequency of reporting calculations of sample sizes and the samples' sizes were extracted from the published literature. A manual search of five leading clinical journals in ophthalmology with the highest impact (Investigative Ophthalmology and Visual Science, Ophthalmology, Archives of Ophthalmology, American Journal of Ophthalmology and British Journal of Ophthalmology) was conducted by two independent investigators. Results: A total of 1698 articles were identified, of which 40 studies were on diagnostic accuracy. One study reported that sample size was calculated before initiating the study. Another study reported consideration of sample size without calculation. The mean (SD) sample size of all diagnostic studies was 172.6 (218.9). The median prevalence of the target condition was 50.5%. Conclusion: Only a few studies consider sample size in their methods. Inadequate sample sizes in diagnostic accuracy studies may result in misleading estimates of test accuracy. An improvement over the current standards on the design and reporting of diagnostic studies is warranted.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Elevated intraocular pressure (IOP) is a major risk factor for the deterioration of open-angle glaucoma (OAG); medical IOP reduction is the standard treatment, yet no randomized placebo-controlled study of medical IOP reduction has been undertaken previously. The United Kingdom Glaucoma Treatment Study (UKGTS) tests the hypothesis that treatment with a topical prostaglandin analog, compared with placebo, reduces the frequency of visual field (VF) deterioration events in OAG patients by 50% over a 2-year period.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper discusses the application of the Taguchi experimental design approach in optimizing the key process parameters for micro-welding of thin AISI 316L foil using the 100W CW fibre laser. A L16 Taguchi experiment was conducted to systematically understand how the power, scanning velocity, focus position, gas flow rate and type of shielding gas affect the bead dimensions. The welds produced in the L16 Taguchi experiment was mainly of austenite cellular-dendrite structure with an average grain size of 5µm. An exact penetration weld with the largest penetration to fusion width ratio was obtained. Among those process parameters, the interaction between power and scanning velocity presented the strongest effect to the penetration to fusion width ratio and the power was found to be the predominantly important factor that drives the interaction with other factors to appreciably affect the bead dimensions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The agile model of software development has been mainstream for several years, and is now in a phase where its principles and practices are maturing. The purpose of this paper is to describe the results of an industry survey aimed at understanding how maturation is progressing. The survey was taken across 40 software development companies in Northern Ireland at the beginning of 2012. The paper describes the design of the survey and examines maturity by comparing the results obtained in 2012 with those from a study of agile adoption in the same region in 2010. Both surveys aimed to achieve comprehensive coverage of a single area rather than rely on a voluntary sample. The main outcome from the work is a collection of ‘insights’ into the nature and practice of agile development, the main two of which are reported in this paper.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Nonconsumptive or trait-mediated effects of predators on their prey often outweigh density-mediated interactions where predators consume prey. For instance, predator presence can alter prey behaviour, physiology, morphology and/or development. Despite a burgeoning literature, our ability to identify general patterns in prey behavioural responses may be influenced by the inconsistent methodologies of predator cue experiments used to assess trait-mediated effects. We therefore conducted a meta-analysis to highlight variables (e.g. water type, predator husbandry, exposure time) that may influence invertebrate prey's behavioural responses to fish predator cues. This revealed that changes in prey activity and refuge use were remarkably consistent overall, despite wide differences in experimental methodologies. Our meta-analysis shows that invertebrates altered their behaviour to predator cues of both fish that were fed the focal invertebrate and those that were fed other prey types, which suggests that invertebrates were not responding to specific diet information in the fish cues. Invertebrates also altered their behaviour regardless of predator cue addition regimes and fish satiation levels. Cue intensity and exposure time did not have significant effects on invertebrate behaviour. We also highlight that potentially confounding factors, such as parasitism, were rarely recorded in sufficient detail to assess the magnitude of their effects. By examining the likelihood of detecting trait-mediated effects under large variations in experimental design, our study demonstrates that trait-mediated effects are likely to have pervasive and powerful influences in nature.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The design of hot-rolled steel portal frames can be sensitive to serviceability deflection limits. In such cases, in order to reduce frame deflections, practitioners increase the size of the eaves haunch and / or the sizes of the steel sections used for the column and rafter members of the frame. This paper investigates the effect of such deflection limits using a real-coded niching genetic algorithm (RC-NGA) that optimizes frame weight, taking into account both ultimate as well as serviceability limit states. The results show that the proposed GA is efficient and reliable. Two different sets of serviceability deflection limits are then considered: deflection limits recommended by the Steel Construction Institute (SCI), which is based on control of differential deflections, and other deflection limits based on suggestions by industry. Parametric studies are carried out on frames with spans ranging between 15 m to 50 m and column heights between 5 m to 10 m. It is demonstrated that for a 50 m span frame, use of the SCI recommended deflection limits can lead to frame weights that are around twice as heavy as compared to designs without these limits.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

PURPOSE. To investigate the methods used in contemporary ophthalmic literature to designate visual acuity (VA). METHODS. Papers in all 2005 editions of five ophthalmic journals were considered. Papers were included if (1) VA, vision, or visual function was mentioned in the abstract and (2) if the study involved age-related macular degeneration, cataract, or refractive surgery. If a paper was selected on the basis of its abstract, the full text of the paper was examined for information on the method of refractive correction during VA testing, type of chart used to measure VA, specifics concerning chart features, testing protocols, and data analysis and means of expressing VA in results. RESULTS. One hundred twenty-eight papers were included. The most common type of charts used were described as logMAR-based. Although most (89.8%) of the studies reported on the method of refractive correction during VA testing, only 58.6% gave the chart design, and less than 12% gave any information whatsoever on chart features or measurement procedures used. CONCLUSIONS. The methods used and the approach to analysis were rarely described in sufficient detail to allow others to replicate the study being reported. Sufficient detail should be given on VA measurement to enable others to duplicate the research. The authors suggest that charts adhering to Bailey-Lovie design principles always be used to measure vision in prospective studies and their use encouraged in clinical settings. The distinction between the terms logMAR, an acuity notation, and Bailey-Lovie or ETDRS as chart types should be adhered to more strictly. Copyright © Association for Research in Vision and Ophthalmology.