78 resultados para Design studies


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Non-linear large-displacement elasto-plastic finite element analyses are used to propose design recommendations for the eaves bracket of a cold-formed steel portal frame. Owing to the thinness of the sheet steel used for the brackets, such a structural design problem is not trivial as the brackets need to be designed against failure through buckling; without availability of the finite element method, expensive laboratory testing would therefore be required. In this paper, the finite element method is firstly used to predict the plastic moment capacity of the eaves bracket. Parametric studies are then used to propose design recommendations for the eaves bracket against two potential buckling modes of failure:

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The requirement for the use of Virtual Engineering, encompassing the construction of Virtual Prototypes using Multidisciplinary Design Optimisation, for the development of future aerospace platforms and systems is discussed. Some of the activities at the Virtual Engineering Centre, a University of Liverpool initiative, are described and a number of case studies involving a range of applications of Virtual Engineering illustrated.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

People are now becoming more environmentally aware and as a consequence of this, industries such as the aviation industry are striving to design more environmentally friendly products. To achieve this, the current design methodologies must be modified to ensure these issues are considered from product conception through to disposal. This paper discusses the environmental problems in relation to the aviation industry and highlights some logic for making the change from the traditional Systems Engineering approach to the recent design paradigm known as Value Driven Design. Preliminary studies have been undertaken to aid in the understanding of this methodology and the existing surplus value objective function. The main results from the work demonstrate that surplus value works well bringing disparate issues such as manufacture and green taxes together to aid decision making. Further, to date studies on surplus value have used simple sensitivity analysis, but deeper consideration shows non-linear interactions between some of the variables and further work will be needed to fully account for complex issues such as environmental impact and taxes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

2'-Beta-D-arabinouridine (AraU), the uridine analogue of the anticancer agent AraC, was synthesized and evaluated for antiviral activity and cytotoxicity. In addition, a series of AraU monophosphate prodrugs in the form of triester phosphoramidates (ProTides) were also synthesized and tested against a range of viruses, leukaemia and solid tumour cell lines. Unfortunately, neither the parent compound (AraU) nor any of its ProTides showed antiviral activity, nor potent inhibitory activity against any of the cancer cell lines. Therefore, the metabolism of AraU phosphoramidates to release AraU monophosphate was investigated. The results showed carboxypeptidase Y, hog liver esterase and crude CEM tumor cell extracts to hydrolyse the ester motif of phosphoramidates with subsequent loss of the aryl group, while molecular modelling studies suggested that the AraU l-alanine aminoacyl phosphate derivative might not be a good substrate for the phosphoramidase enzyme Hint-1. These findings are in agreement with the observed disappearance of intact prodrug and concomitant appearance of the corresponding phosphoramidate intermediate derivative in CEM cell extracts without measurable formation of araU monophosphate. These findings may explain the poor antiviral/cytostatic potential of the prodrugs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVE: To investigate if there is a reduced risk of type 1 diabetes in children breastfed or exclusively breastfed by performing a pooled analysis with adjustment for recognized confounders.
RESEARCH DESIGN AND METHODS: Relevant studies were identified from literature searches using MEDLINE, Web of Science, and EMBASE. Authors of relevant studies were asked to provide individual participant data or conduct prespecified analyses. Meta-analysis techniques were used to combine odds ratios (ORs) and investigate heterogeneity between studies.
RESULTS: Data were available from 43 studies including 9,874 patients with type 1 diabetes. Overall, there was a reduction in the risk of diabetes after exclusive breast-feeding for >2 weeks (20 studies; OR = 0.75, 95% CI 0.64-0.88), the association after exclusive breast-feeding for >3 months was weaker (30 studies; OR = 0.87, 95% CI 0.75-1.00), and no association was observed after (nonexclusive) breast-feeding for >2 weeks (28 studies; OR = 0.93, 95% CI 0.81-1.07) or >3 months (29 studies; OR = 0.88, 95% CI 0.78-1.00). These associations were all subject to marked heterogeneity (I(2) = 58, 76, 54, and 68%, respectively). In studies with lower risk of bias, the reduced risk after exclusive breast-feeding for >2 weeks remained (12 studies; OR = 0.86, 95% CI 0.75-0.99), and heterogeneity was reduced (I(2) = 0%). Adjustments for potential confounders altered these estimates very little.
CONCLUSIONS: The pooled analysis suggests weak protective associations between exclusive breast-feeding and type 1 diabetes risk. However, these findings are difficult to interpret because of the marked variation in effect and possible biases (particularly recall bias) inherent in the included studies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The use of dataflow digital signal processing system modelling
and synthesis techniques has been a fruitful research theme for many years and has yielded many powerful rapid system synthesis and optimisation capabilities. However, recent years have seen the spectrum of languages and techniques splinter in an application specific manner, resulting in an ad-hoc design process which is increasingly dependent on the particular application under development. This poses a major problem for automated toolflows attempting to provide rapid system synthesis for a wide ranges of applications. By analysing a number of dataflow FPGA implementation case studies, this paper shows that despit ethis common traits may be found in current techniques, which fall largely into three classes. Further, it exposes limitations pertaining to their ability to adapt algorith models to implementations for different operating environments and target platforms.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVE: Laypersons are poor at emergency pulse checks (sensitivity 84%, specificity 36%). Guidelines indicate that pulse checks should not be performed. The impedance cardiogram (dZ/dt) is used to assess stroke volume. Can a novel defibrillator-based impedance cardiogram system be used to distinguish between circulatory arrest and other collapse states?

DESIGN: Animal study.

SETTING: University research laboratory.

SUBJECTS: Twenty anesthetized, mechanically ventilated pigs, weight 50-55 kg.

INTERVENTIONS: Stroke volume was altered by right ventricular pacing (160, 210, 260, and 305 beats/min). Cardiac arrest states were then induced: ventricular fibrillation (by rapid ventricular pacing) and, after successful defibrillation, pulseless electrical activity and asystole (by high-dose intravenous pentobarbitone).

MEASUREMENTS AND MAIN RESULTS: The impedance cardiogram was recorded through electrocardiogram/defibrillator pads in standard cardiac arrest positions. Simultaneously recorded electro- and impedance cardiogram (dZ/dt) along with arterial blood pressure tracings were digitized during each pacing and cardiac arrest protocol. Five-second epochs were analyzed for sinus rhythm (20 before ventricular fibrillation, 20 after successful defibrillation), ventricular fibrillation (40), pulseless electrical activity (20), and asystole (20), in two sets of ten pigs (ten training, ten validation). Standard impedance cardiogram variables were noncontributory in cardiac arrest, so the fast Fourier transform of dZ/dt was assessed. During ventricular pacing, the peak amplitude of fast Fourier transform of dZ/dt (between 1.5 and 4.5 Hz) correlated with stroke volume (r2 = .3, p < .001). In cardiac arrest, a peak amplitude of fast Fourier transform of dZ/dt of < or = 4 dB x ohm x rms indicated no output with high sensitivity (94% training set, 86% validation set) and specificity (98% training set, 90% validation set).

CONCLUSIONS: As a powerful clinical marker of circulatory collapse, the fast Fourier transformation of dZ/dt (impedance cardiogram) has the potential to improve emergency care by laypersons using automated defibrillators.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A methodology which allows a non-specialist to rapidly design silicon wavelet transform cores has been developed. This methodology is based on a generic architecture utilizing time-interleaved coefficients for the wavelet transform filters. The architecture is scaleable and it has been parameterized in terms of wavelet family, wavelet type, data word length and coefficient word length. The control circuit is designed in such a way that the cores can also be cascaded without any interface glue logic for any desired level of decomposition. This parameterization allows the use of any orthonormal wavelet family thereby extending the design space for improved transformation from algorithm to silicon. Case studies for stand alone and cascaded silicon cores for single and multi-stage analysis respectively are reported. The typical design time to produce silicon layout of a wavelet based system has been reduced by an order of magnitude. The cores are comparable in area and performance to hand-crafted designs. The designs have been captured in VHDL so they are portable across a range of foundries and are also applicable to FPGA and PLD implementations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In 2004, the integrated European project GEHA (Genetics of Healthy Ageing) was initiated with the aim of identifying genes involved in healthy ageing and longevity. The first step in the project was the recruitment of more than 2500 pairs of siblings aged 90 years or more together with one younger control person from 15 areas in 11 European countries through a coordinated and standardised effort. A biological sample, preferably a blood sample, was collected from each participant, and basic physical and cognitive measures were obtained together with information about health, life style, and family composition. From 2004 to 2008 a total of 2535 families comprising 5319 nonagenarian siblings were identified and included in the project. In addition, 2548 younger control persons aged 50-75 years were recruited. A total of 2249 complete trios with blood samples from at least two old siblings and the younger control were formed and are available for genetic analyses (e.g. linkage studies and genome-wide association studies). Mortality follow-up improves the possibility of identifying families with the most extreme longevity phenotypes. With a mean follow-up time of 3.7 years the number of families with all participating siblings aged 95 years or more has increased by a factor of 5 to 750 families compared to when interviews were conducted. Thus, the GEHA project represents a unique source in the search for genes related to healthy ageing and longevity.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Architects use cycle-by-cycle simulation to evaluate design choices and understand tradeoffs and interactions among design parameters. Efficiently exploring exponential-size design spaces with many interacting parameters remains an open problem: the sheer number of experiments renders detailed simulation intractable. We attack this problem via an automated approach that builds accurate, confident predictive design-space models. We simulate sampled points, using the results to teach our models the function describing relationships among design parameters. The models produce highly accurate performance estimates for other points in the space, can be queried to predict performance impacts of architectural changes, and are very fast compared to simulation, enabling efficient discovery of tradeoffs among parameters in different regions. We validate our approach via sensitivity studies on memory hierarchy and CPU design spaces: our models generally predict IPC with only 1-2% error and reduce required simulation by two orders of magnitude. We also show the efficacy of our technique for exploring chip multiprocessor (CMP) design spaces: when trained on a 1% sample drawn from a CMP design space with 250K points and up to 55x performance swings among different system configurations, our models predict performance with only 4-5% error on average. Our approach combines with techniques to reduce time per simulation, achieving net time savings of three-four orders of magnitude. Copyright © 2006 ACM.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: Overuse of unnecessary medications in frail older adults with limited life expectancy remains an understudied challenge. OBJECTIVE: To identify intervention studies that reduced use of unnecessary medications in frail older adults. A secondary goal was to identify and review studies focusing on patients approaching end of life. We examined criteria for identifying unnecessary medications, intervention processes for medication reduction, and intervention effectiveness. METHODS: A systematic review of English articles using MEDLINE, EMBASE, and International Pharmaceutical Abstracts from January 1966 to September 2012. Additional studies were identified by searching bibliographies. Search terms included prescription drugs, drug utilization, hospice or palliative care, and appropriate or inappropriate. A manual review of 971 identified abstracts for the inclusion criteria (study included an intervention to reduce chronic medication use; at least 5 participants; population included patients aged at least 65 years, hospice enrollment, or indication of frailty or risk of functional decline-including assisted living or nursing home residence, inpatient hospitalization) yielded 60 articles for full review by 3 investigators. After exclusion of review articles, interventions targeting acute medications, or studies exclusively in the intensive care unit, 36 articles were retained (including 13 identified by bibliography review). Articles were extracted for study design, study setting, intervention description, criteria for identifying unnecessary medication use, and intervention outcomes. RESULTS: The studies included 15 randomized controlled trials, 4 non-randomized trials, 6 pre-post studies, and 11 case series. Control groups were used in over half of the studies (n = 20). Study populations varied and included residents of nursing homes and assisted living facilities (n = 16), hospitalized patients (n = 14), hospice/palliative care patients (n = 3), home care patients (n = 2), and frail or disabled community-dwelling patients (n = 1). The majority of studies (n = 21) used implicit criteria to identify unnecessary medications (including drugs without indication, unnecessary duplication, and lack of effectiveness); only one study incorporated patient preference into prescribing criteria. Most (25) interventions were led by or involved pharmacists, 4 used academic detailing, 2 used audit and feedback reports targeting prescribers, and 5 involved physician-led medication reviews. Overall intervention effect sizes could not be determined due to heterogeneity of study designs, samples, and measures. CONCLUSIONS: Very little rigorous research has been conducted on reducing unnecessary medications in frail older adults or patients approaching end of life.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Efficiently exploring exponential-size architectural design spaces with many interacting parameters remains an open problem: the sheer number of experiments required renders detailed simulation intractable.We attack this via an automated approach that builds accurate predictive models. We simulate sampled points, using results to teach our models the function describing relationships among design parameters. The models can be queried and are very fast, enabling efficient design tradeoff discovery. We validate our approach via two uniprocessor sensitivity studies, predicting IPC with only 1–2% error. In an experimental study using the approach, training on 1% of a 250-K-point CMP design space allows our models to predict performance with only 4–5% error. Our predictive modeling combines well with techniques that reduce the time taken by each simulation experiment, achieving net time savings of three-four orders of magnitude.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

PURPOSE. Scanning laser tomography with the Heidelberg retina tomograph (HRT; Heidelberg Engineering, Heidelberg, Germany) has been proposed as a useful diagnostic test for glaucoma. This study was conducted to evaluate the quality of reporting of published studies using the HRT for diagnosing glaucoma. METHODS. A validated Medline and hand search of English-language articles reporting on measures of diagnostic accuracy of the HRT for glaucoma was performed. Two reviewers selected and appraised the papers independently. The Standards for Reporting of Diagnostic Accuracy (STARD) checklist was used to evaluate the quality of each publication. RESULTS. A total of 29 articles were included. Interobserver rating agreement was observed in 83% of items (? = 0.76). The number of STARD items properly reported ranged from 5 to 18. Less than a third of studies (7/29) explicitly reported more than half of the STARD items. Descriptions of key aspects of the methodology were frequently missing. For example, the design of the study (prospective or retrospective) was reported in 6 of 29 studies, and details of participant sampling (e.g., consecutive or random selection) were described in 5 of 29 publications. The commonest description of diagnostic accuracy was sensitivity and specificity (25/29) followed by area under the ROC curve (13/29), with 9 of 29 publications reporting both. CONCLUSIONS. The quality of reporting of diagnostic accuracy tests for glaucoma with HRT is suboptimal. The STARD initiative may be a useful tool for appraising the strengths and weaknesses of diagnostic accuracy studies. Copyright © Association for Research in Vision and Ophthalmology.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objective: To evaluate the quality of reporting of diagnostic accuracy studies using optical coherence tomography (OCT) in glaucoma. Design: Descriptive series of published studies. Participants: Published studies reporting a measure of the diagnostic accuracy of OCT for glaucoma. Methods: Review of English language papers reporting measures of diagnostic accuracy of OCT for glaucoma. Papers were identified from a Medline literature search performed in June 2006. Articles were appraised using the 25 items provided by the Standards for Reporting of Diagnostic Accuracy (STARD) initiative. Each item was recorded as full, partially, or not reported. Main Outcome Measures: Degree of compliance with the STARD guidelines. Results: Thirty papers were appraised. Eight papers (26.7%) fully reported more than half of the STARD items. The lowest number of fully reported items in a study was 5 and the highest was 17. Descriptions of key aspects of methodology frequently were missing. For example, details of participant sampling (e.g., consecutive or random selection) were described in only 8 (26.7%) of 30 publications. Measures of statistical uncertainty were reported in 18 (60%) of 30 publications. No single STARD item was fully reported by all the papers. Conclusions: The standard of reporting of diagnostic accuracy studies in glaucoma using OCT was suboptimal. It is hoped that adoption of the STARD guidelines will lead to an improvement in reporting of diagnostic accuracy studies, enabling clearer evidence to be produced for the usefulness of OCT for the diagnosis of glaucoma. © 2007 American Academy of Ophthalmology.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Aim: To assess the sample sizes used in studies on diagnostic accuracy in ophthalmology. Design and sources: A survey literature published in 2005. Methods: The frequency of reporting calculations of sample sizes and the samples' sizes were extracted from the published literature. A manual search of five leading clinical journals in ophthalmology with the highest impact (Investigative Ophthalmology and Visual Science, Ophthalmology, Archives of Ophthalmology, American Journal of Ophthalmology and British Journal of Ophthalmology) was conducted by two independent investigators. Results: A total of 1698 articles were identified, of which 40 studies were on diagnostic accuracy. One study reported that sample size was calculated before initiating the study. Another study reported consideration of sample size without calculation. The mean (SD) sample size of all diagnostic studies was 172.6 (218.9). The median prevalence of the target condition was 50.5%. Conclusion: Only a few studies consider sample size in their methods. Inadequate sample sizes in diagnostic accuracy studies may result in misleading estimates of test accuracy. An improvement over the current standards on the design and reporting of diagnostic studies is warranted.