139 resultados para 0.044-0.04 µm


Relevância:

90.00% 90.00%

Publicador:

Resumo:

Stress analysis within carotid plaques based on in vivo MR imaging has shown to be useful for the identification of vulnerable atheroma. This study is to investigate whether magnetic resonance imaging (MRI) based-biomechanical stress analysis of carotid plaques can differentiate acute symptomatic and asymptomatic patients. 54 asymptomatic and 45 acute symptomatic patients underwent in vivo multi-contrast MRI of the carotid arteries. Plaque geometry used for finite element analysis was derived from in vivo MR images at the site of maximum and minimum plaque burden. In total 198 slices were used for the computational simulations. A pre shrink technique was used to refine the simulation. Maximum principle stress at the vulnerable plaque sites (i.e. critical stress) was extracted for the selected slices and a comparison was performed between the two groups. Critical stress at the site of maximum plaque burden is significantly higher in acute symptomatic patients as compared to asymptomatic patients [median: 198.0kPa (inter quartile range (IQR) = (119.8 - 359.0) vs. 138.4kPa (83.8, 242.6), p=0.04]. No significant difference was found at the minimum plaque burden site between the two groups [196.7kPa (133.3- 282.7) vs. 182.4kPa (117.2 - 310. 6), p=0.82). Stress analysis at the site of maximal plaque burden can be effectively used for differentiating acute symptomatic carotid plaques from asymptomatic plaques. This maybe potentially used for development of biomechanical risk stratification criteria based on plaque burden in future studies.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Background: Despite being the stiffest airway of the bronchial tree, the trachea undergoes significant deformation due to intrathoracic pressure during breathing. The mechanical properties of the trachea affect the flow in the airway and may contribute to the biological function of the lung. Method: A Fung-type strain energy density function was used to investigate the nonlinear mechanical behavior of tracheal cartilage. A bending test on pig tracheal cartilage was performed and a mathematical model for analyzing the deformation of tracheal cartilage was developed. The constants included in the strain energy density function were determined by fitting the experimental data. Result: The experimental data show that tracheal cartilage is a nonlinear material displaying higher strength in compression than in tension. When the compression forces varied from -0.02 to -0.03 N and from -0.03 to -0.04 N, the deformation ratios were 11.03±2.18% and 7.27±1.59%, respectively. Both were much smaller than the deformation ratios (20.01±4.49%) under tension forces of 0.02 to 0.01 N. The Fung-type strain energy density function can capture this nonlinear behavior very well, whilst the linear stress-strain relation cannot. It underestimates the stability of trachea by exaggerating the displacement in compression. This study may improve our understanding of the nonlinear behavior of tracheal cartilage and it may be useful for the future study on tracheal collapse behavior under physiological and pathological conditions.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Arterial compliance has been shown to correlate well with overall cardiovascular outcome and it may also be a potential risk factor for the development of atheromatous disease. This study assesses the utility of 2-D phase contrast Magnetic Resonance (MR) imaging with intra-sequence blood pressure measurement to determine carotid compliance and distensibility. 20 patients underwent 2-D phase contrast MR imaging and also ultrasound-based wall tracking measurements. Values for carotid compliance and distensibility were derived from the two different modalities and compared. Linear regression analysis was utilised to determine the extent of correlation between MR and ultrasound derived parameters. In those variables that could be directly compared, an agreement analysis was undertaken. MR measures of compliance showed a good correlation with measures based on ultrasound wall-tracking (r=0.61, 95% CI 0.34 to 0.81 p=0.0003). Vessels that had undergone carotid endarterectomy previously were significantly less compliant than either diseased or normal contralateral vessels (p=0.04). Agreement studies showed a relatively poor intra-class correlation coefficient (ICC) between diameter-based measures of compliance through either MR or ultrasound (ICC=0.14). MRI based assessment of local carotid compliance appears to be both robust and technically feasible in most subjects. Measures of compliance correlate well with ultrasound-based values and correlate best when cross-sectional area change is used rather than derived diameter changes. If validated by further larger studies, 2-D phase contrast imaging with intra-sequence blood pressure monitoring and off-line radial artery tonometry may provide a useful tool in further assessment of patients with carotid atheroma.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Purpose Little is known about the prevalence of refractive error, binocular vision, and other visual conditions in Australian Indigenous children. This is important given the association of these visual conditions with reduced reading performance in the wider population, which may also contribute to the suboptimal reading performance reported in this population. The aim of this study was to develop a visual profile of Queensland Indigenous children. Methods Vision testing was performed on 595 primary schoolchildren in Queensland, Australia. Vision parameters measured included visual acuity, refractive error, color vision, nearpoint of convergence, horizontal heterophoria, fusional vergence range, accommodative facility, AC/A ratio, visual motor integration, and rapid automatized naming. Near heterophoria, nearpoint of convergence, and near fusional vergence range were used to classify convergence insufficiency (CI). Results Although refractive error (Indigenous, 10%; non-Indigenous, 16%; p = 0.04) and strabismus (Indigenous, 0%; non-Indigenous, 3%; p = 0.03) were significantly less common in Indigenous children, CI was twice as prevalent (Indigenous, 10%; non-Indigenous, 5%; p = 0.04). Reduced visual information processing skills were more common in Indigenous children (reduced visual motor integration [Indigenous, 28%; non-Indigenous, 16%; p < 0.01] and slower rapid automatized naming [Indigenous, 67%; non-Indigenous, 59%; p = 0.04]). The prevalence of visual impairment (reduced visual acuity) and color vision deficiency was similar between groups. Conclusions Indigenous children have less refractive error and strabismus than their non-Indigenous peers. However, CI and reduced visual information processing skills were more common in this group. Given that vision screenings primarily target visual acuity assessment and strabismus detection, this is an important finding as many Indigenous children with CI and reduced visual information processing may be missed. Emphasis should be placed on identifying children with CI and reduced visual information processing given the potential effect of these conditions on school performance

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Purpose Transient changes in corneal topography associated with soft and conventional or reverse geometry rigid contact lens wear have been well documented; however, only a few studies have examined the influence of scleral contact lens wear upon the cornea. Therefore, in this study, we examined the influence of modern miniscleral contact lenses, which land entirely on the sclera and overlying tissues, upon anterior corneal curvature and optics. Methods Anterior corneal topography and elevation data were acquired using Scheimpflug imaging (Pentacam HR, Oculus) immediately prior to and following 8 hours of miniscleral contact lens wear in 15 young healthy adults (mean age 22 ± 3 years, 8 East Asian, 7 Caucasian) with normal corneae. Corneal diurnal variations were accounted for using data collected on a dedicated measurement day without contact lens wear. Corneal clearance was quantified using an optical coherence tomographer (RS-3000, Nidek) following lens insertion and after 8 hours of lens wear. Results Although corneal clearance was maintained throughout the 8 hour lens wear period, significant corneal flattening (up to 0.08 ± 0.04 mm) was observed, primarily in the superior mid-peripheral cornea, which resulted in a slight increase in against-the-rule corneal astigmatism (mean +0.02/-0.15 x 94 for an 8 mm diameter). Higher order aberration terms of horizontal coma, vertical coma and spherical aberration all underwent significant changes for an 8 mm corneal diameter (p ≤ 0.01), which typically resulted in a decrease in RMS error values (mean change in total higher order RMS -0.035 ± 0.046 µm for an 8 mm diameter). There was no association between the magnitude of change in central or mid-peripheral corneal clearance during lens wear and the observed changes in corneal curvature (p > 0.05). However, Asian participants displayed a significantly greater reduction in corneal clearance (p = 0.04) and greater superior-nasal corneal flattening compared to Caucasians (p = 0.048). Conclusions Miniscleral contact lenses that vault the cornea induce significant changes in anterior corneal surface topography and higher order aberrations following 8 hours of lens wear. The region of greatest corneal flattening was observed in the superior-nasal mid-periphery, more so in Asian participants. Practitioners should be aware that corneal measurements obtained following miniscleral lens removal may mask underlying corneal steepening.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Common diseases such as endometriosis (ED), Alzheimer's disease (AD) and multiple sclerosis (MS) account for a significant proportion of the health care burden in many countries. Genome-wide association studies (GWASs) for these diseases have identified a number of individual genetic variants contributing to the risk of those diseases. However, the effect size for most variants is small and collectively the known variants explain only a small proportion of the estimated heritability. We used a linear mixed model to fit all single nucleotide polymorphisms (SNPs) simultaneously, and estimated genetic variances on the liability scale using SNPs from GWASs in unrelated individuals for these three diseases. For each of the three diseases, case and control samples were not all genotyped in the same laboratory. We demonstrate that a careful analysis can obtain robust estimates, but also that insufficient quality control (QC) of SNPs can lead to spurious results and that too stringent QC is likely to remove real genetic signals. Our estimates show that common SNPs on commercially available genotyping chips capture significant variation contributing to liability for all three diseases. The estimated proportion of total variation tagged by all SNPs was 0.26 (SE 0.04) for ED, 0.24 (SE 0.03) for AD and 0.30 (SE 0.03) for MS. Further, we partitioned the genetic variance explained into five categories by a minor allele frequency (MAF), by chromosomes and gene annotation. We provide strong evidence that a substantial proportion of variation in liability is explained by common SNPs, and thereby give insights into the genetic architecture of the diseases.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Evidence that complex traits are highly polygenic has been presented by population-based genome-wide association studies (GWASs) through the identification of many significant variants, as well as by family-based de novo sequencing studies indicating that several traits have a large mutational target size. Here, using a third study design, we show results consistent with extreme polygenicity for body mass index (BMI) and height. On a sample of 20,240 siblings (from 9,570 nuclear families), we used a within-family method to obtain narrow-sense heritability estimates of 0.42 (SE = 0.17, p = 0.01) and 0.69 (SE = 0.14, p = 6 x 10(-)(7)) for BMI and height, respectively, after adjusting for covariates. The genomic inflation factors from locus-specific linkage analysis were 1.69 (SE = 0.21, p = 0.04) for BMI and 2.18 (SE = 0.21, p = 2 x 10(-10)) for height. This inflation is free of confounding and congruent with polygenicity, consistent with observations of ever-increasing genomic-inflation factors from GWASs with large sample sizes, implying that those signals are due to true genetic signals across the genome rather than population stratification. We also demonstrate that the distribution of the observed test statistics is consistent with both rare and common variants underlying a polygenic architecture and that previous reports of linkage signals in complex traits are probably a consequence of polygenic architecture rather than the segregation of variants with large effects. The convergent empirical evidence from GWASs, de novo studies, and within-family segregation implies that family-based sequencing studies for complex traits require very large sample sizes because the effects of causal variants are small on average.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Most psychiatric disorders are moderately to highly heritable. The degree to which genetic variation is unique to individual disorders or shared across disorders is unclear. To examine shared genetic etiology, we use genome-wide genotype data from the Psychiatric Genomics Consortium (PGC) for cases and controls in schizophrenia, bipolar disorder, major depressive disorder, autism spectrum disorders (ASD) and attention-deficit/hyperactivity disorder (ADHD). We apply univariate and bivariate methods for the estimation of genetic variation within and covariation between disorders. SNPs explained 17-29% of the variance in liability. The genetic correlation calculated using common SNPs was high between schizophrenia and bipolar disorder (0.68 +/- 0.04 s.e.), moderate between schizophrenia and major depressive disorder (0.43 +/- 0.06 s.e.), bipolar disorder and major depressive disorder (0.47 +/- 0.06 s.e.), and ADHD and major depressive disorder (0.32 +/- 0.07 s.e.), low between schizophrenia and ASD (0.16 +/- 0.06 s.e.) and non-significant for other pairs of disorders as well as between psychiatric disorders and the negative control of Crohn's disease. This empirical evidence of shared genetic etiology for psychiatric disorders can inform nosology and encourages the investigation of common pathophysiologies for related disorders.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Objectives Impaired muscle function is common in knee osteoarthritis (OA). Numerous biochemical molecules have been implicated in the development of OA; however, these have only been identified in the joint and serum. This study compared the expression of interleukin (IL-15) and Forkhead box protein-O1 (FoxO1) in muscle of patients with knee OA asymptomatic individuals, and examined whether IL-15 was also present in the joint and serum. Method Muscle and blood samples were collected from 19 patients with diagnosed knee OA and 10 age-matched asymptomatic individuals. Synovial fluid and muscle biopsies were collected from the OA group during knee replacement surgery. IL-15 and FoxO1were measured in the skeletal muscle. IL-15 abundance was also analysed in the serum of both groups and synovial fluid from the OA group. Knee extensor strength was measured and correlated with IL-15 and FoxO1 in the muscle. Results FoxO1 protein expression was higher (p=0.04), whereas IL-15 expression was lower (p=0.02) in the muscle of the OA group. Strength was also lower in the OA group, and was inversely correlated with FoxO1 expression. No correlation was found between IL-15 in the joint, muscle or serum. Conclusion Skeletal muscle, particularly the quadriceps, is affected in people with knee OA where elevated FoxO1 protein expression was associated with reduced muscle strength. While IL-15 protein expression in the muscle was lower in the knee OA group, no correlation was found between the expression of IL-15 protein in the muscle, joint and serum, which suggests that inflammation is regulated differently within these tissues.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Purpose The aim of this study was to determine alterations to the corneal subbasal nerve plexus (SNP) over four years using in vivo corneal confocal microscopy (IVCM) in participants with type 1 diabetes and to identify significant risk factors associated with these alterations. Methods A cohort of 108 individuals with type 1 diabetes and no evidence of peripheral neuropathy at enrollment underwent laser-scanning IVCM, ocular screening, and health and metabolic assessment at baseline and the examinations continued for four subsequent annual visits. At each annual visit, eight central corneal images of the SNP were selected and analyzed to quantify corneal nerve fiber density (CNFD), branch density (CNBD) and fiber length (CNFL). Linear mixed model approaches were fitted to examine the relationship between risk factors and corneal nerve parameters. Results A total of 96 participants completed the final visit and 91 participants completed all visits. No significant relationships were found between corneal nerve parameters and time, sex, duration of diabetes, smoking, alcohol consumption, blood pressure or BMI. However, CNFD was negatively associated with HbA1c (β=-0.76, P<0.01) and age (β=-0.13, P<0.01) and positively related to high density lipids (HDL) (β=2.01, P=0.03). Higher HbA1c (β=-1.58, P=0.04) and age (β=-0.23, P<0.01) also negatively impacted CNBD. CNFL was only affected by higher age (β=-0.06, P<0.01). Conclusions Glycemic control, HDL and age have significant effects on SNP structure. These findings highlight the importance of diabetic management to prevent corneal nerve damage as well as the capability of IVCM for monitoring subclinical alterations in the corneal SNP in diabetes.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Background Malnutrition is common in patients with advanced epithelial ovarian cancer (EOC), and is associated with impaired quality of life (QoL), longer hospital stay and higher risk of treatment-related adverse events. This phase III multi-centre randomised clinical trial tested early enteral feeding versus standard care on postoperative QoL. Methods From 2009 to 2013, 109 patients requiring surgery for suspected advanced EOC, moderately to severely malnourished were enrolled at five sites across Queensland and randomised to intervention (n = 53) or control (n = 56) groups. Intervention involved intraoperative nasojejunal tube placement and enteral feeding until adequate oral intake could be maintained. Despite being randomised to intervention, 20 patients did not receive feeds (13 did not receive the feeding tube; 7 had it removed early). Control involved postoperative diet as tolerated. QoL was measured at baseline, 6 weeks postoperatively and 30 days after the third cycle of chemotherapy. The primary outcome measure was the difference in QoL between the intervention and the control group. Secondary endpoints included treatment-related adverse event occurrence, length of stay, postoperative services use, and nutritional status. Results Baseline characteristics were comparable between treatment groups. No significant difference in QoL was found between the groups at any time point. There was a trend towards better nutritional status in patients who received the intervention but the differences did not reach statistical significance except for the intention-to-treat analysis at 7 days postoperatively (11.8 intervention vs. 13.8 control, p 0.04). Conclusion Early enteral feeding did not significantly improve patients' QoL compared to standard of care but may improve nutritional status.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Background To reduce nursing shortages, accelerated nursing programs are available for domestic and international students. However, the withdrawal and failure rates from these programs may be different than for the traditional programs. The main aim of our study was to improve the retention and experience of accelerated nursing students. Methods The academic background, age, withdrawal and failure rates of the accelerated and traditional students were determined. Data from 2009 and 2010 were collected prior to intervention. In an attempt to reduce the withdrawal of accelerated students, we set up an intervention, which was available to all students. The assessment of the intervention was a pre-post-test design with non-equivalent groups (the traditional and the accelerated students). The elements of the intervention were a) a formative website activity of some basic concepts in anatomy, physiology and pharmacology, b) a workshop addressing study skills and online resources, and c) resource lectures in anatomy/physiology and microbiology. The formative website and workshop was evaluated using questionnaires. Results The accelerated nursing students were five years older than the traditional students (p < 0.0001). The withdrawal rates from a pharmacology course are higher for accelerated nursing students, than for traditional students who have undertaken first year courses in anatomy and physiology (p = 0.04 in 2010). The withdrawing students were predominantly the domestic students with non-university qualifications or equivalent experience. The failure rates were also higher for this group, compared to the traditional students (p = 0.05 in 2009 and 0.03 in 2010). In contrast, the withdrawal rates for the international and domestic graduate accelerated students were very low. After the intervention, the withdrawal and failure rates in pharmacology for domestic accelerated students with non-university qualifications were not significantly different than those of traditional students. Conclusions The accelerated international and domestic graduate nursing students have low withdrawal rates and high success rates in a pharmacology course. However, domestic students with non-university qualifications have higher withdrawal and failure rates than other nursing students and may be underprepared for university study in pharmacology in nursing programs. The introduction of an intervention was associated with reduced withdrawal and failure rates for these students in the pharmacology course.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Background Studies investigating the relationship between malnutrition and post-discharge mortality following acute hip fracture yield conflicting results. This study aimed to determine whether malnutrition independently predicted 12-month post-fracture mortality after adjusting for clinically relevant covariates. Methods An ethics approved, prospective, consecutive audit was undertaken for all surgically treated hip fracture inpatients admitted to a dedicated orthogeriatric unit (November 2010–October 2011). The 12-month mortality data were obtained by a dual search of the mortality registry and Queensland Health database. Malnutrition was evaluated using the Subjective Global Assessment. Demographic (age, gender, admission residence) and clinical covariates included fracture type, time to surgery, anaesthesia type, type of surgery, post-surgery time to mobilize and post-operative complications (delirium, pulmonary and deep vein thrombosis, cardiac complications, infections). The Charlson Comorbidity Index was retrospectively applied. All diagnoses were confirmed by the treating orthogeriatrician. Results A total of 322 of 346 patients were available for audit. Increased age (P = 0.004), admission from residential care (P < 0.001), Charlson Comorbidity Index (P = 0.007), malnutrition (P < 0.001), time to mobilize >48 h (P < 0.001), delirium (P = 0.003), pulmonary embolism (P = 0.029) and cardiovascular complication (P = 0.04) were associated with 12-month mortality. Logistic regression analysis demonstrated that malnutrition (odds ratio (OR) 2.4 (95% confidence interval (CI) 1.3–4.7, P = 0.007)), in addition to admission from residential care (OR 2.6 (95% CI 1.3–5.3, P = 0.005)) and pulmonary embolism (OR 11.0 (95% CI 1.5–78.7, P = 0.017)), independently predicted 12-month mortality. Conclusions Findings substantiate malnutrition as an independent predictor of 12-month mortality in a representative sample of hip fracture inpatients. Effective strategies to identify and treat malnutrition in hip fracture should be prioritized.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In this paper we examine the effect of technology on economic growth in Zimbabwe over the period 1975–2014 whilst accounting for structural breaks. We use the extended Cobb–Douglas type Solow (Q J Econ 70(1):65–94, 1956) framework and the ARDL bounds procedure to examine cointegration and short run and long run effects. Using unit root tests, we note that structural changes in Zimbabwe are generally marked by the period 1982 onwards. We find that mobile technology has a positive short-run (0.09 %) and long-run (0.08 %) impact on the output per capita. The structural changes post-1982 periods show positive impact in the short-run (0.06) and the long-run (0.09), whereas the coefficient of trend in the short-run (−0.03) and the long-run (−0.04) is negative. The Granger non-causality test shows a unidirectional causality from capital stock (investment) per capita to output per capita and a bi-directional causality between mobile cellular technology and output per capita. The plausible reasons for estimated magnitude effects and the direction of causality are explained for policy deliberation.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

BACKGROUND This study compared the effects of three silver dressing combinations on small to medium size acute partial thickness burns in children, focusing on re-epithelialization time, pain and distress during dressing changes. METHOD Children (0-15 years) with clean, ≤ 10% total body surface area (TBSA) partial thickness burns who met the inclusion criteria were included in the study. Children received either (1) Acticoat™; (2) Acticoat™ with Mepitel™; or (3) Mepilex Ag™ dressings. Measures of burn re-epithelialization, pain, and distress were recorded at dressing changes every 3-5 days until full re-epithelialization occurred. RESULTS One hundred and three children were recruited with 96 children included for analysis. No infections were detected for the course of the study. When adjusted for burn depth, Acticoat™ significantly increased the expected days to full re-epithelialization by 40% (IRR = 1.40; 95% CI: 1.14-1.73, p < 0.01) and Acticoat™ with Mepitel™ significantly increased the expected days to full re-epithelialization by 33% (IRR = 1.33; 95% CI: 1.08-1.63, p ≤ 0.01) when compared to Mepilex Ag™. Expected FLACC scores in the Mepilex Ag™ group were 32% lower at dressing removal (p = 0.01) and 37% lower at new dressing application (p = 0.04); and scores in the Acticoat™ with Mepitel™ group were 23% lower at dressing removal (p = 0.04) and 40% lower at new dressing application (p < 0.01), in comparison to the Acticoat™ group. Expected Visual Analog Scale-Pain (VAS-P) scores were 25% lower in the Mepilex Ag™ group at dressing removal (p = 0.04) and 34% lower in the Acticoat™ with Mepitel™ group (p = 0.02) at new dressing application in comparison to the Acticoat™ group. There was no significant difference between the Mepilex Ag™ and the Acticoat™ with Mepitel™ groups at all timepoints and with any pain measure. CONCLUSION Mepilex Ag™ is an effective silver dressing, in terms of accelerated wound re-epithelialization time (compared to Acticoat™ and Acticoat™ with Mepitel™) and decreased pain during dressing changes (compared to Acticoat™), for clean, < 10% TBSA partial thickness burns in children.