996 resultados para 234


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Objectives: To determine, by means of static fracture testing the effect of the tooth preparation design and the elastic modulus of the cement on the structural integrity of the cemented machined ceramic crown-tooth complex. 
Methods: Human maxillary extracted premolar teeth were prepared for all-ceramic crowns using two preparation designs; a standard preparation in accordance with established protocols and a novel design with a flat occlusal design. All-ceramic feldspathic (Vita MK II) crowns were milled for all the preparations using a CAD/CAM system (CEREC-3). The machined all-ceramic crowns were resin bonded to the tooth structure using one of three cements with different elastic moduli: Super-Bond C&B, Rely X Unicem and Panavia F 2.0. The specimens were subjected to compressive force through a 4 mm diameter steel ball at a crosshead speed of 1 mm/min using a universal test machine (Loyds Instrument Model LRX.). The load at the fracture point was recorded for each specimen in Newtons (N). These values were compared to a control group of unprepared/unrestored teeth. 
Results: There was a significant difference between the control group, with higher fracture strength, and the cemented samples regardless of the occlusal design and the type of resin cement. There was no significant difference in mean fracture load between the two designs of occlusal preparation using Super-Bond C&B. For the Rely X Unicem and Panavia F 2.0 cements, the proposed preparation design with a flat occlusal morphology provides a system with increased fracture strength. 
Significance: The proposed novel flat design showed less dependency on the resin cement selection in relation to the fracture strength of the restored tooth. The choice of the cement resin, with respect to its modulus of elasticity, is more important in the anatomic design than in the flat design. © 2013 Academy of Dental Materials.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Langer's axillary arch is a recognized muscular anomaly characterized by an accessory muscular band crossing the axilla that rarely causes symptoms. We describe a patient who presented with an upper limb deep vein thrombosis caused by this aberrant muscle, which we believe is the first reported case. Axillary surgery with division of the aberrant muscle relieved upper limb venous obstruction in this patient. (J Vase Surg 2012;55:234-6.)

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background: Although disabled women are significantly more likely to experience domestic abuse during pregnancy than non-disabled women, very little is known about how maternity care access and utilisation is affected by the co-existence of disability and domestic abuse. This systematic review of the literature explored how domestic abuse impacts upon disabled women’s access to maternity services.

Methods: Eleven articles were identified through a search of six electronic databases and data were analysed to identify: the factors that facilitate or compromise access to care; the consequences of inadequate care for pregnant women’s health and wellbeing; and the effectiveness of existing strategies for improvement.

Results: Findings indicate that a mental health diagnosis, poor relationships with health professionals and environmental barriers can compromise women’s utilisation of maternity services. Domestic abuse can both compromise, and catalyse, access to services and social support is a positive factor when accessing care. Delayed and inadequate care has adverse effects on women’s physical and psychological health, however further research is required to fully explore the nature and extent of these consequences. Only one study identified strategies currently being used to improve access to services for disabled women experiencing abuse.

Conclusions: Based upon the barriers and facilitators identified within the review, we suggest that future strategies for improvement should focus on: understanding women’s reasons for accessing care; fostering positive relationships; being women-centred; promoting environmental accessibility; and improving the strength of the evidence base.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Low-power processors and accelerators that were originally designed for the embedded systems market are emerging as building blocks for servers. Power capping has been actively explored as a technique to reduce the energy footprint of high-performance processors. The opportunities and limitations of power capping on the new low-power processor and accelerator ecosystem are less understood. This paper presents an efficient power capping and management infrastructure for heterogeneous SoCs based on hybrid ARM/FPGA designs. The infrastructure coordinates dynamic voltage and frequency scaling with task allocation on a customised Linux system for the Xilinx Zynq SoC. We present a compiler-assisted power model to guide voltage and frequency scaling, in conjunction with workload allocation between the ARM cores and the FPGA, under given power caps. The model achieves less than 5% estimation bias to mean power consumption. In an FFT case study, the proposed power capping schemes achieve on average 97.5% of the performance of the optimal execution and match the optimal execution in 87.5% of the cases, while always meeting power constraints.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Identifying groundwater contributions to baseflowforms an essential part of surfacewater body characterisation. The Gortinlieve catchment (5 km2) comprises a headwater stream network of the Carrigans River, itself a tributary of the River Foyle, NW Ireland. The bedrock comprises poorly productive metasediments that are characterised by fracture porosity. We present the findings of a multi-disciplinary study that integrates new hydrochemical and mineralogical investigations with existing hydraulic, geophysical and structural data to identify the scales of groundwater flow and the nature of groundwater/bedrock interaction (chemical denudation). At the catchment scale, the development of deep weathering profiles is controlled by NE-SW regional scale fracture zones associated with mountain building during the Grampian orogeny. In-situ chemical denudation of mineral phases is controlled by micro- to meso-scale fractures related to Alpine compression during Palaeocene to Oligocene times. The alteration of primary muscovite, chlorite (clinochlore) and albite along the surfaces of these small-scale fractures has resulted in the precipitation of illite, montmorillonite and illite/montmorillonite clay admixtures. The interconnected but discontinuous nature of these small-scale structures highlights the role of larger scale faults and fissures in the supply and transportation of weathering solutions to/from the sites of mineral weathering. The dissolution of primarily mineral phases releases the major ions Mg, Ca and HCO3 that are shown to subsequently formthe chemical makeup of groundwaters. Borehole groundwater and stream baseflow hydrochemical data are used to constrain the depths of groundwater flow pathways influencing the chemistry of surface waters throughout the stream profile. The results show that it is predominantly the lower part of the catchment, which receives inputs from catchment/regional scale groundwater flow, that is found to contribute to the maintenance of annual baseflow levels. This study identifies the importance
of deep groundwater in maintaining annual baseflow levels in poorly productive bedrock systems.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Context Medical students can have difficulty in distinguishing left from right. Many infamous medical errors have occurred when a procedure has been performed on the wrong side, such as in the removal of the wrong kidney. Clinicians encounter many distractions during their work. There is limited information on how these affect performance. 
Objectives Using a neuropsychological paradigm, we aim to elucidate the impacts of different types of distraction on left–right (LR) discrimination ability. 
Methods Medical students were recruited to a study with four arms: (i) control arm (no distraction); (ii) auditory distraction arm (continuous ambient ward noise); (iii) cognitive distraction arm (interruptions with clinical cognitive tasks), and (iv) auditory and cognitive distraction arm. Participants’ LR discrimination ability was measured using the validated Bergen Left–Right Discrimination Test (BLRDT). Multivariate analysis of variance was used to analyse the impacts of the different forms of distraction on participants’ performance on the BLRDT. Additional analyses looked at effects of demographics on performance and correlated participants’ self-perceived LR discrimination ability and their actual performance. 
Results A total of 234 students were recruited. Cognitive distraction had a greater negative impact on BLRDT performance than auditory distraction. Combined auditory and cognitive distraction had a negative impact on performance, but only in the most difficult LR task was this negative impact found to be significantly greater than that of cognitive distraction alone. There was a significant medium-sized correlation between perceived LR discrimination ability and actual overall BLRDT performance. 
Conclusions
Distraction has a significant impact on performance and multifaceted approaches are required to reduce LR errors. Educationally, greater emphasis on the linking of theory and clinical application is required to support patient safety and human factor training in medical school curricula. Distraction has the potential to impair an individual's ability to make accurate LR decisions and students should be trained from undergraduate level to be mindful of this.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Next-generation sequencing (NGS) is beginning to show its full potential for diagnostic and therapeutic applications. In particular, it is enunciating its capacity to contribute to a molecular taxonomy of cancer, to be used as a standard approach for diagnostic mutation detection, and to open new treatment options that are not exclusively organ-specific. If this is the case, how much validation is necessary and what should be the validation strategy, when bringing NGS into the diagnostic/clinical practice? This validation strategy should address key issues such as: what is the overall extent of the validation? Should essential indicators of test performance such as sensitivity of specificity be calculated for every target or sample type? Should bioinformatic interpretation approaches be validated with the same rigour? What is a competitive clinical turnaround time for a NGS-based test, and when does it become a cost-effective testing proposition? While we address these and other related topics in this commentary, we also suggest that a single set of international guidelines for the validation and use of NGS technology in routine diagnostics may allow us all to make a much more effective use of resources.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Recent research in Europe, Africa, and Southeast Asia suggests that we can no longer assume a direct and exclusive link between anatomically modern humans and behavioral modernity (the 'human revolution'), and assume that the presence of either one implies the presence of the other: discussions of the emergence of cultural complexity have to proceed with greater scrutiny of the evidence on a site-by-site basis to establish secure associations between the archaeology present there and the hominins who created it. This paper presents one such case study: Niah Cave in Sarawak on the island of Borneo, famous for the discovery in 1958 in the West Mouth of the Great Cave of a modern human skull, the 'Deep Skull,' controversially associated with radiocarbon dates of ca. 40,000 years before the present. A new chronostratigraphy has been developed through a re-investigation of the lithostratigraphy left by the earlier excavations, AMS-dating using three different comparative pre-treatments including ABOX of charcoal, and U-series using the Diffusion-Absorption model applied to fragments of bones from the Deep Skull itself. Stratigraphic reasons for earlier uncertainties about the antiquity of the skull are examined, and it is shown not to be an `intrusive' artifact. It was probably excavated from fluvial-pond-desiccation deposits that accumulated episodically in a shallow basin immediately behind the cave entrance lip, in a climate that ranged from times of comparative aridity with complete desiccation, to episodes of greater surface wetness, changes attributed to regional climatic fluctuations. Vegetation outside the cave varied significantly over time, including wet lowland forest, montane forest, savannah, and grassland. The new dates and the lithostratigraphy relate the Deep Skull to evidence of episodes of human activity that range in date from ca. 46,000 to ca. 34,000 years ago. Initial investigations of sediment scorching, pollen, palynomorphs, phytoliths, plant macrofossils, and starch grains recovered from existing exposures, and of vertebrates from the current and the earlier excavations, suggest that human foraging during these times was marked by habitat-tailored hunting technologies, the collection and processing of toxic plants for consumption, and, perhaps, the use of fire at some forest-edges. The Niah evidence demonstrates the sophisticated nature of the subsistence behavior developed by modern humans to exploit the tropical environments that they encountered in Southeast Asia, including rainforest. (c) 2006 Elsevier Ltd. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Velvetgrass (Holcus lanatus L.), also known as Yorkshire fog grass, has evolved tolerance to high levels of arsenate, and this adaptation involves reduced accumulation of arsenate through the suppression of the high affinity phosphate-arsenate uptake system. To determine the role of P nutrition in arsenate tolerance, inhibition kinetics of arsenate influx by phosphate were determined. The concentration of inhibitor required to reduce maximum influx (V(max)) by 50%, K1, of phosphate inhibition of arsenate influx was 0.02 mol m-3 in both tolerant and nontolerant clones. This was compared with the concentration where influx is 50% of maximum, a K(m), for arsenate influx of 0.6 mol m-3 for tolerants and 0.025 mol m-3 for nontolerants and, therefore, phosphate was much more effective at inhibiting arsenate influx in tolerant genotypes. The high affinity phosphate uptake system is inducible under low plant phosphate status, this increasing plant phosphate status should increase tolerance by decreasing arsenate influx. Root extension in arsenate solutions of tolerant and nontolerant tillers grown under differing phosphate nutritional regimes showed that indeed, increased plant P status increased the tolerance to arsenate of both tolerant and nontolerant clones. That plant P status increased tolerance again argues that P nutrition has a critical role in arsenate tolerance. To determine if short term flux and solution culture studies were relevant to As and P accumulation in soils, soil and plant material from a range of As contaminated sites were analyzed. As predicted from the short-term competition studies, P was accumulated preferentially to As in arsenate tolerant clones growing on mine spoil soils even when acid extractable arsenate in the soils was much greater than acid extractable phosphate. Though phosphate was much more efficient at competing with arsenate for uptake, plants growing on arsenate contaminated land still accumulated considerable amounts of As. Plants from the differing habitats showed large variation in plant phosphate status, pasture plants having much higher P levels than plants growing on the most contaminated mine spoil soils. The selectivity of the phosphate-arsenate uptake system for phosphate compared with arsenate, coupled with the suppression of this uptake system enabled tolerant clones of the grass velvetgrass to grow on soils that were highly contaminated with arsenate and deficient in phosphate.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Integrins (ITGs) are key elements in cancer biology, regulating tumor growth, angiogenesis and lymphangiogenesis through interactions of the tumor cells with the microenvironment. Moving from the hypothesis that ITGs could have different effects in stage II and III colon cancer, we tested whether a comprehensive panel of germline single-nucleotide polymorphisms (SNPs) in ITG genes could predict stage-specific time to tumor recurrence (TTR). A total of 234 patients treated with 5-fluorouracil-based chemotherapy at the University of Southern California were included in this study. Whole-blood samples were analyzed for germline SNPs in ITG genes using PCR-restriction fragment length polymorphism or direct DNA sequencing. In the multivariable analysis, stage II colon cancer patients with at least one G allele for ITGB3 rs4642 had higher risk of recurrence (hazard ratio (HR)=4.027, 95% confidence interval (95% CI) 1.556-10.421, P=0.004). This association was also significant in the combined stage II-III cohort (HR=1.975, 95% CI 1.194-3.269, P=0.008). The predominant role of ITGB3 rs4642 in stage II diseases was confirmed using recursive partitioning, showing that ITGB3 rs4642 was the most important factor in stage II diseases. In contrast, in stage III diseases the combined analysis of ITGB1 rs2298141 and ITGA4 rs7562325 allowed to identify three distinct prognostic subgroups (P=0.009). The interaction between stage and the combined ITGB1 rs2298141 and ITGA4 rs7562325 on TTR was significant (P=0.025). This study identifies germline polymorphisms in ITG genes as independent stage-specific prognostic markers for stage II and III colon cancer. These data may help to select subgroups of patients who may benefit from ITG-targeted treatments.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Tumor recurrence after curative resection remains a major problem in patients with locally advanced colorectal cancer treated with adjuvant chemotherapy. Genetic single-nucleotide polymorphisms (SNP) may serve as useful molecular markers to predict clinical outcomes in these patients and identify targets for future drug development. Recent in vitro and in vivo studies have demonstrated that the plastin genes PLS3 and LCP1 are overexpressed in colon cancer cells and play an important role in tumor cell invasion, adhesion, and migration. Hence, we hypothesized that functional genetic variations of plastin may have direct effects on the progression and prognosis of locally advanced colorectal cancer. We tested whether functional tagging polymorphisms of PLS3 and LCP1 predict time to tumor recurrence (TTR) in 732 patients (training set, 234; validation set, 498) with stage II/III colorectal cancer. The PLS3 rs11342 and LCP1 rs4941543 polymorphisms were associated with a significantly increased risk for recurrence in the training set. PLS3 rs6643869 showed a consistent association with TTR in the training and validation set, when stratified by gender and tumor location. Female patients with the PLS3 rs6643869 AA genotype had the shortest median TTR compared with those with any G allele in the training set [1.7 vs. 9.4 years; HR, 2.84; 95% confidence interval (CI), 1.32-6.1; P = 0.005] and validation set (3.3 vs. 13.7 years; HR, 2.07; 95% CI, 1.09-3.91; P = 0.021). Our findings suggest that several SNPs of the PLS3 and LCP1 genes could serve as gender- and/or stage-specific molecular predictors of tumor recurrence in stage II/III patients with colorectal cancer as well as potential therapeutic targets.