991 resultados para Rydelius, Andreas, 1671-1738.


Relevância:

10.00% 10.00%

Publicador:

Resumo:

The formation rate of university spin-out firms has increased markedly over the past decade. While this is seen as an important channel for the commercialisation of academic research, concerns have centred around high failure rates and no-to-low growth among those which survive compared to other new technology based firms. Universities have responded to this by investing in incubators to assist spin-outs to overcome their liability of newness. Yet how effective are incubators in supporting these firms? Here we examine this in terms of the structural networks that spin-out firms form, the role of the incubator in this and the effect of this on the spin-out process.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Cytokine secretion and degranulation represent key components of CD8(+) T-cell cytotoxicity. While transcriptional blockade of IFN-γ and inhibition of degranulation by TGF-β are well established, we wondered whether TGF-β could also induce immune-regulatory miRNAs in human CD8(+) T cells. We used miRNA microarrays and high-throughput sequencing in combination with qRT-PCR and found that TGF-β promotes expression of the miR-23a cluster in human CD8(+) T cells. Likewise, TGF-β up-regulated expression of the cluster in CD8(+) T cells from wild-type mice, but not in cells from mice with tissue-specific expression of a dominant-negative TGF-β type II receptor. Reporter gene assays including site mutations confirmed that miR-23a specifically targets the 3'UTR of CD107a/LAMP1 mRNA, whereas the further miRNAs expressed in this cluster-namely, miR-27a and -24-target the 3'UTR of IFN-γ mRNA. Upon modulation of the miR-23a cluster by the respective miRNA antagomirs and mimics, we observed significant changes in IFN-γ expression, but only slight effects on CD107a/LAMP1 expression. Still, overexpression of the cluster attenuated the cytotoxic activity of antigen-specific CD8(+) T cells. These functional data thus reveal that the miR-23a cluster not only is induced by TGF-β, but also exerts a suppressive effect on CD8(+) T-cell effector functions, even in the absence of TGF-β signaling.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Detection of pretreatment disseminated cells (pre-DTC) reflecting its homing to bone marrow (BM) in prostate cancer (PCa) might improve the current model to predict recurrence or survival in men with nonmetastatic disease despite of primary treatment. Thereby, pre-DTC may serve as an early prognostic biomarker. Post-treatment DTCs (post-DTC) finding may supply the clinician with additional predictive information about the possible course of PCa. To assess the prognostic impact of DTCs in BM aspirates sampled before initiation of primary therapy (pre-DTC) and at least 2 years after (post-DTC) to established prognostic factors and survival in patients with PCa. Available BM of 129 long-term follow-up patients with T1-3N0M0 PCa was assessed in addition to 100 BM of those in whom a pretreatment BM was sampled. Patients received either combined therapy [n = 81 (63%)], radiotherapy (RT) with different duration of hormone treatment (HT) or monotherapy with RT or HT alone [n = 48 (37%)] adapted to the criteria of the SPCG-7 trial. Mononuclear cells were deposited on slides according to the cytospin methodology and DTCs were identified by immunocytochemistry using the pancytokeratin antibodies AE1/AE3. The median age of men at diagnosis was 64.5 years (range 49.5-73.4 years). The median long-term follow-up from first BM sampling to last observation was 11 years. Categorized clinically relevant factors in PCa showed only pre-DTC status as the statistically independent parameter for survival in the multivariate analysis. Pre-DTCs homing to BM are significantly associated with clinically relevant outcome independent to the patient's treatment at diagnosis with nonmetastatic PCa.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Paediatric cardiac catheterizations may result in the administration of substantial amounts of iodinated contrast media and ionizing radiation. The aim of this work was to investigate the effect of iodinated contrast media in combination with in vitro and in vivo X-ray radiation on lymphocyte DNA. Six concentrations of iodine (15, 17.5, 30, 35, 45, and 52.5 mg of iodine per mL blood) represented volumes of iodinated contrast media used in the clinical setting. Blood obtained from healthy volunteers was mixed with iodinated contrast media and exposed to radiation doses commonly used in paediatric cardiac catheterizations (0 mGy, 70 mGy, 140 mGy, 250 mGy and 450 mGy). Control samples contained no iodine. For in vivo experimentation, pre and post blood samples were collected from children undergoing cardiac catheterization, receiving iodine concentrations of up to 51 mg of iodine per mL blood and radiation doses of up to 400 mGy. Fluorescence microscopy was performed to assess γH2AX-foci induction, which corresponded to the number of DNA double-strand breaks. The presence of iodine in vitro resulted in significant increases of DNA double-strand breaks beyond that induced by radiation for ≥17.5 mg/mL iodine to blood. The in vivo effects of contrast media on children undergoing cardiac catheterization resulted in a 19% increase in DNA double-strand breaks in children receiving an average concentration of 19 mg/mL iodine to blood. A larger investigation is required to provide further information of the potential benefit of lowering the amount of iodinated contrast media received during X-ray radiation investigations. 

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Climate model projections suggestwidespread drying in the Mediterranean Basin and wetting in Fennoscandia in the coming decades largely as a consequence of greenhouse gas forcing of climate. To place these and other “Old World” climate projections into historical perspective based on more complete estimates of natural hydroclimatic variability, we have developed the “Old World Drought Atlas” (OWDA), a set of year-to-year maps of tree-ring reconstructed summer wetness and dryness over Europe and the Mediterranean Basin during the Common Era.
The OWDA matches historical accounts of severe drought and wetness with a spatial completeness not previously available. In addition, megadroughts reconstructed over north-central Europe in the 11th and mid-15th centuries
reinforce other evidence from North America and Asia that droughts were more severe, extensive, and prolonged over Northern Hemisphere land areas before the 20th century, with an inadequate understanding of their causes. The OWDA provides new data to determine the causes of Old World drought and wetness and attribute past climate variability to forced and/or internal variability.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Current variation aware design methodologies, tuned for worst-case scenarios, are becoming increasingly pessimistic from the perspective of power and performance. A good example of such pessimism is setting the refresh rate of DRAMs according to the worst-case access statistics, thereby resulting in very frequent refresh cycles, which are responsible for the majority of the standby power consumption of these memories. However, such a high refresh rate may not be required, either due to extremely low probability of the actual occurrence of such a worst-case, or due to the inherent error resilient nature of many applications that can tolerate a certain number of potential failures. In this paper, we exploit and quantify the possibilities that exist in dynamic memory design by shifting to the so-called approximate computing paradigm in order to save power and enhance yield at no cost. The statistical characteristics of the retention time in dynamic memories were revealed by studying a fabricated 2kb CMOS compatible embedded DRAM (eDRAM) memory array based on gain-cells. Measurements show that up to 73% of the retention power can be saved by altering the refresh time and setting it such that a small number of failures is allowed. We show that these savings can be further increased by utilizing known circuit techniques, such as body biasing, which can help, not only in extending, but also in preferably shaping the retention time distribution. Our approach is one of the first attempts to access the data integrity and energy tradeoffs achieved in eDRAMs for utilizing them in error resilient applications and can prove helpful in the anticipated shift to approximate computing.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Static timing analysis provides the basis for setting the clock period of a microprocessor core, based on its worst-case critical path. However, depending on the design, this critical path is not always excited and therefore dynamic timing margins exist that can theoretically be exploited for the benefit of better speed or lower power consumption (through voltage scaling). This paper introduces predictive instruction-based dynamic clock adjustment as a technique to trim dynamic timing margins in pipelined microprocessors. To this end, we exploit the different timing requirements for individual instructions during the dynamically varying program execution flow without the need for complex circuit-level measures to detect and correct timing violations. We provide a design flow to extract the dynamic timing information for the design using post-layout dynamic timing analysis and we integrate the results into a custom cycle-accurate simulator. This simulator allows annotation of individual instructions with their impact on timing (in each pipeline stage) and rapidly derives the overall code execution time for complex benchmarks. The design methodology is illustrated at the microarchitecture level, demonstrating the performance and power gains possible on a 6-stage OpenRISC in-order general purpose processor core in a 28nm CMOS technology. We show that employing instruction-dependent dynamic clock adjustment leads on average to an increase in operating speed by 38% or to a reduction in power consumption by 24%, compared to traditional synchronous clocking, which at all times has to respect the worst-case timing identified through static timing analysis.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper, we introduce a statistical data-correction framework that aims at improving the DSP system performance in presence of unreliable memories. The proposed signal processing framework implements best-effort error mitigation for signals that are corrupted by defects in unreliable storage arrays using a statistical correction function extracted from the signal statistics, a data-corruption model, and an application-specific cost function. An application example to communication systems demonstrates the efficacy of the proposed approach.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The worsening of process variations and the consequent increased spreads in circuit performance and consumed power hinder the satisfaction of the targeted budgets and lead to yield loss. Corner based design and adoption of design guardbands might limit the yield loss. However, in many cases such methods may not be able to capture the real effects which might be way better than the predicted ones leading to increasingly pessimistic designs. The situation is even more severe in memories which consist of substantially different individual building blocks, further complicating the accurate analysis of the impact of variations at the architecture level leaving many potential issues uncovered and opportunities unexploited. In this paper, we develop a framework for capturing non-trivial statistical interactions among all the components of a memory/cache. The developed tool is able to find the optimum memory/cache configuration under various constraints allowing the designers to make the right choices early in the design cycle and consequently improve performance, energy, and especially yield. Our, results indicate that the consideration of the architectural interactions between the memory components allow to relax the pessimistic access times that are predicted by existing techniques.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The area and power consumption of low-density parity check (LDPC) decoders are typically dominated by embedded memories. To alleviate such high memory costs, this paper exploits the fact that all internal memories of a LDPC decoder are frequently updated with new data. These unique memory access statistics are taken advantage of by replacing all static standard-cell based memories (SCMs) of a prior-art LDPC decoder implementation by dynamic SCMs (D-SCMs), which are designed to retain data just long enough to guarantee reliable operation. The use of D-SCMs leads to a 44% reduction in silicon area of the LDPC decoder compared to the use of static SCMs. The low-power LDPC decoder architecture with refresh-free D-SCMs was implemented in a 90nm CMOS process, and silicon measurements show full functionality and an information bit throughput of up to 600 Mbps (as required by the IEEE 802.11n standard).

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper, we investigate the impact of faulty memory bit-cells on the performance of LDPC and Turbo channel decoders based on realistic memory failure models. Our study investigates the inherent error resilience of such codes to potential memory faults affecting the decoding process. We develop two mitigation mechanisms that reduce the impact of memory faults rather than correcting every single error. We show how protection of only few bit-cells is sufficient to deal with high defect rates. In addition, we show how the use of repair-iterations specifically helps mitigating the impact of faults that occur inside the decoder itself.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Inherently error-resilient applications in areas such as signal processing, machine learning and data analytics provide opportunities for relaxing reliability requirements, and thereby reducing the overhead incurred by conventional error correction schemes. In this paper, we exploit the tolerable imprecision of such applications by designing an energy-efficient fault-mitigation scheme for unreliable data memories to meet target yield. The proposed approach uses a bit-shuffling mechanism to isolate faults into bit locations with lower significance. This skews the bit-error distribution towards the low order bits, substantially limiting the output error magnitude. By controlling the granularity of the shuffling, the proposed technique enables trading-off quality for power, area, and timing overhead. Compared to error-correction codes, this can reduce the overhead by as much as 83% in read power, 77% in read access time, and 89% in area, when applied to various data mining applications in 28nm process technology.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We consider the problem of linking web search queries to entities from a knowledge base such as Wikipedia. Such linking enables converting a user’s web search session to a footprint in the knowledge base that could be used to enrich the user profile. Traditional methods for entity linking have been directed towards finding entity mentions in text documents such as news reports, each of which are possibly linked to multiple entities enabling the usage of measures like entity set coherence. Since web search queries are very small text fragments, such criteria that rely on existence of a multitude of mentions do not work too well on them. We propose a three-phase method for linking web search queries to wikipedia entities. The first phase does IR-style scoring of entities against the search query to narrow down to a subset of entities that are expanded using hyperlink information in the second phase to a larger set. Lastly, we use a graph traversal approach to identify the top entities to link the query to. Through an empirical evaluation on real-world web search queries, we illustrate that our methods significantly enhance the linking accuracy over state-of-the-art methods.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Purpose: The recently completed Chinese "Million Cataract Surgeries Program" (MCSP) is among the largest such campaigns ever, providing 1.05 million operations. We report MCSP outcomes for the first time, in Jiangxi, the province with the greatest program output. Methods: Ten county hospitals participating in MCSP were selected in Jiangxi (range of gross domestic product per capita US$743-2998). Each hospital sought to enroll 75 consecutive MCSP patients aged ≥50 years. Data recorded included type of cataract procedure, bilateral uncorrected visual acuity (UCVA) and best-corrected visual acuity (BCVA), and refractive error pre- and ≥50 days postoperatively. Results: Among 715 patients (mean age 72.3±9.1 years, 55.5% female), preoperative UCVA was <3/60 (legally blind) bilaterally in 13.3% and unilaterally in the operated eye in 50.9%. No subjects had UCVA >6/18 preoperatively. Small incision cataract surgery was performed in 92.3% patients. Among 662 patients (92.6%) completing follow-up was ≥ 40 days after surgery, BCVA was ≥6/18 in 80.1%, UCVA was ≥6/18 in 57.1% and UCVA was <3/60 in 2.1%. Older age (p<0.001), female sex (p=0.04), worse refractive error (p=0.02) and presence of intra- (p=0.002) and postoperative surgical complications (p<0.001), were independently associated with worse postoperative UCVA. Based on these results, the MCSP cured an estimated 124,950 cases (13.3%×[100-2.1%]×1.05 million) of bilateral and 502,500 (50.9%×[100-2.1%]×1.05 million) of unilateral blindness. Conclusions: Due to relatively good outcomes and the large number of surgeries performed on blind persons, the sight-restoring impact of the MCSP was probably substantial. © Informa Healthcare USA, Inc.