945 resultados para weighted shift


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The unprecedented and relentless growth in the electronics industry is feeding the demand for integrated circuits (ICs) with increasing functionality and performance at minimum cost and power consumption. As predicted by Moore's law, ICs are being aggressively scaled to meet this demand. While the continuous scaling of process technology is reducing gate delays, the performance of ICs is being increasingly dominated by interconnect delays. In an effort to improve submicrometer interconnect performance, to increase packing density, and to reduce chip area and power consumption, the semiconductor industry is focusing on three-dimensional (3D) integration. However, volume production and commercial exploitation of 3D integration are not feasible yet due to significant technical hurdles.

At the present time, interposer-based 2.5D integration is emerging as a precursor to stacked 3D integration. All the dies and the interposer in a 2.5D IC must be adequately tested for product qualification. However, since the structure of 2.5D ICs is different from the traditional 2D ICs, new challenges have emerged: (1) pre-bond interposer testing, (2) lack of test access, (3) limited ability for at-speed testing, (4) high density I/O ports and interconnects, (5) reduced number of test pins, and (6) high power consumption. This research targets the above challenges and effective solutions have been developed to test both dies and the interposer.

The dissertation first introduces the basic concepts of 3D ICs and 2.5D ICs. Prior work on testing of 2.5D ICs is studied. An efficient method is presented to locate defects in a passive interposer before stacking. The proposed test architecture uses e-fuses that can be programmed to connect or disconnect functional paths inside the interposer. The concept of a die footprint is utilized for interconnect testing, and the overall assembly and test flow is described. Moreover, the concept of weighted critical area is defined and utilized to reduce test time. In order to fully determine the location of each e-fuse and the order of functional interconnects in a test path, we also present a test-path design algorithm. The proposed algorithm can generate all test paths for interconnect testing.

In order to test for opens, shorts, and interconnect delay defects in the interposer, a test architecture is proposed that is fully compatible with the IEEE 1149.1 standard and relies on an enhancement of the standard test access port (TAP) controller. To reduce test cost, a test-path design and scheduling technique is also presented that minimizes a composite cost function based on test time and the design-for-test (DfT) overhead in terms of additional through silicon vias (TSVs) and micro-bumps needed for test access. The locations of the dies on the interposer are taken into consideration in order to determine the order of dies in a test path.

To address the scenario of high density of I/O ports and interconnects, an efficient built-in self-test (BIST) technique is presented that targets the dies and the interposer interconnects. The proposed BIST architecture can be enabled by the standard TAP controller in the IEEE 1149.1 standard. The area overhead introduced by this BIST architecture is negligible; it includes two simple BIST controllers, a linear-feedback-shift-register (LFSR), a multiple-input-signature-register (MISR), and some extensions to the boundary-scan cells in the dies on the interposer. With these extensions, all boundary-scan cells can be used for self-configuration and self-diagnosis during interconnect testing. To reduce the overall test cost, a test scheduling and optimization technique under power constraints is described.

In order to accomplish testing with a small number test pins, the dissertation presents two efficient ExTest scheduling strategies that implements interconnect testing between tiles inside an system on chip (SoC) die on the interposer while satisfying the practical constraint that the number of required test pins cannot exceed the number of available pins at the chip level. The tiles in the SoC are divided into groups based on the manner in which they are interconnected. In order to minimize the test time, two optimization solutions are introduced. The first solution minimizes the number of input test pins, and the second solution minimizes the number output test pins. In addition, two subgroup configuration methods are further proposed to generate subgroups inside each test group.

Finally, the dissertation presents a programmable method for shift-clock stagger assignment to reduce power supply noise during SoC die testing in 2.5D ICs. An SoC die in the 2.5D IC is typically composed of several blocks and two neighboring blocks that share the same power rails should not be toggled at the same time during shift. Therefore, the proposed programmable method does not assign the same stagger value to neighboring blocks. The positions of all blocks are first analyzed and the shared boundary length between blocks is then calculated. Based on the position relationships between the blocks, a mathematical model is presented to derive optimal result for small-to-medium sized problems. For larger designs, a heuristic algorithm is proposed and evaluated.

In summary, the dissertation targets important design and optimization problems related to testing of interposer-based 2.5D ICs. The proposed research has led to theoretical insights, experiment results, and a set of test and design-for-test methods to make testing effective and feasible from a cost perspective.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Purpose: There are two goals of this study. The first goal of this study is to investigate the feasibility of using classic textural feature extraction in radiotherapy response assessment among a unique cohort of early stage breast cancer patients who received the single-dose preoperative radiotherapy. The second goal of this study is to investigate the clinical feasibility of using classic texture features as potential biomarkers which are supplementary to regional apparent diffusion coefficient in gynecological cancer radiotherapy response assessment.

Methods and Materials: For the breast cancer study, 15 patients with early stage breast cancer were enrolled in this retrospective study. Each patient received a single-fraction radiation treatment, and DWI and DCE-MRI scans were conducted before and after the radiotherapy. DWI scans were acquired using a spin-echo EPI sequence with diffusion weighting factors of b = 0 and b = 500 mm2/s, and the apparent diffusion coefficient (ADC) maps were calculated. DCE-MRI scans were acquired using a T1-weighted 3D SPGR sequence with a temporal resolution of about 1 minute. The contrast agent (CA) was intravenously injected with a 0.1 mmol/kg bodyweight dose at 2 ml/s. Two parameters, volume transfer constant (Ktrans) and kep were analyzed using the two-compartment Tofts pharmacokinetic model. For pharmacokinetic parametric maps and ADC maps, 33 textural features were generated from the clinical target volume (CTV) in a 3D fashion using the classic gray level co-occurrence matrix (GLCOM) and gray level run length matrix (GLRLM). Wilcoxon signed-rank test was used to determine the significance of each texture feature’s change after the radiotherapy. The significance was set to 0.05 with Bonferroni correction.

For the gynecological cancer study, 12 female patients with gynecologic cancer treated with fractionated external beam radiotherapy (EBRT) combined with high dose rate (HDR) intracavitary brachytherapy were studied. Each patient first received EBRT treatment followed by five fractions of HDR treatment. Before EBRT and before each fraction of brachytherapy, Diffusion Weighted MRI (DWI-MRI) and CT scans were acquired. DWI scans were acquired in sagittal plane utilizing a spin-echo echo-planar imaging sequence with weighting factors of b = 500 s/mm2 and b = 1000 s/mm2, one set of images of b = 0 s/mm2 were also acquired. ADC maps were calculated using linear least-square fitting method. Distributed diffusion coefficient (DDC) maps and stretching parameter α were calculated. For ADC and DDC maps, 33 classic texture features were generated utilizing the classic gray level run length matrix (GLRLM) and gray level co-occurrence matrix (GLCOM) from high-risk clinical target volume (HR-CTV). Wilcoxon signed-rank statistics test was applied to determine the significance of each feature’s numerical value change after radiotherapy. Significance level was set to 0.05 with multi-comparison correction if applicable.

Results: For the breast cancer study, regarding ADC maps calculated from DWI-MRI, 24 out of 33 CTV features changed significantly after the radiotherapy. For DCE-MRI pharmacokinetic parameters, all 33 CTV features of Ktrans and 33 features of kep changed significantly.

For the gynecological cancer study, regarding ADC maps, 28 out of 33 HR-CTV texture features showed significant changes after the EBRT treatment. 28 out of 33 HR-CTV texture features indicated significant changes after HDR treatments. The texture features that indicated significant changes after HDR treatments are the same as those after EBRT treatment. 28 out of 33 HR-CTV texture features showed significant changes after whole radiotherapy treatment process. The texture features that indicated significant changes for the whole treatment process are the same as those after HDR treatments.

Conclusion: Initial results indicate that certain classic texture features are sensitive to radiation-induced changes. Classic texture features with significant numerical changes can be used in monitoring radiotherapy effect. This might suggest that certain texture features might be used as biomarkers which are supplementary to ADC and DDC for assessment of radiotherapy response in breast cancer and gynecological cancer.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Extensive investigation has been conducted on network data, especially weighted network in the form of symmetric matrices with discrete count entries. Motivated by statistical inference on multi-view weighted network structure, this paper proposes a Poisson-Gamma latent factor model, not only separating view-shared and view-specific spaces but also achieving reduced dimensionality. A multiplicative gamma process shrinkage prior is implemented to avoid over parameterization and efficient full conditional conjugate posterior for Gibbs sampling is accomplished. By the accommodating of view-shared and view-specific parameters, flexible adaptability is provided according to the extents of similarity across view-specific space. Accuracy and efficiency are tested by simulated experiment. An application on real soccer network data is also proposed to illustrate the model.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Mémoire numérisé par la Direction des bibliothèques de l'Université de Montréal.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Mémoire numérisé par la Direction des bibliothèques de l'Université de Montréal.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This article examines the transformation in the narratives of the international governance of security over the last two decades. It suggests that there has been a major shift from governing interventions designed to address the causes of security problems to the regulation of the effects of these problems. In rearticulating the goals of international actors, the means and mechanisms of security governance have also changed, no longer focused on the universal application of Western knowledge and resources but rather on the unique local and organic processes at work in societies that bear the brunt of these problems. This transformation takes the conceptualisation of security governance out of the traditional terminological lexicon of security expertise and universal solutions and instead articulates the problematic of security and the policing of global risks in terms of local management processes, suggesting that decentralised coping strategies and self-policing are more effective and sustainable solutions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Innovation is a fundamental part of social work. In recent years there has been a shift in the innovation paradigm, making it easier to accept this relationship. National and supranational policies aimed at promoting innovation appear to be specifically guided by this idea. To be able to affirm this hypothesis, it is necessary to review the perception that social workers have of their duties. It is also useful to examine particular cases that show how such social innovation arises.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Ecosystem reconfigurations arising from climate-driven changes in species distributions are expected to have profound ecological, social, and economic implications. Here we reveal a rapid climate-driven regime shift of Australian temperate reef communities, which lost their defining kelp forests and became dominated by persistent seaweed turfs. After decades of ocean warming, extreme marine heat waves forced a 100-kilometer range contraction of extensive kelp forests and saw temperate species replaced by seaweeds, invertebrates, corals, and fishes characteristic of subtropical and tropical waters. This community-wide tropicalization fundamentally altered key ecological processes, suppressing the recovery of kelp forests.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Ecosystem reconfigurations arising from climate-driven changes in species distributions are expected to have profound ecological, social, and economic implications. Here we reveal a rapid climate-driven regime shift of Australian temperate reef communities, which lost their defining kelp forests and became dominated by persistent seaweed turfs. After decades of ocean warming, extreme marine heat waves forced a 100-kilometer range contraction of extensive kelp forests and saw temperate species replaced by seaweeds, invertebrates, corals, and fishes characteristic of subtropical and tropical waters. This community-wide tropicalization fundamentally altered key ecological processes, suppressing the recovery of kelp forests.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Despite evidence from a number of Earth systems that abrupt temporal changes known as regime shifts are important, their nature, scale and mechanisms remain poorly documented and understood. Applying principal component analysis, change-point analysis and a sequential t-test analysis of regime shifts to 72 time series, we confirm that the 1980s regime shift represented a major change in the Earth's biophysical systems from the upper atmosphere to the depths of the ocean and from the Arctic to the Antarctic, and occurred at slightly different times around the world. Using historical climate model simulations from the Coupled Model Intercomparison Project Phase 5 (CMIP5) and statistical modelling of historical temperatures, we then demonstrate that this event was triggered by rapid global warming from anthropogenic plus natural forcing, the latter associated with the recovery from the El Chichón volcanic eruption. The shift in temperature that occurred at this time is hypothesized as the main forcing for a cascade of abrupt environmental changes. Within the context of the last century or more, the 1980s event was unique in terms of its global scope and scale; our observed consequences imply that if unavoidable natural events such as major volcanic eruptions interact with anthropogenic warming unforeseen multiplier effects may occur.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Despite evidence from a number of Earth systems that abrupt temporal changes known as regime shifts are important, their nature, scale and mechanisms remain poorly documented and understood. Applying principal component analysis, change-point analysis and a sequential t-test analysis of regime shifts to 72 time series, we confirm that the 1980s regime shift represented a major change in the Earth's biophysical systems from the upper atmosphere to the depths of the ocean and from the Arctic to the Antarctic, and occurred at slightly different times around the world. Using historical climate model simulations from the Coupled Model Intercomparison Project Phase 5 (CMIP5) and statistical modelling of historical temperatures, we then demonstrate that this event was triggered by rapid global warming from anthropogenic plus natural forcing, the latter associated with the recovery from the El Chichón volcanic eruption. The shift in temperature that occurred at this time is hypothesized as the main forcing for a cascade of abrupt environmental changes. Within the context of the last century or more, the 1980s event was unique in terms of its global scope and scale; our observed consequences imply that if unavoidable natural events such as major volcanic eruptions interact with anthropogenic warming unforeseen multiplier effects may occur.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This work presents a computational, called MOMENTS, code developed to be used in process control to determine a characteristic transfer function to industrial units when radiotracer techniques were been applied to study the unit´s performance. The methodology is based on the measuring the residence time distribution function (RTD) and calculate the first and second temporal moments of the tracer data obtained by two scintillators detectors NaI positioned to register a complete tracer movement inside the unit. Non linear regression technique has been used to fit various mathematical models and a statistical test was used to select the best result to the transfer function. Using the code MOMENTS, twelve different models can be used to fit a curve and calculate technical parameters to the unit.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Lo scopo della tesi è di stimare le prestazioni del rivelatore ALICE nella rivelazione del barione Lambda_c nelle collisioni PbPb usando un approccio innovativo per l'identificazione delle particelle. L'idea principale del nuovo approccio è di sostituire l'usuale selezione della particella, basata su tagli applicati ai segnali del rivelatore, con una selezione che usi le probabilità derivate dal teorema di Bayes (per questo è chiamato "pesato Bayesiano"). Per stabilire quale metodo è il più efficiente , viene presentato un confronto con altri approcci standard utilizzati in ALICE. Per fare ciò è stato implementato un software di simulazione Monte Carlo "fast", settato con le abbondanze di particelle che ci si aspetta nel nuovo regime energetico di LHC e con le prestazioni osservate del rivelatore. E' stata quindi ricavata una stima realistica della produzione di Lambda_c, combinando i risultati noti da esperimenti precedenti e ciò è stato usato per stimare la significatività secondo la statistica al RUN2 e RUN3 dell'LHC. Verranno descritti la fisica di ALICE, tra cui modello standard, cromodinamica quantistica e quark gluon plasma. Poi si passerà ad analizzare alcuni risultati sperimentali recenti (RHIC e LHC). Verrà descritto il funzionamento di ALICE e delle sue componenti e infine si passerà all'analisi dei risultati ottenuti. Questi ultimi hanno mostrato che il metodo risulta avere una efficienza superiore a quella degli usuali approcci in ALICE e che, conseguentemente, per quantificare ancora meglio le prestazioni del nuovo metodo si dovrebbe eseguire una simulazione "full", così da verificare i risultati ottenuti in uno scenario totalmente realistico.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In questo breve elaborato si vuole spiegare l’importanza dello studio di un corpo celeste mediante l’osservazione del suo spettro ovvero un grafico del flusso emesso in funzione della frequenza o della lunghezza d’onda nel quale sono presenti righe spettrali, formate dall’interazione tra materia e radiazione, a causa dell’assorbimento od emissione di fotoni a seguito di transizioni elettroniche, ma anche vibrazionali e rotazionali per le molecole. In particolare, dall’analisi delle righe spettrali si traggono diverse informazioni sull’oggetto, quali, la composizione e l’abbondanza delle specie chimiche che lo compongono in base al tipo di righe presenti e alla loro intensità, si deduce la temperatura e la pressione dell’oggetto studiato dalla larghezza di queste, ancora, informazioni sul moto relativo e la distanza dall’osservatore misurando lo shift delle righe; infine densità e campi magnetici del mezzo interstellare. Per molti oggetti astronomici, troppo distanti, lo studio dello spettro è l’unico modo per trarre conclusioni sulla loro natura. Per questo, nel primo capitolo si ricava l’equazione del trasporto radiativo, soffermandosi sui processi che regolano l’assorbimento e l’emissione di energia. Il secondo capitolo invece, tratta il caso particolare delle atmosfere stellari, nel quale si ricava, con una serie di approssimazioni fatte sull’equazione del trasporto radiativo, quale parte osserviamo di una stella e dove si formano le righe spettrali. Successivamente ci si è concentrati sui meccanismi che portano alla formazione delle righe spettrali, analizzando sia le transizioni radiative con i coefficienti di Einstein, sia quelle collisionali, e distinguendo tra transizioni permesse o proibite con le regole di selezione. Infine si sono esaminate le informazioni che si possono ricavare dalle righe spettrali, approfondendo sui fenomeni di shift e modifica di queste, descrivendo più nel dettaglio la riga a 21 cm dell’atomo di idrogeno, fondamentale in astrofisica.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper compares two linear programming (LP) models for shift scheduling in services where homogeneously-skilled employees are available at limited times. Although both models are based on set covering approaches, one explicitly matches employees to shifts, while the other imposes this matching implicitly. Each model is used in three forms—one with complete, another with very limited meal break placement flexibility, and a third without meal breaks—to provide initial schedules to a completion/improvement heuristic. The term completion/improvement heuristic is used to describe a construction/ improvement heuristic operating on a starting schedule. On 80 test problems varying widely in scheduling flexibility, employee staffing requirements, and employee availability characteristics, all six LP-based procedures generated lower cost schedules than a comparison from-scratch construction/improvement heuristic. This heuristic, which perpetually maintains an explicit matching of employees to shifts, consists of three phases which add, drop, and modify shifts. In terms of schedule cost, schedule generation time, and model size, the procedures based on the implicit model performed better, as a group, than those based on the explicit model. The LP model with complete break placement flexibility and implicitly matching employees to shifts generated schedules costing 6.7% less than those developed by the from-scratch heuristic.