15 resultados para Nonlinear diffusion

em Helda - Digital Repository of University of Helsinki


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Transfer from aluminum to copper metallization and decreasing feature size of integrated circuit devices generated a need for new diffusion barrier process. Copper metallization comprised entirely new process flow with new materials such as low-k insulators and etch stoppers, which made the diffusion barrier integration demanding. Atomic Layer Deposition technique was seen as one of the most promising techniques to deposit copper diffusion barrier for future devices. Atomic Layer Deposition technique was utilized to deposit titanium nitride, tungsten nitride, and tungsten nitride carbide diffusion barriers. Titanium nitride was deposited with a conventional process, and also with new in situ reduction process where titanium metal was used as a reducing agent. Tungsten nitride was deposited with a well-known process from tungsten hexafluoride and ammonia, but tungsten nitride carbide as a new material required a new process chemistry. In addition to material properties, the process integration for the copper metallization was studied making compatibility experiments on different surface materials. Based on these studies, titanium nitride and tungsten nitride processes were found to be incompatible with copper metal. However, tungsten nitride carbide film was compatible with copper and exhibited the most promising properties to be integrated for the copper metallization scheme. The process scale-up on 300 mm wafer comprised extensive film uniformity studies, which improved understanding of non-uniformity sources of the ALD growth and the process-specific requirements for the ALD reactor design. Based on these studies, it was discovered that the TiN process from titanium tetrachloride and ammonia required the reactor design of perpendicular flow for successful scale-up. The copper metallization scheme also includes process steps of the copper oxide reduction prior to the barrier deposition and the copper seed deposition prior to the copper metal deposition. Easy and simple copper oxide reduction process was developed, where the substrate was exposed gaseous reducing agent under vacuum and at elevated temperature. Because the reduction was observed efficient enough to reduce thick copper oxide film, the process was considered also as an alternative method to make the copper seed film via copper oxide reduction.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The paradigm of computational vision hypothesizes that any visual function -- such as the recognition of your grandparent -- can be replicated by computational processing of the visual input. What are these computations that the brain performs? What should or could they be? Working on the latter question, this dissertation takes the statistical approach, where the suitable computations are attempted to be learned from the natural visual data itself. In particular, we empirically study the computational processing that emerges from the statistical properties of the visual world and the constraints and objectives specified for the learning process. This thesis consists of an introduction and 7 peer-reviewed publications, where the purpose of the introduction is to illustrate the area of study to a reader who is not familiar with computational vision research. In the scope of the introduction, we will briefly overview the primary challenges to visual processing, as well as recall some of the current opinions on visual processing in the early visual systems of animals. Next, we describe the methodology we have used in our research, and discuss the presented results. We have included some additional remarks, speculations and conclusions to this discussion that were not featured in the original publications. We present the following results in the publications of this thesis. First, we empirically demonstrate that luminance and contrast are strongly dependent in natural images, contradicting previous theories suggesting that luminance and contrast were processed separately in natural systems due to their independence in the visual data. Second, we show that simple cell -like receptive fields of the primary visual cortex can be learned in the nonlinear contrast domain by maximization of independence. Further, we provide first-time reports of the emergence of conjunctive (corner-detecting) and subtractive (opponent orientation) processing due to nonlinear projection pursuit with simple objective functions related to sparseness and response energy optimization. Then, we show that attempting to extract independent components of nonlinear histogram statistics of a biologically plausible representation leads to projection directions that appear to differentiate between visual contexts. Such processing might be applicable for priming, \ie the selection and tuning of later visual processing. We continue by showing that a different kind of thresholded low-frequency priming can be learned and used to make object detection faster with little loss in accuracy. Finally, we show that in a computational object detection setting, nonlinearly gain-controlled visual features of medium complexity can be acquired sequentially as images are encountered and discarded. We present two online algorithms to perform this feature selection, and propose the idea that for artificial systems, some processing mechanisms could be selectable from the environment without optimizing the mechanisms themselves. In summary, this thesis explores learning visual processing on several levels. The learning can be understood as interplay of input data, model structures, learning objectives, and estimation algorithms. The presented work adds to the growing body of evidence showing that statistical methods can be used to acquire intuitively meaningful visual processing mechanisms. The work also presents some predictions and ideas regarding biological visual processing.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Carotid atherosclerotic disease is a major cause of stroke, but it may remain clinically asymptomatic. The factors that turn the asymptomatic plaque into a symptomatic one are not fully understood, neither are the subtle effects that a high-grade carotid stenosis may have on the brain. The purpose of this study was to evaluate brain microcirculation, diffusion, and cognitive performance in patients with a high-grade stenosis in carotid artery, clinically either symptomatic or asymptomatic, undergoing carotid endarterectomy (CEA). We wanted to find out whether the stenoses are associated with diffusion or perfusion abnormalities of the brain or variation in the cognitive functioning of the patients, and to what extent the potential findings are affected by CEA, and compare the clinically symptomatic and asymptomatic subjects as well as strictly healthy controls. Coagulation and fibrinolytic parameters were compared with the rate microembolic signals (MES) in transcranial Doppler (TCD) and the macroscopic appearance of stenosing plaques in surgery. Patients (n=92) underwent CEA within the study. Blood samples pertaining to coagulation and fibrinolysis were collected before CEA, and the subjects underwent repeated TCD monitoring for MES. A subpopulation (n= 46) underwent MR imaging and repeated neuropsychological examination (preoperative, as well 4 and 100 days after CEA). In MRI, the average apparent diffusion coefficients were higher in the ipsilateral white matter (WM), and altough the interhemispheric difference was abolished by CEA, the levels remained higher than in controls. Symptomatic stenoses were associated with more sluggish perfusion especially in WM, and lower pulsatility of flow in TCD. All patients had poorer cognitive performance than healthy controls. Cognitive functions improved as expected by learning effect despite transient postoperative worsening in a few subjects. Improvement was greater in patients with deepest hypoperfusion, primarily in executive functions. Symptomatic stenoses were associated with higher hematocrit and tissue plasminogen activator antigen levels, as well as higher rate of MES and ulcerated plaques, and better postoperative improvement of vasoreactivity and pulsatility. In light of the findings, carotid stenosis is associated with differences in brain diffusion, perfusion, and cognition. The effect on diffusion in the ipsilateral WM, partially reversible by CEA, may be associated with WM degeneration. Asymptomatic and symptomatic subpopulations differ from each other in terms of hemodynamic adaptation and in their vascular physiological response to removal of stenosis. Although CEA may be associated with a transient cognitive decline, a true improvement of cognitive performance by CEA is possible in patients with the most pronounced perfusion deficits. Mediators of fibrinolysis and unfavourable hemorheology may contribute to the development of a symptomatic disease in patients with a high-grade stenosis.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This thesis studies quantile residuals and uses different methodologies to develop test statistics that are applicable in evaluating linear and nonlinear time series models based on continuous distributions. Models based on mixtures of distributions are of special interest because it turns out that for those models traditional residuals, often referred to as Pearson's residuals, are not appropriate. As such models have become more and more popular in practice, especially with financial time series data there is a need for reliable diagnostic tools that can be used to evaluate them. The aim of the thesis is to show how such diagnostic tools can be obtained and used in model evaluation. The quantile residuals considered here are defined in such a way that, when the model is correctly specified and its parameters are consistently estimated, they are approximately independent with standard normal distribution. All the tests derived in the thesis are pure significance type tests and are theoretically sound in that they properly take the uncertainty caused by parameter estimation into account. -- In Chapter 2 a general framework based on the likelihood function and smooth functions of univariate quantile residuals is derived that can be used to obtain misspecification tests for various purposes. Three easy-to-use tests aimed at detecting non-normality, autocorrelation, and conditional heteroscedasticity in quantile residuals are formulated. It also turns out that these tests can be interpreted as Lagrange Multiplier or score tests so that they are asymptotically optimal against local alternatives. Chapter 3 extends the concept of quantile residuals to multivariate models. The framework of Chapter 2 is generalized and tests aimed at detecting non-normality, serial correlation, and conditional heteroscedasticity in multivariate quantile residuals are derived based on it. Score test interpretations are obtained for the serial correlation and conditional heteroscedasticity tests and in a rather restricted special case for the normality test. In Chapter 4 the tests are constructed using the empirical distribution function of quantile residuals. So-called Khmaladze s martingale transformation is applied in order to eliminate the uncertainty caused by parameter estimation. Various test statistics are considered so that critical bounds for histogram type plots as well as Quantile-Quantile and Probability-Probability type plots of quantile residuals are obtained. Chapters 2, 3, and 4 contain simulations and empirical examples which illustrate the finite sample size and power properties of the derived tests and also how the tests and related graphical tools based on residuals are applied in practice.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The main method of modifying properties of semiconductors is to introduce small amount of impurities inside the material. This is used to control magnetic and optical properties of materials and to realize p- and n-type semiconductors out of intrinsic material in order to manufacture fundamental components such as diodes. As diffusion can be described as random mixing of material due to thermal movement of atoms, it is essential to know the diffusion behavior of the impurities in order to manufacture working components. In modified radiotracer technique diffusion is studied using radioactive isotopes of elements as tracers. The technique is called modified as atoms are deployed inside the material by ion beam implantation. With ion implantation, a distinct distribution of impurities can be deployed inside the sample surface with good con- trol over the amount of implanted atoms. As electromagnetic radiation and other nuclear decay products emitted by radioactive materials can be easily detected, only very low amount of impurities can be used. This makes it possible to study diffusion in pure materials without essentially modifying the initial properties by doping. In this thesis a modified radiotracer technique is used to study the diffusion of beryllium in GaN, ZnO, SiGe and glassy carbon. GaN, ZnO and SiGe are of great interest to the semiconductor industry and beryllium as a small and possibly rapid dopant hasn t been studied previously using the technique. Glassy carbon has been added to demonstrate the feasibility of the technique. In addition, the diffusion of magnetic impurities, Mn and Co, has been studied in GaAs and ZnO (respectively) with spintronic applications in mind.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper examines how volatility in financial markets can preferable be modeled. The examination investigates how good the models for the volatility, both linear and nonlinear, are in absorbing skewness and kurtosis. The examination is done on the Nordic stock markets, including Finland, Sweden, Norway and Denmark. Different linear and nonlinear models are applied, and the results indicates that a linear model can almost always be used for modeling the series under investigation, even though nonlinear models performs slightly better in some cases. These results indicate that the markets under study are exposed to asymmetric patterns only to a certain degree. Negative shocks generally have a more prominent effect on the markets, but these effects are not really strong. However, in terms of absorbing skewness and kurtosis, nonlinear models outperform linear ones.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Researchers within the fields of economic geography and organizational management have extensively studied learning and the prerequisites and impediments for knowledge transfer. This paper combines two discourses within the two subjects: the-communities-of-practice and the learning region approaches, merging them through the so-called ecology of knowledge-approach, which is used to examine the knowledge transfer from the House of Fabergé to the Finnish jewellery industry. We examine the pre-revolution St Petersburg jewellery cluster and the post-revolution Helsinki, and the transfer of knowledge between these two locations through the components of communities of people, institutions and industry. The paper shows that the industrial dynamics of the Finnish modern-day goldsmith industry was inherently shaped both through the transfer and the non-transfer of knowledge. It also contends that the “knowledge-economy” is not anchored in and exclusive for the high technology sector of the late 20th century.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The aim of this dissertation is to model economic variables by a mixture autoregressive (MAR) model. The MAR model is a generalization of linear autoregressive (AR) model. The MAR -model consists of K linear autoregressive components. At any given point of time one of these autoregressive components is randomly selected to generate a new observation for the time series. The mixture probability can be constant over time or a direct function of a some observable variable. Many economic time series contain properties which cannot be described by linear and stationary time series models. A nonlinear autoregressive model such as MAR model can a plausible alternative in the case of these time series. In this dissertation the MAR model is used to model stock market bubbles and a relationship between inflation and the interest rate. In the case of the inflation rate we arrived at the MAR model where inflation process is less mean reverting in the case of high inflation than in the case of normal inflation. The interest rate move one-for-one with expected inflation. We use the data from the Livingston survey as a proxy for inflation expectations. We have found that survey inflation expectations are not perfectly rational. According to our results information stickiness play an important role in the expectation formation. We also found that survey participants have a tendency to underestimate inflation. A MAR model has also used to model stock market bubbles and crashes. This model has two regimes: the bubble regime and the error correction regime. In the error correction regime price depends on a fundamental factor, the price-dividend ratio, and in the bubble regime, price is independent of fundamentals. In this model a stock market crash is usually caused by a regime switch from a bubble regime to an error-correction regime. According to our empirical results bubbles are related to a low inflation. Our model also imply that bubbles have influences investment return distribution in both short and long run.