6 resultados para BY-LAYER APPROACH
em Helda - Digital Repository of University of Helsinki
Resumo:
The paper examines the needs, premises and criteria for effective public participation in tactical forest planning. A method for participatory forest planning utilizing the techniques of preference analysis, professional expertise and heuristic optimization is introduced. The techniques do not cover the whole process of participatory planning, but are applied as a tool constituting the numerical core for decision support. The complexity of multi-resource management is addressed by hierarchical decision analysis which assesses the public values, preferences and decision criteria toward the planning situation. An optimal management plan is sought using heuristic optimization. The plan can further be improved through mutual negotiations, if necessary. The use of the approach is demonstrated with an illustrative example, it's merits and challenges for participatory forest planning and decision making are discussed and a model for applying it in general forest planning context is depicted. By using the approach, valuable information can be obtained about public preferences and the effects of taking them into consideration on the choice of the combination of standwise treatment proposals for a forest area. Participatory forest planning calculations, carried out by the approach presented in the paper, can be utilized in conflict management and in developing compromises between competing interests.
Resumo:
Atomic layer deposition (ALD) is a method to deposit thin films from gaseous precursors to the substrate layer-by-layer so that the film thickness can be tailored with atomic layer accuracy. Film tailoring is even further emphasized with selective-area ALD which enables the film growth to be controlled also on the substrate surface. Selective-area ALD allows the decrease of a process steps in preparing thin film devices. This can be of a great technological importance when the ALD films become into wider use in different applications. Selective-area ALD can be achieved by passivation or activation of a surface. In this work ALD growth was prevented by octadecyltrimethoxysilane, octadecyltrichlorosilane and 1-dodecanethiol SAMs, and by PMMA (polymethyl methacrylate) and PVP (poly(vinyl pyrrolidone) polymer films. SAMs were prepared from vapor phase and by microcontact printing, and polymer films were spin coated. Microcontact printing created patterned SAMs at once. The SAMs prepared from vapor phase and the polymer mask layers were patterned by UV lithography or lift-off process so that after preparation of a continuous mask layer selected areas of them were removed. On these areas the ALD film was deposited selectively. SAMs and polymer films prevented the growth in several ALD processes such as iridium, ruthenium, platinum, TiO2 and polyimide so that the ALD films did grow only on areas without SAM or polymer mask layer. PMMA and PVP films also protected the surface against Al2O3 and ZrO2 growth. Activation of the surface for ALD of ruthenium was achieved by preparing a RuOX layer by microcontact printing. At low temperatures the RuCp2-O2 process nucleated only on this oxidative activation layer but not on bare silicon.
Resumo:
This study highlights the formation of an artifact designed to mediate exploratory collaboration. The data for this study was collected during a Finnish adaptation of the thinking together approach. The aim of the approach is to teach pulps how to engage in educationally beneficial form of joint discussion, namely exploratory talk. At the heart of the approach lies a set of conversational ground rules aimed to promote the use of exploratory talk. The theoretical framework of the study is based on a sociocultural perspective on learning. A central argument in the framework is that physical and psychological tools play a crucial role in human action and learning. With the help of tools humans can escape the direct stimulus of the outside world and learn to control ourselves by using tools. During the implementation of the approach, the classroom community negotiates a set of six rules, which this study conceptualizes as an artifact that mediates exploratory collaboration. Prior research done about the thinking together approach has not extensively researched the formation of the rules, which give ample reason to conduct this study. The specific research questions asked were: What kind of negotiation trajectories did the ground rules form during the intervention? What meanings were negotiated for the ground rules during the intervention The methodological framework of the study is based on discourse analysis, which has been specified by adapting the social construction of intertextuality to analyze the meanings negotiated for the created rules. The study has town units of analysis: thematic episode and negotiation trajectory. A thematic episode is a stretch of talk-in-interaction where the participants talk about a certain ground rule or a theme relating to it. A negotiation trajectory is a chronological representation of the negotiation process of a certain ground rule during the intervention and is constructed of thematic episodes. Thematic episodes were analyzed with the adapted intertextuality analysis. A contrastive analysis was done on the trajectories. Lastly, the meanings negotiated for the created rules were compared to the guidelines provided by the approach. The main result of the study is the observation, that the meanings of the created rules were more aligned with the ground rules of cumulative talk, rather than exploratory talk. Although meanings relating also to exploratory talk were negotiated, they clearly were not the dominant form. In addition, the study observed that the trajectories of the rules were non identical. Despite connecting dimensions (symmetry, composition, continuity and explicitness) none of the trajectories shared exactly the same features as the others.
Resumo:
Transfer from aluminum to copper metallization and decreasing feature size of integrated circuit devices generated a need for new diffusion barrier process. Copper metallization comprised entirely new process flow with new materials such as low-k insulators and etch stoppers, which made the diffusion barrier integration demanding. Atomic Layer Deposition technique was seen as one of the most promising techniques to deposit copper diffusion barrier for future devices. Atomic Layer Deposition technique was utilized to deposit titanium nitride, tungsten nitride, and tungsten nitride carbide diffusion barriers. Titanium nitride was deposited with a conventional process, and also with new in situ reduction process where titanium metal was used as a reducing agent. Tungsten nitride was deposited with a well-known process from tungsten hexafluoride and ammonia, but tungsten nitride carbide as a new material required a new process chemistry. In addition to material properties, the process integration for the copper metallization was studied making compatibility experiments on different surface materials. Based on these studies, titanium nitride and tungsten nitride processes were found to be incompatible with copper metal. However, tungsten nitride carbide film was compatible with copper and exhibited the most promising properties to be integrated for the copper metallization scheme. The process scale-up on 300 mm wafer comprised extensive film uniformity studies, which improved understanding of non-uniformity sources of the ALD growth and the process-specific requirements for the ALD reactor design. Based on these studies, it was discovered that the TiN process from titanium tetrachloride and ammonia required the reactor design of perpendicular flow for successful scale-up. The copper metallization scheme also includes process steps of the copper oxide reduction prior to the barrier deposition and the copper seed deposition prior to the copper metal deposition. Easy and simple copper oxide reduction process was developed, where the substrate was exposed gaseous reducing agent under vacuum and at elevated temperature. Because the reduction was observed efficient enough to reduce thick copper oxide film, the process was considered also as an alternative method to make the copper seed film via copper oxide reduction.
Resumo:
What can the statistical structure of natural images teach us about the human brain? Even though the visual cortex is one of the most studied parts of the brain, surprisingly little is known about how exactly images are processed to leave us with a coherent percept of the world around us, so we can recognize a friend or drive on a crowded street without any effort. By constructing probabilistic models of natural images, the goal of this thesis is to understand the structure of the stimulus that is the raison d etre for the visual system. Following the hypothesis that the optimal processing has to be matched to the structure of that stimulus, we attempt to derive computational principles, features that the visual system should compute, and properties that cells in the visual system should have. Starting from machine learning techniques such as principal component analysis and independent component analysis we construct a variety of sta- tistical models to discover structure in natural images that can be linked to receptive field properties of neurons in primary visual cortex such as simple and complex cells. We show that by representing images with phase invariant, complex cell-like units, a better statistical description of the vi- sual environment is obtained than with linear simple cell units, and that complex cell pooling can be learned by estimating both layers of a two-layer model of natural images. We investigate how a simplified model of the processing in the retina, where adaptation and contrast normalization take place, is connected to the nat- ural stimulus statistics. Analyzing the effect that retinal gain control has on later cortical processing, we propose a novel method to perform gain control in a data-driven way. Finally we show how models like those pre- sented here can be extended to capture whole visual scenes rather than just small image patches. By using a Markov random field approach we can model images of arbitrary size, while still being able to estimate the model parameters from the data.