28 resultados para Idealized configuration


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Boron neutron capture therapy (BNCT) is a form of chemically targeted radiotherapy that utilises the high neutron capture cross-section of boron-10 isotope to achieve a preferential dose increase in the tumour. The BNCT dosimetry poses a special challenge as the radiation dose absorbed by the irradiated tissues consists of several dose different components. Dosimetry is important as the effect of the radiation on the tissue is correlated with the radiation dose. Consistent and reliable radiation dose delivery and dosimetry are thus basic requirements for radiotherapy. The international recommendations for are not directly applicable to BNCT dosimetry. The existing dosimetry guidance for BNCT provides recommendations but also calls for investigating for complementary methods for comparison and improved accuracy. In this thesis the quality assurance and stability measurements of the neutron beam monitors used in dose delivery are presented. The beam monitors were found not to be affected by the presence of a phantom in the beam and that the effect of the reactor core power distribution was less than 1%. The weekly stability test with activation detectors has been generally reproducible within the recommended tolerance value of 2%. An established toolkit for epithermal neutron beams for determination of the dose components is presented and applied in an international dosimetric intercomparison. The measured quantities (neutron flux, fast neutron and photon dose) by the groups in the intercomparison were generally in agreement within the stated uncertainties. However, the uncertainties were large, ranging from 3-30% (1 standard deviation), emphasising the importance of dosimetric intercomparisons if clinical data is to be compared between different centers. Measurements with the Exradin type 2M ionisation chamber have been repeated in the epithermal neutron beam in the same measurement configuration over the course of 10 years. The presented results exclude severe sensitivity changes to thermal neutrons that have been reported for this type of chamber. Microdosimetry and polymer gel dosimetry as complementary methods for epithermal neutron beam dosimetry are studied. For microdosimetry the comparison of results with ionisation chambers and computer simulation showed that the photon dose measured with microdosimetry was lower than with the two other methods. The disagreement was within the uncertainties. For neutron dose the simulation and microdosimetry results agreed within 10% while the ionisation chamber technique gave 10-30% lower neutron dose rates than the two other methods. The response of the BANG-3 gel was found to be linear for both photon and epithermal neutron beam irradiation. The dose distribution normalised to dose maximum measured by MAGIC polymer gel was found to agree well with the simulated result near the dose maximum while the spatial difference between measured and simulated 30% isodose line was more than 1 cm. In both the BANG-3 and MAGIC gel studies, the interpretation of the results was complicated by the presence of high-LET radiation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Thin film applications have become increasingly important in our search for multifunctional and economically viable technological solutions of the future. Thin film coatings can be used for a multitude of purposes, ranging from a basic enhancement of aesthetic attributes to the addition of a complex surface functionality. Anything from electronic or optical properties, to an increased catalytic or biological activity, can be added or enhanced by the deposition of a thin film, with a thickness of only a few atomic layers at the best, on an already existing surface. Thin films offer both a means of saving in materials and the possibility for improving properties without a critical enlargement of devices. Nanocluster deposition is a promising new method for the growth of structured thin films. Nanoclusters are small aggregates of atoms or molecules, ranging in sizes from only a few nanometers up to several hundreds of nanometers in diameter. Due to their large surface to volume ratio, and the confinement of atoms and electrons in all three dimensions, nanoclusters exhibit a wide variety of exotic properties that differ notably from those of both single atoms and bulk materials. Nanoclusters are a completely new type of building block for thin film deposition. As preformed entities, clusters provide a new means of tailoring the properties of thin films before their growth, simply by changing the size or composition of the clusters that are to be deposited. Contrary to contemporary methods of thin film growth, which mainly rely on the deposition of single atoms, cluster deposition also allows for a more precise assembly of thin films, as the configuration of single atoms with respect to each other is already predetermined in clusters. Nanocluster deposition offers a possibility for the coating of virtually any material with a nanostructured thin film, and therein the enhancement of already existing physical or chemical properties, or the addition of some exciting new feature. A clearer understanding of cluster-surface interactions, and the growth of thin films by cluster deposition, must, however, be achieved, if clusters are to be successfully used in thin film technologies. Using a combination of experimental techniques and molecular dynamics simulations, both the deposition of nanoclusters, and the growth and modification of cluster-assembled thin films, are studied in this thesis. Emphasis is laid on an understanding of the interaction between metal clusters and surfaces, and therein the behaviour of these clusters during deposition and thin film growth. The behaviour of single metal clusters, as they impact on clean metal surfaces, is analysed in detail, from which it is shown that there exists a cluster size and deposition energy dependent limit, below which epitaxial alignment occurs. If larger clusters are deposited at low energies, or cluster-surface interactions are weaker, non-epitaxial deposition will take place, resulting in the formation of nanocrystalline structures. The effect of cluster size and deposition energy on the morphology of cluster-assembled thin films is also determined, from which it is shown that nanocrystalline cluster-assembled films will be porous. Modification of these thin films, with the purpose of enhancing their mechanical properties and durability, without destroying their nanostructure, is presented. Irradiation with heavy ions is introduced as a feasible method for increasing the density, and therein the mechanical stability, of cluster-assembled thin films, without critically destroying their nanocrystalline properties. The results of this thesis demonstrate that nanocluster deposition is a suitable technique for the growth of nanostructured thin films. The interactions between nanoclusters and their supporting surfaces must, however, be carefully considered, if a controlled growth of cluster-assembled thin films, with precisely tailored properties, is to be achieved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The ever-increasing demand for faster computers in various areas, ranging from entertaining electronics to computational science, is pushing the semiconductor industry towards its limits on decreasing the sizes of electronic devices based on conventional materials. According to the famous law by Gordon E. Moore, a co-founder of the world s largest semiconductor company Intel, the transistor sizes should decrease to the atomic level during the next few decades to maintain the present rate of increase in the computational power. As leakage currents become a problem for traditional silicon-based devices already at sizes in the nanometer scale, an approach other than further miniaturization is needed to accomplish the needs of the future electronics. A relatively recently proposed possibility for further progress in electronics is to replace silicon with carbon, another element from the same group in the periodic table. Carbon is an especially interesting material for nanometer-sized devices because it forms naturally different nanostructures. Furthermore, some of these structures have unique properties. The most widely suggested allotrope of carbon to be used for electronics is a tubular molecule having an atomic structure resembling that of graphite. These carbon nanotubes are popular both among scientists and in industry because of a wide list of exciting properties. For example, carbon nanotubes are electronically unique and have uncommonly high strength versus mass ratio, which have resulted in a multitude of proposed applications in several fields. In fact, due to some remaining difficulties regarding large-scale production of nanotube-based electronic devices, fields other than electronics have been faster to develop profitable nanotube applications. In this thesis, the possibility of using low-energy ion irradiation to ease the route towards nanotube applications is studied through atomistic simulations on different levels of theory. Specifically, molecular dynamic simulations with analytical interaction models are used to follow the irradiation process of nanotubes to introduce different impurity atoms into these structures, in order to gain control on their electronic character. Ion irradiation is shown to be a very efficient method to replace carbon atoms with boron or nitrogen impurities in single-walled nanotubes. Furthermore, potassium irradiation of multi-walled and fullerene-filled nanotubes is demonstrated to result in small potassium clusters in the hollow parts of these structures. Molecular dynamic simulations are further used to give an example on using irradiation to improve contacts between a nanotube and a silicon substrate. Methods based on the density-functional theory are used to gain insight on the defect structures inevitably created during the irradiation. Finally, a new simulation code utilizing the kinetic Monte Carlo method is introduced to follow the time evolution of irradiation-induced defects on carbon nanotubes on macroscopic time scales. Overall, the molecular dynamic simulations presented in this thesis show that ion irradiation is a promisingmethod for tailoring the nanotube properties in a controlled manner. The calculations made with density-functional-theory based methods indicate that it is energetically favorable for even relatively large defects to transform to keep the atomic configuration as close to the pristine nanotube as possible. The kinetic Monte Carlo studies reveal that elevated temperatures during the processing enhance the self-healing of nanotubes significantly, ensuring low defect concentrations after the treatment with energetic ions. Thereby, nanotubes can retain their desired properties also after the irradiation. Throughout the thesis, atomistic simulations combining different levels of theory are demonstrated to be an important tool for determining the optimal conditions for irradiation experiments, because the atomic-scale processes at short time scales are extremely difficult to study by any other means.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This study analyses personal relationships linking research to sociological theory on the questions of the social bond and on the self as social. From the viewpoint of disruptive life events and experiences, such as loss, divorce and illness, it aims at understanding how selves are bound to their significant others as those specific people ‘close or otherwise important’ to them. Who form the configurations of significant others? How do different bonds respond in disruptions and how do relational processes unfold? How is the embeddedness of selves manifested in the processes of bonding, on the one hand, and in the relational formation of the self, on the other? The bonds are analyzed from an anti-categorical viewpoint based on personal citations of significance as opposed to given relationship categories, such as ‘family’ or ‘friendship’ – the two kinds of relationships that in fact are most frequently significant. The study draws from analysis of the personal narratives of 37 Finnish women and men (in all 80 interviews) and their entire configurations of those specific people who they cite as ‘close or otherwise important’. The analysis stresses the subjective experiences, while also investigating the actualized relational processes and configurations of all personal relationships with certain relationship histories embedded in micro-level structures. The research is based on four empirical sub-studies of personal relationships and a summary discussing the questions of the self and social bond. Discussion draws from G. H. Mead, C. Cooley, N. Elias, T. Scheff, G. Simmel and the contributors of ‘relational sociology’. Sub-studies analyse bonds to others from the viewpoint of biographical disruption and re-configuration of significant others, estranged family bonds, peer support and the formation of the most intimate relationships into exclusive and inclusive configurations. All analyses examine the dialectics of the social and the personal, asking how different structuring mechanisms and personal experiences and negotiations together contribute to the unfolding of the bonds. The summary elaborates personal relationships as social bonds embedded in wider webs of interdependent people and social settings that are laden with cultural expectations. Regarding the question of the relational self, the study proposes both bonding and individuality as significant. They are seen as interdependent phases of the relationality of the self. Bonding anchors the self to its significant relationships, in which individuality is manifested, for example, in contrasting and differentiating dynamics, but also in active attempts to connect with others. Individuality is not a fixed quality of the self, but a fluid and interdependent phase of the relational self. More specifically, it appears in three formats in the flux of relational processes: as a sense of unique self (via cultivation of subjective experiences), as agency and as (a search for) relative autonomy. The study includes an epilogue addressing the ambivalence between the social expectation of individuality in society and the bonded reality of selves.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The research focuses on client plan in the field of health care and social work on families with children. The purpose of the plan is to create objectives for helping the client and to assist in coordinating the ever-increasing multi-professional work. In general, the plan is understood in terms of assignments and as a contract specifying what to do in client cases. Taking this into consideration, the plan is outsourced into a written document. Instead of understanding the plan as a tool that stabilizes the objectives of action, documents it and facilitates evaluation, the client plan is conceptualized in this study as a practice. This kind of practice mediates client work as being itself also a process of action that focuses on an object whose gradual emergence and definition is the central question in multi-professional collaboration with a client. The plan is examined empirically in a non-stabilized state which leads to the research methodology being based on the dynamics between stabilization and emerging, non-stabilized entities the co-creation and formulation of practice and context. The theoretical approach of the research is the micro analytic approach of activity theory (Engeström R. 1999b). Grounding on this, the research develops a method of qualitative analysis which follows an emerging object with multiple voices. The research data is composed of the videotaped sessions from client meetings with three families, the interviews with the client and the workers as well as client documents that are used to follow up on client processes for at least one year. The research questions are as follows: 1) How is the client plan constructed between the client and different professional agents? 2) How are meanings constructed in a client-centred plan? 3) What are the elements of client-employee relationships that support the co-configuration necessitated by the changes in the client s everyday life? The study shows that the setting of objectives were limited by the palette of institutional services, which caused that the clients interpretations and acts of giving meaning to the kinds of help that was required were left out of the plan. Conceptually, the distinctions between client-centred and client-specific ways of working as well as an action-based working method are addressed. Central to this action-based approach is construing the everyday life of the client, recognizing different meanings and analyzing them together with the client as well as focusing attention on developing the prerequisites for social agency of the clients. The research portrays the elements for creating an action-based client plan. Key words: client plan, user perspective, multi-voiced meaning, multi-professional social work with children and families, agency

Relevância:

10.00% 10.00%

Publicador:

Resumo:

”Does the community really count? – identity process and social capital as elements in surviving in insecurity and uncertainty” is a combination of five articles. The aim of this study is to answer the question: how or in which ways is it possible to find the role of identity process and social capital in surviving in insecurity and uncertainty? In the introduction part the concepts of community and social capital are examined. Then I will study the articles and try to find out what kinds of elements of identity process and social capital in them can be found in order to survive in the societal change. The study consists of the introduction part and the articles. The articles are: 1. “Is Becoming a Researcher Some Kind of Role-playing” - Roles of the Researcher in the Process of Forming the Identity 2. What Composes Collective Identity in the Polytechnic Community? 3. Opportunities to Succeed or Fear of Failure? -Entrepreneurship from the Youngsters` Point of View 4. Learning Risk-taking Competences 5. “Bricolage”, or Just Putting Things Together? The starting point for the study is the feeling of insecurity that surrounds a person living in the present society: you cannot be sure with whom you are going to co-operate tomorrow. In the “Good Old Days” the harmonious communities “protected” their members and worked strongly toward common aims. Nowadays, partly because of urbanisation, we are so busy that we only have time to take care of ourselves, or rather to say: just of myself. As Bauman (2001) puts it: people turn to communities in which they feel like home. They still long for communality. For Mead (1962) the group and the communality plays a big role: a person needs others to become the whole ”Self.” In acting with others a person can gain much more than working alone (Field 2003). But, as Day (2006) puts it, the reality of community as discovered by empirical reserach is a great deal messier than the abstract and idealized versions used by theorists. Keywords: uncertainty, insecurity, communality, identity process, social capital, significant groups, survival.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A key trait of Free and Open Source Software (FOSS) development is its distributed nature. Nevertheless, two project-level operations, the fork and the merge of program code, are among the least well understood events in the lifespan of a FOSS project. Some projects have explicitly adopted these operations as the primary means of concurrent development. In this study, we examine the effect of highly distributed software development, is found in the Linux kernel project, on collection and modelling of software development data. We find that distributed development calls for sophisticated temporal modelling techniques where several versions of the source code tree can exist at once. Attention must be turned towards the methods of quality assurance and peer review that projects employ to manage these parallel source trees. Our analysis indicates that two new metrics, fork rate and merge rate, could be useful for determining the role of distributed version control systems in FOSS projects. The study presents a preliminary data set consisting of version control and mailing list data.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A key trait of Free and Open Source Software (FOSS) development is its distributed nature. Nevertheless, two project-level operations, the fork and the merge of program code, are among the least well understood events in the lifespan of a FOSS project. Some projects have explicitly adopted these operations as the primary means of concurrent development. In this study, we examine the effect of highly distributed software development, is found in the Linux kernel project, on collection and modelling of software development data. We find that distributed development calls for sophisticated temporal modelling techniques where several versions of the source code tree can exist at once. Attention must be turned towards the methods of quality assurance and peer review that projects employ to manage these parallel source trees. Our analysis indicates that two new metrics, fork rate and merge rate, could be useful for determining the role of distributed version control systems in FOSS projects. The study presents a preliminary data set consisting of version control and mailing list data.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Nucleation is the first step in a phase transition where small nuclei of the new phase start appearing in the metastable old phase, such as the appearance of small liquid clusters in a supersaturated vapor. Nucleation is important in various industrial and natural processes, including atmospheric new particle formation: between 20 % to 80 % of atmospheric particle concentration is due to nucleation. These atmospheric aerosol particles have a significant effect both on climate and human health. Different simulation methods are often applied when studying things that are difficult or even impossible to measure, or when trying to distinguish between the merits of various theoretical approaches. Such simulation methods include, among others, molecular dynamics and Monte Carlo simulations. In this work molecular dynamics simulations of the homogeneous nucleation of Lennard-Jones argon have been performed. Homogeneous means that the nucleation does not occur on a pre-existing surface. The simulations include runs where the starting configuration is a supersaturated vapor and the nucleation event is observed during the simulation (direct simulations), as well as simulations of a cluster in equilibrium with a surrounding vapor (indirect simulations). The latter type are a necessity when the conditions prevent the occurrence of a nucleation event in a reasonable timeframe in the direct simulations. The effect of various temperature control schemes on the nucleation rate (the rate of appearance of clusters that are equally able to grow to macroscopic sizes and to evaporate) was studied and found to be relatively small. The method to extract the nucleation rate was also found to be of minor importance. The cluster sizes from direct and indirect simulations were used in conjunction with the nucleation theorem to calculate formation free energies for the clusters in the indirect simulations. The results agreed with density functional theory, but were higher than values from Monte Carlo simulations. The formation energies were also used to calculate surface tension for the clusters. The sizes of the clusters in the direct and indirect simulations were compared, showing that the direct simulation clusters have more atoms between the liquid-like core of the cluster and the surrounding vapor. Finally, the performance of various nucleation theories in predicting simulated nucleation rates was investigated, and the results among other things highlighted once again the inadequacy of the classical nucleation theory that is commonly employed in nucleation studies.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The productivity of a process is related to how effectively input resources are transformed into value for customers. For the needs of manufacturers of physical products there are widely used productivity concepts and measurements instruments. However, in service processes the underlying assumptions of these concepts and models do not hold. For example, manufacturing-based productivity models assume that an altered configuration of input resources in the production process does not lead to quality changes in outputs (the constant-quality assumption). However, in a service context changes in the production resources and productions systems do affect the perceived quality of services. Therefore, using manufacturing-oriented productivity models in service contexts are likely to give managers wrong directions for action. Research into the productivity of services is still scarce, because of the lack of viable models. The purpose of the present article is to analyse the requirements for the development of a productivity concept for service operations. Based on the analysis, a service productivity model is developed. According to this model, service productivity is a function of 1) how effectively input resources into the service (production) process are transformed to outputs in the form of services (internal or cost efficiency), 2) how well the quality of the service process and its outcome is perceived (external or revenue efficiency), and 3) how effectively the capacity of the service process is utilised (capacity efficiency). In addition, directions for developing measurement models for service productivity are discussed.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Views on industrial service have conceptually progressed from the output of the provider’s production process to the result of an interaction process in which the customer also is involved. Although there are attempts to be customer-oriented, especially when the focus is on solutions, an industrial company’s offering combining goods and services is inherently seller-oriented. There is, however, a need to go beyond the current literature and company practices. We propose that what is needed is a genuinely customer-based parallel concept to offering that takes the customer’s view and put forward a new concept labelled customer needing. A needing is based on the customer’s mental model of their business and strategies which will affect priorities, decisions, and actions. A needing can be modelled as a configuration of three dimensions containing six functions that create realised value for the customer. These dimensions and functions can be used to describe needings which represent starting points for sellers’ creation of successful offerings. When offerings match needings over time the seller should have the potential to form and sustain successful buyer relationships.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

There is an urgent interest in marketing to move away from neo-classical value definitions suggesting that value creation is a process of exchanging goods for money. In the present paper, value creation is conceptualized as an integration of two distinct, yet closely coupled processes. First, actors co-create what this paper calls an underlying basis of value. This is done by interactively re-configuring resources. By relating and combining resources, activity sets, and risks across actor boundaries in novel ways actors create joint productivity gains – a concept very similar to density (Normann, 2001). Second, actors engage in a process of signification and evaluation. Signification implies co-constructing the meaning and worth of joint productivity gains co-created through interactive resource re-configuration, as well as sharing those gains through a pricing mechanism as value to involved actors. The conceptual framework highlights an all-important dynamics associated with ´value creation´ and ´value´ - a dynamics the paper claims has eluded past marketing research. The paper argues that the framework presented here is appropriate for the interactive service perspective, where value and value creation are not objectively given, but depend on the power of involved actors´ socially constructed frames to mobilize resources across actor boundaries in ways that ´enhance system well-being´ (Vargo et al., 2008). The paper contributes to research on Service Logic, Service-Dominant Logic, and Service Science.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Transposed to media like film, drama, opera, music, and the visual arts, “narrative” is no longer characterized by either temporality or an act of telling, both required by earlier narratological theories. Transposed to other disciplines, “narrative” is often a substitute for “assumption”, “hypothesis”, a disguised ideological stance, a cognitive scheme, and even life itself. The potential for broadening the concept lay dormant in narratology, both in the double use of “narrative” for the medium-free fabula and for the medium-bound sjuzet, and in changing interpretations of “event”. Some advantages of the broad use of “narrative” are an evocation of commonalities among media and disciplines, an invitation to re-think the term within the originating discipline, a constructivist challenge to positivistic and foundational views, an emphasis on a plurality of competing “truths”, and an empowerment of minority voices. Conversely, disadvantages of the broad use are an illusion of sameness whenever the term is used and the obliteration of specificity. In a Wittgensteinian spirit, the essay agrees that concepts of narrative are mutually related by “family resemblance”, but wishes to probe the resemblances further. It thus postulates two necessary features: double temporality and a transmitting (or mediating) agency, and an additional cluster of variable optional characteristics. When the necessary features are not dominant, the configuration may have “narrative elements” but is not “a narrative”.