883 resultados para applications in logistics
Resumo:
The main method of modifying properties of semiconductors is to introduce small amount of impurities inside the material. This is used to control magnetic and optical properties of materials and to realize p- and n-type semiconductors out of intrinsic material in order to manufacture fundamental components such as diodes. As diffusion can be described as random mixing of material due to thermal movement of atoms, it is essential to know the diffusion behavior of the impurities in order to manufacture working components. In modified radiotracer technique diffusion is studied using radioactive isotopes of elements as tracers. The technique is called modified as atoms are deployed inside the material by ion beam implantation. With ion implantation, a distinct distribution of impurities can be deployed inside the sample surface with good con- trol over the amount of implanted atoms. As electromagnetic radiation and other nuclear decay products emitted by radioactive materials can be easily detected, only very low amount of impurities can be used. This makes it possible to study diffusion in pure materials without essentially modifying the initial properties by doping. In this thesis a modified radiotracer technique is used to study the diffusion of beryllium in GaN, ZnO, SiGe and glassy carbon. GaN, ZnO and SiGe are of great interest to the semiconductor industry and beryllium as a small and possibly rapid dopant hasn t been studied previously using the technique. Glassy carbon has been added to demonstrate the feasibility of the technique. In addition, the diffusion of magnetic impurities, Mn and Co, has been studied in GaAs and ZnO (respectively) with spintronic applications in mind.
Resumo:
Bandwidth allocation for multimedia applications in case of network congestion and failure poses technical challenges due to bursty and delay sensitive nature of the applications. The growth of multimedia services on Internet and the development of agent technology have made us to investigate new techniques for resolving the bandwidth issues in multimedia communications. Agent technology is emerging as a flexible promising solution for network resource management and QoS (Quality of Service) control in a distributed environment. In this paper, we propose an adaptive bandwidth allocation scheme for multimedia applications by deploying the static and mobile agents. It is a run-time allocation scheme that functions at the network nodes. This technique adaptively finds an alternate patchup route for every congested/failed link and reallocates the bandwidth for the affected multimedia applications. The designed method has been tested (analytical and simulation)with various network sizes and conditions. The results are presented to assess the performance and effectiveness of the approach. This work also demonstrates some of the benefits of the agent based schemes in providing flexibility, adaptability, software reusability, and maintainability. (C) 2004 Elsevier Inc. All rights reserved.
Resumo:
Let X(t) be a right continuous temporally homogeneous Markov pro- cess, Tt the corresponding semigroup and A the weak infinitesimal genera- tor. Let g(t) be absolutely continuous and r a stopping time satisfying E.( S f I g(t) I dt) < oo and E.( f " I g'(t) I dt) < oo Then for f e 9iJ(A) with f(X(t)) right continuous the identity Exg(r)f(X(z)) - g(O)f(x) = E( 5 " g'(s)f(X(s)) ds) + E.( 5 " g(s)Af(X(s)) ds) is a simple generalization of Dynkin's identity (g(t) 1). With further restrictions on f and r the following identity is obtained as a corollary: Ex(f(X(z))) = f(x) + k! Ex~rkAkf(X(z))) + n-1E + (n ) )!.E,(so un-1Anf(X(u)) du). These identities are applied to processes with stationary independent increments to obtain a number of new and known results relating the moments of stopping times to the moments of the stopped processes.
Resumo:
As the virtual world grows more complex, finding a standard way for storing data becomes increasingly important. Ideally, each data item would be brought into the computer system only once. References for data items need to be cryptographically verifiable, so the data can maintain its identity while being passed around. This way there will be only one copy of the users family photo album, while the user can use multiple tools to show or manipulate the album. Copies of users data could be stored on some of his family members computer, some of his computers, but also at some online services which he uses. When all actors operate over one replicated copy of the data, the system automatically avoids a single point of failure. Thus the data will not disappear with one computer breaking, or one service provider going out of business. One shared copy also makes it possible to delete a piece of data from all systems at once, on users request. In our research we tried to find a model that would make data manageable to users, and make it possible to have the same data stored at various locations. We studied three systems, Persona, Freenet, and GNUnet, that suggest different models for protecting user data. The main application areas of the systems studied include securing online social networks, providing anonymous web, and preventing censorship in file-sharing. Each of the systems studied store user data on machines belonging to third parties. The systems differ in measures they take to protect their users from data loss, forged information, censorship, and being monitored. All of the systems use cryptography to secure names used for the content, and to protect the data from outsiders. Based on the gained knowledge, we built a prototype platform called Peerscape, which stores user data in a synchronized, protected database. Data items themselves are protected with cryptography against forgery, but not encrypted as the focus has been disseminating the data directly among family and friends instead of letting third parties store the information. We turned the synchronizing database into peer-to-peer web by revealing its contents through an integrated http server. The REST-like http API supports development of applications in javascript. To evaluate the platform’s suitability for application development we wrote some simple applications, including a public chat room, bittorrent site, and a flower growing game. During our early tests we came to the conclusion that using the platform for simple applications works well. As web standards develop further, writing applications for the platform should become easier. Any system this complex will have its problems, and we are not expecting our platform to replace the existing web, but are fairly impressed with the results and consider our work important from the perspective of managing user data.
Resumo:
As the virtual world grows more complex, finding a standard way for storing data becomes increasingly important. Ideally, each data item would be brought into the computer system only once. References for data items need to be cryptographically verifiable, so the data can maintain its identity while being passed around. This way there will be only one copy of the users family photo album, while the user can use multiple tools to show or manipulate the album. Copies of users data could be stored on some of his family members computer, some of his computers, but also at some online services which he uses. When all actors operate over one replicated copy of the data, the system automatically avoids a single point of failure. Thus the data will not disappear with one computer breaking, or one service provider going out of business. One shared copy also makes it possible to delete a piece of data from all systems at once, on users request. In our research we tried to find a model that would make data manageable to users, and make it possible to have the same data stored at various locations. We studied three systems, Persona, Freenet, and GNUnet, that suggest different models for protecting user data. The main application areas of the systems studied include securing online social networks, providing anonymous web, and preventing censorship in file-sharing. Each of the systems studied store user data on machines belonging to third parties. The systems differ in measures they take to protect their users from data loss, forged information, censorship, and being monitored. All of the systems use cryptography to secure names used for the content, and to protect the data from outsiders. Based on the gained knowledge, we built a prototype platform called Peerscape, which stores user data in a synchronized, protected database. Data items themselves are protected with cryptography against forgery, but not encrypted as the focus has been disseminating the data directly among family and friends instead of letting third parties store the information. We turned the synchronizing database into peer-to-peer web by revealing its contents through an integrated http server. The REST-like http API supports development of applications in javascript. To evaluate the platform s suitability for application development we wrote some simple applications, including a public chat room, bittorrent site, and a flower growing game. During our early tests we came to the conclusion that using the platform for simple applications works well. As web standards develop further, writing applications for the platform should become easier. Any system this complex will have its problems, and we are not expecting our platform to replace the existing web, but are fairly impressed with the results and consider our work important from the perspective of managing user data.
Resumo:
We derive the heat kernel for arbitrary tensor fields on S-3 and (Euclidean) AdS(3) using a group theoretic approach. We use these results to also obtain the heat kernel on certain quotients of these spaces. In particular, we give a simple, explicit expression for the one loop determinant for a field of arbitrary spin s in thermal AdS(3). We apply this to the calculation of the one loop partition function of N = 1 supergravity on AdS(3). We find that the answer factorizes into left- and right-moving super Virasoro characters built on the SL(2, C) invariant vacuum, as argued by Maloney and Witten on general grounds.
Resumo:
An examination of the data available at 22 meteorological stations in Karnataka State shows that wind velocities in the State as a whole are neither spectacularly high nor negligibly low. The highest winds (annual mean of around 13 km/hr) are experienced in parts of the northern maidan region of the State (Gulbarga, Raichur and Bidar districts) and in Bangalore. The winds are strongly seasonal: typically, the five monsoon months May-September account for about 80% of the annual wind energy flux. Although the data available are inadequate to make precise estimates, they indicate that the total wind energy potential of the State is about an order of magnitude higher than the current electrical energy consumption. The possible exploitation of wind energy for applications in rural areas therefore requires serious consideration, but it is argued that to be successful it is essential to formulate an integrated and carefully planned programme. The output of current windpumps needs to be increased; a doubling should be feasible by the design of suitable load-matching devices. The first cost has to be reduced by careful design, by the use of local materials and skills and by employing a labour-intensive technology. A consideration of the agricultural factors in the northern maidan region of the State shows that there is likely to be a strong need for mechanical assistance in supplemental and life-saving irrigation for the dry crops characteristic of the area. A technological target for a windmill that could find applications in this area would be one with a rotor diameter of about 10 m that can lift about 10,000 litres of water per hour in winds of 10 km/hr (2.8 m/s) hourly average speed and costs less than about Rs 10,000. Although no such windmills exist as of today, the authors believe that achievement of this target is feasible. An examination of various possible scenarios for the use of windmills in this area suggests that with a windpump costing about Rs 12,000, a three hectare farm growing two dry crops a year can expect an annual return of about 150% from an initial investment of about Rs 15,000. It is concluded that it should be highly worthwhile to undertake a coordinated programme for wind energy development that will include more detailed wind surveys in the northern maidan area (as well as some others, such as the Western Ghats), the development of suitable windmill designs and a study of their applications to agriculture as well as to other fields.
Resumo:
Research on corporate responsibility has traditionally focused on the responsibilities of companies within their corporate boundaries only. Yet this view is challenged today as more and more companies face the situation in which the environmental and social performance of their suppliers, distributors, industry or other associated partners impacts on their sales performance and brand equity. Simultaneously, policy-makers have taken up the discussion on corporate responsibility from the perspective of globalisation, in particular of global supply chains. The category of selecting and evaluating suppliers has also entered the field of environmental reporting. Companies thus need to tackle their responsibility in collaboration with different partners. The aim of the thesis is to further the understanding of collaboration and corporate environmental responsibility beyond corporate boundaries. Drawing on the fields of supply chain management and industrial ecology, the thesis sets out to investigate inter-firm collaboration on three different levels, between the company and its stakeholders, in the supply chain, and in the demand network of a company. The thesis is comprised of four papers: Paper A discusses the use of different research approaches in logistics and supply chain management. Paper B introduces the study on collaboration and corporate environmental responsibility from a focal company perspective, looking at the collaboration of companies with their stakeholders, and the salience of these stakeholders. Paper C widens this perspective to an analysis on the supply chain level. The focus here is not only beyond corporate boundaries, but also beyond direct supplier and customer interfaces in the supply chain. Paper D then extends the analysis to the demand network level, taking into account the input-output, competitive and regulatory environments, in which a company operates. The results of the study broaden the view of corporate responsibility. By applying this broader view, different types of inter-firm collaboration can be highlighted. Results also show how environmental demand is extended in the supply chain regardless of the industry background of the company.
Resumo:
We provide a survey of some of our recent results ([9], [13], [4], [6], [7]) on the analytical performance modeling of IEEE 802.11 wireless local area networks (WLANs). We first present extensions of the decoupling approach of Bianchi ([1]) to the saturation analysis of IEEE 802.11e networks with multiple traffic classes. We have found that even when analysing WLANs with unsaturated nodes the following state dependent service model works well: when a certain set of nodes is nonempty, their channel attempt behaviour is obtained from the corresponding fixed point analysis of the saturated system. We will present our experiences in using this approximation to model multimedia traffic over an IEEE 802.11e network using the enhanced DCF channel access (EDCA) mechanism. We have found that we can model TCP controlled file transfers, VoIP packet telephony, and streaming video in the IEEE802.11e setting by this simple approximation.
Resumo:
In this paper a theory for two-person zero sum multicriterion differential games is presented. Various solution concepts based upon the notions of Pareto optimality (efficiency), security and equilibrium are defined. These are shown to have interesting applications in the formulation and analysis of two target or combat differential games. The methods for obtaining outcome regions in the state space, feedback strategies for the players and the mode of play has been discussed in the framework of bicriterion zero sum differential games. The treatment is conceptual rather than rigorous.
Resumo:
A compact, high brightness 13.56 MHz inductively coupled plasma ion source without any axial or radial multicusp magnetic fields is designed for the production of a focused ion beam. Argon ion current of density more than 30 mA/cm(2) at 4 kV potential is extracted from this ion source and is characterized by measuring the ion energy spread and brightness. Ion energy spread is measured by a variable-focusing retarding field energy analyzer that minimizes the errors due t divergence of ion beam inside the analyzer. Brightness of the ion beam is determined from the emittance measured by a fully automated and locally developed electrostatic sweep scanner. By optimizing various ion source parameters such as RF power, gas pressure and Faraday shield, ion beams with energy spread of less than 5 eV and brightness of 7100 Am(-2)sr(-1)eV(-1) have been produced. Here, we briefly report the details of the ion source, measurement and optimization of energy spread and brightness of the ion beam. (C) 2010 Elsevier B.V. All rights reserved.
Resumo:
The design and analysis of a coplanar capacitive fed microstrip antenna suspended above the ground plane is presented. It is demonstrated that the proposed approach can be used for designing antennas with impedance bandwidth of about 50% and a good gain to operate in various microwave bands. The model of the antenna incorporates the capacitive feed strip which is fed by a coaxial probe using equivalent circuit approach, and matches simulation and experimental results. The capacitive feed strip used here is basically a rectangular microstrip capacitor formed from a truncated microstrip transmission line and all its open ends are represented by terminal or edge capacitances. The error analysis was carried out for validity of the model for different design parameters. The antenna configuration can be used where unidirectional radiation patterns are required over a wide bandwidth.
Resumo:
In recent times, there has been an ever-growing need for polymer-based multifunctional materials for electronic packaging applications. In this direction, epoxy-Al2O3 nanocomposites at low filler loadings can provide an excellent material option, especially from the point of view of their dielectric properties. This paper reports the dielectric characteristics for such a system, results of which are observed to be interesting, unique, and advantageous as compared to traditionally used microcomposite systems. Nanocomposites are found to display lower values of permittivity/tan delta over a wide frequency range as compared to that of unfilled epoxy. This surprising observation has been attributed to the interaction between the epoxy chains and the nanoparticles, and in this paper this phenomenon is analyzed using a dual layer interface model reported for polymer nanocomposites. As for the other dielectric properties associated with the nanocomposites, the nano-filler loading seems to have a significant effect. The dc resistivity and ac dielectric strength of the nanocomposites were observed to be lower than that of the unfilled epoxy system at the investigated filler loadings, whereas the electrical discharge resistant properties showed a significant enhancement. Further analysis of the results obtained in this paper shows that the morphology of the interface region and its characteristics decide the observed interesting dielectric behaviors.
Resumo:
In this thesis the current status and some open problems of noncommutative quantum field theory are reviewed. The introduction aims to put these theories in their proper context as a part of the larger program to model the properties of quantized space-time. Throughout the thesis, special focus is put on the role of noncommutative time and how its nonlocal nature presents us with problems. Applications in scalar field theories as well as in gauge field theories are presented. The infinite nonlocality of space-time introduced by the noncommutative coordinate operators leads to interesting structure and new physics. High energy and low energy scales are mixed, causality and unitarity are threatened and in gauge theory the tools for model building are drastically reduced. As a case study in noncommutative gauge theory, the Dirac quantization condition of magnetic monopoles is examined with the conclusion that, at least in perturbation theory, it cannot be fulfilled in noncommutative space.