851 resultados para user study
Resumo:
Capacity region for two-user Gaussian Broadcast Channels (GBC) is well known with the optimal input being Gaussian. In this paper we explore the capacity region for GBC when the users' symbols are taken from finite complex alphabets (like M-QAM, M-PSK). When the alphabets for both the users are the same we show that rotation of one of the alphabets enlarges the capacity region. We arrive at an optimal angle of rotation by simulation. The effect of rotation on the capacity region at different SNRs is also studied using simulation results. Using the setup of Fading Broadcast Channel (FBC) given by [Li and Goldsmith, 2001], we study the ergodic capacity region with inputs from finite complex alphabets. It is seen that, using the procedure for optimum power allocation obtained in [Li and Goldsmith, 2001] for Gaussian inputs, to allocate power to symbols from finite complex alphabets, relative rotation between the alphabets does not improve the capacity region. Simulation results for a modified heuristic power allocation procedure for finite-constellation case, show that Constellation Constrained capacity region enlarges with rotation.
Resumo:
The new paradigm of connectedness and empowerment brought by the interactivity feature of the Web 2.0 has been challenging the traditional centralized performance of mainstream media. The corporation has been able to survive the strong winds by transforming itself into a global multimedia business network embedded in the network society. By establishing networks, e.g. networks of production and distribution, the global multimedia business network has been able to sight potential solutions by opening the doors to innovation in a decentralized and flexible manner. Under this emerging context of re-organization, traditional practices like sourcing need to be re- explained and that is precisely what this thesis attempts to tackle. Based on ICT and on the network society, the study seeks to explain within the Finnish context the particular case of Helsingin Sanomat (HS) and its relations with the youth news agency, Youth Voice Editorial Board (NÄT). In that sense, the study can be regarded as an explanatory embedded single case study, where HS is the principal unit of analysis and NÄT its embedded unit of analysis. The thesis was able to reach explanations through interrelated steps. First, it determined the role of ICT in HS’s sourcing practices. Then it mapped an overview of the HS’s sourcing relations and provided a context in which NÄT was located. And finally, it established conceptualized institutional relational data between HS and NÄT for their posterior measurement through social network analysis. The data set was collected via qualitative interviews addressed to online and offline editors of HS as well as interviews addressed to NÄT’s personnel. The study concluded that ICT’s interactivity and User Generated Content (UGC) are not sourcing tools as such but mechanism used by HS for getting ideas that could turn into potential news stories. However, when it comes to visual communication, some exemptions were found. The lack of official sources amidst the immediacy leads HS to rely on ICT’s interaction and UGC. More than meets the eye, ICT’s input into the sourcing practice may be more noticeable if the interaction and UGC is well organized and coordinated into proper and innovative networks of alternative content collaboration. Currently, HS performs this sourcing practice via two projects that differ, precisely, by the mode they are coordinated. The first project found, Omakaupunki, is coordinated internally by Sanoma Group’s owned media houses HS, Vartti and Metro. The second project found is coordinated externally. The external alternative sourcing network, as it was labeled, consists of three actors, namely HS, NÄT (professionals in charge) and the youth. This network is a balanced and complete triad in which the actors connect themselves in relations of feedback, recognition, creativity and filtering. However, as innovation is approached very reluctantly, this content collaboration is a laboratory of experiments; a ‘COLLABORATORY’.
Resumo:
The world of mapping has changed. Earlier, only professional experts were responsible for map production, but today ordinary people without any training or experience can become map-makers. The number of online mapping sites, and the number of volunteer mappers has increased significantly. The development of the technology, such as satellite navigation systems, Web 2.0, broadband Internet connections, and smartphones, have had one of the key roles in enabling the rise of volunteered geographic information (VGI). As opening governmental data to public is a current topic in many countries, the opening of high quality geographical data has a central role in this study. The aim of this study is to investigate how is the quality of spatial data produced by volunteers by comparing it with the map data produced by public authorities, to follow what occurs when spatial data are opened for users, and to get acquainted with the user profile of these volunteer mappers. A central part of this study is OpenStreetMap project (OSM), which aim is to create a map of the entire world by volunteers. Anyone can become an OpenStreetMap contributor, and the data created by the volunteers are free to use for anyone without restricting copyrights or license charges. In this study OpenStreetMap is investigated from two viewpoints. In the first part of the study, the aim was to investigate the quality of volunteered geographic information. A pilot project was implemented by following what occurs when a high-resolution aerial imagery is released freely to the OpenStreetMap contributors. The quality of VGI was investigated by comparing the OSM datasets with the map data of The National Land Survey of Finland (NLS). The quality of OpenStreetMap data was investigated by inspecting the positional accuracy and the completeness of the road datasets, as well as the differences in the attribute datasets between the studied datasets. Also the OSM community was under analysis and the development of the map data of OpenStreetMap was investigated by visual analysis. The aim of the second part of the study was to analyse the user profile of OpenStreetMap contributors, and to investigate how the contributors act when collecting data and editing OpenStreetMap. The aim was also to investigate what motivates users to map and how is the quality of volunteered geographic information envisaged. The second part of the study was implemented by conducting a web inquiry to the OpenStreetMap contributors. The results of the study show that the quality of OpenStreetMap data compared with the data of National Land Survey of Finland can be defined as good. OpenStreetMap differs from the map of National Land Survey especially because of the amount of uncertainty, for example because of the completeness and uniformity of the map are not known. The results of the study reveal that opening spatial data increased notably the amount of the data in the study area, and both the positional accuracy and completeness improved significantly. The study confirms the earlier arguments that only few contributors have created the majority of the data in OpenStreetMap. The inquiry made for the OpenStreetMap users revealed that the data are most often collected by foot or by bicycle using GPS device, or by editing the map with the help of aerial imageries. According to the responses, the users take part to the OpenStreetMap project because they want to make maps better, and want to produce maps, which have information that is up-to-date and cannot be found from any other maps. Almost all of the users exploit the maps by themselves, most popular methods being downloading the map into a navigator or into a mobile device. The users regard the quality of OpenStreetMap as good, especially because of the up-to-dateness and the accuracy of the map.
Resumo:
Even research models of helicopter dynamics often lead to a large number of equations of motion with periodic coefficients; and Floquet theory is a widely used mathematical tool for dynamic analysis. Presently, three approaches are used in generating the equations of motion. These are (1) general-purpose symbolic processors such as REDUCE and MACSYMA, (2) a special-purpose symbolic processor, DEHIM (Dynamic Equations for Helicopter Interpretive Models), and (3) completely numerical approaches. In this paper, comparative aspects of the first two purely algebraic approaches are studied by applying REDUCE and DEHIM to the same set of problems. These problems range from a linear model with one degree of freedom to a mildly non-linear multi-bladed rotor model with several degrees of freedom. Further, computational issues in applying Floquet theory are also studied, which refer to (1) the equilibrium solution for periodic forced response together with the transition matrix for perturbations about that response and (2) a small number of eigenvalues and eigenvectors of the unsymmetric transition matrix. The study showed the following: (1) compared to REDUCE, DEHIM is far more portable and economical, but it is also less user-friendly, particularly during learning phases; (2) the problems of finding the periodic response and eigenvalues are well conditioned.
Resumo:
Payment systems all over the world have grown into a complicated web of solutions. This is more challenging in the case of mobile based payment systems. Mobile based payment systems are many and consist of different technologies providing different services. The diffusion of these various technologies in a market is uncertain. Diffusion theorists, for example Rogers, and Davis suggest how innovation is accepted in markets. In the case of electronic payment systems, the tale of Mondex vs Octopus throws interesting insights on diffusion. Our paper attempts to understand the success potential of various mobile payment technologies. We illustrate what we describe as technology breadth in mobile payment systems using data from payment systems all over the world (n=62). Our data shows an unexpected superiority of SMS technology, over other technologies like NFC, WAP and others. We also used a Delphi based survey (n=5) with experts to address the possibility that SMS will gain superiority in market diffusion. The economic conditions of a country, particularly in developing countries, the services availed and characteristics of the user (for example number of un-banked users in large populated countries) may put SMS in the forefront. This may be true more for micro payments using the mobile.
Resumo:
A mathematical model has been developed for the gas carburising (diffusion) process using finite volume method. The computer simulation has been carried out for an industrial gas carburising process. The model's predictions are in good agreement with industrial experimental data and with data collected from the literature. A study of various mass transfer and diffusion coefficients has been carried out in order to suggest which correlations should be used for the gas carburising process. The model has been interfaced in a Windows environment using a graphical user interface. In this way, the model is extremely user friendly. The sensitivity analysis of various parameters such as initial carbon concentration in the specimen, carbon potential of the atmosphere, temperature of the process, etc. has been carried out using the model.
Resumo:
Formation of silicon carbide in the Acheson process was studied using a mass transfer model which has been developed in this study. The century old Acheson process is still used for the mass production of silicon carbide. A heat resistance furnace is used in the Acheson process which uses sand and petroleum coke as major raw materials.: It is a highly energy intensive process. No mass transfer model is available for this process. Therefore, a mass transfer model has been developed to study the mass transfer aspects of the process along with heat transfer. The reaction kinetics of silicon carbide formation has been taken from the literature. It has been shown that reaction kinetics has a reasonable influence on the process efficiency. The effect of various parameters on the process such as total gas pressure, presence of silicon carbide in the initial charge, etc. has been studied. A graphical user interface has also been developed for the Acheson process to make the computer code user friendly.
Resumo:
Remanufacturing activities in India are still in nascent stages. However, the substantial growth of Indian economy, coupled with serious issues of population and environmental burden demands a radical shift in market strategies and legislations. The scattered and inefficient product recovery methods prevalent in India are unable to cope with increasing environmental and economic burden on the society - remanufacturing seems to be a promising strategy to explore for these. Our study investigated from a user's context the opportunity of establishing remanufacturing as a formal activity, answering the fundamental questions of whether remanufactured products would be accepted by Indian consumers and how these will fit into the Indian market. The study of the Indian mobile phone market eco-system showed how mobile phones currently move through the value chain, and the importance of the grey and used phone markets in this movement. A prescriptive model has been proposed which utilizes the usage patterns of different consumer groups to create a self-sustainable demand-supply system, potentially complementing frameworks such as the Automotive Remanufacturing Decision-Making Framework (RDMF). (C) 2011 Elsevier Ltd. All rights reserved.
Resumo:
Seismic hazard and microzonation of cities enable to characterize the potential seismic areas that need to be taken into account when designing new structures or retrofitting the existing ones. Study of seismic hazard and preparation of geotechnical microzonation maps has been attempted using Geographical Information System (GIS). GIS will provide an effective solution for integrating different layers of information thus providing a useful input for city planning and in particular input to earthquake resistant design of structures in an area. Seismic hazard is the study of expected earthquake ground motions at any point on the earth. Microzonation is the process of sub division of region in to number of zones based on the earthquake effects in the local scale. Seismic microzonation is the process of estimating response of soil layers under earthquake excitation and thus the variation of ground motion characteristic on the ground surface. For the seismic microzonation, geotechnical site characterization need to be assessed at local scale (micro level), which is further used to assess of the site response and liquefaction susceptibility of the sites. Seismotectonic atlas of the area having a radius of 350km around Bangalore has been prepared with all the seismogenic sources and historic earthquake events (a catalogue of about 1400 events since 1906). We have attempted to carryout the site characterization of Bangalore by collating conventional geotechnical boreholes data (about 900 borehole data with depth) and integrated in GIS. 3-D subsurface model of Bangalore prepared using GIS is shown in Figure 1.Further, Shear wave velocity survey based on geophysical method at about 60 locations in the city has been carried out in 220 square Kms area. Site response and local site effects have been evaluated using 1-dimensional ground response analysis. Spatial variability of soil overburden depths, ground surface Peak Ground Acceleration’s(PGA), spectral acceleration for different frequencies, liquefaction susceptibility have been mapped in the 220 sq km area using GIS.ArcInfo software has been used for this purpose. These maps can be used for the city planning and risk & vulnerability studies. Figure 2 shows a map of peak ground acceleration at rock level for Bangalore city. Microtremor experiments were jointly carried out with NGRI scientists at about 55 locations in the city and the predominant frequency of the overburden soil columns were evaluated.
Resumo:
The literature on pricing implicitly assumes an "infinite data" model, in which sources can sustain any data rate indefinitely. We assume a more realistic "finite data" model, in which sources occasionally run out of data; this leads to variable user data rates. Further, we assume that users have contracts with the service provider, specifying the rates at which they can inject traffic into the network. Our objective is to study how prices can be set such that a single link can be shared efficiently and fairly among users in a dynamically changing scenario where a subset of users occasionally has little data to send. User preferences are modelled by concave increasing utility functions. Further, we introduce two additional elements: a convex increasing disutility function and a convex increasing multiplicative congestion-penally function. The disutility function takes the shortfall (contracted rate minus present rate) as its argument, and essentially encourages users to send traffic at their contracted rates, while the congestion-penalty function discourages heavy users from sending excess data when the link is congested. We obtain simple necessary and sufficient conditions on prices for fair and efficient link sharing; moreover, we show that a single price for all users achieves this. We illustrate the ideas using a simple experiment.
Resumo:
Today 80 % of the content on the Web is in English, which is spoken by only 8% of the World population and 5% of Indian population. There is wealth of useful content in the various languages of the world other than English, which can be made available on the Internet. But, to date, for various reasons most of it is not yet available on the Internet. India itself has 18 officially recognized languages and scores of dialects. Although the medium of instruction for most of the higher education and research in India is English, substantial amount of literature by way of novels, textbooks, scholarly information are being generated in the other languages in the country. Many of the e-governance initiatives are in the respective state languages. In the past, support for different languages by the operating systems and the software packages were not very encouraging. However, with the advent of Unicode technology, operating systems and software packages are supporting almost all the major languages of the world that have scripts. In the work reported in this paper, we have explained the configuration changes that are needed for Eprints.org software to store multilingual content and to create a multilingual user interface.
Resumo:
Constellation Constrained (CC) capacity regions of two-user Gaussian Multiple Access Channels (GMAC) have been recently reported, wherein an appropriate angle of rotation between the constellations of the two users is shown to enlarge the CC capacity region. We refer to such a scheme as the Constellation Rotation (CR) scheme. In this paper, we propose a novel scheme called the Constellation Power Allocation (CPA) scheme, wherein the instantaneous transmit power of the two users are varied by maintaining their average power constraints. We show that the CPA scheme offers CC sum capacities equal (at low SNR values) or close (at high SNR values) to those offered by the CR scheme with reduced decoding complexity for QAM constellations. We study the robustness of the CPA scheme for random phase offsets in the channel and unequal average power constraints for the two users. With random phase offsets in the channel, we show that the CC sum capacity offered by the CPA scheme is more than the CR scheme at high SNR values. With unequal average power constraints, we show that the CPA scheme provides maximum gain when the power levels are close, and the advantage diminishes with the increase in the power difference.
On Precoding for Constant K-User MIMO Gaussian Interference Channel With Finite Constellation Inputs
Resumo:
This paper considers linear precoding for the constant channel-coefficient K-user MIMO Gaussian interference channel (MIMO GIC) where each transmitter-i (Tx-i) requires the sending of d(i) independent complex symbols per channel use that take values from fixed finite constellations with uniform distribution to receiver-i (Rx-i) for i = 1, 2, ..., K. We define the maximum rate achieved by Tx-i using any linear precoder as the signal-to-noise ratio (SNR) tends to infinity when the interference channel coefficients are zero to be the constellation constrained saturation capacity (CCSC) for Tx-i. We derive a high-SNR approximation for the rate achieved by Tx-i when interference is treated as noise and this rate is given by the mutual information between Tx-i and Rx-i, denoted as I(X) under bar (i); (Y) under bar (i)]. A set of necessary and sufficient conditions on the precoders under which I(X) under bar (i); (Y) under bar (i)] tends to CCSC for Tx-i is derived. Interestingly, the precoders designed for interference alignment (IA) satisfy these necessary and sufficient conditions. Furthermore, we propose gradient-ascentbased algorithms to optimize the sum rate achieved by precoding with finite constellation inputs and treating interference as noise. A simulation study using the proposed algorithms for a three-user MIMO GIC with two antennas at each node with d(i) = 1 for all i and with BPSK and QPSK inputs shows more than 0.1-b/s/Hz gain in the ergodic sum rate over that yielded by precoders obtained from some known IA algorithms at moderate SNRs.
Resumo:
Household-level water treatment and safe storage systems (HWTS) are simple, local, user-friendly, and low cost options to improve drinking water quality at the point of use. However, despite conclusive evidence of the health and economic benefits of HWTS, and promotion efforts in over 50 countries in the past 20 years, implementation outcomes have been slow, reaching only 5-10 million regular users. This study attempts to understand the barriers and drivers affecting HWTS implementation. Using a case study example of a biosand filter program in southern India, system dynamics modelling is shown to be a useful tool to map the inter-relationships of different critical factors and to understand the dissemination dynamics. It is found that the co-existence of expanding quickly and achieving financial sustainability appears to be difficult to achieve under the current program structure.