951 resultados para digital asset management
Resumo:
This deliverable (D1.4) is an intermediate document, expressly included to inform the first project review about RAGE’s methodology of software asset creation and management. The final version of the methodology description (D1.1) will be delivered in Month 29. The document explains how the RAGE project defines, develops, distributes and maintains a series of applied gaming software assets that it aims to make available. It describes a high-level methodology and infrastructure that are needed to support the work in the project as well as after the project has ended.
Resumo:
This keynote presentation will report some of our research work and experience on the development and applications of relevant methods, models, systems and simulation techniques in support of different types and various levels of decision making for business, management and engineering. In particular, the following topics will be covered. Modelling, multi-agent-based simulation and analysis of the allocation management of carbon dioxide emission permits in China (Nanfeng Liu & Shuliang Li Agent-based simulation of the dynamic evolution of enterprise carbon assets (Yin Zeng & Shuliang Li) A framework & system for extracting and representing project knowledge contexts using topic models and dynamic knowledge maps: a big data perspective (Jin Xu, Zheng Li, Shuliang Li & Yanyan Zhang) Open innovation: intelligent model, social media & complex adaptive system simulation (Shuliang Li & Jim Zheng Li) A framework, model and software prototype for modelling and simulation for deshopping behaviour and how companies respond (Shawkat Rahman & Shuliang Li) Integrating multiple agents, simulation, knowledge bases and fuzzy logic for international marketing decision making (Shuliang Li & Jim Zheng Li) A Web-based hybrid intelligent system for combined conventional, digital, mobile, social media and mobile marketing strategy formulation (Shuliang Li & Jim Zheng Li) A hybrid intelligent model for Web & social media dynamics, and evolutionary and adaptive branding (Shuliang Li) A hybrid paradigm for modelling, simulation and analysis of brand virality in social media (Shuliang Li & Jim Zheng Li) Network configuration management: attack paradigms and architectures for computer network survivability (Tero Karvinen & Shuliang Li)
Resumo:
Major developments in the technological environment can become commonplace very quickly. They are now impacting upon a broad range of information-based service sectors, as high growth Internet-based firms, such as Google, Amazon, Facebook and Airbnb, and financial technology (Fintech) start-ups expand their product portfolios into new markets. Real estate is one of the information-based service sectors that is currently being impacted by this new type of competitor and the broad range of disruptive digital technologies that have emerged. Due to the vast troves of data that these Internet firms have at their disposal and their asset-light (cloud-based) structures, they are able to offer highly-targeted products at much lower costs than conventional brick-and-mortar companies.
Resumo:
The present thesis explores how interaction is initiated in multi-party meetings in Adobe Connect, 7.0, with a particular focus on how co-presence and mutual availability are established through the preambles of 18 meetings held in Spanish without a moderator. Taking Conversation Analysis (CA) as a methodological point of departure, this thesis comprises four different studies, each of them analyzing a particular phenomenon within the interaction of the preambles in a multimodal environment that allows simultaneous interaction through video, voice and text-chat. The first study (Artículo I) shows how participants solve jointly the issue of availability in a technological environment where being online is not necessarily understood as being available for communicating. The second study (Artículo II) focuses on the beginning of the audiovisual interaction; in particular on how participants check the right functioning of the audiovisual mode. The third study (Artículo III) explores silences within the interaction of the preamble. It shows that the length of gaps and lapses become a significant aspect the preambles and how they are connected to the issue of availability. Finally, the four study introduces the notion of modal alignment, an interactional phenomenon that systematically appears in the beginnings of the encounters, which seems to be used and understood as a strategy for the establishment of mutual availability and negotiation of the participation framework. As a whole, this research shows how participants, in order to establish mutual co-presence and availability, adapt to a particular technology in terms of participation management, deploying strategies and conveying successive actions which, as it is the case of the activation of their respective webcams, seem to be understood as predictable within the intricate process of establishing mutual availability before the meeting starts.
Resumo:
Thesis (Master's)--University of Washington, 2016-06
Resumo:
RESUMO O presente trabalho pretende identificar quais as dificuldades encontradas para a implementação da inclusão digital no Ensino Médio da Escola Estadual Deputado João Evaristo Curvo, na cidade de Jauru, Mato Grosso. Os objetivos da pesquisa serão: Identificar os principais ambientes de aprendizagem digital dos alunos do Ensino Médio; Descrever as formas de apropriação da tecnologia da informação realizadas pelos educadores, educandos e pela instituição e avaliar a eficácia da tecnologia da informação para a implementação da inclusão digital e social dos mesmos. A metodologia utilizada para a realização do trabalho será a pesquisa caracterizada como qualitativa descritiva, e ela se dará através de observações e questionários. Através da análise dos resultados buscar-se-á pensar e repensar a realidade vivenciada pelo educador em salas no Ensino Médio e na validação de um modo mais viável para a implementação da inclusão digital. Na medida em que a reflexão for realizada esta pesquisa deverá contribuir para facilitar a busca e o interesse dos estudantes pela aprendizagem e pelas inovações tecnológicas, a fim de que os mesmos disponham de um patrimônio intelectual e encontrem condições de autorrealização e crescimento sociocultural, pois a interação escolar exige-se uma postura adequada de todos os envolvidos, e variedades educacionais que devem ser trabalhadas na prática do processo de ensino pedagógico. Palavras-chave: Implementação. Inclusão Digital. Ensino Médio.
Resumo:
A partir da década de 1990, com o advento da evolução tecnológica, identifica-se uma notável reorganização de processos e atividades organizacionais. É consenso que a tecnologia tem desempenhado o papel fundamental de conectar as pessoas, ampliando assim a oferta de serviços e proporcionando o acesso a uma série de informações e dados. Para o caso do setor bancário é possível identificar a tendência de virtualização de atendimentos e transações, dado o baixo custo destas, bem como a agilidade e a comodidade que proporcionam aos clientes. Em meio a este cenário, o internet banking, a partir de 2011, assume o posto de maior canal de atendimento bancário no Brasil. Este estudo foi desenvolvido com o objetivo de identificar e analisar os principais atributos que exercem influência em clientes das agências de uma instituição bancária federal localizada no Vale do Alto Paraopeba/MG para a utilização de serviços bancários via internet banking. Especificamente, pretendeu-se: identificar e analisar as variáveis que exercem maior influência na percepção de qualidade dos usuários de serviços internet banking da instituição objeto de pesquisa; avaliar a percepção dos usuários de serviços bancários quanto à existência de benefícios a partir da utilização da internet para a realização de suas transações; e identificar possíveis fatores críticos que envolvem os serviços disponibilizados via internet banking na instituição objeto deste estudo. Para tal, metodologicamente desenvolveu-se uma pesquisa descritiva de abordagem quantitativa, baseada em um estudo de caso realizado em uma instituição bancária federal. Os dados foram coletados por meio de um questionário, estruturado, aplicado a correntistas da instituição que utilizam o internet banking para a realização de transações bancárias. Obteve-se um total de 250 questionários válidos, que foram analisados mediante a análise fatorial exploratória (AFE). Dentre os resultados alcançados, foi possível identificar três construtos que exercem maior influência na predisposição do cliente em utilizar os serviços bancários disponibilizados via internet banking: acessibilidade, comodidade e segurança. Destaca-se, ainda, que estes construtos possuem juntos um total de variância explicada de 61,593%, índice considerado “bom” para estudos desenvolvidos na área de Ciências Sociais Aplicadas. Dentre os achados deste estudo, destaca-se, ainda, a identificação de alguns pontos críticos relacionados, principalmente, a questões de disfunção burocrática identificáveis em grande parte dos processos que envolvem a empresa e seus clientes. Notavelmente, essa disfunção é uma característica que se faz presente também em outros segmentos de serviços públicos brasileiros, nos quais é possível identificar morosidade e retrabalho processual. Palavras-chave: Internet banking. Marketing de Serviços. Qualidade de Serviços. Bancos. Acessibilidade.
Resumo:
In today’s big data world, data is being produced in massive volumes, at great velocity and from a variety of different sources such as mobile devices, sensors, a plethora of small devices hooked to the internet (Internet of Things), social networks, communication networks and many others. Interactive querying and large-scale analytics are being increasingly used to derive value out of this big data. A large portion of this data is being stored and processed in the Cloud due the several advantages provided by the Cloud such as scalability, elasticity, availability, low cost of ownership and the overall economies of scale. There is thus, a growing need for large-scale cloud-based data management systems that can support real-time ingest, storage and processing of large volumes of heterogeneous data. However, in the pay-as-you-go Cloud environment, the cost of analytics can grow linearly with the time and resources required. Reducing the cost of data analytics in the Cloud thus remains a primary challenge. In my dissertation research, I have focused on building efficient and cost-effective cloud-based data management systems for different application domains that are predominant in cloud computing environments. In the first part of my dissertation, I address the problem of reducing the cost of transactional workloads on relational databases to support database-as-a-service in the Cloud. The primary challenges in supporting such workloads include choosing how to partition the data across a large number of machines, minimizing the number of distributed transactions, providing high data availability, and tolerating failures gracefully. I have designed, built and evaluated SWORD, an end-to-end scalable online transaction processing system, that utilizes workload-aware data placement and replication to minimize the number of distributed transactions that incorporates a suite of novel techniques to significantly reduce the overheads incurred both during the initial placement of data, and during query execution at runtime. In the second part of my dissertation, I focus on sampling-based progressive analytics as a means to reduce the cost of data analytics in the relational domain. Sampling has been traditionally used by data scientists to get progressive answers to complex analytical tasks over large volumes of data. Typically, this involves manually extracting samples of increasing data size (progressive samples) for exploratory querying. This provides the data scientists with user control, repeatable semantics, and result provenance. However, such solutions result in tedious workflows that preclude the reuse of work across samples. On the other hand, existing approximate query processing systems report early results, but do not offer the above benefits for complex ad-hoc queries. I propose a new progressive data-parallel computation framework, NOW!, that provides support for progressive analytics over big data. In particular, NOW! enables progressive relational (SQL) query support in the Cloud using unique progress semantics that allow efficient and deterministic query processing over samples providing meaningful early results and provenance to data scientists. NOW! enables the provision of early results using significantly fewer resources thereby enabling a substantial reduction in the cost incurred during such analytics. Finally, I propose NSCALE, a system for efficient and cost-effective complex analytics on large-scale graph-structured data in the Cloud. The system is based on the key observation that a wide range of complex analysis tasks over graph data require processing and reasoning about a large number of multi-hop neighborhoods or subgraphs in the graph; examples include ego network analysis, motif counting in biological networks, finding social circles in social networks, personalized recommendations, link prediction, etc. These tasks are not well served by existing vertex-centric graph processing frameworks whose computation and execution models limit the user program to directly access the state of a single vertex, resulting in high execution overheads. Further, the lack of support for extracting the relevant portions of the graph that are of interest to an analysis task and loading it onto distributed memory leads to poor scalability. NSCALE allows users to write programs at the level of neighborhoods or subgraphs rather than at the level of vertices, and to declaratively specify the subgraphs of interest. It enables the efficient distributed execution of these neighborhood-centric complex analysis tasks over largescale graphs, while minimizing resource consumption and communication cost, thereby substantially reducing the overall cost of graph data analytics in the Cloud. The results of our extensive experimental evaluation of these prototypes with several real-world data sets and applications validate the effectiveness of our techniques which provide orders-of-magnitude reductions in the overheads of distributed data querying and analysis in the Cloud.
Resumo:
Many applications, including communications, test and measurement, and radar, require the generation of signals with a high degree of spectral purity. One method for producing tunable, low-noise source signals is to combine the outputs of multiple direct digital synthesizers (DDSs) arranged in a parallel configuration. In such an approach, if all noise is uncorrelated across channels, the noise will decrease relative to the combined signal power, resulting in a reduction of sideband noise and an increase in SNR. However, in any real array, the broadband noise and spurious components will be correlated to some degree, limiting the gains achieved by parallelization. This thesis examines the potential performance benefits that may arise from using an array of DDSs, with a focus on several types of common DDS errors, including phase noise, phase truncation spurs, quantization noise spurs, and quantizer nonlinearity spurs. Measurements to determine the level of correlation among DDS channels were made on a custom 14-channel DDS testbed. The investigation of the phase noise of a DDS array indicates that the contribution to the phase noise from the DACs can be decreased to a desired level by using a large enough number of channels. In such a system, the phase noise qualities of the source clock and the system cost and complexity will be the main limitations on the phase noise of the DDS array. The study of phase truncation spurs suggests that, at least in our system, the phase truncation spurs are uncorrelated, contrary to the theoretical prediction. We believe this decorrelation is due to the existence of an unidentified mechanism in our DDS array that is unaccounted for in our current operational DDS model. This mechanism, likely due to some timing element in the FPGA, causes some randomness in the relative phases of the truncation spurs from channel to channel each time the DDS array is powered up. This randomness decorrelates the phase truncation spurs, opening the potential for SFDR gain from using a DDS array. The analysis of the correlation of quantization noise spurs in an array of DDSs shows that the total quantization noise power of each DDS channel is uncorrelated for nearly all values of DAC output bits. This suggests that a near N gain in SQNR is possible for an N-channel array of DDSs. This gain will be most apparent for low-bit DACs in which quantization noise is notably higher than the thermal noise contribution. Lastly, the measurements of the correlation of quantizer nonlinearity spurs demonstrate that the second and third harmonics are highly correlated across channels for all frequencies tested. This means that there is no benefit to using an array of DDSs for the problems of in-band quantizer nonlinearities. As a result, alternate methods of harmonic spur management must be employed.
Resumo:
Peer-to-peer information sharing has fundamentally changed customer decision-making process. Recent developments in information technologies have enabled digital sharing platforms to influence various granular aspects of the information sharing process. Despite the growing importance of digital information sharing, little research has examined the optimal design choices for a platform seeking to maximize returns from information sharing. My dissertation seeks to fill this gap. Specifically, I study novel interventions that can be implemented by the platform at different stages of the information sharing. In collaboration with a leading for-profit platform and a non-profit platform, I conduct three large-scale field experiments to causally identify the impact of these interventions on customers’ sharing behaviors as well as the sharing outcomes. The first essay examines whether and how a firm can enhance social contagion by simply varying the message shared by customers with their friends. Using a large randomized field experiment, I find that i) adding only information about the sender’s purchase status increases the likelihood of recipients’ purchase; ii) adding only information about referral reward increases recipients’ follow-up referrals; and iii) adding information about both the sender’s purchase as well as the referral rewards increases neither the likelihood of purchase nor follow-up referrals. I then discuss the underlying mechanisms. The second essay studies whether and how a firm can design unconditional incentive to engage customers who already reveal willingness to share. I conduct a field experiment to examine the impact of incentive design on sender’s purchase as well as further referral behavior. I find evidence that incentive structure has a significant, but interestingly opposing, impact on both outcomes. The results also provide insights about senders’ motives in sharing. The third essay examines whether and how a non-profit platform can use mobile messaging to leverage recipients’ social ties to encourage blood donation. I design a large field experiment to causally identify the impact of different types of information and incentives on donor’s self-donation and group donation behavior. My results show that non-profits can stimulate group effect and increase blood donation, but only with group reward. Such group reward works by motivating a different donor population. In summary, the findings from the three studies will offer valuable insights for platforms and social enterprises on how to engineer digital platforms to create social contagion. The rich data from randomized experiments and complementary sources (archive and survey) also allows me to test the underlying mechanism at work. In this way, my dissertation provides both managerial implication and theoretical contribution to the phenomenon of peer-to-peer information sharing.
Resumo:
Prior research has been divided regarding how firms respond to bankruptcy risk, largely revolving around two competing forces. On the one hand, asset substitution encourages firms to increase the riskiness of assets to extract value from creditors. On the other, firms want to minimize bankruptcy risk, either by reducing cash flow risk or through increasing the size of the firm. I test these two theories using a natural experiment of chemicals used in production processes being newly identified as carcinogenic to explore how firms may respond to potential negative cash flow resulting from litigation risk. I use plantlevel chemical data to study firm exposure to risk. I examine how responses between firms of differing levels of chemical exposure may vary within the industry, how firm financial distress affects firm response and whether public and private firms respond differently. In general, my research provides support for the asset substitution theory. My first paper studies how investment response varies based on level of carcinogenic exposure. I find that firms with moderate levels of exposure make efforts to mitigate their cash flow risk and reduce their exposure. At the same time, firms with high levels of exposure increase their exposure and riskiness of future cash flows. These findings are consistent with asset substitution theory. My second paper analyzes the interaction of financial distress and risk exposure. I find that firms in a stronger financial position are more likely to limit their exposure by reducing the number of exposed facilities. On the other hand, not only do firms in weaker financial position not decrease their exposure, I find that, in some instances, they increase their exposure to carcinogens. This work again supports the theory of asset substitution. Finally, in my third paper, I explore if public firms respond differently to a potential negative cash flow shock than do private firms. I test whether existing public firms are more likely to attempt to minimize their cash flow risk and thus reduce their carcinogen exposure than are private firms. I do not find evidence that public firms respond differently to this shock than do private firms.
Resumo:
Most economic transactions nowadays are due to the effective exchange of information in which digital resources play a huge role. New actors are coming into existence all the time, so organizations are facing difficulties in keeping their current customers and attracting new customer segments and markets. Companies are trying to find the key to their success and creating superior customer value seems to be one solution. Digital technologies can be used to deliver value to customers in ways that extend customers’ normal conscious experiences in the context of time and space. By creating customer value, companies can gain the increased loyalty of existing customers and better ways to serve new customers effectively. Based on these assumptions, the objective of this study was to design a framework to enable organizations to create customer value in digital business. The research was carried out as a literature review and an empirical study, which consisted of a web-based survey and semi-structured interviews. The data from the empirical study was analyzed as mixed research with qualitative and quantitative methods. These methods were used since the object of the study was to gain deeper understanding about an existing phenomena. Therefore, the study used statistical procedures and value creation is described as a phenomenon. The framework was designed first based on the literature and updated based on the findings from the empirical study. As a result, relationship, understanding the customer, focusing on the core product or service, the product or service quality, incremental innovations, service range, corporate identity, and networks were chosen as the top elements of customer value creation. Measures for these elements were identified. With the measures, companies can manage the elements in value creation when dealing with present and future customers and also manage the operations of the company. In conclusion, creating customer value requires understanding the customer and a lot of information sharing, which can be eased by digital resources. Understanding the customer helps to produce products and services that fulfill customers’ needs and desires. This could result in increased sales and make it easier to establish efficient processes.
Resumo:
Symbiotic relationships between insects and beneficial microbes are very common in nature, especially within the Hemiptera. The brown marmorated stink bug, Halyomorpha halys Stål, harbors a symbiont, Pantoea carbekii, within the fourth region of the midgut in specialized crypts. In this dissertation, I explored this insect- microbe relationship. I determined that the brown marmorated stink bug is heavily reliant on its symbiont, and that experimental removal of the symbiont from the egg mass surface prior to nymphal acquisition led to lower survival, longer development, lower fecundity, and aberrant nymphal behavior. Additionally, I determined that even when the symbiont is acquired and housed in the midgut crypts, it is susceptible to stressors. Stink bugs reared at a higher temperature showed lower survival, longer development, and a cease in egg mass production, and when bugs were screened for their symbiont, fewer had successfully retained it while under heat stress. Finally, with the knowledge that the stink bug suffers decreases in fitness when its symbiont is missing or stressed, I wanted to determine if targeting the symbiont was a possible management technique for the stink bug. I tested the efficacy of a number of different insecticidal and antimicrobial products to determine whether prevention of symbiont acquisition from the egg mass was possible, and results indicated that transmission of the symbiont from the egg mass to the newly hatched nymph was negatively impacted when certain products were applied (namely surfactants or products containing surfactants). Additionally, direct effects on hatch rate and survival were reported for certain products, namely the insect growth regulator azadirachtin, which suggests that nymphs can pick up residues from the egg mass surface while probing for the symbiont. I conclude that P. carbekii plays a critically important role in the survival of its host, the brown marmorated stink bug, and its presence on the egg mass surface before nymphal hatch makes it targetable as a potential management technique.
Resumo:
Knowledge is one of the most important assets for surviving in the modern business environment. The effective management of that asset mandates continuous adaptation by organizations, and requires employees to strive to improve the company's work processes. Organizations attempt to coordinate their unique knowledge with traditional means as well as in new and distinct ways, and to transform them into innovative resources better than those of their competitors. As a result, how to manage the knowledge asset has become a critical issue for modern organizations, and knowledge management is considered the most feasible solution. Knowledge management is a multidimensional process that identifies, acquires, develops, distributes, utilizes, and stores knowledge. However, many related studies focus only on fragmented or limited knowledge-management perspectives. In order to make knowledge management more effective, it is important to identify the qualitative and quantitative issues that are the foundation of the challenge of effective knowledge management in organizations. The main purpose of this study was to integrate the fragmented knowledge management perspectives into the holistic framework, which includes knowledge infrastructure capability (technology, structure, and culture) and knowledge process capability (acquisition, conversion, application, and protection), based on Gold's (2001) study. Additionally, because the effect of incentives ̶̶ which is widely acknowledged as a prime motivator in facilitating the knowledge management process ̶̶ was missing in the original framework, this study included the importance of incentives in the knowledge management framework. This study also identified the relationship of organizational performance from the standpoint of the Balanced Scorecard, which includes the customer-related, internal business process, learning & growth, and perceptual financial aspects of organizational performance in the Korean business context. Moreover, this study identified the relationship with the objective financial performance by calculating the Tobin's q ratio. Lastly, this study compared the group differences between larger and smaller organizations, and manufacturing and nonmanufacturing firms in the study of knowledge management. Since this study was conducted in Korea, the original instrument was translated into Korean through the back translation technique. A confirmatory factor analysis (CFA) was used to examine the validity and reliability of the instrument. To identify the relationship between knowledge management capabilities and organizational performance, structural equation modeling (SEM) and multiple regression analysis were conducted. A Student's t test was conducted to examine the mean differences. The results of this study indicated that there is a positive relationship between effective knowledge management and organizational performance. However, no empirical evidence was found to suggest that knowledge management capabilities are linked to the objective financial performance, which remains a topic for future review. Additionally, findings showed that knowledge management is affected by organization's size, but not by type of organization. The results of this study are valuable in establishing a valid and reliable survey instrument, as well as in providing strong evidence that knowledge management capabilities are essential to improving organizational performance currently and making important recommendations for future research.
Resumo:
O objetivo do estudo é identificar e analisar na literatura internacional as correlações métricas entre avaliação de desempenho e gestão da informação digital. Justifica-se pela necessidade de ampliação de abordagens da ciência da informação em relação à gestão da informação digital. Caracteriza-se como um estudo de natureza exploratória e caráter quali-quantitativo com o uso do processo Proknow-C para seleção da literatura, identificação, análise e reflexão das características das publicações. Os resultados identificaram cinco autores mais citados, da área de gestão em geral e na área de avaliação; o periódico mais receptivo ao tema foi o International Journal of Public Sector Management e as áreas de comunicação e tecnologia. Os resultados encontrados podem sustentar o desenvolvimento de conhecimento interdisciplinar sobre o tema com novas contribuições, uma vez que evidenciaram-se lacunas estruturais de abordagens no campo da ciência da informação.