781 resultados para big data storage
Resumo:
As we enter an era of ‘big data’, asset information is becoming a deliverable of complex projects. Prior research suggests digital technologies enable rapid, flexible forms of project organizing. This research analyses practices of managing change in Airbus, CERN and Crossrail, through desk-based review, interviews, visits and a cross-case workshop. These organizations deliver complex projects, rely on digital technologies to manage large data-sets; and use configuration management, a systems engineering approach with mid-20th century origins, to establish and maintain integrity. In them, configuration management has become more, rather than less, important. Asset information is structured, with change managed through digital systems, using relatively hierarchical, asynchronous and sequential processes. The paper contributes by uncovering limits to flexibility in complex projects where integrity is important. Challenges of managing change are discussed, considering the evolving nature of configuration management; potential use of analytics on complex projects; and implications for research and practice.
Resumo:
Pervasive healthcare aims to deliver deinstitutionalised healthcare services to patients anytime and anywhere. Pervasive healthcare involves remote data collection through mobile devices and sensor network which the data is usually in large volume, varied formats and high frequency. The nature of big data such as volume, variety, velocity and veracity, together with its analytical capabilities com-plements the delivery of pervasive healthcare. However, there is limited research in intertwining these two domains. Most research focus mainly on the technical context of big data application in the healthcare sector. Little attention has been paid to a strategic role of big data which impacts the quality of healthcare services provision at the organisational level. Therefore, this paper delivers a conceptual view of big data architecture for pervasive healthcare via an intensive literature review to address the aforementioned research problems. This paper provides three major contributions: 1) identifies the research themes of big data and pervasive healthcare, 2) establishes the relationship between research themes, which later composes the big data architecture for pervasive healthcare, and 3) sheds a light on future research, such as semiosis and sense-making, and enables practitioners to implement big data in the pervasive healthcare through the proposed architecture.
Resumo:
Widespread commercial use of the internet has significantly increased the volume and scope of data being collected by organisations. ‘Big data’ has emerged as a term to encapsulate both the technical and commercial aspects of this growing data collection activity. To date, much of the discussion of big data has centred upon its transformational potential for innovation and efficiency, yet there has been less reflection on its wider implications beyond commercial value creation. This paper builds upon normal accident theory (NAT) to analyse the broader ethical implications of big data. It argues that the strategies behind big data require organisational systems that leave them vulnerable to normal accidents, that is to say some form of accident or disaster that is both unanticipated and inevitable. Whilst NAT has previously focused on the consequences of physical accidents, this paper suggests a new form of system accident that we label data accidents. These have distinct, less tangible and more complex characteristics and raise significant questions over the role of individual privacy in a ‘data society’. The paper concludes by considering the ways in which the risks of such data accidents might be managed or mitigated.
Resumo:
This paper discusses how global financial institutions are using big data analytics within their compliance operations. A lot of previous research has focused on the strategic implications of big data, but not much research has considered how such tools are entwined with regulatory breaches and investigations in financial services. Our work covers two in-depth qualitative case studies, each addressing a distinct type of analytics. The first case focuses on analytics which manage everyday compliance breaches and so are expected by managers. The second case focuses on analytics which facilitate investigation and litigation where serious unexpected breaches may have occurred. In doing so, the study focuses on the micro/data to understand how these tools are influencing operational risks and practices. The paper draws from two bodies of literature, the social studies of information systems and finance to guide our analysis and practitioner recommendations. The cases illustrate how technologies are implicated in multijurisdictional challenges and regulatory conflicts at each end of the operational risk spectrum. We find that compliance analytics are both shaping and reporting regulatory matters yet often firms may have difficulties in recruiting individuals with relevant but diverse skill sets. The cases also underscore the increasing need for financial organizations to adopt robust information governance policies and processes to ease future remediation efforts.
Resumo:
O estudo da FGV Projetos, coordenado pelo economista Fernando Blumenschein, desenvolve um quadro metodológico específico para compras governamentais com base em pesquisa realizada para o Fundo Nacional de Desenvolvimento da Educação (FNDE). Este estudo destaca o potencial de uso dos conceitos da teoria dos leilões, juntamente com métodos de análise de "Big Data" na formação de sessões públicas.
Resumo:
As tecnologias digitais tornaram-se uma importante infraestrutura para as nossas vidas nas mais diversas dimensões – cultural, social, politicas e econômicas. As formas de mediação digital têm alterado meios tradicionais e convencionais de organização de tempo e espaço. No entanto, contextualizar e situar narrativas e práticas de produção e análise de dados de rede – gerados em grande volume e a uma grande velocidade – têm sido grandes desafios para pesquisadores de todo o mundo. Lançado no fim de 2014, o livro Big data? Qualitative approaches to digital research (Emerald Books, 2014) oferece uma visão crítica sobre análises qualitativas de novos tipos de dados, plataformas e meios de comunicação, suas implicações para o futuro e como melhorar ativamente a pesquisa em relação a estes. Com autores especialistas das áreas de sociologia, ciência politica, cultura, comunicação, metodologia e administração, os organizadores do livro, Martin Hand e Sam Hillyard, denominam de pesquisa social digital todas as perspectivas qualitativas de diversas disciplinas, conceitos e orientações metodológicas e empíricas que atestem a integração e a diversificação das tecnologias digitais e de dados na vida social de hoje.
Resumo:
EMAp - Escola de Matemática Aplicada
Resumo:
Apresentamos um projeto inovador na intersecção da tecnologia da informação, gestão, e direito, com o intuito de oferecer otimização de resultados, redução de custos e de tempo. A equipe proponente foi formada no projeto Big Data e Gestão Processual, do qual participam três escolas da Fundação Getulio Vargas que são referência em todo Brasil: as escolas de Direito, de Administração de Empresas e de Matemática Aplicada, todas do Rio de Janeiro.
Resumo:
Online geographic-databases have been growing increasingly as they have become a crucial source of information for both social networks and safety-critical systems. Since the quality of such applications is largely related to the richness and completeness of their data, it becomes imperative to develop adaptable and persistent storage systems, able to make use of several sources of information as well as enabling the fastest possible response from them. This work will create a shared and extensible geographic model, able to retrieve and store information from the major spatial sources available. A geographic-based system also has very high requirements in terms of scalability, computational power and domain complexity, causing several difficulties for a traditional relational database as the number of results increases. NoSQL systems provide valuable advantages for this scenario, in particular graph databases which are capable of modeling vast amounts of inter-connected data while providing a very substantial increase of performance for several spatial requests, such as finding shortestpath routes and performing relationship lookups with high concurrency. In this work, we will analyze the current state of geographic information systems and develop a unified geographic model, named GeoPlace Explorer (GE). GE is able to import and store spatial data from several online sources at a symbolic level in both a relational and a graph databases, where several stress tests were performed in order to find the advantages and disadvantages of each database paradigm.
Resumo:
Antimony based glasses have been investigated for the first time regarding the possibility of holographic data storage using visible lasers sources. Changes in both refractive index and the absorption coefficient were measured using a holographic setup. The modulation of the optical constants is reversible by heat treatment. Bragg gratings were written under visible light of an Ar laser and erased thermally.
Resumo:
The two main forces affecting economic development are the ongoing technological revolution and the challenge of sustainability. Technological change is altering patterns of production, consumption and behaviour in societies; at the same time, it is becoming increasingly difficult to ensure the sustainability of these new patterns because of the constraints resulting from the negative externalities generated by economic growth and, in many cases, by technical progress itself. Reorienting innovation towards reducing or, if possible, reversing the effects of these externalities could create the conditions for synergies between the two processes. Views on the subject vary widely: while some maintain that these synergies can easily be created if growth follows an environmentally friendly model, summarized in the concept of green growth, others argue that production and consumption patterns are changing too slowly and that any technological fix will come too late. These considerations apply to hard technologies, essentially those used in production. The present document explores the opportunities being opened up by new ones, basically information and communication technologies, in terms of increasing the effectiveness (outcomes) and efficiency (relative costs) of soft technologies that can improve the way environmental issues are handled in business management and in public policy formulation and implementation.
Resumo:
The data revolution for sustainable development has triggered interest in the use of big data for official statistics such that theUnited Nations Economic and Social Council considers it to be almost an obligation for statistical organizations to explore big data. Big data has been promoted as a more timely and cheaper alternative to traditional sources of official data, and one that offers great potential for monitoring the sustainable development goals. However, privacy concerns, technology and capacity remain significant obstacles to the use of big data. This study makes a case for incorporating big data in official statitics in the Caribbean by highlight the opportunities that big data provides for the subregion, while suggesting ways to manage the challenges. It serves as a starting point for further discussions on the many facets of big data and provides an initial platform upon which a Caribbean big data strategy could be built.
Resumo:
Pós-graduação em Ciência da Informação - FFC