818 resultados para big data


Relevância:

100.00% 100.00%

Publicador:

Resumo:

We are sympathetic with Bentley et al’s attempt to encompass the wisdom of crowds in a generative model, but posit that success at using Big Data will include more sensitive measurements, more and more varied sources of information, as well as build from the indirect information available through technology, from ancillary technical features to data from brain-computer interface.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

JASMIN is a super-data-cluster designed to provide a high-performance high-volume data analysis environment for the UK environmental science community. Thus far JASMIN has been used primarily by the atmospheric science and earth observation communities, both to support their direct scientific workflow, and the curation of data products in the STFC Centre for Environmental Data Archival (CEDA). Initial JASMIN configuration and first experiences are reported here. Useful improvements in scientific workflow are presented. It is clear from the explosive growth in stored data and use that there was a pent up demand for a suitable big-data analysis environment. This demand is not yet satisfied, in part because JASMIN does not yet have enough compute, the storage is fully allocated, and not all software needs are met. Plans to address these constraints are introduced.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Owing to continuous advances in the computational power of handheld devices like smartphones and tablet computers, it has become possible to perform Big Data operations including modern data mining processes onboard these small devices. A decade of research has proved the feasibility of what has been termed as Mobile Data Mining, with a focus on one mobile device running data mining processes. However, it is not before 2010 until the authors of this book initiated the Pocket Data Mining (PDM) project exploiting the seamless communication among handheld devices performing data analysis tasks that were infeasible until recently. PDM is the process of collaboratively extracting knowledge from distributed data streams in a mobile computing environment. This book provides the reader with an in-depth treatment on this emerging area of research. Details of techniques used and thorough experimental studies are given. More importantly and exclusive to this book, the authors provide detailed practical guide on the deployment of PDM in the mobile environment. An important extension to the basic implementation of PDM dealing with concept drift is also reported. In the era of Big Data, potential applications of paramount importance offered by PDM in a variety of domains including security, business and telemedicine are discussed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The term 'big data' has recently emerged to describe a range of technological and commercial trends enabling the storage and analysis of huge amounts of customer data, such as that generated by social networks and mobile devices. Much of the commercial promise of big data is in the ability to generate valuable insights from collecting new types and volumes of data in ways that were not previously economically viable. At the same time a number of questions have been raised about the implications for individual privacy. This paper explores key perspectives underlying the emergence of big data, and considers both the opportunities and ethical challenges raised for market research.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

As we enter an era of ‘big data’, asset information is becoming a deliverable of complex projects. Prior research suggests digital technologies enable rapid, flexible forms of project organizing. This research analyses practices of managing change in Airbus, CERN and Crossrail, through desk-based review, interviews, visits and a cross-case workshop. These organizations deliver complex projects, rely on digital technologies to manage large data-sets; and use configuration management, a systems engineering approach with mid-20th century origins, to establish and maintain integrity. In them, configuration management has become more, rather than less, important. Asset information is structured, with change managed through digital systems, using relatively hierarchical, asynchronous and sequential processes. The paper contributes by uncovering limits to flexibility in complex projects where integrity is important. Challenges of managing change are discussed, considering the evolving nature of configuration management; potential use of analytics on complex projects; and implications for research and practice.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Pervasive healthcare aims to deliver deinstitutionalised healthcare services to patients anytime and anywhere. Pervasive healthcare involves remote data collection through mobile devices and sensor network which the data is usually in large volume, varied formats and high frequency. The nature of big data such as volume, variety, velocity and veracity, together with its analytical capabilities com-plements the delivery of pervasive healthcare. However, there is limited research in intertwining these two domains. Most research focus mainly on the technical context of big data application in the healthcare sector. Little attention has been paid to a strategic role of big data which impacts the quality of healthcare services provision at the organisational level. Therefore, this paper delivers a conceptual view of big data architecture for pervasive healthcare via an intensive literature review to address the aforementioned research problems. This paper provides three major contributions: 1) identifies the research themes of big data and pervasive healthcare, 2) establishes the relationship between research themes, which later composes the big data architecture for pervasive healthcare, and 3) sheds a light on future research, such as semiosis and sense-making, and enables practitioners to implement big data in the pervasive healthcare through the proposed architecture.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Widespread commercial use of the internet has significantly increased the volume and scope of data being collected by organisations. ‘Big data’ has emerged as a term to encapsulate both the technical and commercial aspects of this growing data collection activity. To date, much of the discussion of big data has centred upon its transformational potential for innovation and efficiency, yet there has been less reflection on its wider implications beyond commercial value creation. This paper builds upon normal accident theory (NAT) to analyse the broader ethical implications of big data. It argues that the strategies behind big data require organisational systems that leave them vulnerable to normal accidents, that is to say some form of accident or disaster that is both unanticipated and inevitable. Whilst NAT has previously focused on the consequences of physical accidents, this paper suggests a new form of system accident that we label data accidents. These have distinct, less tangible and more complex characteristics and raise significant questions over the role of individual privacy in a ‘data society’. The paper concludes by considering the ways in which the risks of such data accidents might be managed or mitigated.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The size and complexity of data sets generated within ecosystem-level programmes merits their capture, curation, storage and analysis, synthesis and visualisation using Big Data approaches. This review looks at previous attempts to organise and analyse such data through the International Biological Programme and draws on the mistakes made and the lessons learned for effective Big Data approaches to current Research Councils United Kingdom (RCUK) ecosystem-level programmes, using Biodiversity and Ecosystem Service Sustainability (BESS) and Environmental Virtual Observatory Pilot (EVOp) as exemplars. The challenges raised by such data are identified, explored and suggestions are made for the two major issues of extending analyses across different spatio-temporal scales and for the effective integration of quantitative and qualitative data.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper discusses how global financial institutions are using big data analytics within their compliance operations. A lot of previous research has focused on the strategic implications of big data, but not much research has considered how such tools are entwined with regulatory breaches and investigations in financial services. Our work covers two in-depth qualitative case studies, each addressing a distinct type of analytics. The first case focuses on analytics which manage everyday compliance breaches and so are expected by managers. The second case focuses on analytics which facilitate investigation and litigation where serious unexpected breaches may have occurred. In doing so, the study focuses on the micro/data to understand how these tools are influencing operational risks and practices. The paper draws from two bodies of literature, the social studies of information systems and finance to guide our analysis and practitioner recommendations. The cases illustrate how technologies are implicated in multijurisdictional challenges and regulatory conflicts at each end of the operational risk spectrum. We find that compliance analytics are both shaping and reporting regulatory matters yet often firms may have difficulties in recruiting individuals with relevant but diverse skill sets. The cases also underscore the increasing need for financial organizations to adopt robust information governance policies and processes to ease future remediation efforts.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

O estudo da FGV Projetos, coordenado pelo economista Fernando Blumenschein, desenvolve um quadro metodológico específico para compras governamentais com base em pesquisa realizada para o Fundo Nacional de Desenvolvimento da Educação (FNDE). Este estudo destaca o potencial de uso dos conceitos da teoria dos leilões, juntamente com métodos de análise de "Big Data" na formação de sessões públicas.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A coleta e o armazenamento de dados em larga escala, combinados à capacidade de processamento de dados que não necessariamente tenham relação entre si de forma a gerar novos dados e informações, é uma tecnologia amplamente usada na atualidade, conhecida de forma geral como Big Data. Ao mesmo tempo em que possibilita a criação de novos produtos e serviços inovadores, os quais atendem a demandas e solucionam problemas de diversos setores da sociedade, o Big Data levanta uma série de questionamentos relacionados aos direitos à privacidade e à proteção dos dados pessoais. Esse artigo visa proporcionar um debate sobre o alcance da atual proteção jurídica aos direitos à privacidade e aos dados pessoais nesse contexto, e consequentemente fomentar novos estudos sobre a compatibilização dos mesmos com a liberdade de inovação. Para tanto, abordará, em um primeiro momento, pontos positivos e negativos do Big Data, identificando como o mesmo afeta a sociedade e a economia de forma ampla, incluindo, mas não se limitando, a questões de consumo, saúde, organização social, administração governamental, etc. Em seguida, serão identificados os efeitos dessa tecnologia sobre os direitos à privacidade e à proteção dos dados pessoais, tendo em vista que o Big Data gera grandes mudanças no que diz respeito ao armazenamento e tratamento de dados. Por fim, será feito um mapeamento do atual quadro regulatório brasileiro de proteção a tais direitos, observando se o mesmo realmente responde aos desafios atuais de compatibilização entre inovação e privacidade.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

As tecnologias digitais tornaram-se uma importante infraestrutura para as nossas vidas nas mais diversas dimensões – cultural, social, politicas e econômicas. As formas de mediação digital têm alterado meios tradicionais e convencionais de organização de tempo e espaço. No entanto, contextualizar e situar narrativas e práticas de produção e análise de dados de rede – gerados em grande volume e a uma grande velocidade – têm sido grandes desafios para pesquisadores de todo o mundo. Lançado no fim de 2014, o livro Big data? Qualitative approaches to digital research (Emerald Books, 2014) oferece uma visão crítica sobre análises qualitativas de novos tipos de dados, plataformas e meios de comunicação, suas implicações para o futuro e como melhorar ativamente a pesquisa em relação a estes. Com autores especialistas das áreas de sociologia, ciência politica, cultura, comunicação, metodologia e administração, os organizadores do livro, Martin Hand e Sam Hillyard, denominam de pesquisa social digital todas as perspectivas qualitativas de diversas disciplinas, conceitos e orientações metodológicas e empíricas que atestem a integração e a diversificação das tecnologias digitais e de dados na vida social de hoje.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

EMAp - Escola de Matemática Aplicada

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Apresentamos um projeto inovador na intersecção da tecnologia da informação, gestão, e direito, com o intuito de oferecer otimização de resultados, redução de custos e de tempo. A equipe proponente foi formada no projeto Big Data e Gestão Processual, do qual participam três escolas da Fundação Getulio Vargas que são referência em todo Brasil: as escolas de Direito, de Administração de Empresas e de Matemática Aplicada, todas do Rio de Janeiro.