997 resultados para Big Sowing Pool


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Big data is big news in almost every sector including crisis communication. However, not everyone has access to big data and even if we have access to big data, we often do not have necessary tools to analyze and cross reference such a large data set. Therefore this paper looks at patterns in small data sets that we have ability to collect with our current tools to understand if we can find actionable information from what we already have. We have analyzed 164390 tweets collected during 2011 earthquake to find out what type of location specific information people mention in their tweet and when do they talk about that. Based on our analysis we find that even a small data set that has far less data than a big data set can be useful to find priority disaster specific areas quickly.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Summary: More than ever before contemporary societies are characterised by the huge amounts of data being transferred. Authorities, companies, academia and other stakeholders refer to Big Data when discussing the importance of large and complex datasets and developing possible solutions for their use. Big Data promises to be the next frontier of innovation for institutions and individuals, yet it also offers possibilities to predict and influence human behaviour with ever-greater precision

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In most developing countries, the overall quality of the livelihood of labourers, work place environment and implementation of labour rights do not progress at the same rate as their industrial development. To address this situation, the ILO has initiated the concept of 'decent work' to assist regulators articulate labour-related social policy goals. Against this backdrop, this article assesses the Bangladesh Labour Law 2006 by reference to the four social principles developed by the ILO for ensuring 'decent work'. It explains the impact of the absence of these principles in this Law on the labour administration in the ready-made garment and ship-breaking industries. It finds that an appropriate legislative framework needs to be based on the principles of 'decent work' to establish a solid platform for a sound labour regulation in Bangladesh.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Big Data is a rising IT trend similar to cloud computing, social networking or ubiquitous computing. Big Data can offer beneficial scenarios in the e-health arena. However, one of the scenarios can be that Big Data needs to be kept secured for a long period of time in order to gain its benefits such as finding cures for infectious diseases and protecting patient privacy. From this connection, it is beneficial to analyse Big Data to make meaningful information while the data is stored securely. Therefore, the analysis of various database encryption techniques is essential. In this study, we simulated 3 types of technical environments, namely, Plain-text, Microsoft Built-in Encryption, and custom Advanced Encryption Standard, using Bucket Index in Data-as-a-Service. The results showed that custom AES-DaaS has a faster range query response time than MS built-in encryption. Furthermore, while carrying out the scalability test, we acknowledged that there are performance thresholds depending on physical IT resources. Therefore, for the purpose of efficient Big Data management in eHealth it is noteworthy to examine their scalability limits as well even if it is under a cloud computing environment. In addition, when designing an e-health database, both patient privacy and system performance needs to be dealt as top priorities.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Objective Describe the characteristics of patients presenting to Emergency Departments (EDs) within Queensland, Australia with injuries due to assault with a glass implement (‘glassing’) and to set this within the broader context of presentations due to alcohol-related violence. Methods Analysis of prospectively collected ED injury surveillance data collated by the Queensland Injury Surveillance Unit (QISU) between 1999 and 2011. Cases of injury due to alcohol-related violence were identified and analysed using coded fields supplemented with qualitative data contained within the injury description text. Descriptive statistics were used to assess the characteristics of injury presentations due to alcohol-related violence. Violence included interpersonal violence and aggression (verbal aggression and object violence). Results A total of 4629 cases were studied. The study population was predominantly male (72%) and aged 18 to 24 (36%), with males in this age group comprising more than a quarter of the study population (28%). Nine percent of alcohol-related assault injuries were a consequence of ‘glassing’. The home was the most common location for alcohol-related violence (31%) and alcohol-related ‘glassings’ (33%). Overall, the most common glass object involved was a bottle (75%), however, within licensed venues an even mix of a drinking glass (44%) and glass bottle (45%) were identified. Conclusions Contrary to public perception generated by media, ‘glassing’ incidents, particularly at licensed venues, constitute a relatively small proportion of all alcohol-related violence. The current study highlights the predominance of young men injured following alcohol-related violence, demonstrating a key focus area within the population for aiming prevention strategies.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The promise of ‘big data’ has generated a significant deal of interest in the development of new approaches to research in the humanities and social sciences, as well as a range of important critical interventions which warn of an unquestioned rush to ‘big data’. Drawing on the experiences made in developing innovative ‘big data’ approaches to social media research, this paper examines some of the repercussions for the scholarly research and publication practices of those researchers who do pursue the path of ‘big data’–centric investigation in their work. As researchers import the tools and methods of highly quantitative, statistical analysis from the ‘hard’ sciences into computational, digital humanities research, must they also subscribe to the language and assumptions underlying such ‘scientificity’? If so, how does this affect the choices made in gathering, processing, analysing, and disseminating the outcomes of digital humanities research? In particular, is there a need to rethink the forms and formats of publishing scholarly work in order to enable the rigorous scrutiny and replicability of research outcomes?

Relevância:

20.00% 20.00%

Publicador:

Resumo:

As media institutions are encouraged to explore new production methodologies in the current economic crisis, they align with Schumpeter’s creative destruction provocation by exhibiting user-led political, organisation and socio-technical innovations. This paper highlights the significance of the cultural intermediary within the innovative, co-creative production arrangements for cultural artefacts by media professionals in institutional online communities. An institutional online community is defined as one that is housed, resourced and governed by commercial or non- commercial institutions and is not independently facilitated. Web 2.0 technologies have mobilised collaborative peer production activities for online content creation and professional media institutions face challenges in engaging participatory audiences in practices that are beneficial for all concerned stakeholders. The interests of those stakeholders often do not align, highlighting the need for an intermediary role that understands and translates the norms, rhetoric tropes and day-to-day activities between the individuals engaging in participatory communication activities for successful negotiation within the production process. This paper specifically explores the participatory relationship between the public service broadcaster (PSB), the Australian Broadcasting Corporation (ABC) and one of its online communities, ABC Pool (www.abc.net.au/pool). ABC Pool is an online platform developed and resourced by the ABC to encourage co-creation between audience members engaging in the production of user-generated content (UGC) and the professional producers housed within the ABC Radio Division. This empirical research emerges from a three-year research project where I employed an ethnographic action research methodology and was embedded at the ABC as the community manager of ABC Pool. In participatory communication environments, users favour meritocratic heterarchical governance over traditional institutional hierarchical systems (Malaby 2009). A reputation environment based on meritocracy requires an intermediary to identify the stakeholders, understand their interests and communicate effectively between them to negotiate successful production outcomes (Bruns 2008; Banks 2009). The community manager generally occupies this role, however it has emerged that other institutional production environments also employ an intermediary role under alternative monikers(Hutchinson 2012). A useful umbrella term to encompass the myriad of roles within this space is the cultural intermediary. The ABC has experimented with three institutional online community governance models that engage in cultural intermediation in differing decentralised capacities. The first and most closed is a single point of contact model where one cultural intermediary controls all of the communication of the participatory project. The second is a model of multiple cultural intermediaries engaging in communication between the institutional online community stakeholders simultaneously. The third is most open yet problematic as it promotes and empowers community participants to the level of cultural intermediaries. This paper uses the ABC Pool case study to highlight the differing levels of openness within cultural intermediation during the co-creative production process of a cultural artifact.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This thesis addresses the question of what it means to be a public broadcaster in the context of a rapidly changing media landscape, in which audiences no longer only watch and consume but now also make and share media content. Through a close investigation of the ABC Pool community, this thesis documents how the different interests of the stakeholders within an institutional online community intersect and how those interests are negotiated within the Australian Broadcasting Corporation. It demonstrates a new approach towards the cultural intermediation of user-created content within institutional online communities. The research moves beyond the exploration of the community manager role as one type of intermediary to demonstrate the activities of multiple cultural intermediaries that engage in collaborative peer production. Cultural intermediation provides the basis for institutional online community governance.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Acoustic sensing is a promising approach to scaling faunal biodiversity monitoring. Scaling the analysis of audio collected by acoustic sensors is a big data problem. Standard approaches for dealing with big acoustic data include automated recognition and crowd based analysis. Automatic methods are fast at processing but hard to rigorously design, whilst manual methods are accurate but slow at processing. In particular, manual methods of acoustic data analysis are constrained by a 1:1 time relationship between the data and its analysts. This constraint is the inherent need to listen to the audio data. This paper demonstrates how the efficiency of crowd sourced sound analysis can be increased by an order of magnitude through the visual inspection of audio visualized as spectrograms. Experimental data suggests that an analysis speedup of 12× is obtainable for suitable types of acoustic analysis, given that only spectrograms are shown.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Big Data presents many challenges related to volume, whether one is interested in studying past datasets or, even more problematically, attempting to work with live streams of data. The most obvious challenge, in a ‘noisy’ environment such as contemporary social media, is to collect the pertinent information; be that information for a specific study, tweets which can inform emergency services or other responders to an ongoing crisis, or give an advantage to those involved in prediction markets. Often, such a process is iterative, with keywords and hashtags changing with the passage of time, and both collection and analytic methodologies need to be continually adapted to respond to this changing information. While many of the data sets collected and analyzed are preformed, that is they are built around a particular keyword, hashtag, or set of authors, they still contain a large volume of information, much of which is unnecessary for the current purpose and/or potentially useful for future projects. Accordingly, this panel considers methods for separating and combining data to optimize big data research and report findings to stakeholders. The first paper considers possible coding mechanisms for incoming tweets during a crisis, taking a large stream of incoming tweets and selecting which of those need to be immediately placed in front of responders, for manual filtering and possible action. The paper suggests two solutions for this, content analysis and user profiling. In the former case, aspects of the tweet are assigned a score to assess its likely relationship to the topic at hand, and the urgency of the information, whilst the latter attempts to identify those users who are either serving as amplifiers of information or are known as an authoritative source. Through these techniques, the information contained in a large dataset could be filtered down to match the expected capacity of emergency responders, and knowledge as to the core keywords or hashtags relating to the current event is constantly refined for future data collection. The second paper is also concerned with identifying significant tweets, but in this case tweets relevant to particular prediction market; tennis betting. As increasing numbers of professional sports men and women create Twitter accounts to communicate with their fans, information is being shared regarding injuries, form and emotions which have the potential to impact on future results. As has already been demonstrated with leading US sports, such information is extremely valuable. Tennis, as with American Football (NFL) and Baseball (MLB) has paid subscription services which manually filter incoming news sources, including tweets, for information valuable to gamblers, gambling operators, and fantasy sports players. However, whilst such services are still niche operations, much of the value of information is lost by the time it reaches one of these services. The paper thus considers how information could be filtered from twitter user lists and hash tag or keyword monitoring, assessing the value of the source, information, and the prediction markets to which it may relate. The third paper examines methods for collecting Twitter data and following changes in an ongoing, dynamic social movement, such as the Occupy Wall Street movement. It involves the development of technical infrastructure to collect and make the tweets available for exploration and analysis. A strategy to respond to changes in the social movement is also required or the resulting tweets will only reflect the discussions and strategies the movement used at the time the keyword list is created — in a way, keyword creation is part strategy and part art. In this paper we describe strategies for the creation of a social media archive, specifically tweets related to the Occupy Wall Street movement, and methods for continuing to adapt data collection strategies as the movement’s presence in Twitter changes over time. We also discuss the opportunities and methods to extract data smaller slices of data from an archive of social media data to support a multitude of research projects in multiple fields of study. The common theme amongst these papers is that of constructing a data set, filtering it for a specific purpose, and then using the resulting information to aid in future data collection. The intention is that through the papers presented, and subsequent discussion, the panel will inform the wider research community not only on the objectives and limitations of data collection, live analytics, and filtering, but also on current and in-development methodologies that could be adopted by those working with such datasets, and how such approaches could be customized depending on the project stakeholders.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Modern health information systems can generate several exabytes of patient data, the so called "Health Big Data", per year. Many health managers and experts believe that with the data, it is possible to easily discover useful knowledge to improve health policies, increase patient safety and eliminate redundancies and unnecessary costs. The objective of this paper is to discuss the characteristics of Health Big Data as well as the challenges and solutions for health Big Data Analytics (BDA) – the process of extracting knowledge from sets of Health Big Data – and to design and evaluate a pipelined framework for use as a guideline/reference in health BDA.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Environmental monitoring is becoming critical as human activity and climate change place greater pressures on biodiversity, leading to an increasing need for data to make informed decisions. Acoustic sensors can help collect data across large areas for extended periods making them attractive in environmental monitoring. However, managing and analysing large volumes of environmental acoustic data is a great challenge and is consequently hindering the effective utilization of the big dataset collected. This paper presents an overview of our current techniques for collecting, storing and analysing large volumes of acoustic data efficiently, accurately, and cost-effectively.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Enterprises, both public and private, have rapidly commenced using the benefits of enterprise resource planning (ERP) combined with business analytics and “open data sets” which are often outside the control of the enterprise to gain further efficiencies, build new service operations and increase business activity. In many cases, these business activities are based around relevant software systems hosted in a “cloud computing” environment. “Garbage in, garbage out”, or “GIGO”, is a term long used to describe problems in unqualified dependency on information systems, dating from the 1960s. However, a more pertinent variation arose sometime later, namely “garbage in, gospel out” signifying that with large scale information systems, such as ERP and usage of open datasets in a cloud environment, the ability to verify the authenticity of those data sets used may be almost impossible, resulting in dependence upon questionable results. Illicit data set “impersonation” becomes a reality. At the same time the ability to audit such results may be an important requirement, particularly in the public sector. This paper discusses the need for enhancement of identity, reliability, authenticity and audit services, including naming and addressing services, in this emerging environment and analyses some current technologies that are offered and which may be appropriate. However, severe limitations to addressing these requirements have been identified and the paper proposes further research work in the area.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The Design Minds The Big Picture Toolkit was one of six K7-12 secondary school design toolkits commissioned by the State Library of Queensland (SLQ) Asia Pacific Design Library (APDL), to facilitate the delivery of the Stage 1 launch of its Design Minds online platform (www.designminds.org.au) partnership initiative with Queensland Government Arts Queensland and the Smithsonian Cooper-Hewitt National Design Museum, on June 29, 2012. Design Minds toolkits are practical guides, underpinned by a combination of one to three of the Design Minds model phases of ‘Inquire’, ‘Ideate’ and ‘Implement’ (supported by at each stage with structured reflection), to enhance existing school curriculum and empower students with real life design exercises, within the classroom environment. Toolkits directly identify links to Naplan, National Curriculum, C2C and Professional Standards benchmarks, as well as the student capabilities of successful and creative 21st century citizens they seek to engender through design thinking. Inspired by the Unlimited: Designing for the Asia Pacific Generation Workshop 2010 (http://eprints.qut.edu.au/47762/), this toolkit explores, through three distinct exercises, ‘design for the other 90%’, addressing tools and approaches to diverse and changing social, cultural, technological and environmental challenges. The Design Minds The Big Picture Toolkit challenges students to be active agents for change and to think creatively and optimistically about solutions to future global issues that deliver social, economic and environmental benefits. More generally, it aims to facilitate awareness in young people, of the role of design in society and the value of design thinking skills in generating strategies to solve basic to complex systemic challenges, as well as to inspire post-secondary pathways and idea generation for education. The toolkit encourages students and teachers to develop sketching, making, communication, presentation and collaboration skills to improve their design process, as well as explore further inquiry (background research) to enhance the ideation exercises. Exercise 1 focuses on the ‘Inquire’ phase, Exercise 2 the ‘Inquire’ and ‘Ideate’ phases, and Exercise 3 concentrates on the ‘Implement’ phase. Depending on the intensity of the focus, the unit of work could be developed over a 4-5 week program (approximately 4-6 x 60 minute lessons/workshops) or as smaller workshops treated as discrete learning experiences. The toolkit is available for public download from http://designminds.org.au/the-big-picture/ on the Design Minds website.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Urban space has the potential to shape people's experience and understanding of the city and of the culture of a place. In some respects, murals and allied forms of wall art occupy the intersection of street art and public art; engaging, and sometimes, transforming the urban space in which they exist and those who use it. While murals are often conceived as a more ‘permanent’ form of painted art there has been a trend in recent years towards more deliberately transient forms of wall art such as washed-wall murals and reverse graffiti. These varying forms of public wall art are embedded within the fabric of the urban space and history. This paper will explore the intersection of public space, public art and public memory in a mural project in the Irish city of Cork. Focussing on the washed-wall murals of Cork's historic Shandon district, we explore the sympathetic and synergetic relationship of this wall art with the heritage architecture of the built environment and of the murals as an expression of and for the local community, past and present. Through the Shandon Big Wash Up murals we reflect on the function of participatory public art as an explicit act of urban citizenship which works to support community-led re-enchantment in the city through a reconnection with its past.